Automated Deep Learning: Joint Neural Architecture and Hyperparameter Search (discussions) | AISC

Published on ● Video Link: https://www.youtube.com/watch?v=ghtM-DILSj0



Category:
Discussion
Duration: 36:42
384 views
6


Toronto Deep Learning Series, 10 December 2018

Paper: https://arxiv.org/abs/1807.06906

Discussion Lead: Mark Donaldson (Ryerson University)
Discussion Facilitator: Masoud Hashemi (RBC)

Host: Shopify
Date: Dec 10th, 2018

Towards Automated Deep Learning: Efficient Joint Neural Architecture and Hyperparameter Search

While existing work on neural architecture search (NAS) tunes hyperparameters in a separate post-processing step, we demonstrate that architectural choices and other hyperparameter settings interact in a way that can render this separation suboptimal. Likewise, we demonstrate that the common practice of using very few epochs during the main NAS and much larger numbers of epochs during a post-processing step is inefficient due to little correlation in the relative rankings for these two training regimes. To combat both of these problems, we propose to use a recent combination of Bayesian optimization and Hyperband for efficient joint neural architecture and hyperparameter search.




Other Videos By LLMs Explained - Aggregate Intellect - AI.SCIENCE


2019-02-04TDLS: Learning Functional Causal Models with GANs - part 2 (results and discussion)
2019-02-04Neural Ordinary Differential Equations - part 1 (algorithm review) | AISC
2019-02-04Neural Ordinary Differential Equations - part 2 (results & discussion) | AISC
2019-02-04Parallel Collaborative Filtering for the Netflix Prize (algorithm review) | AISC Foundational
2019-02-04Parallel Collaborative Filtering for the Netflix Prize (results & discussion) AISC Foundational
2019-01-14TDLS - Announcing Fast Track Stream
2019-01-09Extracting Biologically Relevant Latent Space from Cancer Transcriptomes \w VAEs(discussions) I AISC
2019-01-09Extracting Biologically Relevant Latent Space from Cancer Transcriptomes \w VAEs (algorithm) | AISC
2019-01-08[original backprop paper] Learning representations by back-propagating errors (part1) | AISC
2019-01-08[original backprop paper] Learning representations by back-propagating errors (part2) | AISC
2018-12-16Automated Deep Learning: Joint Neural Architecture and Hyperparameter Search (discussions) | AISC
2018-12-16Automated Deep Learning: Joint Neural Architecture and Hyperparameter Search (algorithm) | AISC
2018-12-09Automated Vulnerability Detection in Source Code Using Deep Learning (discussions) | AISC
2018-12-09Automated Vulnerability Detection in Source Code Using Deep Learning (algorithm) | AISC
2018-12-05[DQN] Human-level control through deep reinforcement learning (discussions) | AISC Foundational
2018-12-05Deep Q-Learning paper explained: Human-level control through deep reinforcement learning (algorithm)
2018-12-03SMOTE, Synthetic Minority Over-sampling Technique (discussions) | AISC Foundational
2018-12-02TDLS - Classics: SMOTE, Synthetic Minority Over-sampling Technique (algorithm)
2018-11-30Visualizing Data using t-SNE (algorithm) | AISC Foundational
2018-11-30Visualizing Data using t-SNE (discussions) | AISC Foundational
2018-11-27[BERT] Pretranied Deep Bidirectional Transformers for Language Understanding (discussions) | TDLS



Tags:
neural architecture search
hyperparameter optimization
deep learning
auto-ml
Automated Deep Learning