Training Deep Neural Networks With Dropout | Two Minute Papers #62

Subscribers:
1,550,000
Published on ● Video Link: https://www.youtube.com/watch?v=LhhEv1dMpKE



Duration: 3:29
9,101 views
289


In this episode, we discuss the bane of many machine learning algorithms - overfitting. It is also explained why it is an undesirable way to learn and how to combat it via dropout.

_____________________

The paper "Dropout: A Simple Way to Prevent Neural Networks from
Overtting" is available here:
https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf

Andrej Karpathy's autoencoder is available here:
http://cs.stanford.edu/people/karpathy/convnetjs/demo/autoencoder.html

Recommended for you:
Overfitting and Regularization For Deep Learning - https://www.youtube.com/watch?v=6aF9sJrzxaM
Decision Trees and Boosting, XGBoost -https://www.youtube.com/watch?v=0Xc9LIb_HTw
A full playlist with machine learning and deep learning-related Two Minute Papers videos - https://www.youtube.com/playlist?list=PLujxSBD-JXglGL3ERdDOhthD3jTlfudC2

WE WOULD LIKE TO THANK OUR GENEROUS SUPPORTERS WHO MAKE TWO MINUTE PAPERS POSSIBLE:
Sunil Kim, Vinay S.
https://www.patreon.com/TwoMinutePapers

Subscribe if you would like to see more of these! - http://www.youtube.com/subscription_center?add_user=keeroyz

The thumbnail image background was created by Norma (CC BY 2.0) - https://flic.kr/p/ejXPXt
Splash screen/thumbnail design: Felícia Fehér - http://felicia.hu

Károly Zsolnai-Fehér's links:
Facebook → https://www.facebook.com/TwoMinutePapers/
Twitter → https://twitter.com/karoly_zsolnai
Web → https://cg.tuwien.ac.at/~zsolnai/







Tags:
two minute papers
dropout
l1 regularization
l2 regularization
deep learning regularization
deep learning overfitting
neural network overfitting
machine learning overfitting
neural network regularization
what to do overfitting
deep neural network overfitting
deep learning regularizer
dropconnect
neural network dropout
underfitting
overfit
deep learning dropout
regularization with dropout
Dropout: A Simple Way to Prevent Neural Networks from Over tting
overfitting