What Can We Learn From Deep Learning Programs? | Two Minute Papers #75
The paper "Model Compression" is available here:
https://www.cs.cornell.edu/~caruana/compression.kdd06.pdf
There is also a talk on it here:
http://research.microsoft.com/apps/video/default.aspx?id=103668&r=1
Discussions on this issue:
1. https://www.linkedin.com/pulse/computer-vision-research-my-deep-depression-nikos-paragios
2. https://www.reddit.com/r/MachineLearning/comments/4lq701/yann_lecuns_letter_to_cvpr_chair_after_bad/
Recommended for you:
Neural Programmer Interpreters - https://www.youtube.com/watch?v=B70tT4WMyJk
WE WOULD LIKE TO THANK OUR GENEROUS PATREON SUPPORTERS WHO MAKE TWO MINUTE PAPERS POSSIBLE:
David Jaenisch, Sunil Kim, Julian Josephs.
https://www.patreon.com/TwoMinutePapers
We also thank Experiment for sponsoring our series. - https://experiment.com/
Subscribe if you would like to see more of these! - http://www.youtube.com/subscription_center?add_user=keeroyz
The thumbnail background image was created by John Lord - https://flic.kr/p/nVUaB
Splash screen/thumbnail design: Felícia Fehér - http://felicia.hu
Károly Zsolnai-Fehér's links:
Facebook → https://www.facebook.com/TwoMinutePapers/
Twitter → https://twitter.com/karoly_zsolnai
Web → https://cg.tuwien.ac.at/~zsolnai/