Succinct Neural Networks
Subscribers:
68,700
Published on ● Video Link: https://www.youtube.com/watch?v=ZAQkBTjK5AM
Adam Kalai (Microsoft Research)
https://simons.berkeley.edu/talks/succinct-neural-networks
Lower Bounds, Learning, and Average-Case Complexity
Abstract
We present a Deep Neural Networks (DNN) architecture that is succinct: with a constant number of parameters, it is capable of representing any constant-sized Turing machine. In contrast, most prior DNNs are non-uniform in the sense that they require polynomial size to represent computations. One consequence is that our simple architecture can provably learn as well as any efficient learning algorithm, with simple random initializations. Our architecture combines recurrent and convolutional neural networks and builds on work of Abbe and Sandon [2020].
Other Videos By Simons Institute for the Theory of Computing
Tags:
Simons Institute
theoretical computer science
UC Berkeley
Computer Science
Theory of Computation
Theory of Computing
Lower Bounds Learning and Average-Case Complexity
Adam Kalai