Reducing the Dimension of Language: A Spectral Perspective on Transformers
Subscribers:
68,700
Published on ● Video Link: https://www.youtube.com/watch?v=dLu50h0xmgE
Elad Hazan (Princeton University)
https://simons.berkeley.edu/talks/elad-hazan-princeton-university-2025-04-01
The Future of Language Models and Transformers
Can we build neural architectures that go beyond Transformers by leveraging principles from dynamical systems?
In this talk, we'll introduce a novel approach to sequence modeling that draws inspiration from the paradigm of online control of dynamical systems to achieve long-range memory, fast inference, and provable robustness.