Reducing the Dimension of Language: A Spectral Perspective on Transformers

Published on ● Video Link: https://www.youtube.com/watch?v=dLu50h0xmgE



Duration: 0:00
330 views
9


Elad Hazan (Princeton University)
https://simons.berkeley.edu/talks/elad-hazan-princeton-university-2025-04-01
The Future of Language Models and Transformers

Can we build neural architectures that go beyond Transformers by leveraging principles from dynamical systems?
In this talk, we'll introduce a novel approach to sequence modeling that draws inspiration from the paradigm of online control of dynamical systems to achieve long-range memory, fast inference, and provable robustness.