Open-Source and Science in the Era of Foundation Models
Percy Liang (Stanford University)
https://simons.berkeley.edu/talks/percy-liang-stanford-university-2024-11-15
Domain Adaptation and Related Areas
As capabilities of foundation models skyrocket, openness plummets. In this talk, I argue that open-source models are essential for the long-term goal of building a rigorous foundation for AI. Greater access---from API to open-weight to open-source---enables deeper forms of research. API access allows us to push the frontier of agents, and I will present our recent work on simulation and problem-solving agents. Open weights enables reproducible research on safety, interpretability, and more generally, “model forensics”. Open-source unlocks fundamental innovations in architectures, training procedures, and data curation methods. Of course, the key obstacle for building open-source models is the resources required (data, compute, and research/engineering). I will conclude with some promising directions that leverage the community that bring us closer to the vision of open-source foundation models.