Efficient Exploration in Bayesian Optimization – Optimism and Beyond  by Andreas Krause

Efficient Exploration in Bayesian Optimization – Optimism and Beyond by Andreas Krause

Subscribers:
348,000
Published on ● Video Link: https://www.youtube.com/watch?v=p_PK1CuEuAE



Duration: 1:15:19
4,593 views
0


A Google TechTalk, presented by Andreas Krause, 2021/06/07
ABSTRACT: A central challenge in Bayesian Optimization and related tasks is the exploration—exploitation dilemma: Selecting inputs that are informative about the unknown function, while focusing exploration where we expect high return. In this talk, I will present several approaches based on nonparametric confidence bounds that are designed to navigate this dilemma. We’ll explore the limits of the well-established optimistic principle, especially when we need to distinguish epistemic and aleatoric uncertainty, or in light of constraints. I will also present recent results aiming to meta-learn well-calibrated probabilistic models (Gaussian processes and Bayesian neural networks) from related tasks. I will demonstrate our algorithms on several optimal experimental design tasks.

About the speaker: Andreas Krause is a Professor of Computer Science at ETH Zurich, where he leads the Learning & Adaptive Systems Group. He also serves as Academic Co-Director of the Swiss Data Science Center and Chair of the ETH AI Center, and co-founded the ETH spin-off LatticeFlow. Before that he was an Assistant Professor of Computer Science at Caltech. He received his Ph.D. in Computer Science from Carnegie Mellon University (2008) and his Diplom in Computer Science and Mathematics from the Technical University of Munich, Germany (2004). He is a Microsoft Research Faculty Fellow and a Kavli Frontiers Fellow of the US National Academy of Sciences. He received ERC Starting Investigator and ERC Consolidator grants, the Deutscher Mustererkennungspreis, an NSF CAREER award as well as the ETH Golden Owl teaching award. His research has received awards at several premier conferences and journals, including the ACM SIGKDD Test of Time award 2019 and the ICML Test of Time award 2020. Andreas Krause served as Program Co-Chair for ICML 2018, and is regularly serving as Area Chair or Senior Program Committee member for ICML, NeurIPS, AAAI and IJCAI, and as Action Editor for the Journal of Machine Learning Research.




Other Videos By Google TechTalks


2021-10-06Greybeard Qualification (Linux Internals) part 5: Block Devices & File Systems
2021-10-06Greybeard Qualification (Linux Internals) part 4: Startup and Init
2021-09-30A Regret Analysis of Bilateral Trade
2021-09-29CoinPress: Practical Private Mean and Covariance Estimation
2021-09-29"I need a better description": An Investigation Into User Expectations For Differential Privacy
2021-09-29On the Convergence of Deep Learning with Differential Privacy
2021-09-29A Geometric View on Private Gradient-Based Optimization
2021-09-29BB84: Quantum Protected Cryptography
2021-09-29Fast and Memory Efficient Differentially Private-SGD via JL Projections
2021-09-29Leveraging Public Data for Practical Synthetic Data Generation
2021-07-13Efficient Exploration in Bayesian Optimization – Optimism and Beyond by Andreas Krause
2021-07-13Learning to Explore in Molecule Space by Yoshua Bengio
2021-07-13Resource Allocation in Multi-armed Bandits by Kirthevasan Kandasamy
2021-07-13Grey-box Bayesian Optimization by Peter Frazier
2021-06-10Is There a Mathematical Model of the Mind? (Panel Discussion)
2021-06-04Dataset Poisoning on the Industrial Scale
2021-06-04Towards Training Provably Private Models via Federated Learning in Practice
2021-06-04Breaking the Communication-Privacy-Accuracy Trilemma
2021-06-04Cronus: Robust Knowledge Transfer for Federated Learning
2021-06-04Private Algorithms with Minimal Space
2021-06-04Tackling the Objective Inconsistency Problem in Heterogeneous Federated Optimization