Probabilistic Numerics — moving BayesOpt expertise to the inner loop by Philipp Hennig

Subscribers:
348,000
Published on ● Video Link: https://www.youtube.com/watch?v=jUzASdzqTIA



Duration: 59:09
1,921 views
39


A Google TechTalk, presented by Philipp Hennig, 2022/02/08
ABSTRACT: BayesOpt Speaker Series. Bayesian Optimization experts are Gaussian process experts. And there is much more to do for Gaussian inference in the algorithmic space beyond outer-loop optimization. Using simulation — the solution of differential equations — as an example, I will argue that the advent of machine learning should put probabilistic functionality and theory at the center of numerical computations. This includes decidedly low-level functionality, a chance for BayesOpt experts to move their skills from the outer to the inner loops of computation.

About the speaker: Philipp Hennig holds the Chair for the Methods of Machine Learning at the University of Tübingen, and is an adjunct senior research scientist at the Max Planck Institute for Intelligent Systems. Since his PhD with David MacKay in Cambridge, he has been interested in the relationship between computation and inference. Hennig is the deputy speaker of the Cyber Valley initiative of the German State of Baden-Württemberg; an ELLIS fellow, and co-director of the ELLIS Program for Theory, Algorithms and Computation of Learning Machines; as well core faculty of the Tuebingen AI center. His research has been supported by Emmy Noether, Max Planck, and ERC fellowships / grants.




Other Videos By Google TechTalks


2022-05-052022 Blockly Developers Summit: Bad Blocks
2022-05-052022 Blockly Developers Summit: Debugging in Blockly
2022-05-052022 Blockly Developers Summit: Year in Review and Roadmap
2022-05-052022 Blockly Developers Summit: Customizing Blockly
2022-05-052022 Blockly Developers Summit: Blockly at Google - Scratch for CS First
2022-05-052022 Blockly Developers Summit: Serialization
2022-05-052022 Blockly Developers Summit: Block Definitions - Past, Present, and Future
2022-05-052022 Blockly Developers Summit: TypeScript Migration
2022-05-052022 Blockly Developers Summit: Contributing to Blockly
2022-05-052022 Blockly Developers Summit: Backwards Execution
2022-02-14Probabilistic Numerics — moving BayesOpt expertise to the inner loop by Philipp Hennig
2022-02-08Information-Constrained Optimization: Can Adaptive Processing of Gradients Help?
2022-02-08Differential privacy dynamics of noisy gradient descent
2022-02-08Consistent Spectral Clustering of Network Block Models under Local Differential Privacy
2022-02-08The Skellam Mechanism for Differentially Private Federated Learning
2022-02-08Statistical Heterogeneity in Federated Learning
2022-02-08Improved Information Theoretic Generalization Bounds for Distributed and Federated Learning
2022-02-08Tight Accounting in the Shuffle Model of Differential Privacy
2022-02-08Distributed Point Functions: Efficient Secure Aggregation and Beyond with Non-Colluding Servers
2022-02-08How to Turn Privacy ON and OFF and ON Again
2022-02-08Fair and Accurate Federated Learning under heterogeneous targets with Ordered Dropout