Extensions of Bayesian Optimization for Real-World Applications

Subscribers:
351,000
Published on ● Video Link: https://www.youtube.com/watch?v=24a8pBisH_g



Duration: 1:16:13
1,044 views
4


Bayesian Optimization (BO) is a popular approach in statistics and machine learning for the global optimization of expensive blackbox functions. It has strong theoretical foundations and also yields state-of-the-art empirical results for optimizing functions with few all-continuous inputs. However, many blackbox optimization problems in real-world applications do not fit into this scope. For example, the "algorithm configuration" problem of identifying the best instantiation of a parametric algorithm poses various challenges to BO, including: high dimensionality, mixed discrete/continuous optimization, function evaluations of varying costs, partial function evaluations that only yield a bound on the true function value, and computational efficiency with tens of thousands of function evaluations. In this talk, I discuss recent work at UBC that extends BO to handle these challenges. Empirical results demonstrate that the resulting methods achieve state-of-the-art performance for the configuration of algorithms for solving hard combinatorial problems and for the configuration of machine learning classifiers. Based on joint work with Holger Hoos, Kevin Leyton-Brown, and Nando de Freitas and his machine learning group.







Tags:
microsoft research