Extensions of Bayesian Optimization for Real-World Applications

1 081
72.1
Следующее
Популярные
157 дней – 6113:07:51
AI For All: Embracing Equity for All
Опубликовано 28 июля 2016, 0:51
Bayesian Optimization (BO) is a popular approach in statistics and machine learning for the global optimization of expensive blackbox functions. It has strong theoretical foundations and also yields state-of-the-art empirical results for optimizing functions with few all-continuous inputs. However, many blackbox optimization problems in real-world applications do not fit into this scope. For example, the "algorithm configuration" problem of identifying the best instantiation of a parametric algorithm poses various challenges to BO, including: high dimensionality, mixed discrete/continuous optimization, function evaluations of varying costs, partial function evaluations that only yield a bound on the true function value, and computational efficiency with tens of thousands of function evaluations. In this talk, I discuss recent work at UBC that extends BO to handle these challenges. Empirical results demonstrate that the resulting methods achieve state-of-the-art performance for the configuration of algorithms for solving hard combinatorial problems and for the configuration of machine learning classifiers. Based on joint work with Holger Hoos, Kevin Leyton-Brown, and Nando de Freitas and his machine learning group.
автотехномузыкадетское