BOHB: Robust and Efficient Hyperparameter Optimization at Scale

BOHB: Robust and Efficient Hyperparameter Optimization at Scale

4 Jul 2018 | Stefan Falkner, Aaron Klein, Frank Hutter
BOHB is a novel hyperparameter optimization method that combines Bayesian optimization and Hyperband to achieve strong anytime performance and fast convergence to optimal configurations. It outperforms both Bayesian optimization and Hyperband across a wide range of problem types, including high-dimensional toy functions, support vector machines, feed-forward neural networks, Bayesian neural networks, deep reinforcement learning, and convolutional neural networks. BOHB is robust, versatile, and conceptually simple, making it easy to implement. It addresses the challenges of hyperparameter optimization by satisfying multiple desiderata, including strong anytime performance, strong final performance, effective use of parallel resources, scalability, and robustness. BOHB uses a model-based approach to guide the search, replacing random configuration selection with a Bayesian optimization component. It also effectively parallelizes the system, allowing it to take advantage of large parallel resources. BOHB's performance is demonstrated through extensive empirical evaluations, where it often finds good solutions much faster than Bayesian optimization and converges to the best solutions much faster than Hyperband. The method is also robust to hyperparameter settings and performs well across different problem types and dimensions. BOHB is implemented as an open-source library and is available for use in a variety of machine learning applications.BOHB is a novel hyperparameter optimization method that combines Bayesian optimization and Hyperband to achieve strong anytime performance and fast convergence to optimal configurations. It outperforms both Bayesian optimization and Hyperband across a wide range of problem types, including high-dimensional toy functions, support vector machines, feed-forward neural networks, Bayesian neural networks, deep reinforcement learning, and convolutional neural networks. BOHB is robust, versatile, and conceptually simple, making it easy to implement. It addresses the challenges of hyperparameter optimization by satisfying multiple desiderata, including strong anytime performance, strong final performance, effective use of parallel resources, scalability, and robustness. BOHB uses a model-based approach to guide the search, replacing random configuration selection with a Bayesian optimization component. It also effectively parallelizes the system, allowing it to take advantage of large parallel resources. BOHB's performance is demonstrated through extensive empirical evaluations, where it often finds good solutions much faster than Bayesian optimization and converges to the best solutions much faster than Hyperband. The method is also robust to hyperparameter settings and performs well across different problem types and dimensions. BOHB is implemented as an open-source library and is available for use in a variety of machine learning applications.
Reach us at info@study.space