Sequential Model-Based Optimization for General Algorithm Configuration (extended version)

Sequential Model-Based Optimization for General Algorithm Configuration (extended version)

| Frank Hutter, Holger H. Hoos and Kevin Leyton-Brown
This paper introduces two novel sequential model-based optimization (SMBO) methods for general algorithm configuration: ROAR and SMAC. These methods extend SMBO to handle categorical parameters and multiple instances, addressing previous limitations of SMBO that only supported numerical parameters and single-instance optimization. ROAR is a simple model-free algorithm that randomly selects parameter configurations and compares them against the current best (incumbent) using an intensification mechanism. SMAC is a more sophisticated SMBO method that uses models to select promising configurations, supporting both numerical and categorical parameters, as well as multiple instances. The paper evaluates these methods on SAT and MIP problems, including the local search solver SAPS, tree search solver SPEAR, and commercial MIP solver CPLEX. SMAC outperforms existing methods like PARAMILS and GGA in many cases, achieving state-of-the-art performance. The methods are validated through extensive experiments on 17 configuration scenarios with small runtimes, showing that SMAC is effective in optimizing algorithm parameters for various problem types. The paper also discusses theoretical analysis of SMAC and ROAR, proving their convergence to the optimal configuration under certain conditions. The results demonstrate that SMAC is a promising approach for general algorithm configuration, capable of handling complex parameter spaces and multiple instances.This paper introduces two novel sequential model-based optimization (SMBO) methods for general algorithm configuration: ROAR and SMAC. These methods extend SMBO to handle categorical parameters and multiple instances, addressing previous limitations of SMBO that only supported numerical parameters and single-instance optimization. ROAR is a simple model-free algorithm that randomly selects parameter configurations and compares them against the current best (incumbent) using an intensification mechanism. SMAC is a more sophisticated SMBO method that uses models to select promising configurations, supporting both numerical and categorical parameters, as well as multiple instances. The paper evaluates these methods on SAT and MIP problems, including the local search solver SAPS, tree search solver SPEAR, and commercial MIP solver CPLEX. SMAC outperforms existing methods like PARAMILS and GGA in many cases, achieving state-of-the-art performance. The methods are validated through extensive experiments on 17 configuration scenarios with small runtimes, showing that SMAC is effective in optimizing algorithm parameters for various problem types. The paper also discusses theoretical analysis of SMAC and ROAR, proving their convergence to the optimal configuration under certain conditions. The results demonstrate that SMAC is a promising approach for general algorithm configuration, capable of handling complex parameter spaces and multiple instances.
Reach us at info@study.space
[slides] Sequential Model-Based Optimization for General Algorithm Configuration | StudySpace