Conference paper

Hybrid Bayesian Optimization with DIRECT

Abstract

We present a hybrid Bayesian optimization method that incorporates the well-known DIRECT algorithm for black-box optimization. The DIRECT algorithm originates from Lipschitz optimization for global optimization and performs searches using all possible values for Lipschitz constants by balancing global and local search. It has a good ability to locate promising regions of the feasible space; however, it can have slow asymptotic convergence. As a result, DIRECT can require extensive function evaluations to obtain a high-quality solution, which is not practical in many machine learning applications. Bayesian Optimization (BO) uses some prior knowledge about the function; for example, the unknown function is assumed to be a Gaussian process. In a small sub-region, the BO assumption is more likely to be true, which allows BO to converge quickly in the confined feasible domain identified by DIRECT. The new hybrid algorithm, Bayesian DIRECT (BD), combines strengths from both methods for better convergence to the global optimum. We provide a convergence result, and experiments show the efficiency of the new hybrid algorithm compared with Bayesian optimization.

Related