Increasing the Scope as You Learn: Adaptive Bayesian Optimization in Nested Subspaces

Leonard Papenmeier, Luigi Nardi, Matthias Poloczek

Research output: Chapter in Book/Report/Conference proceedingPaper in conference proceedingpeer-review

Abstract

Recent advances have extended the scope of Bayesian optimization (BO) to expensive-to-evaluate black-box functions with dozens of dimensions, aspiring to unlock impactful applications, for example, in the life sciences, neural architecture search, and robotics. However, a closer examination reveals that the state-of-the-art methods for high-dimensional Bayesian optimization (HDBO) suffer from degrading performance as the number of dimensions increases, or even risk failure if certain unverifiable assumptions are not met. This paper proposes BAxUS that leverages a novel family of nested random subspaces to adapt the space it optimizes over to the problem. This ensures high performance while removing the risk of failure, which we assert via theoretical guarantees. A comprehensive evaluation demonstrates that BAxUS achieves better results than the state-of-the-art methods for a broad set of applications.
Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems, NeurIPS 2022
PublisherCurran Associates, Inc
ISBN (Print)9781713871088
Publication statusPublished - 2022
EventAdvances in Neural Information Processing Systems 35, NeurIPS 2022 - New Oreleans, United States
Duration: 2022 Nov 282022 Dec 9

Conference

ConferenceAdvances in Neural Information Processing Systems 35, NeurIPS 2022
Country/TerritoryUnited States
CityNew Oreleans
Period2022/11/282022/12/09

Subject classification (UKÄ)

  • Robotics

Free keywords

  • Bayesian Optimization
  • Global Optimization
  • Gaussian Process
  • high-dimensional

Fingerprint

Dive into the research topics of 'Increasing the Scope as You Learn: Adaptive Bayesian Optimization in Nested Subspaces'. Together they form a unique fingerprint.

Cite this