Bayesian optimization in high dimensions: a journey through subspaces and challenges

Research output: ThesisDoctoral Thesis (compilation)

380 Downloads (Pure)

Abstract

This thesis explores the challenges and advancements in high-dimensional Bayesian optimization (HDBO), focusing on understanding, quantifying, and improving optimization techniques in high-dimensional spaces.
Bayesian optimization (BO) is a powerful method for optimizing expensive black-box functions, but its effectiveness diminishes as the dimensionality of the search space increases due to the curse of dimensionality. The thesis introduces novel algorithms and methodologies to make HDBO more practical.
Key contributions include the development of the BAxUS algorithm, which leverages nested subspaces to optimize high-dimensional problems without estimating the dimensionality of the effective subspace.
Additionally, the Bounce algorithm extends these techniques to combinatorial and mixed spaces, providing robust solutions for real-world applications.
The thesis also explores the quantification of exploration in acquisition functions, proposing new methods of quantifying exploration and strategies to design more effective optimization approaches.
Furthermore, this work analyzes why simple BO setups have recently shown promising performance in high-dimensional spaces, challenging the conventional belief that BO is limited to low-dimensional problems.
This thesis offers insights and recommendations for designing more efficient HDBO algorithms by identifying and addressing failure modes such as vanishing gradients and biases in model fitting. Through a combination of theoretical analysis, empirical evaluations, and practical implementations, this thesis contributes to the field of BO by advancing our understanding of high-dimensional optimization and providing actionable methods to improve its performance in complex scenarios.
Original languageEnglish
QualificationDoctor
Supervisors/Advisors
  • Nardi, Luigi, Assistant supervisor
  • Malec, Jacek, Supervisor
Award date2025 Jun 12
Place of PublicationLund
Publisher
ISBN (Print)978-91-8104-547-5
ISBN (electronic) 978-91-8104-548-2
Publication statusPublished - 2025

Bibliographical note

Defence details
Date: 2025-06-12
Time: 13:15
Place: Lecture Hall E:1406, building E, Klas Anshelms väg 10, Faculty of Engineering LTH, Lund University, Lund.
External reviewer(s)
Name: Garnett, Roman
Title: Assoc. Prof.
Affiliation: Washington University in St Louis, USA.
---

Subject classification (UKÄ)

  • Probability Theory and Statistics

Free keywords

  • optimization
  • Bayesian optimization
  • Gaussian process
  • machine learning

Fingerprint

Dive into the research topics of 'Bayesian optimization in high dimensions: a journey through subspaces and challenges'. Together they form a unique fingerprint.

Cite this