Adaptive sequential Monte Carlo by means of mixture of experts

Julien Cornebise, Eric Moulines, Jimmy Olsson

Research output: Contribution to journalArticlepeer-review

4 Citations (SciVal)

Abstract

Appropriately designing the proposal kernel of particle filters is an issue of significant importance, since a bad choice may lead to deterioration of the particle sample and, consequently, waste of computational power. In this paper we introduce a novel algorithm adaptively approximating the so-called optimal proposal kernel by a mixture of integrated curved exponential distributions with logistic weights. This family of distributions, referred to as mixtures of experts, is broad enough to be used in the presence of multi-modality or strongly skewed distributions. The mixtures are fitted, via online-EM methods, to the optimal kernel through minimisation of the Kullback-Leibler divergence between the auxiliary target and instrumental distributions of the particle filter. At each iteration of the particle filter, the algorithm is required to solve only a single optimisation problem for the whole particle sample, yielding an algorithm with only linear complexity. In addition, we illustrate in a simulation study how the method can be successfully applied to optimal filtering in nonlinear state-space models.
Original languageEnglish
Pages (from-to)317-337
JournalStatistics and Computing
Volume24
Issue number3
DOIs
Publication statusPublished - 2014

Subject classification (UKÄ)

  • Probability Theory and Statistics

Keywords

  • Optimal proposal kernel
  • Adaptive algorithms
  • Kullback-Leibler
  • divergence
  • Coefficient of variation
  • Expectation-maximisation
  • Particle
  • filter
  • Sequential Monte Carlo
  • Shannon entropy

Fingerprint

Dive into the research topics of 'Adaptive sequential Monte Carlo by means of mixture of experts'. Together they form a unique fingerprint.

Cite this