Projects per year
Abstract
We study the class of statespace models and perform maximum likelihood estimation for the model parameters. We consider a stochastic approximation expectation–maximization (SAEM) algorithm to maximize the likelihood function with the novelty of using approximate Bayesian computation (ABC) within SAEM. The task is to provide each iteration of SAEM with a filtered state of the system, and this is achieved using an ABC sampler for the hidden state, based on sequential Monte Carlo methodology. It is shown that the resulting SAEMABC algorithm can be calibrated to return accurate inference, and in some situations it can outperform a version of SAEM incorporating the bootstrap filter. Two simulation studies are presented, first a nonlinear Gaussian statespace model then a statespace model having dynamics expressed by a stochastic differential equation. Comparisons with iterated filtering for maximum likelihood inference, and Gibbs sampling and particle marginal methods for Bayesian inference are presented.
Original language  English 

Pages (fromto)  179212 
Journal  Computational Statistics 
Volume  33 
Issue number  1 
Early online date  2017 Oct 23 
DOIs  
Publication status  Published  2018 Mar 
Subject classification (UKÄ)
 Probability Theory and Statistics
Keywords
 Hidden Markov model
 Maximum likelihood
 Particle filter
 SAEM
 Sequential Monte Carlo
 Stochastic differential equation
Fingerprint
Dive into the research topics of 'Coupling stochastic EM and approximate Bayesian computation for parameter inference in statespace models'. Together they form a unique fingerprint.Projects
 1 Active

Stochastic modelling of protein folding and likelihoodfree statistical inference methods
Picchini, U., Forman, J., LindorffLarsen, K. & Wiqvist, S.
2015/01/01 → …
Project: Research