Monte Carlo Filtering Objectives

Shuangshuang Chen, Sihao Ding, Yiannis Karayiannidis, Mårten Björkman

Research output: Chapter in Book/Report/Conference proceedingPaper in conference proceedingpeer-review

Abstract

Learning generative models and inferring latent trajectories have shown to be challenging for time series due to the intractable marginal likelihoods of flexible generative models. It can be addressed by surrogate objectives for optimization. We propose Monte Carlo filtering objectives (MCFOs), a family of variational objectives for jointly learning parametric generative models and amortized adaptive importance proposals of time series. MCFOs extend the choices of likelihood estimators beyond Sequential Monte Carlo in state-of-the-art objectives, possess important properties revealing the factors for the tightness of objectives, and allow for less biased and variant gradient estimates. We demonstrate that the proposed MCFOs and gradient estimations lead to efficient and stable model learning, and learned generative models well explain data and importance proposals are more sample efficient on various kinds of time series data.
Original languageEnglish
Title of host publicationIJCAI International Joint Conference on Artificial Intelligence
Pages2256-2262
Number of pages7
Publication statusPublished - 2021
Externally publishedYes
Eventthe 30th International Joint Conference on Artificial Intelligence (IJCAI-21) - Montreal-themed Virtual Reality
Duration: 2021 Aug 192021 Aug 26
Conference number: 30
https://ijcai-21.org/

Conference

Conferencethe 30th International Joint Conference on Artificial Intelligence (IJCAI-21)
Abbreviated titleIJCAI
Period2021/08/192021/08/26
Internet address

Bibliographical note

Part of proceedings: ISBN 978-0-9992411-9-6QC 20220816

Subject classification (UKÄ)

  • Control Engineering

Fingerprint

Dive into the research topics of 'Monte Carlo Filtering Objectives'. Together they form a unique fingerprint.

Cite this