Projects per year
Abstract
We present a novel family of deep neural architectures, named partially exchangeable networks (PENs) that leverage probabilistic symmetries. By design, PENs are invariant to blockswitch transformations, which characterize the partial exchangeability properties of conditionally Markovian processes. Moreover, we show that any blockswitch invariant function has a PENlike representation. The DeepSets architecture is a special case of PEN and we can therefore also target fully exchangeable data. We employ PENs to learn summary statistics in approximate Bayesian computation (ABC). When comparing PENs to previous deep learning methods for learning summary statistics, our results are highly competitive, both considering time series and static models. Indeed, PENs provide more reliable posterior samples even when using less training data.
Original language  English 

Number of pages  13 
Publication status  Unpublished  2019 
Subject classification (UKÄ)
 Probability Theory and Statistics
Fingerprint
Dive into the research topics of 'Partially Exchangeable Networks and Architectures for Learning Summary Statistics in Approximate Bayesian Computation'. Together they form a unique fingerprint.Projects
 1 Active

Stochastic modelling of protein folding and likelihoodfree statistical inference methods
Picchini, U., Forman, J., LindorffLarsen, K. & Wiqvist, S.
2015/01/01 → …
Project: Research