Forskningsoutput per år
Forskningsoutput per år
Samuel Wiqvist, Pierre Alexandre Mattei, Umberto Picchini, Jes Frellsen
Forskningsoutput: Kapitel i bok/rapport/Conference proceeding › Konferenspaper i proceeding › Peer review
We present a novel family of deep neural architectures, named partially exchangeable networks (PENs) that leverage probabilistic symmetries. By design, PENs are invariant to block-switch transformations, which characterize the partial exchangeability properties of conditionally Markovian processes. Moreover, we show that any block-switch invariant function has a PEN-like representation. The DeepSets architecture is a special case of PEN and we can therefore also target fully exchangeable data. We employ PENs to learn summary statistics in approximate Bayesian computation (ABC). When comparing PENs to previous deep learning methods for learning summary statistics, our results are highly competitive, both considering time series and static models. Indeed, PENs provide more reliable posterior samples even when using less training data.
Originalspråk | engelska |
---|---|
Titel på värdpublikation | 36th International Conference on Machine Learning, ICML 2019 |
Förlag | International Machine Learning Society (IMLS) |
Sidor | 11795-11804 |
Antal sidor | 10 |
ISBN (elektroniskt) | 9781510886988 |
Status | Published - 2019 |
Evenemang | 36th International Conference on Machine Learning, ICML 2019 - Long Beach, USA Varaktighet: 2019 juni 9 → 2019 juni 15 |
Namn | 36th International Conference on Machine Learning, ICML 2019 |
---|---|
Volym | 2019-June |
Konferens | 36th International Conference on Machine Learning, ICML 2019 |
---|---|
Land/Territorium | USA |
Ort | Long Beach |
Period | 2019/06/09 → 2019/06/15 |
Forskningsoutput: Avhandling › Doktorsavhandling (sammanläggning)