Bayesian inference with models made of modules

A post about a talk given at ISBA 2022 on Bayesian modular inference
Stats
Talk
Author
Affiliation
Published

August 27, 2022

Large models are often built by combining more basic models.

In June I had the privilege to give the Susie Bayarri lecture at the 2022 ISBA World Meeting, which is a fairly large conference on Bayesian analysis that takes place every other year. The topic was “Bayesian inference with models made of modules”, “cut posteriors”, “cutting feedback”, etc. Susie Bayarri worked on that herself, with a view toward computer models [1]. My own papers on the topic include [2] and [3].

These days quite a few people work on that topic, which is exciting. In my lecture, I tried to summarize the current state of affairs, although I’m sure I’ve missed some relevant references. A video recording of the lecture is available on youtube along with the other keynotes, unfortunately without the slides. If you would like to follow the talk, you can find the slides here.

[1]
Bayarri, M. J., Berger, J. O. and Liu, F. (2009). Modularization in Bayesian analysis, with emphasis on analysis of computer models. Bayesian Analysis 4 119–50.
[2]
Jacob, P. E., Murray, L. M., Holmes, C. C. and Robert, C. P. (2017). Better together? Statistical learning in models made of modules. arXiv preprint arXiv:1708.08719.
[3]
Pompe, E. and Jacob, P. E. (2021). Asymptotics of cut distributions and robust modular inference using posterior bootstrap. arXiv preprint arXiv:2110.11149.