Nov 9 – 13, 2015
Europe/Zurich timezone
There is a live webcast for this event.

ABC Method

Nov 10, 2015, 9:45 AM
222/R-001 (CERN)



Show room on map


Richard Wilkinson (University of Sheffield)


Approximate Bayesian computation (ABC) is the name given to a collection of Monte Carlo algorithms used for fitting complex computer models to data. The methods rely upon simulation, rather than likelihood based calculation, and so can be used to calibrate a much wider set of simulation models. The simplest version of ABC is intuitive: we sample repeatedly from the prior distribution, and accept parameter values that give a close match between the simulation and the data. This has been extended in many ways, for example, reducing the dimension of the datasets using summary statistics and then calibrating to the summaries instead of the full data; using more efficient Monte Carlo algorithms (MCMC, SMC, etc); and introducing modelling approaches to overcome computational cost and to minimize the error in the approximation. The two key challenges for ABC methods are i) dealing with computational constraints; and ii) finding good low dimensional summaries. Much of the early work on i) was based upon finding efficient sampling algorithms, adapting methods such as MCMC and sequential Monte Carlo methods, to more efficiently find good regions of parameter space. Although these methods can dramatically reduce the amount of computation needed, they still require hundreds of thousands of simulations. Recent work has instead focused on the use of meta-models or emulators. These are cheap statistical surrogates that approximate the simulator, and which can be used in place of the simulator to find the posterior distribution. A key question when using these methods concerns the experimental design: where should we next run the simulator, in order to maximise our information about the posterior distribution?

Presentation materials