Aug 19 – 20, 2013
Europe/London timezone
Data is expensive to get, and comes with uncertainty. What is the best way to use experimental data in the formulation of theoretical models that attempt to explain the results? This workshop will discuss the use of information theory in the analysis of experiments, and the use of applied mathematics and statistics within the context of theoretical models dealing with current and future data. It follows on from a preliminary meeting held in Krakow in 2012.

The meeting will take place in Room 323 of the Kelvin Building, University of Glasgow. See the campus map on the left panel for further information.

• Information theory
• Bayesian approaches
• Uncertainity quantification
• Statistical correlations
• Computational techniques

Key Questions
• How can we estimate statistical and systematic errors on calculated quantities?
• How can the uniqueness and usefulness of an observable be assessed, i.e., its information content with respect to current theoretical models?
• How can model-based extrapolations be validated and verified?
• What experimental data are crucial for better constraining current nuclear models?
• How can statistical tools of nuclear theory help planning future experiments and experimental programs?

• Establish a network of interested people in the U.K.
• Produce a write-up in Journal of Physics G (topical volume).

The scientific method uses experimentation to assess theoretical predictions. Based on experimental data, the theory is modified and can be used to guide future measurements. The process is then repeated, until the theory is able to explain observations, and experiment is consistent with theoretical predictions. The positive feedback in the loop "experiment-theory-experiment-" can be enhanced if statistical methods and scientific computing are applied to determine the independence of model parameters, parameter uncertainties, and the errors of calculated observables.  
Nuclei communicate with us through a great variety of observables. Some are easy to measure; some take a considerable effort and experimental ingenuity. But not every observable has a potential to impact theoretical developments: some are more important than the others. Nuclear theory is developing tools to deliver uncertainty quantification and error analysis for theoretical studies as well as for the assessment of new experimental data. Statistical tools can also be used to assess the information   content of an observable with respect to current theoretical models, and evaluate the degree of correlation between different observables. Such technologies are essential for providing predictive capability, estimate uncertainties, and assess model-based extrapolations - as theoretical models are often applied to entirely new nuclear systems and conditions that are not accessible to experiment. 
The meeting is supported by the IoP Nuclear Physics Group and is open to all. We will especially welcome contributions from postdoctoral researchers and PhD students. Interested participants are encouraged to contact the organisers.

The meeting will take place at the Kelvin Building, University of Glasgow.

Campus Map:

The School of Physics and Astronomy is located in the main campus (Gilmorehill), near the Botany Gate. Further travel details can be found here:

Participants are expected to book their accommodation. The University of Glasgow Conference and Visitor Service (CVSO) has a hotel booking service:
Additionally, the SeeGlasgow guide book also has links to a hotel booking service: