Big data management - From CERN/LHC to personalised medicine

May 2, 2016, 11:50 AM


Alberto Di Meglio (CERN)


The transformations that have taken place in Information and Communication Technology in the past 20 years have given rise to a new form of scientific research paradigm where data-intensive, large-scale projects combine experiment, theory and computing to address fundamental questions about ourselves and our universe. The large-scale computing and data analysis infrastructure set up by the High Energy Physics community to support the research of the LHC Experiments at CERN and in the hundreds of collaborating facilities worldwide is one of the foremost examples of this paradigm. Today the HEP community is not anymore the only place where increasingly large amounts of data are produced. Biomedical and healthcare research and practice could benefit from a broader use of big data analysis and simulation platforms. However, the biomedical domain requires a new focus on careful governance and use of data and information in the respect of the social and human value of such data and the design and deployment of collaborative frameworks where medical research, clinical practice and modern information technologies can constructively interact with each other to deliver personalized care. This talk briefly describes the state of the art of large-scale data analytics platforms as used in the HEP community and the ongoing work to adapt and extend such platforms for the benefit of medical applications.

Primary author

Presentation materials