Dorothea vom Bruch | CPPM/CNRS, France
Dorothea vom Bruch is a research scientist with the Centre National de la Recherche Scientifique (CNRS) at the Particle Physics Center in Marseille (CPPM).
She is a passionate physicist, driven by two major questions: What does particle physics teach us about the laws of nature? More explicitly, is lepton flavor universal, as predicted by the Standard Model of particle physics? And how can we use modern computing architectures, such as graphics processing units (GPUs), to handle the huge data stream produced by particle physics experiments?
To test lepton universality, she has worked on three different experiments: Pienu (TRIUMF, Canada), Mu3e (PSI, Switzerland) and now LHCb (CERN,Switzerland). At the last two, she has developed real-time selection systems processed entirely on GPUs. The Allen system, which is the first software trigger stage implementation on GPUs for LHCb, is being used for data-taking since 2022.
Daniel Campora | Maastricht University, the Netherlands
Daniel Cámpora received his PhD in Computer Engineering from the University of Sevilla, and has worked at CERN between 2010 and 2019. He is an expert in GPU computing and co-leads the Allen project, a GPU trigger application for LHCb. He has worked in the Online teams of ATLAS and LHCb and has various years of experience in Data Acquisition Systems in the high throughput, real-time conditions that occur at the LHC.
He likes to keep up with the latest processor developments and software techniques, multi and manycore alike. He is now Assistant Professor at the University of Maastricht and an active researcher of reconstruction algorithm techniques in LHCb.
Andrzej Nowak
Andrzej Nowak spent the last 15 years at the juncture of technology, business and innovation. Between 2007 and 2014 he worked at CERN openlab - a collaboration of CERN and industrial partners such as Google, HP, Huawei, Intel, Oracle and Siemens. Andrzej was also part of the openlab CTO office, where he helped set up next-generation technology projects for CERN.
More recently, Andrzej founded a small technology and innovation consultancy (TIK Services) as well as a fintech start-up. In the last few years, he worked in management consulting in finance and in innovation management. He co-founded various schools and course series, including the thematic CSC, that trained over 2,000 students in 17 countries. Andrzej holds a PhD from EPFL.
Danilo Piparo | CERN
Danilo is an experimental HEP physicist and works at CERN in the Experimental Physics department since a decade.
He coordinates the Offline Software and Computing team of CMS, responsible for the delivery of the experiment's software and the distributed data processing of the data.
He previously held responsibilities in the parallelisation of the CERN software suite, most notably working on Gaudi and ROOT as well as contributing to the initial parallelisation and vectorisation of the CMS software. He obtained a PhD in Particle Physics at the Karlsuhe Institute of Technology, Germany.
Sebastien Ponce | CERN
Sebastien Ponce is a member of the EP department at CERN where he works on the LHCb software framework. He is the leader of the LHCb software upgrade targeting the LHCb run 3 and aiming at parallelizing, vectorizing and in general optimizing the LHCb code.
He has previously spent 10 years in the CERN IT department, working on Mass Storage solutions as the lead developer of the CERN Advanced Storage Manager (CASTOR), the software holding all CERN's physics data (> 150PB). He has obtained a PhD thesis at EPFL, working on parallelization of the LHCb computing software framework. He originally graduated as an engineer in the Ecole Nationale Superieure des Telecoms in Paris and before that as an engineer from the Ecole Polytechnique Paris.