Dr Catherine Biscarat (LPSC/IN2P3/CNRS France)
We describe the synergy between CIMENT (a regional multidisciplinary HPC centre) and the infrastructures used for the analysis of data recorded by the ATLAS experiment at the LHC collider and the D0 experiment at the Tevatron. CIMENT is the High Performance Computing (HPC) centre developed by Grenoble University. It is a federation of several scientific departments and it is based on the gridification of a dozen HPC machines with iRODS storage. CIMENT is a medium scale centre, or a tier-2 in the HPC pyramidal scheme, of about 35 TFlops, placing it in the top-5 of the French tier-2 sites in 2012. The acquisition of an additional machine in spring 2013 is more than doubling its computing capacity and hence consolidating CIMENT's leading role in the French HCP landscape. CIMENT aims at providing significant computing power for tests and algorithm developments before execution on national [tier-1] or european [tier-0] platforms. This profile of resource allocation necessarily implies that not all resources are used at all times. The main goals of the LHC collider at CERN are the search for a subatomic particle called the Higgs boson and the search for new phenomena beyond the Standard Model of particle physics. The observation of a Higgs-like boson has been reported last year by ATLAS and CMS, the general-purpose experiments operating at the LHC. Current research is focused on the characterisation of the newly discovered boson, and on the search for new phenomena. Researchers at LPSC in Grenoble are leading the search for one type of new phenomena in ATLAS, namely the search for additional spatial dimensions which are likely to manifest themselves in LHC collisions. Given the rich multitude of physics studies proceeding in parallel in the ATLAS collaboration, one of the limiting factors in the timely analysis of ATLAS data is the availability of computing resources for physics analysis. Another LPSC team suffers from a similar limitation. This team is leading the ultimate precision measurement based on the data from the D0 experiment: the precise measurement of the W boson mass, which yields an indirect constraint on the mass of the Higgs boson and completes the direct search at the LHC. A relative precision of 10^-4 in the measurement of the W boson mass is needed to help elucidate the nature of the newly discovered Higgs-like particle. Such a measurement requires simulations of unprecedented precision, and therefore considerable computing power. The limitation in available computing power becomes problematic in the months that precede the international conferences where major results are released. The sharing of resources between different scientific fields, like the ones discussed in this article, constitutes a valuable synergy, because the spikes in need for computing resources are uncorrelated in time between different fields (HPC and HEP). The results of our collaboration between fields manifest themselves in the timely delivery of HEP results that had been eagerly awaited both by the particle physics community and by the general public.