Oct 16 – 20, 2017
Asia/Tokyo timezone

Optimising the resource needs for the LHC computing: ideas for a common approach

Oct 18, 2017, 12:15 PM


1-1 Oho, Tsukuba, Ibaraki 305-0801 Japan 36°09'01.0"N 140°04'28.1"E 36.150290, 140.074485
Computing & Batch Services Computing and batch systems


Andrea Sciaba (CERN)


The increase of the scale of LHC computing expected for Run 3 and even more so for Run 4 (HL-LHC) over the course of the next 10 years will most certainly require radical changes to the computing models and the data processing of the LHC experiments. Translating the requirements of the physics programme into resource needs is an extremely complicated process and subject to significant uncertainties. Currently there is no way to do that without using complex tools and procedures developed internally by each LHC collaboration. Recently there has been much interest in developing a common model for estimating resource costs, which would be beneficial for experiments, WLCG and sites and in particular to understand and optimise the path towards HL-LHC. For example, it could be used to estimate the impact of changes in the computing models or to optimise the resource allocation at the site level. In this presentation we expose some preliminary ideas on how this could be achieved, with a special focus on the site perspective and provide some real world examples.

Desired length 20

Primary authors

Jose Flix Molina (Centro de Investigaciones Energéti cas Medioambientales y Tecno) Markus Schulz (CERN) Andrea Sciaba (CERN)

Presentation materials