29–30 May 2017
CERN
Europe/Zurich timezone

Session

Implementations & Technologies

29 May 2017, 14:50
IT Auditorium (CERN)

IT Auditorium

CERN

Conveners

Implementations & Technologies

  • Chris Roderick (CERN)
  • Andrei Dumitru (CERN)

Presentation materials

There are no materials yet.

  1. Paul James Laycock (CERN)
    29/05/2017, 14:50
    Implementations & Technologies

    Conditions data infrastructure for both ATLAS and CMS have to deal with the management of several Terabytes of data. Distributed computing access to this data requires particular care and attention to manage request-rates of up to several tens of kHz. Thanks to the large overlap in use cases and requirements, ATLAS and CMS have worked towards a common solution for conditions data management...

    Go to contribution page
  2. Elizabeth Gallas (University of Oxford (GB)), Gancho Dimitrov (CERN)
    29/05/2017, 15:10
    Implementations & Technologies

    Relational databases are critical backend storage for many systems in ATLAS (both online, offline, and on the grid), storing essential data for the processing of past and current data as well as support daily operations. These systems have been refined over time into robust applications optimized and provisioned for established use cases. Relational storage is well suited for many of these...

    Go to contribution page
  3. Nikolay Tsvetkov (CERN)
    29/05/2017, 16:00
    Implementations & Technologies

    The CERN Accelerator Logging Service (CALS) was designed in 2001, and has been in production for 14 years. It is a mission-critical service for the operation of the LHC (Large Hadron Collider).

    CALS uses an Oracle database for storage of technical accelerator data and persists approx 0.75 petabytes of data coming from more than 1.5 million pre-defined signals. These signals represent data...

    Go to contribution page
  4. Oliver Gutsche (Fermi National Accelerator Lab. (US))
    29/05/2017, 16:20
    Implementations & Technologies

    Experimental Particle Physics has been at the forefront of analyzing the world’s largest datasets for decades. The HEP community was among the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems for distributed data processing, collectively called “Big Data” technologies have emerged from industry and open source projects to support...

    Go to contribution page
  5. Piotr Golonka (CERN)
    29/05/2017, 16:40
    Implementations & Technologies
  6. Serhiy Boychenko (Universidade de Coimbra (PT))
    29/05/2017, 17:00
    Implementations & Technologies

    The Post Mortem was designed almost a decade ago to enable the collection and the analysis of high-resolution, transient data recordings of relevant events, such as beam dumps in the LHC accelerator. Since then, the storage has been constantly evolving both to accommodate larger datasets and to satisfy new requirements and use-cases for the LHC but also first machines in the injector complex....

    Go to contribution page
  7. Ignacio Coterillo Coz (CERN)
    29/05/2017, 17:20
    Implementations & Technologies
Building timetable...