LHC Post Mortem Workshop - I

Europe/Zurich
874/1-011 (CERN)

874/1-011

CERN

Adriaan Rijllart (CERN), Robin Lauckner (CERN), Rüdiger Schmidt (CERN)
Description
This is the first workshop on the recording and analysis of data after an event in the LHC, such as a magnet quench or a beam dump - Post Mortem. Data will come from transient recorders, from the logging systems, from alarms and probably other sources.

The main aims of the workshop are :
- to verify readiness of hardware and software systems for imminent powering of the cold circuits
- to develop the roadmap for beam operation in 2007 and beyond.

Many groups have started to prepare their systems for the different phases of commissioning and operation. The workshop will review their activities, identify open issues and help to define the future roles and responsibilities.

Participants
  • Adriaan Rijllart
  • Alessandro Raimondo
  • Andrew Butterworth
  • Andrzej SIEMKO
  • Antonio Vergara Fernandez
  • Bernd Dehning
  • Blanca Perea Solano
  • Boris Bellesia
  • Boris Khomenko
  • Brennan Goddard
  • Bruno Puccio
  • Cedric Charrondiere
  • Chris Roderick
  • David Nisbet
  • David Widegren
  • Enrico Bravin
  • Eric Veyrunes
  • Etienne Carlier
  • Eugenia Hatziangeli
  • Felix Rodriguez Mateos
  • Frank Zimmermann
  • Frederic Gicquel
  • Frederick Bordry
  • Georges-Henry Hemelsoet
  • Gianluigi Arduini
  • Giulia Bellodi
  • Hermann Schmickler
  • Herve Milcent
  • Hubert Reymond
  • Jacques Lettry
  • Jan Uythoven
  • Javier Serrano
  • Jean-Jacques GRAS
  • jean-pierre koutchouk
  • John Jowett
  • Jorg Wenninger
  • Julian Lewis
  • Karl Hubert Mess
  • Karol Cwalina
  • Katarina Sigerud
  • Kudryavtsev Dmitriy
  • Lars Jensen
  • Laurette Ponce
  • Louis Walckiers
  • Luigi Serio
  • Magali Gruwe
  • Mario Batz
  • Markus Zerlauth
  • Massimo Giovannozzi
  • Matteo Solfaroli Camillocci
  • Michael Draper
  • Michel Jonker
  • Mirko Pojer
  • Paul Collier
  • Pierre Charrue
  • Pierre Pugnat
  • Quentin King
  • Ralph Assmann
  • Rasoaseheno Dit Michel Eric
  • Reiner Denz
  • Reyes Alemany Fernandez
  • Rhodri Jones
  • Roberto Losito
  • Robin Lauckner
  • Roger Bailey
  • Roger Rabehl
  • Roland Garoby
  • Ronny Billen
  • Rüdiger Schmidt
  • Stefano Redaelli
  • Stephan Petit
  • Stephane Bart Pedersen
  • Stephen Page
  • Thiesen Hugues
  • Thomas pettersson
  • Thomas Weiler
  • Verena Kain
  • Volker Mertens
  • Walter Venturini Delsolaro
  • Tuesday 16 January
    • 09:00 09:30
      Introduction 874/1-011

      874/1-011

      CERN

      General Introduction with the main aims of the Post
      Mortem System

      slides
    • 09:30 12:00
      Session 1 874/1-011

      874/1-011

      CERN

      What exists - PM System, Logging, Alarms

      • 09:30
        PM system architecture, front-ends, servers, triggering 30m
        Speaker: Robin Lauckner (CERN)
        Slides
      • 10:00
        PM Data Collection and Storage" 30m
        This talk will cover the following items: - PM data model - Client API - PM server - Data processing and SDDS conversion - Performance and scalability - Current status
        Speaker: Nikolai Trofimov (CERN)
        Slides
      • 10:30
        Coffee break 30m
      • 11:00
        SDDS to LabVIEW, the path from client data to viewing and analysis 30m
        The presentation will cover: - Data arrival and event building - SDDS format and its implementation for PM - PMX method for data description and control - SDDS converter - generic version - Possible enchancement of the converter - LabVIEW application/framework for individual data module (PMM) - PMM data locator - PMM SDDS ascii/binary loader - Internal data classes - Data viewing - Data analysis - Automatic analysis - Diagnostic tools - Conclusion
        Speaker: Boris Khomenko (Joint Institute for Nuclear Research (JINR))
        Slides
      • 11:30
        Alarms in relation with Post Mortem 15m
        LASER will provide alarm event information to the PM system in the case of a PM event. A first solution, agreed between the LASER and the PM teams at the end of 2005, will be described. Since then, the LASER system has evolved which opens up other possibilities to integration. These solutions will be discussed as well as the questions they give rise to.
        Speaker: Katarina Sigerud (CERN)
        Slides
      • 11:45
        Logging data in relation with PM and archiving 15m
        This presentation will explain briefly the purpose, scope and architecture of the LHC Logging Service. More detail will be given on the interaction with the Post-Mortem system including naming conventions and enforcement, data lifetime policy, combining and correlation of slow logging data and external transient data. Finally some ideas and possibilities will be discussed such as the use of the Measurement Service and storing of PM summary information.
        Speaker: Ronny Billen (CERN)
        Slides
    • 12:00 14:00
      Lunch 866 - Rest #3

      866 - Rest #3

      CERN

    • 14:00 16:30
      Session 2 874/1-011

      874/1-011

      CERN

      Cold circuits – data, analysis

      • 14:00
        Powering of the SC circuits: procedures and strategies for circuit validation 30m
        The commissioning of the warm part of the superconducting circuits of the LHC started in 2005 with the short-circuit tests of the power converters where the non-superconducting elements of the circuits are being commissioned together with their associated general services. Once the circuits are at their operation temperature and before powering them, the interlock system will be validated (PIC tests). The overall commissioning of the superconducting circuits will start in February 2007 with the first powering up to nominal current of all the magnets in Sector 7-8. This talk will introduce the sequence of steps and detailed procedures which lead to the powering of the different superconducting circuit types, the powering strategies designed to be ready for 450GeV beam commissioning on schedule and the needs of the hardware commissioning team for diagnostics and to ensure the integrity of the hardware.
        Speaker: Antonio Vergara Fernandez (Cent.de Investigac.Energeticas Medioambientales y Tecnol. (CIEMA)
      • 14:30
        Analysis requirements for the SC magnet systems 30m
        Effective commissioning of the LHC hardware demands a well-designed set of high level software tools, which is required for the equipment performance analysis and validation. The challenge includes a large amount of equipment integrating heterogeneous systems like powering, energy extraction, distributed magnet protection systems, cryogenics and vacuum with their distributed instrumentation as well as the technical services. Various operational conditions must be dealt with like the superconducting magnet quench phenomenon and quench effects, including their constraints on the next powering cycle while respecting the destructive power stored in the magnet system. The level of the commissioning of the main ring superconducting magnet system will depend not only on the time allocated to the commissioning, but also on the availability of the high level software analysis tools. The required tools for various phases of the LHC start-up will be elucidated and discussed. The role of newly created Main Ring Magnet System Performance Panel (MPP), in view of the definition of the high level software tools for the equipment commissioning and performance analysis will also be briefly addressed.
        Speaker: Andrzej SIEMKO (CERN)
        Slides
      • 15:00
        Present status of the individual systems analysis applications 30m
        Three components of the Post Mortem Analysis are already used by the equipment support teams. This talk will present the status and the modes of operation for each of them. Then the present architecture will be detailed, followed by the implementation dedicated to the Hardware Commissioning.
        Speaker: Hubert Reymond (CERN)
        Slides
      • 15:30
        Tea break 30m
      • 16:00
        How do we tackle the extended requirements? 30m
        The first Post-Mortem requirements have come from the needs of the individual systems involved in the first phase of the hardware commissioning using short circuit tests. The second phase of powering the circuits, involving systems such as vacuum, cryogenics and DFB’s will extend the requirements of analysis to a new scale. This talk will show how we plan to include these new analysis requirements into the present framework, how it interfaces with the sequencer and how the analysis could trigger on spontaneous events. Important aspects, such as modularity, flexibility, sequencing and scalability will be covered.
        Speaker: Adriaan Rijllart (CERN)
        Slides
    • 16:30 17:00
      General Discussion 874/1-011

      874/1-011

      CERN

  • Wednesday 17 January
    • 09:00 12:00
      Session 3 874/1-011

      874/1-011

      CERN

      Operation with beam - PM requirements

      • 09:00
        Beam quality checks at injection 30m
        For each beam injection into the LHC a well-defined series of beam quality checks needs to be made, starting in the SPS just before extraction and in the LHC immediately after injection. These checks will be dependent on the beam type, intensity and position in the filling sequence, and will use transient data which must be acquired and analysed at the appropriate time and within a specified time window. The requirements in terms of functionality, response times and scope are described, and the equipment subsystems identified. Potential issues are discussed.
        Speaker: Verena Kain (CERN)
        Slides
      • 09:30
        Beam dump XPOC analysis 30m
        Each dump action must be followed by an XPOC which is launched automatically and is designed to verify that the dump was correctly executed. If an anomaly is discovered during these tests, the XPOC must withhold the User Permit to the BIS (via a software channel). The XPOC comprises beam instrumentation and other signals which will come from the logging and Post- Mortem system, or direct from the equipment. The XPOC must be triggered by the dump action, must retrieve and analyse key data and make a comparison of the relevant parameters against specified reference values, and then give or withhold the User Permit according to the result. The requirements in terms of functionality, response times and scope are described, and the equipment subsystems identified. Data types, reduction, volumes and rates are estimated.
        Speaker: Brennan Goddard (CERN)
      • 10:00
        Emergency dump Post Mortem 30m
        After an emergency dump a general Post-Mortem request will be issued to acquire transient data from a variety of systems. The analysis of a Post-Mortem event may take from minutes to many months, depending of the desired level of details. Key data must be however presented in a way which allows for simple and efficient fault-finding. Operation crews must be presented with clear information to indicate of operation may continue or if expert interventions are required after the emergency beam dump. Key equipment and instrumentation data required to identify the source and causes of an emergency abort are described. Various experiments and measurements will also require the possibility to make ad-hoc acquisition of some transient beam and possibly equipment data, in order to diagnose and solve specific problems and to cope with unforeseen difficulties. An attempt is made to outline the different transient data required for general operational purposes, together with the requirements for triggering and acquisition which are distinct from the general Post- Mortem data.
        Speaker: Jorg Wenninger (CERN)
        Slides
      • 10:30
        Cofee break 30m
      • 11:00
        Transient beam data acquisition 30m
        In addition to systematic transient data acquisition, operation of the LHC will also require the possibility to make ad-hoc acquisition of some transient beam and possibly equipment data, in order to diagnose and solve specific problems and to cope with unforeseen difficulties. An attempt is made to outline the different transient data required for general operational purposes, together with the requirements for triggering and acquisition which are distinct from the general Post- Mortem data.
        Speaker: Canceled
      • 11:30
        Post Mortem acquisition triggering 30m
        A post-mortem timing event distributed by the LHC machine timing system is used to freeze the PM buffers of a large fraction of the LHC equipment. This event must be generated automatically whenever the BIS is issuing a beam dump request by changing the state of the beam permit signal. This presentation outlines the present ideas on how to generate the PM timing event. The issue of PM event suppression in the case of single beam dumps or special operation modes like 'inject and dump' will be addressed.
        Speaker: Julian Lewis (CERN)
        Abstract
        Slides
    • 12:00 13:45
      Lunch 866 - Rest #3

      866 - Rest #3

      CERN

    • 13:45 16:15
      Session 4 874/1-011

      874/1-011

      CERN

      Data providers, volume, type of analysis.

      • 13:45
        Overview of providers 20m
        Post Mortem will be the key to mastering the full complexity of LHC Operation and the interaction between systems. Many systems will be involved in full optimisation and understanding of performance. Today a few systems are providing data to validate and understand hardware commissioning. This must be extended giving priority to obtaining essential information related to achieving first collisions. This talk will review systems involved, discuss the nature of the information to be provided and attempt to identify some priorities. The vacuum system will be examined to demonstrate how these demands are being met.
        Speaker: Robin Lauckner (CERN)
      • 14:05
        Beam Instrumentation 30m
        The key beam instruments for post-mortem diagnostics in the LHC include:
        • the beam position monitors (BPM), • the beam loss monitors (BLM), • the beam current transformers (BCT), • the non-destructive beam profile monitors, • the tune measurement, • the abort gap monitors. Turn by turn (or highest time resolution) data will be provided for all systems for the equivalent of 1000 turns before the post-mortem trigger. Coarser data will also be provided for the time interval of around 20 seconds before the trigger as well as 10-20 samples after the trigger. Data volume depends on the PM data send to the PM server. For instance, 64 BPM systems will send 36 samples of 1000 points which will be approximately 300 Kbytes per system. It will require an external trigger (BST system) to freeze the post-mortem buffers.
        Speaker: Stephane Bart Pedersen (CERN)
        Slides
      • 14:35
        R.F. 30m
        The RF acceleration (ACS) and transverse damper (ADT) systems will supply post-mortem data at various acquisition rates. The PLCs controlling the power systems acquire at a few Hz, while high-speed digitizers and acquisition buffers embedded in the low- level hardware acquire transient signals at 80 MSamples/s over time periods ranging from a few milliseconds to several hundred milliseconds. The high speed acquisitions in particular will result in high volumes of data, and some local data analysis and reduction may be necessary to alleviate this. An overview of the available data signals will be presented, along with tentative requirements on data analysis, logging and alarms.
        Speaker: Dr Andrew Butterworth (CERN)
        Slides
      • 15:05
        Kickers 20m
        Reliable operation of LHC injection, tune/aperture and LBDS kicker systems relies on continuous on-line and off-line surveillances of their critical operational characteristics. Different acquisition techniques like trends logging, shot-by-shot logging or fast transient recording will be used to acquire and record the diverse types of signals existing within kicker systems. Correlation between the acquired data will be done through a precise time-stamping of the data acquisition time coupled with an internal management of the possible acquisition trigger sources. The structure of the different post-mortem buffers will be presented for each kicker system with estimation of their volume and a description of the different acquisition analysis and recording mechanisms. In addition, the triggering logic will be described and the remaining open-issues linked mainly to the distribution of post-mortem event(s) will be highlighted.
        Speaker: Etienne Carlier (CERN)
        Slides
      • 15:25
        Tea break 20m
      • 15:45
        Collimators and movable objects 30m
        The LHC collimation system is responsible for providing clean beam conditions and hence to assure the protection the equipment in the LHC. A failure of the collimation system may trigger a beam dump to avoid magnet quenches. The post mortem data of the collimation system supplies the following information • Demanded and actual positions of all collimator jaws (millisecond accuracy) Note: information on the actual positions is provided by resolver, position and gab lvdt's as well as end switches and anti-collision switches). • Temperatures of the jaws • Jaw vibrations over a period of a few seconds before and after the beam dump • BLM transient data during a collimator movement. • Command history The first analysis of the collimator post mortem data must assure that there were no internal failures in maintaining the actual collimator positions. A second analysis in combination with information from beam loss, beam position and beam profile monitors should validate that the collimation efficiency was as required.
        Speaker: Michel Jonker (CERN)
        slides
        Slides
    • 16:15 17:45
      Discussion Session 874/1-011

      874/1-011

      CERN

      Open issues: structure, technology, roadmap, priorities

      slides