-
Robin Lauckner (CERN)16/01/2007, 09:30
-
Nikolai Trofimov (CERN)16/01/2007, 10:00This talk will cover the following items: - PM data model - Client API - PM server - Data processing and SDDS conversion - Performance and scalability - Current statusGo to contribution page
-
Boris Khomenko (Joint Institute for Nuclear Research (JINR))16/01/2007, 11:00The presentation will cover: - Data arrival and event building - SDDS format and its implementation for PM - PMX method for data description and control - SDDS converter - generic version - Possible enchancement of the converter - LabVIEW application/framework for individual data module (PMM) - PMM data locator - PMM SDDS ascii/binary loader - Internal data classes - Data viewing -...Go to contribution page
-
Katarina Sigerud (CERN)16/01/2007, 11:30LASER will provide alarm event information to the PM system in the case of a PM event. A first solution, agreed between the LASER and the PM teams at the end of 2005, will be described. Since then, the LASER system has evolved which opens up other possibilities to integration. These solutions will be discussed as well as the questions they give rise to.Go to contribution page
-
Ronny Billen (CERN)16/01/2007, 11:45This presentation will explain briefly the purpose, scope and architecture of the LHC Logging Service. More detail will be given on the interaction with the Post-Mortem system including naming conventions and enforcement, data lifetime policy, combining and correlation of slow logging data and external transient data. Finally some ideas and possibilities will be discussed such as the...Go to contribution page
-
Antonio Vergara Fernandez (Cent.de Investigac.Energeticas Medioambientales y Tecnol. (CIEMA)16/01/2007, 14:00The commissioning of the warm part of the superconducting circuits of the LHC started in 2005 with the short-circuit tests of the power converters where the non-superconducting elements of the circuits are being commissioned together with their associated general services. Once the circuits are at their operation temperature and before powering them, the interlock system will...Go to contribution page
-
Andrzej SIEMKO (CERN)16/01/2007, 14:30Effective commissioning of the LHC hardware demands a well-designed set of high level software tools, which is required for the equipment performance analysis and validation. The challenge includes a large amount of equipment integrating heterogeneous systems like powering, energy extraction, distributed magnet protection systems, cryogenics and vacuum with their distributed...Go to contribution page
-
Hubert Reymond (CERN)16/01/2007, 15:00Three components of the Post Mortem Analysis are already used by the equipment support teams. This talk will present the status and the modes of operation for each of them. Then the present architecture will be detailed, followed by the implementation dedicated to the Hardware Commissioning.Go to contribution page
-
Adriaan Rijllart (CERN)16/01/2007, 16:00The first Post-Mortem requirements have come from the needs of the individual systems involved in the first phase of the hardware commissioning using short circuit tests. The second phase of powering the circuits, involving systems such as vacuum, cryogenics and DFB’s will extend the requirements of analysis to a new scale. This talk will show how we plan to include these new analysis...Go to contribution page
-
Verena Kain (CERN)17/01/2007, 09:00For each beam injection into the LHC a well-defined series of beam quality checks needs to be made, starting in the SPS just before extraction and in the LHC immediately after injection. These checks will be dependent on the beam type, intensity and position in the filling sequence, and will use transient data which must be acquired and analysed at the appropriate time and within...Go to contribution page
-
Brennan Goddard (CERN)17/01/2007, 09:30Each dump action must be followed by an XPOC which is launched automatically and is designed to verify that the dump was correctly executed. If an anomaly is discovered during these tests, the XPOC must withhold the User Permit to the BIS (via a software channel). The XPOC comprises beam instrumentation and other signals which will come from the logging and Post- Mortem system, or...Go to contribution page
-
Jorg Wenninger (CERN)17/01/2007, 10:00After an emergency dump a general Post-Mortem request will be issued to acquire transient data from a variety of systems. The analysis of a Post-Mortem event may take from minutes to many months, depending of the desired level of details. Key data must be however presented in a way which allows for simple and efficient fault-finding. Operation crews must be presented with clear...Go to contribution page
-
Canceled17/01/2007, 11:00In addition to systematic transient data acquisition, operation of the LHC will also require the possibility to make ad-hoc acquisition of some transient beam and possibly equipment data, in order to diagnose and solve specific problems and to cope with unforeseen difficulties. An attempt is made to outline the different transient data required for general operational purposes,...Go to contribution page
-
Julian Lewis (CERN)17/01/2007, 11:30A post-mortem timing event distributed by the LHC machine timing system is used to freeze the PM buffers of a large fraction of the LHC equipment. This event must be generated automatically whenever the BIS is issuing a beam dump request by changing the state of the beam permit signal. This presentation outlines the present ideas on how to generate the PM timing event. The issue...Go to contribution page
-
Robin Lauckner (CERN)17/01/2007, 13:45Post Mortem will be the key to mastering the full complexity of LHC Operation and the interaction between systems. Many systems will be involved in full optimisation and understanding of performance. Today a few systems are providing data to validate and understand hardware commissioning. This must be extended giving priority to obtaining essential information related to...Go to contribution page
-
Stephane Bart Pedersen (CERN)17/01/2007, 14:05The key beam instruments for post-mortem diagnostics in the LHC include:
• the beam position monitors (BPM), • the beam loss monitors (BLM), • the beam current transformers (BCT), • the non-destructive beam profile monitors, • the tune measurement, • the abort gap monitors. Turn by turn (or highest time resolution) data will be provided for all systems... Go to contribution page -
Dr Andrew Butterworth (CERN)17/01/2007, 14:35The RF acceleration (ACS) and transverse damper (ADT) systems will supply post-mortem data at various acquisition rates. The PLCs controlling the power systems acquire at a few Hz, while high-speed digitizers and acquisition buffers embedded in the low- level hardware acquire transient signals at 80 MSamples/s over time periods ranging from a few milliseconds to several hundred...Go to contribution page
-
Etienne Carlier (CERN)17/01/2007, 15:05Reliable operation of LHC injection, tune/aperture and LBDS kicker systems relies on continuous on-line and off-line surveillances of their critical operational characteristics. Different acquisition techniques like trends logging, shot-by-shot logging or fast transient recording will be used to acquire and record the diverse types of signals existing within kicker systems....Go to contribution page
-
Michel Jonker (CERN)17/01/2007, 15:45The LHC collimation system is responsible for providing clean beam conditions and hence to assure the protection the equipment in the LHC. A failure of the collimation system may trigger a beam dump to avoid magnet quenches. The post mortem data of the collimation system supplies the following information • Demanded and actual positions of all collimator jaws (millisecond...Go to contribution page
-
Mr Rüdiger Schmidt (CERN)General Introduction with the main aims of the Post Mortem SystemGo to contribution page
Choose timezone
Your profile timezone: