Speaker
Dr
Jean-Roch Vlimant
(CERN)
Description
The analysis of the LHC data at the CMS experiment requires the production of a large number of simulated events. In 2012, CMS has produced over 4 Billion simulated events in about 100 thousands of datasets. Over the past years a tool (PREP) has been developed for managing such a production of thousands of samples.
A lot of experience working with this tool has been gained, and conclusions on its limitations have been drawn. For better interfacing with the CMS production infrastructure and data book-keeping system, new database technology (couchDB) has been adopted. More recent server infrastructure technology (cherrypy + java) has been set as the new platform for an evolution of PREP. The operational limitations encountered over the years of usage have been solved in the new system. The aggregation of the production information of samples has been much improved for a better traceability and prioritization of work.
This contribution will cover the description of the functionalities of this major evolution of the software for managing samples of simulated events for CMS.
Author
Dr
Jean-Roch Vlimant
(CERN)