The CERN openlab is a unique public-private partnership between CERN and leading ICT companies. It was created in 2001 in support of the ambitious computing and data management goals set by the construction of the Large Hadron Collider (LHC) and detectors. Building on more than 10 years of ground-breaking work, the CERN openlab continues to address the key topics in the CERN scientific and technical programme driven by the planned LHC upgrade activities spanning the next 20 years.
The LHC will restart in 2015 and run for 3 years following by a second Long Shutdown 2 (LS2) in 2018. During LS2 the operational parameters of the LHC and the experiments will be pushed to new limits, dramatically increasing the quantity and rate of data generated. In addition, new international research infrastructures have been designed that are expected to produce comparable or even greater amounts of data in diverse scientific domains such as neurology, radioastronomy or genetics. Sharing experience and expertise with such research infrastructures will improve the ICT models and potential market for solutions.
At the same time, the ever increasing usage of the World Wide Web and of the advent consumer-oriented services have started generating, storing and moving quantities of data in the order of hundreds of PB each month. Technologies that today are at the bleeding edge of research will be commodity items tomorrow.
The continuous collaboration between the research infrastructures and ICT companies is therefore more critical than ever in order to make sure that scientific objectives and technological roadmaps are aligned. The CERN openlab plays an important role in this endeavour, setting goals and providing opportunities for collaboration, technical expertise and educational programs.
In order to define the long-term technological context where joint research activities can take place in the next 5 years, the CERN IT Department and the CERN openlab has started defining a number of ambitious challenges covering the most crucial needs of ICT infrastructures, from data acquisition and processing to large-scale computing resources management, from exascale data storage to big data analytics. This process was started earlier this year with events such as the workshop on "IT Requirements for the Next Generation Research Infrastructures".
This workshop provides an occasion to explore in practical terms the set of challenges and the use cases addressing the needs of research teams in physics and other worldwide scientific communities. It also provides an opportunity for the CERN openlab commercial partners to highlight their technology roadmaps and match them with the concrete needs of the research teams.
The expected outcome of the workshop is a shared definition of the most critical long-term ICT challenges, a set of use cases within each challenge and an initial expression of interest from both research teams and commercial partners on common objectives of work to be started from 2014/2015 onwards.
The LHC will restart in 2015 and run for 3 years following by a second Long Shutdown 2 (LS2) in 2018. During LS2 the operational parameters of the LHC and the experiments will be pushed to new limits, dramatically increasing the quantity and rate of data generated. In addition, new international research infrastructures have been designed that are expected to produce comparable or even greater amounts of data in diverse scientific domains such as neurology, radioastronomy or genetics. Sharing experience and expertise with such research infrastructures will improve the ICT models and potential market for solutions.
At the same time, the ever increasing usage of the World Wide Web and of the advent consumer-oriented services have started generating, storing and moving quantities of data in the order of hundreds of PB each month. Technologies that today are at the bleeding edge of research will be commodity items tomorrow.
The continuous collaboration between the research infrastructures and ICT companies is therefore more critical than ever in order to make sure that scientific objectives and technological roadmaps are aligned. The CERN openlab plays an important role in this endeavour, setting goals and providing opportunities for collaboration, technical expertise and educational programs.
In order to define the long-term technological context where joint research activities can take place in the next 5 years, the CERN IT Department and the CERN openlab has started defining a number of ambitious challenges covering the most crucial needs of ICT infrastructures, from data acquisition and processing to large-scale computing resources management, from exascale data storage to big data analytics. This process was started earlier this year with events such as the workshop on "IT Requirements for the Next Generation Research Infrastructures".
This workshop provides an occasion to explore in practical terms the set of challenges and the use cases addressing the needs of research teams in physics and other worldwide scientific communities. It also provides an opportunity for the CERN openlab commercial partners to highlight their technology roadmaps and match them with the concrete needs of the research teams.
The expected outcome of the workshop is a shared definition of the most critical long-term ICT challenges, a set of use cases within each challenge and an initial expression of interest from both research teams and commercial partners on common objectives of work to be started from 2014/2015 onwards.