CERN has always been at the forefront of computing and data challenges. Even before the advent of detectors with electronic readout, significant (for those times) volumes of information from eg. bubble chamber pictures needed processing, and complex mathematical calculations and physics simulations were and still are a must.


With the LHC, the challenge posed by the data volumes to be processed has reached a level where worldwide cooperation is needed between the research institutes. A worldwide network of computing, with CERN as central “Tier-0” node, provides the experiments with the computing power needed to store the ~15 petabyte of data generated each year, process it in multiple levels of reconstruction of the underlying physics, and distribute it around the world. Also simulations take up a huge chunk of computing power, with order-of-magnitude similar demands in terms of CPU power and storage as the data itself.


The majority of CERN's computing needs are dealt with using open-source platforms and software. CERN itself also invests a lot in R&D in computing and software, and returns many tools to the worlwide scientific and broader community. The most known is the world-wide-web, for which the protocols and first servers were pioneered at CERN, and released free-of-charge to the world – a spark that kindled the explosion of the internet.

Sir Tim Berners-Lee, at CERN at that time, made this breakthrough a reality; but did you know that also a Belgian from Tongeren, Robert Cailliau, played a crucial role in this development?
Check him out: https://en.wikipedia.org/wiki/Robert_Cailliau


Other challenges appear with such data volumes and computing demands: high-speed reliable networking, mass long-term storage, security, portability, data preservation, public data releases, etc.

More information can be found on this page: https://home.cern/science/computing


Also the VUB is very active on the computing side. With the other Belgian institutes in CMS, we host a large “Tier-2” computing facility in Brussels, which provides services to the CMS experiment – in particular in terms of simulation – and serves as a powerful computing hub which our researchers use extensively to perform analysis, develop tools, calibrate data, etc.