For various reasons the computing facility for LHC data analysis has been organised as a widely distributed computational grid. Will this be able to meet the requirements of the experiments as LHC energy and luminosity ramp up? Will grid operation become a basic component of science infrastructure? Will virtualisation and the cloud model eliminate the need for complex grid
middleware? Will multi-core personal computers relegate the grid to a data delivery service?..... The talk will look at some of the advantages and some of the drawbacks of the grid approach, and will present a personal view on how things might evolve.
Les Robertson has been involved in the development and management of the
central computing services at CERN since 1974, taking a very active part in
the evolution from supercomputers through general purpose mainframes to
PC-based computing fabrics. He was involved from an early stage in planning
the data handling services for the experiments that will use the LHC and led
the Worldwide LHC Computing Grid project for six years from its start in