The LHC experiments constitute a challenge in several discipline in both High Energy Physics and Information Technologies. This is definitely the case for data acquisition, processing and analysis.
This challenge has been addressed by many years or R&D activity during which prototypes of components or subsystems have been developed. This prototyping phase is now culminating with an evaluation of the prototypes in large-scale tests (approximately called "Data Challenges").
In a period of restricted funding, the expectation is to realize the LHC data acquisition and computing infrastructures by making extensive use of standard and commodity components.
The lectures will start with a brief overview of the requirements of the LHC experiments in terms of data acquisition and computing. The different tasks of the experimental data chain will also be explained: data acquisition, selection, storage, processing and analysis. The major trends of the computing and networking industries will then be indicated with particular attention to their influence on the LHC data acquisition and computing. Finally, the status and the results of the "Data Challenges" performed by the LHC experiments and the IT division will be presented and commented. The vision of the data acquisition and processing system for the LHC era and its promise for tomorrow will conclude this series.