Speaker
Description
The ever-growing data volumes produced by HEP experiments, particularly at the CERN Large Hadron Collider (LHC) and upcoming facilities, demand innovative approaches to data processing and analysis. Traditional data acquisition and processing methods are no longer adequate for handling the scale, speed, and complexity of this data. In response, the field has seen a transformative shift toward edge AI for intelligent trigger and front-end systems, fundamentally changing how experiments manage data acquisition and processing in real-time. This talk will cover promising implementations of these new approaches in current and future HEP experiments and their impact in accelerating discovery and pushing the boundaries of scientific knowledge in high-energy physics and beyond.