Ken Bloom
(University of Nebraska-Lincoln)
27/07/2009, 14:00
Computing in HEP
Each LHC experiment will produce datasets with sizes of order one petabyte
per year. All of this data must be stored, processed, transferred, simulated
and analyzed, which requires a computing system of a larger scale than ever
mounted for any particle physics experiment, and possibly for any enterprise
in the world. I will discuss how CMS has chosen to address these challenges,...
Alden Stradling
(UT Arlington)
27/07/2009, 14:30
Computing in HEP
In anticipation of the calibration, performance, and physics activities that will be performed on the LHC first data, the ATLAS experiment is continuously refining elements of its Computing and Analysis Models. We will present an overview of some of recent developments in ATLAS computing, including the current resource estimates for Tier 1 and 2s, issues related to the establishment of Tier...
Dr
Burt Holzman
(CMS)
27/07/2009, 15:00
Computing in HEP
The Open Science Grid (OSG) enables collaborative science by providing a national cyber-infrastructure of distributed computing and storage resources. The goal of the OSG is to transform processing and data intensive science through a cross-domain, self-managed, nationally distributed cyber-infrastructure that brings together campus and community resources. The High Energy Physics community...
Dr
Shawn McKee
(University of Michigan)
27/07/2009, 15:30
Computing in HEP
Large-scale computing in ATLAS is based on a grid-linked system of tiered computing centers. The ATLAS Great Lakes Tier-2 came online in Sept., 2006 and now is commissioning with full capacity to provide significant computing power and services to the USATLAS community. Our Tier-2 Center also hosts the Michigan Muon Calibration Center which is responsible for daily calibrations of the ATLAS...
Sasha Vanyashin
(Argonne National Laboratory)
27/07/2009, 16:30
Computing in HEP
ATLAS event data processing requires access to non-event data (detector conditions, calibrations, etc.) stored in relational databases. The database-resident data are critical for the event data reconstruction processing steps and are often required for user analysis. A main focus of ATLAS database operations is on the worldwide distribution of the Conditions DB data, which are necessary for...
Amir Farbin
(University of Texas, Arlington)
27/07/2009, 16:50
Computing in HEP
Although in the early 90s, High Energy Physics (HEP) helped to drive the computing industry by establishing the HTTP protocol and the first web-servers, the long time-scales for planning and building modern HEP experiments has resulted in a generally slow adoption by HEP of emerging computing technologies which rapidly become commonplace in business and other scientific fields. We will review...