Seminar:
LHC Computing Grid
Speaker: Dr. Ashiq Anjum, CERN Geneva, Switzerland.
Venue: Department of Physics, University of the Punjab, Lahore
Time and Date: 1/5/2009 (month/day/year)
http://www.khwarzimic.org/activities/lhcgc_ashiq.jpg
Spock
January 13, 2009, 4:21pm
2
Re: LHC Computing Grid
Interesting, shed some more light my friend.
Re: LHC Computing Grid
The LHC Computing Grid, launched on October 3, 2008, is a distribution network designed by CERN to handle the massive amounts of data produced by the Large Hadron Collider (LHC). It incorporates both private fiber optic cable links and existing high-speed portions of the public Internet.
The data stream from the detectors provides approximately 300 GB/s, which is filtered for “interesting events”, resulting in a “raw data” stream of about 300 MB/s. The CERN computer center, considered “Tier 0” of the LHC Computing Grid, has a dedicated 10 Gb/s connection to the counting room.
The project is expected to generate 27 TB of raw data per day, plus 10 TB of “event summary data”, which represents the output of calculations done by the CPU farm at the CERN data center. This data is sent out from CERN to eleven Tier 1 academic institutions in Europe, Asia, and North America, via dedicated 10 Gb/s links. More than 150 Tier 2 institutions are connected to the Tier 1 institutions by general-purpose national research and education networks. The data produced by the LHC on all of its distributed computing grid is expected to add up to 10–15 PB of data each year.
The Tier 1 institutions receive specific subsets of the raw data, for which they serve as a backup repository for CERN. They also perform reprocessing when recalibration is necessary.The primary configuration for the computers used in the grid is based on Scientific Linux
Distributed computing resources for analysis by end-user physicists are provided by the Open Science Grid, Enabling Grids for E-sciencE,and LHC@home projects
Worldwide LHC Computing Grid
WLCG