One of the biggest computing problems in the world is how to process the massive data volumes expected from the world's biggest particle physics experiment - the Large Hadron Collider (LHC) at CERN in
that will recreate conditions from moments after the Big Bang in order to study the fundamental properties of sub-atomic particles.
When the LHC becomes operational in 2007, it will produce Petabytes (millions of Gigabytes) of data and to manage this, researchers have been creating a Grid to distribute the processing and storage of data around the world.
Yesterday, the LHC Computing Grid (LCG) project announced that its massive computing Grid now includes more than 100 sites in 31 countries. This makes it the world's largest international scientific Grid. The
The sites participating in the LCG project are primarily universities and research laboratories. They contribute more than 10,000 central processor units (CPUs) and a total of nearly 10 million Gigabytes of storage capacity on disk and tape. More than 2,000 of these CPUs are in the
LCG receives substantial support from the EU-funded project Enabling Grids for E-sciencE (EGEE), which is a major contributor to the operations of the LCG project.
The
The Grid operated by the LCG project is already being tested by the four major experiments that will use the LHC, namely ALICE, ATLAS, CMS and LHCb, to simulate the computing conditions expected once the LHC is fully operational.
Comment: Autonomous construction requires open data standards
The UK is particularly well served with topographic data thanks to the Environment Agency´s LIDAR programs, specifically the composite digital terrain...