Data Centre

The CERN Data Centre is the heart of CERN’s entire scientific, administrative, and computing infrastructure. All services, including email, scientific data management and videoconferencing use equipment based in the data centre. 

Latest facts and figures

Networking

State-of-the-art network equipment and over 35 000 km of optical fibre provide the infrastructure that is essential for running the computing facilities and services of the laboratory as well as for transferring the colossal amount of LHC data to and from the data centre. 

Data Storage

The LHC experiments produce over 50 petabytes of data per year, plus there is around 25 petabytes produced per year for data from other (non-LHC) experiments at CERN. Archiving the vast quantities of data is an essential function at CERN. 

Magnetic tapes are used as the main long-term storage medium. Tape is actually a very sophisticated storage material and it can store huge volumes of data. Tape is inexpensive, compact, doesn't consume much electricity, and is durable for long-term storage.  With the data tsunami from the LHC, being able to quickly retrieve petabytes of stored data is essential for physicists to make ground-breaking discoveries.

We have many tape robots to ensure efficient storage and retrieval of data. We also have kept all the data from other and previous experiments, although this only represents a small fraction of the total (eg. the LEP experiments is about 400 terabytes. Peanuts compared to today's hundreds of petabytes!).

Cooling

Air conditioning is a major problem for data centres everywhere in the world at the moment. As processors get faster they also get hotter and at the moment we are getting a greater increase in heat than in performance. Rack machines are even worse as they are densely packed with processors. 

Some of the racks at the computing center contain only a few machines in them since there's not enough cooling capacity now to fill them with more machines. The room was designed with one supercomputer in a corner in mind, not several thousand processors! 

To maximise the cooling efficiency, we use a Hot/Cold aisle configuration: the front of the racks are facing each other on the 'cold' aisle and expel heat out in their backs to the 'hot' aisle. The doors and roofs placed in the cold aisles increase efficiency by only allowing cold air into the machines. The cold air comes out from the grills in the floor inside the 'cold' aisles. 

The cold air is introduced in the building through the big blue pipes coming from the roof and going down to the floor. 3 chillers on the building roof are responsible for cooling down the air. This process consumes no energy during the winter months where cold air is directly taken from outside.

Electricity

The electrical infrastructure is a vital element of the Data Centre.

The Data Centre is protected by UPS (Uninterruptable Power Supply). In case of a major power cut, we can start diesel generators to provide for critical systems, and allows time to shut down non-critical systems.

Strategies for increasing power efficiency are permanently investigated to be able to maximise the computing power serving the CERN’s infrastructure and scientific programme.

You are here