Data Centre

The CERN Data Centre is the heart of CERN’s entire scientific, administrative, and computing infrastructure. All services, including email, scientific data management and videoconferencing use equipment based in the data centre. The 110 000 processor cores and 10 000 servers hosted in its three rooms run 24/7. A remote extension of the CERN Data Centre is hosted at the Wigner Research Centre for Physics in Hungary, and it provides the extra computing power required to cover CERN’s increasing needs.


State-of-the-art network equipment and over 35 000 km of optical fibre provide the infrastructure that is essential for running the computing facilities and services of the laboratory as well as for transferring the colossal amount of LHC data to and from the data centre. Some 4 000 users can simultaneously use the WiFi network, while 6 000 devices are connected to it every day. The CERN data centre processes one petabyte of data each day, or the equivalent of around 210 000 DVDs. In 2013 the Wigner data centre in Hungary was added to increase the overall capacity, with two 100  Gb/s (gigabit per second) fibre optic lines linking the two sites.

Data Storage

The LHC experiments produce over 30 petabytes of data per year. Archiving the vast quantities of data is an essential function at CERN. 

Magnetic tapes are used as the main long-term storage medium. Often people wonder why we still use tape, as they see it as an old-fashioned technology. Tape is actually a very sophisticated storage material and it can store huge volumes of data. For example, the data which was stored on thousands of reels for the 1990's OPAL experiment now fit on one of today's cartridges. Tape is inexpensive, compact, doesn't consume much electricity, and is durable for long-term storage.  With the data tsunami from the LHC, being able to quickly retrieve petabytes of stored data is essential for physicists to make ground-breaking discoveries.

We have many tape robots to ensure efficient storage and retrieval of data. CERN  has more than 130 Petabytes of stored data (the equivalent of 700 years of full HD-quality movies). We also have kept all the data from other and previous experiments, although this only represents a small fraction of the total (eg. the LEP experiments is about 400 TB. Peanuts compared to today's >130 PB!).

Accessing data

Accessing tape data is relatively slow, about 1-3 minutes from tape being located, mounted, read and data sent. Often physicists need to access the latest data immediately, so it is also made available on disk servers, where access time is significantly faster. We have around 120 PB of raw disk space available to handle this.


Air conditioning is a major problem for data centres everywhere in the world at the moment. As processors get faster they also get hotter and at the moment we are getting a greater increase in heat than in performance. Rack machines are even worse as they are densely packed with processors. 

Some of the racks at the computing center contain only a few machines in them since there's not enough cooling capacity now to fill them with more machines. The room was designed with one supercomputer in a corner in mind, not several thousand processors! 

To maximise the cooling efficiency, we use a Hot/Cold aisle configuration: the front of the racks are facing each other on the 'cold' aisle and expel heat out in their backs to the 'hot' aisle. The doors and roofs placed in the cold aisles increase efficiency by only allowing cold air into the machines. The cold air comes out from the grills in the floor inside the 'cold' aisles. 

The cold air is introduced in the building through the big blue pipes coming from the roof and going down to the floor. 3 chillers on the building roof are responsible for cooling down the air. This process consumes no energy during the winter months where cold air is directly taken from outside.

Wigner Data Centre

See the presentation about the Wigner Data Centre

The Wigner Data Centre in Budapest was inaugurated in June 2013 as an extension to the CERN Data Centre.

The equipment in Wigner is managed and operated from CERN, in the same way as the equipment in the CERN Data Centre. Only activities requiring physical access to the equipment are being performed by the Wigner Data Centre staff, e.g. installation of equipment into racks, repairs to the servers, etc.

A capacity of around 30% of the capacity of the CERN Data Centre has installed at the Wigner Data Centre. We are increasing resources available to us at Wigner all the time, with the aim by 2020 to have a similar level of resources available there as at the CERN Data Centre in Geneva.

The CERN and Wigner Data Centres are connected via two independent and dedicated 100Gb/s (gigabit per second) fibre optic lines, with a bandwidth equivalent to the transfer of 5 full DVDs per second. Network latency (the time taken between sending data and receiving on the other end) between the two sites, which are 1800km apart, is about 25 milliseconds. 



You are here