Data goes faster than ever

Using store-bought computers and commercially available optical fibre lines, researchers from the California Institute of Technology (Caltech), the University of Victoria, the University of Michigan, CERN and Florida International University broke the world speed record for LHC data transfer. They caught the attention of HEP experiments worldwide – including the LHC – which rely on ever-improving technology to share their results.

 

The equipment used by the Caltech team to break the data transfer record. Photo credit: D. Foster.

At November’s SuperComputing 2011 (SC11) convention in Seattle, the Caltech team sent LHC data between the University of Victoria and the Seattle exhibition floor at a full duplex speed of 186 gigabits per second on a 100 Gbps circuit provided by CANARIE and BCnet. This is a 10-fold increase compared with the current 10 gigabits per second circuits between CERN and each of the 11 major GRID Tier 1 centres that receive LHC data, and you can see why the result made headlines in the IT world.

“What is remarkable about the Caltech experiment is the use of ‘commodity’ technology – that is, the use of computing resources that are commercially available as part of a careful end-to-end systems design,” says David Foster, Deputy Head of the CERN IT Department who attended the SC11 conference. “While terabit technology (that’s 1024 gigabits/sec) is being demonstrated under laboratory conditions, 100 Gbps technology is now making it into the mass market. The Caltech team took this commodity equipment combined with state-of-the-art software and delivered state-of-the-art results. CERN is already deploying 100 Gbps circuits internally and will do so internationally this year.”

Transferring data from one place to another requires much more than just a high-speed network cable – there is a whole system of components including computers, interfaces, disks and software that need to be tuned to work together efficiently. Increasing network bandwidth alone is not enough to produce excellent data transfer rates. The Caltech team took this “whole systems” approach to the problem of data transfer and achieved these record-breaking results.

The landmark exercise could have far-reaching consequences for the LHC experiments and other high-energy physics experiments worldwide. Last year, the LHC produced over 22 PB of experimental data – that’s equivalent to over 5.5 million DVDs of data -  compared to the planned 15 PB of data. If this data production continues to increase, the transfer rate between CERN and its partner institutions will have to increase as well.

“We are constantly monitoring these types of exercises,” says Foster. “As we work together with institutes like Caltech and the University of Victoria to improve the technology of data transfer, we will be able to adjust and improve how we distribute LHC data to collaborating institutes.”

by Katarina Anthony