LHCb: full-steam strategy pays off

LHCb looks at LHC proton collisions from a special angle. The experiment studies rare decays of the B particle to look into the physical processes that might hide new physics. Designed to operate at moderate luminosity, LHCb has been more daring for the last year and is running at conditions tougher than the nominal. The new strategy is paying off, as important physics results have just started to emerge…

 

Event display presented at the EPS-HEP 2011 conference showing a B0s meson decaying into a μ+ and μ- pair. 

The LHCb detector was originally designed to run at moderate luminosity and low interaction pile-up. In other words, unlike the CMS and ATLAS experiments, the whole LHCb experimental set-up and data-taking infrastructure was designed to process just one proton interaction for each bunch crossing.

For the last year, however, this has all been old news. A change in LHCb strategy was made possible when it became clear that the LHC was going to first increase the number of protons in the bunches and only afterwards increase the number of bunches in the machine. “Had we continued with the old policy last year and this year, we would have collected five times less data,” says Richard Jacobsson, LHCb run coordinator. “Very soon we realized that the detector was actually able to stand much tougher operating conditions and could cope with more than just one interaction per bunch crossing. With the proton-rich bunches that the LHC was sending us, we saw that we could even process as many as six interactions per bunch crossing.”

With such a high luminosity per bunch in July last year, all sub-detectors had to prove once again their ability to perform as expected. “Basically we saw that the whole detector was illuminated and we were saturating everywhere, up to the optical fibres for the data transmission. Actually, the trigger challenge required heroic efforts when our online computer farm also became saturated!” explains Richard. “However, we could also see that, overall, the detector was perfectly capable of withstanding these extreme conditions.”

The problem of saturation was particularly evident for the offline event reconstruction and analysis. With a higher pile-up the events become ‘dirtier’ – that is, more particles come into play at a given time, which makes it increasingly difficult for researchers to separate the interesting events from the background. “At the end of 2010 it became clear that we were reaching our limits. The LHC was obviously only going to increase its luminosity during 2011 and we had to work out a stable solution,” says Richard.

The technical solution is called “luminosity levelling” (see box). Thanks to this technique, the LHCb experiment constantly runs at its maximum power without any adverse impact on safety and reliability. “The current system we use to keep our luminosity stable is automatic. It adjusts itself according to the natural changes that occur in the LHC luminosity over one fill,” he confirms.

Constant running at maximum speed remains a big challenge for the offline processing and data analysts, who have to reconstruct the whole event (nature of particles, their energy, their trajectory, etc). However, thanks to the new strategy and the huge amount of additional data it has brought in, the experiment is on its way to delivering very important physics results. “There are very interesting, but extremely rare decays of the B particle involving muons that are benefiting a lot from the current high-luminosity strategy,” says Pierluigi Campana, LHCb Spokesperson. “If everything goes as expected, by the end of this year we will have collected about 1 inverse femtobarn of luminosity, which should enable us to present our results with an unprecedented precision.”

The results that the LHCb collaboration has started to release have to do with the rate and other specific parameters relating to the decay of the Bs (a particle made by a bottom antiquark and a strange quark) and Bd (a bottom antiquark plus a down quark) particles. Some of these parameters have already been studied at Fermilab’s CDF and in other B-factories, and the current values show possible deviations from the Standard Model that make them worth studying in detail and trying to achieve a better precision. “One of the things we are looking into is the rate of the decay of the Bs into two muons. The decay is so rare that we can only expect to observe a handful of them per one billion Bs decays,” explains Pierluigi Campana. “From the theory we know the value we can expect within the Standard Model. Recently, CDF announced possible evidence for a higher value of this rate, but data presented at the EPS Conference in Grenoble by LHCb (and by CMS, although with less precision) make this possibility quite unlikely. For the moment we have to stick to the Standard Model.”

Later in the summer, the LHCb Collaboration also plans to finalise the data analysis of the decay of the Bs into the PSI and PHI particles, which is potentially very sensitive to new physics. “The Grenoble Conference was another triumph of the Standard Model,” concluded Pierluigi Campana, ”as new phenomena had not (yet) been discovered. But we know precision physics has only just started at the LHC, and most probably the devil (i.e. new physics) will be in the details.”

 

Luminosity levelling: how does it work? 

In order to make the luminosity stable over the whole duration of a particle fill in the LHC, beams are artificially separated vertically (by about 80 microns) when they approach the LHCb collision point. From the data observed so far, such a separation does not produce any bad beam-beam effects that could reduce the beam quality or luminosity lifetime.

Luminosity levelling automatically maximizes the target luminosity by looking at all the main parameters of the whole LHCb experiment: the read-out performance, the collision rate, the bandwidth, the event size, the average number of interactions per crossing, and the luminosity limit which is felt to be the safe limit for the detector as compared to the luminosity delivered by the LHC. The luminosity limit is what is successively being increased this year as more is learned about the detector performance.

 

by CERN Bulletin