LHC prospects for 2012

The Large Hadron Collider continues its remarkable 2011 run with the 4/fb mark passed today for all time delivered luminosity (depending on whose stats you believe). With 4 more weeks left of proton physics this year and some of it reserved for TOTEM, we can expect the final count to reach 5/fb if the present run efficiency is maintained.

Some of you will want to remind me that earlier in the year I optimistically predicted a total of 10/fb. At that time the expected peak luminosity for this year was around 1.7/nb/s but by improving bunch intensity, emittance and squeeze beyond design limits they have actually doubled this to 3.3/nb/s . The reasons that this did not lead to even more integrated luminosity were (A) they took a bit longer than I expected to ramp up to maximum and (B) the running efficiency has not been quite as good as it could have been.

Nevertheless the run for this year still counts as a humongous success given that the original target was just 1/fb for the whole year. Collecting five times that amount means that they now have enough data to catch a glimpse of the long-sought-after God particle whose mass is now rumored to be 119 GeV while his partner Goddess particle sits nearby at 140 GeV. The next conference where more new information may be released is Hadron Collider Physics 2011 due to open in Paris on 14th November, just three weeks after ATLAS and CMS stop recording data for the year.

What Run Parameters for 2012?

Th next question is how much data can they collect during 2012? A previous tentative schedule that would have delayed the start of the run to allow more work during the winter shutdown has apparently been ditched. That is probably a good thing because with no beyond-standard-model physics emerging at 7 TeV yet they will now want to get to the design energy of 14 TeV as soon as possible. That will still not happen until 2014. Meanwhile the draft schedule for 2012 looks like this.

Possible scenarios for the 2012 running parameters according to Mike Lamont are as follows

A couple of crucial questions not yet decided are what energy to run at and whether to move to 25ns bunch spacing or stick with 50ns. The ability to go to 4 TeV per beam for a centre of mass energy of 8 TeV instead of 7 TeV can only be determined during the winter shutdown using thermal amplifier tests to check the splices. A higher energy of 9 TeV is not ruled out but will probably be deemed too risky. Let’s assume they will run at 8TeV.

The next question is about the bunch spacing. In principle moving to 25ns doubles the number of proton bunches that can be fitted in which would double the luminosity but you can see from the table above that this is not quite the case. Firstly the emittance at 25ns would not be so small because of limitations of the injector chain. Secondly, with more bunches they cannot push the bunch intensity quite so high because the extra load on the  cryongenic systems would just be too much. In fact the projected luminosity may be less at 25ns . So far they have only run with 216 bunches at 25ns and it is not certain what the limitations may be. So why would they still consider 25ns?

The effects of Pile-up

The answer is pile-up meaning the number of collision events that take place for each bunch crossing. This number, shown as “Peak mean mu” in the table has already gone up to an average of 16. This was much more than the experiments expected to see at this stage of the game and they have had to work hard to cope with it. There is a limit to how many sets of event data the onboard computers can register and save before the rate gets too high. To control this they need to adjust the trigger settings to catch just the most interesting events. They also have to work harder to reconstruct the collision vertices which can be very close together.

More uncertainty is unavoidable as the pile-up increases and this affects some anaylsis more than others. The worst hit results are the ones that look for missing energy and momentum, a classic signal of particles leaving the experiment undetected. These could be ordinary neutrinos or massive particles responsible for dark matter that are as yet unknown to science. To tell the difference they need to account for all the missing energy and momentum by adding up the individual contributions from every particle that is detected and subtracting it from the energy and momentum of the original particles in the collision. Given the precise energy and momentum of a particle you can work out its mass. The ATLAS and CMS detectors have been designed to have an exceptional ability to catch as many of these particles as possible and measure them accurately, but if two collision events are too close it becomes hard to be sure which event each particle came from and the energy resolution suffers. As well as searches for dark matter and supersymmetry some channels used in the Higgs boson search are being hit by the high pile-up seen this year. The WW decay mode has been particularly affected because the W bosons decay always produces neutrinos which are not directly detected. The lower energy resolution means that this channel is not going to be so good for identifying the mass of the Higgs boson and may even mislead us into thinking it has been ruled out at its real mass. Luckily the Higgs at low mass can also decay into two hard photons or two Z bosons that decay into leptons. These are easily detected and the higher luminosity means that they are seen more often. These are now the best channels to look for the Higgs at the LHC.

Overall I think the reduction of pile-up will be seen as worth the extra effort and risk of moving to 25ns so I am placing my bet on them aiming for that.

Yet more squeeze?

From Lamont’s table it can be seen that there is not much prospect of further major increases in luminosity next year. The LHC had already been pushed very close to its limits and will require some major work on the cryogenics, injectors and of course the splices before it can perform at energies and luminosities much beyond what we have seen this year. At best a small increase in energy and and 50% increase in luminosity can be expected for 2012.

But Lamont has been too shy to mention one further possibility. During the machine development periods this year they have tested a different way to squeeze the beams at the collision points using ATS optics. In this way they showed that it may be possible to get down to a beta* of 0.3m at the present energy. That means a further factor of 3 improvement in luminosity. It is a delicate and difficult process and I have no idea if they will really be able to make use of it next year, but at least there is some hope of further luminosity improvements without adding much strain to the cryongenic systems, we shall see.

Runtime efficiency

The remaining big unknown is the runtime efficiency achievable next year. The rapid push to high luminosities has taken its toll on the percentage of time that the LHC has been in stable beams. For the long term the strategy to maximise peak luminosity as soon as possible makes sense. By learning about the problems that come up now they can plan the work during the long shutdown in 2013 so that they have smoother running when they restart at full energy in 2014. For now this has meant running with a rather low efficiency measured with a Hübner Factor of just 0.2 .

In comparison to other colliders such as LEP and the Tevatron a figure of 0.2 is not unreasonable, but those machines were always limited to that amount by their design. The Tevatron takes a day to build up a store of anti-protons with enough intensity for a single run. In the meantime each store keeps running but the luminosity decreases with a half life of just a few hours. This means that no matter how smoothly it runs, the Tevatron can never do much better than a Hübner Factor of around 0.2.  The LHC avoids these limitation because it only uses protons which can be extracted from a bottle of hydrogen on demand. They also have much better luminosity lifetimes ranging from 10 hours at the start of a run to 30 hours as the luminosity decreases. In theory if the LHC can be made to run smoothly without unwanted dumps and without delays while getting ready for the next fill, then they would reach a  Hübner Factor of over 0.5.

But for now we must accept the vagaries of the machine. It uses technologies such as cryongenics and superconducting magnets running at scales way beyond anything ever tried before. It will take time to iron out all the problems, so for next year we should still expect a factor of 0.2. Given 130 days of clear proton physics and luminosities around 4/nb/s this implies a total luminosity of about 1o/fb for 2012. Enough to identify the real Higgs and perhaps see some hints of more beyond if nature is kind to us.

34 Responses to LHC prospects for 2012

  1. Luboš Motl says:

    Welcome update, Phil.

    Concerning “total luminosity of about 1o/fb for 2012”, did you learn how to type using 19th century typewriters that type zero as “oh”? 😉

  2. carla says:

    I wonder how bothered they’re about getting past the 5/fb mile stone? “LHC collects over 5 times expected data to conclude a successful year” has more of a positive ring to: “LHC collects over 4 times” or “collects nearly 5 times”.

    • Philip Gibbs says:

      I think they will be very keen to get to 5/fb but if it is a choice between that and doing the TOTEM runs then TOTEM should win.

  3. briv1016 says:

    If they increase the energy in 2012, how readily can they combine the 10′, 11′, and 12′ data? Will they essentially be starting at zero integrated luminosity?

    • Philip Gibbs says:

      Some things can be combined easily at different energies and some can’t. For example it is not very difficult to combine the Higgs exclusion plots from runs at different energy. Results from the Tevatron can be combined with results from the LHC if desired. It is not really much more difficult than combining runs at different luminosities or with different trigger settings.

      Different beam energies do not change the branching ratios for each particles decay modes. It just affects the different production rates and the background. If they are trying to measure the branching ratios they can easily take this into account. They just have to calculate the background using all the different run parameters set at each stage and compare with what they observe. If they are trying to measure production rates they might be interested in the differences at different energies so the change in energy could add to the information they obtain.

  4. Stéphane says:

    A quick question from a layman: why increasing the intensity of the beams has an impact on the cryogenics? Stray protons heating up the cold pipe? just wondering…
    And thanks a lot for your blog, it’s really a pleasure to read!

  5. Tony Smith says:

    If the LHC has about 4 weeks left in this run,
    if some of it is reserved for TOTEM
    if TOTEM requires significantly different settings from the Higgs search
    what is the schedule for TOTEM ?

    For example, if TOTEM gets a week,
    would it be reasonable to continue the Higgs search for 3 weeks and
    then reset the LHC for TOTEM and run for TOTEM the final week of the 4 ?

    Would that let the Higgs search end a week early and give the analysts at ATLAS and CMS an extra week to prepare for the Paris meeting on 14 November ?


    • anna v says:

      “Would that let the Higgs search end a week early and give the analysts at ATLAS and CMS an extra week to prepare for the Paris meeting on 14 November ? ”

      The collaborations are so large (3000) that the analysis can go on independently to the running of the experiment. Different physicists. The experimental contribution for the average physicist is in shifts at this stage, and the ones working for the analysis can be easily let off.

  6. Tony Smith says:

    My point was not about which people do what work,
    the analysts could not do the final analysis until the final data has been collected,
    so that
    if the final data were to be collected a week early,
    the final analysis could begin a week early.


    PS – By “final data” I mean all of the Higgs search data of the 2011 run.

    • Philip Gibbs says:

      It sounds like a reasonable plan to me but I have no idea if they will aim for that. They may prefer to take their time and analyse the results in more detail before letting us have them. We should at least get an LHC combo using lepton-photon data at that conference.

  7. ondra says:

    Nice summary Philip, while 5 fb-1 remains realistic target, its propably not really a priority, CMS and ATLAS can gain rather good information about 119 GeV Higgs even with 4 fb-1.
    Tony’s plan is of course good one, but there is already on friday plan for 90 m TOTEM/ALFA run and next week 25 ns MD plus some abort gap MD this week. So no 500 pb-1 this week.
    I think the target is next year to run with possibly 25 ns and i think better squeeze should be the target to basiclly test all nominal LHC parameters except energy.
    Question also is what will be the gain from 10 or 15 fb-1, while you can find predictions for Higgs and with 15 fb-1 you basiclly enter precision studies ground for Higgs, what will you gain from 15 fb-1 for other BSM models is hard to find.

    • Philip Gibbs says:

      There are many searches they can do that only become viable with large amounts of data so it is always worth adding more. The diphoton modes for example could reveal other scalar particles, whether Higgslike or not. These might only be seen after larger amounts of data have been recorded. There could always be things that no theorist thought of so it is important to get high stats on every possible channel to look for devaitions from the standard model.

      • ondra says:

        Thats true Philip, I am just wondering what are the real prospects for LHC physics results in 2012 except Higgs, of course just in case that further 2011 data will show growing signal at 119 or 140 GeV.
        Is it better to analyze 10 fb-1 for ATLAS, CMS and 2 fb-1 for LHCb and have a fair decision to stop and go for 13-14 TeV fix because there is nothing conclusive in 7 TeV data?
        Btw I am sure LHCb precision studies and ALICE ion results will growth on the importance if Higgs is not there or unclear.
        We have to remember that new european strategy for particles physics should be ready by the end of 2012 and without the clear seller it will be pretty tough to get financing for a new large scale project as linear collider.

  8. Anders Lund says:

    An update without mentioning the problem of SEUs which seems to be the largest problem for the LHC at the moment. It looks like SEUs have already cost several inverse femtobarns. Do you know anything new about efforts to battle these radiation induced incidents.

  9. ondra says:

    Anders, they try to mitigate these problems with shielding or relocation and installing more monitoring equipment, another major changes are prepared for xmas stop. They also try to push the intensity so high to find these weak places together with influence on cryo and rf. Take a look at http://indico.cern.ch/conferenceOtherViews.py?view=standard&confId=144632 R2E section. Fortunately it seems UFOs are not such a big problem.

  10. Clara says:


    is it correct to expect that 5 inverse fb are completely sufficient to rule out any SM Higgs, in the case that it does not exist? And will we know about it in November already?

    • Philip Gibbs says:

      5/fb per experiment is enough to push the expected CL_s line below the 95% exclusion limit for all masses between 115 GeV and 600 GeV using the ATLAS+CMS combined results. However, Even if the Higgs does not exist there can be fluctuations that mean that the limit is reached either sooner of later. The low mass region would be the last to go and so far has fluctuations that suggest it will take longer. 10/fb is more certain to rule it out (if it does not exist) At 20/fb it would be almost certain.

      Having said that, I do not think that the Higgsless solution is the correct one, both because the theoretical models are weak and because current signals favor its existence. Similar amounts of data work for observations if it does exist, but it might take a little more to get the full discovery significance in the worse case.

      In November we only expect them to provide a combination for the summer data. They may decide not to release the data from the individual experiments with the complete 2011 data until they have looked at it more carefully and/or prepared an official combination. That could mean some weeks or even months delay during which time leaks would become increasingly more inevitable.

    • Philip Gibbs says:

      The HCP program looks good with talks where they could present new search data and combinations. There is no entry for the symposium on any indico server that I can see, so no chance to view abstracts and see what they are intending. It is possible that they may not put slides online at all so that we cant see them, but any public results should appear as conference notes.

    • ondra says:

      Well, lets hope we get something new in HCP results.
      On LHC update, the 90 m run for ALFA and TOTEM ended without any data recorded and frustrating UPS problem.
      Btw more MDs in next week plan, but we still have 4 weeks of proton run to go so still enough time to get to those recored 5 fb-1.
      On Philip note, anyone should actually realize by now that theory people will bend their theory to fit any data and so even if you see resonance it wont be sure if they see real SM Higgs. Thats also why LHC plans to collect those 1000’s of fb-1.

  11. ondra says:

    ATLAS and CMS, including last fill, should reach 4 fb-1 recorded luminosity today! Congratulations!

    • Philip Gibbs says:

      If you count 2010 data they are both already there

      • carla says:

        There was a prediction of data collected by the LHC over the years of its operation with something like 1/fb in the first year, 2/fb the second year, 3-5/fb the third etc. Do you know where it is?

  12. Phillip,

    25ns bunches will be tested today!

    • Daniel de Franca says:

      Now, stable beams at 25ns! But only 60bunches at only 60/microbarns/s

    • Philip Gibbs says:

      That is the first time they collided with 25ns beams. If they have time they will increase the bunch numbers according to the Tuesday morning report.

      They also wanted to do some pile-up tests. Not sure exactly what that means but I imagine they could collide some high intensity bunches to give the experiments a taste of higher pile-ups. That way they can test algorithms to see how well they can reconstruct the events.

      These tests are important to give them the information they need to decide how best to run the collider next year.

  13. Luboš Motl says:

    You may vote on your predicted mass of the Higgs boson:

    • Philip Gibbs says:

      I voted for 140 GeV

      • Luboš Motl says:

        I didn’t – but at any rate, you may see that “no Higgs by end of 2012” is the clear frontrunner haha.

      • Philip Gibbs says:

        That’s just the contrarian view plus the misleading reports about the signal fading. I am not very confident about the 140 GeV bet, it is just the clearest signal in the summer results. I would be happier if there is one at 119GeV but Tevatron bb results almost kill it. A heavier CP-odd Higgs would be nice too but no sign of that yet so I think the lighter one will be seen first

      • Philip Gibbs says:

        I might have voted for two found at once if the option was included. So what did you vote for ? 🙂

      • JollyJoker says:

        115-119, obviously. He’s too optimistic to think the cross section * 2012 luminosity is too low to give 3 sigma.

      • I was the 1st to vote for no higgs. If we have FTL neutrinos, why not no Higgs so that we can have a lot of fun with a completely f*ck up W?

%d bloggers like this: