New LHC Records and bunch splitting

Yesterday the Large Hadron Collider produced a record run lasting 20 hours and delivering a record 62/pb. The run ended with a programmed dump for the first time in a while. They then turned the machine round in just four hours for a new run with 1380 bunches for the first time. This is the maximum bunch number that will be used for physics runs this year.

At this significant point in the LHC commissioning process it is worth reflecting just how much of an achievement it is to run with so many bunches. For comparison, the Tevatron runs with just 36 bunches per beam. Of course the LHC is bigger so it is possible to get more bunches in, but it is only four and a bit times bigger. To get 1380 bunches in they have to pack them much closer together. In the Tevatron the bunches run about 175 meters apart on average but in the LHC they are on average 20 meters apart.

This improvement in the design of the LHC over previous hadron colliders is just as important as the increase in energy. Hadron collisions are messy processes and to get the full information out the physicists will need to look for very rare events with clear signals of unusual processes. By time the LHC has run its full length it will have collected thousands of inverse femtobarns of data to explore the Higgs sector in the best possible detail. To achieve this it has to run with lots of bunches and with high quality, low emittance beams.

You can’t just inject individual proton bunches into an accelerator very close together because there is a limit to how fast the kicker magnets can change as they inject the bunches into the ring. As the energy increases the magnets have to produce more powerful fields and it gets harder to pack the bunches together. The injection process uses a series of increasingly powerful rings to put together the bunches in trains (see my earlier post about the injection chain). The early stages have lower energy so the bunches can be slotted closer, but the rings are smaller and fill up quickly. You can build up as you go along but this is not enough to get the bunches as close together as they need them.

The trick that made this possible was invented in the 1990’s using the PS accelerator at CERN which is now part of the injection chain for the LHC. They first considered a procedure of debunching the protons in the ring, so that they could then reform new smaller bunches, but they found that this ruined the good emittance properties of the beams. The solution was to split the bunches by starting with a low-frequency RF signal in the ring and gradually boosting one of its harmonics to higher amplitude. If you raise the second harmonic the bunches split in two and if you raise the third harmonic it splits them in three.  In the PS they start with 6 big bunches. These are first split in three to provide 18 bunches. The bunches are then accelerated to a higher energy before being further split into two. The 36 bunch trains are moved to the larger SPS ring and gathered into 144 bunch trains which are further accelerated before being injected into the main LHC ring. Later, possibly next year, they will split the bunches one more time in the PS to double the number of bunches again.

I’ve no idea who worked out how to do this bunch splitting but they are just some of the many unsung heroes of the LHC.

18 Responses to New LHC Records and bunch splitting

  1. […] de protones, el número 144bpi, Philip Gibbs ofrece una explicación muy buena en su entrada “New LHC Records and bunch splitting,” viXra log, June 28, 2011 (el “bunch splitting” en el título de la entrada). […]

  2. Tony Smith says:

    Now that LHC has ended the process of collecting over 1/fb
    how long will it take to analyze the data:

    1 – for WW Higgs search (high priority and well rehearsed)

    2 – for CDF Wjj bump study (not as high priority as Higgs, and maybe somewhat more complex to set up, and probably not so well rehearsed).


    • Philip Gibbs says:

      There are lots of talks scheduled for Europhysics-HEP at end of July including separate and combined Higgs channels from ATLAS and CMS. I think they will use 1/fb or near but cant be certain.

      I am not sure that they can make much contribution to Wjj with this amount of data but both CDF and D0 will report on it again at Europhysics-HEP. It will be interesting to see if their position has changed.

      • TonySmith says:

        Talks tell me when announcements might be made,
        would the analysis of important well-rehearsed things like WW Higgs search be actually finished well before then,

        at least before Bastille Day (14 June) so that the people at the mini-Chamonix (15 June) would have a pretty good idea of whether or not there is evidence for SM Higgs
        if so be able to formulate an efficient plan for the rest of 2011 for the LHC to study in detail any SM Higgs for which evidence might have been found in the first 1/fb ?

        What are the chances of any such results getting to the public before Europhysics-HEP ?


      • Philip Gibbs says:

        I don’t know much about what might go on internally, but my sense of how the mini-Chamonix (16th July of course) is likely to run is that it will not be influenced by any early physics results. They will just be considering how the many technical problems can be fixed or improved and whether further increases in luminosity by increasing intensity or emittance are possible and wise at this stage. They may also think about useful Machine Development studies they can do and when they can fit in Totem runs.

        It will always be the plan for this year to collect as much luminosity as possible without taking any undue risks. For next year the plan will be the same except they may increase energy if certain tests go well. They did change the plan when they saw that an early Higgs discovery is possible, but they cant easily change it again at this later stage.

        It is fun to wonder how any important news will emerge. Will it be a press release, a low key seminar, a conference announcement, a rumour? i have no idea.

  3. algernon says:

    I believe fill #1900 and #1901 marked the first time their “last 10 fills efficiency” graph went over 50% time in stable beams?

    Anyway, wow, 108/pb delivered to ATLAS in less than 40 hours… too bad they have to enter MD mode for a while 😉

    • Philip Gibbs says:

      It will be interesting to see what they are doing for MD though

      • algernon says:

        I wonder whether MD will help them even out the peak lumi they’re getting. They did 2 runs at 1236b with 1265/µb/s peak (first and last, ironically), plus a number or runs in the 1150-1170/µb/s region – which is really a lot worse.
        I’m not sure whether they tweaked the intensity during the runs but that seemed to always be around 118-120 billions.

        And their first attempt at 1380b gave “just” 1248/µb/s, which is consistent only with the worse results at 1236b, and didn’t even set a record.
        Getting the smoothest possible injection so they can go consistently over 1300/µb/s at 1380b seems the highest priority right now (before touching more fundamental beam parameters such as intensity and squeeze), don’t you think?

      • Philip Gibbs says:

        Sorting out those things is what they do all the time. MD tends to be for more forward looking developments.

        The intensity for the 1380 run was kept low because of the earlier booster problems. There were also some losses. I’m sure it will improve after the technical stop.

        They do sometimes mention the possibility of increasing intensity/emittance as if they are quite keen to try it. This suggests it is seen as a relatively easy step independent of other problems, but I suspect they will only try it after a good series of clean runs.

  4. Clara says:

    Guys, what is “MD mode”?

  5. Joao says:

    When reading the MD reports on the latest news page, they seemed pretty happy about how the 25ns injection tests had worked. Is this a sign of them changing their plans again and trying to go over the 1380 bunches “limit” they had set for 2011?

    • Philip Gibbs says:

      There is an ongoing plan which will require further tests during the remaining two MD slots this year. If it works well there is a chance that they will use 25ns next year. It cant really be done this year.

      I am not so sure about how the ATS tests will go but I think it must be on a similar schedule. That could also give a much better squeeze and a big luminosity increase for 2012 if it works out.

      I think the only things that have a chance of being used this year are the higher intensity bunches and the lower emittance. These can be used without any major new setup,

      • carla says:

        As a guess, their aim will be to collect at least 1.5 /fb over the next 6 week slot, and 2 /fb over the final 8 weeks giving around 4.5/fb in total which they should easily do, given they collected around 1/fb over the last 4 weeks while still increasing bunches and overcoming commissioning problems.

        How about doing a poll on how much data will be collected over the next 6 week slot?

        1, 1.5, 2, 2.5, 3/fb even?

        No doubt you’ll write up an interesting article about the conclusions of the MD phase, what it means for raising luminosity etc.

  6. Kevin says:

    They’ve just been able to squeeze the beam to .30 m, that is a 5-fold increase in luminosity, with a little more bunch intensity, that makes something like, ha, 10^34 cms-2. So in the end, your thought to be over-optimistic prevision seems to have be over-pessimistic : they could have 15 fb at the end of 2011 and another 30 in 2012 or more 🙂

    • Philip Gibbs says:

      The ATS squeeze is an impressive trick. I think they are aiming to use these things next year and will just aim for about 5/fb this year, working on improving the percentage of time in stable beams. A small increase in intensity might be used if the protection systems are OK for it. we might get more details of plans from the mini-Chamonix if that is public.

  7. Bill K says:

    It is hard to keep up with all these acronyms! I found something that explains what ATS is: Now what is “90 m physics”?

    • Philip Gibbs says:

      The 90m physics is where they do the opposite of squeezing to increase beta to 90 meters. This is a requirement for TOTEM and ALFA experiments

%d bloggers like this: