New LHC Records and bunch splitting

June 28, 2011

Yesterday the Large Hadron Collider produced a record run lasting 20 hours and delivering a record 62/pb. The run ended with a programmed dump for the first time in a while. They then turned the machine round in just four hours for a new run with 1380 bunches for the first time. This is the maximum bunch number that will be used for physics runs this year.

At this significant point in the LHC commissioning process it is worth reflecting just how much of an achievement it is to run with so many bunches. For comparison, the Tevatron runs with just 36 bunches per beam. Of course the LHC is bigger so it is possible to get more bunches in, but it is only four and a bit times bigger. To get 1380 bunches in they have to pack them much closer together. In the Tevatron the bunches run about 175 meters apart on average but in the LHC they are on average 20 meters apart.

This improvement in the design of the LHC over previous hadron colliders is just as important as the increase in energy. Hadron collisions are messy processes and to get the full information out the physicists will need to look for very rare events with clear signals of unusual processes. By time the LHC has run its full length it will have collected thousands of inverse femtobarns of data to explore the Higgs sector in the best possible detail. To achieve this it has to run with lots of bunches and with high quality, low emittance beams.

You can’t just inject individual proton bunches into an accelerator very close together because there is a limit to how fast the kicker magnets can change as they inject the bunches into the ring. As the energy increases the magnets have to produce more powerful fields and it gets harder to pack the bunches together. The injection process uses a series of increasingly powerful rings to put together the bunches in trains (see my earlier post about the injection chain). The early stages have lower energy so the bunches can be slotted closer, but the rings are smaller and fill up quickly. You can build up as you go along but this is not enough to get the bunches as close together as they need them.

The trick that made this possible was invented in the 1990’s using the PS accelerator at CERN which is now part of the injection chain for the LHC. They first considered a procedure of debunching the protons in the ring, so that they could then reform new smaller bunches, but they found that this ruined the good emittance properties of the beams. The solution was to split the bunches by starting with a low-frequency RF signal in the ring and gradually boosting one of its harmonics to higher amplitude. If you raise the second harmonic the bunches split in two and if you raise the third harmonic it splits them in three.  In the PS they start with 6 big bunches. These are first split in three to provide 18 bunches. The bunches are then accelerated to a higher energy before being further split into two. The 36 bunch trains are moved to the larger SPS ring and gathered into 144 bunch trains which are further accelerated before being injected into the main LHC ring. Later, possibly next year, they will split the bunches one more time in the PS to double the number of bunches again.

I’ve no idea who worked out how to do this bunch splitting but they are just some of the many unsung heroes of the LHC.

Strings 2011

June 27, 2011

The strings 2011 conference has opend today in Sweden. You can watch videos of the talks live or recorded straight after, starting with the introduction by David Gross.

Gross asked the usual questions starting with the number one “What is String Theory?” and ending with number 11 “What will we learn from the LHC?”

He is a bit disappointed that the LHC has not found anything surprising yet, but he still holds high hopes for SUSY.

There have been promising new results in trying to solve large-N SUSY gauge theory which has beautiful mathematics: twistors, polytopes in the grasmanian for example. Since this theory is dual to string theory Gross thinks these discoveries could tell us about the fundamentals of string theory.

He goes on to mention entropic gravity which he said was also promising but he had a little smirk on his face when he said it and also implied that it is ambitious. There will be a talk from Verlinde later in the conference.

Apparently it is unfortunate that we seem to live in De Sitter space. The theories work much better in anti- De Sitter space. There are lots of questions but the most important product of knowledge is ignorance, then again it would be nice to have some answers he said at the end.





Coming soon, Europhysics HEP 2011

June 25, 2011

In just four weeks the Europhysics HEP conference for 2011 will begin. This is a biannual meeting alternating each year with the ICHEP conference. This year it is being held in Grenoble and promises to be the biggest HEP event of the year with several hundred talks and poster sessions.

Abstracts of many talks have already been posted covering a wide diversity of experimental and theoretical subjects, but the main interest will be on the CMS and ATLAS searches. At the recent PLHC conference there were about a dozen reports using new LHC data. A couple form CMS and about ten from ATLAS. At that time about 200/pb of data was available. Although that was only a few weeks ago the LHC has now delivered much more and at Europhysics HEP they should be able to show the results of searches using 1000/pb. That is enough to say something significant about new physics. So far there are 13 abstracts posted for ATLAS that promise to use 2011 data and 20 for CMS.

These talks include a number of SUSY searches. If these find no new physics much of the most likely phase space for supersymmetry will be wiped out. SUSY in some form will still be possible, but it will have to be different from what the phenomenologists have predicted as the most likely scenarios. It will start to look like the last 30 years of research in hep-ph have been on the wrong track. If they do see hints something they will quickly be able to narrow down the available theories and tell us something very useful about the laws of physics.

Other presentations will look at the Higgs searches. With the 1/fb of luminosity likely to be included it should be possible to exclude the Higgs for masses above 135 GeV on the assumption that the standard model is the only physics in play. However, the standard model predicts vacuum instabilities for lighter Higgs masses. If this is the way it plays out we will know that something new must be there even if we don’t yet see it. The other possibility is that a promising signal for the Higgs will be seen at a higher mass. Either way there will be something to remember about Europhysics HEP 2011.

With just the 2010 data of 40/pb each, CMS and ATLAS have already produced hundreds of papers looking at different possible decay channels and search scenarios. As the amount of data available increases the possibility to look for rarer events in different channels goes up. With 1000/pb there will be a lot of things they can explore. Even with 6000 physicists between them divided up into small groups they will have to prioritize what they do for this conference. If you are one of those 6000 you should be too busy to be reading this so get back to work, and don’t expect any rest for a long time.

Tough Week for the LHC

June 24, 2011

Today the Large Hadron Collider has taken another step up  in luminosity by increasing the number of proton bunches per beam from 1092 to 1236. The first run at this new intensity equaled the previous luminosity record. They may beat it in subsequent runs by pushing up the bunch intensity. One more step up is required to reach this years maximum possible bunch count of 1380 bunches per beam. It may be too late to reach that step before the next technical stop.

The advance comes at the end of a tough week with only one run during a period of five days. At least that one run was itself a record with integrated luminosity for a single run of 47.7/pb in ATLAS. For CMS the total delivered was 46.3/pb but they issued a special note to say that they had also recorded 44.3/pb, more than the entire amount recorded during 2010.

The main reason for the delay this week was problems with cryogenics caused by clogged oil filters and possibly worsened by a lightning strike and/or an industrial strike. When any one of the 8 major cryongenic plants fails it can easily take two days to get it fixed and return the superconducting magnets to their working temperature of 1.9 degrees Kelvin. There were two such outages this week with this record run in between.

Update 27-Jun-2011: The situation has improved in the last few days culminating in a run today lasting about 20 hours that delivered a record 62/pb. The luminosity was still above half its peak value at the end of the run demonstrating just how good the luminosity lifetimes are. The next run will attempt 1380 bunches, the maximum possible with 50ns spacing. They have just one day left before machine development time takes over, followed by a technical stop.

LHC Status Report

June 18, 2011

last week we celebrated 1 inverse femtobarn (1/fb) of integrated luminosity delivered to ATLAS and CMS. Of that data ATLAS has recorded about 95% and CMS about 92% so with a little more added ATLAS have now recorded over 1/fb.

The milestones have been celebrated with a CERN press release

Last week the LHC Control group held an Open meeting to report on progress of the beams and experiments. Slides and videos are available for some of the talks including  the Machine Status Report by Steve Myers who revealed that during the last Machine Development period the bunch intensity was tested up to 195 billion protons, going well beyond the 170 billion ultimate intensity limit. The intensity currently in use is about 120 billion, but there is hope that this may be increased later in the year.

Although the LHC has delivered 1/fb in record time as a result of its better than expected early performance, there is some frustration that technical problems are holding it back from achieving even better results. A string of difficulties has been making it hard to get the beams circulating while other glitches cause the beams to be dumped early. The time in stable beams has been about 36% since they started running with 1092 bunches and it should be possible to do better than that.

Unidentified Falling Objects

In his talk Myres gave some more information about UFOs. These are mysterious rapid beam loss events thought to be caused by particles falling into the beam path. They can trigger the protection mechanisms to dump the beams. Studies have shown that they most often occur at the injection points and almost always shortly after injection causing problems before they get to stable beams. Surprisingly their frequency is not increasing with further intensity advances. They were 110 of these UFO events last year and already 5000 this year, but only the strongest cases can trigger a beam dump.

An extensive report on UFOs can be found here

RF Power Couplers

Another series of problems concerns the RF components. The couplers can take 200 kW of power and currently are being loaded up to 190 kW. This figure increases with beam intensity. If one of the ceramic couplers breaks it would put the LHC out of action for five to six weeks. These and other concerns have been preventing them from raising the bunch number to the next step of 1236 bunches. There is also a special report on the RF power issues and how they have been addressed. With the situation coming back under control it is hoped that the next luminosity step can still be taken this weekend.


In order to decide how to proceed for the rest of the year there will be a “mini-Chamonix” meeting on the 15th July. There We may hear more about addressing these and other problems as well as prospects for any further luminosity increases e.g. by raising bunch intensity.

Status Reports of the Experiments

At the LPCC meeting there were also reports from the individual LHC experiments. CMS has produced 80 papers using LHC data while ATLAS has about 190 and there are also good initial results from LHCb and ALICE. With the luminosity increasing at faster than expected rates there has been more pileup of events in the detector than anticipated. ATLAS reports an average of 6 events for each bunch crossing. There is significant impact on the calorimeter reconstruction resulting in increased systematic uncertainties in the analysis. Low transverse momentum jet events are the worst affected, but it is a small price to pay for so much extra data. Pileup will get worse if the bunch intensity is raised further.

Summary of Physics Results

Most of the physics results published so far have used just the 40/pb of data collected in 2010 with just a handful using up to 240/pb . A selective summary of results from ATLAS is shown on this slide (click to see full-sized). Within a few weeks we will have many more results including some using the 1/fb now collected. the EPS-HEP conference at the end of July is the next major opportunity for physics presentations.

2000 papers at

June 16, 2011

Today is the turn of to pass an important milestone with over 2000 papers now in the e-print archive. The total has been added since it started a little less than two years ago. When I started on this project I did not imagine that so many people would make use of the service, so I would like to take this opportunity to thank the 600 authors who have supported us by submitting their work. This is also a good moment to thank Huping Hu and Jonathan Dickau who have kindly provided mirror servers for the site and they have helped out with submission administration to keep the service running at times when I am away. Their backup and support gives me confidence to say that will survive as a long-term repository. I am also very grateful to those who have made generous donations to help cover the costs of running the server.

Despite the unqualified success of I still get a sense that there are a lot more people out there who could benefit from using viXra. Some of them simply don’t know about it so this blog is one thing I do to publicize its existence. is primarily here for scientists who do not have access to a qualified endorser for, usually because they are non-professionals working independently of any academic institution. Many of them have left professional research to follow a different carear but retain an interest and continue their research in their spare time. Of course viXra welcomes submissions from anyone and we do even get some contributions from people with .edu adresses along with the occassional highschool student and unqualified amateurs who like to think for themselves. To encourage more of these people to join in, here are a couple of answers to some of the questions people ask.

Does submission to give you as much exposure as other archives such as

It would be a plain lie to say that viXra is as well known or as well browsed as arXiv, yet the stats indicate that papers on viXra get just as many hits as they do on arXiv. Let’s look at the numbers. According to the arXiv usage stats they typically get something like 800,000 web connections each day. From the logs of our corresponding figure is around 4000 per day, just 0.5 percent of the arXiv number. But of course we have a lot less papers. ArXiv have reached about 6000 new papers per month while viXra receives just 80 per month, or about 1.3% of their numbers, so relative to new papers submitted arXiv is getting about 2.5 times as many hits as viXra. Then again, arXiv has been around a lot longer and many accesses come from searches on its back-catalog. arXiv has 680,000 papers compared to viXra’s total of 2000 which is just 0.3% . So relative to the total archive we actually get 60% more hits than they do. Real usage is a mixture of looking at old and new papers so within the uncertainties of this crude analysis I think it is fair to say that a paper submitted to viXra gets a similar number of hits as one on arXiv.

How can this be? In fact most hits do not come from people browsing new submissions. They come from people who search on Google and other serach engines for keywords of interest to them. Since viXra is just as well indexed on Google as arXiv is, it actually gets just as many hits per paper.

Will submission to damage your credibility because it is full of “crackpots” who cannot make it into

It is really important to understand that it is not the purpose of an e-print archive to give you or your work credibility. Such recognition comes only from other sources such as acceptance in a peer-reviewed journal, use or citations of your work, or emperical verification of any predictions and the occasional Nobel Prize. All that you get from an archive such as or is a longterm repository and a fixed link to your work so that people can find it and make links to it that will stay in place. It also provides independently verified timestamps so that others can check who reported any idea first in questions of priority.

Other scientists find papers in an archive mostly by searching on Google or by being referred from somewhere else. Once they find it they are not bothered about where it is. If they are sufficiently interested in the subject to have come across it they will be qualified to judge it on its own merits.

As to the accusation that is full of crank papers, I would be a fool to claim that the work of non-professionals who do not have access to is going to be of the same qualify as the work of professionals who do, at least on average. But there are plenty of people who know their subject and have interesting work to publish who nevertheless do not have access to a qualified endorser as required by arXiv. Conversely, access to such an endorser does not mean someone’s work is good either. There are good papers on just as there are bad ones on

On a lighter note, enjoy this video, (warning: bad words!)

Lunar Eclipse in Progress

June 15, 2011

This is how the Moon looks near maximum eclipse as seen on Google/Slooh. Still a little time left to view it. Hope some of you have clear night skies unlike us!

In case you haven’t noticed you can see the eclipse live on the main google search page.

The intense red colour of the moon is due to light being refracted through the Earth’s atmosphere which scatters the blue light leaving just the red end of the spectrum to bathe the moon in a warm glow. The colour is said to have been deepened by dust from recent volcanoes.

In 2009 Japan’s Kaguya lunar orbiter took some spectacular pictures of a similar eclipse from lunar orbit. The Earth passes in front of the Sun making it look like a solar eclipse except that the Earth is bigger than the Moon so the Sun disappears for much longer. The atmosphere of the Earth continues to be illuminated by the Sun from behind like a continuous ring of twilight. In this sequence the eclipse was rising above the moon’s surface which blocked the beginning of the eclipse as aseen from the orbiter. In the final frame the Sun emerges again from behind the Earth.