Rencontres de Blois

May 31, 2011

Leonardo de Vinci spent many of his latter days in the Loire Valley and if you visit Ambrois you may still see his ghost parading for the tourists. While he was there he contributed a number of beautiful designs into the châteaux of the region including this spiral staircase at the Royal Chateau de Blois. There is an even more spectacular example in the nearby picturesque Chateau de Chambord where he used a double helix design so that people can pass from one floor to another without meeting, but it is at Blois where this week we find a conference to discuss some new findings in particle physics and cosmology.

Dark Matter in Cosmology

Yesterday included a talk by Joe Silk, prof of astronomy at Oxford, who summarised our understanding on the cold dark matter model including dark energy and dark matter. Simulations of galactic cluster and structure formation favour a model where dark matter is made of weakly interacting cold particles. The mass should be in the range 10GeV to 10TeV. The lighter end seems at odds with accelerator experiments which should have produced such particles unless they have a very low cross-section for production from interaction with standard model particles. However, some of the detectors build to directly detect passing WIMPs have found evidence for 7 GeV particles. They have even seen an annual variation in the signal consistent with a cosmological origin. Dark Matter particles may also annihilate to produce a cosmic halo of positrons and electrons that space observatories such as PAMELA may have detected. From the slides Silk seems to have neglected to mention the results from Xenon100, now in theory the most sensitive dark matter detector, which is at odds with other results because it sees nothing. Over all it is a very exciting time for dark matter with the hope that the LHC will resolve the matter by producing the expected particles in proton collision.

Silk also looked at dark energy results and provided this awe-inspiring plot of how different observations constrain the two parameter space of cosmological models.

The blue SNe area is from the supernova data that first indicated the acceleration of the universe. This was followed by the CMB cosmic microwave background analysis that produced an orthogonal constraint to pin down the cosmological parameters to a narrow region consistent with a flat universe. This year a third source of data came from measurements of galactic structures giving the BAO band in excellent agreement with earlier data. This is also sufficient to show that dark energy is well modeled by a cosmological constant rather than a variable that changes with time. The effect of the constant is to have less impact in the early universe, while now it accounts for 73% of the non-gravitational energy of the universe. As the universe ages the cosmological constant will continue to increase its domination. Despite this observational triumph its theoretical origin remains mysterious and should be tied up with theories of quantum gravity not yet understood.

Searches for the Higgs Boson

Later in the afternoon Giovanni Punzi delivered a talk that has already caused a stir around the blogosphere, see here, here, here, here. First take a look at some of his collected plots for Higgs searches.

All these plots show a small excess around the 110GeV – 120GeV region, but the size is not significant and the positions are not quite consistent. Will we look back on these graphics in a few months time as the first signals of the Higgs or as statistical flukes?

The real interest today is in observations from jets at CDF. Here is a plot from a few weeks back that appeared to show a mystery bump. This was widely poo-poo’ed by commentators including myself who said that it could be a statistical fluke or a background effect.

Now they have fought back with a second plot from a completely independent run of data

It shows exactly the same thing! This rules out a statistical fluke at about the 5 sigma level. CDF has also responded to suggestions that it could be due to incorrect scaling of the background, showing that this cannot account for the excess when modelled carefully. Finally they rule out a standard model effect from top quarks. We are left to conclude that it must be a real new effect due to a particle of 150 GeV that is not part of the standard model. the big question is whether it is also seen at D0 and will the LHC confirm it?


New luminosity record for LHC + Injector Chain

May 29, 2011

I know this is getting a bit repetitive but the Large Hadron Collider has a new luminosity record of 1270/μb/s. This was achieved this morning using 1092 bunches.

Notice that this is a change from the plan previously reported. The new plan is to run with 1092 bunches at least once with 108 bunch trains. This is the maximum number of bunches that can be fitted in with 108 bunch injection. They will then switch to 144 bunch injection with the same number of bunches in total. It will then be possible to fit in another train of 144 bunches to make 1236 in total, then the final train goes in to reach the limit of 1380 bunches. All this will take two or three weeks depending on how smoothly it goes. The final peak luminosity will be around 1600/μb/s but there is potential to increase it a little further by pushing up bunch intensity.

Update 29-May-2011: More records! The current fill finally collected over 37/pb which is a record for one fill. CMS have over 500/pb delivered in 2011 and if you count the data from last year as well they will soon pass 500/pb recorded too. The data collected in the last seven days is 169/pb despite the days lost to cryogenic problems. If they keep the same rate for the 130 physics days they have left this year they would add another 3000/pb = 3/fb, but we expect them to do much better.

Update 30-May-2011: With the second run at 1092 bunches well underway, a record for total luminosity over 24 hours has been added to the list. 69.2/pb has been delivered in one day. If this could be maintained for 130 days they would collect 9000/pb = 9/fb to add to the 0.5/pb so far collected bringing the total for 2011 very close to the 10/fb I have predicted. Of course they can’t run this efficiently every day, on the other hand the peak luminosity will rise by another 30% in the next few weeks and possibly further later on if they decide to increase the bunch intensity.

The Injector Chain

Unless you have been following the developments at the LHC in detail,  all this talk of bunches and luminosity probably sounds a bit confusing. Since its a lazy Sunday afternoon I’ll dig around and put together some info about how the process of injecting proton bunches into the collider works.

Duoplasmatron

The protons that collide inside the experiments of the LHC begin their life as ordinary hydrogen gas inside a red bottle. From there they are piped into a duoplasmatron. Hydrogen molecules are made of two hydrogen atoms each of which contains one proton and one electron. The duoplasmatron uses some strong electric fields to tear the molecules and the atoms apart so that the protons move freely in a not plasma. More electric fields then draw the protons out from the duoplasmatron in pulses and accelerate them into a beam. They then shoot through a radio frequency quadrapole that focuses and accelerates the protons towards the next stage in their journey, the LINAC

LINAC 2

LINAC 2 is a 30 meter long linear accelerator that has been in use since 1978. Pulses of protons from the duoplasmatron are accelerated in a straight line using electric fields. The fields alternate from positive to negative in a wave that travels up the LINAC at just the right speed so that the protons are always being forced forwards, riding the wave like a surfer on a breaker that somehow gets faster and faster as it approaches the beach. Finally they shoot out the other end with an energy of 50 MeV. The proton mass is equivalent to about 20 times this energy so they are only travelling at about half the speed of light. The pulses of protons can be sent down the LINAC at a rate of about one every second. It is these pulses that will eventually form the bunches of protons inside the LHC ring and already they need to be well collimated so that they remain in a tight bunch all the way.

The PS Booster

From the LINAC the protons pass into the proton synchrotron booster (PSB), a circular accelerator with a diameter of 50 meters that dates back to 1972. The PSB is used to accelerate the protons by an energy factor of 280 up to 1.4 GeV. It is actually made of four separate rings stacked one on top of another each of which takes in a bunch of protons from LINAC 2 to accelerate.

The Proton Synchrotron

The four bunches of protons in the PS Booster, followed by another injection of two bunches are fed into the Proton Synchrotron (PS), another circular accelerator 4 times as big as the PSB.  The PS has been in use since 1959 making it by the oldest part of the injector chain. When it was first built it was itself the most powerful accelerator in the world and it remained CERNs principle collider until 1970. It has of course been upgraded since then to serve as the feeder to later generations of accelerator. The PS has a diameter of 200 meters and uses 277 ordinary electromagnets to bend the protons around its circular ring. Between the magnets the protons are accelerated using the same principles as the linear accelerator.

In its role as a feeder to the LHC, the PS now has a new party trick. With 6 bunches inside to start with it can split the bunches to make more. Initially each bunch circulates with a separation of 300 ns and a bigger gap at one point to allow them to be kicked out. The LHC may run with bunch spacing of 150ns, 75ns, 50ns or 25ns. To form these spacing the PS can split each bunch into either three smaller bunches or two bunches. To get this years spacing of 50ns it must be split once into three, each of which is then split again into another two. The PS then has 36 bunches circulating in a nice train with a much bigger gap at the ends. Next year the bunches may be split one further time to produce trains of 72 bunches with 25ns spacing. these bunches can be accelerated by a further energy factor of 18 to provide a proton energy of 26 GeV before they are fed into the next stage, the SPS.

The Super Proton Synchrotron

In 1976 CERN commissioned a new accelerator ring that dwarfed the PS. The Super Proton Synchrotron has a diameter of 2.2km and can now accelerate protons fed from the PS by a factor of 17 up to 450 GeV using 960 superconducting magnets to bend the proton bunches around the ring . The SPS never held the title of the worlds highest energy accelerator because it was beaten to that title by the Tevatron at Fermilab. However, it initially had some design advantages that enabled it to find the W and Z bosons first.

Now the SPS is the final stage that injects the protons into the Large Hadron Collider. Since it is more than ten times as big as the PS it can easily build up much larger trains of protons by collecting together the trains of bunches fed from the PS. For the fill running now it takes three trains of 36 bunches from the PS one at a time to form trains of 108 bunches to feed into the LHC.

The Large Hadron Collider

The main LHC accelerator is an underground circular tunnel over 8km in diameter, nearly four times as big as the SPS. It has 5000 superconducting magnets and accelerates the protons up by the last factor of over 7 to 3.5 TeV. In 2014 this will be increased to 7 TeV. It actually consists of two rings that cross over at intersection points inside large particle detectors where the protons finally collide. Maximizing the rate of collision means packing in as many bunches as possible into these rings

The trains from the SPS can be directed along tunnels into either ring of the LHC to circulate in opposite directions. Once the larger trains are in, smaller trains can be injected to squeeze into any remaining available gaps but there will always be some bigger spaces left that cannot be filled. This is because the process of moving the trains from the PS to the SPS and then to the LHC requires powerful electromagnets to be fired up that kick the protons out of one ring into another. It is as if the trains are being transferred from one railtrack to another by moving points. Because the points take time to switch position there must be gaps between the trains to allow time for them to move, otherwise the trains will be derailed when they run into the moving points! There are gaps of different sizes because as the trains get faster at each stage, heavier points are needed (meaning more powerful magnets) which take longer to switch.

While the bunches themselves are separated by gaps of 50 ns, the trains of 36 bunches that come from the PS are separated by 225 ns. The trains of 108 bunches must then be separated by bigger 975 ns gaps because bigger magnets are needed to kick the trains at higher energy into the LHC. Finally, an even bigger abort gap of 3000 ns must be left as they circulate in the LHC so that the protons can be safely dumped out of the LHC at its full energy of 7 TeV. To pack the maximum number of bunches into the LHC the number of larger gaps must be kept down. That is why the number of bunches injected in each train must be increased. Trains of 144 bunches will be needed to get the maximum of 1380 bunches into the LHC in the next few weeks.

With more bunches there are more proton collisions which means more data for discovering new physics.

By the way, if you want to watch the injection process in detail the page to look at is SPS-page 1.  The yellow line in the graph shows the build up of protons in the PS in the steps needed to build up the three or four trains that go into the SPS. The white line shows the rise of the magnets in the PS and finally the SPS as the protons are accelerated. A moving vertical white line indicates the progress through this build-up. When it reaches the end after about 40 seconds,  the trains are then finally ready to be injected into the LHC. Use the LHC announcer page to follow these injections and the subsequent ramping up of energy inside the LHC.


Seminar Watch

May 27, 2011

With LHC and Tevatron insider physicists sitting on a growing mound of unpublished results from new data, excitement amongst the rest of the physics community is reaching fever pitch. Some of us are even looking daily at conference and seminar programs to try and identify where the next big real non-rumour result will emerge.

A couple of conferences have already passed this week including “Quark Matter 2011” at Annecy and “Flavour Physics and CP Violation 2011” in Israel. Sadly no results based on new data were presented as far as I can see, but I give you the links to the contribution lists in case you want to check the slides.

Hot favorite for first new results has to be the up and coming “Physics at LHC 2011” conference. There are many presentations that could include beyond-standard-model physics, especially over the last two days with even a 30 minute unallocated slot right at the end labelled simply “Extra Time for Experimental Talk”. There are closed meetings going on at CERN today with titles like “Exotica Pre-approvals for PLHC” and “SUSY Rehearsals for PLHC”

According to an article in the New Scientist new LHC results should be announced at a CERN seminar and most likely this would come under the “Joint EP/PP/LPCC” series where new results have appeared in the past. Sure enough there is already a seminar listed on “Studies of jets and their properties using the ATLAS detector at the LHC” by Jonathan Butterworth on June 7th. From the information given in the abstract this may or may not include new results but the timing is interesting as it falls during PLHC just before the (possibly) most interesting talks.

Watch this space!


24 Hours of the LHC vs 24 Years of the Tevatron

May 24, 2011

In the last 24 hours the Large Hadron Collider has delivered a record 43/pb worth of collision data, equivalent to the entire proton physics run during 2010. Almost all the published results from the LHC so far are based on that amount of data, (with the single exception of one report needed to squash a recent rumour)   This provides a convenient opportunity to compare 24 hours of data from the LHC with 24 years from the Tevatron.

The Tevatron which began running in 1987 has of course made some significant discoveries including the top quark and some real hints of physics beyond the standard model. It has also excluded the Higgs over a wide mass range. The LHC has so far made no new discoveries but with its 43/pb of 2010 it has surpassed the Tevatron in searches for heavy quarks, supersymmetry and other exotic physics, placing limits well beyond the reach of the Tevatron. This superiority at heavier mass scales is due to its higher proton energy. In many ways, 24 hours of LHC data is more interesting than 24 years worth of Tevatron data. Luckily we are not forced to make the difficult choice between the two possibilities, we have had both.

There was some controversy when it was announced that the Tevatron would stop running this year and it was a decision based on lack of government funding rather than the wishes of the Fermilab directors. By time the summer conferences of 2011 have passed CDF and D0 will have presented their last results from the Tevatron worth listening to and the LHC will have surpassed them in all channels. The Tevatron will cease its work of colliding hadrons and Fermilab will concentrate on its admirable neutrino physics program and the ambitious plan to design and build Project X, a high intensity proton accelerator that could lead one day to a future muon collider and leap frog to the next scale of physics.

To help appreciate the formidable success of the LHC it is worth looking back to its status one year ago and the predictions for its future running. They were summarized in this table presented at ICHEP 2010.

As you can see from the first line, they planned to reach luminosities of 188/μb/s during 2010 and run at that level during 2011 to collect 1/fb before the long shutdown of 2012. They intended to do this using a squeeze of 2m, and 796 bunches with 75ns spacing. This plan changed quickly in the Autumn of 2010 when they found that the running of the LHC was much better than expected.

They thought they would only be able to get bunch intensities of up to 80 billion protons as shown on this table. In fact they quickly reached nominal intensities of 115 billions protons with no difficulty. This year they have pushed the limit higher to 170 billion, well over twice what they thought was possible before interactions between bunches would take their toll, leading to instabilities. The emittance of the beams was also looking good. They had predicted an emittance figure of 3.75 μm, a measure of the distribution of particle momentums away from the ideal in each bunch. Recently they have dropped the emittance as low as 1.5 μm in development tests.

Given this unanticipated performance they adapted the plans to suit. Last year they used a squeeze of just 3.5m and a larger bunch spacing of 150ns. With these less ambitious parameters they were still able to exceed target luminosities reaching 210/μb/s in 2010. This left them open to the possibility of much better luminosities for 2011. In fact they have gone beyond the original plan on all fronts using an even tighter squeeze of 1.5m and 50ns spacing to allow up to 1380 bunches in each ring. The result is luminosities already peaking at 1100/μb/s and likely to end at 10 times higher than those expected when the above slide was prepared a year ago.

With the extra data it became worth extending the early run of the LHC into 2012, with the long shutdown now scheduled for 2013. By then it is likely that the LHC will have produced enough data to discover the Higgs if it is there, whatever mass it has. There is always the possibility that nature has it in store for us to see nothing because it is not there, but as Lubos Motl put it “if nothing “new” is found, it will be a bizarre situation in which “new physics” has to exist but its goal is to suppress the number of interesting and qualitatively different things you can see.”

The early discovery of the Higgs will mean that the LHC has achieved its primary goal. With the pressure off, the engineers and physicists of the collider teams will be able to concentrate on the upgrades for the long shutdown that itself is likely to be extended to include further enhancements. The LHC will return towards the end of 2014 ready to explore even higher energies with even higher luminosities. If nature is kind there will be much to learn.

Finally you may ask how did the LHC manage to get to such an advanced position ahead of schedule compared to the many years it took the Tevatron. The answer is of course that the LHC has built on lessons learnt from the Tevatron and other predecessor accelerators such as LEP, HERA and KEK. There is only one international scientific community with a common goal of understanding nature. The physicists who work on these machines stay in the game for many years and move from one project to the next crossing borders without any regrets and bringing their expertise with them. The LHC uses magnets built at Fermilab and many of its systems such as the cryogenics are direct descendants of the pioneering systems built for the there. The Tevatron may be yesterdays gadget but the LHC could not have been created without the knowledge it brought.

Update: Here are some of the people from Fermilab celebrating 11/fb (from facebook today)

Higgs exclusion plots presented so far have only used about 8/fb, so there is always some hope for a last minute revelation before they are eclipsed by the LHC, good luck!


LHC Luminosity Prospects

May 21, 2011

After a two week break from physics runs, the Large Hadron Collider is once again clocking up proton-proton collisions at a steady rate. The break included a few days of machine development time, a planned technical stop for routine maintenance, Van der Mere scans to study the beam size and distribution and finally alignments for the TOTEM and ALFA experiments.

Now they are back to the process of building up the luminosity. The current fill has 768 bunches matching previous runs except that they are now injecting 108 bunches at a time compared to the previous scheme using 72 bunch injections. This is important because the next step up to 912 bunches will require the 108 injections and that is now expected for the next run later today.

The 912 bunch step is likely to take the luminosity through another milestone by peaking at more than 1000/μb/s. This is the official target for 2011 luminosity but it will be exceeded further as they step up through 1056, 1200 and then 1380 bunches in the next couple of weeks. These last three steps will require 144 bunch injections taking the injection systems to new limits. In the last few days they already injected 1308 bunches during a scrubbing run, but without ramping to full energy for collisions.

(Update 23-May-2011: A record luminosity of 1100/μb/s was reached on the third fill with 912 bunches per ring this morning.)

With the current plans it should be possible to reach luminosity of at least 1500/μb/s for the long run from June to October during which there will be 124 days allocated to proton-proton physics runs. So how much further will it be possible to increase the luminosity and when? As part of the machine development time they tried out some processes that could potentially increase luminosity further. Possibilities for higher luminosity without updating the hardware are the following

  • Increase of bunch intensity
  • Improved emittance
  • Better Squeeze
  • More bunches
Let’s look at each of these in turn

Increase of bunch intensity: The proton bunches currently circulating start out with 115 billion protons in each. This is the “nominal” bunch intensity that was originally planned for the collider with its present hardware. However, it is possible to inject an “ultimate” intensity of 170 billion protons into each bunch from the SPS. The luminosity increases in proportion to the bunch intensity in both beams so the potential luminosity increase is a factor of (170/115)2 = 2.2.  It had been anticipated that the beams would become too unstable at these levels but already last year they found that reaching nominal intensity was easier than expected. During the latest machine development time they injected a few ultimate intensity bunches to see how they perform. Once again they found that there was no problem with the head-on beam-beam tune shifts that they had expected to be a limiting factor, so the way is open to a further increase of luminosity.

The beam operators are being publicly cautious  about when this could happen. Paul Collier who heads the team said in a message on the LHCPortal forum that “The studies with very intense bunches are for the future – no plans to push higher bunch intensities into operation for the moment.”

Improved emittance: The emittance \epsilon is related the size of the beam w = \sqrt{\epsilon \beta} Transversely emittance should be 3.75μm but with better than expected performance  it has been typically 2μm to 3μm. During the machine development tests they were able to reduce this to 1.5μm. If they can do this for injections of full trains it represents a further increase of luminosity.

Despite the cautious words there are signs that increased intensity and improved emittance could be introduced into the physics runs this year taking luminosities up to “a few” times the 1000/μb target figure. Whether this happens and to what level will depend on how smooth operations go during the first few weeks of the long run with 1380 bunches.

Better Squeeze: The squeeze is a process of focusing the beams into a smaller cross section as they pass the collision points inside the detectors. The current squeeze is represented by a figure of β* = 1.5m but a tighter squeeze may be possible. The nominal squeeze is 0.55m but this will not be possible until the full 7TeV beam energy is available after the long shutdown. After that further improvements are still possible. As Paul Collier explained, the minimum β* is limited by the aperture of the triplets and also by the chromatic aberrations introduced by the very tight squeeze. Another test during the machine development period looked at a new method of ATS injection which is a novel and compicated scheme to bypass the strength limit of the existing lattice sextupoles and allow a much smaller β*. Values of around 0.15m should be possible with existing hardware.

These developments are definitely not for this year. Moving to a tighter squeeze by any means will require a long process of resetting collimators and building up luminosity again in steps. Too much time would be lost to make this worthwhile for current runs but in 2012 a smaller β* might be used.

More bunches: The last route to increased luminosity is more bunches. To exceed 1380 bunches per beam they must use a lower bunch separation time of 25ns compared to the 50ns spacing currently in use. Since they travel at virtually the speed of light this smaller spacing places the bunches just 7.5m apart. Such a change might require a larger crossing angle to avoid parasitic collisions as bunches come close 7.5m away from the desired interaction points. This would again mean resetting the collimators. Furthermore, it is not known if the vacuum and cryogenics can yet cope with the higher intensities that twice as many bunches would imply, especially if ultimate luminosity bunches are being used as well. So realistically this is not likely to happen this year either, however, we can expect some tests of 25ns spacing this year during further scheduled machine development times and a switch to 25ns for 2012 is a possibility if the tests go well.

Altogether this could take luminosities up to a remarkable 10000/μb/s next year, if everything works out. 5000/μb/s is probably a more realistic target. Further down the line there will be upgrades allowing for yet higher intensities and a better squeeze. Peak luminosities of 50000/μb/s are anticipated after a few years. At some point the pile-up from collisions coming too frequently to be separated will be a problem and the luminosities will have to be limited. Pile-up is already expected to be a challenge for ATLAS and CMS if futher luminosity increases are introduced this year. Even then it is worth pushing the peak luminosity higher because it will make it possible to sustain the maximum desired luminosity during much of the length of a long run. This is already being done for LHCb and ALICE.

With so much data the potential for new discoveries and precision measurements over the coming years is huge. In the shorter timespan of the next two or three months we can already anticipate some great results. There are two conferences this summer that are especially worth following. The Physics at LHC 2011 conference starting on June 6th will be an opportunity to show results using at least 200/pb of data collected up to now. That is significantly more than the 40/pb already analysed in detail. On July 21st another meeting for the International Europhysics Conference on High Energy Physics could already be presenting first results from as much as 2000/pb worth of collisions. However, if there are any really important discoveries forthcoming there will be a longer process of analysis, checking and consultation before going public. Rumors can of course be based on what they don’t show straight away.


How will the Higgs be announced by CERN?

May 18, 2011

Following the recent rumours and leaks about new particles, New Scientist has been keeping up with the story. Now they have an interesting interview with the CERN press chief about how an important new discovery is supposed to be announced. The answer is worth quoting but you should also read the full interview.

What is supposed to happen if someone really does find the Higgs at the LHC?

We have devised a protocol for dealing with a blockbuster result. If one of the collaborations has a result to announce, they inform the director general of CERN. This sets in motion a chain of events. Other experiments with potentially the same physics are given the chance to confirm the findings. If the result is big enough, like discovery of a supersymmetric particle or the Higgs boson, we inform the heads of other laboratories and all our member states that this is coming, and organise an announcement seminar at CERN.

So if they manage to prevent a leak from the original collaboration of 3000 physicists, then the 3000 physicists in the competing collaboration also get a chance to leak it, followed by any other big laboratory and politicians and civil servants from the 20 member states of CERN. If they all manage to stay quiet long enough then we will notice a major seminar being scheduled at CERN which itself will spark rampant speculation in the days before it is held. If that is really the plan then I think it will certainly reach a blog first. Perhaps the DG should just blog it himself to save the inevitable.

However, there is always a suitable conference just around the corner and I suspect that a conference presentation might be a more likely forum for the first real results. Hopefully we shall see soon enough.


Shuttle Endeavour ready to Launch

May 16, 2011

Endeavour is preparing to launch for its last mission of 16 days in which it will deliver CERNs Alpha Magnetic Spectrometer to the International Space Station. This experiment will measure cosmic ray fluxes and should be able to tell us if there is any antimatter at all in space. Within the Earth’s atmosphere we can detect only the byproducts of most cosmic ray collisions with the atmosphere or Earth itself. Only weakly interacting particles such as neutrinos can be detected directly. In space it is possible to measure directly the energy of cosmic ray particles before they collide, but you have to wait a long time to get a good sample.

The launch can be watched on NASA TV and is due soon

Update: the shuttle has launched on time.


Channel 4 gets excited about LHC and String Theory

May 8, 2011

These days I get most of my news from the internet but occasionally I catch the Channel 4 News on TV after watching Time Team. Twice in the space of a few weeks they have mentioned the LHC, a sure sign that media interest is increasing. The first time was about the recent rumour (now quashed), but today they ran a piece about the Large Hadron Collider and String Theory. I get excited about both subjects so it is nice to see them on the UK news, but I also know that the LHC is not really about proving String Theory so I watched with some trepidation and the imagined words of Woit calling it “hype” ringing in my head.

The article was in fact about the study of the “quark gluon soup” in the heavy ion run last year and the key line was that “The maths of string theory might explain the weirdness of this subatomic soup.” I breathed a sign of relief seeing that this is basically a reasonable statement and not a confusion about using the effects to test String Theory as a unified theory, even if most viewers were unlikely to appreciate the distinction.

Unfortunately they blew it right at the end with this very confused statement:
“In the coming years the hadron collider will run at higher and higher power, each time probing deeper into the subatomic world. When physicists finally turn it up to 11 – multiple dimensions if they exist should make their appearance in our three dimensional world” – eh?

Well it does not bother me as much as some that the journalists never get it quite right. It is still good that they consider these things interesting enough to report to the masses and it will whet peoples appetite for any discoveries that might come this year.


ATLAS crushes new particle rumour

May 8, 2011

Once again the latest rumour of a new particle has proven too good to be true. With the first public plots using 2011 data at the LHC, the ATLAS collaboration have shown that no evidence exists for a new particle at 115 GeV, yet. The first 94/pb of 2011 data on its own did look tantalising, but inconclusive and certainly not a 4 sigma effect as originally described. If the 2010 data is added the apparent effect fades away as shown in these two plots. The result is explained in a note released today.

Note that the 131/pb is the amount of data that was available on 22nd April when the rumour first broke. To find a standard Higgs signal in this region it is expected that at least 100 times as much data will be needed.

According to a report in the New Scientist a leaked internal CMS presentation on the 25th April also concluded that there was no signal. This was followed by the most detailed “No Comment” in the history of particle physics leading to an interesting discussion between Tommaso and a NS editor about whether or not a power point presentation is a “document”. Once again the dust settles, until the next time.

This latest report was first seen on Not Even Wrong where the original rumour started.

Update: The report is also covered nicely by Lubos and Tommaso.


Tevatron steams on!

May 7, 2011

I’m always reporting the progress of the Large Hadron Collider so it is good to have a look at the Tevatron for a change. There is an excellent facebook page for the Fermilab collider where they report on the data collected during each store. Over the last week they had a particularly good run and posted this plot.

They collected 78/pb in one week with typical fills of 15 to 20 hours providing as much as 11/pb each time. Given the peak luminosity of 430/ub/s (which by the way was a new peak luminosity record for the Tevatron), the Hübner Factor is about H = 78000/430*3.6*24*7 = 0.3. This is an optimal value for the factor given excellent running conditions with minimal stoppages.  The longer term value is presumably a bit less than this.

The turnround times on a good day last week appear to be about 5 hours between fills. In fact they can refill in as little as one hour when all goes well. This is very good compared with the Large Hadron Collider whose best turnround times have been about 2.5 hours.

Given that a Hübner Factor of 0.3 is exceptionally good at the Tevatron you may wonder how the Large Hadron Collider with its longer turnround times could do better. The answer is that it has much better luminosity halflife times. At the Tevatron half the luminosity is lost every 6 to 7 hours (this is my estimate from above plot). The LHC has been producing halflives of about three times as long. During a run of 19 hours at the Tevatron the Luminosity could drop to as little as 15% of its peak while the LHC would still be at 50%. This is enough to double the Hübner Factor for the LHC.

What are the reasons for the longer lifetime at the LHC? Good question! I can only speculate that it is a combination of factors including the fact that the LHC uses two beam pipes compared with the single beam pipe design of the Tevatron. Keeping the bunches further apart outside the collision points should help stability. The LHC may also have better damping systems to keep the beam focused. It’s larger size gives more scope for correcting the blow-up of emittance. The higher energy also helps I think. Perhaps there are other reasons. The LHC is just a much more up-to-date machine.

For now though, it is the Tevatron that is steadily collecting data to add to its already impressive store. It’s a glorious run to the end of its long and fruitful life in September.