LHC end of run update

October 30, 2011

Today is scheduled as the end of proton physics at the Large Hadron Collider and the last few fills are circulating this morning. The integrated luminosity recorded this year will end at about 5.2/fb each for CMS and ATLAS, 1.1/fb for LHCb and 5/pb for ALICE. For the remainder of this year they will return to heavy ion physics until the winter shutdown.

The good news this year has been the high luminosity achieved with peaks at 3.65/nb/s. This compares with the expectations of 0.288/nb/s estimated before the 2011 run began. The higher luminosity has been made possible by pushing beam parameters (number of bunches, bunch intensity, emittance, beta*) to give better than expected performance. The not so good news is that out of 230 days that were available for physics runs only 55 (24%) were spent in stable beams. This was due to a barrage of technical difficulties including problems with RF, Vacuum, cryogenics, power stability, UFOs, SEUs and more. There were times when everything ran much more smoothly and the time in stable beams was then twice the average. The reality is that the Large Hadron Collider pushes a number of technologies far beyond anything attempted before and nothing on such scales can be expected to run smoothly first time out. The remarkable amount of data collected this year is testament to the competence and dedication of the teams of engineers and physicists in the operation groups.

After the heavy ion runs they will start looking towards next year. There will be a workshop at Evian in mid December to review the year and prepare for 2012.  Mike Lamont, the LHC Machine Coordinator will be providing a less technical overview for the John Adams Lecture on 18th November.

Brian Cox, Bloggers and Peer Review

October 24, 2011

Brian Cox is a professor of physics at Manchester and a member of the ATLAS collaboration. He is very well-known as a television presenter for science, especially in the UK and has been credited with a 40% increase in uptake of maths and science subjects at UK schools. He is clever, funny and very popular. If you are in the UK and missed his appearance on comedy quiz QI last week you should watch it now (4 days left to view).

At the weekend the Guardian published a great question and answers article with Brian Cox and Jeff Forshaw, who I am less familiar with. The answers all made perfect sense except one:

How do you feel about scientists who blog their research rather than waiting to publish their final results?

BC: The peer review process works and I’m an enormous supporter of it. If you try to circumvent the process, that’s a recipe for disaster. Often, it’s based on a suspicion of the scientific community and the scientific method. They often see themselves as the hero outside of science, cutting through the jungle of bureaucracy. That’s nonsense: science is a very open pursuit, but peer review is there to ensure some kind of minimal standard of professionalism.

JF: I think it’s unfair for people to blog. People have overstepped the mark and leaked results, and that’s just not fair on their collaborators who are working to get the result into a publishable form.

I would be interested to know what Brain Cox was thinking of here. Which bloggers does he think see themselves as “the hero outside of science” and what examples back up the idea that bloggers try to circumvent peer review?

It is not clear to me whether Brian Cox is referring to the internal review process that experimental collaboration go through or the peer review provided by journals as a part of publication. Surely it cannot be the latter because most science research and especially that from CERN is widely circulated long before it reaches  the desk of any journal editor, not by bloggers but by CERN through conferences, press releases, preprints etc. So Cox must be talking about internal review, but that does not really count as peer-review in the usual sense. In any people within a collaboration do not get away with blogging about results before approval.

There have been a few leaks of results from CERN and Fermilab before approval from the collaboration. For example, one plot featured here earlier this year from a talk that turned out to be not intended for the public. However, by time I had passed it on it was already in Google having been “accidentally” released in a form that made it look like any other seminar where new preliminary results are shown. There were a few other examples of leaks but none that fit what Cox is saying that I can think of.

Given his obvious dislike for blogs I can’t hold much optimism that Brian will comment here and elaborate on what he means, but it would be nice if he did. Otherwise perhaps someone else knows some examples that could justify his answer. Please let us know.

New Higgs Results

October 22, 2011

ATLAS have released a conference note with new results for Higgs decays to ZZ and then two leptons and two neutrinos in the final states. The update uses 2.05/fb compared to 1.04/fb as first shown at the Europhysics conference in July. This is not the most exciting channel for discovery potential because it does not cover the light mass region, but it is good to see new data appearing now. No significant excesses are seen. Here is the main plot.

The note is aimed at the Hadron Collider Physics conference starting 16th November in Paris when we are already expecting to see a full LHC combination of Higgs searches in all channels based on data from the Lepton-Photon meeting.

As the 2011 proton physics run enters its last week, nearly 5/fb have been collected per experiment. That is enough data to give good observation potential at all possible Higgs masses except the crucial 115 GeV – 125 GeV light mass region. We may yet have to wait a few weeks before plots at that amount of data emerge.

Update 23-Oct-2011: Both ATLAS and CMS now have over 5/fb recorded.

BBC: Faster Than the Speed of Light?

October 20, 2011

Yesterday evening the BBC ran a documentary about the OPERA neutrino results. If you are in the UK and missed it you can watch repeats over the next few days or view it online here. Probably it will be available in other countries in some form soon.

The program was presented by mathematician and author Marcus du Sautoy who has become a familiar science host on the BBC in recent years. The tone of the show was skeptical but open-minded and I think this reflects the range of views that scientists have on the subject. Marcus described the results and surrounding debate as “a great example of science in action”. The show must have been put together very quickly but it follows clear logical steps and includes most of the relevant points that should be discussed at a popular programme level. I think they did a great job of bringing in the more exciting possibilities without hype. Here are a few highlight quotes from the guest scientists.

Marcus du Sautoy: “You can almost feel the shudder that passes through the entire scientific community when a result as strange as this comes out. Everybody’s talking about it. Is this the moment for a grand new theory to emerge that makes sense of all the mysteries that still pervade physics, or has there just been a mistake in the measurement?”

Marcus du Sautoy

Chiara Sirignano (OPERA): “On top of us we have 1400m of rock, the top of Grand Sasso mountain. Here the cosmic rays are very few because outside they are 200 per square meter per second and here it is just 1 per square meter per hour. This is a very huge shielding”

Chiara Sirignano

John Ellis: “If the speed of light turned out not to be absolute, we would just have to tear up all the textbooks and start all over again. On the other hand it would be nice if it were true.”

John Ellis

Fay Dowker: “For me it would mean that the direction of my own research was wrong, so it would be a revolution but to me it would also mean that nature is just playing tricks with us”

Fay Dowker

Jon Butterworth: I actually heard about this result in the coffee bar at CERN about two weeks before it came out, and I laughed. I have to say that was my thought, they have got something wrong haven’t they?

Jon Butterworth

Stefan Söldner-Rembold: “MINOS and T2K will both work very hard to get a similar measurement with a similar precision in the next few years, but it will take a few years I think”

Stefan Soldner-Rembold

Joao Magueijo: “Obviously this result contradicts what you find in textbooks, but if you are actually working in the frontier of physics, if you are really trying to find new theories this is not as tragic as you might think. It is a crisis, but we need a crisis because there are a lot of things in physics in those textbooks which don’t really make any sense.”

Joao Magueijo

Mike Duff: “Well, I have been working on the idea of extra dimensions for over 30 years so no one would be happier than I if the experimentalists were to find evidence for them. However, To be frank, although I like the idea of extra dimensions, this is not the way they are going to show up in my opinion. So I am not offering extra dimensions as an explanation for the phenomenon that the Italian physicists are reporting.”

Mike Duff

Tara Shears: “This could be one of those moments that turns our understanding on its head yet again, let’s us see further into the universe, let’s us understand more about how it ticks, how it sticks together, how things are related inside. If it does that, if we understand more, then it’s one of those magical moments that you get in the history of physics that just twists your understanding and brings the universe into focus, and if we are seeing the start of that now, and we are documenting it, then we are really, really, really privileged to be doing so.”

Tara Shears

Name A Very Large Radio Telescope Array

October 15, 2011

Do you remember the radio telescopes in the Film Contact where Jodie Foster and her team of geeks received the first haunting signal from alien intelligence? That was actually the Very Large Array run by the NRAO in Mew mexico and it has just finished a big upgrade to its electronic systems. They think that VLA is not a sufficiently imaginative and so they want to rename it but they also want the public to come up with the new name. You can very quickly and easily make a suggestion or several suggestions here.

I have already suggested “Carl Sagan Radio Observatory” and I am sure I will not be the only one using that theme. The BBC has gone for “Unfeasibly Large Telescopes”. There must be some more sophisticated ideas out there, so submit them and let us know.


LHC Update and new records

October 9, 2011

ATLAS and CMS are now reporting about 4.2/fb recorded while LHCb recently celebrated their first 1/fb. There are 20 days of proton physics left this year, enough time to bring the total to about 5/fb each for the big two. Some time is being reserved for extra machine development studies looking forward to next years runs.

Last week they collided bunches with 25ns spacing for the first time. Next week they will do this again with more bunches and will also run some pile-up tests. I think this means they will collide some high intensity bunches so the experiments can test their algorithms to see how well they can deal with even higher event pile-ups.

Meanwhile the physics runs continue to collect data and today they established a new peak luminosity record of 3.42/nb/s , beating the record of 3.3/nb/s from a few weeks ago. This suggests that they are returning to some adiabatic intensity increases as the end of the run approaches.

If you are following the physics results don’t miss “Heavy flavor physics with the CMS experiment” this Tuesday and “Searches for Exotic Physics with the ATLAS Detector” next Tuesday. Both talks will probably be webcast live from CERN.

Update 14-Oct-2011: The all-time delivered luminosity for the LHC has now passed 5/fb per experiment. The more important figure for recorded luminosity in ATLAS and CMS is over 4.5/fb with two weeks of proton physics remaining for 2011.

Let’s talk about FTL

October 9, 2011

Since the announcement of the OPERA result there have been numerous theory papers written about the Faster Than Light neutrinos and posted to arXiv and even viXra. For next Friday the CERN theory group have organised a three hour seminar to discuss various theories. The rule of engagement is that nobody is allowed to talk about the result being wrong. They just have to imagine that it has been robustly confirmed and consider how they would explain it. It is a great idea and a pity that they are too shy to webcast it.

In any such discussion I think the first thing to remember is that the measurement was a purely classical one so you have to first address the classical (non-quantum) implications. This can go two ways. Either the Lorentz transforms are (locally) valid or not. In the experiment, protons were fired at a fixed target to generate pions and kaons that decay to provide a beam of neutrinos. If we want to keep the principles of special relativity intact in our explanation then we have to face the fact that the experiment can be transformed to one where a fast-moving target is smashed into stationary protons as seen by someone moving in the reference frame of the protons. This is enough of a Lorentz boost to transform the neutrino worldlines so that they would become anti-neutrinos that began life at the OPERA detector and headed towards CERN to meet the pions. This means they would have to anticipate the experiment so causality is dramatically violated. You can’t escape this result if you want to keep the Lorentz transformations. It does not matter whether the neutrinos are acting like classical tachyons with imaginary mass or if they are passing through a stargate buried underground that teleports them closer to the detector. The fact is that if Lorentz invariance holds then you can use the experiment to send information back in time. Some imaginative people may be able to dream up theories in which time-travel is acceptable due to branching timelines or whatever, but you might as well believe in Dr Who.

The second alternative is to consider violations of Lorentz invariance and this is what most theorists would do. It remains true that the size of the violations is large and classical in nature. This is not some subtle quantum gravity effect that only reveals itself at the Plank scale. It has to be something that is only hidden because of the difficulty is detecting neutrinos. Lorentz violation justifies the headlines that “Einstein was wrong” but not just at scales where spacetime structure is expected to break down. This is being seen at velocity scales accessible to a modest particle accelerator.

The measurements tell us that the superluminal velocity of neutrinos does not vary much with energy. They don’t seem to approach the speed of light as the energy increases as classical tachyons would. In fact the lack of dispersion observed suggests a fixed speed for neutrinos at least over the range of energies produced in the experiment. Other observations of cosmic neutrinos tells us that much lower energy neutrinos seem to travel at the speed of light. You can consider variations on the possible behavior but I think it is difficult to escape one of two possible conclusions. Either the speed of light a few kilometers underground where the neutrinos passed is faster than the speed of light above ground, or there is a second fixed speed everywhere that high energy neutrinos adhere to.

In the first case you could drill a deep hole and send down an atomic clock, when you bring it back up you will find that time has passed more quickly. This would have to be a much bigger effect than the known GR effects. I can’t see how such an effect would not have been seen in some other observation so I wont consider it further.

The remaining possibility is that there are two (or more) constant speeds everywhere in nature. This is not something you can attribute to violations of Lorentz invariance. It simply implies that Lorentz invariance is completely wrong, but wait. Einstein replaced special relativity with general relativity where the spacetime metric is just a dynamical field associated with gravity. In GR the Lorentz transformation is just a subset of a more general transformation that locally preserves the metric. Suppose there were two metric fields that both transform according to the rules of general relativity but one of them is only coupled to neutrinos and other weakly interacting matter. This I think is the best hope for a classical theory that could explain the superluminal neutrinos without causality violations.

However, with two metrics on spacetime you can combine them to define a preferred reference frame. E.g you can multiply one metric by the inverse of the other and construct the eigenvectors of the result to define vector fields that define a stationary frame. Effectively you have created an aether theory, but at least one where the aether filed is dynamical and nearly invisible. I think this is the least radical way to explain the OPERA result if it stands up.

What about the extra dimensional theories that some people are getting excited about? They don’t escape the classical arguments I have given and I suspect that these arguments can be made more robust if someone believes the OPERA result strongly enough to try it. You will either have to accept strong causality violations or an aether field that determines the frame for a second fixed speed. Any such arguments will make assumptions but violating those assumptions would require a paradigm shift to something so radical that we can’t really anticipate it.

Of course the much simpler explanation is that the experiment has neglected some systematic error, but that is too boring.

Embargoes and Neutrinos

October 7, 2011

The Embargo Watch blog has revealed an interesting aspect of how the recent news of faster-than-light neutrinos was released, namely that information had been issued to the main-stream media news outlets before even the rumours started to spread on the blogs. Their report includes a statement by the CERN press officer James Gillies detailing how he thinks the news broke but it leaves out some important details. It is interesting to look back at what did happen because news of other discoveries may emerge in a similar way in the future, so for the record here is the timeline as I witnessed it.

12th September – A seminar was scheduled at CERN for 16th September with the title “Seminar DG”. I saw it posted on indico and I added it to the viXra event calendar. There was no indication of what it was about, but as we now know CNRS had asked CERN if they could report their results there. CERN does not operate OPERA, it just provides the neutrino beam.

13th September – According to Embargo Watch journalists were briefed about the results at about this time and asked not to publish yet.

15th September – An anonymous commenter reported on Resanaances that a 6.1 sigma effect was about to be reported by CERN but the seminar had been cancelled. I saw the comment and checked my link to the “seminar DG” to find that it had indeed disappeared. I posted a note on an earlier Seminar Watch post and twitter but was not sure if the rumour was genuine.

16th September – Anonymous posted comments on Resonaances, Not Even Wrong and Vixra to say that the report would be about faster than light neutrinos at OPERA and that the seminar had been rescheduled. I added a link to the new seminar to the Calendar.

19th September – Dorigo posted a report about the findings on Quantum Diaries Survivor. Posts quickly followed on viXra and The Reference Frame and other blogs. Dorigo then withdrew the post under pressure from his emplyer.

22nd September – Another Italian physicist gave an interview about it to an Italian paper. According to Gillies this is when CERN briefed some journalists with the intention that the news should be published the next day. Reuters and some others published immediately.

23rd September An e-print appeared in arXiv in the morning and the news was widely reported in the media. The seminar was held later that day. The official press release was issued etc.

What do we learn from this? Firstly, a week is too long to contain a rumour about particle physics and if the rumour starts in Italy then it is far too long. If they had stuck to the original schedule the information would have emerged from the seminar as planned. Briefing the press and then delaying the seminar was not good. The original intention was to let the main stream media prepare the story before the blogs, but the result was that the news leaked onto the blogs while the press were under an agreement to stay silent, what a mess.

When Dorigo posted they should not have forced him to remove it. Other bloggers already knew what the news was and by all accounts it was being discussed widely by physicists. I for one was ready to post more at that time anyway.

The CERN press office and the DG give the strong impression that they do not like bloggers that they don’t have control over. As freelance bloggers we often get information in advance and contrary to what some people think we don’t always post it.  They need to stop working against us if they want that to continue.

ESA’s EUCLID to explore dark energy while NASA’a WFIRST is in doubt.

October 5, 2011

Just as the Nobel prize in physics is awarded for the discovery that points to dark energy, Europe’s space agency has announced that it will go ahead with it’s mission to map out the effects of dark energy on the distribution of galaxies over time. The mission christened EUCLID will be scheduled to launch in 2019 and will map the positions of galaxies out towards the edge of the observable universe. EUCLID was one of two missions that ESA announced yesterday under the banner “Dark and Bright“, the other being Solar Orbiter to launch in 2017.

ESA's EUCLID observatory

The news comes shortly after doubt was cast on the future of WFIRST a similar mission planned by NASA. The problem faced by the American Space Agency is that JWST, its ambitious next generation space-telescope, is over budget and absorbing funding from other projects.

The status of big science in the US has recently taken some big blows.  With the Tevatron bowing out to the superiority of Europe’s Large Hadron Collider and NASA’a manned space capability ending with the demise of the shuttle while China build’s up for a spectacular new space program, the days of US superiority in science seem to be fading into night. Many hopes now rest with the James-Webb Space Telescope which has the potential to be a ground breaking observatory especially for the exploration of the early universe, but the risk is high. The JWST is a complex instrument that will be sent to the Lagrange points far away from the Earth. Even if the US had a manned space program there would be no hope of servicing the mission as they did for the Hubble Space Telescope. It has to work first time and keep working. At least the American’s can still say they are bold.

Nobel Prize for Chemistry 2011 is awarded to Daniel Shechtman

October 5, 2011

The Nobel Prize for Chemistry 2011 has been awarded to Daniel Shechtman for the discovery of quasicrystals. He has previously been awarded the Wolf prize in Physics.

Quasicrystals are substances whose molecular structure is ordered in a form similar to crystals but is not periodic. Such materials can naturally form into geometrical solids just as crystals can, but the solids formed from crystals are always shapes that can tessellate space, such as a cube. Quasicrystals on the other hand can form shapes such as a dodecahedron.  The mathematical patterns that describe the layouts of the molecules were studied by Roger Penrose but have also been found in some ancient arabic art.

Shechtman was the first to make such crystals in 1984.

Ho-Mg-Zn icosahedral quasicrystal