While you eat your Xmas dinner the engineers at the LHC are busy fixing up the systems ready for next year’s run, there is no time to lose. Here is a picture.
While you eat your Xmas dinner the engineers at the LHC are busy fixing up the systems ready for next year’s run, there is no time to lose. Here is a picture.
This has been a great year for experimental big science with ground-breaking findings in particle physics and astronomy. One of the most remarkable breakthroughs has been the success in the search for planets around other stars. The word exoplanet which first appeared in print in 1995 according to google, and has become a popular term in news reports in just the last three, has become ever more familiar this year as reports from the Kepler space telescope have taken the number of candidate exoplanets into the thousands.
Kepler is constantly watching 145,000 main sequence stars in our nearby region of the Milky Way galaxy in the direction of the constellations Lyra and Cygnus. It is looking for the tiny dimming of light that tells us that a planet has passed in front of the stars disk. By recording the amount of dimming, how long it lasts for and how frequently it repeats, Kepler can estimate the size and orbit of the planet. In February NASA released a catalog of 1236 candidate exoplanets and this month the number increased to 2326. These have to be verified by ground based observation and so far the catalog of confirmed exoplanets has 716 entries.
The real interest about exoplanets concerns whether or not there is other life in the universe, and if there is, how common it is. A whole new industry of exoplanetary statistics has been born with scientists inventing habitability indexes that can be applied to the exoplanet catalogs to gauge which ones could support life. One habitable planet catalog has two exoplanets regarded as more Earth-like than Mars. These are HD 85512b and Gleise 581d, both found earlier this year, but they are rather large to be comfortable for us to live on. If they have an atmosphere it is could be too thick due to the stronger gravity. Already Venus has a thick atmosphere making the pressure too hot and high pressure for us to survive. If we discovered an exoplanet like Venus we would be very excited because it is in the habitable zone and is very similar to Earth in size. Finding out about its atmosphere would be difficult from a distance of many light-years.
This week some new “Earth-twins” were announced Kepler 20f and Kepler 20e. They are very similar in size to earth but they are not in their Suns habitable zone where the temperature would be about right for liquid water and conditions similar to Earth. It is good news that Kepler has proven that it can find planets of this size but we will need to wait before we find ones where we could really live. It is said that these planets may have been further from their star in the past so that life could have formed there in the past. This just serves to emphasize one more characteristic an exoplanet must have if it is likely to support life as we know it on Earth. It must stay in a stable orbit around a stable star for billions of years so that life can evolve without being obliterated by heat or freezing.
Kepler has a planned lifespan of 3.5 years and may have its life extended. This should give it time to find some more earth-like planets orbiting Sun-like stars with periods of about 1 year. Kepler takes time to find these because they need to pass in front of their star at least twice to confirm their existence and orbit. because Kepler works by looking at such transit events it only sees planetary systems whose disk is aligned with Earth. If it were looking at earth from afar it would have only a one in 700 chance that this alignment occurred. If Kepler finds one Earth-like planet we could guess that there are 700 in the sample they are looking at, which represents 1 millionth of the stars in our galaxy, but will it really find any?
My guess at this stage is that it will find a number of Earth-sized planets and a number of small planets in the habitable zone, but the statistics may be against it finding an Earth-sized planet in the habitable zone of a stable star like ours. Probably we will be able to estimate how many such planets there are and it may be something like a million in our galaxy. It could be a lot less. The next step will be tp determine how many are likely to have the right chemical mixture to form water and an Earth-like atmosphere. We don’t yet know the answer, but it is exciting that the data we need to answer these questions is starting to become available.
Yes I know that physicists don’t use the term “God particle” but it has entered into popular culture and when the terms “Higgs Boson” and “God Particle” were trending on Twitter and Google earlier this week it was the latter that went the highest. Contrary to what some scientists imagine of the interested public, very few think that there is some religious significance attached to the particle because of this name, it’s just a catchy moniker and we need not be afraid to use it.
Following the CERN announcement earlier this week, physicists have been giving some very different assessments of the chances that the ATLAS and CMS detectors have seen the Higgs boson. The CERN DG says merely that they have seen some “interesting fluctuations”, while Tommaso Dorigo, (an expert on the statistical aspects of the CMS analysis) calls it “firm evidence“. Theorist Lubos Motl is even more positive. He says that it is a “sure thing“, but another theorist Matt Strassler has criticised such positive reports. He regards the situation as 50/50 and backed this up with a poll of experimenters that came up 9 to 1 in favour of uncertainty. This contrasts with a similar poll by Bill Murray who is lead Higgs analyst for the ATLAS collaboration. In an interview he reported a 10 to 0 vote that the Higgs had indeed been found.
So can we make a more objective and quantitative assessment of the current level of uncertainty over the result? You might want to know the probability that the Higgs Boson has been seen for example. Unfortunately this quantity depends on the prior probability that the Higgs Boson exists. Theoretical physicists have a very wide range of opinions on this depending on which theories they favour. Experimenters are supposed to make their assessments independently of such prejudices. So how can we measure the situation objectively?
Luckily there is a different question that is model independent. We can ask for the probability that the experiments would produce results as strong or stronger than those reported if there were no Higgs Boson. This conditional probability removes the theory dependence in the question so the answer should be a number that everyone could in principle agree on. The smaller this probability is, the better the certainty that the Higgs Boson has been found.
Before we can calculate the result we must define precisely what we mean by the “strength of the result”. This has to be a single number so it should come from the combined results of both experiments. I will define it to be the maximum value of the CLs likelihood ratio anywhere on the plot. This takes into account both the exclusion side and the signal side of the statistics and is standard use for Higgs searches. Don’t worry if you are not familiar with this quantity, it will become clearer in a minute.
The Higgs combination group have tried to spread propaganda that my unofficial combinations cannot be trusted because only people familiar with the inner details of the experimental analysis are capable of doing it correctly. This is not true. I repeatedly acknowledge that my method is an approximation and that only the official combination can be used to claim a discovery, but it is a good approximation and is perfectly acceptable for making a rough assessment of the combined certainty.
They warn that people should not add the event histograms from separate experiments but that is not how my combination is done. They say that only the experts can understand the systematic uncertainties of the detectors well enough to do the combination, but these uncertainties are all built into the individual exclusion plots that they have shown and are therefore taken into account when I combine them. They warned in the past that there are correlations between the background calculations because both experiments use the same algorithms. These correlations are there and must be accounted for to get the most accurate combination possible, but they have been shown to be small. You can ignore these correlations and still get a very good approximation.
In fact the largest source of error comes from the fact that the approximate combination method assumes a flat normal probability distribution at each mass point, when in reality a more complex function based on Poisson distributions would be correct. Happily the central limit theorem says that any error function with a finite variance becomes approximately normal given high enough statistics, so the approximation gets better as more data is added.
When the combination group published their first result in November I was able to compare it with my unofficial combination done in September. This confirmed that the approximation was good. This was no surprise to me because it had already been demonstrated with the Tevatron combinations and some earlier unpublished LHC combinations. I acknowledge that my combinations for some of the individual channels were not so good because the number of events has been low, especially for the ZZ channels. This will have improved for the latest results because there is now much more data but still these individual channel combinations should be considered less certain than the overall combination.
The assessment I am doing today depends mainly on that, so this is not a big issue, however it is worth showing one further comparison between my combination and the official one for a signal channel. the plot below shows the official combination for the diphoton channel published in November when ATLAS used just 1.1/fb and CMS used just 1.7/fb. The red line is the unofficial result from viXra. It will be interesting to see how much this has improved for 5/fb.
It is possible to do a systematic evaluation of the probability in question using the combined plot. This takes into account the statistical uncertainties as well as the theoretical uncertainties in the background due to imprecise measurements of the standard model parameters (e.g. W mass) and the approximation methods used in the theory. It also includes the uncertainty in the energy resolution and other similar uncertainties in detector performance. All these things have been considered by the experts from the experimental collaborations and built into the plots, so we don’t need to know the details to do the calculation (If anyone tries to claim otherwise they are wrong)
However, there is also the possibility that the experimenters have made some more fundamental kind of error. There may be a subtle fault in the detectors that has not shown up in all the calibration tests which causes an excess on the plot where there should not be one. This should not happen because there are hundreds of people checking for such errors and they are all very competent. Nevertheless bad luck can strike and throw everything out. This has been the case before and it is probably the case with the OPERA result indicating that neutrinos are faster than light.
A second similar possibility is that the theorists have underestimated the accuracy of some of their calculations so that the background calculation is a little off in one mass range. The analysis involves subtracting a very small signal from a large background, especially in the diphoton channel, so the scope for magnifying any inaccuracy has to be considered. A miscalculation of the signal size is also possible but less likely to lead to a bad result.
As I said, the published plots include all the known experimental and theoretical uncertainties, but these other unknown errors in experiment and theory cannot be accounted for exactly. They can only be estimated based on past experience. Some “expert theorists” say that us more “naive theorists” don’t appreciate these facts. Do we really sound so stupid?
How often do experimental faults contribute to a false positive like the excess reported this week? We can only look at past performance but I am not aware of any careful surveys, so a guestimate is required. Someone else may be able to do better. The answer might be one in a hundred but let’s be more conservative and say one in ten. If you think it is more common please fell free to reevaluate for yourselves.
However, with the CERN Higgs result we have good evidence that such a fault is not the cause of the excess. That is because there are two independent experiments reporting a very similar result. ATLAS and CMS may seem very similar from the answers they produce, but the detector technologies they use are quite different. The chance of a common fault producing the excess in both detectors must therefore be very small. I am going to assume that this is negligible. If anyone thinks otherwise please explain why.
This means that if the excess is due to such a fault it must be a coincidence that it has a similar effect for both experiments. If there is a one in ten chance of a fault for one experiment, the chance for two independent experiments is one in 100, but even then that is the chance that they would produce the fault at different places. Lets have a look at the two signal plots together.
The positions on the maximum excess differ by about 2 GeV but the mass resolution is around 2% so this is not an inconsistency. If these excesses are produced by detector faults then the chance of them lining up so close would be small. How small? That depends on some unknowns. we can’t just say the fault could appear anywhere in the mass range, so let’s be conservative and just call it a one in three chance.
Overall then we arrive at a one in 300 chance for the observed excess to be explained by a coincidental combination of detector faults. I think this is conservative. Someone else might estimate it to be more probable.
The other outside possibility is that the result has been afflicted by a misunderstood background so that the observed excess is really just a subtle effect of the Higgsless standard model that the theorists failed to recognize or estimate correctly. Again this is unlikely but it happens and must be considered. How often does it happen? Once in a hundred perhaps? I will be more cautious and assume one in ten. You may think that is an underestimate in which case you can make your own evaluation.
But again we have more than one place to look. The separate experiments could well be affected by the same theoretical error but the different decay channels are much more independent. There may be some small chance that a single theoretical error could affect all the channels but this would have a small probability, say one in a hundred. If you think it is bigger please justify how that could happen.
So now let’s look at the combined signal plots for the three main channels; diphoton, ZZ->4l and WW->lvlv. For the WW plot I can’t use the latest CMS results because the plots shown are frankly rubbish quality. I hope they will improve them before publication. However the WW channel has good sensitivity even with less data so I will show the combination from the summer.
All three channels show an excess in the same low mass region so if this is due to independent faults it would require a coincidence. However, the excess is not as good in ZZ and WW as in the diphoton channel. I am going to put the probability at one in a hundred overall and add to this the probability of one in a hundred for a common fault that affects all three. So the overall chance for a fault from theory is one in 50. Some people will say that this is a low estimate and some people will say that it is low. Others will say that it is nonsense to attempt such an estimate. Never mind, I am just giving it my best honest shot. Let others do the same.
The last thing to consider is what is the probability of getting s signal as string or stringer than that observed according to the statistical analysis. Actually this also takes into account some theoretical uncertainty and measurement error, but mostly it is statistical. This is a probability that can be worked out more scientifically, but it does include the Look Elsewhere Effect which is partly subjective.
First consider what would be the chance of seeing a signal as strong as the one reported at the fixed mass point of the maximum excess if in fact there was no Higgs Boson. The plot shows a three sigma excess at 124-125 GeV. This would have been much stronger if the peaks from the two experiments had coincided more closely, possibly about 4 sigma. This discrepancy may be due to some detector calibration that could be corrected but it is correct that we do not take that possibility into account. The 3 sigma excess is what we should work with.
As everyone knows, the probability of a three sigma fluctuation is one in 370, but that allows for fluctuations up or down. So the probability for an excess this big or stronger at this point is one in 740. But we need to know the probability for an excess this strong anywhere on the plot. In other words we need to multiply by the Look Elsewhere Effect factor. Have a look at the plot over the entire range
Notice that for the entire range from 130 GeV to 600 GeV the line remains within 2 sigma of the zero line. Big deviations are indeed rare but how rare?
Another point to consider is that if there was a three sigma fluctuation at say 180GeV, the Higgs would still be excluded at that point. This would not count as such a strong signal. This is why I specified that the strength should be measured using the CLs statistic which takes the ration of the probability for the signal hypothesis over the probability from the null hypothesis. This means that the probability of getting a signal as strong in the regions where the Higgs is excluded is much smaller. In fact we can neglect this altogether. So we need only count the regions from 114 GeV (using LEP) to about 135 GeV and perhaps 500 to 600 GeV. Hoe big is the LEE factor for these regions. This depends on the width of the signal which we see to be about 5 GeV in the low mass range due to mass resolution of the detector, and which is much bigger above 500 GeV due to a very large natural width for a high mass Higgs Boson. The LEE factor will therefore be about 6 but let’s call it 10 to be extra cautious.
This gives a final answer for the probability of a fluctuation to be about one in 70.
Combining the three things I have considered i get an overall probability for such a strong signal if there is no Higgs to be about 1 in 30. Perhaps I have failed to account for combinations where more than one of these effects could combine. That requires further coincidences but lets just call the overall result 1 in twenty. In other words, everything considered I take the observed result to be a two sigma effect.
there is one more thing you need to take into account when considering how likely a result of any number of sigmas significance is going to stand the test of time. That is your prior estimate for the probability of it being true. The OPERA neutrino observation is a good example of an extreme case. A six sigma effect was observed, but he prior probability of neutrinos going faster than light would be considered very small by most theoretical physicists. It follows that the probability for this result to go away is quite high despite the statistical significance. An experimental fault is likely to be the biggest contributing factor despite the care of the experimenters.
In fact most 3 sigma excesses for observations beyond the standard model do go away. This is because the prior chance of any one such effect being correct is very small. You can consider this to be part of the Look Elsewhere Effect too. However, the observation of the Higgs Boson is a very different case. Most theoretical physicists would estimate the prior probability for the existence of a Higgs(like) Boson is very high. The standard model provides a very simple explanation of electroweak symmetry breaking but there is no simple way to understand a Higgsless universe. This make the prior probability high which means that the chance of the 2 sigma result going away is small. There is a bigger chance however that it could move to a different mass.
Not everyone agrees with this. some people do not think that the Higgs Boson can exist. Stephen hawking is one of them. these people would assign a low value to the prior probability that the signal for the Higgs will be seen and so they will consider it very likely that the present observation will go away. I doubt that there are enough people of this opinion to account for much doubt among the experimenters.
To claim a discovery the combined results must give a 5 sigma excess without considering the Look Elsewhere Effect. How long this takes depends on a certain amount of luck. If the peaks of the excesses comes closer together with more data, then the excess will grow faster than you would otherwise expect. In that case the matter might be settled with just twice as much data and the whole thing will be over by the summer. On the other hand, if they are unlucky it could easily require the full dataset from 2012 to get enough data to finish the job properly. It will then not be until March 2013 when the combination is ready that they will finally be able to declare a discovery.
I have been accused by theorist and blogger Matt Straddler of being over enthusiastic about the case for the Higgs Boson and the strength of the latest results. In fact I have not made any overly-strong claims. examples of things I have said previously include
“The result is very convincing if you start from the assumption that there should be a Higgs Boson somewhere in the range. Everywhere is ruled out except 115 GeV to 130 GeV and within that window there is a signal with the right strength at around 125 GeV with 3 sigma significance. They will have to wait for that to reach 5 sigma to claim discovery and next years data should be enough to get there or almost.”
“Some caution is merited. The signal is only 3 sigma combined and the possibility of systematic contributions is there. However, look elsewhere effect is very small given that most regions are strongly excluded. Systematic effects look less likely because of consistency across channels. I agree with Dorigo’s more optimistic assessment but until they have 5 sigma it is not a discovery and collapse of the signal is not out of the question.”
“This is basically a half-full/half-empty result. You can state it optimistically or pessimistically according to your political requirements. Another twelve months will be needed to settle it, but it is much more probable that it will be settled with a positive outcome.”
I have not said anything stronger than this and I stand by what I said. These are not bold claims and are no different from what has been said by some of the prominent members of the collaborations. I find it very bizarre that someone is insinuating that my conclusions are overstated and naive. My detailed assessment of the situation here bears out my earlier level of optimism. If anyone wants to criticize any aspect of my calculations I am open to discussion, but if you think I should just bow to the superior authority of people in apparently better positions, please forget it.
Good morning and welcome to what is expected to be an exceptional day for physics as CERN announces important new results in their hunt for the elusive Higgs Boson. Here in one mammoth expanding post I will be reporting on the search for the Higgs Boson in straight forward terms free form silly analogies and patronizing phrases such as “for the layman”. I hope that many interested people with varying degrees of foreknowledge will find the level helpful. I will explain the basic preliminaries first but if there is anything you don’t understand just Google it or wing it.
The present excitement started to build during the summer when it became clear that the Large Hadron Collider experiment was gathering data at a much higher rate than anticipated, meaning that they would soon be able to tell whether the Higgs boson exists or not and most importantly, what mass it has.
I am a theoretical particle physicist based near London independent of the teams working at CERN, and I have been following events at the Large Hadron Collider and blogging about them since it started colliding protons in 2009. In a minute I will answer a few basic questions about the Higgs for the uninitiated, including the Paxman question “What does the Higgs boson look like?” Then I will be live-blogging the events from CERN as they happen, so first let’s look at the schedule of today’s events.
During the 1960s and 1970s theoretical physicists using data from the first generation of particle accelerators assembled a theory of elementary particles known as the Standard Model. It included familiar particles such as the electron, photon and neutrinos as well as unseen quarks that bind to form protons and neutrons inside the atom. All the particles in the standard model are of two types with one exception.
The particles which build up matter are all spin-half fermions which obey an equation formulated by Dirac in 1928. This includes the three generation of quark pairs and the three corresponding pairs of leptons, the electron, muon and tauon with their neutrino partners. Each of these has an antimatter partner so there are 24 distinct fermions in the Standard Model. The second set of particles are the spin-one bosons. These play the role of binding together the fermions with the electromagnetic force (the photon) the strong nuclear force (the gluons) and the weak nuclear force (the W and Z bosons) Of these only the W is charged and so has a distinct anti-particle, meaning that there are 5 different bosons.
Aside from these it was found that the standard model required one further particle. It was known that a consistent model of spin-half fermions and spin-one bosons free from infinities required gauge symmetry, that is a mechanism that would in theory make the bosons massless. On the other hand, nature had shown that the bosons that mediate the weak nuclear force must have mass. The solution was a mechanism worked out around 1960 by a number of physicists that introduces an unusual field into the theory. The field has an unorthodox energy potential that is minimised away from the central point of symmetry so that the value of the field in the vacuum state of space-time must be shifted away from the central point, thus breaking the underlying symmetry and giving mass to some of the particles.
Peter Higgs, one of the pioneers of this mechanism, pointed out that the remnant of this field in its broken form would have excitations corresponding to a unique elementary particle that might be observed as final confirmation of the theory. Unlike the other particles in the Standard Model, this one would be a spin-zero boson. Observation of this hypothetical particle named the Higgs Boson in his honour is what the Large Hadron Collider has been looking for 50 years later.
The Higgs Boson exists only for fleeting moments as a fuzzy quantum wave on scales smaller than the inner workings on the proton. It is therefore impossible both theoretically and practically to “see” it in the normal sense of the word. What we can see are traces of its existence in data gathered from countless collisions between high energy protons in the Large Hadron Collider.
In the LHC at CERN on the Swiss-Franco border near Geneva, physicists have been accelerating protons to unprecedented high energies in a circular underground ring 27 km in circumference. When the protons are brought together in a head-on collision the energy can form new particles, perhaps including some never observed before such as the Higgs boson. Many trillions of collisions have been observed but the processes that form a Higgs boson are so rare that only a few thousand are likely to have been created in the experiment so far.
Once created a Higgs boson should live for a fleeting 10-22 seconds, enough time for it to travel between 10 and a hundred times the width of the protons from which it emerged. Then it decays, usually into other particles, most often a matched pair of bottom/anti-bottom quarks which have a much longer lifetime of 10-12 seconds. As the bottom quarks fly apart a string of gluon flux stretches between them before breaking to form new quarks. These emerge along with the decay products of the bottom quarks as jets of hadrons that reach the detectors. Sometimes the bottom quarks will each decay into another quark plus a lepton (electron or muon) with an accompanying neutrino. The lepton makes tale-tale tracks in the detector while the neutrino flies off without a trail only to be guessed at when they add up the energy of all the other particles and notice that some is missing. Unfortunately there are many other less remarkable processes that produce similar jets and leptons at the LHC making it very difficult to observe the Higgs Boson when it decays in this way.
If the latest rumours about the measurements at CERN are correct the Higgs Boson could have a mass approximately equal to that of a Caesium atom. If this is correct about one in 500 of the Higgs bosons produced will decay into two high energy photons that fly away in opposite directions. Unlike the bottom quarks these fly away cleanly carrying all their energy and momentum to the inner layers of the detectors where a surrounding vessel of liquid argon has been placed to capture them. There they produce a shower of lower energy particles that are carefully tracked so that their energy and trajectory can be measured to reveal the parameters of the original photon. During all of this years run at the LHC this may have happened only a dozen times in each detector, but it could be enough to reveal the Higgs Boson.
Such photons will be thousands of times more energetic than the harmful gamma rays that emanate from nuclear reactions, but they are still photons identical to those of light which differ only by having less energy. If you want to know what the Higgs boson looks like it is the faint glow of these rare photons that answers the question most directly. In the LHC they shine faintly among the brighter radiation of other processes that produce equally energetic gamma rays. The ones coming from the Higgs Boson can only be noticed when enough have been detected to show up as a slightly brighter peak in the energy spectrum of thousands of observations. It is this that we are hoping to hear news of today.
The theory of the Higgs Boson has been around a long time and all the other particles of the standard model have been found. Several of them were found after they were predicted by the model, especially the gluons, W and Z bosons and top quark. This means that the theory of the Standard Model is in very good stead experimentally. Indeed, physicists have been hoping for some experimental deviation from its predictions for decades and have come away disappointed. Every experiment just seems to confirm its correctness with ever more accuracy. (There are some exceptions such as measurement of the muon magnetic anomaly and the cosmological observation of dark matter that seem to point to something beyond the standard model at higher energy)
With such success it is no wonder that the theorists are quite confident that the Higgs Boson will be found as the last missing piece of the Standard Model. However, experiment is the ultimate judge of nature and theorists are not always right. A minority of physicists notably including Stephen Hawking and Nobel laureate Martinus Veltman have said that they do not believe the Higgs Boson will be found because according to their theories it cannot exist. They are considered contrarians by other physicists but until the “Goddamned” particle has been found nobody can be certain.
One thing that is sure is that the Large Hadron Collider will either discover the Higgs Boson or rule it out as predicted by the Standard Model. If all goes well this will be achieved before the end of 2012, perhaps much sooner. It has been said that if the Standard Model Higgs Boson is ruled out it will be an even greater discovery than its mere existence. This is not just excuses for what some people may portray as a failure. Such a result would indeed by a breakthrough inevitably leading to a new and better understanding of physics.
It is also possible that the Higgs Boson exists but that its characteristics are different from those of the Standard Model. In particlular, it may decay into other lighter unknown particles making it hard to detect. In that case it might appear not to be there even though it is. That will still count as ruling out the Standard Model Higgs but until further experiments are done it will not be known whether it does not exist at all, or is merely hidden from view by non-standard processes. Another even more exciting possibility is that there is more than one Higgs Boson possibly including some heavier versions that are charged. This is predicted by some grander theories such as supersymmetry
However, results from the LHC so far suggest that whatever happens there will be something positive to report today. It will not be quite a full discovery but it will be a strong signal that something like a Higgs Boson exists. Although we have heard some quite detailed rumours already, it is only by seeing the actual graphs that we can get a good idea of what the possibilities are. All physicists are now eagerly waiting to see them.
You might think that since the Higgs Boson was predicted 50 years ago its discovery today will not be very exciting news. Indeed, before the LHC started collecting data, many physicists saw its discovery as inevitable and uninteresting. This view has changed, partly because nothing else has been quick to manifest itself at the LHC as hoped. This means that the Higgs Boson is likely to be the leading discovery of any new physics.
The mass of the Higgs Boson is a free parameter in the Standard Model. Once it is known, all other features such as its lifetime and interaction rates can be calculated. However, analysis of the physics of the Standard Model shows that if the mass is not within strict limits the theory will break down at higher energies. In particular, if it is too light the vacuum will not be sufficiently stable, but we know that this cannot be happening in the real world. The mass range left where the Higgs Boson can still be found includes a range where this would be a problem for the theory.
If it is lighter than 126 GeV then that may be an indication of new physics that could be found with more data. The theory of supersymmetry which is very popular with theorists actually favours the lighter Higgs and corrects problems with the stability of the vacuum, but it does not support well a heavier mass. For these reasons today’s announcement could signal the directions of research for future physics depending on what mass is indicated by the experiments.
Despite the rumours, it is not certain exactly what will be shown today, but we are hoping for full reporting of all the results in the Higgs search from the two individual experiments. This would include the analysis of each possible decay mode that the experiments can currently observe plus two combination of results from all channels, one for ATLAS and one for CMS. The amount of data collected this year corresponds to an integrated luminosity of 5 inverse femtobarns (5/fb) in each experiment so anything less than this is not complete.
There are three sets of decay channels that are currently of special relevance to the search,
If recent rumours are correct it is the diphoton channel that holds the most interest with a signal of a possible Higgs Boson at a mass of 125 GeV, but we will be very interested in the other channels to see if there is any supporting evidence or signs of anything at other masses. It will be especially interesting to see of the earlier weak signal at 140 GeV has gone away entirely. These and other channels may provide signs of something interesting at higher masses but most likely there will just be a strengthening of the evidence for exclusion above 140 GeV.
During the presentations delivered by the collaborations today we will see a lot of new graphs. If you are not familiar with these they will require some explanation. The ones that everyone will be looking out for are the “Brazil band” plots, named for their distinctive green and yellow bands. These plots are the main way of showing the results from each Higgs Boson decay mode as well as the all important combinations.
Here is the best LHC combination plot for Higgs boson searches made public prior to today. It incorporates about a third as much data as gathered during the whole year and was shown in November at the Hadron Collider Physics conference, but I have redrawn it to add some extra features. (With any plot on this blog you can click on the image to enlarge for a clearer picture)
The horizontal axis is marked with the range of possible masses for the Higgs Boson. The units are Giga electron-Volts as an energy equivalent of mass. This is the standard way to measure mass in an accelerator experiment. If the Higgs Boson has a mass of 125 GeV as rumoured you should be able to see where it would appear on this plot.
The black line is usually called “Observed CLs” and represents the calculated result from all the experiments. Its value for any given mass gives a quantity labelled “95% Confidence Level limit for σ/σSM” on the vertical axis. What does this mean exactly? Take an example; At 200 GeV the observed CLs has a value of about 0.6. What this says is that if the signal cross-section over all the decay modes were just 0.6 times the amount expected if the Standard Model is correct and the Higgs Boson has a mass of 200 GeV, then there would be a 95% probability of seeing more events than they did. This is a roundabout way of saying that we have seen far too few events, so we can rule out the Higgs Boson at this mass with some confidence.
When the black line descends below the red horizontal line at 1.0 on the vertical axis, people sometimes say that the Higgs Boson has been ruled out at 95% confidence level at this mass. This is not strictly correct because such confidence would depend on our prior assessment of the probability for the existence of the Higgs Boson in this mass range in the first place, and also the “Look Elsewhere Effect” would have to be considered. Such knowledge is subjective and dependent on outside influences, but loosely thinking you can interpret it that way.
In the background of the plot I have shaded areas in various grades of pink. The lightest pink indicates an exclusions at 95% confidence. This is often stated as 2-sigma significance because statistically it corresponds to 2 standard deviations away from the normal expectation. Darker shades of pink indicate 3-sigma and 4-sigma confidence. Until recently it was generally accepted that 2-sigmas was enough to rule out the presence of the Higgs Boson at a given mass, but recently people have said they want 5-sigma significance, the same as for the discovery of a new particle. I think in reality most people will accept 3-sigma for exclusions.
But we are no longer just interested in exclusions. How do we know from this plot if the Higgs Boson has been seen? This is where the yellow and green bands come in. The central blue line indicates the expected value under the condition that no Higgs Boson exists at a given mass. The green and yellow bands are the 1-sigma and 2-sigma deviations from that expectation. This means that if there is no Higgs Boson the observed CLs line should wander within these two bands. Statistically it is likely to go outside the yellow bands for about 5% of its range. When we look at the plot we see that this is indeed the case. Despite the excess exceeding 2-sigmas around the 140 GeV region we can only say that the result is consistent with the lack of a Higgs Boson over the whole range. That is not a very encouraging way to put it. Notice that mass ranges where there are excesses will be background shaded in grades of green.
Can we at least say that the plot is also consistent with the hypothesis that there is a Higgs Boson somewhere in the mass range? We can see that it is excluded over the range from 140 GeV to 480 GeV at 2-sigma significance but we can still accept the possibility that it is in the low or high mass region. there are theoretical reasons to strongly doubt that it is at the high mass end so the range 115 GeV to 140 GeV is the best bet.
It is possible to display the same results in a different way that handles the existence and exclusion of the Higgs Boson in a more symmetrical way. This is sometimes called the “best fit” plot or “signal” plot and for the combination above it would look like this.
The experimenters don’t often display their results this way, but as theorist I find it the best plot to give a feel for where we stand. If I can get the data from the talks today I may show some of these plots.
The black line varies around a range of signal values where a signal of zero would indicate just the Standard Model background with no Higgs Boson and a signal of one is just the right strength for its existence. The blue and cyan bands are error bands (mostly statistical) around the observed data. When the blue and cyan error bands extend over the whole range between the red line at zero and the green line at one we really have no indication either way for a Higgs Boson or its exclusion in the mass range. However, when it starts to settle on one of either the red or green line and moves clear of the other, then we know that we have the right signal strength for the presence or absence of the Higgs Boson.
Whatever comes out today there will still be a lot more work to be done. At the moment the LHC is shutdown for the Winter to allow for maintenance and to save electricity at a time when domestic demand is highest. It will startup again in February next year. Meanwhile the physicists will be using the time to continue the analysis of the data already collected during 2011 and that will include preparing the official combination of today’s results from ATLAS and CMS.
Next year the LHC will run again, probably at a slightly higher energy of 8 TeV rather than the 7 TeV used this year. It is expected to collect three times as much data in 2012 as it did in 2011 so by the end of the year they will have a total of at least 20/fb on tape for each of ATLAS and CMS. If they don’t already have enough data to know whether the Higgs Boson exists they almost certainly will by then.
More importantly, they will start to study the properties of the Higgs Boson to check that it matches the standard model by decaying into all types of lighter particle at the predicted rates. If it doesn’t then they will know that there is new physics outside the Standard Model to be understood.
That assumes that the standard Higgs Boson will show up. If it doesn’t they will have the job of looking for what replaces it . That can be done by looking at interactions between W bosons which should get stronger with increasing energy if there is no Higgs Boson until something gives. Present rumours suggest that the Higgs does exist but these WW scattering experiments will still be interesting.
After 2012 the LHC will shutdown for about 18 months to prepare it for running at higher energies, probably 13 TeV during 2015 and 14 TeV later. They will be searching for more new particles but they will also checking the parameters of the Standard Model including the Higgs Boson in more detail to eek out any signs of dark matter or anything else not seen before. The LHC will continue to run at higher luminosity and possibly even higher energy for perhaps another 30 years. This is just the beginning of what it has to do.
This morning ATLAS have released an update to the Higgs search in the WW -> lvlv channel. They are using 2.05/fb in place of the previous 1.66/fb so it is only a small advance. This had been around for some time unofficially but was not shown at the HCP2011 conference, Hopefully it will be obsolete in a matter of hours but here is the plot anyway. It provides 95% exclusion from 145 GeV to 200 GeV.
Just to remind everyone, the official build-up for this event is as follows: “These results will be based on the analysis of considerably more data than those presented at the summer conferences, sufficient to make significant progress in the search for the Higgs boson, but not enough to make any conclusive statement on the existence or non-existence of the Higgs.”
If you come here expecting a life-changing discovery to be announced you will be disappointed, but if you want to see some science in action taking a small step forwards you may enjoy.
With two hours to go the auditaurium was already full.
Here in the UK the BBC are already running reports on the network news. They are saying that each experiment is finding a blip in the same place giving a strong hint of the Higgs.
Speakers introduced, talks getting underway
ATLAS have updated the three most sensitive channels diphoton to 4.9/fb ZZ->4l to 4.8/fb and WW->lvlv to 2.1 (as above)
I have the CMS Combo, here it is with exclusion from 130 GeV up. Excess seen at about 123 GeV of 2.5 Sigma
Here is the CMS diphoton plot shwoing where the excess comes from, but there are other excesses nearly as big
Here is the ATLAS version from the talk. Updated from conference notes.
The CMS ZZ->4l clearly rules out the 140 GeV possibility, but has an excess at lower mass.
ATLAS ZZ->4l and full combo from the talk. Updated from conference notes.
ATLAS full combo from the talk. Updated from conference notes.
First talk is over, now over to CMS
CMS have two versions of the WW channel, cutbased and BDT
Here is the first of my unofficial combinations as the discussion time ends. This is the diphoton channels combined for ATLAS+CMS. Remember that this is approximate and you should not try to read the number of sigmas from this. I may revise it later when better version of the plots become available.
ATLAS have now released 3 new conference notes so I will update the pixtures
I have now digitised the CMS combined plot and produced this signal plot. It gives a clean indication for no Higgs about 130 GeV and the right size signal for a Higgs at about 125 GeV, but there is still noise at lower mass so chance that it could be moved.
Here is the same thing for the ATLAS data
Here is the fully combined exclusion plot. The signal fits best at 124 GeV and just makes 3-sigma. Remember the official version is likely to be a little different. This is just a quick approximation.
Here is the fully combined signal plot. It looks very convincing but the region below 120 GeV is not resolved yet. Until it is there will be a little room for doubt.
But of course we can clean up the lower region by including LEP and Tevatron too. An official combination with Tevatron data included is also planned
A zoomed version
Finally here is one last combination for diphoton + ZZ in CMS and ATLAS. These are the high-resolution channels so they give a cleaner signal, but without WW the significance is less.
Conclusions: The result is very convincing if you start from the assumption that there should be a Higgs Boson somewhere in the range. Everywhere is ruled out except 115 GeV to 130 GeV and within that window there is a signal with the right strength at around 125 GeV with 3 sigma significance. They will have to wait for that to reach 5 sigma to claim discovery and next years data should be enough to get there or almost. I calculate that they will need 25/fb per experiment at 7 TeV to make the discovery. A big congratulations to everyone from the LHC, ATLAS and CMS who found the Higgs when it hid in the hardest place.
I was lucky enough to meet Peter Higgs many years ago when I was a postdoc at Edinburgh and I have a big smile knowing that this has been achieved in his lifetime. Congratulations to him and the other physicists involved in discovering the mechanism of symmetry breaking. Finally, in case they are forgotten, well done also to all the phenomenologists who did the calculations to work out how the Higgs Boson could be found, not least John Ellis.
From here there is much more work to do in order to check that this particle seen today has exactly the characteristics of the Higgs, if indeed it is confirmed with more data. That will take many more years of runs at the LHC. It will also be exciting to see how this mass affects our understanding of what other physics could be in reach. I hope there are some Campaign corks popping at CERN this evening. They have had a remarkable year.
13 December: please follow the live blog for up-to-date news
Jester has kindly provided some more refined rumours to give us something to talk about and make the time go quickly while we wait for the Big Event. Here are my comments
“The Standard Model Higgs boson is excluded down to approximately 130 GeV, but not below.”
Very nice but this will be using the WW channel. I don’t fully trust this decay mode for exclusions in the lower energy range because of the poor energy resolution. Previously we have seen both exclusions and excesses near this region. It could mean that there is a non-standard Higgs Boson at 140 GeV that might appear to have lower signal because e.g. it decays to something unknown. It could also just be an effect of the poor WW resolution. I will be looking to see what happens at the 140 GeV point in the combined diphoton and ZZ -> 4l channel without WW to understand this better.
“As already reported widely on blogs, both experiments have an excess of events consistent with the Higgs particle of mass around 125 GeV.”
The interesting thing here is going to be to see how big the excess is when the two experiments are combined. Combining the excess strengths is not just a matter of adding in quadrature. That gives just a crude approximation. I will do a better approximation when I have the data. I am also wondering whether the size of the signal is consistent with a Standard Higgs or bigger. I think it has to be bigger by a factor of two because we only expect 2-sigma significance without the WW channel. I will also look forward to seeing how this shows up on the raw event count plots. Overall a lot of what is seen here will be noise because the sensitivity is still relatively low, but a high sigma combined excess would mean there is probably something.
“The excess is larger at ATLAS, where it is driven by the H→γγ channel, and supported by 3 events reconstructed in the H→ZZ*→4l channel at that mass. The combined significance is around 3 sigma, the precise number depending on statistical methods used, in particular on how one includes the look-elsewhere-effect.”
How close in energy are these three events? That could be key. In any case we should not expect much contribution from ZZ at 125 GeV yet. The channel is just not sensitive enough with 10/fb and will be mostly weighted out in the combination with diphoton.
“CMS has a smaller excess at 125 GeV, mainly in the H→γγ channel, but their excess in H→4l is oddly shifted to somewhat lower masses of order 119 GeV. All in all, the significance at 125 GeV in CMS is only around 2 sigma.”
No surprise that the CMS ZZ result is inconsistent. There is too much noise in this channel at < 130 GeV to know what is the real signal at this point. At the end of next year it will start to come through. For now it will add just a little contribution to the diphoton channel. 2 sigma is very little but when combined with ATLAS it adds up.
“With some good faith, one could cherish other 2-sigmish bumps in the γγ channel, notably around 140 GeV. Those definitely cannot be the signal of the Standard Model Higgs, but could well be due to Higgs-like particles in various extensions of the Standard Model.”
Indeed, but the big question is whether the 140 GeV bumps previously seen in the ZZ channel are still there. This is now very sensitive at 140 GeV so we should know something. Since there is no rumour about this it might mean that nothing is there and the diphoton bump is just the remainder of the big excess seen there in the summer.
Aside from all that we are interested to see what remains at higher mass, especially around 240 GeV and 600 GeV. Stay tuned.
This is the last week of physics runs for the LHC during 2011 before the winter shutdown. The last few weeks have been occupied with heavy ion physics which has been going very well. The luminosity lifetime during fills is much less for heavy ion than it is for protons which means they have to be much shorter. On the other hand the lower radiation means that the fills can be reinjected much faster and there are a lot less problems that get in the way. The overall effect is that the heavy runs have been much more trouble-free except for a few days lost to cryogenic outages. The luminosity collected will be about 150 to 160 inverse micro-barns in each of the three experiments CMS, ATLAS and ALICE. This compared with only about 10 inverse microbarns last year so there should be big improvements in the physics to look forward to when they report over the next few months.
With new physics also about to be reported from the analysis of 5/fb of proton luminosity per experiment gathered in 2011, they are now looking at how much more will be produced during 2012. Readers of viXra Log will know that my estimates for 2011 turned out to be a bit optimistic so I am relieved that this time Steve Myres himself is sticking his neck out to give some predictions which should be better than mine. These were presented this morning at the LHCC Meeting that also included reports from the individual experiments. These have been recorded on video for anyone who is interested.
Myres has based his estimates for 2011 on two scenarios. The first is running with 50ns bunch spacing as they did this year, using a squeeze with a beta* of 0.7 and 4 TeV beam energy slightly above this year. The number of bunches, intensity and emittance would be kept at around the levels at the end of this year and cannot be pushed up much further due to injector and intensity limits. These parameters are just a working hypothesis. The final plan will be drawn up at the Chamonix meeting in February as usual. There are 148 days available on the schedule for proton physics runs next year and based on 2011, a Hubner Factor of 0.231 is assumed. This provides a predicted luminosity accumulation as shown in the following plot.
As you can see, this would give them over 16/fb of integrated luminosity, more than three times that delivered this year, but variations in the Hubner Factor mean that this should be regarded as just a crude estimate.
One potential problem is the high pile-up that this would provide. in other words there would be more collisions per beam crossing than the experiments would like. The number is about 27 on average for this set of parameters. The pile-up makes it very hard for some types of analysis, especially where missing energy is involved, e.g. processes that produce neutrinos or unknown dark matter WIMPS would be harder to see. Several solutions are possible each with its disadvantages. One solution is to use 25 ns spacing instead of 50 ns. The better spacing would decrease pile-up by a factor of three for just a little less luminosity. Myres worked out a corresponding prediction for luminosity giving about 11/fb for this case. I understand that the experiments would prefer this but the beam operations group are cautious because there are further unknowns that could reduce the luminosity further at 25ns spacing. It may be harder to reach beta* of 0.7, there may be limits to the total intensity due to heating and some unknown amount of time would have to be dedicated to scrubbing to remove the e-cloud effect.
Myres proposed that instead of using 25ns spacing the experiments might care to accept some luminosity leveling during the start of fills to reduce peak luminosity and pile-up. He also had a suggestion to use bunches of different intensities to get some lower pile-up bunch crossings at the same time. This turned out to be an unofficial unapproved suggestion that reached Myres from an individual in one of the experiments according to the spokesperson. I will make the even more unofficial suggestion that they aim for long runs and use the lower end of run luminosity for the experiments that suffer from pile-up.
The maximum possible luminosity will be needed next year to get conclusive statistics for the Higgs Boson if it is confined to a low mass region such as 125 GeV. Only the diphoton and 4 lepton channels have the resolution and sensitivity to see the Higgs there, and they will need the full amount of luminosity predicted by Myres. These observations are less affected by pile-up because the rare high-energy photons and leptons are directly detected as they emerge from the collision chaos. I suspect that this case will win the day and they will run at 50ns rather than the more risky 25ns with lower luminosity.
Myres also went on to look ahead to 2015 after the long shutdown when the LHC will return with repairs that will allow it to get nearer its design energy. The scenario considered is an energy of 6.5 TeV per beam, beta* of 0.5 and 50ns spacing. The full energy of 7 TeV per beam will not be available until later because early experience with the magnets showed that they are not yet ready for training to maximum magnetic field. With 50ns at 6.5 TeV the pile-up reaches 50 which could be a problem. With the increased energy the priority will return to looking for dark matter candidates that might show up as missing energy (SUSY or otherwise). The 25ns option will also be on the table offering a more acceptable pile-up of 17 with an integrated luminosity of 22/fb during the year, subject to the unknowns. My guess is that the 25ns will be favoured, but this depends on physics results that may be seen during 2012.
A further meeting at EVIAN is scheduled for next week where the beam operation groups will go over the technicalities in much more detail. We however will be distracted by news of the Higgs. Meanwhile, well done to the operation groups for a very successful year.
This is just an admin note for submissions of e-prints to the viXra archive. I am happy to say that there is now a system of web-based submission forms that authors can use to send us their articles for upload.
This system should free up the administrators (mainly me) to give us more time for other things. It will also mean less errors and more control for submitters. We will still honour submissions and other requests sent by email for a little while but please expect the form submissions to be dealt with usually within 24 hours, while e-mail submissions could take a week or longer.
13 December: please follow the live blog for up-to-date news
The rumours tell us that next week ATLAS and CMS will announce a strong but inconclusive signal for the Higgs boson at about 125 GeV. This may be wrong and even if it is right there may be other candidate signals to think about, and it will take much more data to verify that the signal is indeed correct for the Higgs, but if it is right, what then are the implications of the Higgs at this mass?
This question will be the subject of much discussion in the coming months and I can only touch on it here. Certainly the central topic of the debate will be the stability of the vacuum and whether it implies new physics, and if so, at what scale?
It has been known for about twenty years that for a low Higgs mass relative to the top quark mass, the quartic Higgs self-coupling runs at high energy towards lower values. At some point it would turn negative indicating that the vacuum is unstable. In other words the universe could in theory spontaneously explode at some point releasing huge amounts of energy as it fell into a more stable lower energy vacuum state. This catastrophe would spread across the universe at the speed of light in an unstoppable wave of heat that would destroy everything in its path. Happily the universe has survived a very long time without such mishaps so this can’t be part of reality, or can it?
As it turns out a Higgs mass of 125 GeV is quite a borderline case. The situation was analysed taking into account the best recent valued for the top mass and weak coupling constants by Ellis et al in 2009. Here is their most relevant graphic with a line running across at 125 GeV (plus or minus 1 GeV) added by me. The horizontal axis tells us the energy at which the coupling constant goes negative. The yellow band indicates the limit for vacuum stability. Because of uncertainty in the top mass and the weak coupling, and also due to some theoretical unknowns, the exact point at which this limit is reached is not known exactly. The yellow band covers the range of possibilities.
The second plot taken from Quiros shows the scale of instability as a function of Top and Higgs mass. I have added a green spot where we now seem to live.
At 126 GeV the vacuum might remain stable up to Plank energies (see e.g. Shaposhnikov and Wetterich). If this is the case then there is nothing to worry about, but depending on the precise values of the standard model parameters, instability could also set in at energies around a million TeV. This is well above anything we can explore at the LHC but such energies are found in the more extreme parts of the universe and nothing bad has happened. The most likely explanation would be that some new unknown physics changes the running of the coupling to avert it from going negative. Examples of something that could do this include the existence of a Higgsino or a stop as predicted by supersymmetry, but there are other possibilities.
It is also possible that some amount of vacuum instability could really be present. If there is meta-stability the vacuum could remain in its normal state. There would be the possibility of disaster at any moment but the half-life for the decay of the vacuum would have to be more than about the 13 billion years that it has survived so far. In the plot above the blue band indicates the region where a more immediately unstable vacuum is reached. It is unlikely that this case is realised in nature.
As the plot shows, if the mass of the Higgs turns out to be 120 GeV despite present rumours to the contrary then the stability problem would be a big deal. This would be a big boost for SUSY models that stabilize the vacuum amd mostly prefer the light Higgs mass. If on the other hand the Higgs mass was found at 130 GeV or more, then the stability problem would be no issue. 125 GeV leaves us in the uncertain region where more research and better measurements of the top mass will be required. It will still encourage the SUSY theorists as work such as that of Kane shows, but the door will still be open to a range of possibilities.
There are other things apart from the stability of the vacuum that theorists will look at. What is the nature of the electro-weak phase transition implied by this Higgs mass? Can it play some role in inflation or other phenomenology of the early universe? How does the result fit with electro-weak precision measurements and what else would be required to reconcile theory with experiment in such tests, especially the muon magnetic anomaly? 2011 has been a great year for the experimenalists but next year the theorists will also have a lot of work to do.
13 December: please follow the live blog for up-to-date news
A rumour that reached our comment section suggests that a signal for the Higgs boson has been seen at 125 GeV with 2-3 sigma significance. This would be a great result if confirmed because at this mass the standard model has problems with vacuum stability that are likely to require supersymmetry or something similar to stabilize. If on the other hand the Higgs were at 140 GeV we would be left with a simple but unsatisfying model that could exist without modification up to energies well beyond anything we can explore in man-made experiments. In other words we might never be able to detect anything new. A Higgs that is just 15 GeV lighter is a different story altogether, so theorists will be very happy if that is the answer.
A statement by the CERN DG circulated to staff says that the results that will be released on 13th December will be inconclusive. This means that a 5 sigma signal is not yet available. A Higgs signal at 140 GeV would probably be conclusive, at least with the 10/fb of data combined, but of course the combination has not yet been done. In other words, the inconclusion is consistent with the lighter mass but not conclusively. Another rumour says that the signal is only seen in the diphoton end state for both experiments. This again suggest the lighter mass because anything in the range 130 GeV to 150 GeV would show up strongly in the ZZ to 4lepton channel but 125 GeV wont. Oddly enough the diphoton channels up to 3/fb combined showed no excess at 125 GeV, but events at this mass would be very rare and if there is a signal it will be just a few events on the 10/fb sample.
It has to be said that the best way to foil rumours is to spread false rumours, but the consistency of the rumours we have suggests that they are genuine. The only thing I know that counts against them is that the Tevatron search in the bb mode shows no signal at 125 GeV where they have good reach. This could have been just bad luck. Even so it will be interesting to see the whole plots which might have other potential signals at higher energy. A Higgs at 125 GeV may well be accompanied by other heavier Higgs states that may show only a partial signal, either because they have the possibility to decay into the lighter Higgs or because they have odd CP (rather than the even CP of the standard model Higgs)
With an inconclusive signal the combination of results from ATLAS and CMS is all important. Approximate combinations of the type I have been doing will be good indicators but only the carefully prepared official combination can lead to a definitive result. Last month I looked at best fits to the combined summer data and found the 140 GeV signal to be best. If I do a fit to a two Higgs model I get a second one at 128 GeV. The lighter peak at 119 GeV favoured by Italian bloggers has error bars too big to grab the second place. It is going to be very interesting to repeat this with the 10/fb of data and see if both of these signals survives the fit.
One last thing worth mentioning is that the gfitter calculations have been estimating 125 GeV for the Higgs mass for some time as the best fit. Well done to them. This would mean that if it is confirmed at that mass, no further physics is required at this energy to account for precision tests. On the other hand, gfitter calculations also fit doublet models well to the data so other physics is not ruled out either.
Another piece of good news is that the results meeting on 13th December will be webcast. Unless they enlist the services of a heavy-duty streaming service their normal webcast channels will certainly be overwhelmed by the public interest in this event which has already been reported by the BBC, Telegraph, Guardian, and many others.
Update 3-Dec-2011: Some clarification at NEW is that ATLAS has a 3 sigma excess at 126 GeV while CMS has a smaller excess at 126 GeV, perhaps 2 sigma, both in diphoton channels. These are close enough to combine to give a 3.5 sigma. That would be enough to claim an “observation” but is well short of “discovery”. There will be interest in whether other channels such as ZZ or WW add anything to the result. By the end of 2012 they will have up to four times the data which is enough to multiply the significance by two if the signal holds up. ( I am assuming that the results to be shown on the 13th will use the full 5/fb collected this year. It could be less. )
Update: The latest version of the rumour at NEW gives 3.5 sigma in ATLAS and 2.5 sigma in CMS which amounts to about 4.3 sigma combined for the 10/fb. This is about right for the expected significance at this mass.
Tommaso’s post at QDS is worth reading but we will need to wait until the official announcement for his analysis because he knows too much.
Update 4-Dec-2011: Nature blog has an interesting snippet of news about the rumours including a comment from Bill “Nonsense” Murray who says that ATLAS collaboration approvals will (hopefully) be finalized at a meeting on 7th December, followed by management approvals.
Update 8-Dec-2011: The BBC has run another story including an interview with John Ellis and a quote from Sergio Bertolucci that “I think we may get indications that are not consistent with its non-existence.” As director of research at CERN Bertolucci is likely to be one of a very small number of people who have officially seen both sets of results from CMS and ATLAS.
On the 12th December the CERN council will meet and announce the latest news about the search for the Higgs boson to its member states. This will be done in closed meetings but the next day the spokespersons for CMS and ATLAS will deliver 30 minute talks each in public. There will then be a discussion period of one hour. Hopefully this indicates that some meaningful result has been obtained and they will be able to tell us what the Higgs mass is or that it does not exist in the Standard Model form.
Unless they do their own approximate combinations I will be doing them myself here. This means I will have to digitise all the points in the CMS and ATLAS plots and run them through my spreadsheet. However, the real interest may come from the diphoton and ZZ channels so I will have to digitise another four plots and combine those too. I am going to be very busy but I will aim to have it all done before the end of the discussion period, unless they find some way of foiling my evil plot such as by not posting the plots online until later.