Octonions in String Theory

April 29, 2011

John Baez and his student John Huerta have an article in Scientific American this month about octonions in string theory and M-theory. Peter Woit has given it a bit of a cynical review describing it as hype. The defence from John in the comments is worth reading. Here is a bit of what he says

“So, don’t try to make it sound like an obscure yawn-inducing technicality about “some supersymmetry algebra working out a certain way in a certain dimension”. It’s a shocking and bizarre fact, which hits you in the face as soon as you start trying to learn about superstrings. It’s a fact that I’d been curious about for years. So when John Huerta finally made it really clear, it seemed worth explaining — in detail in some math papers, and in a popularized way in Scientific American.”

The article entitled “The Strangest Numbers in String Theory” is about an early observation from the study of superstring theories that the four division algebras are related to four classical formulations of superstring theory in 3, 4, 6 and 10 dimensions. The four division algebras are the reals, complex numbers, quaternions and octonions with dimensions 1,2,4 and 8 and the dimensions of the superstring theories are 2 dimensions higher in each case.

Many of our readers will be familiar with the internet writings of John Baez where he has described these things so I wont attempt to cover any details. His student John Huerta has just finished off his thesis in which he clarifies these observations using higher dimensional lie algebras. The results extend to one dimension higher when strings are replaced by membranes. In the quantum theory only the highest dimensional versions related to the octonions hold up consistently giving us the 10 dimensional superstring theories and M-theory in 11 dimensions. Of course this is not complete since we still don’t know what the full formulation for M-theory is. Even these higher dimensional observations are not new, see for example Mike Duffs brane-scan from 1987 where the relationships were already plotted out. This new work clarifies these results using the concept of 3-lie-algebras from n-category theory.

The Scientific American version does not go into great detail but is a very well written introduction to the ideas. If you don’t have access to it don’t worry, John says he will be allowed to post an online version after a month or so. You can also explore what has been posted already starting here which is more advanced than the article but still very pedagogical.

Personally I find these algebraic ideas for M-theory very enticing. It is a major goal to formulate a complete non-perturbative version of string theory that encompasses all its forms and I think the purely algebraic approach is the best line of attack. It is especially intriguing that the octonions have such a direct relationship to the dimensions in which these theories work, but ultimately the algebraic structures we need to understand it fully are probably much more complex.

The work of Mike Duff and his collaborators which brings in the algebra of hyperdeterminants and qubits to understand a slightly different role of the octonions in string theory is one of the areas to follow. This work brings in the duality algebras found in string theory black holes. I know that several of our regular commenters are very familiar with this already so I need not give more details. Indeed it is the fact that the same algebraic structures keep appearing in different contexts that is so intriguing, yet so confusing, as if we are missing some principle of unification that relates these things.


LHC Luminosity Estimates for 2011 (and poll)

April 28, 2011

Last night the Large Hadron Collider clocked up another record luminosity using 624 bunches per beam. In ATLAS the peak luminosity was measured at 737/μb/s while CMS saw about 10% less at 660/μb/s. In theory they should be seeing about the same amount. Paul Collier (head of the beams department at CERN) has said that they suspect the ATLAS figure is too high. A more accurate measurement will be made after the next technical stop in two weeks time when a Van der Meer scan will be run through. Meanwhile I will quote the lower CMS numbers and will be updating the figure in the viXra Log banner above.

An important date coming up in the LHC calendar is the 6th June when the “Physics at LHC 2011” conference starts. The experiment teams will be keen to show results using as much data as possible collected before that date. That will be challenging given the rate at which data will then be accumulating and the need to get about 3000 physicists in each collaboration to sign off on any results before they are presented.

A few weeks ago I said the LHC should be able to deliver 200/pb before the conference. In fact they have already surpassed that figure for 2011 and can add last years 40/pb as well to make nearly 240/pb. I will also be tracking this number above. Two weeks ago I upped the estimate to 500/pb for the conference but even that is looking a little unambitious now. According to the plan there are about 30 days left for physics before the 6th June and they can now easily collect about 30/pb each day. My new estimate is at least 1000/pb which is 1/fb in time for the conference assuming they don’t lose too many days to machine development or faults. Recall that 1/fb is the official target for the whole of 2011! It may sound like their original expectations were a bit low but really this amazing result is due to the exceptional early performance of the collider when compared with previous models.

The 6th June also marks the point when commissioning of the LHC for this year should officially stop. If all goes well they will be aiming to reach this years maximum luminosity by this date. Which will mean filling with 1404 bunches per beam to provide something like 70/pb per day. From 6th June there are 124 days marked on the plan for running proton physics. Do the sums and you will get my new estimate for this years total of 10/fb, easily enough to find the Higgs boson whatever mass it has, or rule it out.

Update: If you dont like my estimate of 10/fb here is your chance to say how much luminosity you think the LHC will deliver this year


The future of science news reporting in the MSM

April 26, 2011

The Main Stream Media have increasing trouble reporting science news because journalists are rarely sufficiently expert in specialized fields and just make mistakes. Newsy.com (who is not really MSM),  has found the solution. Just quote lots of snippets from science blogs and with next to no effort they have an incredibly well balanced report!


LHC takes hadron collider luminosity record

April 22, 2011

Until yesterday the highest luminosity measured in a hadron collider was 402.4/μb/s achieved at the Tevatron last year. Last night the Large Hadron Collider smashed this record with a luminosity of 481/μb/s using a 480 bunch filling scheme. This takes them nearly half way to this years target of 1/nb/s . In fact they are likely to reach that level in the next few weeks and can go well beyond if they stick to the 50ns bunch spacing.

The superior energy of the LHC over the Tevatron means that it now has every advantage. Strong hints of new physics from about 70/pb of collision data are already showing up, yet CERN could collect 200 times that amount before the end of 2011. This could be a good year to be doing particle physics! (The data collected in 2011 now amounts to 90/pb in each of ATLAS and CMS to add to the 40/pb from 2010)

CERNs new record follows a visit on Wednesday by Jose Barosso, president of the European Commission who said “CERN is a true European centre of excellence: a place where our collective talent is pooled and produces cutting-edge research, European in its foundations but global in its outlook.”

Update 23-Apr-2011: The latest run collected a record 15/pb taking this year’s total to over 100/pb. Until at least Monday they plan to stay with 480 bunches. At this level they should collect 20/pb per day. With about 150 days of proton physics left this year that would already provide 3/fb. However it will probably not be long before they continue to push up the luminosity. Another factor of four should be possible if they can go all the way to 1404 bunches and use nominal intensities.

Update 27-Apr-2011: Now they are filling with 624 bunches. This could take luminosities up to 750/μb/s


New New Particle Rumour

April 22, 2011

There is a new rumour about a new particle doing the rounds of the physics blogs. So far it is reported at Not Even Wrong, Resonaances and The Reference Frame.

I do not claim to be the best source of details about new phenomenology,but sometimes my intuition is good so here are a few points.

  • This is not a hoax. It was revealed in an internal memo that was apparently more pubic than it should have been.
  • The memo described a bump in diphoton channel at 115 GeV
  • This is where SUSY prefers the Higgs, but said Higgs does not give such a high rate of diphoton decays.
  • As seen recently bumps can be made to disappear by adjusting the analysis, and this analysis is not completed and approved so it could go away.
  • However, the signal seems to be consistent with the old LEP observation and with faint hints in Tevatron results, so this suggests something real.
  • If it is real then CMS must see it too. Therefore Tommaso Dorigo must remain silent.
  • Although the old LEP result was always considered a hint of a Higgs and this seems likely to be the same thing, we should be open-minded about what it really is.
  • On the one hand I want it to be SUSY and this is possible. We expect to hear about variants of SUSY that fit the bill at the next arXiv hep-ph update.
  • On the other hand, a high photon coupling suggests a neutral composite particle made of charged constituents, like a meson. I don’t know if this is plausible but someone will be suggesting such possibilities soon.
  • Whatever it is, if it is real it is almost certainly Beyond Standard Model and therefore very exciting.
  • The data collected by ATLAS is already quite a bit more than the amount used in this analysis and it will increase by a factor in the coming weeks. There will be more news soon.

Update: Well I was wrong about Tommaso remaining silent. He is sceptical because of comparison with Tevatron and also because of look-elsewhere effect, but the Tevatron can just about accommodate this and there is no look-elsewhere effect if you were specifically looking for a state at 115 GeV 🙂

Update 23-Apr-2011: Tommaso is now willing to offer $1000 against your $500 that this is not a genuine observation. If you are tempted to take him on you should consider that he may have access to data from CMS that would decisively resolve the question 🙂

Update 25-Apr-2011: Channel 4 news UK picked up this story and included a well balanced interview with John Butterworth



LHC back to physics runs with 50ns bunch spacing

April 15, 2011

The Large Hadron Collider today returned to physics runs. Over the last few days they had been filling the collider for scrubbing runs in an effort to remove the e-cloud problem. In fact they only did a few hours of scrubbing runs before surprisingly returning to physics with 50ns bunch spacing.

On the face of it this is great news because it suggests that they have concluded that the problems are not too severe and physics with 50ns spacing will be the norm for this year. However I have heard no official word so I can’t confirm that they wont change back to 75ns spacing.

With the 50ns spacing they will be able to get 1400 bunches into the collider. This is half the design limit of the machine which may only be achievable when they move to the higher energy of 14 TeV in 2014. During the scrubbing runs they were already able to circulate 1020 bunches, so perhaps it will not be long before they reach similar numbers with colliding beams. It is very impressive that they have reached this point so quickly.

For today they have been running with 228 bunches providing a luminosity of 250/ub/s just below the records of a few weeks ago. They need three runs and 20 hours of stable beams before they can go to the next step at about 300 bunches. Just one more runs of about 5 hours should do it. Today’s runs have already added about 10/pb to the total integrated luminosity delivered to each of ATLAS and CMS.

Update 16-Apr-2011: The next step is set as 336 bunches and still with 50ns spacing. Expected peak luminosity is about 360/ub/s which will be a new record, fingers crossed. Total luminosity delivered for 2011 has passed the 40/pb delivered in 2010.

Update 17-Apr-2011: After a few false starts a good long run with 336 bunches is now in progress. It started with a peak luminosity of 380/ub/s in ATLAS and 360/ub/s for CMS. After 9 hours the integrated luminosity has passed 10/pb, the highest yet for a single run. This may be a good moment to speculate about how much data they can collect before the “Physics at LHC” conference to be held in Italy from 6-11 June. The answer depends on many things but I don’t think 0.5/fb is overly optimistic at this stage assuming they proceed with continuous physics runs and increasing luminosity according to the plan which has six weeks of running before the conference. To give some meaning to that amount of data here is the expected Higgs limits for ATLAS.

0.5/fb would exclude the Higgs between 140GeV and 180GeV, unless it is there. Of course even if they do collect 0.5/fb before the conference they would have to do the analysis very quickly to be able to report it in time and I don’t expect that to be possible. Other results are more likely to be shown and the Higgs limit should just be taken as an indication of how much progress has been made when these landmarks are reached.


Spring Cleaning and a Scary Moment for the LHC

April 9, 2011

Spring is very much in the air here in Europe with warm temperatures and sunny skies over most areas. As many of us spring clean our homes the Large Hadron Collider is also undergoing a thorough scrubbing of its pipes to make it ready for it’s first full scale physics runs. For the collider this does not involve teams of cleaning ladies with dusters and scrubbing brushes. The pipes are scoured clean by sending high intensity beams through the pipes for a period of several days.

Before the scrubbing could begin the LHC had to recover from a scheduled technical stop at the start of the month. This process can sometimes seem painfully slow to us outsiders watching the progress but on Wednesday we were given a stark reminder of why they have to be so cautious. During the build up to the scrubbing runs as they attempted to pump in a record 600 bunches to produce the high intensity needed for the scrubbing runs a signal was generated that triggered an immediate quench of the magnets in one sector, sending high currents through the high tension lines. As far as I know it was the first time such an unplanned quench had happened since the disastrous events of 2008 when the splices proved unable to take the high load leading to an explosive release of helium that shut down the collider for over a year. This time round they had the added complication of high intensity beams in the rings which had to be dumped safely. Failure to do so could have sent trains of proton bunches into sensitive areas of the collider causing damage that would have meant another long shutdown. The problem was caused by a faulty temperature sensor where some cables had been plugged in the wrong way. It was lucky that there were two identical systems functioning and the problem only existed with one of them. This made it possible to detect the fault and fortunately the protection systems all worked as designed and no harm was done. It must have been a heart stopping moment for the operators who reported that without the double protection system they could have seen “beyond repair damage”

The collider was stopped for more than a day while they checked all similar connections to ensure that no further faults were still present. The scrubbing runs have now restarted.

These runs were planned at the end of last year after they first reached high intensity with the proton beam runs. They noticed excess background events in the ATLAS detector which were found to be caused by a build-up of an electron cloud in the pipes where the beams approached the detectors. The electrons are released from residual gas molecules in the walls of the pipes. These electrons then strike back on the surface releasing a cascade of further electrons. The circulating proton beams collide with the electrons sending particles towards the detectors that have to be filtered out of the analysis. Too much background of this sort can hide the signals they are looking for.

One step taken to resolve the problem is the installation of solenoids that produce electric fields around the affected areas steering the electrons away from the pipe to avoid the cascades. But in a addition they scheduled these scrubbing runs in which high intensity proton beams are circulated at 450 GeV to tare the residual gas molecules out from the walls of the pipes. They can then be pumped out using the vacuum systems.

The process will last about a week and could see as many as 1200 bunches being circulated. The protons will not be ramped up to the full energy of 3.5 TeV at which collisions have been made but the runs will also allow the operators to see how well the injection systems work to produce the beam intensities that could be used in physics runs later this year.

The extent to which these scrubbing runs are successful will determine how much collision data can be collected this year. If they manage to remove most of the electron clouds they will progress to running with the 50ns bunch spacing that will make it possible to inject up to 1400 bunches for the physics runs. Otherwise they will stick with the 75ns spacing currently in use for physics but that would limit the number of bunches to about 900 and will mean correspondingly lower luminosities for the runs this year.

update 11-apr-2011: For the scrubbing runs they have injected 1020 bunches per beam. This is near to the maximum intensities that they can get to with 50ns spacing. Although they are not ramping up to 3.5 TeV or colliding particles at this time it is good to see that this step can already be done successfully.


New Particle?

April 6, 2011

Other people do the phenomenology stuff much better than me so I don’t try to compete. See e.g. here, here and here, and now it’s also here, here, herehere, here, here and finally here . However, sometimes I like to see my plots without those fitted lines that lead the eye

I can always add my own alternative bumps


Or maybe…


Hidden Variables and the 24 Rays of Peres

April 4, 2011

A lot of people like to worry about the measurement problem and wave function collapse in quantum mechanics. How can a physical outcome depend in such a fundamental way on how we observe it? Many of us have been happy to accept that quantum mechanics works as described and that no real paradoxes arise from its interpretation, but ever since Einstein challenged Bohr’s Copenhagen interpretation, a minority of physicists from Bohm to ‘t Hooft have tried to find a hidden variable explanation that avoids the philosphical problems.  Even if you don’t worry about such things, the maths and physics behind such ideas can be quite interesting. Last week I came across the paper arXiv:1103.6058 by Waegell and Aravind that relates a proof of the Kochen-Specker theorem to the 24-cell. The Kochen-Specker theorem is a no-go result for hidden variable theories and the 24-cell is a unique mathematical structure that comes up in the context of systems of qubits as I discussed just recently.

Any hidden variable theory must avoid certain no-go theorems of this type. The most well known is Bell’s inequality that follows from the assumption of locality and has been shown to be violated in experiments. This is consistent with quantum mechanics but rules are local hidden variable theories. It is quite a strong refutal of Einstein’s philosphical stance against quantum mechanics, because he had a strong belief in locality that followed from his work on relativity where he established the principle that no signal can be sent faster than light. It turns out that quantum mechanics does not violate this principle in the practical sense, yet it is formulated in such a way that wave-function collapse describes an apparent non-local effect. To some physicists including Einstein this seemed philosophically unsatisfactory. Too bad! Sometimes the universe does not respect our philosophical preferences and this is one of them. The experimental verification of the violation of Bell’s inequality shows that we have to accept non-locality in some form.

But non-locality is not the only philosophical objection that physicists have. The bigger problem is that the rules of wave-function collapse depend on what is measured. This seems to give observers a special role in the laws of physics, but there is no good way to define what an observer is from first principle. In a hidden variable theory we would postulate the existence of state variables with definite values that cannot be easily seen but which determine the outcome of quantum mechanical measurements. So could there be a hidden variable theory of quantum mechanics which is non-local but where no variable depends on the context of the measurement? The Kochen-Specker theorem proved by Simon B. Kochen and Ernst Specker in 1967 rules out such a theory and it does so in a very interesting way.

The original proof was quite complex but in 1991 quantum information expert Asher Peres gave a simpler proof. Although he did not mention it at the time, his proof relies on the symmetry of the root system of the exceptional Lie algebra F4. This comprises 48 vectors in 4D space which can be interpreted as the vertices of a 24-cell and its dual. You don’t need to know anything about root systems or Lie-groups or the 24-cell to understand the proof so don’t be put off.

Each root vector is paired with its negative to define a line through the origin in 4d space. These 24 lines are the 24 rays of Peres. I listed these points in my previous post on the 24-cell but here again are the 24 rays with numbers as in the new paper so that I can refer to them.

1 (2,0,0,0) 2 (0,2,0,0) 3 (0,0,2,0) 4 (0,0,0,2)
5 (1,1,1,1) 6 (1,1,-1,-1) 7 (1,-1,1,-1) 8 (1,-1,-1,1)
9 (-1,1,1,1) 10 (1,-1,1,1) 11 (1,1,-1,1) 12 (1,1,1,-1)
13 (1,1,0,0) 14 (1,-1,0,0) 15 (0,0,1,1) 16 (0,0,1,-1)
17 (0,1,0,1) 18 (0,1,0,-1) 19 (1,0,1,0) 20 (1,0,-1,0)
21 (1,0,0,-1) 22 (1,0,0,1) 23 (0,1,-1,0) 24 (0,1,1,0)

Suppose these (when normalised) are 24 quantum states |ψi> in a 4 dimensionl Hilbert Space e.g. it might be a system of two qubits. For each state we can define a projection operator

Pi = |ψi><ψi|

These are Hermitian operators with three eigenvlaues of 0 and one of 1. They can be considered as observables and we could set up an experimental system where we prepare states and measure these observables to check that they comply with the rules of quantum mechanics. One thing that we observe is that there are sets of 4 operators which commute because the 4 rays they are based on are mutually orthogonal. An example would be the four operators P1, P2, P3, P4.

We know from the theory of quantum mechanics that if we measure these observables in any order we will end up with a state which is a common eigenvector i.e. one of the first four rays. The values of the observables will always be given by 1,0,0,0 in some order. This can be checked experimentally. There are actually 36 sets of 4 different rays that are mutually orthogonal, but we just need nine of them as follows:

{P2, P4, P19, P20}
{P10, P11, P21, P24}
{P7, P8, P13, P15}
{P2, P3, P21, P22}
{P6, P8, P17, P19}
{P11, P12, P14, P15}
{P6, P7, P22, P24}
{P3, P4, P13, P14}
{P10, P12, P17, P20}

At this point you need to check two things, firstly that each of these sets of 4 observables are mutually commuting because the rays are othogonal, secondly that there are 18 observables each of which appears in exactly two sets.

imagine now that there is some hidden variable theory that explains this system and which reproduces all the predictions of quantum mechanics. At any given moment the system would be in a definite state and values for each of the 18 operators would be determinate, even if it is hard for us to see what they are directly. The values must be 0 or 1 but they must also comply with the rules that they are equal to 1 for exactly one observable in each of the nine sets. the other three values in each set will be 0. So there must be nine values set to one overall. But this is impossible! each observable appears twice so which ever observables have the value of 1 there will always be an even number of ones in total, and nine is not even. This proves the Kochen-Specker theorem. (This version of the proof using only 18 of the 24 vectors is a later refinement due to Kernaghan, Cabello, Estebaranz, and Garcia-Alcaine, see paper linked for references)

The conclusion is that it you want to believe in hidden variable theories you had better find a way of implementing it that does not comply with the assumptions of this theorem. It is not enough to look for non-local hidden variable theories. You have to avoid the conditions that the variables have definite values or that they are independent of the context of the measurement. For my money that takes you away from the philosophical objections that hidden variable theories are supposed to answer, but if you want to dispute that you can do it here :).


Mike Duff Finds New Way to Test Strings

April 1, 2011

String theorist Mike Duff famed for his pioneering work on M-theory has announced a novel and practical way to test the theory of strings. The Abdus-Salam Professor of Theoretical Physics at Imperial College delivered a talk on “Black Holes and Qubits” in Durban earlier this month. His work with Borsten, Dahanayake, Ebrahim, Marrani and Rubens caused a stir last year because of the widely misinterpreted claim that it provides a test of the mathematics of string theory.  “Two different branches of theoretical physics, string theory and quantum information theory (QIT), share many of the same features, allowing knowledge on one side to provide new insights on the other. In particular the matching of the classification of black holes and the classification of four-qubit entanglement provides a falsifiable prediction of string theory in the field of QIT.” he said.

During the workshop, Duff teamed up with new collaborators from various fields of quantum physics to try out a completely new way of testing strings that came to light during the interdisciplinary discussions, and this time it was very much for real. Andrzej Dragan, Jason Doukas, Ivette Fuentes, Mike Duff and Nick Menicucci demonstrated the method that involves placing long strings under tension. The objective was to observe the temperature that results from uniform acceleration known as the Unruh effect.  Some of their critics have already described it as “highly risky” and “jumping to wild conclusions.”, but Duff has responded by challenging them to test it for themselves. ViXra Log has exclusive video of how the amazing experiment turned out.

Update: I probably did not fool many people with this post, but in case anyone is wondering, that really is Mike Duff and other participants of the Relativistic Quantum Information Workshop doing the bungy jumping. The jump-off point is 106 meters above the Moses Mabhida Stadium in Durban. If anyone has experienced any equally unusual activities organised at conferences and workshops please do tell.