jump to navigation

Altarelli’s State of the Standard Model January 31, 2008

Posted by dorigo in news, physics, science.
trackback

The italian workshop on LHC physics which is taking place in Perugia offered a few enlightening lectures this morning. The high point was reached when Guido Altarelli took the stage for a 50′ talk on the status of the Standard Model.

Guido is a distinguished particle theorist who has worked in quantum chromodynamics (QCD) and standard model physics since their birth; he gave a mass of important contributions to the matter, and he is best known for one cornerstone of QCD: the famous Altarelli-Parisi equation (which some call Dokshitzer-Gribov-Lipatov-Altarelli-Parisi, or DGLAP).

That formula with many fathers describes the departure of hadron cross sections from a scaling law as a result of QCD radiation off the partons, parametrized by suitable splitting functons as a function of the energy at which hadrons are probed. It would take a long post to explain it in detail, but it would be a wonderful challenge for me to write it -and I do love these challenges. So I am making a virtual knot on my handkerchief to remind myself that I need to treat myself with it soon.

In the following I give a summary of the lecture Guido Altarelli gave this morning. I have to warn my non-physicist readers: in this instance, I was unable to describe things in a way simple enough to make it understandable to everybody… I will expiate soon with some more accessible material on the Higgs boson… Stay tuned: breaking news are coming off tomorrow!

The picture Altarelli painted was “impressionist” in his own words, because of the vastness of the topic. He started with a discussion of the status of QCD, which “surpassed QED as a prototype of a gauge theory, thanks to its richer structure”. The problem with QCD is to extract consequences and predictions from it. It is a crucial job for LHC, a prerequisite for the discoveries that the new machine will make possible. For the most part, QCD phenomena are non-perturbative in nature: they cannot be computed by making first order approximations and then perfectioning them by adding smaller and smaller correctiosn to it, because the “corrections” are too large. The main methods to circumvent this hindrance are two: to perform simulations on lattice or to use “effective lagrangians”.

Guido concentrated on a discussion of the first approach, which has continued to improve and has become a very important tool to understand QCD. In lattice quantum chromodynamics, calculations are made on a lattice of points, by discretizing spacetime. The results will depend on the lattice spacing, which can be then extrapolated to zero obtaining the “continuum limit”. Lattice QCD gave us many results and is continuing to progress, ranging from hadron spectroscopy (explaining the mass of mesons and baryons) to flavor physics (with calculations of form factors in hadron decays and studies of CKM matrix elements), to the study of phase transitions.

A sector of QCD which contains a few important open issues is the decyphering of the phase diagram when both temperature and density of a partonic gas are high. This can be studied with heavy ion beams at LHC. An evidence of deconfinement coming from lattice QCD calculations is based on the slope of the potential between two colored charges (take a pair of quarks, for instance): the potential becomes flatter as the temperature increases, until it totally flattens out and stays so as T is further increased. This critical temperature is a fundamental parameter, which depends on the number of flavors.Our current knowledge of the dynamics of these ultradense states of matter suggests that it is simply described as an ideal fluid, for which hydrodynamics gives a good description. Evidence of this, however, is still indirect, and the interest of a direct confirmation at the LHC of these effects is very high.

Coming back to “standard” particle physics, Guido noticed that as far as the perturbative regime of QCD is concerned,  the technology of calculations has now reached impressive heights, such that to obtain sufficient precision we nowadays need to rely on very high orders of perturbative expansions and complex resummations of leading logarithmic contributions to all orders of the expansions. Computations which are routine today were thought impossible only ten years ago. For instance, those on splitting functions: a computation by Moch, Vermaseren and Vogt in 2005 has used 10,000 different diagrams. Another work on hadronic inclusive decays of Z bosons has reached NNNLO level – that is contributions to the fourth order in \alpha_s.  The famous R-ratio has now been computed to 4-th level in alpha strong too, and with this improvement the agreement with experimental results has improved further.

In summary, QCD has become a very complete theory, which is used for precision measurements and careful comparisons with experiment. But the standard model issues which LHC will have to address are mostly in the electroweak sector. The problem of the higgs boson is central and connected to all others: the flavor sector of the SM; the hierarchy problem; the existence of new physics at the TeV scale.

On the face of the centrality of the Higgs boson in today’s physics, the list of experimental results we presently have which provide information on the Higgs boson is very short. H is light because radiative correction measurements say so. LEP II on the other hand gave a lower mass limit at 114.4 GeV. We also know pretty well that if the Higgs boson exists it is a weak isospin doublet, because otherwise the ratio between W and Z mass would not equal the cosine of the Weinberg angle, M_W/M_Z = \cos \theta_W.

We also know that some Higgs mechanism must exist to break electroweak symmetry. However, the real nature of the mechanism is not known yet. A single doublet of fields ? More doublets ? Susy ones ? Is the Higgs a composite ? The LHC will answer these questions.

Altarelli then mentioned the inputs coming from low energy. Most notably, the g-2 experiments, which are sensitive to new physics, especially in the Brookhaven experiment with muons – the mass of muons being a hundred times larger, sensitivity to new physics is 10^4 times higher. Presently, there is a discrepancy between theory and experiment at the level of 3.3 standard deviations. However, the fact that the largest part of the uncertainty in the theoretical prediction comes from the evaluation of virtual hadronic contributions to g calls for some caution in interpreting this result. On the other hand, some light Supersymmetry could give a signal of its presence in the anomalous magnetic moment of the muon of the right amount.

If one examines the global electroweak fits to standard model parameters, one sees that the largest discrepancy affecting the global fit probability (globally at 15% – not bad, but not great either) comes from the hadronic and leptonic determinations of \sin ^2 \theta. If one puts that quantity on the y axis in a plot with the Higgs mass on the abscissa, and then places the hadronic and leptonic determinations for it at the x value where theory would predict the Higgs mass to be, one is able to visualize this contrasting indication: the average of the two quantities is a political compromise, since the data do not match well with each other:

What could be wrong in the leptonic versus hadronic measurements of the Weinberg angle ? There could be new physics in the Zbb vertex, which affects the third family of fermions. The size of the new physics contribution would be of the order of 30% on the left-handed couplings: a huge effect at tree level! But modifying the standard model at tree level is incredibly hard without jeopardizing the perfect agreement of all physics measurements made so far, so one faces the challenge of inventing a new particle with which W or Z bosons might mix, with suitable quantum numbers to thwart the spoiling of past measurements.

Guido also discussed flavor physics, where no effect is seen, while measurements have reached a high level of precision. Any new physics must enter very silently in these processes. Operators describing new physics effects which correct the standard model must be induced by loops and not at tree level, and since the standard model has very strong protections (such as the GIM mechanism and similar other cancellations), the new physics effects must be small. As far as neutrinos are concerned, the indication is that these particles are very light because they are Majorana particles, and counterparts which are very massive – beyond our reach – keep them light. Because of that, double beta decay experiments are very important because they could establish the violation of lepton flavor number.

To conclude, Altarelli said that the standard model awaits LHC to see on one side the completion of the scalar sector, but much more is in principle possible. When asked by Guido Tonelli to spill the guts and declare what he expects LHC will find, he mentioned that he had previously foreseen new physics at LEP II, and he was proven wrong: so he is apparently not the right person to ask. However, he mentioned that since SUSY is the best model that theorists could conceive in the last twenty years to explain many of the existing problems, he would be happier if what were found was not SUSY, but rather something different: much of the theoretical work on SUSY has already been done. He would be much happier if some kinds of extra dimensions relevant for electroweak physics was discovered: this would necessitate a much richer theoretical overhaul of our present preconceptions. On the other hand, if nothing is found by LHC, particle physics might be at its last stop. With this gloomy remark, he left the pulpit.

Comments

1. Paolo - January 31, 2008

Thanks Tommaso. Just in case someone missed it, this rather recent colloquium is also highly recommended (IMHO):

http://indico.cern.ch/conferenceDisplay.py?confId=a07123

2. DB - January 31, 2008

Paragraph 5, you probably mean “surpassed QED”, not QCD.

My last physics paper (http://xxx.lanl.gov/abs/hep-ph/9604325) was on Zbb, trying to explain the Z->bb excess over Z->cc. It invoked 3 fairies: SUSY, R-parity violation, and a fourth generation. The experimental signature went away before the paper could get published😦

3. Coin - February 1, 2008

“He started with a discussion of the status of QCD, which ‘surpassed QCD…'”

Is this a typo?

4. lazopolis - February 1, 2008

“which surpassed QCD-as-a-prototype-of-a-gauge-theory”,

I guess

5. JustChecking - February 1, 2008

Very interesting post, thanks a lot for taking your time to write it all down!
Looking forward to the DGLAP post!😀

Btw, enjoy your vacation!🙂

6. dorigo - February 1, 2008

Hi all,

it should read “which surpassed QED”…

Sorry! Will change the text.

Paolo, thanks for your link.
JC, it will take a couple of weeks for me to write it…

Cheers,
T.


Sorry comments are closed for this entry

%d bloggers like this: