jump to navigation

Two papers say the SM is doomed. But is it ? March 7, 2008

Posted by dorigo in news, physics, science.
trackback

A colleague pointed me today to two papers recently appeared in the Arxiv: this and that. First signs of physics beyond the Standard Model ? Not if you ask me. Let me describe in short -because I am about to go to sleep!- what these new interesting works are about.

The first paper (0803.0512) discusses leptonic decays of D_s mesons: ones involving a charged lepton and a neutrino.  An average of different determination finds that they are more frequent than what theory predicts, and the effect is quoted at 3.8 standard deviations. Hmmm. What is it, however, that the experimental average is comparing to, to see that significant discrepancy ?

It is a computation from lattice QCD. The paper reviews “critically” two different computations of the factor f_{D_s}, which is the quantity measured at 277 \pm 9 MeV by BaBar, Belle, and CLEO. One computation finds 241 \pm 3 MeV, another finds 249 \pm 3 \pm 16 MeV. There are reasons, according to the authors, to prefer the former to the latter, and there you go, a discrepancy of 36\pm 9.3 MeV arises.

In my deep ignorance of lattice QCD, I spent some time pondering on the claims in the paper on why we should prefer the measurement which disagrees most with experiment, and I came to the conclusion that they are insufficient. You can go through the same process and come to your own conclusion, and of course it might differ from mine: the fact remains, however, that if you take the determination that has the larger, more conservative error bar, you get a disagreement with experiment only at 1.5 standard deviations. If I have to choose which one is the most likely, I am sorry, but I will prefer to buy the smaller discrepancy and cast doubt on the more “precise” lattice calculation, rather than going screaming we have seen the first evidence of charged higgs decays in D_s mesons….

And actually, there is a sentence that to me is a give-away of the happiness of the hunt for discrepancy in the paper. Here it is:

“The only other modern lattice-QCD calculation agrees, 249 \pm 3 \pm 16 MeV, but its quoted error is five times larger and would not influence a weighted average with [the more precise result]”.

Weighted average ? A weighted average of two lattice-QCD calculations, using poorly understood uncertainties and similar methodologies ? What about correlations ? I guess I understand too little of lattice-QCD, but to me this sounds like an alarm bell: a metallic voice is whispering in my ears… do-not-trust-these-figures-they-are-just-an-exercise-yet. I am sorry if I sound disrespectful: these are esteemed theorists and I do respect them and their work. Only, I am used to be wary of lattice QCD calculations, especially if these are used to claim for new physics. Hell, they go straight at claiming for charged higgs and leptoquarks as the possible source of discrepancy…

Paper number 2 (0803.0659) is titled even more boldly “First evidence of new physics in b<->s transitions“. What is this about ? It is a combined fit of quantities measured at the Tevatron on the B_s sector, which finds a disagreement with the standard model at more than 3-sigma level on the phase of B_s mixing. I must say I am not an expert on B mixing analyses any more than I am of lattice QCD, so I have to keep my criticism of the method to a minimum level. The analysis combines CDF and D0 data which have different theoretical assumptions on the strong-CP phases. I was not able to fully understand what it is exactly that they do to combine results, but they do make an attempt at explaining it, noting that “Hopefully D0 will present results without assumptions on the strong phases in the future, allowing for a more straightforward combination”. What I get is just that I am not alone in finding rather un-straightforward their combined result.

The authors come to the conclusion that the phase \phi_B is different from zero at about 3 standard deviations, and say:

“We conclude that the combined analysis gives a stable evidence for new physics, although the precise number of standard deviations depends on the procedure followed to combine presently available data.”

To me, that sounds like saying, “we wish this were evidence for new physics, but we do not really know whether it is solid enough. But the deviation, if there is any, can be explained well by models of new physics beyond the SM”. I cannot help remembering the warnings of Michelangelo Mangano in a recent paper, where he says that the establishment of a deviation from standard model and the interpretation in terms of new physics of this or that kind should be kept separate….

One last remark about the paper. They admit that it was triggered by input from a CDF member, an italian colleague of mine of whom I have a high esteem, Marco Rescigno. Despite the esteem, though, I have to note that Marco’s prodding means the analysis was not blind, but triggered by a discrepancy. I cannot help thinking that it would be really remarkable if among all the beautiful measurements that CDF is making these days, one were unable to squeeze out a 3-sigma discrepancy with standard model expectations. Sure, it is a discrepancy that fits well with some model-independent new physics scenario. But is that enough to get hyper ? Here is what the authors have to say in their conclusions:

“With the procedure we followed to combine the available data, we obtain evidence for NP at more than 3 \sigma. […] We are eager to see updated measurements using larger data sets from both the Tevatron experiments in order to strenghten the present evidence, waiting for the advent of LHCb for a high-precision measurement of the NP phase”.

What can I say. Good luck. I would be so damn happy if you were right… But I bet you are just being optimistic.

Comments

1. Kea - March 7, 2008

Hmmm, it’s a while since I worked in lattice QCD, but your analysis sounds quite reasonable. Would anyone really expect beyond SM physics to come out of an analysis using only the tools of SM physics? Hardly.

2. superweak - March 7, 2008

I’ll only comment on the D_s result. The quality of lattice results has increased dramatically over the last few years; the HPQCD and Fermilab/MILC results use vastly different methodologies, and the (much more precise!) HPQCD result features careful evaluation of their systematic uncertainties. The +- 3 MeV lattice uncertainty is intended to be taken seriously. I’d personally rather have the discrepancy appear in fD/fD_s because the ratio should be even better controlled on the lattice, but unfortunately dominant experimental uncertainties don’t cancel in the ratio.

Could BSM physics appear in a SM analysis? The Homestake solar neutrino experiment says: yes!

3. fliptomato - March 7, 2008

Hi Tommaso, thanks for offering your thoughts on these two papers! I’m curious, as a naive theory student, whether it’s non-trivial to combine experimental results as in the B_s paper. Is this a well-defined, unambiguous process, or is it open to bias by the analysts?

Secondly, how well do we really understand the standard model prediction for these processes? I know next to nothing about lattice methods, but it seems like there’s a lot of wiggle room in values for decay constants, etc. Does it make sense to pick a particular lattice methodology and then compare experimental results to that? Or should one also `average’ over lattice predictions? (Do they do this?)

Thanks,
Flip

4. chris - March 7, 2008

hi tomaso,

being a lattice guy myself i am less than excited that the ‘new’ ‘prediction’ is featured on your blog. not that i do not appreciate your efforts, quite on the contrary. i would really like lattice calculations to make headlines (and they do and will). but it is not so long ago that a few people too eager for fame discredited the whole field with premature conclusions. remember the pentaquark and its silent departure a few years back? i don’t even want to talk about direct cp violation i K decays… fact is, that most of these calculations still are quite far away from the physical point and rely crucially on various extrapolations. and estimating the systematics is a difficult task. furthermore, this paper is based exclusively on the staggered fermion formulation, which is in the process of being phased out because it may have severe conceptual problems (i.e., its continuum limit might not be qcd). so although f_D_s is a rather clean quantity, this level of precision – 3 MeV – is in my opinion overly optimistic.

of course time will tell whether this is a real effect and of course i would like to see a lattice prediction dealing the death blow to the standard model. but unfortunately there is the very real chance that some overly eager individuals discredit the whole field – yet again.

5. chris - March 7, 2008

superweak,

the +/- 3MeV, among many other things, implies a scale setting precision of less than 1% (this is a dimensionful quantity). despite a careful analysis of systematics and with all due respect, we (lattice in general) are not quite there yet in my opinion.

6. DB - March 7, 2008

I’m reminded of the Brookhaven muon g-2 announcement a few years ago. A lot of excitement until the uncertainties in the leading order hadronic vacuum polarisations were pointed out. A lot of to-ing and fro-ing ever since so that today we have a 3 sigma discrepancy if you calculate the contribution on the basis of electron-positron to hadron decay data, or 0.9 sigma if you use tau to tau neutrino decay data.
This process and those referred to by Tomasso are marvellous to watch: soon after a suspected deviation is announced the community lines up its big guns and attacks it mercilessly to uncover flaws. All the PR and hoo-ha replaced by long hours of painstaking and subtle analyses. All good examples of scientific scepticism at work. Remarkable claims require remarkable evidence which must stand up to the fiercest independent assaults.

7. dorigo - March 7, 2008

Hi Kea, well, I guess what you mean is that a discrepancy is not a NP model. I agree. And below, Superweak warns us that Homestake did see BSM effects by measuring a SM cross section. However, one can argue effectively that despite twenty years of stark disagreement of Homestake, Gallex, Sage, and other neutrino experiments, it was only after SuperKamiokande data came in that the disagreement was accepted as the signal of neutrino mixing.

Cheers,
T.

8. dorigo - March 7, 2008

Hi Superweak,

I do not possess the means to question the lattice prediction with a small error in my post, but I note that 3 MeV is a darn small uncertainty, and that the existence of an independent result which is more in line with measurement casts some doubt on it.
Your point about uncertainties not canceling in the ratio is interesting and it leaves me unarmed, and I hereby declare I need to study the matter more, and leave the ground to anybody who wants to explain this subtlety.

Best,
T.

9. dorigo - March 7, 2008

Hi Flip,

yes, the combination of CDF and D0 results was non-trivial and subject to a degree of arbitrariness, according to the authors of the paper themselves. The problem is twofold: the non-homogeneity of starting assumptions on one side, and the lack of complete information on the likelihood function used by D0 on the other.

As to lattice QCD and averaging predictions, I have my doubts it is a good idea. Again, I am no expert, and I again am more than happy if experts clarify here; in any case, I feel as you do that some of the black magic that made lattice QCD become so successful in the recent years is still in need of a more solid foundation. Whenever I read (better say browse through) a LQCD paper I am left with the impression that it is a work in progress – and thus that results are preliminary.

Cheers,
T.

10. dorigo - March 7, 2008

Hello Chris,

thank you for your valuable contribution. I have mixed feelings with respect to the idea that an insider going too far with some claims discredits a whole field. Physicists are vaccinated against these mechanisms, and the harm can only come from shaping the opinion of the general public against basic research. If the authors of the paper wanted to express their conviction about these calculations and their discrepancy, I think there is nothing wrong. Without speculations, ours would be a much more tedious job!
On the other hand, I believe your fear is that LQCD gets discredited by artful attackers of this new claim. I think LQCD is much stronger now than it was a decade ago, and cannot be discredited as a way of doing science as much as it was in the past. Let’s think positive… I personally think lattice calculations are a wonderful tool, and I share your hope that one day it will be a key player in the discovery of new physics.

Cheers,
T.

11. dorigo - March 7, 2008

DB, yes, this is the process by which science advances – by running over egos and killing false claims, only to see who’s still standing. Not for the faint-hearted, but admittedly a pleasure to watch from the sidelines.

Cheers,
T.

12. chris - March 7, 2008

hi Tomaso,

thanks for the encouraging reply. one thing to note is, that although one of the authors of the current paper is a lattice person, none of them has actually authored the original lattice prediction.

you can be sure also, that there will be a lot of lattice papers on that topic during the next months.

13. goffredo - March 7, 2008

Science praises uncertainty. Ideologies, egos abhor it

14. Not Even Wrong » Blog Archive » HEP and Politics News - March 9, 2008

[…] accurate top mass measurement, reports of not very convincing deviations from the Standard model in B-mixing and charm decays, and stringent new limits on WIMPs that make SUSY more […]

15. C. - March 29, 2008

ciao Tommaso

Another lattice practitioner speaking here (though not trying the argumentum ad verecundiam… ;-). Indeed, lattice QCD is not still at the point of delivering results which are accurate enough to make such claims — I mean, not if systematic uncertainties are properly and honestly taken into account. On the positive side: the tools to do it are now finally there, and a few years’ hard work will bring around the era of precise, controlled phenomenological lattice results.

However, some collaborations are jumping ahead, and routinely make bold claims about the absolute precision of their results. HPQCD and Fermilab/MILC, mentioned above, are heralding this strategy. A (very) large fraction of the lattice community considers, at best, that such computations cannot even be deemed to be a first-principles approach. In any case, trying to prove the appearance of new physics on this basis… I find it simply ludicrous.

So, I very much appreciate the objectivity of your analysis of the D_s paper. There are many phenomenologist around way less cunning than that…

16. dorigo - March 29, 2008

Hi C,

thank you for your input. I indeed think that the methods for lattice calculations have been tested enough that we now know their error margin and their range of applicability. What in fact is lacking from the paper cited above is a more critical approach to the problem and a conservative stand. The point has been made, and I would tend to agree, that these attempts risk damaging the field.

Cheers,
T.

17. Antonio Masiero: Astroparticles in the LHC Era « A Quantum Diaries Survivor - April 18, 2008

[…] and some three-sigma discrepancies in transitions. Concerning the latter, a month ago there was a study showing some discrepancy from SM for the phase in mixing. We have some possible indications, but however how much we can like this […]


Sorry comments are closed for this entry