jump to navigation

Awaiting news on the formerly 2.1-sigma excess of MSSM Higgs October 2, 2007

Posted by dorigo in news, physics, science.
trackback

Last January readers of this blog and Cosmic Variance got acquainted with a funny effect seen by CDF in the data where they were searching for a signal of supersymmetric Higgs boson decays to tau lepton pairs: the data did allow for a small signal of H \to \tau \tau decays, if a higgs mass of about 150-160 GeV was hypothesized, and for a hiterto not excluded value of some critical parameters describing the model considered in the search. The plot below shows the mass distribution of events compatible with the searched double tau-lepton final state: backgrounds from QCD, electroweak, and Drell-Yan processes are in grey, magenta and blue, respectively, and the tentative signal is shown in yellow.

Despite John Conway (the writer in CV and one of the analysis authors) and I were quite adamant in explaining that the effect was most likely due to a fluctuation of the data, and that its significance was in all cases very scarce, the rumor of a possible discovery spread around the web, and was eventually picked up in articles which appeared in March on New Scientist and the Economist. I have described in detail the whole process and its implications time and again (check my Higgs search page – tab above), so I will not add anything about that here.

What I wish I could discuss today is the new result obtained by John and his team in the same search, which is now based on twice as much statistics. You would guess that if you double the statistics, a true signal would roughly double in size, and its significance would grow by about 40%: Correct. Further, if you also had some experience with hadron collider results, you would actually expect an even larger increase, because analyses in that environment continue to improve as time goes by and a better understanding of backgrounds is achieved. On the other hand, a fluctuation would be likely to get washed away by a doubling of the data…

CDF has a policy of making public a physics result only after a careful internal scrutiny and several passes of review. After the result is “blessed”, there is nothing wrong in distributing it – but a nagging moral responsibility remains toward the very authors, which have to be left the chance of being the first to present their findings to the outside world. I used to not consider this to be a real obligation in the past, until I discussed the matter with a few colleagues. Among them, the same John Conway who is the mastermind behind the H \to \tau \tau analysis. I have a high esteem of John, which I maturated during a decade of collaboration; he was instrumental in making me change my mind about the issue. For that reason, I am not able to disclose the details of his brand new result here, which was blessed last week in CDF, until I get news about a public talk on the matter. 

Because of the above, this post will not discuss the details of the new result, and it will remain unfinished business for a while. I will update it with the description of the result when I have a green light; for the time being, I think I can still do something useful though: make an attempt at putting readers in the condition of understanding the main nuts and bolts of the theoretical model within which the 2.1 sigma excess was found nine months ago.

1 – TWO WORDS ABOUT SUSY

First of all, what is the MSSM ? MSSM stands for “Minimal Supersymmetric Standard Model”. It is an extension of the Standard Model of particle physics which attempts a solution of some of its wanting features; it is the minimal version of a class of theories called SUperSYmmetric – SUSY for friends. These theories postulate a symmetry between fermions (particles having a half-integer value of the quantum number called “spin”) and bosons (particles with zero or integer spin): for every known fermion (spin 1/2) there exists a supersymmetric partner, whose characteristics are the same except for having spin 1; and likewise for bosons (spin 1). Such a doubling of all known particles would allow to automatically solve the problem of “fine tuning” of the Standard Model (which was https://dorigo.wordpress.com/2007/04/27/explaining-the-naturalness-problem/excellently explained by Michelangelo Mangano recently; also see Andrea Romanino’s perspective on the issue), and it would have the added benefit of allowing a unification of coupling constants for the different interactions at a common, yet very high energy scale. Some say SUSY would make the whole theory of elementary particles considerably prettier; others disagree. If you ask me where I stand, I think it just makes things messier.

Physicists have always been wary of adding parameters or entities to their model of nature, even the model is obviously incomplete or when the addition appears justified by experimental observation. Scientific investigation proceeds well by following Occam’s principle: “entia non sunt multiplicanda praeter necessitatem“, entities should not be multiplied needlessly.

The extension of the standard model to SUSY not only implies the existence of not just one but a score of new, as-of-yet unseen elementary particles: in order for SUSY to be there and still yet to be discovered, we need to have so far missed all these bodies, and the only way that is possible is if all SUSY particles have large masses – so large that we have so far been unable to produce them in our accelerators. Such a striking difference between particles and s-particles can be due to a “SUSY-breaking” mechanism, a contraption by which the symmetry between particles and sparticles is broken, endowing all sparticles with masses much larger than that of the corresponding particles: and funnily enough, their value has to be juuuuust right above the lower limits set by direct investigation at the Tevatron and elsewhere, in order for the coveted “unification of coupling constants” to be possible.

So if we marry the hypothesis of SUSY, we need to swallow the existence of a whole set of new bodies AND a uncalled-for mechanism which hid them from view until today. Plus, of course, scores of new parameters: mass values, mixing matrix elements, what-not. Occam’s razor is drooling to come into action. In fact, so many choices are possible for the free parameters of the theory, that in order to be sure of talking about the same model phenomenologists have conceived some “benchmark scenarios”: choices of parameters that describe “typical” points in the multi-dimensional parameter space.

2 – THE HIGGS SECTOR OF THE MSSM

A very important subclass of these benchmarks (but some would frown at my calling a benchmark: it is more like a model of its own) is the so-called “Minimal Supersymmetric extension” of the standard model, also known as MSSM. In the MSSM the Higgs mechanism yields the smallest number of Higgs bosons: five physical particles, as opposed to a single neutral scalar particle in ths standard model. Let me introduce them:

  • two neutral, CP-even states: h, H (with m_h < m_H)
  • one neutral, CP-odd state, A
  • two electrically charged CP-odd states: H^+, H^-.

The CP-parity of the states need not bother you. It is irrelevant for the searches discussed in this post. However, you should take away the fact that there are three, and not just one, neutral scalar boson to search for.

Where do these five states come from ? Well, the symmetry structure of SUSY requires that two different higgs boson doublets are responsible for the mass of up-type (u,c,t quarks and e,\mu, and \tau leptons) and down-type (d,s,b quarks and the three neutrinos) fermions. Two (2) doublets (x2) of complex (x2) scalar fields make for a total of eight degrees of freedom – eight different real numbers, to be clear; three of these are spent to give rise to masses of W and Z bosons by the higgs mechanism, and five physical particles remain.

One thing to keep in mind when discussing the phenomenology of these theories is the following: among the three neutral scalars, a pair of them ([h,A] or [H,A]) are usually very close in mass, such that they effectively add together their signals, which are by all means indistinguishable. Therefore, rather than discussing the search for a specific state among h, H, and A, experimentalists prefer to discuss a generic scalar \phi, a placeholder for the two degenerate states. 

There are a few interesting “benchmarks” in the MSSM. One is called no mixing scenario, and is the one most frequently used by experimentalists – mainly because it is one of the most accessible by direct searches. There are quite a few others: “Mh max”, “Gluophobic Higgs”, “Small \alpha(eff)… but we need not discuss them here. What matters is that once the no mixing scenario or any other has been selected, just two additional parameters are necessary to calculate the masses and couplings of the five higgs bosons: the mass of the A boson, m_A, and tan(\beta), a ratio between the characteristics of the two higgs doublets. 

It turns out that if $latex tan (\beta)$ is large, then the production rate of higgs bosons can be hundreds of times higher than that predicted in the standard model! Of course, very large values of tan(\beta) have already been excluded by direct searches because of that very feature: if no higgs bosons have been found this far, then their production rate must be smaller than a certain value, and that translates in an upper bound for tan(\beta). Nonetheless the parameter space – usually plotted as the plane where the abscissa is m_A and the y-axis represents tan(\beta) – is still mostly to be explored experimentally. Below you can see the excluded region by the analysis of Conway et al. this January. It is the dark purple region in the plot; the lighter purple marks the region that CDF expected to exclude with their search. The difference is due to the fact that an excess was seen in the data!

3 – MSSM HIGGS PRODUCTION AND DECAY

Higgs production in the MSSM is not too special: the diagrams producing a neutral scalar (h, H, or A) are the same. However, due to the highly boosted couplings of two of these three states with down-type fermions (an increase roughly equal to tan(\beta), two diagrams contribute the most: gluon-gluon fusion via a b-quark loop (see below, left) or direct b-quark annihilation (right). The b-quark in fact is privileged by being a down-type quark AND having a large mass.

As for the decay of these particles, the same enhancement in the couplings rules that the most likely decay is to b-quark pairs (about 85 to 90%). The remainder is essentially a 10-15% chance of decay to tau-lepton pairs, which are also down-type fermions and also have a largish mass: 1.777 GeV, to be compared to the about 3-4 GeV of b-quarks “photographed” at high Q^2. Decay rates scale with the square of the coupling, and the coupling scales with the mass: that explains the order of magnitude difference in decay rates.

Because of the impossibility of going on to describe the analysis, I will conclude this incomplete post with a point about the parameter space. There is in fact one subtlety to mention. As tan(\beta) becomes large, the usually narrow higgs bosons acquire a very large width. The width of a particle is an attribute which defines how close to the nominal mass the actual mass of the state can be. Now, the higgs boson in the standard model has a width much smaller than 1 GeV, which is totally irrelevant when compared with the experimental precision of the mass reconstruction. The same cannot be said for MSSM higgs bosons if tan(\beta) is large: it is the large coupling to down-type fermions the cause of the large indetermination in the mass, in fact.  As tan(\beta) grows larger than about 60, the coupling actually becomes non calculable by perturbation theory, the width becomes really large and rather undetermined (10 GeV and above), and the higgs resonances lose their most significant attribute, i.e. a well-defined mass.

The effect discussed above has two consequences: one is that the region of parameter space corresponding to too large values of tan(\beta) is not well-defined theoretically. The other is that if one were to perform the search carefully in that region, one would need to consider the effect of the large width to the mass templates used to search for the higgs bosons. Given a mass of A, a different mass template would be then needed for each value of tan(\beta), making the analysis quite a bit more complex. Physicists like to approximate, and mostly they get away with it when the neglected effects are small, but in the case of large tan(\beta) an approximation fails and a precise computation is not possible.

The bottomline is: a grain of salt is really needed when interpreting the results of a MSSM Higgs search.

Comments

1. fliptomato - October 2, 2007

Hi Tommaso — a minor point of lexicography: I think MSSM stands for “Minimal Supersymmetric Standard Model” rather than “Minimal SuperSymmetric Model.” There are certainly more `minimal’ SUSY models than the MSSM, such as the Wess-Zumino model. But the MSSM is the minimal supersymmetric model that contains the Standard Model.

Thanks for a great post!

2. dorigo - October 2, 2007

Thank you flip, I will fix it.

Cheers,
T.

3. island - October 2, 2007

The bottomline is: a grain of salt is really needed when interpreting the results of a MSSM Higgs search.

And bottle of aspirin!

for every known fermion (spin 1/2) there exists a supersymmetric partner

Yeah, right. More like, for every spin 1/2 fermion, there exists an equal amount of (negative pressure) vacuum energy that is just too “thinned-out” to be a particle.

“Antimatter” doesn’t have the characteristics of a real massive particle untill you condense enough vacuum energy over a finite enough region of space to attain the matter density, and this effectively “stretches” the vacuum which pulls back harder as negative pressure increases causing the vacuum to expand at an accelerating rate.

Whoops… I’m ahead of the success of the LHC at proving that there is no freaking Higgs for the above reasons.

Oh well, the solution will sit right here and wait… *eyeroll*@discouragement of research into fundamental physics.

4. Luboš Motl - October 2, 2007

It sounds pretty good that you decided to include a crash course of MSSM. Don’t forget: if your and LHC teams won’t find SUSY by Summer 2008, I will have to sue you for negligence because I may lose a $1000 bet.

5. dorigo - October 2, 2007

Wow Lubos, that is quite a dangerous bet! Never trust the schedule of these giant experimental endeavours. My $1000 bet with Distler and Watts implies that we wait for 2 years after CMS and Atlas collecting 10/fb of data… They were quite cautious with the stipulation, since they lose if SUSY (or other new physics) is not found at the LHC.

Cheers,
T.

6. Luboš Motl - October 2, 2007

Our original bet had the Summer 2006 deadline – delays were not taken into account. The deadline was a formality added to the conditions to make it well-defined. Of course that the moral content of the bet is whether there is low-energy supersymmetry or not and the deadline was only chosen to allow the no-SUSY side to win in principle. That’s why the shift from 2006 to 2008 was easy and when it is morally (or by partial hints) justified, I am convinced that we will be ready to shift it more.

7. Arun - October 3, 2007

I assume that if the signal for MSSM Higgs was stronger, it would leaked all over the place, if not shouted from the rooftops.

8. Quasar9 - October 3, 2007

Hi Tommaso,
the search for the higgs field(s) is on
but what about dark energy – new physics is predicted
Yet there is no ‘observable’ change in light travelling thru this field?

PS – I think the expression is “a pinch of salt”

9. Tripitaka - October 5, 2007

/\ “grain of salt” is how it is expressed where I come from

10. dorigo - October 5, 2007

Hi Arun,

indeed I never said there is a signal in the new data… Let’s wait for the authors to speak though!

Quasar, about dark energy… there is a nice guest post, check the tab above.

Cheers,
T.

11. dorigo - October 5, 2007

Tripi, thanks for restoring my confidence🙂
T.

12. carlbrannen - October 8, 2007

Knowing what I do now about salaries in Italy, $1000 is quite a dangerous bet indeed.

13. dorigo - October 8, 2007

Yes Carl, but worry not – the deadline appears far, far in the future!

Cheers,
T.

14. Arun - October 9, 2007

It is a week later; no updates?

15. Arun - October 9, 2007

Is it USD 1000 or Euro 1000? In December 2003, 1000 USD == 1000 Euro. Today, 1000 USD = 713 Euro.

16. Quasar9 - October 9, 2007

lol Tipitaka, don’t know where you come from – so the expression there could be a “grain of salt”

but the English expression is a ‘pinch of salt’ as in when you cook or want to put salt on your food, you take a ‘pinch of salt’ between your thumb and finger (certainly more than a grain) and sprinkle it on.
Hence the expression to take life or a statement with a pinch of salt.
Oh, and don’t forget to throw some over your shoulder, just for luck.

However … to see the universe in a grain of sand
I guess a ‘pinch’ of sand could give us a multiverse, though perhaps not quire 10^500 universes, unless it is very ‘fine’ sand.

On the other hand the saying “not having one grain of truth” implies it is absolutely false.

All the best.

17. dorigo - October 9, 2007

Hi Arun,

no, no updates – I will contact John Conway to hear if they have presented already their stuff. Stay tuned.

As for the bet, 1000$ is whatever the exchange rates.
Cheers,
T.

18. dorigo - October 9, 2007

Thank you for the clarification quasar9, but I am more confused than before…🙂

I guess I will use ” a pinch of salt” because I have in fact heard it in the past. I’m however still happy to know that “a grain of salt” is possible too…

Cheers,
T.

19. Arun - October 10, 2007

This Webster entry may interest you:

Main Entry: grain of salt
Part of Speech: n
Definition: skepticism, reservation
Etymology: Latin cum grano salis, based on antidote to poison needing a grain of salt to work
Usage: used with ‘with a’

“Pinch of salt” can only be a derivative.

20. roger muldavin - October 14, 2007

Much appreciation for learning that the Higgs types are of five types:
h, H, A (the neutrals) and H+ and H-, the separation of the neutrals into three types, that’s the added feature. I’ll track down the essay soon.

And glad to find this site for the advanced knowledge shown.

One exception: Hans Dehmelt Nobel Prize (circa 1998) conjecture that the electron is a flat equal lateral triangle (felt) with each vertex a 1/3 charge triplet with an origin “Cosmonium” subsequently in which each vertex itself divided into triplets that today reached a kth level.

Visualize the Cosmonium as a mobile attached to the ceiling, each of the initial three, n=0, next dividing n=1, dropping under force of gravity or gravity related to separation from an anti-cosmonium, and say at n=k, we have today’s touching surface electron, so we can be considered at the zero position of the whole number line for the unitary everyday electron with (1/3)^n, where n=0,1,2,3,4,…k.

Even add local spreads for the powers of (1/3) for mixing.

Heuristic is futuristic, assuming we humans survive.

Dehmelt used a Penning Trap, a vacuum chamber resonating in an electromagnetic field of some 60 mega hertz and a static magnetic field such that the “electron” would gyrate in moving up and down.

His lab measured the gyromagnetic ration (mass rotation axial force to magnetic rotational force) out to 16 decimal places, i. e. 2.000….[16].

This was done with relatively inexpensive laboratory equipment.

Last time I checked, he is still at Univ. of Washington, Seattle, and appears to switched fields to ancient diets, that is, amounts of essential food types. Recently another U of W, Seattle professor announced that based on three high energy collision research the neutron has a negative charged edge and center, with a positive charged between.

Thus constructing rows of wafered triplets and Higgs pentlets has kept me somewhat energized to construct pictorially possible ways to track the Internet literature.

I pounder why Dehmelt switched fields of study.

I find very few people even appreciate his electron triplet conjecture that also included that each vertex 1/3 e+ charge (a single positron oscillated in the Penning Traps for over a month) had an energy some ten billion times that of the at-a-distance electron, which is some 1/1820 the mass of the proton.

So ever since my son at Univ. of Michigan showed me Dehmelt’s Nobel Prize paper I have been pursuing the triplets.

The first obvious connection was that the quark charts gave the “down~strange~bottom” as a triplet of e-/3s. So why not start there?

Next the connect between the regular triplet based polyhedron and polyhedra and elemental chemical nuclei. Classical Greek Polyhedrons.

Then connect the polyhedra to mathematical descriptions.

Recently, arXiv:0709.34509.3450v1 [hep-ph] 21 Sep 2007

“Fermion Masses and Mixing from Dihedral Flavor Symmetries with Preserved Subgroups”

Still studying this essay, excellent, actually kind of fun to study for most of the material resides within the 25 pages. Group Theory.

To the polyhedra consider that the triplets can be stacked into wafers and these can be the basis of crystals.

Also the near 100 GeV thresholds offer plenty of opportunity for single brain analysis. Have inches of printout, one sided, to study.

So thanks for your collaborative efforts.

Best, rmuldavin

21. Arun - October 15, 2007

Still waiting for the news…..

22. dorigo - October 15, 2007

Arun, thank you for digging out the truth on the expression I used. I’m more than happy when people correct my English, but still happier when finally proven right🙂

And, I just emailed John to see when they will present their findings publically.

Cheers,
T.

23. dorigo - October 15, 2007

Dear RMuldavin,

thank you for your interesting comment. I was unaware of the research you mention, and although I find it more imaginative than solid, I salute any off-stream attempts at understanding the structure of our universe with an open mind.

I will look up the work you point to. Anyway, winning a nobel prize is not a guarantee, a certification of seriousness of one’s research. And, people change. We are not the same as 10, 20, 30 years ago.
So, my evaluation of the recent works of Dehmelt would not consider his past achievements.

Cheers,
T.

24. dorigo - October 15, 2007

Arun, all,

I will be able to post about the updated result on MSSM h into tau pairs in two or three days, but while you wait, why not looking for hints in the wine and cheese talk on the Hb->bbb search by Thomas Wright of last September (which I linked already somewhere else, but here it is:
http://theory.fnal.gov/jetp/talks/wright.pdf ) ?

Cheers,
T.


Sorry comments are closed for this entry

%d bloggers like this: