jump to navigation

New Higgs limits with taus from D0 July 31, 2007

Posted by dorigo in news, physics, science.
trackback

The D0 collaboration has just finished a nice new analysis of 1 inverse femtobarn of Run II data, looking for direct production of Higgs bosons with a decay to W boson pairs.

At the Tevatron proton-antiproton collider, Higgs bosons are mainly produced, if they exist, in two ways: directly in the hard scattering of the colliding bodies, or in associated production by radiating off vector bosons. Direct p \bar p \rightarrow H production is ten times more “frequent” -say, twice a day on a good day of running!- than associated p \bar p \rightarrow WH or p \bar p \rightarrow ZH production, but the two processes have different atouts, and both need to be exploited if the Tevatron experiments are to stand a chance of discovering the particle before LHC does.

The slide below, which I dug out of a talk I gave in La Thuile a couple of years ago, shows all there is to know about Higgs production at the Tevatron: the top right plot shows the various decay modes of Higgs bosons as a function of its mass, and their relative frequency; the plot on the lower left shows the production rate of Higgs bosons in direct and associated modes, and the one on the lower right shows the Feynman diagrams for direct (top) and associated (bottom) production.

As you can see, direct Higgs production (red curve in plot above) is about ten times more frequent than associated production (blue and green curves for WH and ZH, respectively), but it is hard to exploit unless the Higgs has a mass of at least 140 GeV: below that value, its decay to b-quark pairs will make it utterly indistinguishable from “QCD background”, the continuum production of b-quark pairs by strong force. Conversely, associated production is less frequent, but it fills the gap at low mass with several possible signatures which retain discovery potential. 

For masses of the Higgs close to and above 160 GeV, the possibility of materializing into a pair of W bosons makes the search of direct production appealing, and it is indeed in that ballpark that the Tevatron will most likely be soon starting to set real limits on the existence of H. In order to do that, CDF and D0 will have to use all possible search methodologies, and combine their results.

Today’s news are that D0 has shown how to search for H \rightarrow WW decays by allowing one of the W bosons to decay into a tau lepton and tau neutrino, a difficult signature. The final state they sought includes a muon from one W, a tau from the other, and missing transverse energy caused by the energetic neutrinos produced in the W decays. The recent analysis proves to be not far from the sensitivity of the more “gold-plated” modes involving only electrons and muons. What’s more, the tau lepton is explicitly identified through its decay to a hadronic jet, so that the event candidates make a set completely orthogonal to those considered for the gold-plated modes.

Tau leptons are indeed fascinating, difficult animals from an experimental point of view. At 1.777 GeV, the mass of these “fat leptons” is large enough to allow a fast decay to either hadrons or lighter leptons via the weak force. If you take the pains to visit the Particle Data Group web site (no pains at all, it is a great site!), you will discover that taus have a very complex set of possible decay modes. There are a total of 203 possible final states which have been either measured or constrained by past studies! The most important of them from an experimental standpoint are cataloged depending on the type and number of charged particles produced:

  • fully leptonic modes: \tau \rightarrow \mu \nu_\mu \nu_\tau or \tau \rightarrow e \nu_e \nu_\tau – these make up for 17% of the total each;
  • “one prong” modes: \tau \rightarrow \pi \nu_\tau, K \nu_\tau – for a total of 50%;
  • “three prong” modes: \tau \rightarrow h^+ h^- h^+ \nu_\tau (where h is a charged hadron) -for another 15%.

Of course, D0 did not look for the 203 final states individually, but they did separate the tau candidates into three main categories: narrow jets with a single charged track and no signal of additional electromagnetic deposits (which are usually due to neutral pions), narrow jets with a single track and additional deposits, and narrow jets with two or three charged tracks (with total mass below the tau lepton one!). Then, they taught a neural network classifier how to distinguish real tau jets from ordinary jets due to quark hadronization, the large background that hadronically-decaying taus have to fight against. They found out that the third category – jets with two or three tracks – is tougher to distinguish, and they left it unused in the present analysis. Tau candidates belonging to the first and second category were instead analyzed with the network, and the outputs were used together with additional information from the narrow jet to construct two different “tau likelihoods”.

Incidentally, it is amazing to see the giant steps forward we have done in particle physics with advanced analysis methods these days. When I was a student, neural networks were frowned upon as means of signal discrimination, and likelihood techniques were considered very suspicious too. The thought of combining the output of a NN with other information into a global likelihood in a discovery search would have been considered delirious 15 years ago! These methods are now commonplace, and results obtained with them are routinely approved by the collaborations…

Anyway, back to the neat D0 analysis: the tau likelihoods were only a part of the discrimination process. In fact, H \rightarrow WW decays need to be distinguished from generic WW pairs and several other backgrounds, including W + jets QCD production events. The many kinematic observables of the event -angle between leptons, missing transverse energy, lepton momenta- were used to cook up another global event likelihood, so that the final selection was done by selecting a slice of the plane of tau and kinematic likelihoods.

The plot above shows the distribution of simulated background and signal events in the plane for the second tau-candidate category, with a dashed line dividing the signal-enriched region from the discarded region. The green and red rectangles describe the relative frequency of the main backgrounds in the plane, while the hatched black rectangles show the signal distribution.

At the end of the game it always all boils down to a counting experiment: the D0 analysis is no exception. The number of remaining events in the data after cutting on the likelihood plane was compared with background expectations to derive a limit on Higgs production as a function of the unknown value of the Higgs boson mass (which impacts the result through a variation of the selection acceptance).

The final result can be seen in the plot above, which shows in picobarns the observed cross section limit (red curve) as a function of Higgs boson mass.  The blue curve shows the limit D0 expected to be observing with 1/fb worth of data, and the grey band show 1-sigma variations in the blue curve. The fact that the red curve is within the grey band does not mean much, but it confirms that what D0 sees is no surprise.

It will be very interesting to see the many new results on Higgs search by CDF and D0 combined in a single exclusion plot this summer. In the region around 160 GeV, I believe the Tevatron should be able to set limits at no more than twice the SM expectation… 

To find out more about this nice new analysis, please visit the D0 site of the analysis or read the  preprint paper.

Comments

1. jeff - July 31, 2007

Hi Tommaso
it is a little exagerated to say that 15 years ago it would have been considered “delerious” to combine neural nets with other information. Even the espression “frowned upon” is an unhappy one. I do think there was “suspicion”. But the word “suspicion” shouldn’t raise suspicion. Neural nets are tricky.

2. dorigo - July 31, 2007

Hello Jeff,

(for others: Jeff and I collaborated in 1992 to a top search analysis which did use neural networks! But eventually we avoided NNs and used simpler kinematical cuts to produce our first observation of all-hadronic top decays in 1997…)

I agree to some extent, neural networks were already in use back then (even their use in a level-2 trigger! In that CDF was indeed at the forefront…) However, my point is that a discovery would not have been based on their massive usage without raising pure hell in the review process. And taking a NN output as an input of a relative likelihood would indeed have been hard to digest – remember the Leone/Grassmann/Cobal analysis of single lepton top with a kinematical likelihood ? Their methodology was only accepted years after the top discovery (when Monte Carlo simulations had, to be fair, become better tools).

Cheers,
T.

3. jeff - July 31, 2007

Grassmann methodology?
How to rub hundreds of people the wrong way.

4. dorigo - July 31, 2007

Well yes, he was such a pain in the butt… Which, in a large collaboration, totally offsets *any* scientific merit or intuition.

Cheers,
T.

5. Kea - July 31, 2007

Gee, thanks for going to all this trouble to tell us this. Any news on this subject is very exciting! I find it mindboggling how a Tevatron team can sort out something so complicated that any one person cannot possibly be familiar with all the details. It is so very different to any of the science that I have ever done.

6. Frank - August 1, 2007

Just in case you are open to new, strictly gut-felt new physics theory on what the Higgs particle is—please click on http://www.geocities.com/CapeCanaveral/Hall/2638/HiggsParticleIsFound.doc

7. dorigo - August 1, 2007

Hi Kea, no trouble at all… It is a way to keep in touch with the most recent developments.
Indeed, most analyses have so many facets that you cannot be in touch with all of them. Even PhD students who do most of the work on an analysis have to rely on algorithms, detector components, and triggers in the development of which they had little or no part.
Things are just too complicated…
Cheers,
T.

8. Alejandro Rivero - August 2, 2007

#6 “strictly gut-felt”… I took some lapse of time to understand it mean really gut-felt, not SO(10)-felt or E6-felt.


Sorry comments are closed for this entry