jump to navigation

Higgs decays to photon pairs! March 4, 2009

Posted by dorigo in news, physics, science.
Tags: , , , , ,
trackback

It was with great pleasure that I found yesterday, in the public page of the DZERO analyses, a report on their new search for Higgs boson decays to photon pairs. On that quite rare decay process -along with another not trivial decay, the H \to \tau \tau reaction- the LHC experiments base their hopes to see the Higgs boson if that particle has a mass close to the LEP II upper bound, i.e. not far from 115 GeV. And this is the first high-statistics search for the SM Higgs in that final state to obtain results that are competitive with the more standard searches!

My delight was increased when I saw that results of the DZERO search are based on a data sample corresponding to a whooping 4.2 inverse-femtobarns of integrated luminosity. This is the largest set of hadron-collider data ever used for an analysis. 4.2 inverse femtobarns correspond to about three-hundred trillion collisions, sorted out by DZERO. Of course, both DZERO and CDF have so far collected more than that statistics: almost five inverse femtobarns. However, it always takes some time before calibration, reconstruction, and production of the newest datasets is performed… DZERO is catching up nicely with the accumulated statistics, it appears.

The most interesting few tens of billions or so of those events have been fully reconstructed by the software algorithms, identifying charged tracks, jets, electrons, muons, and photons. Yes, photons: quanta of light, only very energetic ones: gamma rays.

When photons have an energy exceeding a GeV or so (i.e. one corresponding to a proton mass or above), they can be counted and measured individually by the electromagnetic calorimeter. One must look for very localized energy deposits which cannot be spatially correlated with a charged track: something hits the calorimeter after crossing the inner tracker, but no signal is found there, implying that the object was electrically neutral. The shape of the energy deposition then confirms that one is dealing with a single photon, and not -for instance- a neutron, or a pair of photons traveling close to each other. Let me expand on this for a moment.

Background sources of photon signals

In general, every proton-antiproton collision yield dozens, or even hundreds of energetic photons. This is not surprising, as there are multiple significant sources of GeV-energy gamma rays to consider.

  1. Electrons, as well as in principle any other electrically charged particle emitted in the collision, have the right to produce photons by the process called bremsstrahlung: by passing close to the electric field generated by a heavy nucleus, the particle emits electromagnetic radiation, thus losing a part of its energy. Note that this is a process which cannot happen in vacuum, since there are no target nuclei there to supply the electric field with which the charged particle interacts (one can have bremsstrahlung also in the presence of neutral particles, in principle, since what matters is the capability of the target to absorb a part of the colliding body’s momentum; but in that case, one needs a more complicated scattering process, so let us forget about it). For particles heavier than the electron, the process is suppressed up to the very highest energy (where particle masses are irrelevant with respect to their momenta), and is only worth mentioning for muons and pions in heavy materials.
  2. By far the most important process for photon creation at a collider is the decay of neutral hadrons. A high-energy collision at the Tevatron easily yields a dozen of neutral pions, and these particles decay more than 99% of the time into pairs of photons, \pi^\circ \to \gamma \gamma. Of course, these photons would only have an energy equal to half the neutral pion mass -0.07 GeV- if the neutral pions were at rest; it is only through the large momentum of the parent that the photons may be energetic enough to be detected in the calorimeter.
  3. A similar fate to that of neutral pions awaits other neutral hadrons heavier than the \pi^\circ: most notably the particle called eta, in the decay \eta \to \gamma \gamma. The eta has a mass four times larger than that of the neutral pion, and is less frequently produced.
  4. And other hadrons may produce photons in de-excitation processes, albeit not in pairs: excited hadrons often decay radiatively into their lower-mass brothers, and the radiated photon may display a significant energy, again critically depending on the parent’s speed in the laboratory.

All in all, that’s quite a handful of photons our detectors are showered with on an event-by-event basis! How the hell can DZERO sort out then, amidst over three hundred trillion collisions, the maybe five or ten which saw the decay of a Higgs to two photons ?

And the Higgs signal amounts to…

Five to ten events. Yes, we are talking of a tiny signal here. To eyeball how many standard model Higgs boson decays to photon pairs we may expect in a sample of 4.2 inverse femtobarns, we make some approximations. First of all, we take a 115 GeV Higgs for a reference: that is the Higgs mass where the analysis should be most sensitive, if we accept that the Higgs cannot be much lighter than that: for heavier higgses, their number will decrease, because the heavier a particle is, the less frequently it is produced.

The cross-section for the direct-production process p \bar p \to H + X (where with X we denote our unwillingness to specify whatever else may be produced together with the Higgs) is, at the Tevatron collision energy of 1.96 TeV, of the order of one picobarn. I am here purposedly avoiding to fetch a plot of the xs vs mass to give you the exact number: it is in that ballpark, and that is enough.

The other input we need is the branching ratio of H decay to two photons. This is the fraction of disintegrations yielding the final state that DZERO has been looking for. It depends on the detailed properties of the Higgs particle, which likes to couple to particles depending on the mass of the latter. The larger a particle’s mass, the stronger its coupling to the Higgs, and the more frequent the H decay into a pair of those: the branching fraction depends on the squared mass of the particle, but since the sum of all branching ratios is one -if we say the Higgs decays, then there is a 100% chance of its decaying into something, no less and no more!- any branching fraction depends on ALL other particle masses!!!

“Wait a minute,” I would like to hear you say now, “the photon is massless! How can the Higgs couple to it?!”. Right. H does not couple directly to photons, but it can nevertheless decay into them via a virtual loop of electrically charged particles. Just as happens when your US plug won’t fit into an european AC outlet! You do not despair, and insert an adaptor: something endowed with the right holes on one side and pins on the other. Much in the same way, a virtual loop of top quarks, for instance, will do a good job: the top has a large mass -so it couples aplenty to the Higgs- and it has an electric charge, so it is capable of emitting photons. The three dominant Feynman diagrams for the H \to \gamma \gamma decay are shown above: the first two of them involve a loop of W bosons, the third a loop of top quarks.

So, how much is the branching ratio to two photons in the end ? It is a complicated calculus, but the result is roughly one thousandth. One in a thousand low-mass Higgses will disintegrate into energetic light: two angry gamma rays, each roughly carrying the energy of a 2 milligram mosquito launched at the whooping speed of four inches per second toward your buttocks.

Now we have all the ingredients for our computation of the number of signal events we may be looking at, amidst the trillions produced. The master formula is just

N = \sigma L B

where N is the number of decays of the kind we want, \sigma is the production cross section for Higgs at the Tevatron, L is the integrated luminosity on which we base our search, and B is the branching ratio of the decay we study.

With \sigma = 1pb, L=4.2 fb^{-1} = 4200 pb^{-1}, and B=0.001, the result is, guess what, 4.2 events. 4.2 in three hundred trillions. A needle in the haystack is a kids’ game in comparison!

The DZERO analysis

I will not spend much of my and your time discussing the details of the DZERO analysis here, primarily because this post is already rather long, but also because the analysis is pretty straightforward to describe at an elementary level: one selects events with two photons of suitable energy, computes their combined invariant mass, and compares the expectation for Higgs decays -a roughly bell-shaped curve centered at the Higgs mass and with a width of ten GeV or so- with the expected backgrounds from all the processes capable of yielding pairs of energetic photons, plus all those yielding fake photons. [Yes, fake photons: of course the identification of gamma rays is not perfect -one may have not detected a charged track pointing at the calorimeter energy deposit, for instance.] Then, a fit of the mass distribution extracts an upper limit on the number of signal events that may be hiding there. From the upper limit on the signal size, an upper limit is obtained on the signal cross-section.

Ok, the above was a bit too quick. Let me be slightly more analytic. The data sample is collected by an online trigger requiring two isolated electromagnetic deposits in the calorimeter. Offline, the selection requires that both photon candidates have a transverse energy exceeding 25 GeV, and that they be isolated from other calorimetric activity -a requirement which removes fake photons due to hadronic jets.

Further, there must be no charged tracks pointing close to the deposit, and a neural-network classifier is used to discriminate real photons from backgrounds using the shape of the energy deposition and other photon quality variables. The NN output is shown in the figure below: real photons (described by the red histogram) cluster on the right. A cut on the data (black points) of a NN output larger than 0.1 accepts almost all signal and removes 50% of the backgrounds (the hatched blue histogram). One important detail: the shape of the NN output for real high-energy photons is modeled by Monte Carlo simulations, but is found in good agreement with that of real photons in radiative Z boson decay processes, p \bar p \to l^+ l^- \gamma. In those processes, the detected photon is 100% pure!

After the selection, surviving backgrounds are due to three main processes: real photon pairs produced by quark-antiquark interactions, compton-like gamma-jet events where the jet is mistaken for a photon, and Drell-Yan processes yielding two electrons, both of which are mistaken for photons. You can see the relative importance of the three sources in the graph below, which shows the diphoton invariant mass distribution for the data (black dots) compared to the sum of backgrounds. Real photons are in green, compton-like gamma-jet events are in blue, and the Drell-Yan contribution is in yellow.

The mass distribution has a very smooth exponential shape, and to search for Higgs events DZERO fits the spectrum with an exponential, obliterating a signal window where Higgs decays may contribute. The fit is then extrapolated into the signal window, and a comparison with the data found there provides the means for a measurement; different signal windows are assumed to search for different Higgs masses. Below are shown four different hypotheses for the Higgs mass, ranging from 120 to 150 GeV in 10-GeV intervals. The expected signal distribution, shown in purple, is multiplied by a factor x50 in the plots, for display purposes.

From the fits, a 95% upper limit on the Higgs boson production cross section is extracted by standard procedures. As by now commonplace, the cross-section limit is displayed by dividing it by the expected standard model Higgs cross section, to show how far one is from excluding the SM-produced Higgs at any mass value. The graph is shown below: readers of this blog may by now recognize at first sight the green 1-sigma and yellow 2-sigma bands showing the expected range of limits that the search was predicted to set. The actual limit is shown in black.

One notices that while this search is not sensitive to the Higgs boson yet, it is not so far from it any more! The LHC experiments will have a large advantage with respect to DZERO (and CDF) in this particular business, since there the Higgs production cross-section is significantly larger. Backgrounds are also larger, however, so a detailed understanding of the detectors will be required before such a search is carried out with success at the LHC. For the time being, I congratulate with my DZERO colleagues for pulling off this nice new result!

Comments

1. Daniel de França MTd2 - March 4, 2009

I see that it is not possible to not possible to find the mass of higgs based on thse 5 to 10 signals due to the background, but is it possible to say what is the mass of these signals, invidualy, considaring they are higgs?

2. Daniel de França MTd2 - March 4, 2009

I see that it is not possible to not possible to find the mass of higgs based on thse 5 to 10 signals due to the background, but is it possible to say what is the mass of these signals, invidualy, considering they are higgs?

dorigo - March 4, 2009

Daniel, which 5 or 10 events ? For each event one can compute the mass. The histograms shown in the plots display the mass values of events in the selected sample. The sample is compatible with containing only background. The statistical tests show that if there was any Higgs signal, it would be smaller in size than about 50 events (based on the fact that they expect, after analysis cuts, about 2.5 signal events, and they set a limit on the production at x20 times the SM). So just pick 50 entries from the mass histogram and call them Higgs, if you want. You have complete freedom to decide which ones to pick.

Cheers,
T.

3. Daniel de França MTd2 - March 4, 2009

I see. You said these events:

“Five to ten events. Yes, we are talking of a tiny signal here.”

dorigo - March 4, 2009

I was talking of the expected signal. This means 5 or 10 events in 300 trillions. Then you select as well as you can, and you end up with a few thousand events, with 2.5 expected. Immersed in background. In HEP, you can NEVER say that one event comes from a process rather than another with absolute certainty. In this case, the probability is not 90%, nor 1%, but rather one in a thousand or less, IF one buys that the signal is indeed there -which nobody warrants.
T.

dorigo - March 4, 2009

I was talking of the expected signal. This means 5 or 10 events in 300 trillions. Then you select as well as you can, and you end up with a few thousand events, with 2.5 expected. Immersed in background. In HEP, you can NEVER say that one event comes from a process rather than another with absolute certainty. In this case, the probability is not 90%, nor 1%, but rather one in a thousand or less, IF one buys that the signal is indeed there -which nobody warrants.
T.

4. Daniel de França MTd2 - March 4, 2009

I understand what you say. But why being so enthusiastic if nothing could possibly be found, almost, as you said? I really thought that some real events were found, somehow, not just background.

dorigo - March 4, 2009

I am enthusiastic of this search because, for the first time, I see becoming concrete the search of H to two photons, which had always seen me skeptical. Of course there are real events, but nobody will ever be able to say which ones are signal. Still, the signal may appear as a property of the dataset in its totality. We will be some day able to touch it with our hands, not event by event though.

Cheers,
T.

5. Daniel de França MTd2 - March 4, 2009

I don’t understand why you don’t keep your skeptcism. You really said this is not sensitive yet to Higgs boson.

6. Daniel de França MTd2 - March 4, 2009

If it was magicaly possible to increase the crosse section of the Higgs, how many higg should be found among these 3×10^14 events so that one could find it?

7. Daniel de França MTd2 - March 4, 2009

If it was magicaly possible to increase the crosse section of the Higgs, how many higg should be found among these 3×10^14 events so that one could find it or at least see anything above the background?

8. dorigo - March 4, 2009

#9: I also said, Daniel, that this makes me less skeptical about the LHC seeing it, not about DZERO!

#10,11: well, increasing the Higgs cross section magically is something I would rather not discuss in this blog. However, if I were a magician then I would make the Higgs decay 100% of the time to a same-sign muon -electron pair. That would allow me to see a signal of very few events even amidst twice more data…

9. Luboš Motl - March 4, 2009

Hi Tommaso, that’s a very interesting and promising strategy!

How many more events – total integrated luminosity (divided by what has been used in this work) – would you need to exclude some big mass intervals at 5 sigma, assuming that the signal would remain proportional to what you drew on the last picture in your post?

10. dorigo - March 4, 2009

Hi Lubos,

the paper quotes, at 130 GeV, 2 signal events expected from SM and 615+-10 events from backgrounds. Under such circumstances, it would take an enormous amount of data to exclude the SM cross section even just at 95% CL, because the signal size is five times smaller than the background uncertainty…

Imagine, for instance, that you multiply the data by a million times: you would get 615 million background events, and two million signal events. The Poisson fluctuation of the background would be of “only” 25,000 events, but the systematics on the background would be still +-10,000,000: still five times larger than the “excess” of 2M events. One would have an excess of 0.2-sigma, and no statistical increase could ever make up for that.

Fortunately, systematic uncertainties also scale down with the amount of data one processes -to some extent. However, until they decrease to a relative value which is much smaller than the relative contamination from the searched signal, the search will not be able to obtain significant results, either on its existence or its absence.

Somebody believes in a N^-1/4 scaling law of systematics, i.e., take 16 times more data and the relative size of systematics will decrease by a half. If that were the case for this analysis, one would get, in 420/fb of data (say), 2000 signal events, 615,000 background, a Poisson error of about +-785, and a syst. error of about +-55 events. That would be just a 2.5 sigma effect, in a 100 times large dataset!

This search, in other words, is only capable of setting limits on a signal much larger than the SM expected Higgs, at the Tevatron. But at the LHC, the S/N is much larger…

Cheers,
T.

11. Daniel de França MTd2 - March 4, 2009

Hi Tommaso, my question #10,11 was serious and it was the same as Lubos’. :p If it was a joke, I wouldn’t ask for the least thing to find…

12. dorigo - March 4, 2009

Hmmm you asked to increase the cross section of the signal, not the whole dataset… That would indeed be magical! However, sorry if I misunderstood you…
T.

13. Luboš Motl - March 4, 2009

Tommaso, thanks, I see. I will avoid further words, attempting not to sound like Daniel #7.😉

14. carlbrannen - March 5, 2009

I have to admit that the first time I scanned through this I thought it was announcing the discovery of the Higgs. This was partly from the enthusiasm and partly from the first graph, which shows such a big bump on the right. It was only when I looked carefully at the four diagrams showing “data” “background” and “signal” and then realized that it actually said “signal x 50” that I realized I’d misinterpreted it.

15. The Large Hadron Collider, Tiny Black Holes, & The Higgs Boson, oh my! » 27keith.com - - June 3, 2009

[…] higgs particle decays into photon pairs, that is 1 higgs automatically becomes 2 photons  (It could be 4, I need to verify my source on […]


Sorry comments are closed for this entry

%d bloggers like this: