jump to navigation

New limits on scalar leptoquarks from D0 August 2, 2007

Posted by dorigo in news, physics, science.
trackback

Leptoquarks are exotic bosons hypothesized by several theoretical extensions of the Standard Model. These particles would be produced singly or in pairs in proton-antiproton collisions, for example through processes such as the ones pictured in the diagrams below.

In the first, a pair of quarks from the incoming proton and antiproton “feel” each other by exchanging a lepton: at each quark-lepton vertex the quark becomes a leptoquark. In the second, quark-antiquark annihilation yields a very energetic gluon, which materializes in a pair of leptoquarks. We are talking of heavy objects here, and therefore the processes just described is rare. There is no a priori reason for leptoquarks to be heavy, but lower limits in the 100-200 GeV range have been set by previous searches.

The decay of a hypothetical leptoquark is very fast, and it yields a lepton (e, \mu, \tau or their neutrinos) together with a quark of the same generation. This latter constraint does not seem to have a strong theoretical motivation, but is forced by the lack of experimental observation of flavor-changing neutral currents – that is, no quark is observed to decay to another of the same charge. Model builders look more and more like rope walkers these days…

Anyway, because of the lack of flavor changing neutral currents, one is also led to consider three different LQ: a first, second, and third generation, each coupling to the relative classes of fermions: LQ1 will couple to the quartet of first-generation fermions (\nu_e e u d), LQ2 to (\nu_\mu \mu c s), and LQ3 to (\nu_\tau \tau t b). The same rule dictates that a proton-antiproton collision does not create pairs of leptoquarks of different generations – that is, unless one considers fancy, rare processes.

All the above is to say that experimental searches at the Tevatron usually look separately for signatures of leptoquarks of different generations. The recent analysis by the D0 collaboration is no exception, and it considers the pair production mechanism of third-generation leptoquarks, which would yield four third-generation particles: a possible final state is thus composed of two b-quarks and two tau leptons. The recently perfected tools to discriminate tau leptons at D0 allow that final state to be sought with good discrimination power and relatively little backgrounds.

D0 can reconstruct muons with high acceptance and good efficiency. Since the tau lepton has a 17% probability to decay into a muon and two neutrinos, it makes sense to insist that one of the two taus in the final state produces a muon, while the other tau is sought in its more frequent, but more background-ridden, hadronic decay mode: the mixed signature is a good compromise between background reduction and relative frequency of the signature.

As I briefly described in my latest post below (concerning taus used by D0 for Higgs searches), tau leptons decay weakly to light hadrons -such as pions or kaons- which produce a narrow jet in the detector, with one or three charged tracks pointing at it. Three separate classes of tau candidate signals are analyzed by as many neural network classifiers, which can sort out the real tau leptons from backgrounds (generic jets from quark or gluon hadronization) by relying on the characteristics defining the classes.

Today’s result is based on a dataset of one inverse femtobarns of proton-antiproton collisions. After a preliminary data selection requiring a muon, a hadronic tau candidate, and two hadronic jets, the data is still too rich in background, given the expected rarity of the leptoquark pair signal. In particular, the presence of a good muon candidate makes the sample rich with W \rightarrow \mu \nu_\mu decays – particularly if coming from top pair production, which will easily produce extra jets and even a tau signal. D0 gets rid of events due to W decay by requiring that the transverse mass of these possible particles, as reconstructed with muon momentum and missing transverse energy in the event, is lower than 50 GeV. As you can see in the distribution of transverse masses in the following plot, the cut reduces backgrounds significantly:

Finally, only events containing one or two b-tagged jets are retained. The table shown below details the expected background contributions to the data, after these selection cuts and after single and double tagging (last two columns). The third-to-last line in the table shows the total background expected, the next-to-last line shows the observed events, and the last one shows the expected signal events for a LQ3 mass of 220 GeV.

Single and double b-tagged events (i.e., events with one or two jets recognized as coming from b-quark hadronization) are considered separately: the former has a larger acceptance, the latter has better purity. D0 cooks up a variable that separates further the backgrounds from the sought signal: it is the scalar sum S of transverse momenta of the two jets, the hadronic tau candidate, the muon, and the extra missing transverse energy. This variable is larger for LQ3 pair production than for all background processes surviving in the data, as is clearly shown by the picture below, where the signal is shown as an empty black histogram (hatched for LQ3 mass of 180 GeV, continuous for mass of 220 GeV) and all backgrounds have different colors – the green top pair background being the most prominent at large S.

Rather than using it as an additional method for increasing signal purity, the distribution of the data on S is used together with that of backgrounds and that expected from leptoquark signal, to extract a lower limit on the LQ3 pair production cross section.

D0 thus finds that third-generation scalar leptoquarks are excluded up to 180 GeV of mass, as is shown in the picture above, where the mass limit is obtained as the intersection between the theoretical cross section curve (yellow band) with the cross section limit (red curve). But there is fine print to read: in fact, in order to make the limit easier to convert to different models, a 100% branching ratio of LQ3 to a b-quark and a tau lepton has been assumed. That means that if LQ3 particles decay only half of the times in that final state, the limit on the cross section becomes four times higher – and the mass exclusion becomes considerably weaker. No criticism to D0, of course: it is a good way to present their data, which allows theorists to make good use of the extracted limit.

I must say I have a small criticism to make to a detail of the analysis instead. It concerns a very subtle correction applied to b-jets containing muons. D0 corrects for the presence of muons in the jets by adding twice the muon momentum to the jet energy, to compensate for the energy carried away and not released in the calorimeter by both the muon and the accompanying neutrino. Now, I happen to have studied the matter in detail ten years ago for the reconstruction of b-jets from Z decay, and I know a few things about the energy correction of b-jets. Well, not surprisingly, there is a negative correlation between the observed muon momentum and the neutrino energy, so that if one uses the former to estimate the latter, as D0 does in the analysis, the result is a sub-optimal correction, to put it mildly. It is a detail, but it bothers me: in principle, one could reconstruct directly the leptoquark mass by taking the four-momenta of all observed objects in the event, plus the missing energy, and constructing a likelihood: one b and one tau must have the same invariant mass as the other b and the other tau, after all. This appears to be ignored in the analysis (but I imagine D0 must have attempted something like that). I would have appreciated a sentence discussing that option in the paper describing the result in detail…. The reason why I believe D0 does not attempt a direct mass reconstruction is that the resolution on the mass of objects decaying into a tau and a b-jet is mediocre. Only very careful studies could improve it to a level that would make the mass likelihood a viable analysis tool. Among them, a better correction for the semileptonic b-quark decay into muon and neutrino is certainly in the agenda…

Comments

1. Not Even Wrong » Blog Archive » Various Stuff - August 8, 2007

[…] some excellent detailed postings about recent experimental HEP results from Tommaso Dorigo, see here and here. For blogging from CHARM 07 by Alexey Petrov, see […]


Sorry comments are closed for this entry

%d bloggers like this: