jump to navigation

## Some recent posts you might want to readMarch 6, 2010

Posted by dorigo in Blogroll, internet, news, physics, science.
Tags: , , , , , , , ,
comments closed

As the less distracted among you know, I have moved my blogging activities to scientific blogging last April. I wish to report here a list of interesting posts I have produced there in the course of the last few months (precisely, since the start of 2010). They are given in reverse chronological order and with zero commentary – come see if you are curious.

## CMS and extensive air showers: ideas for an experimentFebruary 6, 2009

Posted by dorigo in astronomy, cosmology, physics, science.
Tags: , , , , , , ,
comments closed

The paper by Thomas Gehrmann and collaborators I cited a few days ago has inspired me to have a closer look at the problem of understanding the features of extensive air showers – the phenomenon of a localized stream of high-energy cosmic rays originated by the incidence on the upper atmosphere of a very energetic proton or light nucleus.

Layman facts about cosmic rays

While the topic of cosmic rays, their sources, and their study is largely terra incognita to me -I only know the very basic facts, having learned them like most of you from popularization magazines-, I do know that a few of their features are not too well understood as of yet. Let me mention only a few issues below, with no fear of being shown how ignorant on the topic I am:

• The highest-energy cosmic rays have no clear explanation in terms of their origin. A few events with energy exceeding $10^{20} eV$ have been recorded by at least a couple of experiments, and they are the subject of an extensive investigation by the Pierre Auger observatory.
• There are a number of anomalies on their composition, their energy spectrum, the composition of the showers they develop. The data from PAMELA and ATIC are just two recent examples of things we do not understand well, and which might have an exotic explanation.
• While models of their formation suppose that only light nuclei -iron at most- are composing the flux of primary hadrons, some data (for instance this study by the Delphi collaboration) seems to imply otherwise.

The paper by Gehrmann addresses in particular the latter point. There appears to be a failure in our ability to describe the development of air showers producing very large number of muons, and this failure might be due to modeling uncertainties, heavy nuclei as primaries, or the creation of exotic particles with muonic decay, in decreasing order of likelihood. For sure, if an exotic particle like the 300 GeV one hypothesized in the interpretation paper produced by the authors of the CDF study of multi-muon events (see the tag cloud on the right column for an extensive review of that result) existed, the Tevatron would not be the only place to find it: high-energy cosmic rays would produce it in sizable amounts, and the observed multi-muon signature from its decay in the atmosphere might end up showing in those air showers as well!

Mind you, large numbers of muons are by no means a surprising phenomenon in high-energy cosmic ray showers. What happens is that a hadronic collision between the primary hadron and a nucleus of nitrogen or oxygen in the upper atmosphere creates dozens of secondary light hadrons. These in turn hit other nuclei, and the developing hadronic shower progresses until the hadrons fall below the energy required to create more secondaries. The created hadrons then decay, and in particular $K^+ \to \mu^+ \nu_{\mu}$, $\pi^+ \to \mu^+ \nu_{\mu}$ decays will create a lot of muons.

Muons have a lifetime of two microseconds, and if they are energetic enough, they can travel many kilometers, reaching the ground and whatever detector we set there. In addition, muons are very penetrating: a muon needs just 52 GeV of energy to make it 100 meters underground, through the rock lying on top of the CERN detectors. Of course, air  showers include not just muons, but electrons, neutrinos, and photons, plus protons and other hadronic particles. But none of these particles, except neutrinos, can make it deep underground. And neutrinos pass through unseen…

Now, if one reads the Delphi publication, as well as information from other experiments which have studied high-multiplicity cosmic-ray showers, one learns a few interesting facts. Delphi found a large number of events with so many muon tracks that they could not even count them! In a few cases, they could just quote a lower limit on the number of muons crossing the detector volume. One such event is shown on the picture on the right: they infer that an air shower passed through the detector by observing voids in the distribution of hits!

The number of muons seen underground is an excellent estimator of the energy of the primary cosmic ray, as the Kascade collaboration result shown on the left shows (on the abscissa is the logarithm of the energy of the primary cosmic ray, and on the y axis the number of muons per square meter measured by the detector). But to compute energy and composition of cosmic rays from the characteristics we observe on the ground, we need detailed simulations of the mechanisms creating the shower -and these simulations require an understanding of the physical processes at the basis of the productions of secondaries, which are known only to a certain degree. I will get back to this point, but here I just mean to point out that a detector measuring the number of muons gets an estimate of the energy of the primary nucleus. The energy, but not the species!

As I was mentioning, the Delphi data (and that of other experiments, too) showed that there are too many high-muon-multiplicity showers. The graph on the right shows the observed excess at very high muon multiplicities (the points on the very right of the graph). This is a 3-sigma effect, and it might be caused by modeling uncertainties, but it might also mean that we do not understand the composition of the primary cosmic rays: yes, because if a heavier nucleus has a given energy, it usually produces more muons than a lighter one.

The modeling uncertainties are due to the fact that the very forward production of hadrons in a nucleus-nucleus collision is governed by QCD at very small energy scales, where we cannot calculate the theory to a good approximation. So, we cannot really compute with the precision we would like how likely it is that a 1,000,000-TeV proton, say, produces a forward-going 1-TeV proton in the collision with a nucleus of the atmosphere. The energy distribution of secondaries produced forwards is not so well-known, that is. And this reflects in the uncertainty on the shower composition.

Enter CMS

Now, what does CMS have to do with all the above ? Well. For one thing, last summer the detector was turned on in the underground cavern at Point 5 of LHC, and it collected 300 million cosmic-ray events. This is a huge data sample, warranted by the large extension of the detector, and the beautiful working of its muon chambers (which, by the way, have been designed by physicists of Padova University!).  Such a large dataset already includes very high-multiplicity muon showers, and some of my collaborators are busy analyzing that gold mine. Measurements of the cosmic ray properties are ongoing.

One might hope that the collection of cosmic rays will continue even after the LHC  is turned on. I believe it will, but only during the short periods when there is no beam circulating in the machine. The cosmic-ray data thus collected is typically used to keep the system “warm” while waiting for more proton-proton collisions, but it will not be a orders-of-magnitude increase in statistics with respect to what has been already collected last summer.

The CMS cosmic-ray data can indeed provide an estimate of several characteristics of the air showers, but it will not be capable of providing results qualitatively different from the findings of Delphi -although, of course, it might provide a confirmation of simulations, disproving the excess observed by that experiment. The problem is that very energetic events are rare -so one must actively pursue them, rather than turning on the cosmic ray data collection when not in collider mode. But there is one further important point: since only muons are detected, one cannot really understand whether the simulation is tuned correctly, and one cannot achieve a critical additional information: the amount of energy that the shower produced in the form of electrons and photons.

The electron- and photon-component of the air shower is a good discriminant of the nucleus which produced the primary interaction, as the plot on the right shows. It in fact is a crucial information to rule out the presence of nuclei heavier than iron, or the composition of primaries in terms of light nuclei. Since the number of muons in high-multiplicity showers is connected to the nuclear species as well, by determining both quantities one would really be able to understand what is going on. [In the plot, the quantity Y is plotted as a function of the primary cosmic ray energy. Y is the ratio between the logarithm of the number of detected muons and electrons. You can observe that Y is higher for iron-induced showers (the full black squares)].

Idea for a new experiment

The idea is thus already there, if you can add one plus one. CMS is underground. We need a detector at ground level to be sensitive to the “soft” component of the air shower- the one due to electrons and photons, which cannot punch through more than a meter of rock. So we may take a certain number of scintillation counters, layered alternated with lead sheets, all sitting on top of a thicker set of lead bricks, underneath which we may set some drift tubes or, even better, resistive plate chambers.

We can build a 20- to 50-square meter detector this way with a relatively small amount of money, since the technology is really simple and we can even scavenge material here and there (for instance, we can use spare chambers for the CMS experiment!). Then, we just build a simple logic of coincidences between the resistive plate chambers, imposing that several parts of our array fires together at the passage of many muons, and send the triggering signal 100 meters down, where CMS may be receiving a “auto-accept” to read out the event regardless of the presence of a collision in the detector.

The latter is the most complicated thing to do of the whole idea: to modify existing things is always harder than to create new ones. But it should not be too hard to read out CMS parasitically, and collect at very low frequency those high-multiplicity showers. Then, the readout of the ground-based electromagnetic calorimeter should provide us with an estimate of the (local) electron-to-muon ratio, which is what we know to determine the weight of the primary nucleus.

If the above sounds confusing, it is entirely my fault: I have dumped here some loose ideas, with the aim of coming back here when I need them. After all, this is a log. a Web log, but always a log of my ideas… But I wish to investigate more on the feasibility of this project. Indeed, CMS will for sure pursue cosmic-ray measurements with the 300M events it has already collected. And CMS does have spare muon chambers. And CMS does have plans of storing them at Point 5… Why not just power them up and build a poor man’s trigger ? A calorimeter might come later…

## Babysitting this weekFebruary 1, 2009

Posted by dorigo in news, personal, physics.
Tags: , , , ,
comments closed

Blogging is one of the activities that will get slightly reduced this week, along with others that are not strictly necessary for my survival. Mariarosa has left for Athens this morning with three high-school classes of her school, Liceo Foscarini. They will visit Greece for a whole week, and be back to Venice on Saturday.

I am not scared by the obligation of having to care for my two kids, and I do like such challenges -I maintain that my wife should not complain too much when it is me who leaves for a week, much more frequently- but of course the management of our family life will take all of my spare time, plus some.

Blogging material, in the meantime, is piling up. There are beautiful results coming out of CDF these days (isn’t that becoming a rule?). Furthermore, recently the Tevatron has been running excellently, and the LHC seems in the middle of a crisis over whether to risk a second, colossal failure by pushing the energy up to 10 TeV to put the Tevatron off the table in the shortest time possible, or to play it safe and keep the collision energy at 6 TeV, accepting the risk of being scooped of the most juicy bits of physics left over to party with.

And multi-muons keep me busy these days. Besides the starting analysis in the CMS-Padova group, there are papers worth discussing in the arxiv. This one was published a few days ago, and we had in Padova last Thursday one of the authors, Thomas Gehrmann, discussing QCD calculations of event shapes observables in a seminar- which of course allowed me to chat with him about his hunch on the hidden valley scenarios he discusses in his paper. More on these things next week, after I set my kids to sleep!

## Cosmic-ray studies of the CMS trackerJanuary 28, 2009

Posted by dorigo in news, personal, physics.
Tags: , ,
comments closed

It is always nice to open the web browser in the morning, check the arxiv for new interesting preprints, and be surprised to find one’s own name in the author list. That is what happened to me today, while browsing lazily the list of new hep-ex papers, as my eyes set on “Performance studies of the CMS strip tracker before installation”.

The paper describes the full testing of a sector of the CMS tracker. The tracker (see picture on the right, showing a detail of its inner barrels) is a daring device made of many concentric barrels of silicon strip sensors. During the summer of 2007 a quarter of the device was fully instrumented, cooled, and read out while it was being exposed to cosmic rays, and a total of 4.5 million tracks were reconstructed, allowing to gain critical experience with its operation, and detailed studies of its tracking capabilities, the tuning of a simulation of the detector, and the development of advanced tools.

Of course I knew the paper was being prepared -the submitter is Patrizia Azzi, a member of my group in Padova (although she’s full-time at CERN)- but no, I did not contribute to it in any significant way and no, I had not even read the draft!

To be fair, the author list includes over 400 names, the members of the CMS Tracker Collaboration (people who were somehow involved in the construction of the tracker), so you should not run out screaming “Dorigo is a parasite!” -at least, I am not the only one! This is how things work in large collaborations: you focus on one or two studies at a time, on the time-scale of two-three years, but you do not just sign your papers: you sign all of them.

In retrospect, I should be even less severe with myself.  Although the paper contains no results of mine, I did work on the analysis of the data. I did a study of multi-track events, trying to figure out how the presence of large amounts of hits close together could affect the tracking (a matter of relevance for LHC, where dozens of tracks will pack together within small volumes), and I studied the extraction of the angle of incidence of tracks from the width of clusters of charge in the silicon strips (tracks crossing a layer of silicon at normal incidence leave a ionization trail which gets collected in few strips, while tracks crossing with a large angle leave a signal in many adjoining strips).

Those studies did not end up providing a valuable addition to the paper -mostly because I did not conclude them- and they were left out of it, but I invested at least one month of work in them. Not much, but I do not feel a parasite after all: the paper is maybe the result of 20 or 30 man-years of studies, so each of the 400 authors contributed an average of less than one month of full-time work!

## Multi-muon newsJanuary 26, 2009

Posted by dorigo in news, personal, physics, science.
Tags: , , , ,
comments closed

This post is not it but no, I have not given up on my promise to complete my series on the anomalous multi-muon signal found by CDF in its Run II data. In fact, I expect to be able to post once more on the topic this week. There, I hope I will be able to discuss the kinematic characteristics of multi-lepton jets. [I am lazy today, so I will refrain from adding links to past discussions of the topic here: if you need references on the topic, just click on the tag cloud on the right column, where it says "anomalous muons"!]

In the meantime, I am happy to report that I have just started working at the same analysis for the CMS experiment! In Padova we have recently put together a group of six -one professor, three researchers, a PhD student, and a undergrad- and we will pursue the investigation of the same signature seen by CDF.  And today, together with Luca, our new brilliant PhD student, I started looking at the reconstruction of neutral kaon decays $K^\circ \to \pi^+ \pi^-$, a clean source of well-identified pion tracks with which we hope to be able to study muon mis-identification in CMS.

Meanwhile, the six-strong group in Padova is already expanding. Last Wednesday professor Fotios Ptochos, a longtime colleague in CDF, a good friend, and crucially one of the authors of the multi-muon analysis, came to Padova and presented a two-hour-long seminar on the CDF signal in front of a very interested group of forty physicists spanning four generations -from Milla Baldo Ceolin to our youngest undergraduates. The seminar was enlightening and I was very happy with the result of a week spent organizing the whole thing! (I will have to ask Fotios if I can make the slides of his talk available here….)

Fotios, a professor at the University of Cyprus, is a member of CMS, and a true expert of measurements in the B-physics sector at hadron machines. We plan to work together to repeat the controversial CDF analysis with the first data that CMS will collect -hopefully later this year.

The idea of repeating the CDF analysis in CMS is obvious. Both CDF and D0 can say something on the signal in a reasonable time scale, but whatever the outcome, the matter will only be settled by the LHC experiments. Imagine, for instance, that in a few months D0 publishes an analysis which disproves the CDF signal. Will we then conclude that CDF has completely screwed up its measurement ? We will probably have quite a clue in that case, but we will need to remain possibilistic until at least a third, possibly more precise, measurement is performed by an independent experiment.That measurement is surely going to be worth a useful publication.

And now imagine, on the contrary, that the CDF signal is real…

## Guess the function: resultsJanuary 21, 2009

Posted by dorigo in physics, science.
Tags: , , , , , ,
comments closed

Thanks to the many offers for help received a few days ago, when I asked for hints on possible functional forms to interpolate a histogram I was finding hard to fit, I have successfully solved the problem, and can now release the results of my study.

The issue is the following one: at the LHC, Z bosons are produced by electroweak interactions, through quark-antiquark annihilation. The colliding quarks have a variable energy, determined by probability density functions (PDF) which determine how much of the proton’s energy they carry; and the Z boson has a resonance shape which has a sizable width: 2.5 GeV, for a 91 GeV mass. The varying energy of the center of mass, determined by the random value of quark energies due to the PDF, “samples” the resonance curve, creating a distortion in the mass distribution of the produced Z bosons.

The above is not the end of the story, but just the beginning: in fact, there are electromagnetic corrections (QED) due to the radiation of photons, both “internally” and by the two muons into which the Z decays (I am focusing on that final state of Z production: a pair of high-momentum muons from $Z \to \mu^+ \mu^-$). Also, electromagnetic interactions cause a interference with Z production, because a virtual photon may produce the same final state (two muons) by means of the so-called “Drell-Yan” process. All these effects can only be accounted for by detailed Monte Carlo simulations.

Now, let us treat all of that as a black box: we only care to describe the mass distribution of muon pairs from Z production at the LHC, and we have a pretty good simulation program, Horace (developed by four physicists in Pavia University: C.M. Carloni Calame, G. Montagna, O. Nicrosini and A. Vicini), which handles the effects discussed above. My problem is to describe with a simple function the produced Z boson lineshape (the mass distribution) in different bins of Z rapidity. Rapidity is a quantity connected to the momentum of the particle along the beam direction: since the colliding quarks have variable energies, the Z may have a high boost along that direction. And crucially, depending on Z rapidity, the lineshape varies.

In the post I published here a few days ago I presented the residual of lineshape fits which used the original resonance form, neglecting all PDF and QED effects. By fitting those residuals with a proper parametrized function, I was trying to arrive at a better parametrization of the full lineshape.

After many attempts, I can now release the results. The template for residuals is shown below, interpolated with the function I obtained from an advice by Lubos Motl:

After multiplying that function by the original Breit-Wigner resonance function, I could fit the 24 lineshapes extracted from a binning in rapidity. This produced additional residuals, which are of course much smaller than the first-order ones above, and have a sort of parabolic shape this time. A couple of them are shown on the right.

I then interpolated those residuals with parabolas, and extracted their fit parameters. Then, I could parametrize the parameters, as the graph below shows: the three degrees of freedom have roughly linear variations with Z rapidity. The graphs show the five parameter dependences on Z rapidity (left column) for lineshapes extracted with the CTEQ set of parton PDF; for MRST set (center column); and the ratio of the two parametrization (right column), which is not too different from 1.0.

Finally, the 24 fits which use the $f(m,y)$ shape, with now all of the rapidity-dependent parameters fixed, are shown below (the graph shows only one fit, click to enlarge and see all of them together).

The function used is detailed in the slide below:

I am rather satisfied by the result, because the residuals of these final fits are really small, as shown on the right: they are certainly smaller than the uncertainties due to PDF and QED effects. The $f(m,y)$ function above will now be used to derive a parametrization of the probability that we observe a dimuon pair with a given mass $m$ at a rapidity $y$, as a function of the momentum scale in the tracker and the muon momentum resolution.

## Some posts you might have missed in 2008January 5, 2009

Posted by dorigo in cosmology, personal, physics, science.
Tags: , , , , , , , , , , ,
comments closed

To start 2009 with a tidy desk, I wish to put some order in the posts about particle physics I wrote in 2008. By collecting a few links here, I save from oblivion the most meaningful of them -or at least I make them just a bit more accessible. In due time, I will update the “physics made easy” page, but that is work for another free day.

The list below collects in reverse chronological order the posts from the first six months of 2008; tomorrow I will complete the list with the second half of the year. The list does not include guest posts nor conference reports, which may be valuable but belong to a different list (and are linked from permanent pages above).

June 17: A description of a general search performed by CDF for events featuring photons and missing transverse energy along with b-quark jets – a signature which may arise from new physics processes.

June 6: This post reports on the observation of the decay of J/Psi mesons to three photons, a rare and beautiful signature found by CLEO-c.

June 4 and June 5 offer a riddle from a simple measurement of the muon lifetime. Readers are given a description of the experimental apparatus, and they have to figure out what they should expect as the result of the experiment.

May 29: A detailed discussion of the search performed by CDF for a MSSM Higgs boson in the two-tau-lepton decay. Since this final state provided a 2.1-sigma excess in 2007, the topic deserved a careful look, which is provided in the post.

May 20: Commented slides of my talk at PPC 2008, on new results from the CDF experiment.

May 17: A description of the search for dimuon decays of the B mesons in CDF, which provides exclusion limits for a chunk of SUSY parameter space.

May 02 : A description of the search for Higgs bosons in the 4-jet final state, which is dear to me because I worked at that signature in the past.

Apr 29: This post describes the method I am working on to correct the measurement of charged track momenta by the CMS detector.

Apr 23, Apr 28, and May 6: This is a lengthy but simple, general discussion of dark matter searches with hadron colliders, based on a seminar I gave to undergraduate students in Padova. In three parts.

Apr 6 and Apr 11: a detailed two-part description of the detectors of electromagnetic and hadronic showers, and the related physics.

Apr 05: a general discussion of the detectors for LHC and the reasons they are built the way they are.

Mar 29: A discussion of the recent Tevatron results on Higgs boson searches, with some considerations on the chances for the consistence of a light Higgs boson with the available data.

Mar 25: A detailed discussion on the possibility that more than three families of elementary fermions exist, and a description of the latest search by CDF for a fourth-generation quark.

Mar 17: A discussion of the excess of events featuring leptons of the same electric charge, seen by CDF and evidenced by a global search for new physics. Can be read alone or in combination with the former post on the same subject.

Mar 10: This is a discussion of the many measurements obtained by CDF and D0 on the top-quark mass, and their combination, which involves a few subtleties.

Mar 5: This is a discussion of the CDMS dark matter search results, and the implications for Supersymmetry and its parameter space.

Feb 19: This is a divulgative description of the ways by which the proton structure can be studied in hadron collisions, studying the parton distribution functions and how these affect the scattering measurements in proton-antiproton collisions.

Feb 13: A discussion of luminosity, cross sections, and rate of collisions at the LHC, with some easy calculations of the rate of multiple hard interactions.

Jan 31: A summary of the enlightening review talk on the standard model that Guido Altarelli gave in Perugia at a meeting of the italian LHC community.

Jan 13: commented slides of the paper seminar gave by Julien Donini on the measurement of the b-jet energy scale and the $p \bar p \to Z X \to b \bar b X$ cross section, the latter measured for the first time ever at a hadron machine. This is the culmination of a twelve-year effort by me and my group.

Jan 4: An account of the CDF search for Randall-Sundrum gravitons in the $ZZ \to eeee$ final state.

## More on the Z lineshape at LHCDecember 19, 2008

Posted by dorigo in personal, physics, science.
Tags: , , ,
comments closed

Yesterday I posted a nice-looking graph without abounding in explanations on how I determined it. Let me fill that gap here today.

A short introduction

Z bosons will be produced copiously at the LHC in proton-proton collisions. What happens is that a quark from one proton hits an antiquark of the same flavour in the other proton, and the pair annihilates, producing the Z. This is a weak interaction: a relatively rare process, because weak interactions are much less frequent than strong interactions. Quarks carry colour charge as well as weak hypercharge, and most of the times when they hit each other what “reacts” is their colour, not their hypercharge. Similarly, when you meet  John at the coffee machine you discuss football more often than chinese checkers: in particle physics terms, that is because your football coupling with John is stronger than your chinese-checkers coupling.

## Result now, explanation laterDecember 18, 2008

Posted by dorigo in personal, physics, science.
Tags: , , ,
comments closed

Tonight I feel accomplished, since I have completed a crucial update of the cornerstone of the algorithm which provides the calibration of the CMS momentum scale. I have no time to discuss the details tonight, but I will share with you the final result of a complicated multi-part calculation (at least, for my mediocre standards): the probability distribution function of measuring the Z boson mass at a certain value $M$, using the quadrimomenta of two muon tracks which correspond to an estimated mass resolution $\sigma_M$, when the rapidity of the Z boson is $Y_Z$.

The above might -and should, if you are not a HEP physicist- sound rather meaningless, but the family of two-dimensional functions $P(M,\sigma_M)_Y$ is needed for a precise calibration of the CMS tracker. They can be derived by convoluting the production cross-section of Z bosons $\sigma_M$ at a given rapidity $Y$ with the proton’s parton distribution functions using a factorization integral, and then convoluting the resulting functions with a smearing Gaussian distribution of width $\sigma_M$.

Still confused ? No worry. Today I will only show one sample result – the probability distribution as a function of $M$ and $\sigma_M$ for Z bosons produced at a rapidity $2.8< |Y| <2.9$, and tomorrow I will explain in simple terms how I obtained that curve and the other 39 I have extracted today.

In the three-dimensional graph above, one axis has the reconstructed mass of muon pairs $M$ (from 71 to 111 GeV), the other has the expected mass resolution $\sigma_M$ (from 0 to 10 GeV). The height of the function is the probability of observing the mass value $M$, if the expected resolution is $\sigma_M$. On top of the graph one also sees in colors the curves of equal probability displayed on a projected plane. It will not escape to the keen eye that the function is asymmetric in mass around its peak: that is entirely the effect of the parton distribution functions…

## Arkani-Hamed: “Dark Forces, Smoking Guns, and Lepton Jets at the LHC”December 11, 2008

Posted by dorigo in news, physics, science.
Tags: , , , , ,
comments closed

As we’ve been waiting for the LHC to turn on and turn the world upside down, some interesting data has been coming out of astrophysics, and a lot of striking new signals could show up. This motivates theoretical investigations on the origins of dark matter and related issues, particularly in the field of Supersymmetry.

Nima said he wanted to tell the story from the top-down approach: what all the
anomalies were, what motivated his and his colleagues’ work. But instead, he offered a parable as a starter.

Imagine there are creatures made of dark matter: ok, dark matter does not clump, but anyway, leaving disbelief aside, let’s imagine there are these dark astrophysicists, who work hard, make measurements, and eventually see that 4% of the universe dark to them, they can’t explain the matter budget of the universe. So they try to figure out what’s missing. A theorist comes out with a good idea: a single neutral fermion. This is quite economical, and this theory surely receives a lot of subscribers. But another theorist envisions that there is a totally unknown gauge theory, with a broken SU(2)xU(1) group, three generations of fermions, the whole shebang… It seems crazy, but this guy has the right answer!

So, we really do not know what’s in the dark sector. It could be more interesting than just a single neutral particle. Since this is going to be a top-down discussion, let us imagine the next most complicated thing you might imagine: Dark matter could be charged. If the gauge symmetry was exact, there would be some degenerate gauge bosons. How does this stuff have contact with the standard model ?

Let us take a mass of a TeV: everything is normal about it, and the coupling that stuff from this dark U(1) group can have is a kinetic mixing between our SM ones and these new gauge fields, a term of the form $1/2 \epsilon F_{\mu \nu}^{dark} F^{\mu \nu}$ in the Lagrangian density.

In general, any particle at any mass scale will induce a loop mixing through their hypercharge above the weak scale. All SM particles get a tiny charge under the new U(1)’. The coupling can be written as kinetic mixing term, and it will be proportional to their electric charge. The size of the coupling could be in the 10^-3, 10^-4 range.

This construct would mess up our picture of dark matter, and a lot about our
cosmology. But if there are higgses under this sector, we have the usual problem of hierarchy. We know the simplest solution to the hierarchy is SUSY. So we imagine to supersymmetrize the whole thing. There is then a MSSM in our sector, and a whole SUSY dark sector. Then there is a tiny kinetic mixing between the two. If the mixing is 10^-3, from the breaking of symmetry at a mass scale of about 100 GeV, the breaking induced in the DM world would be of radiative origin, through loop diagrams, at a few GeV mass scale.

So the gauge interaction in the DM sector is broken at the Gev scale. A bunch
of vectors, and other particles, right next door. Particles would couple
to SM ones proportionally to charge at levels of 10^-3 – 10^-4. This is dangerous since the suppression is not large. The best limits to such a scenario come from e+e- factories. It is really interesting to go back and look at these things in BaBar and other experiments: existing data on tape. We might discover something there!

All the cosmological inputs have difficulty with the standard WIMP scenario. DAMA, Pamela, Atic are recently evidenced anomalies that do not fit with our
simplest-minded picture. But they get framed nicely in our picture instead.

The scale of these new particles is more or less fixed at the GeV region. This has an impact in every way that you look at DM. As for the spectrum of the theory, there is a splitting in masses, given by the coupling constant $\alpha$ in the DM sector times the mass in the DM sector: a scale of the order $\alpha M$.  It is radiative. There are thus MeV-like splittings between the states. And there are new bosons with GeV masses that couple to them. These vectors couple off-diagonally to the DM. This is a crucial fact, sinply because if you have N states, their gauge interaction is a subpart of a rotation between them. The only possible interaction that these particles can have with the vector is off-diagonal. That gives a cross section comparable to the weak scale.

The particles annihilate into the new vectors, which eventually have to decay. They would be stable, but there is a non-zero coupling to our world, so what do they decay into ? Not to proton-antiproton pairs, but electrons, or muon pairs. These features are things that are hard to get with ordinary WIMPS.

And there is something else to expect: these particles move slowly, have long range interaction, geometric cross sections, and they may go into excited states. Their splitting is of the order of the MeV, which is not different from the kinetic energy in the galaxy. So with the big geometric cross section they have, you expect them not to annihilate but excite. They decay back by emitting e+e- pairs. So that’s a source of low-energy e- and e+: that explains an integral excess in these particles from cosmic rays.

If they hit a nucleus, the nucleus has a charge, the vector is light, and thus the cross section is comparable to Z and H exchange. So the collision is not elastic, it changes the nature of the particle. This changes the analysis you would do, and it is possible for DAMA to be consistent with the other experiments.

Of course, the picture drawn above is not the most minimal possible thing, to
imagine that dark matter is charged and has gauge interactions is a quite far-fetched thing in fact. But it can give you a correlated explanation to the cosmological inputs.

Now, why does this have the potential of making life so good at the LHC ? Because we can actually probe this sector sitting next door, particularly in the SUSY picture. In fact, SUSY fits nicely in the picture, while being motivated elsewhere.

This new “hidden” sector has been studied by Strassler and friends in Hidden valley models. It is the leading way by means of which you can have a gauge sector talking to our standard model.

The particular sort of hidden valley model we have discussed is motivated if you take the hints from astrophysics seriously. Now what does it do to the LHC ? GeV particles unseen for thirty years….  But that is because we have to pay a price, the tiny mixing.

Now, what happens with SUSY is nice: if you produce superpartners you will always go into this sector. The reason is simple: normally particles decay into the LSP, which is stable. But now it cannot be stable any longer, because the coupling will give a mixing between gaugino in our sector and photino in their sector. Thus, the LSP will decay to lighter particle in the other sector, producing other particles. These particles are light, so they come out very boosted. They make a Higgs boson in the other sector, which decays to a W pair, and finally ends up with the lightest vector in the other sector: it ends up as an electron-positron pair in this sector.

There is a whole set of decays that gives lots of leptons, all soft in their sector. They are coming from the decay of a 100 GeV particle. The signature could be jets of leptons. Every SUSY event will contain two. Two jets of leptons, with at least two, if not many more, leptons with high-Pt, but featuring small opening angles and invariant masses. That is the smoking gun. As for lifetime, these leptons are typically prompt, but they might also have a lifetime. However the preferred situation is that they would not be displaced, they would be typically prompt.