A shot in the dark ? October 6, 2007Posted by dorigo in personal, physics, science.
Yesterday I was in Bari, where I followed a meeting with some members of the CMS-tracker italian group. At the meeting we discussed a research project aimed at developing new technologies which will be useful for the upgrade of the detector whose construction our group heavily contributed: the silicon tracker.
The silicon tracker of CMS (of which you can see in the picture the inner layers, called TIB -for Tracker Inner Barrel- being assembled last year) is a fantastic device, and we confidently expect it to deliver what it was designed to, so why are we thinking about upgrading it if it has not seen more than cosmic rays so far ?
Well, experimental physicists specialized in the design of particle experiments have learned to think ahead – ten, fifteen years ahead: that is the time scale of today’s experiments. The Large Hadron Collider (LHC), which will provide proton-proton collisions at 14 TeV to the CMS and ATLAS experiments from next year onwards, cannot be easily upgraded to increase the beam energy it provides, but its instantaneous luminosity can be pushed up by a full order of magnitude by increasing the number of protons and the interaction rate.
Instantaneous luminosity is a quantity basically obtained by multiplying the number of circulating protons in each beam by the revolution frequency , and dividing by the area of the beams at the interaction point: . The larger the number of particles in each beam and their crossing rate, and the smaller the area where they interact, the higher is the resulting luminosity. Luminosity is thus directly related to the production rate of a rare process: if we call the cross section for the process, we have simply that the rate is .
If LHC works and finds new physics -that is a big if, but let me finish the sentence- an increase of the beam luminosity of LHC is useful in many scenarios: more luminosity means more data, and a larger discovery potential, plus smaller statistical errors on all measurements. Of course, after twenty years spent designing and building the LHC and the experiments, running for only a few years and then selling the pieces as scrap metal is not the best option! Better to think of an upgrade which can extend the lifetime of the experiments and increase their discovery potential, if possible.
While in principle an increase in beam luminosity is a good thing, the detectors that benefit from it must come prepared, because of several issues.
First, a high luminosity causes larger fluences of particles through the detectors, which thus have to withstand a larger radiation damage. Of course, radiation dose goes inversely with the squared distance from the interaction point, so the closest devices must be the toughest. The silicon microstrip and pixel detectors of CMS are radiation hard, but tolerances are not allowing for an increase by an order of magnitude in doses.
Second, the 100 interactions occurring every 12.5 nanoseconds in the core of the detectors during SLHC operation will cause a large occupancy of the detector components closest to the interaction region: this imposes constraints on a design capable of maintaining the current tracking performances. One would not like to collect ten times more data if the deal involved a detector performing much worse!
Third, muons – particles which are crucial for several searches of new physics – require a very good momentum resolution at trigger level in order to allow an efficient filtering of the few interesting events among the 80 MHz of interactions. This is because of the steeply falling probability of finding a true muon as its transverse momentum increases. We are interested in the high-momentum ones (derived from the decay of W and Z bosons, or Higgs bosons too), and so we select them with a cut GeV, a threshold dictated by our limited ability to write events to tape. Now, if there is even a slight chance -because of insufficiently good momentum resolution- that we mistake a low momentum muon for one passing the threshold we are dead, because there are so many more muons with the lower momentum that whatever threshold X we set, we will collect mostly the low momentum ones – filling our tapes with uninteresting stuff.
The jury will soon be out to decide on a design for the upgrade of the CMS tracker that addresses all these issues. There are quite a few subtleties, many issues involving cost effectiveness, redundancy, triggering capabilities by the tracker alone, and so on. However, what I feel is most urgently needed is a preliminary assessment of the physics the upgrade will address. Nobody’s fault: it is simply impossible to make a really meaningful case for a LHC upgrade right now, if you ask me. That is: an upgrade is certainly a good idea, but which one, well…
Indeed, deciding now on an upgrade of LHC experiment seems to me a shot in the dark. We can only guess whether CMS and ATLAS will discover the Higgs, and whether they will find SUSY. We can argue on the likelihood of new unpredicted discoveries. But what one would need in order to decide whether design A is better or worse than design B, C, D…, in a scenario where LHC runs for a few years with a tenfold increase in luminosity, would be a clear idea of what we would gain in the study of a few flagship signals. And knowing whether SUSY is there or not makes the hell of a difference.
Sure, we could make a physics case with stuff we are almost certain we will find, such as the Higgs. To me that would be more meaningful than trying to sell a better determination of the full spectrum of SUSY particles or large extra dimensions. An example: Would we be able to see exclusive higgs production, and thus measure that particle’s quantum numbers, only with design A ?
Knowing that would be something… But it looks unimpressive to somebody who has to decide whether to fund a multi-million-dollar project!