jump to navigation

Latest LHC schedule and luminosity for 2008 May 9, 2008

Posted by dorigo in news, physics, science.

Here is an excerpt of the latest LHC schedule for the following few months, as agreed in a meeting at CERN chaired by the Director-General, with the experiments and LHC machine heads.

Based on the good progress for the cool down of the LHC sectors, and on the powering tests from two sectors, the following planning was arrived at:

  1. End of June: The LHC is expected to be cooled down. […]
  2. Mid of July: The experimental caverns will be closed […]
  3. End of July: First particles may be injected, and the commissioning with beams and collisions will start.
  4. It is expected that it will take about 2 months to have first collisions at 10 TeV.
  5. Energy of the 2008 run: Agreed to be 10 TeV. The machine considers this to be a safe setting to optimize up-time of the machine util the winter shut-down (starting likely around end of November).[…]
  6. The winter shut-down will then be used to commissioning and train the magnets up to full current, such that the 2009 run will start at the full 14 TeV design energy.

The above means that the machine will deliver collisions from the end of September on, for at most nine weeks in 2008. More safely, one can assume 6 full weeks of data-taking. What luminosity do we expect to collect ?

A state-of-the-art estimate was made by a colleague, who used his past experience with LEP as well as the information on the current limitations of the RF system -which will make the proton bunches shorter than planned (RMS of 5.4 cm), and with a transverse size of 46 microns. At the lower energy the low-beta squeeze will also be loosened from 2 to 3 meters. These figures reduce the instantaneous luminosity, and the estimate for 6 weeks of collisions are of about 40 inverse picobarns of data in 2008.

If ATLAS and CMS will be fully on during the weeks of collisions, these 40 inverse picobarns will fruit, in my opinion:

  • A top pair production cross section with 10-15% accuracy
  • A sizable sample of vector boson decays to leptons, very useful for calibrations and checks of lepton efficiency studies
  • The first estimates of b-tagging and tau-tagging capabilities of current algorithms
  • no information on the Higgs
  • no SUSY discovery (of course!)

All the above will have a chance of being ready for the 2009 winter conferences, if all goes well…


1. Andrea Giammanco - May 9, 2008

Can you please elaborate on your prediction for ttbar x-section accuracy?
Does it take into account only one experiment or the combination of both, only one channel or a combination of all?
Did you scale all the backgrounds properly? (Remember that the ttbar/QCD ratio will scale differently from the ttbar/Wjets ratio etc.) Or viceversa are you neglecting background contamination and only considering what you get for statistical precision with some reasonable cut efficiency?
(Don’t be intimidated by the fact that I’m currently working on a ttbar x-section measurement optimized for 14 TeV, I’m sure that your reasoning will provide me with food for thought :))

2. Latest LHC schedule - May 9, 2008

[…] Read more here. […]

3. dorigo - May 9, 2008

Hi Andrea,

of course I am intimidated! You do know the topic of tt xs measurements more than I do!

I can only tell you what my reasoning is: two days ago we approved in CMS an analysis based on 100/pb of data taken at 14 TeV, which claimed a S/N of the order of 90/1 in the dilepton final state with b-tagging, and a xs measurement with a statistical accuracy of 8%.

At 10 TeV I do not expect things to be very different. The S/N can be tuned to be close to that value by changing cuts, at the expense of statistics. Say you pay a factor of 1.5 in stats, plus take into account the x2.5 reduction from 100 to 40/pb. That is a factor of 4 in stats, or a factor of 2 in stat error –> 16%. The systematics will be dominated by efficiency uncertainties, b-tagging uncertainties, and luminosity. I take these to be at most another 15%. So we are looking at a 20% measurement.

Then, one takes the single lepton final state. We approved a result on this channel as well two days ago, but we only have signal and background numbers to extrapolate from in that case. It looks like the single lepton channel should provide a measurement with better statistical accuracy and similar systematics (background systematics are not a concern until one pushes accuracy below 10%).

Overall, the combination could indeed provide a 10-15% measurement with 2008 data. That, however, depends on how much work we invest in these studies. Despite the fact that 2009 data might make us eager to do better things with our time, though, I believe the determination of ttbar xs at 10 TeV is an important measurement -we could never again have that particular beam energy. So I predict that for cross sections we will indeed squeeze an accurate determination.


4. Peter Woit - May 9, 2008

Hi Tommaso,

If there really are strongly interacting superpartners with masses just out of reach of the Tevatron, how hard will it be for the LHC to see them? Will 40 inverse picobarns and a few months to analyze them be enough?

5. dorigo - May 9, 2008

Hi Peter,

well, in some special points of the parameter space you would indeed get cascades of gluino decays with a striking signature. The cross sections are large – for the LM1 test point considered in the CMS physics TDR (M_0=60, M_1/2=250, tan(beta)=10, sgn(mu)=+, A0=0) the cross section is 49 pb at LO for 14 TeV cm energy, with squark mass of 560 GeV and gluino mass of 611 GeV.

Accounting for trigger (missing Et>200 GeV) and reconstruction efficiencies (13% total), and backgrounds (mainly from ttbar and QCD) this would provide a 5-sigma observation with just 6/pb at 14 TeV, and probably still less than 40/pb at 10 TeV.

Overall, if those 40/pb were all LHC got, it would be well enough to see a SUSY signal just out of Tevatron reach – the limit on gluinos is about 350 GeV right now. But how lucky do we think we are going to be ? The argument that you will win big money by betting just another chip, after you squandered your whole pile, is always tempting but seldom a true picture of reality.


6. arcadianfunctor - May 10, 2008

I have one question: assuming clear signals, fast analysis etc, with only 40/pb at 10 TeV, how long do we have before you guys release any (ie. maybe non SUSY) new mass spectra?

7. dorigo - May 10, 2008

Hi Kea,

CDF, after years of running, manages to provide calibrated and reconstructed data to analyses in two-three months. I expect there will be several initial issues (but the trigger shouldn’t be one since we will be taking data at very low rates -although still saturating our processing bandwidth).

On the other hand, past experience at the Tevatron will help LHC experiments, and indeed some analyses have been worked out in detail with Monte Carlo simulated samples.

All in all, the winter 2009 conferences appear to me as a really ambitious, although possible, goal for the very earliest distributions.


8. DB - May 10, 2008

I’m with Martinus Veltman: “SUSY is always just around the corner. It’s been like that for 25 years”

9. Coin - May 10, 2008

Question– Is there any theoretical reason to expect the Higgs is lighter than the lightest supersymmetric partner, or vice versa? In short, is it known or can it be expected which will be found first, the Higgs or the LSP?

I seem to remember reading somewhere that the LSP, if it exists, was expected to be heavier than the Higgs*,but I encountered someone this week who believed that SUSY would be found before the Higgs and I was wondering if that was in fact possible/probable.

* I may have been confused about the difference between the Higgs and Higgsino here.

10. dorigo - May 11, 2008

Hi Coin,

I may not be the right person to ask this question, but as far as I recall there are no reasons for a constraint one way or the other. In some models the LSP is lighter than twice h, so that the h \to \chi \chi decay is enabled, subtracting observable decay modes to h and making things harder for the discovery of the latter, but this is only a detail.

A totally different question, which should not be confused with the issue of mass values, is the discovery reach. It is totally false that the heavier a body is, the harder it is to find it. This is only true with electron-positron accelerators, where we know what \sqrt s is. A cascade of squarks and gluinos to lsp’s and leptons and jets would be very easy to see even with gluino masses of 600 GeV at LHC, because gluinos interact strongly and they have a large cross section. Instead, a higgs might be harder to see even at 180 GeV, where the decay to four muons or four electrons would be a really clean signature. It is a matter of production rates, and masses are only part of the equation there.


11. Coin - May 14, 2008

Dorigo, thanks!

12. Not Even Wrong » Blog Archive » News From CERN and Fermilab - May 23, 2008

[…] produce total luminosity of “tens of pb-1“. Tommaso Dorigo predicts 40 pb-1, see more here. Also, don’t miss his series of recent posts from PPC 2008 giving the best blogging from a […]

Sorry comments are closed for this entry

%d bloggers like this: