jump to navigation

At a trigger meeting 15 years ago… April 27, 2007

Posted by dorigo in personal, physics, politics, science.
trackback

David asks me to recall the story behind the famous sentence uttered by Melissa Franklin at a CDF trigger meeting 15 years ago. I comply below. By the way, I have to clarify here that I admire Melissa as a very brilliant physicist and I love her – so if you are looking for anything against her here, look elsewhere.

Anyway. 15 years have passed, but I do remember the whole thing quite well, since at that very meeting I gave my first presentation ever in CDF – also, my first presentation in English (which I did not speak fluently back then). To tell the story, I need to provide you with quite some background information below… Read at your own risk.

We were at the start of Run 1a, and CDF had begun collecting the datasets which would prove decisive for the top quark discovery. We knew that we had a chance: the top had to be there, definitely at our reach. We just had to collect as much data as we could, and be reasonably smart in designing algorithms for  b-tagging (the crucial ingredient for beating down the W+jets background and allowing the top-pair production signal to emerge), so people’s attention to the quality of data collection was really high.

A parenthesis here: if you take a old enough version of the Review of Particle Properties (I religiously keep a 1994 version in my office), you will find out that in 1992 the first precise bracketings of the top mass from fits to electroweak observables were emerging: a LEP global fit quoted Mt=132+-35 GeV [Phys. Lett B276 (1992), 247], while a PDG fit quoted Mt=150+-30 GeV [Phys. Rev. D45 (1992), 1]. A 150 GeV top would have a cross section of about 15-20 picobarns, equivalent to a rate of a golden top pair event per inverse picobarn of collected data: given expected backgrounds, 20 inverse picobarns could suffice for an observation [and 20 were in the agenda for Run 1a in 1992-93]!

And the Tevatron was delivering. The accelerator experts at Fermilab had a strategy of small incremental improvements in the understanding of the beam, and the rate of collisions inside CDF was growing slowly but steadily. Which made all of us happy, and paranoid. People responsible for the safety of the brand new silicon vertex detector, which had just been installed at the core of the apparatus, were losing sleep to be on the lookout for beam instabilities which had the potential of cooking up the inner layers of the precious device. Everybody was working like a single man to take the data and knock the top quark off the list of unknown entities. It was nice to be there and see it happening.

Among the things that required continuative attention, the Level-1 trigger was atop the list. Of course, a continuous increase of the luminosity meant that continuous adjustments to the menu of conditions filtering events had to be made. Tighter and tighter cuts had to be placed on the data, since our capability to write data to tape was unfortunately not increasing with time.

Now, the trigger had a crucial task to perform. During Run I in CDF, it received 300,000 collisions every second, and it could only write 30 or so to disk. Surprisingly, that is fine, since most of the collisions are not very interesting: you already  know the physics of those 299,970 events you are going to throw away, and want to collect the very few that provide new insight: the highest-energy collisions, those which could signal the creation of new particles or allow the measurement of ill-studied quantities, at the forefront of particle physics.

So three lines of defense were organized: at the forefront was put the Level-1, where only a few microseconds were available to decide whether an event was worth passing on to the second line, Level-2. Level-1 had to discard 99 events out of 100, because otherwise Level-2 would not be able to perform its analysis on every event which is passed to it, and the result would be the dreaded dead time, to be avoided at all costs. Level-2 received only a few thousand events per second, and it could take several tens of microseconds to decide whether the event it received was worth keeping, by running a fast reconstruction with custom hardware. If the event passed Level-2 filtering, it was ready for Level-3, which got an input rate of only about a hundred events, and had the time to reconstruct them fully with time-optimized software algorithms, apply a very well tuned selection, and send to the storage devices the surviving 30 fortunate ones.  

Level 1 used almost pristine information from the fastest parts of the detector to perform its task. One of the requirements that would allow the event to pass was the presence of a localized energy deposit in one of the elements of the calorimeter – a “calorimeter tower”. Energetic towers in the calorimeter happen quite often in proton-antiproton collisions, so the threshold on the energy has to be high if one wants to be selective. And the thresholds of forward (as opposed to central) calorimetric towers were the indicted on trial at that trigger meeting of late November, 1992.

Enter myself. I had studied how an increase of thresholds from 10 to 51 GeV (basically the highest possible value before taking that data stream off) would affect the collection of top pair decays to six-jets, a channel nobody at the time believed useful for top physics (we had a few of them change their minds in the forthcoming years, ha!). My conclusions were that since our candidate top events would be collected more efficiently with other selections, the raising forward thresholds would not affect top physics.

And I did present those results, with a broken English and trembling legs, in the glorious “pump room” of the CDF building, next to the control room. A space of eight by four meters, with a long table in the middle, chairs everywhere, and a screen on one side. At that meeting, I think no less than sixty people had crammed it. The smell of sweat, heavy breaths, and top-level physics could be cut with a knife. 

There ensued a discussion. The trigger conveners (Hans Jensen, picture here, used to chair the meeting back then) said that we either raised those thresholds, or we would be throwing away events randomly through the resulting dead time – thereby cutting into all trigger streams democratically, but also reducing our discovery reach for the top quark. I think the Level-2 dead time we were discussing was at the level of 15% or so.

The decision was not easy to take, though, because in the early days of CDF, QCD studies were all the rage. QCD, the theory of strong interactions, was not so well established back then as it is now. The data CDF was collecting was being used to understand our models and tune them. Forward jets, which would be eliminated from our bounty of data saved to tape by the raised thresholds, were very useful to understand the structure of the proton and the soft-limit regime of QCD.

Monte Carlo generators needed those data for tuning. Theoreticians needed those data for a better understanding of parton distribution functions. Diffractive physics enthusiasts needed those data for their analysis of Pomeron exchange. And there they were all, in person or in spirit, arguing endlessly against a brick wall: We cannot afford dead time.

CDF was meant to find the top quark. You could not argue against that. And anything sitting in the way between the experiment and the discovery was going to be swept away.

Melissa (left, in a funny picture) was very interested in the forward jet data. She had been studying QCD with it. But there was more: the forward calorimeters were her pupil. She had built the darn thing – a gas calorimeter, one of the last attempts at those kind of devices before plastic scintillators and even fancier devices took over. And raising the threshold in her detector was like kissing it good-bye: very little would be done with it from then on. People was not using it for high-energy analyses triggered by central objects anyway, and the lack of a specifically triggered dataset would mean oblivion.

I do not remember for how long the discussion went on, but I remember a tense, heavy atmosphere. Everybody knew everybody else well, and the cards were on the table: it was a political decision the one to be taken, but compromise would not be an option. And the decision was to raise the thresholds.

Melissa raised from the table, looked around in a rage, and explained why she was disgusted by the decision. Then she made her colorful wish , and left the room. 

I might say I understood Melissa’s arguments, and at the meeting it was not competely clear to me which was the right decision to take. But in retrospect, I think chances are that we would have been unable to publish the 1994 “top quark evidence” paper if we had kept running with a significant dead-time at Level 2: our first evidence was convincing in some way, but arguably hanging by a thread. Just a few lost events, and CDF would be now unable to claim we have seen the top quark first, after all.

Comments

1. Andrea Giammanco - April 27, 2007

right, but it’s unfair to decide the goodness of a past decision according to our present knowledge.
An example from history: the people searching the tau lepton at the ADONE collider in Frascati fought a lot to increase the center of mass energy as much as possible.
The committee in charge of taking the decision, as far as I was told, decided that the resulting penalization on all the other standard analyses (which were less fancy than the search of a new particle, but very important for understanding the details of e+e- collisions) was excessive.
When, immediately after, other colliders discovered the J/phi and the tau lepton (at energies “right behind the corner” for ADONE), everybody in retrospect judged this a very unwise decision.
It’s always a gamble…:)

2. dorigo - April 27, 2007

Agreed. But what is fair is to evaluate the wisdom of those who argued in one direction or the opposite. Maybe the ones arguing for a higher energy run of Adone were seers, maybe they were just guessing right by total chance. But for sure they deserve some post-mortem credit… As for the naysayers in CDF who cut physics datasets to avoid a minor loss of high-Pt data, well… They were right too.

Cheers,
T.

3. Sean Carroll - April 27, 2007

Great post, Tomaso. Thanks for the inside story! It’s the difficult physics decisions that make it interesting, although the personal flair is always fun.

4. D - April 27, 2007

Can you link to the former post? After reading all that I really want to know what was said

5. David Heffernan - April 28, 2007

Hey Tomasso, thanks for the full story. These trigger meetings sound like a lot of fun, but I’m glad I’m not one of the ones who has to fight to get my pet analysis prioritized. I’ve heard a few more good stories from disgruntled people studying Bs oscillations, who were passed over in favour of the people who wanted to prioritize the Higgs triggers.

Glad it’s not as much of an issue on a lepton machine, although we still get to see people argue over the benefits and trade offs of taking data Y(4S) versus Y(5S) and so on.

6. Andrea Giammanco - April 29, 2007

Well, the same kind of problem had the CERN director general in november 2000, when he had to decide if the possible Higgs bump in LEP data justified an extension for the existing accelerator, i.e. a delay for LHC.
Now that the significance of that bump has decreased from 3 to 2 sigmas, almost everybody agrees that he did the wisest choice. At the time, almost everybody (apart from Tevatron people, isn’t it strange?;)) thought he was making a catastrophic choice.
If the Higgs will be found at the same mass of that bump, probably everybody will remember him as “the guy who slowed human knowledge by N years” (with N>7).


Sorry comments are closed for this entry

%d bloggers like this: