Physics Highlights – May 2009 June 2, 2009Posted by dorigo in news, physics, science.
Tags: CDF, DZERO, Fermi, heavy quarks, Hess, QCD, Randall, standard model
Here is a list of noteworthy pieces I published on my new blog site in May. Those of you who have not yet updated their links to point there might benefit from it…
Four things about four generations -the three families of fermions in the Standard Model could be complemented by a fourth: a recent preprint discusses the possibility.
Fermi and Hess do not confirm a dark matter signal: a discussion of recent measurements of the electron and positron cosmic ray fluxes.
Nit-picking on the Omega_b Discovery: A discussion of the significance of the signal found by DZERO, attributed to a Omega_b particle.
Nit-picking on the Omega_b Baryon -part II: A pseudoexperiments approach to the assessment of the significance of the signal found by DZERO.
The real discovery of the Omega_b released by CDF today: Announcing the observation of the Omega_b by CDF.
CDF versus DZERO: and the winner is…: A comparison of the two “discoveries” of the Omega_b particle.
The Tevatron Higgs limits strenghtened by a new theoretical study: a discussion of a new calculation of Higgs cross sections, showing an increase in the predictions with respect to numbers used by Tevatron experiments.
Citizen Randall: a report of the giving of honorary citizenship in Padova to Lisa Randall.
Hadronic Dibosons seen -next stop: the Higgs: A report of the new observation of WW/WZ/ZZ decays where one of the bosons decays to jet pairs.
!! Moving !! April 15, 2009Posted by dorigo in news, personal, travel.
Effective today: after forty months of service on wordpress, my blogging activity is moving to scientific blogging, an excellent site which collects many top-notch science writers. As soon as I manage to work it out, I will provide here a widget with links to my posts on that site, and possibly I will keep this site from being reported as mature by making a monthly entry with a link to the most interesting read in the new site. Other necessary adjustments are also in order, fixing some external links etcetera. But really, if you love me, please follow me there now.
I know, I can almost hear some of you complaining about this uncalled-for, unanticipated decision: this has indeed been a quick resolution, and quite un-characteristically the idea has not been submitted in advance here, to make a dry run and hear your opinions on the plan before it became a fact.
So, why am I doing it ? There are several answers to this question, and you may pick the one you like the most.
1) Blogging for me is about reaching as large an audience as possible, because I do conceive it as an educational mission. I have repeatedly explained here that I do not feel guilty for the time I invest on blogging, because my research position does not require me to teach, and I am glad to distribute through the internet some of the knowledge I have had the chance to accumulate through my studies and my job. Now, the target site of this move will increase my reach of potential readers, especially ones that might be interested in particle physics but have so far not realized it is not beyond their understanding capabilities. This by itself is a valuable asset.
2) While this blog has served my large ego extremely well, providing me with surprising opportunities and gratifications over the course of these forty months, I have realized that in its present form its further growth is problematic. Moving to scientificblogging.com will supply fresh air, new ideas, and a possibility to interact more closely with a stimulating crew of writers.
3) The real question is, why not ? I am not losing paternity or rights on my writings, the interface and functionalities at the new site are no worse, the freedom to write what I like is assured to stay unchanged. There are paid ads there, that is true; however, I have decided to trust the owner of the site that it will remain as non-invasive and reasonable as it looks now. As for the slight loss of control that the move entails, there are a couple of things to say also against wordpress itself, for instance the obligatory, auto-generated “possibly related” links that are supplied to every post, or the arbitrary taking down of this site I have experienced due to my failure to delete spam comments.
4) As for money, that is really not the reason. No, really. I will indeed get paid in the new site, but the sum I will earn is not going to change appreciably the depth of my pockets. Rather, I see the higher visibility I am expecting there as a reward, which will possibly help any future endeavour I might be entertaining myself with fantasizing about. Like writing a book.
So, it is your turn now to speak up and tell me what you think of this. I know, nobody likes change. But change is a fundamental ingredient of life, and specifically, one associated with growth.
Testing the Bell inequality with Lambda hyperons April 14, 2009Posted by dorigo in news, physics, science.
Tags: bell inequality, quantum mechanics, quantum optics, stern gerlach
This morning I came back from Easter vacations to my office and was suddenly assaulted by a pile of errands crying to be evaded, but I prevailed, and I still found some time to get fascinated by browsing through a preprint appeared a week ago on the Arxiv, 0904.1000. The paper, by Xi-Qing Hao, Hong-Wei Ke, Yi-Bing Ding, Peng-Nian Shen, and Xue-Qian Li [wow, I’m past the hard part of this post], is titled “Testing the Bell Inequality at Experiments of High Energy Physics“. Here is the abstract:
Besides using the laser beam, it is very tempting to directly testify the Bell inequality at high energy experiments where the spin correlation is exactly what the original Bell inequality investigates. In this work, we follow the proposal raised in literature and use the successive decays to testify the Bell inequality. […] (We) make a Monte-Carlo simulation of the processes based on the quantum field theory (QFT). Since the underlying theory is QFT, it implies that we pre-admit the validity of quantum picture. Even though the QFT is true, we need to find how big the database should be, so that we can clearly show deviations of the correlation from the Bell inequality determined by the local hidden variable theory. […]
Testing the Bell inequality with the decay of short-lived subatomic particles sounds really cool, doesn’t it ? Or does it ? Unfortunately, my quantum mechanics is too rusty to allow me to get past a careful post which explains things tidily, in the short time left between now and a well-deserved sleep. You can read elsewhere about the Bell inequality, and how it tests whether pure quantum mechanics rules -destroying correlations between quantum systems separated by a space-like interval- or whether a local hidden variable theory holds instead: and besides, almost anybody can write a better account of that than me, so if you feel you can help, you are invited to guest-blog about it here.
Besides embarassing myself, I still wanted to mention the paper today, because the authors make a honest attempt at proposing an experiment which might actually work, and which could avoid some drawbacks of all experimental tests so far attempted, which belong to the realm of quantum optics. In their own words,
Over a half century, many experiments have been carried out […] among them, the polarization entanglement experiments of two-photons and multi-photons attract the widest attention of the physics society. All photon experimental data indicate that the Bell inequality and its extension forms are violated, and the results are fully consistent with the prediction of QM. The consistency can reach as high as 30 standard deviations. […] when analyzing the data, one needs to introduce additional assumptions, so that the requirement of LHVT cannot be completely satisfied. That is why as generally considered, so far, the Bell inequality has not undergone serious test yet.
Being totally ignorant of quantum optics I am willing to buy the above as true, although, being a sceptical son of a bitch, the statement makes me slightly dubious. Anyway, let me get to the point of this post.
Any respectable quantum mechanic could convince you that in order to check the Bell inequality with the decay chain mentioned above, it all boils down to measuring the correlation between the pions emitted in the decay of the Lambda particles, i.e., the polarization of the Lambda baryons: in the end, one just measures one single, clean angle between the observed final state pions. The authors show that this would require about one billion decays of the mesons produced by an electron-positron collider running at 3.09 GeV center-of-mass energy (the mass of the J/psi resonance): this is because the decay chain involving the clean final state is rare: the branching fraction of is 0.013, the decay occurs once in a thousand cases, and finally, each Lambda hyperon has a 64% chance to yield a proton-pion final state. So, 0.013 times 0.001 times 0.64 squared makes a chance about as frequent as a Pope appointment. However, if we had such a sample, here is what we would get:
The plot shows the measured angle between the two charged pions one would obtain from 3382 pion pairs (resulting from a billion decays through double hyperon decay) compared with pure quantum mechanics predictions (the blue line) and by the Bell inequality (the area within the green lines). The simulated events are taken to follow the QM predictions, and such statistics would indeed refute the Bell inequality -although not by a huge statistical margin.
So, the one above is an interesting distribution, but if the paper was all about showing they can compute branching fractions and run a toy Monte Carlo simulation (which even I could do in the time it takes to write a lousy post), it would not be worth much. Instead, they have an improved idea, which is to apply a suitable magnetic field and exploit the anomalous magnetic moment of the Lambda baryons to measure simultaneously their polarization along three independent axes, turning a passive measurement -one involving a check of the decay kinematics of the Lambda particles- into an active one -directly figuring out the polarization. This is a sort of double Stern-Gerlach experiment. Here I would really love to explain what a Stern-Gerlach experiment is, and even more to make sense of the above gibberish, but for today I feel really drained out, and I will just quote the authors again:
One can install two Stern-Gerlach apparatuses at two sides with flexible angles with respect to according to the electron-positron beams. The apparatus provides a non-uniform magnetic field which may decline trajectory of the neutral () due to its non-zero anomalous magnetic moment i.e. the force is proportional to where is the anomalous magnetic moment of , B is a non-uniform external magnetic field and d/n is a directional derivative. Because is neutral, the Lorentz force does not apply, therefore one may expect to use the apparatus to directly measure the polarization […]. But one must first identify the particle flying into the Stern-Gerlach apparatus […]. It can be determined by its decay product […]. Here one only needs the decay product to tag the decaying particle, but does not use it to do kinematic measurements.
I think this idea is brilliant and it might actually be turned into a technical proposal. However, the experimental problems connected to setting up such an apparatus, detecting the golden decays in a huge background of impure quantum states, and capturing Lambdas inside inhomogeneous magnetic fields, are mindboggling: no wonder the authors do not have a Monte Carlo for that. Also, it remains to be seen whether such pains are really called for. If you ask me, quantum mechanics is right, period: why bother ?
Grow Triops with your kids! April 13, 2009Posted by dorigo in personal, science.
Tags: scientific toys, triops
Just a short, advertising post to mention a very entertaining “scientific toy” for children. Filippo was given a box for his birthday, “Triassic Triops”, a toy made in USA by Triops Inc., three weeks ago. The box rested untouched for a while, until Filippo asked me to try it. The figure on the cover shows a mean-looking crustacean, and it took me a while to decide to try and grow those creatures in our home.
The box contains a small plastic tank, an envelope with gravel and tiny wood bits to create a reasonable habitat in the tank, a tiny box containing about 20 eggs, a tape-on thermometer, and a parcel of food pellets, plus instructions and a plastic pipette. Instructions are quite precise and easy to follow, but the hard part is to find a place in your home where you can stabilize the temperature: these creatures will hatch and grow only if the temperature is in the 22-29°C range.
We followed instructions carefully by using bottled water and waiting until the temperature had stabilized, but once eggs had to be dropped in the tank I was rather amazed: the little box did not seem to contain anything! Only by looking very carefully could I spot tiny grains smaller than half a millimeter across.
Nothing happened for a couple of days, but then we started to see two or three teeny-tiny little beings swimming around. We started feeding them, and they grew quite fast. I have no idea why only few of the eggs hatched, but I am really not sure whether there were twenty originally in the box, nor whether I managed to drop all of them in the tank…
In ten days, two triops have grown to about two-thirds of an inch in length (see right), probably killing the third one in the process (its carcass is still floating around). They are happily swimming day and night, and they eat all the pellets I put in the tank in the matter of a couple of hours… According to instructions, the life cycle of these amazing creatures is of about one to three months. They should grow to about two inches length -roughly the size of a small shrimp. Then they will die, and -by totally drying the tank, and pouring warm water in again- it should be possible to revive their eggs, if they have produced any.
The interesting thing about these creatures is that they have remained biologically identical to the original beings that populated our earth in the Triassic age. Quoting from the leaflet:
“Millions of years before Tyrannosaurus Rex ruled the earth, Tripos Longicaudatus was evolving a method of reproduction that allowed the developing embryos (developing eggs) to survive the drying up of the temporary ponds. This amazing process is known as suspended animation, or diapause. Scientists have found that Triops eggs can remain dormant for more than 25 years. In other words, the little Tripos slept while the dinosaurs disappeared”.
Regardless of the accuracy of the above reconstruction, I think it is really amazing to drop dried eggs into water and see creatures growing up in it. Try it with your kids: fun is assured!
Things I should have blogged on last week April 13, 2009Posted by dorigo in cosmology, news, physics, science.
Tags: anomalous muons, CDF, dark matter, DZERO, Higgs boson, neutrino
It rarely happens that four days pass without a new post on this site, but it is never because of the lack of things to report on: the world of experimental particle physics is wonderfully active and always entertaining. Usually hiatuses are due to a bout of laziness on my part. In this case, I can blame Vodafone, the provider of the wireless internet service I use when I am on vacation. From Padola (the place in the eastern italian Alps where I spent the last few days) the service is horrible, and I sometimes lack the patience to find the moment of outburst when bytes flow freely.
Things I would have wanted to blog on during these days include:
- The document describing the DZERO search of a CDF-like anomalous muon signal is finally public, about two weeks after the talk which announced the results at Moriond. Having had in my hands a unauthorized draft, I have a chance of comparing the polished with the unpolished version… Should be fun, but unfortunately unbloggable, since I owe some respect to my colleagues in DZERO. Still, the many issues I raised after the Moriond seminar should be discussed in light of an official document.
- DZERO also produced a very interesting search for production. This is the associated production of a Higgs boson and a pair of top quarks, a process whose rate is made significant by the large coupling of top quarks and Higgs bosons, by virtue of the large top quark mass. By searching for a top-antitop signature and the associated Higgs boson decay to a pair of b-quark jets, one can investigate the existence of Higgs bosons in the mass range where the decay is most frequent -i.e., the region where all indirect evidence puts it. However, tth production is invisible at the Tevatron, and very hard at the LHC, so the DZERO search is really just a check that there is nothing sticking out which we have missed by just forgetting to look there. In any case, the signature is extremely rich and interesting to study (I had a PhD doing this for CMS a couple of years ago), thus my interest.
- I am still sitting on my notes for Day 4 of the NEUTEL2009 conference in Venice, which included a few interesting talks on gravitational waves, CMB anisotropies, the PAMELA results, and a talk by Marco Cirelli on dark matter searches. With some effort, I should be able to organize these notes in a post in a few days.
- And new beautiful physics results are coming out of CDF. I cannot anticipate much, but I assure you there will be much to read about in the forthcoming weeks!
Quake homeless “enjoy a weekend of camping” April 8, 2009Posted by dorigo in news, politics.
Tags: berlusconi, earthquake
The quiz of the day: who uttered such a profanity today, during an interview to german television, referring to the victims of the destructive earthquake who destroyed part of central Italy two days ago ?
I will give you a few hints. It is the same person who just a few days ago embarassed Italy at the G20, the same person who a few years ago suggested in a verbal exchange at the european parliament that Schultz, a german socialist, would make a very good nazist kapo’.
Ok, if you haven’t figured it out already, check it out on the times online.
I am ashamed of the person Italians has elected as their representative, and I feel especially bad since now I cannot even tell myself that the US president is even worse anymore.
Milind Diwan: recent MINOS results April 8, 2009Posted by dorigo in news, physics, science.
Tags: minos, neutrino, neutrino experiments, neutrino oscillations, Sterile neutrinos
I offer below another piece of the notes I took at the NEUTEL09 conference in Venice last month. For the slides of the talk reported here, see the conference site.
Milind’s presentation concentrated on results of muon-neutrino to electron-neutrino conversions. Minos is a “Main-Injecor Neutrino Oscillation Search”. It is a long-baseline experiment: the beam from the Main injector, Fermilab’s high-intensity source of protons feeding the Tevatron accelerator, can be sent from Batavia (IL) 735km away to the Soudan mine in Minnesota. There are actually two detectors, a near and a far detector: this is the unique feature of MINOS. The spectra collected at the two sites are compared to measure muon neutrino disappearance and appearance. The near detector is 1km away from the target.
The beam is a horn-focused muon-neutrino one. Horns are parabolic-shaped magnets. 120 GeV protons originate pions, which are focused by these structures; negative ones are defocused, and the beam contains predominantly muon neutrinos from the decay of these pions. The accelerator provides 10-microsecond pulses every 2.2 seconds, with protons per pulse. 95% of the resulting neutrino flux is , 4% are .
Besides the presence of two detectors in line, another unique feature of the Fermilab beam is the possibility to move the target in and out, shifting the spectrum of neutrinos that come out, because the focal point of the horns changes. Two positions of the target are used, corresponding to two beam configurations. In the high-energy configuration one can get a beam centered at an energy of 8 GeV or so, while the low-energy configuration is centered at 3 GeV. Most of the time Minos runs with the 3 GeV beam.
Detectors are a kiloton-worth of steel and scintillator planes in the near detector, and 5.4-kT in the far detector. Scintillator strips are 1 cm thick, 4.1 cm wide, and their Moliere radius is of 3.7cm. A 1-GeV muon crosses 27 planes. The iron in the detectors is magnetized with a 1 Tesla magnetic field.
Minos event topologies include CC-like and NC-like events. A charged-current (CC) event gives a muon plus hadrons: a long charged track from the muon, which is easy to find. A neutral current (NC) event will make a splash and it is diffuse, since all one sees is the signal from the disgregation of the target nucleus; an electron CC event will leave a dense, short shower, with a typical electromagnetic shower profile. The three processes giving rise to the observed signatures are described by the Feynman diagrams below.
The analysis challenge is to put together a selection algorithm capable of rejecting backgrounds and select CC events. Fluxes are measured in the near detector, and they allow to predict what can be found in the far detector. This minimizes the dependence on MC, because there are too many factors that may cause fluctuations in the real data, and the simulation cannot possibly handle them all. So they carry out a blind analysis. They check background estimates with independent samples: this serves the purpose of avoiding to bias oneself with what one should observe. They generate many simulated samples not containing an oscillation signal, to check all analysis procedures.
Basic cuts are applied on their data sample to ensure data quality. Fiducial volume cuts provide rejection to cosmic ray backgrounds. Simple cuts lead to a S/N ratio of 1:12. By “signal” one means the appearance of electron neutrinos. events are selected with artificial Neural Networks, which use the properties of the shower, the lateral shower spread, etcetera. These can discriminate NC interactions from electron-neutrino-induced CC interactions. After the application of the algorithm, the S/N ratio is 1/4. At this stage, one important remaining source of background is due to muon-neutrino CC backgrounds which can be mistaken from electron-neutrino ones when the muon is not seen in the detector.
They can select events with a “library event matching” (LEM). This matches the data event with a shower library, reconstructing the fraction of the best 50 matches which are electron-neutrino events. This is more or less like an evolved “nearest-neighbor” algorithm. As a result, they get a better separation. However, according to the speaker this method is not ready yet, since they still need to understand its details better.
[As I was taking these notes, I observed that data and Monte Carlo simulation do not match well in the low-ANN output region. The speaker claims tha the fraction of events in the tail of the Monte Carlo distribution can be modeled only with some approximation, but that they do not need to model that region too well for their result. However, it looks as if the discrepancy between data and MC not well understood. Please refer to the graph shown below, which shows the NN output in data and simulation at a preselection level.]
Back to the presentation. To obtain a result, the calculation they performis simple: how many events are expected in the far detector ? The ratio of far to near flux is known, 1.3E-6. This includes all geometrical factors. For this analysis they have 3E20 protons on target. They expect 27 events for the ANN selection, and 22 for the LEM analysis.
They need to separate backgrounds in NC and CC, so they do a trick: they take data in two different beam configurations, then they look at the spectrum in the near detector, where they expect muon-type events to be rejected much more easily because they are more deflected. From this they can separate the two contributions.
Their final result for the predicted number of electron-induced CC events is 27+-5 (stat)+-2 (syst).
A second check on the background calculation consists in removing the muon in tagged CC events, and use these for two different calculations. One is an independent background calculation: they can add a simulated electron to the event raw data after removing the muon. This checks whether the signal is modeled correctly. From these studies they conclude that the signal is modeled well.
The results show that there is indeed a small signal: they observe 35 events, when they expect 27, in the high-NN output region, as shown in the figure above. For the other method, LEM, results are consistent. The energy spectrum of the selected events is shown in the graph below.
With the observation of this small excess (which is compatible with predictions), a confidence level is set in the plane of the two parameters versus , at 90%. It goes up to 0.35, with a small oscillation dependent on the value of . You can see it in the figure on the right below.
The speaker claims that if the excess they are observing disappears with further accumulated data, they will be able to reach below the existing bound.
The other result of MINOS is a result from disappearance studies. The signal amounts to several hundred events of deficit. They can put a limit on an empirical parameter which determines what fraction of the initial flux has gone into sterile neutrinos. They have 6.6E20 protons on target now taken. The fraction of sterile
neutrinos is less than 0.68 at 90%CL.
What is the Y(4140)? The plot thickens April 6, 2009Posted by dorigo in news, physics, science.
Tags: charmonium, heavy quarks, QCD, Y(4140)
I read with interest -but it would probably be more honest to say I browsed, since I could understand less than 50%- a preprint released three days ago on “The hidden charm decay of Y(4140) by the rescattering mechanism“, by Xiang Liu, from Peking University (now at Coimbra, PT). The Y particle has been recently discovered by CDF.
The existence of the several new resonances of masses above 3 GeV recently unearthed by B factories and by the CDF experiment poses a challenge to our interpretation of these states as simple quark-antiquark bound states, because of their properties -in particular, their decay pattern and their natural widths.
Already with the first “exotic” meson discovered a few years ago (and recently measured with great precision by CDF), the X(3872), the puzzle was evident: at a mass almost coincident with twice the mass of conventional charmed mesons (states which are labeled “D”, which are composed of two quarks: a charm and a up or down quark, like or ), the X was immediately suggested to be a molecular state of two D particles. I wrote an account of the studies of the nature of the X particle a few years ago if you are interested -but mind you, the advancements in this research field are quick, and I believe the material I wrote back then is a bit aged by now.
The paper by Liu tries to determine whether the interpretation of the Y particle as a pure second radial excitation of P-wave charmonium (, with ) holds water once the observed branching ratio of the Y into the final state seen by CDF (), and the measured decay width, are compared to a theoretical calculation.
The nice thing about the decay of the Y into the observed final state is that it occurs only through a so-called “rescattering” mechanism, by means of the diagrams shown in the graph below (the ones shown refer to the J=0 hypothesis of the , but similar diagrams are discussed for the J=1 state in the paper).
As you can see, the Y produces the two final state particles by means of a triangle loop of D mesons. These diagrams usually describe rare processes, and in fact Liu’s calculations end up finding a small branching fraction. I am unable to delve into the details of the computation, so I will just state the result: the typical values of the branching ratio depend on a parameter, which, if taken in a “reasonable” range of values, provides estimates in the ballpark of a few . This appears inconsistent with the observation provided by the CDF experiment.
Clearly, work is in progress here, so I would abstain from concluding anything definite on the matter. So, for now, let us call this an indication that the simple interpretation of the Y as a excited charmonium state is problematic.
3 megatons strike in central Italy April 6, 2009Posted by dorigo in news, science.
A destructive earthquake has struck last night in central Italy, at 3.32AM in a mountainous region of the Appennini, close to the city of L’Aquila. The magnitude of the earthquake has been estimated at 6.3 on the Richter scale, for a release of energy equal to about 3 megatons of TNT (not 16 as I previously reported, which corresponds to 6.7 degrees in the Richter scale).
Many small towns close to the epicenter report more than half of the houses grounded. The biggest worries come from L’Aquila, which counts about 70,000 inhabitants; but many smaller towns scattered around in the mountainous region of the Abruzzi have certainly suffered major damage. There are reports of tens of dead bodies already extracted from the rubble. I will have updates here as soon as I gather more information.
UPDATE: while dead bodies continue to be drawn out from collapsed buildings, a disturbing detail emerges. It transpires that a researcher at the Gran Sasso national laboratories had predicted the event, and had warned that a disastrous seismic event would occur. Giampaolo Giuliani had recorded a large release of radon gas from the ground on March 29th, and had concluded that an earthquake would probably take place in the matter of hours. Giuliani had predicted the event would happen a week before it actually did, and on March 31st the head of civil protection Guido Bertolaso had bitterly criticized the prediction and “quegli imbecilli che si divertono a diffondere notizie false” (those imbeciles that enjoy diffusing false news). Giuliani is facing charges of causing a false alarm, but he was right after all.
UPDATE: here are a few excerpts from an interview given by Giampaolo Giuliani this morning:
“Se commento adesso c’e’ il rischio che a me domani mi mettono in galera. Allora, non e’ vero, e’ falso, che i terremoti non possono essere previsti. Sono quasi dieci anni che noi riusciamo a prevedere eventi nel raggio d’azione di 120-150 chilometri dai nostri rivelatori.”
“Sono tre giorni che vedevamo un forte incremento di Radon. I forti incrementi di Radon, al di fuori delle soglie di sicurezza, significano forti terremoti.”
“Anche la tecnologia classica avrebbe potuto prevederlo. Se qualcuno fosse stato a lavorare, ai posti dovuti, o se qualcuno si fosse preoccupato.”
“Questa notte anche la sala sismica si sarebbe potuto accorgere che sarebbe avvenuta una forte scossa. Il mio sismografo indicava una forte scossa di terremoto e ce l’avevamo online, tutti potevano osservarlo, e tanti lo hanno osservato e si sono resi conto che le scosse crescevano.”
(“If I comment now there is the risk that tomorrow I get imprisoned. Now: it is not true, it is false, that earthquakes cannot be predicted. We have been able to predict events for almost ten years in a range of 120-150 kilometers from our detectors.”
“In the last three days we saw a large increase of Radon. Large increases of Radon, above safety thresholds, mean strong earthquakes.”
“Even classic technology could have been used to predict it. If somebody had been working, at their place, or if somebody had gotten worried.”
“Tonight even the seismic room could have realized that a strong shake was going to happen. My seismograph indicated a strong earthquake and we had it online, everybody could watch it, and many did and realized that the tremors were increasing.”)
UPDATE: Michelangelo Ambrosio, a director of research of the INFN (national institute for nuclear physics) section in Napoli, thus defends the claims of Giuliani:
“trascurare con superficialita’ le applicazioni di nuove tecnologie solo perche’ proposte da ricercatori non appartenenti allo establishment preposto a tale funzione e’ una negligenza criminale di cui oggi paghiamo le conseguenze.”
(“Disregarding with superficiality the applications of new technologies only for the reason they are brought forward by researchers not belonging to the establishment addressing those functions is a criminal neglect of which today we all pay the consequences”.)