jump to navigation

Guest post: Fabio Zandanel, “Dark Matter and the MAGIC telescope” July 12, 2007

Posted by dorigo in astronomy, Blogroll, physics, science.

Fabio Zandanel is a graduate student working with the MAGIC experiment group of the Padua University – INFN at the Physics Department “G.Galilei”. In his recent diploma thesis, he analyzed cosmic gamma rays data for a dark matter search within the MAGIC collaboration.

The Facts

Despite the rich physics progress in the last century, several issues are still open. The last WMAP experiment results brought excitement in the scientific community, in particular in the cosmological one. This experiment underlined that the Universe expansion is accelerating, its geometry is flat and about 96% of its content is in an unknown form. Only 4% of the Universe is accounted of baryonic matter, the so-called “ordinary” matter.

About 73% of the Universe density is accounted for by  dark energy, a bizarre form of energy or matter that is, in effect, gravitationally repulsive. The dark energy is thought as the responsible for the actual accelerated Universe expansion.

The Universe is formed for about 23% by the so-called dark matter, that is thought as non-baryonic weakly-interacting (i.e. interacting only gravitationally) matter. There are many evidences on the dark matter existence. The most striking observational evidence, on galactic scales, comes from the observations of the galaxies rotational curves, i.e. graphs of the circular velocities of gas and stars as a function of the distance from the center of the accounted galaxy. In a Newtonian framework, the circular velocity of objects at a distance r from the center of a galaxy goes as 1/r^2. The observations of several galaxies underlined that angular velocities grow in the central region of the galaxies (as expected) but then stay constant further in the galactic halo, instead of the expected lowering. This fact suggested the presence of an unknown matter component contrasting the expected behavior. There are other experimental evidences concerning dark matter, both on sub-galactic and inter-galactic scales, coming from a great variety of data. For example, recent results on the so-called “bullet” clustering of galaxies are strong evidences for the existence of dark matter. Studying the collision between two galaxy clusters and evaluating the total amount of matter, using the gravitational lensing, the presence of a great quantity of matter which do not suffer the effect of the collision is evident.

The estimation of the total amount of dark matter is possible through observations on cosmological scales. In fact, analyzing the CMB (Cosmic Microwave Background) temperature anisotropies, it is possible to put stringent constraints on the abundances of baryons and matter in the Universe and from these data of the WMAP experiment, derive our estimation of the dark matter content of the Universe.

The Hypothesis

There are a lot of theories which try to describe the dark matter, each with its own preferential candidate. As said above, the dark matter seems to be non-baryonic, weakling-interacting and relatively massive matter (this in order to explain its great contribution to the total density of the Universe).

The search for a solution to the dark matter paradigm leads to physics beyond the Standard Model. Among the great variety of hypotheses, the most studied candidates are the supersymmetric neutralino, in the supersymmetric theories framework, the Kaluza-Klein particle, in the extra-dimensions theories framework, and the axion hypothesis, a particle supposed to exist in order to explain the CP violation problem in particle physics. Unfortunately, the actual knowledge does not permit to favor a candidate among the others. The search for the dark matter explanation is an hard work due to difficulties of doing direct measurements of the nature of dark matter.

One of the possibilities to indirectly detect the dark matter is through the observations of the products of its annihilations. In our work, we investigate the possibility of a gamma ray flux coming from dark matter annihilation, i.e. of a detectable photon emission. Such a gamma ray emission is proportional to the squared dark matter density. Thus, a detectable photon flux is expected from those Universe regions where strong enhancements in the dark matter density are present, that is where strong gravitational fields are present.

This is, for example, the case of the Galactic Center, where should be present a Super-Massive Black Hole, or of the Dwarf Spheroidal galaxies, where seems to be present a large amount of dark matter. Another scenario that has taken growing importance in the last years, is the one concerning the Intermediate-Mass Black Holes (IMBH). These objects should be black holes with a different origin with respect to both the stellar and super-massive black holes, and with masses ranging from 20 to a million of solar masses. The existence of these objects is not definitively demonstrated yet, even if a lot of experimental evidences exist. However, the models involving IMBHs give more optimistic previsions about the possibility of observation with respect to, for example, the Galactic Center case.

In particular, in our work, we used a recent model ( Bertone, Zentner, Silk, 2005) that forecasts a detectable gamma rays emission from the dark matter density enhancements around IMBHs, probably present in our Galactic Halo. The main goal of this model is that the previsions about the gamma rays flux coming from these enhancements (called “mini-spikes”) are very stable with respect to the particle physics parameters of the problem, i.e. changing the particle that should be dark matter (and thus the annihilation channels, branching ratios and so on), and the dark matter density profiles about the mini-spikes are well established: i.e. there are few “free” parameters and they have a little impact on the model final previsions. Moreover, these mini-spikes should be very bright gamma-ray sources!

The characteristics of such a type of signal would be a power law spectrum with an exponential cut-off at energies related with the dark matter particle mass (in the figure, some prediction examples for a possible mini-spikes signal are shown; in this case the dark matter is supposed to be constituted by a supersymmetric neutralino). The observation of several objects with the same spectral characteristics and with cut-offs at the same energies would be a smoking gun signature about the dark matter and also about the IMBHs.

The MAGIC Experiment

As explained above, our interest is to detect a gamma-ray flux coming from “mini-spikes” around IMBHs.

The MAGIC telescope (Major Atmospheric Gamma-Ray Imaging Cherenkov) is a ground-based gamma ray instrument. It is one of the four existent IACTs (Imaging Atmospheric Cherenkov Telescopes) in the world. Its target is the observation of astrophysical sources through the detection of the emitted gamma rays, using the so-called Imaging Cherenkov technique. The direct observation of the gamma rays from the Earth ground is impossible, because they are completely absorbed by the atmosphere. However, during the absorption process a cascade of charged particles is formed, the atmospheric showers, that by themselves produce light flashes, the so-called Cherenkov light flashes. An IACT observes these flashes, focusing them on a pixelled camera composed by photo-multipliers. Then, the Imaging technique contemplates to analyze the images formed by the Cherenkov flashes on the camera, in order to get the informations about the primary gamma ray, principally its energy and coming direction. It is important to emphasize that the photons of the source are only the 0.1% of the totality of the cosmic-ray events that constantly impinge the Earth atmosphere, thus IACTs need to use very sophisticated analysis techniques to select the searched signal on a dominant background.

On the other hand, the space-borne telescopes can direct detect gamma rays, in space, through the pair production phenomenon. The EGRET experiment, the latest space-borne telescope, which finished operation in the year 2000, observed about 300 sources in the energies from MeV up to 10 GeV, while the old ground-based telescopes observed only ten sources above 250 GeV. Thus, the idea that some important astrophysical processes happen in the energy gap not accessible to any of the previous instruments, brought the MAGIC collaboration to propose the construction of a 17 m diameter imaging Cherenkov telescope.

The MAGIC telescope was developed to efficiently study the energy range between 50 and 300 GeV, in order to cover the energy gap between the previous gamma instruments. The telescope is placed at the Roque de Los Muchachos observatory in the Canary Island of La Palma, at an altitude of 2245 m above the sea level. Its principal characteristics are the large parabolic reflecting surface and the very light structure, which permits very fast pointings (of about 30 s), very important in order to study temporal transient phenomena (such as gamma-ray bursts). The MAGIC telescope is the biggest existing Cherenkov telescope in the world, and the IACT with the lowest energy threshold. A second telescope, of equal characteristics – MAGIC II – is under construction in the same site to increase the sensitivity of the experiment, particularly at the low end of the energy spectrum.

The Observations

The MAGIC collaboration is performing an observation campaign in order to find a gamma flux coming from dark matter annihilation. The Galactic Center was observed by MAGIC, and also by other ground-based telescopes, and, in fact, a gamma-ray signal was detected. However, the observed gamma signal does not match with any dark matter scenario. The problem is that the observational direction of the Galactic Center is overloaded with bright gamma sources, and it is difficult to disentangle a dark matter annihilation gamma signal from less exotic astrophysical sources.

Also the dwarf spheroidal galaxy Draco is under observation, and analysis results are expected soon.

For the IMBH scenario, as said above, those objects should be very bright gamma-ray sources. Thus, the EGRET experiment could have observed some of those objects present in our Galactic Halo. Among the 300 sources observed by EGRET, about 100 of these are still unidentified, i.e. their emission is still unexplained. Thus, it is normal to search for IMBHs within the so-called unidentified EGRET sources.

It is possible to do a selection of candidates within the unidentified EGRET sources in a way that satisfies the constraints forecast by the chosen model. The biggest problem with these types of sources is the position uncertainty and the possible flux errors. EGRET had a bad position accuracy with respect to the actual ground-based telescopes and also with respect to the next generation space-borne instruments, such as the Italian AGILE (already in orbit) or GLAST (which will be launched at the end of this year). Moreover, the flux measured by EGRET could suffer from much higher uncertainty than the accounted ones, and thus wrong predictions are possible.

At this moment, no significant results have been obtained. We must perform other observations and wait the results of the next generation space-borne satellite that can spur new effort in this field. For example, GLAST, with its enhanced sensitivity, will be able to observe several IMBHs – if they exist – giving very accurate position of those objects for the ground-based telescopes observations. The collaboration between the space-borne and the ground-based telescopes will be very important because the former can measure the “low” energies fluxes while the latter can observe the high energy part of the spectrum.


1. Andrew Daw - July 13, 2007

Could there really exist a kind of matter that constitutes some 90% of the material universe and yet has evaded direct detection in any experiment for over twenty years? So neutrinos have been detected so why not dark matter that needs to consist of more ,assive particle than the neutrino?

There have been attempts to produce modified theories of gravity to fit in with thw observations but without any success.

But then suppose galaxy cluster and spiral galaxy behaviour and cosmic lensing effects are all inexplicable just by describing the known properties of gravity in the same sense that the subatomic structure of atoms and molecules is inexplicable just by describing the known properties of the atomic and nuclear forces, and thus quantum wave and entanglement are inexplicable in these terms?

2. dorigo - July 13, 2007

Andrew, are you suggesting that there is something fundamental that we do not understand in the large-scale behavior of gravitation ? That is not a new idea; e.g. MOND – modified newtonian dynamics – seeks an explanation of observed phenomena by modifying the long-distance behavior of gravitational forces.


3. Andrew Daw - July 13, 2007

No, this isn’t MOND. I’m saying it’s more fundamental than that.

So basically I’m suggesting that the problems in present cosmology, and especially in explainng structure formation from the scale of galaxies through galaxy clusters to cosmic voids stem from the assumption that the evolution of this structure resulted from the action of the forces alone.

Whereas physics has no causal explanation for quantum wave behaviour in particular but also quantum entanglement and thus, one can insist, there is no adequate explanation for how any atomic or molecular structure is possible given just the action of the atomic and nuclear forces, and just as these causes have been measured and described.

While I’ve found some reasons to believe that a cause of quantum wave behaviour could also act on the astronomical scale in addition to all the forces. So it is this cause that would act non-locally to modify the effects of gravity.

But then this hypothesis is highly radical and, although I have been able to develop a rudimentary diagrammatic account of the action of such an additional cause, I have been unable to develop the cosmological hypothesis mathematically.

4. island - July 13, 2007

Could there really exist a kind of matter that constitutes some 90% of the material universe and yet has evaded direct detection in any experiment for over twenty years?

More likely, the mass-density of the vacuum is simply less than zero, so the theoretically projected estimate of 90% is actually just a volumous exageration that come about as a result of the missed reinterpretation of the vacuum and Dirac’s negative energy states.

What a breakthough.

5. dorigo - July 13, 2007

Well, Island, Fabio described is the standard view of cosmology, I think he is young enough to deserve to be spared from the controversy for a while longer…


6. dorigo - July 13, 2007


ok. I get it, there is much more below the surface of your first comment. As an experimentalist, and not the smartest one at that, I am not the best person to discuss the topic, though…


7. Guess Who - July 13, 2007

I’m too dumb not to ask: what’s the rationale for claiming that “subatomic structure of atoms and molecules is inexplicable just by describing the known properties of the atomic and nuclear forces”?

AFAIK, electron shell configurations of atoms, and the structure of entire molecules, can be computed numerically quite successfully ab initio, using nothing more complicated than relativistic quantum mechanics. I believe the materials and pharmaceutical industries do this routinely.

Andrew, are you thinking of nuclei? There we have the problem that QCD is not amenable to perturbation theory in the relevant regime, but chiral perturbation theory (an effective field theory written in terms of pions and nucleons, with the perturbative expansion being in their masses and momenta instead of the coupling) works fine.

And for the mesons and nucleons themselves you have QCD-based lattice computations.

So I’m afraid I don’t understand where the problem is.

P.S. Oh come on, Fabio has to grow up someday. Let’s break it to him a little at a time: there are horrible, horrible people in the world who dare to doubt the concordance model and even plot to overturn it. Yes, it’s true! Watch out, you never know where they might be lurking! 😉

8. island - July 13, 2007

LOL@GW… and some of them even live on the rim of the mainstream:


9. Guess Who - July 14, 2007

Oh, Cahill. I looked at his “Process Physics” several years ago. He is not only questioning the concordance model, he has a whole theory of everything supposedly unifying quantum mechanics and general relativity


Superficially, it looks a lot like the spin foam models dear to Smolin et al (Bee may want to say something about this), but a notch up with respect to violations of Lorentz invariance and other impractical consequences


You really don’t have to go to such lengths to question “standard” cosmology.

10. dorigo - July 14, 2007

Hi guys,

Fabio may reply himself here if he reads this column… I think he does not need to be taught that there is something beyond concordance cosmology. His post was within those boundaries for didactical purposes, that’s all…


11. Fabio - July 19, 2007

Hello at all,

Excuse me for the delay…
Well, first I would say that with this post I don’t intend to give an extensive explanation of all the existent cosmological models, of course. I know that there are other models, and other explanations for the actual cosmoligical problems, but I’m young and not an expert, let my study some additional years…
I’ve try to explain the actual strategy of the ground-based telescopes in the dark matter field. For now, it seems the only thing that we can try.. If there is the dark matter, and if this 23% is effectively matter of some type, and if the dark matter annihilate and so on (there are a lot of yet un-tested hypothesis), we can try to observe the products of its annihilation. It is true that the problem are overloaded of hypothesis, theories and probably a lot of these are wrong prevision. But from the experimental point of view, and in particular from the gamma ray astronomy point of view, the detection of a dark matter annihilation coming signal is the only thing in which we can hope (for now)!
It is also true that I’ve described the so-called standard cosmology, but we used such a model, inside the “standard” comsology. I think that search beyond the “standard” theories or the most “accepted” theories is a good thing. We must search a solution for this 90% of unknown things!
I’m not a theoric and I don’t know some of the thing that you writed in this column… But, the desribed scenario by my post is simply the most probable from the available data… For example, neutrinos have the “important property” that they exist, but simply neutrinos are not enough (from the available data) and not enough massive (always from the available extimations) to account for the dark matter.

I hope in your goodness with such a young researcher as me… 🙂



12. Jesus Gilbert - April 19, 2008

ranstead supercarbonization remonstrate starless asthore synovitic deputational sisyrinchium
Express Refinance

13. Poibreseges - October 19, 2008

Hello my friends!
The interesting name of a site – dorigo.wordpress.com
I recently 9 hours
has spent to the Internet So I have found your site 🙂
The interesting site but does not suffice several sections!
However this section is very necessary!
Necessarily I shall advise your site to the friends!
Forgive I is drunk :))

Sorry comments are closed for this entry

%d bloggers like this: