jump to navigation

String theorists betting against SUSY July 23, 2008

Posted by dorigo in physics, science.
Tags: , ,
comments closed

This post contains second-hand information, but I place it here anyways because a blog is also a record of things. So, I read with interest on Peter Woit’s blog a summary of the latest paper posted on the ArXiv by Bert Schellekens. Peter’s review is worth reading head to tail (I don’t know about the 80-something page long article), but I especially found interesting a quote from Schellekens’ paper, which says it as clear as one can make it:

With the start of the LHC just months away (at least, I hope so), this is more or less the last moment to make a prediction. Will low energy supersymmetry be found or not? I am convinced that without the strange coincidence of the gauge coupling convergence, many people (including myself) would bet against it. It just seems to have been hiding itself too well, and it creates the need for new fine-tunings that are not even anthropic (and hence more serious than the one supersymmetry is supposed to solve).

Be sure to get this right: he is a die-hard landscape-enthusiast string theorist. And he is saying he would bet against SUSY at the LHC.

With the CERN machine’s start just around the corner, things are indeed getting to some accumulation point -I myself, after having bet heavily (well, for my standards) against the observation of SUSY at LHC, am starting to think I might in the end turn out the happy loser.

What is worth mentioning and is my final prediction, however, is that as soon as protons will start hitting other protons in the head at 10 TeV this fall, we will slowly relax and realize it’s going to take a while before we can say anything from that mess of hadrons that is going to come out of the center of CMS and ATLAS every 25 nanoseconds.

Liam Mc Allister: Inflation in String Theory May 23, 2008

Posted by dorigo in cosmology, news, physics.
Tags: , ,
comments closed

Here we go with another report from PPC 2008. This one is on the talk by Liam Mc Allister from yesterday afternoon session. In this case, I feel obliged to warn that my utter ignorance of the subject discussed makes it quite probable that my notes contain nonsensical statements. I apologize in advance, and hope that what I manage to put together is still of any use to you, dear reader.

The main idea discussed in Liam’s talk is the following: if we detect primordial tensor perturbations in the cosmic microwave background (CMB) we will know that the inflaton -the scalar particle responsible for the inflation epoch- moved more than a Planck distance in field space. Understanding such a system requires confronting true quantum gravity questions. String theory provides a tool to study this.

Inflation predicts scalar fluctuations in the CMB temperature. These evolve to create approximately scale-invariant fluctuations, which are also approximately gaussian. The goal we set to ourselves is to use cosmological observations to probe physics at the highest energy scales.

The scalar field \phi has a potential which drives acceleration. Acceleration is prolonged if V(\phi) is rather flat. How reasonable is that picture ? This is not a macroscopic model. What is \phi ? The simplest inflation models often invoke smooth potentials over field ranges larger than the Planck mass. In an effective field theory with a cutoff \Lambda one writes the potential with powers of the ratio \phi/\Lambda. Flatness is then imposed over distances \Delta \phi > \Lambda. But \Lambda must be smaller than the Planck mass, except in a theory of quantum gravity.

So one needs to assume something about quantum gravity to write a potential. It is too easy to write an inflation model, so it is not constrained enough to be predictive. We need to move to some more constrained scenario.

Allowing an arbitrary metric on the kinetic term, and an arbitrary number of fields in the lagrangian, the potential is very model-dependent. The kinetic term has higher derivative terms. One can write the kinetic term of the scalar fields with a metric tensor G. G is the metric on some manifold, and can well depend on the field themselves. An important notion is that of the field range.

Liam noted that the prospects for excitement in theory and experiments are coupled. If the parameter n_s is smaller than 1, there are no tensors and no non-gaussianity, and in that case we may never get more clues about the inflaton sector than we have right now. We will have to be lucky, but the good thing is that if we are, we are both ways. Observationally non-minimal scenarios are often theoretically non-minimal: detectable tensors require a large field range, and this requires a high-energy input. If anything goes well it will do so both experimentally and theoretically.

String theory lives in 10 dimensions. To connect to 4D reality string theory, we compactify the 6 additional dimensions. Additional dimensions are small otherwise we would not see a newtonian law of gravity, since gravity would propagate too much away from our brane.

Moduli correspond to massless scalar fields in 4-dimensions. Size and shape moduli for the Calabi-Yau manifold. Light scalars with gravitational strength couplings absorb energy during inflation. They can spoil
the pattern of big bang nucleosynthesis (BBN) and overclose the universe. The solution is therefore that sufficiently massive fields decay before BBN, so they are harmless for it (however, if they decay to gravitinos they may still be harmful).

The main technical extension: D-branes, by Polchinski in 1995. If you take a D-brane and you wrap it in the compact space, it takes energy that creates a potential for the moduli. It makes the space rigid.
The tension of D-branes makes distorting the space cost energy. This creates a potential for the moduli.

Any light scalars that do not couple to the SM are called moduli. Warped D-brane inflation: it implies warped throats. A CAlabi-Yau space is distorted to make a throat. This is a Randall-Sundrum region. It is the way by which string theory realizes it. A D-3 brane and an anti-D3 brane attract each other.

The tensor-to-scalar ratio is large only if the field is moving over planckian distances, \Delta \phi/M_p. That is the diameter of the field space. It is ultraviolet-sensitive but not too much so.
In our framework, observable tensors in CMB mean that there has been trans-planckian field variation.

Can we compute the (\Delta \phi /M_p)_{max} in a model of string inflation ? Liam says we can.
Planckian distances can be computed in string theory using the geometry. The field \phi is the position in the throat, so \Delta \phi is the length of the throat. It is reduced to a problem in geometry. The field range is computed to (\Delta \phi/M_p)^2 < 4/N, where N is the number of colors in Yang-mills theory associated to the throat region. N is at least a few hundred!
So the parameter r_{CMB} is small with respect to the threshold for detection in the next decade, since r_{CMB}/0.009 < 4/N.

N has to be large for us to be using supergravity. You can conceive a configuration with N not large,
but then we cannot compute it. It is not in the regime of honest physics, in that case. There are boundaries
in the space of string parameters. So we are constraining ourselves in a region where we can make computations. It would be very interesting to find a string theory that gives a large value of r.

Liam’s conclusions were that inflation in string theory is developing rapidly and believable predictions are starting to become available. In D-brane inflation, the computation of field range in Planck units shows that detectable tensors are virtually impossible.

Dinner with Gordie Kane May 23, 2008

Posted by dorigo in personal, physics, science.
Tags: , , ,
comments closed

Yesterday evening the conference banquet of PPC 2008 was held at Yanni’s, a nice restaurant on Central Avenue in Albuquerque. I was lucky to sit at a table together with quite interesting company of several distinguished colleagues. Most notably, to my right sat Gordie Kane, with whom I had a interesting discussion about the expectations for Supersymmetry at the LHC and on the promises that String Theory may one day go as far as to explain really fundamental things such as why quarks have the mass we observe, why the CKM matrix elements are what they are, and why the other queue is always faster.

Gordie was really surprised by my 1000$ bet against new physics discoveries at the LHC. He was willing to take it himself, but I said I am already exposed with Distler and Watts. He is positive on the fact that LHC experiments will find Supersymmetry, and in general he has a very optimistic attitude which is infectious. I went as far as to say I would be willing to buy string theory if one showed me there are prospects for really explaining things such as those I listed above, and after a further glass of wine I invited him to offer a guest post on this blog where he could discuss the matter, or more loosely the reasons to be optimistic about new physics being just around the corner. He said he will do it, although he is now quite busy. So, expect a guest post by none less than Gordie Kane in here in the matter of a month or two…

For the time being, I can just offer the following picture, taken by Mandeep Gill on my request during the banquet:

A review of yesterday’s afternoon talks: non-thermal gravitino dark matter and non-standard cosmologies May 21, 2008

Posted by dorigo in cosmology, news, physics, science.
Tags: , , ,
comments closed

In the afternoon session at PPC2008 yesterday there were several quite interesting talks, although they were not easy for me to follow. I give a transcript of two of the presenations below, for my own record as well as for your convenience. The web site of the conference is however quite quick in putting the talk slides online, so you might want to check it if some of what is written below interests you.

Ryuichiro Kitano talked about “Non-thermal gravitino dark matter“. Please accept my apologies if you find the following transcript confusing: you are not alone. Despite my lack of understanding of some parts of it, I decided to put it online anyway, in the hope that I will have the time one day to read a couple of papers and understand some of the details discussed…

Ryuichiro started by discussing the standard scenario for SUSY dark matter, with a WIMP neutralino. This is a weakly interacting, massive, stable particle. In general, one has a mixing between bino, wino, and the higgsinos, and that is what we call neutralino. In the early universe it follows a Boltzmann distribution, then there is a decoupling phase when the process inverse to production becomes negligible, so at freeze-out the number density of the neutralino goes with T^3. The final abundance is computed by equating two terms at time of decoupling, to get <\sigma v> = (n^2_\chi -n^2_{eq}.

In this mechanism there are some assumptions. The neutralino is a LSP: it is stable. The second assumption is that the universe is radiation-dominated at time of decoupling. A third assumption says that there is no entropy production below T=O(100 GeV), otherwise relative abundances would be modified. Are these assumptions reasonable ? Assumption one restricts to gravity mediation. There is almost always a moduli problem. This is inconsistent with assumptions 2 and 3. If you take instead anomaly mediation, wino LSP and it gives too small abundances. We thus need a special circumstance for the standard mechanism to work.

The moduli/gravitino problem: in gravity mediation scenario, there is always a singlet scalar field which obtains a mass throuhg SUSY breaking. S is a singlet under any symmetry, and it gives a mass dump to the gaugino. Its potential cannot be stabilized, and it gets mass only through SUSY breaking. Therefore, there exists a modulus field. We need to include it to consider the cosmological history, because it has implications.

During inflation, the S potential is deformed because S gets mass only from SUSY breaking. So, the initial value of the moduli will be modified. Once S-domination happens it is a cosmological disaster. If the gravitino is not LSP, it decays with a lifetime of the order of one year, and it destroys the standard picture of big bang nucleosynthesis (BBN). if the decay is forbidden, it is S
to have a lifetime of O(1y), still a disaster. Inconsistent with neutralino DM scenario, or better, gravity mediation is inconsistent.

So we need some special inflation model which does not couple to the S field; a very low scale inflation such that deformation of S potential is small; and a lucky initial condition such that S-domination does not happen. Is there a good cosmological scenario that does not require such conditions ?

Non-thermal gravitino DM is a natural and simple solution to the problem. Gauge mediation offers the possibility. SUSY breaking needs to be fixed in the scenario. Most of it has the same effective lagrangian. This implies two parameters in addition to the others: the height of the potential (describing how large is the breaking) and the curvature m^4/\Lambda^4. In this framework, the gravitino is the LSP.

In non-thermal gravitino dark matter scenario, the mechanism can produce the DM candidate. After inflation, S oscillation starts. We have a potential for it, there is a quadratic term. Second step is decay. The S coupling to superparticles are proportional to their mass. S-Gravitino coupling is instead suppressed. Gives a smaller branching ratio to gravitino. Good for the gravitino abundance. Also the shorter lifetime as compared to gravity mediation is good news for BBN.

The decay of S to a bino pair must be forbidden to preserve BBN abundances. So S \to hh is the dominant decay mode if it is open. If we calculate the decay temperature, we find good matches with BBN and it is perfect for DM as far as its abundance is concerned.

There are two parameters: height of potentials and curvature. We have to explain the size of gaugino mass and fix one of the parameters. Gravitino abundance is explained if gravitino mass is about 1 GeV. Baryon abundances however have to be produced by other means.

Step three is gravitino cooling. Are they cold ? THey are produced by the decay of 100 GeV particles, relativistic, but their distribution is non thermal. It slows down by redshift. These must be non-relativistic at time of structure formation.

If we think of SUSY cosmology we should be careful about the consistency with the underlying mode, of gravity mediation. Gauge mediation provides viable cosmology with non-thermally
gravitino DM.

Next, Paolo Gondolo gave a thought-provoking talk on “Dark matter and non-standard cosmologies“. Again, I do not claim that the writeup below makes full sense -not to me, but maybe to you it does.

Paolo started by pointing out the motivations for his talk: they come directly from the previous talks, the problem with the gravitino and with the moduli. One might need to modify usual cosmology before nucleosynthesis. Another motivation is more phenomenological. The standard results on neutralino DM are presented in standard parameter space $M_0 – M_{1/2}$, and one gets a very narrow band due to constraints of dark matter from cosmology. These constraints come from primordial nucleosynthesis. They assume that neutralinos were produced thermally, decoupled at a later time and remained with a residual abundances. This might not be true, and if it isn’t, the whole parameter space might still be consistent with cosmological constraints.

[This made me frown: isn’t the SUSY parameter space still large enough ? Do we really need to revitalize parts not belonging to the hypersurface allowed by WMAP and other constraints ?]

The above occurs just by changing the evolution of the universe before nucleosynthesis. By changing $\tan \beta$ you can span a wider chunk of parameters, but that is because you look at a projection. Cosmological constraints give a n-1 hypersurface. One can extend it outside of it. But this comes at the price of more parameters. Don’t we have enough parameters already?

Cosmological density of neutralinos may differ from usual thermal value because of non-thermal production or non-standard cosmologies. J.Barrow, in 1982, wrote of massive particles as a probe of the early universe. So it is an old idea. It continues in 1990, with a paper by Kamionkowski and Turner: Thermal relics: do we know their abundances ?

So let us review the relic density in standard cosmology, and the effect of non-standard ones. In standard cosmology the Friedmann equation governes the evolution of the scale factor a, and the dominant dependence of \rho on a determines the expansion rates. Today we are matter-dominated, and we were radiation-dominated before, because \rho scales with
different powers of the scale factor: now \rho = a^{-3}, but before it went a^{-4}. Before radiation domination there was reheating, and before, the inflation era. At early times, neutralinos are produced in e+e- production, and mu+mu-. Then production freezes out, and here they say neutralino anihilations ceases, but it really is the production which ceases. Annihilation continues at smaller rates until today so that we can look for it, but it is production that stops. Number of neutralinos per photons is constant at freeze out. The higher the annihilation rate, the lower the final density. There is an inverse proportionality.

The freeze-out occurs during the radiation-dominated epoch, T = 1/20th of the particle mass, so we have much higher temperature than that of a matter-dominated universe. Freeze-out occurs before BBN. We are making an assumption of the evolution of the universe before BBN. What can we do in non-standard scenarios ? We can decrease the density of particles, by producing photons after freeze-out (entropy dilution). Increasing photons you get lower final density. One can also increase the density of particles by creating them non-thermally, from decays. Another way is to make the universe expand faster during freeze-out, for instance in quintessence models.

The second mechanism works because if we change the expansion rate we need a higher density. What if instead we want to keep the standard abundance ? We want to produce WIMPS in a thermal mechanism. We need a standard Hubble expansion rate down to T=m/23. down to freeze-out. A plot of m/T_{max} versus <\sigma v> shows that production is aborted at m/T>23.

How can we produce entropy to decrease the density of neutralinos after freeze-out ? We add massive particles that decay or annihilate late, for example a next-to-LSP. We end up increasing the photon temperature and entropy, while the neutrino temperature is unaffected.

We can increase the expansion rate at freeze-out by adding energy in the Universe, adding a scalar field, or modify the Friedmann equation by adding an extra dimension. In alternative, produce more particles by decays.

In braneworld scenarios, matter is confined to the brane, and gravitons propagate in the bulk. It gives extra terms in the Friedmann equation, proportional to the square of the density and the Planck mass squared. We can get different densities. For example, in the plane of m_0 versus gravitino mass, the Wino is usually not a good candidate for DM but here it is in Randall-Sundrum type II scenarios. We can resuscitate models of SUSY that people think are ruled out by cosmology.

Antiprotons from WIMP annihilation in the galactic halo constrain the RS model II. The 5-dimensional Planck constant M5 has different constraints, antiprotons give bounds >1000 TeV.

Non-thermal production from gravitational acceleration: at the end of inflation acceleration was so high you could create massive particles. They can have density of dM if mass is of order of Hubble parameter. Non-thermal production from particle decays is another non-standard case which is not ruled out.

Then there is the possibility of neutralino production from a decaying scalar field: In string theories, the extra dimensions may be compactified as orbifolds or Calabi-Yau manifolds. The surface shown is a solution of equation such as z_1^5 + z_2^5=1, with z complex numbers. The size and shape of the compactified dimensions are parametrized by moduli fields \phi_1, \phi_2, \phi_3… The values of the moduli fields fix the coupling constants.

Two new parameters are needed to evade the cosmological constraints to SUSY models: reheating temperature T_rh of the radiation when phi decays. It is >5 MeV from BBN constraints. Other parameter is the number of neutralinos produced per phi decay divided by phi mass, b/m_phi. b depends on the choice of the Kahler potential, superpotential, and gauge kinetic function, so on the high energy theory: the compactification , the fluxes etc. you put in your model. By lowering the reheating temperature you can decrease the density of particles. But the higher b/m, the higher the density you can get. So you can get almost any DM density you want.

Neutralino can be cold dark matter candidates anywhere in MSSM parameter space, provided one allows for these other parameters to vary.

If you work with non-standard cosmology, the constraints are transferred from the low-energy to the high-energy theory. The discovery of neutralino DM which is non thermal may open an experimental window on string theory.

[And it goes without saying that I find this kind of syllogism a wide stretch!].