jump to navigation

Gary Steigman: Neutrinos and Big Bang Nucleosynthesis April 17, 2008

Posted by dorigo in astronomy, cosmology, physics, science.
trackback

Here follows a summary of the talk by Gary Steigman at the Neutrino Oscillations 2008 conference I am attending in Venice. Due to a chronic shortage of time, I will not attempt at reorganizing my notes in a coherent way -apologies if the text is obscure: the lack of corresponding figures and graphs, which Gary used in his talk, cannot unfortunately be substituted by argute explanations on my part. I therefore advise readers with no background in basic cosmology to jolly well skip this post and read something else. On the other hand, insiders will find here no really new information… So this post is essentially just for my own record! I will add pictures if I find the slides on the web, if not… Too bad.

Gary started by noting that in our attempts at understanding the evolution of the Universe, evidence of large scale structures allows us to study times from a few minutes after the big bang to about ten billion years after it. The evolution can be divided by three important moments.

When the Universe is a tenth of a second old, neutrinos decouple from matter. This transition, as the others, is not sharp: neutrinos continue to interact at this time but at a time scale which is becoming long with respect to the age of the Universe. A few minutes later, elements begin to form. Nuclear reactions continue to happen but there is primordial nucleosynthesis only when the Universe is a few minutes old. Finally, about 400 thousand years later, electrons combine with protons to form neutral atoms, and then is when relic photons are free, and they can propagate all the way to us.

So we have Big Bang Nucleosynthesis (BBN, 20 minutes after the big bang), the Cosmic Microwave Background (400 kiloyears after BB), and the Large Scale Structure of the Universe (10 gigayears after BB). They are all complementary probes of the early evolution of the Universe.

The question to ask oneself according to Steigman is whether the predictions and observations of baryon density \eta_B and expansion rate H_0 agree at these different epochs.

The early hot and dense universe during part of its evolution is a cosmic nuclear reactor. As the universe expands, BBN begins when the temperature is of about 70 keV, when the ratio between neutron and proton abundances is n/p \simeq 1/7. The ratio is crucial for helium abundance. Nucleosynthesis begins but very quickly ends, because the temperature drops and there are coulomb barriers between charged nuclei, neutrons get used up also because of beta decay, and at T =30 keV, 24 minutes after the start, nucleosynthesis ends.

When we talk about the baryon density we mean the nucleon density. But as the universe expands, the density changes. A parameter which remains invariant is the ratio \eta_B = N_n / N_\gamma, between nucleon and photon number densities. eta_B is of course very small, so \eta_{10} is defined as the same number, 10 billion times larger. In terms of \Omega_B and h, the hubble parameter, we can write $\latex eta_{10} = 274 \Omega_B h^2$.

One of the key elements produced in the early universe is Deuterium. The abundance is maximum at about 300 seconds. There is none before 100 seconds. Then it burns to Tritium to end up into Helium four, and it decreases. When the universe is about 1000 seconds old the relative abundance D/H stops changing. This ratio depends on eta_{10}. More nucleon density means less Deuterium produced. So Deuterium is a baryometer: it measures the density of baryons. As the Universe evolves, D is destroyed. Anywhere, the relative abundance of Deuterium is smaller than its plateau value: its evolution is monotonic since the big bang. It can only decrease.

The predicted value of the D/H ratio is sensitive to the baryon density: a 10% determination of D abundance brings to a 6% determination of the baryion to photon abundance.

The way we observe deuterium is in absorption in light sources, high-redshifted quasi-stellar objects. H-I and D-I lines are seen in absorption, but their spectra are identical. An isotope shift is completely equivalent to a velocity shift, so we have to be very careful in interpreting it. An unresolved velocity structure in the measured object causes errors in N(H-I). We need to measure the heavy elements to determine the velocity structure.

Data on D/H can be plotted against metallicity -the ratio of heavy to light elements, such as the relative abundance of Silicon and Hydrogen, Si/H: a measure of heavy element abundance. We only have six data points from background sources at various (small) values of metallicity, and we have to understand well their velocity structure. The points show a lot of dispersion in D/H, and it is hard to see a clear plateau at low metallicity. However, we can fit 10^5 D/H = 2.68 \pm 0.27, taking the dispersion around the mean as the uncertainty. We thus find a 10% error in deuterium abundance. We can use that to measure \eta_{10} as 6 \pm 0.4, or 6% uncertainty.

The evolution of the Helium 4 mass fraction is represented by astronomers as Y_P. It evolves starting from 200 seconds. It increases up to 0.25 at 300 seconds and then it plateaus. Helium 4 starts after deuterium starts burning. Then all neutrons are quickly used up, and we get a plateau. With a neutron to proton ration of 1/7 when nucleosynthesis begins, Y_P is 0.25 with very little spread. So helium abundance is insensitive on the value of \eta_{10}, but it depends crucially on the competition between the weak interaction rates, charged-current weak interactions, and the expansion rate of the Universe: so Helium abundance can provide constraints of the expansion rate of the early Universe.

The expansion rate is usually defined in terms of the Hubble parameter H, which provides a probe of non-standard Physics. There are many models where H deviates from SM values. The ratio of the square of H to the SM value provides an estimate of the energy density of relativistic particles to the Standard Model expectation, with three families of light neutrinos. Anything that changes that picture causes a deviation. An expansion rate parameter S measures the deviation S = (H'/H)^2. There can be many reasons why S departs from 1. N_\nu parameterizes deviations as 1+7 \Delta N_\nu/43. Higher dimensions like those in the Randall-Sundrum model cause a difference in S. S also measures the difference of the gravitational constant from todays value.

We can determine Y_P from Helium abundance and we find $Y_P = 0.24 \pm 0.006$.
As a function of the oxygen to hydrogen abundance O/H one can determine Y. Systems with about 10^{-4}  O/H give a linear extrapolation to zero oxygen abundance, and one finds the value Y=0.24. Alternatively, instead than using 90 data points with uncertainty dominated by systematics, there are other analyses more careful with systematics, where the trend with metallicity is seen better. Any helium one sees is greater than the primordial value, so an upper bound can be extracted from the data, and is found at Y<0.255.

From standard big bang nucleosynthesis there is the prediction is Y_P = 0.248. There is consistency. The deuterium and helium observations plotted together, \eta_{10} and S can be seen as a function of Y_P and Y_D. The helium abundance depends on the expansion factor, while the deuterium abundance also depends slightly on expansion rate factor S. Putting these together we find that there is a consistent, not unexpected, possibility of explaining everything with N_\nu = 2.4 \pm 0.4. 2-sigma away from the standard value of three neutrinos.

In the N_\nu vs $\eta_{10}$ plane, one has a nice contour plot from V-Simha and G.S. We are consistent with 3 neutrinos. In particular, the BBN constraint from He-4 shows very clearly that in the early universe at least one was present in the early Universe. At more than 2-sigma, there was at least one of them. Also, 4 flavors of neutrino from BBN are excluded.

About lithium, it is produced in the BB in low abundance, in the form of Li_7. There is a gap at mass 5, very hard to jump in nucleosynthesis, but there are some reactions that take you up to mass seven. But it stops there: only light elements are formed in the early Universe. From standard BB nucleosynthesis, the prediction of LI abundance is off. As a function of the ratio between iron and hydrogen abundance Fe/H, one finds values a factor of three lower. Question, should we see a plateau, a speed plateau at low metallicity ? If there is a plateau, we can arbitrarily determine it by drawing a line through data points, and find Li abundance at 12+log(Li/H) = 2.1 There is too little lithium according to measurements.

On the CMB radiation, there is a complementary probe. The temperature fluctuation spectrum provides a constraint on the baryon density. Different curves can be drawn on the temperature fluctuation measured for the cosmic microwave background, corresponding to different values of the abundance ratio B/\gamma at 4.5, 6.1, 7.5. This allows to illustrate that it is possible for the data to discriminate the baryon density: so one has an early Universe baryometer which is better than deuterium. One finds \eta_{10} = 6.1 \pm 0.2, which has an uncertainty a factor two better than what we can get with deuterium abundance. In fact if we superpose them, we find excellent agreement.

If one puts the CMB results together with the BBN results, they overlap well, and taking the CMB values of baryon density and N_\nu, and use BBN to predict abundances, one finds good matches in Y_P, Y_{DP}, while no good agreement on lithium abundance.

What are the consequences of the good agreement of physics at 20 minutes and 400,000 year times ? Entropy conservation: the number of photons described by CMB and BBN we find 0.92 \pm 0.07. The ratio in the number of photons is one unless there is entropy creation in between. One can place upper limits on entropy production then.

A modified radiation density for a late decay of a massive particle also give different abundances at the two time scales, and one finds constraints on it too. For variations in the gravitational constant, one can interpret the expansion parameter in terms of G, and comparing the BBN value with the present value one finds values consistent with one.

About these ads

Comments

1. forrest noble - May 15, 2008

Hey Tommaso, never will spell it wrong again

Time spent on BB nucleosynthesis, I believe, would be much better spent studying the ancient Roman gods. In the later case at least you would at least be learning something. If one loves pure science fiction then BB nucleosynthesi is certainly the right field to entertain oneself. If one eventually realizes there was no BB, the synthesis of it would be like studying Ptolemy’s epicycle system today. Interesting but no cigar or prize of any kind. Granted you have to have ability but it seems to me to believe that the BB is presently on sound footing, as a theory, one has to be kidding himself: Dark energy, uneven expansion of the universe, bogus explanations of superluminosity, old galaxies observed at 11 billion light years (a valid observation), quasars as special entities rather than AGN, vacuous black holes, multiple universes and dimensions, the CMB, etc. etc. 21st century theories in general, I believe, will be the biggest jokes of the 22nd century. Within the next 5 to 10 years it will begin to become apparent.

forrest_forrest@netzero.net

your friend forrest

2. dorigo - May 15, 2008

Hi Forrest,

while you put your finger on a few mysteries that are indeed nagging evidence that we do not understand everything under and beyond the sun, your attack of the big bang as a valuable theory is rather silly. It is the best theory we have so far, period. Science moves forward by formulating theories: these provide the guideline to investigate further. No alternative to the big bang is present which comes even close to explaining what the BB explains with great accuracy -sure, after one has bought a few additional unjustified assumptions.
As for BBN, it provides great precision in tracking the abundance of light elements in the universe.

Cheers,
T.

3. forrest noble - May 16, 2008

Granted, Tommaso

There are, I believe, as many mysteries concerning The Big Bang as is continuously pointed out by each new cosmological observation. Most every new observation would seem to completely contradict the BB. Although I have my own book and theory, which I believe based upon the evidence, predicts these observations before they are observed, and makes hundreds of new predictions.Ten Predictions have already been observed. The BB in contrast makes only one prediction: that the universe is 13.7B years old.

Other theories, which I also don’t ascribe to, but which I think are closer to the truth are the Steady State Theory. One that is maybe not as good as The Steady State Theory but is a better than the BB is the Plasma Cosmology theory. The Theory of Inflation is needed to explain BB cosmology, my own theory, I think is backed by more evidence and is much more logical and requires maybe only 1% of the mathematics and new physics.

The lighter elements can be explained by nuclear fission processes within stars that is a simpler and provable theory as apposed to nucleosynthesis.

Know you are probably not interested in alternative theory, but one thing cannot be denied, if Occam’s razor were the primary value indicator, the BB theory is maybe the most complicated theory of dozens of alternatives. My own, I believe, is a much simpler explanation of the cosmos that is also backed by observation. Unlike the Big Band Theory, I BELIEVE MY THEORY CAN CORRECTLY EXPLAIN ALL THE MOST IMPORTANT QUANDARIES OF ALL TIMES. Most science practitioners and students that have read it entirely, usually prefer this theory over all other alternatives.

Please let me know if you think that I have misstated the facts.

forrest_forrest@netzero.com

your friend forrest


Sorry comments are closed for this entry

Follow

Get every new post delivered to your Inbox.

Join 96 other followers

%d bloggers like this: