jump to navigation

Summary of the summary talks March 31, 2007

Posted by dorigo in astronomy, news, physics, science.
trackback

A day has passed from the end of the “Outstanding Questions in Cosmology” conference, but only now I find the time to write down my final report, centered on the summary talks given on Thursday afternoon. I was busy enjoying London with my wife… But let’s keep this about Cosmology.

So. Al Stebbins gave the first of the two summary talks.

He started by noting that the conference had not created too much “back and forth”, and so he hoped he would manage to say something that made somebody mad. But that did not happen, unfortunately: his talk was quite un-controversial…

Stebbins tried to put the current “precision cosmology measurements” in a bit of perspective, by noting that if you asked cosmologists in the mid 70s what was the universe made of, they would reply baryons 40% and curvature 60%, end of the story. The universe, that is, did not look closed back then. A decade later, the general view was thet dark matter would be 85%, and baryons 15%. Moreover, cosmologists would agree that by symmetry, the universe must be flat: zero curvature. In more recent years, all of a sudden dark matter went down to 24%, baryons to 4%, and dark energy up to a whooping 72%. So the universe is made of repulsive stuff, mostly. You could extrapolate what could happen in some more years…

The Fermilab scientist did not go as far as to compute it, but I guess I can do the math myself: baryons will cease to exist in five more years. So cosmologists really have to work hard since there is not much time left before pencils start to vanish… If that were so, Krauss’ worries about the miserable future of Cosmology would have been far too optimistic (and he in fact denied being a pessimist during the question time following his talk!) Anyway, seriously: the point made by Stebbins was rather that theorists must remain vigilant. We could indeed be declaring victory too soon.

Stebbins then reminded the audience of a few of the Big Questions that were dealt with this week: questions of content, of kinematics… Where are things, what is the interclusters medium, where are the baryons… And other questions of dynamics: what are the forces that cause things to move. Also, questions about initial conditions. He noted that inflation was never talked very much at the conference. And indeed, inflation -as many had noted in their talks- is not a testable theory. No matter what you say that you measure, you cannot really test it.

Later, Einstein was quoted: “The most incomprehensible thing about the world is that it is at all comprehensible“. Stebbins noted that scientists tend to really believe in their model. The models are not perceived as just good descriptions, but just plain real. But there are problems with Physics: it is difficult to explain everything. And the universe seems to be “too explainable” rather than “at all comprehensible” (another Albert quote, but Stebbins this time, not Einstein).

Model selection, a topic discussed widely at the conference, was then mentioned. Stebbins noted that in Cosmology there seem to be brief periods of time when you learn a lot about a specific topic. The most recent trends in Cosmo-statistics see Bayesian statistical tests, Fisher matrices, MCMC (Gibbs sampling), Bayesian Evidence/Model selection. Many people at the conference have computed “Bayesian evidence”. It has some sort of sociological content. But the relevant question remains, Do we really believe in our model ? Or are we forced into it by the data ? If that is the case, soon a lot of people might have to stop doing what they are doing.

Then Stebbins said a few things about dark energy. Why do not we stop at a cosmological constant ? Is it because it does not fit the data, or is it because some do not like the whole idea ? If you believed in the degrees of freedom that are actually there until you take a stand on a particular model, you would not be able to constrain anything with the experiments. Here Stebbins quoted Krauss: even in the case of just two parameters, it is practically impossible to put constraints on anything. You learn very little even by putting together all the data from different experiments and tests. Stebbins however also quoted Pahud, who said that the Planck experiment will probably be able to do something to move us from the impasse, especially in combination with low-z observations.

About dark matter and modified gravity models, new particles or wimps: Stebbins said dark matter appears to be not controversial, but it should be. The thing is, we prefer to add parameters to somebody else’s standard model to make our fit work, rather than adding parameters to our own. The highlight is mine, because this is really a witty remark which we should think over. And another remark by Stebbins I subscribe to was that globular clusters and dwarfs galaxies are simple systems, so we should study those and try to understand them. And in fact, another comment I venture to insert here is that the new evidence of black holes sitting in star clusters, and the apparent constancy of the ratio of black hole to total host mass for these and other much larger systems are compelling reasons to get back to the blackboard to get an explanation for these things.

In any case, even if presently proposed alternatives to dark energy do not work, one can be assured that there is an alternative which is indistinguishable from the standard cosmological model. So the question really is, Stebbins noted: do we have to wait for the data to require an alternative ? The real truth is that these missions that measure dark energy are also measuring a lot of other things. And, Stebbins said, there will always be anomalies: Situation Normal, All Fucked Up: SNAFU. Which, by the way, is what makes things interesting to me: as long as there are disagreements with the most fashionable theoretical model, there is a chance that advances can be made…. Compare the situation in cosmology with what happens with the standard model of particle physics and you get the picture.

Talking about anomalies, Stebbins mentioned the wide exposure that the CMBR anomaly got at the conference. It is an effect of low statistical significance, but worth investigating. Spending few words on the issue, Stebbins mentioned different views on the matter: according to Andrew Jaffe, a particular topology of the universe can expain it. Carlo Contaldi instead favors the possibility that the anomaly is a result of anisotropic inflation. Others gave their own interpretations, but my own pitch is that this particular anomaly is really not even that: a two-sigma effect is too little to cause so much interest, in my opinion. It is good to be prepared for a possible future when those two sigma become four or five, but spending too much time discussing the alternatives now seems a bit overkill to me.

– – –

After Stebbins, Richard Lieu gave a more detailed summary, in a style complementary to that of the former presentation. He started by showing a table of different elements of the present-day cosmology, with crosses on two columns, signalling whether there is strong evidence or established observation of each effect, or insufficient or controversial evidence. Then he went on to discuss each line, by showing the most interesting plots presented by the contributors, and discussing them in turn. I did not take detailed notes of his talk, because of its more strict “review” nature, and because I had gotten fed up with typing!

However, I did notice one point he made when he discussed the plot of Delta(M-m) versus redshift (z) of supernovae in far galaxies. Lieu clarified that the supernovae data is actually spread vertically quite a bit for every value of z. What one usually gets to see in the supernova plots is the combination of many different Delta(M-m) values measured from different supernovae for each z. And the combination is done by taking the median of the Delta(M-m) values, not the means! Now, there surely is a good reason for doing that – but I had not noticed it, and I will investigate the matter for my own education as soon as I have a chance.

In fact, in the conference I also learned that Ia supernovae – those that appear to be caused by accretion from a companion until criticality is reached, and could thus be thought of as close copies of each other (for, if mass is by far the most important variable in determining the dynamics of the explosion, one might expect that a critical mass is equal to any other) are far from being “standard candles”: they show a wide variety of time evolution in their luminosity, such that their use for distance measurements is not completely straightforward. Enough to doubt about the results of the analyses ? Probably not, but worth keeping in mind.

By reading back the last paragraph, I notice that one sentence is 95 words long. It is a defect – ok let’s call it a feature – in my writing style which I inherited from my father. Was the sentence too hard to follow ? Probably yes. I need to restrain myself a bit in that respect. And with this totally irrelevant comment, dear reader, I leave you and get a deserved sleep.

Comments

1. Guess Who - March 31, 2007

Thank you for the great conference coverage!

A last late-night comment:

Why do not we stop at a cosmological constant ? Is it because it does not fit the data, or is it because some do not like the whole idea ?

Back the clock to roughly a decade ago, when neutrino experiments were suggesting a negative neutrino mass squared. Do you remember anybody taking this at face value and concluding that neutrinos are tachyons? Actually, there was one guy, a notorious crackpot best known for a gravitational perpertuum mobile… but real physicists knew what a can of worms that would open. It would have been an extraordinary conclusion, so absent extraordinary evidence they simply assumed that the experiments must be flawed.

Compare dark energy. To the average cosmologist, it’s really no big deal; it’s just a constant which you’ve always been free to add to Einstein’s equations, no problem. To see the problem, you need to know your high energy theory. And then you see a Really Big Problem: the thing is absurdly small compared to naive estimates of zero point energy in QFT coupled to gravity. To match the measured result, you need cancellations miraculously fine-tuned to some 120 decimals. There’s nothing like this anywhere else in physics. It’s totally unnatural. If the measured value had been 0, you could have assumed the existence of some exact symmetry leading to that result, but this extraordinarily small, finite value throws that option out the window.

If cosmologists were high energy theoreticians, they might have reacted like physicists did a decade ago when confronted with apparent negative neutrino mass squared: shrug it off as just too far-fetched to be taken seriously. You can see such a tendency among the few people who work in the interface between particle physics and cosmology and understand both fields well (see Sarkar’s talk). But to most cosmologists, dark energy is just that harmless-looking constant in Einstein’s equations, or at most some simple inflaton-like field with a suitably constructed potential (looking completely contrived to a high energy theorist who worries about such things as renormalizability and, again, naturalness). You could say that they don’t know any better. Or maybe that they prefer to add parameters to somebody else’s standard model to make their fits work, rather than adding parameters to their own…

But the problem caused by dark energy is truly extraordinary: you are left with the choice between a universe which is ridiculously fine-tuned (by whom?) and one which only looks like it is (because we wouldn’t be around if it weren’t). The first option practically begs for an Intelligent Designer which fine-tuned the parameters of our universe, the second for an anthropic landscape which realizes all possible combinations, leaving us to live in some infinitesimal part of it where the conditions are right. Both “solutions” fall outside the scope of the scientific paradigm: they are unfalsifiable, the choice between them boils down to a matter of faith, and no matter which one you choose, the hope to ever arrive at a simple fundamental description of the universe is lost.

What’s a tachyon compared to that?

2. Tony Smith - March 31, 2007

Guess Who said “… roughly a decade ago, when neutrino experiments were suggesting a negative neutrino mass squared … real physicists knew what a can of worms that would open. It would have been an extraordinary conclusion, so … they simply assumed that the experiments must be flawed. …”.

Even today, a negative neutrino mass squared has not been ruled out by experiment.
hep-ex/0412056, entitled “Final Results from phase II of the Mainz Neutrino Mass Search in Tritium beta Decay”, says in part:
“… Data taking on the search for the neutrino mass covered the years 1998 to 2001 and yielded the so far narrowest limit on the observable m^2(nu_e) of ( -0.6 +/- 3.0 ) ev^2 / c^4 …
In 2000 the KATRIN collaboration formed proposing to build a large MAC-E-filter in combination with a gaseous T2 source at the site of the Forschungszentrum Karlsruhe. … The present design aims at reaching within 3 years of measurement a precision of m^2(nu_e) = 0.02 ev^2 / c^4 corresponding to a sensitivity limit of 0.2 eV / c^2 for the mass itself. The experiment should be ready to go in 2008. …”.

So, people like me who believe that experimental results are far more important than theoretical prejudice should wait until about 3 years after 2008 to be confident whether or not negative mass squared for neutrinos has been ruled out.

Tony Smith

3. Andrea Giammanco - March 31, 2007

the median is a fairly good estimator of the most probable value (i.e. the peak position) for asymmetric distributions, under very general conditions.
maybe the distribution is asymmetric?

4. Andrea Giammanco - March 31, 2007

or maybe the distributions is almost symmetric but there is a concern for outliers.
the median, as you may easily understand, is quite insensitive to outliers; the median is not (a rare outlier, but with a huge deviation from the mean, pushes the arithmetic mean quite a lot).

5. Starrynight - March 31, 2007

I’d just like to say thanks for that really enjoyable summary, it was spot on!

6. dorigo - March 31, 2007

Andrea,

I do know a bit about outliers and statistics which are insensitive to them – say, the KS distance for instance. I agree, medians are less sensitive to outliers than means. What I find peculiar is that no attempt is made at explaining odd distributions. But I may have missed some.

Cheers,
T.

7. dorigo - March 31, 2007

Hi GW,

thank you for your comment – give me some time and I will indeed guess who you are.

Anyway, “Both “solutions” fall outside the scope of the scientific paradigm: they are unfalsifiable, the choice between them boils down to a matter of faith, and no matter which one you choose, the hope to ever arrive at a simple fundamental description of the universe is lost.”

I agree…It is painful but true.

Cheers,
T.

8. dorigo - March 31, 2007

Hi Tony,

I do agree there is no definitive conclusion about positive neutrino masses, but evidence is strong. That does not imply much for model building if one is willing to overcome that evidence, but it should be kept in mind.

Personally, when I first opened a particle physics book for the first time, I never believed that neutrinos were massless – it just seemed a unverified simplification to me. The best option is that they are massive, but light. How massive ? We’ll soon know, I guess.

Cheers,
T.

9. Tony Smith - March 31, 2007

Tommaso, this may be off-topic, but maybe is worth looking at anyway.
Peter Woit’s blog had a link to a Fermilab Today article dated 29 March 2007 that said in part:

“… On Tuesday, March 27, there was a serious failure in a high-pressure test at CERN of a Fermilab-built “inner-triplet” series of three quadrupole magnets in the tunnel of the Large Hadron Collider. …
preliminary indications are that structures supporting the inner “cold mass” of one of the three magnets within its enclosing cryostat broke at a pressure of 20 atmospheres, in response to asymmetric forces applied during the test. Such forces are expected on occasion during normal operation of the LHC. …
While the full cause of the problem is not yet known, failure to account for the asymmetric loads in the engineering design of the magnet appears to be a likely cause. The test configuration corresponds to conditions that occur during a magnet quench, when a superconducting magnet suddenly “goes normal,” releasing large amounts of energy. …
engineering reviews of the magnets … do not appear to have addressed these asymmetric loads. Tests at Fermilab were done on single magnets where such loads do not develop. …”.

Do you have any further information, from CDF or CMS or anywhere, about the situation and its possible consequences ?

Tony Smith

10. Amedeo - March 31, 2007

Hi Tommaso,
another great post!
I just feel compelled to add my opinion to the whole discussion. Clearly, Guess Who is no cosmologist…🙂
I am a bit puzzled by the quotation that he is referring to, namely:

“Why do not we stop at a cosmological constant ? Is it because it does not fit the data, or is it because some do not like the whole idea ?”

In fact, I guess that Albert was just asking a rethorical question, since the cosmological constant *does* fit the data, and currently there is actually no indication that we should go for anything more sofisticated. The reason for going beyond the cosmological constant is conceptual, not data-driven, and here I tend to disagree with GW when he says that:

“Both “solutions” fall outside the scope of the scientific paradigm: they are unfalsifiable, the choice between them boils down to a matter of faith, and no matter which one you choose, the hope to ever arrive at a simple fundamental description of the universe is lost.”

The problem with the cosmological constant and fine tuning is not something that cosmologists came up because they needed to fit some data. It has been standing there for decades, and there is a great review by Steven Weinberg (Rev. Mod. Phys. 61, 1 – 23 (1989), well before any supernovae data forced us to take the thing seriously) that pointed out the “unnaturalness” of the whole thing. He is, as far as I know, a high energy theorist, and he did not shrug his shoulders… Unfortunately, the problem would stay there even if the cosmological data showed no evidence for a cosmological constant: why QFT predicts a vacuum energy that is 60 to 120 orders of magnitude larger than observed? In my opinion, it may well be that none of the alternatives that GW indicates is the correct one: a shift of paradigm may come in the future and a new theoretical framework would show that there is a clear physical mechanism that explains why the value of vacuum energy is so small. Think of the ultraviolet catastrophe that led to quantum mechanics: we might today be in a similar situation. Should this not challenge our wits?

11. Guess Who - March 31, 2007

Amedeo, when Weinberg wrote that article, the question being asked was different: people were asking why the cosmological constant was 0, and what that was telling us about the TOE unifying QFT and GR. The idea was that some exact symmetry of that theory would protect the cosmological constant from acquiring a non-zero value. That is not a problem, it’s a feature.

Weinberg played around with the then hypothetical question “well, what if it isn’t actually 0; how large could it actually be consistently with our being here, and what would THAT tell us about the universe?”. He famously came up with an anthropic “prediction” for the most likely value IF not 0, but as you know he got “close” only compared to a fine-tuning of 120 decimal places.

If you re-read what I wrote, I think (or at least hope) you will see that I do indeed not subscribe to either Intelligent Designer nor Anthropic landscape. I agree with Sarkar: there really is nowhere near sufficient evidence to accept the extraordinary claim of dark energy.

12. Amedeo - April 1, 2007

GW, I never suspected that you subscribed to any of those alternatives. It is clear that you would rather do without a cosmological constant, but I think that the problem would remain: we still should explain what sort of perfect simmetry erases the vacuum energy with an accuracy of 120 orders of magnitude. I agree that the problem would be much less painful that having to deal with an imperfect cancellation, but still it has no solution. As for the evidence in favour of dark energy, I am afraid that the data are pretty clear and we only have two choices: either we deal with the cosmological constant or something very similar to it, or we decide that our entire cosmological paradigm is wrong and we are in the same situation as those ancient greek who invented complicated epicycles to explain planet orbits only because they wanted to stick to circles. The latter is a perfectly respectable opinion and I have nothing against it: but at the moment the paradigm shift does not seem anywhere near.

13. Guess Who - April 1, 2007

Amedeo, if you care to look at the work of Sarkar and others you will see that the cosmological data is anything but clear. I remind you that there is no direct observation of dark energy, only an inference based on fits to (messy) observations of other quantities, crucially hinging on priors (e.g. the assumption of a simple inflationary fluctuation spectrum) and on prejudice about the acceptable number of parameters. Alternative models do exist (just one example: are you aware that a linearly coasting cosmology does as well from a data fitting perspective as LCDM?).

Once upon a time, very smart men believed that the world was made of air, earth, fire and wind; clearly a simpler model than modern chemistry with its 100+ elements, or even the standard model (of physics) with its 30 (or so) parameters. But hardly better. If LCDM is right, those 30 parameters only describe a few % of the content of the universe; the rest is in the dark sector. But for some reason, that dominating sector is supposed to be adequately described by only six parameters…

I don’t know about you, but saving a few phenomenological parameters of fit at the cost of giving up the scientific paradigm that got us to this point in the first place does not strike me as a great trade.

Regarding symmetry, that’s exactly what would make the difference between exact 0 and an unnaturally small value. When you write down an action and quantize, there are rules to respect; in particular, radiative corrections breaking tree-level symmetries cause anomalies, which can easily destroy the consistency of the theory. This provides a very powerful constraint on model building. If you are into strings, you know that the number of extra dimensions is not a frivolous choice; it’s imposed by the condition of anomaly freedom. If you are into preons (or even just QCD), you know that chiral symmetry can protect bound states from acquiring a mass.

So you see, there would be nothing particularly remarkable about a symmetry forcing the cosmological constant to equal 0 exactly. Quite the contrary: that’s the natural expectation based on everything we know about the role of symmetries in quantum theory. It’s only when you start to speculate about a deviation from 0 that you run into trouble: then you must explain why a value evidently not protected by a symmetry still remains so small, much smaller than the order of magnitude of the radiative corrections which it’s now allowed to pick up without making the theory inconsistent.

Now that is a frivolous choice.

14. dorigo - April 1, 2007

Amedeo and GW,

despite the rather rough nature of your exchange, I think it is a positive one, and I actually thank both for the interesting arguments.
Only, GW – please try to avoid attacks such as the one to Carl in the other thread… The same points could have been made without being so vitriolic! I like nc and others who contribute to this blog even if they do not have a PhD or have used it outside academia, and I would like them not to be discouraged from visiting here.

Cheers,
T.

15. dorigo - April 1, 2007

Thank you Starrynight, you’re of course most welcome.

Cheers,
T.

16. Guess Who - April 1, 2007

Hi TD. I’m not sure which post(s) you are referring to when you write about “attacks such as the one to Carl”. If you mean Carl Brennan, I seem to remember (without actually going back to check all recent threads) that I have ever only responded to one post by him, and only because it contained an insinuation about the competence of another poster (me), ironically backed by a factually incorrect assessment of a Google search (on the name of yet another poster). I tried to make it a tit-for-tat, i.e. non-escalating, limiting the response to reverse (but not add to) his own accusation.

I have nothing at all against people without Ph.D.s and/or working outside academia. Einstein and Dyson come to mind.

Regarding roughness in this thread, the only roughness I’ve seen so far is that I keep mis-spelling and mistyping things, from “perpertuum” to “air, earth, fire and wind” (water, for heaven’s sake, water!). Sorry about that.

17. dorigo - April 1, 2007

Yes, GW, sorry – I was confusing Carl with Nigel. As you see, we all make mistakes…

It is important to me that people are not discouraged to contribute to the interesting discussions that sometimes flourish here: if we all agreed on things it would be boring, but disagreeing should not mean fighting. And to fight it takes two people, not just one, so I have nobody to blame if we went close to the danger line a couple of times this week… You were only involved in one of the episodes.

Cheers,
T.

18. Amedeo - April 2, 2007

The discussion with GW is very interesting (I hope not just to ourselves…) and I am quite enjoying it, I am just afraid it is taking us a bit too far and is getting too technical:
I’d like to make only a few very quick points, which I hope both GW and I can agree on, so we can move on, mantaining our disagreement on anything else🙂
The problem with lambda different from zero is much worse than lambda exactly zero. I do not deny it. I am just saying that even if the “perfect” simmetry would be more natural, we still don’t know what it is.
I think everything eventually boils down to a matter of attitude: if physics is only what we can measure in the laboratory then we have yet no direct evidence of anything in the dark sector. I can see a point in that, and I would not mind if someone could come out with a model of the universe made only of baryons which can simultaneously fit *all* cosmological data. Unfortunately, no working model like that currently exists. I am not the advocate of dark energy, in fact I find it quite problematic. I am open to alternatives: simply, at the moment, I don’t see any.

19. Guess Who - April 2, 2007

I agree that we don’t know what the exact symmetry enforcing lambda = 0 is, for the simple reason that we do not have a finished TOE unifying QFT and GR. There is no lack of reasonable candidates though. A few minutes on the Arxiv is all you need to find examples of (physics) models in which lambda is exactly zero (e.g. good old SUGRA, as in hep-ph/0511259). Their problem is that the authors then typically go on to mess them up by adding some ad-hoc terms which produce a small non-zero value, and then off we go with the usual ridiculous fine tuning. They do this only because observational cosmologists told them it’s needed, and because they are used to trusting experimental input, spoiled as they are by decades of painstaking, rigorous work of the kind done by Dorigo. Few seem to appreciate the huge difference between experimental HEP and cosmology. Those who do take the time to look for themselves tend to react like Sarkar, i.e. with the polite version of a “you’ve got to be kidding me!”.

Please don’t get me wrong, I’m not saying that cosmologists are all dumb, I’m saying that the problem of extracting any useful information at all from cosmological observation is atrociously difficult, and heavily dependent on the assumptions entering the data analysis.

I guess we’ll have to keep disagreeing about the lack of viable alternative cosmological models. Those who look shall find.🙂

20. dorigo - April 2, 2007

Hey, I can recognize adulation when it’s thrown at me… Thank you GW! As a friend of mine uses to say, “Drink whatever you like, it’s on me”.

Cheers,
T.

21. Black holes, the winged seeds of our Universe « A Quantum Diaries Survivor - January 8, 2009

[…] and the black hole they contain at their center. This has been known for a while -I learned it at a intriguing talk by Al Stebbins at the “Outstanding Questions in Cosmology” conference, in March 2007 at the Imperial […]


Sorry comments are closed for this entry

%d bloggers like this: