## Black holes hype does not decayFebruary 3, 2009

Posted by dorigo in astronomy, Blogroll, cosmology, humor, news, physics, politics, religion, science.
Tags: , ,

While the creation of black holes in the high-energy proton-proton collisions that LHC will hopefully start providing this fall is not granted, and while the scientific establishment is basically unanimous in claiming that those microscopical entities would anyway decay in a time so short that even top quarks look longevous in comparison, the hype about doomsday being unwittingly delivered by the hands of psychotic, megalomaniac CERN scientists continues unhindered.

Here are a few recent links on the matter (thanks to M.M. for pointing them out):

The source of the renewed fire appears to be a paper published on the arxiv a couple of weeks ago. In it, the authors (R. Casadio, S. Fabi, and B. Harms) discuss a very specific model (a warped brane-world scenario), in whose context microscopic black holes might have a chance to survive for a few seconds.

Never mind the fact that the authors say from the very abstract, as if feeling the impending danger of being strumentalized, “we argue against the possibility of catastrophic black hole growth at the LHC“. This is not the way it should be done: you cannot assume a very specific model, and then draw general conclusions, because others opposing your view may always use the same crooked logic and reverse the conclusions. However, I understand that the authors made a genuine effort to try and figure out what could be the phenomenology of microscopic black holes created in the scenario they considered.

The accretion of a black hole may occur via direct collision with matter and via gravitational interactions with it. For microscopic black holes, however, the latter (called Bondi accretion) is basically negligible. The authors compute the evolution of the mass of the BH as a function of time for different values of a critical mass parameter $M_c$, which depends on the model and is connected to the characteristic thickness of the brane. They explicitly make two examples: in the first, when $M_c=100 kg$,  a 10 TeV black hole, created with 5 TeV/c momentum, is shown to decay with a roughly exponential law, but with lifetime much longer -of the order of a picosecond- than that usually assumed for a micro-BH evaporating through Hawking radiation. In the second case, where $M_c=10^6 kg$, the maximum BH mass is reached at $3.5 \times 10^{21} kg$ after about one second. Even in this scenario, the capture radius of the object is very small, and the object decays with a lifetime of about 100 seconds. The authors also show that “there is a rather narrow range of parameters [...] for which RS black holes produced at the LHC would grow before evaporating“.

In the figure on the right, the 10-base logarithm of the maximum distance traveled by the black hole (expressed in meters) is computed as a function of the 10-base logarithm of the critical mass (expressed in kilograms), for a black hole of 10 TeV mass produced by the LHC with a momentum of 5 TeV/c. As you can see, if the critical mass parameter is large enough, these things would be able to reach you in your bedroom. Scared ? Let’s read their conclusions then.

“[...] Indeed, in order for the black holes created at the LHC to grow at all, the critical mass should be $M_c>10^5 kg$. This value is rather close to the maximum compatible with experimental test of Newton’s law, that is $M_c=10^6 kg$ (which we further relaxed to $M_c=10^8 kg$ in our analysis). For smaller values of $M_c$, the black holes cannot accrete fast enough to overcome the decay rate. Furthermore , the larger $M_c$ is taken to be, the longer a black hole takes to reach its maximum value and the less time it remains near its maximum value before exiting the Earth.

We conclude that, for the RS scenario and black holes decribed by the metric [6], the growth of black holes to catastrophic size does not seem possible. Nonetheless, it remains true that the expected decay times are much longer (and possibly >>1 sec) than is typically predicted by other models, as was first shown in [4]“.

Here are some random reactions I collected from the physics arxiv blog -no mention of the author’s names, since they do not deserve it:

• This is starting to get me nervous.
• Isn’t the LHC in Europe? As long as it doesn’t suck up the USA, I’m fine with it.
• It is entirely possible that the obvious steps in scientific discovery may cause intelligent societies to destroy themselves. It would provide a clear resolution to the Fermi paradox.
• I’m pro science and research, but I’m also pro caution when necessary.
• That’s what I asked and CERN never replied. My question was: “Is it possible that some of these black might coalesce and form larger black holes? larger black holes would be more powerful than their predecessors and possibly aquire more mass and grow still larger.”
• The questions is, whether these scientists are competent at all, if they haven’t made such analysis a WELL BEFORE the LHC project ever started.
• I think this is bad. American officials should do something about this because if scientists do end up destroying the earth with a black hole it won’t matter that they were in Europe, America will get the blame. On the other hand, if we act now to be seen dealing as a responsible member of the international community, then, if the worst happens, we have a good chance of pinning it on the Jews.
• The more disturbing fact about all this is the billions and billions being spent to satisfy the curiosity of a select group of scientists and philosophers. Whatever the results will yield little real-world benefit outside some incestuous lecture circuit.
• “If events at the LHC swallow Switzerland, what are we going to do without wrist watches and chocolate?” Don’t worry, we’ll still have Russian watches. they’re much better, faster even.

It goes on, and on, and on. Boy, it is highly entertaining, but unfortunately, I fear this is taking a bad turn for Science. I tend to believe that on this particular issue, no discussion would be better than any discussion -it is like trying to argue with a fanatic about the reality of a statue of the Virgin weeping blood.

… So, why don’t we just shut up on this particular matter ?

Hmm, if I post this, I would be going against my own suggestion. Damned either way.

## Black holes, the winged seeds of our UniverseJanuary 8, 2009

Posted by dorigo in astronomy, cosmology, news, science.
Tags: , ,

From Percy Bysshe Shelley’s “Ode to the West Wind” (1819), one of my favourite poems:

[...]O thou,
Who chariotest to their dark wintry bed
The winged seeds, where they lie cold and low,
Each like a corpse within its grave, until
Thine azure sister of the Spring shall blow
Her clarion o’er the dreaming earth, and fill
(Driving sweet buds like flocks to feed in air)
With living hues and odors plain and hill:
Wild Spirit, which art moving everywhere;
Destroyer and preserver; hear, oh, hear!

The winged seeds -of galaxies, and ultimately of everything that there is to see in our Universe- appear today to be black holes: this is what emerges from the studies of Chris Carilli, of the National Radio Astronomy Observatory (NRAO). In a press release of January 6th, Carilli explains that the evidence that black holes are antecedent to galaxy formation is piling up.

In a nutshell, there appears to be a constant ratio between the mass of objects like galaxies and giant globular clusters and the black hole they contain at their center. This has been known for a while -I learned it at a intriguing talk by Al Stebbins at the “Outstanding Questions in Cosmology” conference, in March 2007 at the Imperial College of London. But what has been discovered more recently is that the very oldest objects contain more massive black holes than expected, a sign that black holes started growing earlier than their surroundings.

This is incredibly interesting, and I confess I had always suspected it, when looking at the beautiful spiral galaxies, attracted in a giant vortex by their massive center. I think this realization is a true gate to a deeper understanding of our Universe and its formation. A thought today goes to Louise, who has always held that black holes have a special role in the formation of our Universe.

## Some posts you might have missed in 2008January 5, 2009

Posted by dorigo in cosmology, personal, physics, science.
Tags: , , , , , , , , , , ,

To start 2009 with a tidy desk, I wish to put some order in the posts about particle physics I wrote in 2008. By collecting a few links here, I save from oblivion the most meaningful of them -or at least I make them just a bit more accessible. In due time, I will update the “physics made easy” page, but that is work for another free day.

The list below collects in reverse chronological order the posts from the first six months of 2008; tomorrow I will complete the list with the second half of the year. The list does not include guest posts nor conference reports, which may be valuable but belong to a different list (and are linked from permanent pages above).

June 17: A description of a general search performed by CDF for events featuring photons and missing transverse energy along with b-quark jets – a signature which may arise from new physics processes.

June 6: This post reports on the observation of the decay of J/Psi mesons to three photons, a rare and beautiful signature found by CLEO-c.

June 4 and June 5 offer a riddle from a simple measurement of the muon lifetime. Readers are given a description of the experimental apparatus, and they have to figure out what they should expect as the result of the experiment.

May 29: A detailed discussion of the search performed by CDF for a MSSM Higgs boson in the two-tau-lepton decay. Since this final state provided a 2.1-sigma excess in 2007, the topic deserved a careful look, which is provided in the post.

May 20: Commented slides of my talk at PPC 2008, on new results from the CDF experiment.

May 17: A description of the search for dimuon decays of the B mesons in CDF, which provides exclusion limits for a chunk of SUSY parameter space.

May 02 : A description of the search for Higgs bosons in the 4-jet final state, which is dear to me because I worked at that signature in the past.

Apr 29: This post describes the method I am working on to correct the measurement of charged track momenta by the CMS detector.

Apr 23, Apr 28, and May 6: This is a lengthy but simple, general discussion of dark matter searches with hadron colliders, based on a seminar I gave to undergraduate students in Padova. In three parts.

Apr 6 and Apr 11: a detailed two-part description of the detectors of electromagnetic and hadronic showers, and the related physics.

Apr 05: a general discussion of the detectors for LHC and the reasons they are built the way they are.

Mar 29: A discussion of the recent Tevatron results on Higgs boson searches, with some considerations on the chances for the consistence of a light Higgs boson with the available data.

Mar 25: A detailed discussion on the possibility that more than three families of elementary fermions exist, and a description of the latest search by CDF for a fourth-generation quark.

Mar 17: A discussion of the excess of events featuring leptons of the same electric charge, seen by CDF and evidenced by a global search for new physics. Can be read alone or in combination with the former post on the same subject.

Mar 10: This is a discussion of the many measurements obtained by CDF and D0 on the top-quark mass, and their combination, which involves a few subtleties.

Mar 5: This is a discussion of the CDMS dark matter search results, and the implications for Supersymmetry and its parameter space.

Feb 19: This is a divulgative description of the ways by which the proton structure can be studied in hadron collisions, studying the parton distribution functions and how these affect the scattering measurements in proton-antiproton collisions.

Feb 13: A discussion of luminosity, cross sections, and rate of collisions at the LHC, with some easy calculations of the rate of multiple hard interactions.

Jan 31: A summary of the enlightening review talk on the standard model that Guido Altarelli gave in Perugia at a meeting of the italian LHC community.

Jan 13: commented slides of the paper seminar gave by Julien Donini on the measurement of the b-jet energy scale and the $p \bar p \to Z X \to b \bar b X$ cross section, the latter measured for the first time ever at a hadron machine. This is the culmination of a twelve-year effort by me and my group.

Jan 4: An account of the CDF search for Randall-Sundrum gravitons in the $ZZ \to eeee$ final state.

## Scientific wishes for 2009December 31, 2008

Posted by dorigo in astronomy, Blogroll, cosmology, personal, physics, science.
Tags: , , ,

I wish 2009 will bring an answer to a few important questions:

• Can LHC run ?
• Can LHC run at 14 TeV ?
• Will I get tenure ?
• Are multi-muons a background ?
• Are the Pamela/ATIC signals a prologue of a new scientific revolution ?
• Will England allow a NZ scientist to work on Category Theory on its soil ?
• Is the Standard Model still alive and kicking in the face of several recent attempts at its demise ?

I believe the answer to all the above questions is yes. However, I am by no means sure all of them will be answered next year.

## Selected holiday linksDecember 27, 2008

Posted by dorigo in astronomy, Blogroll, chess, cosmology, internet, italian blogs, news, physics, politics, science.
Tags: , , , ,

Being too lazy to generate content while relaxing after a day on the ski slopes in Padola, I am offering you a few selected links that are worth a visit. Not all about physics, and not all recent -that’s what you get from a very erratic web surfer.

• Tim Krabbé, the dutch novelist and chessplayer, has a very interesting piece on the very peculiar chess problem stipulation called the worst possible move. It now appears that, while the game of chess is virtually unexhaustible -at least for us humans-, we now have an answer to what is the worst possible move you can make in a chess game. Or, at least, we get very close to the best of the worst, with Sampsa Lahtonen’s 1.Qxc4+, a move by white that forces black to administer mate in one, when all the other 52 possible moves by white would have mated black.
• David Orban, my futurologist friend, spoke at the italian parliament on December 12th on internet and new technologies and their use. He offers a report (which includes two video clips), but it is currently only in the italian version of his blog. You can easily translate the text using the web’s http://translate.google.com powerful tools, while for the video… hehm. Maybe in the web 3.0.
• Michael Schmitt, a professor at Northwestern University and a colleague in CMS and CDF, is back in blogging mood, hopefully to stay. He has started back with momentum with a few very interesting posts, the last of which is about Dark Matter as a Quantum Liquid. Welcome back Michael!
• The always excellent Resonaances has yet another post on the Pamela/Atic anomalies out, and this one can’t be missed any more than could the previous ones. Highly recommended.

## Guest post: Ben Allanach, “Predictions for SUSY Particle Masses”September 4, 2008

Posted by dorigo in cosmology, news, physics, science.
Tags: , ,

Ben Allanach is a reader in theoretical physics at the University of Cambridge. Before that he was a post-doc at LAPP (Annecy, France), CERN (Geneva, Switzerland), Cambridge (UK) and the Rutherford Appleton Laboratory (UK). He likes drawing and playing guitar in dodgy rock bands. He is currently interested in beyond the standard model collider phenomenology, and is the author of SOFTSUSY, a computer program that calculates the SUSY particle spectrum. He also tries to do a bit of outreach from time to time. I invited him to discuss the results of his studies here after I discussed the paper by Buchmuller et al. two days ago, since I was interested in understanding the subtle differences between today’s different SUSY forecasts.

In a paper last year “Natural Priors, CMSSM Fits and LHC Weather Forecasts “, we (Kyle Cranmer, Chris Lester, Arne Weber and myself) performed a global fit to a simple supersymmetric model (the CMSSM). Data included were:

• relic density of dark matter
• Top mass, strong coupling constant, bottom mass and fine structure
constant data
• Electroweak data: W mass and the weak mixing angle
• Anomalous magnetic moment of the muon
• B physics: $B_s \rightarrow \mu\mu$ branching ratio,
$b \rightarrow s \gamma$ branching ratio, and
$B \rightarrow K^* \gamma$ isospin asymmetry
• All direct search limits, including higgs limits from LEP2

and used to make predictions for supersymmetric particle masses and cross sections. We showed two characterisations of the data: Bayesian (with various prior probability measures) and the more familiar frequentist one, which I’ll discuss here.

We vary all parameters in order to produce a profile likelihood plot of the LHC cross-sections for producing either strongly interacting SUSY particles, weak gaugino SUSY particles or sleptons directly. This is equivalently a plot of $latex e^{-\chi^2)/2}$:

The good news is that the LHC has great prospects for producing SUSY particles in large numbers assuming the CMSSM: for 1 $fb^{-1}$ of data, we expect the production of over 2000 of them to 95% confidence level (shown by the downward facing arrows). Of these, some fraction will escape detection, but the message is very positive. The CMSSM prefers a light higgs, as shown by this plot:

The different curves correspond to different assumptions about the priors (the green one labelled profile shows the usual $\chi^2$ interpretation), but as the figure shows, these aren’t so important. Arrows show the 95% confidence level upper bounds: 118 GeV for the lightest neutral higgs $h$.

### Comparison of results from two papers

The results are quite similar to the recent ones of the Buchmueller et al crowd (who use recent updated data and more observables) lightish SUSY is preferred, primarily because the anomalous magnetic moment of the muon prefers a non-zero SUSY contribution. Also, the W boson mass and weak mixing angle show a slight preference for light SUSY. Because the LHC has enough energy to produce these particles, detection should be quite easy.

The central results of each paper can be expressed in the parameter plane $m_0$ vs $M_{1/2}$ (scalar supersymmetric particle masses vs gaugino supersymmetric particle mass). Here, I show the result of our fit on the left and theirs on the right:

To compare the two figures, you must convert their axes of the right-hand figure to the one on the left (note the different scales, although I tried to re-size them to make the scales comparable – apologies to Buchmueller et al for flipping their axes to aid comparison). The comparison should be between the solid line of the right-hand diagram, and the outer solid line on the left (both 95% confidence level contours), but the Buchmueller et al gang get lighter scalars than us, by a factor

### Why should the two results differ?

The top mass has changed in the last year from $m_t=170.9 \pm 1.8$ GeV to $m_t=172.4 \pm 1.2$ GeV. Also, Buchmueller et al include additional observables: other electroweak, B and K-physics ones. My understanding is that none of these is very sensitive to the SUSY particle masses, given the constraints from direct searches though. Perhaps most of these extra observables very slightly prefer light SUSY, so that they disfavour $m_0=1000-2000$ GeV range? Buchmueller et al should be able to tell us by examining their data.

Thanks to Tommaso for inviting this guest post.

## Predictions for SUSY particle masses!September 2, 2008

Posted by dorigo in cosmology, news, science.
Tags: , , , ,

Dear reader, if you are not a particle physicist you might find this post rather obscure. I apologize to you in that case, and I rather prefer to direct you to some easier discussion of Supersymmetry than attempting to shed light for you on the highly technical information discussed below:

• For an introduction, see here.
• For dark matter searches at colliders, see a three-part post here and here and here.
• Other dark matter searches and their implications for SUSY are discussed here.
• For a discussion of the status of Higgs searches and the implications of SUSY see here and here.
• For a discussion of the implications for supersymmetry of the g-2 measurement, see here;
• A more detailed discussion can be found in a report of a seminar by Massimo Passera on the topic, here and here.
• For $B \to \mu \mu$ searches and their impact on SUSY parameter space, see here.
• For other details on the subject, see this search result.
• And for past rumors on MSSM Higgs signals found at the Tevatron, have a look at these links.

If you have some background in particle physics, instead, you should definitely give a look at this new paper, appeared on August 29th in the arxiv. Like previous studies, it uses a wealth of experimental input coming from precision Standard Model electroweak observables, B physics measurements, and cosmological constraints to determine the allowed range of parameters within two constrained models of Supersymmetry -namely, the CMSSM and the NUHM1. However, the new study does much more than just turning a crank for you. Here is what you get in the package:

1. direct -and more up-to-date- assessments of the amount of data which LHC will need to wipe these models off the board, if they are incorrect;
2. a credible spectrum of the SUSY particle masses, for the parameters which provide the best agreement between experimental data and the two models considered;
3. a description of how much will be known about these models as soon as a few discoveries are made (if they are), such as the observation of an edge in the dilepton mass distribution extracted by CMS and ATLAS data;
4. a sizing up of the two models, CMSSM and NUHM1 -which are just special cases of the generic minimal supersymmetric extension of the standard model. Their relative merit in accommodating the current value of SM parameters is compared;
5. most crucially, a highly informative plot showing just how much we are going to learn on the allowed space of SUSY parameters from future improvements in a few important observables.

So, if you want to know what is currently the best estimate of the gluino mass: it is very high, above 700 GeV in the CMSSM and a bit below 600 for the NUHM1. The lightest Higgs boson, instead, is -perhaps unsurprisingly- lying very close to the lower LEP II limit, in the 115 GeV ballpark (actually, even a bit lower than that, but that is a detail – read the paper if you want to know more about that). The LSP is instead firmly in the 100 GeV range. For instance, check the figure below, showing the best fit for the CMSSM (which, by the way, implies $M_0 = 60 GeV$, $M_{1/2}=310 GeV$, $A_0 = 240 GeV$, and $\tan \beta =11$).

The best plots are however the two I attach below: they represent a commendable effort to make things simpler for us. Really a highly distilled result of the gazillions of CPU-intensive computations which went into the determination of the area of parameter space that current particle physics measurements are allowing. In them, you can read out the relative merit of future improvements in a few of the most crucial measurements in electroweak physics, B physics, and cosmology, as far as our knowledge of MSSM parameters are concerned. The allowed area in the space of two parameters -$m_0 \div m_{1/2}$ as well as $m_0 \div \tan \beta$, at 95% confidence level, is studied as a function of the variation in the total uncertainty on five quantities: the error on the gyromagnetic ratio of the muon, $\Delta (g-2)_\mu$, the uncertainty in the radiative decay $b \to s \gamma$, the uncertainty in cold dark matter $\Omega h^2$, the branching fraction of $B \to \tau \nu$ decays, and the W boson mass $M_W$.

Extremely interesting stuff! one learns that future improvements in the measurement of the dark matter fraction will yield NO improvement in the constraints of the MSSM parameter space. In a way, dark matter does point to a sparticle candidate, but WMAP has already measured it too well!

Another point to make from the graphs above is that of the observables listed the W boson mass is the one whose uncertainty is going to be reduced sizably very soon -that is where we expect to be improving matters most in the near future, of course if LHC does not see SUSY before! Instead, the $b \to s \gamma$ branching fraction uncertainty might actually turn out to need larger uncertainties than those assumed in the paper, making the allowed MSSM parameter space larger rather than smaller. As for the muon $g-2$, things can go in both directions there as well, as more detailed estimates of the current uncertainties are revised. These issues are discussed in detail in the paper, so I have better direct you to it rather than inserting my own misunderstandings.

Finally, the current fits slightly favor the NUHM1 scenario (the single-parameter Non-Universal Higgs Model) over the CMSSM. The NUHM1 scenario includes one further parameter, governing the difference between the soft SUSY-breaking contribution to $M_H^2$ and to squark and sleptons masses. The overall best-fit $\chi^2$ is better, and this implies that the additional parameter is used successfully by the fitter. The lightest Higgs boson mass also comes up at a “non-excluded” value of 118 GeV, higher than for the best fit point of the CMSSM.

## GLAST Makes its First Light Public !!!August 26, 2008

Posted by dorigo in astronomy, cosmology, internet, news, physics, science.
Tags: , ,

This just in: J.D. Harrington from NASA, and Rob Gutro from the Goddard Space Flight Center inform us that NASA will announce the first results from the Gamma-Ray Large Area Space Telescope today. These include gamma-ray bursts that the telescope has observed since its launch, just two months ago, and some analysis of pulsar sources which were not well measured in the past, and now show very clearly their nature.

Another news is that the experiment is going to change its name! At the press conference the new name of the telescope will be presented.

I am impressed by the promptness with which data analysis was carried out. Of course, with a space mission things are different from ground-based detectors: everything has to be ready beforehand. Nevertheless, getting a high-resolution all-sky map of the universe at the highest light frequencies, increasing the resolution with respect to previous measurements considerably, in such a small amount of time is -well, what else?- an impressive scientific achievement. I do not know many members of GLAST, but the researchers working at the experiment for the INFN section of Padova University -Prof. Antonio Saggion, Dr. Denis Bastieri, Dr. Riccardo Rando and Dr. Luigi Tibaldo- are all very skilled, brilliant physicists. They provided a visible contribution to instrument analysis and to the study of backgrounds and diffuse sources.

The press conference is scheduled for today at 2PM EDT. You can get more information here. An audio webcast will be streamed live at this link.

## Good stuff aroundAugust 15, 2008

Posted by dorigo in Blogroll, computers, cosmology, games, humor, internet, news, physics, science.
Tags: , ,

Here are a few links that you might be interested to follow. They lead to posts in blogs I read and you should too:

• Louise has news of the appearance of a new puzzling “ghost” green galaxy.
• Marco explains the disappearing of a ghost propagator instead.
• Jester unexplains the unhiggs in an undeniably understandable, yet unserious way.
• Alex challenges you to pilot a submarine -and to find a windows bug while you’re at it.
• Bee explains the equivalence principle and why general relativity is sexy.
• Roberto offers some astounding pictures of the most astounding thing ever built.

## Fine tuning and numerical coincidencesJuly 1, 2008

Posted by dorigo in Blogroll, cosmology, games, internet, physics, science.
Tags: , ,

The issue of fine tuning is a headache for today’s theorists in particle physics. I reported here some time ago the brilliant and simple explanation of the problem with the Higgs boson mass given by Michelangelo Mangano. In a nutshell, the Higgs boson mass is expected to be light in the Standard Model, and yet it is very surprising that it be so, given that there are a dozen very large contributions to its value, each of which could make the Higgs hugely massive: but they altogether magically cancel. They are “fine-tuned” to nullify one another like gin and Martini around the olive in a perfectly crafted drink.

A similar coincidence -and actually an even more striking one- happens with dark energy in Cosmology. Dark energy has a density which is orders and orders of magnitude smaller than what one would expect from simple arguments, calling for an explanation which is still unavailable today. Of course, the fact that neither for the Higgs boson nor for dark energy there is as of today a solid experimental evidence is no deterrent: these entities are quite hard to part with, if we insist that we have understood at least in rough terms what exists in the Universe and what is the cause of electroweak symmetry breaking in particle physics. Yet, we should not forget that there might not be a problem after all.

I came across a brilliant discussion of fine tuning in this paper today by sheer chance -or rather, by that random process I entertain myself with every once in a while, called “checking the arXiV”. For me, that simply means looking at recent hep-ph and hep-ex papers, browsing through every third page, and getting caught by the title of some other article quoted in the bibliography, then iterating the process until I remind myself I have to run for some errand.

So, take the two numbers 987654321 and 123456789: could you imagine a more random choice for two 9-digit integers ? Well, what then, if I argued with you that it is by no means a random choice but an astute one, by showing that their ratio is 8.000000073, which deviates from a perfect integer only by nine parts in a billion!

Another more mundane and better known example is the 2000 US elections: the final ballots in Florida revealed that the Republican party got 2,913,321 votes, while the Democratic votes where only 2,913,144: a difference of sixty parts in a million.

Numerical “coincidences” such as the first one above have always had a tremendous impact on the standard crackpot: a person enamoured with a discipline but missing at least in part the institutional background required to be regarded as an authoritative source. A crackpot physicist, if shown a similarly odd coincidence (imagine if those numbers represented two apparently uncorrelated measurements of different physical quantities) would certainly start to build a theory around it with the means he has at his or her disposal. This would be enough for him or her to be tagged as a true crackpot. But there is nothing wrong with trying to understand a numerical coincidence! The only difference is that acknowledged scientists only get interested when those coincidences are really, really, really odd.

Yes, the feeling of being fooled by Nature (the bitch, not the magazine) is what lies underneath. You study electroweak theory, figure that the Higgs boson cannot be much heavier than 100 GeV, and find out that to be so there has to be a highly unlikely numerical coincidence in effect: this is enough for serious physicists to build new theories. And sometimes it works!

The guy in the picture on the right, Johann Jakob Balmer, got his name in all textbooks because of discovering the ratio (in the Latin sense) of the measured hydrogen emission lines. He was no crackpot, but in earnest all he did to become textbook famous was finding out that the wavelength of Hydrogen lines in the visible part of its emission spectrum could be obtained with a simple formula involving an integer number n -none other than the principal quantum number of the Hydrogen atoms.

So, is it a vacuous occupation to try and find out the underlying reason -the ratio- of the Koidé mass formula or other coincidences ? I think it only partly depends on the tools one uses; much more on the likelihood that these observed oddities are really random or not. And, since a meaningful cut-off in the probability is impossible to determine, we should not laugh at the less compelling attempts.

As far as the numerical coincidence I quoted above is concerned, you might have guessed it: it is no coincidence! Greg Landsberg explains in a footnote to the paper I quoted above that one could in fact demonstrate, with some skill in algebra, that

“It turns out that in the base-N numerical system the ratio of the number composed of digits N through 1 in the decreasing order to the number obtained from the same digits, placed in the increasing order, is equal to N-2 with the precision asymptotically approaching $10^{-N}$. Playing with a hexadecimal calculator could easily reveal this via the observation that the ratio of FEDCBA987654321 to 123456789ABCDEF is equal to 14.000000000000000183, i.e. 14 with the precision of $1.3\times 10^{-17}$.”

Aptly, he concludes the note as follows:

“Whether the $10^{-38}$ precision needed to fine-tune the SM [Standard Model] could be a result of a similarly hidden principle is yet to be found out.”

Ah, the beauty of Math! It is so reassuring to know the absolute truth on something… Alas, too bad for Godel’s incompleteness theorem. On the opposite side, whether one can demonstrate that the Florida elections were fixed, it remains to be shown.