It’s been 10 years now since physicists first raised the possibility that particle accelerators on Earth could produce microscopic black holes. This phenomenon initially seemed hugely exciting since it hinted at a way scientists could test their ideas about quantum gravity, the theory that reconciles quantum mechanics with general relativity. .
Since then, much of the excitement has died down. It turns out that the energy required to create these objects vastly exceeds what is possible in the world’s most powerful accelerators and, indeed, is far more than found in the most powerful cosmic ray ever recorded.
There are various loopholes that allow micro-black holes to form at lower energies, however. The most widely discussed is the possibility that the universe has extra dimensions on microscopic scales that significantly weaken gravity at this level. These dimensions would need to operate at a scale greater than 10^-19 metres to allow microscopic black holes to form more easily.
But here again, the evidence is constraining this idea. The world’s most powerful accelerator, the Large Hadron Collider, has been running for a year or so and so far failed to produce black holes with masses up to 4.5 TeV. That means any extra dimensions must be smaller than 10^-12 metres in size.
Nevertheless, black holes could still be produced at the LHC at a rate of perhaps 100 per year. But how to spot them?
Today, Marcus Bleicher at the Frankfurt Institute for Advanced Studies in Germany and a few pals outline some of the open problems concerning black hole production and detection at the LHC, assuming it takes place at all.
These guys assume that after microscopic black holes form, they would go through four phases. First there is the balding phase in which the newly formed back hole evolves from a highly asymmetric object to a more symmetric one, shedding its asymmetry through gravitational radiation.
In the second phase, called the spin-down phase, the black hole loses mass and angular momentum by emitting Hawking radiation. The third, the Schwarzschild phase, the black hole becomes spherical and the rate of mass loss slows down. And in the final Planck phase, the black hole winks out of existence.
Of these phases, only the Schwarzschild phase in is understood in any detail mainly because of the symmetry involved. The other phases are poorly understood, particularly the Planck phase which can only be described in terms of quantum gravity, which is itself an untested idea.
One thing that could help clarify many of these questions is more data and the possibility of an upgrade to the LHC at some point in the future.
The 800 pound gorilla in all this is the safety of these kinds of experiments. There is a widespread belief in the particle physics community that black hole production is a zero risk procedure. Indeed, particle physicists brook no discussion on this topic and Bleicher and co do not mention it.
By contrast, they point out that the physics involved is highly speculative. Indeed what interests them is the possibility that these processes will reveal new physics beyond our existing understanding of the universe. That’s hard to reconcile with the categorical assurances that the public has been given over safety.
There’s little confidence to be gained from safety assessments that have been carried out in the past.. Back in the late 90s, a reader’s letter in Scientific American raised the question of whether the Relativistic Heavy Ion Collider (RHIC) then being built at the Brookhaven National Laboratory, could produce black holes that might destroy the planet.
As a result, Brookhaven’s director commissioned a report from four physicists on the safety of the machine. This report concluded that the probability of catastrophe was 2 x 10^-4, describing this as “a comfortable margin of error”. Another report by a group of CERN physicists came to the “”extremely conservative conclusion [that] it is safe to run RHIC for 500 million years.”
These papers were widely used at the time to provide reassurance to the public and yet both later turned out to contain serious errors. The “comfortable margin of error” is actually a 1 in 5000 chance–not one that most people would consider comfortable. When this was pointed out, the team revised its figures by adding another zero onto the number making it a 1 in 50,000 chance, adding that “we do not attempt to decide what is an acceptable upper limit on [the probability of a disaster].”
The CERN group had mangled its numbers too. It turned out that their calculations merely suggested that there was a low probability that Earth would be destroyed very early on in a run at RHIC. In fact, their calculations were consistent with a high probability of planetary destruction over a long run.
None of these errors were widely reported.
Just before the LHC was due to be switched on, CERN commissioned its own report on the safety of what is now the world’s most powerful accelerator. This report concluded that the machine was safe.
An important question is what confidence the public should place in this report. There are various reasons to be cautious, not least of which are the errors that appeared in earlier assessments.
Just as serious is the fact that the report was written by five employees at CERN who relied on the scientific work of one other CERN employee and a scientist with a pending visiting position at the organisation.
These are people whose entire careers and livelihoods depended on the LHC being switched on. With the best will in the world, it’s hard to see how this was a sensible choice.
Since then the debate has moved on, with a number of new concerns being raised over safety. We’ve covered this in this blog on several occasions. These concerns have yet to be addressed.
What’s needed, of course, is for the safety of the LHC to be investigated by an independent team of scientists with a strong background in risk analysis but with no professional or financial links to CERN. A competent team could surely be put together even though this condition would probably exclude most particle physicists.
The talk now is of an LHC upgrade to increase the machine’s luminosity and its energy to some 16.5 TeV. Safety should be a central part of these plans and yet it is not. The public should demand to know why.
Ref: arxiv.org/abs/1111.0657: Micro Black Holes In The Laboratory
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist
An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.
The Biggest Questions: What is death?
New neuroscience is challenging our understanding of the dying process—bringing opportunities for the living.
Data analytics reveal real business value
Sophisticated analytics tools mine insights from data, optimizing operational processes across the enterprise.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.