A View from Emerging Technology from the arXiv
Do Nuclear Decay Rates Depend on Temperature?
The debate over whether nuclear decay rates change with temperature is about to get hotter.
In 1913, Pierre Curie and M. Kamerlingh Onnes measured the rate of decay of radium at room temperature and after being cooled in liquid hydrogen. Their conclusion was that the decay rate was entirely independent of temperature. Since then, numerous investigations have shown that alpha and beta decays are not influenced by external conditions such as temperature, air pressure, or the surrounding material. By contrast, decays that proceed by electron capture are known to be susceptible to their environment and so have been placed in a different category.
In the last few years, however, a number of new results have threatened to overturn this picture. Various groups have shown that the rate of alpha, beta, and electron capture decays all depend on temperature and whether they are placed in an insulating or a conducting material. That’s exciting because it raises the possibility of treating radioactive waste products. But it also raises a problem for particle physicists whose entire standard model assumes that decay rates cannot be influenced by external factors.
The anomalous results are puzzling. One group found that the alpha emitter polonium-210, when placed in a copper container at 12 degrees Kelvin had a half-life that was six percent shorter than at room temperature. Another report claimed that the half-life of the beta(-) emitter, gold -198, was 3.6 percent longer at 12 degrees Kelvin than at room temperature. And yet another group showed that the half-life of beryllium-7, which decays by electron capture, depends on the material in which it is placed, increasing by 0.9 percent in palladium at 12 degrees Kelvin and at 0.7 percent in indium at 12 degrees Kelvin. There is even a theory to explain what is going on: that a temperature-dependent screening effect inside metallic containers influences electron capture. This, of course, ought to affect all nuclei that decay in this way.
And if these confusing claims aren’t hard enough to stomach, another group claims that decay rates are influenced by the Earth’s distance from the Sun.
What on Earth is going on?
Today, normal service returns with a report on the decay rates inside a metal host of ruthenium-97 by electron capture and ruthenium-103 and rhodium-105, both by beta(-) emission. John Hardy and pals at the Cyclotron Institute at Texas A&M University measured the decay rates of this stuff at room temperature and at 19 degrees Kelvin with a precision that was, in most cases, much higher than any previous experiments.
Their results? Zip, zilch, zero. They found no temperature dependence in any of their data.
The conclusions that can be drawn from this result offer an interesting insight into the nature of the scientific process, where it’s all too easy to dismiss null results.
While Hardy and his group point out that they cannot comment on the validity of other groups’ results, their null result has a significant bearing on the status of the screening effect. Their experiment shows that the screening effect does not apply to ruthenium-97 and therefore cannot be a general phenomenon. That’s a significant finding that will send the theorists scurrying back to their blackboards.
Perhaps more important is the effect of this result on particle physicists, who have been sharpening their pencils in preparation for rewriting their textbooks should the prospect of a temperature dependence, or any other dependence, raise its head.
Today, it looks as if they can rest easy. At least, until the next salvo in this debate.
Ref: arxiv.org/abs/0910.4338: Half-life of the Electron-Capture decay of Ru-97: Precision Measurement Shows No Temperature Dependence