February 1907: Bertram Boltwood Estimates Earth is at Least 2.2 Billion Years Old
Though he was off by over 2 billion years, his method of uranium-lead radiometric dating is still used today.
We take for granted that Earth is very old, almost incomprehensibly so. But for much of human history, estimates of Earth’s age were scattershot at best. In February 1907, a chemist named Bertram Boltwood published a paper in the American Journal of Science detailing a novel method of dating rocks that would radically change these estimates. In mineral samples gathered from around the globe, he compared lead and uranium levels to determine the minerals’ ages. One was a bombshell: A sample of the mineral thorianite from Sri Lanka (known in Boltwood’s day as Ceylon) yielded an age of 2.2 billion years, suggesting that Earth must be at least that old as well. While Boltwood was off by more than 2 billion years (Earth is now estimated to be about 4.5 billion years old), his method undergirds one of today’s best-known radiometric dating techniques.
In the Christian world, Biblical cosmology placed Earth’s age at around 6,000 years, but fossil and geology discoveries began to upend this idea in the 1700s. In 1862, physicist William Thomson, better known as Lord Kelvin, used Earth’s supposed rate of cooling and the assumption that it had started out hot and molten to estimate that it had formed between 20 and 400 million years ago. He later whittled that down to 20-40 million years, an estimate that rankled Charles Darwin and other “natural philosophers” who believed life’s evolutionary history must be much longer. “Many philosophers are not yet willing to admit that we know enough of the constitution of the universe and of the interior of our globe to speculate with safety on its past duration,” Darwin wrote. Geologists also saw this timeframe as much too short to have shaped Earth’s many layers.
Lord Kelvin and other physicists continued studies of Earth’s heat, but a new concept — radioactivity — was about to topple these pursuits. In the 1890s, Henri Becquerel discovered radioactivity, and the Curies discovered the radioactive elements radium and polonium. Still, wrote physicist Alois F. Kovarik in a 1929 biographical sketch of Boltwood, “Radioactivity at that time was not a science as yet, but merely represented a collection of new facts which showed only little connection with each other.”
Then, in 1902, Physicist Ernest Rutherford and chemist Frederick Soddy proposed that radioactivity was the transmutation of one element into another, with alpha, beta, or gamma radiation released during this breakdown. They also found that a radioactive element will decay into a different element at a rate determined by a property known as its “half-life” — the time it takes for half of such atoms in a sample to decay.
In 1904, Boltwood sat in the audience for a lecture Rutherford gave at Yale on the dating potential of radioelements. By determining the amount of a radioactive element and its decay end-product in a rock, scientists could calculate the rock’s age using the element’s known half-life.
Inspired, Boltwood set out to look for the non-radioactive end product of uranium decay. At this time, he was working in his own private laboratory consulting for mining companies, analyzing ore samples and collecting mineral specimens along the way. In all his mineral samples that contained uranium, Boltwood also detected lead. He concluded that lead must be the final product of the uranium series, known to also include radium in an intermediate step.
Rutherford, who was corresponding with Boltwood, endorsed this idea. Based on Rutherford’s calculations, uranium decaying to radium, and radium emitting five alpha particles, would produce an element close in atomic weight to lead. “Knowing the rate of disintegration of uranium, it would be possible to calculate the time required for the production of the proportions of lead found in the different minerals, or in other words the ages of the minerals,” Boltwood wrote in his American Journal of Science paper.
Rutherford’s initial calculations were off, and the half-life of radium was revised several times in 1905 and 1906. Boltwood used the latest (but still inaccurate) half-life value of 2,600 years, along with the finding that most rocks contain 380 parts of radium per billion parts of uranium, to deduce that one part of radium decays, and one part of lead forms, per 10 billion parts of uranium in a rock. He put together a formula: A rock’s age in years equals 10 billion times its ratio of lead to uranium atoms. He applied it to a cache of mineral samples, including uraninite from Connecticut and the Ceylonese thorianite. The latter yielded the primeval age of 2.2 billion, by far the oldest estimate for Earth’s age ever made at the time.
Though spectacular, Boltwood’s results landed with a muted thud. Contemporary geologists were reticent to accept radiometric dating. And Boltwood’s calculations were indeed incorrect, both because of the erroneous half-life value (it’s now known that the longest-lived radium isotope has a half-life of 1,600 years, not 2,600) and because he failed to account for the decay of another radioactive element, thorium, into lead in some minerals that contained both thorium and uranium. He didn’t touch the topic again, and after Yale offered him a position as professor and chair of radiochemistry in 1910, Boltwood slowed his research in favor of teaching and directing several college laboratories.
But by the time Kovarik was writing his biographical remembrance of Boltwood in 1929, the consensus was that “Boltwood laid the foundation for the best method we have today in calculating the ‘age of the earth.’” Scientists reached today’s accepted age, about 4.54 billion years, in part from a meteorite that crashed down to Arizona (assumed to be the same stuff that made the Solar System and Earth) and using the principles of lead dating first established by Boltwood’s thorianite.