Radiometric dating is a method by which the age of materials such as rocks can be determined. The process relies on the fact that certain atoms decay or transform at a measurable rate over time, meaning that age can be established by working out the rate of decay from a sample. The invention of radiometric dating was a crucial step in the process that determined the age of the Earth, a question that troubled scientists for centuries before finally reaching a largely accepted result in the 20th century.
The discovery of radiometric dating is largely attributed to Ernest Rutherford, a British scientist who became interested in the study of radioactivity in the late 19th century. Radioactivity had only recently been introduced to the scientific community, mostly through the work of Marie and Pierre Curie. Rutherford, along with several collaborators, discovered that certain radioactive isotopes, which are elements with an unequal number of protons and neutrons, decay from an unstable version to a stable one. Radiometric dating could determine the age of a sample by measuring how long it took half of the atoms in a sample to turn into the stable version. This measurement became known as the half life, and forms the basis of radiometric dating.
Radiometric dating is sometimes referred to as carbon dating, because the one of the most commonly used forms of dating measures the half-life of carbon-14, a carbon isotope with six protons and eight neutrons. Carbon dating, however, is only accurate for fossils and rocks that are under 50,000 years old. Other half life calculations are made for older samples, using a variety of different isotopes, including potassium and uranium.
One of the biggest concerns in this method of dating is contamination. In order for a sample to be measured accurately, the unstable parent and stable daughter isotopes cannot have entered or left the sample after the material is originally formed. Since contamination is such a common issue, it is standard practice to test many different samples of a material in order to arrive at an accurate range.
The first truly accurate measurement of the age of the Earth was done by a geochemist named Clair Patterson in the late 1940s. The genius of Patterson was in realizing that the best possible estimates for the age of the Earth could be made by using radiometric dating on meteorites, since meteorites date back to the formation of the solar system, and thus came into being around the same time of the Earth’s birth. Measuring the half life of uranium in meteoric samples, Patterson came up with an estimate of 4.5 billion years in the 1950s, which remains the most widely accepted figure in the 21st century.