Posted: Nov 26, 2013 8:18 am
by Blackadder
cbm1203 wrote:I'm trying to understand how radiometric dating accounts for the time lag between the synthesis of radioisotopes and their subsequent deposition on proto-Earth. Assuming that radioisotopes begin decaying soon after they are formed in supernovae, then it would seem that radiometric dating indicates when the isotopes began decaying rather than the age of Earth in which are found today. If all radiometric dating on Earth points to a common starting point, then that could simply mean they all came from the same source supernova. If the time between radioisotope synthesis and the formation of our solar system is very small, then the age of isotopes is a reasonable estimate of the age of Earth. But if time between the synthesis of the isotopes and the formation of Earth is a significant fraction of the age of the isotopes, then Earth would be younger. But how this lag would be determined is not obvious. Please clarify my understanding.

Radiometric dating works on the principle of RATIOS of parent isotopes and daughter products (caused by the decay of the parent) to be found in a particular sample. When the parent isotope was first formed is not relevant to this exercise. What is relevant is how much of the daughter element(s) may have been present in the sample at the time of its formation and whether any subsequent contamination by external daughter elements may have taken place. The former is addressed by using multiple isotopic tests upon various minerals within the same sample to establish how much of the daughter products would have been initally present, in order for that variable to be fixed in the age equation.

There is a very good paper (actually written for Christians would you believe) which explains this very well in simple language. I refer you to page 4 of this paper: