The only context in which I heard it mentioned is the idea of trying to date the earth by its temperature. Assume it was a hot molten mess in the past and has been cooling down ever since. Factor in the rate at which it would radiate heat and all of the energy coming from the sun and... you conclude that the planet is very young. Because you didn't account for radioactive material within the planet, keeping it warm.

Thermal decay dating was also used to estimate the age of the Sun, most famously / notoriously by William Thompson, a/k/a Lord Kelvin. Without knowledge of radioactivity, nuclear chemistry, or nuclear fusion, Thompson presumed latent heat of gravitational collapse for both the Earth and Sun, and came up with an age of "a moderate number of millions of years" for the former, and no more than 500 million years for the Sun.

Rutherford's discovery of the atomic nucleus, and subsequent discoveries of radioactive decay (providing both mechanism and clock for Earth's heat and age) and fusion and helium (the Sun's thermodynamic equation and age) provided a refutation of Thompson's estimates, empirical basis for the current age estimates of the Earth and Sun (about 4.5 billion years), and ultimately of a lifespan for the Sun (about 10--12 billion years) as well as how long habitable conditions on Earth may persist (as little as another 800 million years for some conditions, another 2--3 billion on the outside).

<https://www.teachastronomy.com/textbook/Our-Sun-The-Nearest-...>

<https://en.wikipedia.org/wiki/Lord_Kelvin#Age_of_Earth> citing Burchfield, Joe D. (1990). Lord Kelvin and the Age of the Earth. University of Chicago Press. p. 43. ISBN 978-0-226-08043-7. <https://en.wikipedia.org/wiki/Special:BookSources/978-0-226-...>