Millions of Years and the Bible, Part 2a: A Scientific Defense of a Young Earth

Submitted by Hannah D. on Sat, 06/22/2013 - 19:41

I just realized how long this essay is, so I felt that I should split it into two parts. This will discuss radioactive dating, while 2b will discuss distant starlight and various young earth evidences.
____________________________________________________________
It is clear that an honest reader of Scripture will find no hint of evidence towards millions of years. Is the Christian faith, then, based on blind faith? Is it rational to believe in 6,000 years when scientists, more often than not, say 4.5 billion?

Blind faith is not a biblical concept: “Now faith is being sure of what we hope for and certain of what we do not see.” (Heb. 11:1) There are many reasons, biblical and extra-biblical, to believe in a young earth. God created the universe, after all, and His creation ought to support what He says in His Word.

One of the most-used evidences to prove an old earth is radiometric dating. It is based on the decay rate of radioactive isotopes (radioisotopes). The nucleus of such an isotope is too big for it to be stable, so they emit what are called alpha particles (really, they are just a Helium isotope – two protons and two neutrons together). When the radioisotope emits so many alpha particles that half of it has decayed into its daughter product, it has undergone a half-life. In this way, a radioisotope can decay into a more stable element. Carbon-14 (the parent isotope) decays into Carbon-12 (the daughter isotope), for example, and Uranium-238 decays into lead.

The idea behind radiometric dating is that the rate at which these radioactive materials decay is a constant. C-14 decays into its daughter isotope in around 5,000 years, while U-238 takes a couple billion. These half-lives are supposed to be constants. If you know how much of the original isotope was in a certain rock sample before the radioactive stuff started to decay, and you know how much is there now, you can use the known decay rate to determine how old that rock is.

Say I have a creek in my backyard. Every year, rain comes in, and it adds an inch of depth to it. My creek is a foot deep, so I can assume that this has been going on for the past twelve years.

What I don’t know is that a few years back, the rain came a bit heavier than usual and carved seven inches into the creek in one year. I think its twelve years old; in reality, it is only five. By assuming that the creek’s deepening rate is a constant, I have grossly overestimated the age of the creek.

There is actually a lot of evidence to suggest that at one point in earth’s history, the decay rate of many radioisotopes – the kinds used to date rocks – actually accelerated. At one point in the past, something happened that caused U-238 to decay at a rate much faster than dividing every so many billion years. The first evidence to suggest this has to do with those alpha particles.

Uranium is found in granite. (So are a lot of other radioactive isotopes, but let’s just stick with this for a moment). There are several granite beds that are very deep, and according to the Principle of Superposition, the deeper granites must be older than the shallower ones (if I know anything about science, it is that scientists love giving abstruse names to simple concepts).

There is another way to date these granites, however, and that is by the amount of Helium contained within them (remember, Helium is really all that those alpha particles are). Granite is a very crystalline structure, and if you were to look in at its molecules, you would find that they push apart at each other to create many gaps in between themselves. Now, Helium is a very small molecule, and it slips out of these gaps very quickly. Helium slipping out of granite is called Helium diffusion; we know much more about the rate at which Helium diffuses than we do about the rate of radioactive decay. It would be much more reliable, then, to test the age of the rocks by the amount of Helium they contain.

A group of creationists, or Bible-believing scientists, the RATE (Radioisotopes and the Age of The Earth) group, decided to test this. They first took the radiometric dates of the granite at different depths. They used young earth models to predict how old each portion of the rock should be. Then they looked at the amount of Helium contained in the granite.

Even at the deepest depth of those rocks, there was way too much Helium in there for the rock to be millions of years old. In fact, it couldn’t be older than a few thousand. The dates that they got from the amounts of Helium in the granites matched the creation predictions almost perfectly. On the graph, the Helium date line seemed to hug the prediction, matching it at every rise and fall and slope.

If there is too much Helium in the rocks for them to be millions of years old, then the radioisotopes gave ages that are way too old.

Then there is C-14 dating, which is used on fossils since it is found in living organisms, along with regular Carbon. This radioisotope has a half-life of over 5,000 years. After any radioisotope has undergone 10 half-lives, it is rendered undetectable by most modern equipment. Thus, C-14 should be undetectable after 50,000 years. However, some very special equipment can trace it up to 100,000 years. This, however, is the limit; nothing older than 100,000 thousand years ought to contain any amounts of C-14.

However, C-14 is found in fossils all throughout the fossil record. Even in the Geologic Column’s Precambrian and Cambrian eras, containing fossils that are supposed to be hundreds of millions of years old, contain plenty of C-14 in them. Why is there so much of it still in there?

Most secular scientists claim that C-14 is in these fossils because they were, somehow, contaminated. The same is said for coal, which is also supposed to by millions of years old but contains traceable amounts of it. But C-14 is also found in diamonds. These are supposed to be billions of years old. How do you contaminate the hardest material on earth?

So the dates that radioisotopes give are simply not possible, considering that there are often lots of Helium (in rock samples) or C-14 (in biological specimens) alongside them. There are, however, even more problems with the isotopes involved in giving million-year-old ages to the rocks they are found in.

U-238 is not the only isotope used in radioactive dating. Strontium and rubidium, for example, are two other radioisotopes used. They are often found within the same samples, but when dating a rock, scientists usually only use one of the isotopes to get the age, assuming that the other isotopes would give the same answer.

The RATE group decided to test this assumption as well. Not only did they find that each different radioisotope could give ages ranging from hundreds of millions to tens of thousands of years – for the same rock – the same isotope could give wildly different ages to different fragments of the same rock. And when they took rocks of known ages, such as the new rocks formed by Mt. St. Helens’ eruption, each isotope gave totally inflated ages to samples we know to be only 50 years old.

Clearly, there is something very, very wrong with radioactive dating. Not only does it not match up to the younger ages of better understood dating methods, but it even has age squabbles with its own kind. But if the rocks actually must be younger, than why do the radioisotopes give such inflated ages? What would have caused them to decay so fast that they would look as though they had been dividing for millions, even billions, of years? The answer lies in the secrets of polonium radiohaloes, a geological phenomenon found around traces of U-238.

Let’s start with a microscopic view of granite. Granite is made of two different types of rock: biotite flakes and zircon crystals. The zircon crystals are where the U-238 is found, or rather, the U-238’s radiohaloes.

Radiohaloes are caused by the emission of alpha particles while the element is decaying. They damage the rock around them, causing a dark sphere around the original source of radioisotope. U-238 isn’t the only one to leave these haloes in the rock it forms in, however, but it can be identified by eight characteristic rings within it.

Outside the zircon crystal, however, there exists something totally unexpected: three different types of Polonium radiohaloes (three different Polonium isotopes leave three different types of haloes; one has one ring, another two, and the last three). One of the reasons why this is so surprising is that Polonium is a very rare element that is almost never found in nature. Where did the Polonium come from, and how did it get into the biotite flakes surrounding the zircon?

U-238 decays in eight steps. Instead of going straight to lead, it emits alpha particles slowly, and passes through many other elemental forms along the way. The fourth step is Radon, a radioactive gas. After that, it passes through all three Polonium isotopes and finishes at lead.

Another thing about Polonium: it has an affinity for water.

What if, once the U-238 decayed into Radon, it seeped out of the zircon crystal and into the biotite? If there was water flowing over the granite, it could carry the Polonium isotopes even farther into the flakes. This would not only explain the source of the radiohaloes, but it would explain how they ended up so far from their source (in this case, a few millimeters). When RATE predicted that more Polonium radiohaloes would be found where there was evidence that a lot of water once flowed over the granite, they were right. So far, so good.

So then, what’s the big deal? Simply this: each Polonium isotope has a different length half-life. One divides every 28 days; one every two minutes; and the last one, every several milliseconds. In a second, that last Polonium isotope is gone.

But radiohaloes can only form when the granite is fully formed and dry. (After all, they damage the rock around them; if the rock was still soft and forming, it’d simply fill in the gaps). Thus, in order for the radiohaloes to be there at all, in order for the Polonium to be there at all, the granite had to have hardened quickly, before the Polonium could disappear.

It had to form in a matter of seconds.

It is known that under the high pressure of lots of water, granites can cool very quickly – not in millions of years. And the granites that contain radioisotopes had to have. There is no way that the rocks could be millions of years old; if they were, the Polonium radiohaloes would never have been captured.

Folks, radioactive dating doesn’t work. It contradicts outside evidence and itself; it gives inflated ages for rocks we know the age of; it gives inflated ages to rocks that couldn’t possibly have been formed in millions of years.

What’s more, radioactive dating is used in a lot of contrary reasoning. Mitochondrial DNA is used in a genetic idea that allows scientists to find the time that two organisms last shared a common ancestor. The ages often give millions of years. But on what do they base these ages? Radiometric dating – and then Mitochondrial Eve is used to affirm the results of radiometric dating. It’s completely circular reasoning.

A quick review of the evidence: Radiometric dating gives ages of millions of years to objects containing C-14, which can only last, at most, 100,000 years. It gives the same long ages to rocks containing too much Helium to be that old. Different radioisotope dating methods contradict each other on the same rock sample, or even on different pieces of the same rock sample. And in the presence of Polonium radiohaloes, they give large dates to rocks that, had they really been formed in those long ages, would never have captured those radiohaloes.

Dating methods, radioactive or not, rely on lots of assumptions, and uniformitarianism is the key one. If you assume that slow-and-gradual processes and current rates have been going on in the past as they do today, then you will conclude the earth is a lot older than it actually is. But in the past, something happened. Something occurred that sped up radioactive decay rates, and it included lots and lots – and lots and lots – of water.

And the Biblical account of the Great Flood offers just the answers needed to understand the evidence.

Whether Christian or not, evidence is considered subjectively, not objectively. A lot of people have this idea that scientists are unbiased researchers observing the evidence and following it wherever it leads, but that is simply not the case. Everyone has their own ideas, their own worldview and their own religion that they use to interpret the world around them. And uniformitarianism, along with naturalism, empiricism, relativism, and evolutionism, are every bit as religious in nature as Biblical Christianity is. Scientists who believe these things are every bit as biased as Christians are.

Anyone can study human health and become a doctor; anyone can look at modern physics and put a man on the moon. Creationists don’t deny science in this sense. When you’re talking about observational science – the kind that creates cures to life-threatening diseases and designs the latest in modern technology (in other words, science that deals with the present) – creationists and evolutionists see and learn from the same things. But explaining how old the Grand Canyon is, how life originated on earth or where human consciousness came from uses a lot of bias. Whether you start from the Bible or from man’s word determines whether you think the Grand Canyon formed over millions of years by the Colorado River or in a short period of time by the Great Flood.

This pro-worldview bias can be seen in another evidence for the age of the earth: the distant starlight problem. We can see 46 billion light years out into space; how did the light get here if the universe is only 6,000 years old?

Author's age when written
17
Genre