Beating Fermi by 1.7 billion years
February 16, 2005 2:45 PM   Subscribe

The site of the world's first nuclear reactor? Gabon. About 1.7 billion years ago several deposits of uranium in Oklo, Gabon spontaneously began to undergo nuclear chain reactions fed by small drips of water. These natural breeder reactors ran for almost a million years, producing both intense heat and plutonium byproducts. Aside from the strangeness of naturally occurring reactors, Oklo provides the only existing case of how highly radioactive waste behaves over a period of tens of millions of years -- exactly the problem faced by the DOE's Yucca Mountain nuclear waste site.
posted by blahblahblah (19 comments total) 4 users marked this as a favorite
Interesting. I had never heard of this phenomenon.

The LA Times did a recent story on Yucca Mountain.
posted by euphorb at 2:53 PM on February 16, 2005

As a side note to my post, 9 out of the 15 reactors found have been destroyed by mining. Oklo uranium is used to provide fuel for French nuclear plants.
posted by blahblahblah at 2:57 PM on February 16, 2005

I work in the industry, and the read about the Oklo reactor a couple of years ago. Neat stuff.

Lots more great information on nuclear power can be found at the Canadian Nuclear FAQ, or the Virtual Nuclear Tourist sites.
posted by Popular Ethics at 3:40 PM on February 16, 2005

Is there any oil in Gabon?

Will it be made clear that their behavior internally and externally is out of step with the direction and desire of the international community and that they need to stop pursuing their unregulated nuclear activities?
posted by anthill at 3:44 PM on February 16, 2005

posted by mwhybark at 4:11 PM on February 16, 2005

For me, the most interesting thing about the Oklo reactor is that it currently sets the tightest limits for temporal drift of fundamental constants [pdf], such as the fine-structure constant. In fact, the latest reanalysis gives a positive result and is therefore the first hint that some laws of physics may change with time. A lot of great corroborating work is being done on this problem including cosmological observations as well as precision optical atomic frequency standards but it turns out that several tons of dirt currently sets the gold standard.

If anyone wants to wade through the paper cited above, the main point is that a significant over-abundance of certain fission byproducts (149Sm) were detected at the site.
posted by fatllama at 4:28 PM on February 16, 2005

make that an under-abundance
posted by fatllama at 5:27 PM on February 16, 2005

fatllama: what does that signify?
posted by ParisParamus at 5:52 PM on February 16, 2005

Note that these are fossil reactors, meaning that the chain reaction is effectively no longer taking place. Indeed, as Wikipedia points out:

A key to the creation was that at the time [i.e. 1.7 billion years ago], the abundance of fissionable U-235 was about 3%. Due to U-235's shorter half life than U-238, the current abundance of U-235 in natural uranium [meaning generically, here there and everywhere] is about 0.7%. Therefore a natural nuclear reactor is no longer possible on Earth.

Paris: temporal drift is the change in the accuracy of a measurement over time. For instance, we base certain metric constants on radioactive decay rates, which are all but infinitesimally accurate -- but we know (current theory states) that the decay rate will change over time, making those measurements less accurate. What Oklo gives us, through its Petri dish of radioactive isotopes, is a test case for measuring the rate of change. If I understand the reasoning, we have a rough certainty of the age of the radioactive materials involved, and a rough certainty of what the decay byproducts would be if decay processes remained constant; the Oklo materials show a lower amount of certain byproducts, suggesting that the decay rate has indeed slowed, producing fewer of them.
posted by dhartung at 6:34 PM on February 16, 2005

Do you mean: what do changing fundamental "constants" signify? If so, I'll try to give an answer, but take it with a grain of salt as I'm a lowly experimentalist outside the field and the implications are somewhat profound.

Background reading
It's been sold to readers of popular science texts that the current Standard Model of physics is at a best/worst of times moment. The best of times: it accurately predicts and precisely explains the properties of dozens of fundamental and many Hundreds of composite particles, not to mention all known spectroscopic properties of atoms and by extension molecules. It has been successful at predicting the existence of particles before their discovery (charged W and neutral Z bosons, top quark, neutrino, to name only a few).

The worst of times: at present we do not know why forces have the strengths they do, though answers may come soon. We currently have no understanding1 of the mass hierarchy problem (why are the light quarks so light compared to the heavy quarks, why are neutrinos so light compared to everything else). We have no hint as to why the Universe seems to be only made of matter, not anti-matter. Worse, the mechanism by which we think all particles attain mass (coupling to a scalar2 field called the Higgs) creates more theoretical problems than it solves unless there are two of them and some variant of supersymmetry turns out to be true3. Worse still, no account of gravity is made by the Standard Model. Worst of all, it gives no hint as to what 96% of the mass and energy in the universe is.

Proposed extensions to the standard model (string theory as one example but not at all the only one) attack these problems all at once. One feature many proposals have in common, I conjecture, ties the expansion of the Universe to a new scalar field called, no joke, the inflaton. Rapid inflation after the big bang is represented in these models as the field quickly losing potential energy (light reading, heavy reading, picture from this summary). Along with the scale factor of the universe changing via the action of this scalar field, some models also predict that the relative strengths of the forces change as well. This is a theoretical nuisance... unless, of course, evidence for time-changing fundamental "constants" is unearthed. Seek and ye shall find.

If anyone reading this is working in the field, please correct any misstatements. I'm reaching.

[on preview: dhartung, no. This reactor went critical 1.7 billion years ago and quickly burnt itself out. That merely explains the deficit of Uranium. However, the relative concentrations of other byproducts are not what you'd expect if you burnt that much fuel today in a lab. It turns out that the production rate of one particular byproduct, 149Sm, depends critically on the value of the fine-structure constant (the relative strength of the electromagnetic force to the strong nuclear force) and by extension the absolute strengths of the weak and strong nuclear forces. The implications are much larger than what you seem to present.]

1 Without the need for fine tuning parameters, that is.
2 Scalar is jargon for a particle with a spin-number of 0, which is more jargon. In mathematics, a scalar is an ordinary number with a magnitude in contrast to a vector which is a magnitude and a direction. So a scalar field in mathematics refers to a continuous function of numbers, such as, say, the map of temperatures across the country on the morning weather forecast. A vector field might be the same map with arrows showing wind speeds and directions. A scalar field in a physics context is an entity with only a number value at each point in space.
3 Supersymmetry (briefly, the existence of yet-undiscovered partner particles for every particle we now know of, bosons for fermions, fermions for bosons) is necessary in order to make sense of scalar fields (like the Higgs boson!) under our current understanding of field theory.
posted by fatllama at 7:33 PM on February 16, 2005

Interesting -- so can someone tell me why we're not moving ahead on Yucca Mountain? This all seems to point to a green light for safe storage...
posted by minnesotaj at 7:38 PM on February 16, 2005

Well, the people of Las Vegas don't want a storage facility within 80 miles of their city... and they are proving very effective at stopping it. For example, they raised the environmental impact horizon from 100,000 years to 1 million years.. Of course, having the Senate minority leader doesn't hurt, either. Take a look at Eurphorb's link (the first comment).

fatllama - your post wins "Best Physics Comment" and "Best Use of Footnotes" in a MeFi discussion -- thanks!
posted by blahblahblah at 8:11 PM on February 16, 2005

minnesotaj, don't be misled. The article states that:
It can be determined, for instance, that the atoms of plutonium produced never moved from the grains of uranium where they were formed, despite exposure to ground water movement for over two billion years.
But it well known why plutonium is usually immobile in water--it is insoluble! Other radioactive waste products, such as the highly dangerous Americinium (lose much?) is highly mobile in water. Also keep in mind that seepage in the desert is not the only danger facing a centralized nuclear waste storage facility. I happen to think the status quo is much worse and that the Yucca plan is probably the best alternative but let's never set reason aside in a rush for quick fixes.
posted by fatllama at 8:55 PM on February 16, 2005

that is some cool stuff.
posted by virga at 9:15 PM on February 16, 2005

I worked for the Yucca Mountain Project, and while I believe that it is the best alternative (you would not BELIEVE some of the places currently storing nuclear waste - security nightmares abound), I also saw a level of incompetence and laziness while working on the safety documents and regulatory licensing applications that it shocked me. Believe me, whether or not this is a good idea, Bechtel Nevada has NO FUCKING BUSINESS running it. Worst place I've ever worked for.
posted by u.n. owen at 10:06 PM on February 16, 2005

Cart it up the space elevator and rocket it into the sun.
posted by TheOnlyCoolTim at 11:08 PM on February 16, 2005

Those interested in problems with long-term storage of nuclear waste should read the essay "Building a Marker of Nuclear Warning" by Julia Bryan-Wilson from the book Monuments and Memory, Made and Unmade. (Nelson/Olin)

It will make you wonder about the wisdom of of continuing to produce nuclear waste without knowing how to render it safe.

We have a nuclear reactor. It's the sun, and it is a safe distance from us.
posted by scottr at 10:05 AM on February 17, 2005

It will make you wonder about the wisdom of of continuing to produce nuclear waste without knowing how to render it safe

We continue to produce heavy-metal and radioactive toxins from coal without knowing how to render them safe. We continue to produce disgustingly toxic wastes from chip production and photovoltaic production, without knowing how to render them safe. We continue to produce toxic waste for hybrid car batteries without knowing how to render it safe.

And most of those wastes will be highly toxic forever.

Makes you wonder about the wisdom of continuing to produce anything, knowing that making just about anything produces some sort of waste that we don't know how to render safe or where it's fundamentally impossible to render it safe.
posted by ROU_Xenophobe at 10:58 AM on February 17, 2005

temporal drift is the change in the accuracy of a measurement over time

Er, isn't the change of time over time a confounding element? ISTR some recent news that time may not be a constant.
posted by five fresh fish at 11:05 AM on February 17, 2005

« Older It's not rocket science   |   QEMU / FreeOSZoo Newer »

This thread has been archived and is closed to new comments