July 18, 2011
Nuclear Power Generates Half Of Earth Core Heat
Want to get away from nuclear power and all that runs on nuclear power? Your only choice: move off planet. Half the heat energy from Earth's core comes from nuclear fission. Advocates of geothermal power are really advocates of nuclear fission power. By contrast, advocates of solar energy are really advocates of nuclear fusion power.
What spreads the sea floors and moves the continents? What melts iron in the outer core and enables the Earth's magnetic field? Heat. Geologists have used temperature measurements from more than 20,000 boreholes around the world to estimate that some 44 terawatts (44 trillion watts) of heat continually flow from Earth's interior into space. Where does it come from?
Radioactive decay of uranium, thorium, and potassium in Earth's crust and mantle is a principal source, and in 2005 scientists in the KamLAND collaboration, based in Japan, first showed that there was a way to measure the contribution directly. The trick was to catch what KamLAND dubbed geoneutrinos – more precisely, geo-antineutrinos – emitted when radioactive isotopes decay. (KamLAND stands for Kamioka Liquid-scintillator Antineutrino Detector.)
One thing that's at least 97-percent certain is that radioactive decay supplies only about half the Earth's heat. Other sources – primordial heat left over from the planet's formation, and possibly others as well – must account for the rest.
While the debate about the practicality of thorium nuclear energy still rages Earth is already getting as much energy from thorium as from uranium.
All models of the inner Earth depend on indirect evidence. Leading models of the kind known as bulk silicate Earth (BSE) assume that the mantle and crust contain only lithophiles ("rock-loving" elements) and the core contains only siderophiles (elements that "like to be with iron"). Thus all the heat from radioactive decay comes from the crust and mantle – about eight terawatts from uranium 238 (238U), another eight terawatts from thorium 232 (232Th), and four terawatts from potassium 40 (40K).
Suppose you want to get away from Earth's nuclear power. Where to go? Near as I can tell the the iron sulfur core of Mars has little or no nuclear fission going on. So Mars looks like a good zone for nuclear-free living. Though being so far from the Sun I'm wondering how life there can be made to work without fusion reactors. Does the Mars crust provide the materials needed to make massive solar panel installations and batteries?
Nuclear decay is quite a different process than fission
Does this tell us anything about the age of the earth? Should we still be radioactive if we are as old as some say we are?
I wonder how much we get from tidal effects (the moon). Seems likely to be substantial given the size of our moon. Also, what D.F. Linton said... I don't think emitting alpha particles during radioactive decay really qualifies as 'fission'.
This also gets me thinking: if half of the heat remaining is primordial, then there would necessarily have been somewhat more primordial heat in the past (say, 250 million years ago), and this should have resulted in higher surface temperatures (but how much higher?).
Not significantly higher surface temperatures, but a steeper temperature gradient in the crust and more tectonic/volcanic activity.
The half life of U-238 is 4.468×10^9 y, about the age of the Earth. So, half of the original U-238 in the Earth has decayed. Thorium 232 has a half life of ~14 billion years, which is a little more than the age of the Universe.
The Moons effects are:
1. slowing the rate of rotation, i.e., the day gets longer
2. the moon is pumped up into a higher orbit, i.e., it gets further away
The protoplanet collision which led to moons formation probably also increased the Earth's spin, angel of inclination, and contributed to its retained heat of formation.
Mars no longer has active plate tectonics and an internal dynamo, but it did in the past. It retains a fragmented fossilized magnetic field.
The occasional impact from astroids will impart heat to the earth's interior.
"Nuclear decay is quite a different process than fission"
Fission is a type of nuclear decay. In spontaneous fission you have tunneling through a potential barrier,
similar to alpha decay. Most of the radioactive heat in the earth comes from the alpha decay of
uranium and thorium since the branching ratio to fission is small (
Fission is a type of nuclear decay, but the decays they are seeing are not fission.
The interesting question here is what's the source of the other 50%, if it's not radioactive decay? Primordial heat is out, since the time constant for that to leave the earth is in the tens of millions of years. Maybe annihilation of trapped WIMPs?
'the time constant for that to leave the earth is in the tens of millions of years.'
Pretty sure that's wrong. I get back of the envelope figures of billions or tens of billions of years for half of the energy to be conducted out. On the other hand I'm assuming a million meter thick layer of solid rock, and the actual Earth is rather more fluid, but tens of millions of years is too low a number. I doubt the paper's authors would have suggested primordial heat as a source if you were right.
Spontaneous fission does occur, but on with a ratio of 5.4e-7 fissions to each alpha decay, so as a source of heat its negligible.
bbartlog: look up the history of Kelvin's estimates of the age of the Earth. You've made a mistake somewhere.
Kelvin may have been doing an entirely different calculation. Even in his time I see that others disagreed with his conclusions, with John Perry getting a value of 2-3 billion years in 1895. Fundamentally it appears that Kelvin assumed that the Earth first cooled (fairly rapidly) to the melting point of rock, with the solid rock sinking to the core, and that only after the entire globe had cooled to that melting point did conduction become the manner in which heat escaped. Because of this, his result is rather dependent on his assumptions regarding the melting point of rock(s). I can't quite see how he would have reconciled the (known) thermal gradient near the surface (about 1 deg F for every fifty feet) with the idea of a core that had a temperature of only 3-4K deg F. So it would appear to me that it's Lord Kelvin who made a mistake somewhere.
My model assumes:
- specific heat of earth, about 0.5 J/gK (based on iron)
- mass of earth, 6x10^27g
- temperature of core, 3000 deg K
- thickness of insulating granite shell, 10^6m (underlying core is assumed to transfer heat to the shell efficiently, either by convection or due to being made of iron)
- thermal conductivity of said shell, 3 W/mK
- area of shell, 5x10^14 sq m
Under these assumptions, Earth holds 9 x 10^30 J of heat energy, and it escapes through the insulation at 4.5x10^12 J/s (or watts). With the rate declining in proportion to the temperature, it would take about 1.4^10^18 seconds for half the heat to be lost. That's fifty billion years. Now, I expect that my insulation values are too high; you don't have to go 1000km into the earth in order to see convection and/or the higher conductivity values of iron rather than solid rock, and certainly in the early days of the earth the solid skin would have been even thinner. But as you can see, even if you use a 10km insulating shell rather than a 1000km shell, you will end up with a temperature halving time of 500 million years...
You state that half of the Earth's heat is generated in the core by nuclear fusion. You even provide a link to verify that statement though, the website your link opens says nothing like that. It states that nuclear decay of certain elements in the mantle generate half that heat. Very Different. And to suggest that, because fusion is happening in our solar system then nuclear fission is somehow an acceptable energy source, is absurd.