2006 December 05 Tuesday
Thermal Rectifier Controls Direction Of Heat Flow

MIT's Technology Review reports on an experimental thermal rectifier made from nanotubes that preferentially allows heat to flow more easily in one direction.

Scientists have been precisely controlling electric current for decades, building diodes and transistors that shuttle electrons around and make computers and cell phones work. But similarly controlling the flow of heat in solids stayed in the realm of theoretical physics--until now.

Alex Zettl and his colleagues at the University of California, Berkeley (UC Berkeley), have shown that it is possible to make a thermal rectifier, a device that directs the flow of heat, with nanotubes. If made practical, the rectifier, which the researchers described in last week's Science, could be used to manage the overheating of microelectronic devices and to help create energy-efficient buildings, and it could even lead to new ways of computing with heat.

The difference in the ease of heat flow they produced was not enormous. But it was a good start. Think of it as a material that has a higher insulation rating in one direction than another.

Imagine two sides of a wall. Sometimes the outside is hotter. Sometimes the inside is hotter. It is not hard to imagine scenarios where you would want heat to flow out when it is too hot but to never flow in. Or perhaps you'd want to control the direction of heat flow depending on the season. The ability to easily flip around a section of thermal rectifier wall material would come in very handy.

By Randall Parker    2006 December 05 10:58 PM   Entry Permalink | Comments (9)
2006 June 05 Monday
Better Building Designs Could Halve Energy For Air Conditioning

MIT researchers have developed ways to design better natural ventilation systems to reduce the need for air conditioning.

Operating commercial buildings consumes a sixth of all the energy used in the Western world. Getting rid of air conditioning could cut that consumption by as much as a third -- but people don't like to work in sweltering heat.

So MIT researchers are making computer-based tools to help architects design commercial buildings that cool occupants with natural breezes.

Buildings can be designed to encourage airflow and maintain temperatures that minimize or eliminate the need for conventional air-conditioning systems. "That approach improves air quality, ensures good ventilation and saves both energy and money," said Professor Leon R. Glicksman, director of MIT's Building Technology Program. Indeed, studies have shown that people generally feel more comfortable in a naturally ventilated building than in an air-conditioned one.

The researchers studied a buildig in Luton Britain which cools using natural ventilation and built a computer model to simulate how the building's air circulates. They were able to find ways to improve the design of natural ventilization designs and think they can cut air conditioning costs in half.

"We found what we initially thought were some strange results when we did the full-scale-building tests," said Glicksman. "But using the computer model, we now understand the physics of it, first of all confirming that it's a real effect and second, why it occurred." Such effects can be corrected by building in automatic control systems that, for example, turn on the vent fans when needed to ensure the continuous flow of fresh air.

Based on these findings, the MIT team is formulating a simple, user-friendly computer tool that will help architects design for natural ventilation. They plan to incorporate the tool into their "Design Advisor," a web site (designadvisor.mit.edu) that lets architects and planners see how building orientation, window technology, and other design choices will affect energy use and occupant comfort.

Natural ventilation does, of course, have its limits. For example, during hot summers in Hong Kong or even Boston, conventional air conditioning would still be needed. But just using natural ventilation during spring and fall in Boston, for example, could save at least half the energy now used for year-round air conditioning, the researchers estimate.

Most popular discussions about energy costs tend to revolve around cars and other vehicles. But boosting efficiency of new building designs seems to me an easier goal to achieve and does not require so many basic breakthroughs in science and technology.

By Randall Parker    2006 June 05 11:44 PM   Entry Permalink | Comments (11)
2006 March 06 Monday
Metal-Organic Frameworks Advance In Hydrogen Energy Storage

Has GM been pursuing hydrogen as a Machiavellian intrigue to delay a shift to a better technology? I've never believed that. But some people have made this argument in the comments sections of previous posts. Well, suppose that hydrogen vehicles turn out to work and General Motors puts them into production (see below). Paranoid conspiracists then could always argue that the success was an accident and that the plotters thought that scientists wouldn't so quickly come up with workable solutions. Conspiracy theorising can pretty much explain away any evidence and make it fit a conspiracy theory. Chemists have achieved sufficient density of hydrogen in a storage material for transportation needs but their method still requires a very low temperature.

Chemists at UCLA and the University of Michigan report an advance toward the goal of cars that run on hydrogen rather than gasoline. While the U.S. Department of Energy estimates that practical hydrogen fuel will require concentrations of at least 6.5 percent, the chemists have achieved concentrations of 7.5 percent — nearly three times as much as has been reported previously — but at a very low temperature (77 degrees Kelvin).

The research, scheduled to be published in late March in the Journal of the American Chemical Society, could lead to a hydrogen fuel that powers not only cars, but laptop computers, cellular phones, digital cameras and other electronic devices as well.

"We have a class of materials in which we can change the components nearly at will," said Omar Yaghi, UCLA professor of chemistry, who conducted the research with colleagues at the University of Michigan. "There is no other class of materials where one can do that. The exciting discovery we are reporting is that, using a new material, we have identified a clear path for how to get above seven percent of the material's weight in hydrogen."

The materials, which Yaghi invented in the early 1990s, are called metal-organic frameworks (MOFs), pronounced "moffs," which are like scaffolds made of linked rods — a structure that maximizes the surface area. MOFs, which have been described as crystal sponges, have pores, openings on the nanoscale in which Yaghi and his colleagues can store gases that are usually difficult to store and transport. MOFs can be made highly porous to increase their storage capacity; one gram of a MOF has the surface area of a football field! Yaghi's laboratory has made more than 500 MOFs, with a variety of properties and structures.

Yaghi sounds optimistic about solving the temperature problem using his metal-organic frameworks (MOFs) approach. He also does not see cost as an obstacle.

"We have achieved 7.5 percent hydrogen; we want to achieve this percent at ambient temperatures," said Yaghi, a member of the California NanoSystems Institute. "We can store significantly more hydrogen with the MOF material than without the MOF."

MOFs can be made from low-cost ingredients, such as zinc oxide — a common ingredient in sunscreen — and terephthalate, which is found in plastic soda bottles.

"MOFs will have many applications. Molecules can go in and out of them unobstructed. We can make polymers inside the pores with well-defined and predictable properties. There is no limit to what structures we can get, and thus no limit to the applications."

In the push to develop hydrogen fuel cells to power cars, cell phones and other devices, one of the biggest challenges has been finding ways to store large amounts of hydrogen at the right temperatures and pressures. Yaghi and his colleagues have now demonstrated the ability to store large amounts of hydrogen at the right pressure; in addition, Yaghi has ideas about how to modify the rod-like components to store hydrogen at ambient temperatures (0–45°C).

"A decade ago, people thought methane would be impossible to store; that problem has been largely solved by our MOF materials. Hydrogen is a little more challenging than methane, but I am optimistic."

In a separate story "Seicmic" points me to an announcement by General Motors that they expect to start selling hydrogen cars in 4 to 9 years. (same article here)

General Motors Corp has made major steps in developing a commercially viable hydrogen-powered vehicle and expects to get the emission-free cars into dealerships in the next four to nine years, a spokesman told Agence France-Presse.

GM also expects it will be able to 'equal or better gas engines in terms of cost, durability and performance' once it is able to ramp up volume to at least 500,000 vehicles a year, spokesman Scott Fosgard said.

Hydrogen storage containers, like batteries, are just a way to store energy. The cheapest way to make hydrogen currently is from fossil fuels. But a workable way to store hydrogen at room temperature would, like better batteries, make it a lot easier to end the dependence of cars on oil. Advances in solar, wind, and nuclear power will eventually lower their costs far enough to make them cheaper sources of energy for producing hydrogen. Also, a cost effective hydrogen storage technology, just like cheaper batteries, would allow solar wind to supply a larger fraction of all used energy because the ability to store energy helps any energy source that is not continuously available.

We still also need a big acceleration of research and development on both photovoltaics and nuclear reactor designs. We need cheaper non-fossil fuels energy sources. The storage problems are not going to be what prevents the transition away from fossil fuels. Higher costs of alternatives remain the biggest obstacle to phasing out fossil fuels.

By Randall Parker    2006 March 06 09:08 PM   Entry Permalink | Comments (20)
2006 February 01 Wednesday
More Rigorous Sonofusion Test Yields Positive Results

Can an easy way to tap fusion energy be developed? A new study on a way to create fusion with waves has found neutrons were generated by this approach.

Troy, N.Y. — A team of researchers from Rensselaer Polytechnic Institute, Purdue University, and the Russian Academy of Sciences has used sound waves to induce nuclear fusion without the need for an external neutron source, according to a paper in the Jan. 27 issue of Physical Review Letters. The results address one of the most prominent questions raised after publication of the team’s earlier results in 2004, suggesting that “sonofusion” may be a viable approach to producing neutrons for a variety of applications.

By bombarding a special mixture of acetone and benzene with oscillating sound waves, the researchers caused bubbles in the mixture to expand and then violently collapse. This technique, which has been dubbed “sonofusion,” produces a shock wave that has the potential to fuse nuclei together, according to the team.

But other scientists were skeptical of the results from that earlier round of sonofusion experiments.

In response to earlier criticisms this group of scientists has tried a sonofusion approach that did not use a external source of neutrons.

The telltale sign that fusion has occurred is the production of neutrons. Earlier experiments were criticized because the researchers used an external neutron source to produce the bubbles, and some have suggested that the neutrons detected as evidence of fusion might have been left over from this external source.

“To address the concern about the use of an external neutron source, we found a different way to run the experiment,” says Richard T. Lahey Jr., the Edward E. Hood Professor of Engineering at Rensselaer and coauthor of the paper. “The main difference here is that we are not using an external neutron source to kick the whole thing off.”

In the new setup, the researchers dissolved natural uranium in the solution, which produces bubbles through radioactive decay. “This completely obviates the need to use an external neutron source, resolving any lingering confusion associated with the possible influence of external neutrons,” says Robert Block, professor emeritus of nuclear engineering at Rensselaer and also an author of the paper.

The experiment was specifically designed to address a fundamental research question, not to make a device that would be capable of producing energy, Block says. At this stage the new device uses much more energy than it releases, but it could prove to be an inexpensive and portable source of neutrons for sensing and imaging applications.

To verify the presence of fusion, the researchers used three independent neutron detectors and one gamma ray detector. All four detectors produced the same results: a statistically significant increase in the amount of nuclear emissions due to sonofusion when compared to background levels.

A way to produce energy from fusion would put a pretty permanent end to our energy woes. But using more conventional approaches to creating the conditions under which fusion happens looks like it will take decades before fusion reactors are a reality. Can a less conventional approach provide a cost effective solution much sooner? If it did the implications would be staggering. Energy is the only resource whose limits matter for what humanity can accomplish. There is no single mineral whose short supply would stop or reverse economic growth. With enough energy we can mold many types of matter into states that would allow those types of matter to substitute for whatever is in short supply.

By Randall Parker    2006 February 01 08:55 PM   Entry Permalink | Comments (33)
2006 January 21 Saturday
Building Cooling Electric Demand Could Be Shifted Toward Mornings

Buildings could be pre-cooled in the mornings.

Engineers have developed a method for "precooling" small office buildings and reducing energy consumption during times of peak demand, promising not only to save money but also to help prevent power failures during hot summer days.

The method has been shown to reduce the cooling-related demand for electricity in small office buildings by 30 percent during hours of peak power consumption in California's sweltering summer climate. Small office buildings represent the majority of commercial structures, so reducing the electricity demand for air conditioning in those buildings could help California prevent power-capacity problems like those that plagued the state in 2000 and 2001, said James Braun, a Purdue University professor of mechanical engineering.

The results focus on California because the research was funded by the California Energy Commission, but the same demand-saving approach could be tailored to buildings in any state.

"California officials are especially concerned about capacity problems in the summertime," said Braun, whose research is based at Purdue's Ray W. Herrick Laboratories.

A building's physical mass could get cooled down in the morning and therefore help keep the building cooler later in the day.

Findings will be detailed in three papers to be presented on Monday (Jan. 23) during the Winter Meeting of the American Society of Heating, Refrigerating and Air-Conditioning Engineers in Chicago. Two of the papers were written by Braun and doctoral student Kyoung-Ho Lee. The other paper was written by researchers at the Lawrence Berkeley National Laboratory, a U.S. Department of Energy laboratory managed by the University of California.

The method works by running air conditioning at cooler-than-normal settings in the morning and then raising the thermostat to warmer-than-normal settings in the afternoon, when energy consumption escalates during hot summer months. Because the building's mass has been cooled down, it does not require as much energy for air conditioning during the hottest time of day, when electricity is most expensive and in highest demand.

Better ways could be found to do this where humans are less affected by temperature changes. A building could get constructed (or upgraded) to contain a large mass (made out of lead perhaps?) that gets cooled at night more than the air does. The air conditioner could cool it down way below normal room temperature (say close to freezing or even below). Granted the method reported here requires only an upgrade to the thermostat electronics. But it has drawbacks and limits on what it can achieve. Demand could get shifted for more hours or even days if a high density mass was cooled in the summer. Also using solar or wind energy such a mass could get heated in the winter whenever the wind blew or the sun shined.

Basically they shift demand from the afternoon to the morning.

Precooling structures so that it takes less power to cool buildings during times of peak demand is not a new concept. But researchers have developed a "control algorithm," or software that determines the best strategy for changing thermostat settings in a given building in order to save the most money. Research has shown that using a thermal mass control strategy improperly can actually result in higher energy costs. Factors such as a building's construction, the design of its air-conditioning system, number of windows, whether the floors are carpeted, and other information must be carefully considered to determine how to best use the method.

"The idea is to set the thermostat at 70 degrees Fahrenheit for the morning hours, and then you start adjusting that temperature upwards with a maximum temperature of around 78 during the afternoon hours, " Braun said. "When the thermostat settings are adjusted in an optimal fashion, the result is a 25 percent to 30 percent reduction in peak electrical demand for air conditioning.

But currently there is little incentive for most businesses to shift a portion of their electricity demand from afternoon to morning. What is needed are utility rate structure changes to implement dynamic pricing so that current price comes closer to the marginal price. That'd make electricity much more expensive during peak times but cheaper during low usage times.

"If you couple this reduction in demand with a utility rate structure that charges more during critical peak periods, utility costs will drop. Without such a change in peak rates, though, the actual impact on operating costs is relatively small, with about $50 in annual savings per 1,000 square feet of building space.

"A good incentive for reducing peak demand would be to impose a higher peak demand charge for the critical peak-pricing periods, and if customers reduce their consumption during these times, they are rewarded with lower energy costs for the rest of the time."

Some of the technology developments needed to allow demand shifting are pretty low tech. It is easy to develop a computer program that will vary the thermostat setting as a function of the time or day and not much harder to develop software and a communications system to broadcast marginal prices so that companies could adjust their demand as a function of current electric prices. The bigger obstacle is at the policy level, not the technological level.

If public utilities were to more widely implement dynamic pricing of electricity then businesses would pretty quickly implement lower tech methods of adjusting demand. At the same time, incentives would then come into existence to develop better technologies for shifting demand. For example, the value of better battery technologies would increase and therefore dynamic pricing would accelerate the development of better battery technologies.

An acceleration of battery technology development in response to dynamic electric pricing would eventually accelerate the shift toward hybrid and pure electric cars. Increased demand for electric power storage technologies would increase investment to develop such technologies.

The deployment of technologies and business practices that allow rapid demand adjustment in response to dynamic pricing would be bullish for both solar and wind electric power. Businesses would treat rises in electric prices that happen when the sun isn't shining or the wind isn't blowing as reason to shift business activity (or accumulation of energy in batteries or cool or heat in previously mentioned building masses) toward the times when the sun does shine and the wind does blow. To put it another way: if demand can be made more dynamic by market forces then the inconstancy of solar and wind power would pose less of a problem for their wider spread adoption. Greater market forces in electric power distribution would accelerate energy technology development and deployment.

By Randall Parker    2006 January 21 02:54 PM   Entry Permalink | Comments (10)
2005 May 12 Thursday
Study Sees Hydrogen Problems Requiring Decades To Solve

Use of hydrogen to transport and store enerfgy is still a distant prospect.

WEST LAFAYETTE, Ind. – Researchers conclude in an article to be published in June that it could take "several decades" to overcome daunting technical challenges standing in the way of the mass production and use of hydrogen fuel cell cars.

The article notes that "success is not certain" in efforts to develop inexpensive, hydrogen-powered fuel cells and to create the vast storage and transportation infrastructure needed for the vehicles, stressing that hydrogen's "wide-scale use is laden with potential technical, economic and societal impasses." In case fuel cells never do become practical for cars, the researchers conclude, it would be wise for the nation to "maintain a robust portfolio of energy research and development" in other areas.

"In my mind, developing practical hydrogen fuel cells for cars is definitely doable, but we must solve very daunting technical challenges," said Rakesh Agrawal, Purdue University's Winthrop E. Stone Distinguished Professor of Chemical Engineering.

The article will appear as the cover story in the June issue of the AIChE Journal, a publication of the American Institute of Chemical Engineers. The article was written by Agrawal, Martin Offutt, from the National Research Council, and Michael P. Ramage, a retired executive from ExxonMobile Corp.

Fuel cells cost too much to build and have short operating lifetimes.

"Today's fuel cells generate power at a cost of greater than $2,000 per kilowatt, compared with $35 per kilowatt for the internal combustion engine, so they are more than 10 times more expensive than conventional automotive technology," Agrawal said. "At the same time, fuel cells have an operating lifetime for cars of less than 1,000 hours of driving time, compared with at least 5,000 hours of driving time for an internal combustion engine.

"That means fuel cells wear out at least five times faster than internal combustion engines. If I buy a new car, I expect it to last, say, 10 years, which equates to about 3,000 hours of driving time. If my fuel cell only lasts 1,000 hours, you can see that's not very practical."

Cheaper and longer lasting catalysts are needed. Plus, in order to use fuel cells to burn hydrogen the hydrogen transportation and storage problems need to be solved.

To bring down the cost of fuel cells, less expensive catalysts and membrane materials are needed, Agrawal said.

Developing an infrastructure of hydrogen storage and transportation represents other significant challenges.

"A fuel-cell car built with today's technology would cost about $250,000, but you would have no place to fill up the tank," Agrawal said.

Hydrogen is a light gas, which makes it more expensive to transport and store. Because its molecular weight is only 2 – compared with heavier gases, such as methane, which has a molecular weight of 16 – less hydrogen is contained in the same space as heavier gases, making its transport more expensive.

Agrawal sees hydrogen vehicles starting to show up on the road in the year 2020.

"I believe we can probably solve the technological problems related to making hydrogen fuel cells practical as a replacement for the internal combustion engine, but it won't be easy and it likely won't happen very soon," Agrawal said. "An optimistic prediction would be that a significant number hydrogen fuel cell cars will be entering the marketplace around 2020, and by 2050 everybody will be driving them."

But that is an optimistic prediction. A lot of problems must be solved to even start hydrogen deployment in 2020. In the meantime the market for gas-electric hybrid vehicles is going to become quite large. Many of those hybrids will be pluggable and some people will be charging them from their home outlets. Photovoltaics might drop to the point that a portion of that car battery recharging will be done using electric generated right at home.

Suppose nuclear power experiences a resurgence. Hydrogen could be generated at nuclear plants. But if superconductor technology continues to improve and battery technology does as well then superconducting power lines which suffer no resistance might deliver nuclear power to electric vehicle batteries more conveniently at home at a lower cost than a hugely expensive infrastructure for delivering hydrogen to fuel stations.

In my view hydrogen's eventual role as primary vehicle fuel is by no means assured. Future solutions to hydrogen's technological problems will not compete with today's other energy technologies. Hydrogen's supporting technologies will compete with tomorrow's batteries, superconductors, and other energy technologies. Those competing technologies will be delivering benefits decades before hydrogen begins to do so and therefore industry, academic, and government labs will continue to refine those other technologies. By the time hydrogen is ready the competiton might be too firmly entrenched and cheap to be dislodged.

Fuel cells have a future independent of hydrogen. If the cost and durability problems with fuel cells could be solved for burning hydrocarbon liquid fuels then fuel cels could be adopted much more rapidly as a more efficient way to burn fossil fuels. Liquid fuel burning fuel cells could even work in hybrid vehicles with batteries providing increased efficiency through regenerative braking.

By Randall Parker    2005 May 12 11:18 AM   Entry Permalink | Comments (15)
2005 April 14 Thursday
Porphyrin Nanodevices May Use Light To Generate Hydrogen From Water

The Glittering Eye alerts me to a post about nanomaterials that can use light to split water into hydrogen and oxygen. The original Sandia National Laboratories press release adds more details on how these nanoscale devices might be the key to the use of solar energy to produce hydrogen for energy.

Sunlight splitting water molecules to produce hydrogen by devices too small to be seen in a standard microscope. That’s a goal of a research team led by Sandian John Shelnutt (1116) that has captured the interest of chemists around the world who pursue this “Holy Grail of chemistry."

“The broad objective of the research is to design and fabricate new types of nanoscale devices,” John says. “This investigation is exciting because it promises to provide fundamental scientific breakthroughs in chemical synthesis, self-assembly, electron and energy transfer processes, and photocatalysis. Controlling these processes is necessary to build nanodevices for efficient water splitting, potentially enabling a solar hydrogen-based economy.”

The prospect of using sunlight to split water at the nanoscale grew out of John’s research into the development of hollow porphyrin nano-tubes (see “Porphyrin nanotubes versus carbon” on page 4). These light-active nanotubes can be engineered to have minute deposits of platinum and other metals and semiconductors on the outside or inside of the tube.

The key to making water-splitting nanodevices is the discovery by Zhongchun Wang (1116) of nanotubes composed entirely of porphyrins. Wang is a postdoctoral fellow at the University of Georgia working in John’s Sandia research group. The porphyrin nanotubes are micrometers in length and have diameters in the range of 50-70 nm with approximately 20-nm-thick walls. They are prepared by ionic self-assembly of two oppositely charged porphyrins — molecules that are closely related to chlorophyll, the active parts of photosynthetic proteins.

Photovoltaic devices that convert photonic energy into electricity are just one of several approaches for methods to convert solar energy into more useful energy forms. Devices that could use light to catalyse the splitting of water would generate hydrogen that could be burned in fuel cells to generate electricity or burned in more conventional engines to produce mechanical energy.

Another approach might be to copy nature which uses photonic energy in chlorophyll to drive the fixing of hydrogen from water and carbon from carbon dioxide to produce hydrocarbons. Most biomass energy approaches use the ability of plants to do this. However, entirely synthetic materials could be developed that have the ability to generate hydrocarbons and those materials have the potential to generate hydrocarbons more efficiently than plants can manage.

Shelnutt thinks hydrogen-generating nanodevices could absorb and use a very large portion of the light energy spectrum. This could make them more efficient than either plants or all currently produced photovoltaic cell designs.

“Laboratory-scale devices of this type have already been built by others,” John says. “All we are doing is reducing the size of the device to reap the benefits of the nanoscale architecture.”

John says the nanodevice could efficiently use the entire visible and ultraviolet parts of the solar spectrum absorbed by the tubes to produce hydrogen, one of the “Holy Grails of chemistry.”

These nanotube devices could be suspended in a solution and used for photocatalytic solar hydrogen production.

“Once we have functional nanodevices that operate with reasonable efficiency in solution, we will turn our attention to the development of nanodevice-based solar light-harvesting cells and the systems integration issues involved in their production,” John says. “There are many possible routes to the construction of functional solar cells based on the porphyrin nanodevices. For example, we may fabricate nanodevices in arrays on transparent surfaces, perhaps on a masked free-standing film. However, we have a lot of issues to resolve before we get to that point.”

If solar energy can be harnessed to produce pure hydrogen we will still be faced with the problems of how to store and transport the hydrogen. We need both better battery technologies and better hydrogen storage materials.

By Randall Parker    2005 April 14 03:17 PM   Entry Permalink | Comments (17)
2005 April 01 Friday
New Fuel Cell Design Avoids Need For Hydrogen Storage

A new highly efficient fuel cell design converts liquid hydrocarbon fuel into hydrogen and then burns it.

"A hydrogen economy is not a perfectly clean system," said Scott A. Barnett, professor of materials science and engineering. "You have to process fossil fuels at a plant to produce hydrogen fuel as well as develop an infrastructure to get that fuel into vehicles. We have bypassed these technological hurdles by basically bringing the hydrogen plant inside and pairing it with a high-temperature fuel cell in one compact unit that has a fuel efficiency of up to 50 percent."

In a paper to be published online today (March 31) by the journal Science, Barnett and graduate student Zhongliang Zhan report the development of a new solid oxide fuel cell, or SOFC, that converts a liquid transportation fuel -- iso-octane, a high-purity compound similar to gasoline -- into hydrogen which is then used by the fuel cell to produce energy. The cells could lead to cost-effective, clean and efficient electrical-power sources for applications ranging from aircraft and homes to cars and trucks.

Although only demonstrated on a small scale, Barnett and Zhan's fuel cells are projected to have a 50 percent fuel efficiency when used in a full-sized fuel cell generator, which would improve on other technologies. Higher fuel efficiencies mean less precious fuel is consumed and less carbon dioxide, a greenhouse-effect gas related to global warming, is produced. Internal combustion engines have a "well-to-wheels" efficiency of a mere 10 to 15 percent. Current hydrogen fuel cells that require hydrogen plants and new infrastructure have been calculated to have a 29 percent fuel efficiency while commercial gas/electric hybrid vehicles already have achieved 32 percent.

"The advent of hybrid vehicles has shaken up the fuel cell community and made researchers rethink hydrogen as a fuel," said Barnett, who drives a Toyota Prius and foresees his new fuel cells being developed for use in battery/SOFC hybrid technology for vehicle propulsion or in auxiliary power units. "We need to look at the solid oxide fuel cell -- the one kind of fuel cell that can work with other fuels beside hydrogen -- as an option."

They use the heat from the fuel cell's operation to catalyze the breaking of the carbon-hydrogen bonds in the liquid hydrocarbon fuel. Smart approach.

Because conventional solid oxide fuel cells operate at such high temperatures (between 600 and 800 degrees Centigrade) Barnett recognized that the heat could be used internally for the chemical process of reforming hydrogen, eliminating the need for hydrogen plants with their relatively low fuel efficiency. Barnett and Zhan found the optimal temperature for their system to be 600 to 800 degrees.

The real key to the new fuel cell is a special thin-film catalyst layer through which the hydrocarbon fuel flows toward the anode. That porous layer, which contains stabilized zirconia and small amounts of the metals ruthenium and cerium, chemically and cleanly converts the fuel to hydrogen.

This approach avoids the need solve all the difficult technical problems that stand in the way use of hydrogen as a form of energy. Even if all the hydrogen distribution and storage problems are solved there would still be the need to build the infrastructure to transport and store hydrogen. This approach avoids the need for massive capital investments to deliver hydrogen to cars.

Also, the use of the fuel cell's own heat to separate the hydrogen probably achieves a larger overall system efficiency than could be achieved if hydrogen was produced in special chemical plants that had to generate their own heat to separate the hydrogen. As long as fossil fuels are the source of the energy used to generate the hydrogen the use of fuel cell heat to convert hydrocarbon fuel to hydrogen will increase overall efficiency. However, if hydrogen could be generated from nuclear or solar power the efficiency advantage of converting from liquid fuel to hydrogen in a vehicle would not be as great.

Another thought: Fuel cells as energy sources in cars will not obsolesce the use of batteries in hybrids. Why? Hybrid vehicles get part of their fuel efficiency boost from regenerative braking. Applying the brakes in a hybrid kicks in an electric generator that uses wheel rotational energy to spin the generator to recharge the batteries. This recaptures some of the energy used to accelerate the vehicle. Even if the internal combustion engine is replaced by fuel cells at some future date a hybrid design would still enable energy recapture when braking to improve fuel mileage.

Batteries may also allow fuel cells to operate more efficiently by reducing the frequency with which fuel cells are activated. Note the high operating temperature mentioned above. In their design that heat is harnessed to generate hydrogen. But every time the fuel cell is turned off waste heat is lost as the fuel cell cools. Some fuel cell designs may even need to be heated up before they can start operating. Plus, there is also the need to generate enough heat initially to use to produce the hydrogen fuel. On shorter trips batteries could avoid the need to use of energy to warm up and run a fuel cell and avoid the energy lost as a fuel cell cools.

By Randall Parker    2005 April 01 10:25 PM   Entry Permalink | Comments (45)
2005 February 28 Monday
Do Hydroelectric Dams Cause Global Warming?

For years hydroelectric dams have shown up on lists of energy sources that are renewable and non-polluting. Environmental complaints about dams have been over more local considerations such as the fact that dams can disrupt fish spawning and that the dams contribution to water evaporation by increasing the surface area over which water can evaporate. Well, Philip Fearnside of Brazil's National Institute for Research in the Amazon says that dams increase the amount of plant matter that decompose in anaerobic condtions and produce methane which is 21 times more potent as a greenhouse gas than carbon dioxide.

In a study to be published in Mitigation and Adaptation Strategies for Global Change, Fearnside estimates that in 1990 the greenhouse effect of emissions from the Curuá-Una dam in Pará, Brazil, was more than three-and-a-half times what would have been produced by generating the same amount of electricity from oil.

This is because large amounts of carbon tied up in trees and other plants are released when the reservoir is initially flooded and the plants rot. Then after this first pulse of decay, plant matter settling on the reservoir's bottom decomposes without oxygen, resulting in a build-up of dissolved methane. This is released into the atmosphere when water passes through the dam's turbines.

Note that a dam in Brazil which is right on the equator is probably going to receive a lot more plant matter from a river that fills its reservoir than would a dam on a river further from the equator. Even if some hydroelectric dams turn out to be net producers of greenhouse gasses we can't assume that all hydroelectric dams cause more in warming effects from methane production than they prevent in avoided carbon dioxide release.

Methane is a valuable gas to capture in situations where it can be captured because methane can be burned for the energy. Also, the burning of methane turns the carbon in it into a compound (carbon dioxide) that is far less potent as a greenhouse gas.

Parenthetically, James Hansen of NASA has been arguing for several years that reduction in methane emissions would reduce global warming effects more cheaply than lowering carbon dioxide emissions (and see more on this here and here). I especially like his argument that lowering methane emissions would both increase air quality down at ground level where we live and decrease greenhouse warming effects. Hansen still thinks carbon dioxide emissions restrictions will be necessary. But why not first implement the far cheaper option of decreasing methane emissions and also get better ground level air quality in the bargain? Just the increase in ground level air quality alone would, in my opinion, justify the costs. Efforts to capture methane would be at least partially paid back because the captured methane could be burned for energy.

Update: Dave Schuler mentioned methane production from agriculture in the comments. I can't answer his question about the relative contribution agriculture makes to methane emissions. But this reminds me of recent research at the University of California at Davis which showed that most methane from cows comes from cow belching.

California dairy cows produce only half the amount of certain air pollutants as had been believed and, perhaps more important, most of a dairy cow's contribution to smog comes not from her fresh manure, but from her belching, according to preliminary findings by a UC Davis scientist.

Those unexpected results may affect the thinking and practices of California regulators and dairy operators trying to reduce air pollution.

"We have to re-think the idea that the only good solutions are engineering solutions, and consider biological avenues such as animal feeding and management to reduce emissions," said Frank Mitloehner, the UC Davis air quality specialist who is conducting the study.

For three months, Mitloehner and his co-workers have studied dairy cows in sealed environmental chambers to simulate emissions from one type of cow housing, known as freestall conditions. Under these controlled conditions -- the first study of its kind -- the researchers were able to collect precise measurements of the volatile organic compound (VOC) emissions that cows and their fresh waste produce.

"For the first time we can tell dairy farmers the source of VOCs from the cow-housing part of their dairy," Mitloehner said. "For the most tightly regulated pollutant, the 700 ozone-forming gases collectively called volatile organic compounds, that source is not the cows' fresh waste. It's the cows."

This result makes methane emissions reduction easier to do than was previously thought. Supplements of bacteria types or compounds that inhibit methane production could be placed in cow feed to reduce methane production. But the idea of changing feedstocks as a way to reduce methane emissions from cows is nothing new and I've come across mentions of research along these lines in Switzerland, France, Australia, and New Zealand. In NZ agricultural scientists are experimenting with different types of grasses to lower methane production in grazing animals. They found substantial differences in methane production depending on which grass was fed to animals. (and the CH4 mentioned below is methane)

Improving the quality of the diet of ruminants tends to result in higher feed intakes, which in turn tends to increase productivity and CH4 output per animal. However, if CH4 is expressed per unit of product, then using a smaller number of high-producing animals to produce a given amount of product emits less CH4 than using a larger number of lower producing animals. This is because a smaller proportion of the feed eaten is required to maintain the animal and because high feed intakes tend to reduce CH4 yield per unit of feed eaten. Concentrate diets produce less CH4 than forage diets but are too expensive for extensive use in New Zealand. Research undertaken by AgResearch and Dexcel indicates that certain forage species e.g. white clover, lotus and sulla, improve animal performance and produce less CH4 per unit of feed eaten. Experiments are currently underway to look at whether ryegrass cultivars selected for improved animal performance also result in lower CH4 yields per unit of product.

One can easily imagine a great reduction in agricultural methane production by seeding pastures and farm fields with grasses that are found to reduce methane production of cows. Countries willing to genetically engineer grasses to add factors that reduce methane production will be able to achieve the greatest reduction in agricultural methane emissions. This will probably turn out be a fairly cheap and easy way to reduce methane emissions.

Agricultural scientists Garry Waghorn and Michael Tavendale of AgResearch Grasslands in New Zealand have found that higher levels of condensed tannins in grasses reduce methane production.

Methane is either burped or expelled out in breath, and is a by-product of the fermentation of feed in the rumen. Dr Waghorn and Dr Tavendale say about 90 percent of all methane emissions come from ruminants. Greenhouse gases affect everyone, because the Government is committed to ratifying the Kyoto Protocol. Once the agreement is signed, New Zealand will face financial penalties if it exceeds the emissions it recorded in 1990. But, and this is the dilemma for the country, if agricultural production expands, so will gas emissions. Condensed tannins are a naturally occurring compound found in red wine, apple skins and cocoa, as well as some pasture grasses, including lotus and sulla. They can also be found in docks, white clover flowers and some seeds. Besides reducing methane emissions, condensed tannins have other animal-related benefits, including improved milk yields, increased liveweight gain, decreased internal parasite burden and reduced bloat, dags and fly strike. Dr Waghorn said tannins had in the past been considered "evil" because some plants, especially tropical ones, contained them in high concentrations, which were bad for animals. But in a temperate climate such as New Zealand's, condensed tannins werefound in "weedy species", or less common plants, he said. "It's unusual to find it in grasses, which is a problem because animals eat grass," Dr Waghorn said.

But in some reports I came across condensed tannins reduced methane emissions by only 15%. Tannins alone might not be a panacea.

Whereas reportedly only 2 percent of greenhouse gas effects in the United States come from agriculture New Zealand has only 4 million people but 10 million cattle and 45 million sheep. Therefore most of New Zealand's greenhouse gas emissions come from agriculture.

"New Zealand is unique in that over 50 percent of its greenhouse gas emissions arise from methane released by enteric fermentation," said Katharine Hayhoe, an atmospheric scientist at the University of Illinois, Urbana-Champaign, Illinois.

Therefore it is not surprising that New Zealand agricultural scientists are especially interested in reducing methane emissions from cows and sheep.

By Randall Parker    2005 February 28 12:58 AM   Entry Permalink | Comments (18)
2005 February 04 Friday
Power Plant Waste Heat Could Produce Fresh Water

University of Florida Department of Mechanical and Aerospace Engineering professors James Klausner and Renwei Mei have developed a method to use power plant waste heat to lower the cost of water desalination.

Since power plants need water for cooling purposes and desalination plants need heat, why not combine the needs of both? The professors - James Klausner and Renwei Mei - calculate that their process would shave a sixth of the cost from today's most efficient technology.

If we either develop cheaper energy sources or more of the world becomes industrialized then there will be no world scale shortage of drinkable water. If people can afford to pay for water it can always be produced by desalination. Alarmist talk in some circles about future water shortages assumes a high rate of poverty. Waters shortages will become a bigger problem in the future only where severe poverty will continue to be a problem.

Employing a major modification to distillation, Klausner's technology relies on a physical process known as mass diffusion, rather than heat, to evaporate salt water.

In a nutshell, pumps move salt water through a heater and spray it into the top of a diffusion tower – a column packed with a polyethylene matrix that creates a large surface area for the water to flow across as it falls. Other pumps at the bottom of the tower blow warm, dry air up the column in the opposite direction of the flowing water. As the trickling salt water meets the warm dry air, evaporation occurs. Blowers push the now-saturated air into a condenser, the first stage in a process that forces the moisture to condense as fresh water.

Klausner said the key feature of his system is that it can tap warmed water plants have used to cool their machines to heat the salt water intended for desalination, turning a waste product into a useful one.

He has successfully tested a small experimental prototype in his lab, producing about 500 gallons of fresh water daily. His calculations show that a larger version, tapping the waste coolant water from a typically sized 100-megawatt power plant, has the potential to produce 1.5 million gallons daily. The cost is projected at $2.50 per 1,000 gallons, compared with $10 per thousand gallons for conventional distillation and $3 per thousand gallons for reverse osmosis.

Because the equipment would have to extract as much heat as possible from the coolant water, it would need to be installed when a plant is built, he said. Another potential caveat is that a full-scale version of the mechanism would require a football field-sized plot on land, likely to be expensive in coastal areas where power plants are located, Klausner said. Presumably a utility would sell the fresh water it produces, recouping and then profiting from its investment, he said.

Limited quantities of energy and intelligence are the two biggest factors holding down the rate of economic development. Every technology that increases the efficiency of utilization of energy or lowers the cost of energy spurs economic growth. Anything that raises human intelligence levels will do the same. Also, computer technologies effectively increase the efficiency of the use of human intelligence by unburdening many tasks from human minds. So computers are brain utilization efficiency increasers.

By Randall Parker    2005 February 04 01:13 AM   Entry Permalink | Comments (3)
2005 January 13 Thursday
Peter Huber And Mark Mills On Our Energy Future

Peter W. Huber and Mark P. Mills have a new book out about energy policy entitled The Bottomless Well: The Twilight Of Fuel, The Virtue Of Waste, And Why We Will Never Run Out Of Energy which includes a strong pitch for nuclear power as our best choice to meet continuously rising energy demand. Tyler Cowen finds the book interesting. The latest edition of the City Journal (which FuturePundit strongly recommends) has a long article by Huber and Mills which provides shorter versions of the book's arguments. Each American continually uses about 1,400 watts of electricity on average.

Think of our solitary New Yorker on the Upper West Side as a 1,400-watt bulb that never sleeps—that’s the national per-capita average demand for electric power from homes, factories, businesses, the lot. Our average citizen burns about twice as bright at 4 pm in August, and a lot dimmer at 4 am in December; grown-ups burn more than kids, the rich more than the poor; but it all averages out: 14 floor lamps per person, lit round the clock. Convert this same number back into a utility’s supply-side jargon, and a million people need roughly 1.4 “gigs” of power—1.4 gigawatts (GW). Running at peak power, Entergy’s two nuclear units at Indian Point generate just under 2 GW. So just four Indian Points could take care of New York City’s 7-GW round-the-clock average. Six could handle its peak load of about 11.5 GW. And if we had all-electric engines, machines, and heaters out at the receiving end, another ten or so could power all the cars, ovens, furnaces—everything else in the city that oil or gas currently fuels.

Note that the 6 and 10 Indian Points translate into 12 and 20 nuclear power plants. So they are talking about supplying all the power for New York City for all purposes with about 32 nuclear power plants. In a previous post about how all transportation energy could be supplied by 1000 nuclear power plants I pointed to Westinghouse's new AP1100 nuclear plant design that would generate 1,100 MW or 1.1 GW. My guess is that since nuclear plants have down times it would take 32 AP1100 plants to run NYC. So we can place a price of about $32 billion on their construction. If anyone has any good estimates on yearly operations and fuel costs or nuclear waste disposal and decommissioning costs I'd like to hear them in the comments.

NYC has a population of about 8 million people. The United States as a whole has about 293 million people. So the US as a whole is almost 37 times larger. If the American people would need proportionately as many nuclear power plants as the denizens of the Big Apple then the US as a whole could be operated completely on nuclear power with 32 times 37 or 1184 nuclear power plants. However, that doesn't sound reasonable given the previous estimate I've referenced that claimed 1000 nuclear plants would be needed to power just the cars in America. Anyone understand the cause of the different results of these calculations?

Vehicles use only 30% of power now produced in America.

The U.S. today consumes about 100 quads—100 quadrillion BTUs—of raw thermal energy per year. We do three basic things with it: generate electricity (about 40 percent of the raw energy consumed), move vehicles (30 percent), and produce heat (30 percent). Oil is the fuel of transportation, of course. We principally use natural gas to supply raw heat, though it’s now making steady inroads into electric power generation. Fueling electric power plants are mainly (in descending order) coal, uranium, natural gas, and rainfall, by way of hydroelectricity.

Note that in spite of the attention lavished upon oil as a political topic it accounts for less than half of all energy use in the United States. But there is a difference between energy generated and energy used. A large fraction of the energy generated in electric power generation plants is lost by the time electricity flows through a wall socket. To supply enough nuclear power to operate cars would require more heat generation than is currently generated from burning gasoline in cars. See, for example, the efficiency column of the first table of Engineer-Poet's Ergosphere post laying out his vision of our energy future.

In spite of the lower energy efficiency of electricity it is so convenient and useful that it is a growing percentage of total energy used in America and likely worldwide as well..

That shift is already under way. About 60 percent of the fuel we use today isn’t oil but coal, uranium, natural gas, and gravity—all making electricity. Electricity has met almost all of the growth in U.S. energy demand since the 1980s.

Will this shift toward electricity as the preferred medium for delivering energy continue? To put it another way: Does hydrogen stand any chance of becoming a major medium for the distribution of power? Hydrogen has a lot of problems as an energy storage medium. Perhaps advances in nanotechnology will solve some of those problems. But we'd still be left with the need to use nuclear power plants or solar photovoltaic panels to generate the power we'd use to produce hydrogen in the first place. At the same time, materials advances will reduce electric power transmission costs and so hydrogen is not going to compete against a static target.

Electricity has an inherent advantage over hydrogen: Many end uses require electricity. Think about computers or any electronic devices. The devices run on electricity. Hydrogen use for these applications would require generation of hydrogen from some other energy source (possibly in nuclear power plants designed to optimize hydrogen production), hydrogen transportation and hydrogen storage devices where needed, and then the use of hydrogen fuel cells to generate electricity where and when it is needed. Any guesses on why that approach can be expected to cost more or less than the construction of more superconducting high voltage lines?

Huber and Mills see electricity continuing to encroach on natural gas and other energy source competitors in end-use applications..

Electricity is taking over ever more of the thermal sector, too. A microwave oven displaces much of what a gas stove once did in a kitchen. So, too, lasers, magnetic fields, microwaves, and other forms of high-intensity photon power provide more precise, calibrated heating than do conventional ovens in manufacturing and the industrial processing of materials. These electric cookers (broadly defined) are now replacing conventional furnaces, ovens, dryers, and welders to heat air, water, foods, and chemicals, to cure paints and glues, to forge steel, and to weld ships. Over the next two decades, such trends will move another 15 percent or so of our energy economy from conventional thermal to electrically powered processes. And that will shift about 15 percent of our oil-and-gas demand to whatever primary fuels we’ll then be using to generate electricity.

Huber and Mills also point out that cars are becoming big electric appliances. They expect the trend toward hybrid vehicles to effectively make the main purpose of power plants of cars into electric power generators connected to a large number of devices that run off the electricity supplied by the electric power plant under the hood. I agree with this assessment and have previously argued that Cars May Become Greater Electricity Generators Than Big Electric Plants. However, while the engines in hybrids may eventually become huge electricity generators Huber and Mills argue that the use of gasoline for generating car electric power is an expensive way to charge car batteries. Electric power delivered from electric utility companies is much cheaper:

Once you’ve got the wheels themselves running on electricity, the basic economics strongly favor getting that electricity from the grid if you can. Burning $2-a-gallon gasoline, the power generated by current hybrid-car engines costs about 35 cents per kilowatt-hour. Many utilities, though, sell off-peak power for much less: 2 to 4 cents per kilowatt-hour. The nationwide residential price is still only 8.5 cents or so.

This makes pluggable hybrids (hybrids that can be recharged from wall sockets while parked) as the next logical step. I expect we will see the development of better and cheaper batteries to facilitate this transition.

Huber and Mills make a pitch for nuclear power that has an interesting twist to it: Because nuclear plant reactors are so small compared to the power that comes from them it is easy to overbuild them to protect against terrorist attacks.

And uranium’s combination of power and super-density makes the fuel less of a terror risk, not more, at least from an engineering standpoint. It’s easy to “overbuild” the protective walls and containment systems of nuclear facilities, since—like the pyramids—the payload they’re built to shield is so small. Protecting skyscrapers is hard; no builder can afford to erect a hundred times more wall than usable space. Guaranteeing the integrity of a jumbo jet’s fuel tanks is impossible; the tanks have to fly. Shielding a nuclear plant’s tiny payload is easy—just erect more steel, pour more concrete, and build tougher perimeters.

Because uranium fuel amounts to only one tenth the total cost of nuclear power it is a great energy source for baseline power needs. Also, since hybrid vehicles could be recharged at night an electric industry run by nuclear power would work extremely well with the implementation of residential variable rate electric power metering where late night prices for electric power would be much lower than the peak day rates.

Huber and Mills mention that the Hoover Dam on the Colorado generates 2 GW of electric power. Increasingly I find myself automatically translating energy numbers into nuclear plant terms. For the cost of 2 $1 billion dollar Westinghouse AP1100 nuclear plants and perhaps a few billion more in operations and fuel costs and waste disposal costs the Hoover Dam could be torn down and the Colorado River could be allowed to return to its natural state. So perhaps the whole project would run to $6 billion. Sound far-fetched? Over a period of decades we spend trillions of dollars on environmental protection. Once nuclear power again becomes an acceptable energy source I predict that the idea of building nuclear power plants to enable the tearing down hydroelectric dams will become popular in environmentalist circles.

Huber and Mills end their City Journal article with a plea to end our need to buy oil from the Middle East in order to stop the flow of money into a region so intent upon violence. I agree.

Update: What are nuclear fission power's biggest competitors in the medium run of the next 20 to 40 years? I see two:

  • Coal plants that have extremely low emissions. By extremely low I mean less than 1% of the mercury, sulfur dioxide, soot, and other emissions now coming from the worst American coal-fired electric power plants. Also, the plants would have near total carbon dioxide sequestration. Coal emissions control technology has already greatly improved and further improvements are possible. So clean coal might become possible for less than the cost of nuclear power.
  • Cheaper solar photovoltaics coupled with cheaper storage systems. The costs of photovoltaic panels inevitably will fall by an order of magnitude and more. At the same time, carbon nanotubes or lithium polymers will enable the building of batteries that are cheaper, higher power density, and longer lasting.

In the longer run what are nuclear fission's other competitors? Two more will become feasible:

  • Nuclear fusion. Will fusion be cheaper than fission?

  • Solar satellites. The satellites could provide constant power and the light hitting them would be more intense than the light hitting solar panels down on Earth. A carbon nanotube beanstalk into space may eventually make construction and deployment of such satellites orders of magnitude cheaper than it is today.

Natural gas in the form of clathrates on the ocean floor might become a major source. However, it is unclear how much natural gas is tied up in clathrates.

By Randall Parker    2005 January 13 04:06 PM   Entry Permalink | Comments (87)
2004 December 24 Friday
Planned Coal Plants Reverse 5 Times CO2 Impact Of Kyoto Protocol

Mark Clayton of the Christian Science Monitor has written an excellent analysis of the carbon dioxide emissions of planned coal-powered electric generation plants.

China is the dominant player. The country is on track to add 562 coal-fired plants - nearly half the world total of plants expected to come online in the next eight years. India could add 213 such plants; the US, 72.

The rapid industrialization of China and to a lesser extent India, rising natural gas prices, concerns about energy security, and large coal reserves in the United States and China are all driving the shift toward the construction of electricity generators that burn coal.

It is worth noting that the Kyoto treaty does not place any CO2 emissions restrictions on India or China.

Without the development and deployment of CO2 sequestration technology these coal plants alone will boost world CO2 emissions by 14% in a mere 8 years.

Without such technology, the impact on climate by the new coal plants would be significant, though not entirely unanticipated. They would boost CO2 emissions from fossil fuels by about 14 percent by 2012, Schmidt estimates.

One thing that is striking about these numbers is the scale of world economic growth. Coal-fired electric plant construction alone in only 8 years can boost worldwide CO2 emissions by 14%. There is so much money sloshing around in the world that in a relatively short period of time a few hundred gigawatts of electric power plants can be constructed.

The other point that strikes is that if nuclear power was cheaper (perhaps using a pebble bed modular reactor approach) than coal then these plants would be getting built as nuclear plants and there would be no concern

Clayton comments that a US government effort named FutureGen to build a large scale technology demonstrator coal generation plant with CO2 sequestration technology is slipping its schedule due to lack of a push for it by the Bush Administration. Right he is. The schedule for the completion of the FutureGen plant has it being completed in 2018 but even that late date looks unlikely to happen.

Industry groups are becoming increasingly impatient with the US Energy Department's (DOE) handling of the FutureGen project because the government has failed to provide basic financial details and DOE's schedule for building the near-zero-emissions coal power plant is slipping.

...

Under DOE's plan, the department would spend $620-mil by 2018, industry would contribute $250-mil and foreign participants would kick in $80-mil toward the $950-mil facility. Part of that money would go toward a carbon sequestration system that would be used to store CO2 captured at the site.

I do not know whether we face a serious future problem with global warming. But I think it is prudent to spend the money to develop lots of energy-related technology to make it far cheaper to handle the problem should it become clear at some point in the future that CO2 emissions must be drastically decreased.

Over on the Crumb Trail blog the pseudonymous blogger back40 has reacted to my comments about how it would be cheaper to develop energy technologies that reduce CO2 emissions than to lay on a worldwide regulatory regime and he has provided details on the hundreds of trillions of dollars cost estimates made for some of the CO2 emissions reductions proposals. The underfunded DOE FutureGen proposal is a piddlingly small amount of money as compared to what it would cost to implement Kyoto. Even a very ambitious energy research effort on the scale of $10 billion per year (as Nobelist Richard Smalley argues) would still be orders of magnitude less than the costs of enforcing Kyoto starting now. Such an effort would produce cleaner technologies that would be cheaper than today's energy technologies. The market alone would move many of those technological developments into widespread use without any regulatory costs.

Even if we accept the contention of some of the global warming believers (and MIT climate scientist Richard Lindzen thinks the belief in global warming is religious and irrational) that we face a real problem from global warming in the future that problem will not become hugely expensive to the human race for decades to come. Therefore if we spend relatively smaller amounts of money (but still tens of billions of dollars) now on energy research we will eventually be able to reduce CO2 emissions for many orders of magnitude less in costs than if we attempt to reduce CO2 emissions using today's technologies. A big energy research effort is the most rational and cost-effective response to the potential threat of global warming.

Update: The construction of coal-fired power plants is going to continue well beyond just the next 8 years discussed above. The US alone needs 1,200 300 megawatt power plants over the next 25 years.

Electricity demand will increase 53.4 percent over the next 25 years. Meeting this rising growth rate will require the construction of the equivalent of more than 1,200 new power plants of 300 megawatts each—the equivalent of about 65 plants each year. Coal will remain the largest single source of electricity—accounting for 51 percent of power generation in 2025. Clean coal technologies will help meet these needs, plus continue the decline in SO2 and NOx emissions already underway.

That need could be met by building about 330 AP1100 1,100 Megawatt nuclear power plants (perhaps more than if the nuclear plants have lower capacity utilization rates). What would be cheaper? Coal CO2 emissions sequestration or nuclear power? Both costs are likely to drop in the future. Hard to say which will cost less in the longer run.

By Randall Parker    2004 December 24 02:45 PM   Entry Permalink | Comments (4)
2004 December 04 Saturday
On Public Electric Power Grid Reliability And Terrorism

Mark P. Mills and Peter W. Huber have an article in the City Journal on whether terrorists could manage to knock out electric power to New York City for a long period of time and on how to make the electric grid more reliable.

It takes almost 11 gigawatts of electricity to keep New York City lit in the late afternoon on a hot summer day—a huge amount of power. All the air conditioners, lights, elevators, and quietly humming computers inside use a whole lot more energy than the cars and trucks out on the streets. But because the fuels and infrastructure that deliver the electric power are so distant and well hidden, it's hard to focus public attention on how vital they are to the city's survival. And how vulnerable.

Few of us have even the vaguest idea just how much a gigawatt of power might be. So let's talk Pontiacs instead: 110,000 of them, parked door to door in Central Park. At exactly the same moment, 110,000 drivers start the 110,000 engines, shift into neutral, push pedal to metal, and send 110,000 engines screaming up to the tachometer's red line. Collectively, these engines are now generating a total of about 11 gigawatts of shaft power.

The writers do not bring this up but the power of those cars suggest a future solution to the power back-up needs of New York City. See my posts Will Electric Hybrid Cars Be Used As Peak Electric Power Sources? and Cars May Become Greater Electricity Generators Than Big Electric Plants.

Mills and Huber say that with more sophisticated technology the massive power blackout that struck the American Northeast and Canada on August 14, 2003 could have been prevented.

Had they had real-time access to SCADA networks in Ohio, utilities across the Northeast would have seen the August 14 problem coming many minutes, if not hours, before it hit and could have activated protective switches before the giant wave swept east to overpower them. But in the deregulatory scramble of the 1990s, regulators had pushed the physical interconnection of power lines out ahead of the interconnection of SCADA networks. The software systems needed for automated monitoring and control across systems had not yet been deployed. By contrast, on-site power networks in factories and data centers nationwide are monitored far more closely and make much more sophisticated use of predictive failure algorithms.

Nor had the grid's key switches kept pace. To this day, almost all the grid's logic is provided by electromechanical switches—massive, spring-loaded devices that take fractions of seconds (an eternity by electrical standards) to actuate. But ultra-high-power silicon switches could control grid power flows much faster, more precisely, and more reliably. Already, these truck-size cabinets, containing arrays of solid-state switches that can handle up to 35 megawatts, safeguard power supplies at military bases, airport control hubs, and data and telecom centers. At levels up to 100 megawatts, enormous custom-built arrays of solid-state switches could both interconnect and isolate high-power transmission lines, but so far, they're operating at only about 50 grid-level interconnection points worldwide.

The current structure of regulation of the electric power grid provides disincentives for building a more reliable grid. If you want to follow the debate on government regulation of the electric power industry and energy policy more generally be sure to check out Lynne Kiesling's Knowledge Problem blog.

One quibble I have with their essay has to do with their argument that capacity for locally generated power increases the reliability of the public grid.

At the same time, distributed, small-scale, and often private efforts to secure power supplies at individual nodes directly strengthen the reliability of the public grid. Large-area power outages like the one on 8/14 often result from cascading failures. Aggressive load shedding is the only way to cut off chain reactions of this kind. A broadly distributed base of secure capacity on private premises adds resilience to the public grid, simply by making a significant part of its normal load less dependent on it. Islands of especially robust and reliable power serve as centers for relieving stresses on the network and for restoring power more broadly. Thus, in the aggregate, private initiatives to secure private power can increase the stability and reliability of the public grid as well.

Local generation capacity reduces the dependence of those locations on the public grid. But if the public grid is overloaded and in danger of a cascade of failures those local generation plants only help if they kick on before the cascade of failures and in response to a building strain in the grid. If local generating capacity is designed to turn on in response to a loss of power then their switching on will come too late to prevent a large grid cascading failure. Back-up generators reduce the disruption to society as a whole by allowing hospitals, phone services, and other services to continue to function. But generators that kick on in response to grid failure do not prevent grid failure.

If those local plants operate continuously as a permanent substitute for getting power from the public grid they have no effect on the reliability of the public grid. Such sites do not figure into calculations of how much generation or distribution capacity the public grid needs. So the public grid will be smaller due to the lack of demand from self-sufficient sites and the public grid will be just as liable to reach some critical point and fall over. If terrorists knock out transmission lines or switching stations or natural gas pipelines that deliver gas to electric plants then local generation capacity will not save the grid from failure. If the local generators are run on natural gas then they will even stop working in response to attacks on natural gas pipeline systems just as the public natural gas powered electric generator plants will stop.

Ultimately vulnerability to the public grid may end because the economies of scale from building large electric power plants may vanish and the need for a grid to distribute power from those large plants may come to an end. For how that might come to pass see my post Thin Film Fuel Cells May Obsolesce Large Electric Plants. However, if fuel cells are powered by natural gas then such a turn of events will not end the vulnerability of society to attacks on large power distribution networks. We will still be vulnerable to attacks on natural gas pipeline systems. However, liquid fuel systems are much less vulnerable to attacks because local storage of liquid fuels is cheap and easy.

An economy based on cheap batteries and cheap solar photovoltaics would be much less vulnerable to attack in some parts of the country. But the problem with solar is that it is not concentrated enough to allow enough local generation of power in a highly dense area such as New York City. So in a solar/battery economy grids would still be needed to bring power to major metropolitan areas, especially in regions closer to the poles.

However, solar power used to drive artificial hydrocarbon synthesis (carbon-hydrogen fixing by either direct photosynthesis in an artificial structure or electrically driven chemical reactions - which photosynthesis really is anyway ) to create liquid hydrocarbon compounds would be fairly invulnerable to any man-made disruption. Though a very large volcanic eruption would seriously disrupt any solar-based power system (and most life on the planet). Nuclear power would continue to deliver power after a huge volcanic disruption.

By Randall Parker    2004 December 04 01:51 PM   Entry Permalink | Comments (0)
2004 December 02 Thursday
High Temperature Ceramic Hydrogen Generation Process Announced

A company called Cerametec of Salt Lake City Utah and the US Department of Energy's Idaho National Engineering and Environment Laboratory have announced a new technique that boosts the efficiency of conversion of energy into hydrogen.

SALT LAKE CITY -- Researchers at the U.S. Department of Energy’s Idaho National Engineering and Environmental Laboratory and Ceramatec, Inc. of Salt Lake City are reporting a significant development in their efforts to help the nation advance toward a clean hydrogen economy.

Laboratory teams have announced they’ve achieved a major advancement in the production of hydrogen from water using high-temperature electrolysis. Instead of conventional electrolysis, which uses only electric current to separate hydrogen from water, high-temperature electrolysis enhances the efficiency of the process by adding substantial external heat – such as high-temperature steam from an advanced nuclear reactor system. Such a high-temperature system has the potential to achieve overall conversion efficiencies in the 45 percent to 50 percent range, compared to approximately 30 percent for conventional electrolysis. Added benefits include the avoidance of both greenhouse gas emissions and fossil fuel consumption.

“We’ve shown that hydrogen can be produced at temperatures and pressures suitable for a Generation IV reactor,” said lead INEEL researcher Steve Herring. “The simple and modular approach we’ve taken with our research partners produces either hydrogen or electricity, and most notable of all – achieves the highest-known production rate of hydrogen by high-temperature electrolysis.”

This development is viewed as a crucial first step toward large-scale production of hydrogen from water, rather than fossil fuels.

The major private-sector collaborator has been Ceramatec, Inc. located at 2425 S. 900 West, Salt Lake City. “We’re pleased that the technology created over the nearly two decades dedicated to high-temperature fuel cell research at Ceramatec is directly applicable to hydrogen production by steam electrolysis,” said Ashok Joshi, Ph.D., Ceramatec chief executive officer.

“In fact, both fuel cell and hydrogen generation functionality can be embodied in a single device capable of seamless transition between the two modes. These years of investment, both public and private, in high temperature fuel cell research have enabled the Ceramatec-INEEL team to move quickly and achieve this important milestone toward establishing hydrogen as a part of our national energy strategy.”

The ceramic works as a sieve to separate the oxygen from the hydrogen. (same article here)

The new method involves running electricity through water that has a very high temperature. As the water molecule breaks up, a ceramic sieve separates the oxygen from the hydrogen.

But this approach requires the design and construction of new nuclear reactors that will use helium gas as the cooling medium and the helium would be heated to much higher temperatures than water is heated in existing reactor designs.

The idea is to build a reactor that would heat the cooling medium in the nuclear core, in this case helium gas, to about 1,000 degrees Celsius, or more than 1,800 degrees Fahrenheit. The existing generation of reactors--used exclusively for electric generation--use water for cooling and heat it to only about 300 degrees Celsius.

This latest advance does not solve any of the hydrogen transportation or storage problems. A whole new generation of nuclear reactors would need to be designed (with, no doubt, some unique and difficult design problems to solve) and built to operate with high temperature helium gas in their cores. Design and construction of those reactors would take several years (my guess is probably 10 or 12 and possibly longer). Given the costs and lead time involved and the existing unsolved problems in hydrogen transportation and storage my guess is that this technology is not going to see widespread use until the 2020s at the earliest.

The hydrogen economy is still a distant prospect. Advances in nanotechnology will probably eventually solve the hydrogen storage problem. But other advances in nanotechnology will also eventually solve the car battery problem to allow the construction of cheap electric-powered cars that can travel long distances. Once the car battery problem is solved it takes far less of a change in energy infrastructure to migrate to electric power than it would to migrate to a hydrogen economy. This is just a guess but in the next 20 years I expect better batteries to do more than better hydrogen technologies to change the energy infrastructure of the world.

For more on the potential of electric power for cars over on the Ergosphere blog Engineer-Poet (yet another pseudonymous blogger whose real identity is a mystery) provides some great tables on where our energy comes from today and an argument for plug-in hybrid cars. Check out the table that shows the amount of energy that comes from different sources as measured in quadrillion BTUs ("Quads"). Note that the coal and natural gas quads combined exceed the petroleum quads. But the coal and natural gas burned for electricity lose a lot of energy in conversion. So delivered to the wheel of a car petroleum does more work per BTU. Would a switch to coal to power electric cars therefore require a much greater total consumption of energy? E-P also casts a skeptical look at someone else's skeptical look at hybrids.

By Randall Parker    2004 December 02 01:48 PM   Entry Permalink | Comments (13)
2004 October 27 Wednesday
UK Researchers Find Workable Hydrogen Storage Nanomaterial

Some British researchers have found a way to store hydrogen without sustained high pressures.

A team from the Universities of Newcastle upon Tyne and Liverpool in the UK, who report their findings in the prestigious academic journal, Science, have found a safe way of storing and releasing hydrogen to produce energy. They do this using nanoporous materials, which have tiny pores that are one hundred-thousandth (100,00th) the thickness of a sheet of paper.

...

The Liverpool and Newcastle researchers have found a workable method of injecting the gas at high pressure into the tiny pores - of ten to the minus nine metres in size - in specially-designed materials to give a dense form of hydrogen. They then reduce the pressure within the material in order to store the captured hydrogen safely. Heat can be applied to release the hydrogen as energy, on which a car could potentially run.

Professor Mark Thomas, of Newcastle University's Northern Carbon Research Laboratories in the School of Natural Sciences, a member of the research team, said:

"This is a proof of principle that we can trap hydrogen gas in a porous material and release it when required. However, if developed further, this method would have the potential to be applied to powering cars or any generator supplying power. Although hydrogen-powered cars are likely to be decades away, our discovery brings this concept a step towards becoming reality.

"Now that we have a mechanism that works, we can go on to design and build better porous framework materials for storing hydrogen, which may also be useful in industries that use gas separation techniques."

Professor Matt Rosseinsky, of the University of Liverpool's Department of Chemistry, said "Our new porous materials can capture hydrogen gas within their channels, like a molecular cat-flap.

"After allowing the hydrogen molecule – the 'cat - in, the structure closes shut behind it. The important point is that the hydrogen is loaded into the materials at high pressure but stored in them at a much lower pressure - a unique behaviour. This basic scientific discovery may have significant ramifications for hydrogen storage and other technologies that rely on the controlled entrapment and release of small molecules."

The ability to store hydrogen at high density but under low pressure without extreme cooling is the holy grail for making hydrogen storage in cars practical. But this one result probably doesn't solve that problem. The nanotech material used might be very expensive to manufacture (as is presently the case with many nanotech materials such as nanotubes). Or it might not work over a wide range of environmental conditions. Or it might not work over hundreds of recharges. Still, this report is reason to be hopeful that hydrogen storage is a solvable problem. My guess is that nanotechnology approaches will be where the solutions are found. This report is therefore a step in the right direction.

By Randall Parker    2004 October 27 05:18 PM   Entry Permalink | Comments (5)
2004 October 13 Wednesday
Hgher Temperature Superconducting Wire Continues To Advance

An article in the Christian Science Monitor reports that Sumitomo Electric Industries and American Superconductor Corp. (AMSC) are heading to market with next generation high temperature superconducting ceramic wire.

The wire, produced in Osaka, Japan, is narrower than the width of a pencil.

To develop the market, Sumitomo - Japan's biggest electric cablemaker - will offer the cable at competitive prices - about two to five times the price of conventional copper, Mr. Saeki says.

But Sumitomo will soon have competition. American Superconductor Corp. of Westborough, Mass., is working with the Oak Ridge National Laboratory [NIST] on a more advanced version of the wire, which could be used as transmission lines for electric utilities.

This type of wire still needs to be cooled by liquid nitrogen to a range of -452 to -320 degrees Fahrenheit. So this stuff isn't going to be used as building wiring. But it could still be used for power lines and in motors for ships, trains, and other large pieces of equipment.

The article gives the impression that Sumitomo's next generation wire is coming to market right now. While AMSC is still 3 or 4 years from hitting the market with their next gen wires but they are already shipping existing designs and claim to be the world leader in higher temperature superconducting wire sales.

AMSC is the world’s leading developer and manufacturer of High Temperature Superconductor (HTS) wire. AMSC's first generation HTS wire, based on a multi-filamentary composite architecture, is capable of carrying over 140 times the power of copper wires of the same dimensions. It is the industry leader in both price and performance and is the product of choice in a variety of applications including power cables, motors, generators, and specialty magnets.

AMSC announced break-through results in September of 2002 of its second generation HTS wire beating the Department of Energy's benchmark for performance by 15 months. Second generation wire, when available in commercial quantities in the next three to four years, is expected to cost two to five times lower than first generation HTS wire and will significantly broaden the market for HTS-based products and applications. As a form-fit-function replacement for first generation wire, second generation will require no re-engineering of applications developed and commercialized using first generation wire.

What sort of future will higher temperature superconducting materials make possible? Jesse H. Ausubel, director of the Program for the Human Environment at The Rockefeller University in New York, has an article in The Industrial Physicist one one potential future application of higher temperature supercondutors: the zero-emission power plant (ZEPP) and the Continental SuperGrid.

The ZEPP is a supercompact, superfast, superpowerful turbine putting out electricity and carbon dioxide (CO2) that can be sequestered. Investments by energy producers will make methane (natural gas) overtake coal globally as the lead fuel for making electricity over the next two to three decades. Methane tops the hydrocarbon fuels in heat value, measured in joules per kilogram, and thus lends itself to scaling up. Free of sulfur, mercury, and other contaminants of coals and oils, methane is the best hydrocarbon feedstock.

Ausubel quotes a source that expects ZEPP plants to boost methane-to-electric conversion efficiency from 55% to 70% and imagines a future of methane fueled 5000 MW and 10,000 MW electric power plants fuels by oxygen which has been purified from the atmosphere using croygenic separation. He envisions power plants operating under such enormous pressures that the carbon dioxide by-product of combustion comes out in liquid form for easy capture to send to sequestration facilities. The whole article is pretty interesting. Though a competing argument can be made for the continued spread of smaller electric power generators for local generation and use of electricity. That is the future that KnowlegeProblem blogger Lynne Kiesling thinks distributed energy generation systems are a real possibility, especially if the regulatory environment can be changed to be more accommodating to them. As I've previously pointed out, this might ultimately lead all the way down to cars as distributed electric power generators.

By Randall Parker    2004 October 13 04:07 PM   Entry Permalink | Comments (4)
2004 October 08 Friday
Thousand Nuclear Reactors Could Hydrogen Power All Cars In America

Andrew Oswald, an economist at the University of Warwick, and his brother Jim, claim that to switch to hydrogen power for vehicles would require either covering half of California with with turbines or building 1,000 nuclear reactors.

Converting every vehicle in the United States to hydrogen power would demand so much electricity that the country would need enough wind turbines to cover half of California or 1,000 extra nuclear power stations.

The Oswalds are making the argument that hydrogen isn't an easy solution to our energy problems. Fair enough. But could hydrogen play a role if we really thought we were better off ending our reliance on fossil fuels? Let us leave aside the fact that hydrogen has a lot of problems associated with it that its enthusiasts tend to ignore. Perhaps some day those problems will be solved. Or perhaps if we only had a non-fossil fuel based way to generate enough hydrogen to power our cars we could instead use the power to generate synthetic hydrocarbons or we could develop better battery technology. The more important question then is whether we could get that power from somewhere if we really wanted to.

While I would oppose the construction of so many wind turbines on esthetic grounds some might disagree. I'm not sure what the cost would be of all those wind turbines but the 1,000 nuclear reactors are at least within the realm of the affordable. It is not clear what reactor size the Oswalds assumed in their calculation. But suppose they based their calculation on the new and very large Westinghouse AP1100 1,100 Megawatt nuclear reactor. The cost for a pair is estimated to be about $2.2 to $2.7 billion. But if 1,000 of them were built it seems safe to assume that there'd be considerable economies of scale. So let us suppose the reactors would cost $1 billion each. Well, that is only $1 trillion to build 1,000 of them.

Put that $1 trillion in perspective. The US burns about 20 million barrels of oil per day which at $50 per barrel is $1 billion per day or 364 billion per year. Though much of that is not for cars. Still, is that $1 trillion affordable if we really needed to switch to nuclear? The United States has a $11 trillion dollar a year economy. For a cost equalling slightly more than one month's economic production we could drastically cut our use of fossil fuels. So when people say we have no choice but to use fossil fuels, well, that just isn't true.

Granted, we couldn't convert to a nuclear economy in a year. We'd have to develop a number of supporting technologies and deploy them. It would take a couple of decades to make the full transition. Yet it really could be done.

There are problems with going the nuclear route. Waste disposal is a problem and is a large cost too. Operations and fuels are additional costs but much lower than construction costs. Securing so many nuclear reactors against terrorist attacks would be another substantial problem. Plus, increased use of nuclear power throughout the world would raise the risk of nuclear materials falling into the hands of terrorists.

Also, implementation of a massive nuclear reactor building program might be premature. Pebble Bed Modular Reactor technology could first be developed to provide a safer and cheaper nuclear option. Then PBMR reactors could be built instead. But even the current cost of nuclear power demonstrates that we do not absolutely need fossil fuels in order to maintain a modern industrial economy with fairly high living standards.

Nuclear power is also not the only energy alternative available that could totally displace fossil fuels. Another option would be to construct massive arrays of space-based solar photovoltaic panels, usually referred to as Space Solar Power Satellites (SSPS). Though it is harder to estimate what the costs would be of such an undertaking it seems safe to assume that an effort of that scale would create enough demand for space launch capabilities that space launch technologies would advance as a consequence of the demand for launch services. In conjunction with a space solar power project giant reflectors could be built in space to prevent global warming.

Ironically, while Hoffert’s team recommends harnessing the Sun’s energy from space, they also suggest blocking some of it, either with giant translucent shields or mirrors. About 2 percent of the Sun’s energy would need to be blocked in order to correct for climate-warming gas production. Such an effort is called geoengineering.

"For this application a sunshield or solar parasol would have to be very large (thousands of kilometers in diameter), possibly very thin, and possibly fabricated from lunar materials," Hoffert said. "At this point, space mirrors are more of a thought experiment than a real option."

We could build space-based solar power collection systems or space-based reflectors to cool the Earth. So we could either eliminate our need for fossil fuels or neutralize the warming effects of the continuing increase in atmospheric carbon dioxide due to fossil fuel burning.

Rather that either government spending or government mandates for private spending on massive non-fossil fuel power systems my own preference is for an increase in government funding of energy research in combination with government prizes offered in various areas for achievements in advances in technologies that would help toward the development of alternative energy technologies. Better to develop new technologies that the market will then choose to implement. Implementation of a mandated alternative power source in the United States would be more costly than current energy sources and still would only reduce American demand for fossil fuels while the demand of the rest of the world would grow to eventually far exceed today's current world aggregate demand.

On the subject of prizes to advance energy technologies imagine, for example, a $1 million dollar prize for every demonstrated single point increase in photovoltaics material conversion efficiency. The size of the prize per percentage point increase could even be scaled to provide larger prizes the higher the best existing efficiency becomes. So increasing from 25% to 26% conversion efficiency would not yield as big of a prize as going from 50% to 51%.

The challenge with a prize system for advancing energy technologies would be to find a large and appropriate set of technological goals that would each have a prize offered for their attainment. For example, prizes to researchers for better batteries would have to include both achievement of higher energy/weight density, total energy capacity, size, and number of cycles the batteries would have to be able to be recharged. There could not be just a prize for achieving a battery good enough to make electric cars feasible. We'd need lots of prizes to reward the reaching of intermediate points toward the ultimate goal.

The biggest problem with solar power is the cost. But to incentivize academic researchers to come up with better materials for making solar cells it would make more sense to leave aside the cost question and instead reward achievement of more scientific and technical goals such as new efficiency records for each of several different classes of materials. For example, separate rewards could be made for higher efficiencies of thin film carbon-based, silicon-based, nanotube-based, and other categories.

Update: Ergosphere blogger Engineer-Poet takes a look at the Oswald paper and argues that the Oswalds made some errors in their calculations and that the number of nuclear reactors needed to power cars isn't nearly as many as the Oswalds believe.

By Randall Parker    2004 October 08 11:28 PM   Entry Permalink | Comments (28)
2004 September 07 Tuesday
Electric Power Grid Vulnerabilities Explored

Reka Albert, an assistant professor of physics at Penn State, has led a team examining the national electrical grid in the United States to look for vulnerabilities and her team has found that failures in a fairly small portion of the network can lead to a major disruption. (same article here)

"Our analysis indicates that major disruption can result from loss of as few as two percent of the grid's substations," says Albert, whose research team includes Istvan Albert, research associate in the Bioinformatics Consulting Center at Penn State, and Gary L Nakarado at the National Renewable Energy Laboratory. One implication of the research is that identification of strategic points in the grid system can enhance defense against interruptions, whether by equipment failure, natural disasters, or human activity. Major blackouts caused by failures in the grid, such as the one that affected the northeastern part of the country during the summer of 2003, incur tremendous economic, public-health, and security risks.

The study, titled "Structural Vulnerability of the North American Power Grid," was published in a recent issue of the journal Physical Review E. The researchers constructed a model of the entire transmission grid with over 14,000 "nodes," including generators, transmission substations, and distribution substations, and over 19,000 "edges," corresponding to the high-voltage transmission lines that carry power between the nodes. They measured the importance of each substation node based on its "load," or the number of shortest paths between other nodes that pass through it. "While 40 percent of the nodes had a load below one thousand, the analysis identified 1 percent of the nodes--approximately 140--that have a load higher than one million," Albert says.

...

However, the grid quickly becomes disconnected when the high-load transmission substations are selectively removed from the system--if the nodes that have the highest load are removed first, followed progressively by the nodes with successively lower loads. According to the model, a loss of only 4 percent of the 10,287 transmission substations results in a 60 percent loss of connectivity. During a cascading failure, in which the high-load substations fail in sequence, the model shows that the loss of only 2 percent of the nodes causes a catastrophic failure of the entire system.

Whether regulation can be changed to allow market forces to bring about changes to make the electric grid less prone to massive failure is a question that Lynne Kiesling addresses on her blog. My guess is that the costs from failures hasn't gotten large enough to overcome the status quo bias problem that Lynne has discussed recently. Also see her post Network Reliability As A Public Good And What To Do About It.

Advances in technologies such fuel cells, cheaper batteries, and Vehicle to Grid will likely lead to much more capacity for local generation of electric power in the longer run. So grid reliability will become relatively less important than it is today.

By Randall Parker    2004 September 07 01:30 AM   Entry Permalink | Comments (7)
2004 August 26 Thursday
Two New Hydrogen Generation Techniques

A pair of University of Wisconsin researcher claim they have developed a more efficient and less polluting way to convert hydrocarbons into hydrogen.

Carbon monoxide, or CO, has long been a major technical barrier to the efficient operation of fuel cells. But now, chemical and biological engineers at UW-Madison have not only cleared that barrier - they also have discovered a method to capture carbon monoxide's energy.

To be useful in a power-generating fuel cell, hydrocarbons such as gasoline, natural gas or ethanol must be reformed into a hydrogen-rich gas. A large, costly and critical step to this process requires generating steam and reacting it with carbon monoxide (CO). This process, called water-gas shift, produces hydrogen and carbon dioxide (CO2). Additional steps then are taken to reduce the CO levels further before the hydrogen enters a fuel cell.

James Dumesic, professor of chemical and biological engineering, postdoctoral researcher Won Bae Kim, and graduate students Tobias Voitl and Gabriel Rodriguez-Rivera eliminated the water-gas shift reaction from the process, removing the need to transport and vaporize liquid water in the production of energy for portable applications.

The team, as reported in the Aug. 27 issue of Science, uses an environmentally benign polyoxometalate (POM) compound to oxidize CO in liquid water at room temperature. The compound not only removes CO from gas streams for fuel cells, but also converts the energy content of CO into a liquid that subsequently can be used to power a fuel cell.

Note that their focus is on the development of supporting technologies aimed at making portable fuel cells more practical. Their approach does not generate any energy and they need hydrocarbon fuels to start with. Still, conversion of hydrocarbons to hydrogen to burn in fuel cells might some day make cars more efficient in their use of liquid hydrocarbons. At the very least their appoach might provide portable power sources for personal computers and other gadgets humans lug around.

WIth an aim of enabling solar power to be tapped as an economic source of energy a pair of Australian scientists claim they will be able to build solar power driven hydrogen generating titanium oxide ceramics.

Australian scientists predict that a revolutionary new way to harness the power of the sun to extract clean and almost unlimited energy supplies from water will be a reality within seven years.

Using special titanium oxide ceramics that harvest sunlight and split water to produce hydrogen fuel, the researchers say it will then be a simple engineering exercise to make an energy-harvesting device with no moving parts and emitting no greenhouse gases or pollutants.

It would be the cheapest, cleanest and most abundant energy source ever developed: the main by-products would be oxygen and water.

"This is potentially huge, with a market the size of all the existing markets for coal, oil and gas combined," says Professor Janusz Nowotny, who with Professor Chris Sorrell is leading a solar hydrogen research project at the University of New South Wales (UNSW) Centre for Materials and Energy Conversion. The team is thought to be the most advanced in developing the cheap, light-sensitive materials that will be the basis of the technology.

"Based on our research results, we know we are on the right track and with the right support we now estimate that we can deliver a new material within seven years," says Nowotny.

...

The UNSW team opted to use titania ceramic photoelectrodes because they have the right semiconducting properties and the highest resistance to water corrosion.

Solar hydrogen, Professor Sorrell argues, is not incompatible with coal. It can be used to produce solar methanol, which produces less carbon dioxide than conventional methods. "As a mid-term energy carrier it has a lot to say for it," he says.

Okay, seven years is some years out there with obviously a number of technical problems yet to be solved. They haven't proved they can really make their approach work or that their materials will really turn out to be cheap to manufacture. Still, they might succeed.

We need many more teams in research labs working on materials to use solar power to generate hydrogen, electricity, and hydrocarbons (artificial photosynthesis). We also need more teams working on fuel cell technologies and materials for newer lighter types of batteries. Many battery and fuel cell technologies would allow fossil fuels to be burned more efficiently while also acting as enabling technologies for solar power by allowing energy captured by solar technologies to be stored.

By Randall Parker    2004 August 26 03:49 PM   Entry Permalink | Comments (8)
2004 August 17 Tuesday
Glass Coating Reflects More Heat At Higher Temperatures

A vanadium dioxide derivative with a precise amount of added tungsten automatically lets in more infrared light when the temperature is cold than it does when the temperature is warmer.

Soaring air conditioning bills or suffering in the sweltering heat could soon be a thing of the past, thanks to University College London chemists.

Reporting in the Journal of Materials Chemistry, researchers reveal they have developed an intelligent window coating that, when applied to the glass of buildings or cars, reflects the sun’s heat so you don’t get too hot under the collar.

While conventional tints block both heat and light the coating, which is made from a derivative of vanadium dioxide, allows visible wavelengths of light through at all times but reflects infrared light when temperature rise over 29 degrees Celsius. Wavelengths of light in this region of the spectrum cause heating so blocking infrared reduces unwanted rays from the sun.

The coating’s ability to switch between absorbing and reflecting light means occupants benefit from the sun’s heat in cooler conditions but when temperatures soar room heating is reduced by up to 50 per cent.

Professor Ivan Parkin, of UCL’s Department of Chemistry and senior author of the paper, says:

“Technological innovations such as intelligent window coating really open the door to more creative design. The current trend towards using glass extensively in building poses a dilemma for architects. Do they tint the glass, which reduces the benefit of natural light or face hefty air conditioning bills?

Professor Parkin says the next item on their research agenda is to investigate the durability of the coating and also to change its color.

Better materials for windows, walls, ceilings, and doors could greatly reduce the amount of energy used for heating and cooling.

By Randall Parker    2004 August 17 06:35 PM   Entry Permalink | Comments (6)
2004 July 30 Friday
Will Electric Hybrid Cars Be Used As Peak Electric Power Sources?

Future hybrid cars may be used as "Vehicle To Grid" or V2G power sources to meet peak electric power demand.

But if automakers were to make 1 million next-generation V2G vehicles by 2020, they could generate up to 10,000 megawatts of electricity - about the capacity of 20 average-size power plants, according to a 2001 study by AC Propulsion, the electric vehicle maker in San Dimas, Calif., that created the V2G Jetta.

While vehicles could generate plenty of power - studies show they sit idle 90 percent of the time - it would be far too costly to use as simple "base-load" power. Their main value would be in supplying spurts of peak and other specialty "ancillary" power for which utilities pay premium prices. It would be far cheaper for utilities to tap the batteries of thousands of cars, say, than the current practice of keeping huge turbines constantly spinning just to supply power at a moment's notice, studies show.

With hybrids it wouldn't be necessary to start the engines of parked cars in order to use them as electric power sources. The hybrids have lots of batteries in them. If they plug in when stopped part of their battery charge could be drainied whenever a distributed computer network decided to switch them onto the grid. The switching could be fairly automated and used to deal with quick spikes in electric power demand.

Next generation electric hybrids will have electric power generation costs that are too high to compete with large electric power plants for non-peak electric power uses. But advances in fuel cell technologies will eventually provide a way to generate electricity cheaply than car internal combustion engines. Whether those car fuel cells can ever compete with natural gas or coal fired electric power plants remains to be seen. For more on that possibility see my previous post Cars May Become Greater Electricity Generators Than Big Electric Plants

Even if car electricity doesn't become cost competitive expect to see car electric power generators to become emergency back-up power sources. If a big grid failure happens 10 or 20 years from now we won't have to wait while the long distance electric power lines and switching stations are repaired. People will just plug houses into cars.

Aside about hybrids on the street: Quite a number of people in Santa Barbara California have Toyota Prius electric hybrid cars. I see them every day. One of the most curious aspects of the hybrids is that they are so quiet when running off of battery. As a result I've had to adjust my behavior a bit. You can't rely as much on the absence of an engine sound to know that a car is not coming when, say, doing a daily dog run up a street.

By Randall Parker    2004 July 30 02:20 AM   Entry Permalink | Comments (17)
2004 July 26 Monday
Humans Use A Fifth Of All Land Plant Materials

Dr. Marc L. Imhoff, a Principal Investigator in NASA's Carbon Cycle Science and Land Cover Land Use Change Programs, visiting scientist Lahouari Bounoua and colleagues have added up the human consumption of plant matter and found that globally humans are consuming 20% of the world's plant life.

NASA scientists working with the World Wildlife Fund and others have measured how much of Earth's plant life humans need for food, fiber, wood and fuel. The study identifies human impact on ecosystems.

Satellite measurements were fed into computer models to calculate the annual net primary production (NPP) of plant growth on land. NASA developed models were used to estimate the annual percentage of NPP humans consume. Calculations of domesticated animal consumption were made based on plant-life required to support them.

Marc Imhoff and Lahouari Bounoua, researchers at NASA's Goddard Space Flight Center (GSFC), Greenbelt, Md., and colleagues, found humans annually require 20 percent of NPP generated on land. Regionally, the amount of plant-based material used varied greatly compared to how much was locally grown.

Note that this analysis does not include the oceans. So human fishing activities as a percentage of all ocean-based biomass production are not included in the above analysis.

North America's lower latitudes and lower population density allows it to produce 9 times more carbon in plant matter than Europe. So even though North America consumes much more plant matter than Europe humans in North America consume a smaller percantage of its local plant matter than do Europeans.

The effort resulted in a worldwide map of consumption that could be broken down to individual regions by population to compare how much an area such North America, for example, consumes with the amount it can produce locally - about 24 percent of its annual plant production.

Highly populated Western Europe and South Central Asia, one the other hand, each consume 70 percent of the greenery they produce.

With a much larger population China is already consuming more plant matter per year than North America even though China consumes only a fourth the amount per capita. Continued industrialization therefore seems likely to greatly increased world human demand for plant matter. This seems likely to result in rising timber prices and a shift toward the use of other types of building materials.

The bigger problem with rising Chinese demand and overall rising world demand is that it will decrease the amount of biomass available for wild animal life.

The rising demand for biomass also argues against a big role for biomass materials as future energy sources. Granted, some waste biomass could be converted to energy. But even wastes are potential food sources for bacteria, fungi, insects, fish, and other life forms. A perfectly efficient human-managed biomass cycle is going to squeeze out other life forms that rely upon plants, dead animals, and wastes as food sources.

Also see Structures In United States Cover Area Equal To Ohio.

By Randall Parker    2004 July 26 01:50 PM   Entry Permalink | Comments (6)
2004 June 01 Tuesday
Propane Drives Turbine To Harness Waste Heat, Reduce Pollution

Currently almost two thirds of heat generated by coal and natural gas to generate electricity is wasted.

When steam is used to turn a generator, it must be pressurised and raised to around 650 °C. Below 450 °C, the process no longer operates efficiently because the steam pressure drops too low. This means that the heat in flue gases below 450 °C cannot be used to generate electricity, and so is lost to the atmosphere.

Two engineers have developed a mechanism using a pair of heat exchangers and propane cycling between liquid and vapor states to drive a turbine to generate more electricity from heat that is currently wasted.

But now Daniel Stinger, a turbine engineer, and Farouk Mian, a petroleum engineer, have developed a surprisingly simple way to harness almost all this waste heat. They calculate that a second turbine, driven by the waste heat from the first, would capture almost all the remaining energy. The first turbine's waste heat would vaporise and pressurise still more propane to drive the second (see diagram).

Daniel Stinger and Farouk Mian have founded Wow Energies and have a patent pending on their invention.

A new patent pending technology is available which replaces the steam turbine system with a Cascading Closed Loop Cycle (CCLC) system producing an increase in MW output of 150% to 600% over a steam turbine system operating at the same heat source temperatures. Click here for comparison chart. The CCLC can also be installed to operate in conjunction with an existing steam turbine system to increase the output by 100% without using additional fuel, the Super CCLC system. If the CCLC turbine system had been installed in place of or with the steam turbines in use today, it is estimated that the U.S. economy could save $100 billion in fuel costs annually. The savings to the economy if the CCLC technology is used to retrofit existing units is conservatively estimated at $35 billion annually.

Other industry proceses are equally inefficient. Industries that depend on burning fossel fuels in boilers, furnaces, ovens, kilns, gas turbines, internal combustion (IC) engines, fuel cells, nuclear power plants, etc. all produce equivalent losses in the form of waste heat exhausted to the atmosphere. Major industries, in addition to the power generation industry, which can benefit from the CCLC technology includes refining, petrochemical, transportation, cement, pulp & paper, metals and pharmaceutical.

Even more dramatic are the corresponding environmental benefits of conservation of non-renewable fuel resources and dramatic reductions or elimination of emissions when installed on an existing waste heat source.

The CCLC system uses off-the-shelf components. The three (3) major components are a pump, heat exchanger and turbo-expander (turbines) that are readily available from numerous suppliers. For example, both axial and centrifugal turbo-expanders are used extensively in the petrochemical and oil & gas industries and are readily available from suppliers such as GE, Atlas Copco, Mafi Trench, Mitsubishi, Siemens and MAN.

Less fossil fuels burned to generate electric power translates into less pollutants released across the board. But there is an additional benefit of their approach. By converting more of the heat into electricity they lower the temperature of exhaust air and that causes many pollutants to condense into liquids and solids instead of being released into the atmosphere.

The CCLC system is so efficient that during the process of converting waste heat to power, it reduces the flue gas temperature to near ambient where conditions are favorable for elimination of pollutants. At these temperatures, vaporized pollutants such as Mercury, Vanadium, Lead, Cadmium, as well as Vaporized Organic Compounds (VOC), can no longer exist in a vaporized state and are “forced” to condense out of the flue gas as a liquid or solid. The remaining SOx and NOx can be removed using a low temperature Final Flue Gas Cleanup (FFGC) system by circulating a dilute water solution of sodium hydroxide and hydrogen peroxide in a scrubber that reacts with any remaining SOx and NOx to form stable salt solutions. The dilute solution also serves to remove PM2.5 and PM10 particulates, returning the flue gas to the environment in a pristine state. Low temperature scrubbers are commonly used in the petrochemical and pharmaceutical industries where they must totally prevent far more dangerous pollutants from entering the environment. Any pollutants escaping their plants would be instantly destructive, whereas the pollutants noted above only slowly but surely damage the environment and destroy our health.

Lower costs and less pollution are double wins for their invention.

The CCLC system uses off-the-shelf components to generate electricity by recovering the trillions of BTUs discharged hourly to the environment in the form of 300 oF to 700 oF waste heat. Instead of vaporizing water to produce steam to drive a steam turbine, the CCLC process vaporizes propane to drive turbo-expanders in a sealed closed loop system. The propane is identical to that used in back-yard grills for cooking, stored in tanks for heating homes, and as a clean fuel for cars, trucks and other vehicles. Propane is not consumed in the process and serves only as the medium to convert thermal energy to mechanical energy; requiring only 130 Btu/lb to vaporize versus 1000 Btu/lb for water. More importantly, propane will vaporize and absorb superheat at low ambient temperatures – not possible with water. Turbo-expanders have been used for decades throughout industry to expand vaporized hydrocarbons, including propane, to produce electrical power. The uniqueness of the CCLC patent pending system is the use of twin turbo-expanders and multiple heat exchangers, in a parallel/series arrangement, resulting in conversion of nearly all the temperature from the heat source to electrical power.

If this turns out to work then consider the implications. Rather than building new electric power generation plants existing plants could be outfitted to generate more electricity from the same amount of fuel. Even nuclear plants could have their electric power output boosted. Plus, the CLCC system could be hooked up to all sorts of industrial processes used in other industries to provide yet more sources of electric power and less pollution to boot.

By Randall Parker    2004 June 01 05:32 PM   Entry Permalink | Comments (16)
2004 May 11 Tuesday
Improved Gasoline To Hydrogen Converter For Cars

One problem holding back the use of hydrogen to supply fuel to fuel cells in cars is that there is no good way to store hydrogen in cars and conversion to hydrogen distribution would be very expensive. Yet at the same time hydrogen can burn very efficiently. Looking to find a way around the limitations of hydrogen as a storage medium while still achieving some of the environmental and efficiency gains for hydrogen as a fuel to burn researchers at the Pacific Northwest National Laboratory have found a way to improve extraction of the hydrogen in gasoline into pure hydrogen.

RICHLAND, Wash. — Researchers at the Department of Energy's Pacific Northwest National Laboratory are developing a system to rapidly produce hydrogen from gasoline in your car. "This brings fuel cell-powered cars one step closer to the mass market," said Larry Pederson, project leader at PNNL. Researchers will present their developments at the American Institute for Chemical Engineers spring meeting in New Orleans, on April 27th, 2004.

Fuel cells use hydrogen to produce electricity which runs the vehicle. Fuel cell-powered vehicles get about twice the fuel efficiency of today's cars and significantly reduce emissions. But how do you "gas up" a hydrogen car? Instead of building a new infrastructure of hydrogen fueling stations you can convert or reform gasoline onboard the vehicle. One approach uses steam reforming, in which hydrocarbon fuel reacts with steam at high temperatures over a catalyst. Hydrogen atoms are stripped from water and hydrocarbon molecules to produce hydrogen gas.

The problem has been that you have to wait about 15 minutes before you can drive. It has taken steam reformer prototypes that long to come up to temperature to begin producing hydrogen to power the vehicle. This delay is unacceptable to drivers.

However, PNNL has demonstrated a very compact steam reformer which can produce large amounts of hydrogen-rich gas from a liquid fuel in only 12 seconds. "This kind of fast start was thought to be impossible until just a couple of years ago," said Pederson.

The Department of Energy recognized that a fast start was vital to the viability of onboard fuel processing and established an ultimate goal of 30 seconds for cold start time with an intermediate target of 60 seconds by 2004. The steam reformer is the highest temperature component within the fuel processor and represents the biggest hurdle to achieving rapid startup. "Hence, the PNNL achievement of a 12 second steam reformer startup is a big step towards a complete fuel processor which can start up in 30 seconds," said Greg Whyatt, the project's lead engineer.

PNNL engineers called upon their expertise in microtechnology to develop the reforming reactor. Microchannels, narrower than a paper clip, provide high rates of heat and mass transport within the reactor. This allows significantly faster reactions and dramatically reduces the size of the reactor. A complete microchannel fuel processor for a 50 kilowatt fuel cell is expected to be less than one cubic foot. At this size, the system will readily fit into an automobile.

"The key feature of the new design is that the reforming reactor and water vaporizer are configured as thin panels with the hot gases flowing through the large surface area of the panel," said Whyatt. This allows high gas flows to be provided with an inexpensive, low-power fan while still providing efficient heat transfer to rapidly heat the steam reformer.

"In addition, the panel configuration allows higher combustion temperatures and flows without risking damage to the metal structure while a low pressure drop reduces the electrical power consumed by the fan during startup and steady operation" said Whyatt.

PNNL researchers are now working to reduce the fuel consumption and air flow required during startup. In addition, integration with other components is needed to demonstrate a complete fuel processor system that can achieve startup in less than 30 seconds. However, PNNL's fuel reformer technology appears to have overcome a major stumbling block for onboard reformation: the need for speed.

Converting the hydrocarbons in the gasoline to hydrogen would allow both a less polluting burn and a more efficient burn.

In my view too many future energy scenarios neglect the advantages and future potentials from the continued use of liquid chemical fuels. We do not have batteries or methods of storing hydrogen that compare to the density and ease of use of liquid hydrocarbons. Even if global warming is a serious problem that must be dealt with that is not necessarily a reason to abandon liquid hydrocarbons. Better catalysts for doing artificial photosynthesis (which would parenthetically create an artificial carbon cycle that would stop the rise of carbon dioxide in the atmosphere) to produce liquid hydrocarbons combined with more efficient ways of burning liquid hydrocarbon fuels may some day become a cost competitive set of technologies for gradually reducing and eventually eliminating our reliance on fossil fuels. The burning of liquid hydrocarbons using emerging technologies such as hydrogen reformers promise to increase fuel efficiency while simultaneously reducing emissions. This increased efficiency will be gained regardless of whether the liquid fuel source is from fossil fuel hydrocarbons or from synthetic liquid hydrocarbons produced by either solar energy or using energy generated by nuclear plants.

Combine the conversion of gasoline to hydrogen in the car with continuing advances in hybrid car technologies and use of liquid fuels may well continue to have a bright future. Proposals for a massive and incredibly expensive conversion to a pure hydrogen energy economy ought to be compared to the possibilities for continued development of a much higher tech and environmentally cleaner liquid hydrocarbon future.

By Randall Parker    2004 May 11 02:07 PM   Entry Permalink | Comments (9)
2004 March 04 Thursday
Method To Do Desktop Fusion Discovered?

Scientist at Oak Ridge National Laboratory may have found a cheap way to cause hydrogen atoms to fuse. (same article here)

The researchers expose the clear canister of liquid to pulses of neutrons every five milliseconds, or thousandths of a second, causing tiny cavities to form. At the same time, the liquid is bombarded with a specific frequency of ultrasound, which causes the cavities to form into bubbles that are about 60 nanometers – or billionths of a meter – in diameter. The bubbles then expand to a much larger size, about 6,000 microns, or millionths of a meter – large enough to be seen with the unaided eye.

"The process is analogous to stretching a slingshot from Earth to the nearest star, our sun, thereby building up a huge amount of energy when released," Taleyarkhan said.

Within nanoseconds these large bubbles contract with tremendous force, returning to roughly their original size, and release flashes of light in a well-known phenomenon known as sonoluminescence. Because the bubbles grow to such a relatively large size before they implode, their contraction causes extreme temperatures and pressures comparable to those found in the interiors of stars. Researches estimate that temperatures inside the imploding bubbles reach 10 million degrees Celsius and pressures comparable to 1,000 million earth atmospheres at sea level.

At that point, deuterium atoms fuse together, the same way hydrogen atoms fuse in stars, releasing neutrons and energy in the process. The process also releases a type of radiation called gamma rays and a radioactive material called tritium, all of which have been recorded and measured by the team. In future versions of the experiment, the tritium produced might then be used as a fuel to drive energy-producing reactions in which it fuses with deuterium.

Whereas conventional nuclear fission reactors produce waste products that take thousands of years to decay, the waste products from fusion plants are short-lived, decaying to non-dangerous levels in a decade or two. The desktop experiment is safe because, although the reactions generate extremely high pressures and temperatures, those extreme conditions exist only in small regions of the liquid in the container – within the collapsing bubbles.

The ability to sustain nuclear fusion could provide a way to produce enormous quantities of energy. If this could be done very cheaply then the age of fossil fuels would come to an end.

The research paper reporting this work went through intense review before being approved for publication.

Although no one has tried repeating the latest work, Lee Riedinger, deputy director for science and technology at Oak Ridge, says that, it went through an "extraordinary level of review" before being accepted for publication by Physical Review E.

By the standards of plasma physics research the money needed to try to repeat this experiment is peanuts.

For decades, physicists have dreamed of harnessing the ferocious alchemy of the Sun as a clean, limitless energy source. Most experiments have been conducted in giant, expensive reactors using magnetic fields to confine the ultrahot gases.

In contrast, the new experiment, which cost less than $1 million, uses the power of sound to create energy comparable to the inside of stars.

Hopefully granting agencies will allocate the money needed to for other labs to check this result. If this result holds up then our future could take a really big turn.

By Randall Parker    2004 March 04 03:13 AM   Entry Permalink | Comments (3)
2004 February 29 Sunday
Ocean Gas Hydrate Estimates Seen As Too High

Previous estimates of natural gas bound up in ocean floor hydrates may be too high.

One widely cited estimate proposes that 10,000 gigatonnes (Gt) of methane carbon is bound up as hydrate on the ocean floor.

But Dr Alexei Milkov of BP America says his research shows reserves are between 500 and 2,500 Gt, a significantly smaller figure than has been previously estimated.

Gas hydrates are still very expensive to extract from the ocean floor.

"Drilling gas hydrates is estimated to be six times more expensive than exploitation of oil and other gas sources," said Prof Bahman Tohidi, director of the Centre for Gas Hydrate Research in Edinburgh.

Even the lower estimate is still a huge amount of energy. To put it in perspective, from 1850 thru 2000 the total amount of natural gas burned in the world was only 61 gigatonnes measured in oil equivalent weight. (not sure how that compares to the gigatonnes figures above though) Gas hydrates researcher Anne Trehu at Oregon State University says previous models may be too high in some cases but there is still a lot of gas hydrate methane in the oceans.

Trehu and her colleagues have found that some widely cited previous models estimating the total mass of methane trapped in marine sediments are probably too high.

On the other hand, some local concentrated deposits may be larger than previously thought. “There is still a lot of methane out there, even if the models were wrong,” Trehu said.“

Arthur H. Johnson, Chairman and Chief Executive Officer of Hydrate Energy International, presented Congressional testimony in June 2003 on the potential of gas hydrates as an energy source.

Gas hydrate is a crystalline substance composed of gas and water. It forms when water and natural gas combine under conditions of moderately high pressure and low temperature. If gas hydrate is either warmed or depressurized it will revert back to water and natural gas, a process termed “dissociation”. Natural gas is concentrated in hydrate so that the dissociation of a cubic foot of hydrate will yield 0.8 cubic feet of water and approximately 160 cubic feet of natural gas. The conditions where hydrates occur are common in sediments off the coasts of the United States in water depths greater than approximately 1600 feet and at shallower depths in sediments associated with deep permafrost in the Arctic. Preliminary investigations indicate that considerable volumes of gas hydrate are present in at least some of these areas.

The total volume of gas hydrate in the United States is not known, although the results of a wide variety of investigations conducted over the past thirty years indicate that the volume is very large, on the order of hundreds of thousands of TCF. More important, however, is the amount of hydrate that can be commercially recovered. Characterization of hydrate resources that has been carried out, for example in the MacKenzie Delta of Canada, the North Slope of Alaska, offshore Japan, and elsewhere indicate that the total in less explored areas of the U.S. hydrate province is likely in the range of many thousands of TCF.

Gas hydrate investigations have been undertaken by many Federal agencies during the past 30 years. These include the U.S. Geological Survey, Naval Research Laboratory, National Science Foundation, and Department of Energy. The Methane Hydrate Research and Development Act of 2000 initiated a new program to study several aspects of gas hydrates, including seafloor stability, global climate change, and the potential of gas hydrate as a commercial resource. The resource target has been for production in the year 2020. Funding for the new program, which is managed by the DOE, has typically been on the order of $10 million per year.

Given the potential of gas hydrates as a huge energy source the $10 million per year spent on research by the US government strikes me as chump change. The United States spends tens of billions more on the military than it would if it was not dependent on Middle Eastern oil. Therefore basic research funding in alternative energy sources ought to be funded at a level commensurate with the recognition of how much Middle Eastern oil costs us in defense spending, money spend on aid in the region in order to achieve foreign policy goals, and in increased spending in homeland defense against the threat of terrorism.

For national security arguments on why energy research should be accelerated see my previous ParaPundit posts Intervention In Liberia Linked To Oil Dependency and Michael Scott Doran: The Saudi Paradox and China Energy Consumption Growth Complicates Anti-Terrorist Efforts.

Also see my previous post Natural Gas May Be Extractable From Ocean Gas Hydrates.

By Randall Parker    2004 February 29 03:55 PM   Entry Permalink | Comments (0)
2004 January 28 Wednesday
Cars May Become Greater Electricity Generators Than Big Electric Plants

Since cars use more energy to move than houses use for electricity a car powered by a hydrogen fuel cell that generates electricity to run an electric motor would have the capacity to supply all the power a house would need.

Another possibility that comes from such a system is the homeowner's ability to power the house from a fuel-cell vehicle. The fuel cell in a typical fuel-cell vehicle would have an output power from 25 kW to more than 100 kW. Because the average home only uses between 2 and 10 kW of electricity, it would be possible to "plug" the car into the home to provide power from the fuel cell using the hydrogen stored on the vehicle.

Of course, to make this work we first need fuel cells that are cheap enough and light and durable enough to serve as power sources for cars. But we would also need a way to store hydrogen in a dense enough form to make hydrogen a viable mobile power storage source. Even if a way to store hydrogen in a dense form in vehicles could be found we'd still face the need for a power source to use to generate the hydrogen in the first place.

In spite of these big caveats about the serious problems hydrogen faces as an energy storage form the idea that a car could generate enough power to run a few houses is a neat idea. In fact, the use of fuel cells to generate home electric power does not have to depend on hydrogen as a energy storage form. Advances in Solid Oxide Fuel Cells (also see here and here show promise for the ability to burn fossil fuels in order to generate electricity more efficiently than gas turbines can currently. If fossil fuel-burning solid oxide fuel cells become competitive for use in vehicles then the result would be that most people will come to own vehicles that can generate more electricity than they need to run their homes. Whether the burning of fuel in those vehicles (or in a smaller fuel cell attached to the house) to power a house can be done more cheaply than large centralized electric power plants remains to be seen. The potential exists for that to happen because fuel cells have the potential to convert fossil fuels to electricity more efficiently than how current large electric power plants currently do it. Plus, energy losses in electric power lines could be avoided by generating electricity much closer to where it is used. At the very least the use of fossil fuel-burning fuel cells ought to make central power outages less of a concern to anyone who outfits their house with a connector that they can plug into their car to run the house.

Update: As yet a suitable way to store hydrogen for use in cars has not been developed. Some University of Chicago researchers have just demonstrated that hydrogen can be turned into a clathrate that will remain stable at normal atmospheric pressure.

University scientists have proposed a new method for storing hydrogen fuel in a paper that appeared in the Monday, Jan. 5 to Friday, Jan. 9 online edition of the Proceedings of the National Academy of Sciences.

The lack of practical storage methods has hindered the more widespread use of hydrogen fuels, which are both renewable and environmentally clean. The most popular storage methods—liquid hydrogen and compressed hydrogen—require that the fuel be kept at extremely low temperatures or high pressures. But the University’s Wendy Mao and David Mao have formed icy materials made of molecular hydrogen that require less stringent temperature and pressure storage conditions.

“This new class of compounds offers a possible alternative route for technologically useful hydrogen storage,” said Russell Hemley, Senior Staff Scientist at the Geophysical Laboratory of the Carnegie Institution of Washington. The findings also could help explain how hydrogen becomes incorporated in growing planetary bodies, he said.

The father-daughter team synthesized compounds made of hydrogen and water, hydrogen and methane, and hydrogen and octane in a diamond-anvil cell, which researchers often use to simulate the high pressures found far beneath Earth’s surface. The hydrogen and water experiments produced the best results. “The hydrogen-water system has already yielded three compounds, with more likely to be found,” said Wendy Mao, a graduate student in Geophysical Sciences.

The compound that holds the most promise for hydrogen storage, called a hydrogen clathrate hydrate, was synthesized at pressures between 20,000 and 30,000 atmospheres and temperatures of minus 207 degrees Fahrenheit. More importantly, the compound remains stable at atmospheric pressure and a temperature of minus 320 degrees Fahrenheit, the temperature at which liquid nitrogen boils.

“We thought that would be economically very feasible. Liquid nitrogen is easy and cheap to make,” Wendy Mao said.

The hydrogen in a clathrate can be released when heated to 207 degrees Fahrenheit. The clathrate’s environmentally friendly byproduct is water.

The unanswered question here is: how much energy does it take to convert hydrogen into a clathrate? Also, if the hydrogen has to be heated to release it from the clathrate then how much energy is required to do that?

By Randall Parker    2004 January 28 12:23 AM   Entry Permalink | Comments (15)
2003 December 02 Tuesday
New Superconducting Power Cables Stronger And Cheaper

Next generation superconducting power cables will be able to replace copper cables while increasing the capacity of underground conduits.

New research from the National Institute of Standards and Technology (NIST) suggests that next-generation, high-temperature superconductor (HTS) wire can withstand more mechanical strain than originally thought. As a result, superconductor power cables employing this future wire may be used for transmission grid applications. Projected to become available in three to four years, the advanced superconductor wire (known in the industry as second generation HTS wire) is expected to cost less than the HTS wire used in today's superconductor power cables. The NIST research is described in the Nov. 17 issue of Applied Physics Letters.

Superconductor power cables can carry three to five times the power of conventional copper cables. Compact, underground superconductor cables can be used to expand capacity and direct power flows at strategic points on the electric power grid and can be used in city centers where there is enormous demand, but little space under the streets for additional copper cables. One important challenge in using this next-generation HTS wire in such applications is the need for sufficient strength and resiliency to withstand the stretching and bending that occurs during power cable fabrication and installation.

Using superconductor ceramic coatings on metallic substrates fabricated by American Superconductor Corp. and Oak Ridge National Laboratory, the NIST researchers tested the material's electromechanical properties. According to lead author Najib Cheggour, they found that these advanced wires could stretch almost twice as much as previously believed without any cracking of the superconductor coating and with almost no loss in the coating's ability to carry electricity.

Moreover, the NIST team found that strain-induced degradation of the superconductors' ability to carry electricity is reversible up to a certain critical strain value. That is, the materials return to their original condition once the strain is relieved. The strain tolerance of this future HTS wire was found to be high enough for even the most demanding electric utility applications. The discovered reversible strain effect also opens new opportunities for better understanding of the mechanisms governing the conduction of electricity in this class of superconductors.

I just love it when better living is made possible by advances in materials science.

By Randall Parker    2003 December 02 02:30 PM   Entry Permalink | Comments (6)
2003 October 25 Saturday
Plasmatron May Boost Internal Combustion Engine Efficiency

A plasmatron is a device that can convert gasoline and diesel fuel into hydrogen. Hydrogen can be used in diesel engines to reduce nitrogen oxides (NOx) emission.

The researchers and colleagues from industry report that the plasmatron, used with an exhaust treatment catalyst on a diesel engine bus, removed up to 90 percent of nitrogen oxides (NOx) from the bus’s emissions. Nitrogen oxides are the primary components of smog.

The plasmatron reformer also cut in half the amount of fuel needed for the removal process. “The absorption catalyst approach under consideration for diesel exhaust NOx removal requires additional fuel to work,” explained Daniel R. Cohn, one of the leaders of the team and head of the Plasma Technology Division at MIT's Plasma Science and Fusion Center (PSFC). “The plasmatron reformer reduced that amount of fuel by a factor of two compared to a system without the plasmatron.”

In gasoline engines the use of plasmatrons will boost car fuel efficiency by 20 percent.

"If widespread use of plasmatron hydrogen-enhanced gasoline engines could eventually increase the average efficiency of cars and other light-duty vehicles by 20 percent, the amount of gasoline that could be saved would be around 25 billion gallons a year," Cohn said. "That corresponds to around 70 percent of the oil that is currently imported by the United States from the Middle East."

The Bush administration has made development of a hydrogen-powered vehicle a priority, Heywood noted. "That's an important goal, as it could lead to more efficient, cleaner vehicles, but is it the only way to get there? Engines using plasmatron reformer technology could have a comparable impact, but in a much shorter time frame," he said.

"Our objective is to have the plasmatron in production—and in vehicles—by 2010," Smaling said. ArvinMeritor is working with a vehicle concept specialist company to build a proof-of-concept vehicle that incorporates the plasmatron in an internal combustion engine. "We'd like to have a driving vehicle in one and a half years to demonstrate the benefits," Smaling said.

In the meantime, the team continues to improve the base technology. At the DEER meeting, Bromberg, for example, reported cutting the plasmatron's consumption of electric power "by a factor of two to three."

Lots of small refinements to gasoline engine vehicles could cumulately boost fuel efficiency quite substantially. Compressed air, compressed hydraulic fluid, and other powertrain design ideas show promise as ways to boost fuel efficiency. Also see the related previous post Hydrogen Not Good Short, Medium Term Form Of Fuel

By Randall Parker    2003 October 25 03:59 PM   Entry Permalink | Comments (10)
2003 September 12 Friday
BNL Scientists Develop Bacteria To Make Methane From Coal

A couple of years ago 2 Brookhaven National Laboratory scientists developed bacteria to recover methane from coal in an environmentally more friendly manner.

NEW YORK, NY — Scientists at the U.S Department of Energy’s Brookhaven National Laboratory are exploring the use of bacteria to increase the recovery of methane, a clean natural gas, from coal beds, and to decontaminate water produced during the methane-recovery process.

Methane gas, which burns without releasing sulfur contaminants, is becoming increasingly important as a natural gas fuel in the U.S. But the process of recovering methane, which is often trapped within porous, unrecovered or waste coal, produces large amounts of water contaminated with salts, organic compounds, metals, and naturally occurring radioactive elements. “Our idea is to use specially developed bacteria to remove the contaminants from the wastewater, and also help to release the trapped methane,” says Brookhaven chemist Mow Lin.

Lin’s team has developed several strains of bacteria that can use coal as a nutrient and adsorb or degrade contaminants. They started with natural strains already adapted to extreme conditions, such as the presence of metals or high salinity, then gradually altered the nutrient mix and contaminant levels and selected the most hardy bugs (see details).

In laboratory tests, various strains of these microbes have been shown to absorb contaminant metals, degrade dissolved organics, and break down coal in a way that would release trapped methane. The use of such microbe mixtures in the field could greatly improve the efficiency and lower the associated clean-up costs of coal-bed methane recovery, Lin says.

This latest report suggests these scientists are still pursuing this line of work. The potential benefits are considerable. The United States has more energy in coal than Saudi Arabia has in oil.

Over half of the electricity produced in the United States is generated by coal-based power plants. Coal is affordable. Supplies are plentiful. And, the United States possesses 275 billion tons of recoverable coal reserves, or about one-fourth of the world's total.

U.S. coal reserves are equivalent to four times the oil of Saudi Arabia, 1.3 times the oil of OPEC and equal to all the world's proved oil reserves.

The development of environmentally friendly and cheaper ways to use coal for more purposes holds out the hope of considerably reducing US dependence on Middle Eastern oil and, by doing so, improving America's strategic position in a number of ways. A total reduction of US dependence on foreign oil would provide a number of benefits for the United States:

  • Improve US balance of payments.
  • Reduce US need to keep Middle East politically stable enough to supply oil. This would reduce the US need for defense spending.
  • Reduce the risk that the whole world's economy would go into a deep depression should the Saudis undergo a revolution.
  • Reduce the amount of revenue available to spread Wahhabi Islam and terrorist activities. The lower US demand would reduce world market prices for oil and would therefore reduce mischief-funding oil revenue that now flows to the Middle East from around the world.
  • Give the US greater foreign policy decision-making leeway by removing some constraints on US decision-making.

Methods to more cheaply extract oil from US oil shale or Canadian oil sands would have most of the same set of benefits though in the case of the Canadian oil sands some of the economic benefits would of course flow to Canada rather than to the US. Still, the resulting lower world oil prices and reduction in the need for defense spending would yield substantial benefits for the US economy as well.

In my view there are very large compelling reasons of grand national strategy for the US government to push the development of a broad range of technologies to provide cost-competitive replacements for oil. That the US government has been and continues to be willing to spend hundreds of billions per year on national security and yet so little on meaningful energy research seems unwise when we consider that a substantial portion of defense and even foreign aid spending is due to the presence of so much oil in the Middle East.

Look at it this way: some day methods to extract energy from coal, oil shale, and oil sands will be found. Why not make that day come sooner? Some day methods to make orders of magnitude cheaper photovoltaics by using nanotechnology fabrication methods and materials will be developed. Why not make that day come sooner too? Some day we will have lithium polymer batteries light enough and sufficiently long lasting to use for powering cars. Again, why not make that day come sooner as well? Similar arguments could be made for new nuclear reactor designs that would be cheaper and safer and that would produce far less nuclear waste and far less material useful for making nuclear bombs. Ditto for a wide range of other energy-related technologies. US national security and US living standards would be improved by the development of these technologies and the development costs would be repaid many times over.

Update: Some may wonder whether we should look for ways to shift to coal for a greater portion of our energy consumption given that coal burning generates more carbon dioxide per amount of energy generated as compared to other fossil fuel energy sources. While it is still debatable whether the effects of the build-up of carbon dioxide in the atmosphere will be a net detriment or benefit to humanity even if it becomes clear at some point in the future that the build-up will have to be stopped and perhaps even reversed this does not mean that fossil fuel consumption will necessarily have to be stopped. Dan Giammar, Ph.D., Washington University in Saint Louis assistant professor of civil engineering, is studying ways to sequester carbon dioxide deep underground by bonding it with silicate minerals in solid form.

"If you make more of it (carbon dioxide), you're going to have to do something with it," said Giammar. "Storing and sequestering is a good option."

Giammar's research may lead to not only storage but also permanent sequestration of carbon dioxide. He has found that when combined with silicate minerals containing either calcium, magnesium, or iron, carbon dioxide will precipitate, or change, into a carbonate solid.

"If you just have gaseous carbon dioxide stored underground, it becomes problematic when you think about leakage. But the carbonate mineral is a solid. It can't leak."

If carbon dioxide were injected into deep saline aquifers, several reactions would occur. The minerals would begin to dissolve as the pH of the saltwater became more acidic. The porosity of the rock would increase, allowing for the addition of more carbon dioxide. Eventually, carbonate solids would precipitate. This last phase is the most important in this model.

"Reactive transport models now make assumptions based on calculations that carbonates will precipitate at a certain time," said Giammar. "If that 's not what is really happening in the environment, we should know that. If we can understand this process, potentially it could give us the ability to control when and where these minerals form."

Carbon dioxide sequestration is still in its infancy. Giammar began his work on the project as part of the Carbon Mitigation Initiative at Princeton University. The United States Department of Energy (DOE) currently is planning a heavily monitored system to inject carbon dioxide into a sandstone aquifer on the Texas Gulf Coast. Another project in the North Sea has been storing carbon dioxide in an aquifer beneath the ocean for several years. And most recently, drilling began in July 2003 on a 10,000-foot well to evaluate underground rock layers in New Haven, W. Va.. as part of a DOE carbon sequestration research project now underway at the American Electric Power Mountaineer plant there.

Even with current technology carbon dioxide removal from burning coal would not be prohibitively expensive if CO2 removal when burning coal became necessary.

By Randall Parker    2003 September 12 02:47 PM   Entry Permalink | Comments (6)
2003 August 25 Monday
Hydrogen Pollution Research Points To Importance Of Soil Microbe Uptake

Hydrogen would inevitably leak if we shifted to a hydrogen economy and the question arises: would large quantities of leaked hydrogen cause environmental problems?

In the August 21 issue of the journal Nature, a group of researchers from the California Institute of Technology and other institutions reports results of a study of the atmospheric chemical reactions that produce and destroy hydrogen in the stratosphere. Funded in part by the National Science Foundation (NSF), the study concludes that most of the hydrogen eliminated from the atmosphere goes into the ground, and therefore that scientists will need to turn their focus toward developing an understanding of soil destruction of hydrogen to accurately predict whether hydrogen emissions will eventually accumulate in the air.

The researchers reached this conclusion after carefully measuring the abundance of a rare isotope of hydrogen known as deuterium. It has long been known that atmospheric hydrogen is rich in deuterium, but it was unclear why. The only reasonable explanation, scientists believed, is that atmospheric hydrogen is mostly destroyed by chemical reactions in the air, and that those reactions are relatively slow for deuterium-rich hydrogen, so it accumulates like salt in an evaporating pan of water.

If correct, this would mean that oxidizing atmospheric trace gases control the natural hydrogen cycle and that soils are relatively unimportant. But new research results suggest that one of the main natural sources of atmospheric hydrogen--the breakdown of methane--is actually responsible for the atmosphere's enrichment in deuterium. This result implies that reactions with atmospheric oxidants may be less important to the hydrogen cycle, and that uptake by soils, where microbial processes involve methane, is the driving force.

Air samples from the stratosphere indicate that most atmospheric hydrogen is taken up by the soil.

Hydrogen is a highly reactive element, but answers to the questions of when and where it reacts, and under what circumstances, are difficult to unravel. These questions are simplified in the stratosphere, where it's easier to single out and understand specific reactions. According to John Eiler, a geochemist at the California Institute of Technology and an author of the Nature paper, the new data were gathered from air samples taken from the stratosphere with one of the high-flying ER-2 planes operated by the NASA Dryden Flight Research Center in the Mojave Desert.

The big question is whether the inevitable hydrogen leakage in an economy converted to hydrogen would accumulate in the atmosphere and cause harmful effects.

With precise information on the deuterium content of hydrogen formed from methane, the researchers were able to calculate that the soil uptake of hydrogen is as high as 80 percent. It is suspected that this hydrogen is used by soil-living microbes to carry on their biological functions, although the details of this process are poorly understood and have been the subject of only a few previous studies.

It seems likely, according to the scientists, that the hydrogen taken up by soils is relatively free of environmental consequences, but the question still remains of how much more hydrogen the soil can "consume." If future use of hydrogen in transportation results in a significant amount of leakage, then soil uptake must increase dramatically or it will be inadequate to cleanse the released hydrogen from the atmosphere, Eiler says.

It is assumed by most advocates of hydrogen as an energy storage form that it will not have any harmful pollutant effects. Well, this assumption might turn out to be correct. But for atmospheric scientists this is still an open question.

By Randall Parker    2003 August 25 10:18 AM   Entry Permalink | Comments (2)
2003 July 20 Sunday
Hydrogen Not Good Short, Medium Term Form Of Fuel

UC Berkeley academics throw cold water on the prospects for hydrogen.

In a paper appearing in the July 18 issue of Science magazine, Alex Farrell, assistant professor of energy and resources at UC Berkeley, and David Keith, associate professor of engineering and public policy at Carnegie Mellon University, present various short- and long-term strategies that they say would achieve the same results as switching from gasoline-powered vehicles to hydrogen cars.

"Hydrogen cars are a poor short-term strategy, and it's not even clear that they are a good idea in the long term," said Farrell. "Because the prospects for hydrogen cars are so uncertain, we need to think carefully before we invest all this money and all this public effort in one area."

Farrell and Keith compared the costs of developing fuel cell vehicles to the costs of other strategies for achieving the same environmental and economic goals.

"There are three reasons you might think hydrogen would be a good thing to use as a transportation fuel - it can reduce air pollution, slow global climate change and reduce dependence on oil imports - but for each one there is something else you could do that would probably work better, work faster and be cheaper," Farrell said.

The biggest problem with hydrogen as a means to reduce pollution is that it has to be produced from another energy source. But the most cost competitive energy sources are all forms of fossil fuels. The production of the hydrogen is not 100% efficient and producing it from fossil fuels produces pollution. The transportation and storage of the hydrogen also use substantial amounts of energy.

Hydrogen is also more difficult to store and transport and takes up much more space than liquid hydrocarbon fuels. It is not the only conceivable approach to pursue for reducing net pollution from vehicles for the purpose of reducing green house gasses. Another approach to reduce the net production of green house gasses would be to develop a light-driven chemical process that would fix carbon out of atmospheric carbon dioxide to make hydrocarbon fuels. Or if cheap photovoltaic solar cells could be developed then another approach would be to use electricity from solar cells to drive the chemical process to fix carbon from carbon dioxide. Effectively gasoline would be generated from solar power. Then the gasoline could be burned in cars. This artificial carbon cycle would eliminate the net addition of carbon dioxide gas to the atmosphere.

Back in 2000 the MIT Sloan Automotive Laboratory report On The Road: A life-cycle analysis of new automobile technologies by Malcolm A. Weiss, John B. Heywood, Elisabeth M. Drake, Andreas Schafer, and Felix F. AuYeung registered reservations about the future of hydrogen fuel. (PDF Format)

Continued evolution of the traditional gasoline car technology could result in 2020 vehicles that reduce energy consumption and GHG emissions by about one third from comparable current vehicles and at a roughly 5% increase in car cost. This evolved “baseline” vehicle system is the one against which new 2020 technologies should be compared.

More advanced technologies for propulsion systems and other vehicle components could yield additional reductions in life cycle GHG emissions (up to about 50% lower than the evolved baseline vehicle) at increased vehicle purchase and use costs (up to about 20% greater than the evolved baseline vehicle).

...

If automobile systems with drastically lower GHG emissions are required in the very long run future (perhaps in 30 to 50 years or more), hydrogen and electrical energy are the only identified options for “fuels”, but only if both are produced from non-fossil sources of primary energy (such as nuclear or solar) or from fossil primary energy with carbon sequestration.

A more recent MIT study released in March 2003 voices even greater doubts about the viability and desireability of hydrogen as a fuel source in the next couple of decades.

Published in MIT Tech Talk, March 5, 2003.

Even with aggressive research, the hydrogen fuel-cell vehicle will not be better than the diesel hybrid (a vehicle powered by a conventional engine supplemented by an electric motor) in terms of total energy use and greenhouse gas emissions by 2020, says a study recently released by the Laboratory for Energy and the Environment (LFEE).

And while hybrid vehicles are already appearing on the roads, adoption of the hydrogen-based vehicle will require major infrastructure changes to make compressed hydrogen available. If we need to curb greenhouse gases within the next 20 years, improving mainstream gasoline and diesel engines and transmissions and expanding the use of hybrids is the way to go.

These results come from a systematic and comprehensive assessment of a variety of engine and fuel technologies as they are likely to be in 2020 with intense research but no real "breakthroughs." The assessment was led by Malcolm A. Weiss, LFEE senior research staff member, and John B. Heywood, the Sun Jae Professor of Mechanical Engineering and director of MIT's Laboratory for 21st-Century Energy.

...

However, the researchers do not recommend stopping work on the hydrogen fuel cell. "If auto systems with significantly lower greenhouse gas emissions are required in, say, 30 to 50 years, hydrogen is the only major fuel option identified to date," said Heywood. The hydrogen must, of course, be produced without making greenhouse gas emissions, hence from a non-carbon source such as solar energy or from conventional fuels while sequestering the carbon emissions.

The full text of the March 2003 MIT study Comparative Assessment Of Fuel Cells is available as a PDF document.

Curiously, in spite of the drawbacks of hydrogen as a way to store and transport energy hydrogen produced in cars for immediate burning may be a way to increase the efficiency of internal combustion engines.

But the researchers want to take the concept a big step further, using plasma technology to turn cars into small-scale hydrogen- producing plants - and sharply boosting the spark-ignition engine's efficiency along the way.

"Spark-ignition engines are roughly 30 percent efficient and diesels are about 40 percent efficient," notes Cohn. "We want to approach a diesel level of efficiency while avoiding diesel's pollution problems."

The plasmatron - about the size of a half-gallon milk carton - would convert about a third of a vehicle's gasoline stream into hydrogen. In doing so, it would boost efficiency in varied ways.

I think the hydrogen fuel hype is vastly overblown. The US government spending on hydrogen development is money that would be better spent developing photovoltaic materials that can be made much more cheaply than current photovoltaics. The goal of US government-funded energy research ought to be to obsolesce fossil fuels by developing cheaper competitors.

Update: A big step forward in battery tech would lower the cost and weight of batteries far enough to make hybrid vehicles competitive would allow reductions in emissions and in fossil fuel use in a way that would use all the existing infrastructure. Donald Sadoway of MIT says that a big step forward in battery tech is achieveable. On the subject of whether much better batteries could be developed for use in hybrid vehicles see my Energy Tech archives and in particular see my post Is Hydrogen The Energy Of The Future? for the bottom part of the post where I link to Sadoway's views.

On the question of whether photovoltaics would have to take up too much space, first of all, it will eventually be possible to achieve fairly high solar photovoltaic cell efficiency. See my post Material Discovered For Full Spectrum Photovoltaic Cell about some LBNL researchers who found a material that is 50% efficient. Surely nanotubes will be able to achieve a still higher effiency.

Also, I've done rough calculations on surface area needed for photovoltaics and the energy needed looks like it is achieveable with a fairly small portion of the Earth's surface. On my Parapundit.com blog in the Grand Strategy archive see the comment section of my post Energy Policy, Islamic Terrorism, And Grand Strategy where I introduce some rough calculations on the area needed for photovoltaics. I'd appreciate it if anyone could point to more accurate calculations of how much energy the United States currently uses and how much space in southern parts of the US would be needed to be used to collect enough energy for current consumption rates.

You can also follow a debate about this post on Arnold Kling's EconLog.

By Randall Parker    2003 July 20 10:14 PM   Entry Permalink | Comments (9)
2003 May 20 Tuesday
Metal-Organic Frameworks May Solve Hydrogen Storage Problem

Metal-Organic Frameworks show promise as a way to solve the hydrogen storage problem for vehicles.

A new class of materials achieves that aim without the problems associated with other approaches, researchers report in the May 16 issue of the journal Science. Their work also points to ways of making the materials hold even more hydrogen.

"Hydrogen is an ideal fuel, because when burned it produces only water, which is quite harmless," said University of Michigan chemistry professor Omar Yaghi, whose work over the past 12 years led to the new materials. "But the problem has been, how do you store enough hydrogen for an automobile to run for 300 to 400 miles without refueling? You can't just put a huge tank of hydrogen on the back of an automobile; you have to concentrate the hydrogen into a small volume." That can be done by cooling hydrogen to an extremely low temperature or by compressing it under very high pressure, but neither option would be practical in a car or electronic gadget.

"Our idea was to create a material with pores that attract hydrogen," said Yaghi. "That makes it possible to 'stuff' more hydrogen molecules into a small area without resorting to high pressure or low temperature." The class of materials, called metal-organic frameworks (MOFs), can be made from low-cost ingredients, such as zinc oxide—a common component of sunscreen—and terephthalate, which is used in plastic soda bottles. Sometimes called crystal sponges, MOFs are essentially scaffolds made up of linked rods—a structure that makes for maximum surface area. Just one gram of a MOF, in fact, has the surface area of a football field.

The researchers found that they can increase the material's storage capacity by modifying the rods in various ways. "The material that we're reporting on takes up two percent of its weight in hydrogen," Yaghi said. "The U.S. Department of Energy (DOE) standard for use of hydrogen in automobile applications is about six percent. The exciting thing about this report is not only that we've discovered a new material that takes up hydrogen, but also that we've identified a clear path for how to get to six percent." In work published in Science last year, the researchers found that MOFs can also store large amounts of methane. "We now have materials that exceed the DOE requirements for methane, and we think we can apply the same sort of strategy for hydrogen storage."

MOFs should prove superior to metal hydride alloys, which also are being explored for hydrogen storage, said Yaghi. "One of the problems with metal hydride is that the stored hydrogen is chemically bound to the metal. That means that you have to pressurize the material to charge it with hydrogen, and you have to heat the material to high temperatures to discharge the hydrogen. The process of charging and discharging under these extreme conditions ends up contaminating the metal and breaking the whole process down, so these materials have a limited lifetime. With MOFs, the hydrogen is physically absorbed, not chemically absorbed, so it's easier to take the hydrogen out and put it back in without much energy cost."

A solution to the hydrogen storage problem would not by itself reduce the demand for fossil fuels. There would still be the need for alternative energy sources to use to generate the hydrogen in the first place. Still, the ability to easily and cheaply store and retrieve hydrogen with minimal energy loss would be a great enabling technology for the use of other energy sources.

The key point to keep in mind is that fossil fuels are both fuel sources and great forms of fuel storage (though natural gas is less easy to store). To move to a different source of energy (e.g. wind or solar or nuclear) for, say, transportation applications we need both that alternative source of energy and a way to put that energy into a form that is easy to put into vehicles. Many alternative sources of energy are made into electricity but existing types of batteries weigh too much and cost too much. However, existing methods for storing hydrogen are all quite unsatisfactory for transportation applications as well.

Whether a new battery technology or a new hydrogen storage technology will become the first viable non-fossil fuel energy storage technology for vehicles remains to be seen. A major commitment to shift to hydrogen as an energy storage form remains premature as long as there is not a great way to store it in vehicles.

Update: Yaghi explains why the MOFs do not need large temperature or pressure changes to store and retrieve the hydrogen.

"The hydrogen is physically attracted to the walls of the [material's] pores," he said. "This attraction makes it possible to stuff more hydrogen molecules into a small area without requiring either low temperatures or high pressures."

By Randall Parker    2003 May 20 06:50 PM   Entry Permalink | Comments (6)
2003 April 17 Thursday
Natural Gas Made Into Pellet Hydrates For Easier Transportation

Technology review has an interesting article on the work of Japanese researchers to convert natural gas into a solid form to make it easier to transport from small remote fields that would otherwise be too expensive to operate.

Japanese researchers Hajime Kanda and Yasuhara Nakajima at Mitsui Engineering and Shipbuilding in Tokyo think they’ve found a solution with the aid of hydrates, solid crystals in which natural gas—composed chiefly of methane—is caged inside of water molecules.

If the article is correct then currently most of the natural gas in the world is not exploitable because the fields are too small and can't justify the cost of building pipelines to transport the natural gas from them to market. If these Japanese researchers succeed then natural gas could become a much larger percentage of total fossil fuel use.

It is worth having a look at world natural gas reserves. The world total known reserves of oil is 1212.811 billion barrels and for natural gas it is 5,501.424 trillion cubic feet. Saudi Arabia has the biggest oil reserves at 261.800 billion barrels or about 21% of world oil reserves. But Russia has 1680 trillion cubic feet of natural gas or over 30% of world natural gas. Russia has only 60 billion barrels of oil reserves while Saudi Arabia has only 224.7 trillion cubic feet of natural gas.

What we really need to know is how to compare natural gas reserves and oil reserves for energy content. Some handy tables of energy conversion units provide the needed data. 1 cubic foot of natural gas has 0.00102 million btus of energy whereas 1 barrel of oil contains 5.46 million btus. Therefore 5352.94 cubic feet of natural gas have as much energy as 1 barrel of oil. Armed with these conversion factors let's see how do Russia and Saudi Arabia compare.

  • Saudi Arabia: (261.800 billion barrels x 5.46 million btus/barrel) + (224.7 trillion cubic feet x 0.00102 million btus/cu feet) = 1429.428 + 229.194 = 1658.622 million billion btus.
  • Russia: (60 billion barrels x 5.46 million btus/barrel) + (1680 trillion cubic feet x 0.00102 million btus/cu feet) = 327.6 + 1713.6 = 2041.2 million billion btus.
  • World Oil: (1212.811 billion barrels x 5.46 million btus/barrel) = 6621.94806 million billion btus.
  • World Natural Gas: (5,501.424 trillion cubic meters x 0.00102 million btus/cu feet) = 5611.45248 million billion btus.

If these calculations are correct then Russia has more energy than Saudi Arabia and the world has almost as much energy in the form of natural gas energy as it has in the form of oil. While the Middle East has 56% of the world's oil it has only 36% of the world's natural gas. Any technological development that makes it easier to store and transport natural gas will have a large impact on energy markets. Of the fossil fuel energy producers Russia will benefit the most and the world's demand for energy from the Middle East will be reduced.

By Randall Parker    2003 April 17 01:39 AM   Entry Permalink | Comments (3)
2003 April 07 Monday
Mini Fuel Cells For Personal Lightweight Cooling Systems

Researchers at the Pacific Northwest National Laboratory are developing miniaturized high output fuel cells for military applications.

“Our miniaturized fuel processor incorporates several chemical processes and operations in one device,” said Evan Jones, PNNL principal investigator. The fuel processor system contains two vaporizers, a heat exchanger, a catalytic combustor and a steam reformer, all within a compact package no larger than a dime.

When ready for final deployment, the military envisions many useful applications for this emerging miniaturized energy-generating technology. According to Terry Doherty, director of PNNL’s Department of Defense programs, soldiers could power personal, lightweight cooling systems while wearing protective suits and gear, prolonging their own comfort and efficiency during a reconnaissance.

“Vital personal communications devices could function for extended periods without the added weight of bulky, inefficient batteries,” Doherty said. He added that miniature sensors powered by the same technology could be scattered before advancing troops to monitor ground vibrations or detect dangerous toxic agents and relay this information electronically to soldiers. This technology broadens the possibilities for using self-sustaining items such as mobile devices in remote or difficult-to-access locations.

While methanol has proved to be the most effective fuel source, other liquid fuels such as butane, jet fuel — also known as JP-8 — or even diesel may be used. And, because the hydrogen power source is only produced as needed, there is no need to store or carry the volatile gas, reducing risk and creating a lighter load.

Testing has revealed that performance from the reformer and fuel cell prototype is impressive. “This system can produce an equivalent power (20 mW) to batteries, but at one-third the weight,” Jones said. Similar micro fuel cell systems with greater power output (50 W) currently under development are providing power equal to that of batteries weighing 10 times as much. Researchers suggest that with additional system efficiencies and improvements, even greater performance may be achievable. Development will now focus on creating a deployable system suitable for military use or industrial application.

High electric power output lightweight mini fuel cells would have many civilian applications as well. For example, workers in hot desert oil fields could wear cooling suits with lightweight backpack fuel cells that would allow them to work for longer periods outdoors. A former Bechtel worker who worked in Saudi Arabia once told me how they would work outside for a half hour and then come into a cooled mobile home for a half hour of recovery. This cycle of half hour on and half hour off was how they worked all day in Saudi Arabian oil fields where temperatures could approach 120 Fahrenheit or even hotter.

Construction workers in any really hot environments would find cooler suits incredibly useful as productivity enhancers. Also, if the energy of a fuel cell can be used to cool it certainly can be used to warm a suit as well. Therefore, oil field workers in extreme cold environments could wear heater suits powered by mini fuel cells.

One problem with the use of fossil fuel powered fuel cells is that they produce carbon dioxide and possibly other pollutants whose build-up indoors could be a health problem. But in outdoor applications that gaseous build-up wouldn't be a problem.

Portable fuel cells would have a lot of great uses in hiking and camping trips. They could provide heat for stoves, electricity to power light fixtures, and electricity for communications, computers, and other applications in remote locations.

By Randall Parker    2003 April 07 03:14 PM   Entry Permalink | Comments (3)
2003 March 24 Monday
Is Hydrogen The Energy Of The Future?

The April 2003 issue of Wired has an article written by Peter Schwartz and Doug Randall advocating an accelerated conversion to a hydrogen economy. After discussing the problems inherent to storing hydrogen in gaseous and liquid forms they argue that solid materials as hydrogen sponges will be the best long term solution.

In the long run, the most promising approach is to fill the tank with a solid material that soaks up hydrogen like a sponge at fill-up and releases it during drive time. Currently, the options include lithium hydride, sodium borohydride, and an emerging class of ultraporous nanotech materials. Unlike gaseous hydrogen, these substances can pack a lot of power into a small space of arbitrary shape. And unlike liquid hydrogen, they can be kept at room temperature. On the other hand, energy is required to infuse the solid medium with hydrogen, and in some cases very high temperatures are required to get the fuel back out, exacting a huge toll in efficiency. Also, filling the tank can take far more time than pumping gasoline. Government money could bridge the gap between today's experiments and a viable solution.

But will the problems involved in solid hydrogen storage be any more tractable and yield to any better solution than the problems with gaseous or liquid storage? Will the solid material needed to store the hydrogen weigh so much as to make it weigh as much as a battery which would contain the same amount of energy? The authors provide no indication as to why their preferred approach will turn out to be so advantageous.

The bigger problem with the article is that it does not explain why the use of hydrogen will allow us to reduce and eventually eliminate the use of fossil fuels. Hydrogen is not a source of energy. It would be more accurate to say that hydrogen is a way to store, transport, and use energy. Therefore it competes with other forms of stored energy. In cars and other vehicles hydrogen could be burned in fuel cells. But energy is needed to produce the hydrogen in the first place. To be a better automotive fuel hydrogen would somehow have to reduce the total usage of fossil fuels and do that better than other approaches that could be pursued.

Fossil fuels are a major source of energy today. Fossil fuels could be converted to hydrogen. But hydrogen advocates have not made a clear case for why hydrogen as an intermediate storage and end use form of energy is a more efficient way to use fossil fuels. There are too many unsolved problems and questions. Again, hydrogen does not really compete against other types of originating fuels. Rather, it relies on other types of originating fuels because it has to be produced using these other fuels.

If hydrogen is produced from electricity then the electricity must first be generated. But most electricity is generated by burning coal or natural gas. Hydro and nuclear also produce small fractions of the total electric supply. We've pretty much harnessed the available hydroelectric sources and hydroelectric is a pretty small fraction of total electric generation. The other big current alternative is nuclear energy. But for electricity generation nuclear power costs more than burning fossil fuels. There is no big economic incentive on a global scale to drive the building of massive numbers of nuclear power stations to cause a conversion to a nuclear-hydrogen economy. Also, widespread use of nuclear power on a global scale would so increase the availability of enriched uranium and plutonium that it would cause unacceptable risks of nuclear and radiological weapons proliferation.

The economic case for the use of nuclear power looks even worse than current fossil fuel prices suggest. The marginal cost of oil production (in some fields it is about $3/barrel) in the Middle East is much lower than current oil prices. Therefore nuclear power can not displace the use of Middle Eastern fossil fuels unless nuclear power becomes much cheaper than it is now.

Fossil fuels could be used to generate hydrogen. Would this be a more efficient way to use fossil fuels for transportation purposes? Keep in mind that each step in the use of hydrogen would produce an energy loss. The efficiency of the energy conversion of fossil fuels to hydrogen would be less than 100%. The hydrogen could then be piped (or driven) to what are now gasoline stations. If liquid hydrogen was used in cars then the hydrogen would have to be cooled first to liquid form. To keep it cool would require a great deal of insulation and probably additional cooling on-going. Therefore a car just sitting in a parking lot would consume energy at some low rate. As the Wired article points out, even a solid storage method may require energy usage in order to get the hydrogen into the solid and to get it back out again. Meanwhile, there are an assortment of ways to make the old internal combustion vehicle more fuel efficient. Therefore hydrogen is not just competing against today's internal combustion engine transportation systems. It is also competing against tomorrow's.

Hydrogen would most likely propel vehicles by being burned in a fuel cell. In theory fuel cells are a more efficient means of converting a liquid or gaseous fuel to mechanical power than the internal combustion engine. But hydrogen is not the only energy form that can be burned in fuel cells. There are fuel cell designs that will burn methane gas for instance. In fact, due to the greater efficiency of fuel cells for the conversion of fosil fuels to electricity fuel cells will become widely used for electric power generation from fossil fuels before they become used in transportation.

Is hydrogen the only viable candidate as an energy storage form to replace gasoline and diesel fuel in vehicles? In a word, no. Lead acid batteries have an energy storage density of 35 Watt Hours per kilogram. This leads to electric cars that weigh too much and have too short a range between recharges. MIT professor Donald R. Sadoway believes lithium polymer batteries can be developed that will have over an order of magnitude greater energy density than lead acid batteries.

Niels Bohr, the Danish physicist and Nobel Laureate, once cautioned that prediction is always dangerous, especially when it is about the future. With this disclaimer, then, we speculate on what is in store for rechargeable lithium batteries. In the near term, expect the push for all-solid- state, flexible, thin-film batteries to continue. This is driven by the desire to maximize the electrode–electrolyte inter-facial area while minimizing diffusion distances within the electrodes themselves, in order to combine high capacity with high rate capability. Recent results from our laboratory indicate that in a multi-layer configuration comprising an anode of metallic lithium, a solid polymer electrolyte, and a cathode of dense, thin-film vanadium oxide, it is possible to construct a battery with projected values of specific energy exceeding 400 Wh/kg (700 Wh/l) and specific power exceeding 600 W/kg (1000 W/l).10,11 Another trend is distributed power sources as opposed to a single central power supply. This allows for miniaturization (e.g., the microbattery). Expect also the integration of energy generation with energy storage, for example, a multilayer laminate comprising a photo-voltaic charger and a rechargeable battery. Ultimately, if scientific discoveries prove to be scalable and cost-effective, we should witness the large-scale adoption of electric vehicles.

When the cost of photovoltaics is lowered far enough to compete with fossil fuels then a combination of photovoltaics and lithium polymer batteries may well be the combination of technologies that will lead to the phase-out of the use of fossil fuels as vehicle power sources.

The article co-authored by Donald Sadoway and Anne Mayes is from the August 2002 issue of MRS Bulletin dedicated to lithium batteries.

By Randall Parker    2003 March 24 02:45 AM   Entry Permalink | Comments (9)
2003 March 04 Tuesday
Hydrogen Economy Cost Calculations

Harry Braun, Chairman of the Hydrogen Political Action Committee, has written article laying out some costs and arguing for the use of windpower to generate hydrogen for a hydrogen economy.

With state-of-the-art electrolyzers, about 55 kWh will be needed to manufacture the energy content of a gallon of gasoline in the form of gaseous hydrogen. Assuming electricity costs of 3 cents/kWh, the electricity costs alone would be $14.00/mBtu, which is equivalent to gasoline costing $1.60 per gallon. The cost and maintenance of the electrolyzer and related hydrogen storage and pumping system also needs to be factored in to the equation.

One problem that Braun brings up about liquid hydrogen as a transportation fuel source is that while a gallon of gasoline has 115,000 Btus of energy a gallon of liquid hydrogen has only 30,000 Btus. Therefore liquid hydrogen tanks would need to be much larger and at the same time stronger and insulted in order to hold the extremely cold liquid hydrogen. Not exactly an appealing prospect. Also, liquifying the hydrogen itself takes energy that boosts the costs by nearly a quarter.

Even if we accept his assumptions for how far down windpower costs could drop if mass produced his calculations take little account of the infrastructure costs for hydrogen for the huge transition that he envisions. Also, windpower seems a worse choice than photovoltaics for the United States in the long term in part because the wind farms have to be built where the wind is. Whereas with the move of people in the US toward the Southern parts of the country people have been moving toward where there is more solar power to be tapped. Eventually, (eventually? how long is eventually? er, I don't know) thin film photovoltaics will allow electric power to be generated much closer to where it is used.

Given the drawbacks of hydrogen as a power source it still seems possible that a big advance in battery technology could make batteries a viable alternative to hydrogen fuel cells.

An EE Times article from 2001 surveyed the field of battery development and experts think batteries viable as automotive power sources are still years away.

Similar efforts are in progress at Massachusetts Institute of Technology (MIT), where researchers have developed a competing lithium-polymer battery that could ultimately achieve energy densities of 300 W-hr/kg, according to its developers. The technology, which uses a multiple-layer configuration of polymer and metal resembling a potato chip bag, is funded by the Office of Naval Research and is said to be 5 to 10 years from commercialization.

That article does a good job of describing how far a battery technology would have to advance in order for it to become competitive for automotive applications. The MIT effort, if successful, would create batteries that would have about 4 times more power density than the nickel-metal hydride batteries found in the most expensive uncompetitive electric vehicles (whose market prices are way below manufacturing costs btw). That would make the batteries dense enough. The cost is a question though.

In a more recent article Donald Sadoway and John Heywood (director of MIT's Sloan Automotive Lab) are noticeably lacking in enthusiasm for hydrogen as an automotive power source.

“Their state of development is oversold,” said Heywood. Sadoway put it another way: “In the context of portable power, fuel cells are not a technology, they’re a laboratory curiosity.”

Among other things, fuel cells are now far too expensive for use in mainstream applications like cars. That’s because they’re made partly of platinum, the same metal used in expensive jewelry. And an alternative to platinum will be difficult to discover, said Sadoway; “that’s Nobel Prize-winning work.”

Another key challenge: “How are we going to produce, distribute and store the hydrogen” for fuel cells, asked Heywood. He pointed out that the production of hydrogen itself involves generating various greenhouse gases. “So when people argue that the fuel cell produces only water vapor, that’s deceptive in the context of a complete transportation system,” he said.

Battery technology is appealing from an infrastructure standpoint because batteries could be recharged at night when existing electric power plants run well below maximum capacity. Then when photovoltaics become cost effective vehicles could be recharged during the day.

Stationary applications for alternative power sources are not as hard. There are lots of future possibilities for better ways to get energy for stationary uses. Some NASA researchers think thin film batteries and thin film photovoltaic cells could be integrated into roof tiles that would collect and store electrical energy.

There are also numerous applications that could exploit integrated power devices. Examples of these include: battery and solar cell devices integrated into a roof tile to provide a total power system for homes, or solar-rechargeable power systems for the military, for recreational vehicles, for cell phones or for other consumer products designed to be used in remote locations. In summary, the same considerations that provide performance edges for space applications make these power technologies applicable for terrestrial needs both individually or used in tandem.

By Randall Parker    2003 March 04 05:33 PM   Entry Permalink | Comments (1)
2003 January 24 Friday
Lithium Nitride Sets New Hydrogen Storage Record

The material has drawbacks because it requires either a higher temperature or low pressure to cause the hydrogen to release. But this result is important because it identifies a class of compounds that are worth investigating for hydrogen storage potential.

The researchers have found a material that can store and quickly release large amounts of hydrogen. Lithium nitride can store 11.4 percent of its own weight in hydrogen, which is 50 percent more than magnesium hydride, the previous best hydrogen storage material. Other metal hydrides generally store only 2 to 4 percent of their weight.

By Randall Parker    2003 January 24 12:52 AM   Entry Permalink | Comments (5)
2002 December 26 Thursday
US Navy Develops New Fuel Cell

The enormous weight of lead-acid batteries and limited range of electric cars illustrate the importance of energy density in energy storage technologies.

Researchers at the US Naval Undersea Warfare Center Division have developed a semi-fuel cell, which is a high energy density source for underwater vehicle applications with energy densities approaching 6 to 7 times that of silver-zinc batteries. The new electrochemical system is based on a magnesium anode, a seawater/catholyte electrolyte and an electrocatalyst of palladium and iridium catalyzed on carbon paper.

To put this in perspective compare some of other battery technologies currently in use:

Long a mainstay for undersea vehicle programs, the lead acid battery has been used because of its low cost, known performance, reliability and reasonable cycle life. Its principal disadvantages are low specific energy (30 Wh/kg) and energy density (65 Wh/litre), loss of capacity at low temperatures and the production of hydrogen gas during charges as well as high rate discharges. Nickel cadmium batteries have a specific energy (30 Wh/kg) and energy density (75 Wh/litre) that are comparable to lead acid. Their cost, performance, reliability and cycle life are also comparable. Unlike lead acid batteries, however, cold temperatures do not degrade their performance significantly. A major limitation of the nickel cadmium battery is memory effect, requiring more stringent battery management.

Until recently, the silver-zinc battery has been the battery of choice for long range missions. Silver-zinc batteries are available off-the-shelf and have a higher specific energy (130 Wh/kg) and density (240 Wh/litre) than most other commonly available secondary batteries. High cost, limited cycle and shelf life, and a long recharging process reduce its overall attraction. While at normal discharge rates, 40 to 50 cycles can be expected from the battery, this reduces to 10 or 15 at high discharge rates. Cycle life is also reduced if the battery is discharged below 80% of rated capacity and thus, a 20% reserve is required at the end of the mission. Silver-zinc batteries have been used extensively in AUVs and their performance is reliable and documented. Their high cost and short life have, however, prompted consideration of alternative technologies.

Note that the high cost and short life-time of the silver-zinc battery has restricted its use to specialty applications such as underwater vehicles. Its not clear what the lifetime would be for the Navy semi-fuel cell. Still, its energy density greatly surpasses than of any type of battery that turned up on some Google searches.

By Randall Parker    2002 December 26 12:21 PM   Entry Permalink | Comments (1)
2002 December 09 Monday
Wind Power Rapidly Growing In Europe

Wind now supplies 28 million Europeans with electricity.

Europe's wind-driven energy has been growing at 40 percent a year. With a capacity of more than 20,000 megawatts installed on land, it now represents three-fourths of the world's total wind-power output. Europe hopes to raise this to 60,000 megawatts in the next six years. Much of that growth is expected to come from sea-based turbines.

Unfortunately, while the article is rather short on cost information (why didn't the NY Times editors demand the writer put this info in the article?) it doesn't sound like wind power is really cost competitive with other energy sources:

Then there is the issue of price. Industry spokesmen contend that, strictly speaking, the price of wind-driven energy is close to being competitive with other sources. They argue that traditional fossil fuels and nuclear energy get enormous hidden or indirect subsidies, to the tune of billions of dollars a year. For example, in some European countries, governments pay for the insurance of nuclear power plants.

The nuclear insurance costs are a poor example of a power subsidy because nuclear power is not the lowest cost method of producing electricity in the first place. Fossil fuels (I'm guessing natural gas in particular) are the lowest cost energy sources for generating electricity. What subsidies exist for them are mostly in the form of not forcing producers to pay all external costs generated by the pollution from burning the fuels. Such costs are hard to estimate.

I get annoyed by articles like this New York Times article. What it needed (and what the NY Times surely could have gotten from industry sources fairly easily) was a graph of historical cost trends in fossil fuel and wind power generation costs for new fossil fuel and new wind power generation facilities. If we want to project foward about the prospects for wind power it would be useful to know how rapidly it is closing the gap in costs as compared to other power sources.

By Randall Parker    2002 December 09 11:22 AM   Entry Permalink | Comments (4)
2002 December 04 Wednesday
New Australian Photovoltaic Cell To Be Cheaper

While their manufacturing process uses fewer silicon wafers they neglect to say how much that will reduce the manufacturing cost of their cells.

A joint venture between the Australian National University and Origin Energy has developed a new type of solar cell with the potential to revolutionise the global solar power industry.

Director of the ANU Centre for Sustainable Energy Systems, Professor Andrew Blakers today unveiled the Sliver CellTM, which uses just one tenth of the costly silicon used in conventional solar panels while matching power, performance and efficiency.

Professor Blakers said, "A solar panel using Sliver CellTM technology needs the equivalent of two silicon wafers to convert sunlight to 140 watts of power. By comparison, a conventional solar panel needs about 60 silicon wafers to achieve this performance.

"By dramatically reducing the amount of expensive pure silicon, the largest cost in solar panels today, this new technology represents a major advance in solar power technology."

Origin Energy's Executive General Manager, Generation, Andrew Stock said, "Origin Energy has worked with ANU's Centre for Sustainable Energy Systems for several years, investing more than $6 million in research to discover a way to harness the sun's power at much lower cost.

"Due to the economy and flexibility of Sliver CellsTM, we believe this technology will play an important role in the future wide-spread adoption of solar power. Sliver CellTM technology is an excellent example of the way Australian researchers can work with Australian industry to innovate a product that leads the world".

ANU Vice-Chancellor, Professor Ian Chubb welcomed the research breakthrough. "Origin Energy is to be congratulated for its foresight and persistence in supporting the ANU team in this project. The company has made a substantial contribution since establishing the research partnership with ANU," Professor Chubb said.

The most expensive part of traditional solar power panels is the silicon from which the individual cells are made. The Sliver CellTM is a radically different concept in photovoltaics. Sliver CellsTM are produced using special micro-machining techniques, then assembled into solar panels using similar methods to those used to make conventional solar panels.

The new technology reduces costs in two main ways – by using much less expensive silicon for similar efficiency and power output, and needing less capital to build a solar panel plant of similar capacity.

The unique attributes of Sliver CellTM technology could open many new Sliver CellTM applications, in addition to conventional rooftop and off-grid uses, including:

  • Transparent Sliver CellTM panes to replace building windows and cladding
  • Flexible, roll-up solar panels
  • High-voltage solar panels, and
  • Solar powered aircraft, satellite and surveillance systems
By Randall Parker    2002 December 04 12:15 PM   Entry Permalink | Comments (4)
2002 November 28 Thursday
DOE EIA Renewable Energy Annual 2001

The Energy Information Administration of the US Department Of Energy has just released its Renewable Energy Annual report for 2001. The summary is available on a web page.

There were dramatic changes in the patterns of photovoltaic (PV) cell and module shipments. Domestic shipments shot up nearly 80 percent in 2001 to 36.3 peak megawatts, while exports declined 10 percent. This reverses a 10-year history of largely modest growth in domestic shipments and strong gains in exports. Overall, total PV cell and module shipments rose 11 percent in 2001 to 98 peak megawatts.

There were also substantial changes in the type of module produced. For example, thin-film silicon, which had never had more than 4 peak megawatts shipped in a single year, had almost 13 peak megawatts of cells and modules shipped in 2001. This was partially at the expense of cast-and-ribbon cells and modules, whose shipments decreased from 33 peak megawatts in 2000 to 30 peak megawatts in 2001.

Module manufacturers purchased substantially less product in 2001, receiving shipments of 14 peak megawatts of cells and modules, compared with 19 peak megawatts in 2000. Despite this trend, total module.shipments rose from 55,007 peak kilowatts to 67,033 peak kilowatts.

The total value of PV cell and module shipments rose to $305 million in 2001, a 13-percent gain over 2000. The average price per peak megawatt held fairly steady for both cells and modules during 2001 at $2.46 and $3.42, respectively.

A 34-percent surge in shipments to the residential market enabled it to regain its ranking as the top market for PV cells and modules in 2001. Manufacturers shipped 33 peak megawatts of cells and modules to the residential market in 2001, compared with 25 in 2000. Shipments to the second-largest market sector, industrial, declined slightly from 29 to 28 peak megawatts.

What's interesting here is that thin film solar photovoltaics are expanding their marketshare at the expense of other types of photovoltaics. Also, wind is growing faster than solar and has almost equalled solar as seen in the table H1. My guess is that hydroelectric's decline is due to changes in weather patterns.

As can be seen in this chart renewables are only 6% of total US energy production and solar is only 1% of total renewables (so solar is about 0.06% of total energy production). What's surprising (at least to me) is that biomass is a bigger source of energy than hydroelectric. Three quarters of the biomass is wood and wood wastes.

There is no dramatic trend of declining prices in the photovoltaics market. From the full text PDF version of the report:

The total value of photovoltaic cell and module shipments grew 13 per cent to $305 million in 2001 from $270 million in 2000 (Table 29). The average price for modules (dollars per peak watt) d ecr eased 1 per cent, from $3.46 in 2000 to $3.42 in 2001. For cells, the average price increased 3 per cent, from $2.40 in 2000 to $2.46 in 2001.

Looked at over a longer period of time a more dramatic price drop can be seen:

Twenty-one companies were involved in the production of 88,221 kWp of solar PV in 2000, says the Energy Information Administration in its 'Renewable Energy Annual' report. The total of 85,155 kW of crystalline silicon and 2,736 kW of thin-film silicon is an increase from the 12,492 and 1,321 (respectively) produced in 1990. The cost was US$3.46 per peak watt for modules and $2.40 for cells, compared with $5.69 and $3.84 a decade earlier.

Still, we aren't going to get cost effective photovoltaics any time soon unless annual price drops become consistent and dramatic. Gradual refinement of existing photovoltaics manufacturing techniques is probably not going to be what makes photovoltaic devices into a low cost energy source. The research on thin films and nanotechnology will probably be what produces the technological breakthrus that will finally make solar power cost competitive with fossil fuels.

By Randall Parker    2002 November 28 02:42 PM   Entry Permalink | Comments (0)
2002 November 26 Tuesday
Stanford Global Climate & Energy Project

Stanford has set up a privately funded energy and climate research effort to the tune of $225 million dollars. From the project FAQ:

The money:

Q: What corporations are providing funding for this Project and how much are they contributing?

A: To date, sponsors and their contributions to help fund the research are:

  • ExxonMobil (NYSE: XOM), the world's largest publicly traded petroleum and petrochemical company (up to $100 million);
  • General Electric (NYSE: GE), the world leader in power generation technology and services ($50 million); and
  • Schlumberger (NYSE: SLB), a global technology services company ($25 million).

The university expects to involve additional global companies in the automotive and technology industries as the research progresses. E.ON, Europe's largest privately owned energy service provider, has signaled its intention to contribute $50 million and join G-CEP along with other academic and corporate sponsors from Europe. The value of this combined sponsorship is equal to the total of all the corporate-sponsored research at Stanford over the past 10 years.

The areas of research:

Q: What are some immediate projects/innovations that you will be exploring?

A: This Project will give researchers the freedom to explore a variety of new energy technology fields, some of which are in their infancy now but need further exploration. Stanford will develop and maintain a portfolio of specific research options that would further the objectives of the Project and will consider at a minimum the following topics:

  • Low greenhouse gas electric power production, storage, and distribution
  • Advanced transportation techniques
  • Production, distribution, and use of hydrogen
  • Production, distribution, and use of biomass fuels
  • Advanced nuclear technologies
  • Renewable energy supplies (for example, wind and solar energy)
  • Carbon sinks, CO2 separation and storage
  • Coal utilization
  • Material, combustion, and systems science
  • Enabling infrastructure
  • Geoengineering

Specific research initially will focus on:

  • Development of a methodology for an integrated assessment of technology options
  • Hydrogen production and utilization, including biological hydrogen production and efficient hydrogen fuel cells
  • Advanced combustion systems aimed at increasing efficiency of combustion devices and reducing environmental impact
  • Geologic sequestration of CO2

The ownership of the resulting intellectual property:

Q: Who will hold title to new technologies brought to market through this initiative?

A: Stanford will hold formal legal title to all technology and information derived from this program. It also will hold formal legal title to all patents sought.

The complete project white paper is available as a downloadable PDF.

By Randall Parker    2002 November 26 12:12 PM   Entry Permalink | Comments (0)
2002 November 24 Sunday
BASF Hydrogen Storage Nanocubes

BASF is going after the future market for fuel cells as a way to power portable electronic devices. The hope is that a fuel cell combined with a storage device would yield a higher power to weight ratio than existing rechargeable batteries and hence longer battery life.

The hydrogen in the cartridge would be subject to 10 times atmospheric pressure-- about the same level as in a butane cigarette lighter, BASF says. The nanocubes provide controlled release of the hydrogen to the fuel cell, the company says. The hydrogen-- fed fuel cells could power portable devices for more than 10 hours, it adds.

However, as a portable power source the hydrogen fuel cells face a competitor in the form of liquid powered fuel cells. There are companies bringing out prototype liquid fuel cells for portable electric power sources. A company called Smart Fuel Cell argues that methanol fuel cells will be more convenient since recharging will be easier.

In April 2002 SFC had presented the first prototype power supply for mobile office applications at the Hanover Fair. Manfred Stefener, founder and CEO of Smart Fuel Cell: "So far we have miniaturised our products every six months by more than 50 %. The recent progress demonstrated now is based upon an entirely new DMFC stack design. Furthermore, we have made every system component smaller in close collaboration with our supplier network."

SFC has furthermore built up the first infrastructure for fuel cartridges. Stefener said, "It is essential that cartridges are widely available for the consumers, for example at filling stations and supermarkets. This is a lot easier to establish for methanol cartridges than for hydrogen-based systems, and we have already realized the complete logistics chain of the cartridges for our first series product."

Update: Another interesting article on recent fuel cell advances can be found here.

By Randall Parker    2002 November 24 01:59 PM   Entry Permalink | Comments (2)
2002 November 20 Wednesday
Material Discovered For Full Spectrum Photovoltaic Cell

A new discovery raises the prospect of a more efficient photovoltaic cell for lower cost solar energy. Two layers of indium gallium nitride in a solar cell design could convert sunlight to electricity at 50% efficiency.

BERKELEY, CA — Researchers in the Materials Sciences Division (MSD) of Lawrence Berkeley National Laboratory, working with crystal-growing teams at Cornell University and Japan's Ritsumeikan University, have learned that the band gap of the semiconductor indium nitride is not 2 electron volts (2 eV) as previously thought, but instead is a much lower 0.7 eV.

The serendipitous discovery means that a single system of alloys incorporating indium, gallium, and nitrogen can convert virtually the full spectrum of sunlight -- from the near infrared to the far ultraviolet -- to electrical current.

"It's as if nature designed this material on purpose to match the solar spectrum," says MSD's Wladek Walukiewicz, who led the collaborators in making the discovery.

What began as a basic research question points to a potential practical application of great value. For if solar cells can be made with this alloy, they promise to be rugged, relatively inexpensive -- and the most efficient ever created.

The original URL for the article has a graph.

Update: For more details also see this article from the Lawrence Berkeley National Laboratory site:

Working with crystal growers from Cornell and Ritsumeikan University, Japan, the LBNL team performed optical tests (absorption and “photoluminescence”) on a wide range of extremely high quality InN and InxGa1-xN films grown under carefully controlled conditions. It was found that the direct band gap of pure InN is 0.7 eV rather than the previously reported 2.0 eV, which had been measured in lower quality material. Furthermore, it was shown that alloying the InN with GaN to form InxGa1-xN can produce materials whose bandgaps can be continuously varied from 0.7 eV to 3.4 eV. This single semiconductor alloy system, therefore, has an almost perfect match to the entire solar spectrum. Not only does this range include the optimal bandgap values (1.1 and 1.7 eV) for a two-layer cell, it will also enable the fabrication of optimized tandem cells with more layers, for which materials whose band gaps extend close to the lower and nearly all the way to the upper bounds of the usable region of the solar spectrum are required. More recent work has shown that the InxAl1-xN system has direct band gaps spanning an even wider energy range: from 0.7 – 6.2 eV; thus, this related materials system may be useful for both solar energy conversion and for other optoelectronic applications in the near-IR to deep ultraviolet regions of the spectrum.

Although grown on lattice mismatched substrates, all the InxGa1-xN films show an exceptionally strong and robust photoluminescence, demonstrating insensitivity of the optoelectronic properties to structural imperfections. This observation bodes very well for applications of these materials in environmentally harsh conditions. To fully implement the InxGa1-xN alloys for photovoltaic applications some additional hurdles such as control of p-type doping must be overcome, however the work demonstrates that III-V nitride alloys are promising candidates for the development of new solar cells with efficiencies as high as 50%. Furthermore, the discovery extends the range of potential optoelectronic applications of III-V nitride alloys from the near infrared to the deep ultraviolet spectral regions.

Update II: The reason that InGaN turned out to have a different absorption spectrum than expected may have been due to impurities in previously used samples. The InGaN used in this recent set of experiments was much purer but also much expensive. Now the researchers need to find out whether less pure material can exhibit a similar wide light absorption spectrum:

The samples Walukiewicz tested were made using a painstaking, and prohibitively expensive, method to grow very pure crystals of InGaN one atomic layer at a time. The team now hopes to collaborate with the National Renewable Energy Laboratory in Colorado to try to build cheap InGaN solar cells.

Update III: These scientists have a number of problems to solve before this breakthru turns into something useful. They need to find out how to make other forms of this material:

There's a lot of work to be done before practical solar cells can be made from indium nitride, however. The researchers have not yet made the p-type form of the material. "One of the biggest challenges is to make p-type doped indium nitride," said Walukiewicz. The indications are good, however. It is theoretically easier to make p-type doped indium nitride than to do the same with gallium nitride, which has already been done, he said. Gallium nitride is also a direct band-gap material.

The researchers' next step is to make p-type indium nitride. They are also working to make p-type gallium indium nitride, he said. And they are more thoroughly testing the properties of the two materials under high-energy particle irradiation, he said.

The researchers have only tested a few samples, said Cheng Hsiao Wu, a professor of electrical and computer engineering at the University of Missouri at Rolla. The reasons for the measurements are not yet clear; there could be a mechanism involved other than a different band gap, he said.

By Randall Parker    2002 November 20 04:36 PM   Entry Permalink | Comments (14)
2002 November 04 Monday
The Next 50 Years Is A Long Time In Technology

Why, when thinking about technology, does Martin Hoffert think that 50 years is not a long time?

There is no current alternative to fossil fuels that would maintain world economic growth while generating fewer environmental toxins, the team found.

"We don't have those energy sources off the shelf right now, but we have some time to develop them," the report's lead author, Martin Hoffert, a professor of physics at New York University in New York City, told United Press International.

"We have about 50 years. However, 50 years is not a long time."

Given the rate at which biotech and electronics tech are advancing the next 50 years is an extremely long time for technological advances. 50 years from now we will have computers that are many orders of magnitude faster than the computers of today. Those computers will be able to simulate all manner of physical processes and simulation experiments will turn out all sorts of ways to make photovoltaic cells, fuel cells, materials for wind catching propellers for wind power, and for countless other energy-related technologies. We will have complete control of DNA and will be able to make new species of plants and single cell organisms that would make for better biomass energy generators. As an example of where future biotech advances can make a big difference consider how gene tweaking will allow improvements on the already promising prospects for using algae to generate hydrogen fuel:

The breakthrough, Melis said, was discovering what he calls a "molecular switch." This is a process by which the cell's usual photosynthetic apparatus can be turned off at will, and the cell can be directed to use stored energy with hydrogen as the byproduct. "The switch is actually very simple to activate," Melis said. "It depends on the absence of an essential element, sulfur, from the micro alga growth medium." The absence of sulfur stops photosynthesis and thus halts the cell's internal production of oxygen. Without oxygen from any source, the anaerobic cells are not able to burn stored fuel in the usual way, through metabolic respiration. In order to survive, they are forced to activate the alternative metabolic pathway, which generates the hydrogen and may be universal in many types of algae. "They're utilizing stored compounds and bleeding hydrogen just to survive," Melis said. "It's probably an ancient strategy that the organism developed to live in sulfur-poor anaerobic conditions." He said the alga culture couldn’t live forever when it is switched over to hydrogen production, but that it can manage for a considerable period of time without negative effects.

The folks at Melis Energy are working to improve the yields of naturally occurring algae. But imagine what bioengineering of the DNA of algae will make possible to accomplish in 10 or 20 years. Algae will be optimizeable for energy generation tasks. Also, nanotechnological advances will allow fabrication of materials we can only dream about today. The onus to justify their pessimistic viewpoints belongs on the people who do not believe that photovoltaics, fuel cells, and biomass will be cost effective in 30 or 40 years.

Why is it that some people think that only a large scale international coordination of efforts by governments can solve large scale problems?

"What our research clearly shows is that scientific innovation can only reverse this trend if we adopt an aggressive, global strategy for developing alternative fuel sources that can produce up to three times the amount of power we use today," New York University physicist Martin Hoffert said.

Scientific innovations will reverse the trend even if governments do not get involved in funding alternative energy research. Could government money accelerate the process? Only if the government restricts its involvement to basic research and if it stays clear of picking particular technologies to be winners. If it picks the wrong ones and that intimidates private funders from pursuing competitors it is even possible that government involvement could slow the rate of progress. But the development of new energy sources is a process that is going to happen anyhow.

They drag out the proposal to build solar arrays in space to then beam energy down here.

Is it feasible to replace fossil fuels with cleaner sources of energy? A new study concludes that it could be done with enough “political will” and what the lead researcher described as a global effort pursued with the same urgency as the Apollo space program. Europe is showing that will, recently embarking on a massive investment program in hydrogen and fuel cells. But the researchers didn’t see a similar push in the United States.

For space enthusiasts that is a fun proposal. But wouldn't it make more sense to spend a small fraction of that amount ot money to just develop processes that will lower the cost of making solar panels?

Policy discussions ought to be restricted to how much to give university basic researchers to work on basic related science problems.

Joel Darmstadter, an energy researcher at Resources for the Future, an energy think tank, said the study by Hoffert and others is a useful review of the technical status of the world's alternate energy systems. The study, he said, could prompt policy discussions because it gives an evaluation of what is possible to replace fossil fuels.

But Darmstadter said the study failed to draw a clear picture of which of the alternative systems should have the highest priority and bases some of the discussion on ``far out and highly speculative'' technologies, such as the power satellite.

If governments wanted to increase funding to basic physics and chemistry reseachers who are trying to understand the qualities of photovoltaic materials then I think the rate of advance could be accelerated. Ditto for scientists who are investigatng how to do nanotech manipulations with materials or biological scientists who are studying how chloroplasts do photosynthesis. But if governments fund production line construction or other types of decisions that are best left to business then I have serious doubts about the ability of governments to increase the rate of progress.

You can find my previous energy technology posts here.

By Randall Parker    2002 November 04 04:25 PM   Entry Permalink | Comments (6)
2002 October 10 Thursday
Microbial Fuel Cell To Run On Kitchen Scraps

This reminds me of the fusion reactor that took scraps in the Back To The Future movie. This one produces hydrogen gas to run a fuel cell.

Although such "microbial fuel cells" (MFCs) have been developed in the past, they have always proved extremely inefficient and expensive. Now Chris Melhuish and technologists at the University of the West of England (UWE) in Bristol have come up with a simplified MFC that costs as little as £10 to make.

By Randall Parker    2002 October 10 08:42 AM   Entry Permalink | Comments (1)
Site Traffic Info