2005 December 31 Saturday
Fischer-Tropsch Coal Gas Cost Effective With Current Oil Prices?

The Chinese are planning to start converting coal into liquid fuels.

ROYAL Dutch/Shell Group has handed in a feasibility study report on a coal-to-liquid project in China worth several billion dollars to compete with Sasol Ltd, the National Development and Reform Commission said.

The project involves the proposed building of two plants with a combined investment of US$6 billion to US$8 billion in Shaanxi Province and Ningxia Hui Autonomous Region, which represent areas with the most significant coal reserves in the country.

The estimated crude production capacities of the two plants are up to 80,000 barrels per day, or more than 1 percent of China's total oil consumption currently.

Shell is competing against South African company Sasol which currently makes 150,000 barrels of oil per day from coal.

The Governor of Montana wants to have Montana coal converted to synthetic gas and liquid fuel.

Gov. Brian Schweitzer believes Montana is sitting on the answer, and it’s in the form of the nation’s second largest coal reserve. Schweitzer wants the state to begin using an 80-year-old technology developed by Nazi Germany to turn Montana’s vast supplies of coal into usable, ultra-clean-burning diesel and aviation fuel.


With oil prices more than doubling the break-even point of producing synthetic fuels, oil companies and world leaders are beginning to take a serious look at the future of Fischer-Tropsch fuels. Schweitzer predicts they could be produced at a cost of about $1 per gallon in Montana if large-scale commercial plants could be developed in the state.


Closer to home, the Great Plains Synfuels Plant near Beulah, N.D., began operating in 1984 in response to the 1970s energy crisis and today produces more that 54 billion cubic feet of natural gas using the Fischer-Tropsch process.

“The Department of Energy was going to build hundreds of those plants but then oil prices dropped and we all forgot about it,” says Schweitzer, who visited the Beulah plant three weeks ago. “The cost of production [of syngas] at that plant last year was $3 per MCF (thousand cubic feet). Now the price of [natural gas] is $7 per MCF.”

Are Governor Schweitzer's numbers correct? What is the cost to convert coal to natural gas and then to liquid fuel with the Fischer-Tropsch process or some other process? Is synthetic gas (gaseous hydrocarbons - not gasoline) already cheaper than the current price of natural gas in the United States?

Three coal gasification plants are proposed in Illinois.

Three coal gasification plants currently are proposed in Illinois; only three are operating in the nation. A 260 megawatt gasification plant in Florida currently uses Illinois Basin coal. The state government, coal companies and even utilities have banded together to lobby for an Illinois siting of the federally-subsidized near-zero emission coal plant of the future known as FutureGen.

Planning for nine additional Gas-To-Liquid plants were underway in 2004 worldwide.

Synthesis gas can also be created from natural gas - and this is less costly than from coal. Since 1993, Shell in Malaysia (Bintulu) and PetroSA in South Africa (Mossel Bay) have been operating industrial Fischer-Tropsch Synthesis facilities, which produce liquid fuels from synthesis gas which comes originally from natural gas (Gas To Liquid, GTL). A third similar plant is being built by Sasol and Qatar Petroleum in Qatar in the Persian Gulf. Last year nine more GTL-facilities were being planned world-wide; most of them are to use Fischer-Tropsch Synthesis.

Both coal gasification and conversion of natural gas to liquid fuels are becoming more widely used.

Liquid fuel produced by the Fischer-Tropsch Gas-To-Liquids (GTL) process burns more cleanly than conventional fossil fuels and hence is less polluting.

Shell and ExxonMobil are ramping up production on a fuel, called Gas-to-Liquids, that's derived from natural gas. It significantly reduces the sulfur, carbon monoxide and other pollutants that belch from car tailpipes. And although more costly than regular gas, it should help crimp the air pollution in places like Los Angeles, or in New Delhi, where diesel buses are banned.

One impetus behind use of the Fischer-Tropsch GTL process is that natural gas is hard to transport. At the same time, the demand for liquid fuel is strong and prices are high. Shell has a plant in Borneo and is building another in Qatar to convert locally produced natural gas into a diesel-compatible fuel that burns much more cleanly than does diesel made from oil.

Note that natural gas gets used to generate electricity. If nuclear, wind, and solar generated all electricity then more natural gas would be available to make liquid fuel for transportation. There is already a lot of potential for substituteability of fuels even without development of better battery technologies.

Big politics and big money are also converging to support oil shale development.

Legislation recently signed by President Bush instructs the Interior Department to lease 35 percent of the federal government's oil shale lands within the next year, provides tax breaks to the industry,reduces the ability of states and local communities to influence where projects are located and compresses lengthy environmental assessments into a single analysis good for 10 years.


To produce the oil, Shell and other companies sink heaters half a mile into oil shale seams for up to four years, subjecting the rock to 700 degrees. Over time, natural gas and a liquid that can be refined into light crude oil rise to the surface. To prevent the brewing hydrocarbons from spoiling groundwater, the heated rock core would be surrounded by a 20-to-30-foot-thick impermeable ice wall, which also requires electricity to keep it frozen.

The federal government has begun leasing land for oil shale production. Ten new research and development leases are being processed by the Bureau of Land Management in Colorado. Others have been awarded on federal land in Utah and Wyoming.

See my previous post on Shell's effort to develop a better method to extract oil from oil shale. On coal gasification see a couple of Green Car Congress posts: "Rentech Moves on Its PolyGeneration Strategy: Fertilizer, FT Fuels and Power" and "Rentech Tracking to Startup Coal-to-Liquids Pilot Plant in Q4 2006". Also, Rentech makes the argument that synthetic liquid fuels made from natural gas (or coal gas for that matter) contain less contaminants that will mess up fuel cells than do liquid hydrocarbons made from oil. So when fuel cell technology matures that might increase the demand for synthetic liquid fuels even further.

Nuclear power could free up more fossil and biomass fuels to use as liquid fuel for ground transportation. This could be done a number of ways including the following:

  • Build a nuclear plant next to the Canadian tar sands and use nuclear energy to heat up the tar to extract oil from it. I think over half the energy extracted from the tar gets used to do the extraction. The planned Canadian natural gas pipeline will have part of its natural gas going toward tar oil extraction. That could be avoided with nuclear power. Then more oil could be made from the natural gas.
  • Build a nuclear power plant next to a coal field and run all the Fischer-Tropsch processing steps using nuclear power.
  • Build a nuclear plant next to the oil shale fields and use nuclear power to heat up the shale under ground using Shell's extraction process.
  • Use nuclear power for agricultural uses such as power water pumps and dry corn. Then biomass liquid fuel production would not use fossil fuels.

There's enough coal to provide liquid fuel for a long time to come even if part of the coal is used to generate energy to process other coal into natural gas and liquids. Ditto with oil shale and Canadian tar sands. But for making liquid fuels nuclear power could stretch the supplies of coal, oil shale, and oil tar (perhaps doubling or tripling the amount extractable as liquid fuel) and also reduce the total amount of polluting emissions generated by the production of liquid fuel.

My bigger point here is that even if the "Peak Oi" pessimists are right and oil production peaks sometime in the next 10 years that would not spell the end of the fossil fuels economy or the end of heavy reliance on cars and trucks. We will not enter a worldwide depression. The liquid hydrocarbon alternatives to oil are not prohibitively expensive. The quantities of capital needed to rapidly build up conversion plants would be available because the energy marketplace deals in sales in the hundreds of billions of dollars every year. Non-fossil fuel energy sources can even be used to help convert non-liquid hydrocarbons into liquid hydrocarbons.

Looking down the road a few decades I expect solar photovoltaics to become cheap as a result of nanotech advances. That might happen as early as 10 or 20 years from now. If Peak Oil comes early we can keep vehicles moving using liquid fuels made from coal and oil shale. Then we can transition to nuclear and solar to charge better batteries once battery technologies advance far enough to make pure electric vehicles possible.

I'd rather that the transition to nuclear, solar, and batteries happen sooner for environmental, national security, and economic reasons. But I'm not worried about Peak Oil if the transition away from oil comes as soon as the pessimists predict.

Update: I've come across a number of companies developing what they claim are better catalysts and other improvements on the Fischer-Tropsch process. For example, Syntroleum claims to have a better catalyst for Gas-To-Liquid conversion.

During the last two years at Syntroleum's 70 barrel per day gas-to-liquids (GTL) facility at the Port of Catoosa near Tulsa, Okla. Syntroleum utilized its proprietary FT-410 cobalt catalyst to successfully demonstrate the Syntroleum(R) Process by producing ultra-clean diesel and jet fuels from natural gas feedstock for various U.S. government programs.

This new testing program will demonstrate the effectiveness of the Syntroleum FT catalyst with proven coal-derived syngas clean-up and treatment processes for use in a coal-to-liquids (CTL) application. Syngas, which consists of hydrogen and carbon monoxide, is the building block for many chemical processes including FT ultra-clean fuels produced from the Syntroleum(R) Process.

"This testing program is an important step for Syntroleum in demonstrating that our proven natural gas-to-liquids technology is also applicable to coal-to-liquids as well," said Ken Roberts, senior vice president of business development for Syntroleum. "We see this as an opportunity to develop our position toward participation in future coal-to-liquids plants."


"Coal-to-liquids technology has the potential of providing a tremendous source of ultra-clean fuels from abundant coal reserves in the United States and other regions of the world," Roberts said. "The U.S. has the world's largest estimated recoverable coal reserves equaling over 268 billion tons. If only 5 percent of this coal were converted to FT liquids, it would be equivalent to the entire oil reserves currently held by the U.S.

I'd like to know what fraction of the energy extracted from coal makes it into liquid form. The process must require considerable amounts of energy at every step. Unless nuclear, solar, or wind power drove the conversion process liquids made from coal will effectively emit much more CO2 and other gases than petroleum used for the same end purposes. So a switch to coal would increase the greenhouse gas effect of fossil fuel usage.

Update II: Coal is not immune to price rises.

The demand for coal on the world market is up, according to The Associated Press. China has gone from being an exporter of coal to being an importer. Because demand is higher, the price is higher. Futures contracts for Western U.S. coal have gone from about $9 in June to $19.50 in October.


Coal is in demand because it produces energy at low cost. In July, electric companies could produce one megawatt-hour of electricity for $17 by burning coal. It cost $59 to produce the same energy with natural gas and $64 with liquid fuels such as kerosene, according to the AP.

But my guess is that higher prices will be transient since lots of coal mines can be opened up.

By Randall Parker 2005 December 31 10:45 PM  Energy Fossil Fuels
Entry Permalink | Comments(37)
2005 December 30 Friday
Would You Rather Be Watched By Computers Or People?

Since I think the death of privacy is inevitable anyway the idea of computer programs looking for patterns in huge numbers of phone call records does not bother me much and it seems preferable to human spying.

What has not been publicly acknowledged is that N.S.A. technicians, besides actually eavesdropping on specific conversations, have combed through large volumes of phone and Internet traffic in search of patterns that might point to terrorism suspects. Some officials describe the program as a large data-mining operation.


Officials in the government and the telecommunications industry who have knowledge of parts of the program say the N.S.A. has sought to analyze communications patterns to glean clues from details like who is calling whom, how long a phone call lasts and what time of day it is made, and the origins and destinations of phone calls and e-mail messages. Calls to and from Afghanistan, for instance, are known to have been of particular interest to the N.S.A. since the Sept. 11 attacks, the officials said.

If a computer program analyzes tens or hundreds of billions of call records and some dozens of those records are from calls you made do you feel like your privacy has been invaded? I don't. Statistical analysis of massive gobs of data doesn't make me feel like I'm being watched. It just isn't personal enough. I like the idea that such impersonal means of analysis of data can lead to the identification of circles of friends and associates around terrorists.

If intelligence agencies were restricted to using conventional wiretapping court orders aimed at watching specific individuals there'd be no way for data mining computer programs to analyse to look for useful patterns. The whole idea of the approach is to try to find the needle in a haystack by rapidly comparing very large numbers of objects. Each object gets a very limited examination and few of the objects get looked at by real humans.

What it is about privacy invasion that most bothers you? Do you simply not like the idea of people watching you? Or is the objection more along the lines of specific harms incurred as a result? Are you afraid someone who watches you will use the information thus gleaned to blackmail or otherwise harm you?

Also, if someone is going to watch you would you prefer it is employees of an intelligence agency or local police or your neighbor?

I'd rather have governments discover the identity of terrorists by doing statistical analysis of large numbers of phone calls or credit card transactions or flight reservations rather than by, say, planting bugs to listen to conversations of people with ties to the Middle East. Computer analyses seem less invasive because human minds are not finding out intimate details of lives.

The use of computers seems preferable to having law enforcement personnel going around questioning lots of people about the personal lives of other people they know. The questioning can quite unfairly hurt a person's reputation. Whereas a computer program comparing billions of records in databases does not make your neighbors or employers or co-workers or friends think you might be involved in nefarious activities.

Update: When I present the choice as computers or people watching us I think this is an accurate representation of the truth. Intelligence agencies are searching for the terrorist needle in the human haystack. Either they use automation to find the terrorists or they employ much larger (orders of magnitude larger in all likelihood) numbers of people to sit in cars watching who comes to whose apartment, who has lunch with whom, or where someone goes when they fly out of the country and so on.

See Heather MacDonald's City Journal article where Heather explains how the TIA project could have linked all the al-Qaeda operatives together before 9/11.

Why DARPA’s interest in commercial repositories? Because that is where the terror tracks are. Even if members of sleeper cells are not in government intelligence databases, they are almost certainly in commercial databases. Acxiom, for example, the country’s largest data aggregator, has 20 billion customer records covering 96 percent of U.S. households. After 9/11, it discovered 11 of the 19 hijackers in its databases, Fortune magazine reports. The remaining eight were undoubtedly in other commercial banks: data aggregator Seisint, for example, found five of the terrorists in its repository.

Had a system been in place in 2001 for rapidly accessing commercial and government data, the FBI’s intelligence investigators could have located every single one of the 9/11 team once it learned in August 2001 that al-Qaida operatives Khalid al-Midhar and Nawaq al-Hazmi, two of the 9/11 suicide pilots, were in the country. By using a process known as link analysis (simpler than data mining), investigators would have come up with the following picture: al-Midhar’s and al-Hazmi’s San Diego addresses were listed in the phone book under their own names, and they had shared those addresses with Mohamed Atta and Marwan al-Shehi (who flew United 175 into the South Tower of the World Trade Center). A fifth hijacker, Majed Moqed, shared a frequent-flier number with al-Midhar. Five other hijackers used the same phone number Atta had used to book his flight reservations to book theirs. The rest of the hijackers (who crashed in Pennsylvania) could have been tracked down from addresses and phones shared with hijacker Ahmed Alghamdi, a visa violator—had the INS bothered to locate him before the flight by running his name on its overstayer watch list.

Data mining can find the needle in the haystack. It can do this without listening on phone conversations. Of course, there is a third choice: let terrorist attacks happen.

Also see my posts "Privacy Concerns Block Response To Terrorist Threat" (which includes a discussion of science fiction writer David Brin's argument that the death of privacy really is inevitable), "Heather Mac Donald on US Senate TIA Ban", and my favorite on the absurd: "Heather Mac Donald: Government Panel Opposes Google Searches By Spies".

By Randall Parker 2005 December 30 10:22 PM  Surveillance Computers
Entry Permalink | Comments(38)
2005 December 29 Thursday
Better Diet Decreases Incidence Of Age Related Macular Degeneration

Eat good food to reduce the rate at which your eyes age.

A diet with a high intake of beta carotene, vitamins C and E, and zinc is associated with a substantially reduced risk of age-related macular degeneration in elderly persons, according to a study in the December 28 issue of JAMA.

Age-related macular degeneration (AMD) is a degenerative disorder of the macula, the central part of the retina, and is the most common cause of irreversible blindness in developed countries, according to background information in the article. Late-stage AMD results in an inability to read, recognize faces, drive, or move freely. The prevalence of late AMD steeply increases with age, affecting 11.5 percent of white persons older than 80 years. In the absence of effective treatment for AMD, the number of patients severely disabled by late-stage AMD is expected to increase in the next 20 years by more than 50 percent to 3 million in the United States alone. Epidemiological studies evaluating both dietary intake and serum levels of antioxidant vitamins and AMD have provided conflicting results. One study (called AREDS) showed that supplements containing 5 to 13 times the recommended daily allowance of beta carotene, vitamins C and E, and zinc given to participants with early or single eye late AMD resulted in a 25 percent reduction in the 5-year progression to late AMD.

Redmer van Leeuwen, M.D., Ph.D., of Erasmus Medical Centre, Rotterdam, the Netherlands, and colleagues investigated whether antioxidants, as present in normal daily foods, play a role in the primary prevention of AMD. Dietary intake was assessed at baseline in the Rotterdam Study (1990-1993) using a semiquantitative food frequency questionnaire. Follow-up continued through 2004. The Rotterdam Study included inhabitants aged 55 years or older from a middle-class suburb of Rotterdam, the Netherlands. Of 5,836 persons at risk of AMD at baseline, 4,765 had reliable dietary data and 4,170 participated in the follow-up.

Average follow-up of participants was 8.0 years. During this period, 560 persons (13.4 percent) were diagnosed as having new AMD, the majority of whom had early-stage AMD. A significant inverse association was observed for intake of vitamin E, iron, and zinc. After adjustment, a 1-standard deviation increase in intake was associated with a reduced risk of AMD of 8 percent for vitamin E and 9 percent for zinc. An above-median (midpoint) intake of beta carotene, vitamins C and E, and zinc, compared with a below-median intake of at least 1 of these nutrients, was associated with a 35 percent reduced risk of AMD, adjusted for all potential confounders. In persons with a below-median intake of all 4 nutrients, the risk of AMD was increased by 20 percent. Adding nutritional supplement users to the highest quartile of dietary intake did not change the results.

The benefit of the nutrients is higher when the nutrients come from foods.

"This study suggests that the risk of AMD can be modified by diet; in particular, by dietary vitamin E and zinc. A higher intake of vitamin E can be achieved by consumption of whole grains, vegetable oil, eggs, and nuts. High concentrations of zinc can be found in meat, poultry, fish, whole grains, and dairy products. Carrots, kale, and spinach are the main suppliers of beta carotene, while vitamin C is found in citrus fruits and juices, green peppers, broccoli, and potatoes. Based on this study, foods high in these nutrients appear to be more important than nutritional supplements. Until more definitive data are available, this information may be useful to persons with signs of early AMD or to those with a strong family history of AMD. Although in need of confirmation, our observational data suggest that a high intake of specific antioxidants from a regular diet may delay the development of AMD," the authors conclude.

The foods contain other beneficial compounds and part of the measured benefit associated with high consumption of E, zinc, and so on probably comes from other compounds. For example, foods high in the D-alpha tocopherol form of vitamin E also have other tocopherols and some foods even have the related tocotrienols. Also beta carotenoid is part of a very large family of carotenoids. In addition, healthful foods have a variety of anti-oxidants that are not even related to vitamins.

Improve your diet to increase your odds of living long enough to take advantage of Strategies for Engineered Negligible Senescence. I'm convinced that there are people alive right now who will one day become young again using coming advances in biotechnology. We should all support a big push to make that day come sooner. But while we do that and while we wait for rejuvenation therapies we should eat excellent diets to increase our odds of still being around for that day.

By Randall Parker 2005 December 29 10:22 PM  Aging Diet Eye Studies
Entry Permalink | Comments(5)
2005 December 27 Tuesday
Many Small Hydropower Projects In Planning Stages

High energy prices are making a lot of dam upgrade projects for hydroelectric power look economic.

Propelled by high energy costs, federal incentives, and an eased licensing process, at least 104 projects in 29 states - with 2,400 megawatts of new capacity - have been granted "preliminary permits" by the Federal Energy Regulatory Commission (FERC), which regulates hydropower development. Many other projects in the works have not yet been officially reported by FERC, observers say.

This is happening against a backdrop of fights by environmentalists to get old dams torn down. Make energy prices high enough and I think the rate at which environmentalists win the dam battles will drop.

If the best existing dams were upgraded to generate electricity they could generate 17 gigawatts of power.

About 4 in 5 projects on the books are tiny - producing less than 20 megawatts of power. But if all 104 projects now in the planning stages are built, they would contribute 2.4 gigawatts to generating capacity nationwide.

The potential exists for much more, say federal researchers. Of 80,000 existing dams, only about 2,500 generate electricity. Upgrading those hydropower dams could boost power by 4,300 megawatts. Retrofitting the most promising of the remaining 77,000 dams could generate as much as 17,000 megawatts, according to a recent US Department of Energy Report.

Would you rather have more dams for electricity or more coal burning power plants or more nuclear power plants?

To put that potential 17 gigawatts of additional hydroelectric power in perspective, in the United States the US government's Energy Information Administration projects 174 gigawatts of additional electric power generating capacity from coal by the year 2030.

In the AEO2006 reference case, the projected average prices of natural gas and coal delivered to electricity generators in 2025 are 31 cents and 11 cents per million Btu, respectively—higher than the comparable prices in AEO2005. Although the projected levels of coal consumption for electricity generation in 2025 are similar in the two forecasts, higher natural gas prices and slower growth in electricity demand in AEO2006 lead to significantly lower levels of natural gas consumption for electricity generation. As a result, projected cumulative capacity additions and generation from natural-gas-fired power plants are lower in the AEO2006 reference case, and capacity additions and generation from coal-fired power plants through 2025 are similar to those in AEO2005. In the later years of the AEO2006 projection, natural-gas-fired generation is expected to decline, displaced by generation from new coal-fired plants (Figure 5). The AEO2006 projection of 1,070 billion kilowatthours of electricity generation from natural gas in 2025 is 24 percent lower than the AEO2005 projection of 1,406 billion kilowatthours.

In the AEO2006 reference case, the natural gas share of electricity generation (including generation in the end-use sectors) is projected to increase from 18 percent in 2004 to 22 percent around 2020, before falling to 17 percent in 2030. The coal share is projected to decline slightly, from 50 percent in 2004 to 49 percent in 2020, before increasing to 57 percent in 2030. Additions to coal-fired generating capacity in the AEO2006 reference case are projected to total 102 gigawatts between 2004 and 2025, as compared with 86 gigawatts in AEO2005. Over the entire period from 2004 to 2030, 174 gigawatts of new coal-fired generating capacity is projected to be added in the AEO2006 reference case, including 19 gigawatts at CTL plants.

Nuclear generating capacity in the AEO2006 reference case is projected to increase from about 100 gigawatts in 2004 to about 109 gigawatts in 2019 and to remain at that level (about 10 percent of total U.S. generating capacity) through 2030. The total projected increase in nuclear capacity between 2004 and 2030 includes 3 gigawatts expected to come from uprates of existing plants that continue operating and 6 gigawatts of capacity at newly constructed power plants, stimulated by the provisions in EPACT2005, that are expected to begin operation between 2014 and 2020.

Coal is cheaper than natural gas. But burning coal generates more pollution. Making coal burn with less emissions raises the cost of burning coal. About half of the new electric power generation capacity added between now and 2030 is projected to be from coal.

Currently the whole world has about 350 gigawatts of nuclear electric capacity. But China might build 300 gigawatts of nuclear power plants by the year 2050.

If you do not want higher energy prices and you do not want energy sources that pollute then the only remaining option is to greatly accelerate the rate of advance of energy technologies. But I do not see a political consensus in favor of that option in the United States or in Europe for that matter. Over in China they are going to burn enormously larger quantities of coal and build dozens or hundreds of nuclear power plants. The US is sticking with the use of increasing amounts of coal. Technological advances will eventually make coal less bad than it is today. But it is not clear to what extent governments will force the coal burners to use more expensive technologies to burn coal more cleanly. Is there any prospect for coal gassification to make cleaner coal at no higher total cost?

By Randall Parker 2005 December 27 09:59 PM  Energy Policy
Entry Permalink | Comments(11)
2005 December 26 Monday
New England To Limit CO2 Emissions

A mini-Kyoto Accord in New England gets agreed to.

At the heart of the Regional Greenhouse Gas Initiative (RGGI) is a "cap and trade" program that sets a fixed limit on CO2 emissions. The right to emit the gas then becomes a tradable commodity on Jan 1, 2009. Companies that produce less carbon dioxide can sell their credits to others, giving an economic incentive to cut emissions and sell, rather than buy, credits.

The RGGI caps regional CO2 emissions at 121.3 million short tons through 2014, then cuts them to 10 percent below that level by 2018. Some say the pact will cost households an additional $3 to $24 per year on their electric bills, although the RGGI governors expect new technology and energy efficiency to reduce rates.

As I've previously argued, for electric power our three choices are coal at its current level of dirtiness, nuclear, or higher prices. The New England states (which already have higher prices) have opted for still higher prices. Keep in mind that New England is more densely populated than most of the US. So in order to achieve the same level of air quality as the plains or Rockies states New England has to impose tougher and more expensive emissions regulations.

Though in this case the New England states are pursuing emissions reduction for a gas that mainly affects temperature (and in a way that would make winters less severe). The New Englanders would benefit more by incurring the same level of additional costs to cut particulate, mercury, and other pollutants that have more direct health impacts. While a CO2 reduction will reduce some other pollutants as well the same amount of money could reduce those other pollutants much more if CO2 reduction is not part of the equation.

As for increased energy efficiency: The New England governors would need to increase electric bill costs by much more than $24 per year to get people to install a lot of more energy efficient devices. But the best way for New England to reduce energy usage would be to spend on reducing the use of fossil fuels for heating. Better insulation and solar heating would provide the best bang for the buck.

We ought to solve the potential CO2 emissions problem in the longer term by obsolescing fossil fuels. If one is to believe Ray Kurzweil (see The Singularity Is Near: When Humans Transcend Biology) then by the 2020s artificial intelligence will solve the nanotech problems and solar power will provide us with all the power we need. If Ray is right (and he might well be) then we ought to focus environmental efforts on improving the air quality for breathability so we can avoid illness in the shorter run. Then let rapidly advancing technology obsolesce oil long before the temperates rise by much.

If Ray is right then the New Englanders have their environmental priorities wrong.

By Randall Parker 2005 December 26 11:03 PM  Energy Policy
Entry Permalink | Comments(22)
2005 December 25 Sunday
Intelligence Correlates With Brain Size

These results are consistent with lots of other studies which found a positive correlation between brain size and intelligence. Bigger is better.

Brain size matters for intellectual ability and bigger is better, McMaster University researchers have found.

The study, led by neuroscientist Sandra Witelson, a professor in the Michael G. DeGroote School of Medicine, and published in the December issue of the journal Brain, has provided some of the clearest evidence on the underlying basis of differences in intelligence.

The study involved testing of intelligence in 100 neurologically normal, terminally ill volunteers, who agreed that their brains be measured after death.

It found bigger is better, but there are differences between women and men.

In women, verbal intelligence was clearly correlated with brain size, accounting for 36 percent of the verbal IQ score. In men, this was true for right-handers only, indicating that brain asymmetry is a factor in men.

Spatial intelligence was also correlated with brain size in women, but less strongly. In men, spatial ability was not related to overall brain size. These results suggest that women may use verbal strategies in spatial thinking, but that in men, verbal and spatial thinking are more distinct.

How hard is it to measure gray versus white matter volumes in dead brains? My guess from other reading is that spatial reasoning abilities will vary more as a function of gray matter volume and perhaps more for gray matter in some areas of the brain than other areas.

It may be that the size or structure of the localized brain regions which underlie spatial skills in men is related to spatial intelligence, as was shown in previous research in Witelson's lab on the brain of Albert Einstein.

In a further sex difference, brain size decreased with age in men over the age span of 25 to 80 years, but age hardly affected brain size in women. It is not known what protective factors, which could be genetic, hormonal or environmental, operate in women.

It remains to be determined what the contribution of nature and nurture are to this cerebral size relationship with intelligence, Witelson said. She added that the results point to the need for responsibility in considering the likely future use of magnetic imaging (or MRIs) of brain structure as a measure of ability in student and workforce settings.

"We're going to need to be careful if, in the future, we use MRI brain scans as a measure of ability in any selection process," she said.

That brain size should correlate with intelligence strikes me as unsurprising on a number of grounds. The brain burns a disproportionate amount of fuel for its size. The brain is expensive for the body. Why should it be bigger unless being bigger provides some Darwinian fitness benefit? For larger brains to offer no benefit the people with smaller brains would need to have mutations that allow them to process just as much information but in a smaller space. But then why wouldn't such mutations sweep through a population?

I'm expecting we'll see the development of far more accurate ways to measure intelligence using MRI and other physical measures.

By Randall Parker 2005 December 25 10:41 PM  Brain Intelligence
Entry Permalink | Comments(5)
2005 December 22 Thursday
Venture Capital Flows Into Solar Power Companies

Venture capital for solar energy has more than doubled in the last year.

In the first three quarters of this year, U.S. venture-capital firms funneled $67.7 million into the solar-energy sector, up from $31.4 million for all of 2004, according to the National Venture Capital Association, an Arlington, Va., trade group.

That's more than 30 times the amount invested 10 years ago and presents more evidence that record-high energy prices have incited a monumental push for cheaper forms of energy.

The NVCA says solar investments for the first three quarters of 2005 represented more than a third of the $194.6 million invested by venture-capital firms in the entire U.S. energy industry.

The article also reports revenues for solar energy sales have grown 50% in the last year. However, a venture capitalist quoted in the article says few solar start-ups are near to getting new products to market and most start-ups are basically doing science projects hoping for a breakthrough. So while the higher oil and natural gas prices have stimulated demand for solar equipment no big price breakthroughs resulting from technological advances are in sight.

I expect the price of oil to stay high enough in future years to maintain higher levels of investments in alternative energy technologies. But one can only guess when investments will finally pay off with price competititve alternatives to fossil fuels.

While solar electric gets the bulk of the attention in the popular press it remains too costly while solar heating has a rapid return on investment.

The cost of solar repels many homeowners. Solar panels, storage batteries and installation start at about $20,000 for modest dwellings, and $25,000 to $30,000 for three-bedroom homes. It’s more depending on the direction the house faces and the duration of sunshine in winter months, and the battery systems require monthly monitoring and maintenance.

The first step is making homes more energy-efficient with extra insulation and fluorescent lighting, Deri said.

Deri also recommends systems that only supplement existing systems - solar water heaters, or solar air systems - which cost from $1,500 to $5,000.

"They pay for themselves in three to six years," he said.

The biomass story is similar to the solar story: Biomass and solar are both more competitive for heating than they are for electric generation or transportation. Yet electricity and cars get more attention than space heating. A pity that. A lot more people ought to be using corn and solar to heat their houses and commercial buildings.

By Randall Parker 2005 December 22 10:33 PM  Energy Policy
Entry Permalink | Comments(14)
2005 December 20 Tuesday
More Evidence For Fructose Obesity Link

Fructose fruit sugar is not a harmless substitute for glucose.

University of Florida researchers have identified one possible reason for rising obesity rates, and it all starts with fructose, found in fruit, honey, table sugar and other sweeteners, and in many processed foods.

Fructose may trick you into thinking you are hungrier than you should be, say the scientists, whose studies in animals have revealed its role in a biochemical chain reaction that triggers weight gain and other features of metabolic syndrome - the main precursor to type 2 diabetes. In related research, they also prevented rats from packing on the pounds by interrupting the way their bodies processed this simple sugar, even when the animals continued to consume it.

The findings, reported in the December issue of Nature Clinical Practice Nephrology and in this month's online edition of the American Journal of Physiology-Renal Physiology, add to growing evidence implicating fructose in the obesity epidemic and could influence future dietary guidelines. UF researchers are now studying whether the same mechanism is involved in people.

"There may be more than just the common concept that the reason a person gets fat is because they eat too many calories and they don't do enough exercise," said Richard J. Johnson, M.D., the J. Robert Cade professor of nephrology and chief of nephrology, hypertension and transplantation at UF's College of Medicine. "And although genetic predispositions are obviously important, there's some major environmental force driving this process. Our data suggest certain foods and, in particular, fructose, may actually speed the process for a person to become obese."

Physical inactivity, increased caloric intake and consumption of high-fat foods undoubtedly account for part of the problem, Johnson said. But Americans are feasting on more fructose than ever. It's in soft drinks, jellies, pastries, ketchup and table sugar, among other foods, and is the key component in high fructose corn syrup, a sugar substitute introduced in the early 1970s.

Since then, fructose intake has soared more than 30 percent, and the number of people with metabolic syndrome has more than doubled worldwide, to more than 55 million in the United States alone, Johnson said. The condition, characterized by insulin resistance, obesity and elevated triglyceride levels in the blood, is linked to the development of type 2 diabetes and hypertension.

"If you feed fructose to animals they rapidly become obese, with all features of the metabolic syndrome, so there is this strong causal link," Johnson said, "And a high-fructose intake has been shown to induce certain features of the metabolic syndrome pretty rapidly in people."

An increase in uric acid is involved in creating the negative consequences from fructose consumption.

Now UF research implicates a rise in uric acid in the bloodstream that occurs after fructose is consumed, Johnson said. That temporary spike blocks the action of insulin, which typically regulates how body cells use and store sugar and other food nutrients for energy. If uric acid levels are frequently elevated, over time features of metabolic syndrome may develop, including high blood pressure, obesity and elevated blood cholesterol levels.

Researchers from UF and the Baylor College of Medicine studied rats fed a high-fructose diet for 10 weeks. Compared with rats fed a control diet, those on the high-fructose diet experienced a rise in uric acid in the bloodstream and developed insulin resistance.

Blocking of uric acid prevented the weight and blood sugar problems.

"When we blocked or lowered uric acid, we were able to largely prevent or reverse features of the metabolic syndrome," Johnson said. "We were able to significantly reduce weight gain, we were able to significantly reduce the rise in the triglycerides in the blood, the insulin resistance was less and the blood pressure fell."

UF researchers are now studying the uric acid pathway in cell cultures in the laboratory, in animals and in people, and are also eyeing it as a possible factor in the development of cardiovascular and kidney diseases because of its effects on blood vessel responses. They are conducting a National Institutes of Health-funded trial to determine if lowering uric acid in blacks with hypertension improves blood pressure control and are collaborating with scientists at Baylor to determine if lowering uric acid will reduce blood pressure in adolescents with hypertension.

"We cannot definitively state that fructose is driving the obesity epidemic," said Johnson. "But we can say that there is evidence supporting the possibility that it could have a contributory role - if not a major role. I think in the next few years we'll have a better feel for whether or not these pathways that can be shown in animals may be relevant to the human condition."

Findings to date suggest certain sugar carbohydrates are actually better than others, he added, because some do not activate the uric acid pathway.

I love it when a scientific discovery suggests obvious practical ways to make use of the new information.

"It may well be we don't need to cut out carbohydrates but just certain types of carbohydrates," Johnson said. "So this may be an alternative to the Atkins type of approach, which cuts out carbohydrates indiscriminately."

As scientists learn more about the pathway, Johnson said, and as studies are completed in people, the findings may influence how to make wise choices about the foods we eat.

"With the caveat that people are different from rodents in many ways, the link between urate levels, blood pressure elevation and insulin resistance demonstrated in rats fed fructose is extremely provocative," said Brian F. Mandell, M.D., Ph.D., vice chairman of medicine for education and a professor of medicine at the Cleveland Clinic Lerner College of Medicine of Case Western Reserve University. "Whether the fructose supplementation to the diet in the United States is partially responsible for the 'epidemic' of obesity remains to be proven - but this is an association which can be tested, and the work of Dr. Johnson and his collaborators makes the evaluation of the fructose-metabolic link in people an academic and public health imperative."

So how can one keep uric acid levels down? Anyone know?

Also see my previous post "Fructose Consumption May Lead To Obesity".

By Randall Parker 2005 December 20 10:11 PM  Brain Appetite
Entry Permalink | Comments(12)
2005 December 18 Sunday
Vitamin D Crucial For Long Term Lung Health

Yet another argument for getting plenty of vitamin D:

Vitamin D may play a role in keeping our lungs healthy, with greater concentrations of vitamin D resulting in greater lung health benefits. A study in the December issue of CHEST, the peer-reviewed journal of the American College of Chest Physicians (ACCP), shows that patients with higher concentrations of vitamin D had significantly better lung function, compared with patients with lower concentrations of vitamin D.

"Low levels of vitamin D have been associated with osteoporosis, hypertension, diabetes, and cancer," said lead author Peter Black, MB, ChB, Department of Medicine, University of Auckland, Auckland, New Zealand. "Our research shows that vitamin D may also have a strong influence on lung health, with greater levels of vitamin D associated with greater and more positive effects on lung function."

Researchers from the University of Auckland examined the relationship between vitamin D and lung function using participants from the United States National Health and Nutrition Examination Survey (NHANES III) carried out during 1988 to 1994. The study included 14,091 people aged >= 20 years, who were interviewed at mobile examination centers, had spirometry performed, and had serum 25-hydroxyvitamin D measured. Vitamin D measurements were divided into five groups (quintiles), ranging from more than 85.7 mL to less than 40.4 mL. After adjusting for gender, age, ethnicity, body mass index, and smoking status, the differences between the lowest quintile of vitamin D and the next quintile were 79 mL for FEV1 and 71 mL for FVC. In comparison, the differences between the highest and lowest quintiles of vitamin D were 126 mL for FEV1 and 172 mL for FVC. With further adjustment for physical activity, intake of vitamin D supplements and milk, and antioxidant level, the difference between the highest and lowest quintiles of vitamin D also was significant at 106 mL for FEV1 and 142 mL for FVC. In addition, an association between vitamin D and FEV1 was seen in non-Hispanic whites and blacks and was greater for those over 60 years and current or former smokers.

Here is the most amazing part.

"The difference in lung function between the highest and lowest quintiles of vitamin D is substantial and greater than the difference between former and nonsmokers," said Dr. Black. "Although there is a definite relationship between lung function and vitamin D, it is unclear if increases in vitamin D through supplements or dietary intake will actually improve lung function in patients with chronic respiratory diseases."

Overall, male gender, younger age, white ethnicity, nonsmoking status, and regular, vigorous physical activity were associated with the highest lung function. Vitamin D was higher in men than women, was inversely related to BMI, and declined with age. Vitamin D also was lower in non-Hispanic blacks and Mexican-Americans, compared with non-Hispanic whites, and it was lower in participants smoking more than 20 cigarettes a day compared with nonsmokers.

"Vitamin D would be a relatively simple, low-cost intervention that would likely have high compliance to prevent or slow loss of lung function in susceptible subgroups. However, further studies examining the relationship between vitamin D and lung function are warranted to identify who may benefit from such an intervention," said author of the study's corresponding editorial Rosalind Wright, MD, MPH, Channing Laboratory, Department of Medicine, Brigham and Women's Hospital, Harvard Medical School, Boston, MA.

"Chronic lung conditions compromise quality of life for millions of people in the United States and around the world," said W. Michael Alberts, MD, FCCP, President of the American College of Chest Physicians. "By understanding the effect that vitamins have on lung function, we may be able to identify new and more effective treatments for these debilitating diseases."

People working indoors do not get the sun exposure that their ancestors got as farmers and manual laborers. Vitamin D deficiency is a common problem.

Also see my previous posts Vitamin D Could Decrease Overall Cancer Risk 30% and Higher Vitamin D Reduces Aging Bone Fracture Risks.

By Randall Parker 2005 December 18 08:05 PM  Aging Diet Studies
Entry Permalink | Comments(5)
PDYN Brain Gene Modified During Primate Evolution

The set of genes known to have been under selective pressure during primate evolution has gained another member: Prodynorphin. Regulatory regions for prodynorphin, an important brain gene, have been under natural selective pressure during evolution from lower primates to humans.

Durham, N.C. -- Researchers have discovered the first brain regulatory gene that shows clear evidence of evolution from lower primates to humans. They said the evolution of humans might well have depended in part on hyperactivation of the gene, called prodynorphin (PDYN), that plays critical roles in regulating perception, behavior and memory.

They reported that, compared to lower primates, humans possess a distinctive variant in a regulatory segment of the prodynorphin gene, which is a precursor molecule for a range of regulatory proteins called "neuropeptides." This variant increases the amount of prodynorphin produced in the brain.

While the researchers do not understand the physiological implications of the activated PDYN gene in humans, they said their finding offers an important and intriguing piece of a puzzle of the mechanism by which humans evolved from lower primates.

They also said that the discovery of this first evolutionarily selected gene is likely only the beginning of a new pathway of exploring how the pressure of natural selection influenced evolution of other genes.

They also said their finding demonstrates how evolution can act more efficiently to alter the regulatory segments, or "promoters," that determine genes' activity, rather than on the gene segment that determines the structure of the protein it produces. Such regulatory alteration, they said, can more readily generate variability than the hit-or-miss mutations that alter protein structure and function.

I think they are exaggerating to call PDYN "this first evolutionarily selected gene". See my discussion below of Microcephalin, ASPM, and the Ashkenazi Jewish genetic disease genes such as the sphingolipid pathway genes.

Prodynorphin is involved in many brain functions.

The researchers published their findings in an article in the December 2005 issue of the Public Library of Science. They were Gregory Wray and David Goldstein of Duke University; Matthew Rockman of Princeton University; Matthew Hahn of Indiana University; Nicole Soranzo of University College London; and Fritz Zimprich of the Medical University of Vienna in Austria. The research was sponsored by the National Science Foundation and NASA.

"We focused on the prodynorphin gene because it has been shown to play a central role in so many interesting processes in the brain," said Wray. "These include a person's sense of how well they feel about themselves, their memory and their perception of pain. And it's known that people who don't make enough of prodynorphin are vulnerable to drug addiction, schizophrenia, bipolar disorders and a form of epilepsy. So, we reasoned that humans might uniquely need to make more of this substance, perhaps because our brains are bigger, or because they function differently.

Note how the study of gene sequence variations from an evolutionary perspective allows scientists to find what areas most likely changed to make human minds different than the minds of other primates. The theory of evolution is not just an explanation of ancient events. Genetic models of evolution help in doing practical research in how the brain works.

"Also importantly, the part of the gene that produces the prodynorphin protein shows no variation within humans, or even between humans and any of the great apes," said Wray, who is a professor of biology. "So, if we found any variation in this gene due to evolution, it was likely to be in its regulation. And our premise is that the easiest way to generate evolutionary change is to alter regulation."

In their studies, the researchers analyzed the sequence structure of the PDYN promoter segment in humans and in seven species of non-human primates -- chimpanzees, bonobos, gorillas, orangutans, baboons, pig-tailed macaques and rhesus monkeys. They found significant mutational changes in the regulatory sequence leading to humans that indicated preservation due to positive evolutionary selection. They also found an "evolution-by-association," in which sequences near the regulatory segment showed greater mutational change -- as if they were "dragged along" with the evolving regulatory sequence.

The identification of areas of the genome which were under active selective pressure in evolution helps the search for genetic sequence variations which boost intelligence in the smarter folks among us. Areas of brain genes which are shown to be under recent selective pressure are the areas most likely to contain genetic variations that account for the huge range of intellectual ability found in the human population.

The report above about PDYN reminds me of previous reports about the brain gene ASPM. First Bruce Lahn of the University of Chicago found that ASPM underwent extended selective pressure and change in the primates. Then Lahn showed that ASPM and Microcephalin have been under strong selective pressure in recent human history and the genetic variations recently selected for were not selected for equally in all human populations. PDYN is basically at the research stage that ASPM was at before Lahn compared large numbers of humans for their ASPM variations. The next logical step with PDYN would be to compare regulatory regions for that gene in different human populations to see if they differ in their frequency of different genetic variations.

ASPM, PDYN, and Microcephalin make excellent candidates for genes whose variations cause differences in levels of intelligence. What we need is a massive study of people who would get IQ tested and also get genetically tested for which variations they have for these genes and for the regulatory regions for these genes. A number of other genes would also make good candidates for inclusion in such a study. as Greg Cochran, Henry Harpending, and Jason Hardy have recently demonstrated the genes which cause Jewish genetic diseases are also excellent candidates for comparisons of genetic sequences and IQ levels.

The Duke University researchers in the report at the top of this post have 250 more genes active in the brain they are going to look at for signs of natural selective pressures for brain evolution. My guess is that in the next 5 years the evolutionary approach to brain gene study is going to lead to the identification of many genetic variations that causes differences in intelligence. The work could go much faster if the genetic basis for IQ differences was not so taboo politically. But enough excellent work is getting done in this area that I'm hopeful about some major discoveries in spite of the taboo.

The discovery of genetic variations which boost IQ will lay the groundwork for attempts to boost human intelligence. People will use genetic tests to choose mates and choose egg and sperm donors to get smarter offspring. Also, the knowledge that up and down regulation of specific genes affectgs intelligence levels will lead to attempts to develop drugs which change the regulation of those genes in hopes of boosting intelligence. So the search for signs of selective pressures on brain genes will lead to IQ boosts that will eventually cause revolutionary changes in human societies.

By Randall Parker 2005 December 18 11:24 AM  Brain Evolution
Entry Permalink | Comments(0)
2005 December 17 Saturday
Total World Dementia Seen Tripling By 2040

A group of researchers from Britain, Australia, Brazil, the United States, China, Japan, and Sweden has published a report in the British medical journal The Lancet arguing that barring advances in treatment the number of people in the world suffering dementia due to aging will more than triple by the year 2040. (requires free registration)

We have generated expert consensus estimates of age-specific dementia prevalence for different world regions using the Delphi technique. We estimate that 24 million people have dementia today and that this amount will double every 20 years to 42 million by 2020 and 81 million by 2040, assuming no changes in mortality, and no effective prevention strategies or curative treatments. Of those with dementia, 60% live in developing countries, with this number rising to 71% by 2040. The rate of increase in numbers of people with dementia is predicted to be three to four times higher in developing areas than in developed regions.

Obviously, lots of advances in medical treatments will occur in the interim. Some advances will increase longevity by keeping old bodies alive longer. Those sorts of advances will increase the number of people with longevity by allowing more people to live to an age where their brains fail. On the other hand, medical advances that prevent Alzheimer's Disease and other causes of dementia will surely be developed as well.

Prevention of brain aging is much harder than rejuvenation of the rest of the body. The reason for this is simple: We will develop ways to grow and build replacement parts for most of the body. But our brains hold our identities. We can't get a brain replaced with a younger brain without replacing ourselves with a different person. Now, maybe some day nanotechnological methods will allow us to replicate our memories in another brain and that new brain will think it is us. Though I would not view a copy of me as being me. But given such advanced technologies why not instead apply those nano-devices to instead fully repair the brain we already have?

The costs of millions of demented people are enormous. People with early onset Alzheimers are lost from the workforce. Regardless of age of onset the costs of caring for each patient are high because the patients gradually lose the ability to care for themselves. Both families and governments shoulder large portions of the costs. The burden per working person is rising as the average age of populations rise. Taxes will go up in all the developed countries in the next decade and levels of service will simultaneously be cut in order to pay for the growing population of old folks.

These costs of caring for the demented and of old people suffering from other maladies are a strong argument for a huge increase of government funding for research to develop rejuvenation therapies (what Aubrey de Grey calls Strategies for Engineered Negligible Senescence or SENS). Once developed such therapies will become far cheaper to administer than the costs of caring for an aging population. People who are too worn out to work will, once rejuvenated, be able to return to work. Many will once again become net payers of taxes rather than net recipients of taxes paid by younger workers.

Brain rejuvenation combined with technologies to boost cognitive function will cause an enormous increase in average human productivity. The increases in human productivity will pay back the costs of medical research many times over.

We are going to pay for the aging population one way or another. I prefer to pay for it by solving the underlying problem: reverse aging. That way of paying for it requires larger government expenditures in the short to medium run but will avoid much larger government expenditures in the long run while simultaneously allowing us to become young again.

By Randall Parker 2005 December 17 12:52 PM  Aging Population Problems
Entry Permalink | Comments(5)
Dietary Fiber Does Not Lower Colorectal Cancer Risk

The role of dietary fiber as a preventive against colon cancer remains very unproven.

In an analysis combining data from 13 studies, high intake of dietary fiber was not associated with reduced risk of colorectal cancer, according to a study in the December 14 issue of JAMA.

Dietary fiber has been hypothesized to reduce the risk of colorectal cancer, according to background information in the article. However, the results of numerous epidemiological studies have been inconsistent. Ecological correlation studies and many case-control studies have found an inverse association between dietary fiber intake and risk of colorectal cancer. But most prospective cohort studies have found no association between dietary fiber intake and risk of colorectal cancer or adenomas (precursors of colorectal cancer), and randomized clinical trials of dietary fiber supplementation have failed to show reductions in the recurrence of colorectal adenomas.

Yikyung Park, Sc.D., formerly of the Harvard School of Public Health, Boston, and colleagues evaluated the association between dietary fiber intake and risk of colorectal cancer by reanalyzing the primary data from 13 prospective cohort studies (Pooling Project of Prospective Studies of Diet and Cancer). The pooled analysis included 725,628 men and women who were followed-up for 6 to 20 years across studies.

During the follow-up, 8,081 colorectal cancer cases were identified. Among the studies, median (midpoint) energy-adjusted dietary fiber intake ranged from 14 to 28 g/d in men and from 13 to 24 g/d in women. The major source of dietary fiber varied across studies with cereals as a major contributor to dietary fiber intake in the European studies, and fruits and vegetables as the main sources in the North American studies.

Note the factors here that influenced risk. These are the things you want to get more or less of to lower your risk.

In the age-adjusted model, dietary fiber intake was significantly associated with a 16 percent lower risk of colorectal cancer in the highest quintile compared with the lowest. This association was attenuated slightly but still remained statistically significant after adjusting for nondietary risk factors, multivitamin use, and total energy intake. Additional adjustment for dietary folate intake further weakened the association. In the final model, which further adjusted for other dietary factors, such as red meat, total milk, and alcohol intake, only a nonsignificant weak inverse association was found. Fiber intake from cereals, fruits, and vegetables was not associated with risk of colorectal cancer.

The factors that reduced the association between fiber and risk are the factors that probably really do reduce risk. So get less red meat and milk for example. Did alcohol intake reduce or increase the risk of colon cancer? Probably the fruits and vegetables decreased risk due to other compounds (notably vitamins, minerals, and anti-carcinogen compounds) and not due to fiber. The fiber just happened to always be there.

One theory on why fiber should lower colon cancer risk is that it dilutes toxins and speeds the passage of toxins through the intestinal tract. Maybe adjusting for red meat consumption factors out the value of eating the fiber. If a person does eat a lot of red meat then maybe fiber really does protect agianst toxins formed in the intestines from red meat. Though I'm speculating.

But consumption of whole plant food like vegetables that are high in dietary fiber is associated with lower risk of heart disease and diabetes. So you could eat high fiber fruits and vegetables anyway.

"The association between dietary fiber intake and risk of colorectal cancer has been inconsistent among observational studies and several factors may explain the disparity: potential biases in each study, the failure to adjust for covariates in the multivariate models, and the range of dietary fiber intake," the authors write.

"In conclusion, we did not find support for a linear inverse association between dietary fiber intake and risk of colorectal cancer in a pooled analysis of 13 prospective cohort studies. Although high dietary fiber intake may not have a major effect on the risk of colorectal cancer, a diet high in dietary fiber from whole plant foods can be advised because this has been related to lower risks of other chronic conditions such as heart disease and diabetes," the researchers write.

Just eating a high fiber food extract is probably a waste of time. But vegetables continue to be good for you. Most people (and myself) continue to not eat enough veggies.

By Randall Parker 2005 December 17 09:27 AM  Aging Diet Cancer Studies
Entry Permalink | Comments(4)
2005 December 14 Wednesday
N Acetyl Cysteine Might Reduce Cocaine Cravings

All still very preliminary.

"Cocaine is highly addictive and can have devastating effects on the health and well being of users," says lead researcher Peter Kalivas, Ph.D., Professor and Chair of the Department of Neurosciences at the Medical University of South Carolina (MUSC). "The discovery that a readily available herbal supplement can reduce the intense cravings associated with cocaine use is an important finding for individuals undergoing treatment for cocaine addiction. Reduced craving might help addicted individuals restrain from abusing cocaine."

In the first phase of the study, Dr. Kalivas and the research team conditioned rats on a regimen of cocaine to establish their addiction. The rats in the treatment group were then treated with NAC. After treatment, the cocaine-addicted rats exposed to NAC were significantly less likely to seek out cocaine than those without NAC. Those treated with NAC ceased to actively seek cocaine, but showed normal food-seeking behaviors.

In the second phase of the study headed by Drs. Robert Malcolm, Hugh Myrick, Steve LaRowe, and Pascale Mardikian in the Department of Psychiatry at MUSC, NAC treatment was investigated in a small inpatient study (n=15) involving non-treatment seeking cocaine-dependent subjects. In this phase of research, subjects were asked to look at pictures that were either neutral (e.g., trees, boats) or cocaine-related (e.g., drug paraphernalia). Those individuals treated with NAC reported less craving for cocaine and spent less time looking at the cocaine-related pictures. In addition, when using a functional magnetic resonance imaging (fMRI) test, subjects treated with NAC had reduced brain activity in the prefrontal cortex, the area of the brain activated during cocaine craving and used to modulate the addictive behavior of chronic cocaine use. An open label trial, which was recently completed, indicated that cocaine-dependent patients could take NAC on an extended outpatient basis, with minimal side effects. More importantly, patients taking higher doses of NAC were more likely to complete the trial, providing further indication of the potential benefits of NAC.

"The potential to use NAC for the treatment of individuals addicted to cocaine is a major finding," emphasized Dr. Kalivas. "For those individuals who have the desire to end their addictive habit, a NAC supplement might help to control their cravings."

A larger clinical trial that will follow 282 cocaine-dependent individuals has just begun in order to further understand and corroborate how NAC works in the brain to reduce cocaine craving. Dr. Kalivas stresses that while the initial findings are very promising, the widespread use of NAC in cocaine treatment is not advised until larger scale studies are complete.

The number of humans involved (15) is too small to know for sure. The bigger study they have begun will provide a more definitive answer.

By Randall Parker 2005 December 14 07:15 AM  Brain Addiction
Entry Permalink | Comments(6)
2005 December 11 Sunday
Leucine Stops Muscle Breakdown In Aging Rats

Don't want to become less muscular as you age? Here is news you can probably use.

Muscle in adults is constantly being built and broken down. As young adults we keep the two processes in balance, but when we age breakdown starts to win. However, adding the amino acid leucine to the diet of old individuals can set things straight again. This is the finding of research performed by Lydie Combaret, Dominique Dardevet and colleagues at the Human Nutrition Research Centre of Auvergne, INRA, Clermont-Ferrand, France.

After the age of 40, humans start loosing muscle at around 0.5–2% per year. Immediately after a meal degradation of protein slows down and synthesis doubles. This process is triggered by the arrival of a plentiful supply of amino acids. In older animals this stimulus is less effective; synthesis slows down, and previous work also suggests that breakdown may be affected. While adding leucine to the diet restores protein building there was no knowledge about this supplement's effect on breakdown.

To address this, researchers compared protein breakdown in young (8-month) and old (22-month) rats. They discovered that the slow down in degradation that normally follows a meal does not occur in old animals, so there is excessive breakdown. But adding leucine to the diet restored a balanced metabolism.

The team of researchers believe that the age-related problem results from defective inhibition of ubiquitin-proteasome dependent proteoloysis, a complex degradative machinery that breaks down contractile muscle protein, and that leucine supplementation can fully restore correct function.

"Preventing muscle wasting is a major socio-economic and public health issue, that we may be able to combat with a leucine-rich diet," says senior co-author Didier Attaix.

Commenting on the work Michael Rennie from the University of Nottingham Medical School at Derby says: "This is exciting because it strengthens the idea of a co-ordinated linkage between the meal-related stimulation of protein synthesis and the inhibition of breakdown."

Here for the convenience of American readers is a Froogle Google search on Leucine.

No, I do not know what would be an appropriate human daily dose. Does any reader have an informed basis for estimating a reasonable daily leucine dose?

By Randall Parker 2005 December 11 09:12 AM  Aging Diet Studies
Entry Permalink | Comments(7)
2005 December 10 Saturday
Electric Power Plant Operators Looking Hard At Nuclear

Electric power plant operators see nuclear power in their plans for the year 2015 and beyond.

COLUMBIA, Mo. - Ameren Corp. is considering building a second nuclear power unit in Callaway County, although the idea is only in the discussion phase, said Gary Rainwater, chief executive officer of Ameren.

Rainwater expects such a plant to take 5 years to construct and to go online in 2017 at the earliest. He says on paper nuclear power looks like the best choice.

Rainwater said the Electric Power Research Institute projects that tighter EPA carbon dioxide emission rules could double the cost of burning coal to more than 8 cents a kilowatt hour. The cost of nuclear power, meanwhile, would remain flat at about 4.5 cents per kilowatt hour.

Think about it from the perspective of Rainwater and other CEOs of electric power plant operators. Right now the price gap between coal and nuclear is pretty small. On the one hand fear of nuclear elicits much more public opposition. But coal puts out much more pollution in the form of particulates, oxides of sulfur and nitrogen, mercury, and carbon dioxide. If electric power plant operators build more coal plants and emissions regulations get tougher then they'll be stuck with large additional costs. Then if competitors build nuclear plants the coal operators will get undersold in competitive electric power supply markets. A choice to build more coal plants means they could incur massive losses. But if they build nuclear plants then their risk of additional costs due to regulatory changes are much lower once the nuclear plants are in operation.

I keep saying that we have three choices for more electric power in the foreseeable future:

A) More coal plants with more pollution.

B) Nuclear power.

C) Higher prices.

An opponent of nuclear power has to then choose between options A and C.

In California legislators and regulators have clearly chosen option C (and utility bills in California already bear witness to this fact). California Electric power generators have regulatory requirements to get more power from renewable sources.

The fundamental feature of California's new program is the requirement that all major utilities in the state buy 1 percent more renewable energy each year so that at least 20 percent of their total electric supply portfolio is made up of renewable generation by 2017. This requirement could result in procurement of up to an additional 21,000 GWh of renewable energy each year.

When you see news reports that California power generators are making deals to build wind farms and large Stirling solar electric generator facilities out in California deserts this is not a sign that market costs for these power sources have fallen to equal conventional sources. To consider wind and solar as competitive we have to assign high external costs to conventional sources. While those external costs do not show up in market costs and so perhaps Stirling solar and wind really are competitive in some situations once external costs are accounted for. Hard to tell because external costs are hard to price (and I welcome links in the comments to sources for external costs of different power sources).

Wind doesn't work as an option in areas with little wind such as the American Southeast. Also, the sun obviously does not shine at night and shines less in the winter. In the longer run better batteries technologies and long distance superconductivity technologies will eventually solve some of the problems that come from inconsistent availability of wind and sun. But for the foreseeable future we still need electric power sources that can satisfy baseline demand growth. So in this era of high natural gas prices we end up coming back to nuclear or coal for electric power generation.

Check out this table of electric power costs by sector and by state for the United States. (and see here for more charts and tables from the US Department of Energy Electric Power Monthly report). California and the Northeast have high electric costs as do outliers Alaska and Hawaii. The rest of the United States as much lower electric costs. While Californians pay over 12 cents per kilowatt hour (kwh) and New Englanders pay over 13 cents per kwh almost all of the rest of the nation pays less than 9 cents per kwh and a few even pay less than 7 cents per kwh. Proximity to hydroelectric dams and coal fields account for some of the lower cost regions. But also lower population density states can tolerate higher emissions per kwh generated and still have cleaner air to breathe than the higher population density states.

So what do you prefer? Nuclear, coal with current emissions regulations, or higher prices?

Update: While some might protest we really do face the three choices listed above. I'm not arguing that "higher prices" is a bad choice. Whether it is or not depends on what the external costs are for nuclear and for coal with current allowed emissions levels. But we need to start off with a clear understanding that before we consider external costs the choices we face differ in price.

We face several higher priced choices:

C.a) Coal at varying levels of increased emissions controls. Prices would rise. Pollution levels would drop. Electric consumption would not grow as rapidly.

C.b) No more power plants. Increased demand would translate into higher prices. This would force people to adopt more conservation measures. It would also lower living standards.

C.c) Solar Stirling. Only works during the day. In theory we could use Solar Stirling so that coal plants could shut down during the day so that total emissions are reduced.

C.d) Wind. Simillar to Solar Stirling in that it would cost more and not be available all the time. So we'd still need more coal (or nuclear) plants to provide reliable baseload supply. But the coal plants would not operate as much per plant on average.

C.e) Solar photovoltaics. More expensive than Solar Stirling or Wind. But allows local generation and use.

C.f) Lots of Liquified Natural Gas terminals. These are widely opposed for safety reasons but could probably bring in enough natural gas to meet the increased electric demand with more natural gas electric plants. Domestic natural gas production is declining in the US and Britain. World natural gas demand is going to rise. Prices will rise as well.

If we choose a tough enough version of option C.a (in the extreme: zero emissions) while also disallowing new nuclear plant construction we will automatically get some amount of the other options as the price for coal electric rises.

The energy debate comes down to the question of how much are you willing to pay to avoid risks or external costs that you oppose?

Now, in the longer run we'll have technological advances that will reduce risks, external costs, and market prices for most of these options. Some day photovoltaics will be much cheaper and batteries will allow easier shifting of solar power from day to night. Advances in nuclear power plant technology will reduce waste problems, costs, and risks. Wind and Solar Stirling will advance and get cheaper as well. I repeatedly have argued for a sort of massive Manhattan Project to develop a large range of energy technologies and far more formidable figures such as the recently deceased Nobel Laureate scientist Richard Smalley have made this argument as well. But right now we need more electric generator plants and we will need more next year and the year after that and so on. We are faced with today's choices and today's costs for each choice. So given today's choices which do you choose and why? Do you favor higher priced choices?

Update II: One key question in the energy debate is just how big are the external costs for each energy source? How to measure these costs? How big is the uncertainty in those measurements?

Also, some oppose nuclear power on national security grounds (e.g. the possibility that terrorists could blow up a nuclear bomb next to a nuclear reactor and thereby release large amounts of nuclear materials in surrounding areas - I think terrorists would blow up NYC or DC first though). But fossil fuels have their own national security costs that don't get the attention I think they deserve. Saudi money has corrupted portions of the political class in Washington DC. Oil money funds the spread of Wahhabism and radical Islam. A portion of the US defense budget goes to US forces in the Middle East. How to measure those costs? They seem pretty big to me.

For a number of reasons I favor government funding of energy research aimed at developing energy sources that have much lower external costs. First off, new energy technologies that would be cheaper than fossil fuels would lower overall energy costs even at market prices. Plus, the cleaner technologies would lower the external costs that regular market failure and political market failure allow to happen. In addition, national security costs would be reduced for a number of reasons as less money flowed to Middle Eastern oil producers and political competition for influence over the Middle Eastern oil greatly dedlined.

By Randall Parker 2005 December 10 08:53 PM  Energy Policy
Entry Permalink | Comments(47)
2005 December 08 Thursday
Low Carbohydrate And Low Fat Diets Compared On Cholesterol

Low carbohydrates and low fat diets produce different heart benefits.

DURHAM, N.C. -- People who followed a low-carbohydrate diet for six months raised their good cholesterol and lowered their triglycerides, changes that can help lower the risk of heart disease, Duke University Medical Center researchers found.

The Duke study compared the effects of a low-carbohydrate diet, which included nutritional supplements, with a low-fat, low-cholesterol, low-calorie diet. The two diets improved cardiac risk in different ways, said lead researcher Eric Westman, M.D., associate professor of medicine at Duke University Medical Center.

The low-carb diet improved HDL, or good cholesterol levels, and lowered triglycerides, the researchers found. The reduced fat diet lowered total cholesterol levels and triglyceride levels. Both diets brought down blood levels of small LDL particles, the form of bad cholesterol most likely to lead to hardened arteries, they found.

The results appeared early online November 16, 2005 in the International Journal of Cardiology and will appear in print in 2006. The research was funded by an unrestricted grant from the Robert C. Atkins Foundation. The study authors have no financial interest in Atkins Nutritionals, Inc.

"I think the emerging science shows different diets improve cardiac risk in different ways. We are moving from a one-size-fits-all approach to considering many different diets to fit the many different types of cardiac risk," Westman said.

Triglycerides fell much more on the low carb diet.

Overall, both diets had positive effects on cholesterol, Westman said. The triglyceride levels improved significantly in both groups, falling 74.2 points for the low-carb group and 27.9 points for the low-fat group. People on the low-carb diet showed an increase in HDL cholesterol by 5.5 points, a positive change, while those following the low-fat diet did not have a significant change. LDL cholesterol levels did not change significantly in either group but small LDL particles decreased 17.4 points for the low-carb dieters and 19.2 points for the low-fat dieters, a similar improvement. The total cholesterol of the low-fat dieters saw a 13.7 point decline over 6 months but did not change significantly in the low-carb dieters.

My guess is there is enough genetic variability in how bodies respond to different diets that one would benefit from trying different diets and then getting one's cholesterol tested.

They say the participatns were "in generally good health". But some of the participants have body mass indexes as high as 60. Er, this doesn't strike me as compatible with the "generally good health" label. Also, these people all had high cholesterol levels.

The 120 study participants were randomly assigned to either the low-carbohydrate diet or the low-fat, low-cholesterol, low-calorie diet. All were between 18 and 65 years old and in generally good health, with a body mass index (BMI) between 30 and 60, indicating obesity, and a total cholesterol level of more than 200 mg/dL. None had tried dieting or weight loss pills in the previous six months.

There's also the possibility that people who have low cholesterol levels react to low fat and low carbohydrate diets differently than do people with high cholesterol. So if you have low cholesterol yet still want to optimize your diet be aware this study might not provide useful guidance.

The low carb dieters were allowed unlimited meat and eggs. But the details matter a great deal. My guess is you'd be a lot better off eating unlimited salmon than unlimited beef and then fowl would fall somewhere in between salmon and beef in terms of health effects. But what proportion of each meat type did the dieters choose?

The low-carbohydrate group was permitted daily unlimited amounts of animal foods (meat, fowl, fish and shellfish); unlimited eggs; 4 oz. of hard cheese; two cups of salad vegetables such as lettuce, spinach or celery; and one cup of non-starchy vegetables such as broccoli, cauliflower or asparagus. They also received daily nutritional supplements -- a multivitamin; essential oils including flax seed oil, borage oil and fish oil; and chromium picolinate. There were no restrictions on total calories, but carbohydrates were kept below 20 grams per day at the start of the diet.

The low-carbohydrate diet appears to have a favorable effect on cardiac risk, Westman said. "While the low-carbohydrate group received extra nutritional supplements, and experienced greater weight loss, these differences did not fully account for the changes in cardiac risk factors that we saw," he said.

The low-fat, low-cholesterol, low-calorie group followed a diet consisting of less than 30 percent of daily caloric intake from fat; less than 10 percent of calories from saturated fat; and less than 300 milligrams of cholesterol daily. They were also advised to cut back on calories. The recommended daily calorie level was 500 to 1,000 calories less than the participant's maintenance diet -- the calories needed to maintain current weight.

The size of the cutback in fat does not seem that radical to me and falls far short of a Pritikin-style diet.

Both diets leave out junk foods.

Westman noted that the diets have one often-ignored similarity. "It's possible that the common denominator of these diets is what they're not eating – both diets did not allow refined sugar or junk food," Westman said.

Study participants were encouraged to exercise 30 minutes at least three times per week, but no formal exercise program was provided. Both sets of dieters had group meetings at an outpatient research clinic regularly for six months.

I have a hard time coming to a conclusion on the debate about the ideal relative ratio of carbos to proteins to fats. Certainly the types of fats matter a great deal with differences as a function of omega 3 versus omega 6 fats and also fats of various levels of saturation. Plus, particular fatty acids might have especially harmful or healthful effects.

I'd like to see blood fat and cholesterol comparisons of diets with different types of carbs. How does a fruit diet differ from, say, a potato diet? Carb glycemic index differences cause differences in the size of blood spikes when consuming carbs and that might lead to differences in amounts of triglycerides and cholesterol. Imagine two groups ate equal amounts of rice but one group ate low glycemic index basmati rice and another ate the really sticky high glycemic index rice found at Chinese restaurants. My guess is the latter rice diet would have worse effects on blood fats and cholesterol than a basmati rice diet.

The benefit from eating a lot of fruits and vegetables seems more certain that the benefit of changing the relative proportion of fats, proteins, and carbs. Also, a number of foods have clear health risks. For example, charcoal cooked steak should be avoided. The flames and fossil fuels burning with the steak fat are a guaranteed way to produce a lot of carcinogens. Also, trans fatty acids found in many processed foods such as potato chips and many commercial cookies are definitely health unhealthy.

If you want to start improving your diet first remove the very bad foods from it while adding clear winners such as fruits and vegetables.

By Randall Parker 2005 December 08 10:10 PM  Aging Diet Studies
Entry Permalink | Comments(14)
2005 December 06 Tuesday
Decision Made To Fund FutureGen Coal Plant

FutureGen gets the green light.

MONTREAL, Dec. 6 - Under pressure from other industrialized countries at talks here on global warming, the Bush administration announced on Tuesday that it had signed an agreement with a coalition of energy companies to build a prototype coal-burning power plant with no emissions.

The project, called FutureGen, has been in planning stages since 2003. But the Energy Department said here that a formal agreement had been signed under which companies would contribute $250 million of a cost estimated at $1 billion.

A lot of coal plants will get built and put into operation during the 10 year construction time for the FutureGen plant. Once it is completed new coal plants will not all get built using the FutureGen technologies. Even more advanced coal emissions control technologies will probably cost more than not using them.

On the one hand, acceleration of technological advances to reduce emissions is a good thing. On the other hand, note that the emphasis is on developing technology that will lower the cost of emissions control. The emphasis is not on taxing pollutants or simply outlawing pollutants. Well, why is that? Industry lobbies and people do not want to pay more for electricity.

I tend to favor the expenditure of taxpayer money to accelerate research into ways to produce energy that are inherently cleaner in large part because there's a limit to how much people will impose costs on themselves in order to reduce external costs on others. Humanity's fairness and virtue are pretty limited. We evolved to have these limitations. Research will eventually produce cleaner technologies that will get deployed and displace fossil fuels by being cheaper even before considering external costs. This can be accomplished without changing or improving human nature.

By Randall Parker 2005 December 06 11:04 PM  Energy Policy
Entry Permalink | Comments(13)
2005 December 04 Sunday
Girls Sold For Marriage In India Due To Sex Selective Abortions

A shortage of females in India due to sex selective abortion is producing a market for females as brides.

BIR KHURD, India -- Harmesh Singh, a 40-year-old vegetable farmer, had explored the conventional methods of finding a wife. He had consulted female relatives and older women in his village and buttonholed family members in other towns in a quest for leads. After years of searching, however, he still had not found a bride. So last year, he bought one.

Dipping into his meager savings, Singh paid a marriage broker to introduce him to Bibi Kaur, a runaway from Calcutta who says she is 17 but looks considerably younger. Singh would not say how much he paid, but social workers in the area say the fee typically ranges from $100 to $300, depending on the age and appearance of the bride-to-be.

So runaway girls in India probably get hunted down by marriage brokers.

The relative wealth of the Punjab state allows people to afford ultrasound and sex selective abortions. As a result in 2001 there were only 874 females per 1000 males. Probably the imbalance is even greater at younger ages. Once again, the streets find their own uses for technology. The uses often bear little resemblance to the goals the developers of a technology had in mind.

Some girls get sold by their families.

The dearth of potential young brides in Punjab has fueled a demand for women from poor eastern states such as West Bengal and from neighboring Bangladesh and Nepal, where the sex ratio is not as skewed but unemployment and poverty are widespread. Some of the women go willingly. But others are enticed by false promises of jobs, social workers say, or are sold by their families to brokers, a practice that is illegal.

Keep in mind that many of these people are extremely poor. The sale of a daughter might yield money to something valuable for food like a water buffalo.

What I want to know: Will the selling of females essentially as slaves lead to an end to dowry and eventually even a rise in the status of women in India? Or will only further industrialization create conditions that raise the status of women?

By Randall Parker 2005 December 04 10:02 PM  Trends Demographic
Entry Permalink | Comments(10)
Support Builds For Nanomaterials Safety Regulations

Fear of nanomaterials toxicity is leading to greater regulatory agency interest.

Nanomaterials are already being integrated into a wide range of products, including sports equipment, computers, food wrappings, stain-resistant fabrics and an array of cosmetics and sunscreens -- a market expected to exceed $1 trillion a year within a decade. Preliminary studies suggest that most of these products do not pose significant risks in their bulk form or embedded in the kinds of products that so far use them.

But the same cannot be said of the particles themselves, which can pose health risks to workers where they are made and may cause health or environmental problems as discarded products break down in landfills.

Lab animal studies have already shown that some carbon nanospheres and nanotubes behave differently than conventional ultrafine particles, causing fatal inflammation in the lungs of rodents, organ damage in fish and death in ecologically important aquatic organisms and soil-dwelling bacteria.

I look at this intuitively: One of the reasons scientists make nanoscale particles is that the same materials behave differently if made using much smaller sizes. The small sizes powerfully change behavior. For example, carbon tubes make miniature springs that do not wear out and that might be useful in human joints to increase ability to absorb strong shocks. But with greater changes in behavior come greater risk of unexpected and undesired side effects. Well, this intuitive expectation is being borne out by recent experiments as the previous article reports.

Some recent work at Lawrence Berkeley National Laboratory found that both nanotubes and nano-onions (really, I am not making this up) cause changes in gene regulation and cell division.

The increasing use of nanotechnology in consumer products and medical applications underlies the importance of understanding its potential toxic effects to people and the environment. Although both fullerene and carbon nanotubes have been demonstrated to accumulate to cytotoxic levels within organs of various animal models and cell types and carbon nanomaterials have been exploited for cancer therapies, the molecular and cellular mechanisms for cytotoxicity of this class of nanomaterial are not yet fully apparent. To address this question, we have performed whole genome expression array analysis and high content image analysis based phenotypic measurements on human skin fibroblast cell populations exposed to multiwall carbon nano-onions (MWCNOs) and multiwall carbon nanotubes (MWCNTs). Here we demonstrate that exposing cells to MWCNOs and MWCNTs at cytotoxic doses induces cell cycle arrest and increases apoptosis/necrosis. Expression array analysis indicates that multiple cellular pathways are perturbed after exposure to these nanomaterials at these doses, with material-specific toxigenomic profiles observed.

You've no doubt heard about the health benefits of onions. Well, nano-onions are better for you than nanotubes.

Chen and colleagues found that exposure to the nanotubes and nano-onions activated genes involved in cellular transport, metabolism, cell-cycle regulation and stress response. Multi-walled carbon nanotubes induced genes related to a strong immune and inflammatory response, while the presence of nano-onions caused most changes in genes induced in response to external stimuli. The nanotubes appeared to be ten times more toxic than the nano-onions.

These sorts of effects are potentially useful in medical research. The power of nano-materials like any capability is a double-edged sword. The ability to effect changes in the environment is useful or harmful in different contexts. Nothing new about that idea. Just different kinds of materials creating the promise and the risk.

On the bright side, research on nanomaterials toxicity has turned up ways to reduce the toxicity.

Researchers from Rice University, US, have found that the toxicity of water-soluble carbon nanotubes to human skin cells decreased as the functionalization of the tubes increased. The results are similar to the team's findings for fullerene molecules last year, although the nanotubes were generally less toxic than the fullerenes.

I expect nanotechnology to bring orders of magnitude more benefits than costs - at least as long as we do not make nano-replicators.

By Randall Parker 2005 December 04 09:49 PM  Dangers Nanotech
Entry Permalink | Comments(8)
Two Natural Gas Pipelines From Arctic Gain Support

High energy prices are paving the way for decades old proposals.

Of the two lines, the Alaska Gas Pipeline is the behemoth. Its most likely route would stretch 1,700 miles from Alaska's Prudhoe Bay to Canada's Alberta province. The line would cost $20 billion and take a decade to build, but the project has picked up momentum under the whip of Alaska Gov. Frank H. Murkowski (R) and $18 billion in loan guarantees approved last year by Congress.

The second line, the Mackenzie Valley Pipeline, would start 250 miles east of the Alaska line, on Canada's portion of the Beaufort Sea. It would snake 800 miles through forests of spruce and pine along the Mackenzie River -- one of the world's longest with no bridge or dam. This all-Canada route would cost $6 billion and is predicted to take three years to complete once construction begins.

I can see one free market argument for the loan guarantees: Most of the risk is political. If the government was stuck with the bill for a partially completed pipeline that was stopped due to environmentalist opposition then the government would be less likely to bow to environmentalist pressures to stop the project.

Environmentalists are opposed but see at least one pipeline as inevitable. I figure they both will get built. Declining US natural gas production in the lower 48 states makes the economics too attractive and both the US and Canadian publics do not want high heating and electric bills.

The bigger footprint, after the construction crews have left, will be in opening the mineral-rich area to further exploration and development.

Mostly for that reason, some environmentalists favor the Alaska Pipeline, which follows the route of the existing oil pipeline and Alaska Highway.

"We think it's the lesser environmental evil," said Stephen Hazell, a director of the Sierra Club of Canada. Environmental groups have largely bowed to the inevitability of at least one of the projects.

The Canadian pipeline is being delayed by negotiations with native tribes. Environmentalists are more opposed to the smaller Canadian project because it would open up an area for development that currently is hard to reach. The roads built in Alaska for the oil pipeline could be used for the Alaskan natural gas pipeline and so won't do as much to make inaccessible areas accessible.

The environmentalists also fear the Canadian natural gas will be used as an energy source for harvesting oil from the Alberta tar sands. About that fear: The environmentalists who want to bring an end to the age of fossil fuels should spend a lot more time promoting the idea of a broad research effort to develop cleaner technologies. Their fight against fossil fuels is doomed because as soon as prices rise high enough the majority of the public will swing around toward supporting more pipelines and drilling. Fossil fuels use will not get regulated out of existence. Only lower prices for other energy sources will bring an end to fossil fuels use.

By Randall Parker 2005 December 04 09:05 PM  Energy Policy
Entry Permalink | Comments(3)
2005 December 02 Friday
Leptin Restoration After Weight Loss Keeps Off Pounds

Leptin drops while people lose weight and that causes weight regain.

A team at New York's Columbia University has shown the key is falling levels of the hormone leptin, which controls appetite.

They found that giving people who had recently lost weight injections of the hormone helped them to avoid putting the pounds straight back on.

The study features in the Journal of Clinical Investigation.

Leptin is a peptide (i.e. it is a sequence of amino acids and peptides are what make up protein). Peptides taken orally get digested. So to use it you'd have to inject it with a syringe.

To test their theory, the researchers gave doses of leptin to lean and obese volunteers who had recently lost weight.

They found that most of the metabolic and hormonal changes which mean people cannot keep the weight from creeping back on were reversed once leptin levels were restored to pre-weight loss levels.

Leptin is known to play a role in controlling appetite, but as yet the exact way that it works is unclear.

Injections of leptin have been used to help morbidly obese people with a deficiency of the hormone to lose weight, but a similar approach has no effect on obese people with normal leptin levels.

So go on a diet. Lose weight. Then start taking leptin injections to keep the weight off.

Development of a method to increase leptin receptor concentrations on fat cells would also keep the weight off.

A new study by researchers at UT Southwestern Medical Center suggests that when fat cells increase in size – as they do during the development of obesity – the cells progressively lose receptors for the hormone leptin, a powerful stimulus for fat burning.

Leptin, a hormone produced by the body’s fat cells and involved in the regulation of body weight, was first discovered in 1994. It was thought leptin itself would be a key to curing obesity in humans, but the hypothesis did not readily translate into weight loss in obese people. Using mouse models, UT Southwestern researchers have now shown that if enough receptors are present on the fat cells, it is impossible for the cells to store fat and obesity would be blocked.

The new findings, appearing in an upcoming issue of the Proceedings of the National Academy of Sciences and currently available online, bring researchers a step closer to understanding obesity in humans, said Dr. Roger Unger, director of the Touchstone Diabetes Research Center at UT Southwestern and senior author of the study.

“We now think that people with naturally high levels of leptin receptors may not gain weight as rapidly over time as people who have low levels of leptin receptors,” said Dr. Unger. “It could explain why some people can eat more and do not gain weight.”

To test this hypothesis, the UT Southwestern researchers used genetically modified rats in which the leptin receptor remained present in large quantities even during marked overfeeding. In normal mice, the high-fat diet caused massive obesity with enlargement of fat cells to almost three times their normal size. In mice with the forced overexpression of the leptin receptor on their fat cells no obesity occurred, even though they too were fed high-fat, highly caloric diets.

“The fat-storing function of the fat cells requires the disappearance of the leptin receptor,” Dr. Unger said. “This is done in order to block the action of the leptin fat cells produce.”

Add the newly discovered peptide hormone obestatin to the injections and weight control becomes even more likkely.

Examination of the ghrelin gene showed Hsueh, Zhang and their colleagues that it in fact codes for a second peptide hormone — the hormone they named obestatin.

“Obestatin appears to act as an anorexic hormone,” said the Nov. 11 ‘Science’ article, “by decreasing food intake, gastric emptying activities, jejunal motility and body-weight gain.”

These observations led to the christening of the newly discovered hormone.

“On the basis of the bioinformatics prediction that another peptide also derived from proghrelin exists, we isolated a hormone from rath stomach and named it obestatin — a contraction of obese, from the Latin ‘obedere,’ meaning to devour, and ‘statin,’ denoting suppression,” Hsueh said.

Also see my previous post Obestatin Hormone Suppresses Appetite.

We might be perhaps 10 or 15 years away from effective weight control treatments. Keep in mind that drugs normally take as long as 10 years to make it through the drug approval process. Some drug already it the pipeline might cure obesity. Or perhaps leptin could be sold for injections without a lengthy approval process. So it is hard to guess when obesity will become easily curable. But the knowledge about the mechanisms which cause obesity has advanced so much in the last few years that we now have sufficient scientific understanding on which to base development of obesity control therapies and appetite control therapies.

By Randall Parker 2005 December 02 07:04 AM  Brain Appetite
Entry Permalink | Comments(2)
2005 December 01 Thursday
Bupropion Reduces Methamphetamine Cravings

Addicts participating in a small study at UCL reported less intense cravings for methamphetamine as a result of taking bupropion.

A new study led by researchers at UCLA's Semel Institute suggests the antidepressant bupropion may help treat methamphetamine addiction. No medications presently are approved for treating methamphetamine addicts.

Appearing Nov. 23 as an advance online publication of the peer-reviewed journal Neuropsychopharmacology, the study finds bupropion blunts the methamphetamine "high" and reduces cravings prompted by visual cues such as ambient drug use.

The research team hypothesizes that bupropion reduces the effects of methamphetamine by preventing the drug from entering brain cells, where methamphetamine can produce release of neurotransmitters that cause feelings of euphoria.

The study is the first to examine the effectiveness of bupropion for treating methamphetamine addiction in humans. A multisite Phase II clinical trial led by UCLA researchers is in progress.

Bupropion is found in the anti-nicotine drug Zyban and the anti-depressant Wellbutrin. So meth addicts could easily start trying to use bupropion right now to help them quit. Since a lot of meth users also smoke cigarettes they might also find it easier to quit smoking at the same time.

Drug addiction is a sign that humans are not adapted to the environments they've created for themselves using advances in technology. We were not selected for by evolution to handle the drugs that scientists have turned up. We need to develop technologies which will allow us to adapt ourselves to the elements in our environment which many of us can not handle. Lest you think the "many of us" doesn't include you I have a few questions to ask you: Are you overweight? Do you get less than an optimal amount of exercise? Ever had any problems with addictions or substance abuse? Have any destructive or at least partially disabling cravings? Spend too much time reading on the internet?

By Randall Parker 2005 December 01 10:19 PM  Brain Addiction
Entry Permalink | Comments(8)
Site Traffic Info
Site Copyright
The contents of this site are copyright ©