2007 January 31 Wednesday
Uric Acid Correlates With Brain Aging

Rising uric acid probably makes our minds slow down as we age.

WASHINGTON— Researchers at the Johns Hopkins and Yale university medical schools have found that a simple blood test to measure uric acid, a measure of kidney function, might reveal a risk factor for cognitive problems in old age. Of 96 community-dwelling adults aged 60 to 92 years, those with uric-acid levels at the high end of the normal range had the lowest scores on tests of mental processing speed, verbal memory and working memory.

The findings appear in the January issue of Neuropsychology, which is published by the American Psychological Association (APA).

High-normal uric acid levels, defined in this study as 5.8 to 7.6 mg/dL for men and 4.8 to 7.1 mg/dL for women, were more likely to be associated with cognitive problems even when the researchers controlled for age, sex, weight, race, education, diabetes, hypertension, smoking and alcohol abuse. These findings suggest that older people with serum (blood) uric-acid levels in the high end of the normal range are more likely to process information slowly and experience failures of verbal and working memory, as measured by the Wechsler Adult Intelligence Scale and other well-established neuropsychological tests.

“It might be useful for primary-care physicians to ask elderly adults with high normal serum uric acid about any problems they might be having with their thinking, and perhaps refer those who express concern, or whose family members express concern, for neuropsychological screening,” says lead author David Schretlen, PhD.

The link between high-normal uric acid and cognitive problems is also sufficiently intriguing for the authors to propose clinical studies of whether medicines that reduce uric acid, such as allopurinol, can help older people with high-normal uric acid avoid developing the mild cognitive deficits that often precede dementia.

Would the growth of younger replacement kidneys prevent the rise of uric acid with age? Or could stem cell therapies or gene therapies do the trick?

For reasons that are not entirely clear, uric acid levels increase with age, says Dr. Schretlen. Higher levels of uric acid are linked with known risk factors for dementia, including high blood pressure, atherosclerosis, Type 2 diabetes and the “metabolic syndrome” of abdominal obesity and insulin resistance. Dr. Schretlen also says there is mounting evidence that end-stage renal (kidney) disease increases the risk of cognitive dysfunction and dementia in elderly adults. Given this web of connections, uric acid could potentially become a valuable biological marker for very early cognitive problems in old age.

If our uric acid levels are rising as a side effect of kidney aging and if the higher uric acid levels deliver no real benefit then efforts to keep uric acid down would slow brain aging. The brain is the toughest rejuvenation challenge. Anything we can to do delay brain aging will give us more time to find ways to make 100 billion neurons in our brains young again.

Update: As some commenters have pointed out, this study does not prove the direction of causation runs from higher uric acid to faster brain aging. Another possible direction of causality runs from high oxidative stress to both higher uric acid and faster brain aging. The higher oxidative stress could impair kidney function and brain function and cause higher uric acid to correlate with faster brain aging. Or the body could make more uric acid as a protectant against certain kinds of radicals. For example, uric acid appears to protect against peroxynitrite. So the higher uric acid might be an indicator of some other problem that is causing the accelerated brain aging.

Early diabetes suppresses uric acid. Yet diabetes increases oxidative stress and accelerates aging. Could it be that effects of diabetes are worsened by the lower uric acid? Also, insulin prevents oxidative stress-caused decreases in intracellular uric acid and in intracellular antioxidant glutathione. Perhaps rather than take drugs to lower uric acid a person with high uric acid should first try a variety of antioxidants and other brain protecting compounds.

Update II: Bonnie Firestein's research team at Rutgers University have found that uric acid stimulates brain astroglial cells to make transporter proteins that haul away compounds that do damage to nerves and uric acid may therefore be neuroprotective.

Uric acid's effects on the health of neurons had been observed by other researchers, but the mechanics of how it confers protection has remained a mystery.

"It is interesting to note that people with gout never seem to develop multiple sclerosis," Firestein said. "In animal models of multiple sclerosis, the addition of uric acid reduces symptoms and improves prognosis. The same is true for one type of Parkinson's disease tested."

The Firestein team's breakthrough studies revealed that uric acid can stimulate astroglial cells to produce transporter proteins that carry harmful compounds away from neurons in jeopardy of chemical damage. This opens the door to identifying a unique drug target for new therapies.

Glutamate is a compound that under normal circumstances aids neurons in transmitting signals for cognitive functions in the brain, such as learning and memory. In the case of spinal cord injury or stroke where there is physical cell damage, however, an excess of glutamate is released and it accumulates around the remaining intact neurons, eventually choking them to death.

When Firestein's group added uric acid to a mixed culture of rat spinal cord neurons and astroglial cells, the production of the glutamate transporter EAAT-1 increased markedly. The challenge now is find the most effective strategy for increasing the production of the transporter, using drug therapies or other means.

So then does the higher blood uric acid increase brain aging or decrease it?

By Randall Parker 2007 January 31 10:34 PM  Brain Aging
Entry Permalink | Comments(12)
2007 January 30 Tuesday
Bioengineers Simulate Human Metabolism

Some bioengineers at UCSD are building a model to simulate parts of human metabolism.

Bioengineering researchers at UC San Diego have painstakingly assembled a virtual human metabolic network that will give researchers a new way to hunt for better treatments for hundreds of human metabolic disorders, from diabetes to high levels of cholesterol in the blood. This first-of-its-kind metabolic network builds on the sequencing of the human genome and contains more than 3,300 known human biochemical transformations that have been documented during 50 years of research worldwide.

Note that these people are engineers, not scientists. They are treating the human body as just another complex system to engineer. They are using simulation just as engineers simulate airplanes, cars, and other systems designed by humans. Their simulations are a prelude to efforts to re-engineer the human body.

Simulations allow more rapid testing of much larger combinations of conditions. For human bodies simulations will allow testing of drugs and other treatments without need for the huge sums of money used in real human trials and also without the need to wait for lots of real wall clock time to go by. Plus, simulations can check out dangerous scenarios that would be far too risky to try with real humans.

An increasing portion of all biomedical research and development will take place in simulations. The cost of computing will continue to decline as the software becomes more complex and the data from real lab experiments feed in to make the models increasingly more realistic.

The simulation can predict the behavior of actual human cells.

In a report in the Proceedings of the National Academy of Sciences (PNAS) made available on the journal's website on Jan. 29, the UCSD researchers led by Bernhard Ø Palsson, a professor of bioengineering in the Jacobs School of Engineering, unveiled the BiGG (biochemically, genetically, and genomically structured) database as the end product of this phase of the research project.

Each person's metabolism, which represents the conversion of food sources into energy and the assembly of molecules, is determined by genetics, environment, and nutrition. In a demonstration of the power and flexibility of the BiGG database, the UCSD researchers conducted 288 simulations, including the synthesis of testosterone and estrogen, as well as the metabolism of dietary fat. In every case, the behavior of the model matched the published performance of human cells in defined conditions.

This simulation is limited to known interactions and transformations done by cellular components. As more interactions become discovered and characterized these additional pieces of the puzzle can get added to existing simulations such as this one at UCSD. Fortunately, biochips which measure proteins and genes keep getting more powerful. For example, see my post Chip Measures Protein Binding Energies In Parallel

By Randall Parker 2007 January 30 10:43 PM  Biotech Advance Rates
Entry Permalink | Comments(2)
2007 January 29 Monday
Protein In Hypothalamus Regulates Hunger

A protein called SH2B1 regulates whether mice gain weight or stay skinnier and healthier.

Previously, Rui and his team have shown that mice that lack the gene for SH2B1 -- called knockout mice -- become obese, diabetic, and unable to stop eating. Their bodies lose the ability to sense the signals sent by leptin and insulin that tell the brain to slow down food intake and fat storage.

For the new paper, they looked at not only normal mice and mice that didn’t have the SH2B1 gene, but also at mice that made SH2B1 only in brain cells, either in normal or larger-than-normal amounts. They found that restoring SH2B1 just in the brain completely corrected the metabolic disorders that the knockout mice had developed, but also improved the brain cells’ ability to respond to leptin signals and produce further signals that regulate eating.

What’s more, the mice that were treated to make extra SH2B1 didn’t become obese or lose their ability to respond to leptin signals even after being fed a high-fat diet that caused those effects in other mice.

More SH2B1 means less obesity. So any drug or gene therapy that would increase SH2B1 will probably decrease hunger and cause weight loss.

“Obesity, whether in mice or humans, is the product of an altered balance between energy intake and energy use. The imbalance is linked to alterations in leptin and insulin signaling that lead to excess weight gain and Type 2 diabetes, respectively,” says Rui, an assistant professor of molecular and integrative physiology at U-M. “SH2B1 appears to play a key regulatory role in this system, through its direct influence on the processing of leptin and insulin signals in cells of the brain’s hypothalamus.”

Do people who are naturally skinny have genetic variations that causes their brain cells to make more SH2B1?

SH2B1 binds to an enzyme called a kinase that places phosphate groups onto other proteins. The kinases typically activate and deactivate other proteins. So SH2B1 changes whether tyrosine kinase regulates other proteins.

The team and other researchers have found that SH2B1, which was previously called SH2-B, is a kind of jack-of-all-trades in the world of cell signaling. Able to shuttle between the area just beneath the cell membrane and the nucleus, it can bind to many different molecules and facilitate signaling.

Specifically, it can bind to a variety of molecules called tyrosine kinases, including ones that serve as receptors for insulin and growth factors that circulate in the brain and body. One of its most important binding partners is JAK2, which is activated every time a leptin molecule binds to a cell.

Since leptin is the body’s messenger boy to the brain for “stop eating, we’re full” messages, and JAK2 helps receive those messages as they arrive, SH2B1’s partnership with JAK2 is an important one. In a previous paper, Rui and his former mentor and current colleague Christin Carter-Su, Ph.D., showed that SH2B1 encourages the action and production of JAK2, unlike two other proteins that have been shown by other teams to reduce its activity. Carter-Su is a professor of molecular and integrative physiology and heads the biomedical research division of the Michigan Diabetes Research and Training Center.

These scientists and many others are gradually piecing together a very detailed model of how the feeling of hunger is regulated in the brain. With better understanding of a complex systems comes better ability to manipulate it. The identification and study of every piece eventually leads to candidate targets for drug development and ideas for how to alter the environment of cells to cause them to act differently.

By Randall Parker 2007 January 29 11:25 PM  Brain Appetite
Entry Permalink | Comments(0)
2007 January 28 Sunday
MIT Study Sees Big Future For Geothermal Energy

A new MIT-led study suggests that geothermal power does not get the attention it deserves.

A comprehensive new MIT-led study of the potential for geothermal energy within the United States has found that mining the huge amounts of heat that reside as stored thermal energy in the Earth's hard rock crust could supply a substantial portion of the electricity the United States will need in the future, probably at competitive prices and with minimal environmental impact.

An 18-member panel led by MIT prepared the 400-plus page study, titled "The Future of Geothermal Energy" (PDF, 14.1 MB). Sponsored by the U.S. Department of Energy, it is the first study in some 30 years to take a new look at geothermal, an energy resource that has been largely ignored.

See the body of the report for cost details. I also include a couple of excerpts from the cost sections below. Also, see Figure 1.11 on page 42 which shows cost estimates for a future geothermal site at Clear Lake in Kelseyville California. Note the cost range is from about 2.8 to 4.4 kwh. That is competitive with coal's current cost. That coal electric comes with no carbon sequestration and much more mercury, particulates and other pollution than I'd like to see. Geothermal would avoid all that.

For geothermal note that drilling costs are falling due to on-going efforts by the oil and natural gas industries to develop cheaper methods to do deep drilling. But geothermal has some additional needs for technological advancement and for exploration to identify the best drilling locations.

Here is a surprise. Did you know that geothermal generates more power than wind and solar combined in the United States?

According to panel member M. Nafi Toksöz, professor of geophysics at MIT, "geothermal energy could play an important role in our national energy picture as a non-carbon-based energy source. It's a very large resource and has the potential to be a significant contributor to the energy needs of this country." Toksöz added that the electricity produced annually by geothermal energy systems now in use in the United States at sites in California, Hawaii, Utah and Nevada is comparable to that produced by solar and wind power combined. And the potential is far greater still, since hot rocks below the surface are available in most parts of the United States.

Wind blows weakly in some parts of the United States, especially in the southeast which is experiencing rapid population growth. But geothermal installations won't stop when the wind slows and the sun goes down.

The panel makes a number of recommendations:

  • More detailed and site-specific assessments of the U.S. geothermal energy resource should be conducted.
  • Field trials running three to five years at several sites should be done to demonstrate commercial-scale engineered geothermal systems.
  • The shallow, extra-hot, high-grade deposits in the west should be explored and tested first.
  • Other geothermal resources such as co-produced hot water associated with oil and gas production and geopressured resources should also be pursued as short-term options.
  • On a longer time scale, deeper, lower-grade geothermal deposits should be explored and tested.
  • Local and national policies should be enacted that encourage geothermal development.
  • A multiyear research program exploring subsurface science and geothermal drilling and energy conversion should be started, backed by constant analysis of results.

With better battery technology add geothermal to the list of energy sources that could replace oil as a means to power ground transportation.

What about cost? The body of the 372 page report has extensive discussion of cost considerations.

Because the field-demonstration program involves staged developments at different sites, committed support for an extended period will be needed to demonstrate the viability, robustness, and reproducibility of methods for stimulating viable, commercial-sized EGS reservoirs at several locations. Based on the economic analysis we conducted as part of our study, a $300 million to $400 million investment over 15 years will be needed to make early-generation EGS power plant installations competitive in evolving U.S. electricity supply markets.

These funds compensate for the higher capital and financing costs expected for early-generation EGS plants, which would be expected as a result of somewhat higher field development (drilling and stimulation) costs per unit of power initially produced. Higher generating costs, in turn, lead to higher perceived financial risk for investors with corresponding higher-debt interest rates and equity rates of return. In effect, the federal investment can be viewed as equivalent to an “absorbed cost” of deployment. In addition, investments in R&D will also be needed to reduce costs in future deployment of EGS plants.

But one big pay-off would be to avoid construction of dozens of polluting coal plants.

Based on growing markets in the United States for clean, base-load capacity, the panel thinks that with a combined public/private investment of about $800 million to $1 billion over a 15-year period, EGS technology could be deployed commercially on a timescale that would produce more than 100,000 MWe or 100 GWe of new capacity by 2050. This amount is approximately equivalent to the total R&D investment made in the past 30 years to EGS internationally, which is still less than the cost of a single, new-generation, clean-coal power plant.

Geothermal would have a small surface footprint and would cut very little into habitat areas. Unlike biomass energy sources such as corn and cane sugar ethanol, geothermal wouldn't compete with food crops for agricultural acreage. But in order for geothermal (or solar or nuclear or wind) to displace fossil fuels for transportation uses we need better batteries. Battery technology is going to be the key to ending the era of fossil fuels without turning most of the world into crop lands.

Given the timelines discussed in the report geothermal is not much of a short term solution. Though the panel thinks development of hot sites near the surface in the western US could come sooner than the deeper sites. But if we start now to do the research and development recommended in this report then geothermal could start displacing a lot of new construction of coal plants by the 2020s. The amount of money needed to develop this option is fairly small. Why not do it?

Update: Current geothermal projects under development would almost double existing geothermal generation capacity.

Some 58 new geothermal energy projects are already under development in the United States, according to a November 2006 survey by the Geothermal Energy Association, GEA, an industry trade group, which says federal and state incentives to promote geothermal energy are paying off.

Anyone know how big the government incentives are for geothermal? I'm guessing California's mandate to get more electricity from renewables makes geothermal more cost effective in California than in most other states.

“This represents the U.S. geothermal industry’s most dramatic wave of expansion since the 1980s,” said Karl Gawell, GEA’s executive director. "We are seeing a geothermal power renaissance in the U.S."

These projects, when developed, would provide up to 2,250 megawatts of electric power capacity, enough to serve the needs of 1.8 million households.

Those geothermal projects combined add up to about one large nuclear power plant.

One problem geothermal has is that most of the best sites in the United States are in the west. But if geothermal electricity becomes really cheap then energy intensive industries such as computing and aluminum could relocate to near geothermal sites.

By Randall Parker 2007 January 28 08:54 AM  Energy Geothermal
Entry Permalink | Comments(9)
2007 January 27 Saturday
Ethanol Demand Driving Up Corn Prices

The demand for corn to produce ethanol is driving up the price of tortillas in Mexico and creating a political problem.

Mexico is in the grip of the worst tortilla crisis in its modern history. Dramatically rising international corn prices, spurred by demand for the grain-based fuel ethanol, have led to expensive tortillas. That, in turn, has led to lower sales for vendors such as Rosales and angry protests by consumers.

The uproar is exposing this country's outsize dependence on tortillas in its diet -- especially among the poor -- and testing the acumen of the new president, Felipe Calderón. It is also raising questions about the powerful businesses that dominate the Mexican corn market and are suspected by some lawmakers and regulators of unfair speculation and monopoly practices.

Tortilla prices have tripled or quadrupled in some parts of Mexico since last summer.

Biomass energy puts more affluent car drivers in competition with poor food buyers for the same fields of land. Biomass energy also increases the amount of competition between farmers and wildlife for the same fields of land. The poor people and the wildlife lose in such competitions.

Higher corn prices will eventually translate into higher prices for other basic foods. Why? Farmers will plant less of other crops and more of corn. Plus, people will shift away from eating corn and toward eating other foods.

Ethanol producers are getting squeezed by higher corn prices and lower gasoline prices.

Corn prices, 75 percent of the cost of ethanol production, have doubled in the past six months, to more than $4 a bushel. At the same time, the price of ethanol has followed the price of gasoline downward.

Absent a rescue from Capitol Hill, the glut is going to get worse. AgResource's Basse estimates the blending demand for ethanol at 10 billion gallons, 7 percent of the 150 billion gallons of blended fuel burned each year. Current nationwide ethanol capacity is 5.4 billion gallons. But 6.1 billion gallons' worth of capacity is now under construction, according to the Renewable Fuels Association. That would push supply right past demand and destroy ethanol prices. Unless mandates are tightened. At the moment the motor fuel industry is meeting environmental minimums and exceeding the energy independence ones.

Under present law the independence minimum comes to 4.7 billion gallons of ethanol this year and 7.5 billion in 2012. But now the Bush Administration is considering boosting this mandate to 60 billion gallons by 2030.

Such a huge increase in ethanol production will raise food costs even if we shift to using switchgrass with cellulosic technology to make ethanol. Lots of land will get shifted into switchgrass production and away from food crop production in that case.

Corn has more than doubled in price.

NEW YORK - The economic viability of ethanol as an alternative to petrol has been thrown into question as the oil price fell below US$50 a barrel yesterday for the first time in nearly two years, while the price of corn - the main ingredient in the new fuel - surged to a new 10-year high.


After a decade of trading between US$2 and US$3 per bushel, corn was trading yesterday at US$4.09 a bushel.

Some ethanol plants extract as much as 2.8 gallons of ethanol per bushel. So a doubling of corn prices adds at least 71 cents to the cost of a gallon of ethanol. The only way ethanol can get produced is with a large taypayer-funded subsidy. You subsidize the production of ethanol and as a result you pay more for ham, chicken, steak, corn muffins, and tortillas. Plus, the demand for corn causes farmers to shift away from wheat, soy, and other crops to grow more corn. So you pay more for the other grains as well.

Calves headed to feedlots cost $200 more each due to the doubling of corn prices.

Corn prices have gone from $2 to $4 per bushel. Ranchers say that's costing them about $200 dollars a head for calves headed to the feedlots. Even worse, cattleraisers are bracing for even higher prices.

That $200 is just at the stage of calves. As the calves continue to get fed the costs from higher corn prices continue to add up even higher.

The biggest chicken producer says we may pay more for meats this year due to the use of corn for ethanol production.

Tyson has warned rising corn prices could mean consumers will have to pay more for chicken, beef and pork this year. The price of corn, which is used as animal feed, has been skyrocketing as demand has increased for ethanol, an alternative source of fuel to gasoline.

Chickens cost 6 cents more per lb wholesale due to ethanol.

"We estimate that ethanol demand has already increased the price of chicken by six cents per pound wholesale," said William P. Roenigk, senior vice president and chief economist for NCC. "If government continues to push corn out of livestock and poultry feed and into the energy supply, the cost of producing food will only increase."

The retail price increase is higher, but how much higher? 10 cents per lb total perhaps?

Ethanol gets a 51 cents per gallon subsidy in the United States. If that subsidy remains in place then the yearly cost of that subsidy could rise to over $17 billion per year by 2017.

Production of ethanol is currently subsidized by the federal government through a tax credit of 51 cents per gallon of ethanol added by fuel blenders. In his State of the Union message, President Bush called for an increase in the production of renewable and alternative fuels from 7.5 billion gallons to 35 billion gallons by 2017. He also proposed $2 billion in loans for the development of fuel from sources other than corn, such as switchgrass or other "cellulosic" sources.

Some of the post commenters argue against government spending on energy research in photovoltaics, batteries, and other areas. Well, we are spending billions per year on ethanol production subsidies. I'd rather spend on research that will lower costs than on use of technologies that are expensive. Politicians are going to spend big money on energy policy. I'd rather they spend in ways that will lower costs and reduce environmental impacts.

Corn ethanol is driving up the costs of raising pigs.

Greg Boerboom raises 37,000 pigs a year on his farm in Marshall, Minn. Those hogs eat a lot of corn—10 bushels each from weaning to sale. In past years he has bought feed for about $2 a bushel. But since late summer, average corn prices have leapt to nearly $4 a bushel. To reduce feed costs, he sells his pigs before they reach the normal 275 pounds, and keeps them warmer so they don't devour more food fighting off the cold. Still, Boerboom hopes just to break even. "It's been a pretty tight squeeze on pork producers," he says. "The next eight months will be really tough."

That is only $20 extra per pig or less than 10 cents per lb.

Recently Lester Brown and colleagues at the Earth Policy Institute counted up all the ethanol plants in operation, under construction, getting expanded, and in planning. They discovered a much larger scaling up of ethanol production than has previously been measured by other organizations.

According to the EPI compilation, the 116 plants in production on December 31, 2006, were using 53 million tons of grain per year, while the 79 plants under construction—mostly larger facilities—will use 51 million tons of grain when they come online. Expansions of 11 existing plants will use another 8 million tons of grain (1 ton of corn = 39.4 bushels = 110 gallons of ethanol).

In addition, easily 200 ethanol plants were in the planning stage at the end of 2006. If these translate into construction starts between January 1 and June 30, 2007, at the same rate that plants did during the final six months of 2006, then an additional 3 billion gallons of capacity requiring 27 million more tons of grain will likely come online by September 1, 2008, the start of the 2008 harvest year. This raises the corn needed for distilleries to 139 million tons, half the 2008 harvest projected by USDA. This would yield nearly 15 billion gallons of ethanol, satisfying 6 percent of U.S. auto fuel needs. (And this estimate does not include any plants started after June 30, 2007, that would be finished in time to draw on the 2008 harvest.)

I think the rising cost of corn combined with the declining cost of oil may prevent many of those planning stage ethanol plants from ever getting built. A removal of the subsidy for ethanol production would reduce food prices and save money. We'd be better off spending the tax dollars on developing new energy technologies that have less environmental impact than biomass.

By Randall Parker 2007 January 27 11:48 PM  Energy Biomass
Entry Permalink | Comments(12)
2007 January 25 Thursday
Stanford Expert Sees Rush To Ethanol As Premature

George W. Bush wants to scale up ethanol production in order to reduce gasoline use by 20% in a decade. I continue to think this is a bad idea. David Victor at Stanford University tells MIT's Technology Review that a big scaling up of ethanol production is premature without cellulosic technology.

TR: One of the technologies the president emphasized is converting wood chips and grasses, known as cellulosic feedstocks, into ethanol. Could that make his goals achievable?

DV: You have to be careful because a very large part of our biofuels policy is not about energy at all. It's really about the heartland and farm politics because the current corn-based biofuels don't really save us that much energy. Cellulosic biomass [which is potentially much more efficient] is still really some distance off in the future. If we try to meet these aggressive targets very quickly, what we're going to end up with is a much, much larger version of the current, already inefficient, corn-based ethanol program.

TR: Documents released by the White House said that the vast majority of the 20 percent reduction in gasoline use in the next decade should come from using more biofuels such as ethanol. Is this a good strategy?

DV: In my view, this is a dangerous goal because the other technologies [such as cellulosic ethanol] are not available, [and] it really demands that we dramatically scale up our corn-based ethanol program. And I think that has serious ecological problems because of the large amount of land that they're going to have to put under cultivation. [There are] big economic problems because [making ethanol from corn] certainly isn't competitive with other ways of making biofuels, such as from sugar.

Note when he says that biofuels made from sugar are more competitive he's almost certainly referring to cane sugar from Brazil, not beet sugar from US farm fields. Currently the United States has restrictions in cane sugar imports in order to protect the domestic farmers who produce cane or beets for sugar. The Brazilians can grow cane sugar at lower cost and can therefore make ethanol for a lower cost.

The US government also effectively blocks Brazilian ethanol import. So neither Brazilian cane sugar or ethanol made from sugar cane can be imported at a competitive price. But there's an ecological advantage in blocking US import of Brazilian ethanol: This reduces agricultural demand for Brazilian rain forests.

I'd like to repeat what is surely a familiar refrain for long time FuturePundit readers: We'd be better off accelerating battery, nuclear, and photovoltaics technologies. They'll eventually provide cheaper energy than ethanol. Plus, they'll use a much smaller land footprint and produce less pollution than ethanol produced from agriculture.

My fear about cellulosic technology: It will make biomass ethanol so cheap that humanity will put large swathes of the world under cultivation to make ethanol. Continued world economic growth is going to increase demand for transportation fuel by double, triple, and even more eventually. If we make biomass energy cheap then say good bye to the natural state of ever larger chunks of land.

By Randall Parker 2007 January 25 11:16 PM  Energy Policy
Entry Permalink | Comments(23)
Insula In Brain Key For Cigarette Cravings

If you can't stop smoking blame it on your insula.

Smokers with a damaged insula – a region in the brain linked to emotion and feelings – quit smoking easily and immediately, according to a study in the Jan. 26 issue of the journal Science.

The study provides direct evidence of smoking's grip on the brain.

It also raises the possibility that other addictive behaviors may have an equally strong hold on neural circuits for pleasure.

The senior authors of the study are Antoine Bechara and Hanna Damasio, both faculty in the year-old Brain and Creativity Institute at the University of Southern California, in collaboration with graduate students Nasir Naqvi, who was first author on the study, and David Rudrauf, both from the University of Iowa.

"This is the first study of its kind to use brain lesions to study a drug addiction in humans," Naqvi said.

In the 1990s, Antonio Damasio proposed the insula, a small island enclosed by the cerebral cortex, as a "platform for feelings and emotion." The Science study shows that the pleasure of smoking appears to rest on this platform.

"It's really intriguing to think that disrupting this region breaks the pleasure feelings associated with smoking," said Damasio, director of the institute and holder of the David Dornsife Chair in Neuroscience at USC.

"It is immediate. It's not that they smoke less. They don't smoke, period."

Strokes damage many different areas of the brain. A subset of all stroke patients happen to experience damage to their insulas and a reduction in their cravings for cigarettes.

The study, pubished today in the journal Science, was inspired by a patient who smoked 40 cigarettes a day before having a stroke that damaged his insula. He quit immediately, telling doctors that he “forgot the urge to smoke”.

The scientists then turned to a database of stroke patients held by the University of Iowa and identified 69 who had smoked at least five cigarettes a day for at least two years before they suffered brain damage. They found that 19 of these patients had damage to the insula and 13 of them had given up smoking, 12 of them quickly and easily. The other six continued to smoke — possibly reflecting damage to different parts of the insula.

Comparisons of insulas done with brain scanning technologies such as functional magnetic resonance (fMRI) may lead to identification of exactly where the insula must get damaged to stop cigarette craving. I bet some smokers will subject themselves to brain surgery to damage a part of their insula if they could be assured of little or no side effects aside from a decreased desire to smoke.

Insula damage did not reduce the desire to eat.

The patients’ desire to eat, by contrast, was intact. This suggests, the authors wrote, that the insula is critical for behaviors whose bodily effects become pleasurable because they are learned, like cigarette smoking.

The insula, for years a wallflower of brain anatomy, has emerged as a region of interest based in part on recent work by Dr. Antonio Damasio, a neurologist and director of the Brain and Creativity Institute. The insula has widely distributed connections, both in the thinking cortex above, and down below in subcortical areas, like the brain stem, that maintain heart rate, blood pressure and body temperature, the body’s primal survival systems.

Based on his studies and others’, Dr. Damasio argues that the insula, in effect, maps these signals from the body’s physical plant, and integrates them so the conscious brain can interpret them as a coherent emotion.

The search will now start in earnest: Scientists will look for drugs or try biofeedback training methods or try transcranial magnetic stimulation or other therapies in order to tweak the insula to reduce cravings.

By Randall Parker 2007 January 25 10:52 PM  Brain Addiction
Entry Permalink | Comments(1)
Nanotechnology Measures Activity Of Single Cells

A method of measuring the molecules created by a single gene's expression uses nanotechnology to accomplish this feat.

A new nanotechnology that can examine single molecules in order to determine gene expression, paving the way for scientists to more accurately examine single cancer cells, has been developed by an interdisciplinary team of researchers at UCLA's California Nanosystems Institute (CNSI), New York University's Courant Institute of Mathematical Sciences, and Veeco Instruments, a nanotechnology company. Their work appears in the January issue of the journal Nanotechnology.

This ability to measure the expression of a single gene in a single cell is part of a larger (or smaller) trend: The development of tools that can measure and manipulate biological systems at the scale of their smallest individual components.

Cancer researchers and other biomedical researchers have spent decades trying and failing to cure many diseases because they lacked the tools needed to figure out the mechanisms of a large variety of diseases. What is going to make the next 20 years so different than what has come before is the development of tools for manipulation and detailed measurement of activities inside of cells.

Their use of the phrase "individual transcript molecules" sounds like they can measure the presence of individual molecules of messenger RNA.

Previously, researchers have been able to determine gene expression using microarray technology or DNA sequencing. However, such processes could not effectively measure single gene transcripts—the building blocks of gene expression. With their new approach, the researchers of the work reported in Nanotechnology were able to isolate and identify individual transcript molecules—a sensitivity not achieved with earlier methods.

"Gene expression profiling is used widely in basic biological research and drug discovery," said Jason Reed of UCLA's Department of Chemistry and Biochemistry and the study's lead author. "Scientists have been hampered in their efforts to unlock the secrets of gene transcription in individual cells by the minute amount of material that must be analyzed. Nanotechnology allows us to push down to the level of individual transcript molecules."

Biomedical science is going to be revolutionized by "GRIN" technology that will be much cheaper and more powerful.

"We are likely to see more of these kinds of highly multi-disciplinary research aimed at single molecule sequencing, genomics, epigenomic, and proteomic analysis in the future," added Bud Mishra, a professor of Computer Science, Mathematics, and Cell Biology from NYU's Courant Institute and School of Medicine. "The most exciting aspect of this approach is that as we understand how to intelligently combine various components of genomics, robotics, informatics, and nanotechnology—the so-called GRIN technology—the resulting systems will become simple, inexpensive, and commonplace."

Computers got cheaper because they kept getting smaller. Microfluidic devices and microsensors on chips are going to do the same thing to biological science and medicine. Small and mass produced devices driven by complex software will accelerate the rate of advance of biomedical research by an order of magnitude or more. We will develop cures for almost every disease and rejuvenation technologies as well.

By Randall Parker 2007 January 25 10:31 PM  Nanotech for Biotech
Entry Permalink | Comments(1)
2007 January 24 Wednesday
Gene Activation Stops Cancer In Mice

Reactivation of a key gene for cellular regulation stopped and wiped out a number of cancers in genetically engineered mice.

Howard Hughes Medical Institute researchers have developed two strategies to reactivate the p53 gene in mice, causing blood, bone and liver tumors to self destruct. The p53 protein is called the “guardian of the genome” because it triggers the suicide of cells with damaged DNA.

Inactivation of p53 can set the stage for the development of different types of cancer. The researchers' findings show for the first time that inactivating the p53 gene is necessary for maintaining tumors. While the researchers caution that cancers can mutate to circumvent p53 reactivation, they believe their findings offer ideas for new approaches to cancer therapy.

The research was carried out independently by two Howard Hughes Medical Institute (HHMI) research teams led by Tyler Jacks at the Massachusetts Institute of Technology and Scott Lowe at Cold Spring Harbor Laboratory. Both papers were published online January 24, 2007, in advance online publication articles in the journal Nature.

The techniques the teams used to reactivate the p53 gene could not be used therapeutically in humans because they genetically engineered the mice to have their p53 genes controllable with drugs.

To reactivate p53, Lowe and his colleagues used a genetic technique they had developed to induce an aggressive form of liver cancer in mice. Although they had inactivated p53 in the mice, they genetically engineered the mice so that they could reverse p53 inactivation by giving the animals the antibiotic doxycycline. They suppressed p53 protein levels by using RNA interference (RNAi) that had been modified so that RNAi could be switched off by the antibiotic. The RNA interference technology was developed in collaboration with HHMI investigator Gregory Hannon at Cold Spring Harbor Laboratory.

When the researchers reactivated p53 in the mice they found that the liver tumors completely disappeared. “This was quite surprising,” said Lowe. “We were working with a very advanced, aggressive tumor, but when we reestablished p53, not only did it stop growing, it went away.

The most obvious way to try to duplicate this result in humans would be to send in p53 genes in some sort of gene therapy delivery package. But we lack the technologies needed to do that. Gene therapy delivery is a hard problem.

The way that p53 activation stopped the cancers was surprising. It made cells go into a senescent state rather than to commit suicide. But once in the senescent state the immune system in the mice attacked these cells.

“But the second surprise—and perhaps the more scientifically interesting one—was why the tumor went away,” said Lowe. “We expected the tumor cells to undergo programmed cell death, or apoptosis. But instead, we saw evidence for a very different process that p53 also regulates—senescence, or growth arrest. What really excited us was evidence that this senescence somehow triggered the innate immune system to kill the tumor cells.” Involvement of the innate immune system suggests there may be an unknown mechanism by which cancers can trigger the immune system, he said. Lowe and his colleagues are now exploring how the innate immune system might be enlisted against cancer.

How do cancer cells suppress the immune system to prevent it from attacking them? These mice make for a good research model to try to figure out how cancer cells protect themselves from immune attack. Once scientists know how that works they can develop therapies that'll basically block the immune suppression mechanisms used by cancer cells. Then cancer vaccines would become far more potent.

For some types of cancer the gene for p53 might still be intact but deactivated. A drug might be able to reactivate it. But in other types of cancer p53 is probably so mutated that it can't be reactivated. Delivery of replacement genetic material might be the only solution. But that approach runs up against the problem of how to deliver a replacement gene into every single cancer cell in a person's body.

Genetically engineered copies of p53 could find use in human stem cell therapies and in the development of replacement organs. Replacement cells could have not only normal copies of p53 but also additional copies which will activate only in the presence of a particular drug. That way if cells ever go cancerous due to mutations in regular p53 genes a back-up set of p53 genes could get activated by a prescription antibiotic or some other drug. Think of such genetically engineered p53 genes as akin to emergency brakes in cars. If your main cancer brakes give way additional sets of cellular brakes could get turned on.

By Randall Parker 2007 January 24 08:52 PM  Biotech Cancer
Entry Permalink | Comments(3)
2007 January 23 Tuesday
Fiber Against Breast Cancer

Ladies, eat more fiber to protect your breasts!

Pre-menopausal women who eat large amounts of fibre could halve their breast cancer risk, a UK study has suggested.

The University of Leeds researchers, who studied 35,000 women, found those who ate 30g of fibre a day had half the risk of those who ate less than 20g.

Also, reduce protein intake while increasing vitamin C intake.

257 pre-menopausal women developed breast cancer during the study, which was initially funded by the World Cancer Research Fund.

They were found to be women who had a greater percentage of energy derived from protein, and lower intakes of dietary fibre and vitamin C, compared to women who did not develop cancer.

Eat beans and berries. Eat whole grains. Vegetables too.

By Randall Parker 2007 January 23 11:23 PM  Aging Diet Cancer Studies
Entry Permalink | Comments(0)
Tomatoes And Broccoli Against Prostate Cancer

Broccoli and tomatoes shrink prostate cancer tumors in rats.

URBANA - A new University of Illinois study shows that tomatoes and broccoli--two vegetables known for their cancer-fighting qualities--are better at shrinking prostate tumors when both are part of the daily diet than when they're eaten alone.

"When tomatoes and broccoli are eaten together, we see an additive effect. We think it's because different bioactive compounds in each food work on different anti-cancer pathways," said University of Illinois food science and human nutrition professor John Erdman.

In a study published in the January 15 issue of Cancer Research, Erdman and doctoral candidate Kirstie Canene-Adams fed a diet containing 10 percent tomato powder and 10 percent broccoli powder to laboratory rats that had been implanted with prostate cancer cells. The powders were made from whole foods so the effects of eating the entire vegetable could be compared with consuming individual parts of them as a nutritional supplement.

Other rats in the study received either tomato or broccoli powder alone; or a supplemental dose of lycopene, the red pigment in tomatoes thought to be the effective cancer-preventive agent in tomatoes; or finasteride, a drug prescribed for men with enlarged prostates. Another group of rats was castrated.

After 22 weeks, the tumors were weighed. The tomato/broccoli combo outperformed all other diets in shrinking prostate tumors. Biopsies of tumors were evaluated at The Ohio State University, confirming that tumor cells in the tomato/broccoli-fed rats were not proliferating as rapidly. The only treatment that approached the tomato/broccoli diet's level of effectiveness was castration, said Erdman.

"As nutritionists, it was very exciting to compare this drastic surgery to diet and see that tumor reduction was similar. Older men with slow-growing prostate cancer who have chosen watchful waiting over chemotherapy and radiation should seriously consider altering their diets to include more tomatoes and broccoli," said Canene-Adams.

How much tomato and broccoli should a 55-year-old man concerned about prostate health eat in order to receive these benefits? The scientists did some conversions.

You'd need to eat a cup and a half of broccoli and a half cup of tomato paste to get a similar dose scaled up to human size. I do not see consumption of so much tomato paste as a problem. But the broccoli? Ugh.

"To get these effects, men should consume daily 1.4 cups of raw broccoli and 2.5 cups of fresh tomato, or 1 cup of tomato sauce, or ½ cup of tomato paste. I think it's very doable for a man to eat a cup and a half of broccoli per day or put broccoli on a pizza with ½ cup of tomato paste," said Canene-Adams.

What I want to know: Can cabbage serve in place of broccoli as a prostate cancer risk reducer?

Tomatoes reduce testosterone in rats. Do they have this effect in humans?

Another recent Erdman study shows that rats fed the tomato carotenoids phytofluene, lycopene, or a diet containing 10 percent tomato powder for four days had significantly reduced testosterone levels. "Most prostate cancer is hormone-sensitive, and reducing testosterone levels may be another way that eating tomatoes reduces prostate cancer growth," Erdman said.

I've long suspected that many common foods have pharmacological efffects. If a large study systematically put people on a variety of controlled diets with few foods each diet and then measured many hormones and other blood markers my guess is all sorts of interactions would pop up from the data. Lots of compounds in foods accidentally bind in locations in human bodies and cause changes in how our metabolisms function.

By Randall Parker 2007 January 23 11:20 PM  Aging Diet Cancer Studies
Entry Permalink | Comments(5)
2007 January 22 Monday
Altruistic People Differ In Brain Scans

Researchers Scott A. Huettel and Dharol Tankersley at Duke University have found that people who are more altruistic have more activity in the posterior superior temporal sulcus region of the brain while watching a computer play a game.

In the study, researchers scanned the brains of 45 people while they either played a computer game or watched the computer play the game on its own. In both cases, successful playing of the game earned money for a charity of the study participant's choice.

The researchers scanned the participants' brains using a technique called functional magnetic resonance imaging (fMRI), which uses harmless magnetic pulses to measure changes in oxygen levels that indicate nerve cell activity.

The scans revealed that a region of the brain called the posterior superior temporal sulcus was activated to a greater degree when people perceived an action -- that is, when they watched the computer play the game -- than when they acted themselves, Tankersley said. This region, which lies in the top and back portion of the brain, is generally activated when the mind is trying to figure out social relationships.

The researchers then characterized the participants as more or less altruistic, based on their responses to questions about how often they engaged in different helping behaviors, and compared the participants' brain scans with their estimated level of altruistic behavior. The fMRI scans showed that increased activity in the posterior superior temporal sulcus strongly predicted a person's likelihood for altruistic behavior.

Do people do better in some occupations if they have more or less activity in the posterior superior temporal sulcus? Imagine a career counselor telling someone "You have so little activity in the posterior superior temporal sulcus there's no way a career in nursing makes any sense. How about sales?".

These scientists hypothesize that something about how people model the world makes them more likely to commit altruistic acts.

According to the researchers, the results suggest that altruistic behavior may originate from how people view the world rather than how they act in it.

"We believe that the ability to perceive other people's actions as meaningful is critical for altruism," Tankersley said.

Not sure what he means by "meaningful". Any speculations?

A twins study found that about half of the tendency toward altruism is genetic. The ability to identify those who are altruistic combined with cheap genetic testing will lead to the identification of the genetic variations that make people more or less altrustic. Psychopathy is also at least partially genetically determined. The same will also turn out to be the case for other ways in which people differ cognitively.

I've previously expressed my conviction that when people can choose genetic variations for their offspring they will choose to make their kids more genetically determined. In other words, people will leave less to chance. If they want their kids to be altruistic they'll choose those genetic variations that absolutely assure altruism. If they want their kids to be selfish they'll choose genes that leave no role for chance in the outcome.

I'm worried about that genetically more determined future for a few reasons. First off, people won't all choose the same sets of characteristics. Imagine one group decides to make their kinds more altruistic. Another group makes their kids more selfish. They'll disagree more deeply. The differences in outlooks will widen. Big divisions can lead to civil wars, wars between nations, and other problems. So one problem is that we'll get more people who are extremes as people give their kids stronger doses of whatever qualities they like in themselves or that they wished they possessed.

Another problem is that some people will choose qualities for their children that make those kids lousier citizens and lousier human beings. I happen to disagree with Objectivists who believe that only the ability to reason is enough to equip people with the potential to respect the rights of others. For example, the impulse to carry out altruistic punishment is probably essential in the vast majority of a populace in order for the criminal justice system to work and in order to get people to deal fairly with each in a large range of work and social settings.

I can imagine why some people (especially those who have a weak or non-existent impulse to carry out altruistic punishment) will choose to make offspring that lack that instinct. Will enough make that choice that some future generation will have less of that desire?

I think altruism serves a useful and even necessary function in some contexts. At the same time, it causes problems. We need to learn more about how altruism works and what causes it to get expressed in pathological ways (e.g. stifling high tax welfare states that reduce the costs of irresponsible behaviors and reduce the incentives and means to carry out more productive behaviors). Do some genetic variations for altruism deliver net benefits while others deliver net damage? We won't all agree on the answers to that question even when the data is in. Differences in values (at least some of which will be genetically caused) will cause differences in decisions about which effects are good or bad.

By Randall Parker 2007 January 22 10:19 PM  Brain Altruism
Entry Permalink | Comments(9)
2007 January 21 Sunday
Mitochondria Activation Reduces Cancer

A small molecule activates suppressed mitochondria in cancer cells and the cells start acting normal.

January 16, 2007 - Edmonton - DCA is an odourless, colourless, inexpensive, relatively non-toxic, small molecule. And researchers at the University of Alberta believe it may soon be used as an effective treatment for many forms of cancer.

One qualifier to the above statement: Whether dichloroacetate (DCA) would really be non-toxic when used in therapeutic doses against cancer remains to be seen. When used to treat a genetic disorder involving high lactic acid DCA caused peripheral neuropathy. DCA inhibits a kinase enzyme that deactivates an enzyme called pyruvate dehydrogenase (PDH) which is involved in mitochondrial metabolism (i.e, it is involved in sugar breakdown to make energy).

Dr. Evangelos Michelakis, a professor at the U of A Department of Medicine, has shown that dichloroacetate (DCA) causes regression in several cancers, including lung, breast and brain tumors.

Michelakis and his colleagues, including post-doctoral fellow Dr. Sebastian Bonnet, have published the results of their research in the journal Cancer Cell.

Many cancer cells do not break sugar down completely. They just do a step called glycolysis. They do not do a step called the Krebs cycle (aka the citric acid cycle or tricarboxylic acid cycle or TCA cycle) which extracts all the energy out of sugar molecules to make energy carrier molecules called NADH and ATP. This was first observed about cancer all the way back in the 1930s. Up until now the assumption to explain this was that cancer cells lost that ability. But this result suggests that not only do cancer cells suppress that ability but that suppression helps them grow uncontrollably.

Pyruvate dehydrogenase (PDH) synthesizes acetyl-CoA which is used in the first step of the TCA cycle in mitochondria. If DCA has either toxicity problems or problems with achieving sufficient doses that does not defeat this approach to anti-cancer drug development. The kinase that DCA blocks could become a target for drug development. A drug that would disable that kinase would likely activate mitochondria in cancer cells just like DCA does.

I remember a scientist telling me decades ago that classic intermediary metabolism doesn't get the attention it deserves because everyone was rushing into genetics. Many scientists decided that there was little of interest left to learn from studying the main pathways of energy metabolism. This result argues for his view. How can we get all the way to the year 2007 without noticing sooner the powerful results from a simple long known molecule?

Michelakis decided the conventional wisdom on cancer and mitochondria might be wrong and decided to test it.

Until recently, researchers believed that cancer-affected mitochondria are permanently damaged and that this damage is the result, not the cause, of the cancer. But Michelakis, a cardiologist, questioned this belief and began testing DCA, which activates a critical mitochondrial enzyme, as a way to "revive" cancer-affected mitochondria.

The results astounded him.

Michelakis and his colleagues found that DCA normalized the mitochondrial function in many cancers, showing that their function was actively suppressed by the cancer but was not permanently damaged by it.

More importantly, they found that the normalization of mitochondrial function resulted in a significant decrease in tumor growth both in test tubes and in animal models. Also, they noted that DCA, unlike most currently used chemotherapies, did not have any effects on normal, non-cancerous tissues.

No one single molecule is going to cure all cancers by itself. But combinations of compounds where all have toxicity highly specific to cancer cells will certainly end up curing a great many cancers. Monoclonal antibodies targetted at cancers, anti-angiogenesis compounds that block blood vessel growth in cancers, gene therapies that activate in cancer cells and assorted other compounds such as DCA are going to cure many cancers when used in combination.

"I think DCA can be selective for cancer because it attacks a fundamental process in cancer development that is unique to cancer cells," Michelakis said. "Cancer cells actively suppress their mitochondria, which alters their metabolism, and this appears to offer cancer cells a significant advantage in growth compared to normal cells, as well as protection from many standard chemotherapies. Because mitochondria regulate cell death - or apoptosis - cancer cells can thus achieve resistance to apoptosis, and this appears to be reversed by DCA."

The suppression of mitochondria might be a way for cancer cells to divide in low oxygen environments found deep in tumors lacking in sufficient vasculature. By turning on mitochondria in these cells their need for oygen is probably increased and that likely contributes to their death. This suggests that DCA might work well in combination with anti-angiogenesis drugs since the ability of anti-angiogenesis drugs to block blood vessel growth will decrease the amount of oxygen available to tumors and therefore make more cells in tumors susceptible to the effects of DCA.

DCA (aka Ceresine) has a big problem: It is not patentable and hence provides little incentive for commercial companies to raise money to fund clinical studies to develop it as an anti-cancer drug. People who are philosophically opposed to patents ought to take note of this.

Furthermore, the DCA compound is not patented or owned by any pharmaceutical company, and, therefore, would likely be an inexpensive drug to administer, Michelakis added.

However, as DCA is not patented, Michelakis is concerned that it may be difficult to find funding from private investors to test DCA in clinical trials. He is grateful for the support he has already received from publicly funded agencies, such as the Canadian Institutes for Health Research (CIHR), and he is hopeful such support will continue and allow him to conduct clinical trials of DCA on cancer patients.

If DCA is on the market in less regulated countries then maybe it'll get tried out in human cancer patients under less restrictive regulatory regimes.

DCA hasn't been tried yet in humans against cancer.

Evangelos Michelakis of the University of Alberta in Edmonton, Canada, and his colleagues tested DCA on human cells cultured outside the body and found that it killed lung, breast and brain cancer cells, but not healthy cells. Tumours in rats deliberately infected with human cancer also shrank drastically when they were fed DCA-laced water for several weeks.

People who have fatal diseases should be allowed to try anything as a treatment.

By Randall Parker 2007 January 21 09:06 PM  Biotech Cancer
Entry Permalink
2007 January 20 Saturday
Plasma Converts Wastes To Energy

How about an energy technology that will reduce the need for landfills while replacing as much as a quarter of the gasoline burned in the United States?

The technology, developed originally by researchers at MIT and at Batelle Pacific Northwest National Labs (PNNL), in Richland, WA, doesn't incinerate refuse, so it doesn't produce the pollutants that have historically plagued efforts to convert waste into energy. Instead, the technology vaporizes organic materials to produce hydrogen and carbon monoxide, a mixture called synthesis gas, or syngas, that can be used to synthesize a wide variety of fuels and chemicals. The technology has been further developed and commercialized by a spinoff called Integrated Environmental Technologies (IET), also based in Richland, WA. In addition to processing municipal waste, the technology can be used to create ethanol out of agricultural biomass waste, providing a potentially less expensive way to make ethanol than current corn-based plants.

If you go to the IET company history page you'll find the base technology was originally developed using US Department of Energy research funds. The DOE wanted a way to better process nuclear wastes. But the scientists involved in the work recognized they could adapt the technology to process a larger range of wastes and produce energy as a result. I point all this out for the benefit of my orthodox libertarian readers who repeatedly argue that government funded research can't solve our energy problems.

There is enough municipal and industrial waste produced in the United States for the system to replace as much as a quarter of the gasoline used in this country, says Daniel Cohn, a cofounder of IET and a senior research scientist at the Plasma Science and Fusion Center.

But can the process make ethanol at a lower cost than use of corn? IET thinks their process will cost 95 cents per gallon of ethanol or lower. Since ethanol contains less energy than a gallon of gasoline you have to multiply by 1.5 to get the equivalent cost per gasoline gallon.

IET has a big cost advantage over corn ethanol plants: They would get paid to take the waste they'll use as inputs to their plants. How much of an advantage would depend on how much they get paid for the waste. If their plants could be designed to fit in densely populated areas (think a borough of New York City) then in heavily urban areas they'd save on hauling costs to take the trash to more distant landfills. In rural areas their plants would tend to be further away than the nearest dump. So their technology strikes me as better suited for the most densely populated areas.

IET has not yet decided how best to use the carbon monoxide and hydrogen their process produces. They need a cost effective system with a catalyst that'll bind the hydrogen to the carbon to produce liquid hydrocarbons. Can they get their costs low enough to do that?

Since they have hydrogen as an intermediate product with the right catalyst they might be able to produce a carbon-based liquid fuel that is more reduced (has more hydrogen) than ethanol does. That'd make the fuel more like gasoline and reduce the frequency of gasoline station stops by a third as compared to ethanol.

By Randall Parker 2007 January 20 12:35 PM  Energy Biomass
Entry Permalink | Comments(19)
2007 January 18 Thursday
Human Brains Limited Parallel Processing Capabilities

Vanderbilt University neuroscientists Paul E. Dux and René Marois used functional magnetic resonance imaging of human brains to discover how the brain responds to the need to perform two tasks at once. They discovered parts of the brain that are bottlenecks which serialize the processing of information for multiple tasks.

To overcome this limitation, Dux and Marois rapidly sampled brain activity using fMRI while subjects were performing two demanding tasks. Evaluation of the data produced by this rapid sampling method allowed them to characterize the temporal pattern of activity in specific brain areas.

The two tasks consisted of pressing the appropriate computer key in response to hearing one of eight possible sounds and uttering an appropriate syllable in response to seeing one of eight possible images. Different senses and motor responses were enlisted in order to ensure that any interference between the two tasks was not specific to a particular sensory or motor modality, but instead originated at a central information-processing bottleneck.

The results revealed that the central bottleneck was caused by the inability of the lateral frontal and prefrontal cortex, and also the superior frontal cortex, to process the two tasks at once. Both areas have been shown in previous experiments to play a critical role in cognitive control.

"We determined these brain regions responded to tasks irrespective of the senses involved, they were engaged in selecting the appropriate response, and, most importantly, they showed 'queing' of neural activity--the neural response to the second task was postponed until the response to the first was completed," Dux said.

"Neural activity seemed to be delayed for the second task when the two tasks were presented nearly simultaneously – within 300 milliseconds of each other," Marois said. "If individuals have a second or more between tasks, we did not see this delay.

What I'd like to know: Do higher IQ people have an enhanced ability to process two tasks at once? Or do they serialize just as strictly but finish processing each task more quickly?

When you do two tasks at once your response to stimuli for each task gets slowed for as much as a second. So all those people driving around with cell phones are at greater risk for causing an accident.

"I'm Australian, and it's illegal there, so I'm trained not to," Dux said. "Even so, I would never do it. Dual-task costs can be up to a second, and that's a long time when you're traveling at 60 miles per hour."

It would be really handy to have a greater capacity to process two or three or more problems at once. It would also be really handy to have a much larger short term memory. One of the challenges of future post-human genetic engineering is to develop DNA sequences that code for brains that can handle more problems at once.

I am expecting the use of genetic engineering on offspring to make easy the expansion of short term memory working set size. Plenty of people have bigger memory working set sizes. We'll be able to compare their genetic sequences to those of lesser minds and identify the best genes to tweak for bigger short term memories. But will we discover genetic variations that increase the ability of human minds to do many tasks at once? Do such variations exist in the human population? Or are the architectural changes needed to allow parallel processing on major problems too big for such genetic variations to come into existence naturally?

By Randall Parker 2007 January 18 10:57 PM  Brain Limits
Entry Permalink | Comments(11)
Folic Acid Slows Brain Aging Unless B-12 Deficient

In a January 2007 edition of the American Journal of Clinical Nutrition, A David Smith of University of Oxford surveys recent reports on the health effects of higher folic acid consumption. In a nutshell: Folic acid appears to slow brain aging for those who have lots of B-12 but for those with B-12 deficiency higher folic acid consumption makes the brain decline more rapidly.

Interestingly, Morris et al report both a "good" and a "not-so-good" side of folate. In agreement with current knowledge, they found that a low vitamin B-12 status is associated with macrocytosis, anemia, and cognitive impairment. The key finding in this report concerns interactions between folate status and vitamin B-12 status. The "good news" is that, in subjects with a normal vitamin B-12 status, high serum folate (>59 nmol/L) was associated with protection from cognitive impairment. This finding is remarkable in a population with a much higher mean folate concentration (39 nmol/L) than that seen in countries where there is no mandatory folate fortification. A similar result was reported for Latinos living in California, where higher red blood cell folate concentrations after fortification were associated with protection from cognitive impairment and dementia (11). Simply put, if your vitamin B-12 status is good, folate supplementation is good for you!

So good so far. But there's a gray lining in that silver cloud. Higher folic acid consumption appears to lower cognitive function in those with low B-12 in their blood serum.

The "not-so-good" news from the study by Morris et al is that the relation between high serum folate and cognitive impairment was reversed in subjects who had a low vitamin B-12 status. Those with a low vitamin B-12 status (serum cobalamin <148 pmol/L) and high serum folate (>59 nmol/L) had an odds ratio for cognitive impairment of 5 compared with those whose vitamin B-12 status and folate status were both normal. This group, which had a low vitamin B-12 status and a high serum folate concentration also had an odds ratio close to 5 for anemia. Thus, the simple interpretation is that the cognitive impairment and anemia usually associated with low vitamin B-12 status are made much worse by a high folate status.

This result supports the theory of some nutrition researchers that folic acid supplementation doesn't just mask B-12 deficiency. Folic acid makes the damaging effects of B-12 deficiency worse. One of the effects of B-12 deficiency is neurological damage.

You might think you are not at risk of B-12 deficiency because you eat lots of meat or perhaps you take a B-12 supplement. You might be right. But some people have an impaired ability to absorb B-12. So without a blood test you can't be absolutely sure that your B-12 status is fine.

4% of the elderly in this one study had both high folic acid and low B12.

Morris et al found that {approx}4% of the elderly persons they studied had a combination of low vitamin B-12 status and high folate status. If the same proportion of all elderly persons in the United States is affected, then {approx}1.8 million elderly might be at increased risk of cognitive impairment and anemia because of an imbalance between folate and vitamin B-12.

If you are deficient in B-12 you are at increased risk of neural damage already. The inability to absorb B-12 rises with age. But periodic B-12 injections can restore B-12 levels. So those with B12 deficiency are best off treating their problem with diet and with injections if necessary.

Dr Jane Durga of the University of Wageningen in the Netherlands just published a paper in :Lancet finding late middle age and early elderly people who take daily folic acid have higher performing brains.

Researchers found that men and post-menopausal women aged between 50 and 70 who took daily doses had the mental abilities of those almost five years their junior.

The supplements also helped maintain speed of information processing, reactions involving movement and overall brain power. These abilities decline with age, and their loss has been linked to a higher risk of dementia.

Another study just published in Archives of Neurology found those with higher folic acid consumption have reduced risk of Alzheimer's Disease.

The study, led by Dr. Jose Luchsinger of Columbia University Medical Center in New York, looked at 965 people age 65 and older in Manhattan. Those with higher levels of folate through diet and supplements were less likely to get the devastating brain ailment, the study found.

In an attempt to reduce the incidence of neural tube defects the US Food and Drug Administration mandated the fortification of many grains with folic acid in 1998. But since whole grains used in whole grain breads are not fortified a popular shift toward the use of whole grains and the attempts by many women to reduce carbohydrate consumption to control weight have reduced the concentration of folic acid in the blood of American women since 2000.

Now, it seems, even that first sign of progress is eroding — an apparent victim of dietary shifts, obesity and the stubborn resistance of women in their childbearing years to taking a multivitamin. In a report issued Jan. 5, the CDC found that among women in their childbearing years, blood folate levels had declined 16% by 2004 from the levels recorded in 2000.

The March of Dimes is calling for a doubling of grain folic acid fortification. But as the first article above shows, some scientists are afraid higher folic acid consumption will cause net harm to millions who do not absorb enough B-12.

It might be a good idea to get your blood B-12 tested. If you are deficient then you can change your diet or get a periodic B-12 shot or try taking a B-12 supplement. Once you have enough B-12 then boosting your folic acid is probably a good idea. My advice: Get the folic acid from beans and greens. You'll derive numerous other benefits that way. If you avoid animal products in your diet then get B-12 in a supplement or in highly fortified foods (e.g. Total cereal).

By Randall Parker 2007 January 18 09:29 PM  Aging Diet Brain Studies
Entry Permalink | Comments(3)
2007 January 17 Wednesday
Total US Cancer Deaths Decline 2nd Year In Row

Not only did the rate of cancer death per 100,000 go down. But the total number of people who died from cancer in the United States went down in 2003 and 2004.

Fewer people died of cancer in 2004 than in 2003, marking the second consecutive year that cancer deaths have declined in the United States, a new American Cancer Society report shows. According to Cancer Statistics 2007, there were 3,014 fewer cancer deaths in 2004 compared to the previous year. The report is published in the latest issue of the ACS journal CA: A Cancer Journal for Clinicians.

That number is much higher than the drop of 369 deaths reported between 2003 and 2002. And that suggests the trend is more than just a statistical blip, experts say.

A decline in deaths from colorectal cancer caused about a third of the total decline. Part of the decline was due to wider use of colonoscopy to remove polyps and catch cancer sooner. But new drugs also are driving down the death rate from colorectal cancer.

Improved treatment has also played a part in lowering the death rate from colorectal cancer, Dr. Neugut said. “There was a revolution in treatment between 1998 and 2000, and revolution is a mild word,” he said. “We went from having one drug to having six or seven good drugs. The cure and survival rates have increased dramatically as a result. The cost of care has also gone up, but you get what you pay for.”

The toal number of cancer deaths declined in spite of both an aging population and a growing population.

The death rate from cancer has been falling by slightly less than 1 percent a year since 1991, but until 2003 the actual number of deaths kept rising because the population was growing and aging. Then, in 2003, the cumulative drop in death rates finally became large enough to outpace aging and population growth.

Cancer deaths can be driven down further by vaccines against viral causes of cancer and antibiotics against bacterial causes. The new Gardasil vaccine against human papilloma virus will lower the incidence of cervical cancer. Also, vaccines against hepatitis B can lower the incidence of liver cancer. Wider testing for the helicobacter pylori bacteria infections of the stomach could lead more to take antibiotics to cure it. That would lower the incidence of stomach cancer. Identification of more pathogenic causes of cancer will lead to the development of still more ways to prevent cancer.

The percentage decline in the cancer death rate has increased in recent years.

In 2003 and 2004, the cancer death rate declined by about 2 percent each year -- more than offsetting the effects of aging and population growth.

We are goign to witness an acceleration in the decline of the cancer death rate. Combinations of anti-angiogenesis factors to stop blood vessel growth in tumors will stop some cancers. Immune approaches such as monoclonal antibodies carrying toxins will get better.

We are also going to see some big successes with the use of stem cells against cancer. A stem cell treatment was used to locally activate chemo agents mostly at tumor sites and this cured all treated mice of neuroblastoma cancer.

Researchers at City of Hope and St. Jude Children's Research Hospital may have found a way to treat cancers that have spread throughout the body more effectively. They used modified neural stem cells to activate and concentrate chemotherapeutic drugs predominately at tumor sites, so that normal tissue surrounding the tumor and throughout the body remain relatively unharmed.

The neural stem cells are attracted to compounds that tumours secret. Possibly the compounds which attract the neural stem cells are angiogenesis compounds that stimulate blood vessel growth. Or perhaps the damaged nature of cancer cells cause them to secrete lots of free radical compounds that signal damage which stem cells rush in to repair.

Most chemotherapy drugs affect both normal and cancerous tissue, which is why they also are toxic to naturally fast-growing cells in the body such as hair follicles and intestinal cells. Aboody and her colleagues have developed a two-part system to infiltrate metastatic tumor sites, and then activate a chemotherapeutic drug, thereby localizing the drug's effects to the tumor cells.

The technique takes advantage of the tendency for invasive tumors to attract neural stem cells. The researchers injected modified neural stem/progenitor cells into immunosuppressed mice that had been given neuroblastoma cells, which then formed tumors. After waiting a few days to allow the stem cells to migrate to the tumors, researchers administered a precursor-drug. When it reached the stem cells, the drug interacted with an enzyme the stem cells expressed, and was converted into an active drug that kills surrounding tumor cells. The precursor-drugs were administered for two weeks, then after a two-week break, a second round of stem/progenitor cells and drugs were administered.

One hundred percent of the neuroblastoma mice appeared healthy and tumor-free at six months. Without treatment, all the neuroblastoma mice died within two-and-a-half months.

The results hold promise for treating solid tumors that metastasize including neuroblastoma, which represents 6 percent to 10 percent of all childhood cancers worldwide, with higher proportions in children under 2 years of age.

Cancer is not an unsolvable problem. To study and develop treatments for cancer scientists are getting far more powerful tools. Nanotechnologies such as microfluidics are going to accelerate the rate of progress of biomedical research by orders of magnitude. If you do not get cancer in the next 20 years you aren't going to die from it. Our tools are going to become far more powerful than the disease.

By Randall Parker 2007 January 17 10:12 PM  Biotech Cancer
Entry Permalink | Comments(8)
2007 January 16 Tuesday
Molecule Excites Appetite-Stimulating Neurons When Fasting

Your brain works overtime to make you eat when you fast.

During periods of fasting, brain cells responsible for stimulating the appetite make sure that you stay hungry. Now, a new study of mice reported in the January issue of the journal Cell Metabolism, published by Cell Press, reveals the complex series of molecular events that keep those neurons active.

The researchers revealed a link between active thyroid hormone in the brain and increases in an "uncoupling" protein (UCP2) that boosts the number of power-generating mitochondria in neurons that drive hunger. The increase in mitochondria, in turn, allows the brain's hunger center to remain active when periods of food scarcity result in a "negative energy balance," said Sabrina Diano of Yale University School of Medicine, who led the study.

Indeed, the researchers found, animals lacking either UCP2 or an enzyme that stimulates thyroid hormone's production ate less than normal after a period of food deprivation.

"This shows the key importance of UCP in the brain and its effect on neuronal activity," Diano said. "It's how neurons 'learn' that food is missing, and it keeps them ready to eat when food is introduced."

A drug that targets this mechanism would make fasting much easier.

Appetite-stimulating neurons get more mitochondria made in them. Those mitochondria produce more energy molecules that drive the appetite-stimulating neurons to send out more signals to produce a stronger desire to eat.

Now, the researchers found that support cells in the hypothalamus producing an enzyme that catalyzes active thyroid hormone production are side by side with appetite-stimulating neurons that express UCP2. In mice that were fasted for 24 hours, the arcuate nucleus showed an increase in the "DII" enzyme's activity and local thyroid production, in parallel with increased UCP2 activity.

This fasting-induced, T3-mediated UCP2 activation resulted in mitochondrial proliferation in the neurons, an event that was critical for the brain cells' increased excitability and consequent rebound feeding by the animals following food deprivation.

"Our results indicate that this mechanism is critical in sustaining an increased firing rate in these [hunger-stimulating] cells so that appetite remains elevated during fasting," Diano's group concluded. "Overall, our study provides strong evidence for an interplay between local T3 production and UCP2 during fasting and reveals a central thermogenic-like mechanism in the regulation of food intake."

Strong genetically driven mechanisms to make us eat are the product of selective pressures going back millions of years. Food shortages were a very common problem. Hence we have neural mechanisms that make us eat too much in an era of plenty. We need to develop biotechnologies that let us control our instincts for food.

By Randall Parker 2007 January 16 09:47 PM  Brain Appetite
Entry Permalink | Comments(0)
Growth Hormone Bad Anti-Aging Bet

Older folks who are taking growth hormone aren't doing themselves any favors.

PHILADELPHIA, Jan. 16, 2007 -- A review of published data on use of human growth hormone (GH) by healthy elderly people found that the synthetic hormone was associated with small changes in body composition but not in body weight or other clinically important outcomes.

Further, people who took GH had increased rates of unhealthy side effects such as soft tissue swelling, joint pain, carpal tunnel syndrome, and, in men, abnormal breast development. They were also somewhat more likely to develop diabetes.

The review, "The Safety and Efficacy of Growth Hormone in the Healthy Elderly," was published in the Jan. 16, 2007, issue of Annals of Internal Medicine and is available on the Web at www.annals.org on that day.

"Growth hormone has been widely promoted as an anti-aging therapy," said Hau Liu, MD, a research fellow in endocrinology and health policy at Stanford University and an author of the review.

"But the scant clinical experience of GH in the healthy elderly suggests that although GH may minimally alter body composition, it does not improve other clinically relevant outcomes such as bone density, cholesterol levels, stamina, and longevity in this population.

"And it's associated with high rates of adverse events.

"So, on the basis of available evidence, we cannot recommend growth hormone use for anti-aging in the healthy elderly."

Growth hormone always struck me as a bad idea for rejuvenation. It is about growth and we can't go on growing in size. If higher hormone levels could extend life then selective pressures would likely have caused humans to make more hormones in their old age. But lower hormone levels probably reduce the risk that aged cells will get too stimulated and go cancerous.

We need gene therapies, cell therapies and replacement organs to reverse the accumulated damage that comes with age. Hormones aren't going to do it.

By Randall Parker 2007 January 16 09:32 PM  Aging Drugs
Entry Permalink | Comments(5)
2007 January 15 Monday
Abused Children Have More Inflammation As Adults

Children who were physically or sexually abused show higher levels of inflammation indicators in blood as adults.

Researchers at King's College London followed 1,000 people in New Zealand from birth to the age of 32.

A third of those who were maltreated had high levels of inflammation - an early indicator of conditions such as heart disease and diabetes.


They took blood samples to measure levels of C-reactive protein, fibrinogen and white blood cells - substances which are known to be associated with inflammation in the body.

Adult survivors of childhood maltreatment who appeared to be healthy were twice as likely to show clinically relevant levels of inflammation compared to those who had not been maltreated.

So you get abused as a kid. Bad enough. But then you go on to suffer more diseases when you get older. The suffering lasts a lifetime. How incredibly cruel.

Twenty years hence will cheap in-school testing of kids for elevated inflammation response get used to spot kids who might be getting abused at home?

Abused kids are already known to get more heart disease and other illnesses which could be caused by sustained inflammation response.

The findings could explain why children who are abused show a higher incidence of conditions such as heart disease and diabetes as adults, the researchers say. Until now, it has not been clear exactly how early stress could cause these future health problems, says Andrea Danese, a psychiatrist at King's College London in the UK.

What is the mechanism? Do the various endocrine organs become more prone to turn on the inflammatory responses? Or does brain development in the abused alter in a way that makes the brain send out stress chemicals in potentially stressful situations? I'm going to guess that the latter mechanism is at least partially responsible because abuse of children makes them more prone to violence when they get older - especially if they have the right version of the gene for the mono-amine oxidase A enzyme.

This brings up another thought: Could people who have higher levels of stress-related inflammation indicators get trained by biofeedback or other means to reduce their inflammatory response? It might not be that easy. During childhood development cells throughout the body might have gotten their epigenetic state altered to make them more prone to inflammation response.

More generally: We need better ways to dampen down inflammation responses. We have lots of responses that have ceased to be adaptive in the modern environment. Ever been in an argument at work where you felt the "fight or flight" urge? That response is a maladaptive vestige of our evolutionary history. You might some day find youself in a situation where the adrenaline rush could help you survive. But in most cases the response just makes you age more rapidly.

By Randall Parker 2007 January 15 09:54 PM  Brain Development
Entry Permalink | Comments(1)
2007 January 14 Sunday
Exercise Participation Partly Inherited

No need to meekly accept the disapproving moral sanction of others when you do not exercise enough. If you do not like to exercise blame it on your genes.


A sedentary lifestyle remains a major threat to health in contemporary societies. To get more insight in the relative contribution of genetic and environmental influences on individual differences in exercise participation, twin samples from seven countries participating in the GenomEUtwin project were used.


Self-reported data on leisure time exercise behavior from Australia, Denmark, Finland, Norway, the Netherlands, Sweden and United Kingdom were used to create a comparable index of exercise participation in each country (60 minutes weekly at a minimum intensity of four metabolic equivalents).

Principal Findings

Modest geographical variation in exercise participation was revealed in 85,198 subjects, aged 19–40 years. Modeling of monozygotic and dizygotic twin resemblance showed that genetic effects play an important role in explaining individual differences in exercise participation in each country. Shared environmental effects played no role except for Norwegian males. Heritability of exercise participation in males and females was similar and ranged from 48% to 71% (excluding Norwegian males).

This result suggests one potential solution for the tendency of people in modern societies to get insufficient exercise: Genetically engineer offspring to get more joy from exercise. Also, identification of the genetic variations that contribute to the urge to exercise could lead to development of drugs that make exercise more enjoyable.

But genetic engineering will likely lead to the development of methods to reduce the need for exercise. Look at how anabolic steriods increase muscle mass build-up in response to exercise. We'll eventually have safer ways to control extent of muscle mass. We'll also gain complete control of appetite and find ways to cause the body to burn off excess fat. We'll also find lower exercise ways to produce other physiological benefits that we now gain from exercise.

Since an increasing fraction of all physical labor is automated we really need to find ways to adapt our metabolisms to the changes in our environment which are the results of our technology. This problem is only going to get worse as robots take over cleaning the house, mowing the lawn, and other physical chores us desk jockeys and drivers still do. Biotechnology will provide the solutions.

By Randall Parker 2007 January 14 05:42 PM  Human Bodies Athletics
Entry Permalink | Comments(3)
SORL1 Genetic Risks For Alzheimer's Disease

Want to know if you'll slowly lose all your memory and control of your body in your 70s and 80s? Probably not. Hopefully a cure for Alzheimer's won't take more than 10 or 15 years and any genetic risk you have for Alzheimer's will never get a chance to slowly destroy your mind. . But if you want to know if you are at risk a research team has identified yet another genetic variation that increases the risk of late-onset Alzheimer's Disease.

Researchers led by Howard Hughes Medical Institute (HHMI) international research scholar Peter St George-Hyslop have identified a new genetic risk factor associated with the most common form of Alzheimer's disease. The research implicates a gene called SORL1 in late-onset Alzheimer's, which usually strikes after age 65.

In an advance online publication in Nature Genetics on January 14, 2007, St George-Hyslop and colleagues connected the gene to the disease in six different groups of people, although they did not pinpoint the exact genetic mutations in SORL1 responsible for Alzheimer's. In their studies, the researchers used databases that include genetic information about people with and without Alzheimer’s disease. More than 6,800 individuals—45.8 percent of them affected with the disease—were included in the analysis, which is considered a large data set in the field, said St George-Hyslop..

These SORL1 variations join apolipoprotein E variation ApoE4 as known genetic risks for late onset Alzheimer's.

“We looked for variations of SORL1 in nine different groups of people and found those variations to be associated with an increased risk of Alzheimer's in six of them,” St George-Hyslop said. “That implies that SORL1 is not the only cause of Alzheimer's, but it's one of several. Some people with the disease will have a SORL1-related cause, and some won't.” St George-Hyslop is a professor in the department of medicine and director of the Center for Research in Neurodegenerative Disease at the University of Toronto and an HHMI international research scholar. Through its international research scholars program, HHMI supports leading scientists in 28 countries outside the United States.

The researchers studied several groups of Caucasians, one group of African Americans, one group of Hispanics from the Dominican Republic, and a group of Israeli Arabs. They tracked the SORL1 genes via single nucleotide polymorphisms, or SNPs, which are single-letter changes in a gene's sequence. They found that the Caucasians with Alzheimer's displayed a certain SNP signature at one end of the gene, while the African Americans, Hispanics, and Israeli Arabs with the disease displayed another SNP signature. “This implies that there are at least two, and possibly more, gene variants at work here,” said St George-Hyslop. “That's not unusual—in many diseases you see multiple variations that all impact a specific gene.”

So how can the scientists know that a gene has a genetic variation that contributes to a disease without knowing which particular genetic variation is responsible? See the mention of SNPs (single nucleotide polymorphisms) above. Those are locations in the genome where groups of people have single letter differences in their DNA as compared to all other groups of people. SNPs tend to occur in groups. Suppose at a particular location you have a letter A in your genome. Suppose other people have a G in that location and those who have a G have greater risk of Alzheimer's. That G usually will occur along with a group of other SNPs in nearby locations. The A at the same location will occur with letters at the same nearby SNP locations. The puzzle is to figure out which of other other nearby SNPs is the one that contributes to a disease risk.

The cost of testing for SNPs in genes is declining because chips are coming to market that can test for the presence of hundreds of thousands of SNPs at a time. The decline in SNP testing costs is enabling a growing flood of successful searches for genetic variations that contribute to disease risks. Within 5 years time I expect the number of discovered and easily testable genetic risk factors will become large enough to make personal DNA testing worthwhile.

But which risks will be worth testing for? Those you'll be able to do something about. Suppose a genetic variation makes Alzheimer's inevitable at middle age and that diet has little influence on when you'll get it. Well, I guess you could decide to avoid taking on family responsibilities that you won't be around to fulfill. But initially the biggest potential for doing something about a risk will involve risks that can be influenced by diet or exercise. What we need: genetic sample collection on big population studies of diets and lifestyles. Existing on-going longitudinal studies of diet and lifestyle risks could have their diet and lifestyle information compared against disease outcomes for those with high genetic disease risk to see if any dietary factors delayed or reduced the risk of major diseases.

What you should do when you discover 5 or 10 years hence that you have high genetic risk of a disease: Write your elected officials and argue for more research on the disease you are on course to get. Lobby for cures for diseases that will otherwise kill you and your loved ones.

Update: If the result with SORL1 is replicated it will join 14 other genes which have variations which are known risk factors for Alzheimer's.

Two geneticists at Massachusetts General Hospital, Lars Bertram and Rudolph Tanzi, have tried to bring order to this confused field by combining the data from many different studies. In an article in Nature Genetics earlier this month, they presented a group of 13 genes besides apolipoprotein E that have a statistically significant association with Alzheimer’s.

Dr. Tanzi said that he had run the numbers on SORL1 and that it would qualify at present for a place in his canon. “This is another gene worth paying attention to,” he added, “but we really have to wait for more replications.”

Over on the Gene Expression blog Amnestic points to evidence that APOE4 boosts episodic memory when young at the expense of greater Alzheimer's risk when you get old. APOE4 might be a variation worth having for someone being born now. The short term advantage might not cost you anything in the long run becaus 50 years from now Alzheimer's will be easily preventable.

We are going to find that many genetic variations which increase disease risks also provide benefits. The task of choosing ideal genetic variations for offspring will not be straightforward with a simple list of good genes and another list of bad genes. The best trade-offs will depend on guesses about the future availability of technologies, guesses about the shape of future societies, and one's values.

By Randall Parker 2007 January 14 03:26 PM  Brain Alzheimers Disease
Entry Permalink | Comments(0)
2007 January 10 Wednesday
MicroRNAs Elevated In Pancreatic Cancers

Pancreatic cancer cells have very high levels of some microRNAs,

COLUMBUS , Ohio – A pattern of micro molecules can distinguish pancreatic cancer from normal and benign pancreatic tissue, new research suggests.

The study examined human pancreatic tumor tissue and compared it to nearby normal tissue and control tissue for levels of microRNA (miRNA). It identified about 100 different miRNAs that are present usually at very high levels in the tumor tissue compared with their levels in normal pancreatic tissue.

Each of these miRNA has a unique short sequence of RNA letters. RNA is like DNA except it delivers signals and instructions around inside of cells. That there are so many kinds of miRNAs at higher levels is useful for the development of anti-cancer treatments.

These researchers envision anti-cancer drugs aimed at the miRNAs which occur at higher concentrations in cancers.

“Our findings show that a number of miRNAs are present at very different levels in pancreatic cancer compared with benign tissue from the same patient or with normal pancreatic tissue,” says principal investigator Thomas D. Schmittgen, associate professor of pharmacy and a researcher with the Ohio State's Comprehensive Cancer Center.

“Most are present at much higher levels, which suggests that developing drugs to inhibit them might offer a new way to treat pancreatic cancer. It also means that a test based on miRNA levels might help diagnose pancreatic cancer.”

I see a more sophisticated way to use this information to develop anti-cancer therapies: Develop gene therapies that would go into cells in organs which have cancer cells. The gene therapies would have sequences that match with the miRNA sequences involved in cancer. If enough miRNA sequences match with corresponding gene sequence DNA segments this binding could be used to activate the gene therapy in a cell to kill that cell.

To put it another way: Develop a gene therapy that acts like a computer program that activates in the presence of genetic signals that indicate cancer. This would allow far greater selectivity in killing of cancer cells without killing normal tissue.

Out of about 470 miRNAs currently known these researchers measured the levels of 225 of them. Likely other miRNAs both known and yet to be discovered would help to improve the accuracy of this method of cancer identification.

For this study, the researchers used a technique developed by Schmittgen and a group of colleagues in 2004 to measure miRNA in small tissue samples. The method is based on a technology called real-time PCR profiling, which is highly sensitive and requires very small amounts of tissue, Schmittgen says.

The researchers used the method to compare the levels of 225 miRNAs in samples of pancreatic tumors from patients with adjacent normal tissue, normal pancreatic tissue and nine pancreatic cancer cell lines.

Computer analysis of the data identified a pattern of miRNAs that were present at increased or decreased levels in pancreatic tumor tissue compared with normal tissue. The analysis correctly identified 28 out of 28 pancreatic tumors, 11 of 15 adjacent benign tissues and six of six normal tissues.

Levels of some miRNAs were increased by more than 30- and 50-fold, with a few showing decreased levels of eight- to 15-fold.

We need methods to deliver gene therapy into all the cells in an organ. With such a capability we could periodically get gene therapy that'd wipe out precancerous cells before they even become full blown cancers.

By Randall Parker 2007 January 10 11:12 PM  Biotech Cancer
Entry Permalink | Comments(0)
2007 January 09 Tuesday
Detroit Automakers Want Half Billion For Battery Research

Do the Detroit automakers want to move toward electric vehicles? GM just introduced their Volt electric car prototype which would go 40 miles on a wall socket charge and further on a 3 cylinder engine recharge. Now the US automakers want the US government to spend $100 million per year to speed up the development of batteries that can work better in cars and trucks.

General Motors Corp., Ford Motor Co. and DaimlerChrysler AG have asked the U.S. government for $500 million over five years to subsidize research into advanced batteries for cars and trucks.

The automakers made the request last month after meeting with President George W. Bush in the White House in November, said Stephen Zimmer, an advanced engineering director at DaimlerChrysler's Chrysler unit.

The Detroit makers are saddled with an expensive union and an high exchange rate for the dollar as a result of manipulations by East Asian governments. Therefore they are losing big money and aren't in a position to do much research.

Current US federal government funding of battery research is very low.

Since 1991, the U.S. government has subsidized battery research at the rate of about $25 million a year.

$25 million a year is chump change. Even $100 million per year isn't much. The article reports a claim by a spokesman for GM that Japan and other East Asian countries are spending a few hundred million dollars to subsidize the development of battery technologies in order to give their automakers a competitive advantage.

Given the current $3 billion per week burn rate for US forces in Iraq (which understates total costs since deaths and disabilities will cost us far into the future) the $100 million per year proposed above would pay for 6 hours of the US expenditures in Iraq. 6 hours. We could defund Muslim fundamentalists if we developed cleaner and cheaper replacements for oil and we'd raise our living standards in the process.

Fossil fuels usage brings big external costs in the form of pollution. We are better off accelerating the development of technologies that'll reduce and eventually eliminate the need for fossil fuels.

By Randall Parker 2007 January 09 08:02 PM  Energy Policy
Entry Permalink | Comments(32)
2007 January 08 Monday
Method Found To Route MicroRNA Into Nucleus

6 letter sequences of RNA act like zip codes to control the routing of slightly longer sequences of RNA within cells.

MicroRNAs, already implicated in cancer and normal development, latch on to and gum up larger strands of RNA that carry instructions for making the proteins that do all the cell’s work. They are, says Joshua Mendell, M.D., Ph.D., an assistant professor in the McKusick-Nathans Institute of Genetic Medicine at Hopkins, like “molecular rheostats that fine-tune how much protein is being made from each gene.”

That’s why normally microRNAs always have appeared to stick close to the cell’s protein-making machinery.

But during a survey of more than 200 of the 500 known microRNAs found in human cells, Mendell’s team discovered one lone microRNA “miles away” --- in cellular terms --- from all the others.

A weird unexpected result led to the discovery of how short sequences of RNA act like a sort of zip code for routing longer pieces of RNA.

Mendell's team discovered that 6 RNA nucleotide letters on the end of a microRNA control where in a cell the microRNAs get routed. By changing whihc 6 letters were on the end of a microRNA they could make the microRNA go to the nuclear rather than to the areas where most microRNAs are found.

Consisting of only 20 to 25 nucleotide building blocks (compared to other types of RNA that can be thousands of nucleotides long), each microRNA has a different combination of blocks. Mendell’s team realized that six building blocks at the end of the wayward miR-29b microRNA were noticeably different from the ends of other microRNAs.

Suspicious that the six-block end might have something to do with miR-29b’s location, the researchers chopped them off and stuck them on the end of another microRNA. When put into cells, the new microRNA behaved just like miR-29b, wandering far away from the cell’s protein-making machinery and into the nucleus, where the cell’s genetic material is kept.

Their ability to apply the same finding to a different kind of RNA opens up the possibility of using cellular RNA routing method to suppress genes for medical purposes.

The researchers then stuck the same six-block end onto another type of small RNA, a small-interfering RNA or siRNA that turns off genes. This also forced the siRNA into the nucleus.

According to Mendell, these results demonstrate for the first time that despite their tiny size, microRNAs contain elements consisting of short stretches of nucleotide building blocks that can control their behavior in a cell. Mendell hopes to take advantage of the built-in “cellular zip code” discovered in miR-29b as an experimental tool. For example, he plans to force other microRNAs and siRNAs into the nucleus to turn off specific sets of genes.

Since some genes suppress other genes. The ability to suppress some genes could also be used to turn up the activity of other genes.

Think of this discovery as equivalent to a computer programming hack. The ability to, for example, send small-interfering RNA (siRNA) into the nucleus means the ability to turn off specific genes. There's still the additional problem of how to get siRNA into cells in the first place. We need better gene therapy delivery mechanisms to do that. But once we get those mechanisms we'll be able to turn off specific genes in the nucleus. That could be used to suppress cancer or flip the state of genes for all sorts of other medical reasons.

By Randall Parker 2007 January 08 11:16 PM  Biotech Gene Hacking
Entry Permalink | Comments(0)
Chip Measures Protein Binding Energies In Parallel

To accelerate the pace of biological research we need automation and miniaturization to drive down costs. The development of miniature silicon devices that can measure biological systems with a high degree of parallelism is going to drive down costs by orders of magnitude just as happened in the computer industry. The trend toward labs on a chip continues to accelerate. In a recent example of this trend Stanford microfluidics researcher Stephen Quake and collaborator Sebastian Maerkl have developed a silicon chip that can measure the affinity of transcript factor proteins (which regulate gene expression) for sections of DNA with simultaneous measurements of 2400 pairs of proteins and DNA fragments.

To understand complex biological systems and predict their behavior under particular circumstances, it is essential to characterize molecular interactions in a quantitative way, Quake said. Binding energy-the energy with which one protein bind to another or to DNA-is one important quantitative measurement researchers would like to know. But these interactions are highly transient and often involve extremely low binding affinities, so they are difficult to measure on a large scale. To overcome this hurdle, Quake and Maerkl set out to develop a microlaboratory that could trap a type of protein known as a transcription factor. Once the transcription factor was trapped, the scientists hoped to measure the binding energy of the transcription factor bound to specific DNA sequences.

But simply measuring the binding energy between a transcription factor and a single DNA sequence is not enough, Quake said. He said it would be more meaningful to know the energy involved in a transcription factor binding to many different DNA sequences. This would give researchers a more complete picture of the “DNA binding energy landscape” of each transcription factor.

To determine the binding energy landscape, Quake and Maerkl's microlaboratory needed to conduct thousands of binding-energy experiments at once. The apparatus they created, which they called mechanically induced trapping of molecular interactions (MITOMI), consists of 2,400 individual reaction chambers, each controlled by two valves and including a button membrane. Each of the chambers is less than a nanoliter in volume. That's one-billionth of a liter—enough to hold a snippet of human hair only as long as the hair's diameter. The MITOMI apparatus fits over a 2,400-unit DNA microarray, or gene chip, onto which the researchers can dab minute amounts of DNA sequences. Each spot of DNA is then enclosed in its own reaction chamber.

Quake wants to use this approach to map all the protein-protein binding energies of a single organism. The ability to use semiconductor industry manufacturing processes to cheaply mass produce silicon chips will make this possible.

The ability to conduct many measurements cheaply and in parallel will eventually enable the use of simulations to carry out much biological research. The measurements of biological phenomena made by silicon chips will serve as useful data to feed into computer simulations.

According to Quake, MITOMI brings scientists closer to an important goal. “To test theories of systems biology, we should now be able to predict biology without making any measurements on the organism itself,” he said.

Technologies from the computer industry are accelerating the rate of advance of biomedical science. This trend is why I expect the defeat of almost all diseases in the lifetimes of some people who are already alive. Technologies to achieve full body rejuvenation will even stop the process of aging.

By Randall Parker 2007 January 08 06:26 PM  Biotech Advance Rates
Entry Permalink | Comments(0)
Benefit Of Tea Cancelled By Milk

The English should stop adding milk to their tea.

Research published on-line (Tuesday 9 January) in European Heart Journal[1] has found that the protective effect that tea has on the cardiovascular system is totally wiped out by adding milk.

Tests on volunteers showed that black tea significantly improves the ability of the arteries to relax and expand, but adding milk completely blunts the effect. Supporting tests on rat aortas (aortic rings) and endothelial (lining) cells showed that tea relaxed the aortic rings by producing nitric oxide, which promotes dilation of blood vessels. But, again, adding milk blocked the effect.

The findings, by cardiologists and scientists from the Charité Hospital, Universitätsmedizin-Berlin, Campus Mitte, Germany, are bad news for tea-drinking nations like the British, who normally add milk to their beverage. The results have led the researchers to suggest that tea drinkers who customarily add milk should consider omitting it some of the time.

Proteins in milk probably bind to the catechins and render them unavailable.

Their study showed that the culprit in milk is a group of proteins called caseins, which they found interacted with the tea to decrease the concentration of catechins in the beverage. Catechins are the flavonoids in tea that mainly contribute to its protection against cardiovascular disease.

Senior researcher Dr Verena Stangl, Professor of Cardiology (Molecular Atherosclerosis) at the hospital, said: "There is a broad body of evidence from experimental and clinical studies indicating that tea exerts antioxidative, anti-inflammatory and vasodilating effects, thereby protecting against cardiovascular diseases. As worldwide tea consumption is second only to that of water, its beneficial effects represent an important public health issue. But, up to now, it's not been known whether adding milk to tea, as widely practised in the UK and some other countries, influences these protective properties. So, we decided to investigate the effects of tea, with and without milk, on endothelial function, because that is a sensitive indicator of what is happening to blood vessels."

In East Asian countries where green tea is popular the use of milk in tea is relatively rare. So they are deriving a greater benefit from tea drinking.

High resolution ultrasound allowed measure of the effects of tea and milk on an artery.

Sixteen healthy postmenopausal women drank either half a litre of freshly brewed black tea, black tea with 10% skimmed milk, or boiled water (as a control) on three separate occasions under the same conditions. The endothelial function of the brachial artery in the forearm was measured by high resolution ultrasound before and two hours after drinking, with measurements being taken every 15 seconds for up to two minutes a time.

Said first author Dr Mario Lorenz, a molecular biologist: "We found that, whereas drinking tea significantly increased the ability of the artery to relax and expand to accommodate increased blood flow compared with drinking water, the addition of milk completely prevents the biological effect. To extend our findings to a functional model, we determined vasodilation in rat aortic rings by exposing them to tea on its own and tea with individual milk proteins added, and got the same result."

Milk contains a number of different proteins: by testing each one separately, the researchers found that it was the three caseins that accounted for the inhibiting effect, probably by forming complexes with tea catechins.

Casein proteins make up a substantial portion of cheese. This suggests that red wine (which has a healthy amount of catechins) and cheese is a bad combination. Chocolate milk similarly blunts the effects of the catechins in the chocolate. Drink your wine with nuts. The nuts have their own beneficial compounds such as another type of flavonoids called anthocyanins (tea has some too). As for chocolate milk: Why dilute something as great as chocolate with mere milk?

Casein proteins show up as food additives in a variety of foods and you can watch for the terms caseinate, casein, milk solids, milk protein, and curds as indicators of their presence.

By Randall Parker 2007 January 08 06:26 PM  Aging Diet Studies
Entry Permalink | Comments(9)
2007 January 07 Sunday
A Scientific Middle Emerges On Global Warming

The New York Times reports that as climate alarmist catastrophists have painted the threat from global warming in increasingly extreme terms a more scientific middle ground school of thought has developed in reaction to the catastrophists.

They agree that accumulating carbon dioxide and other heat-trapping smokestack and tailpipe gases probably pose a momentous environmental challenge, but say the appropriate response is more akin to buying fire insurance and installing sprinklers and new wiring in an old, irreplaceable house (the home planet) than to fighting a fire already raging.

“Climate change presents a very real risk,” said Carl Wunsch, a climate and oceans expert at the Massachusetts Institute of Technology. “It seems worth a very large premium to insure ourselves against the most catastrophic scenarios. Denying the risk seems utterly stupid. Claiming we can calculate the probabilities with any degree of skill seems equally stupid.”

I agree with this insurance premium argument. We can't prove either disaster or minimal impact. The amount of atmospheric carbon dioxide (CO2) increase is large and, with rapidly industrializing China set to surpass the United States in CO2 emissions by 2009, the massive change in atmospheric CO2 content accelerating. It seems unwise to me to take a passive stance in response to this thread. I've been arguing that it is simply imprudent to do nothing.

My own argument for what to do: A massive research effort to develop cheaper non-fossil fuel energy sources. This approach holds several advantages, not least of which is that it will accelerate rather than decelerate economic growth in the medium to long term

Prometheus web log author Roger A. Pielke Jr. (who writes great stuff btw) says this middle group are not so much climate skeptics as they are heretics on what to do about it.

There are enough experts holding such views that Roger A. Pielke Jr., a political scientist and blogger at the University of Colorado, Boulder, came up with a name for them (and himself): “nonskeptical heretics.”

“A lot of people have independently come to the same sort of conclusion,” Dr. Pielke said. “We do have a problem, we do need to act, but what actions are practical and pragmatic?”

One practical and pragmatic thing to do is to decrease methane emissions. Methane is a more potent greenhouse gas and the costs of lowering methane emissions are lower than the costs of lowering CO2 emissions.

Mike Hulme, Professor of Environmental Sciences at the University of East Anglia, and Director of the Tyndall Centre for Climate Change Research says as environmental political activists have adopted a more catastrophist approach he finds himself getting criticised by catatstrophist believers as well as by warming skeptics.

It seems that mere "climate change" was not going to be bad enough, and so now it must be "catastrophic" to be worthy of attention.

The increasing use of this pejorative term - and its bedfellow qualifiers "chaotic", "irreversible", "rapid" - has altered the public discourse around climate change.

This discourse is now characterised by phrases such as "climate change is worse than we thought", that we are approaching "irreversible tipping in the Earth's climate", and that we are "at the point of no return".

I have found myself increasingly chastised by climate change campaigners when my public statements and lectures on climate change have not satisfied their thirst for environmental drama and exaggerated rhetoric.

It seems that it is we, the professional climate scientists, who are now the (catastrophe) sceptics. How the wheel turns.

Catastrophe movies are exciting and the prospects of catastrophe in real life even more exciting. Plus, lots of people want to think they are fighting in a moral crusade for a great good against evil and ignorance. They think they need to paint an extremely disastrous picture of the future in order to motivate people. So the prospect of global warming has a lot to offer. Plus, Mother Gaia is morally superior to us human enviro-sinners. Never mind that we are the products or creations of Mother Gaia. We fell out of Eden somehow or other when Mother Gaia's natural selection made us too intelligent.

The biggest problem with the catastrophe scenarios is that they involve projections of trends that will not continue even if governments around the world do little to alter current trends. While fossil fuel consumption will likely rise for a decade or two the march of technology looks set to obsolesce fossil fuel even without government intervention. Nuclear, photovoltaics, batteries, and wind will all get cheaper and eventually their costs will fall below the costs of fossil fuels.

But we have several quite compelling reasons to take steps to bring the fossil fuel era to an earlier end. For example fossil fuel usage produces conventional pollutants such as particulates, mercury, and oxides of sulfur and nitrogen. Why expose ourselves to pollutants? Why let the neurotoxin mercury accumulate in the food chains for fish? Why breathe carcinogens and stuff that makes our eyes sting and our throats hurt?

The deep skeptic school on global warming is also making an economic mistake. They correctly point out that restraints on CO2 emissions will raise the price of energy and therefore slow economic growth and lower living standards. But when they fail to push instead for a huge acceleration of nuclear, solar, wind, and other non-fossil fuel technology development they miss the opportunity to help create technologies that lower energy costs and clean up environments at the same time.

Additional advantages of a big R&D push comes from the effects on trade. First off, the need for expensive energy imports will vanish along with the need for oil. US trade deficits will get smaller. Also, my grandmother always used to say "Idle hands are the devil's workshop". That certainly holds true for Wahhabi Middle Eastern oil emirates where oil allows plenty of Muslims to live the life of Riley (or his Arab equivalent). Fewer people would become terrorists if more had to get up each day and go to work. The cut-off of money for Saudi Arabia would also cut off money now used to spread Wahhibi Islam around the globe.

We have the real potential for large technological steps forward (e.g. nanodevice replicators to produce incredibly cheap photovoltaics) to make oil obsolete and to make clean energy sources very cheap. The late Richard Smalley, nanomaterials researcher and 1996 Nobel Prize winner for his work on fullerenes ("buckeyballs"), argued for a $10 billion a year research effort on a wide range of energy technologies and believed such an effort would lead to big breakthroughs in cost and cleanliness of energy technologies. This effort is worth doing even if the global warming skeptics are correct. Lower energy costs, reduced flow of money to the Middle East, and a cleaner environment each by themselves will pay back the money spent.

If we fail to do anything to accelerate the development of new energy technologies and the projections of the global warming catastrophists turn out correct then we will not suffer ruin by any means. The nanotechnology advances we gain in the next 50 years will make movement of equipment into space much cheaper. Nanotech beanstalk space elevators will lower costs by orders of magnitude. We'll be able to move up massive amounts of materials to build big deflector satellites to cool off the poles to stop and reverse ice melts. Climate engineering technologies will save us from catastrophe. But we can become wealthier and healthier sooner by accelerating the development of clean and cheap energy technologies.

By Randall Parker 2007 January 07 03:34 PM  Climate Policy
Entry Permalink | Comments(33)
2007 January 03 Wednesday
Brain Gray Matter Holes Caused By Silent Strokes?

The brain is going to be the hardest organ in the body to rejuvenate. But prevention of damage is more important than repair because loss of brain cells means loss of information. Replacement of brain cells won't give you back lost memories. One way we can prevent brain cell loss is to prevent blockages in small arteries in the brain.

A team of UC San Diego physicists and neuroscientists has discovered a bottleneck in the network of blood vessels in the brain that makes it vulnerable to strokes. The finding may explain the origin of the puzzling damage to the brain’s gray matter often detected in brain scans, especially among the elderly.

In the study, published this week in the journal Proceedings of the National Academy of Sciences, the researchers used a laser technique they developed to precisely monitor changes in blood flow resulting from an induced blockage in a tiny artery, or arteriole, in the brains of anesthetized rats. They found that the penetrating arterioles, which connect the blood vessels on the brain’s surface with deeper blood vessels, are a vulnerable link in the network.

“The blood vessels on the surface of the brain are like a collection of city streets that provide multiple paths to get somewhere,” explained David Kleinfeld, a professor of physics at UCSD, who led the team. “If one of the vessels is blocked, blood flow quickly rearranges itself. On the other hand, the penetrating arterioles are more like freeways. When blocked, the blood flow is stopped or slowed significantly in a large region round the clot.”

Many more people have had strokes than are aware of it. So lots of old people suffer from reduced cognitive ability and loss of memory due to silent strokes.

The obstruction of blood flow resulted in damage to the surrounding brain area, which the researchers report resembled damage seen in the brains of humans and thought to be the result of “silent strokes.” Silent strokes have attracted attention recently because magnetic resonance imaging has made it possible to follow changes in the brains of individuals as they age. MRI scans have revealed that, over time, small holes accumulate in the gray matter of many patients, including those who have no obvious behavioral signs of a stroke.

I do not want to get lots of holes in my brain gray matter as the years go by.

The researchers say their results support the hypothesis, made by clinicians, that the penetrating arterioles may be the location of small strokes that cause the death of sections of brain tissue in humans. The accumulation of damage may lead to memory loss, and may be a risk factor for having a larger stroke, according to Pat Lyden, a professor of neurosciences at UCSD’s School of Medicine and head of the UCSD Stroke Center.

Development of youthful artery stem cells that can replace aged stem cells could help repair brain arteries and by doing so avoid silent strokes that wipe out pockets of neurons in the brain. Genetic engineering of the liver to improve blood cholesterol could also reduce the risk of brain damage from stroke and poor circulation.

We need faster progress on stem cell research. The survival of our brain cells is at stake.

By Randall Parker 2007 January 03 10:53 PM  Brain Aging
Entry Permalink | Comments(5)
Methamphetamine Increases Stroke Risk

Methamphetamine and cocaine probably are toxic to artery cells.

ST. PAUL, Minn – Methamphetamine use may be associated with increased risks of major neck artery tears and stroke, according to an article published in the December 26, 2006, issue of Neurology, the scientific journal of the American Academy of Neurology.

“It appears methamphetamine use is toxic to large blood vessels,” said the study’s senior author Wengui Yu, MD, PhD, with the University of California, Irvine Medical Center and a member of the American Academy of Neurology.

The article reviewed the cases of two women, ages 36 and 29, who had sudden onset of speech difficulty and weakness following recent use of methamphetamine.

Brain scans showed both women had severe strokes from carotid artery dissection, which is a tear in the inner lining of one of the major arteries in the neck. On the National Institutes of Health Stroke Scale, the 36-year-old woman received a score of 21 and was treated with tissue plasminogen activator. The 29-year-old woman, who required a stent to treat the blockage in her common carotid artery, received a score of 17. Stroke Scale scores over 16 predict a high probability of death or severe disability.

Toxic drugs that cause cell death accelerate the aging process. Even if abusers stop taking meth or coke they've forced their stem cells to do a lot more dividing to repair the damage they were constantly causing while still using. All that extra cell division causes the stem cells to wear out more quickly. So as they get older they'll probably develop circulatory problems sooner than they otherwise would have.

Meth and coke users probably also get silent strokes due to damage to smaller blood vessels that affect smaller regions in the brain. Brain cell death amounts to loss of part of your identity. Once we gain the ability to grow replacement brain cells from youthful stem cells that won't bring back memories that went away with the cell death caused by stroke.

By Randall Parker 2007 January 03 10:20 PM  Brain Addiction
Entry Permalink | Comments(0)
Herpes Virus And ApoE Gene Cause Alzheimer's RIsk

The ApoE-4 version of the ApoE gene which is associated with a higher Alzheimer's Disease risk probably increase the risk of Alzheimer's by doing a poorer job of suppressing the Herpes virus that causes cold sores.

A gene known to be a major risk factor for Alzheimer's disease puts out the welcome mat for the virus that causes cold sores, allowing the virus to be more active in the brain compared to other forms of the gene. The new findings, published online in the journal Neurobiology of Aging, add some scientific heft to the idea, long suspected by some scientists, that herpes somehow plays a role in bringing about Alzheimer's disease.

The work links a form of the ApoE gene known as ApoE-4, which after advanced age is the leading known risk factor for getting Alzheimer's disease, with the form of herpes – herpes simplex 1 or HSV – that infects more than 80 percent of Americans and causes cold sores around the mouth. The findings from a group at the University of Rochester Medical Center show that the particular form of the gene that puts people at risk also creates a fertile environment for herpes in the brain, allowing the virus to be more active than other forms of the ApoE gene permit.

We need vaccines that will prevent Herpes virus infections. We also need drugs or perhaps gene therapies that'll suppress or kill Herpes in the brain and peripheral nerves.

Scientists have known for more than 15 years that the ApoE-4 gene is a player in Alzheimer's disease, but the idea that it works in concert with the herpes virus is new.

Note how we've known about the ApoE-4 link to Alzheimer's for 15 years without being able to do anything about it. That's true with many other genetic variations which have known roles in causing diseases. We lack the gene therapy technologies to intervene. Though the knowledge that specific genes play roles in development of diseases does allow many scientists to focus their attention on how those those operate and how their role may help cause diseases.

Different lines of evidence point toward an Apo-E4 plus Herpes connection with Alzheimer's.

Ruth Itzhaki of the University of Manchester has led the way with several studies showing a correlation between herpes and Alzheimer's. She has shown that Alzheimer's patients who have the ApoE-4 form of the gene have more herpes DNA in the brain regions that are affected by Alzheimer's, compared to Alzheimer's patients who also have herpes but who have a different form of the ApoE gene. And she has shown that people with the ApoE-4 version of the gene who are infected with herpes are more likely to get Alzheimer's disease than people infected with herpes who have a different form of the ApoE gene, or than people who have the ApoE-4 gene but who don't have herpes.

Other scientists have found that a herpes infection is active more often – causing the tell-tale cold sores around the mouth – in the 25 percent of people who have a copy of the ApoE-4 gene. In other words, people who are frequently troubled by cold sores are more likely to have the gene that makes them more vulnerable to Alzheimer's disease.

Every time you get a cold sore do you suffer mild brain damage? Seems plausible at least.

ApoE-4 does not increase the odds of infection but it does increase the amount of time time the virus is active.

The team found that the virus infiltrates brain cells about the same no matter which gene is involved. But they found that the subsequent activity level of the virus generally mirrored the disease-causing potential of the gene. They found that in animals with the ApoE-4 gene, the virus is less likely to be in the quiet, latent stage of its life cycle, suggesting it has more of an opportunity to replicate. In animals with the ApoE-2 gene, the virus was less active.

Brain aging is the form of aging I most want to slow down and delay. How to rejuvenate the 100 billion neurons in the brain is the hardest task facing rejuvenation medicine. We'll be able to reverse the aging of the rest of the body before we develop the ability to make the brain youthful once again. Therefore any treatments we can come up with to slow brain aging will provide great benefits and give us more time to develop brain rejuvenation therapies.

By Randall Parker 2007 January 03 09:59 PM  Brain Alzheimers Disease
Entry Permalink | Comments(1)
2007 January 02 Tuesday
Greg Cochran Says Replicators Will Solve Energy Problems

Physicist and genetic theorist Gregory Cochran thinks within decades we will have more energy than we can imagine and incredible wealth too.

Hardly anyone seems to realize it, but we're on the threshold of an era of unbelievable abundance. Within a generation—sooner if we want it enough—we will be able to make a self-replicating machine, first seriously suggested by John von Neumann.

Such a machine would absorb energy through solar cells, eat rock and use the energy and minerals to make copies of itself. Numbers would grow geometrically, and if we manage to design one with a reasonably short replication time—say six months—we could have trillions working for humanity in another generation.

Regarding getting technological advances sooner: The lack of a more vigorous pursuit of the big pay-off breakthroughts has got to be the absolutely hugest opportunity cost we inflict on ourselves. We've somehow managed to allow the United States to get in a position where it is going to waste about $150 billion in Iraq this fiscal year. Yet the total budget for the US National Science Foundation for a wide range of research efforts in many areas of science is about $6 billion dollars. Granted, other research agencies get much larger chunks of money compared to the NSF. But ridiculous fiascos get far more.

The vast bulk of the wealth gained from advances in knowledge go to people other than those who generate the knowledge. So marketplaces really under-reward those who push back the edges of scientific understanding. We therefore do not get as much net benefit from science as we could. We underfund science.

Self-replicating devices (which better have great built-in controls for stopping replication) can make solar photovoltaics cheap.

Right now the human race uses about 13 trillion watts: the solar cells required to produce that much power would take up less than a fifth of one percent of the Earth's land surface—remember that the Earth intercepts more solar energy in an hour than the human race uses in a year. That's a lot of solar cell acreage, but it's affordable as long as they make themselves. We could put them in deserts—in fact, they'd all fit inside the Rub' al Khali, the Empty Quarter of Saudi Arabia. As I understand it, we like depending on the Saudis for energy.

But there are better ways. Solar energy works better in space—sure, the weather is better, but also consider that the vast majority of the Sun's energy misses the Earth. In fact only about one part in two billion warms this planet. Space-based self-replicating systems could harvest some of that lost sunlight—enough to make possible a lot of energy-expensive projects that are currently impractical.

Greg also expects the incredibly greater ability to harness energy and matter to make interstellar probes possible within the lifetimes of some people who are still alive.

If Greg is correct then projections that we'll be burning huge amounts of fossil fuels in the latter part of the 21st century are entirely wrong. We could bring the age of fossil fuels to an end much sooner by pushing much harder to accelerate the rate of advance of nanotechnology. That'd pay orders of magnitude greater dividends than attempts to make everyone think warm fuzzy thoughts about hybrids and hostile thoughts about SUVs.

By Randall Parker 2007 January 02 11:08 PM  Nanotech Wealth
Entry Permalink | Comments(29)
Olive Oil Lowers Oxidative Stress Marker

The ability of olive oil to lower a marker for oxidative damage of DNA in cells suggests that olive oil might lower the risk of cancer.

If you want to avoid developing cancer, then you might want to add eating more olive oil to your list of New Year's resolutions. In a study to be published in the January 2007 issue of The FASEB Journal, scientists from five European countries describe how the anti-cancer effects of olive oil may account for the significant difference in cancer rates among Northern and Southern Europeans.

The authors drew this conclusion based on the outcomes of volunteers from Denmark, Finland, Germany, Italy, and Spain, who consumed 25 milliliters (a little less than a quarter cup) of olive oil every day for three weeks. During this time, the researchers examined urine samples of the subjects for specific compounds known to be waste by-products of oxidative damage to cells, a precursor to cancer. At the beginning of the trial, the presence of these waste by-products was much higher in Northern European subjects than their Southern European counterparts. By the end of three weeks, however, the presence of this compound in Northern European subjects was substantially reduced.

"Determining the health benefits of any particular food is challenging because of it involves relatively large numbers of people over significant periods of time," said lead investigator Henrik E. Poulsen, M.D. of Rigshospitalet, Denmark. "In our study, we overcame these challenges by measuring how olive oil affected the oxidation of our genes, which is closely linked to development of disease. This approach allows us to determine if olive oil or any other food makes a difference. Our findings must be confirmed, but every piece of evidence so far points to olive oil being a healthy food. By the way, it also tastes great."

I'd like to see more dietary studies using oxidative stress markers as a quicker way to guess at the likely long term effects of various food choices.

The polyphenols in olive oil surprisingly do not look like the cause of the lowered oxidative stress marker.

Another interesting finding in the study suggests that researchers are just beginning to unlock the mysteries of this ancient "health food." Specifically, the researchers found evidence that the phenols in olive oil are not the only compounds that reduced oxidative damage. Phenols are known antioxidant compounds that are present in a wide range of everyday foods, such as dark chocolate, red wine, tea, fruits, and vegetables. Despite reducing the level of phenols in the olive oil, the study's subjects still showed that they were receiving the same level of health benefits.

I'd like to see studies done using different high phenol foods that are low in fat to see if any of the foods can lower oxidative stress using the same marker (8oxodG - sounds like an oxidized form of the nucleic acid guanine) that these researchers used.

The researchers measured the compound 8oxodG in the urine as an indicator of oxidative stress and damage and found olive oil lowered 8oxodG.

Oxidative damage is a process whereby the metabolic balance of a cell is disrupted by exposure to substances that result in the accumulation of free-radicals, which can then damage the cell.

The men were found to have around 13% less 8oxodG compared with their levels at the beginning of the study.

At the beginning of the study, men from northern Europe had higher levels of 8oxodG than those from southern Europe, supporting the idea that olive oil had a reductive effect.

I've started eating more olives and olive oil. The olive oil is displacing canola oil. But we need a comparative study of the effects of olive oil and canola oil on urine 8oxodG. Ditto for fish oils.

The bigger story on olive oil has been the suspected heart benefit. A September 2006 paper published in the Annals of Internal Medicine found olive oil boosts heart healthy HDL cholesterol while lowering triglycerides and lowering oxidized LDL cholesterol.

Results: A linear increase in high-density lipoprotein (HDL) cholesterol levels was observed for low-, medium-, and high-polyphenol olive oil: mean change, 0.025 mmol/L (95% CI, 0.003 to 0.05 mmol/L), 0.032 mmol/L (CI, 0.005 to 0.05 mmol/L), and 0.045 mmol/L (CI, 0.02 to 0.06 mmol/L), respectively. Total cholesterol–HDL cholesterol ratio decreased linearly with the phenolic content of the olive oil. Triglyceride levels decreased by an average of 0.05 mmol/L for all olive oils. Oxidative stress markers decreased linearly with increasing phenolic content.

When you can lower heart disease and cancer risk with the same dietary practice that sounds like a winner to me.

By Randall Parker 2007 January 02 10:48 PM  Aging Diet Cancer Studies
Entry Permalink | Comments(4)
2007 January 01 Monday
Home Robots Grow In Popularity

Joel Garreau says people are falling in love with their Roomba robotic vacuum cleaners.

This week, women all over America -- and not a few men -- are cooing and doting over their surprise hit Christmas present. They swoon when it hides under the couch and plays peekaboo. When it gets tired and finds its way back to its nest, sings a little song and then settles into a nap, its little power button pulsing like a beating heart, on, off, on, off, they swear they can hear it breathe.

It's as cute as E.T., as devoted as R2D2, more practical than a robotic dog and cheaper than some iPods.

iRobot's Roomba is a big seller.

More than 2 million of the machines, which range in price from about $150 to $330, have been sold. The day after Christmas, a Roomba was among the top 20 items in Amazon.com's vast home-and-garden section, ahead of the top-selling iron, the top-selling blender, the top-selling coffeemaker and the top-selling George Foreman grill. In Housewares, different models were Nos. 1, 6 and 8, ahead of all the other vacuum cleaners, including the DustBusters.

Automation of boring house work is a wonderful thing. I especially want full automation of food preparation. Picture a bunch of bins that you'd load with noodles, rice, and other basic dried goods. Plus, imagine a bunch of small spice bins. Then an automated system like an miniaturized warehouse robot would take small amounts from each bin and put the ingredients into a pot which would first be removed from a standard position on a rack and placed on a stove. If the automated system needed to, say, take an unopened bottle of ketchup from a shelf it would put an RFID tag on the bottle and put the bottle in the refrigerator after removing some ketchup. So when will we get the kitchen cook robot? 10 years? 20 years?

An MIT Technology Review article on the future of robots reports home robots surpassed industrial robots in number in 2005.

Domestics. If the latest figures are to be believed, 2007 will be the year of the robotic revolution. According to the latest Robotics Survey, published in October by the United Nations Economic Commission for Europe, domestic robots now outstrip their industrial cousins. In 2005, the number of domestic droids exceeded the one million milestone, a figure that is now expected to rise into the several millions over the next few years. Christensen believes that next year South Korea will likely come out with the first truly multifunctional home robot. The South Korean government is committed to becoming a leader in robotics and has announced a plan to have a robot in every home by 2013.

The industrial robots cost more and deliver more economic value. But the trend is clear. Home robotics has started to become a part of the present and not just a science fiction dream about the future.

While the term "Roomba" has achieved popularity in the mainstream culture iRobot also makes some less well known mass market floor cleaning robots. The Scooba cleans hard floors such as found in kitchens and bathrooms. The Dirt Dog cleans nails, bolts, and other small debris from shop floors. Scooba can clean just about any floor that a mop can clean.

Scooba is designed to safely clean all sealed hard floor surfaces, including tile, linoleum, marble and sealed hardwood—wherever you would typically use a standard mop. Scooba uses water and a specially designed Clorox® cleaning solution that is safe and effective on all sealed hard floor surfaces.

I can see one problem with these devices: Pets! My late great Australian Shepherd thought all wheeled devices were things to bite at. If I was pushing along a lawn mower that was not running he'd try to bite the wheels. So if you have a dog at home with access to the insides of the house and you set the Roomba or Scooba to do cleaning while you are at work what is Fido or Fluffy going to do when one of these devices starts cruising around? Maybe the simple solution is to start it running as one goes out the door to walk the dog.

How quickly will we get a taller device that'll vacuum couches and chairs or dust window sills and other ledges? The liability risk would be much higher for such a device. Plus, it would be a tougher problem to solve since the device would more in more dimensions with more axes of motion. The same difficulties hold for something tall enough to clear the dishes from the table and put them into the dish washer with the table scraps removed.

Spiralling costs and an aging population make health an area that cries out for robotic automation. Another Technology Review article reports on efforts to provide better physical feedback from robots to surgeons.

Robotic surgical systems have become a staple in operating rooms, advancing the field of minimally invasive surgery. These computer-assisted tools help surgeons conduct more-precise in-depth procedures. The robots are often praised for their dexterity, advanced visualization technologies, and mechanical stamina. But there is one important aspect the robots are missing: a sense of touch, also known as haptics.

A Johns Hopkins team is working on the haptics problem.

To develop such technology, Okamura and her team are working with the da Vinci surgical system made by Intuitive Surgical; it's the only robot approved by the FDA for conducting surgical procedures. The da Vinci is particularly useful in laparoscopic surgical procedures, such as the removal of the gallbladder or prostate. It also makes it possible to perform minimally invasive procedures for general noncardiac surgical procedures inside the chest.

Surgical robots will serve as aides to human surgeons in much the same way that automatic pilots do work for real pilots. Surgical robots will eventually do subsets of steps within longer surgical procedures. For example a surgical robot could probably be designed to show graphically what they plan to do for a sequence by overlaying an animation over an already sliced open area of the body. Then a surgeon could approve of that sequence and the robot could perform the sequence more rapidly and accurately than a human could.

We are moving beyond the stage where robots were used only in controlled and therefore relatively simple factory environments. The home and the surgical operating table are both much more complicated environments with more unplanned and unexpected elements that can show up. Recent advances in robotic vehicles demonstrate the potential for robotic systems to handle complex environments outside of factories. The success of robots in the mass market will provide revenue flows to fund the development of more robotic products. We should expect the introductions of new kinds of home and workplace robots in the next few year. Robots are a growing part of our everyday lives.

By Randall Parker 2007 January 01 09:35 PM  Robotics Trends
Entry Permalink | Comments(7)
Lower Fertility Drug Doses Just As Effective For IVF

By maturing eggs with miniscule doses of fertility drugs after the eggs are removed with needles from ovaries fertility researchers have found a way to give women far smaller doses of fertility drugs for in vitro fertilization.

Clinical trials in Denmark have shown that a pioneering technique known as in-vitro maturation (IVM) has a success rate of 30 per cent, comparable to standard IVF procedures. The patient, however, does not have to take expensive fertility drugs that can carry serious side-effects.

This lowers costs, perhaps by as much as half. It also reduces the risks of side effects from fertility drugs. So IVF becomes less risky and less costly at the same time.

Professor Lindenberg, who works at the Nordica Fertility Centre in Copenhagen, explained: “We give a very low dose of a stimulating drug for three days early in the cycle and rescue up to ten eggs. For the first 24 hours a tiny amount of stimulating hormone is added to the culture, in fact one hundreth of the dose the woman would receive, and after that the eggs go on to mature in the culture alone.”

This is great news for those with fertility problems trying to make babies now. But in the longer run this advance will get even more widely used by those who start using pre-implantation genetic diagnosis to select embryos for implantation based on desired genetic characteristics.

Lower hormone doses reduced the number of genetically damaged embryos produced.

Professor Bert Fauser - who carried out the study - said: 'Women are paying a high price financially and they are risking their health and psychological well being when low doses therapy will work for the majority of patients.'

A second study by Professor Fauser's team at the University of Utrecht found that high stimulation of the ovaries with hormone drugs created more chromosomally damaged embryos compared to women on mild stimulation treatment.

Milder methods to extract eggs will also reduce the risk for egg donors and therefore should lead to an increase the availability of donor eggs.

In the longer run I expect stem cell research to discover methods to create eggs from adult stem cells. This will solve the problem faced by women whose ovaries have gotten too old or never worked well in the first place.

At the conference where these results were presented the general theme was to find ways to reduce the severity of treatments used to boost fertility.

Dr Geeta Nargund, the organiser of the Congress on Natural Cycle and Minimal Stimulation IVF, and head of reproductive medicine at St George’s Hospital, London, believes it is time to stop giving women hormones to make them more fertile. In the week that IVF laws had a government shake-up, she says there is a back-to-basics approach to help women conceive that is safer, cheaper and, according to new studies presented by her peers at the congress, just as effective. Dr Nargund has pioneered techniques of scanning the ovaries for blood flow, which enables specialists to accurately predict which eggs are most likely to be fertilised successfully, doing away with the need to artifically stimulate the production of lots of eggs.

A new gene chip that can test 650,000 single letter genetic differences at once means we are getting close to finding large numbers of useful genetic variations. What causes IQ differences? We're going to know. We are finding out what causes hair color, skin color, and eye color differences. We are going to find out what causes differences in height, musculature, fat distribution (including breast size of course), teeth color, teeth enamel quality, facial shape, ear shape, and everything else that makes us look or think or feel differently from each other.

We are at the tip of a flood of information about human genetic differences. Prospective parents are going to use that information to choose embryos to get the kinds of kids they want. Look at brothers and look at sisters. Two brothers from the same parent can be greatly different in height, eye color, hair color, physical attractiveness, smarts, aggressiveness, and many other qualities. Couples are going to have the information they need to select among which of their own genetic variations they will pass down to their future children and I predict a substantial fraction of prospective parents will jump at the chance to make smarter and better looking children who are less prone to crime, depression, and assorted other problems.

The ability to fertilize several embryos, do genetic testing on embryos, and then choose the genetically most preferred embryo will accelerate the rate of human evolution. While many of the effects, such as intelligence boosting, will be beneficial I worry that different groups will go for different ideals and basically cause the human race to go off in divergent directions. I'm especially worried about divergences in that cause different tendencies in beliefs and moral sensibilities.

Will some choose genes that make their offspring more likely to be religious while others choose genes that make their offspring more likely to be skeptical? Will some choose genes which make people more likely to feel morally outraged while others choose genes that make their kids more amoral? Seems to me such choices will become possible and the human race could split into groups that cognitively see the world so differently that wars between them become inevitable.

By Randall Parker 2007 January 01 06:33 PM  Biotech Reproduction
Entry Permalink | Comments(0)
Site Traffic Info
Site Copyright
The contents of this site are copyright ©