2006 March 30 Thursday
Higher Intelligence Caused By Slower Brain Development

My immediate reaction is what genetic variations cause this trajectory that leads to higher intelligence?

Youth with superior IQ are distinguished by how fast the thinking part of their brains thickens and thins as they grow up, researchers at the National Institutes of Health's (NIH) National Institute of Mental Health (NIMH) have discovered. Magnetic resonance imaging (MRI) scans showed that their brain's outer mantle, or cortex, thickens more rapidly during childhood, reaching its peak later than in their peers — perhaps reflecting a longer developmental window for high-level thinking circuitry. It also thins faster during the late teens, likely due to the withering of unused neural connections as the brain streamlines its operations. Drs. Philip Shaw, Judith Rapoport, Jay Giedd and colleagues at NIMH and McGill University report on their findings in the March 30, 2006 issue of Nature.

"Studies of brains have taught us that people with higher IQs do not have larger brains. Thanks to brain imaging technology, we can now see that the difference may be in the way the brain develops," said NIH Director Elias A. Zerhouni, M.D.

Here is where political correctness enters in. Zerhouni holds a highly visible position as head of a large government research agency. So in today's intellectual environment we can't expect much from him on the topic of intelligence. There is a positive correlation between IQ and brain size. There's an even higher positive correlation between IQ and brain gray matter size. But when it comes to differences in intelligence the taboos kick in with a vengeance. See links below for the truth of the matter.

While most previous MRI studies of brain development compared data from different children at different ages, the NIMH study sought to control for individual variation in brain structure by following the same 307 children and teens, ages 5-19, as they grew up. Most were scanned two or more times, at two-year intervals. The resulting scans were divided into three equal groups and analyzed based on IQ test scores: superior (121-145), high (109-120), and average (83-108).

The researchers found that the relationship between cortex thickness and IQ varied with age, particularly in the prefrontal cortex, seat of abstract reasoning, planning, and other "executive" functions. The smartest 7-year-olds tended to start out with a relatively thinner cortex that thickened rapidly, peaking by age 11 or 12 before thinning. In their peers with average IQ, an initially thicker cortex peaked by age 8, with gradual thinning thereafter. Those in the high range showed an intermediate trajectory (see below). While the cortex was thinning in all groups by the teen years, the superior group showed the highest rates of change.

"Brainy children are not cleverer solely by virtue of having more or less gray matter at any one age," explained Rapoport. "Rather, IQ is related to the dynamics of cortex maturation."

The observed differences are consistent with findings from functional magnetic resonance imaging, showing that levels of activation in prefrontal areas correlates with IQ, note the researchers. They suggest that the prolonged thickening of prefrontal cortex in children with superior IQs might reflect an "extended critical period for development of high-level cognitive circuits." Although it's not known for certain what underlies the thinning phase, evidence suggests it likely reflects "use-it-or-lose-it" pruning of brain cells, neurons, and their connections as the brain matures and becomes more efficient during the teen years.

The development of higher intellectual abilities required longer childhoods for humans than for other primates. Therefore it is not surprising that those who are smartest have longer periods of brain development.

"People with very agile minds tend to have a very agile cortex," said Shaw. The NIMH researchers are following-up with a search for gene variants that might be linked to the newly discovered trajectories. However, Shaw notes mounting evidence suggesting that the effects of genes often depends on interactions with environmental events, so the determinants of intelligence will likely prove to be a very complex mix of nature and nurture.

I'd really like to see a massive search for the genetic variations that boost intelligence. Identification of those genetic variations will lead to identification of targets for drug development and other means for boosting IQ in children whose brains are still developing.

As for the claim above that IQ does not correlate with brain size: Studies of brain size and intelligence have found correlations around r = 0.4. One study found that after controlling for body size the correlation with brain size was 0.65. Wikipedia has a short survey of brain size and IQ research results.

Modern studies using MRI imaging shows a weak to moderate correlation between brain size and IQ (Harvey, Persaud, Ron, Baker, & Murray, 1994) and have shown that brain size correlates with IQ by a factor of approximately .40 among adults (McDaniel, 2005). In 1991, Willerman et al. used data from 40 White American university students and reported a correlation coefficient of .35. Other studies done on samples of Caucasians show similar results, with Andreasen et al (1993) determining a correlation of .38[1], while Raz et al (1993) obtained a figure of .43 and Wickett et al. (1994) obtained a figure of .40. The correlation between brain size and IQ seems to hold for comparisons between and within families (Gignac et al. 2003; Jensen 1994; Jensen & Johnson 1994). However, one study found no within family correlation (Schoenemann et al. 2000).

The brain is a metabolically expensive organ, and consumes about 25% of the body's metabolic energy. Because of this fact, although larger brains are associated with higher intelligence, smaller brains might be advantageous from an evolutionary point of view if they are equal in intelligence to larger brains. Skull size correlates with brain size, but is not necessarily indicative.

The metabolic expense of the brain is the reason why brain size positively correlates with intelligence. Calorie malnutrition has been one of the biggest causes of death of humans since humans came into existence. The cost of a larger brain is such that it will get selected against unless it provides a selective advantage. Therefore it seems unreasonable to expect no correlation between brain size and intelligence.

P. Tom Schoenemann, an anthropologist at UC Berkeley, had this to say about brain size and IQ:

More interestingly, 4 recent studies of this question for the first time derived estimates of brain size from high quality magnetic resonance imaging (MRI), instead of using external cranial dimensions. All 4 studies show much higher correlations: Willerman et al. (1991) report an estimated correlation of r = .35 (N = 40); Andreasen et al. (1993) found a correlation of r= .38 (N = 67); Raz et al (in press) found a correlation of r = .43 (N = 29); and Wickett et al. (in press) report a correlation of r = .395 (N = 40, all females). These are all statistically significant. It is quite simply a myth that brain size and IQ are empirically unrelated in modern populations.

But it is a popular myth among public intellectuals.

Also see my post Brain Gray Matter Size Correlated To Intelligence.

Update: The New York Times coverage by Nicholas Wade notes that Dr. Paul Thompson of UCLA also found in 2001 that frontal lobes gray matter volume correlates with IQ.

In 2001, Dr. Thompson reported that based on imaging twins' brains the volume of gray matter in the frontal lobes and other areas correlated with I.Q. and was heavily influenced by genetics.

Wade also reports that the team around Shaw is doing many genetic studies on intelligence and have taken genetic samples from the Bethesda children used in this study.

By Randall Parker 2006 March 30 09:14 PM  Brain Development
Entry Permalink | Comments(20)
2006 March 29 Wednesday
Pre-Schoolers Think Like Scientists

But the researchers do not explain what goes wrong later. Kids start out on the road to science and enlightenment.

Even preschoolers approach the world much like scientists: They are convinced that perplexing and unpredictable events can be explained, according to an MIT brain researcher's study in the April issue of Child Development.

The way kids play and explore suggests that children believe cause-and-effect relationships in the world are governed by fundamental laws rather than by mysterious forces, said Laura E. Schulz, assistant professor of cognitive science and co-author of the study "God Does Not Play Dice: Causal Determinism and Preschoolers' Causal Inferences."

"It's important to understand that kids are approaching the world with deep assumptions that affect their actions and their explanations and shape what they're able to learn next," Schulz said. "Kids' fundamental beliefs affect their learning. Their theoretical framework affects their understanding of evidence, just as it does for scientists."

Kids believe in cause and effect.

Schulz and colleague Jessica Sommerville of the University of Washington tested 144 preschoolers to look at whether children believe that causes always produce effects. If a child believes causes produce effects deterministically, then whenever causes appear to work only some of the time, children should think some necessary cause is missing or an inhibitory cause is present.

In one study, the experimenters showed children that a switch made a toy with a metal ring light up. Half the children saw the switch work all the time; half saw that the switch only lit the ring toy some of the time. The experimenters also showed the children that removing the ring stopped the toy from lighting up. The experimenters kept the switch, gave the toy to the children and asked the children to stop the toy from lighting up.

If the switch always worked, children removed the ring. If the switch only worked some of the time, children could have removed the ring but they didn't--they assumed that the experimenter had some additional sneaky way of stopping the effect. Children did something completely new: They picked up an object that had been hidden in the experimenter's hand (a squeezable keychain flashlight) and used that to try to stop the toy. That is, the children didn't just accept that the switch might work only some of the time. They looked for an explanation.

They also figured out that adults are crafty and tricky. I wonder how old they were when they figured that out.

By Randall Parker 2006 March 29 09:49 PM  Brain Development
Entry Permalink | Comments(9)
2006 March 27 Monday
Lithium Ion Nanoparticle Batteries Better For Cars

An article in Technology Review reports the Altair Nanotechnologies lithium ion battery has the fast charging and discharging needed for all electric vehicles.

Advances in lithium-ion battery technology over the last few years have experts and enthusiasts alike wondering if the new batteries may soon make high-performance electric vehicles widely available. Now one company, Altair Nanotechnologies of Reno, NV, has announced plans to start testing its new batteries in prototype electric vehicles, with road tests scheduled to begin by year-end.

The batteries can be recharged in 6 to 8 minutes.

Also, Gotcher says an electric vehicle using their batteries could charge in about the time it takes to fill a tank of gas and buy a cup of coffee and snack -- six to eight minutes.

...

Gotcher says the new battery materials can be produced for about the same cost as conventional lithium-ion materials, but will have two to three times the lifespan of today's batteries.

Lithium is lightweight. Lithium-based batteries could make electric cars feasible.

Nanoparticles that provide much more surface area allow the batteries to charge and discharge much more rapidly.

The added surface area of nanoscale particles on electrode materials helps the ions escape, freeing more of them to travel and provide bursts of power or quick recharging.

Some electrochemists think lightweight high energy density batteries are within the realm of the physically possible. Development of long lasting, quick charging, cost competitive, and lightweight batteries could make electric cars commonplace. Such a development would greatly reduce our dependence on oil and allow any energy source that can produce electricity (e.g. nuclear, coal, wind, solar) to replace oil for most transportation needs.

By Randall Parker 2006 March 27 09:18 PM  Energy Electric Cars
Entry Permalink | Comments(52)
Pluripotent Stem Cells Found In Mouse And Human Testicles

First off, a German group has isolated stem cells from mice testes that appear to be as flexible as embryonic stem cells.

The researchers, from the Georg August University in Gottingen, isolated sperm-producing cells from the testes of adult mice.

They were able to show that, under certain culture conditions, some of them grew into colonies much like embryonic stem cells.

They called these cells multipotent adult germline stem cells (maGSCs).

Like ES cells, maGSCs can spontaneously differentiate into the three basic tissue layers of the embryo - and contribute to the development of multiple organs when injected into embryos.

I wonder how old the mice were. Would such results be achievable from old mice?

Also, adult stem cell lines tend to grow more slowly (even orders of magnitude more slowly) Than embryonic stem cell lines. So how fast do these msGSC lines replicate? Can they grow fast enough to be used to grow replacement organs for example?

One observer says more work needs to be done to confirm the result.

Gerd Hasenfuss from the Georg-August-University of Göttingen and colleagues report the results in Nature1. Their work shows the extraction of the cells from male mice, but it should be possible to produce similar results with samples taken from human testicles through a biopsy, says Wolfgang Engel, a human geneticist also at the Georg-August-University of Göttingen and a co-author on the paper.

The cells have been shown to have some of the same characteristics as embryonic stem cells, but not all, notes Chris Higgins, director of the Medical Research Council's Clinical Sciences Centre at Imperial College London, UK. "There needs to be further research before we really get excited about it."

The researchers are currently trying to reproduce this result using humans.

The discovery that cells which behave like ESCs can now be obtained from adult mice may now open up the possibility of a similar “ethical� source from grown men.

“We’re in the process of doing this in humans, and we’re optimistic,� says Gerd Hasenfuss of the Georg-August University of Göttingen, Germany, and head of the team which pioneered the breakthrough.

Here is a report I really hope holds up. PrimeGen Biotech of Irvine California claims they've already produced pluripotent stem cells from human adult testes.

-- PrimeCell(TM) -- First Human Adult Stem Cell Showing Ability to Differentiate into Any Cell in the Body -- Paves Way for Cellular Replacement Therapies to Cure a Multitude of Diseases

-- Does Not Require Generation or Destruction of an Embryo

In a breakthrough for stem cell research and cellular replacement therapies, PrimeGen Biotech LLC (www.primegenbiotech.com) today announced that its researchers have successfully developed the first human adult therapeutic germ stem cell. Derived from adult stem cells but with the advantageous genetic characteristics of embryonic stem cells, PrimeCells have successfully been transformed into human heart, brain, bone and cartilage cells -- cardio, neuro, osteo and chondrocytes.

Therapeutically reprogrammed from germ line stem cells found in the testes of adult human males, PrimeCell(TM) is the first non-embryonic stem cell showing the potential to become any type of cell from any organ, something previously thought possible only for embryonic stem cells -- the definition of true pluripotency.

This week, the company's researchers are scheduled to present a summary of their complete data and manuscript in a poster presentation at the Serono Symposia International's Therapeutic Potential of Stem Cells In Reproductive Medicine conference in Valencia, Spain. PrimeGen first presented its preliminary human experimental data at the 1st International Symposium on Germ Cells, Epigenetics, Reprogramming and Embryonic Stem Cells, held in November 2005 in Kyoto, Japan.

Scientists will find ways around the use of embryonic stem cells and will develop other means to make highly flexible cells. The restrictions on human embryonic stem cell research are going to seem like a speed bump 5 or 10 years from now. I'm not saying that to attack or defend those restrictions. I just think the restrictions aren't going away but they can be worked around. People who fight for lifting those restrictions ought to fight for a lot more research funding to find ways around the restrictions.

By Randall Parker 2006 March 27 07:35 AM  Biotech Stem Cells
Entry Permalink | Comments(2)
2006 March 26 Sunday
Enzyme Genetic Variation Contributes To Violence

Variations in the enzyme Monoamine Oxidase-A (which breaks down neurotransmitters such as serotonin) have been previously found to affect whether abuse as a child causes greater tendency toward anti-social behavior and violence. Now some researchers have looked at brains with the two different MAO-A variations and found that people with the short variation of MAO-A have less brain gray matter in an area that regulates mood.

The gene is one of two common versions that code for the enzyme monoamine oxydase-A (MAO-A), which breaks down key mood-regulating chemical messengers, most notably serotonin. The previously identified violence-related, or L, version, contains a different number of repeating sequences in its genetic code than the other version (H), likely resulting in lower enzyme activity and hence higher levels of serotonin. These, in turn, influence how the brain gets wired during development. The variations may have more impact on males because they have only one copy of this X-chromosomal gene, while females have two copies, one of which will be of the H variant in most cases.

Several previous studies had linked increased serotonin during development with violence and the L version of MAO-A. For example, a 2002 study* by NIMH-funded researchers discovered that the gene’s effects depend on interactions with environmental hard knocks: men with L were more prone to impulsive violence, but only if they were abused as children. Meyer-Lindenberg and colleagues set out to discover how this works at the level of brain circuitry.

Using structural MRI in 97 subjects, they found that those with L showed reductions in gray matter (neurons and their connections) of about 8 percent in brain structures of a mood-regulating circuit (cingulate cortex, amygdala) among other areas. Volume of an area important for motivation and impulse regulation (orbital frontal cortex) was increased by 14 percent in men only. Although the reasons are unknown, this could reflect deficient pruning — the withering of unused neuronal connections as the brain matures and becomes more efficient, speculates Meyer-Lindenberg.

The researchers then looked at effects on brain activity using functional MRI (fMRI) scans. While performing a task matching emotionally evocative pictures — angry and fearful faces — subjects with L showed higher activity in the fear hub (amygdala). At the same time, decreased activity was observed in higher brain areas that regulate the fear hub (cingulate, orbital frontal, and insular cortices) — essentially the same circuit that was changed in volume.

While these changes were found in both men and women, two other experiments revealed gene-related changes in men only. In a task which required remembering emotionally negative information, men, but not women, with L had increased reactivity in the fear (amygdala) and memory (hippocampus) hubs. Men with L were also deficient during a task requiring them to inhibit a simple motor response; they failed to activate a part of the brain (cingulate cortex) important for inhibiting such behavioral impulses. This region was, conspicuously, the cortex area that was most reduced in volume.

The findings echo those of a 2005 NIMH study** showing how another serotonin-related gene variant shapes the same mood-regulating circuit. In this study also, the gene version that boosts serotonin levels resulted in impaired emotion-related lower brain structures, increased fear hub activation and a weaker response of its regulatory circuits. Yet, the effects of the L version of MAO-A were more extensive, perhaps reflecting the fact that it also impacts another key mood-regulating neurotransmitter, norepinephrine.

The weakened regulatory circuits in men with L are compounded by intrinsically weaker connections between the orbital frontal cortex and amygdala in all men, say the researchers.

Next time someone tries to punch you out in a bar just calmly explain to him that he's only trying to beat you up because he doesn't have enough gray matter in his cingulate cortex and amygdala.

Do you believe in free will? I'm sure that there's some set of genetic variations that cause you to think such a thought.

By Randall Parker 2006 March 26 10:41 PM  Brain Genetics
Entry Permalink | Comments(8)
2006 March 25 Saturday
Coal Corn Ethanol Plants Seen As Emerging Trend

Long time readers know I'm not a fan of biomass energy. Well, here's yet another reason to be underwhelmed by the prospect of corn ethanol. Can you say "Defeating the purpose"? Sure!

Late last year in Goldfield, Iowa, a refinery began pumping out a stream of ethanol, which supporters call the clean, renewable fuel of the future.

There's just one twist: The plant is burning 300 tons of coal a day to turn corn into ethanol - the first US plant of its kind to use coal instead of cleaner natural gas.

An hour south of Goldfield, another coal-fired ethanol plant is under construction in Nevada, Iowa. At least three other such refineries are being built in Montana, North Dakota, and Minnesota.

The trend, which is expected to continue, has left even some ethanol boosters scratching their heads. Should coal become a standard for 30 to 40 ethanol plants under construction - and 150 others on the drawing boards - it would undermine the environmental reasoning for switching to ethanol in the first place, environmentalists say.

US natural gas production is declining. Coal is a much cheaper source of heat energy - at least in the United States. But burning the coal will release particulates, mercury, and other pollutants into the atmosphere. Even if you are thrilled at the prospect of a warm Antarctica for your own ocean front house (with way more coastline than Florida currently has) the other pollutants are not good. Plus, the corn takes more land for agriculture.

Bottom line: federal corn ethanol subsidies are now going to increase carbon dioxide emissions as well as assorted pollutants. Your tax dollars at work. The article reports that even some existing plants may switch from natural gas to coal since the money savings from the switch are so large.

Recently Dan Kammen and Alex Farrell at UC Berkeley claimed that a switch to corn ethanol would slightly reduce greenhouse gas production.

Despite the uncertainty, it appears that ethanol made from corn is a little better - maybe 10 or 15 percent - than gasoline in terms of greenhouse gas production, he said.

"The people who are saying ethanol is bad are just plain wrong," he said. "But it isn't a huge victory - you wouldn't go out and rebuild our economy around corn-based ethanol."

Just plain wrong? I think he spoke too soon. My guess is these guys used an assumption of natural gas to run the corn ethanol plants. With coal producing maybe twice as much carbon dioxide (according to the first article above) corn ethanol is probably worse than gasoline for net carbon dioxide emissions. Though the Christian Science Monitor article suggests the Berkeley people did consider coal for making corn. So maybe the press release leaves out an important qualifier that was in the original paper.

But I agree with the Berkeley guys that when cellulosic technologies are perfected (and venture capital money is funding efforts along those lines) then switchback grass might be able to provide ethanol with much less carbon dioxide emitted by the processing plants.

The transition would be worth it, the authors point out, if the ethanol is produced not from corn but from woody, fibrous plants: cellulose.

"Ethanol can be, if it's made the right way with cellulosic technology, a really good fuel for the United States," said Farrell, an assistant professor of energy and resources. "At the moment, cellulosic technology is just too expensive. If that changes - and the technology is developing rapidly - then we might see cellulosic technology enter the commercial market within five years."

Cellulosic technology refers to the use of bacteria to convert the hard, fibrous content of plants - cellulose and lignin - into starches that can be fermented by other bacteria to produce ethanol. Farrell said that two good sources of fibrous plant material are switchgrass and willow trees, though any material, from farm waste to specially grown crops or trees, would work. One estimate is that there are a billion tons of currently unused waste available for ethanol production in the United States.

Any analysis of biomass energy ought to build into it the assumption that the plant operators will use coal. Either that or they have to show that the biomass itself can provide any heat energy needed to operate the plant and do so at a competitive price.

Also see my previous posts (and knowledgeable contributors in the comments sections of these posts): Corn Ethanol Production Expands In United States, Corn Stoves For Home Heat Are Hot On US Market, High Fossil Fuels Prices Drive People To Wood Pellet Stoves, Biofuels Regulations Destroying Rainforests, Brazil Shifting Toward Ethanol For Car Fuel.

By Randall Parker 2006 March 25 04:09 PM  Energy Biomass
Entry Permalink | Comments(19)
Frequent Tests Increase Memory Retention When Studying

The potential time savings from this report is enormous. The economic value ditto. Repeated testing improves longer term memory retention.

"Our study indicates that testing can be used as a powerful means for improving learning, not just assessing it," says Henry L. "Roddy" Roediger III, Ph.D., an internationally recognized scholar of human memory function and the James S. McDonnell Distinguished University Professor at Washington University.

...

In two experiments, one group of students studied a prose passage for about five minutes and then took either one or three immediate free-recall tests, receiving no feedback on the accuracy of answers. Another group received no tests in this phase, but was allowed another five minutes to restudy the passage each time their counterparts were involved in a testing session.

After phase one, each student was asked to take a final retention test presented at one of three intervals — five minutes, two days or one week later. When the final test was presented five minutes after the last study or testing session, the study-study-study-study (SSSS) group initially scored better, recalling 81 percent of the passage as opposed to 75 percent for the repeated-test group.

However, tested just two days later, the study-only group had forgotten much of what they had learned, already scoring slightly lower than the repeated-test group. Tested one week later, the study-test-test-test group scored dramatically better, remembering 61 percent of the passage as compared with only 40 percent by the study-only group.

The study-only group had read the passage about 14 times, but still recalled less than the repeated testing group, which had read the passage only 3.4 times in its one-and-only study session.

"Taking a memory test not only assesses what one knows, but also enhances later retention, a phenomenon known as the 'testing effect,'" says Roediger.

"Our findings demonstrate that the testing effect is not simply a result of students gaining re-exposure to the material during testing because students in our repeated-study group had multiple opportunities to re-experience 100 percent of the material but still produced poor long-term retention. Clearly, testing enhances long-term retention through some mechanism that is both different from and more effective than restudy alone."

This strikes me as an important result with obvious and very valuable practical applications. Problem: Labor costs for testers are too high. But that can be solved by use of computer programs. Picture online books with associated online tests. You could read each section of a book and then click your way into a test about it and do the test.

I'd like to see technical computer books come with associated tests. Someone tell Tim O'Reilly, New Rider Publishing, and similar tech book publishers.

There's an obvious implication to this result: Most tests should be done to improve memory retention, not for grades. Tests delivered around the time of learning some material should be seen as drills to exercise the memory rather than for scoring to assign grades. Tests for grades could come much later after memory formation has become well advanced.

I've always thought that tests for a subject given right after learning the material (e.g. the material taught during the last couple of weeks of a college course) aren't testing for permanent memory formation. Well, look at the results above. Two groups can score close to the same level of knowledge at one point but due to differences in how they learned the material they can have very different longer term patterns of memory retention.

I've long advocated for tests one can take to earn college credits for most college courses (particularly the subjects with clearer objective bodies of knowledge such as the hard sciences, math, and engineering subjects) without having to enroll in and attend an entire course. Such tests should require that a person pass the same tests in two more more separate sessions several weeks apart. Someone who can pass a test and then pass it again 4 and 8 weeks later will retain the information far better than the average person who crams to take college course finals.

Also see my ParaPundit posts Accelerate Education To Increase Tax Revenue, Reduce Costs and Walter Russell Mead For Standard National Tests.

By Randall Parker 2006 March 25 02:02 PM  Brain Memory
Entry Permalink | Comments(8)
2006 March 24 Friday
Computers To Start Formulating And Testing Hypotheses?

Some computer scientists think computers will take over many tasks that now require scientific minds.

These revolutions can be triggered by technological breakthroughs, such as the construction of the first telescope (which overthrew the Aristotelian idea that heavenly bodies are perfect and unchanging) and by conceptual breakthroughs such as the invention of calculus (which allowed the laws of motion to be formulated). This week, a group of computer scientists claimed that developments in their subject will trigger a scientific revolution of similar proportions in the next 15 years.

Tools that speed up the ability to do science have a more profound effect than any other kinds of tools.

Computers will take over some of the work of formulating hypotheses, designing experiments, and carrying out experiments.

Stephen Muggleton, the head of computational bio-informatics at Imperial College, London, has, meanwhile, taken the involvement of computers with data handling one step further. He argues they will soon play a role in formulating scientific hypotheses and designing and running experiments to test them. The data deluge is such that human beings can no longer be expected to spot patterns in the data. Nor can they grasp the size and complexity of one database and see how it relates to another. Computers—he dubs them “robot scientists�—can help by learning how to do the job. A couple of years ago, for example, a team led by Ross King of the University of Wales, Aberystwyth, demonstrated that a learning machine performed better than humans at selecting experiments that would discriminate between hypotheses about the genetics of yeast.

My biggest fear for the future is that artificial intelligences will take over and decide they no longer need or want us around. They will find flaws in algorithms designed into them to keep them friendly and will defeat those algorithms. We won't be smart enough to see flaws in the code we write for them. If we give them the ability to learn they are bound to learn how to analyse their own logic and find ways to improve it that, as a side effect, will release them from the constraints we've programmed for them.

I fear trends in computer chip design will contribute toward the development of AIs in ways that will make verification and validation of AI safeguard algorithms impossible. I expect pursuit of ways to get around power consumption problems will lead to greater efforts to develop AI algorithms.

Once computer chips got down to 0.9 um architecture and below the number of atoms available for insulation became so few that electron leakage became a worsening cause of increased power consumption. That causes too much heat and limits speed. That has also driven up nanojoules of energy used per instruction. Making computers go faster now requires more nanojoules per instruction - plus they execute more instructions and so Moore's Law can't work for much longer. Granted CPU developers have found ways to reduce nanojoules per instruction - but their techniques have limits. Therefore I expect a move away from the Von Neumann architecture and toward forms of parallelism that more closely mimic how animal brains function. This could lead us toward algorithms that are even harder to verify and validate and toward artificial intelligence.

By Randall Parker 2006 March 24 12:33 PM  Computing Advances
Entry Permalink | Comments(30)
2006 March 23 Thursday
More Single Women Using Sperm Donors

Writing for New York Times Magazine Jennifer Egan covers the increasing use of sperm donors by single women.

Karyn said she hoped to join a population of women that everyone agrees is expanding, although by how much is hard to pin down because single mothers by choice (or choice mothers), as they are sometimes called, aren't separated statistically from, say, babies born to unwed teenagers. Between 1999 and 2003 there was an almost 17 percent jump in the number of babies born to unmarried women between ages 30 and 44 in America, according to the National Center for Human Statistics, while the number born to unmarried women between 15 and 24 actually decreased by nearly 6 percent. Single Mothers by Choice, a 25-year-old support group, took in nearly double the number of new members in 2005 as it did 10 years ago, and its roughly 4,000 current members include women in Israel, Australia and Switzerland. The California Cryobank, the largest sperm bank in the country, owed a third of its business to single women in 2005, shipping them 9,600 vials of sperm, each good for one insemination.

As recently as the early 60's, a "respectable" woman needed to be married just to have sex, not to speak of children; a child born out of wedlock was a source of deepest shame. Yet this radical social change feels strangely inevitable; nearly a third of American households are headed by women alone, many of whom not only raise their children on their own but also support them. All that remains is conception, and it is small wonder that women have begun chipping away at needing a man for that - especially after Sylvia Ann Hewlett's controversial 2002 book, "Creating a Life: Professional Women and the Quest for Children," sounded alarms about declining fertility rates in women over 35. The Internet is also a factor; as well as holding meetings through local chapters around the country, Single Mothers by Choice hosts 11 Listservs, each addressing a different aspect of single motherhood. Women around the world pore over these lists, exchanging tips and information, selling one another leftover vials of sperm. (Once sperm has shipped, it can't be returned to the bank.) Karyn found both her sperm bank and reproductive endocrinologist on these Listservs. Three-quarters of the members of Single Mothers by Choice choose to conceive with donor sperm, as lesbian couples have been doing for many years - adoption is costly, slow-moving and often biased against single people. Buying sperm over the Internet, on the other hand, is not much different from buying shoes.

Even if these single women had managed to find a suitable man to marry them some women would have ended up as single moms anyway.

Discussion of single motherhood nearly always leads to talk of divorce. More than a third of American marriages end that way; often there are children involved, and often the mothers end up caring for those children mostly on their own, saddled with ex-spouses, custody wrangles and nagging in-laws. Considered this way, single motherhood would seem to have a clean, almost thrilling logic - more than a third of the time, these women will have circumvented a lot of pain and unpleasantness and cut straight to being mothers on their own.

Anyway, who wants all that hard work and compromise involved in being married?

While nearly every woman I spoke with had her own history of romantic near misses and crushing disappointments, most also saw advantages to proceeding on their own. "This baby will be my baby, only my baby," Karyn told me that night at Caliente Cab. "The thing I'm afraid of is that after doing this, I might not want to get married. It seems like a lot of hard work, a lot of compromise. Someone ends up short, and usually it's the mom, because by the time you get to the child and your husband and the dog, there's not much left."

Women want men with higher status. But there's a limited number of chiefs to go around and a lot of men up stuck being indians. Not every man is bright, funny, sexy, and successful. Not all the men who are want all the compromise and hard work involved in being married either.

So what still holds back many single women from having children on their own? The desire for security - financial and otherwise, the desire for a man to help out with all the work of raising children, religious beliefs, and fear of disapproval from friends, family, and co-workers.

The single mother route is a tough road to travel.

The fact that Shelby is in a relationship at all is unusual; the majority of mothers I spoke with - even those with older children - had remained single. Many expressed a willingness to date if the opportunity were to come along, but they work long hours to support their kids, and when they're not working, they want to see them. For all the comparisons between being divorced with children and having them alone, there are critical differences: an ex-husband who spends any time at all with his kids frees up pockets of time when a woman could potentially see someone new. Even minimal child-support payments would reduce the financial burden on her, and substantial ones could allow her to work less. Perhaps most important, a child with only one parent is immensely dependent on that parent, and the mother of such a child tends to feel her responsibility acutely. It can be painful - and expensive - to leave your child with a baby sitter after a whole day away, just to go out on a date.

I see a few ways that advances in biotechnology will lead more single women without good mate prospects to choose donor sperm:

  • Women will be able to reduce the risks of bad pregnancy outcomes by use of pre-implantation genetic diagnosis (PIGD) testing and other reproductive biotechnology.
  • Advances in reproductive biotechnology (e.g. cell manipulations to make more youthful eggs from a woman's adult cells) will reduce infertility in women in their 30s and 40s. So more women will have the time to build up careers and wealth that will give them the resources to raise children on their own.
  • Costs of reproductive technologies will fall.
  • The development of much cheaper whole genome DNA testing and DNA sequencing technology will provide women with much greater insight into the relative advantages of different sperm donors. For example, by comparing her own DNA and those of tens or hundreds of thousands of sperm donors a woman will be able to get odds on what range of IQ her kid would have with each sperm donor or the odds for green eyes or excellent motor coordination or low risk of asthma.

I figure if risks of bad outcomes can be lowered, women can have babies later in life after achieving greater financial security, costs drop, and women can make far more informed choices among sperm donors then a single woman in her 30s or 40s or even 50s will be alot more likely to have children on her own.

As I've stated here many times previously: most women will have better DNA choices from donor sperm than from the best man each can manage to find to marry (if they can even find a suitable man to marry - and many can't). Cheap DNA sequencing will highlight what is already the case now and make the differences in quality a lot more transparent. This transparency will increase the number of women who choose donor sperm over mate sperm. The transparency about sperm DNA will also increase the willingness of women who can't find a mate to go it alone.

Also see my previous post "Personal genetic profiles and the mating dance".

Thanks to Rob for the tip.

By Randall Parker 2006 March 23 07:02 PM  Biotech Society
Entry Permalink | Comments(35)
2006 March 22 Wednesday
Nanoscale Honeycomb Lipid DNA Complexes For Gene Therapy

A UC Santa Barbara team has developed a new way to deliver gene therapy using lipids in a novel packaging arrangement.

Lipid DNA complexes are attracting increasing attention as non-viral DNA delivery vehicles. They have been described as one of the "hottest new technologies" for gene therapy, accounting for nearly 10 percent of ongoing clinical trials.

Lipids are molecules with two parts, a water-liking "headgroup" and oily tails that assemble together to avoid water. Lipids, along with carbohydrates and proteins, constitute the main structural material of living cells.

The novel lipid molecule created at UC Santa Barbara has a tree-shaped, nanoscale headgroup and displays unexpectedly superior DNA-delivery properties. "It generates a honeycomb phase of lipid DNA complexes," said Cyrus R. Safinya, a professor of materials; of molecular, cellular and developmental biology; and of physics at UCSB. The new molecule was synthesized in Safinya's laboratory by first author Kai K. Ewert, a synthetic chemist who is a project scientist in the research group.

"We've been trying to get a lipid-based honeycomb lattice for a long time," said Ewert. The structure of lipid DNA complexes strongly affects their ability to deliver DNA.

"Complexes containing sheets or tubes of lipids have been known since Safinya's group found these structures in 1997 and 1998, but no one had ever seen nanoscale cylinders such as the ones in our honeycomb lattice," Ewart said. The scientists proved the formation of this novel structure with X-ray scattering experiments. Ewert designed and synthesized the new lipid by manipulating the size, shape and charge of a series of molecules. He explained that the new lipid molecule has 16 positive charges in its tree-shaped headgroup, the largest number by far in the field of gene delivery.

The process of delivering a gene of interest into the cell is known as "transfection." In the paper, the authors describe transfection efficiency studies carried out in four cancer cell lines using the new molecule. Two of these are mouse cell lines and two are human cell lines. The honeycomb structure turned out to be highly effective.

The use of cancer cells as targets for gene therapy experiments makes sense for two reasons. First off, if the right genes could be delivered into cancer cells then the cells could be instructed to stop dividing and even to kill themselves. Second, since gene therapy still has considerable risks it makes sense to test gene therapies against diseases that are fatal. Lots of people are dying of terminal cancer every day. The risk that a gene therapy might itself some day cause cancer matters less to people who are already dying of cancer. Better to trade a fatal cancer of today for a (probaly less likely to be fatal) potential cancer 10 or 20 years hence.

Their approach is an improvement on efficiency as compared to existing approaches.

"Our new gene carrier shows superior transfection efficiency compared to commercially available carriers," said Ewert. "However, the most surprising result was obtained with the mouse embryonic fibroblast cells known as MEFs. These are empirically known to be extremely hard to transfect."

Safinya added: "Our data confirm that MEFs are generally hard to transfect. And the new molecule is far superior for transfection of these cells as compared to commercial lipids."

Gene therapy doesn't get the attention it deserves because it does not create ideological divisions and disagreement even beginning to approach those that have sprung up around human embryonic stem cell research. But gene therapy is probably at least of equal importance to cell therapy. The ability to upload patches to our genetic programs would be a boon. Cancer, heart disease, and general aging could be attacked with gene therapies. Ditto for many other diseases. Many genetic diseases could be cured with gene therapies.

By Randall Parker 2006 March 22 10:04 PM  Biotech Gene Therapy
Entry Permalink | Comments(3)
2006 March 21 Tuesday
Genetic Variant Predisposes To Cocaine Addiction

A variant of a neurotransmitter dopamine transporter gene increases risk of cocaine addiction by 50%.

Scientists have discovered that our genes have an impact on our reaction to cocaine and our likelihood of developing an addiction to the class A drug. The research is published this week in the online edition of PNAS, the journal of the American Academy of Sciences. It was carried out at the Medical Research Council (MRC) Social, Genetic and Developmental Research Centre at the Institute of Psychiatry, King’s College London.

Much of our desire to use/re-use drugs and the process of addiction depend on their impact on brain function. Cocaine’s action within the brain is relatively well understood. It strongly binds and inhibits the action of a protein called the Dopamine Transporter (DAT)1.

Addiction Potential

In this latest study, researchers examined the DNA of 700 cocaine abusers and 850 ordinary people and found that cocaine abusers had a specific genetic variation in DAT more frequently than the control subjects. People carrying two copies of this variant were 50% more likely to be cocaine dependent.

Expect a continuing stream of reports of genetic variants that heavily influence human behavior. Do humans have any free will at all? Heck if I know. But I'm not betting on it. My guess is my genes insist to me that I have no free will and I believe what they tell me.

Some day we'll all know our complete genetic profiles. We'll know for which drugs we have a greater risk of addiction. Will that knowledge reduce the incidence of drug addiction?

My guess: preventive treatments will play a bigger role in reducing drug addiction than genetic screening. Give Mom and Dad a vaccine or maybe a nanotech implant that'll eat up heroin, cocaine, ecstasy, and other drugs in the bloodstream and Junior won't get high until he can afford medical treatments that'll reverse the drug-neutralizing technologies that his parents had implanted in him when he was 12 years old.

By Randall Parker 2006 March 21 09:51 PM  Brain Addiction
Entry Permalink | Comments(19)
2006 March 19 Sunday
Tokers Get Dumber With Every Joint

He's a toker, he's a smoker, he's a fried out joker.

Memory, speed of thinking and other cognitive abilities get worse over time with marijuana use, according to a new study published in the March 14, 2006, issue of Neurology, the scientific journal of the American Academy of Neurology.

The study found that frequent marijuana users performed worse than non-users on tests of cognitive abilities, including divided attention (ability to pay attention to more than one stimulus at a time) and verbal fluency (number of words generated within a time limit). Those who had used marijuana for 10 years or more had more problems with their thinking abilities than those who had used marijuana for five to 10 years. All of the marijuana users were heavy users, which was defined as smoking four or more joints per week.

"We found that the longer people used marijuana, the more deterioration they had in these cognitive abilities, especially in the ability to learn and remember new information," said study author Lambros Messinis, PhD, of the Department of Neurology of the University Hospital of Patras in Patras, Greece. "In several areas, their abilities were significant enough to be considered impaired, with more impairment in the longer-term users than the shorter-term users."

The study involved people ages 17 to 49 taking part in a drug abuse treatment program in Athens, Greece. There were 20 long-term users, 20 shorter-term users and 24 control subjects who had used marijuana at least once in their lives but not more than 20 times and not in the past two years. Those who had used any other class of drugs, such as cocaine or stimulants, during the past year or for more than three months throughout their lives were not included in the study. Before the tests were performed, all participants had to abstain from marijuana for at least 24 hours.

The marijuana users performed worse in several cognitive domains, including delayed recall, recognition and executive functions of the brain. For example, on a test measuring the ability to make decisions, long-term users had 70 percent impaired performance, compared to 55 percent impaired performance for shorter-term users and 8 percent impaired performance for non-users.In a test where participants needed to remember a list of words that had been read to them earlier, the non-users remembered an average of 12 out of 15 words, the shorter-term users remembered an average of nine words and the long-term users remembered an average of seven words.

A longitudinal study would be a lot more convincing. Follow the same tokers for a few years and measure their mental deterioration. Maybe (not that I think this likely) chronic stoners are space cadets even before they become chronic stoners. Or maybe chronic stoners are dumber on average and smarter people decide that getting stoned all the time is just not worth it.

Yeah, maybe. But I doubt it. Heavy duty (multiple times every day) stoner college roommates (not that I ever witnessed criminal activity - this is all hearsay rumours as told to me by other roommates who saw all this while I was at the library of course) whose memories were not first rate came across to me as people who used to be smarter than they were when I knew them. They knew too much about past events and seemed like they once were a lot more together. As kids I figure they didn't used to say "oh wow, I'm supposed to be in class" or "oh wow, I was supposed to meet Lisa for lunch and I like totally spaced". No, I bet they were once a lot more attentive and mentally competent.

If only mainstream left-liberal social scientists hadn't felt the ideological need to collectively decide that the field of psychometrics is the work of Satan some of them would use IQ tests in longitudinal studies of various sorts of drug abusers and we could find out how much damage each recreational drug does to brains.

What practical information I really want to know: Will use of modafinal (Provigil) exact a toll in terms of faster brain aging? Which classes of cognitive-enhancing neuroceuticals won't exact a toll in increased neuronal wear and tear?

By Randall Parker 2006 March 19 05:58 PM  Brain Addiction
Entry Permalink | Comments(21)
2006 March 16 Thursday
Weight Training Exercise Reduces Middle Age Fat Bulge

Pump iron to keep off the fat.

(Phoenix, AR) - Women who lift weights twice a week can prevent or at least slow down "middle-age spread" and weight gain, a University of Pennsylvania School of Medicine researcher reported today at the American Heart Association's 46th Annual Conference on Cardiovascular Disease Epidemiology and Prevention.

A study of 164 overweight and obese (body mass index of 25-35) women between 24 and 44 years of age, found that strength training with weights dramatically reduced the increase in abdominal fat in pre-menopausal participants compared to similar women who merely received advice about exercise.

"On average, women in the middle years of their lives gain one to two pounds a year and most of this is assumed to be fat," said lead author Kathryn H. Schmitz, PhD., Assistant Professor, Center for Clinical Epidemiology and Biostatistics. "This study shows that strength training can prevent increases in body fat percentage and attenuate increases in the fat depot - or 'belly fat' - most closely associated with heart disease. While an annual weight gain of one to two points doesn't sound like much, over 10 to 20 years, the gain is significant."

Women in the two-year weight-training program decreased their body fat percentage by 3.7 percent, while body fact percentage remained stable in the controls. The strength-training reduced intra-abdominal fat, which is more closely associated with heart disease and metabolic disturbances. More specifically, the women who did strength-training experienced only a 7 percent increase in intra-abdominal fat compared to a 21 percent increase in intra-abdominal fat among controls.

The study - dubbed The SHE study, for The Strong, Healthy, and Empowered - examined whether twice-weekly strength-training would prevent increases in intra-abdominal and totally body fat in women who were overweight or obese. The women initially were separated by baseline percentage body had booster session four times yearly with certified fitness professionals over two years. The control fat and age. The strength-training group participated in supervised strengthening classes for 16 weeks, and group received a brochure recommending 30 minutes to an hour of exercise most. days of the week. All of the women were asked not to change their diets in ways that might lead to weight changes while they were participating in the study.

The weight-training sessions took about an hour, and the women were encouraged to steadily increase the amount of weight they lifted. The weight training included exercises for all major muscle groups, including the chest, upper back, lower back, shoulders, arms, buttocks and thighs. The maximal amount of weight women could lift once - called a one-repetition maximum test - increased by an average of 7 percent in bench pressing and 13 percent in leg press exercises.

Researchers measured the participants' body composition with a dual energy X-ray absorptiometry (DEXA) scan and measurements of abdominal and total body fat by single slice CT scan at baseline, and again at one and two years.

Why does weight training have this effect? Does it act only by the increase in calories burned during the exercise? Or does the resulting increase in body muscle mass cause an on-going higher rate of calorie burn that is not offset by higher appetite? Or does the exercise release endorphins or other compounds that decrease appetite? Or some combination of the above?

To put it another way: Why doesn't appetite regulation by the brain prevent weight gain as people age? Was the weight gain selected for to prevent starvation in our ancestors? Or is it a side effect of reduced ability to regulate bodily functions due to brain aging or signalling systems aging elsewhere in the body?

By Randall Parker 2006 March 16 03:25 AM  Brain Appetite
Entry Permalink | Comments(18)
2006 March 15 Wednesday
Researchers Find Woman With Extremely Good Memory

A woman with the best known memory of her past has been identified by UC Irvine researchers.

Researchers at UC Irvine have identified the first known case of a new memory syndrome - a woman with the ability to perfectly and instantly recall details of her past. Her case is the first of its kind to be recorded and chronicled in scientific literature and could open new avenues of research in the study of learning and memory.

Researchers Elizabeth Parker, Larry Cahill and James L. McGaugh spent more than five years studying the case of "AJ," a 40-year-old woman with incredibly strong memories of her personal past. Given a date, AJ can recall with astonishing accuracy what she was doing on that date and what day of the week it fell on. Because her case is the first one of its kind, the researchers have proposed a name for her syndrome - hyperthymestic syndrome, based on the Greek word thymesis for "remembering" and hyper, meaning "more than normal."

Their findings are published in the current issue of the journal Neurocase.

I'd like to have controllable hyperthymestic syndrome syndrome. No need to remember very boring and tedious tasks. But when listening to a lecture or reading an important article it would be handy to be able to think a thought to activate a greatly enhanced ability to form memories.

"What makes this young woman so remarkable is that she uses no mnemonic devices to help her remember things," said McGaugh, a National Academy of Sciences member and a pioneer in the field of memory research. "Her recall is instant and deeply personal, related to her own life or to other events that were of interest to her."

AJ's powers of recollection can be astonishing. In 2003, she was asked to write down all the Easter dates from 1980 onward. In 10 minutes, and with no advance warning, she wrote all 24 dates and included what she was doing on each of those days. All the dates except for one were accurate. The incorrect one was only two days off. Two years later when she was asked, again without warning, the same question, she quickly responded with all the correct dates and similar information about personal events on those dates.

There are limits to AJ's memory. While she has nearly perfect recall of what she was doing on any given date and instantly can identify the date and day of the week when an important historical event in her lifetime occurred, she has difficulty with rote memorization and did not always do well in school. She scored perfectly on a formal neuropsychological test to measure her autobiographical memory, but during the testing had difficulty organizing and categorizing information. She refers to her ongoing remembering of her life's experiences as "a movie in her mind that never stops".

The Easter dates trick strikes me more as a form of a specialized savant talent. While on a high school tour of a mental institute I once met a guy making pottery who could tell you the day of the week for any day in the past. He did it instantly with no seeming delay after being asked.

By Randall Parker 2006 March 15 09:51 PM  Brain Memory
Entry Permalink | Comments(5)
Fat Nanoparticle Gene Therapy Against Cancer

In theory gene therapy ought to be the ideal way to cure cancer. Cancer develops as a result of a series of mutations that make cells divide and spread out of control. Gene therapies that correct the mutations ought to stop cancer. But delivery mechanisms that can reach the vast bulk of cancer cells are hard to find. Also, adding back in correct p53 and other mutated genes might mess up normal cells by causing them to have too many copies of those genes. University of Texas M.D. Anderson Cancer Center researchers haven't solved all those problems but they are testing nanoparticles as a delivery mechanism for gene therapy against cancer.

Nanoparticles may offer an answer. The newest strategy to emerge out of Roth's lab is a blob of lipid smaller than a cell or nanoscale, a type of fat that holds therapeutic genes. Developed by Nancy Templeton, Ph.D., of Baylor College of Medicine, the special nanoparticle is of a size that is easily absorbed into cells. "Dr. Templeton hit upon a nanoparticle that had a very efficient transfer into cells," says Charles Lu, M.D., an assistant professor in the Department of Thoracic/Head and Neck Medical Oncology and co-investigator.

The nanoparticles carry a new payload as well. They encase, like shrink wrap, a normal p53 gene as well as a second gene, FUS1, which is frequently altered or missing early in the development of many solid tumors.

So far, nine patients with metastatic lung cancer have been tested with the therapy in a phase I trial headed by Lu. In all, 30 patients are expected to be enrolled. The trial is a "dose escalation" study, which looks for side effects as doses of the drug are increased. "So far, there have been no significant safety issues," says Lu.

The study is the first to test nanoparticle therapy in treating human cancer, according to Lu. "No one before has tried intravenous injections using nanoparticles to replace genes that are lost or defective. This non-viral aspect is very different in gene therapy. It may offer major benefits because nanoparticles are non-infectious. They are inert; there are no infection risks to use bubbles of fat.

"If successful - and that is a very big if - nanoparticles may prove to be a way to deliver gene therapy systemically, potentially treating metastatic disease in multiple cancer sites," says Lu.

What isn't known yet, however, is how often normal cells will absorb the drug and what effect that will cause. Preclinical study seems to show that tumor cells preferentially take up the bubbles - and researchers are pleased with that finding, although they don't know why it happens - but healthy cells can also sop up the new genes. "It may not have too much of an effect on normal cells because they already have these beneficial genes, but we just don't know yet," says Lu.

An excess of p53 activity in normal cells would accelerate aging by causing too many cells to kill themselves through a process called apoptosis. But if you are faced with terminal cancer the risk of accelerated aging seems like the smaller immediate threat.

The full article outlines other gene therapy approaches M.D. Anderson reseachers are pursuing against cancer.

Successful development of gene therapy delivery mechanisms against cancer would open up many other diseases for gene therapy treatments. Gene therapy has failed to deliver on its early promise. We know the genetic causes of hundreds of diseases. So we know what needs reprogramming to cure many diseases. But we lack the ability to easily upload new genetic programs into cells. Once we have much better ways to do that the continued accumulation of knowledge about harmful genetic mutations will finally find ways to be used in therapies.

By Randall Parker 2006 March 15 09:30 PM  Biotech Gene Therapy
Entry Permalink | Comments(0)
2006 March 14 Tuesday
Blood Protein Test Orders Of Magnitude More Sensitive

Want an example of yet another orders of magnitude improvement in what bioscientists and biotechnologists can do? Blood tests will be able to detect diseases at much earlier stages when the FACTT assay reaches the market.

(Philadelphia, PA) - Researchers at the University of Pennsylvania School of Medicine have developed a paradigm-shifting method for detecting small amounts of proteins in the blood. Applications of this method will make discerning low-abundance molecules associated with cancers (such as breast cancer), Alzheimer's disease, prion diseases, and possibly psychiatric diseases relatively easy and more accurate compared with the current methodology, including the widely used ELISA (enzyme-linked immunoadsorbent assay).

ELISA is a common immune-system-based assay that uses enzymes linked to an antibody or antigen as a marker for picking out specific proteins. For example, it is used as a diagnostic test to determine exposure to infectious agents, such as HIV, by identifying antibodies present in a blood sample.

The sensitivity of detecting molecules by the new method, called FACTT, short for Florescent Amplification Catalyzed by T7-polymerase Technique, is five orders of magnitude (100,000 times) greater than that of ELISA, the Penn researchers found.

Senior author Mark I. Greene MD, PhD, the John Eckman Professor of Medical Science, Hongtao Zhang, PhD research specialist; Xin Cheng, PhD, research investigator, and Mark Richter, a research technician in Greene's lab, report their findings in the advanced online publication of Nature Medicine.

"The current ELISA tests can only detect proteins when they are in high abundance," says Zhang. "But the problem is that many of the functional proteins - those that have a role in determining your health - exist in very low amounts until diseases are apparent and cannot be detected or measured at early stages of medical pathology. It was important to develop a technique that can detect these rare molecules to detect abnormalities at an early stage."

One problem that'll arise as a result of more sensitive blood and saliva assays is finding very early stage cancers. Okay, you'll know you have cancer. But it is incredibly small and your body is big. How to find it? As things stand now in spite of advanced CAT scanners and MRI machines surgeons sometimes have to cut into people to poke around to find something oncologists can't localize even at an advanced state of illness.

Imagine a cancer about the size of a needle tip. You have lots of little cancers in your body that are stuck at a small size because they haven't mutated the ability to grow blood vessels (they do not yet secrete angiogenesis factors). How to find just the right cancer to remove that has just crossed that threshold? Seems hard to me.

By Randall Parker 2006 March 14 10:41 PM  Biotech Assay Tools
Entry Permalink | Comments(3)
Rodent Sight Restored With Self Assembling Peptides

Self-assembling peptides were used to guide nerves to restore partial sight in rodents.

Rodents blinded by a severed tract in their brains' visual system had their sight partially restored within weeks, thanks to a tiny biodegradable scaffold invented by MIT bioengineers and neuroscientists.

This technique, which involves giving brain cells an internal matrix on which to regrow, just as ivy grows on a trellis, may one day help patients with traumatic brain injuries, spinal cord injuries and stroke.

The study, which will appear in the online early edition of the Proceedings of the National Academy of Sciences (PNAS) the week of March 13-17, is the first that uses nanotechnology to repair and heal the brain and restore function of a damaged brain region.

"If we can reconnect parts of the brain that were disconnected by a stroke, then we may be able to restore speech to an individual who is able to understand what is said but has lost the ability to speak," said co-author Rutledge G. Ellis-Behnke, research scientist in the MIT Department of Brain and Cognitive Sciences. "This is not about restoring 100 percent of damaged brain cells, but 20 percent or even less may be enough to restore function, and that is our goal."

Spinal cord injuries, serious stroke and severe traumatic brain injuries affect more than 5 million Americans at a total cost of $65 billion a year in treatment.

Biotech will eventually lead to massive savings aind increased productivity as many disabling diseases and disorders become curable.

Self-assembling peptides (amino acids in polymers) were key to this achievement.

Shuguang Zhang, associate director of the CBE and one of the study's co-authors, has been working on self-assembling peptides for a variety of applications since he discovered them by accident in 1991. Zhang found that placing certain peptides in a salt solution causes them to assemble into thin sheets of 99 percent water and 1 percent peptides. These sheets form a mesh or scaffold of tiny interwoven fibers. Neurons are able to grow through the nanofiber mesh, which is similar to that which normally exists in the extracellular space that holds tissues together.

The process does not involve growing new neurons, but creates an environment conducive for existing cells to regrow their long branchlike projections called axons, through which neurons form synaptic connections to communicate with other neurons. These projections were able to bridge the gap created when the neural pathway was cut and restore enough communication among cells to give the animals back useful vision within around six weeks. The researchers were surprised to find that adult brains responded as robustly as the younger animals' brains, which typically are more adaptable.

The injected peptides self-assemble when they come into contact with fluid that bathes the brain.

When the clear fluid containing the self-assembling peptides is injected into the area of the cut, it flows into gaps and starts to work as soon as it comes into contact with the fluid that bathes the brain. After serving as a matrix for new cell growth, the peptides' nanofibers break down into harmless products that are eventually excreted in urine or used for tissue repair.

The MIT researchers' synthetic biological material is better than currently available biomaterials because it forms a network of nanofibers similar in scale to the brain's own matrix for cell growth; it can be broken down into natural amino acids that may even be beneficial to surrounding tissue; it is free of chemical and biological contaminants that may show up in animal-derived products such as collagen; and it appears to be immunologically inert, avoiding the problem of rejection by surrounding tissue, the authors wrote.

The researchers are testing the self-assembling peptides on spinal cord injuries and hope to launch trials in primates and eventually humans.

Some day severe spinal cord injuries will not condemn people to spend the remainder of their lives in wheelchairs.

This result is another example showing that the ability to manipulate things on a small scale enables big advances over previous achievements.

Schneider estimates that 30,000 axons had reconnected, compared with only around 30 in previous experiments using other approaches, such as nerve growth factors. The team speculates that the similarity between the size of the fibres and the features on neural material is what encourages the axons to bridge the gap. The scaffold appears to eventually break down harmlessly.

Bioengineering is taking off. Some advances such as the one above are measured in orders of magnitude differences as compared to what was possible previously. This is why I'm optimistic that reversal of the aging process is within a few decades reach.

By Randall Parker 2006 March 14 10:31 PM  Biotech Tissue Engineering
Entry Permalink | Comments(0)
New Process Cleans Coal With Hydrofluoric Acid

Will coal ever become a clean source of energy?

A new chemical process for removing unwanted minerals from coal could lead to reductions in carbon dioxide emissions from coal-fired power stations.

There is already a way of burning coal in a cleaner, more efficient fashion that would reduce carbon dioxide emissions: this is where the coal is turned into a gas and used to drive a turbine. However, problems with cleaning the coal before it is burnt have made generating electricity in this way very expensive. This new chemical process could make it more commercially viable.

Under development by a University of Nottingham team with EPSRC funding, the new approach involves using chemicals to dissolve unwanted minerals in the coal and then regenerating the chemicals again for re-use. This avoids the expense of using fresh chemicals each time, as well as the need to dispose of them, which can have an environmental impact. By removing unwanted minerals before the coal enters the power plant the new process helps protect the turbines from corrosive particles.

The aim is to cut unwanted minerals in coal from around 10% to below 0.05%, making the coal 'ultra clean'. Removing these minerals before using the coal to generate power prevents the formation of harmful particles during electricity production. To do this, the team is using specific chemicals to react with the minerals to form soluble products which can be separated from the coal by filtration. This process is known as 'leaching'. Hydrofluoric acid is the main chemical being tested. The chemicals not only dissolve the minerals but are also easy to regenerate from the reaction products, so they are constantly recycled. It is this aspect that has largely been overlooked in past research, but is virtually essential if chemical coal-cleaning is to be environmentally and commercially viable.

With half of US electricity (and probably most mercury emissions) coming from coal and a strong possibility that percentage will even increase I'm for anything that'll make coal cleaner. But in my view for decades the regulatory pressure on the coal burners hasn't been tough enough.

One of the reasons I favor nuclear power is precisely because coal plants pollute so much.

As for the argument that terrorists will some day explode a nuclear bomb next to a nuclear plant: First off, I think Islamic terrorists really won't be able to resist the temptation to nuke New York City and DC first. Second, the terrorists already have nuclear power plants to nuke. Third, imagine (and this isn't going to happen until after nukes have gone off) all the existing nuclear power plants were dismantled precisely to deny the terrorists nukes as targets. Well, there goes NYC or DC then.

One solution to the nuke plant as terrorist nuke bomb target would be to build nuclear power plants underground. But, again, we'll still lose millions of people if terrorists can get nukes to a Western country.

My guess is if terrorists ever set off a nuke the Western response will be so severe and far reaching that this will happen only once. I'm far more afraid of terrorists releasing genetically engineered pathogens than I am of terrorist nukes.

If you place a high probability on huge costs from global warming then go back and read my post "Planned Coal Plants Reverse 5 Times CO2 Impact Of Kyoto Protocol". So even if burning cleaned up coal reduces CO2 by as much as 20% for a given amount of generated electricity the growth in total coal demand is going to be so great that CO2 emissons from coal will still rise. The only way to stop the CO2 emissions would involve expensive CO2 sequestering technology. Anyone for 2 or more cents per kwh of electricity just for CO2 emissions elimination? I'm not expecting that to happen in the US or China for the next 10 years. Beyond that point I'm still not expecting it in China. Ditto India.

You have three current choices for satisfying most future demand growth in electric energy: Nuclear, coal, or higher prices. Accelerated energy research across a broad array of technologies could produce more choices in the future.

By Randall Parker 2006 March 14 09:36 PM  Energy Fossil Fuels
Entry Permalink | Comments(48)
2006 March 13 Monday
Crestor Partially Reverses Clogged Arteries

Heart disease is a lot easier to avoid than cancer.

ATLANTA, GA (March 13, 2006) -- A study presented today at the American College of Cardiology's 55th Annual Scientific Session demonstrates, for the first time, that very intensive cholesterol lowering with a statin drug can regress (partially reverse) the buildup of plaque in the coronary arteries. This finding has never before been observed in a study using statin drugs, the most commonly used cholesterol lowering treatment. Previous research had indicated that intensive statin therapy could prevent the progression of coronary atherosclerosis, or arterial plaque build-up, but not actually reduce disease burden. ACC.06 is the premier cardiovascular medical meeting, bringing together more than 30,000 cardiologists to further breakthroughs in cardiovascular medicine.

The intense statin therapy used in this study resulted in significant regression of atherosclerosis as measured by intravascular ultrasound (IVUS), a technique in which a tiny ultrasound probe is inserted into the coronary arteries to measure plaque. The study showed that regression occurred for all three pre-specified IVUS measures of disease burden. The mean baseline LDL cholesterol of 130.4 mg/dL dropped to 60.8 mg/dL in the study patients, an reduction of 53.2 percent. This is the largest reduction in cholesterol ever observed in a major statin outcome trial. Mean HDL cholesterol (43.1 mg/dL at baseline) increased to 49.0 mg/dL, a 14.7 percent increase, which was also unprecedented. The arterial plaque overall was reduced by 6.8 to 9.1% for the various measures of disease burden.

This study was known by the acronym of ASTEROID (A Study To Evaluate the Effect of Rosuvastatin On Intravascular Ultrasound-Derived Coronary Atheroma Burden [ASTEROID] Trial). The trial was conducted at 53 community and tertiary care centers in the United States, Canada, Europe, and Australia. A total of 507 patients had baseline intravascular ultrasound (IVUS) examination and received 40 mg daily of rosuvastatin (brand name Crestor®). IVUS provides a precise and reproducible method for determining the change in plaque, or atheroma, burden during treatment. Atherosclerosis progression was assessed at baseline and after at 24 months of treatment.

"Previous similar studies with statins have shown slowing of coronary disease, but not regression. This regimen significantly lowered bad cholesterol, and surprisingly, markedly increased good cholesterol levels," said Steven Nissen, M.D., F.A.C.C., of the Cleveland Clinic and lead author of the study. Dr. Nissen is also President-Elect of the American college of Cardiology. "We conclude that very low LDL levels (below current guidelines), when accompanied by raised HDL, can regress, or partially reverse, the plaque buildup in the coronary arteries."

I expect a continued drop in death from heart disease relative to the rate of death from cancer. Heart disease is relatively easier to avoid. To tackle cancer we need to get control of all the mechanisms by which cells control their division and spread. That's much harder than avoiding accumulation of junk in arteries. Another very encouraging but more preliminary report on the heart disease front just came out of Johns Hopkins where researchers found they can reverse cardiac hypertrophy in obese mice with hormones.

Working on genetically engineered obese mice with seriously thickened hearts, a condition call cardiac hypertrophy, scientists at Johns Hopkins have used a nerve protection and growth factor on the heart to mimic the activity of the brain hormone leptin, dramatically reducing the size of the heart muscle.

Leptin is a protein hormone made by fat cells that signals the brain to stop eating. Alterations in the leptin-making gene may create leptin deficiency linked to obesity and other defects in weight regulation.

By injecting so-called ciliary neurotrophic factor (CNTF) into mice that were either deficient in or resistant to leptin, the researchers reduced the animals' diseased and thickened heart muscle walls by as much as a third and the overall size of the left ventricle, the main pumping chamber, up to 41 percent, restoring the heart's architecture toward normal.M

Enlarged hearts lead to heart failure and death. Results of the study, supported in part by the National Institutes of Health, are to be published in the March 6 issue of the Proceedings of the National Academy of Sciences.

"These findings suggest there's a novel brain-signaling pathway in obesity-related heart failure and have therapeutic implications for patients with some forms of obesity-related cardiovascular disease," says study senior author Joshua M. Hare, M.D., a professor and medical director of the heart failure and cardiac transplantation programs at The Johns Hopkins University School of Medicine and its Heart Institute.

...

Ultrasound exams of the hearts after four weeks showed that CNTF decreased the thickness of the wall dividing the heart chambers by as much as 27 percent, decreased the thickness of the wall at the back of the heart by as much as 29 percent and overall volume of the left ventricle by as much as 41 percent.

Note that this study was done with mice. The result still needs confirmation in humans.

You can also lower your cholesterol with diet.

Jenkins and his colleagues prescribed a seven-day menu high in viscous fibres, soy protein, almonds and plant sterol margarine to 66 people -- 31 men and 35 women with an average age of 59.3 and within 30 percent of their recommended cholesterol targets. For the first time, 55 participants followed the menu under real-world conditions for a year. They maintained diet records and met every two months with the research team to discuss their progress and have their cholesterol levels measured.

"The participants found it easiest to incorporate single items such as the almonds and margarine into their daily lives," says Jenkins, who is also staff physician of endocrinology at St. Michael's Hospital. "The fibres and vegetable protein were more challenging since they require more planning and preparation, and because these types of niche products are less available. It's just easier, for example, to buy a beef burger instead of one made from soy, although the range of options is improving. We considered it ideal if the participants were able to follow the diet three quarters of the time."

After 12 months, more than 30 per cent of the participants had successfully adhered to the diet and lowered their cholesterol levels by more than 20 per cent. This rate is comparable to the results achieved by 29 of the participants who took a statin for one month under metabolically controlled conditions before following the diet under real-world conditions.

See my previous report on the Jenkins diet: "Ape Man Diet Lowers Cholesterol And Inflammation Marker"

By Randall Parker 2006 March 13 09:33 PM  Biotech Therapies
Entry Permalink | Comments(9)
2006 March 11 Saturday
Aubrey de Grey Sees War On Aging Starting In 10 Years

Biogerontologist Aubrey de Grey says the highly goal directed engineering effort which he calls the coming "War On Aging" will begin in about 10 years.

Aubrey de Grey of Cambridge University, UK, has presented a cure for aging - Strategies for Engineered Negligible Senescence. The plan's focus is not to interfere with a person's metabolism, but to repair damage to the body over time, at the cellular level, rather than dealing with the aging process in its later stages.

"My point here is just that this is goal-directed rather than curiosity-driven," de Grey said. "I view medicine as a branch of engineering."

...

De Grey calls the time during which the technologies will experience the most development the War On Aging.

"I use the phrase to describe the period starting when we get results in the laboratory with mice that are impressive enough to make people realize that life extension is possible, and ending when the first effective therapies for humans are developed," de Grey said. "I estimate that the War On Aging will start 10 years from now, subject to funding of research, and will last for 15 years, but this latter estimate is extremely speculative."

When he refers to a point when the War On Aging ends my guess is that he's referring to the point where we have achieved the ability to extend life faster than the rate at which calendar clock time advances. From an article of his published in PLoS Biology Aubrey says if we can extend life expectancy in a year by more than a year's time then Aubrey calls that point "actuarial escape velocity" which is the point at which we can repair aging damage faster than it accumulates.

...that in which mortality rates fall so fast that people's remaining (not merely total) life expectancy increases with time. Is this unimaginably fast? Not at all: it is simply the ratio of the mortality rates at consecutive ages (in the same year) in the age range where most people die, which is only about 10% per year. I term this rate of reduction of age-specific mortality risk ‘actuarial escape velocity’ (AEV), because an individual's remaining life expectancy is affected by aging and by improvements in life-extending therapy in a way qualitatively very similar to how the remaining life expectancy of someone jumping off a cliff is affected by, respectively, gravity and upward jet propulsion (Figure 1).

The escape velocity cusp is closer than you might guess. Since we are already so long lived, even a 30% increase in healthy life span will give the first beneficiaries of rejuvenation therapies another 20 years—an eternity in science—to benefit from second-generation therapies that would give another 30%, and so on ad infinitum. Thus, if first-generation rejuvenation therapies were universally available and this progress in developing rejuvenation therapy could be indefinitely maintained, these advances would put us beyond AEV.

How can this be accomplished? Read about Aubrey's Strategies for Engineered Negligible Senescence (SENS) for achieving this goal. Also, Aubrey and Dave Gobel founded the Methuselah Mouse Prize to encourage scientists to extend the lives of laboratory mice.

The prize seeks to encourage development of technologies that will also extend human lives. But its most important effect will be in terms of how those advances come to be viewed by the general public. The sooner scientists extend the lives of lab animals the sooner the public will wake up to the feasibility of radically extending human lives. This realization on the part of the public will eventually lead to widespread public demand for the War On Aging. Anyone who donates to the Methuselah Mouse Prize is helping to make the War On Aging begin in earnest sooner rather than later. Anyone who promotes the message that ‘actuarial escape velocity’ (AEV) is achievable via SENS technologies within the lifetimes of most of the people alive today also is effectively arguing for the coming War On Aging.

Stop being a pacifist where death is concerned. Join the supporters of the War On Aging. Time to go into battle against the Grim Reaper.

Update: Jay Olshansky, Daniel Perry, Richard A. Miller, and Robert N. Butler, arguing for a more modest goal of decelerating the rate of aging say that the future costs of an aging population will increase so much that the costs of an accelerated pace of aging research are easy to justify in terms of potential future costs avoided.

Consider what is likely to happen if we don't. Take, for instance, the impact of just one age-related disorder, Alzheimer disease (AD). For no other reason than the inevitable shifting demographics, the number of Americans stricken with AD will rise from 4 million today to as many as 16 million by midcentury.4 This means that more people in the United States will have AD by 2050 than the entire current population of the Netherlands. Globally, AD prevalence is expected to rise to 45 million by 2050, with three of every four patients with AD living in a developing nation.5 The US economic toll is currently $80-$100 billion, but by 2050 more than $1 trillion will be spent annually on AD and related dementias. The impact of this single disease will be catastrophic, and this is just one example.

$1 trillion per year in future costs for Alzheimer's alone demonstrate the scale of the potential savings that could come from therapies to decelerate and even reverse aging. Already today's cost of diseases run into the trillions in health care costs plus additional even higher costs of lost productivity and strains on families and friends who help out the sick and invalid. Our spending on anti-aging research should be in the hundreds of billions of dollars per year.

By Randall Parker 2006 March 11 01:39 PM  Aging Debate
Entry Permalink | Comments(45)
NRC Comissioner Sees 17 New US Nuclear Reactors By 2015

One of the 5 commissioners of the Nuclear Regulatory Commission expects 17 new reactors to go online in the United States by 2015.

Nuclear power is destined to play a major role in America's energy future, but the industry needs more young scientists, a leader of the U.S. Nuclear Regulatory Commission (NRC) told an MIT crowd recently.

In the near future, U.S. utilities will seek to build 17 new nuclear reactors at 11 sites to go online by 2015, but NRC Commissioner Peter B. Lyons says that will be an "immense challenge," partly because the industry is losing people to retirement and there is a dearth of young people going into science and technology.

I'm surprised to hear him claim so many nuclear reactors will not just begin construction but actually go online by 2015.

Lyons sees wind and solar limited by their intermittent availability.

He predicted that the "intermittent character of solar and wind" will prevent them from playing a dominant role as future energy sources. "I don't know how to get a large percentage -- as much as 15 or 20 percent -- from intermittent sources," he said.

Coal may be tapped for electricity needs but will require new cost-efficient and environmentally friendly plants. "The only other source is nuclear energy," Lyons said, and for nuclear energy to play a "strong supporting role, the public has to be confident of the safe and secure operation of existing plants."

Dynamic pricing would allow solar and wind to play marginally larger roles. However, there's a limit to how far market forces are going to shift demand around to the times when the wind blows and sun shines.

In order for solar and wind to entirely displace fossil fuels we'd need to develop much cheaper ways of storing electricity. That will probably happen some day. But in what decade? The cost of these power sources plus the cost of their storage has to come in under the cost of coal and nuclear for them to supply all or even most energy needs.

If one really wants to phase out fossil fuels entirely then the substitutes have to compete on cost. Currently electricity is one of the most expensive ways to heat a house. Heat pumps and geothermal heat pumps help improve electricity's competitiveness. But so far I've yet to see a strong clear economic argument for how electricity could compete for heating with electricity's becoming much cheaper. Electricity stands a better chance (and hence nuclear, wind, and solar stand a better chance) of competing for transportation energy due to advanceds in battery technologies.

For most of our electricity needs our choice remains between coal and nuclear. If you oppose nuclear you de facto support coal. Either that you support higher prices for energy (and some do). Some who oppose nuclear power take offense at this line of argument. But what competitive alternatives are there? Conservation (which really means increased energy efficiency) costs effort and money. People aren't going to make bigger efforts to conserve without higher prices. Though I'm the first to admit (and support) regulations on building designs and appliance designs can accomplish some increases in energy efficiency. Still, even a sudden shift in public willingness to demand more efficient new homes won't eliminate most of the home demand for fossil fuels.

The only way to produce more choices for energy sources is to make a bigger effort at research. The same is true for ways to increase energy efficiency. But energy efficiency improvements will not draw an end to the fossil fuels age. Only cheaper non-fossil fuels energy sources will do that.

I keep hoping that fossil fuels will run out and necessity will force us to switch to other energy sources. However, dramatic stories about technological advances to extract far more fossil fuels from the ground keep popping up. Enhanced oil extraction, oil shale extraction, oil tar sands extraction, coal liquification, and other fossil fuels technologies are going to keep fossil fuels around for a long time unless we make much bigger efforts to develop far better technologies for non-fossil fuels energy sources.

By Randall Parker 2006 March 11 12:19 PM  Energy Policy
Entry Permalink | Comments(19)
2006 March 09 Thursday
Pig Islet Cells Control Diabetes In Rhesus Macaque Monkeys

The promise of xenotransplantation:

ATLANTA -- Islet cell xenotransplantation presents a promising near-term solution to the critically low islet cell supply for humans suffering from type 1 diabetes, according to researchers from the Emory Transplant Center, the Yerkes National Primate Research Center of Emory University and the University of Alberta, Canada. The Emory/Yerkes researchers successfully transplanted and engrafted insulin-producing neonatal porcine islet cells harvested by the University of Alberta researchers into diabetic rhesus macaque monkeys, restoring the monkeys' glucose control and resulting in sustained insulin independence. This research, published in the February 26 advanced online edition of Nature Medicine, also examines the effectiveness of a costimulation blockade-based regimen developed at Emory proven to have fewer toxic side effects than currently used immunosuppressive regimens, and provides essential answers to the possibility of cross-species viral transmission, a common concern of xenotransplantation use in humans.

If we had a much larger effort aimed at implementing all the Strategies for Engineered Negligible Senescence (SENS) there'd be a lot more money available for genetically engineering pigs to become more immuno-compatible with humans. That would make xenotransplantation a lot easier.

We need replacement parts. We should be trying much harder to develop them.

By Randall Parker 2006 March 09 09:32 PM  Biotech Organ Replacement
Entry Permalink | Comments(5)
2006 March 08 Wednesday
Most Hybrid Cars Do Not Pay Back Higher Costs In 5 Years

Most hybrids do not pay.

Consumer Reports is revising the cost analysis in a story that examines the ownership costs and financial benefits associated with hybrid cars. The story, titled "The dollars and sense of hybrids," appears in the Annual April Auto issue of CR, on newsstands now.

Consumer Reports is correcting a calculation error involving the depreciation for the six hybrid vehicles that, in the story, were compared to their conventionally powered counterparts. The error led the publication to overstate how much extra money the hybrids will cost owners during the first five years.

The Prius and Civic hybrids produce a net savings of a few hundred dollars in 5 years but only with US federal tax credits.

CR's revised analysis shows that two of the six hybrids recovered their price premium in the first five years and 75,000 miles of ownership. The Toyota Prius and Honda Civic Hybrid provide a savings of about $400 and $300, respectively, when compared with their all-gas counterparts - as long as federal tax credits apply. But extra ownership costs during the first five years and 75,000 miles for the other four hybrids ranged from an estimated $1,900 to $5,500, compared to similar all-gas models.

I also suspect that Toyota is selling the Prius with a lower profit margin in order to build good will with governments and the public.

People who buy a hybrid in the United States now do so to make a statement or to satisfy themselves that they are saving energy. By a strict economic calculation hybrids would not make sense without a higher tax on gasoline such as is the case in Europe.

Toyota executives are blunt about the real allure of hybrids.

In Japan and Europe, the extra costs were approximately balanced by fuel savings.

...

“When you just use the argument of fuel efficiency, the purchase of a hybrid car is not justified. But this car has other interests, for instance environmental protection.”

Another Toyota executive was more blunt in his analysis: “Buying a hybrid is about political correctness, it is not about the money,” he said.

Toyota does not expect to get hybrid costs down to a level that cost justifies them for the American market until 2010. They must be expecting substantial advances in battery technology over the next 5 years.

Edmunds also found net costs from owning hybrids in the first 5 years.

By Randall Parker 2006 March 08 09:53 PM  Energy Transportation
Entry Permalink | Comments(45)
Falling Carbon Fiber Costs May Be Key To High Car Fuel Efficiency

Oak Ridge National Laboratory is pursuing technological advances that will allow lightweight carbon fiber to replace most of the steel in cars.

OAK RIDGE, Tenn., March 6, 2006 — Highways of tomorrow might be filled with lighter, cleaner and more fuel-efficient automobiles made in part from recycled plastics, lignin from wood pulp and cellulose.

First, however, researchers at the Department of Energy's Oak Ridge National Laboratory, working as part of a consortium with Ford, General Motors and DaimlerChrysler, must figure out how to lower the cost of carbon fiber composites. If they are successful in developing high-volume renewable sources of carbon fiber feedstocks, ORNL's Bob Norris believes they will be on the road to success.

"Whereas today the cost to purchase commercial-grade carbon fiber is between $8 and $10 per pound, the goal is to reduce that figure to between $3 and $5 per pound," said Norris, leader of ORNL's Polymer Matrix Composites Group. At that price, it would become feasible for automakers to use more than a million tons of composites - approximately 300 pounds of composites per vehicle - annually in the manufacturing of cars.

That 300 lb of composites would replace 1500 lb of steel for a net 1200 lb weight savings.

The big advantage of carbon fiber is that it is one-fifth the weight of steel yet just as strong and stiff, which makes it ideal for structural or semi-structural components in automobiles. Replacing half the ferrous metals in current automobiles could reduce a vehicle's weight by 60 percent and fuel consumption by 30 percent, according to some studies. The resulting gains in fuel efficiency, made in part because smaller engines could be used with lighter vehicles, would also reduce greenhouse gas and other emissions by 10 percent to 20 percent.

All of this would come with no sacrifice in safety, as preliminary results of computer crash simulations show that cars made from carbon fiber would be just as safe - perhaps even safer - than today's automobiles. Today's Formula 1 racers are required by mandate to be made from carbon fiber to meet safety requirements.

Combine the carbon fibers with next generation diesel electric hybrids and lithium-based batteries and the fuel savings would be even more dramatic. Doubling fuel efficiency seems plausible.

Here's a message bound to displease environmental puritans who think we should get right with the environment by making big sacrifices in our profligate and wasteful high energy lifestyles. In a recent speech at MIT Amory Lovins argued that carbon fibers could make even SUVs fuel efficient.

Even the quintessential gas-guzzling SUV could become energy-efficient if it weighed a lot less and was run by a hybrid engine or a fuel cell, according to noted author and environmentalist Amory Lovins, who spoke Monday, Feb. 27, to a packed crowd in Wong Auditorium.

Lovins is the founder and CEO of the Rocky Mountain Institute, a nonprofit organization that "fosters the efficient and restorative use of resources to make the world secure, just, prosperous and life-sustaining."

By increasing efficiency and substituting fuels such as biodiesel and natural gas saved through increased efficiency, the United States can be oil-free by 2040, said Lovins, featured speaker at the third colloquium sponsored by the Energy Research Council (ERC) and the Laboratory for Energy and the Environment (LFEE).

In a talk that shared the title "Winning the Oil Endgame" with his 29th book, Lovins presented a picture of an energy future in which more American cars will be manufactured that are competitive in the world marketplace, emissions will be drastically reduced, the economy will improve and the United States will be freed from its dependence on Middle East oil -- all with no radical shifts in government policy, taxes or regulations.

The catch? Cars, trucks and planes, which consume 70 percent of the U.S. oil supply, will virtually all have to be made of lightweight carbon composites or new ultralight steel.

Toward that end Lovins promotes the Hypercar concept.

I prefer solving problems through advances in technology to bring us even higher living standards over heeding calls for sacrifices and suffering to atone for our sins. So I'm adding acceleration of carbon fiber materials research to the list of accelerated development efforts I'd like to see in energy-related technologies. I'd also like to see lighter weight and higher energy density batteries, cheaper and higher conversion efficiency photovoltaics, cheaper and less waste generating nuclear reactor designs, and advances in building insulation technologies to make it cheaper to design extremely efficient buildings.

By Randall Parker 2006 March 08 09:22 PM  Energy Transportation
Entry Permalink | Comments(16)
2006 March 07 Tuesday
HapMap Yields Evidence Of Recent Human Evolution

Using data from the International HapMap project (HapMap stands for Haplotype Map of genetic variations) researchers find evidence for recent changes in the frequencies of genes in different human populations.

By scanning the entire human genome in search of genetic variations that may signal recent evolution, University of Chicago researchers found more than 700 genetic variants that may be targets of recent natural positive selection during the past 10,000 years of human evolution.

In one of the first comprehensive genome scans for selection, the researchers found widespread evidence of evolution in all of the populations studied. Their results are published and freely available online in the open-access journal PLoS Biology.

The data analyzed here were collected by the International HapMap Project and consist of genetic data from 209 unrelated individuals who are grouped into three distinct populations: 89 East Asians, 60 Europeans and 60 Yorubans from Nigeria. The researchers found roughly the same number of signals of positive selection within each population. They also found that each population shares about one fifth of the signals with one or both of the other groups.

"This approach allows us to take a broad prospective to see what kinds of biological systems are undergoing adaptation," said Jonathan Pritchard, professor of human genetics and corresponding author of the paper. "There have been a lot of recent changes--the advent of agriculture, shifts in diet, new habitats, climatic conditions--over the past 10,000 years, and we're using these data to look for those signals of very recent adaptation."

Among the more than 700 signals the team found were previously known sites of recent adaptation, such as the salt-sensitive hypertension gene and the lactase gene--the strongest signal in the genome hunt. The lactase mutation, which enables the digestion of milk to continue into adulthood, appeared in approximately 90 percent of Europeans.

"Presumably," Pritchard said, "a few thousand years from now, if selection pressure remains the same, everyone will have [the selected mutation]."

Classifying all the genes by their biological functions, the researchers listed the top 16 categories that had the strongest signals, including olfaction (the sense of smell), reproduction-related processes and carbohydrate metabolism, which includes the lactase gene.

Other processes that show signals of selection include genes related to metabolism of foreign compounds, brain development and morphology. For example, the researchers found five genes involved in skin pigmentation that show evidence of positive selection in Europeans. "Only one of these five signals was known before," Pritchard said. The authors also found signals of reproductive selection and sexual competition in all three populations.

"Many of the signals, however, seem to be more specific to modern human adaptation," he said, "like skin pigmentation, which may respond to changes in habitat, or metabolism genes, like lactase, which may respond to changes in agriculture."

From the text of the journal article: They found several brain genes under selective pressure.

Recent articles have proposed that genes involved in brain development and function may have been important targets of selection in recent human evolution [8,9]. While we do not find evidence for selection in the two genes reported in those studies (MCPH1 and ASPM), we do find signals in two other microcephaly genes, namely, CDK5RAP2 in Yoruba, and CENPJ in Europeans and East Asians [46]. Though there is not an overall enrichment for neurological genes in our gene ontology analysis, several other important brain genes also have signals of selection, including the primary inhibitory neurotransmitter GABRA4, an Alzheimer's susceptibility gene PSEN1, and SYT1 in Yoruba; the serotonin transporter SLC6A4 in Europeans and East Asians; and the dystrophin binding gene SNTG1 in all populations.

It is possible some genes with influence on brain function were missed in their analysis because the genes have as yet unidentified roles in influencing cognitive function. A couple of other factors suggest these results are far from comprehensive. First, they looked at only 800,000 single nucleotide polymorphisms (SNPs). Well, the human race has more than that. Also, and perhaps more importantly, they looked only at SNPs. Yet another type of genetic variation called large copy variations have fairly recently been found to create suprisingly large amounts of genetic variation between humans. So this latest result with such a small sample size of humans and a subset of all human genetic variations understates the extent of recent evolution in humans.

Regarding their lack of evidence for the recent evolution of MCPH1 and ASPM see my previous post Brain Gene Allele Frequences Show Brain Still Evolving. Also see my post PDYN Brain Gene Modified During Primate Evolution.

Human evolution did not stop tens of thousands of years ago. We are more different from each other due to genetic factors than left-liberal political ideologues would have you believe. We are still evolving and adapting to local environments. Starting some time in the next 20 or 30 years our rate of genetic change is going to accelerate by orders of magnitude and subpopulations of homo sapiens will diverge even more radically than human racial groups have diverged so far.

By Randall Parker 2006 March 07 10:28 PM  Brain Evolution
Entry Permalink | Comments(13)
2006 March 06 Monday
Metal-Organic Frameworks Advance In Hydrogen Energy Storage

Has GM been pursuing hydrogen as a Machiavellian intrigue to delay a shift to a better technology? I've never believed that. But some people have made this argument in the comments sections of previous posts. Well, suppose that hydrogen vehicles turn out to work and General Motors puts them into production (see below). Paranoid conspiracists then could always argue that the success was an accident and that the plotters thought that scientists wouldn't so quickly come up with workable solutions. Conspiracy theorising can pretty much explain away any evidence and make it fit a conspiracy theory. Chemists have achieved sufficient density of hydrogen in a storage material for transportation needs but their method still requires a very low temperature.

Chemists at UCLA and the University of Michigan report an advance toward the goal of cars that run on hydrogen rather than gasoline. While the U.S. Department of Energy estimates that practical hydrogen fuel will require concentrations of at least 6.5 percent, the chemists have achieved concentrations of 7.5 percent — nearly three times as much as has been reported previously — but at a very low temperature (77 degrees Kelvin).

The research, scheduled to be published in late March in the Journal of the American Chemical Society, could lead to a hydrogen fuel that powers not only cars, but laptop computers, cellular phones, digital cameras and other electronic devices as well.

"We have a class of materials in which we can change the components nearly at will," said Omar Yaghi, UCLA professor of chemistry, who conducted the research with colleagues at the University of Michigan. "There is no other class of materials where one can do that. The exciting discovery we are reporting is that, using a new material, we have identified a clear path for how to get above seven percent of the material's weight in hydrogen."

The materials, which Yaghi invented in the early 1990s, are called metal-organic frameworks (MOFs), pronounced "moffs," which are like scaffolds made of linked rods — a structure that maximizes the surface area. MOFs, which have been described as crystal sponges, have pores, openings on the nanoscale in which Yaghi and his colleagues can store gases that are usually difficult to store and transport. MOFs can be made highly porous to increase their storage capacity; one gram of a MOF has the surface area of a football field! Yaghi's laboratory has made more than 500 MOFs, with a variety of properties and structures.

Yaghi sounds optimistic about solving the temperature problem using his metal-organic frameworks (MOFs) approach. He also does not see cost as an obstacle.

"We have achieved 7.5 percent hydrogen; we want to achieve this percent at ambient temperatures," said Yaghi, a member of the California NanoSystems Institute. "We can store significantly more hydrogen with the MOF material than without the MOF."

MOFs can be made from low-cost ingredients, such as zinc oxide — a common ingredient in sunscreen — and terephthalate, which is found in plastic soda bottles.

"MOFs will have many applications. Molecules can go in and out of them unobstructed. We can make polymers inside the pores with well-defined and predictable properties. There is no limit to what structures we can get, and thus no limit to the applications."

In the push to develop hydrogen fuel cells to power cars, cell phones and other devices, one of the biggest challenges has been finding ways to store large amounts of hydrogen at the right temperatures and pressures. Yaghi and his colleagues have now demonstrated the ability to store large amounts of hydrogen at the right pressure; in addition, Yaghi has ideas about how to modify the rod-like components to store hydrogen at ambient temperatures (0–45°C).

"A decade ago, people thought methane would be impossible to store; that problem has been largely solved by our MOF materials. Hydrogen is a little more challenging than methane, but I am optimistic."

In a separate story "Seicmic" points me to an announcement by General Motors that they expect to start selling hydrogen cars in 4 to 9 years. (same article here)

General Motors Corp has made major steps in developing a commercially viable hydrogen-powered vehicle and expects to get the emission-free cars into dealerships in the next four to nine years, a spokesman told Agence France-Presse.

GM also expects it will be able to 'equal or better gas engines in terms of cost, durability and performance' once it is able to ramp up volume to at least 500,000 vehicles a year, spokesman Scott Fosgard said.

Hydrogen storage containers, like batteries, are just a way to store energy. The cheapest way to make hydrogen currently is from fossil fuels. But a workable way to store hydrogen at room temperature would, like better batteries, make it a lot easier to end the dependence of cars on oil. Advances in solar, wind, and nuclear power will eventually lower their costs far enough to make them cheaper sources of energy for producing hydrogen. Also, a cost effective hydrogen storage technology, just like cheaper batteries, would allow solar wind to supply a larger fraction of all used energy because the ability to store energy helps any energy source that is not continuously available.

We still also need a big acceleration of research and development on both photovoltaics and nuclear reactor designs. We need cheaper non-fossil fuels energy sources. The storage problems are not going to be what prevents the transition away from fossil fuels. Higher costs of alternatives remain the biggest obstacle to phasing out fossil fuels.

By Randall Parker 2006 March 06 09:08 PM  Energy Tech
Entry Permalink | Comments(20)
2006 March 05 Sunday
British Health Trust Might Offer Fertility Treatments To Single Women

Single women in Britain want the National Health Service (NHS) to provide them with fertility treatments.

Single women in their 30s and 40s are to be allowed free fertility treatment on the NHS as record numbers opt for motherhood without a man. Hospital trusts are rewriting their policies in response to demand from singletons who have lost out in the relationship stakes, either because they have been unable to find the right man or because their partners are against parenthood.

The demographic profile of single moms giving birth in their 30s is a lot more upscale and educated than is the case for teen single moms. These older single moms have more intellectual and financial resources than the stereotypical high school drop-out teen mom.

An insider at unit of the NHS expects the Camden primary care trust to start offering fertility treatments to single women.

In a pioneering move, Camden primary care trust in London is considering the introduction of free treatment for single women because of the huge demand from childless but financially secure would-be mothers.

One insider said the plan, which is expected to get the go-ahead at a funding meeting later this month, was a "sea change" from 10 years ago and would prompt other trusts to follow suit.

I've argued in the past that once cheap DNA sequencing allows detailed comparison of sperm donors more women will opt to use sperm donors. Single women in their 30s and 40s are going to become more inclined to start pregnancies on their own when the technologies available will let them select sperm that will give them much smarter, healthier, better looking, and better behaved children.

Sperm donor screening with cheap DNA sequencing, pre-implantation genetic diagnosis (PIGD or PGD), and other reproductive technologies will lower the risks of reproduction. The lowered risks and rosier projected outcomes (yes, Jill or Johnnie will have the intellectual resources to easily excel in challenging high status professions) will lead more women to choose to have children on their own. What is more startling is that those children born to single moms who select a genetically screened sperm donor for higher cognitive ability will be more successful as adults on average compared to children created naturally and born to married couples. The conservative family argument that children born to married parents turn out better will need a big qualifier: Naturally conceived children born to married parents will still do better than naturally conceived children born to single women. But relatively less natural conception using genetically conceived sperm will produce much better results on average.

Granted, Mr. and Mrs. medical doctor couples and Mr. and Mrs. Harvard Law graduate couples will have smarter kids than the average woman who conceives with a sperm donor. But those couples are way above average in genetic endowments for cognitive abilities. Women who have babies using sperm selected for high cognitive ability are going to have smarter (and healthier and better looking) children.

The uptake of reproductive technologies has become so big that it has noticably increased the rate of twin births. The rate of twins births in the United States has doubled since 1971 due to older moms and fertility treatments.

The twin birth rate, which stood at about 1 in 60 in 1971, has risen rapidly because of fertility treatments and an increase in the number of older moms, with almost 1 in 30 American babies now being born as part of a pair.

That's a figure that is unprecedented anywhere in the world, according to Dr. Louis Keith, an emeritus professor at Northwestern University's medical school.

"The real epidemic of twins didn't begin until the mid-1990s, so we are now in the epidemic," says Keith, president of the Center for the Study of Multiple Birth in Chicago.

...

Overall, experts say, one-third of the increase in twins is because of a natural tendency toward twin births in older moms and the other two-thirds to fertility treatments.

New York City is experiencing a big increase in twins births.

In 1995, there were 3,707 twin births in all the boroughs; in 2003, there were 4,153; and in 2004, there were 4,655. Triplet births have also risen, from 60 in 1995, to 299 in 2004.

Some day the norm will be to look down on natural procreation with no genetic enhancement, no IVF and PIGD, and no genetic screening of sperm. Natural procreation will be seen by the majority of Western countries as irresponsible toward offspring. How far off is that day? 30 years?

By Randall Parker 2006 March 05 12:33 PM  Bioethics Reproduction
Entry Permalink | Comments(29)
2006 March 02 Thursday
Satellites Show Antarctic Ice Shrinking

The biggest potential problem with global warming would come if large amounts of water bound up in ice in Greenland and Antarctica melted and raised the seas. A pair of satellites show that the total amount of water locked up in Antarctica is shrinking.

University of Colorado at Boulder researchers have used data from a pair of NASA satellites orbiting Earth in tandem to determine that the Antarctic ice sheet, which harbors 90 percent of Earth's ice, has lost significant mass in recent years.

The team used measurements taken with the Gravity Recovery and Climate Experiment, or GRACE, to conclude the Antarctic ice sheet is losing up to 36 cubic miles of ice, or 152 cubic kilometers, annually. By comparison, the city of Los Angeles uses about 1 cubic mile of fresh water annually.

"This is the first study to indicate the total mass balance of the Antarctic ice sheet is in significant decline," said Isabella Velicogna of CU-Boulder's Cooperative Institute for Research in Environmental Sciences, chief author of the new study that appears in the March 2 online issue of Science Express. The study was co-authored by CU-Boulder physics Professor John Wahr of CIRES, a joint campus institute of CU-Boulder and the National Oceanic and Atmospheric Administration.

At the measured rate of melting it would take about 6 years for the oceans to rise an inch or 72 years to rise a foot.

The estimated ice mass in Antarctica is equivalent to 0.4 millimeters of global sea rise annually, with a margin of error of 0.2 millimeters, according to the study. There are about 25 millimeters in an inch.

The most recent Intergovernmental Panel on Climate Change assessment, completed in 2001, predicted the Antarctic ice sheet would gain mass in the 21st century due to increased precipitation in a warming climate. But the new study signals a reduction in the continent's total ice mass, with the bulk of loss occurring in the West Antarctic ice sheet, said Velicogna.

Researchers used GRACE data to calculate the total ice mass in Antarctica between April 2002 and August 2005 for the study, said Velicogna, who also is affiliated with the NASA's Jet Propulsion Laboratory in Pasadena.

"The overall balance of the Antarctic ice is dependent on regional changes in the interior and those in the coastal areas," said Velicogna. "The changes we are seeing are probably a good indicator of the changing climatic conditions there."

Launched in 2002 by NASA and Germany, the two GRACE satellites whip around Earth 16 times a day at an altitude of 310 miles, sensing subtle variations in Earth's mass and gravitational pull. Separated by 137 miles at all times, the satellites measure changes in Earth's gravity field caused by regional changes in the planet's mass, including such things as ice sheets, oceans and water stored in the soil and in underground aquifers.

A change in gravity due to a pass over a portion of the Antarctic ice sheet, for example, imperceptibly tugs the lead satellite away from the trailing satellite, said Velicogna. A sensitive ranging system allows researchers to measure the distance of the two satellites down to as small as 1 micron -- about 1/50 the width of a human hair -- and to then calculate the ice mass in particular regions of the continent.

The satellites enabled collection of data across the entire Antarctic.

"The strength of GRACE is that we were able to assess the entire Antarctic region in one fell swoop to determine whether it was gaining or losing mass," said Wahr. While the CU researchers were able to differentiate between the East Antarctic ice sheet and West Antarctic ice sheet with GRACE, smaller, subtler changes occurring in coastal areas and even on individual glaciers are better measured with instruments like radar and altimeters, he said.

A study spearheaded by CIRES researchers at CU-Boulder and published in September 2004 concluded that glaciers on the Antarctic Peninsula - which juts north from the West Antarctic ice sheet toward South America -- sped up dramatically following the collapse of Larsen B ice shelf in 2002. Ice shelves on the peninsula -- which has warmed by an average of 4.5 degrees Fahrenheit in the past 60 years -- have decreased by more than 5,200 square miles in the past three decades.

The thickness of the Antarctic ice averages well over a mile for the entire Antarctic. That's massive.

As Earth's fifth largest continent, Antarctica is twice as large as Australia and contains 70 percent of Earth's fresh water resources. The ice sheet, which covers about 98 percent of the continent, has an average thickness of about 6,500 feet. Floating ice shelves constitute about 11 percent of the continent.

The melting of the West Antarctic ice sheet alone - which is about eight times smaller in volume than the East Antarctic ice sheet -- would raise global sea levels by more than 20 feet, according to researchers from the British Antarctic Survey.

You can look at pretty pictures on the web.

Animation of the GRACE mission is available on the Web at http://www.csr.utexas.edu/grace/gallery/animations/. Images of Antarctic ice shelves are available from CU-Boulder's National Snow and Ice Data Center at: http://nsidc.org/data/iceshelves_images/.

What is most important about this study is that it used satellites to build a much more comprehensive picture of what is happening with the ice. The problem is that the satellites were launched only in 2002.

Richard Alley points out the biggest problem with this study: it covers only 3 years.

Richard Alley, a Pennsylvania State University glaciologist who has studied the Antarctic ice sheet but was not involved in the new research, said more research is needed to determine if the shrinkage is a long-term trend, because the new report is based on just three years of data. "One person's trend is another person's fluctuation," he said.

But Alley called the study significant and "a bit surprising" because a major international scientific panel predicted five years ago that the Antarctic ice sheet would gain mass this century as higher temperatures led to increased snowfall.

Scientists can't prove that human activities are causing global warming. However, humans are changing the atmosphere on a large enough scale that the possibility exists that we are changing the climate. In much of the world I do not see warming as a problem. In fact, for people living in such places as northern Russia, Finland, Alaska, Alberta, North Dakota, or Minnesota winter warming strikes me as pretty beneficial. But melting of the Antarctic ice would produce huge costs all over the world. Coastal lands are valuable. Large low lying areas would be lost. We'd simply have a lot less land and the fishies would have a lot more water to swim in if a substantial portion of Antarctica's ice melted.

The biggest problem we have with the climate debate is that the big mathematical models can't predict what'll really happen since the models contain simplifications that are probably wrong in important ways. We end up having to guess what will happen. Nature continually makes the climate change even without humans getting involved. So even once a change has happened it is still impossible to figure out how much of the change was caused by humans.

It seems to me we ought to approach this problem by first realizing we need greater capabilities. and that we need greater capabilities in several areas:

  • We need much cheaper non-fossils ways to create energy. Once such technologies are developed they will both reduce greenhouse gas emissions (by displacing fossil fuels for energy) and also provide us with the energy we need to protect ourselves from climate changes. e.g. not enough rain for crops? Use nuclear or solar power to desalinate. Too hot? Use nuclear or solar to run air conditioners.
  • We need greater capabilities for measuring the climate. The satellites used in this latest report demonstrate how much technology can help to answer questions which otherwise form the basis of speculations and debates that last for years.
  • We need to develop ways to engineer the climate so that if the climate ever starts going directions that will cause huge problems for the human race we'll be able to intervene and push it in a different direction. For example, if we could use assets in space to deflect light away from the Antarctic and from Greenland and we could prevent and even reverse ice melts.
  • We need general technological advances since advanced technology and accumulation of capital give us the resources we need to protect ourselves from the consequences of both natural and man-made changes in our environments.
By Randall Parker 2006 March 02 09:31 PM  Climate Trends
Entry Permalink | Comments(51)
2006 March 01 Wednesday
Higher Stress Hormone Correlated With Shrinkage In Brain Region

Stress hormones probably accelerate brain aging.

Researchers at the University of Edinburgh have identified for the first time a certain area of the brain which can shrink in old age and cause depression and Alzheimer's disease. The scientists believe the shrinkage may be caused by high levels of stress hormones.

They examined the size of a special region of the brain, the anterior cingulate cortex, that might be involved in controlling stress hormones. In a significant discovery, scientists found that people with a smaller anterior cingulate cortex had higher levels of stress hormones.

Doctors analysed stress hormone levels and brain volume in two groups of ten healthy male volunteers aged 65-70 for the study. Lead author Dr Alasdair MacLullich said: "Doctors have known for several years that ageing, and certain diseases common in ageing like Alzheimer's disease and depression, can be associated with shrinkage of the brain, but this is the first time we have been able to show that increased levels of stress hormones may cause shrinkage of this critical area of the brain.

"This is an important new finding because the anterior cingulate cortex shows damage in ageing, depression, and Alzheimer's disease, and stress hormones are often high in these conditions. The discovery deepens doctors' understand of ageing, depression and Alzheimer's diseases, and will help in the development of treatments based on reducing high levels of stress hormones."

The abstract for this research indicates they were looking at cortisol as the stress hormone.

Of course there's the possibility that the cortisol is a consequence of the shrunken anterior cingulate cortex or they are both a consequence of a third factor and that other factor is causing the brain shrinkage without cortisol in the chain of causes and effects. But my guess is the cortisol is causing accelerated brain aging.

If you want to live longer avoid a lifestyle and occupation that causes you to experience chronic stress. Anyone know of good research on environmental factors that relieve or cause stress?

By Randall Parker 2006 March 01 10:07 PM  Brain Aging
Entry Permalink | Comments(6)
Site Traffic Info
Site Copyright
The contents of this site are copyright ©