2005 October 31 Monday
Paternal Inheritance Found For Telomeres Which Influence Longevity

A research team at Umeå University in Sweden have found that the telomere length influence on aging appears to be inherited from fathers, not mothers.

Telomeres are genetic material with repetitive content at the ends of DNA, and their main function is believed to be to protect the rest of the genetic material from degradation. Telomeres are shortened each time a cell divides, which in broad terms means that the longer a cell’s telomeres are, the longer the individual can live, in theory. A person’s telomeres are shortened with age, which the findings of the study indeed show: telomeres were shortened by an average of 21 nitrogen base pairs per year in the subjects studied.

The study, soon to appear in the U.S. scientific journal PNAS, Proceedings of the National Academy of Sciences, was carried out on 132 healthy subjects in 49 different families with no close kinship to each other in northern Sweden. The subjects consisted of fathers and mothers (mean age 66 years) and their daughters and sons (mean age 37 years). Blood samples were taken, and mononuclear immune cells were culled.

Half of these were simply frozen, while the other half were infected with Epstein-Barr virus (EBV) and cultured for 18-55 days, whereupon the surviving cells were frozen. DNA was then extracted from both cell types using standardized techniques, and the length of the telomeres was ascertained.

The findings show that changes in the length of the telomeres in the cultured cells are determined by the original length of the telomeres, and the length of the telomeres in the second generation, both sons and daughters, proved to be inherited from the father.

Telomeres are caps on the ends of chromosomes. They get shorter each time cells divide. As the telomere caps become really short they start to interfere with cell division. This is one of the causes of aging. Telomere cap shortening is probably an evolved mechanism to reduce the risk of death from cancer. Cancers need to undergo a mutation to activate telomerase to grow longer telomere caps so that the cancer cells can divide many more times than normal cells can.

Given that shorter telomere lengths probably reduce cancer risk it is by no means guaranteed that people who inherit longer telomeres from their father will live longer on average.

I would be curious to know what the mechanism might be to cause telomere length to track with paternal inheritance. Do sperm genes for telomerase expression have a methylation pattern or other circumstance that makes them become active as soon as the sperm and egg meet? Does paternal telomerase regulatory genes get turned on while maternal telomerase regulatory genes are suppressed?

Suppose this finding holds up upon further investigation. The female line of descent still has at least one big unique influence upon longevity: mitochondrial DNA. That comes solely from the female line of descent the vast bulk of the time (though perhaps not always).

The abstract and full paper can be found here.

Also see my previous posts "Telomere Length Indicates Mortality Risk" and "Chronic Stress Accelerates Aging As Measured By Telomere Length".

By Randall Parker 2005 October 31 09:15 PM  Aging Studies
Entry Permalink | Comments(4)
2005 October 30 Sunday
COMT Gene Deletion A Cause Of Schizophrenia

Deletion of an enzyme that breaks down dopamine causes less than 5% of schizophrenia cases.

STANFORD, Calif. –A gene that regulates dopamine levels in the brain is involved in the development of schizophrenia in children at high risk for the disorder, say researchers at the Stanford University School of Medicine, Lucile Packard Children’s Hospital and the University of Geneva. The finding adds to mounting evidence of dopamine’s link to psychiatric and neurological disorders. It may also allow physicians to pinpoint a subset of these children for treatment before symptoms start.

“The hope is that we will one day be able to identify the highest-risk groups and intervene early to prevent a lifetime of problems and suffering,” said Allan L. Reiss, MD. “As we gain a much better understanding of these disorders, we can design treatments that are much more specific and effective.”

Gene therapy to restore COMT activity (see below) would probably be the ideal method for early intervention. More generally, as we discover the genetic contributors to more diseases we will need better gene therapy delivery techniques to make use of the discovered information.

30% of those with a deletion at a location on chromosome 22 will develop schizophrenia or a similar mental disorder.

Reiss and the study’s first author Doron Gothelf, MD, a child psychiatrist and postdoctoral scholar at Stanford, studied 24 children with a small deletion in one copy of chromosome 22. About 30 percent of children with this deletion, which occurs in about one in 4,000 births, will develop schizophrenia or a related psychotic disorder. These children also often have special facial features, cardiac defects and cleft anomalies that often make their speech hypernasal. Although these characteristics make it possible to identify them before psychiatric disorders develop, the disorder, called velocardiofacial syndrome, is under-diagnosed and under-recognized in this country despite its link to schizophrenia.

“We have strong evidence that this deletion is a major risk factor for the development of schizophrenia or related psychotic disorders,” said Reiss. “We asked, ‘What is it about this deletion that causes such an increase in risk?’”

The answer lay in the fact that one of the missing genes encodes a dopamine-degrading protein called COMT. Natural variations in the gene generate two versions of the protein: one with high activity, one with low.

Because most people have two copies of the gene, it doesn’t usually matter which versions of COMT they inherit; high-high, high-low and low-low all seem to provide enough COMT activity to get the job done (though some combinations confer a mild advantage for some cognitive tasks).

Note the point above about how some combinations of COMT variations confer a mild cognitive advantage. Quite possibly some of these COMT these variations which contribute to schizophrenia exist due to Darwinian natural selective pressure for higher cognitive ability. Some people get combinations of genes that boost their cognitive ability at the cost of higher risk of schizophrenia.

The researchers decided to see if excess levels of dopamine due to insufficient COMT activity perhaps acted as neurotoxins that brought on schizophrenia.

But children with the deletion have only the one copy that remains on their intact chromosome 22. Reiss and Gothelf, who is also an assistant professor at Tel Aviv University in Israel, surmised that a single copy of the low-activity COMT might not dispose of enough dopamine to produce optimal brain function. They set out to determine if the clinical course of the children with deletions who developed schizophrenia varied with the version of the COMT protein they had.

Since chromisomes come in pairs deletion on one chromosome 22 still leaves another copy of the gene on the other copy of chromosome 22. So the scientists investigated variations of COMT on that other copy of chromosome 22 for those who have deletions on one of their chromosome 22 copies.

The surmise of the researchers turned out to be correct. Of those children missing one copy of COMT the children who had the lower activity version of their only copy of COMT had worse symptoms than children who had a higher activity version of COMT.

As expected, about 29 percent, or seven, of the children with the deletion had developed a psychotic disorder by the second round of testing, compared with only one child in the control group. Of these seven, those with the low-activity version of COMT had experienced a significantly greater drop in their verbal IQ and expressive language skills and a markedly greater decrease in the volume of their prefrontal cortex than did their peers with the more highly active version of COMT. The psychotic symptoms of the low-activity subset were also significantly more severe.

In contrast, members of the control group experienced no significant differences in any of these categories, regardless of their COMT profiles.

What I want to know: are the COMT low activity and deletion mutations more examples of IQ "overclocking" mutations? The press release doesn't provide enough detail to tell. Which combinations of COMT variations resulted in the best cognitive performance? Also, are low activity versions of COMT or deletions of COMT more common among Ashkenazi Jews than among other populations?

By Randall Parker 2005 October 30 12:49 PM  Brain Disorders
Entry Permalink | Comments(5)
2005 October 28 Friday
Cruciferous Vegetables Lower Cancer Risk Only For Some With Inactive Genes

Eating cruciferous vegetables only helps lower the risk of cancer if you have inactive forms of glutathione-S-transferase enzymes.

Paul Brennan of the International Agency for Cancer Research and other scientists have just completed a lung-cancer study that appears to back up this theory. In particular, the team studied the diets and genes of more than 4,000 people in Eastern and Central Europe.

According to the results published today in the journal The Lancet, the researchers found that people with an inactive form of the GSTM1 gene were 33-per-cent less likely to get lung cancer if they ate cruciferous vegetables on a weekly basis.

Furthermore, "in people who had an inactive GSTT1, there was a 37-per-cent protective effect, while those with both genes inactivated had a 72-per-cent protective effect."

They found no protective effect in people with active forms of the genes.

Think about that last sentence. If you could get tested and discovered that you have active forms of both genes then you'd have no health reason to eat Brussels sprouts and broccoli. That strikes me as something I'd really like to know.

On the other hand, this result might also provide support for the idea of developing drugs to deactivate the glutathione-S-transferase enzymes so that isothiocyanates will hang around the body for longer periods of time. Or maybe high dose isothiocyanate pills could overwhelm the enzymes that break them down so that the isothiocyanates can still provide protection.

Over 4000 people were used in the study.

For this study, the researchers looked at 2,141 people with lung cancer, comparing them with 2,168 healthy individuals in the Czech Republic, Hungary, Poland, Slovakia, Romania and Russia, where consumption of these vegetables has traditionally been high.

Participants filled out a food questionnaire, and also gave a blood sample so researchers could detect GSTM1 and GSTT1.

The questionnaire listed 23 foods, including three cruciferous vegetables: cabbage and a combination of Brussels sprouts with broccoli.

In non-smokers with active genes there might still be a low protective effect.

When the results were stratified by smoking status, a protective effect was seen in smokers with both genes inactive (OR 0.31; 95% CI 0.12-0.82) but not in people with both genes active. In non-smokers, there seemed to be a protective effect regardless of genotype, however the results did not reach statistical significance.

By Randall Parker 2005 October 28 02:00 PM  Aging Diet Cancer Studies
Entry Permalink | Comments(5)
Evidence For More Genetic Large Structural Variations Between Humans

A comparison of human and chimpanzee DNA sequences for large structural variations such as inversions (flips) has led to the discovery of more genetic variations between humans than were previously known.

By comparing the human genome with that of the chimpanzee, man's closest living relative, researchers have discovered that chunks of similar DNA that have been flipped in orientation and reinserted into chromosomes are hundreds of times more common in primates than previously thought. These large structural changes in the genome, called inversions, may account for much of the evolutionary difference between the two species. They may also shed light on genetic changes that lead to human diseases.

Although humans and chimpanzees diverged from one another genetically about six million years ago, the DNA sequences of the two species are approximately 98 percent identical.


The researchers published their findings in the October 28, 2005, issue of the journal Public Library of Science Genetics (PLoS Genetics).

That previous link is free access.

This paper provides yet more evidence that the early focus on single point mutations (Single Nucleotide Polymorphisms or SNPs) as a measure of genetic variation between humans has understated the extent to which humans differ from each other genetically.

This research expands on a Nature paper published on September 1, 2005, by HHMI investigator Evan E. Eichler at the University of Washington. Eichler's group determined that novel duplications of genetic material within humans also significantly contribute to differences between the species.

Instead of identifying sequence changes between the two genomes at the base-pair level, Scherer focused his research on large structural variations in chromosomes between humans and chimps, specifically genetic inversions. Inversions can disrupt the expression of genes at the point where the chromosome breaks, as well as genes adjacent to breakpoints.

“From a medical genetics perspective, there are probably hundreds of disease genes that have not yet been characterized,” said Scherer. “The vast majority of disease gene discovery has been based on gene sequencing, but this is not a comprehensive view of chromosomes. We are using an evolutionary approach to identify mutations that may predispose people to disease.”

According to Scherer, prior to this research, only nine inversions between humans and chimps had been identified. Using a computational approach, Scherer's group identified 1,576 presumed inversions between the two species, 33 of which span regions larger than 100,000 base pairs—a sizeable chunk of DNA. The average human gene is smaller, only about 60,000 bases in length.

Scherer's team experimentally confirmed 23 out of 27 inversions tested so far. Moreover, by comparing the chimp genome with its ancestor, the gorilla genome, they determined that more than half of the validated inversions flipped sometime during human evolution.

The genetic sequence inversions found in humans are not found uniformly across all humans.

Perhaps even more interesting than the abundance of inversions that Scherer's group unveiled was their discovery that a subset of the inversions are polymorphic—taking different forms—within humans, meaning that the human genome is still evolving. When the 23 experimentally confirmed inversions were tested against a panel of human samples, the scientists found three inversions with two alleles or pairs of genes displaying the human inversion in some people, whereas others had one allele of the human inverted sequence and one allele of the normal sequence in chimps.

Having one allele with an inversion and one allele without represents a ticking time bomb in genetic terms, Scherer said, since these alleles may improperly align and recombine during replication, ultimately causing DNA deletions or a loss of DNA that subsequent generations inherit. Scherer's prior research on Williams-Beuren syndrome, a disease caused by DNA micro-deletions, identified a significantly higher incidence of inversions among the parents of afflicted patients.

Because of the small human population used for comparison by these researchers many more structural variation polymorphisms in humans were probably missed in this report.

Scherer said that his group looked at only a very small subset of the human population when assessing the prevalence of polymorphisms. He suspects that polymorphisms, and structural variations in general, may be much more common than his preliminary analyses suggest.

We need DNA testing methods for easily and cheaply detecting large copy variations, inversions, and other large structural variations. It is obviously not enough just to compare single point mutations as the HapMap project is doing. Lots of important genetic variations exist as larger structural differences. While most of the SNP differences have been identified most of the large structural variations probably still wait discovery.

Also see my recent posts "Human Genetic HapMap Phase I Published" and "Genetic Analysis Shows Signs Of Selective Pressure In Human Evolution".

By Randall Parker 2005 October 28 12:30 PM  Trends, Human Evolution
Entry Permalink | Comments(0)
Implantable Remote Controlled Surgical Robot Developed

University of Nebraska researchers have developed a 3 inch (8 centimeters) insertable remote controlled robot for abdomen surgery.

But, these tiny-wheeled robots – slipped into the abdomen and controlled by surgeons hundreds of kilometers away – may be giants in saving the lives of roadside accident victims and soldiers injured on the battlefield.

Each camera-carrying robot -- the width of a lipstick case -- would illuminate the patient’s abdomen, beam back video images and carry different tools to help surgeons stop internal bleeding by clamping, clotting or cauterizing wounds.

Sound far-fetched? Not for physicians and engineers at the University of Nebraska Medical Center and University of Nebraska-Lincoln, who already are turning the sci-fi idea into reality with a handful of miniature prototypes.

“We want to be the Microsoft leader in this technology and be the state that changes the way surgery is done,” said Shane Farritor, Ph.D., associate professor in the Department of Mechanical Engineering in UNL’s College of Engineering and Technology.

“This work has the potential to completely change the minimally invasive surgery landscape,” said Dmitry Oleynikov, M.D., director of education and training for the minimally invasive and computer-assisted surgery initiative. “This is just the start of things to come regarding robotic devices at work inside the body during surgery.”

So when will surgery by hands-on surgeons become less common than surgery by robots that are controlled by surgeons? 20 years? 30 year? When will surgeon-controlled robots be replaced by totally automated robots?

This approach provides greater control and more views than existing laparoscopic techniques.

It’s a stark contrast to existing laparoscopic techniques, which allow surgeons to perform operations through small incisions. The benefits of laparoscopy are limited to less complex procedures, however, because of losses in imaging and dexterity compared to conventional surgery.

“These remotely controlled in vivo robots provide the surgeon with an enhanced field of view from arbitrary angles, as well as provide dexterous manipulators not constrained by small incisions in the abdominal wall,” Dr. Oleynikov said.

In fact, the view is better than the naked eye, he said, because the in-color pictures from the roaming robots are magnified 10x.

Future remote use applications include space, battlefield, and civilian emergencies.

On the battlefield, these tiny soldiers can be inserted into wounds and allow remote surgeons to determine how critical the injury is and what immediate steps can be taken to ensure survival.

The UNMC and UNL team also plans to soon test a final prototype of a mobile biopsy robot designed to take samples of tissue. In addition, the design team is making modified robots that can be inserted into the stomach cavity through the esophagus.

The 3-inch long, aluminum-cased robots contain gears, motors, lenses, camera chips and electrical boards. “Three inches seems to be our limit at the moment because of the electrical components we use,” said designer Mark Rentschler, a Ph.D. candidate in biomedical engineering at UNL. “If we were to make 1,000 robots we would be able to afford customized electrical components that would reduce the size of the robot by half.”

The design team said initially the mini-robots will be single-use devices, although they eventually may be able to be sterilized for multiple use.

The group intends to create a local, spin-off company and then seek FDA approval of the devices, which would be applicable for any laparoscopic or minimally invasive surgery – from gall bladder to hernia repair.

NASA will begin trials next spring with an astronaut in a submarine off of Florida. The scientists hope to begin clinical trials with humans within a year in the UK.

One can also imagine an insertable stem cell incubator that would continually produce stem cells aimed at an especially damaged part of the body. Or how about an insertable robot surgeon that stays in the body for days and weeks to gradually reshape damaged tissue with a combination of a series of small surgical modifications, drug delivery, and stem cell delivery? In the longer run nanobots will do a lot of that work. But before nanobots become practical more conventional miniaturized robots will do a lot of repair work.

By Randall Parker 2005 October 28 10:34 AM  Biotech Manipulations
Entry Permalink | Comments(8)
Cytokines Stimulate Stem Cells To Heal Heart Attack Patients?

Heart specialist Sebastiano Marra at Turin University in Italy found that injection of cytokine hormones into the body after a heart attack marshals stem cells to repair the heart and leads to better outcomes.

In the new technique, hormones called cytochines are injected into the body during the 24 hours after emergency heart surgery and immediately spur the production of stem cells in spinal fluid.

The stem cells race to rescue the damaged heart, Marra said.

"The acute inflammation of the heart attracts the stem cells whose role in the body is to repair cardiac tissue," said Marra, who operated on the patients at Turin's Molinette Hospital.

Tests on eight patients who were operated on immediately after a heart attack have produced "amazing" results, he said.

"They were soon back on their bicycles or going to swimming pools." Compared to experimental methods used so far, the Turin technique is far less invasive, Marra continued.

Mind you, this is a news report on only 8 patients and not a journal article with a larger number of patients with controls and a detailed comparison of outcomes. Still, the approach is at least plausible. Stimulation of the production of stem cells is already used to make stem cells from donors to treat leukemia. But Marra is trying to stimulate stem cells within the same body that needs them for heart muscle repair.

The technique might work less well in the really old because stem cell reservoirs in older people are aged and do not divide as quickly. However, one study found that elements in the blood of the old mice caused their stem cells to grow less rapidly. So it isn't so much that the stem cells are old but that they are getting signals telling them not to grow. Perhaps cytokines or other compounds can override those suppressor signals. So Marra's approach might work even for old folks.

Thanks to Brock Cusick for the pointer.

By Randall Parker 2005 October 28 09:43 AM  Biotech Stem Cells
Entry Permalink | Comments(8)
2005 October 27 Thursday
Dearth Of Suitable Males For University Educated Japanese Women

Social demographer James Raymo, at University of Wisconsin-Madison and Miho Iwasawa of the National Institute of Population and Social Security Research in Tokyo Japan have found that the willingness of better educated Japanese men to marry less educated women has left many more educated Japanese women single.

Highly educated women generally seek out equally educated spouses, says Raymo, but in Japan, husbands don't necessarily share a similar preference. Due to the extreme difficulty of tending to family and having a job simultaneously, Japanese wives are more likely than their American counterparts to stay home and financially depend on their husbands. Consequently, the researcher notes, Japanese men have less incentive to choose partners of the same educational background.

"Most highly educated Japanese women still want to marry, but can't do that as easily as they could in the past," says Raymo. "Women's increasing educational attainment, combined with lack of change in family roles, has created a potential 'marriage market mismatch' in Japan."

Raymo and co-author Miho Iwasawa of Tokyo's National Institute of Population and Social Security Research analyzed data from Japan's largest fertility and marriage survey, which has collected information on up to 10,000 married women and a similar number of unmarried men and women since 1952. The researchers' analysis revealed that approximately one-fourth of the decline in marriages among university-educated Japanese women can be explained by the dearth of available mates.

In a nutshell: smarter men are willing to marry relatively dumber women. This leaves a deficit of smarter men for the smarter women to marry. This problem is not unique to Japan by any means. Does the preference for equally well educated spouses run more strongly in some Western countries as compared to other Western countries?

Why does mismatch of male and female mating preferences happen? It is probably a consequence of Darwinian natural selection. Women tend to place a higher value on status in mates. Men tend to place a higher value in fertility signals (most notably youth). So a man can get a younger and sexier looking mate by sacrificing the status level he's willing to recruit from. But then that leaves quite a few smart women with a much less desirable field of men to choose among.

Where this phenomenon occurs one would also expect to see a lower rate of marriage among poorly educated and lower IQ men as compared to women of similar lower intellectual abilities. Where this phenomenon occurs such societies end up with lots of unmarried smarter women and dumber men.

Some years back I can remember when Lee Kuan Yew (former prime minister and current "senior minister" of Singapore) organized social events to bring smart men and smart women together at dances or dinners. He wanted the smart male engineers to marry the smart women who had majored in the humanities in college. I wonder if he had any success and whether these social get-togethers are still arranged by the government in Singapore.

Also see my previous posts "Men Prefer Subordinate Women For Long Term Relationships", "Humans Have Been Polygamous For Most Of History", and "Baby Boys Keep Marriages Together Better Than Baby Girls".

By Randall Parker 2005 October 27 01:26 PM  Trends Demographic
Entry Permalink | Comments(20)
2005 October 26 Wednesday
Human Genetic HapMap Phase I Published

The human geonome Haplotype Map (HapMap) project has reached an important milestone.

At the project's outset in October 2002, the consortium set an ambitious goal of creating a human haplotype map, or HapMap, within three years. The Nature paper marks the attainment of that goal with its detailed description of the Phase I HapMap, consisting of more than 1 million markers of genetic variation, called single nucleotide polymorphisms (SNPs). The consortium is also nearing completion of the Phase II HapMap that will contain nearly three times more markers than the initial version and will enable researchers to focus their gene searches even more precisely on specific regions of the genome.

Identification of locations where chromosomes swap sections of genetic code provides a number of benefits. One big advantage is that it allows testing of smaller numbers of locations in order to better guess what sequence variations will be found at untested locations. This lowers the cost of genetic testing. This is great for tracking which genetic sequence variations are correlated with specific diseases, physical body shapes and colors, and cognitive differences among other areas of human differences.

Just because most people who have particular SNPs at particular locations usually also then have particular other SNPs at other locations does not mean this is always the case. Some people will have rarer mutations (in fact we probably all have unique mutations). So the coming decline in cost of DNA sequencing by orders of magnitude will provide useful benefits, especially for identifying rarer mutations and identifying who has those rarer mutations.

Identification of the key SNP locations also provides a sort of roadmap for the study of human evolution. The fact that some sections with common sets of SNPs are shorter and other sections are longer is helpful in identifying genetic functionality that has been under a great deal of selective pressure.

Genetic diversity in humans is increased by recombination, which is the swapping of DNA from the maternal and paternal lines. It has been recently realized that in humans, most such swapping occurs primarily at a limited number of "hotspots" in the genome. By analyzing the HapMap data, the researchers have produced a genome-wide inventory of where recombination takes place. This will enable more detailed studies of this fundamental property of inheritance, as well as serve to improve the design of genetic studies of disease.

The scientists found evidence of selective pressures for both immune response and cognitive function.

The HapMap consortium found that genes involved in immune response and neurological processes are more diverse than those for DNA repair, DNA packaging and cell division. Researchers speculate the difference might be explained by natural selection shaping in the human population in ways that favor increased diversity for genes that influence the body's interactions with the environment, such as those involved in immune response, and that do not favor changes in genes involved in core cellular processes.

As expected, the vast majority of both rare and common patterns of genetic variation were found in all of the populations studied. However, the consortium did find evidence that a very small subset of human genetic variation may be related to selection pressures related to geographic or environmental factors, such as microorganisms that cause infectious diseases. This evidence appears as significant differences in genetic variation patterns in particular genomic regions among the populations studied. While more follow-up study is needed to explore the differences, researchers say some of the most striking examples merely serve to confirm well-known genetic differences among populations, such as the Duffy blood group, which plays a role in response to malaria, and the lactase gene, which influences the ability to digest milk products.

In 20 or 30 years most of the selective pressures that acted on the human race in local environments will probably be well characterised and their effects quantified down at the level of frequencies of genetic variations in different human populations. We are living in the final years of our ignorance of how natural selection molded humans to produce the wide range of human variations we see today.

The HapMap is for single point mutations. But recent progress in identifying larger scale structural variations such as large copy variations is starting to paint a picture of much larger amounts of genetic variation between individuals and groups.

By Randall Parker 2005 October 26 03:17 PM  Trends, Human Evolution
Entry Permalink | Comments(2)
Women Have More Facial Nerve Receptors Than Men

A larger number of facial nerves may provide one explanation for why women feel more pain.

ARLINGTON HEIGHTS, Ill. – For centuries, it has been generally believed women are the more sensitive gender. A new study says that, when it comes to pain, women are in fact more sensitive. According to a report published in October's Plastic and Reconstructive Surgery®, the official medical journal of the American Society of Plastic Surgeons (ASPS), women have more nerve receptors, which cause them to feel pain more intensely than men.

"This study has serious implications about how we treat women after surgery as well as women who experience chronic pain," said Bradon Wilhelmi, MD, ASPS member and author of the study. "Because women have more nerve receptors, they may experience pain more powerfully than men, requiring different surgical techniques, treatments or medicine dosages to help manage their pain and make them feel comfortable."

According to the study, women averaged 34 nerve fibers per square centimeter of facial skin while men only averaged 17 nerve fibers. Despite psychosocial expectations for men to be tougher than women when feeling pain, these findings illustrate that women's lower pain tolerance and threshold are physical.

But does the higher concentration of nerve fibers in the face of women reflect a similar difference between men and women in other parts of their bodies? I suspect not.

Also, I've come across lots of studies that contradict each other on the question of whether men or women have lower pain thresholds or feel more pain. Thinking about this study a thought occurs: Maybe men have lower pain thresholds for some parts of the body while women do for other parts. Or perhaps maybe men and women have different ratios of pain sensitivity for acute versus chronic pain.

Most plastic surgeries are done on women.

"Eighty-seven percent of the 9.2 million cosmetic surgery procedures performed last year were on women," said Dr. Wilhelmi. "The ability to minimize pain often affects a patient's perception of their results. We hope this data will give new perspective on how to better treat post-operative pain in women."

A lot of people are feeling pain.

Currently, 15 to 20 percent of the U.S. population suffers from acute pain, says Dr. Wilhelmi, while 25 to 30 percent suffer from chronic pain.

By Randall Parker 2005 October 26 10:06 AM  Brain Pain
Entry Permalink | Comments(9)
2005 October 25 Tuesday
Quantum Dots Make LEDs Full Spectrum Light Sources

The days of Edison's light bulb are numbered.

Take an LED that produces intense, blue light. Coat it with a thin layer of special microscopic beads called quantum dots. And you have what could become the successor to the venerable light bulb.

The resulting hybrid LED gives off a warm white light with a slightly yellow cast, similar to that of the incandescent lamp.

Until now quantum dots have been known primarily for their ability to produce a dozen different distinct colors of light simply by varying the size of the individual nanocrystals: a capability particularly suited to fluorescent labeling in biomedical applications. But chemists at Vanderbilt University discovered a way to make quantum dots spontaneously produce broad-spectrum white light. The report of their discovery, which happened by accident, appears in the communication "White-light Emission from Magic-Sized Cadmium Selenide Nanocrystals" published online October 18 by the Journal of the American Chemical Society.

In the last few years, LEDs (short for light emitting diodes) have begun replacing incandescent and fluorescent lights in a number of niche applications. Although these solid-state lights have been used for decades in consumer electronics, recent technological advances have allowed them to spread into areas like architectural lighting, traffic lights, flashlights and reading lights. Although they are considerably more expensive than ordinary lights, they are capable of producing about twice as much light per watt as incandescent bulbs; they last up to 50,000 hours or 50 times as long as a 60-watt bulb; and, they are very tough and hard to break. Because they are made in a fashion similar to computer chips, the cost of LEDs has been dropping steadily. The Department of Energy has estimated that LED lighting could reduce U.S. energy consumption for lighting by 29 percent by 2025, saving the nation's households about $125 million in the process.

Doesn't that amount of savings seem small? Does the United States really spend such a small amount of money on incandescent light electricity?

LEDs are more efficient because they do not emit in the infrared.

Of course, quantum dots, like white LEDs, have the advantage of not giving off large amounts of invisible infrared radiation unlike the light bulb. This invisible radiation produces large amounts of heat and largely accounts for the light bulb's low energy efficiency.

The breakthrough came accidentally and was the result of making quantum dots smaller than they are usually made.

Bowers works in the laboratory of Associate Professor of Chemistry Sandra Rosenthal. The accidental discovery was the result of the request of one of his coworkers, post-doctoral student and electron microscopist James McBride, who is interested in the way in which quantum dots grow. He thought that the structure of small-sized dots might provide him with new insights into the growth process, so he asked Bowers to make him a batch of small-sized quantum dots that he could study.

"I made him a batch and he came back to me and asked if I could make them any smaller," says Bowers. So he made a second batch of even smaller nanocrystals. But once again, McBride asked him for something smaller. So Bowers made a batch of the smallest quantum dots he knew how to make. It turns out that these were crystals of cadmium and selenium that contain either 33 or 34 pairs of atoms, which happens to be a "magic size" that the crystals form preferentially. As a result, the magic-sized quantum dots were relatively easy to make even though they are less than half the size of normal quantum dots.

After Bowers cleaned up the batch, he pumped a solution containing the nanocrystals into a small glass cell and illuminated it with a laser. "I was surprised when a white glow covered the table," Bowers says. "I expected the quantum dots to emit blue light, but instead they gave off a beautiful white glow."

"The exciting thing about this is that it is a nano-nanoscience phenomenon," Rosenthal comments. In the larger nanocrystals, which produce light in narrow spectral bands, the light originates in the center of the crystal. But, as the size of the crystal shrinks down to the magic size, the light emission region appears to move to the surface of the crystal and broadens out into a full spectrum.

As all matter of materials get made at smaller sizes more interesting, unexpected, and useful behaviors of materials will be found.

By Randall Parker 2005 October 25 10:54 AM  Energy Lighting
Entry Permalink
Lawrence Berkeley Group Develops Thin Film Photovoltaics

Yet another promising photovoltaics fabrication method:

Imagine a future in which the rooftops of residential homes and commercial buildings can be laminated with inexpensive, ultra-thin films of nano-sized semiconductors that will efficiently convert sunlight into electrical power and provide virtually all of our electricity needs. This future is a step closer to being realized, thanks to a scientific milestone achieved at the U.S. Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab).

Researchers with Berkeley Lab and the University of California, Berkeley, have developed the first ultra-thin solar cells comprised entirely of inorganic nanocrystals and spin-cast from solution. These dual nanocrystal solar cells are as cheap and easy to make as solar cells made from organic polymers and offer the added advantage of being stable in air because they contain no organic materials.

Their point about stability is important. Think about how plastic and rubber (made from hydrocarbons) degrade under exposure to sunlght. The longer photovoltaics last the better the economics become. Also, rather than bolting the photovoltaics onto structure surfaces in separate apparatuses if the photovoltaics could get built right into structure surfaces even larger cost reductions become possible.

"Our colloidal inorganic nanocrystals share all of the primary advantages of organics -- scalable and controlled synthesis, an ability to be processed in solution, and a decreased sensitivity to substitutional doping - while retaining the broadband absorption and superior transport properties of traditional photovoltaic semiconductors," said Ilan Gur, a researcher in Berkeley Lab's Materials Sciences Division and fourth-year graduate student in UC Berkeley's Department of Materials Science and Engineering.

Gur is the principal author of a paper appearing in the October 21 issue of the journal Science that announces this new development. He is a doctoral candidate in the research group of Paul Alivisatos, director of Berkeley Lab's Materials Sciences Division, and the Chancellor's Professor of Chemistry and Materials Science at UC Berkeley. Alivisatos is a leading authority on nanocrystals and a co-author of the Science paper. Other co-authors are Berkeley Lab's Neil A. Fromer and UC Berkeley's Michael Geier.

While the initial conversion efficiency is still low the process lends itself to scaling up at low cost should they find ways to boost conversion efficiency.

In this paper, the researchers describe a technique whereby rod-shaped nanometer-sized crystals of two semiconductors, cadmium-selenide (CdSe) and cadmium-telluride (CdTe), were synthesized separately and then dissolved in solution and spin-cast onto a conductive glass substrate. The resulting films, which were about 1,000 times thinner than a human hair, displayed efficiencies for converting sunlight to electricity of about 3 percent. This is comparable to the conversion efficiencies of the best organic solar cells, but still substantially lower than conventional silicon solar cell thin films.

"We obviously still have a long way to go in terms of energy conversion efficiency," said Gur, "but our dual nanocrystal solar cells are ultra-thin and solution-processed, which means they retain the cost-reduction potential that has made organic cells so attractive vis-a-vis their conventional semiconductor counterparts."

Silicon crystals that are used in manufacturing current silicon photovoltaic cells represent a large fraction of total photovoltaics costs. Approaches that avoid the need to make lots of relatively thick crystals are probably essential for driving down the cost of photovoltaics far enough to make photovoltaic installations ubiquitous. So any new photovoltaic fabrication method that avoids the use of silicon crystals warrants notice.

Another advantage of this approach is low weight. Thin film solar cells with high durability and low weight could potentially get coated onto electric and hybrid car surfaces to recharge batteries.

Compare this report to my previous post UCLA Team Cuts Photovoltaics Cost With Plastics. Note the 15 to 20 year life expectancy for the UCLA approach. The Lawrence Berkeley material would probably last longer. But which group can boost conversion efficiency the most and the soonest?

By Randall Parker 2005 October 25 09:31 AM  Energy Wind
Entry Permalink | Comments(5)
2005 October 23 Sunday
IVF Embryo Genetic Defect Rate For Young Women Has Ethical Implications

Studies using Pre-Implantation Genetic Diagnosis (PGD or PGID) on embryos created using In Vitro Fertilization (IVF) found that most embryos formed from eggs donated from young healthy women have errors in chromosome count.

Paulette Browne, at the Shady Grove Center for Preimplantation Genetics in Rockville, Maryland, US, and her colleagues, examined 275 embryos created from the donated eggs of women aged between 21 and 31. All the donors were ostensibly healthy. The researchers removed cells three days after conception and examined them for aneuploidies. They found that 137 – half – of the embryos had at least one error.

Aneuploidy is the state of having too many or too few chromosomes. So either the embryo cells had extra chromosomes or missing chromosomes from each chromosome pair. Note that aneuploidy is just one type of genetic abnormality and testing for aneuploidy by itself leads to an underestimate of the incidence of genetic abnormalities.

Another study found a high rate of chromosome damage.

In research presented at the American Society for Reproductive Medicine in Montreal yesterday, Jeffrey Nelson of the Huntingdon Reproductive Centre in California used a technique called preimplantation genetic diagnosis (PGD) to screen 289 embryos created from healthy egg donors, all of whom were under 30. He found that 42% had damaged chromosomes, the strands of DNA that together hold the entire complement of human genes. The extent of damage ranged from 28% to as high as 83% in some women. "We had always assumed that embryos created from eggs donated by younger women would not have these defects," Dr Nelson said. "But just the fact that we are seeing this high rate of abnormality suggests that we should be using [PGD] more."

The spelling for that center is really "Huntington Reproductive Center".

A third study by Peter Nagy of Reproductive Biology Associates in Atlanta found similar results.

At Reproductive Biology Associates in Atlanta, GA, researchers investigated the differing incidence of aneuploidy in young infertility patients as compared with older patients and found that the frequency of chromosomally abnormal embryos is unexpectedly high in those of young reproductive age. In a prospective on-going study, 36 infertile patients (average age 32.5, all under 35), with no prior treatment and representing all diagnoses of infertility proportionally, had IVF with PGD. Their PGD results were compared with a control population of women over 38 (average age 40.7) who were undergoing IVF at the same time. Young patients in the study population had an average of 17.6 eggs retrieved, of which 70% fertilized; the older control patients had 13.5 eggs on average, of which 69% fertilized. The younger women in total had 103 embryos identified as normal and 198 abnormal embryos. The older women had, as expected, a higher proportion of abnormal embryos: 323 abnormal to 116 normal. Of the younger patients, 56% became pregnant, while 33% of the older patients became pregnant.

Eric Surrey, MD, President of SART, remarked, “PGD may become a very useful technique for maximizing the chances of success of a particular cycle of IVF. And these results do shed light on some of the reasons why a particular young donor or patient might produce many eggs, which fertilize and develop as embryos of normal appearance, but do not result in pregnancy. However, PGD, especially using a single cell, is not fail-safe. Mosaicism, the presence of normal and abnormal cells in the same embryo can confound the results of single-cell PGD.”

Nagy says that the extent of genetic abnormalities was even greater than his test showed.

These findings overturn long-held assumptions that reproductive problems are primarily age-related, Nagy said.

"This is new information," he said, adding that the genetic-defect rate could turn out to be even higher once pre-implantation genetic diagnosis extends to the entire genome.

"We tested 11 chromosomes, not the whole genome," he said.

Even if he'd tested all the chromosomes that would only have detected larger scale abnormalities. Smaller scale genetic damage would not show up with current testing techniques.

Most pregnancies fail before women even know they are pregnant.

Researchers also suggested the rate of genetic problems seen in IVF embryos mirrors the real world.

There's a "baseline of abnormal" in the general population that increases with age, Nelson said.

An estimated 60 to 70 per cent of pregnancies are lost before a woman recognizes she'd been pregnant, he added.

The human reproductive system produces a lot of genetically damaged embryos. Think about that. Two thirds of pregnancies end before women even know they are pregnant. Some additional percentage miscarry after a woman knows she's pregnant. Consider these facts in light of religious beliefs held by some that at the moment of conception a spirit is somehow attached to the fertilized egg. Does God attach spirits to all these fertilized eggs that are doomed to never attach to the uterus or that initially attach but fail due to genetic damage?

Consider this result in light of the recent work by Rudolf Jaenisch and Alexander Meissner at MIT's Whitehead Institute to create mouse embryos that can can not grow a placenta. This results in embryos that can not develop very far. Their goal is to find ways to develop embryonic stem cells that will not elicit as many objections from some religious folks. Nature (or God if you prefer) already generates lots of embryos that can not develop into humans. Likely most embryos created naturally lack that capacity. Doesn't that fact make the Jaenisch and Meissner approach more ethically acceptable?

Genetic deactivation and/or genetic deletion using genetic engineering techniques essentially mirrors what happens naturally. What happens naturally, if done by human will, would be considered highly morally objectionable by some. Natural selection produced an outcome where the most efficient way to make viable humans is to allow large numbers of fertilizations to fail to develop all the way to birth.

If an embryo lacks genes needed to develop a complete human but has most of the genes needed to produce a human it is not even a potential human. It lacks the potential to become a human. Should we consider a thing that lacks even the potential to become a human as possessing the rights of a human?

By Randall Parker 2005 October 23 11:47 AM  Bioethics Reproduction
Entry Permalink | Comments(6)
2005 October 22 Saturday
Genetic Analysis Shows Signs Of Selective Pressure In Human Evolution

Cornell researcher Carlos Bustamante and associates have found signs of Darwinian selective pressure in recent human evolution.

ITHACA, N.Y. -- The most detailed analysis to date of how humans differ from one another at the DNA level shows strong evidence that natural selection has shaped the recent evolution of our species, according to researchers from Cornell University, Celera Genomics and Celera Diagnostics.

In a study published in the Oct. 20 issue of the journal Nature, Cornell scientists analyzed 11,624 genes, comparing how genes vary not only among 39 humans but also between the humans and a chimpanzee, whose DNA is 99 percent identical to humans.

The comparisons within and between species suggest that about 9 percent of genes that show some variability within humans or differences between humans and chimpanzees have evolved too rapidly to be explained simply by chance. The study suggests that positive Darwinian natural selection -- in which some forms of a gene are favored because they increase the probability of survival or reproduction -- is responsible for the increased rate of evolution. Since genes are blueprints for proteins, positive selection causes changes in the amino acid sequence of the protein for which the gene codes.

I wish the press release on this study provided a bit more detail. How recent is "recent" in this context? For any of the genes are there signs of selective pressure just within the last few thousand years? Also, how many of the genes examined have functions specific to the nervous system and what percentage of them show recent signs of selective pressure?

Cheaper DNA sequencing technologies will eventually enable studies with much larger groups of people, larger numbers of genes, and more types of genetic difference. My guess is this study looked mainly at point mutations and not large copy variations (see the second report below about large copy variations). As large copy variations become more easily testable more signs of selective pressure will be found.

Several categories of genes underwent selective pressure to create modern humans.

"Our study suggests that natural selection has played an important role in patterning the human genome," said the paper's lead author, Carlos Bustamante, assistant professor of biological statistics and computational biology at Cornell.

The Cornell/Celera team found that genes involved in immune function, sperm and egg production, sensory perception and transcription factors (proteins that control which genes are turned on or off) have been particularly affected by positive selection and show rapid evolution in the last 5 million years, when humans shared a common ancestor with chimps.

13 percent of the genes examined appear to be under negative selection against variations that are harmful.

Likewise, the researchers found that approximately 13 percent of the genes that may vary show evidence of slightly deleterious or harmful mutations in human populations; these include genes involved in determining the basic structure of cells and muscles as well as genes that control traffic in and out of the cell. These mutations are subject to weak negative selection, according to the study. In general, negative selection eliminates from the population very harmful changes to proteins that kill or stop reproduction. But mutations that have led to slightly deleterious versions of the gene -- mutations that may cause disease or only slightly reduce the average number of children left by those that carried the mutation -- can by chance become quite common in the population.

Mildly harmful mutations take a long time to get selected out. Identification of purely harmful mutations be very useful in offspring genetic engineering to produce human offspring with far fewer purely harmful genetic mutations. Such humans will function better mentally and physically in a large assortment of ways.

All of us now living are already born with lots of harmful mutations. Well, we are not permanently stuck with old flawed genetic software. In a couple of decades we'll be able to get replacement organs grown which will be grown from stem cells genetically engineered to remove harmful mutations. We'll also be able to get stem cell therapies to upgrade our bodies with better genetic software.

Read this excellent article in Nature which surveys recent discoveries of larger amounts of genetic variation in humans than has previously been predicted.

How common, exactly? Last July, Wigler's group reported that it had looked at 20 normal individuals and found 221 places in the genome where those people had different copy numbers of stretches of DNA2. Some of these copy-number changes showed up in more than one person, and so qualify as 'polymorphisms' — shorthand for particular spots in the genome that regularly differ between individuals. In the Book of Life analogy, these polymorphisms represent sections of text where certain paragraphs are repeated different numbers of times in different individuals.

About 76 of the variations Wigler's team found were polymorphisms, and each person had about 11 of them in his or her genome2. Soon after, Lee and Scherer reported that in a survey of 55 people they had found 255 copy-number variants, 102 of which were polymorphisms3.

Large copy variations can produce large effects. Picture a gene for making a neuroreceptor. An extra copy of it could increase the concentration of that receptor on the end of a neuron. Or the replication of a gene which makes a protein which stimulates neural stem cells to divide could cause more rapid neural stem cell growth and hence larger brains in those who carry extra copies.

In addition to the large copy variations researchers have since found large scale rearrangements, deletions and insertions. Together all these variations are referred to as structural variations. The number of structural variations being discovered is so large that the claim of 99% shared genetic sequences between different humans may turn out to be too high..

Genome researchers now have a catch-all phrase for the vast array of rearrangements — including copy-number polymorphisms, inversions, deletions and duplications — that occur normally in the human genome. They call it structural variation, and have described at least 800 individual variants that, in total, account for about 3.5% of the human genome. And the sheer number of variants seems likely to catch up with the number of known single nucleotide polymorphisms — the single-letter 'typos' in the Book of Life. That makes structural variation a potentially major source of diversity. It is even possible that we're not all 99.9% similar, as the Human Genome Project predicted.

The increasing discoveries of genetic dissimilarity between humans is evidence of selective pressure to adjust to local environments as humans spread out and colonized the globe.

Also see my previous post "Brain Gene Allele Frequences Show Brain Still Evolving".

By Randall Parker 2005 October 22 08:45 PM  Trends, Human Evolution
Entry Permalink | Comments(6)
DNA And Cell Based Vaccines In Development Against Pandemics

British researchers call for accelerated development of DNA vaccines in case an H5N1 influenza pandemic breaks out in humans.

Researchers scrambling to combat a virulent form of bird flu that could mutate into a form easily spread among humans should consider developing vaccines based on DNA, according to British biochemical engineers. DNA vaccines, they say, can be produced more rapidly than conventional vaccines and could possibly save thousands of lives if a global influenza outbreak occurs.

A DNA-based vaccine could be a potent weapon against this emerging threat, particularly if enough conventional vaccine isn't available, according to Peter Dunnill, DSc., and his colleagues at University College London. However, they caution that any DNA vaccine should only be used as needed to slow the spread of the disease because the technique is largely untested in humans. The analysis appears in the November-December issue of the journal Biotechnology Progress, a co-publication of the American Chemical Society and the American Institutes of Chemical Engineers.

The avian virus, H5N1, has spread among birds throughout Southeast Asia and has been recently detected in Eastern Europe. The virus has killed more than 60 people in Asia since 2003 and forced the slaughter of millions of birds. There are no confirmed cases of human-to-human transmission of this flu, but that could change as the virus continues to mutate, Dunnill says.

If that occurs, current production facilities are unlikely to meet global demands for conventional vaccines in time to avert a pandemic, Dunnill says. But it might be possible to quickly produce a DNA vaccine by adapting the manufacturing processes of selected biopharmaceutical and antibiotic plants in countries such as the United States, China and India.

Current vaccine production facilities not only couldn't meet demand fast enough avert a pandemic. If a pandemic happens current production capacity will not be able to make enough vaccine for the industrialized countries for a year or two. For the whole world production of suffcient vaccine might take much longer. Just how big a hole we'd be in would depend on the size of the antigen doses needed in a vaccine against a pandemic influenza strain. But the egg-based method currently used for making influenza vaccine probably couldn't yield enough vaccine for the whole world for 2 or 3 years. Hence the need for faster and more easily scalable methods for making vaccine.

"A DNA vaccine is not a panacea, however it could be useful if the situation gets out of hand," Dunnill says. "But if we're going to try it, we need to move. You can't expect to walk into a production facility, hand over the instructions, and expect them to make it on the spot. It's going to take some weeks, and we really don't know how much time we have."

A DNA vaccine could be produced in as little as two or three weeks, Dunnill says. To do it, scientists would create a "loop" of DNA that contains the construction plans for a protein on the outer surface of the H5N1 virus. When that DNA is injected into cells, it would quickly reproduce the protein and trigger immunization in much the same way as a conventional vaccine.

In contrast, producing conventional vaccines from viruses incubated in fertilized eggs can take up to six months, which is too long to effectively prevent an influenza pandemic, Dunnill says.

Although no commercial influenza DNA vaccine is currently available, these vaccines have worked well in animals. However, human trials are still in the early stages so the safety and efficacy of these vaccines isn't fully established in people. But these trials could be accelerated, Dunnill says, particularly if the H5N1 virus eventually causes large numbers of human deaths and out paces the supply of conventional vaccine. In the worst case scenario, he suggests, using a DNA vaccine could be a "stop-gap" measure until enough conventional vaccine is available to corral the pandemic.

Given a working DNA vaccine a group at Cardiff University have developed a pain-less and easy delivery mechanism.

Researchers at Cardiff University have discovered a means of delivering DNA directly into skin cells, allowing it to be spread efficiently throughout the body.

The breakthrough could lead to mass immunisation campaigns being carried out by post. Patients would be able to administer the vaccine themselves by pressing a silicon chip embedded with 400 microscopic needles onto the back of their hand for a few seconds.

A painless small silicon chip placed on the skin would deliver the DNA vaccine into surface skin cells where the DNA would get expressed to make antigen that the body's immune system would make antigens against.

The new micro-needles are long enough to penetrate the skin but not to reach pain receptors.

They were designed to introduce a DNA vaccination directly into skin cells.

Vical has just got a DARPA contract to investigate methods for rapidly producing large quantities of DNA vaccine.

SAN DIEGO, Sept. 22 /PRNewswire-FirstCall/ -- Vical Incorporated (Nasdaq: VICL) today announced that it has been awarded funding for a one-year, $0.5 million project for the Defense Advanced Research Projects Agency (DARPA), of the U.S. Department of Defense. The award will fund feasibility studies of a new approach for rapidly manufacturing large quantities of DNA vaccines.

Conventional vaccine development and manufacturing methods require years of effort after the emergence of a new pathogen for production of even a single dose for testing. Current DNA vaccine development and manufacturing processes allow initial production of vaccines in as little as three months after selection of a gene sequence associated with a pathogen, but quantities are limited by the batch-processing capacity of available manufacturing equipment. Vical intends to use the funding to evaluate new methods that would dramatically reduce the manufacturing time and increase yields, allowing production of millions of doses in a matter of weeks.

French vaccine maker Sanofi-Aventis received a $97 million dollar contract from the US Department of Health and Human Services in April 2005 to develop a non-egg cell-based method of making vaccine in infected cells in large stainless steel vats. A number of other companies are also pursuing cell-based and DNA-based methods for rapidly scaling up vaccine production.

Sanofi is the first company to be awarded a U.S. government contract for developing a new method of vaccine production, but it is not the only drug maker experimenting with alternate methods. Crucell, which works on cell-based and DNA-based methods of vaccine production, has also licensed its technology to British drug maker GlaxoSmithKline (down $0.51 to $50.02, Research) and Swiss drug maker Roche (up $0.22 to $144.18, Research), according to Bernstein analyst Gbola Amusa, who projected that a non-egg production method could be on the market by 2008. PowderMed, a privately-held British company, is also developing DNA-based methods for vaccine production, while Philadelphia-based Hemispherx Biopharma (down $0.11 to $2.21, Research) is working on a cell-based method.

Acceleration of research and development of more easily scalable and rapid methods for making vaccine ought to be the top priority for preparations against an H5N1 avian flu pandemic. The technology developed will be useful for any flu pandemic and also for producing vaccines against a large range of other diseases.

By Randall Parker 2005 October 22 12:46 AM  Pandemic Vaccines
Entry Permalink | Comments(2)
2005 October 21 Friday
Half Of Pre-Teens With Behavior Problems Become Criminals

A Swedish research group claims that half of all children who seriously violate norms of behavior go on to become criminals.

The future is bleak for children whose behavior seriously goes against the norm at a tender age. Early and long-term interventions make all the difference. This is shown in a research survey presented by IMS, the Institute for Evidence-Based Social Work Practice at the Swedish National Board of Health and Welfare together with the National Board of Institutional Care.

The behavior of such children is often more serious and aggressive than that of children who do not violate the norm until they are teenagers. Moreover, it more often continues into adulthood. Current research shows that as many as every other boy and one in five girls in this group will exhibit criminal behavior as a grown-up.

Suppose the ability to predict future criminal behavior gets further refined with genetic tests, brain scans, and other measures. Suppose that some portion of 12 year olds can be identified as having a 95+% chance of becoming lifelong criminals (and I think it inevitable that we will some day have the means to make predictions that accurate for some fraction of society). Then suppose that drugs are found that, if taken by those 12 year olds will bump their brain development in a direction that cuts their odds of becoming criminals by two thirds or three quarters. would you favor or disfavor mandatory preventative treatment of all 12 year olds who can be shown to have very high odds of becoming criminals?

Greater abilities to predict future behavior and to modify development to alter future behavior will inevitably bring up the question of when to allow or require use of methods to alter brain development and behavioral tendencies. I can't predict exactly when such capabilities will be developed. But I feel quite confident that many of us alive right now will live to see the development of such capabilities. I also expect the capabilities to become widely popular once they are available. The popularity of Ritalin demonstrates that technology for behavioral modification of kids isn't going to face serious opposition.

By Randall Parker 2005 October 21 02:26 PM  Bioethics Debate
Entry Permalink | Comments(15)
Space Exploration Proposal: Edible Moon Buggy

To save weight on a future moon mission some students are exploring ways to construct an edible moon buggy.

The students task is to make an edible moon buggy. Eating your transportation is probably not always a good idea, admits project leader Walter Smith, at Ball State University in Muncie, Indiana, US. Neither is devouring anything coated in moon dust for that matter, but for college and middle school students aged 11 or 12, designing edible model rovers serves as a good learning tool, he says.

On the early part of a moon mission astronauts could do travelling with edible lunar rover. But toward the end of the mission they would shift toward working around their base and start eating parts of their rover.

A better way to save weight on food seems obvious though: grow the food while on the moon. Sunlight is not a problem. Though filters against the UV bands might be needed. Genetically engineer algae or other plant species to grow well under lunar conditions under filtered glass. Water would be needed of course. But genetically engineered organisms could process the human wastes of astronauts to get the water and grow food.

If soil which contains substantial amounts of oxyen could be found then only hydrogen would need to be transported to the moon. Research into hydrogen storage for earthbound energy applications may eventually produce better methods of hydrogen transport.

Oxygen from rocks for plants and for human breathing probably won't be a problem. The Hubble Space Telescope recently discovered areas of the moon with rocks rich in oxygen.

The Hubble Space Telescope has detected oxygen in moon minerals that future explorers could use for breathing, to make electricity, and for rocket fuel. Scientists say the findings will help them determine whether the amounts available in the lunar soil will be enough for future astronauts to use.

The orbiting Hubble observatory is usually aimed at extremely distant areas of the universe. But for a few days in August, the U.S. space agency, NASA, pointed it at the moon to look at the landing sites of the Apollo 15 and 17 missions of the early 1970s and a 45-kilometer wide impact crater on a plateau never visited by astronauts.

The Apollo missions had returned rock samples containing an oxygen-bearing mineral called ilmenite. Planetary scientist Mark Robinson of Northwestern University near Chicago says planners of future moon missions want to know if the plateau region contains an equally rich amount of ilmenite.

"All the minerals you find on the moon have oxygen in them, but ilmenite is special in the sense that it is relatively easy to break it apart to get to the oxygen," said Mr. Robinson.

By Randall Parker 2005 October 21 01:06 PM  Space Exploration
Entry Permalink | Comments(1)
Governments Waste Energy

New Haven Connecticut schools have started saving $600,000 per year by not heating and cooling schools when no one is in the schools.

NEW HAVEN, CONN. – They're not rocket scientists. But conservation consultants John Pierson and Parthiban Mathavan were able to save New Haven Public Schools $1.1 million in energy costs last fiscal year.

How? By peeking out the window and deciding that a mild winter morning does not require full-blast heat at the 50 schools they monitor.

"We are always dreaming up ways to be more efficient," says Mr. Pierson. Typically, heat or air-conditioning was on 24/7 - even if no one was in school. Stopping that saved $600,000 the first year. "A lot of it is common sense."

So how many millions or tens of millions of dollars has New Haven wasted cooling and heating buildings for years or decades while the buildings were empty? How many other city, county, and state governments are still doing this even today?

Some conservation is very easy to do but governments lack the incentive to make even easy decisions to save costs.

"One of the big problems I see with municipalities is they get used to paying the bills, grumble about the price, and don't do a lot to investigate cost and consumption," Melchiori says. "To me, we are not employing any technology that anyone else couldn't employ.... It is a lot of common-sense application of existing possibilities; we just actually applied it."

A state government could do its citizens a favor if it passed a law requiring all state, county, and local governments to collect basic information on energy costs, and publish the information on the web in a state government database. Imagine the database contained energy costs per month along with energy types used, quantities of each energy type (e.g. kilowatt hours of electricity, gallons of heating fuel, millions of square feet of natural gas, and so on) and square feet of space for each building. Then concerned citizens search the database and look for energy usage per building, compare similar buildings, and spot likely sources of waste. Such a database would be even more helpful if it included temperature information per town per day so that outside temperature could be adjusted for when analyzing costs.

The knowledge that journalists and regular citizens were going to be looking through their energy usage data would give quite a few elected officials the incentive to find ways to eliminate waste and use cheaper energy sources. Transparency on costs will improve efficiency, lower costs, and improve the quality of governments.

By Randall Parker 2005 October 21 09:33 AM  Energy Policy
Entry Permalink | Comments(11)
2005 October 19 Wednesday
New Method Grows Collagen For Surgery Orders Of Magnitude Faster

University College London researcher Robert Brown and his team have stumbled upon a cell free way to rapidly produce collagen protein polymers for surgical repair of collagen damage.

A research team from the UCL Tissue Repair and Engineering Centre (TREC), the UCL Eastman Dental Institute and the UCL Institute of Orthopaedics, have pioneered a novel technique for engineering tissues, which has the capacity to greatly reduce the time taken to fabricate implantable human tissue.

Tissue engineering is a method whereby the patient has cells extracted from his or her body and grown under laboratory conditions for a myriad of applications such as cartilage, skin grafts, heart valves and tendons, without the risk of rejection, infection or the ethical dilemma involved in transplanting a donated organ.

Current tissue engineering methods depend on the ability of the cultured cells themselves to grow new tissue around a cell scaffold, which is slow, expensive and has limited success. Professor Brown’s process is cell-independent, controlled engineering of scaffolds by rapidly removing fluid from hyper-hydrated collagen gels.

The fluid is removed by employing plastic compression, a process that the team found produces dense, cellular, mechanically strong collagen structures that can be controlled at nano and micro scales and which mimic biochemical processes.

Principal investigator Professor Robert Brown (UCL TREC) said: “The fluid removal dramatically shrinks the collagen by well over 100 times its original volume, which provides the ability to introduce controlled mechanical properties, and tissue-like microlayering, without cell participation. Crucially, this takes minutes instead of the conventional days and weeks without substantial harm to the embedded cells. The rapidity and biomimetic potential of the plastic compression fabrication process opens a new route for the production of biomaterials and patient-customised tissues and represents a new concept in ‘engineered’ tissues.”

A paper describing the process in detail will be published in the October edition of the ‘Advanced Functional Materials’ journal.

The method takes 35 minues instead of weeks and is twice as strong as collagen made using existing methods.

Cell-seeded collagen gel typically takes weeks to develop into weak, early-stage tissues. In the UCL study, published today in Advanced Functional Materials online, the team sucked most of the water out of the structure in a procedure known as plastic compression rather than waiting for the cells to do their job.

The result - following huge-scale shrinkage by a factor of at least 100 - was a simple collagen-based tissue created in 35 minutes. The shrinkage – a new approach to microfabrication – also gave the tissue an average break strength of 0.6 MPa (megapascals) compared with tissues conventionally grown over 1 to 12 weeks of around 0.3 MPa. Given that collagen sheets are fragile – in the study they were around 30 micrometres thick – the team rolled them up like swiss rolls to produce 3D rods which were easier to handle and manipulate.

Tissue engineering typically involves placing cells on a polymer scaffold and allowing them to grow into the desired tissues, which can then be used for surgical implantation. The process can take days or weeks and is difficult to control, cells may also fail to develop into the target material and at best, produce a tissue of less than 1 MPa break strength compared with natural collagen which can be up to 100 MPa strong in a human tendon.

Collagen makes up a quarter of the body.

Collagen is a protein which acts as a structural support for skin, bone, tendon, ligament, cartilage, blood vessels and nerves and as such it makes up 25 per cent of the human body.

Collagen accumulates damage with age and injury just like the rest of the body. So the ability to produce collagen quickly for implant is another step down the road toward full body rejuvenation. However, since collagen's arrangement in so many locations in the body is so microscopic in structure and in so many different locations surgery is not a practical method of replacing all or even most worn collagen. A lot of the damaged collagen in an aged body will need to be replaced by cells sent into the body as cell therapy programmed to do collagen repair. Still, this latest discovery has uses for repair of larger sized pieces of collagen.

The discovery was made accidentally.

Professor Robert Brown, of the UCL Institute of Orthopaedics, says: “We stumbled across this discovery while trying to measure the compression properties of collagen gel. Our method offers a simple and controllable means of quickly engineering tissue structures. The next stage is to test whether this method could help repair injured tissues.

“The speed and control it offers means that our method could one day be used to produce implant tissue at the bedside or in the operating theatre. We have a proof of concept grant from UCL BioMedica to produce a semi-automatic device for implant production. Ultimately, the goal is to design a rapid, inexpensive, automatic process for creating strong tissues which could supply hospital surgical units with a tool kit of spare parts for reconstructive surgery.”

This result fits a larger pattern of bioscientific and biotechnological advances where smarter manipulation of biological molecules allows processes to be speeded up by orders of magnitude.

By Randall Parker 2005 October 19 11:00 AM  Biotech Tissue Engineering
Entry Permalink | Comments(1)
Smoking Accelerates Cognitive Decline

If the threat of cancer, stroke, heart disease, and other old age diseases aren't enough to scare smokers into quitting how about the threat of becoming dumber as the toxins in cigarette smoke damage the brain?

ANN ARBOR, MI – Smokers often say that smoking a cigarette helps them concentrate and feel more alert. But years of tobacco use may have the opposite effect, dimming the speed and accuracy of a person's thinking ability and bringing down their IQ, according to a new study led by University of Michigan researchers.

The association between long-term smoking and diminished mental proficiency in 172 alcoholic and non-alcoholic men was a surprising finding from a study that set out to examine alcoholism's long-term effect on the brain and thinking skills.

While the researchers confirmed previous findings that alcoholism is associated with thinking problems and lower IQ, their analysis also revealed that long-term smoking is too. The effect on memory, problem-solving and IQ was most pronounced among those who had smoked for years. Among the alcoholic men, smoking was associated with diminished thinking ability even after alcohol and drug use were accounted for.

The findings are the first to suggest a direct relationship between smoking and neurocognitive function among men with alcoholism. And, the results suggest that smoking is associated with diminished thinking ability even among men without alcohol problems.

Avoid neurotoxins. Your brain is your most valuable asset.

Those who think they will live long enough to enjoy the benefits of rejuvenation therapies ought to keep in mind that the brain is going to be the organ that is hardest to rejuvenate. The development of technologies for the growth of replacement organs will allow lots of old parts to get replaced with young parts. But you obviously can not replace your brain without replacing your identity with a different identity. Technologies for brain rejuvenation will come more slowly. Even stem cells therapies that replace dead neurons are far less than ideal because those dead neurons will be taking with them memories and personality elements. Take good care of your brain. Avoid toxins and avoid sports that might give you a concussion. You need for it to last a long time.

By Randall Parker 2005 October 19 09:58 AM  Aging Studies
Entry Permalink | Comments(12)
2005 October 17 Monday
New Methods To Create Embryonic Stem Cells Sidestep Some Ethical Objections

Scientists at the MIT Whitehead Institute demonstrated on mouse murine cells that altered nuclear transfer (ANT) where genes are deactivated in the donor nucleus will allow creation of embryonic stem cells from embryos that could never develop into a full organism.

Some senators unhappy with those proposals have suggested that 'alternative' methods of deriving the cells, which don't require the destruction of viable embryos, could help to bridge the ethical divide (see Nature 436, 309; 2005).

Until now, such methods have been purely theoretical, but in work published online by Nature this week, two teams report their successful use in mice. Rudolf Jaenisch and Alexander Meissner of the Massachusetts Institute of Technology describe a variant of therapeutic cloning called altered nuclear transfer (ANT), in which a gene in the patient's donated cell is switched off before the nucleus is transferred into a fertilized egg. The resulting egg grows into a normal ball of cells called a blastocyst from which ES cells can be derived, but the deactivated gene means that the ball lacks the ability to implant in a uterus and so develop into a baby (A. Meissner and R. Jaenisch Nature doi:10.1038/nature04257; 2005).

The deactivation of a single gene prevents a pregnancy from being possible to initiate.

Jaenisch and Alexander Meissner, a graduate student in his lab, focused on a gene called Cdx2, which enables an embryo to grow a placenta. In order to create a blastocyst that cannot implant in a uterus, the researchers disabled Cdx2 in mouse cells.

They accomplished this with a technique called RNA interference, or RNAi. Here, short interfering RNA (siRNA) molecules are designed to target an individual gene and disrupt its ability to produce protein. In effect, the gene is shut off. Jaenisch and Meissner designed a particular form of siRNA that shut off this gene in the donor nucleus and then incorporated itself into all the cells comprising the blastocyst. As a result, all of the resulting mouse blastocysts were incapable of implantation.

However, once the stem cells had been extracted from the blastocysts, Cdx2 was still disabled in each of these new cells, something that needed to be repaired in order for these cells to be useful. To correct this, Meissner deleted the siRNA molecule by transferring a plasmid into each cell. (A plasmid is a unit of DNA that can replicate in a cell apart from the nucleus. Plasmids are usually found in bacteria, and they are a staple for recombinant DNA techniques.) The stem cells resulting from this procedure proved to be just as robust and versatile as stem cells procured in the more traditional fashion.

"The success of this procedure in no way precludes the need to pursue all forms of human embryonic stem cell research," says Jaenisch, who is also a professor of biology at MIT. "Human embryonic stem cells are extraordinarily complicated. If we are ever to realize their therapeutic potential, we must use all known tools and techniques in order to explore the mechanisms that give these cells such startling characteristics."

ANT, Jaenisch emphasizes, is a modification, but not an alternative, to nuclear transfer, since the approach requires additional manipulations of the donor cells. He hopes that this modification may help resolve some of the issues surrounding work with embryonic stem cells and allow federal funding.

I like Jaenisch's approach because it will work with any donor nucleus. So a person have their own nuclear DNA used to create a cell line to create stem cell therapies and replacement organs made from their own DNA.

Some will object to Jaenisch's approach as essentially consigning a potential human to death. But as knowledge of more genes involved in development become identified more genes could get turned off in an extended version of this approach where the resulting embryo looked ever less like something that would develop into a human. Imagine, for example, that if one wanted to grow kidneys one turned off all the developmental genes for a head so that the embryo would only have potential to grow into a few chest organs.

Robert Lanza of Advanced Cell Technology in Worcester Massachusetts and colleagues took a different but very straightforward approach where they removed single cell from an early stage 8 cell embryo called a morula and then expose that cell to existing embryonic stem cells to stabilize it as an embryonic stem cell.

The procedure involves removing a blastomere, one of eight cells that make up an early embryo before implantation in the uterus, and putting it in a culture with other stem cells to encourage it to form into independent stem cells.

The technique, used in pre-implantation genetic diagnosis (PGD) by couples with a family history of hereditary conditions such as muscular dystrophy to screen out affected embryos, allows the embryo to continue to grow normally despite the removal of the cell.

To meet the objections of ethical opponents Lanza's technique still requires treating the other 7 cells as being destined to make a new baby. That is problematic. Suppose you want to make a stem cell line from your own DNA to, for example, grow a replacement kidney for yourself. Chances are most people won't be keen on creating a baby as a side effect. Also, in order for the stem cell line to perfectly match your DNA then embryo has to be a clone and so you have to clone yourself if you go down this path.

The other 7 cells remain in the embryo and get implanted in a womb.

But the key benefit of this technique may be that the remaining 7-cell embryos, when implanted into the wombs of female mice, developed into completely normal baby mice. Of the 47 implanted, 23 came to term, exactly the same rate as for “control” 8-cell morulas that had not had a blastomere removed.

“It means we overcome the key pro-life objection, that you must destroy life to save life,” says Lanza. Also, he says that the technique used to extract the blastomere is identical to that used routinely in pre-implantation diagnosis during IVF to screen out embryos which are defective and have no chance of surviving. “This procedure has been done hundreds of thousands of times, so we know it has a minimal or negligible effect on the embryo,” he says.

This works okay as a way to get a cell from IVF procedures that are going to get done anyway. So this technique can be utilized to generate new embryonic stem cell (ESC) lines for research and potentially for therapy as well.

Some people reject these approaches.

Much of the debate centers on the precise definition of "embryo," because it is considered by some people to have the same moral status as a human being. In one of the new sets of experiments, researchers crafted stem cell lines from lab creations characterized as "nonviable" entities.

Others dismissed such arguments as semantic quibbling.

"This is an attempt to solve an ethical issue through a scientific redefinition that really doesn't solve the issue," said Jaydee Hanson, director of human genetics at the International Center for Technology Assessment, a Washington, D.C., nonprofit organization that opposes some kinds of cloning and stem cell research on moral grounds.

My guess is that not all moral objectors have to be satisfied by new methods of making ESCs. The approaches just have to win over enough of the objectors that the remaining opponents can not form political coalitions big enough to stop which is done using human ESCs with one of these approaches.

Scientists will keep on developing more improved techniques for making pluripotent stem cells in ways that satisfy an increasing number of the objectors to existing methods for making stem cells from embryos. As more genes involved in development are identified and as more techniques for manipulating genetic regulatory state are discovered stem cell researchers will find all sorts of additional ways to skate around ethical objections. In the process they will also develop useful toolboxes for manipulating stem cells for other goals as well..

By Randall Parker 2005 October 17 03:16 PM  Bioethics Debate
Entry Permalink | Comments(1)
2005 October 15 Saturday
Brain Enhancement Drugs Headed To Market

Michael S. Gazzaniga, director of the Center for Cognitive Neuroscience at Dartmouth College, has an essay in the October 2005 edition of Scientific American entitled Smarter On Drugs.

Work on memory enhancers may be furthest along. Eric R. Kandel of Columbia University, who won a Nobel Prize for his research on learning and memory in the sea slug Aplysia, is one proponent. He found that learning occurs at the synapse (the junction between two neurons) by several means. The synapse is enhanced when a protein called CREB is activated, and CREB plays a role in memory formation in fruit flies and in mice. With these discoveries came the 1998 birth of Memory Pharmaceuticals, Kandel's Montvale, N.J.-based company, which hopes to formulate a drug that will raise the amount of CREB in the human neural system and thus facilitate the formation of long-term memories. One of the most promising chemicals is called MEM 1414. If clinical trials go well, MEM 1414 could be on the market after 2008. At least one other company, Helicon Therapeutics in Farmingdale, N.Y., is also investigating CREB to improve human memory formation.

Alternative drugs are also in the works based on other brain mechanisms. Before a neuron naturally increases CREB, certain channels on its membrane must open to allow positive ions to flow into the cell. The ions then trigger a cascade of events leading to the activation of CREB. One channel of interest is known as NMDA. In 1999 Joseph Z. Tsein, Ya-Ping Tang and their colleagues, then at Princeton University, discovered that increasing the number of NMDA receptors in the mouse hippocampus led to better performance on a spatial-memory task. Now researchers and pharmaceutical companies are pursuing NMDA receptor agonists (they combine with the receptors) as nootropes. At least a dozen new drugs of this kind are making their way toward clinical trials.

Some FuturePundit readers have requested I post more on currently available cognitive enhancement methods. Well, Gazzaniga points to one way to boost learning:

Self-medicating with Starbucks is one thing. But consider the following. In July 2002 Jerome Yesavage and his colleagues at Stanford University discovered that donepezil, a drug approved by the FDA to slow the memory loss of Alzheimer's patients, improves the memory of the normal population. The researchers trained pilots in a flight simulator to perform specific maneuvers and to respond to emergencies that developed during their mock flight, after giving half the pilots donepezil and half a placebo. One month later they retested the pilots and found that those who had taken the donepezil remembered their training better, as shown by improved performance. The possibility exists that donepezil could become a Ritalin for college students. I believe nothing can stop this trend, either.

Donepezil is marketed by Pfizer as Aricept. Note that doctors in the United States can prescribe drugs for purposes other than their original FDA approved purpose. Therefore donepezil can be had by anyone in America who can find a cooperative doctor willing to write an Aricept prescription. In some Third World countries you just have to show up at a pharmacy and wave some money in front of the pharmacist and Aricept can be acquired without the cooperation of the doctor. Note that I'm not advocating this. My point is that some people are going to try Aricept to boost their learning and no doubt some already are doing this.

Since donepezil blocks acetyl cholinesterase it boosts available neurotransmitter acetylcholine. This probably causes other consequences aside from increased learning. For example, I've noticed that taking large amounts of choline (which presumably also boosts acetylcholine) makes me a lot more prone to depression. Your mileage may vary.

Gazzaniga also relays claims that Ritalin enhances cognitive function not just for hyperactives but also for regular minds. Has anyone ever come across any controlled studies on SAT tests, IQ tests, or other measures of cognitive function that demonstrate this claim? A lot of claims are floating around out there. But in the absence of controlled (preferably double blind) studies I'll treat such claims with skepticism.

The article above is adapted from Gazzaniga's book The Ethical Brain.

By Randall Parker 2005 October 15 02:03 PM  Brain Enhancement
Entry Permalink | Comments(15)
2005 October 13 Thursday
Uranium Pellet Design Allows Longer Lower Temperature Burning

Purdue University researchers have discovered a way to operate uranium pellets in nuclear reactors at lower temperature which also will allow the pellets to last longer before needing replacement.

WEST LAFAYETTE, Ind. – Purdue University nuclear engineers have developed an advanced nuclear fuel that could save millions of dollars annually by lasting longer and burning more efficiently than conventional fuels, and researchers also have created a mathematical model to further develop the technology.

New findings regarding the research will be detailed in a peer-reviewed paper to be presented on Oct. 6 during the 11th International Topical Meeting on Nuclear Reactor Thermal Hydraulics in Avignon, France. The paper was written by Shripad Revankar, an associate professor of nuclear engineering; graduate student Ryan Latta; and Alvin A. Solomon, a professor of nuclear engineering.

The research is funded by the U.S. Department of Energy and focuses on developing nuclear fuels that are better at conducting heat than conventional fuels. Current nuclear fuel is made of a material called uranium dioxide with a small percentage of a uranium isotope, called uranium-235, which is essential to induce the nuclear fission reactions inside current reactors.

Better heat conduction allows cooler internal operating temperature and hence less cracking and longer life. This could reduce the interval between refuelings, allowing reactors to have more up-time and also reduce fuel consumption.

"Although today's oxide fuels are very stable and safe, a major problem is that they do not conduct heat well, limiting the power and causing fuel pellets to crack and degrade prematurely, necessitating replacement before the fuel has been entirely used," Solomon said.

Purdue researchers, led by Solomon, have developed a process to mix the uranium oxide with a material called beryllium oxide. Pellets of uranium oxide are processed to be interlaced with beryllium oxide, or BeO, which conducts heat far more readily than the uranium dioxide.

This "skeleton" of beryllium oxide enables the nuclear fuel to conduct heat at least 50 percent better than conventional fuels.

"The beryllium oxide is like a heat pipe that sucks the heat out and helps to more efficiently cool the fuel pellet," Solomon said.

A mathematical model developed by Revankar and Latta has been shown to accurately predict the performance of the experimental fuel and will be used in future work to further develop the fuel, Revankar said.

Pellets of nuclear fuel are contained within the fuel rods of nuclear fission reactors. The pellets are surrounded by a metal tube, or "cladding," which prevents the escape of radioactive material.

Longer lasting fuel also translates into less waste generated.

Because uranium oxide does not conduct heat well, during a reactor's operation there is a large temperature difference between the center of the pellets and their surface, causing the center of the fuel pellets to become very hot. The heat must be constantly removed by a reactor cooling system because overheating could cause the fuel rods to melt, which could lead to a catastrophic nuclear accident and release of radiation – the proverbial "meltdown."

"If you add this high-conductivity phase beryllium oxide, the thermal conductivity is increased by about 50 percent, so the difference in temperature from the center to the surface of these pellets turns out to be remarkably lower," Solomon said.

Revankar said the experimental fuel promises to be safer than conventional fuels, while lasting longer and potentially saving millions of dollars annually.

"We can actually enhance the performance of the fuel, especially during an accident, because this fuel heats up less than current fuel, which decreases the possibility of a catastrophic accident due to melting," Revankar said. "The experimental fuel also would not have to be replaced as often as the current fuel pellets.

"Currently, the nuclear fuel has to be replaced every three years or so because of the temperature-related degradation of the fuel, as well as consumption of the U-235. If the fuel can be left longer, there is more power produced and less waste generated. If you can operate at a lower temperature, you can use the fuel pellets for a longer time, burning up more of the fuel, which is very important from an economic point of view. Lower temperatures also means safer, more flexible reactor operation."

Solomon said a 50 percent increase in thermal conductivity represents a significant increase in performance for the 103 commercial nuclear reactors currently operating in the United States.

A small group of academic researchers figured out how to reduce uranium consumption, increase reactor performance, and reduce waste generation and all in one fell swoop. Pretty impressive. Nuclear reactor technology continues to advance just as other energy technologies advance.

Even if oil production peaks in the next 10 years I do not see the economies of developed countries being slowed down for long. Too many good minds would react to necessity and demonstrate once again that it really is the mother of invention.

By Randall Parker 2005 October 13 09:38 PM  Energy Nuclear
Entry Permalink | Comments(20)
UCLA Team Cuts Photovoltaics Cost With Plastics

A UCLA team may have found a path to make photovoltaics cost competitive.

In research published today in Nature Materials magazine, UCLA engineering professor Yang Yang, postdoctoral researcher Gang Li and graduate student Vishal Shrotriya showcase their work on an innovative new plastic (or polymer) solar cell they hope eventually can be produced at a mere 10 percent to 20 percent of the current cost of traditional cells, making the technology more widely available.

"Solar energy is a clean alternative energy source. It's clear, given the current energy crisis, that we need to embrace new sources of renewable energy that are good for our planet. I believe very strongly in using technology to provide affordable options that all consumers can put into practice," Yang said.

The use of purified silicon currently prevents photovoltaics from reaching cost competitiveness. Another approach being pursued is the development of plants for making less purified silicon. But the plastics approach bypasses the problem altogether.

The price for quality traditional solar modules typically is around three to four times more expensive than fossil fuel. While prices have dropped since the early 1980s, the solar module itself still represents nearly half of the total installed cost of a traditional solar energy system.

Currently, nearly 90 percent of solar cells in the world are made from a refined, highly purified form of silicon -- the same material used in manufacturing integrated circuits and computer chips. High demand from the computer industry has sharply reduced the availability of quality silicon, resulting in prohibitively high costs that rule out solar energy as an option for the average consumer.

Made of a single layer of plastic sandwiched between two conductive electrodes, UCLA's solar cell is easy to mass-produce and costs much less to make -- roughly one-third of the cost of traditional silicon solar technology. The polymers used in its construction are commercially available in such large quantities that Yang hopes cost-conscious consumers worldwide will quickly adopt the technology.

Independent tests on the UCLA solar cell already have received high marks. The nation's only authoritative certification organization for solar technology, the National Renewable Energy Laboratory (NREL), located in Golden, Colo., has helped the UCLA team ensure the accuracy of their efficiency numbers. The efficiency of the cell is the percentage of energy the solar cell gathers from the total amount of energy, or sunshine, that actually hits it.

The conversion efficiency they have achieved is not yet high enough. But they think they can achieve a 3 or 4 times increase in conversion efficiency to make it competitive.

According to Yang, the 4.4 percent efficiency achieved by UCLA is the highest number yet published for plastic solar cells.

"As in any research, achieving precise efficiency benchmarks is a critical step," Yang said. "Particularly in this kind of research, where reported efficiency numbers can vary so widely, we're grateful to the NREL for assisting us in confirming the accuracy of our work."

Given the strides the team already has made with the technology, Yang calculates he will be able to double the efficiency percentage in a very short period of time. The target for polymer solar cell performance is ultimately about 15 percent to 20 percent efficiency, with a 15–20 year lifespan. Large-sized silicon modules with the same lifespan typically have a 14 percent to 18 percent efficiency rating.

Plastic decays in sunlight. So I'm not surprised by the projected 15 to 20 year lifespan. Other approaches as replacements for silicon could potentially last longer.

This development is not yet ready for market.

The plastic solar cell is still a few years away from being available to consumers, but the UCLA team is working diligently to get it to market.

"We hope that ultimately solar energy can be extensively used in the commercial sector as well as the private sector. Imagine solar cells installed in cars to absorb solar energy to replace the traditional use of diesel and gas. People will vie to park their cars on the top level of parking garages so their cars can be charged under sunlight. Using the same principle, cell phones can also be charged by solar energy," Yang said. "There are such a wide variety of applications."

Photovoltaics will become cost competitive some day. But it is very hard to guess when. The fact that talented groups of researchers (including some start-ups with funding from major venture capitalists) are working on approaches that avoid the high cost of silicon crystals makes me optimistic that a breakthrough will come within 10 years. We also still face the battery problem for how to store it for night use and also for transportation.

By Randall Parker 2005 October 13 08:18 AM  Energy Solar
Entry Permalink | Comments(18)
2005 October 11 Tuesday
Tony Blair Privately For More Nuclear Plants In UK

British Prime Minister Tony Blair is saying privately that he is for construction of additional nuclear power plants in the UK.

TONY Blair has thrown his personal backing behind the expansion of nuclear power generation in Britain. The new reactors would be built on existing nuclear sites and replace those which are to be decommissioned in the near future.

The Prime Minister will sell the nuclear build programme to the public and the Labour Party as a job-creating solution to the problems posed by global warming and Britain’s growing dependence on imported energy supplies from unstable countries. The Prime Minister expects a year-long inquiry into Britain’s future energy requirements to conclude that more nuclear energy is the only practical way to reduce greenhouse gas emissions.

Blair has privately disclosed that he is in favour of more nuclear reactors and that he expects the findings of the inquiry to make a case that can be supported by an all-party consensus.

Blair is responding to the failure of the EU to meet its Kyoto Accord CO2 reduction targets, the unlikelihood that even tougher reduction targets can be met without nuclear power, and public concern about dependence on oil from the Middle East. The war in Iraq and high oil prices are fueling (sorry, couldn't resist) that concern.

Publically Blair is warming to nuclear but not officially endorsing it.

Blair is prepared to go as far as he can without prejudging the nuclear review. A fortnight ago, he made the case for nuclear power to the Labour Party conference while stopping short of calling for its implementation.

"Global warming is too serious... to split into opposing factions on it," he told delegates. "And for how much longer can countries like ours allow the security of our energy supply to be dependent on some of the most unstable parts of the world?"

The Department of Trade & Industry confirmed on Friday that it has been holding discreet talks with major energy providers about nuclear options: E-On and RWE of Germany, and EdF of France. BNFL has a design for a new plant.

I have long argued for greatly accelerated development of new technologies as the most appropriate response to both the potential threat of global warming (about which I'm not much concerned) and the eventual exhaustion of fossil fuels energy sources. New energy technologies will be cleaner just in terms of ground level conventional pollutants and this alone is reason enough to develop them. Blair has recently demonstrated a new appreciation of the value of accelerated energy technology development. But will he allocate more public funds toward this purpose or make other policy changes that accelerate the rate of energy technology development?

Environmentalists signal the extent of their belief in the danger of global warming when they start arguing for nuclear power as an energy source. I think we could phase out the use of fossil fuels eventually by making solar energy our primary energy resource. But that requires many technological advances that lie in the future. We could make those advances come more quickly. The global warming alarmists ought to put half as much effort to lobbying for photovoltaics and battery research as they do in raising alarms about a supposed coming environmental disaster. We'd be better off with the resulting technology even if the global warming fears are exaggerated. However, while waiting for those advances for those who urgently want to reduce CO2 emissions nuclear is a necessary substitute for many uses of fossil fuels today.

Nuclear power has one sort of insurance policy advantage: If a huge volcanic eruption or a massive meteor ever blotted out the sun for a few years solar power would become worthless. Nuclear power would keep on ticking. If you want to survive natural disaster scenarios involving reduction of sunlight then nuclear is the best power source.

By Randall Parker 2005 October 11 11:36 AM  Energy Policy
Entry Permalink | Comments(28)
Energy Price Shocks Bring Conservation Back In Vogue

High energy prices have sent people looking for ways to reduce heating costs and other energy costs.

Some schools are turning down thermostats, limiting bus service or hiring energy consultants. In Council, Idaho, the schools expect to halve their $10,000 monthly heating bill with a new system that runs on wood chips produced when state crews thin trees along the highways. Last month, Gov. Sonny Perdue of Georgia closed the schools for two days because school buses were running out of diesel fuel.

In Marengo, Iowa, the county courthouse remains closed on Mondays to give its gas boilers an extra day off, and employees work four 10-hour days. In Marshalltown, Iowa, officials have traded traffic lights for stop signs at six intersections.

Insulation upgrades, shifts to alternative fuels, shifts to more fuel efficient vehicles, and changes to lifestyles are all reported in the article.

The growing role of biomass for heating is the most interesting response. People are turning to biomass heat sources including wood and corn to save money.

Some suppliers of the stoves and the pellets and wood they burn are running out of inventory or hiring extra employees to meet the demand. In Walla Walla, Wash., Chris Neufeld, vice president of Blaze King Industries-USA, said his company had a backlog worth $1 million for stoves that cost about $2,000 apiece.

In Waverly, Ill., Don Magelitz, who sells corn stoves, is more than eight weeks behind on deliveries and has a backlog of 200 orders.

Biomass for heating makes more sense that biomass for ethanol production. When corn is used for heating a far larger fraction of the chemical energy serves a useful purpose. Production of ethanol takes energy to operate the chemical plant and some energy is lost as heat. but when corn is burned inside a building then that heat serves a useful purpose.

I was recently surprised to learn that corn is a very economically competitive source of heating energy. In the comments section of my recent post on ethanol in Brazil you will find a discussion of corn stoves and the economics of different heat energy sources. Unless you happen to have a free source of wood the cheapest heat energy source appears to be corn. A lot of corn stoves have automated corn fuel feeders and thermostats. But most of the models I looked at had bins that stored only a day or two of fuel. Construction of a bigger bin and feeding system would allow a much longer period of time between refuelings. Anyone seriously considering this option should check out the price of corn by the bushel for deliveries of many bushels at a time. You'd also need storage facilities for handling hundreds or thousands of pounds of corn. Ideally the stored corn from a large bin would gravity feed into the corn stove.

Liquid fuels are most needed for transportation. To the extent that corn and other biomass sources displace heating oil for heating they free up a liquid fuel source far better suited for transportation. The heat loss of converting corn to ethanol is avoided. The total amount of energy available is increased. More generally, stationary uses of liquid fuel should be especially targetted for displacement by biomass, solar, nuclear and other sources. Liquid fuel is too valuable for transportation to be wasted in other applications. Currently (as of 2002) 82% of #2 heating oil used in America is used in the American Northeast. Policymakers ought to take notice and encourage migration to other heat energy sources and better insulation in the northeast.

The need for liquid fuel in transportation is also an argument for the development of better battery technology. Eliminate the need for liquid fuel and suddenly many more energy sources such as nuclear, wind, biomass, and photovoltaics could be used to power cars.

By Randall Parker 2005 October 11 10:41 AM  Energy Policy
Entry Permalink | Comments(9)
Fish In Diet Slows Rate Of Cognitive Decline

Here is more evidence for the benefit of fish for reducing the rate of brain aging.

CHICAGO – Consuming fish at least once a week was associated with a 10 percent per year slower rate of cognitive decline in elderly people, according to a new study posted online today from Archives of Neurology, one of the JAMA/Archives journals. The study will be published in the December print edition of the journal.

Martha Clare Morris, ScD, of Rush University Medical Center, and colleagues analyzed six years of data from an ongoing study of Chicago residents, 65 years and older, first interviewed between 1993 and 1997 and every three years in two follow-up interviews. Interviews included four standardized cognitive tests and dietary questions on the frequency of consumption of 139 different foods, as well as questions of daily activities, exercise levels, alcohol consumption and medical history.

Morris found dietary intake of fish was inversely associated with cognitive decline over six years in this older, biracial community study. "The rate of decline was reduced by 10 percent to 13 percent per year among persons who consumed one or more fish meals per week compared with those with less than weekly consumption. The rate reduction is the equivalent of being three to four years younger in age," she said.

The benefit might not be from omega 3 fatty acids.

Although fish is a direct source of omega-3 fatty acids, which have been shown to protect against Alzheimer's disease and stroke, the dietary intake of omega-3 fatty acids was not associated with cognitive change in this study.

In addition, neither consumption of fruit and vegetables nor overall cardiovascular health appeared to account for the study findings, the researchers said.

However, see below for another report that demonstrates a mechanism by which omega 3 fatty acid DHA reduces inflammation and protects brain cells from damage and cell death.

Morris has previously reported that consumption of foods high in vitamin E reduces the incidence of Alzheimer's Diseaes.

Louisiana State University researcher Nicolas G. Bazan has just recently discovered a mechanism by which omega 3 fatty acid DHA protects the brain from neurotoxins and prevents cell death.

Their study shows that docosahexaenoic acid (DHA), an omega-3 fatty acid found in coldwater fish such as mackerel, sardines and salmon, reduces levels of a protein known to cause damaging plaques in the brains of Alzheimer's patients.

What's more, the researchers discovered that a derivative of DHA, which they dubbed "neuroprotectin D1" (NPD1), is made in the human brain. That natural substance plays a key role, too, in protecting the brain from cell death, the study showed.

Here is Bazan's paper.

A time-dependent release of endogenous free DHA followed by NPD1 formation occurs, suggesting that a phospholipase A2 releases the mediator’s precursor. When NPD1 is infused during ischemia-reperfusion or added to RPE cells during oxidative stress, apoptotic DNA damage is down-regulated. NPD1 also up-regulates the anti-apoptotic Bcl-2 proteins Bcl-2 and BclxL and decreases pro-apoptotic Bax and Bad expression. Moreover, NPD1 inhibits oxidative stress-induced caspase-3 activation. NPD1 also inhibits IL-1 -stimulated expression of COX-2. Overall, NPD1 protects cells from oxidative stress-induced apoptosis.

I just decided to have salmon for lunch.

By Randall Parker 2005 October 11 09:49 AM  Aging Diet Brain Studies
Entry Permalink | Comments(9)
2005 October 10 Monday
Habitual Liar Brains Look Different On Scans

Adrian Raine and Yaling Yang have found that habitual liars have more white matter and less gray matter in their brains as compared to less lie-prone people.

The research – led by Yaling Yang and Adrian Raine, both of the USC College of Letters, Arts and Sciences – is published in the October issue of the British Journal of Psychiatry.

Consider what these numbers say about the make-up of a temporary employment pool in a major US city:

The subjects were taken from a sample of 108 volunteers pulled from Los Angeles’ temporary employment pool. A series of psychological tests and interviews placed 12 in the category of people who had a history of repeated lying (11 men, one woman); 16 who exhibited signs of antisocial personality disorder but not pathological lying (15 men, one woman); and 21 who were normal controls (15 men, six women).

At least a quarter of the temporary employees examined were pathological liars or had antisocial personalities.

“We looked for things like inconsistencies in their stories about occupation, education, crimes and family background,” said Raine, a psychology professor at USC and co-author of the study.

“Pathological liars can’t always tell truth from falsehood and contradict themselves in an interview. They are manipulative and they admit they prey on people. They are very brazen in terms of their manner, but very cool when talking about this.”

A significant portion of the human race are predatory liars and con artists. On top of that there are rapists, murders, and assorted other criminals and psychopaths as well. Think about that next time someone speaks about humanity and the human future in lofty terms.

The habitual liars had histories of conning other people.

Aside from having histories of conning others or using aliases, the habitual liars also admitted to malingering, or telling falsehoods to obtain sickness benefits, Raine said.

Magnetic Resonance Imaging (MRI) scans found more white matter and less gray matter in the liars. So then are women better liars than men on average? Women also have a higher ratio of white matter to gray matter than men do.

After they were categorized, the researchers used Magnetic Resonance Imaging to explore structural brain differences between the groups. The liars had significantly more “white matter” and slightly less “gray matter” than those they were measured against, Raine said.

Specifically, liars had a 25.7 percent increase in prefrontal white matter compared to the antisocial controls and a 22 percent increase compared to the normal controls. Liars had a 14.2 percent decrease in prefrontal gray matter compared to normal controls.

The white matter probably helps in the formulation of deceptions.

More white matter – the wiring in the brain – may provide liars with the tools necessary to master the complex art of deceit, Raine said.

“Lying takes a lot of effort,” he said.

“It’s almost mind reading. You have to be able to understand the mindset of the other person. You also have to suppress your emotions or regulate them because you don’t want to appear nervous. There’s quite a lot to do there. You’ve got to suppress the truth.

“Our argument is that the more networking there is in the prefrontal cortex, the more the person has an upper hand in lying. Their verbal skills are higher. They’ve almost got a natural advantage.”

But in normal people, it’s the gray matter – or the brain cells connected by the white matter – that helps keep the impulse to lie in check.

Imagine genetically engineered people who are talented liars. Or imagine artificial intelligences which are geniuses at lying.

Pathological liars have a surplus of white matter, the study found, and a deficit of gray matter. That means they have more tools to lie coupled with fewer moral restraints than normal people, Raine said.

“They’ve got the equipment to lie, and they don’t have the disinhibition that the rest of us have in telling the big whoppers,” he said.

One of the reasons why I'm not particularly sanguine about our transhumanist future is that human ethical constraints are in large part a product of genetic coding. I do not buy the argument that rational self interest by itself provides enough basis to maintain a civilized society. Well, once biotechnology provides ways to enhance the ability to lie and the ability to feel less remorse or guilt won't some people opt to use this technology? Mightn't there even be a sort of mental arms race where people find it necessary to enhance their ability to deceive in order to protect themselves from other deceivers?

The ethical features of human cognition that were selected for to work in hunter gatherer groups and in small village life might get heavily selected against when humanity enters its transhumanist phase. I have a similar worry about altruistic punishment and I have a high expectation that the tendency to want to carry out altruistic punishment is coded for by genes as well. Also, I expect the desire to carry out altruistic punishment will be found to vary between individuals due to differences in gene sequences. Brain scans show that the brain rewards itself for carrying out altruistic punishment. Well, will all genetically engineered babies of the future be as likely to be coded for that desire as humans are today? The desire to carry out altruistic punishment might be essential to maintenance of a fair degree of cooperation within societies.

Adrian Raine has also previously found differences in the brains of psychopaths and normal people which are recognizable in brain scans.

Modest proposal: Require politicians running for offer to get brain scans and publish their gray matter to white matter ratio. If the public really wants more honest politicians (and I'm not entirely convinced that is the case) then the public could vote for candidates that have higher gray to white matter. Also, politicians should have to include any indications that they have brains shaped for psychopathy.

By Randall Parker 2005 October 10 03:12 PM  Brain Society
Entry Permalink | Comments(14)
Chronic Disease Biggest Killer World Wide

Chronic diseases will kill 35 million in 2005.

By the end of 2005, twice as many people will have died from chronic diseases as from all infectious diseases, starvation and pregnancy and birth complications combined, international experts have warned.

The “neglected epidemic” of chronic disease will take 35 million lives in 2005, out of the total 58 million who will die globally. And contrary to popular belief, most of the deaths - 80% - from chronic conditions such as heart disease, diabetes and cancer will be in low to middle-income countries.

Obesity and the demon tobacco are big contributors to ths trend. But, while the article does not state this, this is really a success story in one sense: More people in less developed countries are living long enough to get diseases of old age.

Deaths from infectious diseases and starvation continue to be viewed by most people as morally more objectionable. Death from heart disease, cancer, and other diseases associated with aging continue to be seen as natural and inevitable. This view that these diseases are natural leads to the view that they are not morally undesirable. Whereas deaths from infectious diseases are seen as morally repugnant and to be fought against much more strongly.

As more diseases of old age become manipulable by biotechnology this belief in their naturalness and in their inevitability will wane. If more people could make that mental shift sooner then the result would be much greater political support for increased funding of research aimed at developing rejuvenation therapies.

By Randall Parker 2005 October 10 12:44 PM  Aging Debate
Entry Permalink | Comments(4)
2005 October 09 Sunday
Researchers Find Cheaper Way To Reduce Coal Mercury Emissions

Lehigh University researchers have found a cheaper way to reduce mercury emissions from coal plants.

Researchers at Lehigh University's Energy Research Center (ERC) have developed and successfully tested a cost-effective technique for reducing mercury emissions from coal-fired power plants.

In full-scale tests at three power plants, says lead investigator Carlos E. Romero, the Lehigh system reduced flue-gas emissions of mercury by as much as 70 percent or more with modest impact on plant performance and fuel cost.

The reductions were achieved, says Romero, by modifying the physical conditions of power-plant boilers, including flue gas temperature, the size of the coal particles that are burned, the size and unburned carbon level of the fly ash, and the fly ash residence time. These modifications promote the in-flight capture of mercury, Romero said.

Aside: One hears Orwellian talk of "clean coal" as if it is a reality today. But if coal was already so clean there'd be no need for research in how to reduce coal power plant emissions.

Coal-fired power plants are considered to be the biggest sources of mercury emissions. Only now 35 years after the Clean Air Act did the US EPA finally get around to restricting mercury emissions from coal plants.

Coal-fired power plants are the largest single-known source of mercury emissions in the U.S. Estimates of total mercury emissions from coal-fired plants range from 40 to 52 tons.

The U.S. Environmental Protection Agency last March issued its first-ever regulations restricting the emission of mercury from coal-fired power plants. The order mandates reductions of 23 percent by 2010 and 69 percent by 2018. Four states - Massachusetts, New Jersey, Connecticut and Wisconsin - issued their own restrictions before the March 15 action by the EPA.

My take on the Bush Administration mercury reduction regulations is that they came after too many years and do not reduce mercury rapidly enough. Similarly, I fault the Clinton Administration for not already imposing more restrictive standards 10 years ago. Neurotoxins are bad. We should do a lot more about neurotoxins than about the possible threat of global warming. But global warming is a far more fashionable worry.

The trick is to make the mercury become oxidized.

The changes in boiler operating conditions, said Romero, prevent mercury from being emitted at the stack and promote its oxidation in the flue gas and adsorption into the fly ash instead. Oxidized mercury is easily captured by scrubbers, filters and other boiler pollution-control equipment.

Note that computer simulations played a role in identifying operating conditions likely to reduce mercury emissions. This is part of a much larger long running trend where simulations speed up the rate of scientific and technological advance. What I'd like to know: Just how much faster will science and technology be able to advance 20 or 30 years from now due to the ability to rapidly run simulated experiments? Will the rate of advance speed up by orders of magnitue due to simulations alone?

The ERC team used computer software to model boiler operating conditions and alterations and then collaborated with Western Kentucky University on the field tests. Analysis of stack emissions showed that the new technology achieved a 50- to 75-percent reduction of total mercury in the flue gas with minimal to modest impact on unit thermal performance and fuel cost. This was achieved at units burning bituminous coals.

Only about one-third of mercury is captured by coal-burning power plant boilers that are not equipped with special mercury-control devices, Romero said.

Romero estimated that the new ERC technology could save a 250-megawatt power unit as much as $2 million a year in mercury-control costs. The savings could be achieved, he said, by applying the ERC method solely or in combination with a more expensive technology called activated carbon injection, which would be used by coal-fired power plants to reduce mercury emissions. The resulting hybrid method, says Romero, would greatly reduce the approximately 250 pounds per hour of activated carbon that a 250-MW boiler needs to inject to curb mercury emissions.

Reductions of emissions of sulfur and nitrogen oxides causes, as a side effect, a big reduction in mercury emissions as well. So a more rapid tightening of sulfur and nitrogen oxide emissions would also lead to reduced mercury emissions.

Humans have doubled or tripled the amount of mercury in the atmosphere.

Best estimates to date suggest that human activities have about doubled or tripled the amount of mercury in the atmosphere, and the atmospheric burden is increasing by about 1.5 percent per year. Global anthropogenic emissions of mercury are estimated to range between 2000 and 6000 metric tons per year. Electric utilities, municipal waste combustors, commercial and industrial boilers, and medical waste incinerators account for approximately 80 percent of the total amount. Coal-fired utility boilers are the largest point source of unregulated mercury emissions in the United States.

I'd really like to know how much of the mercury in fish is there due to human pollution. Have humans doubled or tripled the amount of mercury in fish? I've yet to come across any reports on research that attempts to quantify the impact of human mercury sources on fish.

Chlorine plants are another major source of mercury.

In 2000, for instance, these chlorine plants reported 79 tons of mercury consumed, according to federal and industry data cited in the report. Fourteen of those tons were emitted or released into the environment; the rest - 65 tons - was officially classified as "unaccounted for" by the US Environmental Protection Agency (EPA).

That's an amount that shocks environmentalists because, by contrast, the nation's 497 mercury-emitting power plants sent 49 tons of the toxin into the air that year, Oceana reports.

A relatively small number of all the chlorine plants still use mercury in the United States and a larger number in Europe use mercury. Why not shut down the old plants or force those plants to shift to mercury-free manufacturing methods?

Indeed, most of the 43 chlor-alkalai manufacturing plants in the US today use advanced mercury-free manufacturing processes that are relatively clean. But nine US factories - and 53 older ones in Europe - still use older "mercury-cell" technology that requires huge quantities of mercury to do the same job, Oceana reports.

One can debate about the effects of green house gases for decades and people have. But mercury is bad for the brain. Why let chlorine or power plants emit much mercury at all?

By Randall Parker 2005 October 09 11:56 AM  Energy Fossil Fuels
Entry Permalink | Comments(6)
2005 October 07 Friday
Brazil Shifting Toward Ethanol For Car Fuel

Fueling cars with a mix of gasoline and alcohol from sugar cane is becoming the norm in Brazil.

Alcohol made from sugar cane is becoming the fuel of choice in Brazil, and other countries - so much so that global sugar prices hit a seven-year high this week.

Faced with the high sugar price signal Brazilian farmers will plant sugar cane on more acres. If ethanol usage grows by orders of magnitude then sugar cane acreage could do likewise. Will this end up cutting into rain forests? I'd like to know how many acres of Brazlian land would be needed to shift all cars now operating from oil to ethanol. Currently Brazil makes about 400 million tonnes of sugar per year. Anyone know the ratio between tonnes of sugar and gallons of ethanol produced from it?

Ethanol from Brazilian sugar is cheaper per mile than gasoline. The market alone is enough to drive the shift toward more alcohol fuel usage.

Unlike hybrids sold in the US, for example, flex cars sold in Brazil don't cost any more than traditional models. In fact, some models are only available with flex engines now. Ethanol engines use 25 percent more ethanol per mile than gasoline. But ethanol (the alcohol produced by fermenting sugar) usually sells at somewhere between a third to half of the price of gas. Even people who were reluctant to take the plunge and buy a flex say they have been won over by the savings.

Does Brazil tax the gasoline component of mixed fuels more than the ethanol component? Is this shift driven by the real pre-tax cost of both these fuels?

Sugar traders are bullish on sugar prices due to high oil prices.

Raw sugar futures have surged by a third to almost 12 cents per lb this year, having stood at 9.04 cents at the end of 2004.

"Definitely, we're going to 12 cents," said Marius Sonnen of sugar trader Sonnen and Co. Inc. in the United States. "As long as oil prices are this high, the Brazilians will convert more cane into ethanol. I don't see any end in sight to this rally."

Note that import restrictions erected from domestic sugar producers keep the price of sugar much higher in the United States. Therefore the cost of ethanol made from sugar is much higher in the United States and US taxpayers have to pay subsidies for ethanol production.

The US sugar industry simultaneously gets restrictions on imports that drive up the price of sugar in the US market plus subsidies for conversion of sugar to alcohol since costly protected domestic sugar is too expensive to compete.

Countries such as Brazil have embraced sugar-based ethanol, which accounts for 40 percent of the fuel Brazilians pump into their gas tanks. But sugar is less expensive in that country than in the United States, where critics contend import quotas artificially raise sugar prices. The industry should not get both trade protections and a subsidy to make sugar-ethanol competitive, critics said.

Brazil could conceivably end up exporting ethanol made from Brazilian sugar. Many US candy factories have moved to Canada and other countries in order to get cheaper sources of sugar. The candy isn't subjected to import restrictions analogous to those on raw sugar. The United States might end up importing ethanol made from foreign sugar as well.

By Randall Parker 2005 October 07 11:32 PM  Energy Biomass
Entry Permalink | Comments(46)
Nicotine Leaves Long Lasting Effects On Brain Reward Systems

Scripps researchers Athina Markou and Paul Kenny found that in rodents nicotine causes an elevation in mood that lasts for weeks after the nicotine is gone.

Nicotine induces a long-lasting activation of the brain's reward systems that is not seen after excessive consumption of other drugs of abuse, such as cocaine or heroin. This slight elevation in mood is there regardless of how much nicotine is consumed, and it persists long after the nicotine is gone from the body.

"It's almost a memory of nicotine in the brain," says Kenny, who is now a staff scientist at Scripps Florida. "The reward system becomes hyperactive, even when the nicotine isn't there."

This persistence of reward activity, Kenny adds, appears unique to nicotine among drugs of abuse and is probably crucial in maintaining the nicotine habit. Knowing this may have relevance to prevention of nicotine addiction and smoking cessation programs.

Weeks after the nicotine was gone the effects on rodent brain reward systems remained. Normal pleasures were still enhanced by the long gone nicotine.

In their study, Markou and Kenny looked at the effect of nicotine self-administration on brain reward systems in laboratory rodents. They allowed the rodents to have extended access to nicotine self-administration, and they directly measured the changes in neuronal activity in the brain.

As predicted, the scientists found that nicotine acutely stimulates the brain's reward system and seems to enhance the normal pleasures in the environment for hours. Unexpectedly, however, rather than the depression-like state induced when cocaine and heroin leave the system, nicotine's elevation of mood persists. The measurements of neuronal activity in the brain's reward system one hour after the nicotine consumption looked similar to those twelve hours after consumption.

In fact, this increased sensitivity to reward persists for days or weeks after the nicotine disappears. The excitation of these systems cannot be due to the presence in the brain of nicotine, which is readily metabolized by enzymes in the body so that all traces of it are gone after a matter of about three to four hours.

So if the nicotine is metabolized and cannot be responsible for the elevation in mood, what is the explanation? One possibility is that nicotine leads to an upregulation of the brain receptors to which it binds--the nicotinic receptors. Since the neurotransmitter acetylcholine also binds to these receptors, the elevation in nicotinic receptors due to nicotine may be behind the persistent elevation in mood.

Do addicts of the demon weed tobacco experience increased pleasure from life as a result of smoking tobacco? Had they never started smoking would their average level of experience of pleasure be lower?

What I'd like to see: A study of twins in their 20s (i.e. young enough not to have too much accumulated damage from cigarette smoking) where one member of each twin smokes and the other doesn't. Test them for average level of mood. See if the smoker experiences more daily pleasure than the non-smoker. Perhaps though the toxic effects of the cigarettes cause damage that reduces pleasure other ways. I'd expect that to be the case after enough years of smoking.

In the long run I expect to see the development of wide spectrum long lasting feel good drugs that do as nicotine does but without undesired side effects such as addiction and without a delivery mechanism such as smoke that brings along lots of toxins. One reason that some people are happier than others is that they are wired up that way due to their genes. Those who go through life generally less joyful (and not just those who are depressed) will some day have the option of getting their average pleasure level turned up by adjustment of receptor concentrations in their neurons.

By Randall Parker 2005 October 07 08:14 PM  Brain Addiction
Entry Permalink | Comments(7)
Feelings Of Mortality Change Charitable Gift Sizes And Food Choices

Feelings of mortality change charitable giving and food choices.

After the Sept. 11, 2001, terrorist attacks many Americans reported dramatic changes in their behavior, from increased church attendance and charitable giving to -- at the other end of the scale -- overeating and overspending.

Intrigued by these anecdotal reports, Baba Shiv, associate professor of marketing at Stanford Graduate School of Business, and colleagues Rosellina Ferraro and James R. Bettman from Duke University's Fuqua School of Business set out to discover what might be driving these changed behaviors. Their paper, "Let Us Eat and Drink, for Tomorrow We Shall Die," is published in the June 2005 Journal of Consumer Research.

They discovered that when confronted with thoughts of death, people tend to act in ways that will boost their self-esteem. They also have fewer cognitive resources to resist behaviors that are not central to their self-image. People for whom being slim or fit is important to their self-image, for instance, will not be as likely to overeat, but if physical appearance isn't as important, the willpower to resist that fudge sundae will plummet.

I've long suspected that terrorists are motivated in large part by a desire to manage their own feelings of self esteem. They carry out terrorist acts as a way to feel more efficacious and in control.

As a basis for the study, the researchers built upon an existing body of work on "Terror Management Theory," which attempts to explain how people cope when faced with a threat to their mortality. The theory holds that when reminded of their own mortality people first tend to cope in two ways: forcefully defending their cultural worldview (which could manifest itself in forms such as acting more aggressively toward someone with different political beliefs); or behaving in ways most likely to bolster their self-esteem.

The researchers decided to test the second part of this theory by examining two possible sources of self-esteem: physical appearance and being virtuous.

People who consider being virtuous as more important donate more when reminded of their mortality.

Researchers told 115 student subjects that they would be entered into a $200 lottery as part of their compensation for participating in a study. All participants first answered a questionnaire that measured the importance of virtue to their self-esteem. Half were then asked questions about the prospects of their own deaths while the other half were asked about dental pain. At the end, subjects in each group were asked how much of the $200 they would be willing to donate to charity if they won the lottery.

Participants for whom virtue was an important source of self-esteem offered significantly higher donations (an average of $65) when reminded of their mortality than when they were not (an average of just $34.50). Participants for whom virtue was not an important source of self-esteem actually donated less when reminded of their own deaths (an average of $20.36 compared with $28.60).

I wonder about the motivations here. Is the reminder of mortality more likely to result in virtuous behavior among those religious believers who expect they'll some day be judged by God in the afterlife? How would the results of this experiment differ with believers in different religions?

Do those who respond to reminders of mortality by donating money otherwise act more virtuous on average as compared to those who do not respond by upping their donations? What other differences are there between people who attach high importance to virtuous behavior and those who do not?

Will charities start running ads reminding people of their mortality as a way to boost donations? Or how about charities that help the homeless? Would they get more donations if they reminded people that health and financial reversals might some day make them homeless too?

Next we come to the very important choice between chocolate cake and fruit salad when placed in the context of future death.

A second part of the study investigated body image as a source of self-esteem. Participants were first asked questions to determine how their attitude toward their bodies contributed to their sense of self. Half were then asked about their reactions to September 11 while the others were asked about a recent local fire in which a building was destroyed but no one was hurt.

Afterward, all participants were given the choice between two snacks -- chocolate cake or fruit salad -- ostensibly as part of their reward for participating in the study.

Among women whose bodies contributed greatly to their self-esteem, only 23 percent chose cake when their sense of mortality was high; a dramatic drop from the 38 percent that chose cake when their sense of mortality was low.

These numbers make perfect sense, according to Shiv. "When people are reminded of their deaths, they are desperately seeking to cope, and if a decision can help them boost their self-esteem, they will make that decision. Thus women whose bodies were important to their self-esteem were much more likely to reject the offer of chocolate cake when their sense of mortality was heightened."

However, among women whose bodies did not contribute significantly to their sense of self-esteem, 94 percent chose chocolate cake when reminded of 9/11 -- more than twice as many as those who chose cake when their sense of mortality was low (44 percent).

"If you are using all your resources to focus on some important aspect of self-esteem in order to cope with thoughts of death, and body esteem is not central to you, then you simply have fewer resources to help you resist that tempting piece of chocolate cake," said Shiv.

I see the use of chocolate cake in this experiment as unfortunate. A lot of people think that chocolate makes them feel better when unhappy. This is especially so when heart broken over the end of a romance. So maybe the people who ate more chocolate cake when reminded of 9/11 were self-medicating.

A heightened sense of mortality did not cause men to resist chocolate cake.

Interestingly, there was no difference between the men's choice of cake or fruit salad based on their sense of mortality. The researchers attributed this to the fact that physical appearance is more of a "hot button" for women than for men.

I wonder if men would more likely choose high protein foods in response to a heightened sense of mortality on the theory that protein is needed to build muscles and muscles help in self defense.

I wonder how much of the response reported above is due to generalized feelings of stress versus feelings of lowered self esteem. Would other forms of stressors cause the same response? Or in response to stressors that do not remind us of our mortality would more people eat chocolate as a way to reduce the feelings of stress?

In the future the ability to manage one's emotional state might reduce charitable behavior. If people feel up and unstressed all the time I bet they are a lot less likely to donate to charities. Add in some future nanotech implants that automatically release compounds that block stress or negative emotions and the desire to engage in charitable behavior might decline markedly. I'd love to see a study of charitable giving of depressed people before and aftey they go on Prozac or Paxil. I bet anti-depressants reduce charitable giving by emotionally distancing people from those who suffer.

By Randall Parker 2005 October 07 03:53 PM  Brain Economics
Entry Permalink | Comments(6)
2005 October 06 Thursday
Tallness In Women Correlates With Masculine Ambitions

Taller women are more ambitious.

Scots academics questioned 1,220 women from the UK, United States, Canada and Australia and found the taller ones were less broody, had fewer children and were more ambitious. They were also likely to have their first child at a later age.

Shorter women tended to be more maternal and homely, according to research carried out by psychologists Denis Deady, of Stirling University, and Miriam Law Smith, from St Andrews University.

The researchers theorize that higher testosterone made the women taller while simultaneously causing their brains to develop to have more masculine features.

They conclude, rather, that taller women have more of the male sex hormone testosterone, which could give them more “male” traits, such as being assertive, competitive and ambitious.

This is a plausible hypothesis. But note they have not done any biological testing. Such studies are orders of magnitude more expensive. Also, testosterone might be higher in taller women only during development and testing adult women might not catch the higher testosterone.

It would be interesting to know whether homosexual women are taller than heterosexual women on average.

Also, adjusted for IQ are taller women more likely to commit crimes than shorter women? One would expect greater masculinity to correlate with higher crime rates, all else equal.

Also, do taller women have a higher ratio of mathematical and spatial reasoning aptitudes to verbal aptitudes as compared to shorter women?

Taller women are driven less by the desire to have children and more by the desire to succeed in a career.

Ms Law Smith, 27, who is 5ft 7in and has no children, said: "We related the height of every woman with their scores. It wasn't so much that women above a certain height were less maternal . . . more that the taller she was the less maternally driven she was likely to be.

I also wonder whether particular types of diets during development cause changes in the relative ratios of sex hormones. For example, would a high protein versus high fat versus high carbohydrates diet cause small but significant differences in the absolute levels and relative ratios of androgen hormones during development? Or would higher saturated fat versus less saturated fat cause differences in absolute and relative androgen levels? Or how about omega 3 versus omega 6 ratios?

In the longer run will society become more masculine? Will people use offspring genetic engineering techniques to make their daughters slightly more masculine in order to make them more ambitious to pursue careers? Will they make sure all their sons are not wimps? Will most countries become societies of alpha personalities and much higher levels of competition?

By Randall Parker 2005 October 06 11:17 AM  Brain Development
Entry Permalink | Comments(28)
2005 October 05 Wednesday
1918 Killer Pandemic Was An Avian Flu

A complete recreation of the 1918 H1N1 pandemic flu strain shows that it was an avian flu.

Scientists have re-created the “Spanish flu” virus that killed up to 50 million people in 1918-19 and shown that it shared traits with the H5N1 strain of avian flu.

An analysis of the re-created pathogen has shown that, like its modern cousin, it began as a bird virus and jumped species into humans with mutations that made it peculiarly virulent and lethal.

Dr. Jeffrey Taubenberger of the US Armed Forces Institute of Pathology and other scientists have just published papers in Science and Nature demonstrating that the 1918 virus was an avian virus in origins.

"We now think that the best interpretation of the data available to us is that the 1918 virus was an entirely avian-like virus that adapted to humans," Taubenberger told reporters in a telephone briefing.

"It suggests that pandemics can form in more than one way."

The more deadly 1918 pandemic virus is unlike the 1957 and 1968 flus in that the 1918 flu did not recombine with human influenza strains. That the 1918 strain did not recombine with human influenza strains and at the same time that it was orders of magnitude more lethal is probably not a coincidence.

"We now think that the 1918 virus was an entirely avian-like virus that adapted to humans," said Mr. Taubenberger. This is a different situation than the last two pandemics we had, the Asian flu in 1957 and the Hong Kong flu in 1968, which are mixtures in which a human-adapted influenza virus acquired two or three new genes from an avian influenza source. So it suggests that pandemics can form in more than one way, and this is a very important point."

He says it also suggests that the current Asian bird flu, known by its scientific designation H5N1, could evolve into a human killer with just a few more mutations that allow it to jump more efficiently among people.

"It suggests to us the possibility that these H5 viruses are actually being exposed to some human adaptive pressures and that they might be acquiring some of these same changes," he added. "In a sense, they might be going down a similar path that ultimately led to 1918."

This is the most important fact here: The 1918 H1N1 influenza virus did not need to co-infect a human and swap genes with a human influenza strain in order to gain the mutations needed to cause a highly lethal human influenza pandemic. That ups the probability that H5N1 could start a human pandemic.

The scientists took the recreated 1918 virus and performed experiments on it in a CDC Atlanta lab to identify proteins and genes key to its virulence.

"We felt that we had to re-create the virus and run these experiments to understand the biological properties that made the 1918 virus so exceptionally deadly," said Terrance Tumpey, a flu researcher at the Centers for Disease Control and Prevention and lead author of the Science study. "We wanted to identify the specific genes responsible for virulence, which we feel will advance our ability to prepare vaccines and make antiviral medicines that are effective against future pandemic strains."

Although the genetic data has been made part of a public database, the 10 or so vials of the virus itself - grown in human kidney cells - are contained under tight security guidelines set for potential biological weapons at the CDC's lab in Atlanta.

Fox News has a good article reporting that some of the mutations which H5N1 has already picked up are similar to 1918 H1N1 mutations and probably are moving H5N1 closer to human transmissibility and a human pandemic.

The good news is that the H5N1 flu bug still has a long way to go. The 1918 bug seemed to need several changes in every one of its eight genes. The H5N1 virus is making similar changes but isn't very far along.

"So, for example, in the nuclear protein gene we speculate there are six genes crucial [for human adaptation]," Taubenberger says. "Of those six, three are present in one or another H5N1 strain. But usually there is only one of these changes per virus isolate. That is true of other genes as well. You see four, five, or six changes per gene in the 1918 virus, whereas H5N1 viruses only have one change or so. It shows they are subjected to similar [evolutionary] pressures, but the H5 viruses are early on in this process."

Which H5N1 strains is Taubenberger comparing the 1918 H1N1 strain to? How old are the strains he is comparing to? Has he compared to any H5N1 strains isolated from recent Indonesians who have recently died from bird flu?

H5N1 appears to have picked up important mutations on the road toward human adaptation including one that has made it much more lethal in mice.

There is one ominous sign. It's in a flu gene protein called PB2. A single change in this gene makes H5N1 extremely deadly to mice. The same single change helps bird flu to adapt to mammals.

For example, the change in PB2 was seen in six of the seven H5N1 viruses spreading among captive tigers in Thailand.

Robert Webster of St. Jude Children's Research Hospital showed in a PNAS paper in the summer of 2004 that H5N1 has become much more lethal in mice and this is an indicator that H5N1 is becoming better adapted to mammals. Webster also points to an expanded range for H5N1 including tigers and domestic cats. The expanded range gives the virus more ecological niches in which it can further adapt to mammals and pick up more mutations that would help it become transmissible in humans.

Also check out a CDC report on possible H5N1 transmission between tigers. I realize some of my readers think I'm being excessively alarmist by writing posts about avian flu. But I just do not see humans as so different from all the other mammals as to think that a virus that is hopping between a bunch of species is going to draw the line and avoid humans. These new reports about the 1918 H1N1 strain which hopped from birds to humans and the parallels with H5N1 strike me as a strong reason not dismiss the threat this virus poses.

The New Scientist has a good article on the latest findings. The 1918 H1N1 virus was less dependent on cellular machinery to replicate.

Meanwhile, Terrence Tumpey at the US Centers for Disease Control in Atlanta and colleagues used the sequences to rebuild the virus itself, and infect mice with it. They report this week that unlike other flu viruses, 1918 does not need a protein-splitting enzyme from its surroundings to replicate, instead using some hitherto-unknown mechanism. And as in 1918, it rapidly destroys lungs (Science, vol 310, p 77).

Pathogens that jump species are a lot more lethal that pathogens that have been transmitted primarily between members of a species for a long time. The knowledge that the 1918 influenza was from birds and had no recombination with human influenza strains should give pause to anyone wondering whether H5N1 poses a serious threat.

Update: It probably took only a couple dozen mutation to make the 1918 influenza into a massive killer.

The bird flu viruses now prevalent share some of the crucial genetic changes that occurred in the 1918 flu, scientists said, but not all. The scientists suspect that with the 1918 flu, changes in just 25 to 30 out of about 4,400 amino acids in the viral proteins turned the virus into a killer. The new work also reveals that 1918 virus acts much differently from ordinary human flu viruses. It infects cells deep in the lungs of mice and infects lung cells, like the cells lining air sacs, that would normally be impervious to flu. And while other human flu viruses do not kill mice, this one, like today's bird flus, does.

As these researchers advance along in their work they are going to come up with much better metrics for measuring how far the H5N1 avian flu is from being capable of creating a pandemic in humans.

Update II: Henry Niman claims the 1918 flu was the product of a human flu and swine flu recombination.

By Randall Parker 2005 October 05 09:56 PM  Pandemic Signs
Entry Permalink | Comments(6)
2005 October 04 Tuesday
Carbon Dioxide Emissions Continue Rapid Increase

A rapid rise in fossil fuels burning is causing a rapid rise in carbon dioxide emissions.

World energy-related carbon dioxide (CO2) emissions rose by 4.5% last year, their fastest rate since 2000, according to first estimates by German economics institute DIW. The figures show that EU-15 emissions climbed only marginally in 2004 after increasing significantly in 2003 released.

Nearly half of carbon dioxide emissions growth in 2004 came from China. Continued economic growth in China and elsewhere in Asia means even more growth in CO2 emissions.

DIW's early review of 2004 data confirms China as currently the major driver of global emissions growth. It released an extra 579m tonnes of CO2 in 2004, a year-on-year increase of 15%. In comparison, world emissions increased by 1.2bn tonnes to stand at 27.5bn tonnes, or 26% above their 1990 level.

The folks who fear global warming are going to find themselves talking to a wall (a "Great Wall") if they try to convince the Chinese to stop their increase in fossil fuels consumption.

Emissions growth in industrialised countries in 2004 was far less rampant. Energy-related CO2 rose by 1.3% across the OECD area, DIW reported. In the USA it increased by 1.4%. In the old EU-15 countries it rose by 0.7%, less than half the rate of increase in 2003, according to official EU figures (ED 21/06/05).

Meanwhile, DIW estimates that EU-15 emissions of all six Kyoto greenhouse gases rose by just 0.3% in 2004, again well down on their 1.3% increase in 2003 according to official EU figures. According to the German institute, EU-15 emissions are now 1.4% below their 1990 level compared with a commitment to minus 8% by 2010.

The "EU-15" refer to the core and more industrialized western European Union countries. Note the 0.7% rise in CO2 emissions in the EU-15 versus 1.4% in the US and 1.3% in the whole set of OECD countries. The EU-15 aren't managing to stop, let alone reverse, CO2 emissions growth. The EU is growing more slowly in population and in total economic output than the United States. So a substantial portion of the EU-15 's slower growth in CO2 is a consequence of slower economic growth rather than success of government policies aimed at reducing emissions.

To make their Kyoto Accord goals by 2010 the EU-15 will have to reverse emissions growth very soon and achieve substantial reductions every year. I do not see them accomplishing this goal. They'd have to slow their economies even more. That's politically unpalatable. The EU is getting a lot of help from high oil and natural gas prices. But unless oil prices go higher still I'm skeptical that the EU countries will honor their Kyoto commitments.

If the countries with the strongest dedication to CO2 emissions reduction can not manage any better than they've achieved so far what chances are there for the rest of the world to stop CO2 emissions growth? Not gonna happen folks.

The best hope the CO2 emissions reduction advocates have is if Matthew Simmons and the other "Peak Oil" advocates are right and the peak of oil production is on the near horizon of the next 5 to 10 years. That'd put a huge brake on CO2 emissions growth. However, even if conventional oil production peaks my bet is that massive investments will bring on big production increases from oil shale in Wyoming and Colorado, oil tar sands in Alberta Canada, and coal in several countries.

CO2 emissions will eventually reduce substantially when solar, wind, and nuclear energy become cheap enough to substitute for fossil fuels. Also, advances in battery technology would enable the use of solar, wind, and nuclear energy for transportation and that would shift a lot of demand away from fossil fuels. People who fear global warming (and I'm not yet convinced it will be severe or a net harm) should join those who think technological advances are the way to bring the fossil fuel age to an end. I want to obsolesce fossil fuels for other reasons including the desire for cleaner air, less flow of money to Islamic theocracies, elimination of a big import expense, and greater economic efficiency and economic growth from the development of better energy technologies. Those seem compelling enough reasons regardless of climate effects.

Update: An EU press release from 2004 shows that the EU core states have experienced green house gas emissions increases in 2 out of 3 years in the 2000-2002 period.

Greenhouse gas emissions from the EU's 15 pre-2004 member states dropped by 0.5% between 2001 and 2002, latest estimates compiled by the European Environment Agency show.

The reasons for the decrease include warmer weather in most EU countries which reduced the use of carbon dioxide-producing fossil fuels to heat homes and offices. Slower economic growth in manufacturing industries, which also lowered fossil fuel use, a continuing shift from coal to gas and specific measures to reduce greenhouse gas emissions were the other main reasons.

Emissions of the six greenhouse gases had risen by 0.2% and 1.3% a year in 2000 and 2001 respectively.

The fall in 2002 took total EU15 emissions to 2.9% below their level in the base year used for calculations - 1990 in most cases.

A mild winter and slow economic growth in one year allowed EU-15 emissions to decline. Hardly a sign they are on track to make their goals.

By Randall Parker 2005 October 04 04:45 PM  Energy Policy
Entry Permalink | Comments(72)
2005 October 03 Monday
Influenza Hits Preschoolers First

Influenza cases show up several weeks earlier in preschoolers than in adults.

Current immunization policies recommend universal flu vaccination for children aged 6-23 months, but shots are advised for older children only if they have high-risk medical conditions. New data compiled by researchers at Children's Hospital Boston and Harvard Medical School, reported in October 1 issue of the American Journal of Epidemiology, suggest that otherwise healthy 3- and 4-year-olds drive flu epidemics, a pattern that may warrant consideration when formulating immunization policy.

The researchers leveraged a real-time computerized biosurveillance system linking five diverse health-care settings in Greater Boston, and examined medical visits from 2000 to 2004. Children aged 3 to 4 clearly led influenza epidemics, presenting with flu-like respiratory illness as early as late September. Children aged 0-2 began arriving a week or two later, while older children first arrived in October and adults began arriving only in November.

Moreover, flu-like illness in children under age 5, compared with all other age groups, was the most predictive of pneumonia and influenza deaths in the general population as determined from a Centers for Disease Control and Prevention database. Visits by children aged 0-2 provided the best prediction of mortality, but those of 3- and 4-year-olds followed close behind, suggesting that preschoolers, not just infants and toddlers, are important spreaders of flu to vulnerable groups.

"The data make sense because preschools and daycares, with their close quarters, are hotbeds of infection," says Dr. John Brownstein, the paper's lead author and a faculty member of the Children's Hospital Informatics Program at the Harvard-MIT Health Sciences and Technology program. "The data suggest that when kids are sneezing, the elderly begin to die. Three- and 4-year-olds are sentinels that allow us to focus our surveillance systems."

Influenza kills tens of thousands of Americans each year. Previous studies have shown decreases in household flu transmission and in adult flu mortality when children are immunized. Additional studies have also suggested that preschoolers drive flu epidemics, but they are based on simulations.

"Our study was not a simulation," says senior investigator Dr. Kenneth Mandl, an attending physician in Children's Department of Emergency Medicine and an informatics program faculty member. "This was real life."

With that previous report in mind consider this next report. H5N1 bird flu is currently very lethal to children.

The disease has been particularly deadly for children. In Thailand, 89% of patients under the age of 15 years died an average of nine or 10 days after illness onset.

Protection of children is the biggest problem in a pandemic. During a pandemic of high lethality schools will be closed. But lots of kids are in day care and kids come into contact with each other while playing. Also, with so many working mothers kids end up under the care of others during the day. This sets up transmission belts of influenza from home to children to day care environments to other kids and then to other homes.

Some ideas to protect children from a deadly flu pandemic:

  • Keep all boarding schools open but quarantined. Staff and teachers should live on the premises and agree that if they leave they can't come back. Only occasional delivery trucks should enter the grounds to drop off large crates of supplies which then get collected by staff after the delivery trucks leave.
  • Some summer camps for children should get converted into year-round boarding schools with the same quarantine rules as those put in place for boarding schools.
  • Child labor laws should get temporarily lifted so that adolescents can join the workforce to work and live at quarantined workplaces (which I call "workplace cocoons").
  • Farm houses out in plains states should take on children as boarders to live in isolation. The farmers would have to agree to avoid all contact with outsiders except for rare carefully made large supply deliveries.
  • Where economically feasible mothers could stay home and home school their children. If they have spouses the spouses could stay somewhere else in order to avoid bringing infection into the home.
  • Turn abandoned towns (such places exist in the US plains states) into quarantined camps for children.
  • Take hotels and turn them into quarantined boarding schools.
  • Poorly behaving children should be put in separate facilities to reduce quarantine rule breaking and skin contact when kids beat on other kids.
  • Identify isolated islands (e.g. Catalina) and valleys where entry and exit could be controlled and where children could move to stay in homes and in camps set up over larger areas. This would allow much larger areas in which kids could roam, go to school, and function in more normal fashion.

To move children or adults into longer term quarantine facilities requires intermediate quarantine facilities where people stay even more isolated for a week or two to establish that they do not already have a flu infection. So one problem is how to process large numbers of children through about 10 day cycles to separate out the infected from the infection-free. Then the infection-free could be transported under quarantine to quarantined longer term living facilities of the sorts listed above.

Children do not work in the economy. Therefore their isolation from society should be easier to manage than the isolation of working adults. At the same time, children are the biggest transmission belts of influenza. So their isolation would do more to reduce the spread of flu virus than the isolation of most other members of society. Plus, they would be at great risk of dying from an H5N1 pandemic.

By Randall Parker 2005 October 03 01:29 PM  Pandemic Prepare Children
Entry Permalink | Comments(11)
2005 October 02 Sunday
Finland To Vaccinate Entire Population Against Avian Flu

Finland's policy makers are taking a bold step against the threat of an avian flu pandemic.

In response to a recommendation by the World Health Organization, according to which the avian influenza pandemic threat is real, Finland is preparing to vaccinate its entire population against the disease.

The argument against doing this is that if H5N1 bird flu makes the mutational jump into easy transmissibility in human populations then that strain may be (probably would be) immunologically different than whatever H5N1 strain(s) Finland chooses to immunize against.

So then is the Finnish government wasting time and money? I can think of two arguments for why a preliminary vaccine might help. I'd really like to know if either or both arguments have any scientific merit: A) Partial immunity from a premature vaccine would reduce the lethality of an eventual pandemic infection and/or B) Vaccination by a premature H5N1 vaccine would reduce the size or number of a later vaccine dose using a more exact antigen target made to match an eventual pandemic strain (assuming such a strain will arise).

One reason put forward for why avian H5N1 influenza vaccine doses must be very large (see below) is that humans have few antibodies aimed at anything remotely like H5N1 since it is a bird influenza. Well, couldn't we treat the current strains of H5N1 as basically trainer strains to teach the human immune system about that category of influenza? Is there some reason to expect a future pandemic strain of H5N1 to be much more antigenically different from current H5N1 than human influenza strains are different from each other over a period of a few years?

If either of these arguments has merit then affluent folks who see avian flu as a big threat might want to start planning a medical tourism trip to Finland or to any other country that announces plans to vaccinate their entire population.

Recent vaccine testing for H5N1 bird flu found that 12 times the normal vaccine dose was needed for a good immune response against H5N1. That's very bad news. Assuming such a large dose size then current existing worldwide vaccine production capacity could make only 25 million doses per year against H5N1 influenza. By contrast when used against conventional human influenza strains world production capacity translates to 300 million yearly doses. The French vaccine maker Sanofi Pasteur is rumoured to be conducting trials using just 3.5 micrograms (as compared to the standard 15 microgram dose or 2 90 mch doses for H5N1) of H5N1 antigen plus an adjuvant compound to enhance immune response. If Sanofi Pasteur's trial works and all vaccine makers can follow the same approach then worldwide yearly production would effectively quadruple to 1.2 billion doses per year.

Current H5N1 test vaccines have to be taken as two large doses several weeks apart (and, yes, you would therefore be at risk of infection for those several weeks). The ability to produce much smaller single dose vaccines would greatly reduce the length of a pandemic, the death toll, and the size of the economic disruption. Therefore a lot potentially rides on a successful outcome of Sanofi Pasteur's trials.

70% of the world's flu vaccines are produced in Europe.

Most of the world's flu vaccine is produced in nine countries: Australia, Britain, Canada, France, Germany, Italy, Japan, the Netherlands and the United States. Europe produces 70% of the vaccines. And Europe's vaccine producers are worried.

Each nation on that list is a potential place to get a preliminary bird influenza vaccine before a pandemic hits. How about a medical tourism tour of European bird flu vaccine producing countries? Get about 3 or 4 different preliminary H5N1 vaccines and hope for a decent amount of partial immunity.

Canada's "pandemic readiness" fee contract for vaccine production sounds like the best idea I've heard for flu pandemics so far.

Canada, for example, recently signed a ten-year agreement with a manufacturer for its seasonal vaccine supply, and the country also pays an annual “pandemic readiness fee” which stipulates the company has the capacity to produce 8m doses of vaccine per month for four months.

Canada would still face an up-front delay of some months (3, 4, 5, 6 perhaps?) for developing the vaccine to put into production. But if their supplier can live up to that contract then any pandemic would last for less than a year in Canada. But my guess is that contract is based on the assumption that normal vaccine dose sizes would be used. So Canada's preparedness probably still depends on an optimistic expectation about dose size.

By Randall Parker 2005 October 02 09:24 PM  Pandemic Vaccines
Entry Permalink | Comments(11)
Site Traffic Info
Site Copyright
The contents of this site are copyright ©