2006 July 31 Monday
Corn Price Rise Coming Due To Use For Ethanol

The growing use of corn to produce ethanol is expected to drive up the price of corn by about 25% within a single year.

Fresh signs of ethanol's new economic impact are expected soon. After languishing for years, corn prices are projected to rise about 25 percent from around $2.00 a bushel currently to $2.45 a bushel this next crop year, reports the US Department of Agriculture (USDA). But as ethanol demand for corn kicks in, prices could go much higher in the future depending on gasoline prices. Meat and grocery prices could eventually rise as well, some analysts say.

"Ethanol has had huge impact on corn markets," says Jason Hill, a University of Minnesota researcher and coauthor of a study on ethanol's environmental impact published in the proceedings of the National Academy of Science last month. "Competition between food and fuel is growing, along with the environmental consequences as more ethanol facilities are built," the study says.

The drive to produce food-based biofuels is misplaced, because even if all US corn and soybeans were used, they "would meet only 11 percent of gasoline demand and 8.7 percent of diesel demand. There is a great need for renewable energy supplies that do not cause significant environmental harm and do not compete with food supply," the study says.

The rising use of biomass for energy production is going to put food buyers in direct competition with car drivers for the same agricultural output.

The price of meat will rise as a result.

One key impact is that the price of feed corn for cattle, pork, and poultry could rise 60 to 70 percent over the next two years, although meat and other grocery items may not see significant price gains for up to four years, Wisner says.

So will the price of popcorn, corn tortillas, and corn muffins for that matter.

The rise in demand for corn to produce ethanol might be short lived. The development of cellulosic technology will eventually enable bushes, trees, and most notably perennial switchgrass to be used to produce ethanol.

Clearly, there's a great deal of potential energy to be tapped. A study at Argonne National Laboratory estimates that a gallon of ethanol produced from kernels of corn in today's processes provides about 20,000 BTUs more energy than the energy that went into making it. The study projects that using cellulose from switchgrass would triple that net gain, to about 60,000 BTUs per gallon, mostly because little fossil fuel would be used in farming the grass. But costs need to come down to make this practical.

It was this "cellulosic" ethanol that President Bush spoke about when he proposed adding $150 million to next year's federal budget for research into using switchgrass. Raab says switchgrass is appealing; for one thing, an acre of land can produce four times the mass of switchgrass as of corn. And switchgrass is far hardier and easier to grow than corn. "The energy balance for ethanol from switchgrass is tremendously better," he says. "It doesn't require all the fertilizer, all the irrigation, all the energy intensity that corn does."

Switchgrass has many environmental advantages over corn as an energy source.

Perennial grasses, such as switchgrass, and other forage crops are promising feedstocks for ethanol production. "Environmentally switchgrass has some large benefits and the potential for productivity increases," says John Sheehan of the National Renewable Energy Laboratory (NREL). The perennial grass has a deep root system, anchoring soils to prevent erosion and helping to build soil fertility. "As a native species, switchgrass is better adapted to our climate and soils," adds Nathanael Criers, NRDC Senior Policy Analyst. "It uses water efficiently, does not need a lot of fertilizers or pesticides and absorbs both more efficiently."

Switchgrass already produces much more energy per acre. Our problem is we need cheaper and more efficient ways to break down the cellulose sugar polymers that contain the sugar which can be converted into ethanol. Once the cellulosic technologies mature then breeding programs could more than double switchgrass yield per acre and further widen its advantages over corn.

"The key to producing enough ethanol is switchgrass," says Greene. Switchgrass shows great potential for improving yields, offers environmental benefits and can be grown in diverse areas across the country. Current average yields are five dry tons per acre. Crop experts have concluded standard breeding techniques, applied progressively and consistently, could more than double the yield of switchgrass. Yield improvements predicted by the report of 12.4 dry tons per acre are in keeping with results from breeding programs with crops such as corn and other grasses. The innovations discussed have a net effect of reducing the total land required to grow switchgrass to an estimated 114 million acres. Sufficient switchgrass could be grown on this acreage to produce 165 billion gallons of ethanol by 2050, which is equivalent to 108 billion gallons of gasoline. The next logical question is how do we integrate switchgrass production into our agricultural systems. The answer lies with the ability to produce animal protein from switchgrass. "If we have cost-effective agricultural policy, farmers will rethink what they plant," says Lynch "For example, we are using 70 million acres to grow soybeans for animal feed. You can grow more animal feed protein per acre with switchgrass. If there were a demand for biomass feedstocks to produce ethanol and other biofuels, farmers would be able to increase their profits by growing one crop producing two high value products."

To put that equivalent of 108 billion gallons of gasoline in perspective: The United States consumes over 320 million gallons of gasoline per day or about 117 billion gallons per year. So in theory 114 million acres of land (about a third of an acre per person) could produce enough switchgrass to power all cars in the United States. To put the land needed into perspective, US farmers plant about 74 million acres of soybeans, 81 million acres of corn, 14 million acres of cotton, and 59 million acres of wheat. So planting 114 million acres for switchgrass is not impossible by any means.

The United States is 2.3 billion acres total with 442 million or 19.5% used by crops. If 114 million acres were devoted to switchgrass for ethanol that would increase crop land usage by about a quarter.

If switchgrass becomes a really cheap way to produce biomass energy then that doesn't prevent a rise in the price for corn. Some of the tens of millions of acres that will get put into production for switchgrass will be land that otherwise would have been planted in corn, wheat, soy, and other crops used to feed humans and livestock.

My standard rant on biomass: The development of cheap photovoltaics would allow land that is not used for food crops to produce energy. Much of the surfaces that will be covered by cheap photovoltaics will be already existing buildings and other structures built by humans. Biomass energy competes with wild plants and animals for use of the same land. Photovoltacs (and nuclear power for that matter) leaves more of nature in the natural state.

Some Americans look at the US, see huge amounts of wide open spaces, and conclude that expanded planting of crops will have little impact. But the development of cheap cellulosic technologies will also create demand for expanded planting in parts of the world far more densely populated and already suffering from shrinking natural areas. Think of India for example and imagine large chunks of its land shifted into biomass energy production.

My guess is that the cellulosic technology problems will get solved and we will witness a huge shift toward use of switchgrass to produce ethanol.

By Randall Parker 2006 July 31 10:47 PM  Energy Biomass
Entry Permalink | Comments(11)
2006 July 30 Sunday
Infections In 19th Century Caused Earlier Chronic Illnesses

Gina Kolata of the New York Times has written a great article surveying the building body of evidence which shows earlier generations got classic diseases of old age sooner and did so due to infections while very young and poorer nutrition. (and I strongly urge you to read the full article)

New research from around the world has begun to reveal a picture of humans today that is so different from what it was in the past that scientists say they are startled. Over the past 100 years, says one researcher, Robert W. Fogel of the University of Chicago, humans in the industrialized world have undergone “a form of evolution that is unique not only to humankind, but unique among the 7,000 or so generations of humans who have ever inhabited the earth.”

We humans alive today are physically way different on average as compared to previous generations.

In previous centuries heart disease, lung disease, and other ailments showed up decades earlier in human lives.

The biggest surprise emerging from the new studies is that many chronic ailments like heart disease, lung disease and arthritis are occurring an average of 10 to 25 years later than they used to. There is also less disability among older people today, according to a federal study that directly measures it. And that is not just because medical treatments like cataract surgery keep people functioning. Human bodies are simply not breaking down the way they did before.

What is most interesting about these results are the suspected causes: events in the womb and while still quite young can set people up for chronic diseases decades later.

The proposed reasons are as unexpected as the changes themselves. Improved medical care is only part of the explanation; studies suggest that the effects seem to have been set in motion by events early in life, even in the womb, that show up in middle and old age.

“What happens before the age of 2 has a permanent, lasting effect on your health, and that includes aging,” said Dr. David J. P. Barker, a professor of medicine at Oregon Health and Science University in Portland and a professor of epidemiology at the University of Southampton in England.

But it is too late for us to go back in time and tell our mothers to avoid people with colds and flus and other infectious diseases. Our bodies are damaged even from before birth. To fix that damage we need gene therapy, stem cells, and the rest of the panoply of coming rejuvenation therapies.

We are taller, heavier, live longer, get sick later. Almost half of 65 year olds can expect to reach 85. I want that percentage to rise much higher.

In 1900, 13 percent of people who were 65 could expect to see 85. Now, nearly half of 65-year-olds can expect to live that long.

People even look different today. American men, for example, are nearly 3 inches taller than they were 100 years ago and about 50 pounds heavier.

One factor that is different today is that we get infected less and suffer from infections for shorter periods of time. Improved hygiene (e.g. refrigerators and a variety of methods of killing and avoiding food borne pathogens), vaccines, antibiotics, better nutrition, and less exposure to extremes of weather all reduce our rates of infectious disease.

Even if one does not die while infected the infectious diseases take their toll and accelerate aging in a number of ways. First off, the pathogens directly do damage to the body. Second, the immune system's response does damage. In the process of attacking pathogens the immune response causes collateral damage to human tissue. Chemical compounds released by immune cells do damage to our own cells. Third, infection reduces our ability to stay nourished due to decreased appetite, diarrhea, decreased ability to do activities that bring in food, and other mechanisms. Therefore a reduction in infectious disease exposure has reduced the rate at which our bodies accumulate damage.

Conventional wisdom has it that people live longer today because when they do get sick medical treatments can keep them alive. But Dr. Fogel's study of US Civil War veteran medical records shows that back then people got serious illnesses at much younger ages, decades sooner. They lived with these illnesses for much of their lives.

Instead of inferring health from causes of death on death certificates, Dr. Fogel and his colleagues looked at health throughout life. They used the daily military history of each regiment in which each veteran served, which showed who was sick and for how long; census manuscripts; public health records; pension records; doctors’ certificates showing the results of periodic examinations of the pensioners; and death certificates.

They discovered that almost everyone of the Civil War generation was plagued by life-sapping illnesses, suffering for decades. And these were not some unusual subset of American men — 65 percent of the male population ages 18 to 25 signed up to serve in the Union Army. “They presumably thought they were fit enough to serve,” Dr. Fogel said.

Suddenly travel to the past in a time machine has gotten a lot less attractive. Even if one could go back with even more vaccines than we have today the environment back then would take a heavy toll. Though if you could go back and get rich and choose a less severe environment you could buffer yourself from some of the ravages of previous eras.

Note that people living back in the 1800s ate what today would be considered a much more natural diet. No pesticides. No trans fatty acids on french fries. But they had a much higher incidence of heart disease.

Eighty percent had heart disease by the time they were 60, compared with less than 50 percent today. By ages 65 to 74, 55 percent of the Union Army veterans had back problems. The comparable figure today is 35 percent.

That higher rate of heart disease could at least in part be due to chronic infections.

Economist Douglas V. Almond at Columbia University examined health records of children born around the time of the great killer 1918 influenza pandemic and found that women were pregnant during the pandemic gave birth to children who fared much worse by several measures as compared to children born right before or after the pandemic.

To his astonishment, Dr. Almond found that the children of women who were pregnant during the influenza epidemic had more illness, especially diabetes, for which the incidence was 20 percent higher by age 61. They also got less education — they were 15 percent less likely to graduate from high school. The men’s incomes were 5 percent to 7 percent lower, and the families were more likely to receive public assistance.

The effects, Dr. Almond said, occurred in whites and nonwhites, in rich and poor, in men and women. He convinced himself, he said, that there was something to the Barker hypothesis.

Pet peeve: I think employers should organize workplaces to reduce the incidence of diseases transmission at work. Discourage sick people from working. I hate hearing people coughing over the cubicle walls and then seeing other people getting sick. Not only does this cost economically but it is probably also shortening our lifespans. Workplace doors, bathrooms, kitchens, and other locations could be reworked to reduce touching of common surfaces.

You wash your hands in the lavatory sink but have to turn the turn the faucet handle to turn off the water (how about foot pedals?) and then turn a door handle to get out of the room. Low cubicle walls also allow cough droplets to travel across a room. In workplaces where employers push wellness programs on employees shouldn't the employers put more effort into reducing our odds of getting sick while at work?

These results go a long way toward confirming the arguments of evolutionary theorists Gregory Cochran and Paul Ewald that the role of infections in causing chronic illnesses has been much underestimated.

By Randall Parker 2006 July 30 02:23 PM  Aging Studies
Entry Permalink | Comments(9)
2006 July 27 Thursday
Ampakines Reverse Brain Aging In Rats

Ampakines reverse some aspects of brain aging in rats.

A drug made to enhance memory appears to trigger a natural mechanism in the brain that fully reverses age-related memory loss, even after the drug itself has left the body, according to researchers at UC Irvine.

Professors Christine Gall and Gary Lynch, along with Associate Researcher Julie Lauterborn, were among a group of scientists who conducted studies on rats with a class of drugs known as ampakines. Ampakines were developed in the early 1990s by UC researchers, including Lynch, to treat age-related memory impairment and may be useful for treating a number of central nervous system disorders, such as Alzheimer’s disease and schizophrenia. In this study, the researchers showed that ampakine drugs continue to reverse the effects of aging on a brain mechanism thought to underlie learning and memory even after they are no longer in the body. They do so by boosting the production of a naturally occurring protein in the brain necessary for long-term memory formation.

I am surprised this was so easy to do. Some aspects of brain aging will require gene therapy, cell therapy, and other techniques to reverse. But this study's results strongly suggest that conventional drugs will play an important role in preventing and reversing brain aging.

Ampakines boosted a protein involved in memory formation and improved quality of connections between nerve cells.

The researchers treated two groups of middle-aged rats twice a day for four days with either a solution that contained ampakines or one that did not. They then studied the hippocampus region of the rats’ brains, an area critical for memory and learning. They found that in the ampakine-treated rats, there was a significant increase in the production of brain-derived neurotrophic factor (BDNF), a protein known to play a key role in memory formation. They also found an increase in long-term potentiation (LTP), the process by which the connection between the brain cells is enhanced and memory is encoded. This enhancement is responsible for long-term cognitive function, higher learning and the ability to reason. With age, deficits in LTP emerge, and learning and memory loss occurs.

Significantly, restoration of LTP was found in the middle-aged rats’ brains even after the ampakines had been cleared from the animals’ bodies. The drug used in the injections has a half-life of only 15 minutes; the increase in LTP was seen in the rats’ brains more than 18 hours later. According to the researchers, this study suggests that pharmaceutical products based on ampakines can be developed that do not need to be in the system at all times in order to be effective. Most drugs used to deal with central nervous system disorders, such as Parkinson’s disease, are only effective when they are in the body. Further studies will be needed to determine exactly how long the effect on LTP will be maintained after the ampakines leave the system.

The economic impact of drugs that reduce and reverse brain aging will be huge. People in their 50s, 60s, and 70s will be far more economically productive when brain aging can be reduced and even reversed. The question isn't whether this can be done but when it will be done.

By Randall Parker 2006 July 27 11:39 PM  Aging Reversal
Entry Permalink | Comments(8)
Sulfur Could Reverse Global Warming

Nobel Prize winning chemist Paul Crutzen says we could cool the planet by injecting sulfur into the atmosphere.

Injecting sulfur into the atmosphere to slow down global warming is worthy of serious consideration, according to Nobel laureate Paul Crutzen from the Max Planck Institute for Chemistry in Germany and the Scripps Institution of Oceanography, University of California at San Diego.  His thought-provoking paper1 is published in the August issue of the Springer journal Climatic Change, devoted this month to the controversial field of geoengineering.

The sulfur would reflect light back into space.

Crutzen’s proposed planet-saving scheme, which artificially injects sulfur into the earth’s stratosphere (the second atmospheric layer closest to earth) to offset greenhouse gas warming, is based on this phenomenon.

His “albedo2 enhancement method”, or, in other words, his proposed way of increasing the earth’s reflective powers so that a significant proportion of solar radiation is reflected back into space, aims to replicate the cooling effect these man-made sulfate particles achieve.

If we get into desperate straits sulfur could be used as an emergency climate treatment. It would require continuous application since the sulfur does not stay in the atmosphere.

In Crutzen’s experiment, artificially enhancing earth’s reflective powers would be achieved by carrying sulfur into the stratosphere on balloons, using artillery guns to release it. In contrast to the slowly developing effects of global warming associated with man-made carbon dioxide emissions, the climatic response of the albedo enhancement method could theoretically start taking effect within six months. The reflective particles could remain in the stratosphere for up to two years.

Would the sulfur cause acidic rains or other problems? How big would those problems be? Volcanoes inject large amounts of sulfur. What other effects does that sulfur cause?

On most issues involving fears of worst case outcomes of human activity my take on them is that we can use technology to prevent or reverse the outcomes. That's not an argument for total complacency. But it is an argument against claiming that civilization is going to collapse or that we are going to suffer terribly.

The best way to cut carbon dioxide emissions is to develop cleaner energy technologies that are cheaper than the dirtier ones. Then we'd get both cheaper energy and a cleaner environment.

By Randall Parker 2006 July 27 11:35 PM  Climate Engineering
Entry Permalink | Comments(5)
2006 July 26 Wednesday
Potent H5N1 Avian Influenza Vaccine Developed

GlaxoSmithKline (GSK) has developed a vaccine for H5N1 flu that is potent in much lower doses than previously tested H5N1 vaccines.

In a clinical trial, 80% of volunteers who received two vaccine doses containing 3.8 mcg of antigen with an adjuvant (a chemical that stimulates the immune system) had a strong immune response, the British-based company said in a news release. A typical dose of seasonal flu vaccine is 15 mcg.

"This is the first time such a low dose of H5N1 vaccine has been able to stimulate this level of strong immune response," GSK Chief Executive Officer J.P. Garnier said in the news release.

By comparison, an H5N1 vaccine developed by Sanofi Pasteur induced a good immune response in 67% of volunteers who received two 30-mcg doses with an adjuvant, according to findings reported in May. The US government is stockpiling the Sanofi vaccine.

Garnier called the GSK vaccine a breakthrough because, with the effectiveness of the low dose, a given amount of antigen will go much further than it would otherwise.

"The meaning of this is that we are going to be in a position, starting later this year, to produce hundreds of millions of doses of an effective pandemic vaccine, so this is a big breakthrough," Garnier said on BBC Radio, as reported today by Agence France-Presse (AFP).

World influenza vaccine production capacity is too low in event of a deadly pandemic flu outbreak. A vaccine that is potent in such a small dose greatly expands the number of doses of vaccine that could be made in a year in response to a deadly pandemic.

Other vaccines against H5N1 are much less potent.

Previous attempts to produce a low dose vaccine had been unsuccessful, with required doses as high as 180 micrograms. Companies then tried adding an adjuvant – a chemical which stimulates the immune system and increases the potency of a vaccine. But, again, results were disappointing.

We need newer and far more easily scaled vaccine production technologies. The lead time for vaccine production is several months. The production capacity is low because only a small fraction of the world's population gets vaccinated for the flu in typical years. A better vaccine production method would be faster and scale up rapidly with easily produceable capital equipment.

By Randall Parker 2006 July 26 10:55 PM  Dangers Natural Bio
Entry Permalink | Comments(0)
2006 July 25 Tuesday
Distractions Reduce Learning

You learn less when you have to juggle more distractions.

Multi-tasking affects the brain's learning systems, and as a result, we do not learn as well when we are distracted, UCLA psychologists report this week in the online edition of Proceedings of the National Academy of Sciences.

"Multi-tasking adversely affects how you learn," said Russell Poldrack, UCLA associate professor of psychology and co-author of the study. "Even if you learn while multi-tasking, that learning is less flexible and more specialized, so you cannot retrieve the information as easily. Our study shows that to the degree you can learn while multi-tasking, you will use different brain systems.

"The best thing you can do to improve your memory is to pay attention to the things you want to remember," Poldrack added. "Our data support that. When distractions force you to pay less attention to what you are doing, you don't learn as well as if you had paid full attention."

You'll remember less about how you did a task if you had to do another task at the same time.

Participants in the study, who were in their 20s, learned a simple classification task by trial-and-error. They were asked to make predictions after receiving a set of cues concerning cards that displayed various shapes, and divided the cards into two categories. With one set of cards, they learned without any distractions. With a second set of cards, they performed a simultaneous task: listening to high and low beeps through headphones and keeping a mental count of the high-pitch beeps. While the distraction of the beeps did not reduce the accuracy of the predictions - people could learn the task either way - it did reduce the participants' subsequent knowledge about the task during a follow-up session.

When the subjects were asked questions about the cards afterward, they did much better on the task they learned without the distraction. On the task they learned with the distraction, they could not extrapolate; in scientific terms, their knowledge was much less "flexible."

This result demonstrates a reduced capacity to recall memories when placed in a different context, Poldrack said.

If you have only one task to focus on your can notice more patterns and look at it in more ways while you are doing it. You can basically sift through and make more sense of it. That lets you use the experience of that task in more ways.

It is a continual source of amazement to me just how distracting office workplaces are. This report is yet another argument against the rows of cubicles with all the noise and distractionn the lack of walls brings.

By Randall Parker 2006 July 25 10:34 PM  Brain Performance
Entry Permalink | Comments(2)
2006 July 23 Sunday
Eye Laser Detects Early Stage Alzheimer's In Mice

Imagine getting diagnosed with Alzheimers decades before obviously failing memory would lead to a clinical diagnosis.

Building on their discovery that people with Alzheimer’s have ß-amyloid deposits that appear as unusual cataracts in the lens of the eye, Lee E. Goldstein, M.D., Ph.D., of Brigham & Women's Hospital and Harvard Medical School, Boston, and colleagues have developed a new, non-invasive, laser technology that may detect Alzheimer’s at its earliest stages.

Clumps of abnormal ß-amyloid protein (known as “plaques”) accumulate outside the brain’s nerve cells in people with Alzheimer’s. As Goldstein and colleagues previously reported in the British medical journal The Lancet, these same ß-amyloid clumps also collect in the lens of the eye as unusual “supranuclear cataracts.” These Alzheimer’s cataracts are different from common, age-related cataracts. This is the first evidence to date that Alzheimer’s-related amyloid pathology may occur outside the brain.

In their most recent experiments to be reported in Madrid, the researchers used genetically engineered Alzheimer’s mice to test a new, non-invasive molecular diagnostic technology. Goldstein and his team directed a brief pulse of infrared light – barely visible to humans – into the eye of each of four non-anesthetized Alzheimer mice and four age-matched normal mice every month starting at five months of age. Analysis of how the light bounced back from the lens completely separated the two types of mice by 10 months of age, when amyloid lesions were not detectable in the brain or eye by conventional means. The scientists believe that this technology, known as quasi-elastic light scattering (QLS), may detect the very earliest stages of ß-amyloid pathology, even in eyes that are completely clear.

“Amyloid in the lens can be detected using extremely sensitive, non-invasive optical techniques. This makes the lens an ideal window for early detection and disease monitoring in Alzheimer’s,” Goldstein said.

Early identification of developing Alzheimer's might extend back by literally decades the point at which a person can be diagnosed as developing Alzheimer's. Imagine being told at, say, age 45 that 20 or 25 years from now your memory will degrade far enough that you'll have clincal Alzheimer's. I hope such people so diagnosed will react by making loud demands of their elected officials (or the dictators who rule them as the case may be) to accelerate the development of treatments. People should treat early diagnosis as a wake-up call to become politically active and fight for much larger efforts to discover the causes and cures of their diseases.

Early diagnosis will also greatly speed up research on preventive therapies, whether those therapies be drugs, diet, or other techniques. Long longitudinal studes that watch for increased risks of diseases will be replaced by shorter studies that can watch people for several years to see what factors are associated with very early stage development of Alzheimer's and other diseases of aging. Also, people so diagnosed will be able to try new therapies and scientists will be able to find out whether each therapy works before memory has degraded by a substantial amount.

The article linked to above also reports on an fMRI (functional magnetic resonance imaging) technique that finds small blood vessel ruptures may cause dementia in the elderly. Better measurement methods for this problem will lead to better ways to test therapies in shorter periods of time.

Update: Once you get the diagnosis of very early stage Alzheimer's Disease you'll of course want an immediate cure before the brain deteriorates much. Well, some Australian researchers might have the ticket. A drug called PBT2 might stop and reverse the build-up of the plaque that probably causes Alzheimer's.

Professor Ashley Bush, MD, PhD, of the Mental Health Research Institute of Victoria (Australia) and co-founding scientist of Prana Biotechnology Limited (Nasdaq: PRAN, ASX: PBT) today presented data at the 10th International Conference on Alzheimer's Disease (ICAD) in Madrid demonstrating that in mouse models[1] PBT2:

-- improved memory performance within five (5) days of oral dosing

-- rapidly reduced the levels of soluble beta-amyloid ("Abeta") in the brain, and

-- restored normal function to Abeta impaired synapses.

I'm expecting cures for Alzheimer's before cures for cancer. Preventing the build-up of a protein plaque seems a lot easier than stopping some of your own cells from dividing like mad.

The results sound promising.

In addition, Professor Bush referenced studies he and colleagues performed on 15-month old transgenic Alzheimer's mice treated with 30 mg/kg PBT2, which showed the drug reduced soluble Abeta40 and Abeta42 levels by 60 percent within 24 hours of oral PBT2 administration. Professor Bush also presented mechanistic findings showing that PBT2 blocks the copper-dependent formation of amyloid oligomers, considered by many to be the toxic chemical entity leading to brain damage in Alzheimer's disease. Professor Bush showed that, by this mechanism, PBT2 in the rodent brain blocks synaptotoxicity caused by soluble beta-amyloid oligomers and restores LTP (long-term potentiation) -- the neuronal electrical activity that underlies memory formation.

Another team has just reported preliminary results of a monoclonal antibody against Alzheimer's plaques. Drugs, vaccines, and monoclonal antibodies will all work against Alzheimer's eventually. We'd already have vaccines against Alzheimer's if the US Food and Drug Administration didn't demand excessively low levels of side effects. I'd rather run the risk of brain inflammation from a vaccine if I knew I was in the process of losing my memory and ability to think. But the FDA doesn't think we should be allowed to judge such trade-offs for ourselves.

By Randall Parker 2006 July 23 11:24 PM  Brain Alzheimers Disease
Entry Permalink | Comments(5)
Neanderthal Genome Sequenced In 2 Years?

But they make no mention of using the sequence information to resurrect the species.

The Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, in collaboration with 454 Life Sciences Corporation, in Branford, Connecticut, today announce a plan to have a first draft of the Homo neanderthalensis genome within two years.

Come on guys. Some day some group is going to use human eggs to put neanderthal DNA made with DNA synthesis machines to create neanderthal babies. Your work is laying the foundations. We should consider what will be the consequencs. Will the Neanderthal facilty for speech be good enough for them to sing? I want to hear neanderthals cover the Kinks classic "I'm an ape man, I'm an ape ape man, I'm an ape man. I'm a king kong man, I'm a voodoo man, I'm an ape man".

Neanderthals might not make nice semi-people. Would they be smart enough and capable of being civilized enough to qualify for human rights? One of the biggest debates of the 21st century (at least until the robots take over) is going to be on the question of which attributes must an intelligence possess to be eligible for rights and even to be eligible for not immediately getting destroyed or at least imprisoned. But that debate hasn't started in earnest yet because all the politically correct liberals are still denying that genetics plays a big role in creating cognitive characteristics that determine why human societies take the forms we see.

Advances in sequencing technology made by 454 Life Sciences makes the sequencing attempt possible. (same press release here as PDF)

"The Max Planck Institute and 454 Life Sciences are working together to sequence the Neandertal genome. Our expertise with ancient DNA and the Neandertal, coupled with 454 Sequencing, a next generation sequencing technology with unparalleled throughput, makes this an ideal collaboration," explained Svante Paabo, Ph.D., Director of the Department of Evolutionary Anthropology at the Max Planck Institute. "The advent of 454 Sequencing has enabled us to move forward with a project that was previously thought to be impossible."

Neandertal inhabited Europe and the Near East until about 30,000 years ago then disappeared after his successor, Homo sapiens, migrated to Europe. This year marks the 150th anniversary of the discovery of the first Neandertal fossil in Germany's Neander Valley near Dusseldorf. Dr. Paabo was the first to sequence DNA from a Neandertal fossil in 1997 while at the University of Munich.

That these scientists can get DNA fragments good enough to be worth trying to sequence is the amazing part of it.

Extracting, identifying and sequencing ancient DNA from fossils is a technically challenging task. When an organism dies, its tissues are overrun by bacteria and fungi. Much of the DNA is simply destroyed, and the small amount remaining is broken into short pieces and chemically modified during the long period of fossil formation. This means that when scientists mine tiny samples of ancient bones for DNA, much of the DNA obtained is actually from contaminants such as bacteria, fungi and even scientists who have previously handled the bones.

Over the last twenty years, Dr. Paabo's research group has developed methods for demonstrating the authenticity of ancient DNA results, as well as technical solutions to the problems of working with short, chemically-modified DNA fragments. Together with 454 Life Sciences, they will now combine these methods with high-throughput DNA sequencing. By enabling a method of sequencing that is more comprehensive and less expensive than conventional sequencing methods, 454 Sequencing is well suited for such a project.

"Unlike the human genome project, Neandertal samples are extremely scarce and have been contaminated with microbial DNA over tens of thousands of years. Therefore, this project is only possible with 454 Sequencing technology," said Michael Egholm, Ph.D., Vice President, Molecular Biology, 454 Life Sciences.

Due to such sample contamination, the task of sequencing the Neandertal genome is much more extensive than the task of sequencing the human genome. 454 Life Sciences' Genome Sequencer 20 System makes such an endeavor feasible by allowing approximately a quarter of a million single DNA strands from small amounts of bone to be sequenced in only about five hours by a single machine. The DNA sequences determined by the Genome Sequencer 20 System are 100-200 base pairs in length, which coincides neatly with the length of ancient DNA fragments.

Over the next two years, the Neandertal sequencing team will reconstruct a draft of the 3 billion bases that made up the genome of Neandertals. For their work, they will use samples from several Neandertal individuals, including the type of specimen found in 1856 in Neander Valley and a particularly well-preserved Neandertal from Croatia. The Max Planck Society's decision to fund the project is based on an analysis of approximately one million base pairs of nuclear Neandertal DNA from a 45,000-year-old Croatian fossil, sequenced by 454 Life Sciences.

See here for more information.

By Randall Parker 2006 July 23 10:33 PM  Trends, Human Evolution
Entry Permalink | Comments(7)
Sleeping With Women Drains Male Brains?

University of Vienna Austria researcher Gerhard Kloesch arranged for childless couples to sleep 10 days together and 10 days apart. Both wives and husbands slept more poorly together and men suffered more degraded performance than the wives did.

While men thought they slept better with a partner, and women believed they didn't, actually both sexes had more disturbed sleep, even when they did not have sex. Lack of sleep led to increased stress hormone levels in men, and reduced their ability to perform simple cognitive tests the next day.

Women can handle interruptions more easily. The reason is probably Darwinian: Natural selection selected for females that can handle the interrupts of babies and children. Whereas men spent more time out concentrating on one task at a time such as hunting.

A sleep expert in England says married couples shouldn't be so determined to sleep together.

Dr Neil Stanley, a sleep expert at the University of Surrey, said: "It's not surprising that people are disturbed by sleeping together.

"Historically, we have never been meant to sleep in the same bed as each other. It is a bizarre thing to do.

"Sleep is the most selfish thing you can do and it's vital for good physical and mental health.

"Sharing the bed space with someone who is making noises and who you have to fight with for the duvet is not sensible.

Once the sex becomes infrequent (or so married men assure me) why not sleep apart some of the time?

Marriage doesn't just make men dumber. It also takes away some of their drive. Previous research has shown that getting married lowers male testosterone and having kids lowers it even further.

Gray studied testosterone in saliva collected from 58 men (48 of them Harvard students) between the ages of 20 and 41. Half were married, and of those, 15 were married with children. He took four saliva samples from each man: two in the morning and two in the evening. The subjects also completed questionnaires about their demographic, marital, and parenting backgrounds. Among other things, the questionnaires asked how much time the men spent with their spouses (instead of hanging out with the guys) on their last day off from work, and measured the effort they expended caring for their children. Analysis showed that marriage, fatherhood, and longer periods spent with wives and children were all linked to lower testosterone levels. Fathers in particular had levels significantly lower than those of unmarried men. Researchers also observed that hormone levels in the morning samples were high and relatively even among the men; the differences appeared at night.

On the other hand, the lower testosterone might reduce the risk of prostate cancer and reduce the general rate of aging.

By Randall Parker 2006 July 23 03:24 PM  Brain Sleep
Entry Permalink | Comments(6)
Genes Show Same Aging Pattern Across Species

Humans, mice, and flies show the same patterns of changes in gene regulation with age.

STANFORD — We can dye gray hair, lift sagging skin or boost lost hearing, but no visit to the day spa would be able to hide a newly discovered genetic marker for the toll that time takes on our cells. “We’ve found something that is at the core of aging,” said Stuart Kim, PhD, professor of developmental biology and of genetics at the Stanford University School of Medicine.

In a study published in the July 21 issue of Public Library of Science-Genetics, Kim and colleagues report finding a group of genes that are consistently less active in older animals across a variety of species. The activity of these genes proved to be a consistent indicator of how far a cell had progressed toward its eventual demise.

Until now, researchers have studied genes that underlie aging in a single animal, such as flies or mice, or in different human tissues. However, a protein associated with aging in one species may not be relevant to the aging function in a different animal. This limitation had made it difficult to study the universal processes involved in aging.

Kim’s work overturns a commonly held view that all animals, including humans, age like an abandoned home. Slowly but surely the windows break, the shingles fall off and floorboards rot, but there’s no master plan for the decay.

What we need to know: Which genes first start changing? Or which key regulatory switches start telling genes to start expressing differently? To put it more generally: What is the sequence of events that causes the genes to start behaving differently with age?

One possibility: The genes in the mitochondria (the sub-cellular organelles that generate energy molecules for the rest of the cell) could get mutated and damaged and then the genes in the nucleus start expressing differently due to signals coming out of the mitochondria.

Energy metablolism takes a big hit with age.

In the study, Kim and his colleagues looked at which genes were actively producing protein and at what level in flies and mice in a range of ages and in tissue taken from the muscle, brain and kidney of 81 people ranging in age from 20 to 80. The group used a microarray, which can detect the activity level of all genes in a cell or tissue. Genes that are more active are thought to be making more proteins.

One group of genes consistently made less protein as cells aged in all of the animals and tissues the group examined. These genes make up the cellular machinery called the electron transport chain, which generates energy in the cell’s mitochondria.

Kim said the gene activity is a better indicator of a cell’s relative maturity than a person’s birthday. One 41-year-old participant had gene activity similar to that of people 10 to 20 years older; muscle tissue from the participant also appeared similar to that of older people. Likewise, the sample from a 64-year-old participant, whose muscles looked like those of a person 30 years younger, also showed gene activity patterns similar to a younger person.

Biopsies of many organs in your body might tell you which organs are going to wear out first and which need replacements. With the sort of biotechnology we'll have 10 or 20 years from now we'll be able to start growing replacements for the worn out parts. Ideally, the replacements could be grown inside your own body and then connected up with surgery.

You can read the full article online: Transcriptional Profiling of Aging in Human Muscle Reveals a Common Aging Signature

We analyzed expression of 81 normal muscle samples from humans of varying ages, and have identified a molecular profile for aging consisting of 250 age-regulated genes. This molecular profile correlates not only with chronological age but also with a measure of physiological age. We compared the transcriptional profile of muscle aging to previous transcriptional profiles of aging in the kidney and the brain, and found a common signature for aging in these diverse human tissues. The common aging signature consists of six genetic pathways; four pathways increase expression with age (genes in the extracellular matrix, genes involved in cell growth, genes encoding factors involved in complement activation, and genes encoding components of the cytosolic ribosome), while two pathways decrease expression with age (genes involved in chloride transport and genes encoding subunits of the mitochondrial electron transport chain). We also compared transcriptional profiles of aging in humans to those of the mouse and fly, and found that the electron transport chain pathway decreases expression with age in all three organisms, suggesting that this may be a public marker for aging across species.

People who had worse muscle function also had gene expression patterns characteristic of more aged muscles.

The authors profiled gene expression changes in the muscles of 81 individuals with ages spanning eight decades. They found 250 genes and 3 genetic pathways that displayed altered levels of expression in the elderly. The transcriptional profile of age-regulated genes was able to discern elderly patients with severe muscle aging from those that retained high levels of muscle function; that is, the gene expression profiles reflected physiological as well as chronological age.

Another use for this information: Study people on different diets and lifestyles and see if particular diets or patterns of living cause particular organs to age more rapidly.

Some day I expect spouses to include DNA tests on body aging to argue that their spouses are aging them too rapidly.

By Randall Parker 2006 July 23 03:14 PM  Aging Genetics
Entry Permalink | Comments(4)
2006 July 20 Thursday
Tesla Roadster 100% Electric Goes 250 Miles Per Charge

Telsa Motors, located south of San Francisco and funded by Silicon Valley's famous Sand Hill Road Menlo Park venture capitalists, claims their new all electric Roadster sports car will go 0 to 60 mph in about 4 seconds and costs just 1 cent per mile in electricity to operate. For someone who drives 15,000 miles per year that'd cost $150. It goes on sale in California in the summer of 2007 and in Chicago in fall 2007 with other locales coming later. This is not a car for the masses. The Tesla Roadster wll cost from $85,000 to $100,000.

Tesla Motors, a four-year-old Silicon Valley start-up, has raised $60 million and spent about $25 million developing a two-seat roadster that will sell for $85,000 to $100,000.

It goes from zero to 60 miles per hour, or 96 kilometers per hour, in four seconds - "wicked fast," said the company's chairman, Martin Eberhard. Because it is an electric, the driver does not have to shift into second gear until the car hits 65 miles an hour, he said.

The long charge time makes it unsuitable for long trips.

The car comes with a kit that connects to a 240-volt circuit and fully charges dead batteries in three and a half hours. It can also be charged on a normal 110-volt household outlet, though that takes longer.

A house with only 110 volts would need an electrical upgrade for an outlet which can provide 240 volts. Still, even at 110 volts a car could easily charge overnight. You could even take it on a 200 mile trip if you were going to stay overnight somewhere you could charge it up.

The penny per mile cost is based on an cheap night rate that isn't available to most who have regular home electric service. Tesla CEO Martin Eberhard says that at 13 cents per kwh the car costs 2.6 cents per mile. He's in California and therefore pays a lot more than the average in America for electricity.

Tesla's Frequently Asked Questions (FAQ) list claims the batteries will last for 500 recharging cycles. In theory that gives 100,000 miles before replacement. In practice you might get less since most people aren't going to want to run their batteries all the way down and therefore will charge up less than every 250 miles.

Lotus Design in Britain will build the cars.

Just before Christmas 2004, 30 employees and board members from Tesla came to Eberhard's Woodside, California, house to decide what the car would look like. He had commissioned four top automotive designers to draw sketches, which he taped to his living room wall. He gave everyone three red stickers and three green and told them to flag what they liked and didn't like. By the time the eggnog was gone, the green dots had coalesced around a drawing by Barney Hatt of Lotus Design in England. This is how a Silicon Valley startup does car design.

Lotus had manufactured cars for GM, in addition to its own lightweight aluminum sports car, the Elise. So Eberhard contracted the company to assemble his new vehicle, codenamed Dark Star (after a classic low-budget sci-fi movie). The electric motor would be built in Taiwan, and engineering and R&D would be conducted in a San Carlos warehouse.

Part of the choice to build a sports car was probably driven by the idea that upper class people will pay a lot of money and accept some trade-offs to buy a eco-friendly high performance sports car. But another reason they went with the sports car approach is that they didn't have to provide much room for passengers or luggage. So more space could be given over to batteries. In other words, a 250 mile range electric sports car does not demonstrate that more common sedans and SUVs could get built to operate with that range.

What I'd like to know: First, how much do the batteries cost? Second, how quickly will the battery costs drop? Third, how quickly will the energy density go up for lithium-based batteries?

While this car is interesting and will provide a lot of fun for some highly affluent people the hybrid vehicles, because they generate mass production volumes, are much more important for driving development of better batteries. Battery makers and venture capitalists are funding battery research in order to chase after really big purchase orders from Toyota, Honda, GM, Ford, and Nissan. Hybrids are the path we will take to eventually reach all electric high production volume cars.

I would go even further: Hybrids are less important for the fossil fuel they save in the short run than they are for the battery technology innovations they will spark. Those innovations will enable mass produced pure electric cars. More efficient ways to burn gasoline just lead to bigger and faster cars with little net gain in fuel efficiency. Pure electric vehicles will enable the use of non-fossil fuels for transportation. That will be the greatest legacy of hybrids.

By Randall Parker 2006 July 20 07:30 PM  Energy Transportation
Entry Permalink | Comments(63)
2006 July 19 Wednesday
Car Fuel Efficiency Gains Used For Speed And Size

Technological gains that in theory could increase vehicle fuel efficiency instead get used to make cars bigger and faster at the same average level of fuel efficiency.

The Environmental Protection Agency said in its annual report, based on sales projections provided by automakers, that the estimated average fuel economy for 2006 vehicles was 21 miles per gallon, the same as 2005 models.

Since they are using sales projections they are weighting for number of units sold. High oil prices? Expensive gasoline? Yes, but faster acceleration is just so much fun.

Even as Toyota ramps up hybrids production it is ramping up bigger cars even faster for a net loss in fuel efficiency per mile travelled.

Honda Motor Co. had the highest fuel economy rating by manufacturer, 24.2 mpg, followed by Toyota Motor Corp., with a 23.8 mpg average. But both Japanese automakers saw their averages drop from the previous year as they placed more of an emphasis on larger vehicles.

6 speed transmissions, engines that turn off some cylinders while cruising, hybrids, new lighter materials, and other innovations plus the big rise in oil prices were not enough to change the average fuel economy of new cars. Attempts to increase efficiency get undermined in at least 3 ways by consumers:

  • People choose bigger cars and SUVs.
  • People choose models that accelerate more rapidly.
  • People drive more miles.

Most of the increases in wealth due to productivity increases are going to those who already earn higher incomes. The gaps between the classes are widening. Upper class folks are less affected by higher gasoline prices. At the same time, they buy a disproportionate fraction of all new cars. So new car buying patterns haven't shifted as much in response to higher oil prices as you might expect. Lower class folks tend to buy used cars. So the fuel economy of used cars is lowered by the upper class folks who buy new cars.

Auto pricing changes also have reduced buyer responses to higher fuel costs. The auto makers have higher profit margins on larger vehicles. They've partially compensated for higher gasoline prices by lowering prices more on less efficient large vehicles. So the buyers effectively haven't seen as large an increase in total vehicle ownership costs as you might think if you look at gasoline price increases alone.

Fuel economy has changed little since 1992 even though automotive technology has advanced.

Since 1992, average real-world fuel economy has been relatively constant, ranging from 20.6 to 21.4 mpg. This 21.0 mpg value is five percent lower than the fleet-average fuel economy peak value of 22.1 mpg achieved in 1987-1988. For model year 2006, cars and light trucks are each projected to account for about 50 percent of vehicle sales. After two decades of steady growth, the light truck market share has been relatively stable for five years. New technologies have maintained fuel economy while supporting the heaviest and fastest new vehicle fleet since EPA began compiling data in 1975. Recent technology developments, such as hybrid-electric vehicles, clean diesel technology, improved transmission designs, and engines equipped with variable valve timing and cylinder deactivation, hold promise for stable or improving fuel economy in the future.

There's also a type of higher efficiency gasoline engine under development that uses compression for ignition like diesel does. The gasoline engines of this type do use spark at lower and higher engine speeds but not at mid range engine speeds.

Buyers have shifted from cars to heavier trucks.

Between 1975 and 2006, market share for new passenger cars and station wagons decreased by over 30 percent. For model year 2006, cars are estimated to average 24.6 mpg, vans 20.6 mpg, SUVs 18.5 mpg, and pickups 17.0 mpg. The increased market share of light trucks, which in recent years have averaged more than six mpg less than cars, accounted for much of the decline in fuel economy of the overall new light-duty vehicle fleet from the peak that occurred in 1987-88.

Customers use improved drive trains to allow them to get bigger faster vehicles at the same level of fuel efficiency per distance travelled.

Vehicle weight and performance are two of the most important engineering parameters that determine a vehicle’s fuel economy. All other factors being equal, higher vehicle weight (which can be a proxy for some vehicle utility attributes) and faster acceleration performance (e.g., lower 0 to 60 time), both decrease a vehicle’s fuel economy. Improved engine, transmission, and powertrain technologies continue to penetrate the new light-duty vehicle fleet. The trend has clearly been to apply these innovative technologies to accommodate increases in average new vehicle weight, power, and performance while maintaining a relatively constant level of fuel economy. This is reflected by heavier average vehicle weight, rising average horsepower, and faster average 0-to-60 mile-per-hour acceleration time. MY2006 light-duty vehicles are estimated, on average, to be the heaviest, fastest and most powerful vehicles than in any year since EPA began compiling such data.

The gap is closing between the more and less fuel efficient auto makers with most of the gap closing coming as a result of more rapid deterioration in fuel economy among the makers with higher average fuel efficiency.

For MY2006, the eight highest-selling marketing groups (that account for over 95 percent of all sales) fall into two fuel economy groupings: Honda, Toyota, Hyundai-Kia (HK), and Volkswagen all have estimated fuel economies of 23.5 to 24.2 mpg, while General Motors, Nissan, Ford, and DaimlerChrysler all have estimated fuel economies of 19.1 to 20.5 mpg.

Each of these marketing groups has lower average fuel economy today than in 1987. Since then, the differences between marketing group fuel economies have narrowed considerably, with the higher mpg marketing groups in 1987 (e.g., Hyundai-Kia, Honda, and Nissan) generally showing a larger fuel economy decrease than the lower mpg marketing groups (e.g., Ford and General Motors). Two marketing groups (Toyota and DaimlerChrysler) show a slight increase in average fuel economy since 1997. For MY2006, the six top-selling marketing groups all have truck shares in excess of 40 percent; only Hyundai-Kia and Volkswagen have a truck market share of less than 40 percent and the Hyundai-Kia truck share is increasing rapidly.

Clearly people have a strong preference for bigger and faster. Technological advances will eventually enable a big drop in the cost of hybrids and other fuel efficiency enhancing technologies. When that happens cars will get faster, bigger, and moderately more fuel efficient all at the same time. Plus, people will drive more. But the only way we can substantially reduce the use of fossil fuels in transportation is to develop ways to use non-fossil fuels for transportation.

To repeat: Fuel efficiency increases are not a panacea for reducing fossil fuel usage. Only the development of competitive non-fossil fuel energy sources can make a very big impact on fossil fuels consumption.

By Randall Parker 2006 July 19 11:03 PM  Energy Transportation
Entry Permalink | Comments(25)
2006 July 18 Tuesday
AI Research Generating Useful Results

John Markoff of the New York Times reports on signs that the rate of advance in artificial intelligence research is accelerating with many useful technologies entering the market.

At Stanford University, for instance, computer scientists are developing a robot that can use a hammer and a screwdriver to assemble an Ikea bookcase (a project beyond the reach of many humans) as well as tidy up after a party, load a dishwasher or take out the trash.

One pioneer in the field is building an electronic butler that could hold a conversation with its master — á la HAL in the movie “2001: A Space Odyssey” — or order more pet food.

Though most of the truly futuristic projects are probably years from the commercial market, scientists say that after a lull, artificial intelligence has rapidly grown far more sophisticated. Today some scientists are beginning to use the term cognitive computing, to distinguish their research from an earlier generation of artificial intelligence work. What sets the new researchers apart is a wealth of new biological data on how the human brain functions.

Computers continue to get faster and higher capacity. At the same time, neuroscience is generating useful insights into how brains work. Both these trends look set to continue to enable the implementation of more complex computer algorithms that do more of what human brains can do and many things that human brains can not do well.

The article cites many examples of impressive advances such as the winning of the DARPA prize for a robot car that can guide itself over a long test track.

Last October, a robot car designed by a team of Stanford engineers covered 132 miles of desert road without human intervention to capture a $2 million prize offered by the Defense Advanced Research Projects Agency, part of the Pentagon. The feat was particularly striking because 18 months earlier, during the first such competition, the best vehicle got no farther than seven miles, becoming stuck after driving off a mountain road.

Now the Pentagon agency has upped the ante: Next year the robots will be back on the road, this time in a simulated traffic setting. It is being called the “urban challenge.”

Once artificial intelligences become smarter than humans I do not see how they can be kept friendly toward us. I hope we do not reach a short-lived period of technological utopia followed by our extinction.

By Randall Parker 2006 July 18 11:10 PM  Computing AI
Entry Permalink | Comments(41)
Auto Parts Build Solar Power Generator

MIT's Technology Review has an interesting report on how MIT graduate student Matthew Orosz, while on a Peace Corps trip to Lesotho in southern Africa, saw Africans using a parabolic reflector to bake bread. Building on this idea Orosz came up with a way to use common parts to make a solar electric generator that is cheaper than photovoltaics.

The basic design of Orosz's solar generator system is simple: a parabolic trough (taking up 15 square meters in this case) focuses light on a pipe containing motor oil. The oil circulates through a heat exchanger, turning a refrigerant into steam, which drives a turbine that, in turn, drives a generator.

The refrigerant is then cooled in two stages. The first stage recovers heat to make hot water or, in one design, to power an absorption process chiller, like the propane-powered refrigerators in RVs. The solar-generated heat would replace or augment the propane flame used in these devices. The second stage cools the refrigerant further, which improves the efficiency of the system, Orosz says. This stage will probably use cool groundwater pumped to the surface using power from the generator. The water can then be stored in a reservoir for drinking water.

Since the parts are mass produced for automobiles they are half the cost of photovoltaics for generating electricity.

As a result, the complete system for generating one kilowatt of electricity and 10 kilowatts of heat, including a battery for storing the power generated, can be built for a couple thousand dollars, Orosz says, which is less than half the cost of one kilowatt of photovoltaic panels.

But does that cost estimate include maintenance costs and replacement parts? I'd expect a much higher mean time between failure for photovoltaics. Though when this gadget fails people with fairly common auto mechanic skills would be able to fix most of it and they'd be able to get many of the parts from an auto parts store.

There's a downside to the mass-produced parts: It is unlikely that many of these parts could be made much cheaper. The design is not as amenable to cost reduction as photovoltaics. Eventually photovoltaics will drop in cost below the cost of this system. Still, it is a pretty neat idea today.

Some questions: How much heat and electricity would this device generate in winter? Would the cold air prevent it from working? Also, can the heat do anything useful in the summer? Solar hot water comes to mind.

Another question: How noisy is it?

By Randall Parker 2006 July 18 06:45 PM  Energy Solar
Entry Permalink | Comments(5)
2006 July 16 Sunday
Customized Cancer Treatments With Proteonomics

Proteonomic techniques combined with gene silencing has led to the identification of yet another gene which can mutate to contribute to cancer.

In a step toward personalized medicine, Howard Hughes Medical Institute investigator Brian J. Druker and colleagues have developed a new technique to identify previously unknown genetic mutations that can trigger cancerous growth. By analyzing the proteins – instead of the genes – inside acute myeloid leukemia (AML) cells, the researchers have dramatically reduced the time it takes to zero in on molecular abnormalities that might be vulnerable to specific drug treatments.

The researchers are correct in arguing that this approach could lead to use to identify specific mutations for individual cancers so that cancer treatments can be tailored to the characteristics of each cancer case. But that is not the only value of this approach. They also have hit upon a faster way to find proteins whose mutations can cause or at least contribute to the development of cancer. My guess is the identification of more genes which can mutate to contribute to cancer will be the greater value.

"This approach gives us a way to figure out what's driving the growth of a cancer in an individual patient and ultimately match that patient with the right drug," said Druker, who is based at the Oregon Health & Science University in Portland. Druker's team collaborated on the research, which was published in the July 17, 2006, issue of the journal Cancer Cell, with scientists in the lab of D. Gary Gilliland, an HHMI investigator at Brigham and Women's Hospital, as well as researchers at the Portland VA Medical Center, Cell Signaling Technology, the University of Chicago, and Yale University.

Traditionally, cancer-gene hunters have scanned the genome looking for mutations that trigger out-of-control cell growth. Druker tried this approach, but found it wanting. "We were doing some high-throughput DNA sequencing, and we weren't really finding much," he said.

DNA sequencing is a hard way to look for mutations that drive cancer because cancer cells are genetically very unstable and have large numbers of mutations that are just side effects of the cancer. Also, genomes are very large and they end up sequencing lots of sequences that are not genes or that are not getting expressed even if they are genes.

They decided to instead sequence the peptides that make up proteins. This reduces the sequencing job by orders of magnitude.

Instead, the team added tools from the burgeoning field of proteomics, the study of proteins. "We decided this more functional assay would get us to the disease-causing genes more rapidly," said Druker, who has been studying a group of cell-signaling proteins called tyrosine kinases for 20 years.

Tyrosine kinases play a key role in many cancers. In healthy cells, they help form a chain of signals that prompt normal cell growth and division. Sometimes, though, a tyrosine kinase gets stuck in an "on" position, driving out-of-control cell division and, ultimately, cancer. This potentially devastating kinase activation carries a calling card in the form of a molecule called a phosphate.

"The phosphates signal activated tyrosine kinases," said Druker. "So we decided to use the phosphates as markers."

To find these markers, the team took myeloid leukemia cells and chemically digested them into a mixture of protein snippets called peptides. Next, they extracted all of the peptides carrying extra phosphates and sent them through a mass spectrometer, which precisely measured the weight of each peptide. Sophisticated software then sifted through a massive protein database at the National Library of Medicine, identifying each of the team's peptides as a segment of a specific protein. The analysis showed that many of the peptides came from tyrosine kinases. Scanning this list, Druker picked out five as likely suspects.

Sounds like they used RNA interference (RNAi) to block the candidate genes. So they used a number of fairly new techniques to do this research.

Druker's team then introduced into their leukemia cells five segments of RNA that each shut down one of the candidate kinases. Silencing four of the kinases with RNA did nothing – the cells still grew out of control. But with the fifth, the cells no longer became cancerous.

"That left one gene to sequence. We found that the gene, called JAK3, had a mutation that drives the growth of leukemia cells in mice," said Druker. Analysis of additional patient samples later identified two more mutations in the JAK3 gene.

Thomas Mercher, a postdoctoral fellow in Gilliland's lab, then tested the mutation in a mouse model. "It was important to show that the JAK3 mutation, when introduced in mice, would lead to a leukemia-like illness. It did, confirming that the JAK3 mutations play a central role in leukemia," said Gilliland.

One of the reasons I'm optimistic that cancer will be cured within 10 to 20 years is that cancer researchers have such better tools for doing their work as compared to 10 years ago or even 5 years ago. Also, their tools will get better next year and the year after that and enormously better 15 or 20 years from now. Researchers will be able to identify mutations, introduce mutations, interfere function, sequence DNA, sequence peptides, and do other tasks with biological systems more cheaply and rapidly in the future with microfluidics and other advances in techniques. Experiments that are not even possible to do today will become possible and then increasingly easy to do.

By Randall Parker 2006 July 16 10:23 PM  Biotech Cancer
Entry Permalink | Comments(3)
Future Of Nuclear Power Surveyed

Writing for the New York Times Magazine Jon Gertner has written an excellent article surveying the state of the nuclear power industry and signs that new nuclear power plant construction will commence in less than 10 years. For anyone seriously interested in energy policy I urge you to read this long article in full.

Thanks partly to large government incentives and to market forces that have pushed the price of other electric plant fuels (especially natural gas) to historic heights, the prospect of starting a new nuclear reactor in this country for the first time in 30 years has become increasingly likely. By early summer a dozen utilities around the country had informed the U.S. Nuclear Regulatory Commission, which oversees all civilian nuclear activity in this country, that they were interested in building 18 new facilities, nearly all of which would be sited next to existing nuclear reactors.

The electric power industry is taking nuclear power very seriously.

The sooner and higher the carbon taxes come the more attractive nuclear power will become. But nuclear power plants take several years from beginning of planning to first power production. So the electric power industry must make multi-billion dollar guesses about the state of emissions regulations in future decades.

Moreover, what makes the choice of fuels such a knotty problem is that something that is cheap now, like coal, may not be so cheap in 10 years. This isn’t because we’re running out; we probably have at least a century’s worth of coal reserves in the United States alone. But if the government were to impose a tax or a cap on carbon emissions, something that almost everyone I spoke with in the energy industry believes is inevitable, or if new laws mandate that coal plants must adopt more expensive technologies to burn the coal cleaner — or to “sequester” the carbon-dioxide byproducts underground — the financial equation will change: a kilowatt-hour generated by coal suddenly becomes more expensive. There are other contingencies at play, too: fuels, like natural gas, could experience a supply interruption that leads to enormous price spikes. As for the hope that wind and solar power will generate large amounts of clean, affordable electricity in the near future? I encountered great skepticism inside and outside the utility companies. “Maybe in 40 years,” Paul Joskow, of M.I.T., told me.

Looking out over decades the electric power industry also has to guess about the rate of technological advances in wind, photovoltaics, and other non-fossil fuels based alternatives for generating electric power. Nuclear power plants do not pay back their capital costs for decades. So the cost of competing electric power sources 20, 30, and 40 years hence have to figure into decisions about whether to start building nuclear power plants today.

If carbon taxes become a major cost then that might drive the cost of coal electric well above nuclear. But another risk that nuclear faces is the potential for innovations that lower the cost of carbon extraction when burning coal. So even if evidence of global warming from carbon dioxide becomes very strong that's not a guarantee that nuclear will become the lowest cost electric power source.

Some see construction of new electric plants as avoidable by use of technologies that greatly improve energy efficiency.

There is a counterargument to building large new power plants. One view — voiced most forcefully, perhaps, by Amory Lovins, a physicist who runs Rocky Mountain Institute, which advises corporations and utilities on energy efficiency — is that we don’t need to increase our electrical supply. We need to decrease demand by rewarding utilities for getting customers to reduce electricity use by, say, updating their appliances, furnaces and lighting. Lovins, a longtime critic of nuclear power, contends that it remains financially uncompetitive and that the 30-year absence of new plants is proof that the market has rejected nuclear power as a viable technology. When we spoke about whether utilities need to build more big generating plants in this country, he told me no — not now, not in 15 years, not even after that. “I think if you do,” he remarked, “your shareholders and ratepayers will be asking awkward questions that you would really rather not want to answer.” Yet the concern, even among Lovins’s admirers, is that if he is mistaken — that is, if either his estimates on efficiencies can’t accommodate population and industrial growth, or because what is possible in principle for energy efficiency is not possible in the real world — then the utilities will require an alternative plan. And that would entail more supply, likely meaning more big base-load plants (whether they rely on uranium, gas or coal) as well as large investments in renewable sources like wind and solar power.

My view: Only a big rise in the cost of electricity will substantially reduce per capita electric usage. Rising living standards will make electric power more affordable. People will find more ways to use electricity if they can afford it. They'll get bigger televisions, faster computers, run air conditioners to a lower temperature, and so on. Sure, technological advances will improve energy efficiency. But when energy efficiency rises part of the response is to do more of whatever is now more efficient to do. For example, make cars more fuel efficient and people will drive more miles and get bigger cars. Also, other technological advances will raise incomes and so people will buy more gadgets that use more power. This is especially the case in the industrializing countries, most notably China. So I do not see conservation as a solution. Increases in energy efficiency can raise living standards. But it is unlikely they will stop the increase in demand for energy.

Westinghouse with their AP1000 design and other nuclear reactor designers claim they've gotten their costs down far enough to be competitive. But read the full article for reasons behind the uncertainty about their cost estimates.

But the appeal of the AP1000 remains doubtful, even as 11 utilities, including the Southern Company, have expressed interest in the design. Westinghouse maintained to me that the cost will ultimately be somewhere between $1.4 billion and $1.9 billion. “We’re negotiating contracts,” Dan Lipman, who runs the new-power-plant division at Westinghouse, told me over lunch at the company cafeteria. “We’re well beyond the should-we-do-nuclear phase. It’s now a matter of, How should we do it?” So I asked Lipman what it would mean to actually cut a deal with a utility for a new plant, the first in 30 years. Would it happen a year from now? Two years? “If your definition of a deal is, when do you first start getting money, then that could happen very soon,” he said. “I look for that this year, with big money committed after licensing by the N.R.C.” From his continuing negotiations, Lipman said, it’s clear that his customers are interested in “off-ramps”: clauses in the contracts that allow them to bow out if they hit an unexpected financial or construction snag.

The industry has a number of advantages that it did not have during the last wave of nuclear reactor construction. First off, computers can track design changes, automate communications, manage order tracking and parts inventory, and otherwise manage the design and construction process. Computers have made large construction projects more manageable. Also, the industry is going to use standard designs this time around. So each new plant won't have a large assortment of unique problems to work out. The industry has even formed a consortium for constructing the first reactors that use the new designs. This consortium will allow a great deal of sharing of regulatory forms and knowledge about costs and technological problems encountered during the construction process.

The second cushion is the creation of an industry consortium, called NuStart, to test the licensing process. NuStart is filing several applications for nuclear plants, on behalf of its members, with the Nuclear Regulatory Commission. These applications — for the Grand Gulf plant in Mississippi and the Bellefonte site in Alabama — have preceded all others and may end up being built first. One goal of NuStart is to prove to Wall Street that utilities can get a license in a timely manner. Another goal is to establish a way for the industry to pool risk and information. If NuStart’s construction-and-operating applications for its two sites are approved, in other words, any utility in the consortium (including Entergy, Exelon and Southern Company) can copy huge parts of the approved application for its own use, thus saving time and money.

A lot is going to hinge on the costs of building the initial reactors that test out the regulatory process and the new designs. We will find out from the costs and schedules of those reactors how far the nuclear power industry has progressed toward making nuclear power competitive. The big wild card for nuclear power is global warming. If the global warming threat starts looking serious enough to justify large carbon taxes then I expect a huge shift toward nuclear for new electric power plants.

Again, read the full article if you are seriously interested in the energy debate.

By Randall Parker 2006 July 16 08:22 PM  Energy Nuclear
Entry Permalink | Comments(25)
Drug Recovers Function In Rat Model Of Parkinson's Disease

Exciting results for Parkinson's sufferers:

In preliminary results, researchers have shown that a drug which mimics the effects of the nerve-signaling chemical dopamine causes new neurons to develop in the part of the brain where cells are lost in Parkinson's disease (PD). The drug also led to long-lasting recovery of function in an animal model of PD. The findings may lead to new ways of treating PD and other neurodegenerative diseases. The study was funded in part by the NIH's National Institute of Neurological Disorders and Stroke (NINDS).

The study suggests that drugs which affect dopamine D3 receptors might trigger new neurons to grow in humans with the disease. Some of these drugs are commonly used to treat PD. The finding also suggests a way to develop new treatments for PD. The results appear in the July 5, 2006, issue of The Journal of Neuroscience. *

Parkinson's disease, a progressive neurodegenerative disorder that causes tremors, stiffness, slow movements, and impaired balance and coordination, results from the loss of dopamine-producing neurons in part of the brain called the substantia nigra. While many drugs are available to treat these symptoms during the early stages of the disease, the treatments become less effective with time. There are no treatments proven to slow or halt the course of PD. However, many researchers have been trying to find ways of replacing the lost neurons. One possible way to do this would be to transplant new neurons that are grown from embryonic stem cells or neural progenitor cells. However, this type of treatment is very difficult for technical reasons.

The new study, conducted by Christopher Eckman, Ph.D., and Jackalina Van Kampen, Ph.D., at the Mayo Clinic College of Medicine in Jacksonville, Florida, focused on a second possible way to restore function — prompting stem cells that normally remain dormant in the adult brain to develop into neurons.

The drug requires continual infusion.

"This is the first study to show that endogenous neurogenesis [development of new neurons from cells already in the brain] can lead to recovery of function in an animal model of Parkinson's disease," says Dr. Eckman.

The researchers gave either 2-, 4-, or 8-week continuous infusions of a drug called 7-OH-DPAT, which increases the activity of dopamine D3 receptors, into the brain ventricles of adult rats with neuron loss in the substantia nigra and symptoms similar to human PD on one side of the body. 7-OH-DPAT is not used in humans, but its effects on dopamine receptors are similar to the drugs pramipexole and ropinirole, which are approved to treat PD. The rats also received injections of a chemical called bromodeoxyuridine (BrdU), which marks proliferating cells, and infusions of a substance that fluorescently "traces" how neurons connect. The animals were tested before and 3 days after receiving the treatment to see how well they could walk and reach to retrieve food pellets with their paws. A subset of the rats was tested again 2 and 4 months following the treatment.

Rats treated with 7-OH-DPAT had more than twice as many proliferating cells in the substantia nigra as rats that were treated with saline, the researchers found. Many of the newly generated cells appeared to develop into mature neurons, and approximately 28 percent of them appeared to be dopamine neurons by 8 weeks after treatment. Animals treated for 8 weeks also developed almost 75 percent of the normal number of neuronal connections with other parts of the brain and showed an approximately 80 percent improvement in their movements and a significantly improved ability to retrieve food pellets. These effects lasted for at least 4 months after the treatment ended.

Similar drugs exist and the researchers are examining the effects of other drugs in rats as a prelude to trying therapies in human sufferers of Parkinson's. This result could turn out to yield an effective therapy without the need to solve the many problems involved with the development of adult or embryonic stem cells grown outside of the body.

By Randall Parker 2006 July 16 02:16 PM  Brain Disorder Repair
Entry Permalink | Comments(3)
2006 July 13 Thursday
Soy Better Than Corn For Biomass Energy

Soy for biodiesel is better than corn for ethanol but even soy has a very limited role to play as an energy source.

MINNEAPOLIS / ST. PAUL (7/10/2006) -- The first comprehensive analysis of the full life cycles of soybean biodiesel and corn grain ethanol shows that biodiesel has much less of an impact on the environment and a much higher net energy benefit than corn ethanol, but that neither can do much to meet U.S. energy demand.

The study, which was funded in part by the University of Minnesota’s Initiative for Renewable Energy and the Environment, was conducted by researchers in the university’s College of Biological Sciences and College of Food, Agricultural and Natural Resource Sciences. The study will be published online July 12 in the Proceedings of the National Academy of Sciences.

The researchers tracked all the energy used for growing corn and soybeans and converting the crops into biofuels. They also looked at how much fertilizer and pesticide corn and soybeans required and how much greenhouse gases and nitrogen, phosphorus, and pesticide pollutants each released into the environment.

“Quantifying the benefits and costs of biofuels throughout their life cycles allows us not only to make sound choices today but also to identify better biofuels for the future,” said Jason Hill, a postdoctoral researcher in the department of ecology, evolution, and behavior and the department of applied economics and lead author of the study.

The study showed that both corn grain ethanol and soybean biodiesel produce more energy than is needed to grow the crops and convert them into biofuels. This finding refutes other studies claiming that these biofuels require more energy to produce than they provide. The amount of energy each returns differs greatly, however. Soybean biodiesel returns 93 percent more energy than is used to produce it, while corn grain ethanol currently provides only 25 percent more energy.

Still, the researchers caution that neither biofuel can come close to meeting the growing demand for alternatives to petroleum. Dedicating all current U.S. corn and soybean production to biofuels would meet only 12 percent of gasoline demand and 6 percent of diesel demand. Meanwhile, global population growth and increasingly affluent societies will increase demand for corn and soybeans for food.

The authors showed that the environmental impacts of the two biofuels also differ. Soybean biodiesel produces 41 percent less greenhouse gas emissions than diesel fuel whereas corn grain ethanol produces 12 percent less greenhouse gas emissions than gasoline. Soybeans have another environmental advantage over corn because they require much less nitrogen fertilizer and pesticides, which get into groundwater, streams, rivers and oceans. These agricultural chemicals pollute drinking water, and nitrogen decreases biodiversity in global ecosystems. Nitrogen fertilizer, mainly from corn, causes the 'dead zone' in the Gulf of Mexico.

Advances in biotechnology will continue to increase crop yields and increase energy efficiency of agriculture. But total demand for energy will rise quite rapidly. Still, biomass's prospectives will improve when cellulosic technology matures to allow use of all of a plant for production of energy. Yet, even then I expect biomass to play only a minor role in providing energy.

Development of cheaper and higher efficiency photovoltaic materials seems like a better long term prospect than biomass for several reasons. First off, photovoltaics can provide energy all year rather than just during the growing season. Second, photovoltaics will provide energy even during droughts. Third, photovoltaics allow energy to be generated closer to where it gets used. When used to cover a building (e.g. with photovoltaic tiles) photovoltaics generate electricity where it gets used. Fourth, photovoltaics reduce the need for use of additional land to generate more energy. Photovoltaics on buildings and other human structures do not cover more ground than those structures already cover. Fifth, photovoltaics can get placed where few plants will grow (e.g. deserts) and therefore again won't compete as much with wildlife as agriculture does. Sixth, even when plants are growing their efficiency for turning light into chemical energy is lower than what photovoltaics wll achieve in the future.

We should accelerate photovoltaics research and development. Ditto nuclear research. Ditto batteries research (though already plug-in hybrid cars are coming).

In spite of lame government policies with regard to solar power Technology Review reports many signs that solar is starting to take off.

The announcement last month that Palo Alto, CA-based Nanosolar had raised $100 million to finance a new solar-cell factory based on an inexpensive process, similar to that used to print newspapers, and that it will make enough cells to produce 430 megawatts of power annually, is just one sign that new types of solar power are emerging as a viable alternative energy source (see "Large-Scale, Cheap Solar Electricity").

While Nanosolar's new factory capacity, equivalent to one-quarter of the total global solar capacity last year, is unprecedented for a new technology, it's just part of equally impressive overall growth in the solar industry. For the last several years, solar cell production has been doubling every two years, and indicators suggest this will not slow soon, says industry analyst Michael Rogol, managing director of Photon Consulting in Aachen, Germany.

The price of oil just exceeded $78 per barrel. The high cost of oil is doing more to accelerate development of new energy technologies than all government energy policies together. If biomass can make the grade high oil prices mean we'll find out.

By Randall Parker 2006 July 13 11:08 PM  Energy Biomass
Entry Permalink | Comments(18)
2006 July 12 Wednesday
Loners At Greater Risk Of Heart Attack

Lonely hearts run a greater risk of heart disease.

People who live alone double their risk of serious heart disease as those who live with a partner, suggests research in the Journal of Epidemiology and Community Health. This includes severe angina and heart attack.

The finding is based on a study of more than 138,000 adults between the ages of 30 and 69 living in one area (Aarhus) of Denmark.

Between 2000 and 2002, 646 people were diagnosed with severe angina, or sustained a heart attack, or sudden cardiac death, a spectrum of conditions known as acute coronary syndrome.

When analysed in detail, using information from population registers, poor educational attainment and living on a pension were associated with an increased risk of the syndrome.

But the two strongest predictive factors for the syndrome were age and living alone.

Women above the age of 60 and living by themselves, and men over the age of 50, in the same position, were twice as likely to have the syndrome as everyone else.

Lone women over 60 comprised just over 5 per cent, and lone men over 50 just under 8 per cent, of the whole population.

Yet lone women in this age group accounted for a third of all deaths from the syndrome within 30 days of diagnosis, while lone men in this age group accounted for two thirds of deaths.

The lowest risks included cohabiting with a partner, a high level of education, and being in work. Women divorcees also enjoyed a lower risk of the syndrome.

The authors say that certain risk factors tend to be more common in the lifestyles of those who live by themselves, which may help to explain the differences.

These include smoking, obesity, high cholesterol and fewer visits to the family doctor. People living on their own may be less able to draw on social support networks as well, say the authors.

Keep working and try to stay hooked up.

By Randall Parker 2006 July 12 10:48 PM  Aging Studies
Entry Permalink | Comments(0)
Less Sleep Increases Obesity

Fight the battle of the bulge by getting enough sleep.

Research by Warwick Medical School at the University of Warwick has found that sleep deprivation is associated with an almost a two-fold increased risk of being obese for both children and adults.

Early results of a study by Professor Francesco Cappuccio of the University of Warwick's Warwick Medical School were presented to the International AC21 Research Festival hosted this month by the University of Warwick.

The research reviewed current evidence in over 28,000 children and 15,000 adults. For both groups Professor Cappuccio found that shorter sleep duration is associated with almost a two-fold increased risk of being obese.

The research also suggests that those who sleep less have a greater increase in body mass index and waist circumference over time and a greater chance of becoming obese over time.

This result is consistent with other studies. See my previous posts Lack Of Sleep Linked To High Blood Pressure, Other Risks and Sleep A Lot To Avoid Burn-Out From Stress And To Stay Skinny.

Update: If you get enough sleep your mind will probably do a better job of memory formation as well.

In the new work, the researchers studied the influence of sleep on declarative memory in healthy, college-aged adults. The results demonstrated a robust effect: Compared to participants who did not sleep during the trials, those who slept between learning and testing were able to recall more of the original words they had learned earlier. The beneficial influence of sleep was particularly marked when participants were presented with the challenge of "interference"--competing word-pair information--just prior to testing. A follow-up group further demonstrated that this sleep benefit for memory persists over the subsequent waking day. This work clarifies and extends previous study of sleep and memory by demonstrating that sleep does not just passively and transiently protect memories; rather, sleep plays an active role in memory consolidation.

I wonder whether people could learn more if they napped a few times a day between learning exercises.

By Randall Parker 2006 July 12 10:40 PM  Brain Appetite
Entry Permalink | Comments(3)
2006 July 11 Tuesday
Psilocybin Studied For Mystical And Mood Effects

People chosen asked by scientists to try the hallucinatory mushroom compound psilocybin under controlled conditions experienced mood improvements that lasted for months.

Using unusually rigorous scientific conditions and measures, Johns Hopkins researchers have shown that the active agent in "sacred mushrooms" can induce mystical/spiritual experiences descriptively identical to spontaneous ones people have reported for centuries.

The resulting experiences apparently prompt positive changes in behavior and attitude that last several months, at least.

The agent, a plant alkaloid called psilocybin, mimics the effect of serotonin on brain receptors-as do some other hallucinogens-but precisely where in the brain and in what manner are unknown.

What causes the change in mood? Do people feel happier about life after experiencing it in a very different way? Severe alteration of perceptions caused by a hallucinogen would not seem sufficient by itself to cause a more positive outlook. After all, a nightmare does not brighten one's outlook and scary hallucinations probably wouldn't brighten one's mood afterward either. So what causes the mood brightening? A happy hallucination? Or do alterations in feelings about self and non-self make people feel less isolated and more connected after the hallucinatory experience? Do they feel that the world makes more sense? More generally, does the change in mood come from processing the meaning of the experience? In other words, is the mood change the result of cognitive processing that interprets the experience? Or does the drug cause lasting side effects on neurons separate from the memory of the experience?

Many of the subjects found 'shrooming to be a deeply significant and meaningful experience. By contrast, I'm skeptical that a chemical compound from this universe can allow one to experience the supernatural.

In the study, more than 60 percent of subjects described the effects of psilocybin in ways that met criteria for a "full mystical experience" as measured by established psychological scales. One third said the experience was the single most spiritually significant of their lifetimes; and more than two-thirds rated it among their five most meaningful and spiritually significant. Griffiths says subjects liken it to the importance of the birth of their first child or the death of a parent.

Two months later, 79 percent of subjects reported moderately or greatly increased well-being or life satisfaction compared with those given a placebo at the same test session. A majority said their mood, attitudes and behaviors had changed for the better. Structured interviews with family members, friends and co-workers generally confirmed the subjects' remarks. Results of a year-long followup are being readied for publication.

Psychological tests and subjects' own reports showed no harm to study participants, though some admitted extreme anxiety or other unpleasant effects in the hours following the psilocybin capsule. The drug has not been observed to be addictive or physically toxic in animal studies or human populations. "In this regard," says Griffiths, a psychopharmacologist, "it contrasts with MDMA (ecstasy), amphetamines or alcohol."

The study isn't the first with psilocybin, the researchers say, though some of the earlier ones, done elsewhere, had notably less rigorous design, were less thorough in measuring outcomes or lacked longer-term follow-up.

In the present work, 36 healthy, well-educated volunteers-most of them middle-aged-with no family history of psychosis or bipolar disorder were selected. All had active spiritual practices. "We thought a familiarity with spiritual practice would give them a framework for interpreting their experiences and that they'd be less likely to be confused or troubled by them," Griffiths says. All gave informed consent to the study approved by Hopkins' institutional review board.

Each of thirty of the subjects attended two separate 8-hour drug sessions, at two month intervals. On one they received psilocybin, on another, methylphenidate (Ritalin), the active placebo.

I've long thought that the mind is a very flawed instrument when it comes to accurately assessing and understanding the world. Since natural selection probably selected for genes that cause us to perceive and focus in ways that enhance survival and reproduction at the expense of accuracy I can imagine that a drug could temporarily block cognitive processes that hobble our ability to assess the world accurately. So while I'm skeptical that drugs can help one see God I do think it is possible that hallucinatory compounds could help improve the quality cognitive processing.

On the other hand, improvements in mood could actually be the result of strengthened delusions about reality. I'm reminded of a study I read about a few years ago which found that depressed people had more accurate perceptions of how their co-workers evaluated them than non-depressed people. Non-depressed people tended to think their co-workers rated them higher than the co-workers actually did. Anyone recall the study? I can't find it and would appreciate a link to it.

So then does psilocybin improve mood by helping people better understand the world? Or by feeding their delusions? Or by acting like an anti-depressant drug that alters neurotransmitter levels?

I am reminded of Michael Persinger's use of electromagnetic fields to induce a state of mind in which some think they experience the divine. Though some are skeptical of Persinger's research.

By Randall Parker 2006 July 11 06:11 PM  Brain Spirituality
Entry Permalink | Comments(23)
2006 July 10 Monday
Fish Lowers Aging Eye Disease Risks

Eat fish and do not smoke if you want to avoid age-related macular degeneration of the eyes.

BOSTON (July 10, 2006) -- Researchers in Boston studied elderly male twins and found that those who smoke or have a history of smoking had an increased risk of developing age-related macular degeneration as compared to those who never smoked. At the same time, those who ate more fish and had diets with higher levels of omega-3 fatty acids reduced their risk of this blinding disease. Their findings are published in the July 2006 issue of the Archives of Ophthalmology.

Researchers at the Massachusetts Eye and Ear Infirmary and Department of Biostatistics at Harvard Medical School studied 681 male twins from the National Academy of Sciences-National Research Council World War II Veteran Twin Registry. To determine genetic and environmental risk factors for AMD, twins were surveyed for a prior diagnosis of AMD and underwent an eye examination, fundus photography, and food frequency and risk factor questionnaires. The study included 222 twins with intermediate and late stage AMD and 459 twins with no signs of the disease.

“Current smokers had a 1.9-fold increased risk of developing AMD, while past smokers had about a 1.7-fold increased risk,” said Johanna M. Seddon, M.D., director of the Epidemiology Unit at the Massachusetts Eye and Ear Infirmary and an associate professor of ophthalmology at Harvard Medical School. “We also found that increased intake of fish reduced the risk of AMD, particularly if they ate two more servings per week. Dietary omega-3 fatty intake was also inversely associated with AMD. This study of twins provides further evidence that cigarette smoking increases risk while fish consumption and omega-3 fatty acid intake reduce risk of AMD.”

The effects of cigarettes are no surprise. Previous studies have shown a beneficial effect of fish for AMD. This study strengthens the evidence for that finding.

The problem: The oceans are getting depleted of fish. We really need genetic engineering done on crop plants in order to make more scalable and cheaper land-based sources of omega 3 fatty acids.

By Randall Parker 2006 July 10 11:29 PM  Aging Diet Eye Studies
Entry Permalink | Comments(7)
2006 July 09 Sunday
First US Coal To Liquid Plant Coming

The New York Times reports on plans by Rentech to build a plant to convert coal to liquid fuel burnable in diesel engines.

Here in East Dubuque, Rentech Inc., a research-and-development company based in Denver, recently bought a plant that has been turning natural gas into fertilizer for forty years. Rentech sees a clear opportunity to do something different because natural gas prices have risen so high. In an important test case for those in the industry, it will take a plunge and revive a technology that exploits America's cheap, abundant coal and converts it to expensive truck fuel.

"Otherwise, I don't see us having a future," John H. Diesch, the manager of the plant, said.

If a large scaling up of coal-to-liquid (CTL) production takes place then an increase in pollution seems likely. Though perhaps advances in conversion technologies and tougher regulations could prevent this. The use of coal to make liquid fuels will increase CO2 emissions since the conversion plants will emit CO2 and of course the liquid fuel will emit CO2 just as conventional diesel fuel does. Those who view rising CO2 emissions with alarm therefore see a shift to CTL as a harmful trend.

And, uniquely in this country, the plant will take coal and produce diesel fuel, which sells for more than $100 a barrel.

The cost to convert the coal is $25 a barrel, the company says, a price that oil seems unlikely to fall to in the near future. So Rentech is discussing a second plant in Natchez, Miss., and participating in a third proposed project in Carbon County in Wyoming.

That sounds very profitable. The longer the price of oil stays high the likelier that capitalists will decide it is worth the risk to build CTL plants. Many are holding back worried that oil prices could tank again as happened in the early 1980s. That price decline drove the Beulah North Dakota Great Plains Synfuels Plant into bankruptcy. Though it was restarted and now produces natural gas from coal profitably. Though the bankruptcy cut the capital cost of operating that plant and so is not a perfect measure of the profitability of processes to convert coal to gas or liquid.

The rise in US natural gas prices has led to a cutback in the production of ammonia and ammonia-based fertilizers from natural gas in the United States. Ammonia production has ramped up in countries which have cheaper natural gas due to a surplus of local production (e.g. in the Middle East). But synthetic gas can be produced from coal for cheaper than recent natural gas prices. Rentech intends to convert an existing ammonia plant to coal gas and then at the same site they will implement a coal-to-liquids process to make synthetic diesel fuel.

Denver, Colorado-Rentech, Inc. (AMEX:RTK) announced today that its wholly owned subsidiary, Rentech Energy Midwest Corporation (REMC), has entered a Professional Services Agreement (PSA) with Kiewit Energy Company (KEC), Houston, Texas, a subsidiary of Kiewit Corporation located in Omaha, Nebraska. WorleyParsons, under contract to KEC, will lead the Front End Engineering and Design (FEED) from their offices in Houston, Texas. FEED services will include the planned coal gasification conversion of REMC's natural gas fed ammonia fertilizer facility in East Dubuque, Illinois and the production of ultra-clean fuels based on Rentech’s patented and proprietary Fischer-Tropsch (FT) coal-to-liquids (CTL) technology.

Rentech intends to implement ConocoPhillips E-Gas™ Technology for clean coal gasification at the site to produce syngas initially for use in the production of ammonia and ammonia-based fertilizers and ultra-clean FT fuels.

The FEED contract for the first phase of the project, which includes the conversion of the ammonia plant feedstock to coal derived syngas and the installation of FT liquids production, has been initiated. The results of the FEED work will be used to advance the project and provide a basis for an Engineering Procurement and Construction Contract. FEED activities are scheduled for completion during the first half of 2007. REMC's East Dubuque facility is expected to be the site of the United States' first commercial CTL plant utilizing clean coal gasification in conjunction with the Rentech Process.

The New York Times article also mentions GreatPoint Energy which has a pilot plant coal-to-gas (CTG) plant in Des Plaines Illinois. Their process uses a new catalyst that lowers the heat needed and therefore the energy loss from conversion of coal to natural gas. They list several advantages of their process:

GreatPoint lists many advantages for their process.

Produces methane in a single step and in a single reactor

  • Pipeline grade product
  • No need for external water gas shift reactor
  • No need for external methanation reactor
  • Produces CO2 as a valuable sequestration-ready byproduct

Significantly reduces operating temperature

  • Lower cost reactor components
  • Lower maintenance costs and higher reliability
  • Eliminates costly high temperature cooling

Utilizes steam methanation

  • Eliminates costly air separation plant

High efficiency

  • 65% overall efficiency
  • Thermally neutral reaction process
  • No need for integrated power plant

I like the fact that this process produces CO2 already separated out. This lowers the cost of CO2 sequestration.

The price of oil has gotten so high that lots of talented people with entrepreneurial streaks are coming up with cheaper ways to get liquid and gaseous fuel from coal. Given that I do not expect the political system to impose sufficiently stringent emissions regulations on coal-based energy processes I'm ambivalent about these developments. But the GreatPoint Energy process is reason for more optimism. If they really can produce the CO2 as a separate product of their process then perhaps coal-based energy technologies which are both low cost and low in emissions is an attainable goal.

Even if the plants that convert coal to liquid and gaseous fuels can be made to be carbon-neutral at low cost a shift to coal on a scale sufficient to increase liquid fuels production would still lead to higher CO2 emissions. The liquids would get burned in vehicles and vehicle consumption of hydrocarbon liquids would rise.

Advances in battery technology show the most promise for reducing both conventional pollutant emissions and CO2 emissions. First off, cheap and high energy density batteries would lower the cost of hybrids. That would increase efficiency of burning liquid fuels and therefore reduce emissions. Also, batteries will enable a migration toward use of electricity to charge up vehicles. Then stationary power plants - whose emissions are far eaiser to reduce - will supply an increasing fraction of all energy used in transportation.

The migration to pluggable hybrids and pure electric vehicles will allow nuclear, solar, and wind to provide power for transportation. Also, coal-burning electric plants could in theory be made to have zero emissions. Whether the cost of zero emissions coal will ever compete with nuclear, wind, and future cheaper photovoltaics remains to be seen.

Also see my posts US Air Force Plans Shift To Coal Derived Jet Fuel, Fischer-Tropsch Coal Gas Cost Effective With Current Oil Prices?, and Rapid Switch From Oil To Coal Possible?

By Randall Parker 2006 July 09 09:31 AM  Energy Policy
Entry Permalink | Comments(23)
2006 July 06 Thursday
THC Primes Rats For Heavier Heroin Abuse

Marijuana really does lead to higher risk of heavy duty drug abuse - at least in lab rats.

To rule out social factors, the researchers turned to an animal model. They dosed some rats with the active ingredient of cannabis and others with a neutral compound during their adolescence (when they were about four to six weeks old). After that, they gave the rats intermittent access to heroin for several weeks, obtained by pressing a lever.

Although all rats helped themselves to heroin, the ones given cannabis's key compound, called Delta-9-tetrahydrocannabinol (THC), during their formative years showed a greater escalation in their self-dosing during the experiment. By the end, rats that'd had cannabis in their 'teens' were pressing the lever that delivered heroin about 1.5 times more than the rats that had previously been drug-free.

I find this incredibly unsurprising. The developing brain develops differently if exposed to drugs that activate brain pleasure circuitry.

Decreased sensitivity leads to greater risk of addiction.

“At first, all the rats behaved the same and began to self-administer heroin frequently,” says Hurd. “But after a while, they stabilised their daily intake at a certain level. We saw that the ones that had been on THC as teenagers stabilised their intake at a much higher level than the others – they appeared to be less sensitive to the effects of heroin. And this continued throughout their lives.”

Hurd says reduced sensitivity to the heroin means the rats take larger doses, which has been shown to increase the risk of addiction.

Earlier use of alcohol in humans is associated with greater risk of alcoholism.

Data from a survey of 43,000 U.S. adults heighten concerns that early alcohol use, independent of other risk factors, may contribute to the risk of developing future alcohol problems. Those who began drinking in their early teens were not only at greater risk of developing alcohol dependence at some point in their lives, they were also at greater risk of developing dependence more quickly and at younger ages, and of developing chronic, relapsing dependence. Among all respondents who developed alcoholism at some point, almost half (47 percent) met the diagnostic criteria for alcohol dependence (alcoholism) by age 21.

The associations between early drinking and later problems held even after investigators controlled for other risk factors for dependence, adding to concerns that drinking at a young age might raise the risk of future alcohol problems rather than being an identifying feature of young people predisposed to risky behavior. The study appears in the July issue of Archives of Pediatrics & Adolescent Medicine, Volume 160, pages 739-746.

Timing of first alcohol use leads to a huge difference in risk

In results that echo earlier studies, of those individuals who began drinking before age 14, 47 percent experienced dependence at some point, vs. 9 percent of those who began drinking at age 21 or older. In general, each additional year earlier than 21 that a respondent began to drink, the greater the odds that he or she would develop alcohol dependence at some point in life. While one quarter of all drinkers in the survey started drinking by age 16, nearly half (46 percent) of drinkers who developed alcohol dependence began drinking at age 16 or younger.

New findings showed that among all drinkers, early drinking was associated not only with a higher risk of developing alcoholism at some point, but also within 10 years of first starting to drink, before age 25, and within any year of adult life. Early drinking was also associated with increased risk of having multiple episodes of alcoholism. Further, among respondents who had had alcohol dependence at some point, those who began drinking young had episodes of longer duration and with a wider range of symptoms than those who started later.

The developing brain gets altered by drug and alcohol use. We do not have free will. If legalization would increase the amount of teen drug use then legalization would lead to much more drug abuse and addiction.

Adolescent brains are at much higher risk because they are still developing. Industrialized societies need to do a better job of protected teenagers from subtstances that will mess up their brain development.

Also see my post Adolescence Is Tough On The Brain.

By Randall Parker 2006 July 06 10:13 PM  Brain Addiction
Entry Permalink | Comments(19)
New Scaffold For Burn Victim Skin Cell Growth

An electric field to direct electrospinning produces a dissolving fiber which provide a base for skin cell growth.

The process used to make the scaffold is based on a technique called electrospinning, which produces polymer fibres down to nano-scale by applying an electric field. However, the team has developed a new method of making aligned-fibre 'mats' from the same biodegradable polymers. These promote the growth of nerves, tendons and cartilage.

Prof Tony Ryan, who leads the research at Sheffield University, said: 'Normal electrospinning leaves the fibres running in random directions. We have developed a method to control the orientation of the fibres by controlling the electric field. This is now being patented.'

Ryan said the breakthrough was as much about understanding cell behaviour as the scaffold. 'What we have shown is that cells know the order in which they need to build, so you get the same strata in the new skin as you had in your own. '

The researchers see human trials as still at least 2 years off.

By Randall Parker 2006 July 06 08:39 PM  Biotech Tissue Engineering
Entry Permalink | Comments(0)
Cell Therapy Repairs Knee Osteoarthritis Damage

British scientists at the University of Bristol report some success with cell therapy to repair osteoarthritis damage in knees.

Knee cartilage injuries can be effectively repaired by tissue engineering and osteoarthritis does not stop the regeneration process concludes research led by scientists at the University of Bristol.

The study, "Maturation of tissue engineered cartilage implanted in injured and osteoarthritic human knees", published in the July 2006 (Volume 12, Number 7) issue of Tissue Engineering, demonstrates that engineered cartilage tissue can grow and mature when implanted into patients with a knee injury. The novel tissue engineering approach can lead to cartilage regeneration even in knees affected by osteoarthritis.

They grew cells extracted from the same persons they implanted cells back into. This approach avoids immune rejection problems.

The tissue engineering method used in this study involved isolating cells from healthy cartilage removed during surgery from 23 patients with an average age of 36 years. After growing the cells in culture for 14 days, the researchers seeded them onto scaffolds made of esterified hyaluronic acid, grew them for another 14 days on the scaffolds, and then implanted them into the injured knees of the study patients.

Cartilage regeneration was seen in ten of 23 patients, including in some patients with pre-existing early osteoarthritis of the knee secondary to traumatic injury. Maturation of the implanted, tissue-engineered cartilage was evident as early as 11 months after implantation.

Antony Hollander, ARC Professor of Rheumatology & Tissue Engineering at Bristol University who led the study, said: "This is the first time we have shown that tissue-engineered cartilage implanted into knees can mature within 12 months after implantation, even in joints showing signs of osteoarthritis.

Initial tissue engineering successes will accelerate the rate of advance of tissue engineering and stem cell therapies. Experience gained from scaling up and automating cell therapy delivery will lead to discoveries for how to refine and improve processes for handling cells. The revenue from stem cell therapy delivery will fund more development.

Many steps had be worked out to go from extraction to reimplantation of cells.

Although the researchers did not carry out physical tests of the patient’s mobility, these testing techniques have previously been shown to provide a good indicator of the cartilage's function, suggesting movement should be improved too. The reason why not all patients benefited from the engineered cartilage is not yet clear, although Hollander says giving the engineered tissue longer to settle in may help.

The new study is a "textbook example" of how tissue engineering should work, says Julian Chaudhuri, a tissue engineer working on cartilage at Bath University, UK. "Every step is in place from growing the tissue to implanting in patients, and it's been shown to work," he says. "It looks very exciting."

A December 2005 report on Hollander's progress shows he managed to grow human cartilage outside of the body.

Professor Anthony Hollander and his team at Southmead Hospital have successfully grown human cartilage from a patient's own stem cells for the first time ever.

This means people suffering from the severe form of the bone disease osteoarthritis, which leaves them unable to walk, could in the future have cartilage transplant operations.

Scientists took stem cells from the bone marrow of pensioners undergoing NHS replacement operations because they have arthritis.

They took just over a month to grow the cells into a half-inch length of cartilage.

An American team is pursuing a stem cell approach using a single donor for many patients.

Stem cell therapy — a technique that relies on the idea that stem cells can be prompted to turn into cartilage cells that will grow and repair damage — is another possible avenue for future treatment. Johns Hopkins researcher Jennifer Elisseeff has used the method in rats, finding that stem cells can fill in holes in the cartilage.

"These cells have the amazing ability to repair parts of the body," says Thomas Vangsness, an orthopedic surgeon at the University of Southern California in Los Angeles.

Vangsness and his colleagues are testing a stem cell therapy developed by Osiris Therapeutics. The Baltimore company has developed a solution of stem cells taken from a single adult donor. Vangsness and his colleagues injected the stem cell solution into the knees of 55 patients with a torn meniscus, cartilage-like tissue in the knee. They're hoping the stem cells will turn into cartilage cells and repair the injury, but the data are just now being analyzed, Vangsness says.

With multiple teams doing human clinical trials the development of successful joint repair using cell therapies no longer seems a distant prospect.

By Randall Parker 2006 July 06 07:50 PM  Biotech Tissue Engineering
Entry Permalink | Comments(14)
2006 July 05 Wednesday
Lifestyle Factors Reduce Heart Risk

You can't just take a pill to avoid health problems. At least not yet. Some day biotechnological advances will free us from the need to select our diets carefully. But for now drugs can not do the job alone.

Boston, MA -- A prospective study of 42,847 middle-aged and older U.S. men participating in the Health Professionals Follow-up Study has found that a healthy lifestyle is associated with a lower risk of coronary heart disease (CHD), even among men taking antihypertensive or lipid-lowering medications. The research, which is the first to look at the role of a healthy lifestyle and CHD in men in this age group, is published in the July 3, 2006, online edition of Circulation.

The research team, led by Eric Rimm, associate professor of epidemiology and nutrition at Harvard School of Public Health (HSPH) and Stephanie Chiuve, research fellow in nutrition at HSPH, did a 16-year follow-up of men aged 40-75 in the Health Professionals Follow-up Study, a men's health study that began in 1986. The researchers defined healthy lifestyle factors as not smoking, daily exercise, moderate alcohol consumption, a healthy body weight and a healthy diet (based upon the Alternate Healthy Eating Index developed by HSPH, which targets food and nutrients associated with lower risk of chronic disease). The study, which documented 2,183 coronary events, found that men with all five healthy lifestyle factors had a lower risk of CHD compared to men with none of those factors. It also found that 62 percent of coronary events may have been prevented if all men in the study population adhered to all five healthy lifestyle factors; for those men taking medications, 57 percent could have been prevented. Men who adopted two or more low-risk factors during the study period (1986-2002) had a 27 percent lower risk of CHD. Overall, for each healthy lifestyle factor, the authors found an inverse association with CHD risk.

Well, statins used to lower cholesterol will not provide an optimal outcome. Diet still matters, not surprisingly.

If you want to follow their dietary recommendations then check out the food pyramid for the Alternate Healthy Eating Index. My guess is you can do even better than that food pyramid by enhancing it with the "Ape Diet" recommendations of David Jenkins at U Toronto.

By Randall Parker 2006 July 05 10:12 PM  Aging Diet Studies
Entry Permalink | Comments(0)
Pomegranate Slows Prostate Cancer

Drink pomegranate juice.

Drinking an eight ounce glass of pomegranate juice daily increased by nearly four times the period during which PSA levels in men treated for prostate cancer remained stable, a three-year UCLA study has found.

The study involved 50 men who had undergone surgery or radiation but quickly experienced increases in prostate-specific antigen or PSA, a biomarker that indicates the presence of cancer. UCLA researchers measured "doubling time," how long it takes for PSA levels to double, a signal that the cancer is progressing, said Dr. Allan Pantuck, an associate professor of urology, a Jonsson Cancer Center researcher and lead author of the study.

Doubling time is crucial in prostate cancer, Pantuck said, because patients who have short doubling times are more likely to die from their cancer. The average doubling time is about 15 months. In the UCLA study, Pantuck and his team observed increases in doubling times from 15 months to 54 months, an almost four-fold increase.

"That's a big increase. I was surprised when I saw such an improvement in PSA numbers," Pantuck said. "In older men 65 to 70 who have been treated for prostate cancer, we can give them pomegranate juice and it may be possible for them to outlive their risk of dying from their cancer. We're hoping we may be able to prevent or delay the need for other therapies usually used in this population such as hormone treatment or chemotherapy, both of which bring with them harmful side effects."

That's a huge increase in doubling times. If you get diagnosed with prostate cancer now and pomegranate juice extends your life by several years then you might live long enough for a cure to be developed.

Pomegranate juice contains all sorts of good things.

Pomegranate juice is known to have anti-inflammatory effects and high levels of anti-oxidants, which are believed to protect the body from free-radical damage. It also contains poly-phenols, natural antioxidant compounds found in green tea, as well as isoflavones commonly found in soy, and ellagic acid, which is believed to play a role in cancer cell death.

"There are many substances in pomegranate juice that may be prompting this response," Pantuck said. "We don't know if it's one magic bullet or the combination of everything we know is in this juice. My guess is that it's probably a combination of elements, rather than a single component."

Some of the men have kept their PSA levels from rising for 3 years with pomegranate.

Pantuck said he has men on the study more than three years out who are not being treated for prostate cancer other than drinking pomegranate juice and their PSA levels continue to be suppressed.

"The juice seems to be working," he said.

Some of you ladies might be thinking that pomegranate against cancer is just a guy thing. Not so. Back in 2001 an Israeli team found evidence of pomegranate against breast cancer.

The Technion-Israel Institute of Technology research team presented two studies at an international conference in June indicating that pomegranate seed oil triggers apoptosis -- a self-destruct mechanism in breast cancer cells. Furthermore, pomegranate juice can be toxic to most estrogen-dependent breast cancer cells, while leaving normal breast cells largely unaffected. Estrogen is a hormone often prescribed to protect postmenopausal women against heart disease and osteoporosis.

In the first study, laboratory-grown breast cancer cells were treated for three days with pomegranate seed oil. The researchers observed apoptosis in 37 to 56 percent of the cancer cells, depending upon the dose of oil applied.

In the second study, both normal and cancerous breast cells were exposed to fermented pomegranate juice (pomegranate wine) and pomegranate peel extracts, which contain polyphenols (powerful antioxidants). The vast majority of the normal cells remained unaffected by the two pomegranate derivatives. But more than 75 percent of the estrogen-dependent cancer cells, and approximately half of the non-estrogen dependent cancer cells were destroyed by exposure to these same pomegranate products.

"Pomegranates are unique in that the hormonal combinations inherent in the fruit seem to be helpful both for the prevention and treatment of breast cancer," explains Dr. Ephraim Lansky, who headed the studies. "Pomegranates seem to replace needed estrogen often prescribed to protect postmenopausal women against heart disease and osteoporosis, while selectively destroying estrogen-dependent cancer cells."

Pomegranate might also reduce the odds of getting skin cancer. Still other scientists are using liquid chromatography to isolate components in pomegranate with the strongest anti-cancer effects.

By Randall Parker 2006 July 05 08:22 PM  Aging Diet Cancer Studies
Entry Permalink | Comments(2)
2006 July 02 Sunday
Americans Becoming Lonelier

I hear Sting singing "I feel lonely, I'm so lonely, I feel so low". People have fewer confidants.

Durham, N.C. -- Americans’ circle of confidants has shrunk dramatically in the past two decades and the number of people who say they have no one with whom to discuss important matters has more than doubled, according to a new study by sociologists at DukeUniversity and the University of Arizona.

“The evidence shows that Americans have fewer confidants and those ties are also more family-based than they used to be,” said Lynn Smith-Lovin, Robert L. Wilson Professor of Sociology at Duke University and one of the authors of " Social Isolation in America: Changes in Core Discussion Networks Over Two Decades."

“This change indicates something that’s not good for our society. Ties with a close network of people create a safety net. These ties also lead to civic engagement and local political action,” she said.

The study, published in the June issue of American Sociological Review, is based on the first nationally representative survey on this topic in 19 years.

It compared data from 1985 and 2004 and found that the mean number of people with whom Americans can discuss matters important to them dropped by nearly one-third, from 2.94 people in 1985 to 2.08 in 2004.

Researchers also found that the number of people who said they had no one with whom to discuss such matters more than doubled, to nearly 25 percent. The survey found that both family and non-family confidants dropped, with the loss greatest in non-family connections.

The study paints a picture of Americans’ social contacts as a “densely connected, close, homogeneous set of ties slowly closing in on itself, becoming smaller, more tightly interconnected, more focused on the very strong bonds of the nuclear family.”

That means fewer contacts created through clubs, neighbors and organizations outside the home -- a phenomenon popularly known as “bowling alone,” from the 2000 book of the same title by Robert D. Putnam.

The researchers speculated that changes in communities and families, such as the increase in the number of hours that family members spend at work and the influence of Internet communication, may contribute to the decrease in the size of close-knit circles of friends and relatives.

The study also finds that:

-- The trend toward social isolation mirrors other class divides. Non-whites and people with less education tend to have smaller networks than white Americans and those with higher educational levels.

-- Racial diversity among people’s networks has increased. The percentage of people who count at least one person of another race in their close network has gone up from about 9 percent to more than 15 percent.

-- The percentage of people who talk only to family members about important matters increased from about 57 percent to about 80 percent, while the number of people who depend totally on their spouse has increased from about 5 percent to about 9 percent.

The social scientists who did this research are uncertain about explanations.

One possibility is that people interpreted the questions differently in 2004 than they did in 1985. What people define as “important” might have changed, or people might not equate emailing or instant messaging with “discussing.”

The researchers also suggest that changes in work and the geographical scattering of families may foster a broader, shallower network of ties, rather than the close bonds measured by this study.

Research also shows a decline in the number of groups that people belong to and the amount of time they spend with these clubs and other organizations. Members of families spend more time at work and have less time to spend on activities outside the home that might lead to close relationships.

And new technology, while it allows people to connect over larger distances, might diminish the need for face-to-face visits with friends, family or neighbors, the study said.

I certainly spend more time communicating with people remotely due to the internet. But hasn't the decline in the cost of phone calls shifted more time spent communicating to remote communication as well?

On the one hand, phones let people stay in contact with other people who are no longer living near them. On the other hand, time spent on the phone reduces time available to deal with people face-to-face. That face time seems more likely to develop friendships.

What I wonder: Are people specializing their relationship needs? Instead of having a friend that one uses for many things do people have more relationships where each relationship satisfies a smaller range of needs?

We can meet many more people. We can live in more places, work in more places, play in more places. We can communicate with people around the world. Look at yourself reading my writing right now. You can read some thoughts of some guy who is not a professional writer and you can respond to him in the comments and read my responses in return. That all pulls you away from developing relationships in person where you are.

But if the decline in relationships is greater in the lower classes then that argues against the internet playing the major role. The influence of the internet is greater among the more affluent.

Further into the future perhaps this trend will reverse. When we develop the ability to reverse the aging process people will have centuries in which to develop relationships. Automation will probably increase the amount of free time. Will people spend some of that time developing relationships?

On the other hand, automation will reduce many ways in which we need to come into contact with others. For example, look at education. It is incredibly inefficient. Every year the same basic calculus course gets taught thousands of times. Why not record more lectures and save all that labor? Plus, people could watch lectures whenever they wanted to. Tests could get delivered and graded automatically over the internet. But automation of much of learning would reduce contacts with teachers, school administrators, and fellow students.

Will work involve more human-to-human contact or less? Also, some jobs require lots of contact but of very short duration or via phone. Will jobs tend to involve more longer term relationships? Or will more of the customized delivery of services be done via computer records remembering customer preferences and will each customer service representative deal with a constantly changing set of customers with short interactions?

Anyone think they have insights on the causes of the decrease in real friendships?

Update: To what extent does movement to take new jobs cause a reduction in the number of friendships people maintain? Over the last couple of decades has there been a reduction in the average amount of time people spend in a neighborhood before moving a substantial distance?

I also wonder whether rising economic output and the resulting widening range of incomes has decreased the amount people have in common with each other. When a larger fraction of the population worked in factories salaries and career trajectories were more similar. Knowledge work might pull people apart as specialization in education and in career work reduces the extent of common experiences.

Update II: I think media make people less interested in those around them. You can find better looking people on TV, in movies, and on the internet. In contrast to who you can meet locally in electronic venues you can find funnier people, more thoughtful people, more original people, more energetic, pretty, sexy and alluring people, smarter, and more informed people. Getting to know your neighbors seems unrewarding to most people. That's a harsh thing to say. But for the vast bulk of the population it is also true. Most people are not that interesting to most other people.

What does this portend for the future?

By Randall Parker 2006 July 02 10:09 PM  Trends Demographic
Entry Permalink | Comments(19)
Ultrasound Regrows Damaged Teeth

University of Alberta scientists have developed a wearable microminiature ultrasound generator that causes damaged teeth to generate more tooth material.

Hockey players, rejoice! A team of University of Alberta researchers has created technology to regrow teeth--the first time scientists have been able to reform human dental tissue.

Using low-intensity pulsed ultrasound (LIPUS), Dr. Tarak El-Bialy from the Faculty of Medicine and Dentistry and Dr. Jie Chen and Dr. Ying Tsui from the Faculty of Engineering have created a miniaturized system-on-a-chip that offers a non-invasive and novel way to stimulate jaw growth and dental tissue healing.

"It's very exciting because we have shown the results and actually have something you can touch and feel that will impact the health of people in Canada and throughout the world," said Chen, who works out of the Department of Electrical and Computer Engineering and the National Institute for Nanotechnology.

The wireless design of the ultrasound transducer means the miniscule device will be able to fit comfortably inside a patient's mouth while packed in biocompatible materials. The unit will be easily mounted on an orthodontic or "braces" bracket or even a plastic removable crown. The team also designed an energy sensor that will ensure the LIPUS power is reaching the target area of the teeth roots within the bone. TEC Edmonton, the U of A's exclusive tech transfer service provider, filed the first patent recently in the U.S. Currently, the research team is finishing the system-on-a-chip and hopes to complete the miniaturized device by next year.

"If the root is broken, it can now be fixed," said El-Bialy. "And because we can regrow the teeth root, a patient could have his own tooth rather than foreign objects in his mouth."

The device is aimed at those experiencing dental root resorption, a common effect of mechanical or chemical injury to dental tissue caused by diseases and endocrine disturbances. Mechanical injury from wearing orthodontic braces causes progressive root resorption, limiting the duration that braces can be worn. This new device will work to counteract the destructive resorptive process while allowing for the continued wearing of corrective braces. With approximately five million people in North America presently wearing orthodontic braces, the market size for the device would be 1.4 million users.

This would allow more rapid realignment of teeth for those undergoing orthodontic therapy.

El-Bialy had previously demonstrated this effect using a larger ultrasound generator. He teamed up with other faculty and developed a wearable device so that the benefit could be had more easily. His previous research showed that the ultrasound also helped cause damaged bones to repair.

El-Bialy has shown in earlier research that ultrasound waves, the high frequency sound waves normally used for diagnostic imaging, help bones heal and tooth material grow.

"I was using ultrasound to stimulate bone formation after lower-jaw lengthening in rabbits," El-Bialy said in an interview Tuesday.

To his surprise, not only did he help heal the rabbits' jaws after the surgery, but their teeth started to grow as well.

He foresees the day when people with broken bones will wear ultrasound emittters wrapped into the bandages.

This approach by itself probably can't solve the problem of growing replacements for entirely missing teeth. However, ultrasound might help stimulate tooth building cells once scientists develop techniques for creating suitable cells. Still, additional problems must be solved to get tooth building cells to produce the particular tooth shape desired.

By Randall Parker 2006 July 02 07:22 PM  Biotech Teeth And Gums
Entry Permalink | Comments(438)
Site Traffic Info
Site Copyright
The contents of this site are copyright ©