2005 January 31 Monday
Human Embryonic Stem Cells Converted Into Motor Neurons

Human embryonic stem cells (hESCs) have previously been converted into many different cell types including a number of nerve cell types. But until now no lab has been successful in converting hESCs into motor neurons. Motor neurons are the nerve cells that run down the spinal cord to send messages to muscle cells to cause muscles to contract. Your body won't motor around without motor neurons to order your muscles to push you along. Suffer from an injury that cuts your motor neurons in your spine and you'll find yourself desiring some replacement motor neurons about as soon as you regain consciousness and are apprised of you tragic predicament. Well, University of Wisconsin-Madison scientists Su-Chun Zhang and Xue-Jun Li have found a sequence of growth factors and other chemicals that can be allied to hESCs to turn them into motor neurons. (also found here)

MADISON - After years of trial and error, scientists have coaxed human embryonic stem cells to become spinal motor neurons, critical nervous system pathways that relay messages from the brain to the rest of the body.

The new findings, reported online today (Jan. 30, 2005) in the journal Nature Biotechnology by scientists from the University of Wisconsin-Madison, are important because they provide critical guideposts for scientists trying to repair damaged or diseased nervous systems.

Motor neurons transmit messages from the brain and spinal cord, dictating almost every movement in the body from the wiggling of a toe to the rolling of an eyeball. The new development could one day help victims of spinal-cord injuries, or pave the way for novel treatments of degenerative diseases such as amyotrophic lateral sclerosis (ALS), also known as Lou Gehrig's disease. With healthy cells grown in the lab, scientists could, in theory, replace dying motor neurons to restore function and alleviate the symptoms of disease or injury.

Much sooner in the future, the advance will allow researchers to create motor neuron modeling systems to screen new drugs, says study leader Su-Chun Zhang, an assistant professor of anatomy and neurology in the Stem Cell Research Program at the Waisman Center at UW-Madison.

Scientists have long believed in the therapeutic promise of embryonic stem cells with their ability to replicate indefinitely and develop into any of the 220 different types of cells and tissues in the body.

But researchers have struggled to convert blank-slate embryonic stem cells into motor neurons, says Zhang. The goal proved elusive even in simpler vertebrates such as mice, whose embryonic stem cells have been available to scientists for decades.

There is a fairly small window in time during which developing embryo cells possess the capacity to turn into motor neurons.

One reason scientists have had difficulty making motor neurons, Zhang believes, may be that they are one of the earliest neural structures to emerge in a developing embryo. With the ticking clock of development in mind, Zhang and his team deduced that there is only a thin sliver of time - roughly the third and fourth week of human development - in which stem cells could be successfully prodded to transform themselves into spinal motor neurons.

I think it is inevitable that methods will be found to dedifferentiate (i.e. make less specialized or less committed to a single purpose) both adult stem cell types and fully specialized cell types (e.g. liver cells or skin fibroblast cells) to turn these cells back into less differentiated stem cells and even all the way back into embryonic stem cells. So for the production of motor neurons we will not always be limited to starting with embryonic stem cells to pass them through that 2 week window in early embryonic development during which embryonic stem cells can be converted into motor neurons. In fact, compounds that cause cellular dedifferentiation have already been found. I expect many more techniques for dediffentiating cells will be found.

Think of cells as enormously complex state machines. Currently it is much easier (though not easy in an absolute sense) to coax cells to switch from the embryonic state into other states. The reason for this is pretty obvious: Cells in the embryonic state must be capable of transitioning through a series of steps into all the other states (e.g. to the state that heart muscle cells are in or the state that liver cells are in or the state that insulin-secreting Pancreatic Isles of Langerham cells are in) because embryos develop to produce cells in all those states. They must have that capacity or else a full organism couldn't develop starting from an embryo. However, just because there are some cell state transitions that do not happen under normal conditions of development that doesn't mean that those transitions can't be made to happen with the right (and waiting to be discovered) sequences of hormones, growth factors, gene therapies, and other stimuli.

Just because some day we will have methods to turn non-embryonic cell types into all other cell types that does not mean that avoidance of the use of hESCs in developing therapies has no future cost in terms of the health of some fraction of the human population. There is a very real possibility that hESCs can be developed for some therapeutic uses faster than other cell types can be developed for all uses. My guess is that at least for some purposes hESCs will be ready to provide treatments faster than adult stem cell types can be coaxed to do the same. We will see more research results such as this paper offering the possibilty of a cell therapy treatment for which the development of alternative non-hESC based cell therapy treatments are a more distant prospect.

Zhang's group had to use precise timings of changes in the biochemical cocktails fed to the cells to produce the desired outcome.

In addition to the narrow time frame, it was also critical to expose the growing stem cells to an array of complex chemical cocktails. The cocktails constitute naturally secreted chemicals - a mix of growth factors and hormones - that provide the exact growing conditions needed to steer the cells down the correct developmental pathway. "You need to teach the [embryonic stem cells] to change step by step, where each step has different conditions and a strict window of time," says Zhang. "Otherwise, it just won't work."

To differentiate into a functional spinal motor neuron, the stem cells advanced through a series of mini-stages, each requiring a unique growing medium and precise timing. To start, the Wisconsin team generated neural stem cells from the embryonic stem cells. They then transformed the neural cells into progenitor cells of motor neurons, which in turn developed in a lab dish into spinal motor neuron cells.

Note that this group had to try many different compounds and timings to find a recipe that worked. Greater automation of lab equipment is accelerating and will continue to accelerate this kind of work by increasing the rate at which different chemical cocktails can be tried in searches for techniques to turn various cell types into other cell types. So I expect the rate of advance of stem cell research of all kinds to accelerate regardless of the likely outcomes of political debates about human embryonic stem cell research.

By Randall Parker 2005 January 31 10:05 PM  Biotech Organ Replacement
Entry Permalink | Comments(3)
2005 January 29 Saturday
Technique Speeds Search For Methods To Change Cell Types

Some UC San Diego researchers have developed a process to help automate the search for compounds that will turn less differentiated stem cells into more differentiated specialized cell types.

Bioengineering researchers at the University of California, San Diego have invented a process to help turn embryonic stem cells into the types of specialized cells being sought as possible treatments for dozens of human diseases and health conditions. Sangeeta Bhatia and Shu Chien, UCSD bioengineering professors, and Christopher J. Flaim, a bioengineering graduate student, described the cell-culture technique in a paper published in the February issue of Nature Methods, which became available online on Jan. 21.

It is very likely this technique would be useful in testing proteins for their ability to turn non-embryonic stem cells into other cell types as well.

To find out what would work they had to develop the means to automatically test many different combinations and concentrations of proteins.

“We kept the other factors constant and developed a miniaturized technique to precisely vary extracellular matrix proteins as a way to identify which combinations were optimal in producing differentiated cells from stem cells,” said Bhatia. She, Chien, and Flaim described in their paper a technique that enabled them to identify the precise mix of proteins that optimally prompted mouse embryonic stem cells to begin the differentiation process into liver cells. Bhatia, Chien, and Flaim designed the technique with other cell biologists in mind so that any of them could duplicate it with off-the-shelf chemicals and standardized laboratory machinery. “We think other researchers could easily use this technique with any other tissue in mouse, or human, or any other species,” said Bhatia.

They adopted an existing machine that delivers tiny volumes of DNA and made it deliver protein instead.

In their experiments, the UCSD researchers took advantage of the knowledge that the extracellular matrix in liver is comprised primarily of just five proteins. They applied spots of all 32 possible combinations of the five proteins as engineered niches onto the surface of gel-coated slides, and then added mouse embryonic stem cells to the niches. After the cells were allowed to grow, the researchers assayed their progression into liver cells. “We looked at all the combinations at once,” said Bhatia. “Nobody has done this combinatorial approach before.”

Bhatia, Chien, and Flaim reported that either collagen-1 or fibronectin had strongly positive effects on the differentiation of the stem cells they tested. Unexpectedly however, when both collagen-1 and fibronectin were combined in one niche, the liver cell differentiation process was subtly inhibited. “You would not predict that from the customary cell biology experiments,” said Bhatia. “By using this combinatorial technique we were surprised to find many interesting interactions, and we were able to tease out the effects of each protein, alone and in combination with others.”

Cell biologists have not performed such combinatorial assays for other desired cell types because they had no practical way to do so. Bhatia, Chien, and Flaim seized on the unique ability of so-called DNA spotting machines to deliver tiny volumes of liquid, about one trillionth of a liter per spot. The spotting machines, which cost about $20,000, have become common fixtures at most research universities, but the innovation reported today in Nature Methods involved using such a machine to spot solutions of proteins rather than DNA. The UCSD researchers also refined other parameters so that the technique would be reproducible in other research laboratories.

The more important story here is not the discovery of particular protein combinations that make stem cells differentiate into liver cells. What will be more valuable in the longer run is the ability to apply their technique to more combinations of proteins to convert embryonic and other cell types into various desired cell types. With better tools the rate of progress can accelerate by orders of magnitude. This is yet another example of the general trend toward the development of techniques that are accelerating the rate of advance of biomedical research.

By Randall Parker 2005 January 29 11:44 PM  Biotech Organ Replacement
Entry Permalink | Comments(0)
2005 January 28 Friday
Hockey Stick Climate Temperature Trend Theory Challenged

A pair of Canadian researchers, University of Guelph Canada economist Ross McKitrick and Toronto-based mineral exploration consultant Stephen McIntyre, have a paper coming out in Geophysical Research Letters that challenges the "Hockey Stick" temperature trends model which shows the 20th century as the hottest centure in the last 1000 years.

Until now, criticisms of the hockey stick have been dismissed as fringe reports from marginal global warming skeptics. Today, however, the critical work of two Canadian researchers, Ross McKitrick, an economics professor at Guelph University, and Toronto consultant Stephen McIntyre, will be published by Geophysical Research Letters, the prestigious journal that published one of the early versions of Michael Mann's 1,000-year tracking of Northern Hemisphere temperatures,

Publication in Geophysical Research sets McIntyre and McKitrick's analysis and conclusions in direct opposition to the Mann research. Their criticism can no longer be dismissed as if it were untested research posted on obscure Web sites by crank outsiders. Their work is now a full challenge to the dominant theme of the entire climate and global warming movement.

The paper will be published in February. So as of this writing it is not on the Geophysical Research Letters web site. However, a pre-publication version of the paper "“Hockey Sticks, Principal Components and Spurious Significance”" is available (PDF format).

For a graphical comparison of the original hockey stick chart and the McIntyre and McKitrick analysis see this page from McKitrick's web site. That page has a lot of other useful links. McIntyre and McKitrick also have another web site with a lot more useful links.

Dutch science journalist Marcel Crok has a two part series in the Canadian Financial Post on the McIntyre and McKitrick research paper.

Up to January, 2005, none of McIntyre and McKitrick's findings had been published by major scientific journals. Thus, in the opinion of established climate researchers, there was no reason to take them seriously. Climate researchers were quite comfortable in their consensus and repeatedly referred to this "consensus" as a basis for policy. The official expression of the consensus comes from the IPCC. This group, under the flag of the United Nations, comes out with a bulky report every five years on the state of affairs in climate research. Hundreds of climate researchers from every corner of the world contribute to it. In the third report in 2001, Mann himself was a lead author of the chapter on climate reconstructions.

McKitrick and McIntyre had a hard time getting access to the data and source used in the analysis by Mann and colleagues that led to their claim that the 20th century was the hottest in the last 1000 years. No other group had seriously tried to replicate the Mann analysis.

McIntyre sent an e-mail to Michael Mann in spring 2003, asking him for the location of the data used in his study. "Mann replied that he had forgotten the location," he said. "However, he said that he would ask his colleague Scott Rutherford to locate the data. Rutherford then said that the information did not exist in any one location, but that he would assemble it for me. I thought this was bizarre. This study had been featured in the main IPCC policy document. I assumed that they would have some type of due-diligence package for the IPCC on hand, as you would have in a major business transaction. If there was no such package, perhaps there had never been any due diligence on the data, as I understood the term. In the end, this turned out to be the case. The IPCC had never bothered to verify Mann, Bradley and Hughes' study."

Despite billions of dollars spent on climate research, academic and institutional researchers had never bothered to replicate Mann's work either. In 2003, McIntyre tackled the job and, from an unusual hobby, the task has since grown to become almost a full-time occupation. On an Internet forum for climate skeptics, he met Ross McKitrick, professor of economics at the University of Guelph, just outside of Toronto. Since meeting in person in September of 2003, the two have been working on the project together. McIntyre does most of the research and McKitrick asks questions and assists in the writing of papers.

When people tell us that we urgently need to spend hundreds of billions or trillions to fix some problem we ought to demand a higher standard of proof than the effort that went into the original Hockey Stick paper.

Read the full article. Keep in mind as you read it that published science should be transparent, verifiable, and reproducible. Science that can not be checked and reproduced has no place as a basis for public policy that could cost the world's collective economies hundreds of billions or even trillions of dollars..

A lone Canadian Gaspe peninsula cedar tree's rings were heavily weighted in Mann's model for North American temperature in the 15th century.

"More strangely," said McIntyre, "the series appears twice in Mann's data set, as an individual proxy, and in the North American network. But it is only extrapolated in the first case, where its influence is very strong." McIntyre and McKitrick went back to the source of the Gaspe series and then to the archived data at the World Data Center for Paleoclimatology."We found that although the Gaspe series begins in 1404, up until 1421, it is based on only one tree. Dendrochronologists (tree ring researchers) generally do not use data based on one or two trees. The original authors only used this series from 1600 onwards in their own temperature reconstructions. This series should never have been used in the 15th century, let alone counted twice and extrapolated."

Go and read the full articles I'm linking to. Note how McIntyre and McKitrick were able to find a Fortran program and crucial datasets on an FTP server used by Mann's group that led McIntyre and McKitrick to an understanding of how Mann and his colleagues made serious mistakes in how they did a mathematical analysis called principal component analysis (PCA) on their datasets. There is a larger lesson here: More data and source code on which scientific research papers are based ought to be available in the public domain to allow replication of mathematical analyses used in scientific research papers.

Note also that in climate research McIntyre and McKitrick are essentially self-taught amateurs. But they had the mathematical chops to use basic analytical techniques to datasets and apparently that is all that is needed do analyses on climate history data.

Finally, regarding the idea of a scientific consensus on global warming the words MIT meteorologist Richard Lindzen bear repeating:

"Do you believe in global warming? That is a religious question. So is the second part: Are you a skeptic or a believer?" said Massachusetts Institute of Technology professor Richard Lindzen, in a speech to about 100 people at the National Press Club in Washington, D.C.

"Essentially if whatever you are told is alleged to be supported by 'all scientists,' you don't have to understand [the issue] anymore. You simply go back to treating it as a matter of religious belief," Lindzen said. His speech was titled, "Climate Alarmism: The Misuse of 'Science'" and was sponsored by the free market George C. Marshall Institute. Lindzen is a professor at MIT's Department of Earth, Atmospheric, and Planetary Sciences.

...

According to Lindzen, climate "alarmists" have been trying to push the idea that there is scientific consensus on dire climate change.

"With respect to science, the assumption behind the [alarmist] consensus is science is the source of authority and that authority increases with the number of scientists [who agree.] But science is not primarily a source of authority. It is a particularly effective approach of inquiry and analysis. Skepticism is essential to science -- consensus is foreign," Lindzen said.

Alarmist predictions of more hurricanes, the catastrophic rise in sea levels, the melting of the global poles and even the plunge into another ice age are not scientifically supported, Lindzen said.

"It leads to a situation where advocates want us to be afraid, when there is no basis for alarm. In response to the fear, they want us to do what they want," Lindzen said.

If global warming eventually becomes a problem we will be able to handle it. We can switch to nuclear power. In time photovoltaic cells will become much cheaper and we may switch away from fossil fuels in 30 years because market forces cause the switch even without government regulations that force the switch. The most prudent action to take at this point would be to accelerate the rate of energy research to develop cheaper alternatives to fossil fuels and cheaper ways to capture and sequester CO2. The impositions of huge costs on economies to reduce CO2 emissions today is an excessive response to a potential problem that, if it comes, could be much more cheaply handled in the future.

Update: Check out the Prometheus blog on science policy and the post A Third Way on Climate?. For insights into the problems caused by scientists playing policy advocates while simultaneously trying to serve the role of providing authoritative answers about scientific knowledge to the public see these posts Chris Landsea Leaves IPCC, Follow Up On Landsea/IPCC, Landsea on Hurricanes, More Politics and IPCC, and A Good Example why Politics/IPCC Matters. There is a lot of sensible thinking in those posts. Hope you agree.

By Randall Parker 2005 January 28 03:22 PM  Climate Trends
Entry Permalink | Comments(25)
2005 January 27 Thursday
Pheromone Increases Sexual Attractiveness Of Postmenopausal Women

Researchers Susan Rako M.D., a medical doctor in private practice as a psychiatrist in Newton Massachusetts, and Joan Friebely Ed.D., a researcher at Harvard's psychiatry department, have demonstrated that a synthesized pheromone applied to postmenopausal women appears to make them more sexually attractive to their partners. Here is the abstract of the paper "Pheromonal Influences on Sociosexual Behavior in Postmenopausal Women".

To determine whether a putative human sex-attractant pheromone increases specific sociosexual behaviors of postmenopausal women, we tested a chemically synthesized formula derived from research with underarm secretions from heterosexually active, fertile women that was recently tested on young women.

Participants (n=44, mean age = 57 years) were postmenopausal women who volunteered for a double-blind placebo-controlled study designed “to test an odorless pheromone, added to your preferred fragrance, to learn if it might increase the romance in your life.” During the experimental 6-week period, a significantly greater proportion of participants using the pheromone formula (40.9%) than placebo (13.6%) recorded an increase over their own weekly average baseline frequency of petting, kissing, and affection(p = .02). More pheromone (68.2%) than placebo (40.9%) users experienced an increase in at least one of the four intimate sociosexual behaviors (p = .04). Sexual motivation frequency, as expressed in masturbation, was not increased in pheromone users. These results suggest that the pheromone formulation worn with perfume for a period of 6 weeks has sex-attractant effects for postmenopausal women.

The question of whether humans are even capable of response to pheromones looks increasingly to have the answer of "Yes". This of course creates all sorts of possibilities for the future. But it also brings up some interesting issues about individual rights and free will.

Does the wearing of a pheromone violate the rights or integrity of others by invading their bodies and changing their desires and behaviors? Or does the efficacy of pheromones demonstrate limits to trying to organize societies around the idea of human rights? After all, if wearing artificially synthesized pheromones is a rights violation because it stealthily manipulates others then doesn't the natural excretion of pheromones do the same? Does it matter whether the person giving off the pheromone scent consciously chose to do so?

Then there is the question of free will. Doesn't every discovery of how chemicals alter behavior eat away at the idea of a core in every person that is free to choose?

The full paper appears to be on the web as well.

To determine whether a putative human sex-attractant pheromone increases specific sociosexual behaviors of postmenopausal women, we tested a chemically synthesized formula derived from research with underarm secretions from heterosexually active, fertile women that was recently tested on young women. Participants (n = 44, mean age = 57 years) were postmenopausal women who volunteered for a double- blind placebo-controlled study designed "to test an odorless pheromone, added to your preferred fragrance, to learn if it might increase the romance in your life." During the experimental 6-week period, a significantly greater proportion of participants using the pheromone formula (40.9%) than placebo (13.6%) recorded an increase over their own weekly average baseline frequency of petting, kissing, and affection (p = .02). More pheromone (68.2%) than placebo (40.9%) users experienced an increase in at least one of the four intimate sociosexual behaviors (p = .04). Sexual motivation frequency, as expressed in masturbation, was not increased in pheromone users. These results suggest that the pheromone formulation worn with perfume for a period of 6 weeks has sex- attractant effects for postmenopausal women.

Postmenopausal women may get better results from longer use since many of them are starting out without being involved in any kind of physically intimate relationship.

Within specific behaviors, a significantly higher proportion of pheromone than placebo users increased over their baseline behaviors in average weekly frequency of petting/affection and kissing6. However, the other sociosexual behaviors did not significantly increase.

Results in the menopausal group appear to be more modest than the results for men and women in their fertile years. Several explanations are possible. A reduced availability of male sexual partners occurs after 50. There is some evidence36 that with increasing postmenopausal age, there is a decreasing interest in sexual intercourse.

Moreover, in contrast with younger women, there is a reduced availability of male sexual partners for postmenopausal women. In fact, partner status at baseline (p=0.01) as well as pheromone use (p=0.03) were the two independent variables that significantly increased the likelihood that postmenopausal women would increase at least one intimate behavior during the 6-week experimental period. Postmenopausal women may require a longer experimental period, particularly if they need to find a partner, to bring about increases in more intimate sexual behaviors.

Details of the chemical used are being kept secret until its patent protections have been granted.

However, Winnifred Cutler, who discovered the pheromone, has said she will keep its true identity secret until patents have been granted to her Women's Wellness Research Centre, in Chester Springs, Pennsylvania.

But aren't patent applications in the public domain? Also, once patents are applied for is there any need for continued secrecy? My impression is that there isn't. Anyone know?

One could imagine the use of such a pheromone helping to keep together marriages of middle aged and old aged couples. But we can also expect to find that young males and females differ in the amounts of pheromnes they excrete. So expect young low pheromone producers to go for pheromone perfumes that level the playing field. Also expect eventually to see tests that measure your pheromone output. Perhaps more natural pheromone compounds will be discovered and we will be able to be tested for our pheromone profiles. There might be chemicals that elicit lust and other chemicals that elicit love from others.

Further into the future expect to see the development of gene therapy-carrying viruses on the black market that will reprogram the sexual and romantic feelings of any intended target. One can imagine these illicit treatments being used both to hook and to dump objects of desire at different stages in relationships.

By Randall Parker 2005 January 27 11:59 AM  Brain Sexuality
Entry Permalink | Comments(4)
2005 January 25 Tuesday
New Gene Therapy Selectively Kills Only Cancer Cells

Columbia University researchers have developed a gene therapy that is selectively toxic only in cancer cells.

"What's exciting is we may now be able to design a therapy that will seek out and destroy only cancer cells," said the study's senior author, Paul B. Fisher, Ph.D., professor of clinical pathology and Michael and Stella Chernow Urological Cancer Research Scientist at Columbia University Medical Center. "We hope it will be particularly powerful in eradicating metastases that we can't see and that can't be eliminated by surgery or radiation. Gene therapy, especially for cancer, is really starting to make a comeback."

The virus's selectivity for cancer cells is based on two molecules called PEA-3 and AP-1 that, the researchers found, are usually abundant inside cancer cells. Both of the molecules flip a switch (called PEG) that turns on the production of a cancer-inhibiting protein uniquely in tumor cells.

The researchers say the PEG switch can be exploited to produce gene therapies that will only kill cancer cells even if the therapy enters normal cells.

As an example, the researchers constructed an adenovirus that carries the PEG switch and a toxic protein. The switch and the protein were connected to each other so that the deadly protein is only unleashed inside cancer cells when the switch is flipped on by PEA-3 or AP-1.

When added to a mix of normal and prostrate cancer cells, the virus entered both but only produced the toxic protein inside the cancer cells. All the prostrate cancer cells died while the normal cells were unaffected.

The same virus also selectively killed human cancer cells from melanoma and ovarian, breast, and glioma (brain) tumors.

This approach is important because cancer can not be cured without the development of therapeutic agents that have far greater ability than current conventional chemical chemotherapy agents to selectively target cancer cells while leaving normal cells unharmed. The use of molecular switches that will flip on to deliver therapies only in cancer cells is going to be one of the major ways that cancer is going to be defeated and perhaps even ultimately the best way. There are two parts to such a therapy. The first is the switching part that detects unique signature patterns in cancer cells to know to activate. The other part is what will get done once the activation of the switch has happened. There are many possibilities for the second part. Imagine, for example, an enzyme that gets synthesized in cancer cells that can metabolize inert chemotherapy compounds into toxic forms. Or imagine a protein made from the switch that effectively punches a hole in a cell. Or perhaps the switch would turn on a bigger package of genes that would restore normal cell division regulation. The gene package could include a replacement non-mutated p53 cell divisiion regulating gene to replace the mutated p53 genes found in many types of cancer.

Also see my previous posts "DNA Nanomachine Computers Against Cancer" and "A Couple Of Novel Cancer Therapies Reported".

Update: After watching a lecture by Judah Folkman on anti-angiogenesis compounds to control cancer a thought occurs me: What would be neat would be a gene therapy that turns on anti-angiogenesis genes only in cells that are cancerous. Then anti-angiogenesis compounds would be produced in an area of the body only as long as cancer cells were growing in that area. Or imagine a gene therapy that only in cancer cells would make RNAi (RNA interference) segments against the messenger RNA for VEGF and other angiogenesis molecules.

There would be a distinct advantage, however, to a gene therapy that just killed all the cancer cells and even the pre-cancerous cells. A cell killing therapy would have the benefit of also being a rejuvenating therapy since it would wipe out a lot of damaged cells and therefore provide healthier cells room to grow. Depending on what internal conditions of a cell were used to activate the gene therapy it would be effective even in cancers that have not mutated to the point of being able to develop new vasculature. So evenl the very small (less than a millimeter) cancers (that most people die with undiagnosed) could be wiped out. Given that those damaged cancerous and precancerous cells are not doing their original jobs well (if at all) and are likely to be releasing inflaming molecules and/or free radicals one can expect a rejuvenating benefit from such a treatment.

By Randall Parker 2005 January 25 03:13 PM  Biotech Therapies
Entry Permalink | Comments(11)
Israel Scientist Develops Libido-Meter Brain Wave Test

Dr. Yoram Vardi, in charge of neurology at the Rambam Hospital and professor of urology at the Technion Israel Institute of Science, both in Haifa, Israel, has demonstrated that p300 brain waves are most reduced by exposure to sexually arousing images.

Vardi conducted experiments on 14 male and 16 female volunteers with normal sexual function. Using standard EEG equipment, a pair of headphones and a computer monitor, the subjects listened to music and other sounds to stimulate p300 brain waves. These waves, produced 300 milliseconds after an event, are the brain's normal response to stimuli.

While reduction in p300 brain wave amplitude occurs in response to visual stimuli in general the greatest reduction in p300 brain wave amplitude came in response to es

"What we found was that, in all subjects, the most significant reduction in p300 amplitudes was found when viewing sexual clips. At the end of the testing session, we also asked the subjects via the questionaire how much they were attracted to the sexual content. We found a very significant statistical correlation between what the subject told us and the amount the brain waves were diminished."

For all subjects, a lesser degree of reduction, but still statistically significant from baseline, was found when viewing sports and romantic clips. According to Vardi, what the study shows - and the questionnaire verifies - is that the p300 testing accurately reveals if and how much a person's brain waves react to the film clips of sexual content.

The extent of reduction of p300 brain waves correlated with self-reported sexual interest in the sexual images shown.

This has all sorts of potential uses. One use would be to test people in court cases who claim they have suffered a loss of libido in response to an accident. Also, men who claim they are interested in sex but can't get it up could be tested to see if they really have an unexpressable interest. Another idea: Test for pedophilic and other forbidden tendencies. Want to test a treatment to stop pedophiles from desiring children? Show them pictures and monitor their brain waves. Some day such a test might even need to be passed as a condition for parole.

In medical research an objective test for libido would make clinical testing of drugs for treating sexual dysfunction and for the development of aphrodisiacs even for normal people. Also, diagnosis of types of sexual dysfunction could be done more quickly and accurately.

By Randall Parker 2005 January 25 02:56 PM  Brain Sexuality
Entry Permalink | Comments(10)
2005 January 23 Sunday
Some Day You May Take An Aphrodisiac For Your Heart

I can easily imagine someone lying to their lover: "I have to take Viagra and act like this or my heart will fail. You don't want me to die, do you?"

Researchers at Johns Hopkins have found that sildenafil citrate (Viagra), a drug used to treat erectile dysfunction (ED) in millions of men, effectively treats enlarged hearts in mice, stopping further muscle growth from occurring and reversing existing growth, including the cellular and functional damage it created.

"A larger-than-normal heart is a serious medical condition, known as hypertrophy, and is a common feature of heart failure that can be fatal," says study senior author and cardiologist David Kass, M.D., a professor at The Johns Hopkins University School of Medicine and its Heart Institute. Kass is also the Abraham and Virginia Weiss Professor of Cardiology at Hopkins.

Sildenafil, Kass says, was the focus of his research because it blocks or stops an enzyme, called phosphodiesterase 5 (PDE5A), involved in the breakdown of a key molecule, cyclic GMP, which serves as a "natural brake" to stresses and overgrowth in the heart. "We thought we could more strongly apply the brake on hypertrophy in the heart if we used sildenafil to prevent the breakdown of cyclic GMP," he says. The makers of the drug had no involvement in the design or support of the research. PDE5A is also the biological pathway blocked in the penis to prevent the relaxation of blood vessels and maintain erections.

The Johns Hopkins findings, to be published in the journal Nature Medicine online Jan. 23, are the first to show that sildenafil is an effective treatment for a chronic heart condition. It is also the first study to reveal that the enzyme pathway blocked by sildenafil (PDE5A), never before known to play a significant role in the heart, is active when the heart is exposed to pressure stress and hypertrophied. The results provide some of the strongest evidence to date that blocking the heart's adaptive response to hypertrophy does not harm its function but, in fact, may improve it, Kass says. Already, plans are under way by the Hopkins researchers for a multicenter trial to test if sildenafil has the same effects on hypertrophy in humans.

In the first of several experiments, each involving groups of 10 to 40 male mice, the Hopkins team stimulated hypertrophy for up to nine weeks, but only by half as much in those that had also consumed sildenafil in their food at 100 milligrams per kilogram per day. In mice, this dose produces blood levels similar to those achieved in humans given standard clinical doses.

The mice fed sildenafil also showed 67 percent less muscle fibrosis, a complication that often occurs with hypertrophy, as compared to mice that were not fed the drug. The treated mice also had smaller hearts and improved heart function, whereas the untreated hearts were dilated with weakened function. For all mice with hypertrophy, the condition was surgically produced by constricting the main artery carrying blood from the heart to create pressure stress.

In a second experiment, the researchers used the same dose of sildenafil and examined its effects on reversing hypertrophy that had already occurred. Initially, these mice were exposed to pressure stress for seven to 10 days, with hearts developing fibrosis and muscle growth by nearly 65 percent. After two weeks of therapy, fibrosis and muscle growth almost completely disappeared. In mice that did not have therapy, hearts continued to get bigger.

In a surprising result, the researchers found that heart function, as measured by pressure-volume analysis of the muscle's ability to contract and pump blood, actually improved after hypertrophy had been stopped and treated. While researchers previously thought that hypertrophy was an adaptive response to pressure stress, the functional gains lasted despite the heart's continued exposure to high blood pressure. Improvements were seen in more than 10 measures of heart function, including heart relaxation, cardiac output and heart contractility, which increased by nearly 40 percent. These improvements were seen even when therapy was deferred and started two weeks after hypertrophy had already developed.

"This study shows that sildenafil can make hypertrophy go away," says Kass. "Its effects can be both stopped in their tracks and reversed. Overall, the results provide a better understanding of the biological pathways involved in hypertrophy and heart dilation, leading contributors to heart failure. They suggest possible therapies in the future, including sildenafil, which has the added benefit of already being studied as safe and effective for another medical condition."

This is a funny result. Picture millions of men in the future complaining they can't let it down because to do so would cause them heart failure. The future is going to be a strange place.

By Randall Parker 2005 January 23 07:44 PM  Biotech Therapies
Entry Permalink | Comments(0)
2005 January 20 Thursday
A Drink A Day Reduces Cognitive Decline In Old Age

Harvard researchers have found in the on-going Nurses' Health Study that elderly women who drink one alcoholic beverage a day experience less cognitive decline than otherwise similar women who do not have a daily drink.

"Low levels of alcohol appear to have cognitive benefits," said Francine Grodstein of Brigham and Women's Hospital in Boston, senior author on the study, published in today's New England Journal of Medicine. "Women who consistently were drinking about one-half to one drink per day had both less cognitive impairment as well as less decline in their cognitive function compared to women who didn't drink at all," Grodstein said.

This result is expected to hold for men as well.

Both wine and beer were beneficial.

They found that the women who had the equivalent of one drink a day had a 23% lower risk of becoming mentally impaired during the two-year period, compared with non-drinkers.

12,400 elderly women were in the study.

Stampfer and colleagues focused on more than 12,400 Nurses' Health Study participants. The women were 70 to 81 years old. The researchers collected information about the women's alcohol intake as part of a food questionnaire every two to four years, starting in 1980. They asked the women how often on average they drank beer, wine, or liquor during the previous year.

The result held even after many other factors were adjusted for.

The results held true even after the researchers factored in characteristics about the women that could have confused the findings, such as age, education, how many friends and family members they had, how much exercise they got, and whether they had any other health problems.

The suspected mechanism by which alcohol delivers this benefit is improved blood flow. This suggests a few things. First, younger people who have improved blood flow may not gain any cognitive benefit from drinking alcohol. So it might make more sense to delay making a daily drink as part of your routine until late middle age. Though I'd like to see more research on this point.

Another point is that alcohol is not the only way available - or even the most powerful way - to improve blood flow or for preventing a decline in blood flow. A heart healthy diet and lifestyle is probably the best bet for keeping strong blood flow available in all parts of the body. Foods low in saturated fats, lots of fruits and vegetables and some nuts as well would all be good bets. I'm going to guess that nuts high in arginine will eventually be shown to be especially beneficial on this count. Plus, plenty of exercise, sufficient sleep, avoidance of chronically stressing situations, and of course no smoking are wise lifestyle choices.

Another point here is that statin drugs that keep cholesterol down may also slow the rate of cognitive decline. In the future stem cell therapies that allow the circulatory system to be rejuvenated will also reduce the rate of cognitive decline. Though note that poor circulation is only one cause of cognitive decline. Other causes of brain aging must also be addressed before cognitive decline can be halted and even reversed.

By Randall Parker 2005 January 20 02:45 PM  Brain Aging
Entry Permalink | Comments(4)
Cancer Drug Approval Process Getting Tougher At FDA

Scott Gottlieb M.D., recently departed US Food and Drug Administration (FDA) Director of Medical Policy Development, a practicing physician and fellow at the American Enterprise Institute, argues that the FDA is making cancer drug approval harder and more costly.

Even more concerning, there is also a whiff of caution coming from the agency's cancer division--the Oncology Drug Advisory Committee (ODAC)--where delays on the approval of new drugs can have a dramatic impact on the lives of patients who are suffering from terminal disease.

At a meeting last month, outside advisers to ODAC as well as rank-and-file FDA medical reviewers expressed criticism of the applications they are seeing and a desire to clamp down on the number of cancer drugs qualifying for accelerated approval. Accelerated approval regulations allow the FDA to approve products based on preliminary test results, with the proviso that the company continues with clinical trials after the drug is marketed.

The chair of the FDA's cancer drug advisory committee rejected the idea that cancer drugs should be allowed onto the market if they are reasonably safe and have some degree of effectiveness (known as "efficacy") with the understanding that oncologists will determine their value through routine use of the drug.

Higher costs per drug developed translate into fewer drugs being developed at any one time and fewer new drugs available in the future.

The FDA is moving toward making it harder for a new drug to be approved if it can not be demonstrated to work better than the off-label use of an existing drug.

The other point of contention at the cancer advisory meeting was whether it is appropriate for the FDA to require drugs up for accelerated approval for treatment of a specific kind of terminal cancer to prove that they were superior to other drugs being used off-label to treat the same cancer.

The debate turned on the FDA's consideration of Inex Pharmaceutical's application for accelerated approval of Marqibo vincristine sulfate liposome injection to treat relapsed, aggressive non-Hodgkin's lymphoma (NHL). The FDA noted that no products have been approved for the indication, and suggested that the committee consider whether a number of products that are used off-label for the indication should be considered "available therapy." By suggesting that ODAC consider off-label uses as available therapies, the FDA dramatically raised the bar for approval of Marqibo.

Imagine you have relapsed NHL. Suppose there is an existing drug that works against it for some though not all people. Would you want the FDA to hold another drug off the market that also works for some but not all people? Or do you think you should have the right to choose between the two drugs or even to take both of them?

I believe that once someone has been given a diagnosis of a fatal disease that they should be given the equivalent of a "get out of the FDA drug approval jail free card". In other words, once your days are numbered due to a specific diagnosed disease you should be free to take any experimental therapy and drug and biotech companies should be free to sell you any experimental therapy without the FDA being able to stop them.

A smaller step in the right direction would be to replace the existing membership of the above mentioned Oncology Drug Advisory Committee (ODAC) with a new membership made up of people with diagnosed cancer. There are plenty of scientists, epidemiologists, and medical doctors walking around today with diagnosed cancers. So the committee would not have to lack for expertise. But what it would cease to lack is a great sense of urgency and a sense that patients should be given more choices.

If you are going down the elevator for the final check-out from the big hotel of life why should the government have any power to prevent your trying any therapy imaginable on your way down? Someone close to me is dying from metastatic cancer and I do not understand why the government should have any power at all over what experimental treatments someone such as this person might try. The government can not protect people from death. The government can not regulate death out of existence. But the government can and does impose such costs and obstacles on the drug development process that the rate of development of new drugs is greatly slowed and the date at which various diseases become curable gets pushed much farther into the future than it needs to be.

By Randall Parker 2005 January 20 01:01 PM  Policy Medical
Entry Permalink | Comments(0)
2005 January 19 Wednesday
Old Beagles Learn Better If Given Fruits, Vegetables, Vitamins, Exercise

Dogs are like humans in yet another way. Elderly dogs demonstrate better cognitive performance if given higher antioxidant diets and more stimulating environments.

During the two-year longitudinal study, William Milgram, Ph.D., of the University of Toronto, Elizabeth Head, Ph.D., and Carl Cotman, Ph.D., of the University of California, Irvine and their colleagues found older beagles performed better on cognitive tests and were more likely to learn new tasks when they were fed a diet fortified with plenty of fruits, vegetables and vitamins, were exercised at least twice weekly, and were given the opportunity to play with other dogs and a variety of stimulating toys. The study is reported in the January 2005 Neurobiology of Aging.

Citrus pulp mixed in with dog food? I wonder if they had problems getting the dogs to eat it.

For the study, the researchers divided 48 older beagles (ages 7 to 11) into four groups. One group was fed a regular diet and received standard care; a second group received standard care but was fed an antioxidant fortified diet, consisting of standard dog food supplemented with tomatoes, carrot granules, citrus pulp, spinach flakes, the equivalent of 800 IUs of vitamin E, 20 milligrams per kilogram of vitamin C, and two mitochondrial co-factors--lipoic acid and carnitine; the third was fed a regular diet, but their environment was enriched (regular exercise, socialization with other dogs, and access to novel toys); the fourth group received a combination of the antioxidant diet as well as environmental enrichment. In addition, a set of 17 young dogs (ages 1 to 3) were divided into two groups, one fed a regular diet and the other fed the antioxidant fortified diet.

I am skeptical that the vitamin E was a big benefit. Too much of a single antioxidant can actually dampen down metabolism by quenching too many free radicals. Not all free radicals are purely detrimental. The body uses free radical molecules for intracellular and intercellular signalling. Dampen down those signals too much and the net result can be harmful. I'd like to see this experiment repeated with more fruits and vegetables and no vitamins. I bet well chosen fruits and vegetables such as blueberry, spinach, kale, and perhaps even some nuts could provide as antioxidant punch as this study's mixture that included vitamins.

The fruits and vegetables added to the antioxidant fortified diet was the equivalent of increasing intake from 3 servings to 5 or 6 servings daily. Previous research suggests that antioxidants might reduce free radical damage to neurons in the brain, which scientists believe is involved in age-associated learning and memory problems. Mitochondrial co-factors may help neurons function more efficiently, slash free radical production and lead to improvements in brain function. Other studies suggest that stimulating environments improve learning ability, induce beneficial changes in cellular structure, may help the brain grow new neurons, and increase the resistance of neurons to injury.

I've had Australian Shepherds turn up their noses at me when I offered them various fruits - and this in spite of their begging when they saw me eating out of a human food bowl. But perhaps mixed in with much tastier foods (like some blood poured out of a red meat package) dogs could be persuaded to eat their fruits. Though getting them to eat tomato sauce is not hard when it is mixed with pasta and some oil. the right

The combination of better environment and better diet had the most powerful effect.

Overall, older dogs in the combined intervention group did the best on these learning tasks, outperforming dogs in the control group (standard diet, standard care) as well as those that received either the antioxidant diet or environmental enrichment. However, older beagles that received at least one of these interventions also did better than the control group. For instance, all 12 of the older beagles in the combined intervention group were able to solve the reversal learning problem. In comparison, 8 of the 12 dogs that ate the antioxidant diet without environmental enrichment and 8 of the 10 that received environmental enrichment without the antioxidant diet solved the problem. Only two of the eight older dogs in the control group were able to do this task. Dietary intervention in the younger canines had no effect.

Similar dietary changes for older humans would probably provide a similar cognitive benefit.

Also see my previous posts "Concord Grape Juice Improves Memory Of Aged Rats" and "Choline May Restore Middle Aged Memory Formation".

By Randall Parker 2005 January 19 02:47 AM  Brain Enhancement
Entry Permalink | Comments(9)
2005 January 17 Monday
Proposal For Open Source Drug Development

How about open source development of drugs for Third World tropical diseases?

Only about 1% of newly developed drugs are for tropical diseases, such as African sleeping sickness and dengue fever. While patent incentives have driven commercial pharmaceutical companies to make Western health care the envy of the world, the commercial model only works if companies can sell enough patented products to cover their R&D costs and produce profits for shareholders. The model thus fails in the developing world, where few patients can afford to pay patented prices for drugs. The solution to this devastating problem, say Stephen Maurer, Arti Rai, and Andrej Sali in the premier open-access medical journal PLoS Medicine, is to adopt an "open source" approach to discovering new drugs for neglected diseases.

They call their approach the Tropical Diseases Initiative (www.tropicaldisease.org), or TDI. "We envisage TDI as a decentralized, Web-based, community-wide effort where scientists from laboratories, universities, institutes, and corporations can work together for a common cause."

What would open-source drug discovery look like? "As with current software collaborations, we propose a website where volunteers could search and annotate shared databases. Individual pages would host tasks such as searching for new targets, finding chemicals to attack known targets, and posting data from related chemistry and biology experiments. Volunteers could use chat rooms and bulletin boards to announce discoveries and debate future research directions. Over time, the most dedicated and proficient volunteers would become leaders."

The key to TDI's success, they argue, is that any discovery would be off patent. An open-source license would keep all discoveries freely available to researchers and--eventually--manufacturers. The absence of patents, and the use of volunteer staff, would contain the costs of drug development.

Reflecting the range of issues that a proposal like this must address the three fellows making this proposal, Stephen Maurer, Arti Rai and Andrej Sali, are two lawyers and a computational biologist respectively.

You can read the full journal article for free. Note that open source drug development is becoming possible because of advances in computer and communications technology.

Ten years ago, TDI would not have been feasible. The difference today is the vastly greater size and variety of chemical, biological, and medical databases; new software; and more powerful computers. Researchers can now identify promising protein targets and small sets of chemicals, including good lead compounds, using computation alone. For example, a SARS protein similar to mRNA cap-1 methyltransferases—a class of proteins with available inhibitors—was recently identified by scanning proteins encoded by the SARS genome against proteins of known structure [9]. This discovery provides an important new target for future experimental validation and iterative lead optimization. More generally, existing projects such as the University of California at San Francisco's Tropical Disease Research Unit (San Francisco, California, United States) show that even relatively modest computing, chemistry, and biology resources can deliver compounds suitable for clinical trials [10]. Increases in computing power and improved computational tools will make these methods even more powerful in the future.

As computers and modelling software become steadily cheaper the rate of advance of biomedical research will accelerate. More work will be done in computer simulations. More collaborations across great distances will take place. Ideas and results will be shared far more rapidly.

However, I see at least one big problem with this approach: If patent royalties can not be earned off of drugs developed in this approach then there will be no company with the financial incentives to pay the hundreds of millions of dollars it would take to fund taking an open source drug through the drug approval processes of more industrialized countries. So the drug would never be developed for First World country uses. Whether the potential First World uses were identical or for a completely different purposes the financial incentive would be lacking to pay for getting a drug through First World clinical trials and regulatory hoops.

Take, for example, artemisinin. Henry Lai and Narendra Singh at the University of Washington Department of Bioengineering have found that artemisinin, a compound used in less developed countries to treat malaria, works against cancer cells. But while preliminary research has found anti-cancer effects in animals and there are desperate cancer patients taking it on their own (since it is from an herb it is available over the counter as an herbal extract) the problem with artemisinin's being unpatentable prevents it from attracting major investment from big pharma companies. It may be possible for companies to develop similar compounds that are patentable that work the same way or take artemisinin or one of its active forms and attach it to antibodies to target cancer cells and get a patentable treatment that way. But the fact that artemisinin and artemether (another form of the compound) are not patentable has slowed development of this therapy for cancer.

By Randall Parker 2005 January 17 12:44 PM  Biotech Advance Rates
Entry Permalink | Comments(8)
66 Year Old Romanian Woman Gives Birth

Her 33 week pregnancy was over 6 weeks short of a full 40 week pregnancy.

She underwent fertility treatment for nine years, including procedures to reverse the effects of menopause, before being artificially inseminated and then having a Caesarean at 33 weeks.

Most of the news reports on this story do not mention the most important point: she had a donor egg.

Adriana Iliescu, who was artificially inseminated using sperm and egg from anonymous donors, delivered her daughter Eliza Maria by Caesarean section, doctors at the Giulesti Maternity Hospital in Bucharest said. The child's twin sister was stillborn, they said.

So drugs could get her reproductive tract to the point where she could start and maintain a pregnancy for over 7 months.

The pregnancy started out with triplets but only one made it.

Marinescu said Iliescu was successfully inseminated on the first attempt, and that she initially was carrying triplets but lost the third fetus after nine to 10 weeks.

After two of the fetuses died a Caesarean was done.

The girl was born prematurely by Caesarean section after her twin sister died in the womb, the hospital said.

When it becomes possible to rejuvenate reproductive tracts the world could be faced with a population explosion. Career women especially can be expected to have children when they are finally able to do so with assurance in their 40s, 50s, and 60s. Here is the future for the most ambitious and talented women: Make it to the top of the corporate ladder, and stash away millions. Then get rejuvenation treatments to get young again, retire, and make babies.

Artificial wombs will also eventually remove the limits on reproduction caused by aging. Cloning techniques combined with rejuvenation techniques will even allow women to make babies with lab-produced eggs rather than with ovaries. Then there will be no need for women to burden themselves with menstrual cycles or pregnancy. My guess though is that some women will still opt for pregnancy for the experience.

Once rejuvenation becomes possible I predict that some or all governments will eventually limit the number of births each woman can have. Otherwise rejuvenated women who like children could have dozens or hundreds of children over a period of centuries.

By Randall Parker 2005 January 17 11:58 AM  Biotech Reproduction
Entry Permalink | Comments(8)
2005 January 15 Saturday
Rapid Gene Synthesizer Will Enable Custom Microbe Construction

A new method to synthesize long sequences of DNA lowers costs and increases speed of synthesis by orders of magnitude. (same article here)

HOUSTON, Dec. 22, 2004 – Devices the size of a pager now have greater capabilities than computers that once occupied an entire room. Similar advances are being made in the emerging field of synthetic biology at the University of Houston, now allowing researchers to inexpensively program the chemical synthesis of entire genes on a single microchip.

Xiaolian Gao, a professor in the department of biology and biochemistry at UH, works at the leading edge of this field. Her recent findings on how to mass produce multiple genes on a single chip are described in a paper titled "Accurate multiplex gene synthesis from programmable DNA microchips," appearing in the current issue of Nature, the weekly scientific journal for biological and physical sciences research.

"Synthetic genes are like a box of Lego building blocks," Gao said. "Their organization is very complex, even in simple organisms. By making programmed synthesis of genes economical, we can provide more efficient tools to aid the efforts of researchers to understand the molecular mechanisms that regulate biological systems. There are many potential biochemical and biomedical applications."

Most immediately, examples include understanding the regulation of gene function. Down the road, these efforts will improve health care, medicine and the environment at a fundamental level.

Long time FuturePundit readers have heard me argue that the rate of advance of biotechnology is increasingly resembling the rate of advance of electronics technologies such as silicon chip fabrication, hard drive fabrication, and fiber optic fabrication where the technologies produce gains in capacities and speeds that give them doubling times in months to years. Well for just this one biotechnological capability - the rate at which DNA can be synthesized - the advance has been by two whole orders of magnitude in a single step forward. This rate of increase is at least briefly many times faster than the rates of increase of the electronics technologies.

Using current methods, programmed synthesis of a typical gene costs thousands of dollars. Thus, the prospect of creating the most primitive of living organisms, which requires synthesis of several thousand genes, would be prohibitive, costing millions of dollars and years of time. The system developed by Gao and her partners employs digital technology similar to that used in making computer chips and thereby reduces cost and time factors drastically. Gao's group estimates that the new technology will be about one hundred times more cost- and time-efficient than current technologies.

With this discovery, Gao and her colleagues have developed a technology with the potential to make complete functioning organisms that can produce energy, neutralize toxins and make drugs and artificial genes that could eventually be used in gene therapy procedures. Gene therapy is a promising approach to the treatment of genetic disorders, debilitating neurological diseases such as Parkinson's and endocrine disorders such as diabetes. This technology may therefore yield profound benefits for human health and quality of life.

Is this the sign of more things to come? Will the rate of increase of DNA sequencing technologies take a step forward as great as the step just made by DNA synthesis technologies? A couple of order of magnitude decrease in DNA sequencing costs would put personal DNA sequencing within reach for wealthy people anyway and would produce a great acceleration in the rate at which the effects of various genetic sequence variations are identified.

This advance in DNA sequencing sounds great, right? But some of you must be thinking that this technology could be used for nefarious purposes to construct dangerous pathogens such as smallpox or a massive killer influenza strain. Well, this fear is not limited only to the circles of educated laymen. Nicholas Wade of the New York Times reports that some scientists are concerned that fast DNA synthesis technology will make construction of dangerous pathogens too easy. (same article here and here

"This has the potential for a revolutionary impact in the ease of synthesis of large DNA molecules," said Richard Ebright, a molecular biologist at Rutgers University with an interest in bioterrorism.

"This will permit efficient and rapid synthesis of any select agent virus genome in very short order," he added, referring to the list of dangerous pathogens and toxins that must be registered with the Centers for Disease Control and Prevention.

George Church of Harvard, one of the collaborators in the development of these machines, is so concerned about the potential danger of this technology that he would like to see the machines sold only to labs that register with the government.

Most of the really dangerous pathogens have had their DNA sequenced and published in the public domain. For example, the genomes of at least a dozen pox viruses including smallpox have been sequenced and published. Feel safe with the knowledge that some governments are contracting production of large amounts of smallpox vaccine? Well, greater understanding many viruses (and very likely smallpox among them) will eventually provide enough understanding to allow the construction virus variations that have surface antigen structures different enough that existing viruses will not provide much if any protection.

Now of course the problem with highly transmissible pathogens such as smallpox as bioterrorism agents is that they are likely to spread across the world and into societies that terrorists may be seeking to protect against perceived threats coming from other societies in a Clash of Civilizations. Most terrorist groups are therefore likely to rule out the use highly transmissible pathogens as terrorism weapons. A guy like Osama Bin Laden must understand that if he releases a large amount of smallpox in America then good Muslims will die when the disease inevitably spreads across international borders.

Still, it is not at all impossible that some religious fringe group could decide God has called on it to kill people all over the world because the human race has rejected some special message that all humans should recognize as obvious. Still other terrorists could decide that God has told them that only the unfaithful will be felled by some pathogen.

Technologies and capabilities are needed to enable better responses to a deadly outbreak of a lethal natural or human-made disease pandemic. We need biotechnologies that accelerate by orders of magnitude the rate at which new vaccines and drug treatments can be developed. Also, we need to develop more capabilities to "harden" society against a major pandemic in the same way that militaries harden bunkers against bombs. We need ways to reduce inter-human contacts by orders of magnitude while still allowing the bulk of normal operations of society to be performed. We need rapidly manufacturable face masks, building air filters, and other technologies that could reduce the ease of transmission of airborne pathogens.

By Randall Parker 2005 January 15 09:34 PM  Biotech Advance Rates
Entry Permalink | Comments(9)
2005 January 13 Thursday
Peter Huber And Mark Mills On Our Energy Future

Peter W. Huber and Mark P. Mills have a new book out about energy policy entitled The Bottomless Well: The Twilight Of Fuel, The Virtue Of Waste, And Why We Will Never Run Out Of Energy which includes a strong pitch for nuclear power as our best choice to meet continuously rising energy demand. Tyler Cowen finds the book interesting. The latest edition of the City Journal (which FuturePundit strongly recommends) has a long article by Huber and Mills which provides shorter versions of the book's arguments. Each American continually uses about 1,400 watts of electricity on average.

Think of our solitary New Yorker on the Upper West Side as a 1,400-watt bulb that never sleeps—that’s the national per-capita average demand for electric power from homes, factories, businesses, the lot. Our average citizen burns about twice as bright at 4 pm in August, and a lot dimmer at 4 am in December; grown-ups burn more than kids, the rich more than the poor; but it all averages out: 14 floor lamps per person, lit round the clock. Convert this same number back into a utility’s supply-side jargon, and a million people need roughly 1.4 “gigs” of power—1.4 gigawatts (GW). Running at peak power, Entergy’s two nuclear units at Indian Point generate just under 2 GW. So just four Indian Points could take care of New York City’s 7-GW round-the-clock average. Six could handle its peak load of about 11.5 GW. And if we had all-electric engines, machines, and heaters out at the receiving end, another ten or so could power all the cars, ovens, furnaces—everything else in the city that oil or gas currently fuels.

Note that the 6 and 10 Indian Points translate into 12 and 20 nuclear power plants. So they are talking about supplying all the power for New York City for all purposes with about 32 nuclear power plants. In a previous post about how all transportation energy could be supplied by 1000 nuclear power plants I pointed to Westinghouse's new AP1100 nuclear plant design that would generate 1,100 MW or 1.1 GW. My guess is that since nuclear plants have down times it would take 32 AP1100 plants to run NYC. So we can place a price of about $32 billion on their construction. If anyone has any good estimates on yearly operations and fuel costs or nuclear waste disposal and decommissioning costs I'd like to hear them in the comments.

NYC has a population of about 8 million people. The United States as a whole has about 293 million people. So the US as a whole is almost 37 times larger. If the American people would need proportionately as many nuclear power plants as the denizens of the Big Apple then the US as a whole could be operated completely on nuclear power with 32 times 37 or 1184 nuclear power plants. However, that doesn't sound reasonable given the previous estimate I've referenced that claimed 1000 nuclear plants would be needed to power just the cars in America. Anyone understand the cause of the different results of these calculations?

Vehicles use only 30% of power now produced in America.

The U.S. today consumes about 100 quads—100 quadrillion BTUs—of raw thermal energy per year. We do three basic things with it: generate electricity (about 40 percent of the raw energy consumed), move vehicles (30 percent), and produce heat (30 percent). Oil is the fuel of transportation, of course. We principally use natural gas to supply raw heat, though it’s now making steady inroads into electric power generation. Fueling electric power plants are mainly (in descending order) coal, uranium, natural gas, and rainfall, by way of hydroelectricity.

Note that in spite of the attention lavished upon oil as a political topic it accounts for less than half of all energy use in the United States. But there is a difference between energy generated and energy used. A large fraction of the energy generated in electric power generation plants is lost by the time electricity flows through a wall socket. To supply enough nuclear power to operate cars would require more heat generation than is currently generated from burning gasoline in cars. See, for example, the efficiency column of the first table of Engineer-Poet's Ergosphere post laying out his vision of our energy future.

In spite of the lower energy efficiency of electricity it is so convenient and useful that it is a growing percentage of total energy used in America and likely worldwide as well..

That shift is already under way. About 60 percent of the fuel we use today isn’t oil but coal, uranium, natural gas, and gravity—all making electricity. Electricity has met almost all of the growth in U.S. energy demand since the 1980s.

Will this shift toward electricity as the preferred medium for delivering energy continue? To put it another way: Does hydrogen stand any chance of becoming a major medium for the distribution of power? Hydrogen has a lot of problems as an energy storage medium. Perhaps advances in nanotechnology will solve some of those problems. But we'd still be left with the need to use nuclear power plants or solar photovoltaic panels to generate the power we'd use to produce hydrogen in the first place. At the same time, materials advances will reduce electric power transmission costs and so hydrogen is not going to compete against a static target.

Electricity has an inherent advantage over hydrogen: Many end uses require electricity. Think about computers or any electronic devices. The devices run on electricity. Hydrogen use for these applications would require generation of hydrogen from some other energy source (possibly in nuclear power plants designed to optimize hydrogen production), hydrogen transportation and hydrogen storage devices where needed, and then the use of hydrogen fuel cells to generate electricity where and when it is needed. Any guesses on why that approach can be expected to cost more or less than the construction of more superconducting high voltage lines?

Huber and Mills see electricity continuing to encroach on natural gas and other energy source competitors in end-use applications..

Electricity is taking over ever more of the thermal sector, too. A microwave oven displaces much of what a gas stove once did in a kitchen. So, too, lasers, magnetic fields, microwaves, and other forms of high-intensity photon power provide more precise, calibrated heating than do conventional ovens in manufacturing and the industrial processing of materials. These electric cookers (broadly defined) are now replacing conventional furnaces, ovens, dryers, and welders to heat air, water, foods, and chemicals, to cure paints and glues, to forge steel, and to weld ships. Over the next two decades, such trends will move another 15 percent or so of our energy economy from conventional thermal to electrically powered processes. And that will shift about 15 percent of our oil-and-gas demand to whatever primary fuels we’ll then be using to generate electricity.

Huber and Mills also point out that cars are becoming big electric appliances. They expect the trend toward hybrid vehicles to effectively make the main purpose of power plants of cars into electric power generators connected to a large number of devices that run off the electricity supplied by the electric power plant under the hood. I agree with this assessment and have previously argued that Cars May Become Greater Electricity Generators Than Big Electric Plants. However, while the engines in hybrids may eventually become huge electricity generators Huber and Mills argue that the use of gasoline for generating car electric power is an expensive way to charge car batteries. Electric power delivered from electric utility companies is much cheaper:

Once you’ve got the wheels themselves running on electricity, the basic economics strongly favor getting that electricity from the grid if you can. Burning $2-a-gallon gasoline, the power generated by current hybrid-car engines costs about 35 cents per kilowatt-hour. Many utilities, though, sell off-peak power for much less: 2 to 4 cents per kilowatt-hour. The nationwide residential price is still only 8.5 cents or so.

This makes pluggable hybrids (hybrids that can be recharged from wall sockets while parked) as the next logical step. I expect we will see the development of better and cheaper batteries to facilitate this transition.

Huber and Mills make a pitch for nuclear power that has an interesting twist to it: Because nuclear plant reactors are so small compared to the power that comes from them it is easy to overbuild them to protect against terrorist attacks.

And uranium’s combination of power and super-density makes the fuel less of a terror risk, not more, at least from an engineering standpoint. It’s easy to “overbuild” the protective walls and containment systems of nuclear facilities, since—like the pyramids—the payload they’re built to shield is so small. Protecting skyscrapers is hard; no builder can afford to erect a hundred times more wall than usable space. Guaranteeing the integrity of a jumbo jet’s fuel tanks is impossible; the tanks have to fly. Shielding a nuclear plant’s tiny payload is easy—just erect more steel, pour more concrete, and build tougher perimeters.

Because uranium fuel amounts to only one tenth the total cost of nuclear power it is a great energy source for baseline power needs. Also, since hybrid vehicles could be recharged at night an electric industry run by nuclear power would work extremely well with the implementation of residential variable rate electric power metering where late night prices for electric power would be much lower than the peak day rates.

Huber and Mills mention that the Hoover Dam on the Colorado generates 2 GW of electric power. Increasingly I find myself automatically translating energy numbers into nuclear plant terms. For the cost of 2 $1 billion dollar Westinghouse AP1100 nuclear plants and perhaps a few billion more in operations and fuel costs and waste disposal costs the Hoover Dam could be torn down and the Colorado River could be allowed to return to its natural state. So perhaps the whole project would run to $6 billion. Sound far-fetched? Over a period of decades we spend trillions of dollars on environmental protection. Once nuclear power again becomes an acceptable energy source I predict that the idea of building nuclear power plants to enable the tearing down hydroelectric dams will become popular in environmentalist circles.

Huber and Mills end their City Journal article with a plea to end our need to buy oil from the Middle East in order to stop the flow of money into a region so intent upon violence. I agree.

Update: What are nuclear fission power's biggest competitors in the medium run of the next 20 to 40 years? I see two:

  • Coal plants that have extremely low emissions. By extremely low I mean less than 1% of the mercury, sulfur dioxide, soot, and other emissions now coming from the worst American coal-fired electric power plants. Also, the plants would have near total carbon dioxide sequestration. Coal emissions control technology has already greatly improved and further improvements are possible. So clean coal might become possible for less than the cost of nuclear power.
  • Cheaper solar photovoltaics coupled with cheaper storage systems. The costs of photovoltaic panels inevitably will fall by an order of magnitude and more. At the same time, carbon nanotubes or lithium polymers will enable the building of batteries that are cheaper, higher power density, and longer lasting.

In the longer run what are nuclear fission's other competitors? Two more will become feasible:

  • Nuclear fusion. Will fusion be cheaper than fission?

  • Solar satellites. The satellites could provide constant power and the light hitting them would be more intense than the light hitting solar panels down on Earth. A carbon nanotube beanstalk into space may eventually make construction and deployment of such satellites orders of magnitude cheaper than it is today.

Natural gas in the form of clathrates on the ocean floor might become a major source. However, it is unclear how much natural gas is tied up in clathrates.

By Randall Parker 2005 January 13 04:06 PM  Energy Tech
Entry Permalink | Comments(87)
2005 January 11 Tuesday
Compulsive Gamblers Feel Less Pleasure From Gambling?

Brain scans show differences in the brain reactions of compulsive gamblers and regular people to gambling.

Compared with the controls, the pathological gamblers showed a lower level of activity in the ventral striatum, the dopamine-producing brain region that provides the pleasure in winning, suggesting gamblers remain unsatisfied even when winning. The scans also showed decreased activation of the ventrolateral prefrontal cortex -- the brain's "superego," which keeps people from acting impulsively.

Mick Jagger would understand. Gamblers "just can't get no satisfaction".

I'm trying to picture the logistics of a guy gambling while in an MRI machine.

In the study, the brains of 12 compulsive gamblers and 12 non-gamblers were monitored using functional magnetic resonance imaging (fMRI) while they played a simple card guessing game.

If dopamine levels could be boosted in the brains of compulsive gamblers they might not feel as great a need to gamble. However, another group found in results they published in May 2004 that dopamine release increases in one area of the brain while decreasing in other areas when an unpredictable monetary award arrives.

Zald and his colleagues used positron emission topography (PET scanners) to view brain activity in nine human research subjects who had been injected with a chemical that binds to dopamine receptors in the brain, but is less able to bind when the brain is releasing dopamine. A decrease in binding to the receptors is associated with an increase in dopamine release, while an increase in binding indicates reduced release of dopamine. This technique allows researchers to study the strength and location of dopamine release more precisely than has previously been possible.

The team studied the subjects under three different scenarios. Under the first scenario, the subject selected one of four cards and knew a monetary reward of $1 was possible but did not know when it would occur. During the second scenario, subjects knew they would receive a reward with every fourth card they selected. Under the third scenario, subjects chose cards but did not receive or expect any rewards.

Zald and his team found that over the course of the experiment, dopamine transmission increased more in one part of the brain in the unpredictable first scenario, while showing decreases in neighboring regions. In contrast, the receipt of a reward under the predictable second scenario did not result in either significant increases or decreases in dopamine transmission.

The most effective treatments for gambling and drug addictions are going to have to involve manipulation of pleasure regions of the brain. Well, that is quite the Rubicon to cross. Once pleasure centers are effectively rewired to treat addictions rewiring for other purposes will not be too far behind. Imagine the possibilities for individual and group manipulation if the brain regions for pleasure and pain can be reorganized to any significant extent.

Another interesting consequence of the ability to conduct experiments that show neurological differences between addicts and normal people is that this will eventually lead to very objective methods for diagnosing addictions. Junior just got busted for using. Oh my, is Junior a crackhead? Or is he just recreationally using cocaine on occasion? Mom and dad will demand a brain scan test to find out. Or juvenile courts will require the brain scan. Similar work is bound to lead to effective means by which to reliably diagnose assorted compulsions and preferences. I predict that some day successful lack of response to child pictures while being brain scanned will be a required condition for parole of convicted pedophiles. Also, one can imagine recovering addicts to be required to pass a neurological test for lack of craving for an addictive drug as a condition for parole or to regain custody of children taken away by the state.

By Randall Parker 2005 January 11 07:27 PM  Brain Addiction
Entry Permalink | Comments(1)
Love Is Blind: Couples In Love Can't Identify Who Else Is In Love

People who were in love and other people who were not in love were asked to view film clips of couples interacting who were in different levels of emotional involvement. The viewers who were in love were least able to identify which viewed couples were in love.

"Love is truly blind," said Frank J. Bernieri, professor and chair of the Department of Psychology at Oregon State University and one of the authors of the study. "People in the study who had the longest relationships, were immersed in reading romance novels and spent lots of time watching romantic movies just loved this research. They all were quite confident of their ability to identify others in love."

"And without exception," he added, "they were, by far, the least accurate in their assessment."

The study was just published in the Journal of Nonverbal Behavior. Bernieri co-authored the paper with lead investigator Maya Aloni, who was an honors undergraduate at the University of Toledo when Bernieri was on the faculty there. She is now at State University of New York-Buffalo pursuing graduate studies.

A team of clinical psychologists at McGill University in Montreal filmed 25 couples for another study and used a battery of common assessment tools -- including the Sternberg Love Scale, the Hatfield Passion Scale and other relationship measures -- to determine the depth of couples' affection for one another. All of the couples had been together for at least three weeks; many for several months.

On film, the couples were seen interacting casually. Bernieri showed snippets of each couple to a series of volunteers and ask them to assess the depths of the filmed couples' feelings for each other.

"The range of accuracy was really extraordinary," Bernieri said. "Those who were best at it were about twice as good as those who did the worst. Imagine observing 10 couples and trying to identify the five who love each other the most, and the five who loved each other the least. If you were in love at the time of the study, you would only get three or four out of 10 couples -- so you'd be wrong twice as much as you'd be right."

"But if you weren't in love, you'd get it right six or seven times out of 10," he added. "That, in my book, is a huge difference."

If being drunk on alcohol at the time of getting married can be grounds for annulment then why can't being in love also be grounds for annument? After all, people in love are in an obviously naturally drugged mental state and they obviously can't think straight. So shouldn't people in love be treated as suffering from a mental handicap or a special form of mental incapacitation? Should the law treat lovers as legally competent to enter into the serious and important contract of marriage?

Another interesting point about this study: Some people are especially skilled at identifying which couples are in love. Well, there are also rare individuals who have exceptional talent at identifying when someone is telling a truth or a falsehood. The technique these researchers used could be applied to a much larger set of subjects to identify people who are exceptionally skilled at telling who is in love. This has all sorts of practical applications in the war between the sexes. Imagine a woman who is uncertain if her boyfriend really loves her. She could arrange to have a dinner party or other meeting and pay this expert relationship evaluator to attend to figure out whether the boyfriend is just having a fling or more committed.

One can also imagine use of expert relationship evaluators in marriage counseling. It would save a lot of time to be able to simply say "Jill obviously doesn't love Jack any more but she is reluctant to admit it." Or "Hey, these people hate each other and love each other at the same time".

Another neat application would be in the spy business both real and fictional. Imagine Alias star Jennifer Garner as Sydney Bristow pretending to be in love with a fellow agent while on a mission. The Covenant (who whatever shadowy international group is the current enemy of the CIA in Alias - I've lost track) could have some corrupt psychologist recruit a talented observer who would detect that Sydney is faking her love for some guy at an embassy reception. Then a big gun battle would ensue.

In the long run I predict drugs will be developed that will induce and halt the feeling of being in love.

Update: The brain changes in physically measurable ways when people fall in love. See my previous posts Love Deactivates Brain Areas For Fear, Planning, Critical Social Assessment and What Brain Scans Of People Falling In Love Tell Us and Hormone Levels Change When Falling In Love.

By Randall Parker 2005 January 11 03:19 PM  Brain Love
Entry Permalink | Comments(15)
Hearts Age More Slowly In Women Than In Men

Women have longer life expectancies than men in industrialized countries. Slower aging of hearts is one reason for greater female longevity. (same article here)

Research by exercise scientists at Liverpool John Moores University (LJMU) may have an answer to the age old question of why women live longer than men.

On average, women live longer than men and women over 60 are now the fastest growing cohort in today’s ageing society. LJMU’s findings show that women’s longevity may be linked to the fact that their hearts age differently to men’s and do not lose their pumping power as they get older.

David Goldspink, LJMU’s Professor of Cell and Molecular Sports Science explains: “We have found that the power of the male heart falls by 20-25% between 18 and 70 years of age. In stark contrast, over the same period there was no age-related decline in the power of the female heart, meaning that the heart of a healthy 70 year-old women could perform almost as well as a 20 year-old’s. This dramatic gender difference might just explain why women live longer than men.”

The results are based on the findings of the largest study ever undertaken on the effects of ageing on our cardiovascular system. Since the study began two years ago, Professor Goldspink and a team of scientists at LJMU’s Research Unit for Human Development and Ageing have examined more than 250 healthy men and women between the ages of 18 and 80 years.

As we age the whole circulatory system deteriorates. Blood vessels become less elastic and less able to carry blood to muscles and skin.

  • Blood pressure increases both at rest and during exercise, because the large arteries become stiffer and less elastic as we age.
  • Blood flow to the muscles and skin of limbs also progressively decrease. These changes in the structure of blood vessels occur earlier in men, but women soon catch up after the menopause.

Aging pretty much amounts to going to hell in a handbasket. The spin from some quarters that it brings wisdom, maturity, and contentment is no consolation to FuturePundit. Those spinners mostly just make me feel irritated. Aging brings decline, decay, disorders, diseases, and for many people chronic pain and suffering. As Roger Waters famously put it "but you're older, Shorter of breath, and one day closer to death". We should not meekly accept this fate. We can develop the ability to repair and replace worn out parts. Aging reversal is going to come some day. The question is whether it will come soon enough for each of us and how many chronic maladies will we each have to live with for years before rejuvenation therapies are developed.

A temporary solution? Become a veteran athlete.

In a related study, Prof Goldspink found that the hearts of veteran male athletes (aged 50-70) were as powerful, if not more powerful, than those of inactive 20-year-old male undergraduates.

"The 20-25 per cent loss of power in the ageing male heart can be prevented or slowed down by engaging in regular aerobic exercise."

Okay, let us all go back in our time machines and join the pro tennis circuit when we were only 15 years old. Misplaced your time machine? Darn, I can't find mine either. Therefore we are left with only rather more pedestrian options such as get lots of exercise and eat better food. But it will be a lot easier when stem cell therapy can replace all the lost and tired heart cells and artery cells.

By Randall Parker 2005 January 11 02:26 PM  Aging Studies
Entry Permalink | Comments(3)
2005 January 10 Monday
Lasers Release Anti-Cancer Chemicals From Packages In Cells

Capsules containing anti-cancer agents and coated with gold nanoparticles can be melted open in cancer cells using near-infrared lasers and someday this may be done in patients without damaging non-cancerous cells.

So Frank Caruso and his team at the University of Melbourne, Australia, are developing an ingenious way of doing this. Their trick is to enclose the drug in polymer capsules that are peppered with gold nanoparticles and attached to tumour-seeking antibodies.

When injected into the bloodstream, the capsules will concentrate inside tumours. When enough capsules have gathered there, a pulse from a near-infrared laser will melt the gold, which strongly absorbs near-infrared wavelengths. This will rupture the plastic capsules and release their contents.

This packaging is neat because it would prevent the damage that conventional chemotherapy causes to normal cells all over the body while en route to cancer cells.

What is the biggest problem with cancer treatment? Cancer cells are too like normal cells. Therefore it is hard to selectively kill cancer cells. It remains to be seen whether all cancer cells will have enough unique surface proteins to be targettable using antibodies. But for those cancer cells that do present distinct surface antigen patterns an approach like Caruso's to package and selectively deliver toxic compounds (or even gene therapies) to cancer cells is going to be what ends up curing many types of currently uncurable cancers.

Even if some types of cancer cells do not present a single unique antigen this approach could still be used to attack cancer cells. Suppose cancer cells present combinations of antigens that are rarely found in normal cells. A few interdependent chemo agents could be placed in different packages attached to different antibodies. Imagine cancer cells have antigens A, B, and C. Then chemicals X, Y, and Z could be packaged with antibodies aimed at antigens A, B, and C respectively. Then only cells that have all 3 of the targetted antigens on their surfaces that in combination mark them as cancer cells would get all the different types of chemical packages delivered to them. Think of this as analogous to explosives that work only when two or three different chemicals are mixed together. One could create chemical compounds that would act like metaphorical set of anti-cancer explosives. The chemicals would become deadly only when chemicals X, Y, and Z are all released from different packages into the same cell.

Monoclonal antibodies to deliver chemotherapy compounds are already in clinical trials. But Caruso's approach would allow most of the delivery packages to get into cancer cells and then to be released all at once to create a bigger spike in cancer cell killer compounds. Also, Caruso's approach would work better with compounds that would need to detach from their antibody delivery vehicles. Also, in Caruso's approach it seems likely that more chemo molecules could be delivered per antibody.

You can view a slide show and read text of a presentation that Caruso delivered in May 2003 that explains how some of the pieces of this capability were created. That presentation doesn't include the step of using antibodies to deliver capsules. But it does include some interesting bits of information on how the capsules were constructed to be able to be opened by a laser.

Update One problem with this approach is that it may not work well for cancers that have widely metastasized. I came across one report claiming that the near-infrared lasers can penetrate a few millimters of skin or be delivered endoscopically (and the light has to shine for only ten billionths of a second). But what if, for example, one has metastasized cancer to the bone? Lasers therefore seem problematic as activation agents. However, if capsules could be constructed to burst open in response to ultrasound then cancers in brains and other less accessible locations might be reachable with microcapsule chemo delivery vehicles.

By Randall Parker 2005 January 10 03:31 PM  Biotech Therapies
Entry Permalink | Comments(1)
2005 January 07 Friday
Twins Study Shows About Half Of Altruism Is Genetic

Male altruism is more heavily genetically determined than female altruism.

A paper showing a strong genetic contribution to social responsibility was published in the December 22 issue of Proceedings of the Royal Society: Biological Sciences, 271, 2583-2585, entitled "Genetic and environmental contributions to pro-social attitudes: a twin study of social responsibility."

The study compared identical twins with non-identical twins to see how much they agreed on 22 questions, such as "I am a person people can count on," "It is important to finish anything you have started," and "Cheating on income tax is as bad as stealing," using a scale from 1 (strongly disagree) to 5 (strongly agree). Answers are known to predict real-life behavior such as whether a person votes in elections or volunteers to help others.

The twins came from the University of London Twin Register. There were 174 pairs of monozygotic (identical twins, who share all their genes) and 148 pairs of dizygotic (non-identical twins, who share only half their genes). If monozygotic twins agree more than dizygotic twins it suggests that morality has a biological basis and is part of our evolved psychology.

The answers of the identical twins were almost twice as alike as those of the non-identical twins. The results showed that genes account for 42% of the individual differences in attitudes, growing up in the same home for 23%, and differences within the same home for the rest.

The study also found that genes had a stronger influence on males than females (50% vs. 40%) and that home upbringing had a stronger influence on females (40% vs. 0%). This suggests parents may watch over the behavior of daughters more carefully than they do for their sons.

In previous research Rushton has shown that genes influence people's levels of altruism and aggression--including feelings of empathy like enjoying watching people open presents and acts of violence such as fighting with a weapon. Rushton has also demonstrated that the male sex hormone testosterone sets the levels of aggression and altruism.

When asked about his findings Prof. Rushton noted, "They join a host of recent research in showing that both genes and upbringing influence almost every human behavior. It is especially interesting to see that this applies to moral attitudes." He said that he agreed with George Eliot's sentiment: "What do we live for, if it is not to make life less difficult for each other?"

If your reaction is that identical twins share more social environment keep in mind that a great many of twins studies have been done, including on twins reared apart. My impression from reading on twins studies and comparison across these studies is that the common experiences of identical twins as compared to non-identical twins and non-twin siblings do not end up counting for that much. So Rushton's use of this data to draw the conclusions he reaches about hereditability is sound in my opinion.

What is most important about this result? When something is genetic then it becomes manipulable using genetic technologies. Once people can control what genetic variations their offspring can have will they choose genetic variations that make their children more or less altruistic, more or less empathetic, more or less desirous to see justice done (with different levels of brain rewards for carrying out altruistic punishment), or more or less prone to being aggressive?

Once genetic variations for behavior and cognition become choosable by parents they will make choices that differ from what would happen from chance combination of their genes to produce offspring. So human offspring will change somehow as a result. The question is how?

My guess is that in different cultures the average decision made will be different. So cultures will become more unalike as humans make average different decisions about behavioral characteristics in their offspring.

The abstract:

Abstract: Although 51 twin and adoption studies have been performed on the genetic architecture of antisocial behaviour, only four previous studies have examined a genetic contribution to pro-social behaviour. Earlier work by the author with the University of London Institute of Psychiatry Adult Twin Register found that genes contributed approximately half of the variance to measures of self-report altruism, empathy, nurturance and aggression, including acts of violence. The present study extends those results by using a 22-item Social Responsibility Questionnaire with 174 pairs of monozygotic twins and 148 pairs of dizygotic twins. Forty-two per cent of the reliable variance was due to the twins' genes, 23% to the twins' common environment and the remainder to the twins' non-shared environment.

Paid access to the full article can be made here.

By Randall Parker 2005 January 07 01:55 PM  Brain Altruism
Entry Permalink | Comments(7)
2005 January 05 Wednesday
Jared Diamond Wrong To Worry About Environmental Collapse

Jared Diamond has a new book, Collapse: How Societies Choose to Fail or Succeed, about past societies that failed due to damage they inflicted on their environments through deforestation, overfarming, and other bad things that humans have done to the environment. He also argues that today we are at risk of a similar fate. Oh humans, you terrible people. Look at how you get your just desserts if you don't do right by the environment. Picture me rolling my eyes. Yet Diamond will be taken seriously in some quarters.

Steve Sailer points out that most societies that have fallen (and there have been many) did not do so as a result of damage they inflicted upon their environment.

Contra Diamond, in reality, most societies down through history died because they were conquered. Generally speaking, not suicide, but homicide was the fate of most extinct societies.

Diamond cites the Maya, but I cite the Aztecs and the Incas. He cites the Anasazi, but I cite the Cherokee, the Sioux, and countless others. He cites the Easter Islanders, but I cite the Maoris, the Tasmanians, the Australian Aborigines, the Chatham Islanders (exterminated by the Maori), and so forth. He cites the Vikings in Greenland, but I cite the Saxons in Britain and the Arabs in Sicily, both conquered by the descendents of the Vikings. We can go on like this all day.

Diamond used to be a terrific independent thinker, as shown in his 1993 book The Third Chimpanzee (indeed, many of my examples come from this book). But he sold out to political correctness, most profitably, in his bestseller Guns, Germs, and Steel.

How many people will pick up on the absurdity of Diamond's latest argument? It is so politically correct that it deserves parody Onion-style. But some suckers will buy it. I predict he'll make a fair amount of money off of a left-leaning segment of our society prone to being excited by uncontextualized trivia about environmental disaster. This sort of thing reinforce their prejudices and so will be welcome. P.T. Barnum was right after all. Those who treat environmentalism as a sort of secular religion will see Diamond's book as a bunch of clever new arguments (and they are in need of such arguments) to use to make new conversions to the faith and to buck up their own belief in their faith. Bjorn Lomborg and other rationalists (notably Julian Simon before him) have been bringing up disquieting rational arguments against some of the nuttier environmentalist claims. The faithful need something like this book.

Speaking of parody, how about WWII? The Germans, by attacking 3 major powers including one that had a few times greater industrial base (albeit one that was terribly polluting at the time), brought on a counter-attack that devastated the environment of Germany. Allied bombings caused an ecological disaster which wrecked the quality of German water and food supplies and left many Germans without adequate shelter. This directly led to the fall of the Third Reich. The Germans should have had more fighter interceptors to protect their environment. The Luftwaffe should have been rebranded as the Aerial Environmental Protection Agency.

Or hey, how about the Carthaginians? By challenging the supremacy of Rome the Carthaginians provoked counter-attacks on the environment around Carthage. How irresponsible. The Carthaginians did not put enough resources into environmental protection (probably because their capitalists were funnelling money off to invest in Egypt or Syria) and eventually the Romes were able to defeat the Carthaginians on the field of battle. This left Carthaginian farm fields completely unprotected from Roman efforts to salt the earth. The result? Carthaginian fields became an unfarmable ecological disaster that the Carthaginians (at least those few still left alive) failed to repair.

There was an alternative for the Carthaginians. They could have pursued a policy of appeasement and let themselves become servants of Rome. Appeasement might have protected their fields. Though the Romans might have forced them to overfarm in order to ship more grain to Rome. That might have allowed the Roman Empire to go on longer before ecological collapse caused by the damage from all those Visigoth horse hooves. In any case, not only did the Carthaginians fail at their responsibility of environmental protection but they also failed to set up (let alone adequately fund) something like the EPA Superfund program to repair their damaged environment. Looked at this way by righteous environmentalists the Carthaginians clearly deserved their fate.

Tyler Cowen makes the correct argument that we have far too many technological and human resources to be unable to deal with any environmental problems.

The key to the "meta-book" is Diamond's claim that part one -- the history of deforestation -- means we should worry more about part two, namely current environmental problems. The meta-book fails.

Yes we should worry about the environment today, but largely because of current data and analysis, not because of past history. If you look at the past, the single overwhelming fact is that all previous environmental problems, at the highest macro level, were overcome. We moved from the squalor of year 1000 to the mixed but impressive successes of 2005, a huge step forward. Environmental problems, however severe, did not prevent this progress. We may not arrive in 3005 with equal ease, but if you are a pessimist you should be concerned with the uniqueness of the contemporary world, not its similarities to the past.

Tyler's argument is ultimately why I am not deeply concerned about the possibility of global warming. Humanity's base of technological capabilities is only going to grow more advanced in the future. What global scale environmental problems we have now are ultimately solvable. For example, should we ever need to stop using fossil fuels then as I've previously argued, nuclear power plants could provide all the power we need for transportation and for a cost that would still allow modern lifestyles. Huge amounts of capital are available to build new coal-fired electric power plants. As CO2 extraction and sequestration technologies advance the costs of adding on CO2 emissions control systems will fall to the point where stopping CO2 emissions will become much cheaper than it would be to do today. Energy shortages are not going to stop us.

How can environmental pollution bring down modern civilization? I just do not see it. Take the apocalyptic warnings of future water shortages as an example. In the industrialized countries we have too many ways to deal with potential future water shortages. We can desalinate. Desalination is more expensive but still affordable. We can stop subsidizing agricultural uses of water. Farmers can adopt practices that use water more efficiently. We can put on more efficient fixtures in showers. We currently mix all waste water together even though some types are much harder to process. So we could gradually build our plumbing and waste water street pipes to separate them out. There are just too many options for handling water more efficiently for more efficient use and reuse that are doable for affordable prices. In the face of warnings about water shortages starting in the year 2003 for the first time in history more than half of the human race now has piped water. As China, and some other Asian countries industrialize hundreds of millions more will get piped water. Nanotech advances in materials and biological engineering will make water filtration cheaper. So water isn't going be what brings us low.

So what should we worry about with regard to the future? I think Tyler hits the right note when he speaks of the uniqueness of modern problems. Future dangers I worry about are nuclear proliferation, germ warfare pathogens, robots some day taking over, self-replicating nanotech that gets out of control, and genetically engineered ruthless semi-humans who lack the necessary empathy and feelings of fairness and altruism to make a workable society. You can read some of those items as ecological. But they would not be the result of overusing resources or emitting pollutants (unless someone wants to take seriously my strategic bombing pollution parody or perhaps categorize robots as pollutants).

Steve Sailer says once upon a time before Diamond made his run for fame and fortune pitching appealing arguments to the politically correct Diamond had much more interesting and insightful things to say about the human condition.

Jared Diamond didn't used to be so boring: Jared Diamond has a new book out called Collapse about societies that have collapsed due to environmental disasters such as deforestation. It's a useful topic, but in the large scheme of things, a minor one, which is why Diamond spends so much time on famously trivial edge-of-the-world cultures like the Vikings in Greenland and the Polynesians on Easter Island. But Diamond is so good at getting publicity that the fact that ecology has little to do with the reason most societies collapse will likely be overlooked. The main reason you don't see many Carthaginians or Aztecs or members of other collapsed civilizations around these days is they got beat in war, as Edmund Creasy's famous 1851 book "Fifteen Decisive Battles of the World" makes clear.

Steve is not the only one to make that argument. As Godless Capitalist has found, a younger and less politically correct Jared Diamond once said provocative things about selective pressures in human populations. But those days are long past.

In a way Diamond has flipped from the position he took in Guns, Germs, & Steel when he now focuses on the worries that come from noticing that humans can alter the environment. One of my favorite historians, William Hardy McNeill wrote a great review of Diamond's GG&S for the NY Times Book Review. McNeill's review costs 4 dollars. It prompted Diamond to write a letter to the New York Review of Books and you can read Diamond's response and McNeill's reply and here is a bit of what McNeill said: (my bold emphasis added).

Secondly, Diamond accuses historians of failing "to explain history's broadest patterns." I answer that some few historians are trying to do so, among them myself, and with more respect for natural history than Diamond has for the conscious level of human history. He wants simple answers to processes far more complex than he has patience to investigate. Brushing aside the autonomous capability of human culture to alter environments profoundly—and also irreversibly—is simply absurd.

So now Diamond is overemphasising the importance of human damage to the environment. Before he was overemphasising the importance of environment as restraints on human achievements and development while simultaneously sidestepping the importance of local environments as selective pressures. But Diamond is responding to his own left-liberal academic environment and allowing himself to be far too constrained in what causes of history he will consider and what conclusions he will allow himself to draw.

Update: back40 examined some essays by Diamond that he wrote as shorter versions of the arguments in his book.

As stated earlier, Diamond isn't convinced by his own analysis and is still perplexed. I am perplexed why we should pay much attention to the prescriptions of someone who is bewildered by the problem he seeks to cure.

Why were Easter Islanders so foolish as to cut down all their trees, when the consequences would have been so obvious to them? This is a key question that nags everyone who wonders about self-inflicted environmental damage. I have often asked myself, "What did the Easter Islander who cut down the last palm tree say while he was doing it?" Like modern loggers, did he shout "Jobs, not trees!"? Or: "Technology will solve our problems, never fear, we'll find a substitute for wood"? Or: "We need more research, your proposed ban on logging is premature"?

No, they didn't want to abandon their projects in which they had invested so much already and they didn't want to disrupt their group consciousness. They couldn't bear that double loss even though in the end it meant that they would lose everything. It is the self justification noted by Brockner: "when the group is faced with a negative feedback, members will not suggest abandoning the earlier course of action, because this might disrupt the existing unanimity." The individual human susceptibility to the "Concorde fallacy" is amplified by group consciousness.

It isn't the "globalization, international trade, jet planes, and the Internet" that Diamond worries about that are the problem, it is the “Concorde fallacy”, big projects entered into for flimsy reasons and maintained even when it is crystal clear that they are nothing but resource sinks. It's important to grasp this because Diamond's solution is to engage in even "greater integration of parts" so that he can enforce his proposed bans on logging or whatever. Group behaviors are less intelligent than individual behaviors for such problems and the larger the group the more this is true.

As long as we have enough energy we can clean up any industrial or agricultural processes that cause environmental problems. With sufficient wealth and energy any environmental disaster can be avoided. In Western industrialized societies overall environments are getting better, not worse. We already have enough wealth and technology to get plenty of energy from non-fossil fuel sources. So I do not see some coming future collapse of society due to lack of energy. Resource depletion and pollution are poor choices for speculations about disasters in the future. If you want to worry about the future worry about natural dangers such as an asteroid collision or a repeat of the Yellowstone area eruption of 600,000 years ago that spewed out 240 cubic miles of debris. Or if you want to worry about human dangers worry about run-away nanotech lifeforms or a robot take-over. Common forms of pollution or depletion of trees or fish or minerals just aren't going to bring down our civilization.

Update II: Regards my comment about complaining about CBS and the NY Times: Someone emailed me to complain about this comment. In case anyone else didn't get it I was joking! If we can't criticise left-liberal major media without bringing on a civil war then we are doomed anyway. In that case we obviously might as well coordinate our criticisms and make them reach a coordinated peak as a way to choose when the civil war will start. This will give the critics of the Grey Lady a decisive advantage in the outcome of the war. Though of course such an advantage would not be needed since the conservatives dominate the military away.

By Randall Parker 2005 January 05 11:37 PM  Trends Future Issues
Entry Permalink | Comments(51)
Appetite Regulator Galanin Increases Craving For Alcohol

A neuropeptide compound injected into rats caused them to consume large amounts of ethanol.

A brain chemical that stokes hunger for food and fat also triggers thirst for alcohol and may play a role in chronic drinking, according to a study led by Princeton University scientists.

The study showed that rats injected with galanin, a natural signaling agent in the brain, chose to drink increasing quantities of alcohol even while consuming normal amounts of food and water. The finding helps explain one of the mechanisms involved in alcohol dependence and strengthens scientists' understanding of the neurological link between the desires for alcohol and food.

"There seems to be a cycle of positive feedback," said Bartley Hoebel, co-author of a paper appearing in the December issue of Alcoholism: Clinical and Experimental Research. "Consumption of alcohol produces galanin, and galanin promotes the consumption of alcohol. That would perpetuate the behavior."

This suggests the obvious possibility that a compound that blocked the synthesis or binding of galanin to some target would help alcoholics stop drinking.

The research was conducted by Michael Lewis, a visiting research fellow in Hoebel's lab, in collaboration with Hoebel, a professor of psychology; Deanne Johnson, a research staff member; Daniel Waldman, a senior undergraduate; and Sarah Leibowitz, a neurobiologist at Rockefeller University.

Galanin, a kind of small protein fragment called a neuropeptide, had previously been shown to play a role in appetite, particularly for fatty foods. Consumption of fat causes a part of the brain called the hypothalamus to produce more galanin, which, in turn, increases the appetite for fat. In a healthy person, however, there are counteracting signals that break this loop, said Hoebel.

In animals given galanin and access to alcohol, the role of the chemical appeared to be subverted: it boosted alcohol intake instead of eating. The effect was especially noticeable during daylight hours, when the nocturnal animals normally do not eat and drink much. Those given galanin drank alcohol during the day, but did not consume any more food or water than normal.

"Alcohol is the only drug of abuse that is also a calorie-rich food, and it undoubtedly has important interactions with systems that control food intake and nutrition," said Lewis, who is also a senior fellow of the National Institute on Alcohol Abuse and Alcoholism (NIAAA).

A drug that blocks the efect of galanin reduced alcohol consumption. However, it would be difficult ot make a drug that would do the same in humans.

When the animals were given a drug that blocked the effects of galanin, they maintained normal eating and drinking habits. This observation helps confirm the conclusion that galanin affects alcohol consumption and also suggests the possibility of someday creating a drug that blocks galanin in order to fight alcoholism. However, Hoebel noted that such an achievement would be a long way off, because it is hard to make drugs that cross from the blood into the brain and interact with neuropeptide receptors. In addition, galanin plays many roles in other parts of the brain, which could be adversely affected by trying to block its effects related to food or alcohol.

The researchers plan to explore further the role of galanin and other neuropeptides in alcohol use, as well as the role of fat intake and metabolism on alcohol intake.

An effective drug to stop alcohol cravings would prevent enormous economic losses from brain damage in alcoholics, brain damage to fetuses of alcohol abusing pregnant women, work time lost, crimes committed while drunk, accidental deaths and injuries, and still other losses. Addictions cost the US economy hundreds of billions per year. In 1992 a US government study tallied up total alcohol abuse costs as about $150 billion yearly. The health care costs alone were $18 billion in 1992. A separate category for motor vehicle crash costs was $24.7 billion. My guess is these costs are higher today.

My guess is that there are some major categories of cost which are not captured by that analysis. For example, there are children who have a genetic variation in Mono Amine Oxidase A (MAOA) which causes them to react to child abuse by becoming permanently more impulsive and violent and to lack remorse for their assaults upon others. Well, how many of those children were sent down the path of a life of crime by fathers abusing them while on alcoholic benders? WHat is the cost to the rest of us from the assaults, murders, rapes, and other acts that come as a result of that abuse?

By Randall Parker 2005 January 05 06:40 PM  Brain Addiction
Entry Permalink | Comments(1)
2005 January 04 Tuesday
Embryonic Stem Cells Reduce Parkinson's Symptoms In Monkeys

Embryonic stem cells trreated to become dopamine-producing neurons lessened the symptoms of Parkinson's Disease in monkeys.

The replenishment of missing neurons in the brain as a treatment for Parkinson disease reached the stage of human trials over 15 years ago, however the field is still in its infancy. Researchers from Kyoto University have now shown that dopamine-producing neurons (DA neurons) generated from monkey embryonic stem cells and transplanted into areas of the brain where these neurons have degenerated in a monkey model of Parkinson disease, can reverse parkinsonism. Their results appear in the January 3 issue of the Journal of Clinical Investigation.

Studies of animal models of Parkinson disease as well as clinical investigations, have shown that transplantation of fetal DA neurons can relieve the symptoms this disease. However the technical and ethical difficulties in obtaining sufficient and appropriate donor fetal brain tissue have limited the application of this therapy.

These researchers previously demonstrated that mouse embryonic stem cells can differentiate into neurons when cultured under specific conditions. These same culture conditions, technically simple and efficient, were recently applied to primate embryonic stem cells and resulted in the generation of large numbers of DA neurons. In their current JCI study, Jun Takahashi and colleagues generated neurons from monkey embryonic stem cells and exposed these cells to FGF20, a growth factor that is produced exclusively in the area of the brain affected by Parkinson disease and is reported to have a protective effect on DA neurons. The authors observed increased DA neuron development and subsequently transplanted these neurons into monkeys treated with an agent called MPTP, which is considered a primate model for Parkinson disease. These transplanted cells were able to function as DA neurons and diminished Parkinsonian symptoms.

In an accompanying commentary, J. William Langston from the Parkinson's Institute, California, describes this study as a milestone in the development of stem cell technology but cautions that while the observations are encouraging, the reported number of surviving DA neurons was very low, only 1–3% of the cells surviving, well below the estimated number of DA neurons that survive after fetal cell transplants (approximately 10%). While this may be a difference observed between transplantation in monkeys and humans, Langston stresses that it may be necessary for far more DA neurons to survive and for that survival to be long lasting in order to render this approach as a useful therapy in humans.

Langston highlights that "clearly the study reported here will advance research aimed at validating the use of stem cells to treat neurodegenerative disease" and this is most welcome particularly as investigators face yet another presidential moratorium endeavoring to limit the number of human stem cell lines that can be used for future research and treatment.

There are more hurdles here than just making the cells more viable before implanting them. There is the other extreme: the cells should not divide too much and replace more cells than are needed. Also, the cells should replace cells only in the parts of the brain where Parkinson's Disease has caused losses. Though adding extra dopaminergic neurons in small numbers in other parts of the brain might not cause a problem (leaving aside the possibility that a person's personality might change and they might effectively become someone else).

Eventually this work is going to progress to the point that researchers will want to try human trials. In Japan human embryonic stem cell use will probably not elicit much political opposition. So my guess is this avenue of research will eventually progress all the way to useful human therapies. At that point expect a big political fight in the United States over therapeutic cloning to produce human embryonic stem cell lines.

Animal models of diseases are very useful for the development of disease treatments. To use embryonic stem cells in therapy research on other species one must first produce embryonic stem cells. This is difficult to do in some species. In this context it is worth noting that one month ago a team at the University of Pittsburgh reported producing cloned rhesus monkey embryos.

Using newer cloning techniques, including the "gentle squeeze" method described by South Korean researchers who earlier this year reported creating the first cloned human embryonic stem cell line, University of Pittsburgh scientists have taken a significant step toward successful therapeutic cloning of nonhuman primate embryos.

It is the first time researchers have applied methods developed in the Seoul laboratory to nonhuman primate eggs. Resulting cloned embryos progressed to the blastocyst stage, a developmental step in which the embryo resembles a hollow, fluid-filled cavity surrounded by a single layer of cells. Called the inner cell mass, this layer contains embryonic stem cells. Growth of a cloned nonhuman primate egg to the blastocyst stage is farther along the developmental spectrum than ever achieved before, Gerald Schatten, Ph.D., director of the Pittsburgh Development Center at Magee-Womens Research Institute, and his colleagues report.

It remains to be seen whether cells produced using this technique will be a useful source of monkey embryonic stem cells.

So how did the Japanese team get monkey embryonic stem cells for their research? My guess is embryonic stem cells from a conventionally initiated monkey pregnancy was the source. But does anyone reading this know for sure?

By Randall Parker 2005 January 04 01:34 PM  Brain Disorder Repair
Entry Permalink | Comments(4)
2005 January 03 Monday
Do Men Want Dumber Women As Mates Or Are Smart Women Too Choosy?

Social scientists at the universities of Aberdeen, Bristol, Edinburgh and Glasgow in Britain tested the IQs of 900 boys and girls at the age of 11 and then checked on their rates of marriage 40 years later. They found that higher IQ increases the chances a man will marry but high IQ causes an even greater decrease in the chances that a woman will marry. (same article here)

“The finding that IQ in early life appears to be associated with the likelihood to marry is important because factors in childhood may determine a person’s marital status in adulthood, which may in turn influence future health and mortality,” says the study, to appear in the Journal of Personality and Individual Differences.

For boys, there is a 35% increase in the likelihood of marriage for each 16-point rise in IQ. For girls, there is a 40% drop for each 16-point increase.

One possible cause of this result is that many smarter women find it beneath them to be wives. Or perhaps they are too choosy in wanting higher status men whereas the men are not as choosy about status of females and hence can find a suitable mate from a much larger pool of women. Men are more driven to seek physical beauty and youth as a result of selective pressures to seek fertile mates. Whereas natural selection favored a female preference for higher status men as better providers.

For the lower status and less intelligent women the smart successful men (and smart men are more successful on average) look like great catches that allow the women to move up in status and in creature comforts. They might also see smarter men as likely to treat them more thoughtfully (at least on average - though there are smart and callous men of course).

Another possible cause in the reduction in marriage rates for higher IQ women is that they spend more time in school than lower IQ women and therefore delay marriage past the point of their maximum attractiveness and maximum fertility. This is certainly consistent with a study on the Australian Twins Registry found that higher education reduces reproductive fitness of women. It would be interesting to look at the women in the most recent study to see if higher IQ still lowered marriage rates once educational attainment was adjusted for.

Go back and read the comments of my previous post Men Prefer Subordinate Women For Long Term Relationships. Note that some people really took issue when I advanced the argument that smarter women are at a disadvantage in finding a mate. Here is social science data that really proves the common intuition. Anyone still want to dispute this argument?

Here is what I want to know: Are genes for higher IQ being selected against? If smarter men are marrying more are they having more kids to compensate for the fact that smarter women are having fewer kids? My guess is that there is a net dysgenic effect. However, in America there is one higher IQ group that has a higher fertility rate: Higher income Republicans have more children than lower income Republicans and various groups of Democrats. So the selective pressures on genes for IQ are hard to tease out. We need cheap DNA sequencing which will probably come along in 5 to 10 years and settle this question.

By Randall Parker 2005 January 03 02:18 PM  Human Mating
Entry Permalink | Comments(102)
Vitamin D Could Decrease Overall Cancer Risk 30%

A forthcoming epidemiological study strengthens the case higher vitamin D intake could dramatically lower the rate of cancer in the United States.

Other studies have suggested that higher vitamin D levels help protect against colon, prostate, and breast cancer, but a long-term study of 50,000 men by researchers at Harvard School of Public Health suggests vitamin D may reduce the risk of all cancers. The study, which is still under review for publication, found that men who consumed higher levels of vitamin D reduced their overall cancer risk by at least 30 percent, according to lead author, Ed Giovannucci. The findings were statistically significant, he said, and a separate study of women is expected to produce similar results.

This is big stuff. Imagine an anti-cancer drug that reduced overall cancer deaths by 30%. It would be hailed as a medical wonder. But it is much better to avoid getting cancer in the first place.

Another interesting angle here is this huge benefit against cancer is coming from a vitamin that is not classified as an antioxidant. For decades researchers have been trying to use antioxidant free radical quenching vitamins such as beta carotene, vitamin E, and vitamin C to reduce cancer, heart disease, and other diseases. The results have been pretty disappointing. Now the biggest potential benefit turns out to be from a vitamin which is most likely operating by a mechanism unrelated to prevention of free radical damage.

Keep in mind that this result will not carry over to the world as a whole. Some populations are consistently exposed to enough sunlight for their skins to synthesize the amount of vitamin D that they need. But a 30% reduction in cancer in America looks to be possible. That would be an enormous boon, both lengthening lives and reducing medical costs.

This latest study does not come as a surprise. It builds upon a larger body of epidemiological evidence for a wide array of benefits from consumption of greater quantities of vitamin D. A previous analysis found that addition of Vitamin D and calcium to grains would reduce the incidences of fractures and colon cancer and save $3 billion per year for a cost of less than $20 million per year.

Currently, the federal government requires that manufacturers enrich cereal-grain products with five nutrients—iron and the vitamins thiamine (B1), riboflavin (B2), niacin (B3), and folate (B9). The total cost to U.S. consumers of adding calcium and vitamin D to the list should be no more than about $19 million a year, Harold L. Newmark of Rutgers University and his colleagues report in the August American Journal of Clinical Nutrition. Conservatively, they calculate, this investment would spare U.S. consumers some $3 billion in direct medical costs from illnesses and injuries stemming from their inadequate intake of calcium and vitamin D.

Most Americans get less than the officially recommended amount of vitamin D.

The human body can generate 10,000 to 12,000 international units (IU) of vitamin D from a half-hour of summer-sun exposure. The National Academies recommend that adults, depending on their age, get from 200 to 600 IU of the vitamin each day.

In practice, however, most people in the United States get a daily intake from food and sun exposure well below that recommended intake, especially during winter. People living in the United States and Europe or farther from the equator have trouble getting enough sun to maintain adequate blood concentrations of the vitamin. When people heed dermatologists' warnings about preventing skin cancer by limiting sun exposure and using sunscreen, they also reduce their vitamin D production.

Even the officially recommended amounts of vitamin D are probably well below the level that would provide maximal benefit. Click through on the following link to read an argument for getting 800 to 1000 IU of vitamin D daily.

Lots of aspects of modern society reduced sun exposure. For example work in office buildings contributes to a reduction in sun exposure and reduced vitamin D synthesis. So does the message from dermatologists to avoid sun as a way to lower the risk of skin cancer. This has led to a debate in medical circles about whether sun exposure increases or decreases net cancer risk. This debate has so upset the dermatologists that vitamin D researcher Michael Holick was forced out of Boston University's dermatology department since he veered too far from accepted orthodoxy among dermatologists about sun exposure. My own view is that moderate sun exposure most obviously decreases net cancer risk and that the evidence is building up to the point that science is going to vindicate Holick. By the way, Holick thinks the top daily safe dose for vitamin D is at least 5000 IU, which is much higher than the current officially recommended maximum daily dose (which is 2000 IU if memory serves).

Another element of modern society that is causing vitamin D deficiency is the migration of darker skin peoples to places further away from the equator. This has put them in environments where their darker skin pigment blocks the sun too much to allow sufficient vitamin D synthesis.

Global location and skin color also affect the amount of vitamin D a person's skin manufactures. UV intensity falls as one moves from the equator toward Earth's poles, increasing latitude. Evolution compensated by selecting for increasingly unpigmented skin in northern populations, says Boston University endocrinologist Michael F. Holick.

Melanin pigment protects the skin from the damage of UV rays but also lowers the skin's production of vitamin D. In the March American Journal of Clinical Nutrition, Holick quantifies this effect: Fair-skinned people who sunburn easily and rarely tan need just 2 to 10 percent as much sun exposure to produce a unit of vitamin D as do people with the darkest skin.

Given that blacks especially have higher rates of lactose intolerance the fortification of milk with vitamin D is not reaching a group most in need of dietary vitamin D.

Consumption of fruits appears to boost the level of the biologically active form of vitamin D in the blood. Though if you do not have enough vitamin D in your body the fruit can not make up for that deficiency.

Vitamin D does protect men from prostate cancer. In the USA and many countries, milk is fortified with vitamin D. Even so, calcium in mik and other foods lowers the amount of usable vitamin D in the body. Eating several servings of fruit a day keeps the level of vitamin D raised.

A high circulating level of the biologically active form of vitamin D (1,25(OH)2 vitamin D [1,25(OH)2D) is known to inhibit formation of cancer in the prostate. Eating a diet high in meat and milk and low in fruit reduces the level of this anti-prostate cancer vitamin. "High intakes of calcium and phosphorus, largely from dairy products, lower circulating 1,25(OH)2D level, and sulfur-containing amino acids from animal protein lower blood pH, which also suppresses 1,25(OH)2D production."

Fortification of foods with calcium alone may well have the effect of lowering the rate of colon cancer while boosting the rate of prostate cancer. One concern I have with the combined calcium and vitamin D food fortification is that the level of vitamin D added needs to be set to be enough to more than cancel any effects of calcium on increasing prostate cancer risk.

It is clear that in America the fortification of only milk with vitamin D is inadequate. The decreased consumption of milk prevents milk from being an avenue for boosting vitamin D in a growing portion of the population. Fortification of other milk products such as cheese and yogurt and even a boost in the level of fortification of milk seems called for. Also, the potential benefit of grain fortification with vitamin D is so large that it warrants urgent consideration. Higher vitamin D in diets would reduce the risk of cancers, type I diabetes, osteoporosis, unexplainable muscle and bone pain, hypertension, a wide array of auto-immune diseases (including multiple sclerosis), and possibly other disorders and diseases as well.

The studies about vitamin D and health are great news. The incidence of several major diseases can be reduced for a trivially low cost. We need many more such discoveries that show how to cheaply improve human health.

By Randall Parker 2005 January 03 02:33 AM  Aging Studies
Entry Permalink | Comments(3)
Site Traffic Info
Site Copyright
The contents of this site are copyright ©