2003 November 27 Thursday
TriStem Claims Converts Blood Cells To Stem Cells

New Scientist magazine has a report about the claims of London UK biotech company TriStem that it has developed a very rapid way to convert blood cells to less differentiated stem cells.

TriStem has been claiming for years that it can take a half a litre of anyone's blood, extract the white blood cells and make them revert to a "stem-cell-like" state within hours. The cells can be turned into beating heart cells for mending hearts, nerve cells for restoring brains and so on.

The company has now finally provided proof that at least some of its claims might be true. In collaboration with independent researchers in the US, the company has used its technique to turn white blood cells into the blood-generating stem cells found in bone marrow.

The ability to dedifferentiate stem cells (the article uses the term retrodifferentiation) would be incredibly valuable just for leukemia treatment. Though if all their claims are correct the number of applications would be enormous. TriStem hasn't yet proven all their claims and if you read the full article you will find different scientists voicing varying degrees of skepticism about those claims. But TriStem has begun to demonstrate some of their claims to outside scientists including Tim McCaffrey of George Washington University and a clinical trial in Britain will attempt to use TriStem's technique to treat aplastic anemia with results due by March 2004. So we will soon know a lot more about their claims.

The ability to use a person's own blood cells, dedifferentiate them and to grow them in large number for conversion into a wide variety of cell types would provide the advantages of hESC while avoiding political opposition in the form of the types of ethical objections which have been raised about the use of human embryonic stem cells (hESCs). Such cells would have a big advantage over hESCs because the use of hESCs from another person poses potential immune incompatibility problems whereas one's own cells are unlikely to cause an auto-immune response.

There have been a lot of reports lately of success using and manipulating adult stem cells. See recent posts MIT Technique To Produce Large Numbers Of Adult Stem Cells, Stem Cells On Spinal Cord Injury Opened Connection To Brain, and Adult Stem Cell Research Promising For Heart, Lung Disease.

By Randall Parker 2003 November 27 09:52 PM  Biotech Organ Replacement
Entry Permalink | Comments(8)
Insulin Production May Resume If Auto-Immune Attack Halted

Many researchers have been pursuing what they have believed two separate parts of the solution to type I diabetes: A) stop the auto-immune response that kills pancreatic isle of Langerhan cells and B) either replace the lost cells or deliver gene therapy to instruct other cells to take their place as insulin releasers. Well, the good news is that while pursuing these problems researchers may have discovered that there are adult precursor stem cells in the spleen that have the ability to take over the function of the lost pancreatic insulin-making islet cells.

Cells from an unexpected source, the spleen, appear to develop into insulin-producing pancreatic islet cells in adult animals. This surprising finding from Massachusetts General Hospital (MGH) researchers, published in the Nov. 14 issue of Science, is a followup to the same team's 2001 report of a treatment that cures advanced type 1 diabetes in mice. In discovering the biological mechanism behind that accomplishment, the researchers also have opened a potential new approach to replacing diseased organs and tissues using adult precursor cells.

"We have found that it is possible to rapidly regrow islets from adult precursor cells, something that many thought could not be done," says Denise Faustman, MD, PhD, director of the MGH Immunobiology Laboratory and principal investigator of the study. "By accomplishing effective, robust and durable islet regeneration, this discovery opens up an entirely new approach to diabetes treatment."

David M. Nathan, MD, director of the MGH Diabetes Center, notes, "These exciting findings in a mouse model of Type 1 diabetes suggest that patients who are developing this disease could be rescued from further destruction of their insulin-producing cells. In addition, patients with fully established diabetes possibly could have their diabetes reversed." Nathan has developed a protocol to test this approach in patients, but additional grant support is needed before a clinical trial can begin. Type 1 diabetes develops when the body's immune cells mistakenly attack the insulin-producing islet cells of the pancreas. As islet cells die, insulin production ceases, and blood sugar levels rise, damaging organs throughout the body. In their earlier study, Faustman's team directly attacked this process by retraining the immune system not to attack islet cells. They first used a naturally occurring protein, TNF-alpha, to destroy the mistargeted cells. Then they injected the mice with donor spleen cells from nondiabetic mice. A protein complex on these cells plays a key role in teaching new immune cells to recognize the body's own tissues, a process that goes awry in diabetes and other autoimmune disorders.

The earlier study's results are also quite important because it is essential to develop the ability to selectively knock out immune cells that are causing an auto-immune response in order to cure diabetes, rheumatoid arthritis, lupus, multiple sclerosis, and other auto-immune disorders.

The researchers expected to follow that process, which eliminated the autoimmune basis of the animals' diabetes, with transplants of donor islet cells. However, they were surprised to find that most of the mice did not subsequently need the transplant: Their bodies were producing normal islet cells that were secreting insulin.

"The unanswered question from that study was whether this was an example of rescuing a few remaining islet cells in the diabetic mice or of regeneration of the insulin-secreting islets from another source," says Faustman. "We've found that islet regeneration was occurring and that cells were growing from both the recipient's own cells and from the donor cells." An associate professor of Medicine at Harvard Medical School, Faustman notes that it has been generally believed that most adult organs cannot regenerate and that adult stem cells or cellular precursors would not be powerful enough to reconstitute functioning insulin-secreting islets.

In order to determine whether or not the new islets had developed from the donated spleen cells, the researchers carried out the same treatment using spleen cells from healthy male donors to re-educate the immune cells of female diabetic mice. In those diabetic mice that achieved long-term normal glucose metabolism, the researchers found that all of the new functioning islets had significant numbers of cells with Y chromosomes, indicating they had come from the male donors. In another experiment, donor spleen cells were marked with a fluorescent green protein, and again donor cells were found throughout the newly developed islets.

Here comes the especially interesting part: if the auto-immune response can be halted in human diabetes sufferers then it is likely that over a period of months the body will slowly develop the ability to secrete enough insulin to control blood sugar without insulin shots.

A separate experiment, however, indicated that islets also could grow from remaining precursor cells in the diabetic mice and resume insulin secretion once the autoimmune process had been halted. Such regrowth from the animal's own cells was slightly slower than regeneration from donor cells – taking about 120 days – but the eventual regeneration of islets was just as complete. The result suggests that, given time, regrowth of islets can occur in animals who have immune system re-education to eradicate their diabetes but do not receive the donor islet cell precursors.

The researchers then separated spleen cells into those with a surface molecule called CD45, which indicates the cell is destined to become an immune cell, and those without CD45. They injected labeled spleen cells with or without CD45 – or unseparated cells – into young mice in which autoimmunity had begun but full-blown diabetes had not yet developed. After the immune system re-education therapy, all of the mice maintained normal glucose control, while their untreated littermates soon became diabetic. However, close examination of pancreatic tissue from the treated mice revealed markers from the donor cells only in the islets of those who had received spleen cells without CD45.

"It's the cells without CD45 that are the precursors for pancreatic islets. They have a distinct function that has not previously been identified for the spleen," Faustman says.

Any time a new source of cells are found in the body that are capable of turning into other cell types that alone is a small reason to celebrate. Each type of stem cell that is identified is another useful piece in the puzzle and helps with the development of future stem cell treatments. But this report is also great news because it indicates that cell therapy to replace islet cells may not even be necessary in order to cure type I diabetes.

By Randall Parker 2003 November 27 07:24 PM  Biotech Organ Replacement
Entry Permalink | Comments(10)
FDA Stands In The Way Of Promising Heart Stem Cell Therapy

The buttheads at the US Food and Drug Administration are standing in the way of rapid trials to test the efficacy of the use of blood stem cells to treat heart disease.

“After we went public, FDA told us not to conduct any similar procedures,” says Steven Timmis, the cardiologist at Beaumont who performed the bone marrow transplant. “We had also proposed a 100-patient randomized clinical trial, but FDA has denied this.”

The doctors did not seek FDA approval prior to the transplant because it was an emergency operation and they believed the procedure was allowed because the teen received injections of stem cells from his own blood. Transplants of bone marrow, the primary source of stem cells, have been performed routinely in the United States for more than 30 years.

The teenager first received a drug that increases the production of stem cells in the bone marrow. The stem cells are released into the bloodstream. The doctors then collected the cells from his circulating blood.

In 1999, researchers showed that bone marrow stem cells injected into mice with damaged heart muscle homed in to the damaged tissue and restored function. Since then, clinical trials in Britain, Germany, Hong Kong, China, and Brazil have shown that heart patients with heart disease who are injected with their own bone marrow stem cells improve significantly.

When people have heart disease that is going to kill them in a few months or a few years it is time for the government to butt out and let people choose what risks they want to take with their lives. This sort of report infuriates me. The FDA is insisting that the researchers spend a couple of years on animal trials when groups in other countries are charging ahead of US researchers and getting promising results.

The doctors working on this method at Beaumont Hospital in Royal Oak Michigan tried it out as an emergency treatment on a 16 year old kid named Dmitiri Bonneville who had nail gun nails puncture his heart in an accident. The kid would probably be dead by now and instead the emergency treatment worked and he's running around being a kid again. But the FDA doesn't like surprise trials of treatments with press conferences to announce the unexpected results. That sort of thing is not controlled by expert bureaucrats sitting in committees passing expert judgement and so we can't have that.

The FDA's jurisdiction is questionable since use of one's own tissues (homologous use) is supposed to be allowed without approval. So the FDA is claiming the pressure at which the catheter is injecting the cells is experimental.

In my very strongly held view anyone with a terminal illness should be free to try any experimental therapy that they can find a medical doctor to deliver. Will there be abuse? Sure. But do we own our own lives or not? Also, greater freedom to experiment will accelerate the rate of medical advance and more lives will be saved than will be lost by failed experimental therapies. Keep in mind that if the people allowed this exception are ones who have only a few years or months or weeks or days to live they may feel the gamble is worth it. Who are government bureaucrats to make a value judgement about such a decision?

Update: Some other blogs are linking to this item and the comments made on those blogs indicate that there seems to be some misunderstanding about why the FDA is taking the position they are taking toward the Michigan researchers. The FDA's position has nothing to do with the controversy over human embryonic stem cells. The stem cells used by the Michigan researchers are not embryonic and there is no substantial religious opposition to the use of non-embryonic stem cells. To those misinterpreters: Back off on the intense reflexive unthinking irrational partisanship. This is not a story about ethical objections to a promising medical treatment. The FDA is not following the instructions of the Bush White House on this question. This is just the FDA acting the way it normally acts under both Democratic and Republican administrations. If you don't like it then you ought to tell your Congressional representative that the FDA has too much power to prevent terminally ill people from trying experimental therapies..

By Randall Parker 2003 November 27 03:30 PM  Biotech Organ Replacement
Entry Permalink | Comments(5)
Choline May Restore Middle Aged Memory Formation

MIT professor Richard Wurtman and post-doc Lisa Teather choline in the form of cytidine (5')-diphosphocholine (CDP-choline) improved learning in rats after 2 months of supplementation.

Among rats not getting CDP-choline, the older animals seemed to forget much of the previous day's learning, Teather says, while the young ones didn't. By the end of 4 days of testing, she notes, the difference between these groups "was really huge," suggesting that the older ones had trouble forming long-term memories. However, she notes, among CDP-choline–supplemented rats, middle-aged animals "mastered the [maze learning] as readily as the young animals did." Her group is now in the process of evaluating the impact of CDP-choline on memory development in the rodent equivalent of senior citizens.

No quick fix

"The interesting thing," observes Teather, "is that if you feed the [rats the supplemented] diet for 1 month, you can't rescue memories." The animals had to get CDP-choline for at least 2 months to receive some memory protection. And that, she says, points to a mechanism for what the nutritional supplement is doing.

Teather and Wurtman theorize that it takes a month of choline supplementation for the brain to build up enough acetylcholine transmitter for the choline to start getting used in the synthesis of phospholipids to make more membranes. Then gradually enough extra neuronal membrane is made for it to be available for use to do new memory formation. Also, they expect that humans would have to get only 500 mg per day because the human body retains choline more efficiently

So how to get choline in the diet? See the lists below. Keep in mind that there are about 454 grams in a pound and so 100 grams of food is less than a quarter pound.

FOOD SOURCES Choline
Food (mg/100g)
Liver, dessicated 2170
Heart, beef 1720
Brewer’s yeast 300
Nuts 220
Pulses 120
Citrus fruits 85
Bread, wholemeal 80
Bananas 44

Note the different serving sizes here.

Food/serving mg choline/serving
Beef Liver, 85 grams (3 ounces) 453.2
Egg, 61 grams (1 large) 345.0
Beef Steak, 85 grams (3 ounces) 58.5
Cauliflower, 99 grams (1/6 medium head) 43.9
Iceberg Lettuce, 89 grams, (1/6 medium head) 28.9
Peanuts, 1 ounce 28.3
Peanut Butter, 32 grams (2 Tbsp) 26.1
Grape Juice, 8 ounces 12.9
Potato, 148 grams (1 medium) 12.9
Orange, 154 grams (1 medium) 11.5
Whole Milk, 8 ounces 9.7

Someone who doesn't want to eat eggs, liver, or beef heart is probably not going to get enough choline in thieir diet to achieve the effect that the MIT researchers expect. The amount of nuts needed to get enough choline would add up to a lot of calories. Though the nuts have other health benefits

By Randall Parker 2003 November 27 03:05 PM  Brain Enhancement
Entry Permalink | Comments(0)
FBI May Start Collecting Juvenile Offender DNA

A bill currently working thru the US Congress will expand the scope of FBI DNA data collection and storage.

WASHINGTON — DNA profiles from hundreds of thousands of juvenile offenders and adults arrested but not convicted of crimes could be added to the FBI's national DNA crime-fighting program under a proposed law moving through Congress.

The article reports that thirty states already collect DNA from juveniles. What accounts for some of the opposition to the spread of this practice as compared to the already universally accepted collection of fingerprints must be the fear that DNA can tell more about the innate characteristics of a person than fingerprints can. But if that fear is justified then what drives the opposition is fear that the truth about human nature will be used to treat people who are innately different in ways that are in response to those innate differences.

There is fear in the minds of many modern liberal thinkers that people will not be considered equal before the law if it is known that they have innate tendencies to behave in ways different from each other. So there is an element of "don't want to know" in the attempts to prevent information from being collected that might at some future point turn out to be useful for automatically identifying differences in innate behavioral tendencies.

Given that juveniles commit assault, murder, rape, armed robbery, and a large assortment of other crimes and that some juveniles do so repeatedly that part of the expansion of DNA collection does not strike me as unreasonable. It is hard to see why juveniles should be treated so differently than adults when they commit crimes every bit as brutal as those committed by adults and when juvenile criminals can pose threats as repeat offenders every bit as great as those posed by adult criminals.

Given that we do not now have the ability to analyse DNA to produce a detailed picture of genetic factors that influence behavior the current drive to collect more DNA samples is being driven by the same purpose for which fingerprint evidence is already collected: it allows the identification of more criminals from evidence found at crime scenes. This provides a few different benefits in terms of protection of the innocent population. First, it increases the rate at which criminals are caught. This removes dangerous people from the streets and also increases the deterrent effect of the law on would-be criminals.

But more accurate identification of criminals does something else that is rarely mentioned: it decreases the rate of investigation and conviction of innocent people. Every time a criminal commits a crime there is some chance that an innocent person will be incorrectly suspected of having committed it. This reduces the rates of false arrests (with all the stigma and costs which are entailed), trials in which innocents are found innocent (which have to be terrible and expensive ordeals for innocents caught up in them), and trials in which innocents are found guilty (even worse). Each crime that is correctly connected to a real perpetrator is a crime that is unlikely to involve a prosecution of an innocent. Also, if the deterrence effect of the law is heightened and more criminals are jailed the result is that fewer crimes will be committed and hence fewer innocents will be incorrectly implicated in something they didn't do.

A reduction in crime rates reduces victimizations both by criminals and by governments. It also reduces the amount of fear and inconvenience visited upon those who live with considerable risk of becoming crime victims. Proposals for measures that will have the effect of reducing crime rates need to be weighed with all of those factors in mind.

By Randall Parker 2003 November 27 01:21 PM  Surveillance Society
Entry Permalink | Comments(2)
Geron Using hESC In Animal Spinal Cord Injury Models

The biotech company Geron Corporation is working with human embryonic stem cells (hESC) converted into oligodendroglial progenitor cells for use in treating spinal cord injuries in rats and mice.

Menlo Park, CA – November 13, 2003 – Geron Corporation (Nasdaq: GERN) today announced the presentation of results demonstrating that the transplant of cells differentiated from human embryonic stem cells (hESCs) can result in functional improvement in animals with spinal cord injuries. This work provides proof of concept of the efficacy of hESC-based therapies in spinal cord injury.

In two presentations at the Society for Neurosciences Annual Meeting in New Orleans, Dr. Hans Keirstead and his colleagues from the University of California at Irvine detailed studies demonstrating that when hESC-derived oligodendroglial progenitors were transplanted into rats that had received a thoracic spinal cord contusion injury, statistically significant improvements in the ambulatory activity of the rats could be observed approximately one month later. In these blinded studies, animals showed evidence of improved weight-bearing capacity, paw placement, tail elevation and toe clearance activity compared to injured untreated animals. Control animals that received transplants of human fibroblasts instead of oligodendroglial progenitors showed little, if any improvement.

“These results are exciting. They show that cells derived from hESCs can have therapeutic efficacy in a model of human disease,” stated Jane S. Lebkowski, Ph.D., Geron’s vice president of regenerative medicine.

In these studies, the cells were transplanted directly into the spinal cord lesions seven days after injury. Dr. Keirstead found evidence of both increased oligodendrocyte-mediated myelination and some neural sprouting upstream of the lesion in the test animals. These observations were further supported by additional transplant studies from Dr. Keirstead’s lab in which the oligodendroglial progenitors were implanted into the spinal cord of Shiverer mice, a mutant mouse that is deficient in myelin basic protein and hence lacks normal neuronal myelination. In those mice the researchers observed evidence of oligodendrocyte-mediated remyelination of nerve cell axons. No evidence of tumor formation from the transplanted cells or other adverse events was observed in any of these studies.

In a third presentation at the meeting, Dr. Keirstead and his colleagues presented data showing how hESCs can be differentiated in tissue culture to oligodendroglial progenitors, the precursors of oligodendrocytes. Oligodendrocytes are specialized neural cells that produce myelin, the protective sheath that insulates the axons of nerve cells allowing normal nerve impulse conduction. Oligodendrocytes also produce a variety of neurotropic factors which can induce the sprouting of nerve cells. In a spinal cord contusion injury, neurons that are spared during the initial injury can be demyelinated during the subsequent inflammatory response. Such demyelination can lead to decreased nerve conduction velocity and eventual death of the “denuded” axons, producing impaired sensory and motor function.

“This work demonstrates the versatility of hESCs and their potential utility for broad-based cellular therapeutics,” added Thomas B. Okarma, Ph.D., M.D., Geron’s president and chief executive officer. “In these studies, oligodendroglial progenitors were produced multiple times from the same human embryonic stem cell line over a period of months. The success of these studies and potential economies from large batch production of oligodendroglial progenitors from hESCs supports development of this potential product for the treatment of patients with acute spinal cord injury.”

Geron is now initiating formal preclinical safety and efficacy studies and is planning for scaled-up production of the cells for potential use in human clinical trials.

Suppose that within 5 or 10 years a Geron has a therapy derived from hESC ready for market for those suffering from spinal cord injuries. Suppose that therapy will help restore some spinal cord function with some degree of restoration of control and feeling in the lower body. The opponents of the use of hESC for therapeutic purposes are going to be faced with a much more difficult political position as the larger public has to weigh ethical concerns voiced by hESC therapy opponents against the very concrete ability to allow children and adults to arise from wheelchairs. My guess is that the hESC opponents will fail to keep hESC therapies off the market since they have failed to get a complete ban on the development of hESC therapies. Once a single therapy that provides a benefit as dramatic as helping crippled people walk again reaches the market the political opposition to hESC-based therapies will wither.

Geron is working on the basic nuts and bolts of being able to deliver therapies based on hESC as demonstrated by a report of their progress on developing better techniques for growing hESC in culture.

Menlo Park, CA – November 19, 2003 – Geron Corporation (Nasdaq: GERN) today announced the development of a defined, serum-free culture system for the propagation of human embryonic stem cells (hESCs). This new culture system relies solely on completely defined components for hESC growth, facilitating safe and scalable expansion of these cells for cell-based therapeutics.

In a presentation at the 2003 annual meeting of the American Institute of Chemical Engineers in San Francisco, Geron presented studies demonstrating that hESCs could be expanded in a culture medium that contains only human-sourced proteins and defined recombinant growth factors. Using these defined conditions, hESCs could be propagated for at least 120 days in culture while maintaining normal morphology, doubling time, and expression of a panel of markers characteristic of hESCs. Moreover, hESCs propagated under these conditions continued to be pluripotent, differentiating into cells representative of endoderm, mesoderm, and ectoderm, the three cell lineages of the human body.

This work extends Geron’s previous development of feeder-free growth conditions for hESCs. Geron had earlier developed protocols to culture hESCs in the absence of direct contact with feeder cells by using extracellular matrix proteins and cell-free media that had been previously conditioned by feeder cells. “This new development allows the replacement of conditioned medium with a fully defined medium that contains only human-sourced proteins and purified growth factors,” stated Jane S. Lebkowski, Ph.D., Geron’s vice president of regenerative medicine. “This advance greatly facilitates the scalable production of the cells while essentially eliminating the risk of contamination by non-human infectious agents in the culture process for undifferentiated cells.”

By Randall Parker 2003 November 27 10:49 AM  Biotech Organ Replacement
Entry Permalink | Comments(0)
Chimpanzees Provide Insights Into Human Behavior

Nicholas Wade of the New York Times has written an excellent article reviewing what is known about chimpanzee behavior and differences and similarities with human behavior. For instance, chimps are very territorial and patrol borders in order to maintain large territorial areas for gathering fruit.

In two known cases, a chimp community has wiped out all of a neighbor's males. Though the females may be absorbed into the victors' community, the basic goal seems to be getting rid of a rival rather than capturing females, since male chimps often attack strange females.

Within a community, there is a male hierarchy that is subject to what primatologists euphemistically call elections. Alpha males can lose elections when other males form alliances against them. Losing an election is a bad idea. The deposed male sometimes ends up with personal pieces torn off him and is left to die of his wounds.

But the bonobos may hold more appeal for the most militant feminists.

An intriguing variation on the chimpanzee social system is that of bonobos, which split from chimps some 1.8 million years ago. With bonobos, who live in Congo south of the Congo River, the female hierarchy is dominant to that of males, and males do not patrol the borders to kill neighbors. Though bonobos are almost as aggressive as chimps, they have developed a potent reconciliation technique — the use of sex on any and all occasions, between all ages and sexes, to abate tension and make nice.

Anyone else flash on the 1960s hippie slogan "Make love, not war"?

Stand back far enough and forget for a moment that humans are so much smarter and capable of developing very complex technologies and of discovering scientific laws. Just look at human social forms. They seem similar to those of other primates. At the same time it seems conceivable that a different sentient species could naturally favor very different structures of relationships than what humans form. It seems likely that a substantial portion of how humans organize into groups has a genetic basis.

When it becomes possible to genetically engineer human personality characteristics and behavioral tendencies different groups of humans may choose to engineer their children to be as different from each other in social behevior as are bonobos and chimpanzees. Such engineered splits in the human race may lead to wars between groups. Disagreements about values that are genetically engineered to be radically different will not be easily resolved by negotiation.

Wade's mention of elections to throw out an alpha male reminds me of Paul H. Rubin's argument that reverse dominance hierarchies from the Pleistocene era serve as the impetus for creating democracy. It could well be that the characteristic that has led to the drive for democracy can be traced all the way back to before humans and chimpanzees split off from each other about one and a half million years ago. See the post: Human Desire For Freedom Evolved Before We Lived In Cities

By Randall Parker 2003 November 27 12:34 AM  Trends, Human Evolution
Entry Permalink | Comments(0)
2003 November 25 Tuesday
Even Black Bears Getting Fat From Urban Living

The argument that humans are evolutionarily maladapted to urban lifestyles and fast food finds support in a report that black bears are becoming fat and lazy from eating junk food from dumpsters.

NEW YORK (Nov. 24) -- Black bears living in and around urban areas are up to a third less active and weigh up to thirty percent more than bears living in wild areas, according to a recent study by scientists from the Bronx Zoo-based Wildlife Conservation Society (WCS).

The study, published in the latest issue of the Journal of Zoology says that black bears are spending less time hunting for natural food, which can consist of everything from berries up to adult deer. Instead, they are choosing to forage in dumpsters behind fast-food restaurants, shopping centers, and suburban homes, often eating their fill in far less time than it would take to forage or hunt prey.

"Black bears in urban areas are putting on weight and doing less strenuous activities," said WCS biologist Dr. Jon Beckmann, the lead author of the study. "They're hitting the local dumpster for dinner, then calling it a day."

In addition, the authors say that urban black bears are becoming more nocturnal due to increased human activities, which bears tend to avoid. Bears are also spending less time denning than those populations living in wild areas, which the authors say is linked to garbage as a readily available food source.

The authors suggest that as humans continue to expand into wild areas, and as bears colonize urbanized regions, people must be educated to reduce potential conflicts. Local ordinances should be passed mandating bear-proof garbage containers for homes and businesses. "Black bears and people can live side-by-side, as long as bears don't become dependent on hand-outs and garbage for food," Beckmann said. "Lawmakers should take a proactive stance to ensure that these important wild animals remain part of the landscape."

This change in environment is a selective pressure. While is it not clear what traits are being selected for in bears by urban environments it seems very likely that different traits are being selected for among urban bears than among bears that live out in the wild.

Update: I know what you are all thinking: In the face of the rising obesity of our inner city black bear population no government agencies or health professionals are taking steps to encourage the urban black bears to come to cholesterol screening clinics to see if their junk food diets are putting them at risk for heart disease. Even if they come to the clinics they have no medical insurance that includes a prescription drug benefit to pay for Lipitor. Something must be done. Also, are any black bears getting screening for type II diabetes? And who, if anyone, is giving them dietary advice? They need nutritional counseling and they should be on a diet high in fiberous nuts and berries. Shouldn't the junk food industry be taxed for making cheap high calorie meals so easily available in dumpsters?

By Randall Parker 2003 November 25 02:30 AM  Evolutionary History
Entry Permalink | Comments(2)
Junk DNA Result Of Slowness Of Natural Selection

Species that replicate at a slower rate and that are fewer in number do not experience enough selective pressure to prevent junk DNA from accumulating

Genetic mutations occur in all organisms. But since large-scale mutations -- such as the random insertion of large DNA sequences within or between genes -- are almost always bad for an organism, Lynch and University of Oregon computer scientist John Conery suggest the only way junk DNA can survive the streamlining force of natural selection is if natural selection's potency is weakened.

When populations get small, Lynch explained, natural selection becomes less efficient, which makes it possible for extraneous genetic sequences to creep into populations by mutation and stay there. In larger populations, disadvantageous mutations vanish quickly.

Most experts believe that the first eukaryotes, which were probably single-celled, appeared on Earth about 2.5 billion years ago. Multicellular eukaryotes are generally believed to have evolved about 700 million years ago. If Lynch's and Conery's explanation of why bacterial and eukaryotic genomes are so different is true, it provides new insights into the genomic characteristics of Earth's first single-celled and multicellular eukaryotes.

A general rule in nature is that the bigger the species, the less populous it is. With a few exceptions, eukaryotic cells are so big that they make most bacteria look like barnacles on the side of a dinghy. If the first eukaryotes were larger than their bacterial ancestors, as Lynch believes, then their population sizes probably went down. This decrease in eukaryote population sizes is why a burgeoning of large-scale mutations survived natural selection in the first single-celled and multicellular eukaryotes, according to Lynch and Conery.

To estimate long-term population sizes of 50 or so species for which extensive genomic data was available, Lynch and Conery examined "silent-site" mutations. Silent-site mutations are single nucleotide changes within genes that don't affect the gene product, which is a protein. Because of their unique characteristics, silent-site mutations can't be significantly influenced by natural selection. The researchers were able to calculate rough estimates of the species' long-term population sizes by assessing variation in the species' silent-site nucleotides.

Of the original group of sampled organisms, Lynch and Conery selected a subset of about 30 and calculated, for each organism, the number of genes per total genome size as well as the longevity of gene duplications per total genome size. They also calculated the approximate amount of each organism's genome taken up by DNA sequences that do not contain genes.

The researchers found that a consistent pattern emerged when genomic characteristics of bacteria and various eukaryotes were plotted against the species' total genome sizes. Bigger species, such as salmon, humans and mice, tended to have small, long-term population sizes, more genes, more junk DNA and longer-lived gene duplications. Almost without exception, the species found to have large, long-term population sizes, fewer genes, less junk DNA and shorter-lived gene duplications were bacteria.

The data suggest it is genetic drift (an evolutionary force whose main component is randomness), not natural selection, that preserves junk DNA and other extraneous genetic sequences in organisms. When population sizes are large, drift is usually overpowered by natural selection, but when population sizes are small, drift may actually supersede natural selection as the dominant evolutionary force, making it possible for weakly disadvantageous DNA sequences to accumulate.

Junk DNA costs energy to duplicate and to carry around as part of each cell. So natural selection operates against it. But if junk DNA gets generated by errors in replication faster than natural selection can select against it then junk DNA can accumulate..

At some point in the 21st century, barring some natural or human-caused disaster, biotechnology will advance far enough to make it possible to edit out junk sequences from cells. So it should become possible to have offspring that have far fewer junk DNA sequences. Therefore junk DNA may eventually disappear from the human species. Also, replacement organs will eventually be genetically enhanced with more beneficial variants of genes that play important roles in each organ type. It seems reasonable to expect that at least some people will opt to have their DNA edited to eliminate junk DNA sequences from cells that will be used to grow replacement organs. So even some of us who today are walking around with junk DNA will have less of it once we are able to have replacement organs grown for us.

Update: Carl Zimmer raises a number of specific objections against the idea of removing junk DNA but he also sees one point in favor of doing so: some junk DNA sections can hop around the genome and cause mutations when they embed in new locations.

There are also arguments for getting rid of junk DNA that Futurepundit doesn't mention. When mobile elements jump around to new homes, they can trigger diseases as they mutate the genome.

As for mobile elements that jump around the genome: Yes, note that this reason for removing junk DNA is especially strong in the case of stem cells that are going to be used to grow replacement organs. The cells in those replacement organs (with the exception of testes and ovaries) are not going to have their DNA passed along to progeny and therefore the ability of their junk DNA to mutate to create new environmental adaptations provides no benefit while the junk DNA does pose a mutational threat that can result in cancer and other diseases.

The effects of removing various junk sequences will be testable by producing organs without them and then seeing how those organs perform. This will be relatively less risky to experiment with in the case where humans have two of an organ. So, for instance, one could have just one kidney replaced with a junk-free kidney and then, with the other kidney still available as back-up, the functionality of the junk-free kidney could be monitored over time. The same could be done with many muscles. Replace a thigh muscle with a junk-free thigh muscle. If the thigh muscle fails the result is unlikely to be fatal. There would still be risks from such an experiment as one could imagine fatal failure modes where, for instance, an organ releases toxins or clotting factor or something else that damages some other more critical part of the body.

Next he raises the point that what seems like junk DNA might not really be junk DNA.

Junk-free genomes may indeed become possible in the future, but they're probably not a wise idea. Even if junk DNA doesn't benefit us in any obvious way, that doesn't mean that we can do without it. Many stretches of DNA encode RNA which never become proteins, but that doesn't make the RNA useless--instead, it regulates the production of other proteins. Some broken genes (known as "pseudogenes") may no longer be able to encode for proteins, but they can still help other genes produce more of their proteins

Well, my response to this is pretty simple: Yes, it is hard to be certain that some DNA section has no benefit to the cell. But suppose at some point in the future we can assign a really high probability to the idea that some chunk of DNA has no value and that it actually is far more likely to cause disease than benefit? Why not then remove it?

This reminds of another point: Some genetic theorists make the argument that we each have dozens and perhaps hundreds of purely harmful mutations because natural selection can't select out hamful mutations as fast as they are generated by mutations that occur during reproduction. If this argument is correct (and I believe it is) then we should also have junk DNA that is either of no value or harmful. Someone who holds this more pessimistic view of our genomes as full of flaws and parasitic DNA sections is going to tend to be more willing to decide to throw out the suspected junk with the view that the odds are great that the suspected junk really is junk. Of course, there's no rush here and we ought to wait a couple of decades for a lot more evidence to accumulate before acting on this belief.

Zimmer also brings up the argument that simply by making the genome bigger that junk DNA may serve a useful function by making cells the correct size. I'm skeptical of this argument mostly because an assortment of different kinds of intracellular components cross-react with each other in undesirable ways and turn into compounds that the cell can not eject or destroy. As a result, cells accumulate junk and this junk accumulation robs the cells of needed space and decreases the efficiency of cells as they age as well. The junk also serves as a source of free radical generation. This problem with junk accumulation has even led Aubrey de Grey to argue for the transfer of lysosomal enzymes from other species into humans as a rejuvenation treatment. Analogously, genomal junk is taking up space that could be used by cells to do useful work. Get rid of it and the cells might become ever so slightly more efficient.

Next Zimmer brings up the value of junk DNA and, in particular, pseudogenes, as potential sources of future beneficial mutations:

It's on this evolutionary scale where purging junk DNA makes the least sense. The pasting and copying of junk DNA is a major source of new genetic variation. Instead of changing a nucleotide here or there, mobile elements can shuffle big stretches of DNA into new arrangements, taking regulatory switches and other genetic components and attaching them to different genes. While some of this variation may lead to diseases, it also prepares our species to adapt to new environmental challenges. (Similarly, pseudogenes that are truly broken still have the potential to become working genes again. Some scientists have proposed calling them "potogenes.")

Here's my problem with that argument: Natural selection is going to cease to be the major source of new beneficial mutations in humans within 20 or 30 years. We are going to have our genomes changed by bioengineering. Therefore junk DNA will have no value to us. Going into future centuries our bioengineering techniques will advance even further and we will be able to simulate the effects of variations orders of magnitude more quickly than mutations occur naturally.

There's another point about junk DNA that especially holds for agricultural plants and animals: to the extent that junk DNA can be removed from crops and livestock a source of variability is removed that essentially serves as noise. If someone develops some ideal dairy cow and wants to clone it he does not want jumping genes creating variations that cause some of them clones to produce less milk. Similarly, jumping genes could create variations in the growth of corn or wheat that would be undesirable.

It should be possible to grow up replacement organs in other species first and to try out junk removal in organs and whole genomes in other species before trying it out in humans. This will provide an important way to discover functional purposes served by parts of genomes that are mistakenly thought to be junk. The mechanisms by which those parts serve useful functions will then be able to be searched for in humans as well. In my view, the discovery of which sections of the genome really are junk is a technical challenge that will be solved with time. Once purely junk sections are identified with a fairly high probability of correct classification and techniques for removing it are developed it seems inevitable that more daring individuals will opt to try to have the junk removed from their replacement organs and progeny.

By Randall Parker 2003 November 25 01:39 AM  Evolutionary History
Entry Permalink | Comments(15)
2003 November 21 Friday
MIT Technique To Produce Large Numbers Of Adult Stem Cells

An MIT researcher has discovered a way to produce large numbers of adult stem cells.

CAMBRIDGE, Mass. -- In a finding that may help create unlimited quantities of therapeutically valuable adult stem cells, an MIT researcher fortified adult rat liver stem cells with a metabolite that allows them to multiply like embryonic stem cells.

In the absence of the metabolite, the cells revert to acting like normal adult stem cells, which produce differentiating cells without increasing their own numbers. Stem cells proliferating unchecked can cause cancer.

The idea here is that when an adult stem cell divides normally the result is one adult stem cell as well as one differentiated cell to supply to the target tissue that needs a new cell. The problem is that for therapeutic purposes what is needed is a way to tell an adult stem cell to divide to create two adult stem cells and then to divide to each create two more adult stem cells and so on until many times more adult stem cells are available. Only then in most therapies would there be enough stem cells to do cell division to produce the needed amount of differentiated cells for target tissues.

“If we want to do cell replacement therapy with stem cells, we have to be able to monitor them and avoid mutations that cause tumors in people,” said James L. Sherley, associate professor of biological engineering in MIT’s Biotechnology Process Engineering Center, Center for Environmental Health Science and Center for Cancer Research.

Embryonic stem cells can become virtually any human tissue or organ, offering potentially powerful treatments for damaged or diseased organs, spinal injuries, neurological diseases and more. Unlike embryonic stem cells, which exist only during early prenatal development, adult stem cells create new tissues throughout our lifetimes. Their potential to produce mature tissue cells may be limited to cells of the tissues in which they reside.

Actually, some adult stem cell types can become several different differentiated (i.e. fully specialized) cell types. But any one kind of adult stem cell can not become all cell types. Though at some point in the future expect to see techniques developed that will make it possible to instruct adult stem cells to become more kinds of differentiated cell types. A simple rule to remember that FuturePundit thinks by: Matter becomes steadily more manipulable and transformable to achieve more kinds of outcomes as technology advances.

One of the problems of working with adult stem cells is that they are very rare and difficult to isolate. Researchers who attempt to grow adult stem cells in the laboratory find that they cannot increase the number of stem cells in culture, because when adult stem cells divide, they produce both new replacement stem cells and regular cells, which quickly proliferate and vastly outnumber the stem cells. Adult stem cells divide to replace themselves and create daughter cells, which either differentiate immediately or divide exponentially to produce expanded lineages of differentiating cells.

In previous work, Sherley created cells that divide the way adult stem cells do -- by hanging onto their original DNA and passing copies on to the next generation of daughter cells. The theory goes that through this unique pattern of chromosome segregation, adult stem cells avoid mutations that may arise from DNA replication errors.

DIVIDING IN TWO

Sherley has dubbed this pattern asymmetrical cell kinetics because the cells don’t divide symmetrically into two identical cells. His new approach to growing adult stem cells suppresses this asymmetrical mechanism. He calls it SACK (suppression of asymmetrical cell kinetics).

Through SACK, Sherley created a way to make cells that were dividing asymmetrically like stem cells revert to dividing symmetrically. This involves manipulating biochemical pathways regulated by the expression of the p53 gene (tied to many human cancers) by exposing cells to certain nucleotide metabolites that activate growth regulatory proteins. In the absence of the metabolites, cells are converted from asymmetric cell kinetics to symmetric cell kinetics.

When p53 is switched on, cells grow like adult stem cells. While others have attempted to alter adult stem cells genetically to force them to duplicate themselves, “what’s neat about this approach is that we are regulating the biochemistry of the cell, not changing its genetics,” Sherley said.

It sounds like he has a technique that suppresses p53 and that as a result the cells start dividing in a way that each division produces two adult stem cells.

What is needed for use in conjunction with this technique are better ways to test DNA quality noninvasively. There are many genes which, if they mutate, put a cell one step closer to becoming a cancer. To develop safe rejuvenation therapies what we need is a way to test stem cells and weed out cells that have accumulated mutations in genes that are crucial for cell division control. Then aged adult stem cell reservoirs could safely be replenished with cells that are at very low risk of going on a rampage of cancerous growth.

One other point: It would be interesting to know whether this research team tested the stem cells induced to divide in this manner to discover whether the cells became any less differentiated as a result of being induced to make two cells that are still stem cells. Would continued division of this sort result in daughter cells that do not replicate the methylation patterns and other epigenetic state that the adult stem cells started out with?

Update: The ability to replicate adult stem cells would be useful for use in leukemia treatments and other treatments that are already being practiced today. Keith Sullivan M.D. of Duke University explains the difficulties involved in collecting stem cells today.

Sullivan explains that there are two ways to donate stem cells. "The first is from one's own bone marrow," he says. "This typically requires an hour or two in the operating room under anesthesia to have stem cells collected by a mini-surgical procedure in the area of the hip bone.

"The other option is to collect blood for stem cells, which is not the same as simply giving blood. Stem cells are quite rare in circulating blood, so what's needed is three or four days' worth of growth factors and shots to increase the percentage of stem cells. These stem cells are then collected on a pheresis machine, which collects the stem cells and gives the red and white platelets back."

Sullivan also notes one other important source of stem cells: "If a woman is pregnant and wishes to donate some of the blood in the umbilical cord at the time of birth, these cells have the advantage of being early, undifferentiated cells. Therefore they have less potential for reactivity and adverse complications."

If stem cells could be easily replicated then just a few could be removed from the blood and grown up to large quantities. This might eliminate the need for surgery or the administration of a few days worth of growth factors.

By Randall Parker 2003 November 21 02:50 PM  Biotech Organ Replacement
Entry Permalink | Comments(0)
Steroid Use May Increase Aggressiveness Long Term

Anabolic steroid use may cause permanent changes in personality and behavior.

(11-20-03) BOSTON, Mass. – With the recent revelations about steroid use in Major League Baseball and the bust last week of several Oakland Raiders players for drug abuse, Northeastern University psychology professor Richard Melloni, who studies the link between steroid use and aggression, has recently found evidence that use of anabolic steroids may have long-term effects on players’ behavior and aggression levels well after they stop abusing these performance enhancing drugs.

With funding from the NIH (recently extended through 2008), Melloni and doctoral student Jill Grimes have been studying how steroids used during adolescence may permanently alter the brain's ability to produce serotonin. In their experiments, adolescent Syrian hamsters - given their similar brain circuitry to human adolescents – were administered doses of anabolic steroids and then measured for aggressiveness over certain periods of time.

The researchers initially hypothesized that steroid use during adolescence might permanently alter the brain's chemistry and a person's tendency toward aggression long after use has stopped. Their most recent findings, published this week in Hormones and Behavior, enabled them to confirm this hypothesis and conclude that there is indeed a lengthy price – namely long-term aggression – to pay for drug abuse even after the ingestion of steroids ceases. “We know testosterone or steroids affect the development of serotonin nerve cells, which, in turn, decreases serotonin availability in the brain,” Melloni says. “The serotonin neural system is still developing during adolescence and the use of anabolic steroids during this critical period appears to have immediate and longer-term neural and behavioral consequences. What we know at this point is that aggressiveness doesn’t simply cease after the ingestion of steroids does.”

Based on this research, Melloni also believes that athletes who abuse steroids may also be inclined toward aggressive behavior long after their drug abuse – and musculature – have waned.

The press release doesn't detail their findings unfortunately. But the claims here are at least plausible. See my previous posts on reports about drugs and environmental influences causing changes to brain development: Drugs And Stress Have Variety Of Effects On Brain Development and Adolescence Is Tough On The Brain.

By Randall Parker 2003 November 21 01:20 PM  Brain Development
Entry Permalink | Comments(3)
2003 November 20 Thursday
Donald Kennedy: Brain Scan Privacy Should Be Protected

Donald Kennedy, editor of the journal Science, calls for brain scans to be given privacy protections equal to DNA sequences.

America recently passed legislation preventing businesses from obtaining customers' DNA amid fears they could use it to discriminate against those deemed more risky. In Britain a moratorium is in place to prevent companies from accessing customers' genetic material.

Prof Kennedy told the Guardian: "There's a push to prevent genetic information being used by companies for adverse selection, and at least equal protection should be given to brain scan data."

What makes brain scans seem special is that scanning techniques may eventually provide insights into personality type, behavioral tendencies, and, when conducted in concert with appropriate environmental stimuli, they might even eventually provide insights into beliefs and memories. For instance, brain scans might eventually be usable as part of a better lie detector test. The desire to keep one's own thoughts secret is certainly a reason to place some sort of restrictions on what insurance companies can get access to. But most brain scans are made for more mundane purposes that make them little different than scans done in other parts of the body: to discover tumors, clots, leaks in arteries, and other medical problems. A blanket ban on insurance company access to brain scans is no more or less justifed than a blanket ban on insurance company access to chest scans.

Can all that much about a person's thoughts be divined from a brain scan? Razib of Gene Expression has pointed to a pretty good response to the recent report about being able to detect racism with fMRI (functional Magnetic Resonance Imaging) brain scans. In that response Carl Zimmer says that there is a lot of subjective judgement involved in interpreting brain scans and we should take such reports with a grain of salt. In time more rigorous brain scan studies linking scan results to beliefs and feelings will be done with appropriate double blind controls and larger groups of patients even as brain scanning machines become more sensitive, accurate, and cheaper to operate and interpret. The ability to use fMRI brain scans to learn more about a person's thoughts will no doubt improve with time. But is that one single application of brain scan information a reason to single out brain scan results to restrict insurance company access to those results? Couldn't insurance companies just be banned from accessing brain scan tests that are done to study beliefs and feelings?

Is there anything about brain scan results that make them in some way more logically equivalent DNA sequence information as far as insurance companies are concerned? Remember that the big problem with DNA tests is that they will eventually provide a great deal of insight about the long term risks that each person has for various diseases. Brain scans may eventually do that if they can, for instance, detect the early stages of Alzheimer's Disease decades before disease symptoms become noticable. But the same may turn out to be true of blood tests that may eventually be able to predict Alzheimer's risks and other neurological disease risks decades in advance.

The early detection of neurological diseases is part of a larger trend that is resulting from the broad advance of medical testing in general. Look at how cholesterol tests have become increasingly more refined as a single number for blood cholesterol has been broken down in the HDL, LDL, and other components and now even subtypes of HDL are being discovered while other potential risk factors such as C Reactive Protein (CRP) are being investigated. Also, scans to detect plaque build-up and artery and heart abnormalities have steadily become more accurate and useful. The sensitivity of a broad range of biological tests is going to continue to advance to make it increasingly easier to detect a large variety of diseases and disease risk factors at progressively earlier stages.

At first glance, what might seem to make DNA tests different than other types of tests - including brain scans - is that DNA tests will be able to provide an assessment of many health risks before any sort of disease process has even begun. For instance, a female baby at birth will be able to be scored for breast cancer risk before the baby has even gone thru puberty to grow breasts that can become cancerous in the first place. But DNA sequence tests are not unique in their ability to detect disease risks decades before diseases develop. For instance, there are events that happen during development that cause variations in outcomes by changing epigenetic programming in various parts of the body. Epigenetic information tests will also eventually become available. One way to respond to this is for genetic testing privacy laws could be extended to encompass epigenetic testing results as well.

But other ways to detect differences in developmental outcomes will also be developed. For instance, advanced imaging techniques may be able to measure the relative sizes and details of the shapes and activity of glands and organs. From those scans it may be possible to calculate risks for glandular disorders and organ disorders. Imaging and other sensing techniques may be able to detect heart problems decades before they become life-threatening. There does not appear to be a clear dividing line between health risks detected well in advance of disease using genetic testing and risks detected in advance using other kinds of tests.

The problem posed by advances in medical testing for insurance is not limited to DNA testing or even DNA testing plus brain scans. Costs of tests will fall, newer and less onerous tests will be introduced, and existing tests will become more sensitive even as more sophisticated and automated methods will be developed to analyse test results and use them to more accurately predict the development of future health problems. People will therefore discover more health risks at much earlier stages of their lives. This will cause those at greater risk of diseases to seek more medical insurance while those at less risk will buy less insurance. Bans on insurance company access to medical test results will not prevent this problem from developing because the high risk buyers of insurance will buy more while low risk buyers buy less. Insurance companies will have fewer healthy customers and more unhealthy customers.

Update: There is an important and beneficial way that early disease risk identification can actually improve the workings of the medical insurance market: If insurance companies are allowed to know as much about the health risks of insurance applicants as insurance applicants know about themselves then insurance companies will eventually offer policies contingent upon the applicants getting certain treatments in advance or continually in order to maintain coverage. Look at elevated cholesterol for example. It would make some sense for an insurance company to require a 50 year old with elevated cholesterol to take Lipitor and/or to go on a cholesterol lowering diet to lower cholesterol below some target point as a condition of coverage. One can even imagine a sliding scale of premiums based on cholesterol test results.

As more health risks become identifiable at early stages and as more treatments are developed to reduce specific risks the incentive for the insurance companies is going to be to require treatments as a condition of coverage. The insurance companies may require that the applicant pay for the risk-reducing treatments. Some treatments, such as cholesterol lowering drugs, may need to be taken continually for years. But in the future real "fix it" treatments will become available. For instance, a more permanent way to fix elevated cholesterol problems will be to do gene therapy to the liver so that it produces different quantities of the precursor lipoproteins that form parts of various types of blood cholesterol molecules. Basically, change the DNA programming of a liver for a high risk person to make it function more like the liver for a very low risk person.

This approach of providing incentives for risk reduction could be expanded in all sorts of ways. For instance, as various forms of medical tests become cheaper and easier to do imagine periodic testing to measure how well each person is nourished and how much stress a person is under. A person pursuing a lifestyle that causes less wear and tear on the body ought to be able to pay lower insurance premiums than a person who chooses a diet and lives under conditions that pose greater health risks.

This approach of pricing more accurately to risks has obvious precedents in other insurance markets where, for instance, insurance companies offer lower rates if fire detection and fire fighting equipment is installed and where structures are inspected and modified to be less likely to catch fire in the first place.

Update: A few recent reports illustrate how medical testing advances will allow progressively earlier identification of diseases and disease risks. First off MRI brain scans can identify those at risk of Alzheimer's Disease several years before clinical symptoms become identifiable.

Using a new technique to measure the volume of the brain, they were able to identify healthy individuals who would later develop memory impairment, a symptom associated with a high risk for future Alzheimer's disease. The study is published in the December issue of the journal Radiology.

In the small study, led by Henry Rusinek, Ph.D., Associate Professor of Radiology at NYU School of Medicine, the researchers used MRI scans and a computational formula to measure a region of the brain called the medial-temporal lobe over a period of two years. This area contains the hippocampus and the entorhinal cortex, key structures allied with learning and memory. The researchers found that each year, this region of the brain shrank considerably more in people who developed memory problems compared with people who didn't. The medial-temporal lobe holds about 30 cubic centimeters -- the equivalent of one-sixth of a cup -- of brain matter in each hemisphere of the brain.

"With our findings, we now know that the normal healthy brain undergoes a predictable shrinkage that can be used to help recognize Alzheimer's several years before clinical symptoms emerge," says Dr. Rusinek. "We believe this is the first MRI study to report these findings in healthy people, but it is only the first demonstration that extremely early diagnosis is possible, and the technique still requires additional work before it is ready for the clinic," he adds.

The technique was about 90 percent accurate, meaning that it correctly predicted cognitive decline in nine out of 10 people, and it also correctly identified 90 percent of those whose memories would remain normally for their age.

However, the study only involved 45 people; future studies need to ascertain whether the technique would be as accurate in a much larger pool of subjects. In addition, it remains to be shown whether other neurodegenerative diseases that affect the aging brain can also be accurately identified with this technique.

In another study blood pressure and C-reactive protein combine to more accurately predict stroke risk.

When levels of both blood pressure and C-reactive protein (CRP) were elevated, the risk of future heart attack and stroke increased as much as eight times, researchers report in the Nov. 25 issue of Circulation.

"What our study shows is that, at all levels of blood pressure, knowledge of CRP levels greatly improves our ability to predict which patients are at very high risk," said Dr. Paul Ridker, the director of the Center for Cardiovascular Disease Prevention at Brigham and Women's Hospital in Boston and the senior author of the study.

In yet another study Duke University Medical Center researcher Jason Allen has found evidence suggesting that nitric oxide metabolite levels are inversely associated cardiovascular disease risks.

"First, it appears that a nitric oxide metabolite measured in the blood after exercise may discriminate between healthy patients and those with cardiovascular disease and is related with a physiological response of the artery diameter," Allen said. "Also, these biochemical and physiological markers can be positively influenced by exercise in patients who are at risk for cardiovascular disease."

The number of tests for health risks and the accuracy of the predictions made from test results will steadily increase. DNA sequence testing will be just one of many kinds of tests that will be used to more accurately predict health risks.

By Randall Parker 2003 November 20 01:01 PM  Biotech Society
Entry Permalink | Comments(0)
2003 November 19 Wednesday
On The Declining Costs Of DNA Sequencing

DNA sequencing costs have already dropped by orders of magnitude and promise to drop by many more orders of magnitude in the future. (The Scientist website, free registration required - and an excellent site worth the time to register)

Sequencing costs have dropped several orders of magnitude, from $10 per finished base in 1990 to today's cost, which Green estimates at about 5 or 6 cents per base for finished sequence and about 2 to 4 cents for draft sequence. For some comparisons, draft sequence is adequate. Last spring NHGRI projected future cost at about a cent per finished base by 2005.

Although the plummeting price of sequencing is welcome, it is due to incremental improvements on the basic technology. “What we're all praying for is one of those great breakthroughs—a new technology that will allow us to read single-molecule sequences, or whatever the trick is going to be that will give us several orders of magnitude increase in speed and reduced cost,” Robertson said.

The article quotes Eric Green, scientific director of the National Human Genome Research Institute (NHGRI) that it costs about $50 million to $100 million to sequence each vertebrate species. This is an argument for spending a lot more money on basic research on the areas such as microfluidics and nanopore technology that will lead to orders of magnitude cheaper DNA sequencing technologies.

The lengthy NHGRI vision statement includes a section on dreams of future basic biotechnology advances that would be of particular value. (also see same article here)

During the course of the NHGRI's planning discussions, other ideas were raised about analogous 'technological leaps' that seem so far off as to be almost fictional but which, if they could be achieved, would revolutionize biomedical research and clinical practice.

The following is not intended to be an exhaustive list, but to provoke creative dreaming:

  • the ability to determine a genotype at very low cost, allowing an association study in which 2,000 individuals could be screened with about 400,000 genetic markers for $10,000 or less;
  • the ability to sequence DNA at costs that are lower by four to five orders of magnitude than the current cost, allowing a human genome to be sequenced for $1,000 or less;
  • the ability to synthesize long DNA molecules at high accuracy for $0.01 per base, allowing the synthesis of gene-sized pieces of DNA of any sequence for between $10 and $10,000;
  • the ability to determine the methylation status of all the DNA in a single cell; and
  • the ability to monitor the state of all proteins in a single cell in a single experiment.

Methylation of DNA is used by cells to regulate gene expression. The DNA methylation pattern in a cell is part of the epigenetic state of the cell which is basically the information state of the cell outside of the primary DNA sequence. The ability to determine the methylation state of all the DNA in a cell would be incredibly valuable for understanding cell differentiation and that, in turn, would be incredibly valuable for the development of cell therapies and for the development of the ability to grow replacement organs. Plus, the ability to read DNA methylation patterns in cancer cells would be very valuable for understanding the changes that cause cells to become cancerous. Methylation pattern changes may be essential for carcinogenesis in some or all cancer types. The development of ability to reverse a methylation pattern change may eventually be useful as an anti-cancer treatment.

Speedier and cheaper bioassay technologies have a great many applications both in research and in clinical treatment. For instance, lab-on-a-chip technology promises to allow instant diagnosis of bacterial infections and more precise and lower cost choices of antibiotics to treat them.

ST's polymerase chain-reaction-on-chip system, announced at last year's Chips to Hits conference, is a good example of the type of heterogeneous integration required in this new field. The chip contains microfluidic channels and reaction chambers heated with electronic resistors. DNA samples are amplified on chip in the chambers and piped to a DNA sample array for optical analysis.

"We want to transfer the complexity of large-scale labs onto these chips and use volume manufacturing to reduce cost," LoPriore said. Once it is in volume production, the MobiDiag system will replace lab equipment costing more than $10,000 with a small-scale unit costing only a few thousand, he said.

Equally significant is the short response time of the diagnostic system. Today, a patient with an unspecified infection needs to take a broad-spectrum antibiotic until a diagnosis is obtained that allows a switch to a narrow-spectrum antibiotic. With the new system, the doctor would know immediately which pathogen to target. "This could save billions of dollars per year, just in the cost of antibiotics," LoPriore said.

A lot of press reports are dedicated to reporting various discoveries about such things as how cells work, whether some drug works, or what is the best diet. But what excites me the most are advances in instrumentation and assay technologies. With a much better set of tools all discoveries could be made much sooner and with less effort and all treatments could be developed much more rapidly.

By Randall Parker 2003 November 19 12:12 PM  Biotech Advance Rates
Entry Permalink | Comments(3)
Human Population May Add 2.6 Billion By 2050

There will be 8.9 billion people on planet Earth by 2050.

It took from the beginning of time until 1950 to put the first 2.5 billion people on the planet. Yet in the next half-century, an increase that exceeds the total population of the world in 1950 will occur.

So writes Joel E. Cohen, Ph.D., Dr.P.H., professor and head of the Laboratory of Populations at The Rockefeller University and Columbia University, in a Viewpoint article in the November 14 issue of the journal Science.

In "Human Population: The Next Half-Century," Cohen examines the history of human population and how it might change by the year 2050. By then, the earth's present population of 6.3 billion is estimated to grow by 2.6 billion.

...

In the Science article, Cohen reports such statistical information as the following:

  • history of human population: It took from the beginning of time until about 1927 to put the first 2 billion people on the planet; less than 50 years to add the next 2 billion people (by 1974); and just 25 years to add the next 2 billion (by 1999). In the most recent 40 years, the population doubled.
  • birth rates: The global total fertility rate fell from five children per woman per lifetime in 1950 to 2.7 children in 2000, a result of worldwide efforts to make contraception and reproductive health services available, as well as other cultural changes. Encouraging as this is, if fertility remains at present levels instead of continuing to decline, the population would grow to 12.8 billion by 2050 instead of the projected 8.9 billion.
  • urbanization: In 1800, roughly 2 percent of people lived in cities; in 1900, 12 percent; in 2000, more than 47 percent. In 1900, not one metropolitan region had 10 million people or more. By 1950, one region did -- New York. In 2000, 19 urban regions had 10 million people or more. Of those 19, only four (Tokyo, Osaka, New York, and Los Angeles) were in industrialized countries.
  • poor, underdeveloped regions: b
  • population density: The world's average population density is expected to rise from 45 people per square kilometer in the year 2000 to 66 people per square kilometer by 2050. Assuming 10 percent of land is arable, population densities per unit of arable land will be roughly 10 times higher, posing unprecedented problems of land use and preservation for the developing world.
  • aging population: The 20th century will probably be the last when younger people outnumbered older ones. By 2050, there will be 2.5 people aged 60 years or older for every child 4 years old or younger, a shift that has serious implications for health care spending for the young and old.

Although it is not possible to predict how global demographics will affect families or international migration, Cohen points out that three factors set the stage for major changes in families: fertility falling to very low levels; increasing longevity; and changing mores of marriage, cohabitation and divorce.

In a population with one child per family, no children have siblings, Cohen explains. In the next generation, the children of those children have no cousins, aunts, or uncles.

If people are between ages 20 and 30 on the average when they have children and live to 80 years of age, they will have decades of life after their children have reached adulthood, and their children will have decades of life with elderly parents, Cohen also points out.

If family sizes shrink in the Middle East one consequence will be to reduce tribalism. But if life expectancy increases dramatically then the tribal bonds may continue for somewhat longer period of time due to intergenerational bonds and because the older generations who still have siblings and cousins galore will stick around longer.

All else equal, the political value of having a larger population is to make a country potentially stronger as a result of having more workers and also greater economies of scale. But the costs of crowding, pollution, and burden on the environment rises as well (and suburbs and freeways stretching as far as the eye can see is esthetically undesireable for most of us). For a country that wants to compete in terms of power and influence given that the productivity of workers varies literally by orders of magnitude it makes much more sense to have a much smaller increase in population but to make that increase be much more heavily weighted toward people who have very high economic productivity.

Productive potential is a function of innate cognitive ability, training, and motivation. A large raw increase in population decreases the amount of resources available to train each new member of a society. It is more cost-effective to add people who have much higher cognitive ability because:

  • smarter folks are faster learners and are quicker and therefore cheaper to train.
  • smarter folks can do brain work more rapidly.
  • smarter folks can do types of brain work that are beyond the ability of lesser minds.

The value of physical labor continues to decline relative to the value of complex mental work as the sum total of all knowledge increases and provides a larger body of information which can be manipulated to create economic value. But even the value of having a smarter brain may eventually be obsolesced by technological advances.

If computers become smarter than humans then the economic value of having a larger number of smarter and more productive humans will eventually pale next to value of having smart artificially intelligent computers.

By Randall Parker 2003 November 19 12:17 AM  Trends Demographic
Entry Permalink | Comments(8)
2003 November 18 Tuesday
Stem Cells On Spinal Cord Injury Opened Connection To Brain

A Brazilian team took blood stem cells from patients suffering from paralysis and delivered the stem cells to an artery running to the site of injury.

The researchers harvested stem cells from the patients' blood, and reintroduced them into the artery supplying the area which was damaged.

Electrical nerve signals evoked in the part of the body below the injury site could be found showing up as brain activity.

"Two to six months after treatment, we found that some patients were showing signs of responding to somatosensory evoked potential tests," says Barros.

This approach produced a measurable result in 12 out of 30 patients.

A team from the University of San Paulo in Brazil, led by Tarciscio Barros, said after treatment 12 out of 30 patients responded to electrical stimulation of their paralysed limbs.

Keep in mind that these patients did not get a large amount of function restored. They had to be tested to see a measurable difference.

The published accounts of this work do not provide enough details to be able to make any guess as to the quality of the work. Two obvious questions come to mind: Were the somatosensory evoked potential tests also performed on all the patients before they had the stem cell treatment? Also, were any controls used to compare the difference between getting and not getting the treatment?

If there really was a benefit then the question is why? The mechanism of the effect is unknown at this time. The stem cells might be delivering growth factors to the existing nerve cells to cause them to recover and grow. Or the stem cells might be merging with the nerve cells and, as a result, enhancing their functionality. Or the stem cells might be differentiating into nerve cells that bridge the gap of the injury site.

By Randall Parker 2003 November 18 10:03 PM  Biotech Organ Replacement
Entry Permalink | Comments(12)
2003 November 17 Monday
CIA Panel On Our Darker Bioweapons Future

The Federation of American Scientists has put up a file containing a rather ominous report released by the CIA Office of Transnational Issues entitled The Darker Bioweapons Future. (PDF format)

A panel of life science experts convened for the Strategic Assessments Group by the National Academy of Sciences concluded that advances in biotechnology, coupled with the difficulty in detecting nefarious biological activity, have the potential to create a much more dangerous biological warfare (BW) threat. The panel noted:

  • The effects of some of these engineered biological agents could be worse than any disease known to man.
  • The genomic revolution is pushing biotechnology into an explosive growth phase. Panelists asserted that the resulting wave front of knowledge will evolve rapidly and be so broad, complex, and widely available to the public that traditional intelligence means for monitoring WMD development could prove inadequate to deal with the threat from these advanced biological weapons.
  • Detection of related activities, particularly the development of novel bioengineered pathogens, will depend increasingly on more specific human intelligence and, argued panelists, will necessitate a closer - and perhaps qualitatively different - working relationship between the intelligence and biological sciences communities.

The Threat From Advanced BW

In the last several decades, the world has witnessed a knowledge explosion in the life sciences based on an understanding of genes and how they work. According to panel members, practical applications of this new and burgeoning knowledge base will accelerate dramatically and unpredictably:

  • As one expert remarked: "In the life sciences, we now are where information technology was in the 1960s; more than any other science, it will revolutionize the 21 st century."

Growing understanding of the complex biochemical pathways that underlie life processes has the potential to enable a class of new, more virulent biological agents engineered to attack distinct biochemical pathways and elicit specific effects, claimed panel members. The same science that may cure some of our worst diseases could be used to create the world's most frightening weapons.

The know-how to develop some of these weapons already exists. For example:

  • Australian researchers recently inadvertently showed that the virulence of mousepox virus can be significantly enhanced by the incorporation of a standard immunoregulator gene, a technique that could be applied to other naturally occurring pathogens such as anthrax or smallpox, greatly increasing their lethality.
  • Indeed, other biologists have synthesized a key smallpox viral protein and shown its effectiveness in blocking critical aspects of the human immune response.
  • A team of biologists recently created a polio virus in vitro from scratch.

According to the scientists convened, other classes of unconventional pathogens that may arise over the next decade and beyond include binary BW agents that only become effective when two components are combined (a particularly insidious example would be a mild pathogen that when combined with its antidote becomes virulent); "designer" BW agents created to be antibiotic resistant or to evade an immune response; weaponized gene therapy vectors that effect permanent change in the victim's genetic makeup; or a irstealthll virus, which could lie dormant inside the victim for an extended period before being triggered. For example, one panelist cited the possibility of a stealth virus attack that could cripple a large portion of people in their forties with severe arthritis, concealing its hostile origin and leaving a country with massive health and economic problems.

According to experts, the biotechnology underlying the development of advanced biological agents is likely to advance very rapidly, causing a diverse and elusive threat spectrum. The resulting diversity of new BW agents could enable such a broad range of attack scenarios that it would be virtually impossible to anticipate and defend against, they say. As a result, there could be a considerable lag time in developing effective biodefense measures.

However, effective countermeasures, once developed, could be leveraged against a range of BW agents, asserted attendees, citing current research aimed at developing protocols for augmenting common elements of the body's response to disease, rather than treating individual diseases. Such treatments could strengthen our defense against attacks by ABW agents.

They cited the pace, breadth, and volume of the evolving bioscience knowledge base, coupled with its dual-use nature and the fact that most is publicly available via electronic means and very hard to track, as the driving forces for enhanced cooperation. Most panelists agreed that the US life sciences research community was more or less "over its Vietnam-era distrust" of the national security establishment and would be open to more collaboration.

  • One possibility, they argued, might be early government assistance to life sciences community efforts to develop its own "standards and norms" intended to differentiate between "legitimate" and "illegitimate" research, efforts recently initiated by the US biological sciences community.
  • A more comprehensive vision articulated by one panelist was for the bioscience community at large to aid the government by acting as "a living sensor web" - at international conferences, in university labs, and through informal networks - to identify and alert it to new technical advances with weaponization potential. The workshop did not discuss the legal or regulatory implications of any such changes.

Attempts to prevent the spread of nuclear weapons technology are already glaringly inadequate and are failing (and also see here). Efforts to prevent bioweapons development will fare even worse because the "footprint" of a bioweapons development effort will be able to be incredibly smaller than that of a nuclear weapons development effort. The development of microfluidics devices and nanotechnology hold the potential to revolutionize medical research, disease treatment, and the development of rejuvenation therapies. But those same technologies also will make it easier to use biotech for nefarious purposes.

Illustrating the speed with which biotechnology is advancing to create new bioterrorism threats is a recent announcement by Craig Venter and his Institute for Biological Energy Alternatives that they have synthetically created working copies of the known existing bacteriophage virus Phi X174.

Scientists at the Institute for Biological Energy Alternatives (IBEA) in Rockville, Maryland , announced their findings, along with the Secretary of the Department of Energy (DOE), Spencer Abraham, at a press conference Thursday in Washington, D.C. DOE funded the research.

J. Craig Venter, president of IBEA, led the research, working with longtime collaborators Nobel Laureate Hamilton O. Smith of IBEA and Clyde A. Hutchinson of the University of North Carolina, Chapel Hill. Venter and Smith were principal collaborators on sequencing the human genome. Smith, in his 70s, and Hutchinson, in his 60s, pulled all-nighters “just like post-docs” to create the genome in record time, said Venter.

Venter and his colleagues created the genome of a virus that infects bacteria but is harmless to humans. The genome of this particular virus, called phi X, was already a bit of a celebrity in the world of genomics. In 1978, it was the first virus ever sequenced. It has been extensively studied in the laboratory since the 1950s.

In 14 days, the researchers created the artificial phi X by piecing together synthetic DNA ordered from a biotechnology company. They used a technique called polymerase cycle assembly (PCA) to link the strands of DNA together.

Venter says techniques involved are like working with Lego building blocks.

As a demonstration, researchers at the Institute for Biological Energy Alternatives announced yesterday that they had created a simple virus in just 14 days by stitching together strands of synthetic DNA purchased through the mail.

"You can envision this like building something out of Legos," said IBEA President J. Craig Venter, who led the race to decode human DNA before joining the effort to build organisms from scratch.

The Phi X174 virus is 5,386 DNA letters long.

The team used enzymes to glue the oligonucleotides together accurately into the complete 5,386-base genetic strand, and to copy it many times. When the synthetic viral genome was injected into bacteria, the bacterial cell's machinery read the instructions and created fully fledged viruses.

By contrast, the previously synthesized 7500 base long poliovirus synthesis project took two years and the resulting poliovirus had errors in its DNA sequence.

Other researchers had previously synthesised the poliovirus, which is slightly bigger, employing enzymes usually found in cells. But this effort took years to achieve and produced viruses with defects in their code.

So the timescale has shifted from years to weekst o make a virus. There are other bigger viruses that would require more time to assemble. The biggest viruses are 400,000 base pairs long with HIV containing 10,000 base pairs whereas hepatitis B contains 3000, human cytomegalovirus contains about 230 kilo base pairs (kbps where kilo means thousand) and influenza at 12 kbp. By contrast the E Coli bacteria is 4 million base pairs, the the bacteria that causes tuberculosis is 4,411,532 base pairs (bp) and the bacteria that causes leprosy is 3,268,203 bp. So building artificial bacteria from scratch is a much bigger job. But keep in mind that 12 kbp number for influenza. Individual influenza strains have killed tens of millions of people. Imagine a bioengineered influenza attack that unleashed many deadly strains at once. The results for the human race would be catastrophic.

The research results for how to rapidly make a synthetic virus were held up by a security review that reached all the way to the White House.

Before the work was publicized, officials at the Department of Energy consulted with the White House and the Department of Homeland Security to make sure there were no security concerns. And the paper describing the results, which will be published in the Proceedings of the National Academy of Sciences, was subjected to an extra level of scientific review, according to Venter, who heads the Institute for Biological Energy Alternatives in Rockville, Md., where the work was done.

Note from the previous article that Venter thinks his team could make a bacteria with about 60 times larger genome from scratch within about a year of starting.

The Venter team used improvements of a process called polymerase cycling assembly (PCA) to achieve the fast construction time. This technique could also be applied to dangerous viruses whose sequences are known.

But the DNA sequences of several nasty viruses, including smallpox, are now known and publicly available. And as one of the team observed, the entry proteins for smallpox might be provided by a related but harmless virus. Let’s hope nobody tries.

The debate about whether to destroy smallpox virus stocks is pointless because any virus or bacteria whose DNA sequence is published is eventually going to be easily creatable by labs all around the world.

By Randall Parker 2003 November 17 04:50 PM  Dangers Biowarfare
Entry Permalink | Comments(0)
White Minds React To Black Faces With fMRI Watching

Picture a politically correct job interview in which both participants get their brains recorded while they go through the interview.

"We were surprised to find that brain activity in response to faces of black individuals predicted how research participants performed on cognitive tasks after actual interracial interactions," says Jennifer Richeson, Assistant Professor of Psychological and Brain Sciences, the lead author on the paper. "To my knowledge, this is the first study to use brain imaging data in tandem with more standard behavioral data to test a social psychological theory."

Their findings suggest that harboring racial bias, however unintentional, makes negotiating interracial interactions more cognitively demanding. Similar to the depletion of a muscle after intensive exercise, the data suggest that the demands of the interracial interaction result in reduced capacity to engage in subsequent cognitive tasks, say the researchers.

For the study, thirty white individuals were measured for racial bias, which involved a computer test to record the ease with which individuals associate white American and black American racial groups with positive and negative concepts. Racial bias is measured by a pattern in which individuals take longer to associate the white Americans with negative concepts and black Americans with positive concepts. The study participants then interacted with either a black or a white individual, and afterward they were asked to complete an unrelated cognitive task in which they had to inhibit instinctual responses. In a separate fMRI session, these individuals were presented with photographs of unfamiliar black male and white male faces, and the activity of brain regions thought to be critical to cognitive control was assessed.

"We found that white people with higher scores on the racial bias measure experienced greater neural activity in response to the photographs of black males," says Richeson. "This heightened activity was in the right dorsolateral prefrontal cortex, an area in the front of the brain that has been linked to the control of thoughts and behaviors. Plus, these same individuals performed worse on the cognitive test after an actual interaction with a black male, suggesting that they may have been depleted of the necessary resources to complete the task."

According to Richeson, most people find it unacceptable to behave in prejudiced ways during interracial interactions and make an effort to avoid doing so, regardless of their level of racial bias. A different research project by Richeson and her colleagues suggested that these efforts could leave individuals temporarily depleted of the resources needed to perform optimally on certain cognitive tasks. This new study by Richeson provides striking evidence that supports the idea that interracial contact temporarily impairs cognitive task performance.

This study will of course occasion considerable discussion about the continued existence of racial stereotypes and the harm therefrom. But since this site is dedicated to taking less conventional looks at human nature and our future let us look at some other issues that others will tend to ignore.

What would be interesting is to see this study repeated with much larger groups of people of different races, occupations, and histories of living in different areas. Does the feeling of bias run stronger among those who have more or less experience with other races? Does it vary as a function of age of the person when the most experience of other races happened. Does it run stronger as some sort of function of IQ? Does it vary as a function of personality type with someone who is outgoing having more or less bias than someone who is shy and retiring? Do some races bear more animosity or fear toward other races? This result was only with whites and a pretty small sample of them. So the really interesting questions can't be answered.

Think about the economic implications of this work. People whose work performance varies a great deal as a function of how much cognitive effort they can muster (for instance engineers, computer programmers) ought to avoid sources of cognitive drain. One way to avoid sources of cognitive drain would be to isolate oneself from them (like by not answering a phone call from a girlfriend who wants an emotionally complicated conversation while I'm trying to program something complicated - not that I'd ever do such a thing. But gotta love caller ID! ;>). Another way might be to learn desensitization techniques. If a biofeedback machine or some other device could allow one to measure the extent of one's responses one might be able to learn to dampen the responses that cause cognitive drain.

There are other implications that go beyond race. What kinds of physical appearance and personality characteristics in some people cause which other kinds of people to strain to react cordially? Are there personality types that are simply incompatible and that will drain off too much in the way of cognitive resources when working with each other? Could employers use that knowledge to divide people up into teams that will reduce the total amount of cognitive waste that results from emotional reactions between co-workers?

By Randall Parker 2003 November 17 11:20 AM  Biological Mind
Entry Permalink | Comments(4)
2003 November 13 Thursday
Adult Stem Cell Research Promising For Heart, Lung Disease

A medical conference of the American Heart Association, Scientific Sessions 2003, has produced a number of encouraging reports on the use of bone marrow stem cells to regenerate hearts, blood vessels, and even lungs. A German group used bone marrow stem cells in humans to restore heart muscle function in failing hearts.

ORLANDO, Fla., Nov. 10 – Bone marrow stem cells restored heart muscle that was damaged from a heart attack, providing a new treatment for failing hearts, researchers reported today at the American Heart Association’s Scientific Sessions 2003.

The bone marrow cells came from patients’ own blood and were injected into their ailing hearts. The cells fueled new cell growth, which strengthened the heart’s pumping capacity.

“These results demonstrate for the first time that transplantation of a person’s own stem cells through direct intracoronary injection increased cardiac function, blood flow and metabolism in the damaged zone,” said senior author Bodo E. Strauer, M.D., professor of medicine at Heinrich Heine University in Düsseldorf, Germany.

“If a prospective, randomized, multicenter study confirms these encouraging results, a new therapy for heart attacks could be in reach,” he said.

A Canadian group successfully used endothelial progenitor cells from bone marrow to restore heart function and lower blood pressure in rats.

“This is a novel and exciting approach,” said Duncan Stewart, M.D., professor and director of cardiology at the University of Toronto and head of cardiology at St. Michael’s Hospital.

Pulmonary arterial hypertension (PAH) is abnormally high blood pressure in the arteries between the heart and lungs. It is a progressive disease that can affect the arterioles and capillaries that supply blood to the lungs.

“PAH reduces the heart’s ability to pump blood through the lungs and gradually leads to heart failure. Today, we can achieve some improvement with drugs, but the treatment is palliative and can only delay death,” Stewart said.

Restoring blood flow to the lungs with a stem cell transplant in the pulmonary vessels may hold promise as a new treatment for PAH, Stewart said.

His team used endothelial progenitor cells. Endothelial cells form a thin lining in blood vessels, providing an interface between the vessel and blood. This lining, called the endothelium, regulates a host of basic processes, such as blood clotting and blood pressure.

“Our results show that endothelial progenitor cells from the bone marrow circulate in the bloodstream. We can use them to form new blood vessels or repair damaged ones,” Stewart said.

Stewart and co-investigator Yidan Zhao, M.D., a research associate at the University of Toronto and St. Michael’s Hospital, removed vascular progenitor cells from rats’ bone marrow. The cells were cultured for five days, then injected into the pulmonary circulation of rats with PAH. A second group of rats with PAH received skin fibroblasts (cells), while a third group, which did not have PAH, were used as controls.

Right ventricular systolic blood pressure (the pressure when the heart contracts) was measured 21 days later. The systolic pressure of the untreated, normal rats was 26 millimeters of mercury (mm Hg). Rats with PAH had a systolic pressure of 47 mm Hg. Systolic pressure fell to 32 mm Hg in those treated with endothelial progenitor cells, and it was relatively unchanged (45 mm Hg) in control animals treated with skin fibroblasts.

A Texas Heart Institute group used bone marrow stem cells to restore heart function in humans.

Researchers at the Texas Heart Institute in Houston tested mononuclear bone marrow cell transplant injections in patients with severe ischemic heart failure — the first such study in a severely ill population. There are few treatment options for patients with end-stage ischemic heart failure, according to the study’s lead author Emerson C. Perin, M.D., Ph.D., a clinical assistant professor of medicine at Baylor College of Medicine at the University of Texas Health Science Center in Houston.

Previous laboratory research has shown that mononuclear cells taken from bone marrow then injected into human tissue can promote growth in oxygen-deprived tissue. Mononuclear cells can differentiate into tissue and new blood vessels, and secrete a wide variety of proteins and growth factors, said Perin, who is also director of new cardiovascular interventional technology at the Texas Heart Institute.

Treated patients had better blood flow and could walk longer on treadmill tests than controls. They also reported less chest pain and were able to breathe better.

“To have the sustained ability to exercise at six months is significantly different than the controls. They’re functional and they have their lives back,” Perin said.

In what could turn out to be a useful discovery for purposes extending beyond heart disease treatment a Tufts University group discovered a new kind of stem cell in bone marrow that may be able to differentiate into a wide range of cell types.

ORLANDO, Fla., Nov. 10 – A “universal stem cell clone” found in adult bone marrow regenerated blood vessels and heart muscle, according to research reported at the American Heart Association’s Scientific Sessions 2003.

The cells, called human bone marrow-derived multipotent stem cells (hBMSC), were implanted into animal hearts where they formed multiple cell types.

The hBMSC improved animals’ heart function, said the study’s lead author, Young Sup Yoon, M.D., Ph.D., assistant professor of medicine at Tufts University School of Medicine in Boston.

“This study is exciting because it is the first to show that human bone marrow includes a clonal stem cell population that can differentiate into both vessels and heart muscle. These cells can regenerate the essential tissues of the heart,” Yoon said. This finding comes from animal and laboratory research.

Such stem cells might be used to regenerate damaged hearts for people who have acute and chronic heart failure. They also might help people with hypertension, diabetes or other blood vessel diseases.

The researchers found that these stem cells didn’t belong to any previously known bone marrow-derived stem cell population (such as hematopoietic cells, the source for all types of blood cells or mesenchymal cells that give rise to cell types like bone and cartilage).

These adult bone marrow stem cells have been shown to differentiate into all three so-called “germ layers.” The three germ layers of cells in early human development are the beginnings of the body’s tissues and organs. Differentiation is the term that describes the process in which stem cells change into these specialized cells.

A University of Ottawa group has found that a bone marrow stem cell growth factor helped five heart attack patients.

ORLANDO, Fla., Nov. 11 – A drug that stimulates bone marrow to produce stem cells helped regenerate damaged heart muscle in one of the first studies of its kind, according to a report presented at the American Heart Association’s Scientific Sessions 2003.

The drug, granulocyte colony stimulating factor (G-CSF), treats some forms of cancer. It stimulates bone marrow to produce the different types of blood cells, including white blood cells that can become depleted after disease or chemotherapy.

G-CSF might help repopulate the heart’s muscle cells, which in turn could help repair the damaged heart, said lead author Chris A. Glover, M.D.

“Research has shown that there are cells in the heart that come from bone marrow stem cells. We hypothesized increasing these cells after a heart attack may help the heart regenerate heart muscle cells, and this is supported by our results,” said Glover, assistant professor of medicine at the University of Ottawa and the Ottawa Heart Institute in Ontario.

A larger clinical trial on G-CSF is now planned.

The drug -- granulocyte colony stimulating factor, or G-CSF -- was tested in only five heart-attack patients, says Chris Glover, a researcher at the University of Ottawa and the Ottawa Heart Institute. A clinical trial will begin this spring involving up to 85 to 100 patients.

Tissue-engineered heart valve replacements made from a patient's own tissues were also reported.

ORLANDO, Fla., Nov. 12 – Heart valves engineered from patients’ own tissue may offer a new treatment for valvular heart disease, researchers reported today at the American Heart Association’s Scientific Sessions 2003.

“Using this tissue-engineered valve overcomes many of the problems with mechanical or donor valves because it is a living structure from the patient’s own tissue, and so it does not cause an immunological reaction,” said Pascal M. Dohmen, M.D., head of tissue engineering research and staff surgeon of the department of cardiovascular surgery at Charité Hospital in Berlin, Germany.

The action in heart disease research aimed at developing therapies has obviously shifted toward repair using cell therapies and the use of growth factors to stimulate and guide cells as part of cell therapies. The sheer number of research groups reporting encouraging results at a single conference suggests an even larger number of groups must be working on therapies. It is also worth noting that many of the results mentioned above were done on humans. Heart disease is a major killer. These reports in total suggest that cell therapies to repair hearts are no longer a distant uncertain prospect and successful therapies will not have to wait on advances in poorly funded embryonic stem cell research. The ability to use a patient's own cells to do repair also avoids immune incompatibility problems.

These reports are not a reason for complacency about your diet. If you think that heart disease will be curable by the time you become old enough to have a heart attack then that is not a reason for complacency about diet or exercise. The risk factors for heart disease also cause general aging to happen more quickly. A high level of cholesterol in the blood probably increases total free radical oxidative stress on the body. There is considerable overlap between a diet that is ideal for reducing heart disease risk and cancer risk. General brain aging is going to be accelerated by a high level of oxidative stress. Plaque build-up puts you at risk for a stroke and brain damage. Eat a great diet. Get plenty of exercise. We are decades away from the day when medicine can fully protect us from our vices.

By Randall Parker 2003 November 13 01:45 PM  Biotech Organ Replacement
Entry Permalink | Comments(120)
2003 November 12 Wednesday
Drugs And Stress Have Variety Of Effects On Brain Development

A number of recent reports underscore how much various drugs and stress can cause lasting changes to brains, and particularly to younger brains that are still developing. Adolescent brains even appear to be more vulnerable in some cases than younger brains. First off, here is a report on stress-induced permanent changes to the hippocampus.

Studies by Susan Andersen, PhD, of McLean Hospital in Belmont, Mass., and colleagues show that stressful events experienced during adolescence can lead to enduring changes in brain structure in adulthood. This work is the first to demonstrate that exposure to a significant stress during adolescence can impact neuronal connections in the adult brain.

The researchers found that adult rats exposed to a social stress during adolescence (ages approximating 13 to 15 years in humans) showed a significant decrease in a specific protein found in the hippocampus, a brain region important for learning and memory. In fact, the loss of this protein, synaptophysin, is at least as great as that occurring in animals exposed to more severe stressors at a younger age, suggesting that adolescents may be more vulnerable to the effects of stress than younger animals.

Under typical conditions, synaptophysin, which is often used as an index of the number of neuronal connections, or synapses, reaches a peak during young adulthood (approximately ages 18 to 20), with the rise occurring primarily during adolescence. The team tested whether a social stress during this key developmental period might alter this pattern. A control group of rats was housed with their peers, and an experimental group of rats was housed individually during adolescence; individual housing in normally social animals such as rats is a stressful experience. The brains of both groups were then examined during young adulthood. The team found that rats exposed to the social stressor did not show the normal increase in synaptophysin during this period. These data suggest that social stress during adolescence causes either a loss of synapses or a decrease in the synaptophysin protein.

The researchers then compared the loss of synaptophysin in rats that were stressed during adolescence with rats that experienced significant stress during ages comparable to childhood. The stressor used for this age group was repeated maternal separation (RMS). The scientists found no significant difference in synaptic density between rats that had social stress during adolescence or rats that had early RMS. However, the density of synapses in the hippocampus of both groups was reduced significantly when compared with control rats.

These findings are the first to demonstrate that exposure to a significant stress during adolescence can have enduring consequences on the connections formed in the hippocampus in adulthood. These data may suggest why early traumatic stress, such as physical or sexual abuse or neglect, is associated with a decrease in the size of the hippocampus in adulthood.

“These preclinical data suggest that stress experienced early in life alters the normal developmental trajectory of the hippocampus, but that these changes are not apparent until later in life,” says Andersen.

Adolescent brains undergo a great amount of change and therefore anything that interferes with development during that stage has the potential of creating lasting impacts on cognitive function. See the previous post Adolescence Is Tough On The Brain for reports on some of the changes that happen in adolescence.

So what to do about stress causing harmful effects on the brain? Picture at some point in the future nanotech sensors injected into a child's body to provide Mom and Dad with a daily report of whether the kid is feeling enough stress for it to have a harmful effect on cognitive development. If that happens the kid will be put in stress-dampening drugs in order to protect the brain. Sound far-fetched? Sensors will eventually become sensitive enough, small enough, and sufficiently long lasting for that part of this scenario to work. A sensor reader could be mounted on the front door or perhaps in internal house rooms with the sensor data getting routed to the house computer. As Amy Arnsten explains in the following article, drugs with the desired effects already exist.

Amy F.T. Arnsten wrote an interesting article a few years ago in J Am Acad Child Adolesc Psychiatry that explains a different mechanism by which stress causes the result of catecholamines which produce temporary or even permanent changes to the prefrontal cortex.

Animals or humans with lesions to the PFC exhibit poor attention regulation, disorganized and impulsive behavior, and hyperactivity. Recent research in animals indicates that exposure to stress can produce a functional “lesion” of the PFC. During stress exposure, catecholamines are released in both the peripheral and central nervous systems. In the periphery, the catecholamines norepinephrine and epinephrine are released from the sympathetic nervous system and adrenal gland, respectively. These catecholamine actions serve to “turn on” our heart and muscles and “turn off” the stomach to prepare for fight-or-flight responses during stress.

In the brain, high levels of the catecholamines dopamine and norepinephrine are released in the PFC during stress exposure, even during relatively mild psychological stress. As basal levels of dopamine and norepinephrine have essential beneficial influences on PFC function, it was originally presumed that high levels of catecholamine release during stress might facilitate PFC function. However, research in monkeys and rats demonstrated the contrary: exposure to stress impairs the working memory functions of the PFC.

If you read Arnsten's full article you will see where she talks about drugs that can prevent the damage caused by catecholamines released during stress. Keep that in mind when reading below how nicotine can provide the brain with protection against stress. Nicotine, being an addictive drug that also causes other and perhaps undesireable changes to the brain, is far from the ideal compound to use for stress protection. But it does point the way toward the development of compounds that would provide stress protection without the various harmful side-effects caused by nicotine.

Nicotine causes changes in fetal brains.

The children of women who smoke during pregnancy have been found to be at greater risk for a wide variety of emotional and behavioral disorders, such as attention deficit hyperactivity disorder (ADHD) and conduct disorder. Now, new animal studies from the Yale University School of Medicine demonstrate that the effects of developmental nicotine on emotional learning last into adulthood.

“If we can identify the mechanism for this long-term behavioral change, we may be able to develop new therapies for human emotional disorders that are linked to prenatal nicotine exposure,” says Sarah King, PhD.

For their most recent study, King and her colleague, Marina Picciotto, PhD, used an animal model of emotional learning known as passive avoidance. This model measures how long an animal avoids a dark chamber in which it had previously received a mild electric shock. King and Picciotto found that nicotine-treated mice showed a hypersensitive response and avoided the dark compartment longer than non-exposed mice.

This response was identical to one the researchers had reported on previously (Journal of Neuroscience, May 2003) in genetically altered mice that lack high affinity nicotine receptors as a result of a knockout mutation. “We believe that nicotine exposure during development— the same kind of exposure that occurs in mothers who smoke during pregnancy — disrupts normal nicotine receptor activity, much like the knockout mutation, and that this leads to altered emotional learning in adulthood,” says King.

King and Picciotto have also identified a novel brain circuit — glutamate neurons, which originate in the cortex and project to the thalamus (corticothalmic neurons) — as the likely site where changes occur in the brain during early nicotine exposure. They are currently working to identify the molecular changes that developmental exposure to nicotine triggers in the corticothalamic neurons.

Each year, about 2 million teenagers become regular smokers, according to the American Lung Association. Because the brain continues to develop during adolescence — and beyond — scientists at George Mason University decided to investigate the effect that exposure to nicotine during adolescence has on later behavioral functioning. The researchers implanted 46 rats with small minipumps that dispensed either 3 or 6 mg of nicotine per kilogram of body weight per day — or no nicotine at all (controls). When the animals reached adulthood, they were tested for spatial learning and memory.

Nicotine made a significant difference in the animals’ performance in the tests. Low and high doses of nicotine altered behavior in opposite directions: The low-dose group tended to learn faster and the high-dose group tended to learn slower than the control animals. “Whether performance improved or declined is probably less important than the demonstration that nicotine does produce long-lasting changes in the animals’ performance, presumably reflecting long-lasting effects on brain development,” says Robert Smith, PhD.

Although this research was done in rats, the processes of brain development are similar in humans, which leads Smith to believe that teenagers who smoke aren’t risking only addiction, but also lasting changes in the development of their brains. Smith and his colleagues are now examining the genetic mechanisms that are involved in producing this lasting change in behavior.

During times of stress, smokers tend to increase the number of cigarettes they light up — perhaps as a form of self-medication to counteract the harmful effects of stress on the brain. Stress, which may range from mild anxiety to posttraumatic stress disorder, has been shown to impair normal brain function, including learning and memory.

Researchers in the laboratories of Karim Alkadhi, PhD, at the University of Houston College of Pharmacy recently studied the effect of nicotine on stress-induced memory impairment in rats. They found that when stressed animals were given nicotine, they performed significantly better at short-term memory tests than stressed animals not given the chemical. In fact, the nicotine-treated stressed animals performed the same as unstressed (control) animals.

“Our findings are important to the understanding of the mechanism by which nicotine repairs stress-damaged brain function,” says Abdulaziz Aleisa, a doctoral student at UH. “This research may eventually help in the designing of new, safe approaches to the treatment of Alzheimer’s and Parkinson’s diseases — approaches that mimic the beneficial effect of nicotine on stress.”

Before you start thinking that nicotine would be great to give to adolescents to learn more quickly check out an previous post: Early Nicotine Exposure Increases Nicotine Craving. Nicotine is one of many addictive drugs which cause problems. See also: Adolescent Mice More Sensitive To Addictive Drugs.

Also, note the opposite effects of the lower and higher nicotine doses on speed of learning. The brain is a finely balanced device. Likely some day it will become possible to use drugs to guide brain development to improve long-term memory and other cognitive abilities. But there are more ways to go wrong than to go right with this kind of intervention and at this point there are no clear safe ways to try to guide brain development to yield some desired outcome.

Not all drug use during adolescence primes the brain to want more drugs later in life. The same Susan Andersen mentioned above has previously found evidence that Ritalin given to juveniles may decrease their desire for cocaine later in life.

December 2, 2001 -- Belmont, MA -- Exposure to Ritalin early in life may make one less vulnerable to the allure of cocaine later, according to a new report by McLean researchers. Susan Andersen, PhD, William Carlezon, PhD, and their colleagues found adult rats that were given Ritalin as juveniles spent less time seeking out cocaine than did their Ritalin-free peers. Moreover, in some cases, the rats appeared to actively avoid places where they had been exposed to cocaine in the past.

The findings, which appear in the Dec. 3 online version of Nature Neuroscience, could help resolve several controversies surrounding the use of Ritalin, or methylphenidate, a stimulant prescribed for children who have an abnormally high level of activity or attention deficit hyperactivity disorder (ADHD).

Also see a more recent report on Ritalin's effect on long term drug and alcohol use: Ritalin For Children Reduces Later Alcohol and Drug Abuse.

By Randall Parker 2003 November 12 01:16 PM  Brain Addiction
Entry Permalink | Comments(4)
2003 November 11 Tuesday
Ireland First To Require Car Flight Data Recorders

Modelled after aircraft flight data recorders that are used to record crash information car data recorders are now cheap enough to become widely used.

On Nov. 6, Ireland's Transportation Minister announced an agreement to outfit the nation's vehicles with black-box data recorders and link them to an emergency notification system. Under the agreement, Safety Intelligence Systems (SIS), a private New York-based company, will partner with IBM (IBM ) as its exclusive information-technology provider, to supply the boxes and build a comprehensive crash-data network.

The data recorders can use cellular links to automatically phone in location recorded from a built-in GPS sensor. The recorders can report location, pattern of deceleration leading up to the end of the accident, and other information that can be used to determine the likelihood of occupant injury.

Insurance companies in the US may eventually offer discounts to drivers who agree to install recorders. The recorded information has many uses and not just from accidents. Picture recording and reporting of all vehicles that come down an off-ramp to measure whether the vehicles have a problem decelerating in the length of ramp available and whether vehicles tend to slide on a particular ramp or road curve when road surfaces are wet.

There are of course privacy concerns about the use of this sort of technology. But even if individuals resist allowing recorders to be placed in their own cars or place limits on what can be done with the data from their own cars the privacy issue will play out differently for fleet vehicles. An operator of a fleet of delivery vehicles would love to know whether any driver drives too quickly, tends to wait too long to decelerate, tends to accelerate thru intersections (a sign of running lights just turning red), or takes side trips that are not on the approved route. Fleet operators will probably be more willing to provide insurance companies with greater access to recorded data in exchange for lower rates. One can imagine a day when insurance companies will routinely come to fleet operators to demand that particular reckless drivers be fired before they cause accidents. One can also imagine how insurance companies will be able to develop databases of driver behavior and even make hiring recommendations to fleet operators based on the performance of those drivers in previous jobs.

Fleet data recorders could also provide useful information about driving patterns that lower gas mileage or increase tire wear or general vehicle wear. But fleet operators are not the only vehicle owners who will want to collect data on the driving of others. How about parents who want to monitor the driving behavior of their teenage kids? Here's a future conversation that will eventually take place many times: "You can't have a car unless the car has a very high capacity recording device". What's the kid going to do, say no? Here's a case where there would be no government or insurance company involvement where it would be hard to argue against it on civil liberties grounds. Do parents not have a right to monitor their kids in this manner?

A really smart box that was monitoring g force shifts and direction might even be able to detect drivers impaired by drugs, alcohol, or some other factor and the box could report this while the driving trip was taking place. Police could be summoned with a continuously updating position and direction of the vehicle. Or the vehicle could be ordered to shut down or at least to slow down to some low maximum speed.

Of course, in the longer run the computers will gradually take over driving responsibilities. This has already begun in a limited manner with ABS and even with airbag deployment. But more work could be done. For instance, a computer could detect a traffic light changing color or even be told by a radio signal that the light has changed color. Then the computer could flash a light or otherwise indicate to the driver that he is too close to the intersection to make it thru safely. Also, computers could be told that a traffic accident or fog is up ahead and alert a driver of the need to slow down and of where the exact danger lies. Also, a driver could be given optional adaptive cruise control (and this has already been tested - deployed anywhere?) that would decelerate a vehicle in order to maintain some maximum distance from the car in front.

By Randall Parker 2003 November 11 01:23 PM  Surveillance Society
Entry Permalink | Comments(3)
2003 November 10 Monday
False Versus Real Memory Recall Looks Different In MRI Brain Scan

Functional Magnetic Resonance Imaging (fMRI) can detect differences in brain patterns when brains correctly and incorrectly think they have seen an image before.

Neuroimaging techniques can help determine if the neural processes driving this retrieval of inaccurate memories are different from those that drive the retrieval of accurate memories. Several research groups are using functional magnetic resonance imaging (fMRI) to address this question. The hope is that neuroimaging can help determine the various potential sources of false memories.

Daniel Schacter, PhD, and his colleagues at Harvard University have looked at neural activity associated with the creation of false memories. Previous studies had focused on neural activity associated with the retrieval of false memories.

Relying upon earlier work that showed the right fusiform cortex is involved in encoding the exact visual details of objects and the left fusiform cortex is involved in more general processing, Schacter’s group designed an experiment to test the role of the right fusiform area in avoiding the formation of false memories for objects similar to those seen previously but not exactly the same.

In the study, led by graduate student Rachel Garoff, participants underwent brain scans using fMRI as they made judgments about the size of various objects. A surprise memory test was then given when the patients were outside the MRI scanner. During the test, patients saw objects identical to those seen earlier, objects similar to those they had seen earlier, and new objects they had not seen at all.

Although the study is still in progress, results to date indicate that the right fusiform area was more active in these individuals during the encoding of objects participants later labeled the “same” as objects they had seen before. The right fusiform area was less active when patients incorrectly labeled objects the same when they were only similar, or when they labeled objects similar when they were actually the same.

“This preliminary finding supports the idea that the right fusiform area is tied to the encoding of specific visual details,” Schacter said. “It also suggests that false memories of objects can be reduced through additional activity of the right fusiform area during encoding.”

In another study, Schacter’s group showed that visual processing regions of the brain were reactivated during true memory but not during false memory.

Scott Slotnick and Schacter constructed “prototype” shapes by adjoining four curves into various shapes, then they distorted these prototypes to form “exemplar” shapes. The twelve individuals who have taken part in the study thus far were instructed to remember each shape and whether it appeared to the left or the right on a screen. “True memory” was defined as recognizing a shape that was seen previously, and “false memory” referred to mistakenly recognizing a shape that resembled a shape seen earlier but that was not actually seen. In the next step, fMRI was used to determine which areas of the brain were associated with true and false memory.

“We found that participants gave the same response regarding whether an object was “new” or “old” during true and false memory, which leads you to expect that the associated brain activity might be indistinguishable,” Schacter said. “But fMRI revealed there is a different activation of brain regions involved in visual processing during true versus false memory. What we need to do now is understand the meaning of this difference.”

This is not as impressive as it first sounds. The researchers can not say in each instance whether the memory being recalled is true or false. They only see a difference on average over many experimental runs.

However, Schacter points out that their work currently averages brain activity over many trials, so detecting the accuracy of a single memory is not yet possible.

Still, these experiments suggest that it might be possible to train people to be more aware of the strength of their own memory recall. If there are differences in brain activity when recalling accurate and inaccurate matches between viewed images and memory it might be possible to develop a training regime to let people know when they are making fake matches between memories and viewed objects. By getting that immediate feedback people might be able to calibrate their own sense of certainty and develop a better sense of just how strong the feeling of seeing a match has to be in order for it to be likely to be accurate.

By Randall Parker 2003 November 10 02:49 PM  Brain Memory
Entry Permalink | Comments(3)
Trade-Off For Enhancing Working And Long Term Memory

Yale University researchers say one drug approach for enhancing long term memory may worsen working memory.

Memory Enhancing Drugs May Worsen Working Memory

New Haven, Conn. -- A new study cautions that while drugs being designed to enhance memory in the elderly seem to be effective for some types of memory, they may actually worsen working memory, according to a study by Yale researchers published Thursday in the journal Neuron.

Working memory is the cognitive ability that intelligently regulates our thoughts, actions and feelings, letting us plan and organize for the future. It is governed by the prefrontal cortex. This type of memory is constantly updated and is known to be impaired by the normal aging process.

The ability to lay down long-term memories depends upon another region of the brain, the hippocampus.

The study by Amy Arnsten, associate professor and director of graduate studies in neurobiology at Yale School of Medicine, shows that the prefrontal cortex and hippocampus have different chemical needs, and that medications being developed to enhance long-term memory actually worsen working memory in aged animals.

Biotech companies are focusing on the activating protein kinase A, an enzyme in hippocampal cells which strengthens long-term memory formation. Arnsten and colleagues found that activation of protein kinase A in prefrontal cortex worsened working memory, while inhibiting this enzyme in prefrontal cortex improved working memory in aged rats. In collaboration with Ronald Duman's laboratory, they found that aged rats with naturally occurring working memory impairment had signs of overactive protein kinase A in their prefrontal cortex.

"Because PKA is over-activated in the aged prefrontal cortex, PKA stimulation actually makes the situation worse by further impairing working memory," Arnsten said.

The study was funded by the National Institute on Aging (NIA) of the National Institute on Health (NIH). "This important study tells us that one size may not fit all when developing treatment strategies for cognitive deficits," says Molly Wagster, program director for neuropsychology of aging research at the NIA. "The differing effects of PKA activity in the hippocampus and the prefrontal cortex suggest that distinct neurochemical needs of different regions of the brain must be addressed for the development of effective ways to enhance cognition."

Co-authors included Brian Ramos, Shari Birnbaum, Isabelle Lindenmayer, Samuel Newton and Ronald Duman.

Most drugs are really imprecise tools for modifying metabolism because they typically work on more than one site with more than one effect. It is therefore not surprising that a drug aimed to have one desired effect on the hippocampus has an undesired effect on the prefrontal cortex. Drugs that attempt to reverse or slow down aging are going to be tough to develop because the most effective treatments would reverse the damage that accumulates in aging. Attempts to simply suppress or up-regulate some activity are unlikely to reverse most forms accumulated damage. It may well be possible to develop drugs that work reverse some subset of all the types of accumulated damage. But theoretically more powerful approaches such as gene therapy and cell therapy will likely prove more efficacious in the long run.

Alzheimer's is an even bigger threat to the hippocampus than normal aging.

"Prefrontal cortex functions are essential to the information age and they naturally decrease with normal aging, so it's particularly important to see what this cortex needs and to give it back to this part of the brain," says study author Amy F.T. Arnsten, an associate professor and director of graduate studies in neurobiology at Yale University School of Medicine. "There's some deterioration in the hippocampus with normal aging, but what really erodes the hippocampus is Alzheimer's."

By Randall Parker 2003 November 10 11:37 AM  Brain Enhancement
Entry Permalink | Comments(2)
2003 November 09 Sunday
New Zealand May Screen Embryo Adopters

Need an embryo to start a pregnancy? If you are in New Zealand you may find yourself going through the same sorts of screening steps that baby adopters routinely do now in many jurisdictions.

Infertile couples adopting an embryo may have to undergo police checks to determine if they are suitable recipients. The move is being considered by the National Ethics Committee on Assisted Human Reproduction (NECAHR) as it sets guidelines for embryo donation for reproductive purposes.

Conceptually this is not all that different than screening people who want to make babies from their own sperm and eggs. So you have to use someone else's egg. Why is that any more reason to screen for parental competence and character than if one uses one's own egg?

Recipient screening is only one side of the issue. In Denmark a sperm bank is doing criminal background checks on sperm donors.

In Denmark, the world's biggest sperm bank - Cryos International Sperm Bank in Aarhus - has been forced to start screening donors for any criminal record after it emerged that a man who killed his two baby daughters was on its books.

Think about how this is going to develop once alleles are identified that contribute to criminality. There will be calls to prevent men with criminal tendencies from reproducing. But rather than an outright ban on reproduction of criminals there might be a move to prevent criminals from passing on just the genetic variations that make the biggest contribution to criminality. How could that be done? pre-implantation genetic screening. A recent advance in biotechnology may make pre-implantation for genetic variations easier to do.

Hundreds of cells have been grown from a single cell taken from an early mouse embryo. If the same feat can be repeated in humans, it would make screening embryos for genetic defects during IVF much easier and more accurate.

Pre-implantation genetic screening is already becoming popular for sexual selection. Surprisingly, in Australia genetic screening for sex selection is being done more often to select for a girl than to select for a boy.

But at Sydney IVF – a leading company for IVF and genetic testing – more than 250 couples have used PGD for sex selection since 1995.

Just over a third of the treatments resulted in a pregnancy and 64 per cent of parents wanted a girl.

Suppose genetic screening for sex selection becomes much more widely used. One way to prevent a large imbalance between the sexes would be to tax babies born of the more popular sex and give the proceeds to those who have babies of the less popular sex. The size of the tax could be set at whatever level is needed to achieve a balance between the sexes. That would be a lot easier to enforce than a ban against sexual selection since such a ban would be hard to enforce. Tax collection and disbursement would be a lot easier to carry out. While poor parents would present a problem for any tax collection system the collection side could be progressive and it would still work on the middle class and above.

By Randall Parker 2003 November 09 07:32 PM  Biotech Reproduction
Entry Permalink | Comments(0)
2003 November 07 Friday
UN Debates Therapeutic, Reproductive Cloning Ban

The UN is worried about cloning.

While there is virtually universal support at the United Nations for a treaty banning human cloning, the international community is deeply divided over therapeutic cloning.

Scientists see it as a promising avenue in the battle against disease while anti-abortion activists and many Catholics see it as the taking of human lives

There is something almost classic about this debate where diplomats and the forces of international law are so assured they are tackling an important emerging issue while they ignore a real problem that has been developing for years. In particular, the UN is unified about the supposed threat of reproductive cloning while ignoring a change in reproduction practices that is a far greater threat to society in large part because it is already happening on a large scale. What change am I referring to? If you guessed sexual selection go to the front of the class. See, for instance: Girl Shortage Causes Wife Buying In India, Genetic Testing Changing South India Mating Practices, and Human Natural Selection In Taiwan to see just how rapidly this practice is spreading. There is an upside in that it will probably select for higher intelligence in offspring. But the downside will be societies with large numbers of sexually frustrated males and that could cause everything from internal unrest to wars. <

Aside: Europe has experienced quite the come down from their certainty that they were not torn by the sorts of divisions over abortion that characterise US politics. But along came more southern European members and suddenly they too face debates about abortion and therapeutic cloning that have people in Brussels unable to find a consensus on issues that provoke strong passions in opposing factions.

But what is even more interesting about all this? At the risk of seeming a bit esoteric: I the definition of "clone" is going to end up being very difficult to pin down in the long run. If cloning is defined by reference to the DNA sequence of a donor and if the prohibition is against a person having an exact duplicate made of himself or herself then what happens when someone decides to have an child who is made from a 2.0 improved version of their DNA sequence? After all, 20 years from now we will all know our personal DNA sequences and I can easily see someone deciding to make someone a whole lot like themself but without, say, the heartbreak of seborrhea, allergies, asthma, or a hairline that starts receding when they reach the age of 17. Many women will go for permanent blondness for their daughters. Just a couple of SNP changes and suddenly no need for peroxide. Allow your kid to look almost like you but be smarter, healthier, and better-looking. Make a child who will grow up to be an idealized image of what you always wanted to be.

You don't suppose those UN folks have considered this possibility do you? My guess is they haven't. Think about it. When it becomes possible to make small alterations in offspring DNA how many SNP alterations (Single Nucleotide Polymorphisms or single letter DNA changes) should be required to be made in order to make an offspring not be an exact clone of yourself? I figure I needed to satisfy some high SNP difference requirement I'd opt for a whole bunch of silent mutation changes (changes in ways and in places that do not cause any functional changes - and there are just tons of those that can be done btw) that would not appreciably alter the result.

Bottom line: if the UN bans reproductive cloning expect people to carefully read the treaty language and then to "program around it".

Update: One other point about reproductive cloning: If a government or a cult made hundreds or thousands or tens or hundreds of thousands of copies of the same person that would be a problem. But would single copy cloning of a person for progeny really create a substantial problem? It'd be like having more twins. But my guess is that by the time reproductive cloning can be done safely and cheaply the technology will have advanced to the point where the "version 2.0" approach of making clones better will be available and most cloners will opt for it. So most clones will not be identical. Expect them to be healthier, smarter, and better looking. Individual level cloning will not cause much of a problem. But cult or government-level cloning could pose problems.

The more substantial conflict is going to come over the question of what future generations should be like. Once all the genetic variations that influence cognitive function are identified battles and perhaps literally wars will be fought over the moral and empathetic characteristics and sensibilities of future generations.

By Randall Parker 2003 November 07 04:18 PM  Biotech Society
Entry Permalink | Comments(2)
2003 November 06 Thursday
Prion Gene Influences Cognitive Ability

In a report in Molecular Psychiatry entitled "M129V variation in the prion protein may influence cognitive performance" German scientists Dan Rujescu, Annette Hartmann, Claudia Gonnermann, Hans-Jürgen Möller, and Ina Giegling of Ludwig-Maximilians-University, in Munich, Germany report that the gene for prion protein has genetic variations that influence cognitive ability.

Cognitive abilities are influenced by an interplay of genes and environment. With regard to the genetic component, multiple genes are assumed to be responsible for interindividual variation in cognitive abilities. Despite tremulous progress in molecular genetics, little is known about specific genes that contribute to this complex behavior. In an attempt to further delineate the genetic component of cognitive abilities, the authors investigated the relationship between a genetic variation in the prion protein and variations in cognitive abilities in 335 healthy volunteers. The main result is that a common variation in the prion protein gene is associated with cognitive abilities in our sample of healthy volunteers. These findings are further strengthened by the observation that the effect occurs in a gene dose dependent manner. The effect of this variation accounted for 2.7% of the total variability in cognitive abilities, further strengthening the assumption that many genetic variations with only a small effect influence human cognitive abilities. The mechanisms by which the prion protein might actually act on cognitive performance are unclear, but several lines of evidence suggest that this protein is involved in neuroprotection. To the authors' knowledge, this is one of the first reports on the influence of a common genetic variation on individual differences of cognitive abilities in healthy individuals. Nevertheless, it should be emphasized, that replications of our findings are needed before firm conclusions can be drawn.

Researchers on cognitive ability believe a large number of sites in the genome have genetic variations that influence cognitive abilities. In one of his books brain genetics researcher Robert Plomin says the same holds true for personality types. So it will take a large number of reports such as the one above to identify all the genetic variations that cause intelligence and personality differences.

By Randall Parker 2003 November 06 08:40 AM  Brain Genetics
Entry Permalink | Comments(1)
2003 November 05 Wednesday
Excess Serotonin 5-HT1A Receptor Increases Depression, Suicide

A genetic variation (aka allele) has been identified that increases the risk of suicide. (same press release here)

OTTAWA, Oct 17, 2003 (Canada NewsWire via COMTEX) -- After a pioneering seven-year study, Canadian scientists have discovered a new genetic difference in people suffering from severe depression and in those who have committed suicide. The findings by collaborative researchers at the University of Ottawa and the Institute of Mental Health Research, and McGill University's Douglas Hospital, Montreal -- represent a significant step forward in identifying individuals at risk for debilitating depression or even death.

The study - "Impaired repression at a 5-hydroxytryptamine -1A receptor gene polymorphism associated with major depression and suicide", published in a recent issue of the Journal of Neuroscience, showed the same genetic difference or 'single nucleotide polymorphism (SNP)' in a gene that contributes to the serotonin system which regulates mood cycles in human beings. This SNP in the serotonin-1A gene was two-fold enriched in people with depression, and four-fold enriched in those who had completed suicide, as compared to normal control groups.

...

The studies showed for the first time that the polymorphism of the serotonin-1A gene impacts by inhibiting the function of a protein called NUDR, leading to abnormal levels of serotonin-1A gene expression and decreased serotonin, and a key factor in the incidence of depression.

Frequency of variations in serotonin receptor 5-HT2A may differ between populations that have different rates of suicide

The researchers found a mutation in the gene encoding for the receptor, a protein that transmits brain signals, which more than doubles the risk of suicidal behaviour in those who carry it.

An analysis of the DNA showed 41% of the suicidal patients had the 5-HT2A receptor mutation, compared with 24% of the non-suicidal patients and 18% of the healthy subjects.

...

A genetic variability might also explain why suicide rates vary strongly between populations with different ethnic origins. For example, the annual suicide rate in Finland (for males) is 43 per 100,000 people, one of the highest rates in the world.

But the rate for neighbouring Norway is only 21 per 100,000, less than half.

"It may be interesting to look into the distribution of the [mutation] in these countries," said Dr. Hrdina.

As the costs of DNA Single Nucleotide Polymorphism (SNP) testing of single letter differences (a.k.a. alleles) in DNA sequences become cheaper the distributions of various alleles that affect mental state and behavior will become better known. Populations at greater risk for assorted mental illnesses and behavioral problems will have their problems traced down to a genetic level.

Whether it will be possible to develop compounds to target the various alleles that cause mental problems remains to be seen. Certainly pharmaceutical companies will try. My guess is that the answer will be "Yes" for some but not all. An allele that causes overexpression, for instance, may simply have no easy way for a compound to be designed that can reach all the way into the nucleus and selectively bind somewhere to repress it. The level of specificity needed may be beyond what a conventional drug can achieve. Also, for more rare alleles the market may not be big enough for a drug company to spend hundreds of milions on development. If you are going to have a major mental problem caused by a genetic variation best to have a variation that is fairly common in industrialized countries.

Aside: I think the above article misspells the receptor abbreviation. The journal abstract below for the original paper spells it slightly differently.

From the abstract of the published paper Impaired Repression at a 5-Hydroxytryptamine 1A Receptor Gene Polymorphism Associated with Major Depression and Suicide in the Journal of Neuroscience:

Our data indicate that NUDR is a repressor of the 5-HT1A receptor in raphe cells the function of which is abrogated by a promoter polymorphism. We suggest a novel transcriptional model in which the G(-1019) allele derepresses 5-HT1A autoreceptor expression to reduce serotonergic neurotransmission, predisposing to depression and suicide.

Update: Here is another article on the 5HT2A link to suicide:

The discovery could lead to the development of genetic tests to identify those at risk. But it also poses questions about the ramifications of such testing. During their 10-year study investigating the causes of suicide, the Canadian team discovered a genetic variation that affects brain chemistry. They found that depressed individuals with a mutation in the gene encoding the serotonin 5-HT2A receptor are more than twice as likely to attempt suicide as those who suffered from depression but did not carry the mutation, says Pavel Hrdina, a neurobiologist at the Royal Ottawa Hospital and study co-author. Serotonin is a neurotransmitter that carries messages between brain cells and is thought to be involved in the regulation of emotion, among other functions. For some years, scientists have suspected that the genes regulating the serotonin system could be one of the culprits.

Here is an abstract of work by Pavel Hrdina and David Bakish on their work in this area from 2000.

By Randall Parker 2003 November 05 02:23 PM  Brain Genetics
Entry Permalink | Comments(1)
2003 November 04 Tuesday
Synthetic HDL Cholesterol Reduces Artery Clogging In 6 Weeks

The reduction of clogging was 4% in just 6 weeks.

Intravenous doses of a synthetic component of "good" cholesterol reduced artery disease in just six weeks in a small study with startlingly big implications for treating the nation's No. 1 killer.

How would you like to quickly reduce your risk of heart attacks or the pain from angina?

In a small, preliminary study, the laboratory-made substance, which mimics a type of cholesterol discovered in a group of surprisingly healthy villagers in rural Italy, significantly reduced in just six weeks the amount of plaque narrowing arteries of heart attack and chest pain patients, the researchers reported.

Note to life extension skeptics: With this report is there any reason to think artery hardening will be a major cause of mortality in industrialized countries 20 years from now?

The drug is made using recombinant DNA that matches the gene sequence for HDL cholesterol found in a small population of people in northern Italy. (same article here)

The development of this investigational drug is an unusual story. About 30 years ago, researchers discovered 40 individuals in Limone Sul Garda in Northern Italy who appeared perfectly healthy, despite having very low levels of good cholesterol. Ordinarily, such people would have a high risk of heart disease, but these people did not. Intrigued, researchers wanted to find out why. Their studies revealed a variant in a protein known as Apolipoprotein A- I, which is a component of HDL. This variant was named ApoA-I Milano after the city of Milan, where the initial laboratory work was done.

ApoA-I Milano is being developed into a potential treatment for heart disease by Esperion Therapeutics Inc., an Ann Arbor, Mich.-based biopharmaceutical company. Esperion's investigational treatment, designated ETC-216, is a recombinant version of ApoA-I Milano combined with a phospholipid. After pre-clinical studies showed rapid removal of plaques from diseased arteries, scientists at Esperion came to Dr. Nissen to help them design a study to determine whether infusions of the ApoA-I Milano/phospholipid complex could reverse coronary plaque buildup in patients with heart disease.

The Cleveland Clinic-directed study administered the ApoA-I Milano/phospholipid complex intravenously over a five-week period to a randomized group of patients initially hospitalized for acute chest pain. Researchers measured arterial plaques using intravascular ultrasound (IVUS) before and after the six-week study. Patients who were given the synthetic protein showed a dramatic decrease in arterial plaques, whereas a comparison group given saline showed no change in plaques.

It sounds like it is time for phase 3 clinical trials.

"These results demonstrate for the first time that it is possible to rapidly regress the major underlying cause of heart attack," said Roger S. Newton, Ph.D., President and CEO of Esperion Therapeutics. "By enhancing the removal of cholesterol from plaques in artery walls, a process known as reverse lipid transport, HDL therapy may provide an innovative approach to the treatment of atherosclerosis. We are excited about these results and look forward to continuing the development of ETC-216 in more patients with longer follow-up and assessing more endpoints, including morbidity and mortality."

In the Phase 2 clinical trial, 47 patients with acute coronary syndromes (ACS) received five weekly intravenous infusions of placebo (n=11 patients), ETC-216 at 15 mg/kg (n=21 patients) and ETC-216 at 45 mg/kg (n=15 patients). Plaque volume was measured before treatment and within two weeks after the final infusion using intravascular ultrasound (IVUS). With IVUS, a tiny ultrasound probe is inserted into the coronary artery to directly image and measure the size of the atherosclerotic plaques. The study revealed a statistically significant reduction (p=0.02) in percent atheroma (plaque) volume in the combined ETC-216 treatment groups comparing end-of-treatment values to baseline values.

Additional IVUS endpoints in the trial, such as total atheroma volume and maximum atheroma thickness, also showed statistically significant improvements.

"This study shows that ETC-216 could become an important new option for the treatment of people affected by atherosclerosis," said Steven E. Nissen, M.D., F.A.C.C., principal investigator of the study and medical director of the Cleveland Clinic Cardiovascular Coordinating Center. "We now have evidence that it is possible to rapidly and directly reverse the atherosclerotic disease process in artery walls."

Eventually in the much longer term (10 to 20 years is my guess) we can expect to see a gene therapy developed to deliver ApoA-I Milano protein gene into the liver of the vast bulk of us who do not have this beneficial variant of the gene (Update: the more likely scenario would be to just add more of the regular ApoA-I that would be expressed at a higher level to raise normal blood HDL levels). That way the benefit would be there all the time. To derive an even bigger benefit the gene therapy could also deliver a genetic variation Cholestryl Ester Transfer Protein (CETP) variant that makes cholesterol molecules bigger and extends life as a result.

Esperion's web site explains the mechanism of action.

The RLT pathway is a four-step process responsible for removing excess cholesterol and other lipids from the walls of arteries and other tissues, and transporting them to the liver for elimination from the body. The first step is the removal of cholesterol from the walls of arteries by HDL in a process called "cholesterol removal". In the second step, cholesterol is converted to a new form that is more tightly associated with HDL as it is carried in the blood; this process is called "cholesterol conversion". The third step is the transport and delivery of that converted cholesterol to the liver in a process called "cholesterol transport". The final step is the transformation and discarding of cholesterol by the liver in a process known as "cholesterol elimination". We believe our product candidates have the potential to enhance the effectiveness of these four steps in the RLT pathway in humans.

In a healthy human body, there is a balance between the delivery and removal of cholesterol. Over time, however, an imbalance can occur in our bodies in which there is too much cholesterol delivery by LDL and too little removal by HDL. When people have a high level of LDL-cholesterol, or LDL-C, and a low level of HDL-C, the imbalance results in more cholesterol being deposited in arterial walls than being removed. This imbalance can also be exaggerated by, among other factors, age, gender, high blood pressure, smoking, diabetes, obesity, genetic factors, physical inactivity and consumption of a high fat diet. The excess cholesterol carried in the blood in LDL particles can be deposited throughout the body, but can frequently end up in the arterial walls, especially those found in the heart. As a consequence, repeated deposits of cholesterol called plaque can form and possibly narrow the arteries, which may lead to acute chest pain (i.e. angina) or a heart attack. These are known as the "acute coronary syndromes".

As more genetic variants that affect health and longevity are found look for attempts to basically take "best of breed" genetic variations and stuff them all into each person who wants them. A lot of genes are expressed in only certain parts of the body and it may be practical to, for instance, upgrade livers to make better blood proteins. The eventual development of the ability to easily grow new replacement organs will facilitate this trend as people will get organs replaced with younger organs and in the process will opt to get genetic improvements added to the starter cells used to grow their new replacement organs.

Update: ApoA-I Milano was chosen because it is patented, not because it is any better than the more common form of apoA-I.

A second but obvious choice would be to simply give people H.D.L., infusing it into their veins. But there was a problem. The idea of giving ordinary H.D.L. was in the public domain and was not protected by patent, and so companies were not interested.

There was, however, one H.D.L. that had been patented, and Dr. Roger Newton, the president and chief executive of Esperion Therapeutics, a small company in Ann Arbor, licensed the rights to develop it.

Again, the main advantage of the Milano type of HDL is that it is patented.

Rader noted that there could be nothing particularly special about this particular form of HDL. It could be that it's the only one that's been tested this way because it's a form of HDL that can be patented. Other companies are developing different ways of using HDL to fight heart disease, such as drugs that boost the body's own production of HDL.

So this therapeutic approach works simply by raising the amount of apoA-I, which is a component of HDL cholesterol molecules.

This illustrates a real serious problem in the development of new treatments: if the infusion of naturally occurring compounds produced by the body is not going to be pursued all that vigorously because most of the compounds are not patentabe then the rate of advance of new and very useful therapies will be much slower. The fact that apoA-1Milano happened to have been patented (perhaps before US patent laws changed to make it harder to patent human genes?) turns out to be very beneficial to us all in this case because it provided an incentive for a company to pursue the various rather expensive phases of animal and human trials.

Perhaps what is needed is some regulatory category for drugs that functions as a proxy for a patent that provides sole ability to sell a compound for some period of time of a company takes the time to go thru regulatory approval steps.

Update II: The Cleveland Clinic has also done recent work on injecting into the heart cells that express stromal cell derived factor-1 (SDF-1) so that the SDF-1 will instruct stem cells to repair the heart.

Previous studies have indicated that damaged heart muscle could be regenerated by directly injecting stem cells into the bloodstream or by chemically mobilizing stem cells from bone marrow either prior to a heart attack or within 48 hours afterward. At The Cleveland Clinic, Dr. Penn and his colleagues looked at the potential of stem cells in repairing hearts weeks after a heart attack, during congestive heart failure. To determine whether SDF-1 was sufficient to induce stem cell homing and recovery of heart function, investigators transplanted cells that expressed SDF-1 into hearts eight weeks following a heart attack. Their research showed that re-establishing SDF-1 expression in the heart led to the homing of circulating stem cells to the injured organ, the growth of new blood cells and the recovery of cardiac tissue. Reintroducing SDF-1 to the heart yielded nearly a 90 percent increase in heart function compared to hearts treated with cells alone. Just increasing the number of circulating stem cells using drugs that induce stem-cell mobilization eight weeks after a heart attack was not enough to initiate meaningful tissue regeneration, supporting the notion that repair to the damaged tissue is possible for only a limited amount of time following the heart attack. Finally, this research suggests a clinically viable strategy for delivering this molecule that can be tested in future trials involving other organ systems.

Update III: Derek Lowe thinks it is possible to construct a patentable version of natural Apo-A1.

Note also that the natural, presumably unpatentable, form of Apo-A1 has been tweaked and modified quite a bit in its clinical studies. There are, I should think, eminently patentable processes there. Anyone, for example, who found a way to get around the purification difficulties of the native protein would patent the method immediately, and they'd get it, too. Any nonobvious formulations or dosing methods would be patentable - just find one that works. And I haven't even mentioned all the peptide analogs that people have been making - patentable, every one of 'em.

Update III: A September 26, 2004 report claims that the Catholic Medical Center in Manchester New Hampshire expects to be participating in the next round of Pfizer's ApoA-I Milano clinical trials which have been temporarily delayed for a reorganization.

Dr. Mary McGowan is director of the Cholesterol Management Center at CMC’s New England Heart Institute. She explained the company that developed the treatment— known as ApoA-I Milano — has since been purchased by the pharmaceutical giant Pfizer, and that has temporarily delayed the start of clinical trials while the company is reorganized.

McGowan said the NEHI has worked on cholesterol studies with Pfizer in the past. And she said last week, “I think there’s very little doubt that we’ll be working with ApoA-I Milano.”

Most clinical trial enrollments are done through the medical centers conducting the trials. So readers would be better off contacting the Cleveland Clinic or the Catholic Medical Center in hopes of getting in on the next round of trials.

By Randall Parker 2003 November 04 03:06 PM  Biotech Therapies
Entry Permalink | Comments(76)
Human Desire For Freedom Evolved Before We Lived In Cities

Denis Dutton has written a review for Arts & Letters Daily of Emory University professor of economics and law Paul H. Rubin's book entitled Darwinian Politics: The Evolutionary Origin of Freedom

Rubin begins with that bracing idea that the often-coercive political control placed on human beings since the advent of cities is character­istic only of the Holocene. The human desire for freedom, he argues, is an older, deeper prehistoric adaptation: for most of their existence, human beings have experienced relative freedom from political coer­cion. Many readers will find Rubin’s thesis counterintuitive: we tend to assume that political liberty is a recent development, having appeared for a while with the Greeks, only to be reborn in the eighteenth century, after millennia of despotisms, for the benefit of the modern world. This is a false assumption, a bias produced by the fact that what we know best is recorded history, those 500 generations since the advent of cities and writing.

The fall from the proverbial garden of freedom began around ten thousand years ago at the beginning of the Holocene era when humans developed agriculture. With agriculture came the ability to maintain much more dense settlements and that, in turn, led to governments and the coercive power of governments to control people.

If this argument is correct (and I think it is) then the advent of dense settlements and coercive political systems must have created new selective pressures on genes that shape human personality and behavioral tendencies. Therefore people whose ancestors lived in denser communities for longer periods of time probably have different distributions of alleles for personality than people whose ancestors were more recently hunter-gatherers. We might expect, therefore, to find different average personality types in Mongolia than in the most densely populated regions of northern China. Also, we might expect to see different average personality types among the Hmong and other more remote groups in southeast Asia than among those living in the Mekong Delta.

Any area that has managed to maintain a strong administrative system of control for many generations almost certainly selected for different kinds of progeny. People more likely, due to temperament and behavior, to be killed or imprisoned by governments were less likely to reproduce. People who were adept at advancing thru the ranks of elaborate administrative systems (and China probably stands out in terms of sheer continued length of such systems) would have different personality types and would have been more likely to leave more progeny.

If the desire for freedom is a primitive Pleistocene urge and if city systems and larger empire administrative structures selected for different characteristics what does this hold for humanity's future? It is hard to say. Certainly, the innate desires to get along with large numbers of people and to submit to laws and norms of behavior are all useful. But there is a complex interaction between the many elements of human personality and once it becomes possible to control what personality characteristics offspring get it is hard to predict what humans will become like.

The Pleistocene era's selection for reverse dominance hierarchies probably provides the human mental characteristics that serve as the basis of democracy.

Rubin cites studies showing that hunter-gatherers had what are called “reverse dominance hierarchies,” where less dominate males acted individually or cooperated with each other to curtail the power of would-be dominants. Strategies for this would include “ridicule, refusal to obey commands, forcible resistance, and even homicide against those with too strong a desire for power.” A desire for freedom, then, for relative personal autonomy within the group, is a powerful Pleistocene adapta­tion pitted against extreme coercive hierarchy.

Imagine a country in which the government forces offspring to be born with less of the innate desire to form reverse dominance hierarchies. The result would be people who do not put up as much (or even any) resistance to the dictates of those in power. In anther country where people are free to choose the genetic characteristics of their offspring it is hard to predict what choices will be made and, as a consequence, what the resulting political system will look like.

Rubin sees both the impulse for support of the welfare state and the opposition to high taxes and the resentment toward freeloaders as all consequences of Pleistocene adaptations. Helping others in tough times might lead to their helping you out at a later point. At the same time. food was too scarce to tolerate freeloading. Rubin also argues that libertarianism is contrary to human nature and that humans want to meddle in each others' lives. Read the whole review. Very interesting.

One other point: My guess is that the distribution of alleles for the desire to be altruistic or to enforce rules or to force people not to be freeloaders will be found to be different in different parts of the political spectrum. A lot of political divisions will turn out to be, at least in part, due to average differences in personality characteristics that have their origins in the Pleistocene era. My bet is that once people start genetically tinkering with their offspring purer forms of socialists, libertarians, social conservatives, and other political types will be born and the political divisions within some societies and between societies will become greater as a result.

By Randall Parker 2003 November 04 12:47 PM  Trends, Human Evolution
Entry Permalink | Comments(5)
2003 November 03 Monday
GAD2 Gene Variation Increases Obesity

A variation of a gene called GAD2 was found to be more common among obese test subjects.

GAD2, which sits on chromosome 10, acts by speeding up production of a neurotransmitter in the brain called GABA, or gamma-amino butyric acid. When GABA interacts with another molecule named neuropeptide Y in a specific area of the brain - the paraventricular nucleus of the hypothalamus - we are stimulated to eat.

The researchers behind this study believe that people who carry a more active form of the GAD2 gene build up a larger than normal quantity of GABA in the hypothalamus, and suggest that this over accumulation of GABA drives the stimulus to eat further than normal, and is thus a basis for explaining why obese people overeat.

Professor Philippe Froguel, senior author of the research, from Imperial College London, and Hammersmith Hospital, London, and who carried out the research while at the Institut Pasteur de Lille, France, said: "The discovery that this one gene plays a role in determining whether someone is likely to overeat could be crucial in understanding the continued rise in obesity rates around the world.

"Genetic factors alone can not explain the rapid rise in obesity rates, but they may provide clues to preventative and therapeutic approaches that will ease the health burden associated with obesity.

"Having identified this gene, it may be possible to develop a screening programme to identify those who may be at risk of becoming obese later in life, and take effective preventative measures."

The team compared genome-wide scans of 576 obese and 646 normal weight adults in France, from which they identified two alternative forms, or alleles, of the GAD2 gene.

One form of the gene was found to be protective against obesity, while another increased the risk of obesity. The normal weight group of French adults had a higher frequency of the protective form of the GAD2 gene. Obesity is three to five times less prevalent in France than in the USA.

GAD2 codes for a protein involved in insulin metabolism.

The discovery, which will be published in an upcoming issue of the journal Public Library of Science (http://www.plos.org), involves researchers originally from Sweden and France who collaborated at the University of Washington in Seattle.

The gene, on Chromosome 10, was first connected to diabetes in 1991 by Dr. Åke Lernmark, R. H. Williams Professor of Medicine and adjunct professor of immunology at the UW. The GAD2 gene is responsible for the protein GAD65, which plays a role in the healthy use of insulin by the body. Lernmark is a native of Sweden, which has one of the highest rates of Type I diabetes incidence in the world.

If this result is confirmed in other populations expect GAD2 expression and the activity of the GAD6 protein to become targets for drug development.

By Randall Parker 2003 November 03 12:47 AM  Brain Genetics
Entry Permalink | Comments(2)
2003 November 01 Saturday
Wanted: Half Billion Dollars To Jumpstart Eternal Youthfulness Research

The New York Times has an article by James Gorman about University of Cambridge biogerontologist Aubrey de Grey's appearance at the Pop!Tech conference.

Getting old and dying are engineering problems. Aging can be reversed and death defeated. People already alive will live a thousand years or longer.

He was at pains to argue that what he calls "negligible senescence," and what the average person would call living forever, is inevitable. His proposed war on aging, he said, is intended to make it happen sooner and make it happen right.

Aubrey says he only needs a half billion dollars to start the coming explosion in anti-aging research.

Mr. de Grey has no illusions about the challenge he faces. He wants to establish an institute to direct research, he said, adding that he probably needs $500 million to achieve the goal of using mouse research to kick-start a global research explosion on human aging. That includes the prize fund.

If anyone is in the Washington DC area be aware that on November 5, 2003 Aubrey de Grey will be debating the prospects for rolling back aging at an AAAS meeting.

WASHINGTON, DC, Nov. 1, 2003 (PRIMEZONE) -- The Methuselah Foundation is proud to announce a landmark debate between two pioneering scientists on not just how, but when, science will reverse the aging process -- hosted by the AAAS and funded by the Alliance for Aging Research.

In a November 5th debate at the American Association for Advancement of Science, 1200 New York Ave, 11 AM, Dr. Aubrey de Grey, University of Cambridge, will discuss the very real possibility of a modern day medical fountain-of-youth with Dr. Richard Sprott, Executive Director of the Ellison Medical Foundation. Dr. de Grey is a Pioneering Biogerentologist, the Senior Science Advisor to the Methuselah foundation, and serves on the Board of Directors of the International Association of Biomedical Gerontology and the American Aging Association.

These two leading biogerontologists will debate the implications of recent advances in aging and anti-aging research, and set forth a timeline for reversal of aging and its associated diseases. Morton Kondracke, Executive Editor of Roll Call and author of Saving Milly, a personal chronicle of his wife's battle with Parkinson's disease, will moderate.

Some people claim that we can't extend human life by hundreds or thousands of years because biological systems are too complicated or the problems are too complicated. The term "complicated" in this context means several separate things and it is worth it to try to break them apart. Here is my first stab attempt to describe what might be meant by the term "too complicated" when used by anti-aging therapy pessimists:

  • Too complicated to understand. Even if we could somehow collect data on every single thing that happens in cells some would argue that our minds just won't be able to figure out all the myriad causes and effects and how they all interact with each other. Well, we have ever faster computers that can be used to process the data and to run simulations. We also have increasingly larger numbers of smart people becoming scientists all over the world. My judgement is that humans can get a handle on all that happens within and between cells and to understand the ramifications.
  • Too complicated to measure. The idea here is that we just can't measure everything that needs to be measured to get the data we need to analyse in order to be able to understand cells. Well, the trend has been that every year our ability to measure biological systems increases as new types of instruments are developed and existing types of instruments become faster and more sensitive. We have all sorts of instruments to work with such as MRI machines, DNA sequencing machines, DNA microarrays, and microfluidics devices. The advances in semiconductors, nanotechnology, and other areas look like they will increase our ability to measure by orders of magnitude. Measurement does not look like it will be the roadblock.
  • Too complicated to manipulate. The argument here is that even if we can figure out what we can to fix it will turn out to be impossible to get into organisms and fix things. This argument fails for a number of reasons. For many types of problems we won't need to fix a particuilar part. We will just replace it. If we can grow replacement internal organs then all those parts won't need to be fixed. Then the argument becomes that it will turn out to be too hard to grow replacement organs. But early indications so far is that organ growth will turn out to be a solvable problem. The biggest problem is the brain. We can't replace the brain without effectively killing the old identity. So will it become possible to fix aged brain cells in situ? That is the hardest manipulation problem of all.

Aubrey argues that we don't really need to understand everything that goes wrong in aging. We just need to be able to fix it. He is quite right to argue that we should be approaching the problem of human aging with a mentality more like that of an engineer or an auto mechanic. We can develop techniques to fix things without understanding every last detail. Therefore the "too complicated to understand" argument is even less of an objection.

Still, even if we just want to fix things there is value in developing greater understanding in particular areas. The ability to measure what goes on in cells as they differentiate is very important for developing the ability to fix and replace old parts because we need a way to measure the results of our attempts to turn cells into other cell types. But advances in measuring epigenetic information and gene expression promise to make the study of cellular differentiation progressively easier to do. If we can measure something then we can test out ways to manipulate it. Instrumentation advances are very important for the advance of biological science and biotechnology. Fortunately, the steady advances in semiconductors and nanotechnology assure that the instrumentation advances will continue to come at a fairly rapid pace.

Update: It is also possible to watch the debate remotely as a webcast.

By Randall Parker 2003 November 01 01:21 PM  Aging Reversal
Entry Permalink | Comments(3)
Site Traffic Info
Site Copyright
The contents of this site are copyright ©