2004 March 31 Wednesday
House May Be Automatically Built In 24 Hours

A USC professor is developing technology that will allow a complete house to be built in 24 hours.

Degussa AG, one of the world’s largest manufacturers and suppliers of construction materials, will collaborate in the development of a USC computer-controlled system designed to automatically “print out” full-size houses in hours.

Funded by a grant from the National Science Foundation, Behrokh Khoshnevis of the USC Viterbi School of Engineering’s Information Sciences Institute has been developing his automated house-building process, called “Contour Crafting,” for more than a year.

Khoshnevis believes his system will be able to construct a full-size, 2,000- square-foot house with utilities embedded in 24 hours. He now has a working machine that can build full-scale walls and is hoping to actually construct his first house in early 2005.

Contour Crafting uses crane- or gantry-mounted nozzles, from which building material - concrete, in the prototype now operating in his laboratory - comes out at a constant rate.

Moveable trowels surrounding the nozzle mold the concrete into the desired shape, as the nozzle moves over the work.

Robots and other automated equipment have increased factory automation so much that factories are a dwinding source of all jobs. The next big target for automation has been and continues to be office work. Office automation is being addressed with the development of huge amounts of software and information systems.

What never seem to get as much attention is how to automate all the other places where people work aside from the office and the factory. Construction automation is an obvious big target. One approach is to do prefabrication of walls and other building pieces in highly automated factories. Then the prefabricated parts can be shipped to the construction site. But automated methods to doing construction at a site have advantages because they avoid the difficulty of shipping large walls, floors, and ceilings to a site. Also, automated site construction techniques allow more flexibility in site design.

By Randall Parker 2004 March 31 01:44 AM  Robots Home
Entry Permalink | Comments(11)
2004 March 29 Monday
Genetically Engineered Mosquitoes Could Stop Malaria

Researchers in the European Molecular Biology Laboratory located in Heidelberg, Germany have discovered mosquito proteins that determine how well the malaria parasite falciparum plasmodium reproduces in mosquitoes.

EMBL scientists have identified four mosquito proteins that affect the ability of the malaria parasite (Plasmodium) to survive and develop in the malaria-carrier mosquito (Anopheles). This breakthrough, featured in recent issues of Cell (March 5, 2004) and Science (March 26, 2004), could be used to block the transmission of malaria from mosquitoes to humans.

"Many researchers focus on the direct effects of Plasmodium on the human body but the mosquito is an equally important battleground in fighting the disease," notes Prof. Fotis C. Kafatos, EMBL's Director-General and leader of the group focusing on malaria research. "We now see a way to potentially stop the parasite in its tracks."

The malaria parasite has to be able to reproduce in the mosquito in order to be able to infect humans.

When a blood-feeding Anopheles mosquito bites an infected organism, the insect feeds on its blood - taking in the malaria-causing Plasmodium. After three weeks of developing within the mosquito, the Plasmodium moves from the insect gut into the salivary glands and is ready for transmission: at the next bloodmeal it will be injected into the bloodstream along with the mosquito's saliva, initiating a new infection cycle.

But one fact that had continued to puzzle malaria researchers is why within one mosquito species, some mosquitoes transmit malaria (termed "susceptible"), whereas others do not ("refractory"). It was suspected that protein factors of the mosquito's immune system might be responsible for this difference. EMBL scientists have now shown this to be the case - with a new twist.

Two of these mosquito proteins, TEP1 and LRIM1, were shown to be true defenders of the mosquito - killing the parasite in the insect's gut.

"The TEP1 and LRIM1 studies proved that the mosquito's immune system has the ability to defend itself against malaria. By enhancing these natural defenders, we may be able to block the parasite-mosquito cycle," says EMBL PhD student Stephanie Blandin, who worked on the TEP1 studies with CNRS researcher (and EMBL alumna) Elena Levashina and collaborators from the University of Leiden (The Netherlands).

"Our studies on TEP1 represent an important step because they show that TEP1 specifically locks onto the Plasmodium and it is this binding that mediates the killing of the parasite," notes Levashina. "Different forms of this protein are present in susceptible and refractory mosquitoes, potentially accounting for the fact that refractory mosquitoes do not sustain parasite development."

In the Kafatos Group, a collaboration between postdoctoral fellow Mike Osta and Staff Scientist George Christophides revealed a new twist: in addition to the mosquito defender protein LRIM1, they discovered two proteins, CTL4 and CTLMA2, which have an opposite effect, actually protecting the parasite as it develops in the mosquito gut. If these proteins were eliminated, the parasites died.

One way to use this new information would be to develop chemicals aimed at these proteins to strip away protection that these proteins provide to falciparum plasmodium. The chemicals would be used in a fashion analogous to pesticides but with the aim of allowing mosquito immune systems to kill the malaria parasite rather than killing the mosquitoes.

"It is now clear that if we strip away protective proteins, the parasite becomes vulnerable to the mosquito's immune system," Christophides notes. "Developing novel chemicals to inhibit the ability of such proteins to protect the parasite is a promising avenue to decrease the prevalence of malaria."

Prof. Kafatos agrees. "These studies are the first to show the power of the mosquito's immune system and give us some very real options for fighting the disease in the insect before it even has a chance to be passed to a human," he explains. "There is no single 'magic bullet' for controlling this ancient scourge of humanity, but we want to exploit this new lead to contribute to the defeat of malaria."

If the protein made by the CTL4 gene could be deactivated or simply not made then 97% of the parasites would be killed.

When one gene, called CTL4, is inactivated, the mosquitoes destroy up to 97% of the parasites developing inside their bodies. When the other, called LRIM1, is removed, it has the opposite effect: the parasites multiply readily.

The more radical approach to stopping the malaria parasite would be to make genetically engineered mosquitoes that are highly resistant to falciparum plasmodium infection and release those mosquitoes into the wild to displace existing wildtype mosquitoes.

The discoveries have raised new possibilities for stopping mosquitoes from spreading the parasite. For example, genetically engineering mosquitoes with extra genes to attack the parasites, or lacking the genes that protect them, could help.

One objection raised to this approach is that it would be difficult to displace all wildtype mosquitoes. But repeated releases of genetically engineered mosquitoes could make a large dent in the population of wildtype and at the very least decrease the rate of human infection by malaria. Consider the number of lives at stake. Currently every year 300-500 million people are infected and 1.5-2.7 million people die from malaria. Malaria causes damage to livers, kidneys, and other parts of the body. Even the people who do not die and who are not permanently damaged still suffer and are far less able to work and provide for themselves and their families. Those numbers represent a great deal of human suffering.

This is an idea that could be applied to a number of other diseases by making analogous discoveries in other insects and bugs to find out how to make them resistant to pathogens that they pass on to humans. Genetic engineering of ticks, mosquitoes, and other bugs could protect humans against Lyme Disease, West Nile Virus, and other pathogens transmitted by various sorts of creatures. This idea could even be extended as far as genetically engineering chickens, pigs, and other organisms to be more resistant to influenza infection in order to reduce the risk of virulent influenza strains jumping from other species into humans.

Many of the usual suspects who are opposed to genetically modified food crops can be expected to oppose genetic engineering of mosquitoes and other bugs. But the potential number of lives saved could run into the millions and even the tens of millions over a longer period of time. One big advantage of genetic engineering is that it avoids the costs, potential human health risks, and potential environmental harm that would come from repeated application of chemicals in areas where malaria or some other disease is being spread by insects into the human population.

By Randall Parker 2004 March 29 04:19 PM  Biotech Pathogen Control
Entry Permalink | Comments(6)
2004 March 27 Saturday
X-43A Scramjet Test Flight Reaches 5000 MPH

This is very good news. An unmanned supersonic ramjet test vehicle soared to 95,000 feet and Mach 7 (about 5,000 mph).

(Dulles, VA 27 March 2004) - Orbital Sciences Corporation (NYSE: ORB) announced today that its Hyper-X Launch Vehicle was successfully launched on Saturday, March 27 in a flight test that originated from NASA's Dryden Flight Research Center located at Edwards Air Force Base, California. The Hyper-X launch vehicle uses a modified first stage rocket motor, originally designed and flight-proven aboard Orbital's Pegasus® space launch vehicle, to accelerate NASA's X-43A air-breathing scramjet to seven times the speed of sound.

Unlike vehicles with conventional rocket engines, which carry oxygen onboard, the air-breathing X-43A scoops and compresses oxygen from the atmosphere using the shape of the vehicle's airframe. This type of propulsion system could potentially increase payload capacity of future launch vehicles and make high-speed passenger travel feasible since no onboard supply of oxidizer would be required.

"We are extremely pleased with the results of the Hyper-X flight," said Ron Grabe, Executive Vice President and General Manager of Orbital's Launch Systems Group. "After several years of detailed analysis, design upgrades and testing to address the factors that contributed to the failure of the program's first flight, it is all the more gratifying to have carried out this successful flight test. This flight was one of the most challenging missions Orbital has ever conducted and demonstrated our ability to take on and tackle the toughest technical challenges."

Mr. Grabe added, "Our congratulations go out to NASA and all the partners on this program who persevered to get it right. We now have our sights set on a successful third mission to provide even more critical data to NASA's research into the field of hypersonic flight and to extend the flight speed record set today to Mach 10."

On launch day, flight operations began when NASA's B-52B carrier aircraft took off and flew a predetermined flight path to a point 50 miles off the California coast. The Hyper-X vehicle was released from the B-52 at 2:00 p.m. (PST) approximately 40,000 feet over the Pacific Ocean. Following rocket motor ignition, the Hyper-X Launch Vehicle, carrying the X-43A scramjet, accelerated to a velocity of approximately Mach 7 (or seven times the speed of sound) and reached an altitude of 95,000 feet. Approximately 90 seconds after ignition, with the booster at a precise trajectory condition, the Hyper-X launch vehicle sent commands to the X-43A scramjet, which then separated from the booster.

Early flight results indicate that the X-43A stabilized, ignited its scramjet and provided flight data back to NASA engineers. Following the engine burn, the X-43A executed a number of aerodynamic maneuvers during its eight-minute coast to an ocean impact approximately 450 miles from the launch point. After separation, the spent booster impacted the ocean in a pre-determined splash area.

The lure of a scramjet engine is that unlike a rocket it does not need to carry its oxidizer. It carries fuel but scoops oxygen from the atmosphere. A conventional jet engine does this as well but theoretically a scramjet can operate at much higher speeds. However, the heat and pressure at such high speeds have made the development of scramjet vehicles an extremely difficult challenge. Before this latest flight there was enough skepticism about scramjets that some press reports were predicting that if this flight failed the X-43A program would be cancelled. Fortunately this second X-43A test flight succeeded and NASA will continue to do scramjet vehicle development unless the Bush Administration is foolish enough to redirect the money toward a Moon and Mars expedition.

NASA's press release:

The unpiloted vehicle's supersonic combustion ramjet, or scramjet, ignited as planned and operated for the duration of its hydrogen fuel supply. The X-43A reached its test speed of Mach 7, or seven times the speed of sound.

The flight originated from NASA's Dryden Flight Research Center at Edwards Air Force Base, Calif. Taking off at 12:40 p.m. PST, NASA's B-52B launch aircraft carried the X-43A, which was mounted on a modified Pegasus booster rocket. The booster was launched from the B-52B just before 2 p.m. PST. The rocket boosted the X-43A up to its test altitude of about 95,000 ft. over the Pacific Ocean, where the X-43A separated from the booster and flew freely for sev

Aside: Note the reference to the B-52B above. Am I the only one who is under the impression that only B-52G and B-52H aircraft are still operational?

The flight is part of the Hyper-X program, a research effort designed to demonstrate alternate propulsion technologies for access to space and high-speed flight within the atmosphere. It will provide unique "first time" free flight data on hypersonic air-breathing engine technologies that have large potential pay-offs.

See some diagrams of the test flight here.

By Randall Parker 2004 March 27 07:57 PM  Airplanes and Spacecraft
Entry Permalink | Comments(10)
Special Computer Speeds Protein Folding Calculations

A parallel computer designed for high energy physics is speeding calculations for protein folding by 3 orders of magnitude.

MONTREAL, CANADA -- Scientists at the U.S. Department of Energy's Brookhaven National Laboratory are proposing to use a supercomputer originally developed to simulate elementary particles in high-energy physics to help determine the structures and functions of proteins, including, for example, the 30,000 or so proteins encoded by the human genome. Structural information will help scientists better understand proteins' role in disease and health, and may lead to new diagnostic and therapeutic agents.

Unlike typical parallel processors, the 10,000 processors in this supercomputer (called Quantum Chromodynamics on a Chip, or QCDOC, for its original application in physics) each contain their own memory and the equivalent of a 24-lane superhighway for communicating with one another in six dimensions. This configuration allows the supercomputer to break the task of deciphering the three-dimensional arrangement of a protein's atoms -- 100,000 in a typical protein -- into smaller chunks of 10 atoms per processor. Working together, the chips effectively cut the computing time needed to solve a protein's structure by a factor of 1000, says James Davenport, a physicist at Brookhaven. This would reduce the time for a simulation from approximately 20 years to 1 week.

"The computer analyzes the forces of attraction and repulsion between atoms, depending on their positions, distances, and angles. It shuffles through all the possible arrangements to arrive at the most stable three-dimensional configuration," Davenport says.

This is a familiar theme to long-time FuturePundit readers: the rate of advance in biological science and technology is accelerating because technological advances are producing tools which allow scientists to find answers literally orders of magnitude faster. While I post more often about instrumentation advances the ability of computers to simulate biological systems may turn out to be more important in the long run. Many experiments that are now done through lab work will in the future instead be done with computer simulations.

By Randall Parker 2004 March 27 02:01 PM  Biotech Advance Rates
Entry Permalink | Comments(1)
2004 March 25 Thursday
Last Ice Age Happened In South America Too

Researchers at UW-Madison including Michael Kaplan and Brad Singer along with researchers at Woods Hole Oceanographic Institution have show that the last ice age also happened in South America.

Using a technique to read the changes imposed by cosmic rays—charged, high-energy particles that bombard the Earth from outer space—on atoms found in the mineral quartz, the researchers were able to precisely date a sequence of moraines, ridge-like glacial features composed of an amalgam of rocks, clay, sand and gravel. Their results show that glacial ice in South America reached its apex 22,000 years ago and had begun to disappear by 16,000 years ago.

"The team has applied an innovative investigative technique to an untapped archive of data on natural climate variability to help reduce uncertainty in our knowledge of how Earth's climate works," said David Verardo, director of the National Science Foundation’s (NSF) paleoclimate program, which funded the research. NSF is the independent federal agency that supports fundamental research and education across all fields of science and engineering.

The work is certain to help researchers of past climates unravel the mysteries of ice ages that periodically gripped the planet, Verardo said, but it also will help those trying to understand current and future climate change by helping to determine the natural causes of changes in the Earth's climate system on a global scale.

"We've been able to get quite precise ages directly on these glacial deposits," says team leader Brad Singer, of UW-Madison. "We found that the structure of the last South-American ice age is indistinguishable from the last major glacier formation in the Northern Hemisphere."

And, said Kaplan, "During the last two times in Earth's history when glaciers formed in North America, the Andes also had major glacial periods."

The results address a major debate in the scientific community, according to Singer and Kaplan, because they seem to undermine a widely held idea that global redistribution of heat through the oceans is the primary mechanism that drove major climate shifts of the past.

The implications of the new work, say the study authors, support a different hypothesis: Rapid cooling of the Earth's atmosphere synchronized climate change around the globe during each of the last two glacial epochs.

"Because the Earth is oriented in space in such a way that the hemispheres are out of phase in the amount of solar radiation they receive, it is surprising to find that the climate in the Southern Hemisphere cooled off repeatedly during a period when it received its largest dose of solar radiation," says Singer. "Moreover, this rapid synchronization of atmospheric temperature between the polar hemispheres appears to have occurred during both of the last major ice ages that gripped the Earth."

There are big heat conveyor flows in the world's oceans. One fear of climatologists is that global warming might cause a halt to the flow of warm water into the North Atlantic due to a decrease in salinity caused by glacier melting and changing precipitation patterns. This would cause a dramatic cooling especially in Europe and even in parts of North America. However, while the possibility of that scenario still exists these researchers are arguing that a halt in the heat conveyor probably didn't cause the glacial periods. This research shows the glacial periods happened in South America as well and the halt in the conveyor would not have caused the same cooling effect in South America that it would have caused in Europe.

Most of the debate about climate change today is centers around the question of how is humanity changing the world's climate and how big an effort should be made to reduce the human impact on climate. However, if humanity survives long enough then eventually humans are going to have to decide whether to try to stop much larger climate changes that have non-human causes. When the Earth begins to descend into another ice age should humanity intervene to stop it? Some naturalists believe that processes that have non-sentient causes should be considered sacred and not to be tampered with. In their minds that which is natural is morally superior and changes caused by sentient minds are inherently sinful. My guess is that if humanity lives long enough to witness the beginning of the next ice age then artificial intelligences will ally with the majority of the human race in favor of an intervention to prevent the ice age.

By Randall Parker 2004 March 25 10:52 AM  Climate Trends
Entry Permalink | Comments(10)
Two Approaches For Preventing Injured Nerve Cell Death

When an accident or any other event damages nerve cells much of the nerve cell death and resulting disability occurs hours or even days after the traumatic event. The injured nerves, many of which are not damaged in ways that make death inevitable, go through changes that cause them to commit cell suicide. Two approaches for how to prevent nerve cell death have just been reported. The first approach, tried at Wake Forest University, involves use of so-called heat stress proteins which normally are found inside of cells but which if delivered outside of cells still prevent a substantial fraction of damaged nerve cells from dying.

WINSTON-SALEM, N.C. – New findings in animals suggest a potential treatment to minimize disability after spinal cord and other nervous system injuries, say neuroscientists from Wake Forest University Baptist Medical Center.

“Our approach is based on a natural mechanism cells have for protecting themselves, called the stress protein response,” said Michael Tytell, Ph.D., a neuroscientist and the study’s lead researcher. “We believe it has potential for preventing some of the disability that occurs as a result of nervous system trauma and disease.”

The research showed that up to 50 percent of the motor and sensory nerve cell death could be prevented in mice with sciatic nerve injury. It is reported in the current issue of Cell Stress and Chaperones, a journal of stress biology and medicine.

“We are on our way to developing a treatment that is effective in preventing motor nerve cell death, which is significant to people because loss of motor neurons means paralysis,” said Tytell, professor of neurobiology and anatomy at Wake Forest Baptist.

The goal of the work is to prevent or minimize the “secondary” cell death that occurs in the hours and days after a spinal cord or brain injury. During this period, cells surrounding the injury can become inflamed and die, a cascading response that worsens disability.

Either these proteins could be delivered at injury sites or drugs could be developed to stimulate the production of Hsc70 and Hsp70 in damaged nerve cells.

For the study, the researchers treated injured sciatic nerves in mice with Hsc70 and Hsp70. In mice treated with the proteins, cell death was reduced by up to 50 percent compared to mice that weren’t treated.

Tytell said it is a novel idea that cells can be successfully treated with a protein that is ordinarily made inside the cells.

“We don’t know whether the protein is functioning in the same way as when it’s made in the cells,” he said. “We’re working to learn more about this effect. If we can understand it better, we’ll know what form it should be in and what the doses should be to maximize the protective benefits.”

Tytell and colleagues hope to use their knowledge about the proteins and how they work to develop drugs that could be used to treat injury. One idea is to develop a drug that would increase the production of the protective proteins.

Another group which includes lead author Sung Ok Yoon of Ohio State University used an antibody to neutralize a protein released by damaged nerve cells that plays a role in causing cell death.

Researchers report in the current online issue of the Proceedings of the National Academy of Sciences that they were able to prevent the death of damaged neurons by neutralizing a specific protein the injured cells secreted. Neurons carry messages from the brain to the spinal cord and the rest of the body.

Damaged neurons are rendered useless by the physical interaction of two cellular proteins – proNGF and p75, the researchers report. They learned that treating these injured cells with a proNGF antibody kept the proteins from interacting. In turn the neurons were saved from almost certain loss.

The approach of using an antibody to neutralize proNGF (which is a precursor to Nerve Growth Factor) saved most of the cells that otherwise would have died.

The researchers saw substantial increases in proNGF and p75 in damaged neurons within 24 hours after injury. Levels of p75 peaked three days after injury, as did neuronal death.

The researchers took another group of damaged neurons and treated these cells with an antibody to proNGF. Doing so kept proNGF from interacting with p75, and resulted in a 92 percent survival rate of otherwise damaged neurons.

"The antibody notably reduced the number of neurons that normally die after such injury," Yoon said. "But it's too soon to say if these rescued cells would function normally again after treatment.

"We do know that injury decreased the number of healthy, viable neurons by half," she said. "But the number of intact neurons remained at nearly 100 percent after antibody treatment."

To treat cancer what we need are better ways to cause cell death and to prevent cells from dividing. To develop stem cell therapies we need better ways to order cells to move to desired target locations, to divide, and to become specialized for various purposes. But for nerve cell damage caused by trauma or perhaps by toxins what we need is the ability to prevent the series of steps that lead cells to commit suicide (caled apoptosis). That two different teams have almost simultaneously come up with two different approaches for doing this in lab settings is encouraging. These results are not going to immediately lead to new treatments but they do demonstrate that such treatments are probably possible to create and these results provide useful information about what directions to pursue for further research.

By Randall Parker 2004 March 25 02:12 AM  Biotech Therapies
Entry Permalink | Comments(0)
2004 March 23 Tuesday
Closely Related Mountain Gorilla Males Less Likely To Fight

Mountain gorillas get along better across social groups when they are more closely related.

Scientists studying the elusive western gorilla observed that neighboring social groups have surprisingly peaceful interactions, in contrast to the aggressive male behavior well documented in mountain gorillas. By analyzing the DNA from fecal and hair samples of the western gorilla, scientists uncovered evidence that these neighboring social groups are often led by genetically related males. These findings suggest connections between genetic relationships and group interactions, parallels with human social and behavioral structures, and clues to the social world of early humans.

In the new work, reported by Brenda Bradley and colleagues at the Max Planck Institute for Evolutionary Anthropology and Stony Brook University, the researchers collected DNA samples to characterize patterns of paternity within and among western gorilla social groups. The authors found that a strong majority of silverbacks were related to other silverbacks in the area and that in almost all cases, the nest sites of related silverbacks were found near each other. It was already known that both male and female western gorillas leave their natal group once mature, but the new findings suggest that the dispersing males may remain in the vicinity of male kin, forming a so-called "dispersed male network."

It makes sense that selective pressures would favor a greater willingness to harm those more genetically distant. Closer relations share more DNA in common and therefore their well-being and reproductive success is more in the interest of their close relatives than is the case with more distant members of the same species.

The researchers theorize that the dispersed male network and the social behavior of the western gorilla may be connected, in part because peaceful interactions between related males may be beneficial. This idea is in keeping with kin-selection theory, a well-regarded set of ideas for how related members of a society interact to benefit the related group. According to the authors, western gorilla male networks may benefit younger males as they attempt to attract females and form new groups, since male-male aggression is thought to hinder the acquisition and retention of females. Similar scenarios have been reported for some bird species, and there is ample evidence of such relationships underlying aspects of human social interactions, including marriage patterns. In addition, some relevant aspects of western gorilla society are shared with chimpanzees. The new findings point to characteristics that appear to be held in common between humans and some other African apes, suggesting that kinship patterns both within and among groups may have played an important role in shaping the social world of early humans.

I'll bet that humans will eventually be found to have genetic variations that encourage this sort difference in behavior as well.

By Randall Parker 2004 March 23 10:02 AM  Trends, Human Evolution
Entry Permalink | Comments(1)
2004 March 22 Monday
Sadness, Disgust Motivate People To Buy And Sell

Jennifer Lerner and colleagues at Carnegie Mellon University showed people movies that portrayed events that were either emotionally neutral, disgusting or saddening. While disgust caused people to be willing to sell at a lower price it also caused them to be less willing to buy unless the price was also low. By contrast, sadness caused both an increased desire to sell and an increased desire to buy.

Sadness, too, cut people's selling price, to $3.06, compared with what emotionally neutral volunteers demanded. Feeling blue made people so desperate for change that they held a fire sale. Sadness also raised, to $4.57, the price people would pay for the pens.

Sad people are therefore more likely to sell things for less than the best market price they could getand to buy things for more than the market price. Is this caused by a desire to change one's suirroundings in hopes of relieving the feeling of sadness? Or does a sad person just place a low value of their own money?

Physical disgust may have evolved to cause people to avoid things that are toxic or infectious to them.

"Disgust is a form of evasive action to protect us against signs of threat, such as disease," says Val Curtis, who led the research. "Women need to have a higher level of sensitivity to infection or disease, because they are the main carers of infants. And, as reproductive ability declines with age, so does disgust."

Avoid shopping or selling things when you are sad.

Update: Sadness makes people want to change their circumstances while disgust makes people want to get rid of what they have.

"We're showing for the first time that incidental emotions from one situation can exert a causal effect on economic behavior in other, ostensibly unrelated situations," said Jennifer Lerner, an assistant professor of social and decision sciences and psychology. Lerner co-authored the study with Economics Professor George Loewenstein and Deborah Small, a doctoral student in the Department of Social and Decision Sciences who soon will be joining the faculty at the Wharton School of Business at the University of Pennsylvania.

To study the effects of sadness on economic behavior, Lerner and her colleagues had one group of participants watch a scene from the film "The Champ" in which a young boy's mentor dies. The researchers elicited disgust in another group by showing a clip from the movie "Trainspotting" in which a man uses an unsanitary toilet. To augment their emotional states, the participants were asked to write about how they would feel in situations similar to those portrayed in the movies. Individuals were then given the opportunity to either sell a highlighter set they were given, or set a price that they would be willing to pay to buy a highlighter set. The study participants then actually bought and sold the highlighters.

The researchers concluded that sadness triggered an implicit need for individuals to change their circumstances, thus a greater willingness to buy new goods or to sell goods that they already had, while disgust made people want to get rid of what they had and made them reluctant to take on anything new, depressing all prices. When asked whether emotion played a role in their decisions, participants said no, indicating that they were unaware that their emotional state could be costing them money. The researchers already have replicated their results in another experiment.

By Randall Parker 2004 March 22 02:38 AM  Brain Economics
Entry Permalink | Comments(1)
2004 March 19 Friday
Primate Frontal Cortex Hyperscales

Across primates the size of the frontal cortex goes up faster than the size of the rest of the brain.

"In primates, having a bigger brain means you have a disproportionately larger frontal cortex," said Eliot Bush, a PhD candidate at Caltech who worked on the study.

Large human total brain size results in much larger frontal cortex size.

PASADENA, Calif.--Everybody from the Tarzan fan to the evolutionary biologist knows that our human brain is more like a chimpanzee's than a dog's. But is our brain also more like a tiny lemur's than a lion's?

In one previously unsuspected way, the answer is yes, according to neuroscientists at the California Institute of Technology. In the current issue of the Proceedings of the National Academy of Sciences (PNAS), graduate student Eliot Bush and his professor, John Allman, report their discovery of a basic difference between the brains of all primates, from lemurs to humans, and all the flesh-eating carnivores, such as lions and tigers and bears.

The difference lies in the way the percentage of frontal cortex mass increases as the species gets larger. The frontal cortex is the portion of brain just behind the forehead that has long been associated with reasoning and other "executive" functions. In carnivores, the frontal cortex becomes proportionately larger as the entire cortex of the individual species increases in size--in other words, a lion that has a cortex twice the size of another carnivore's also has a frontal cortex twice the size.

By contrast, primates like humans and apes tend to have a frontal cortex that gets disproportionately larger as the overall cortex increases in size. This phenomenon is known as "hyperscaling," according to Bush, the lead author of the journal article.

What this says about the human relationship to the tiny lemurs of Madagascar is that the two species likely share a developmental or structural quirk, along with all the other primates, that is absent in all the carnivores, Bush explains. "The fact that humans have a large frontal cortex doesn't necessarily mean that they are special; relatively large frontal lobes have developed independently in aye-ayes among the lemurs and spider monkeys among the New World monkeys."

...

The hyperscaling mechanism is genetic, and was presumably present when the primates first evolved. "Furthermore, it is probably peculiar to primates," says Allman, who is Hixon Professor of Neurobiology at Caltech.

Given that brain size differs between humans it would be interesting to know whether this same mathematical relationship exists when comparing humans with each other/.

From the abstract of the paper:

Primate frontal cortex hyperscales relative to the rest of neocortex and the rest of the brain. The slope of frontal cortex contrasts on rest of cortex contrasts is 1.18 (95% confidence interval, 1.06-1.30) for primates, which is significantly greater than isometric. It is also significantly greater than the carnivore value of 0.94 (95% confidence interval, 0.82-1.07). This finding supports the idea that there are substantial differences in frontal cortex structure and development between the two groups.

Whatever may have been the cause of this hyperscaling? Is there a reason why the way all primates function causes this to be advantageous?

By Randall Parker 2004 March 19 02:29 AM  Trends, Human Evolution
Entry Permalink | Comments(1)
2004 March 18 Thursday
Climate Researchers Find Less Water Evaporation As Earth Warms

Kenneth R. Minschwaner of the New Mexico Institute of Mining and Technology and Andrew Dessler of the University of Maryland have found that warming caused by carbon dioxide build-up will cause less evaporation of water and therefore less additional warming than climate modellers have been assuming. (same article here and both have graphic illustrations)

A NASA-funded study found some climate models might be overestimating the amount of water vapor entering the atmosphere as the Earth warms. Since water vapor is the most important heat-trapping greenhouse gas in our atmosphere, some climate forecasts may be overestimating future temperature increases.

In response to human emissions of greenhouse gases, like carbon dioxide, the Earth warms, more water evaporates from the ocean, and the amount of water vapor in the atmosphere increases. Since water vapor is also a greenhouse gas, this leads to a further increase in the surface temperature. This effect is known as "positive water vapor feedback." Its existence and size have been contentiously argued for several years.

Ken Minschwaner, a physicist at the New Mexico Institute of Mining and Technology, Socorro, N.M., and Andrew Dessler, a researcher with the University of Maryland, College Park, and NASA's Goddard Space Flight Center, Greenbelt, Md, did the study. It is in the March 15 issue of the American Meteorological Society's Journal of Climate. The researchers used data on water vapor in the upper troposphere (10-14 km or 6-9 miles altitude) from NASA's Upper Atmosphere Research Satellite (UARS).

Their work verified water vapor is increasing in the atmosphere as the surface warms. They found the increases in water vapor were not as high as many climate-forecasting computer models have assumed. "Our study confirms the existence of a positive water vapor feedback in the atmosphere, but it may be weaker than we expected," Minschwaner said.

Water evaporation will contribute to warming but not as much as previously predicted.

In most computer models relative humidity tends to remain fixed at current levels. Models that include water vapor feedback with constant relative humidity predict the Earth's surface will warm nearly twice as much over the next 100 years as models that contain no water vapor feedback.

Using the UARS data to actually quantify both specific humidity and relative humidity, the researchers found, while water vapor does increase with temperature in the upper troposphere, the feedback effect is not as strong as models have predicted. "The increases in water vapor with warmer temperatures are not large enough to maintain a constant relative humidity," Minschwaner said. These new findings will be useful for testing and improving global climate models.

While gradual warming is the scenario most widely discussed possible future climate scenario there is a real possibility that the Earth's climate could undergo a rapid cooling initiated by natural processes or human-caused changes.

While policymakers have worried long and hard about global warming, which might raise Earth's temperature 1.4 to 5.8 degrees C by century's end, a growing body of evidence suggests natural forces could just as easily plunge Earth's average temperatures downward. In the past, the planet's climate has changed 10 degrees in as little as 10 years.

...

For example: Regional and global climates have undergone quick and dramatic changes even after what would appear to be only gentle prodding by natural influences, Dr. Eakin says. In many cases, that prodding has been far less severe than the changes humans have wrought via industrial emissions of carbon dioxide.

We have not seen sudden drastic climate changes in the lifetimes of anyone now living. But the Little Ice Age temperature dips have happened as recently as the US Revolutionary War period and into the 19th century.

The general trends reflected in the tree-ring record include cooler conditions in the early 1700s, followed by warming that started mid-century. An abrupt cooling occurred in the late 1700s and continued for much of the 1800s. The coldest period was between the 1830s and 1870s, after which a steadily increasing warming trend began.

We shouldn't be completely suprised if the Earth's climate suddenly begins to undergo some sudden change in the coming years and decades. It has happened often enough in the past that it is really a question of when and not if this will occur.

By Randall Parker 2004 March 18 07:58 PM  Climate Trends
Entry Permalink | Comments(1)
2004 March 17 Wednesday
Coal Staging Comeback As Natural Gas, Oil Prices Rise

Triggering a lot of thoughts about energy is a good article Mark Clayton wrote in The Christian Science Monitor on February 26, 2004 that has been in my "ought to post about this" list for too long. The article is entitled America's new coal rush.

After 25 years on the blacklist of America's energy sources, coal is poised to make a comeback, stoked by the demand for affordable electricity and the rising price of other fuels.

At least 94 coal-fired electric power plants - with the capacity to power 62 million American homes - are now planned across 36 states.

Many different electric power companies have made the decision that coal is going to be cheaper than natural gas as a source of energy to generate electric power. After a long period during which most new electric power plants have been built to burn natural gas in order to reduce emissions this represents a substantial shift in long term views about availability of different fossil fuels. While part of that shift may be due in part to advances in coal-burning technologies that reduce emissions this shift also appears to be part of a larger pattern of a growing belief that both oil and natural gas production do not look like they will be able to rise as rapidly as demand. At some point in the next two decades it is quite probable that their production will actually fall. This spells a coming era of wrenching readjustments and difficult economic times.

Some experts claim that only half these plants may be built. But that is still a large number.

But experts caution that perhaps no more than half of all proposed plants will ever be built. It can take seven to 10 years for a coal power plant to go from planning to construction - and legal action and public protests often halt them.

My guess is that rising prices for other forms of energy will create conditions that will lead to the building of all of these planned coal-fired electricity generation plants and probably many more.

Industry plans for building coal electric power plants come from a US Department of Energy National Energy Technology Laboratory Office of Coal and Environmental Systems February 24, 2004 report entitled Tracking New Coal-Powered Power Plants: Coal's Resurgence In Electric Power Generation (PDF format).

CIBC World Markets economist Jeffrey Rubin says there are already signs that conventional oil production may have peaked.

Strip out unconventional sources of supply, and crude production is hovering around 65 million barrels, where it has been for the past four years. Has the world already seen the peak in conventional crude production?

The 82 millions per barrel total production today includes oil sands extraction and very deep sea extraction.

Dr. David Goodstein, Vice Provost and Professor of Physics and Applied Physics at Caltech, has recently written a book entitled Out of Gas: The End of the Age of Oil where he argues that the peak of oil production is rapidly approaching. A CalTech press release on the book provides a sketch of Goodstein's arguments on the coming decline in the production of oil.

But even the 1970s' experience would be nothing compared to a worldwide peak, Goodstein explains. Indeed, the country then experienced serious gas shortages and price increases, exacerbated in no small part by the Arab oil embargo. But frustration and exasperation aside, there was oil to buy on the global market if one could locate a willing seller. By contrast, the global peak will mean that prices will thereafter rise steadily and the resource will become increasingly hard to obtain.

Goodstein says that best and worst-case scenarios are fairly easy to envision. At worst, after the so-called Hubbert's peak (named after M. King Hubbert, the Texas geophysicist who was nearly laughed out of the industry in the 1950s for even suggesting that a U.S. production peak was possible), all efforts to deal with the problem on an emergency basis will fail. The result will be inflation and depression that will probably result indirectly in a decrease in the global population. Even the lucky survivors will find the climate a bit much to take, because billions of people will undoubtedly rely on coal for warmth, cooking, and basic industry, thereby spewing a far greater quantity of greenhouse gases into the air than that which is currently released.

"The change in the greenhouse effect that results eventually tips Earth's climate into a new state hostile to life. End of story. In this instance, worst case really means worst case."

The best-case scenario, Goodstein believes, is that the first warning that Hubbert's peak has occurred will result in a quick and stone-sober global wake-up call. Given sufficient political will, the transportation system will be transformed to rely at least temporarily on an alternative fuel such as methane. Then, more long-term solutions to the crisis will be put in place--presumably nuclear energy and solar energy for stationary power needs, and hydrogen or advanced batteries for transportation.

The preceding is the case that Goodstein makes in the first section of the book. The next section is devoted to a nontechnical explanation of the facts of energy production. Goodstein, who has taught thermodynamics to a generation of Caltech students, is particularly accomplished in conveying the basic scientific information in an easily understandable way. In fact, he often does so with wit, explaining in a brief footnote on the naming of subatomic particles, for example, that the familiar "-on" ending of particles, such as "electrons," "mesons," and "photons," may also suggest an individual quantum of humanity known as the "person."

The remainder of the book is devoted to suggested technological fixes. None of the replacement technologies are as simple and cheap as our current luxury of going to the corner gas station and filling up the tank for the equivalent of a half-hour's wages, but Goodstein warns that the situation is grave, and that things will change very soon.

"The crisis will occur, and it will be painful," he writes in conclusion. "Civilization as we know it will come to an end sometime in this century unless we can find a way to live without fossil fuels."

Goodstein sees the peak coming in this decade or the next decade. Needless to say, the world is in no way prepared to adjust to a declining supply of oil

Goodstein says that at current photovoltaic conversion efficiencies it would take an area of land 300 by 300 miles to get as much energy as we get from fossil fuels.

Solar energy will be an important component, an important part of the solution. If you want to gather enough solar energy to replace the fossil fuel that we’re burning today—and remember we’re going to need more fossil fuel in the future- using current technology, then you would have to cover something like 220,000 square kilometers with solar cells. That’s far more than all the rooftops in the country. It would be a piece of land about 300 miles on a side, which is big but not unthinkable.

Dr. Goodstein was kind enough to provide me with some of the basic facts that went into those figures. The energy that would be collected by 300 by 300 mile area is for the whole world and he's assuming a current world total fossil fuel burn of 10 TW (ten trillion watts). He's also assuming a 10% conversion efficiency for the photovoltaics.

Note of course that part of that energy could be gotten from rooftoops. Also, some could be gotten from other human structures. It is conceivable, for example, that future materials advances may allow the construction of roads that could operate as huge photovoltaic power collectors. Also, boosts in conversion efficiency could reduce the amount of area needed by a factor of perhaps 4 or 5 or even higher. For example, some researchers at Lawrence Berkely Labs have shown that an indium gallium nitride material can boost conversion efficiency to 50%. Also many uses of power could be made much more energy efficient.

Another recent book by Kenneth S. Deffeyes entiteld Hubbert's Peak : The Impending World Oil Shortage m akes similar arguments that the peak of world oil production is approaching.

Deffeyes used a slightly more sophisticated version of the Hubbert method to make the global calculations. The numbers pointed to 2003 as the year of peak production, but because estimates of global reserves are inexact, Deffeyes settled on a range from 2004 to 2008. Three things could upset Deffeyes's prediction. One would be the discovery of huge new oil deposits. A second would be the development of drilling technology that could squeeze more oil from known reserves. And a third would be a steep rise in oil prices, which would make it profitable to recover even the most stubbornly buried oil.

In a delightfully readable and informative primer on oil exploration and drilling, Deffeyes addresses each point. First, the discovery of new oil reserves is unlikely--petroleum geologists have been nearly everywhere, and no substantial finds have been made since the 1970s. Second, billions have already been poured into drilling technology, and it's not going to get much better. And last, even very high oil prices won't spur enough new production to delay the inevitable peak.

"This much is certain," he writes. "No initiative put in place starting today can have a substantial effect on the peak production year. No Caspian Sea exploration, no drilling in the South China Sea, no SUV replacements, no renewable energy projects can be brought on at a sufficient rate to avoid a bidding war for the remaining oil."

I've previously written here on the coming oil production peak.

On my ParaPundit site I've written extensively about the political ramifications of rising oil demand during a period of rising prices and greater dependence on the Middle East. One possible source of hope is the possibility of extracting natural gas from ocean gas hydrates. Or perhaps we will be saved by a breakthrough in desktop fusion. Conventional nuclear power has both cost and proliferation problems. What we need is a massive research push on the order of $5 to $10 billion dollars per year in many different energy technology areas to develop methods to produce energy from other sources and to use energy more efficiently.

Update: At a February 24 2004 symposium hosted by the Center for Strategic & International Studies energy industry investment banker Matthew W. Simmons presented a skeptical analysis of official Saudi Arabian oil reserve claims. (PDF format and the following links as well) A couple of Saudi Aramco employees argued for Saudi estimates. If Simmons is correct then the biggest oil field in Saudi Arabia may already be mostly depleted and the beginning of the decline of oil production in Saudi Arabia may happen decades sooner than conventional wisdom expects. Also from the event: the introduction and the event transcript.

Demand for energy is going to rise with rising populations and growing economies even as oil production may start to decline.

One of Goodstein's Caltech colleagues, chemistry professor Nathan S. Lewis, has calculated the total energy used in the world today, coming up with a grand total of 13 trillion watts consumed annually. That figure, he expects, will rise to 28 trillion watts in the next 40 years or so as the world's population increases from 6 billion to 10 billion.

By Randall Parker 2004 March 17 05:56 PM  Energy Fossil Fuels
Entry Permalink | Comments(8)
2004 March 16 Tuesday
Brain Scans Show Abnormalities In Psychopaths

A USC professor used MRI brain scans, a battery of cognitive function tests, and criminal histories to compare normal people with psychopaths and also to compare psychopaths who manage to avoid getting caught with psychopaths who get arrested for committing crimes.

Adrian Raine, a professor of psychology and neuroscience in the USC College of Letters, Arts & Sciences, focused his research on two parts of the brain: the hippocampus, a portion of the temporal lobe that regulates aggression and transfers information into memory; and the corpus callosum, a bridge of nerve fibers that connects the cerebral hemispheres.

One type of psychopath is adept at avoiding getting caught committing crimes but another type is not.

To explore the physical roots to psychopathic behavior, Raine and his colleagues recruited 91 men from Los Angeles’ temporary employment pool and gave them a battery of tests to assess cognitive ability, information processing skills and criminal history. They also were given MRIs, or brain scans.

In the study of the hippocampus, the research team expanded the scope of previous studies by comparing the brains of two groups for the first time: “successful” psychopaths - those who had committed crimes but had never been caught - and “unsuccessful” psychopaths - those who had been caught.

The hippocampus plays a critical role in regulating aggression and in learning which situations one should be afraid of - a process called contextual fear conditioning.

With psychopaths, contextual fear conditioning plays a part in learning the concept of what to do and what not to do, Raine said. It has been theorized that the disruption of the circuit linking the hippocampus with the prefrontal cortex could contribute to the impulsiveness, lack of control and emotional abnormalities observed in psychopaths.

“It is learning what is right and what is wrong in a certain situation,” he said.

The difference between successful psychopaths (those who avoid getting arrested) and unsuccessful psychopaths is that the more successful ones have a greater ability to learn fear of getting caught and to therefore guide their own behavior to minimize the chances of getting caught.

He tested the theory that psychopaths with hippocampal impairments could become insensitive to cues that predicted punishment and capture. As a result, he said, these “impaired’ psychopaths were more likely to be apprehended than psychopaths without that deficit.

Fewer than half of both the control subjects and the “successful” psychopaths had an asymmetrical hippocampus.

Ninety-four percent of the unsuccessful psychopaths had that same abnormality, with the right side of the hippocampus larger than the left.

The successful and unsuccessful psychopaths share in common a different form of faulty brain wiring that causes them to lack empathy and consideration for other people.

These findings were bolstered by the results of the second study, which focused on the corpus callosum.

The corpus callosum is a bundle of nerve fibers that connects the two hemispheres of the brain, enabling them to work together to process information and regulate autonomic function. Raine explored its role in psychopathy for the first time.

“There’s faulty wiring going on in psychopaths. They’re wired differently than other people,” Raine said. “In a way, it’s literally true in this case.”

He found that the psychopaths’ corpus callosums were an average of 23 percent larger and 7 percent longer than the control groups’.

“The corpus callosum is bigger, but it’s also thinner. That suggests that it developed abnormally,” Raine said.

The rate that the psychopaths transmitted information from one hemisphere to the other through the corpus callosum also was abnormally high, Raine said.

But that didn’t mean things worked better.

With an increased corpus callosum came less remorse, fewer emotions and less social connectedness - the classic hallmarks of a psychopath, he said.

“These people don’t react. They don’t care,” Raine said. “Why that occurs, we don’t fully know, but we are beginning to get important clues from neuro-imaging research.”

When it comes possible to diagnose psychopaths should they be placed under greater sustained law enforcement scrutiny? The better adapted psychopaths who feel a great deal of fear of getting caught are currently getting away with many crimes. If we can identify who they are should they be treated differently?

Also, if a psychopath can be diagnosed in advance as extremely dangerous should it be permitted to lock such a person up in an institution before they rape or kill or do other harm to people? What if a person could be identified as a psychopath at the age of 14? Should such a person be removed from normal society?

Suppose it became possible to treat the brains of psychopaths to cause them to have greater empathy, greater remorse, and less impulsiveness. Should the government have the power to compel psychopaths to accept treatment that will change the wiring of their brains?

Also, if there is a genetic basis for psychopathy and it becomes possible to test for it then should people who have the genetic variations for psychopathic brain wiring be allowed to reproduce? Should they be allowed to reproduce if only they submit to genetic engineering of their developing offspring?

I predict that most of these hypothetical questions will become real questions that will be debated in many countries around the world. I also predict that most populations will support either preemptive restraint of psychopaths or forced treatment to change the brains of psychopaths to make them less dangerous.

By Randall Parker 2004 March 16 03:04 PM  Brain Society
Entry Permalink | Comments(41)
2004 March 15 Monday
Cal Tech Quake Lab Extracts DNA From Single Cell

The Stephen Quake lab at the Californian Institute of Technology (CalTech) has developed a microfluidic device that will extract the DNA from a single cell.

By shrinking laboratory machines to minute proportions, California scientists have built a postage stamp-sized chip that drags DNA from cells. The device might one day shoulder some of scientists' routine tasks.

Microfluidic devices are going to revolutionize biological science and medicine because they will lower costs and increase automation by orders of magnitude.

The chip requires thousands to millions times less of the expensive chemicals required to isolate and process nucleic acids such as DNA and RNA. Once commercialized, it could have a profound impact on the nucleic acid isolation market, which is worth $232 million per year in the United States alone. Current leaders in that market include Qiagen in Germany, Sigma-Aldrich in St. Louis and Amersham Biosciences in Britain.

Parallel processing samples on a chip will speed up the rate of analysis and lower costs.

Steve Quake's team describes the general architecture for parallel processing of nanoliter fluids for biotechnology in a letter in the March 15 Nature Biotechnology. “We think it's an important milestone in general biological automation,” he told The Scientist.

Automation lowers costs and the ability to use smaller samples and smaller amounts of reagents also lowers costs. But another advantage of microfluidics is that enables the measurement of things that otherwise could not be measured at all. It is often very difficult to get large samples in the first place. Many cell types are difficult or impossible to grow in culture. Also, growing cells in culture will change their internal chemical state. The ability to look inside and measure the contents and structure of a single cell will allow many types of experiments and tests that are just not possible today.

Also see my previous posts on the Quake lab's work: Cal Tech Chemical Lab On A Silicon Chip and Microfluidics to Revolutionize Biosciences.

By Randall Parker 2004 March 15 11:13 AM  Biotech Advance Rates
Entry Permalink | Comments(0)
Aubrey de Grey: First Person To Live To 1000 Already Alive

University of Cambridge biogerontologist Aubrey de Grey says the first person who will live to be 1000 is 45 years old right now.

The first person to hit 150, he believes, is already 50 now. And the first individual to celebrate 1,000 -- imagine the candles on that birthday cake -- is just five years younger, he contends.

Aubrey thinks aging is barbaric. Aubrey is right.

"Aging is fundamentally barbaric, and something should be done about it," said de Grey, who has published research in Science and other peer-reviewed journals. "It shouldn't be allowed in polite society."

Aubrey believes there will be a sea change in public opinion about the reversibility of aging once genetic engineering, stem cell therapies, and several other aging-reversal therapies allow mice to live much longer. Toward this end Aubrey is one of the founders of the Methuselah Mouse Foundation which offers cash prizes to scientists who develop techniques that allow them to set new records for mouse longevity. This work will lead to the development of a combination of treatments which will allow the attainment of engineered negligible senescence where the body effectively ceases to age from one year to the next. The major categories of approaches to reverse aging are called Strategies for Engineered Negligible Senesence or SENS for short.

By Randall Parker 2004 March 15 10:35 AM  Aging Reversal
Entry Permalink | Comments(16)
Transplanted Stem Cells Grow Hair In Mice

George Cotsarelis, M.D. of the University of Pennsylvania Medical School has shown in mice stem cells in hair follicles can be labelled, extracted, and reimplanted to cause hair growth in the target implant area.

Stem cells plucked from the follicles of mice can grow new hair when implanted into another animal. The work represents a dramatic step forward that is sure to stimulate new research into treatments for human baldness.

Cotarelis' group and a different group at Rockefeller University headed by Elaine Fuchs each separaately developed means to label the stem cells around follicles so that they could be isolated from other cells around them. Then gene arrays were used to study which genes were on and off. That pattern of gene state ought to be useful for discovering the same cell type in humans.

After purifying a sufficient amount of these cells, both groups used gene chips to find which genes were switched on in the stem cells. For the first time, this provides a signature that researchers can use to identify the same cells in humans.

Cells will be extracted from areas where hair grows and implanted where the hair doesn't now grow.

"We've shown for the first time these cells have the ability to generate hair when taken from one animal and put into another," Cotsarelis said in a telephone interview. "You can envision a process of isolating existing stem cells and re-implanting them in the areas where guys are bald."

One advantage of using your own adult stem cells for this purpose is that immune rejection problems are avoided.

This incredibly important, monumentous, world altering, glorious, and revolutionary advance in medical science will be available for men to use within 5 to 10 years.

"I think this or something like it will be available in the next five to 10 years," said George Cotsarelis

Here's the abstract of the Fuchs group paper on their method for identifying the hair follicle stem cells:

Many adult regenerative cells divide infrequently but have high proliferative capacity. We developed a strategy to fluorescently label slow-cycling cells in a cell type-specific fashion. We used this method to purify the label-retaining cells (LRCs) that mark the skin stem cell (SC) niche. We found that these cells rarely divide within their niche but change properties abruptly when stimulated to exit. We determined their transcriptional profile, which, when compared to progeny and other SCs, defines the niche. Many of the >100 messenger RNAs preferentially expressed in the niche encode surface receptors and secreted proteins, enabling LRCs to signal and respond to their environment.

By Randall Parker 2004 March 15 02:53 AM  Biotech Organ Replacement
Entry Permalink | Comments(0)
2004 March 13 Saturday
Partial Recovery From Methamphetamine-Induced Brain Damage

Gene-Jack Wang of Brookhaven National Laboratories (also assistant professor of radiology at SUNY Stony Brook) and coworkers used positron emission tomography (PET scans) to show only part of the damage caused by methamphetamine abuse is repaired 12 to 17 months after the end of the drug abuse.

Researchers found that former meth abusers showed improved glucose metabolism in a brain region called the thalamus after staying off the drug for 12 to 17 months.

The striatum does not appear to recover.

There was, however, no evidence of improved metabolism in a brain region called the striatum -- suggesting, researchers say, that some meth-induced brain changes are long lasting.

From the abstract of the research paper:

RESULTS: Significantly greater thalamic, but not striatal, metabolism was seen following protracted abstinence relative to metabolism assessed after a short abstinence interval, and this increase was associated with improved performance in motor and verbal memory tests. Relative to the comparison subjects, the methamphetamine abusers tested after protracted abstinence had lower metabolism in the striatum (most accentuated in the caudate and nucleus accumbens) but not in the thalamus.

Take home lesson: addictive drugs really do fry the brain and should be avoided. Not a particularly novel observation? Yes, that is true. But evidence from brain scans makes obvious truths harder to deny.

By Randall Parker 2004 March 13 05:47 PM  Brain Addiction
Entry Permalink | Comments(102)
2004 March 12 Friday
Bone Marrow Stem Cell Aging Leads To Heart Disease

A form of adult stem cells called endothelial progenitor cells in the blood are inversely correlated with arterial damage that leads to heart disease.

NEW ORLEANS -- Duke University Medical Center researchers have uncovered a strong relationship between the severity of heart disease and the level of endothelial progenitor cells circulating in the bloodstream. This relationship, if confirmed by ongoing studies, could represent an important new diagnostic and therapeutic target for the treatment of coronary artery disease, they said.

These endothelial progenitor cells (EPC) are produced in the bone marrow, and one of their roles is to repair damage to the lining of blood vessels. Duke cardiologists believe that one cause of coronary artery disease is an increasing inability over time of these EPCs to keep up with the damage caused to the arterial lining, or endothelium.

"In our study we found that patients with multi-vessel disease had many fewer EPCs, which supports our hypothesis that these cells play an important role in protecting blood vessels," said cardiologist Geoffrey Kunz, M.D., of the Duke Clinical Research Institute. "If you don't have enough of the cells, the ongoing damage to the endothelium from traditional risk factors occurs faster than the body's ability for repair."

EPC levels are independently correlated with heart disease incidence.

"We found that the patients with multi-vessel disease had significantly lower EPC counts than those without -- 13 CFU vs. 41.7 CFU," Kunz said. "Additionally, for every 10 CFU increase in EPC level, a patient's likelihood for multi-vessel disease declined by 20 percent."

While the EPC levels did not vary significantly by age, gender or other risk factors, the researchers found that the levels were lower for diabetics (19 CFU vs. 36 CFU) and for patients who had suffered a recent heart attack (23 CFU vs. 34 CFU).

"These findings demonstrate a strong inverse relationship between circulating EPCs and coronary artery disease, independent of traditional disease risk factors," Kunz said.

The researchers said that it might ultimately be possible to forestall or even prevent development of atherosclerosis by injecting these cells into patients or by retraining the patient's own stem cells to differentiate into progenitor cells capable of arterial repair.

While the direct clinical use of stem cells as a treatment might be many years off, the researchers said it is likely that strategies currently used to reduce the risks for heart disease -- such as lifestyle modifications and/or different medications -- preserve these rejuvenating cells for a longer period of time, which delays the onset of atherosclerosis.

This latest result in humans illustrates yet again how important it is to develop stem cell therapies to replace aged adult stem cell reservoirs with rejuvenated stem cells. Duke researchers have already successfully shown that replacement bone marrow stem cell therapies reduce the development of atherosclerosis in mice. See my previous post Bone Marrow Stem Cell Aging Key In Atherosclerosis. Also, this latest result is not the first indication that the aging of stem cells in humans is a heart disease risk. See my previous post Aged Blood Stem Cells Indicator For Cardiovascular Disease Risk. These results demonstrate that we do not need to develop a greater understanding of aging in order to start developing rejuvenation therapies. The major challenge now is to develop effective treatments that will repair and replace aged tissue. Research aimed at developing useful stem cell therapies is a key piece of the rejuvenation puzzle.

By Randall Parker 2004 March 12 07:59 PM  Aging Reversal
Entry Permalink | Comments(0)
Collapsing European Populations: Time To Reverse Aging

The Daily Telegraph (free registration required) has an interesting article on demographic trends in Latvia and other European countries.

"Abortion on demand, which carries no social stigma, is almost as common as live birth." The collapse in the fertility rate has now continued so long that further contraction appears inevitable. The United Nations forecasts that Latvia will lose 44 per cent of its population by 2050. The projected collapse for Estonia is 52 per cent, Russia 30 per cent, Italy 22 per cent, Poland 15 per cent and Greece 10 per cent. Britain will grow slowly to 66 million, while France and Germany will contract gently.

The governments are changing their policies on taxes and benefits to encourage child birth. Will this work? Past attempts by governments to do so are hardly encouraging. As an extreme example Romania during the communist years tried drastic measures to boost birth rates.

Although government expenditures on material incentives rose by 470 percent between 1967 and 1983, the birthrate actually decreased during that time by 40 percent. After 1983, despite the extreme measures taken by the regime to combat the decline, there was only a slight increase, from 14.3 to 15.5 per 1,000 in 1984 and 16 per 1,000 in 1985. After more than two decades of draconian anti- abortion regulation and expenditures for material incentives that by 1985 equalled half the amount budgeted for defense, Romanian birthrates were only a fraction higher than those rates in countries permitting abortion on demand.

Romanian demographic policies continued to be unsuccessful largely because they ignored the relationship of socioeconomic development and demographics. The development of heavy industry captured most of the country's investment capital and left little for the consumer goods sector. Thus the woman's double burden of child care and full-time work was not eased by consumer durables that save time and labor in the home. The debt crisis of the 1980s reduced the standard of living to that of a Third World country, as Romanians endured rationing of basic food items and shortages of other essential household goods, including diapers. Apartments were not only overcrowded and cramped, but often unheated. In the face of such bleak conditions, increased material incentives that in 1985 amounted to approximately 3.61 lei per child per day--enough to buy 43 grams of preserved milk--were not enough to overcome the reluctance of Romanian women to bear children.

Part of the reason why the Ceacescu regime failed to sustain an increase in the birth rate in Romania is that an expenditure level that equalled half the annual defense level of Romania was still only enough to buy each child 43 grams of preserved milk per day. Well, think about those 43 grams as calories. Fat is 9 calories per gram while sugar and protein are 4 calories per gram. So the Romanians were so poor that half their defense budget could pay for only a small fraction of all their children's daily calorie needs. Still, even pronatalist policies by Western European countries seem to be having little effect as free child care and other benefits have done little to slow the decline in birth rates in Scandinavia.

Children who grow up to earn high incomes pay more in taxes than they receive in benefits. Yet the idea of aiming for an increase in the high income-earning population rarely shows up in public debates about social policy and the idea is totally ignored by advocates of high levels of immigration. In an article about retirement and children that Jonathan Rauch wrote for The Atlantic back in 1989 he puts his finger on the core of the problem.

If boys and girls grew up to become industrial machinery instead of men and women, it would be easy to see that everybody had a stake in other peoples children.

But how to go from that observation to useful social policies is hardly obvious. One problem with pronatalist policies that dole out equal amounts per child regardless of income is that they essentially become incentives for the poorest and least educated to have children. An amount of money that is a lot for a poor person is not much for a high income earner. So equally sized benefits for all children become a recipe for growing the size of the impoverished class. But the amount of money needed to provide substantial incentives for those who earn higher incomes to have children would likely be politically unacceptable. The cry would go out against subsidizing the rich and the amount of money involved have to be so large that other programs would have to be cut to pay for it.

So can policy makers do anything at all to prevent the collapse of their populations? One argument I've made for other reasons, to accelerate education, would also likely have the effect of increasing family sizes. Increased education is anti-natalist in part because it delays the beginning point when women will be ready to have children. That education-caused delay is acting as a Darwinian selective pressure on the population of industrialized societies.

Part of the twin data analysis aimed to discover the effect that social, psychological and historical factors had on the number and timing of children born to the 2,710 pairs of twins studied.

The researchers found many of the variations in the threetraits were controlled by social factors such as religion and education (5). For example, Roman Catholic women had 20 per cent higher reproductive fitness than other religions. University educated women had 35 per cent lower fitness than those who left school as early as possible.

"I was staggered by the results we got," said Dr Owens. "When we decided to control for these factors, I wasn't expecting anything to come out of it. I thought, 'let's just run with the analysis'. But there was a massive difference in the number of children born to families with a religious affiliation. Many of the Catholic twins we studied had an average family of five children, where other families were having only one or two children.

"We also found that mothers with more education were typically having just one child at an older age. Their reproductive fitness was much lower than their peers who left school as early as possible. Again, and again, our analyses for these two factors came back with the same results."

The influence of religion and education in family size may seem an obvious finding - but what the scientists found really astonishing was that after controlling for these social factors, genetic changes were influencing the three life traits studied.

"Even after we controlled for these social factors, there was still lots of genetically heritable genetic variation in the three life history traits. This is a really unexpected finding."

However, he cautions against linking this work with the possibility of a eugenic programme for selective human breeding.

"Looking to the future, I would expect to pick up genetic changes within the ten generations (6) since industrialisation. However, what this work doesn't indicate or find, is a genetic marker for human reproduction - so you can't breed for early reproduction from our data. All the traits that we have examined are controlled by interactions between the environment and many genes."

The future work aims to understand more fully, the contribution psychological factors make, says Dr Owens. "We also want to repeat our experiments using twins databases elsewhere, to really put our results into a 'western world' context," he said.

Some of the genetic markers he can't find are probably alleles which enhance intelligence. Other genetic markers might be factors that cause people to like children or to act impulsively. Also, genes could be selected for that increase the level of sex drive or that decrease the selectivity of who one is attracted to. The more general idea here is that the genetic variants being selected among are most likely ones that affect cognitive function in a variety of ways.

An extremely appealing approach to the problems caused by declining bith rate would be to develop rejuvenation treatments. Populations won't decrease if people do not die. The half billion dollars that Aubrey de Grey wants to jumpstart eternal youthfulness research seems like pretty small potatoes compared to the future benefit.

Just how small a cost is biomedical research as a way to solve the aging population problem faced by Western countries? Well, to put it in perspective the United States government has $45 trillion in unfunded liabilities as a result of the aging of the population. (also see more from Alex Tabarrok on this here and here and the discussion on Arnold Kling's blog here). When I look at the financial consequences of the declining birth rates and longer life expectancies my first reaction is that we need an absolutely massive effort to develop therapies that will reduce aging. The size of our liabilities dwarf any amount of money that could possibly be spent on research to reverse aging. Aubrey de Grey argues that the current widespread fatalism about the inevitability of aging is unjustified by our current level of understanding of the causes and possible treatments for aging. This fatalism is greatly holding back the rate of progress. Though at least stem cell therapy research is being pursued for less ambitious reasons and so rejuvenation therapies are being pursued anyway, albeit at a slower rate than would be possible if serious money was thrown at the problem. Aubrey argues that with proper funding 7 approaches for doing rejuvenation could all be tried on mice within a decade's time. So lets get started!

By Randall Parker 2004 March 12 05:58 PM  Trends Demographic
Entry Permalink | Comments(2)
Cambridge Team Sets New Nanotube Fiber Length Record

Alan Windle and his team of researchers at the University of Cambridge have set a new record for length of produced carbon nanotube fibers.

A thread of carbon nanotubes more than 100 metres long has been pulled from a fiery furnace. The previous record holder was a mere 30 centimetres long.

...

"This is ground-breaking research - but it's early days" says Harry Swan, whose company Thomas Swan of Consett, UK, is helping to finance the development of the new manufacturing technique.

But the fibers produced by this method are not exceptionally strong.

So far, the fibres aren't outstandingly strong — they're no better than typical textile fibres. But Windle thinks that there's still plenty of scope for improving the process to make stronger fibres, for example by finding ways to make the nanotubes line up better. In Kevlar it's the good alignment of molecules that generates the high strength.

If Windle's group can improve the strength of the fibers produced by this approach then nanotube fibers could finally move into use in industrial applications. The potential exists to lower the cost of cars, aircraft, trains, suspension bridges, an a large assortment of other vehicles and stationary structures. Work on making nanotube strands line up better to increase their strength is also showing signs of making progress. If carbon nanotube fibers ever achieve sufficient strength (and lots of scientists believe they can) then construction of a space elevator becomes possible. That would lower the cost of getting into space by 2 or 3 orders of magnitude.

By Randall Parker 2004 March 12 03:34 PM  Nanotech Advances
Entry Permalink | Comments(0)
Choline During Pregnancy Boosts Rat Brains

Choline during pregnancy might enhance cognitive functon of offspring.

Taking a nutrient called choline during pregnancy could "super-charge" children's brains for life, suggests a study in rats.

Offspring born to pregnant rats given the supplement were known to be faster learners with better memories. But the new work, by Scott Swartzwelder and colleagues at Duke University Medical Center in North Carolina, US, shows this is due to having bigger brain cells in vital areas.

Previous studies have shown that the offspring of rats fed choline have better memories and their cognitive function does not decay as rapidly as they age.

Eggs are a good source of choline. For a chart on choline food sources see my previous posts Choline May Restore Middle Aged Memory Formation. For more on choline's effects down at the level of genetic regulation see my post Nutrients Change Embryonic DNA Methylation, Risk Of Disease.

Update: The Duke University press release for the study provides more details and background:

In the current study, the researchers explored the effects of choline on neurons in the hippocampus, a brain region that is critical for learning and memory. They fed pregnant rats extra amounts of choline during a brief but critical window of pregnancy, then studied how their hippocampal neurons differed from those of control rats.

The researchers found that hippocampal neurons were larger, and they possessed more tentacle-like "dendrites" that reach out and receive signals from neighboring neurons.

"Having more dendrites means that a neuron has more surface area to receive incoming signals," said Scott Swartzwelder, Ph.D., senior author of the study and a neuropsychologist at Duke and the Durham VA Medical Center. "This could make it easier to push the neuron to the threshold for firing its signal to another neuron." When a neuron fires a signal, it releases brain chemicals called "neurotransmitters" that trigger neighboring neurons to react. As neurons successively fire, one to the next, they create a neural circuit that can process new information, he said.

Not only were neurons structured with more dendrites, they also "fired" electrical signals more rapidly and sustained their firing for longer periods of time, the study showed. The neurons also rebounded more easily from their resting phase in between firing signals. These findings complement a previous study by this group showing that neurons from supplemented animals were less susceptible to insults from toxic drugs that are known to kill neurons.

Collectively, these behaviors should heighten the neurons' capacity to accept, transmit and integrate incoming information, said Swartzwelder.

"We've seen before that the brains of choline-supplemented rats have a greater plasticity -- or an ability to change and react to stimuli more readily than normal rats -- and now we are beginning to understand why," he said.

By Randall Parker 2004 March 12 11:11 AM  Brain Enhancement
Entry Permalink | Comments(5)
2004 March 11 Thursday
Long-Lived Grandmothers Increase Childbearing Of Their Children

Living near long-lived mothers increased the number of offspring their children had..

The researchers looked at 2,800 women living in two 18th and 19th century farming communities in Finland and Canada.

They wanted to see how long women lived after their menopause and what effect that might have on how many babies their own children had.

The data showed women had two extra grandchildren for each extra decade they lived after 50.

The grandmothers not only increased the number of children their children had but also increased the number that survived childhood.

This effectively would have selected for women to live longer.

The research, led by Mirkka Lahdenpera, a professor of human ecology at the University of Turku in Turku, Finland, found that "prolonged post-reproductive longevity in humans is associated with greater grandchild production."

In both Canada and Finland, women gained two extra grandchildren for every decade they lived beyond age 50.

Greater distance between parents and children may be contributing to the decline in family size in industrial societies.

The research also reveals that the declining role of grandmothers in childrearing is one factor among many that have led to the birth rate falling in modern societies.

Because of greater mobility people are less likely to live next to their parents. Therefore they are less likely to have parental babysitting services available.It would be interesting to look at industrial society populations today and measure physical distance between parents and their children and compare that to the number of offspring the children have. In light of this study it seems likely that people who do not live near their mothers have smaller families than those who do.

The effect of living near one's mother might not be as strong on childbearing today for the upper classes in particular since wealthy people can afford nannies and babysitters.

Humans are unusual in the length of time they survive after they become infertile.

The post-reproductive survival of humans—women in particular—is truly unusual. Non-reproductive “helpers” of individuals who are breeding are found in many species. But they are usually young animals that have yet to establish themselves, rather than relics from previous generations. The post-reproductive elderly just die. Chimpanzees, for example, have a similar pattern of fertility to people. A female chimp's fertility peaks in her late 20s, and is more or less extinguished by her mid-40s. But in chimpanzees, mortality rises as fertility declines.

By Randall Parker 2004 March 11 03:27 PM  Trends, Human Evolution
Entry Permalink | Comments(3)
2004 March 10 Wednesday
Human Brain Size Regulating Gene To Be Inserted Into Mice

A previous FuturePundit post reported on the work of Bruce Lahn at the University of Chicago in exploring the evolution of the Abnormal Spindle-Like Microcephaly Associated (ASPM) gene. Mutations in ASPM have played a key role in causing the evolutionary lineage leading up to modern humans to develop bigger and smarter brains. Well, a recent report on Lahn's work that covers much the same ground but ends with a very intriguing mention of Lahn's next step: insertion of the human ASPM gene into mice.

In future experiments, Lahn will insert the human ASPM gene into mice to see what affect it has on brain development. He hopes to reconstruct the detailed story of how the human brain grew and changed as the result of natural selection, thereby creating the thing that makes us each unique—the human mind.

The creation of transgenic mice using a human gene which plays a role in determining brain could potentially produce a larger brained and smarter mouse. That outcome is by no means assured. Yet this real-life experiment brings to mind David Brin's Uplift Saga series of books including Sundiver, the Hugo and Nebula award winning Startide Rising, and The Uplift War. The term "uplift" in this context refers to the lifting up of less intelligent species to a level of sentience similar to that of humans. In Brin's saga humanity has used genetic engineering to uplift both chimpanzees and dolphins into human-like sentience.

It seems inevitable (barring the extinction of the human race in the next few decades) that the knowledge will be found for how to genetically engineer human-level intelligence into other species. At the same time, the knowledge will also be found for how to genetically engineer humans to be much smarter. Once DNA sequencing becomes cheap enough just the ability to compare the DNA sequences of humans of different levels of intelligence will lead to the discovery of many variations that will allow the average level of intelligence to be boosted quite dramatically. Building on that knowledge will be possible to discover variations that have not yet happened in humans that will allow an even greater boosting of human intelligence to levels not seen in any humans to date.

One danger of uplifting other species is that they may not feel any loyalty or empathy to humans. We may just create competitors who will clash with us in ways that make conflicts between human groups seem tame by comparison. In light of this threat an argument can be made for the idea that uplifting dogs will pose less of a threat to humans than uplifting various primates. Dogs have been bred for tens of thousands of years to form bonds with humans and to feel protective toward humans. It will also eventually be possible to genetically engineer this form of loyalty and empathy toward humans into other uplifted species. But with dogs we will be able to start with a species that already possesses some of the desired qualities that would reduce the danger that another sentient species would become a threat to humans.

By Randall Parker 2004 March 10 02:00 PM  Trends, Human Evolution
Entry Permalink | Comments(23)
2004 March 09 Tuesday
Biological Basis For Sheep Homosexuality Discovered

This is the first discovery in a non-human animals species of a brain structure difference underlying homosexual preferences.

About 8 percent of domestic rams display preferences for other males as sexual partners. Scientists don't believe it's related to dominance or flock hierarchy; rather, their typical motor pattern for intercourse is merely directed at rams instead of ewes.

"They're one of the few species that have been systematically studied, so we're able to do very careful and controlled experiments on sheep," Roselli said. "We used rams that had consistently shown exclusive sexual preference for other rams when they were given a choice between rams and ewes."

The study examined 27 adult, 4-year-old sheep of mixed Western breeds reared at the U.S. Sheep Experiment Station. They included eight male sheep exhibiting a female mate preference – female-oriented rams – nine male-oriented rams and 10 ewes.

OHSU researchers discovered an irregularly shaped, densely packed cluster of nerve cells in the hypothalamus of the sheep brain, which they named the ovine sexually dimorphic nucleus or oSDN because it is a different size in rams than in ewes. The hypothalamus is the part of the brain that controls metabolic activities and reproductive functions.

The oSDN in rams that preferred females was "significantly" larger and contained more neurons than in male-oriented rams and ewes. In addition, the oSDN of the female-oriented rams expressed higher levels of aromatase, a substance that converts testosterone to estradiol so the androgen hormone can facilitate typical male sexual behaviors. Aromatase expression was no different between male-oriented rams and ewes.

The study was the first to demonstrate an association between natural variations in sexual partner preferences and brain structure in nonhuman animals.

The Endocrinology study is part of a five-year, OHSU-led effort funded through 2008 by the National Center for Research Resources, a component of the National Institutes of Health. Scientists will work to further characterize the rams' behavior and study when during development these differences arise. "We do have some evidence the nucleus is sexually dimorphic in late gestation," Roselli said.

They would also like to know whether sexual preferences can be altered by manipulating the prenatal hormone environment, such as by using drugs to prevent the actions of androgen in the fetal sheep brain.

I predict that some day it will be possible to alter this structure in adult humans. Will more people at that point choose to switch from heterosexual to homosexual orientation or vice versa? There are more heterosexuals to make the switch. So that tilts the odds in favor of hetero to homo transitions. But on the other hand the stigma associated with homosexuality is still great enough to provide incentive to switch in the other direction.

If a test on fetuses for sexual orientation can be developed and if a treatment for altering fetal sexual orientation can also be developed then that would probably favor a net shift toward heterosexuality since most parents would choose to guarantee their children will be heterosexual. Also, even without such a test if it becomes possible to control the genetic and environmental factors that influence the development of the part(s) of the brain that determine sexual orientation then many parents will opt to, metaphorically speaking, tilt the playing field even more toward the odds of heterosexuality in their offspring. In other words, it seems reasonable to expect that most parents will avail themselves of medical treatments that will make sure their kids will turn out to be heterosexuals.

Whether the ability to alter sexual orientation at the fetal and adult stages will cause a net change in the balance of the population in a more homosexual or heterosexual direction is hard to predict. It seems likely that males and females will, on average, make different decisions. So the ratio of male to female homosexuality could either increase or decrease once sexual orientation becomes malleable. Also, the ratio will likely diverge between cultures and population groups as different groups make different choices on average.

This report brings to mind Big Gay Al's Big Gay Animal Sanctuary from the South Park classic episode 104 Big Gay Al's Big Gay Boat Ride.

Hello there little pup, I'm Big Gay Al. [Sparky looks at him] Have you been outcast? [Sparky pants an affirmative] Well, then I'm so glad you found my Big Gay Animal Sanctuary. We're all big gay friends here. Would you like to live with us? [Sparky pants an affirmative] Come on in little fellow, nobody will ever oppress you here.

By Randall Parker 2004 March 09 11:53 AM  Biological Mind
Entry Permalink | Comments(1)
2004 March 08 Monday
GPS Monitoring Of Criminals Increasing

Florida legislators are proposing to extend the use of wearable GPS monitoring devices for not only high risk parolees but also high risk suspects out on bail awaiting trial.

TALLAHASSEE - Evoking the slaying of 11-year-old Carlie Brucia, Florida law enforcement officials descended on the state Capitol Wednesday to urge key lawmakers to invest $35-million next year to keep minute-by-minute track of thousands of paroled criminals.

Pitched as the next technological revolution in crime-fighting and already up and running in four Florida counties including Pinellas and Citrus, the VeriTracks system uses global positioning technology to track criminals released from jail or prison. It then cross-references those locations nightly with criminal activity.

Note that many of these systems are not doing real-time reporting of location. The device constantly records where the wearer is at. But it has to be downloaded at the end of a day to find where the parolee has been. This delayed reporting is a lot cheaper because systems that use real-time tracking have to do frequent transmissions of information via automated cellular messaging. However, more expensive models support real-time reporting of locations of wearers as they move around throughout the day and night. More on that below.

Florida already uses a mix of "active" or real-time reporting devices and "passive" or delayed end-of-day download devices.

Florida has been one of the most aggressive states in using satellite technology to track criminals. The Department of Corrections uses "active" GPS for about 400 probationers, mostly sex offenders and people who've committed violent crimes. About another 150 are monitored with "passive" GPS which checks offenders whereabouts less often.

The main company that provides the tracking system, Pro Tech, is based in Florida. Its system is used in 33 states.

Pro Tech's web site points out that there are millions of criminal offenders who are potential wearers of these devices.

On any given day, 5 million offenders in the U.S. are either on probation, parole or some other form of community supervision. These same offenders account for 33% of violent crimes. These staggering statistics led to the founding of Pro Tech Monitoring, Inc. and the creation of SMART® System Technology.

Pro Tech has a real-time versionof their monitoring system that uses wireless to send signals to report real-time movements of wearers.

The key components of the SMART® Active Tracking System are a Portable Tracking Device (PTD), ankle bracelet, charging stand, and GPS satellites.

Offenders are fitted with a tamper-resistant ankle bracelet and assigned a PTD to keep near them at all times. The ankle bracelet acts as an "electronic tether" which transmits signals to the PTD.

The PTD uses GPS signals and a wireless network to locate and report an offender's every move. The PTD monitors the signal strength of the GPS satellites to ensure accurate location information and incorporates a motion detector to monitor movement in areas of insufficient GPS signal strength. Pro Tech's Offender Tracking Center (OTC) monitors this information. The PTD is equipped with an LCD, used to notify the offender of violations and for sending text messages from the agency. This patented communication capability has demonstrated it's effectiveness in modifying offender behavior and reducing recidivism.

Authorities can even create multiple Inclusion and Exclusion zones, and be notified by fax, pager or email whenever a zone violation occurs.

Note the use of Inclusion and Exclusion zones. Depending on the implemention the real-time reporting devices can utilize this capability to reduce the cost of reporting because the wireless reporting method can be used to do real-time reports only when a criminal enters a forbidden zone or is moving around in a forbidden zone.

Minnesota is also using the Pro Tech Monitoring devices.

On average, offenders will use the devices for six months. The technology will cost the Corrections Department $175,000 to $200,000 a year based on the annual release of about 45 Level 3 offenders. The daily cost of about $17 for the technology and vendor expenses is in addition to the $20 daily expense for monitoring offenders on intensive supervision.

Another supplier of this type of equipment, iSecureTrac, is selling to a number of jurisdictions around the United Sates including jurisdictions in Florida and Mississippi.

Court Programs will place tracNET24 units on a wide range of offenders including deadbeat dads, juveniles, domestic violence and misdemeanor cases, and those on pre-trial release for felonies. By adopting tracNET24, Court Programs is providing Mississippi and Florida with the most advanced offender tracking available. The device allows authorities to monitor, via a satellite system, the whereabouts of offenders who are outfitted with small 12-ounce personal tracking units (PTU). A PTU receives signals from the Department of Defense's GPS satellite system and after the PTU is docked for charging, the PTU downloads the offenders' movements into a database accessed by correction officials. In addition, authorities may program exclusion areas, places where the presence of an offender is prohibited; such as certain residences, schools, and child care facilities. tracNET24 gives authorities verifiable records of where offenders have been, 24 hours per day, seven days a week.

The more expensive iSecureTrac model is also capable of real-time reporting of violations of exclusion zones.

Currently only law enforcement agencies and the companies that sell these devices have access to the position data that is collected. However, if public demands ever arose for wider access this could easily be done with current technology. Imagine, for instance, the ability of, for instance, a battered women to get a beeper that would notify her in real-time when her ex-husband was too close to her current location. She would need to wear a similar device (unless at home or some other fixed location) that would report where she was so that the central database could compare that to the location of some guy who is a threat to her.

There are a lot of other possible uses of this technology. Schools could get real-time reports of convicted pedophiles in their vicinity. Also, night clubs could be notified when a convicted rapist enters their premises or shopping mall security could be notified to watch someone in a parking lot..

The key to using real-time data to allow people to avoid criminals is that a person's location must be compared to another person's location. That requires a huge of real-time messaging of the location of both criminals and of the far larger population non-criminals in order to do the comparison. However, one way around that problem would be to add to a criminal's worn device a transmitter that is constantly broadcasting his location with a low power radio transmitter over a distance of, say, a half mile. Then anyone in the larger population could just use a radio receiver coupled to an embedded computer to notify that a criminal is nearby. Buildings could contain such receivers and building security could be alerted automatically when known criminals are nearby. The notifications could be filtered by types of previous convictions or other characteristics.

The ability to track the locations of people has a lot of other applications of course. As the tracking devices become smaller and cheaper expect to see parents putting them in their children both to protect their children from kidnapping and also simply to find out what trouble the kids are getting themselves into.

Another possible interesting application would be to manage affinity groups. Imagine a traveller who is cruising down a road trying to decide which night club to try out. If people registered with an affinity tracking service then a traveller could choose a club or restaurant whose currently present patrons fit some desired demographic profile. One obvious problem with such a service is that just because one person likes a particular type of person doesn't mean that most who fit a desired profile will like that person in return. Look at celebrities for example. They are loved by all sorts of people who the celebrities would very much like to avoid. So a service would need to develop eligibility criteria that require matching of preferences in both directions before that person driving down the street would get a flashing light on their car LCD pointing them to a particular bar or night club.

By Randall Parker 2004 March 08 11:43 AM  Surveillance GPS
Entry Permalink | Comments(47)
2004 March 06 Saturday
UC Berkeley Engineers Develop Human Exoskeleton Prototype

University of California, Berkeley robotics researchers have demonstrated a working prototype for a strength-enhancing human exoskeleton.

"We set out to create an exoskeleton that combines a human control system with robotic muscle," said Homayoon Kazerooni, professor of mechanical engineering and director of UC Berkeley's Robotics and Human Engineering Laboratory. "We've designed this system to be ergonomic, highly maneuverable and technically robust so the wearer can walk, squat, bend and swing from side to side without noticeable reductions in agility. The human pilot can also step over and under obstructions while carrying equipment and supplies."

The Berkeley Lower Extremity Exoskeleton (BLEEX), as it's officially called, consists of mechanical metal leg braces that are connected rigidly to the user at the feet, and, in order to prevent abrasion, more compliantly elsewhere. The device includes a power unit and a backpack-like frame used to carry a large load.

Such a machine could become an invaluable tool for anyone who needs to travel long distances by foot with a heavy load. The exoskeleton could eventually be used by army medics to carry injured soldiers off a battlefield, firefighters to haul their gear up dozens of flights of stairs to put out a high-rise blaze, or rescue workers to bring in food and first-aid supplies to areas where vehicles cannot enter.

"The fundamental technology developed here can also be developed to help people with limited muscle ability to walk optimally," said Kazerooni.

The researchers point out that the human pilot does not need a joystick, button or special keyboard to "drive" the device. Rather, the machine is designed so that the pilot becomes an integral part of the exoskeleton, thus requiring no special training to use it. In the UC Berkeley experiments, the human pilot moved about a room wearing the 100-pound exoskeleton and a 70-pound backpack while feeling as if he were lugging a mere 5 pounds.

The project, funded by the Defense Advanced Research Projects Agency, or DARPA, began in earnest in 2000. Next week, from March 9 through 11, Kazerooni and his research team will showcase their project at the DARPA Technical Symposium in Anaheim, Calif.

For the current model, the user steps into a pair of modified Army boots that are then attached to the exoskeleton. A pair of metal legs frames the outside of a person's legs to facilitate ease of movement. The wearer then dons the exoskeleton's vest that is attached to the backpack frame and engine. If the machine runs out of fuel, the exoskeleton legs can be easily removed so that the device converts to a large backpack.

More than 40 sensors and hydraulic actuators form a local area network (LAN) for the exoskeleton and function much like a human nervous system. The sensors, including some that are embedded within the shoe pads, are constantly providing the central computer brain information so that it can adjust the load based upon what the human is doing. When it is turned on, the exoskeleton is constantly calculating what it needs to do to distribute the weight so little to no load is imposed on the wearer.

...

One significant challenge for the researchers was to design a fuel-based power source and actuation system that would provide the energy needed for a long mission. The UC Berkeley researchers are using an engine that delivers hydraulic power for locomotion and electrical power for the computer. The engine provides the requisite energy needed to power the exoskeleton while affording the ease of refueling in the field.

The current prototype allows a person to travel over flat terrain and slopes, but work on the exoskeleton is ongoing, with the focus turning to miniaturization of its components. The UC Berkeley engineers are also developing a quieter, more powerful engine, and a faster, more intelligent controller, that will enable the exoskeleton to carry loads up to 120 pounds within the next six months. In addition, the researchers are studying what it takes to enable pilots to run and jump with the exoskeleton legs.

Check out a 1 megabyte jpeg picture of a human wearing a BLEEX exoskeleton and see more images at the Berkeley Lower Extremity Exoskeleton (BLEEX) project page.

The BLEEX brings to mind the powerloader exoskeleton that Sigourney Weaver as Ripley wore to battle the alien queen in the climactic fight scene in the 1986 move Aliens. As images here, here, and here demonstrate, its larger size and huge grappler hands made it look far more industrial and powerful than BLEEX. But a real life implementation of Ripley's industrial power loader would weigh too much and give its users too large a profile to be useful in most battlefield applications.

By Randall Parker 2004 March 06 01:04 PM  Cyborg Tech
Entry Permalink | Comments(5)
2004 March 05 Friday
Molecular Mechanism Of Alcohol On Brain Suggests Addiction Treatment

Scripps Research Institute researchers have discovered part of one mechanism of how ethanol affects the brain.

Previous studies have also shown that alcohol enhances GABA neurotransmission in the amygdala, the so-called pleasure center of the brain. Interestingly, the brain corticotropin releasing factor (CRF) stress system also increases GABA transmission in the amygdala.

CRF is a common peptide in the brain that is responsible for activating the hypothalamic-pituitary-adrenal stress response and in the amygdala for activating sympathetic and behavioral responses to stressors. CRF is found in lots of different parts of the brain and is known to be involved in the brain in response to stress, anxiety, and depression.

Significantly, the CRF system also seems to be central to alcoholism, and scientists at Scripps Research and elsewhere have shown that CRF is involved in the transition from alcohol use to alcohol dependence. Scripps Research Professor George Koob and his colleagues found recently that levels of CRF increase in brains treated with alcohol. Other studies have shown that CRF levels increase when animals are withdrawing from alcohol as well—a situation analogous to an alcoholic's protracted abstinence.

In their latest paper, Siggins and his colleagues show, at the cellular level, how alcohol and CRF interact. When neurons are exposed to alcohol, says Siggins, they release CRF, and this causes the release of GABA in the amygdala. And when the CRF receptor is removed altogether (by genetic knock out), the effect of alcohol and CRF on GABA neurotransmission is lost.

Siggins and his colleagues say that this suggests a cellular mechanism underlying involvement of CRF in alcohol's behavioral and motivational effects. During withdrawal, CRF levels increase and these changes may persist for a long time.

It also suggests a possible way of treating alcoholism—using CRF antagonists, or compounds that block the effects of CRF. In the current study, when the scientists applied an antagonist of CRF, they found that alcohol no longer had an effect.

"Not only did the antagonists block the effect of CRF in enhancing GABA transmission, it also blocked the effect of alcohol," says Siggins. "The response was totally gone—alcohol no longer did anything."

An understanding of mechanisms by which ethanol acts on the brain will lead to better treatments of alcoholism. But an understanding of the various mechanisms by which ethanol intervenes in brain function will also lead to the identification of targets for drug development to develop treatments that will emulate some of the effects of alcohol while avoiding many of the harmful side effects.

It seems likely that in the next few decades a deeper understanding of how drugs cause addiction will lead to the development of effective treatments for many and perhaps even all forms of addiction. Addiction may become much less common as a result.

By Randall Parker 2004 March 05 03:39 AM  Brain Addiction
Entry Permalink | Comments(5)
2004 March 04 Thursday
Cannibals Differ In Cognitive Ability To Protect Against Prion Disease?

Some people claim that there is no evidence that natural selection has caused differences in different human populations in the frequency of genetic variations that create differences in cognitive performance. However, a recent combination of reports about a single gene which affects cognitive function provides such evidence. Recent research reports on the effects that variations of the prion protein gene (PRNP) have on cognitive ability, other reports that PRNP variations affect the risk for getting prion diseases such as kuru and Creutzfeld Jacob Disease (CJD), and still other reports on the distribution of the PRNP gene variations in different human populations suggest that in different ecological niches natural selection has indeed operated to produce differences in human cognitive function.

For a specific example of a genetically caused difference in cognitive ability in different human populations that has been caused by natural selection first see my previous post Prion Gene Influences Cognitive Ability which reported on how the M129V variations in the prion protein gene (PRNP) may cause differences in cognitive ability between those who and those who do not have that variation. Here's an excerpt from the abstract of a research paper on PRNP genetic variations and cognitve function.

We have recently shown that methionine at codon 129 in the prion protein is associated with white matter reduction in a group of healthy volunteers and schizophrenic patients. The present study examines the influence of the same genetic variation on psychometric cognitive performance measurements in 335 community-based healthy volunteers. The polymorphism was associated with Full Scale IQ (genotype: F=4.38, df=2/317, P=0.013; allele: F=8.04, df=1/658, P=0.005), as measured by HAWIE-R (German version of the Wechsler Adult Intelligence Scale, Revised). Genotype accounted for 2.7% of the total variability in Full Scale IQ (partial eta2=0.027).

Why is this interesting in terms of natural selection for cognitive performance in different human environments? We know that the M129V variation occurs at different frequencies in different in populations. In fact, cannibalism have have selected for the heterozygous occurrence of M129V variations in cannibals in Papua New Guinea.

From approximately 1920 to 1950, a kuru epidemic devastated the Fore in the Highlands of Papua New Guinea. At mortuary feasts, kinship groups would consume deceased relatives, a practice that probably started around the end of the 19th Century, according to local oral history. The Australian authorities imposed a ban on cannibalism there in the mid-1950s.

The same genetic variation in the prion protein that helps protect against Creutzfeld Jacob disease turned out to do the same for kuru. Studying Fore women who had participated in mortuary feasts, Collinge's group found that 23 out of the 30 women were heterozygous for the prion protein gene, possessing one normal copy and one with the M129V mutation.

The researchers sequenced and analyzed the prion protein gene in more than 2000 chromosome samples from people selected to represent worldwide genetic diversity. They found either M129V or E219K in every population, with the prevalence decreasing in East Asia (except for the Fore, who have the highest frequency in the world).

Collinge's team also studied the diversity of sequence variations in a block of DNA containing the prion protein gene, in European, African, Japanese, and Fore populations. The prevalence of the M129V and E219K variations, even when the sequence at other spots was highly variable, indicated that the variations were ancient--more than 500,000 years old, according to authors' estimates.

Finally, the researchers identified a telltale signature of balancing selection in the gene: a greater than average number of highly variable sites, and a smaller than average number of low-frequency variations.

These findings are consistent with other lines of evidence indicating that prehistoric populations practiced cannibalism, such as cuts and burn marks on Neanderthal bones, and biochemical analysis of fossilized human feces.

M129V allele frequencies do not differ only among populations which were recently cannibals as compared to populations that were not recently cannibals. For instance, allele frequencies for M129V are different in Turkey than in most of Europe and East Asia.

Three known polymorphisms but no other gene variants were detected in the PRNP coding sequence of the Turkish individuals. Genotype frequencies at codon 129 were 57% Met/Met, 34% Met/Val and 9% Val/Val, with an allele frequency of 0.740:0.260 Met:Val. These distributions are considerably different from those reported for other normal populations residing in Western Europe and East Asia, except in Crete. The higher frequency of 129 Met-homozygotes in Turkey than in Western Europe suggests that the Turkish are at greater risk of developing CJD.

If the research work which shows that M129V reduces brain volume and reduces intelligence is confirmed upon further investigation then this will be an example of a difference in selective pressures in different environments causing differences in frequencies of alleles that affect cognitive function.

By Randall Parker 2004 March 04 03:58 PM  Brain Genetics
Entry Permalink | Comments(0)
Method To Do Desktop Fusion Discovered?

Scientist at Oak Ridge National Laboratory may have found a cheap way to cause hydrogen atoms to fuse. (same article here)

The researchers expose the clear canister of liquid to pulses of neutrons every five milliseconds, or thousandths of a second, causing tiny cavities to form. At the same time, the liquid is bombarded with a specific frequency of ultrasound, which causes the cavities to form into bubbles that are about 60 nanometers – or billionths of a meter – in diameter. The bubbles then expand to a much larger size, about 6,000 microns, or millionths of a meter – large enough to be seen with the unaided eye.

"The process is analogous to stretching a slingshot from Earth to the nearest star, our sun, thereby building up a huge amount of energy when released," Taleyarkhan said.

Within nanoseconds these large bubbles contract with tremendous force, returning to roughly their original size, and release flashes of light in a well-known phenomenon known as sonoluminescence. Because the bubbles grow to such a relatively large size before they implode, their contraction causes extreme temperatures and pressures comparable to those found in the interiors of stars. Researches estimate that temperatures inside the imploding bubbles reach 10 million degrees Celsius and pressures comparable to 1,000 million earth atmospheres at sea level.

At that point, deuterium atoms fuse together, the same way hydrogen atoms fuse in stars, releasing neutrons and energy in the process. The process also releases a type of radiation called gamma rays and a radioactive material called tritium, all of which have been recorded and measured by the team. In future versions of the experiment, the tritium produced might then be used as a fuel to drive energy-producing reactions in which it fuses with deuterium.

Whereas conventional nuclear fission reactors produce waste products that take thousands of years to decay, the waste products from fusion plants are short-lived, decaying to non-dangerous levels in a decade or two. The desktop experiment is safe because, although the reactions generate extremely high pressures and temperatures, those extreme conditions exist only in small regions of the liquid in the container – within the collapsing bubbles.

The ability to sustain nuclear fusion could provide a way to produce enormous quantities of energy. If this could be done very cheaply then the age of fossil fuels would come to an end.

The research paper reporting this work went through intense review before being approved for publication.

Although no one has tried repeating the latest work, Lee Riedinger, deputy director for science and technology at Oak Ridge, says that, it went through an "extraordinary level of review" before being accepted for publication by Physical Review E.

By the standards of plasma physics research the money needed to try to repeat this experiment is peanuts.

For decades, physicists have dreamed of harnessing the ferocious alchemy of the Sun as a clean, limitless energy source. Most experiments have been conducted in giant, expensive reactors using magnetic fields to confine the ultrahot gases.

In contrast, the new experiment, which cost less than $1 million, uses the power of sound to create energy comparable to the inside of stars.

Hopefully granting agencies will allocate the money needed to for other labs to check this result. If this result holds up then our future could take a really big turn.

By Randall Parker 2004 March 04 03:13 AM  Energy Tech
Entry Permalink | Comments(3)
2004 March 03 Wednesday
Harvard To Found Human Embryonic Stem Cell Institute

The Boston Globe has reported that Harvard University is going to create a stem cell research institute using private funding that will work with human embryonic stem cells.

Set to be announced in April, the stem cell plan will bring together researchers from Harvard and all of the Harvard-affiliated hospitals to unlock the mysteries of a type of cell that has the potential to develop into any healthy tissue in the body, but has triggered ethical controversy over the way it is created. Though not housed in a central building, the initiative will be large, even by Harvard standards, with a fund-raising goal of about $100 million, according to the scientists involved.

Harvard has confirmed the Boston Globe report.

Harvard issued a statement Sunday confirming its plans, saying the school is "proceeding in the direction of establishing a stem cell institute." Final details are not complete, it said.

One goal of the new institute appears to be to use private money in order to bypass Bush Administration funding restrictions on embryonic stem cell research.

Provost Steven E. Hyman confirmed plans were in progress for the Harvard Stem Cell Center, which would bring together researchers from the University and affiliated hospitals who are already exploring the promising cells’ potential to help cure diseases like AIDS and diabetes. “We are moving forward on a stem cell center,” Hyman said. “It’s something Harvard ought to be doing. It is something we can be preeminent in.”

The revelation, first reported yesterday in The Boston Globe, comes two weeks after a South Korean laboratory became the first to extract a line of stem cells from a cloned human embryo, disappointing Harvard researchers who had been pursuing the achievement.

A report circulated by Dean of the Faculty of Arts and Sciences (FAS) William C. Kirby in January included a proposal to establish a stem cell research program on the University’s lands in Allston.

“Not only does the Institute propose to bridge the gap from basic to applied life science, it also proposes to address the complex social, ethical and religious questions that have arisen as stem cell research has advanced,” read the report obtained by The Crimson.

Human embryonic stem cells, which can harness the potency of fertilized eggs to form any variety of human tissue, have emerged as a pivotal—and controversial—field of study.

Bush administration restrictions limit government-financed research to pre-existing stem cells, but Hyman said the University would seek funding for the center from private donors and foundations.

One way or another human embryonic stem cell research is going to be done. It will be done by private money in the United States. It will be done in some other countries, particularly in East Asia where there is enough scientific talent and money and little in the way of government restrictions.

I think some proponents of human embryonic stem cell research have promoted unrealistic expectations about how quickly human embryonic stem cell research would produce useful treatments if only there were fewer political obstacles to this area of research.. Much of the work that needs to be done to understand how to manipulate stem cells can be and is being done in various animal models. This is similar to how many other kinds of research are done in other species for reasons of cost, ethics, ease of the work, and other factors. Plus, a lot of work on stem cells can be done on non-embryonic stem cells.

I'm not saying all this to make an argument against human embryonic stem cell (hESC) research. Decide for yourself whether you think that kind of research is ethically acceptable. My point is that in order to fight for the legality of this research the proponents have overstated how urgent the need is for doing human embryonic stem cell research at this point in time and at least one prominent stem cell researcher has put forth a similar view on this controversy.

Update: On a related note a group at Harvard led by researcher Douglas Melton has used private funding from the Howard Hughes Medical Institute (HHMI) to develop 17 new human embryonic stem cell (hESC) lines.

March 3, 2004— Howard Hughes Medical Institute researchers at Harvard University announced today that they have derived 17 new human embryonic stem-cell lines. The new cell lines will be made available to researchers, although at this time United States policies prohibit the use of federal funds to investigate these cells.

The cell lines were derived using private funds by researchers in the laboratory of Douglas A. Melton, a Howard Hughes Medical Institute (HHMI) investigator at Harvard University. The researchers described the stem-cell lines in an article published online on March 3, 2004, in the New England Journal of Medicine (NEJM). The article will also be in the March 25, 2004, print edition of NEJM.

HHMI funds a lot of excellent quality biomedical research and has an endowment which is currently about $11 billion dollars and which currently provides $450 million per year in biomedical research funding to hundreds of investigators. So HHMI has pockets deep enough to make hESC research happen in the United States without federal government money.

By Randall Parker 2004 March 03 02:13 AM  Biotech Organ Replacement
Entry Permalink | Comments(10)
2004 March 02 Tuesday
Hippocampus Sorts Information Into Memory Categories

Monkey brains remember things in part by placing them into pre-existing categories.

“When you need to remember people you’ve just met at a meeting, the brain probably doesn’t memorize each person’s facial features to help you identify them later,” says Sam Deadwyler, Ph.D., a Wake Forest neuroscientist and study investigator. “Instead, it records vital information, such as their hairstyle, height, or age, all classifications that we are familiar with from meeting people in general. Our research suggests how the brain might do this, which could lead to ways to improve memory in humans.”

The researchers found that when monkeys were taught to remember computer clip art pictures, their brains reduced the level of detail by sorting the pictures into categories for recall, such as images that contained “people,” “buildings,” “flowers,” and “animals.”

The categorizing cells were found in the hippocampus, an area of the brain that processes sensory information into memory. It is essential for remembering all things including facts, places, or people, and is severely affected in Alzheimer’s disease.

“One of the intriguing questions is how information is processed by the hippocampus to retain and retrieve memories,” said Robert Hampson, Ph.D., co-investigator. “The identification of these cells in monkeys provides evidence that information can be remembered more effectively by separating it into categories. It is likely that humans use a similar process.”

The researchers measured individual cell activity in the hippocampus while the monkeys performed a video-game-like memory task. Each monkey was shown one clip art picture, and after a delay of one to 30 seconds, picked the original out of two to six different images to get a juice reward.

By recording cell activity during hundreds of these trials in which the pictures were all different, the researchers noticed that certain cells were more active when the pictures contained similar features, such as images of people – but not other objects. They found that different cells coded images that fit different categories.

One really interesting aspect of this report is that different monkeys developed different ways for categorizing the same sensory inputs:

“Unlike other cells in the brain that are devoted to recording simply an object’s shape, color or brightness, the category cells grouped images based on common features – a strategy to improve memory,” said Terry Stanford, Ph.D., study investigator. “For example, the same cell responded to both tulips and daisies because they are both flowers.”

The researchers found, however, that different monkeys classified the same pictures differently. For example, with a picture of a man in a blue coat, some monkeys placed the image in the “people” category, while others appeared to encode the image based on features that were not related to people such as “blue objects” or “types of coats."

While such categorization is a highly efficient memory process, it may also have a downside, said the researchers.

“The over generalization of a category could result in errors,” said Deadwyler. “For example, when the trials included more than one picture with people in it, instead of different images, the monkeys often confused the image with a picture of other people.”

This really matches with what one would expect intuitively. A doctor is going to look at people and remember them by disease characteristics. A fashion magazine editor is going to remember them by types of clothing and jewelry worn. Others with different previous training and life experiences are going to split people and things up in different ways. There obviously must be ways that networks of neurons have formed to favor different approaches for filtering and categorizing sensory input.

The process of choosing categories in your mind to sort what you learn and experience is an important part of becoming an effective learner and analyzer of information.

By Randall Parker 2004 March 02 01:28 AM  Brain Memory
Entry Permalink | Comments(18)
Different Parts Of Brain Remember Boring, Exciting Words

How well the mind remembers and what part of the brain is involved in memory formation depends on whether the words being memorized are emotionally arousing.

For the study, Elizabeth Kensinger, a researcher in MIT's Department of Brain and Cognitive Sciences, and Suzanne Corkin, professor of behavioral neuroscience in the same department, asked 14 men and 14 women to "learn" 150 words related to events, while the participants brains were being scanned in an fMRI (functional magnetic resonance imaging) procedure. Some of the words represented arousing events, such as "rape" or "slaughter." Others were nonarousing, such as "sorrow" and "mourning."

They then tested the participants to see which of the words they remembered having been shown. Kensinger and Corkin found that Mb

"This result suggests that stress hormones, which are released as part of the response to emotionally arousing events, are responsible for enhancing memories of those events," said the researchers. "We think that detailed cognitive processing may underlie the enhanced memory for the nonarousing events."

Memory storage can be enhanced by associating memories with emotionally exciting events. But use of such a technique raises a question that one ought to ponder before trying to learn important material while, say, scaring oneself watching a scary movie: Do you want various memories to be stored in areas of the brain associated with strong emotional reactions? Doing so may make the memories easier to recall but will also probably cause emotional reactions upon recall. Sometimes it makes sense to place memories where they will be linked to various emotional reactions. But in other cases it makes more sense to be able to retrieve some memories without having to feel a potentially stressful and draining emotional reaction.

It would be useful to be able to measure the intensity and type of emotional reactions as memories are recalled. Then it would be even more useful to be able to it would be useful to be able to disconnect the recall of a particular set of memories with the evocation of an undesired emotional reaction.

This result is not surprising either intuitively or based on previous scientific results. Also see my previous post Gory Pictures Improve Memory Retention.

By Randall Parker 2004 March 02 12:47 AM  Brain Memory
Entry Permalink | Comments(3)
Site Traffic Info
Site Copyright
The contents of this site are copyright ©