2004 January 29 Thursday
Women Hold Babies On The Left To Connect To Emotional Half Of the Brain

Victoria Bourne and Brenda Todd of the University of Sussex in England have found that women hold babies on the side of their bodies that connect to that side of the brain which is dominant in the processing of emotions.

The right side of the brain controls the left side of the body and usually helps to process emotions, explains Bourne. So holding the baby on the left-hand side may help to direct the sight of emotionally charged information, such as tears or laughter, to the specialized right hemisphere for processing, she says.

Keep in mind that the left brain gets input from the right side of the body and the right brain gets input from the left side. It is interesting to note that the heart is traditionally associated with the seat of emotions and it happens to be on the left side of the body and that side is the side that connects to the center of emotional processing in the brain in most people.

If you want to find out which side of your brain does most of your emotional processing then take this quick test. The test is too short to be definitive. If anyone knows of a longer test with more pictures please post a link to it in the comments.

The abstract for the research paper more clearly explains what they did. The researchers used both people who have left-brain dominance for emotional processing and people who did not.

Previous research has indicated that 70-85% of women and girls show a bias to hold infants, or dolls, to the left side of their body. This bias is not matched in males (e.g. deChateau, Holmberg & Winberg, 1978; Todd, 1995). This study tests an explanation of cradling preferences in terms of hemispheric specialization for the perception of facial emotional expression. Thirty-two right-handed participants were given a behavioural test of lateralization and a cradling task. Females, but not males, who cradled a doll on the left side were found to have significantly higher laterality quotients than right cradlers. Results indicate that women cradle on the side of the body that is contralateral to the hemisphere dominant for face and emotion processing and suggest a possible explanation of gender differences in the incidence of cradling.

One thing that would be interesting to discover is whether less emotionally intense women are less likely to prefer one side over the other for holding a baby.

By Randall Parker 2004 January 29 05:12 PM  Brain Emotions
Entry Permalink | Comments(8)
2004 January 28 Wednesday
Sandia Researchers Monitoring EEGs To Improve Meeting Productivity

Researchers in the Sandia National Laboratories Advanced Concepts Group are using computers hooked up to a variety of medical monitoring devices to measure brain and body changes of people in meetings and feeing back this information to meeting members in order to cause them to change the way they are responding.

Aided by tiny sensors and transmitters called a PAL (Personal Assistance Link) your machine (with your permission) will become an anthroscope - an investigator of your up-to-the-moment vital signs, says Sandia project manager Peter Merkle. It will monitor your perspiration and heartbeat, read your facial expressions and head motions, analyze your voice tones, and correlate these to keep you informed with a running account of how you are feeling - something you may be ignoring - instead of waiting passively for your factual questions. It also will transmit this information to others in your group so that everyone can work together more effectively.

"We're observing humans by using a lot of bandwidth across a broad spectrum of human activity," says Merkle, who uses a Tom Clancy-based computer game played jointly by four to six participants to develop a baseline understanding of human response under stress.

"If someone's really excited during the game and that's correlated with poor performance, the machine might tell him to slow down via a pop-up message," says Merkle. "On the other hand, it might tell the team leader, 'Take Bill out of loop, we don't want him monitoring the space shuttle today. He's had too much coffee and too little sleep. Sally, though, is giving off the right signals to do a great job.'"

The idea of the devices has occasioned some merry feedback, as from a corporate executive who emailed, "Where do we get the version that tells people they are boring in meetings? Please hurry and send that system to us. A truck full or two should cover us."

More seriously, preliminary results on five people interacting in 12 sessions beginning Aug. 18 indicate that personal sensor readings caused lower arousal states, improved teamwork and better leadership in longer collaborations. A lowered arousal state - the amount of energy put into being aware - is preferable in dealing competently with continuing threat.

...

"Some people think you have to start with a theory. Darwin didn't go with a theory. He went where his subjects were and started taking notes. Same here," he says. Merkle presented a paper on his group's work at the NASA Human Performance conference Oct. 28-29 in Houston. "Before we knew that deep-ocean hydrothermal vents existed, we had complex theories about what governed the chemistry of the oceans. They were wrong." Now it's state-of-the-art to use EEG systems to link up brain events to social interactions, he says. "Let's get the data and find out what's real."

The tools for such a project - accelerometers to measure motion, face-recognition software, EMGs to measure muscle activity, EKGs to measure heart beat, blood volume pulse oximetry to measure oxygen saturation, a Pneumotrace(tm) respiration monitor to measure breathing depth and rapidity - are all off-the-shelf items.

...

Further work is anticipated in joint projects between Sandia and the University of New Mexico, and also with Caltech.

"In 2004 we intend to integrate simultaneous four-person 128-channel EEG recording," says Merkle, "correlating brain events, physiologic dynamics, and social phenomena to develop assistive methods to improve group and individual performance."

How many potential abuses of this technology can you imagine? One of the worse I can think of would be bosses using it to ensure that everyone is paying attention in a seminar introducing the latest management fad.

On the more optimistic side, if the device could measure confusion then it would help to alert someone that his explanation about some problem or proposal is not getting across. Also each person could use it to detect their own anger and frustration and work to try to reduce the stress that one feels in some situations. This would have the beneficial effect of slowing down the rate of aging. The ability to do biofeedback training while in silly meetings would also have the benefit of making those meetings more productive - at least for one's own personal quest to develop thought patterns that allow one to remain unstressed by the folly which is found in so many corporate settings.

By Randall Parker 2004 January 28 10:30 AM  Cyborg Tech
Entry Permalink | Comments(2)
Cars May Become Greater Electricity Generators Than Big Electric Plants

Since cars use more energy to move than houses use for electricity a car powered by a hydrogen fuel cell that generates electricity to run an electric motor would have the capacity to supply all the power a house would need.

Another possibility that comes from such a system is the homeowner's ability to power the house from a fuel-cell vehicle. The fuel cell in a typical fuel-cell vehicle would have an output power from 25 kW to more than 100 kW. Because the average home only uses between 2 and 10 kW of electricity, it would be possible to "plug" the car into the home to provide power from the fuel cell using the hydrogen stored on the vehicle.

Of course, to make this work we first need fuel cells that are cheap enough and light and durable enough to serve as power sources for cars. But we would also need a way to store hydrogen in a dense enough form to make hydrogen a viable mobile power storage source. Even if a way to store hydrogen in a dense form in vehicles could be found we'd still face the need for a power source to use to generate the hydrogen in the first place.

In spite of these big caveats about the serious problems hydrogen faces as an energy storage form the idea that a car could generate enough power to run a few houses is a neat idea. In fact, the use of fuel cells to generate home electric power does not have to depend on hydrogen as a energy storage form. Advances in Solid Oxide Fuel Cells (also see here and here show promise for the ability to burn fossil fuels in order to generate electricity more efficiently than gas turbines can currently. If fossil fuel-burning solid oxide fuel cells become competitive for use in vehicles then the result would be that most people will come to own vehicles that can generate more electricity than they need to run their homes. Whether the burning of fuel in those vehicles (or in a smaller fuel cell attached to the house) to power a house can be done more cheaply than large centralized electric power plants remains to be seen. The potential exists for that to happen because fuel cells have the potential to convert fossil fuels to electricity more efficiently than how current large electric power plants currently do it. Plus, energy losses in electric power lines could be avoided by generating electricity much closer to where it is used. At the very least the use of fossil fuel-burning fuel cells ought to make central power outages less of a concern to anyone who outfits their house with a connector that they can plug into their car to run the house.

Update: As yet a suitable way to store hydrogen for use in cars has not been developed. Some University of Chicago researchers have just demonstrated that hydrogen can be turned into a clathrate that will remain stable at normal atmospheric pressure.

University scientists have proposed a new method for storing hydrogen fuel in a paper that appeared in the Monday, Jan. 5 to Friday, Jan. 9 online edition of the Proceedings of the National Academy of Sciences.

The lack of practical storage methods has hindered the more widespread use of hydrogen fuels, which are both renewable and environmentally clean. The most popular storage methods—liquid hydrogen and compressed hydrogen—require that the fuel be kept at extremely low temperatures or high pressures. But the University’s Wendy Mao and David Mao have formed icy materials made of molecular hydrogen that require less stringent temperature and pressure storage conditions.

“This new class of compounds offers a possible alternative route for technologically useful hydrogen storage,” said Russell Hemley, Senior Staff Scientist at the Geophysical Laboratory of the Carnegie Institution of Washington. The findings also could help explain how hydrogen becomes incorporated in growing planetary bodies, he said.

The father-daughter team synthesized compounds made of hydrogen and water, hydrogen and methane, and hydrogen and octane in a diamond-anvil cell, which researchers often use to simulate the high pressures found far beneath Earth’s surface. The hydrogen and water experiments produced the best results. “The hydrogen-water system has already yielded three compounds, with more likely to be found,” said Wendy Mao, a graduate student in Geophysical Sciences.

The compound that holds the most promise for hydrogen storage, called a hydrogen clathrate hydrate, was synthesized at pressures between 20,000 and 30,000 atmospheres and temperatures of minus 207 degrees Fahrenheit. More importantly, the compound remains stable at atmospheric pressure and a temperature of minus 320 degrees Fahrenheit, the temperature at which liquid nitrogen boils.

“We thought that would be economically very feasible. Liquid nitrogen is easy and cheap to make,” Wendy Mao said.

The hydrogen in a clathrate can be released when heated to 207 degrees Fahrenheit. The clathrate’s environmentally friendly byproduct is water.

The unanswered question here is: how much energy does it take to convert hydrogen into a clathrate? Also, if the hydrogen has to be heated to release it from the clathrate then how much energy is required to do that?

By Randall Parker 2004 January 28 12:23 AM  Energy Tech
Entry Permalink | Comments(15)
2004 January 26 Monday
Cambridge UK Conference Looks Into Climate Engineering

A recent scientific conference looked into methods for climate engineering to counteract global warming. (same press release here)

The meeting is being jointly hosted by the Tyndall Centre for Climate Change Research and the Cambridge-MIT Institute.

The symposium, called “Macro-engineering options for climate change management and mitigation” is at the Isaac Newton Institute in Cambridge from 7-9 January.

“We urgently need to explore the feasibility of imaginative new ideas for reducing global warming in the future, either by slashing carbon dioxide emissions, or by counteracting its effects, if we are to avoid dangerous climate change”, says Professor John Shepherd, a Director of the Tyndall Centre.

Proposed options for reducing carbon dioxide pollution currently include underground burying of liquefied carbon dioxide; disposal in the sea; fertilising its absorption by marine algae; reflecting the sun’s rays in the atmosphere; and stabilizing sea-level rise. These and other macro-engineering ideas will be evaluated against a strict set of criteria, including effectiveness, environmental impacts, cost, public acceptability, and reversibility. All of these options go beyond the conventional approaches of improving energy efficiency and reducing carbon intensity by using more renewable energy sources, and may be needed in addition to these conventional approaches.

“Because of the urgency of implementing climate-change management, more innovative approaches to the mitigation of climate change might be needed. This is really a big thought experiment, to critically evaluate which macro-engineering options might be feasible and worth pursuing” comments John Shepherd. “Some of the macro-engineering options which have been suggested are big and rather scary, and some may even appear to be crazy. That is precisely why they should be evaluated – and if necessary dismissed – as soon as possible, so that society can decide which should be developed as serious options for future use, if & when they are needed.”

“Most of these macro-engineering options are not yet in the mainstream for climate policy, but the mere fact that they have been suggested places an obligation on scientists from many disciplines to explore their feasibility and evaluate their consequences and their wider implications” comments Shepherd.

Bubble-making machines could delay global warming for decades.

Instead, the scientists backed more way-out systems for reflecting the sun's rays back into space. Plan A would float thousands of bubble-making machines across the world's oceans to send huge amounts of salt spray into the atmosphere. The trillions of tiny droplets would make the clouds bigger, whiter, and more reflective -- enough, in theory, to shut down several decades worth of global warming.

Plan B would flood the stratosphere with billions of tiny metal-coated balloons, "optical chaff" to backscatter the sun's rays. Most sophisticated of all, Plan C would assemble giant mirrors in orbit, ready to be positioned at will by a global climate controller.

The BBC reports on 4 major categories of conceivable climate engineering approaches.

  • "sequestering" (storing) carbon dioxide, for example in the oceans, by removing it from the air for storage, or by improved ways of locking it up in forests
  • "insolation management" - modifying the albedo (reflectivity) of clouds and other surfaces to affect the amount of the Sun's energy reaching the Earth
  • climate design, for example by long-term management of carbon for photosynthesis, or by glaciation control
  • impacts reduction, which includes stabilising ocean currents by river deviation, and providing large-scale migration corridors for wildlife.

A test of one technique for climate engineering is currently underway. A German scientific team led by Victor Smetacek set sail on January 21, 2004 on board the research ship Polarstern headed for the Antarctic to try a massive experiment at salting the ocean with iron to encourage phytoplankton to remove carbon dioxide (CO2) from the atmosphere.

The team plans to dissolve an iron sulphate solution in a in a 150-200 square-kilometre patch of the Southern Ocean, near Antarctica, where currents are expected to keep the iron within a limited area. The team will then monitor the growth of phytoplankton from a helicopter, and examine which kinds of algae and other creatures flourish for a period of eight to ten weeks.

If and when global warming becomes a net harm to humanity climate engineering may turn out to be a far more cost-effective way to mitigate it. If atmospheric carbon dioxide levels could rise without causing global warming then there would be benefits in terms of faster growing crops and also from the growth of plants into areas that are currently deserts. Israeli scientists attribute the expansion of a forest into the Negev desert to the results of higher atmospheric CO2 levels.

Update: Even without the formation of an international committee to choose climate engineering projects the impact of human activity on the climate is going to become a far more politically contentious subject. The reason is that advances in climate science will improve our ability to model and predict how each human activity will change the climate all over the world. So, for instance, it will become possible to know how much the burning of coal in China increases or decreases rainfall in Saharan Africa. It will become possible to know how much car exhaust fumes in the United States change temperatures in Europe in different seasons. It will become possible to know how cutting down rainforests in Brazil or Indonesia affects rainfall in Peru or Australia. As people become more aware of how activities by other people cause impacts in their own lives then it seems reasonable to expect animosities between countries to rise as a consequence.

If normal human activities in each country come to be seen as causing problems in other countries then it will not take the formation of an international climate engineering organization for climate changes to be the source of focused resentments and animosities. To the extent that science causes the weather to seem less like a consequence of acts of God and more like the consequence of acts of humans climate will become a source of political strife and possibly terrorism and war.

By Randall Parker 2004 January 26 12:52 PM  Engineering Large Scale
Entry Permalink | Comments(8)
2004 January 25 Sunday
Aubrey de Grey Interview: Closing in on the Cure for Death

In a far-ranging interview with the Better Humans web site biogerontologist Aubrey de Grey outlines the reasons why so few scientists are currently working on rejuvenation therapies even though biological science and biotechnology have advanced far enough for such work to begin in earnest.

The fatalism problem can be dissected into three separate problems that form a sort of triangular logjam, each perpetuating the next. The public thinks nothing can be done. So, the state only funds very unambitious work -- very reasonably they feel that to fund stuff that their constituency thinks is a pipedream would jeopardize re-election. (Parallel logic holds for shareholders and directors in industry.)

So, scientists -- also very reasonably -- don't even submit grants to do ambitious stuff, even if they want to (of which more in a moment), because it's a waste of time -- the grant will be turned down. So, when scientists go on the television to talk about their work, they talk about the cautious stuff that they're actually doing, not about the ambitious stuff that they're not doing, and indeed this encourages them to the mindset that they don't really want to do the ambitious stuff anyway.

So, the public -- again very reasonably -- continues to view curing aging as very, very far away, because the scientists with the best information are telling them that (not in as many words, but by what they're not saying). So each of these three communities is behaving very reasonably in its own terms, but the result is stasis.

Aubrey lays out his 7 major categories of therapies that will, once they become available, make it possible for humans to have youthful bodies for decades longer than is now possible. He believes there are concrete steps that could be taken now in mice models to test out versions of those therapies and that the results could be available from mouse studies within 10 years if $100 million per year was spent to develop all these major categories of therapeutic approaches. Therapies for humans that would add years and perhaps even decades to life could be available by the 2020s if a big push was started now. Then more therapies introduced in the later 2020s and 2030s could so extend life that anyone still alive at that point who doesn't die from an accident will effectively be able to become young again.

There are enough multimillionaire and billionaire philanthropists that all the work could be done with private money if only enough wealthy people became interested. If you know any wealthy people then do us all a favor and send them Aubrey's interview and some of the articles from his web site.

Speaking of Aubrey's web site, if you haven't already been there be sure to visit Aubrey's home page for Strategies for Engineered Negligible Senescence (SENS) and read some of his articles about how to stop and reverse aging.

By Randall Parker 2004 January 25 10:49 PM  Aging Reversal
Entry Permalink | Comments(1)
2004 January 23 Friday
UCSD Researchers Develop New Way To Look At 3-D Protein Structures

Advances in basic instrumentation and in techniques for characterizing the structure of biological molecules enable many other advances to be made that produce results that are more directly usable in medicine and in other fields. While people who produce medical treatments tend to get most of the glory the scientists who make advances in instrumentation and in biological assays create the tools that make possible the many advances which are of more direct benefit.

With this thought in mind you might therefore be mildly excited to learn that some researchers at UC San Diego have developed a new method to determine the structure of proteins whose structures could not be determined by existing methods.

An innovative method that allows increased success and speed of protein crystallization – a crucial step in the laborious, often unsuccessful process to determine the 3-dimensional structure unique to each of the body’s tens of thousands of folded proteins – has been developed by researchers at the University of California, San Diego (UCSD) School of Medicine and verified in tests with the Joint Center for Structural Genomics (JCSG) at The Scripps Research Institute (TSRI) and the Genomics Institute of the Novartis Research Foundation in La Jolla, California.

Described in the Jan. 20, 2004 issue of the journal Proceedings of the National Academy of Sciences (PNAS)*, the method, which employs a UCSD invention called enhanced amide hydrogen/deuterium-exchange mass spectrometry, or DXMS, rapidly identifies small regions within proteins that interfere with their ability to crystallize, or form a compact, folded state. The investigators demonstrate that once these regions are removed by what amounts to “molecular surgery”, the proteins then crystallize very well.

“Although the sequencing of the human genome gave us the code for genes that are the recipes for proteins, we need to see and understand the folded shape taken by proteins to determine how they work as the fundamental components of all living cells,” said UCSD’s Virgil Woods, Jr., M.D., the inventor of DXMS, senior author of the PNAS article and an associate professor of medicine. “Definition of a protein folded structure is of great use in the discovery of disease-targeting drugs. Furthermore, when we’re able to identify incorrectly folded proteins in disease states, such as Alzheimer’s, cystic fibrosis and many cancers, we may then be able to design drugs that encourage proper folding or block the misshapen protein.”

The 3-dimensional structure of a protein is a useful piece of knowledge for drug developers. Detailed knowledge of a protein structure is a useful starting point to suggest what types of chemical compounds to build to test against a protein for binding affinity. Better protein structure determination tools will therefore speed drug development.

X-rays are used to gather information used to discover 3-dimensional structures of proteins. The problem is that x-ray crystallography requires that a protein first be induced to form crystalline structures and not all proteins can be made to do so.

Unfortunately, many proteins do not naturally form a single, compact state in solution and hence, they are often highly resistant to crystallization, which is required for the x-ray crystallographic process that determines their shape. X-ray crystallography works by bombarding x-rays off crystals of a protein that contain a 3-dimensional lattice, or array of the individual protein or of a protein complex. The scattered, or diffracted pattern of the x-ray beams is used to calculate a s-dimensional structure of the protein.

Out of 24 proteins used to test this technique, including 18 which existing techniques had failed to crystallize, the researchers were able to determine the structures of the 6 easy ones and 15 of the hard ones. Then with some genetic manipulation the researchers were able to determine the structures of 2 of the 3 remaining proteins.

Of the 24 proteins provided by JCSG for DXMS analysis, six had already been crystallized and their structures determined. The results provided by DXMS matched the information on those six proteins, correctly identifying even small unfolded regions. The remaining 18 proteins provided by JCSG had all failed extensive prior crystallization attempts. In the new experiments, DXMS technology rapidly determined the unstructured regions in 15 of these proteins.

Two of the previously failed proteins were then subjected to “molecular surgery”, in which the DXMS-identified unstructured regions were selectively removed from the DNA that coded for the proteins. DXMS study of the resulting modified proteins demonstrated that the surgery had removed the unstructured regions without otherwise altering the shape of the originally well-folded regions. Each of the two resulting DXMS stabilized forms of the proteins were then found to crystallize well, while the original, unmodified proteins again failed to crystallize.

JCSG investigators were subsequently able to determine the 3-dimensional structures of these two proteins by x-ray analysis of the crystals resulting from DXMS-guided stabilization. One of the proteins that was successfully crystallized was found to have a unique shape or “fold”, not previously seen in proteins.

So now a technique has been found that can determine the 3-dimensional structures of proteins which were previously beyond the reach of researchers. Science marches on.

By Randall Parker 2004 January 23 01:29 PM  Biotech Assay Tools
Entry Permalink | Comments(0)
2004 January 21 Wednesday
REM And Slow-Wave Sleep Needed For Memory Consolidation

Duke University researchers have found studying rats that memory formation occurs in the slow-wave and rapid eye movement (REM) sleeping states. (same article here and shorter press release here)

In their study, the researchers placed about 100 infinitesimal recording electrodes in the brains of rats, in four regions involved in memory formation and sensory processing. Those brain areas included the hippocampus, which is widely believed to be involved in memory storage, and areas of the forebrain involved in rodent-specific behaviors. The scientists employed the same neural recording technology that Nicolelis and his colleagues used to enable monkeys to control a robot arm, an achievement announced in October 2003.

The researchers next exposed the rats to four kinds of novel objects in the dark, since largely nocturnal rodents depend on the sense of touch via their whiskers to investigate their environment. The four objects were a golf ball mounted on a spring, a fingernail brush, a stick of wood with pins attached and a tube that dispensed cereal treats.

The researchers recorded and analyzed brain signals from the rats before, during and after their exploration, for several days across natural sleep-wake cycles. Analyses of those signals revealed "reverberations" of distinctive brain wave patterns across all the areas being monitored for up to 48 hours after the novel experience.

According to Ribeiro, "We found that the activity of the brain when the animal is in a familiar environment does not 'stick' -- that is, the brain keeps moving from one state to another. In contrast, when the animal is exploring a novel environment, that novelty imposes a certain pattern of activity, which lingers in all the areas we studied. Also, we found that this pattern was much more prevalent in slow-wave sleep than in REM sleep."

Conversely, previous studies by Ribeiro and his colleagues demonstrated that the activation of genes able to effect memory consolidation occurs during REM sleep, not slow-wave sleep.

"Based on all these results, we're proposing that the two stages play separate and complementary roles in memory consolidation," he said. "Periods of slow-wave sleep are very long and produce a recall and probably amplification of memory traces. Ensuing episodes of REM sleep, which are very short, trigger the expression of genes to store what was processed during slow-wave sleep." In principle, this model explains studies such as those by Robert Stickgold and his colleagues at Harvard University, showing that both slow-wave and REM sleep have beneficial effects on memory consolidation, he said. According to Nicolelis, the new experiments remedy shortcomings of previous studies.

Of course what we all want is to be able to more easily store selected memories. Suppose more sophisticated sleeping drugs are developed that can selectively cause humans to spend more time in slow-wave and REM sleep. Will that boost memory formation? Suppose it did. I think one would want to exercise restraint in the use of such drugs. Do you want to remember the most boring details of your most boring days? I think not. It might even make sense to crowd your most important learning-intense activities into particular days so that you can remember new knowledge from those days in special intense memory formation sleeping sessions.

The genes identified that are up-regulated by the REM stage of memory storage are a point of particular interest. Another approach that might boost memory formation would be induce the expression of genes involved in memory formation. The problem, though is that a drug that upregulated their expression might be too broad in its effects causing the genes to be expressed all the time rather than just in the phase of REM sleep when memories are normally consolidated.

Drugs capable of inducing specific sleeping states and drugs capable of turning on genes used in particular sleeping states hold out the potential of creating states of the mind that would be hybrid states that are between sleeping and waking states. Whether those hybrid mental states would end up being useful in practice remains to be seen. Perhaps it will eventually be possible to use sleep state regulating drugs in the following way: One could study a subject really intensely for an hour or two and then hook oneself up to an automatic drug dispensing device (which might even be an embedded device) that would release drugs that would throw one very quickly and successively into slow-wave and REM sleep states to consolidate the memories of what one was studying. Then the drugs would be stopped and another drug would bring you back awake with the last couple of hours of memories consolidated. Using this approach one might be able to then cycle through several cycles of learning and sleeping states in a single day. This would allow humans to escape from some of the limits caused by the evolutionary legacy of our ancestors being exposed the 24 hour cycle of light and dark caused by the period of rotation of the Earth back before Edison's invention of the electric light bulb.

The full text of the article is available free on-line in the journal Public Library of Science Biology (PLoS Biology) in three formats. First, a synopsis of the work:

Brain Activity during Slow-Wave Sleep Points to Mechanism for Memory
Full-text | Print PDF (2607K) | Screen PDF (326K)

The research paper for the work:

Long-Lasting Novelty-Induced Neuronal Reverberation during Slow-Wave Sleep in Multiple Forebrain Areas
Sidarta Ribeiro, Damien Gervasoni, Ernesto S. Soares, Yi Zhou, Shih-Chieh Lin, Janaina Pantoja, Michael Lavine, Miguel A. L. Nicolelis
Full-text | Figures | Print PDF (11762K) | Screen PDF (732K)

PloS Biology even has a list of news articles reporting on this paper here.

By Randall Parker 2004 January 21 07:11 PM  Brain Memory
Entry Permalink | Comments(3)
2004 January 20 Tuesday
Stem Cell Therapy Replaces Missing Myelin In Mouse Brains

A team led by Steven Goldman, M.D., Ph.D., of the University of Rochester Medical Center has injected both embryonic and adult stem cells into mice that were previously genetically engineered to be deficient in the insulation covering nerve cells that is called myelin sheath. The injected stem cells restored some of the missing myelin sheath.

The team remyelinated the mice – restored the “insulation” to the brain cells– by injecting into the mice highly purified human “progenitor” cells, which ultimately evolve into the cells that make myelin. These cells are known as oligodendrocytes: While these and other types of glial cells aren’t as well known as information-processing brain cells called neurons, they are vital to the brain’s health.

“Neurons get all the press, but glial cells are crucial to our health,” says Goldman.

The team studied 44 mice that were born without any myelin wrapped around their brain cells. Within 24 hours of their birth, scientists injected cells that become oligodendrocytes –myelin-producing cells – into one precisely selected site in the mice.

Scientists found that the cells quickly migrated extensively throughout the brain, then developed into oligodendrocytes that produced myelin which coated or “ensheathed” the axons of cells in the brain.

“These cells infiltrate exactly those regions of the brain where one would normally expect oligodendrocytes to be present,” says Goldman. “As they spread, they begin creating myelin which wraps around and ensheaths the axons.”

Goldman says that while scientists have used other methods during the past two decades to remyelinate neurons in small portions of the brains of mice, the remyelination seen in the Nature Medicine paper is much more extensive. He estimates that about 10 percent of the axons in the mouse brains were remyelinated, compared to a tiny fraction of 1 percent in previous studies.

If the auto-immune attack on the myelin could be stopped in Multiple Sclerosis (MS) suffers then even a repair of 10% of the damage would improve functionality. That might be enough of a difference to allow someone in a wheelchair to walk with the assistance of a walker or it might be enough to allow a person to feed himself.

Currently, demyelinating diseases are permanent, and problems worsen as time goes on because there is no way to fix the underlying problem – restoring the myelin around the axons of brain cells. Goldman is hopeful that infusion of cells like oligodendrocyte progenitors might be used to offer relief to patients.

“The implantation of oligodendrocyte progenitors could someday be a treatment strategy for these diseases,” says Goldman, a neurologist whose research was supported by the National Multiple Sclerosis Society and the National Institute of Neurological Disorders and Stroke. While the experiment provides hope for patients, Goldman says that further studies are necessary before considering a test in humans. Currently he’s conducting experiments in an attempt to remyelinate not just the brains but the entire nervous system of mice.

While it is widely known that Multiple Sclerosis (MS) is caused by an auto-immune attack that eats away at the myelin sheath Goldman points out that many other diseaes involve myelin damage. Therefore myelin restoration would help repair damage associated with many disease which become more common as we age.

In addition to MS, many diseases affecting tens of millions of people in the United States involve myelin problems, Goldman says. These include widespread diseases like diabetes, heart disease and high blood pressure, where decreased blood flow can damage myelin and hurt brain cells, as well as strokes, which often destroy brain cells in part by knocking out the cells that pump out myelin. In addition, cerebral palsy is largely caused by a myelin problem in infants born prematurely.

Given that myelin, like everything else, deteriorates with age the ability to even partially repair it with a stem cell therapy would even offer some prospects for improving cognitive function in the aged.

Nervous system repair is an especially important and especially difficult rejuvenation challenge. Eventually it will be possible to grow replacements for most organs. But the brain must be repaired in place and in ways that do not cause any damage to existing networks of nerves. Therapies that hold the prospect of repairing even a limited subset of all nervous system age-related damage are cause for excitement.

The team found that adult human cells were much more adept at settling into the brain, becoming oligodendrocytes and producing myelin than the fetal cells. After just four weeks, adult cells but not fetal cells were producing myelin. After 12 weeks, four times as many oligodendrocytes derived from adult cells were producing myelin – 40 percent, compared to 10 percent of the cells from fetal cells. In addition, adult cells were likely to take root and form oligodendrocytes, not other brain cells such as neurons or astrocytes, which are not necessary for myelin production. On average, each oligodendrocyte from an adult cell successfully remyelinated five axons, compared to just one axon for fetal cells.

“The adult-acquired cells not only myelinate much more quickly, but more extensively – they myelinate many more axons per cell, and they do so with much higher efficiency. The adult cells were far more efficient than fetal cells at getting the job done,” Goldman says.

The adult-acquired cells (a.k.a. adult stem cells) have a big advantage: they are already more specialized for the desired task. The biggest advantage of embryonic stem cells is also their biggest disadvantage: they can become any kind of cell. Well, to develop a stem cell therapy for a particular disease one usually has to make stem cells become more specialized to produce only one or a few final functional cell types (what biologists call differentiated cells). Embryonic stem cells delivered into a diseased organ that needs a particular cell type may turn themselves into a number of different cell types and many of the cell types the embryonic cells will become are cell types that are not going to help in treating the disease that is being targetted for therapy.

This is not to say that adult stem cells are ideal in all respects. First of all, one needs to find a type of adult stem cell that is capable of becoming the target differentiated cell type that is needed. We do not know adult stem cell types for each final differentiated type and some adult stem cell types are hard to isolate. Plus, adult stem cells from adults frequently act like they are older. They grow more slowly. In fact, the aging of stem cells in adult stem cell reservoirs is a major contriibutor to general aging and we need the ability to replenish adult stem cell reservoirs with younger adult stem cells. For instance, it may be possible to avoid or delay atherosclerosis and heart disease by rejuvenating adult stem cell reservoirs. Whether this is best done by taking adult stem cells and rejuvenating them or by taking embryonic stem cells and turning them into adult stem cells remains to be seen. But one advantage of rejuvenation of one's own adult stem cells is that this would avoid auto-immune problems from use of embryonic stem cells that are not from one's own tissue.

There are many genetically caused myelin diseases.

The classic leukodystrophies include adrenoleukodystrophy, Krabbe's globoid cell, and metachromatic leukodystrophy, and a few other less well known entities. They have in common a genetic origin and involve the peripheral nerves as well as the central nervous system. Each is caused by a specific inherited biochemical defect in the metabolism of myelin proteolipids that results in abnormal accumulation of a metabolite in brain tissue. Progressive visual failure, mental deterioration, and spastic paralysis develop early in life, however, variants of these diseases have a more delayed onset and a less progressive course. The other primary white matter disorders include Alexander's disease, Canavan disease, Cockayne's syndrome, and Pelizaeus-Merzbacher's disease

Tay-Sachs disease is among the genetically based myelin diseases.

When babies are born, many of their nerves lack mature myelin sheaths, so their movements are gross, jerky, and uncoordinated. The normal development of myelin sheaths is impaired in children born with certain inherited diseases, such as Tay-Sachs disease, Niemann-Pick disease, Gaucher's disease, and Hurler's syndrome. Such abnormal development can result in permanent, often extensive, neurologic defects.

By Randall Parker 2004 January 20 01:27 PM  Biotech Organ Replacement
Entry Permalink | Comments(12)
Vitamin C, E In High Dose Combination May Protect Against Alzheimer's

Vitamins C and E in combination appear to reduce the incidence and prevalence of Alzheimer's Disease.

Peter P. Zandi, Ph.D., of The Johns Hopkins University Bloomberg School of Public Health, Baltimore, and colleagues examined the relationship between antioxidant supplement use and risk of AD.

The researchers assessed the prevalence of dementia and AD in 4,740 elderly (65 years or older) residents of Cache County, Utah in 1995 to 1997 and collected information about supplement use. These residents were followed-up in 1998 to 2000 for new cases of dementia or AD. The researchers identified 200 cases of AD (prevalent cases) between 1995 and 1997, and 104 new cases (incident cases) of AD during follow-up.

The researchers categorized participants as vitamin E users if they reported taking an individual supplement of vitamin E or a multivitamin containing more than 400 IU (international units) of vitamin E. Vitamin C users reported taking vitamin C supplements or multivitamins containing at least 500 micrograms of ascorbic acid. Individuals were classified as multivitamin users if they reported taking multivitamins containing lower doses of vitamin E or C.

The researchers found the greatest reduction in both prevalence and incidence of AD in participants who used individual vitamin E and C supplements in combination, with or without an additional multivitamin. "Use of vitamin E and C (ascorbic acid) supplements in combination reduced AD prevalence [by about 78 percent] and incidence [by about 64 percent]," the authors write.

The researchers also found "no appreciable association with the use of vitamin C alone, vitamin E alone, or vitamin C and multivitamins in combination," and prevalence of AD.

"The current… recommended daily allowance for vitamin E is 22 IU (15 micrograms), and for vitamin C (ascorbic acid), 75 to 90 micrograms," the researchers write. "Multivitamin preparations typically contain these approximate quantities of both vitamins E and C (more vitamin C in some instances), while individual supplements typically contain doses up to 1,000 IU of vitamin E and 500 to 1,000 micrograms or more of vitamin C (ascorbic acid). Our findings suggest that vitamins E and C may offer protection against AD when taken together in the higher doses available from individual supplements."

The combination of C and E works far better than either alone.

Antioxidant vitamin supplements, particularly vitamins E and C, may protect the aging brain against damage associated with the pathological changes of Alzheimer's disease, according to a study conducted by the Johns Hopkins Bloomberg School of Public Health and other institutions. The researchers believe antioxidant vitamin supplements may be an ideal prevention strategy for our aging population as they are relatively nontoxic and are thought to have wide-ranging health benefits. The study, "Reduced Risk of Alzheimer's Disease in Users of Antioxidant Vitamin Supplements" is published in the January 2004, issue of the journal Archives of Neurology.

Peter P. Zandi, PhD, lead author of the study and an assistant professor in the School's Department of Mental Health, said, "These results are extremely exciting. Our study suggests that the regular use of vitamin E in nutritional supplement doses, especially in combination with vitamin C, may reduce the risk of developing Alzheimer's disease."

The researchers examined data from the Cache County Study, which is a large, population-based investigation of the prevalence and incidence of Alzheimer's disease and other dementias. Residents who were 65 or older were assessed from 1996-1997 and again from 1998-2000. Study participants were asked at their first contact about vitamin usage. The researchers then compared the subsequent risk of developing Alzheimer's disease over the study interval among supplement users versus nonusers to come to their conclusions.

The doses of C and E that have the best prospects of working are fairly high.

Researchers believe the most effective doses were vitamin E in liquid capsules of 400 to 1,000 International Units and vitamin C in pill form of 500 to 1,500 milligrams.

If you want to take Vitamin E to reduce your risk of Alzheimer's Disease then be aware that it is best to take E with oil and perhaps a food grain for maximum absorption. (same article here)

The pill of 400 I.U. vitamin E taken with just a glass of milk, in theory should have provided more than 13 times the RDA of this nutrient. But, in fact, it raised the level of new vitamin E in the blood by only 3 percent. By comparison, the cereal fortified with 30 I.U. vitamin E raised the blood plasma level of new vitamin E five times higher than that, and the cereal fortified with 400 I.U. raised the new blood plasma level 30 times higher.

The effect of a pill of 400 I.U. taken with a serving of plain wheat cereal was inconsistent; some participants had a significant increase in blood plasma levels of vitamin E, others almost none. "This study clearly showed that applying vitamin E onto a grain cereal provided a huge and consistent increase in its bioavailability," said Scott Leonard, an LPI research assistant who conducted the study. "Even 30 I.U., the RDA for this vitamin, produced a large increase in new blood plasma levels."

Vitamin E with pasta and a pasta sauce with oil would probably be a great way to maximize absorption.

By Randall Parker 2004 January 20 02:18 AM  Brain Alzheimers Disease
Entry Permalink | Comments(7)
Dogs Have Personality Types Obvious To Owners And Strangers

Owners and strangers were separately asked to classify individual dogs by personality traits and came to similar conclusions.

The traits, which are also found in humans, have positive and negative extremes - for example, dogs could be rated as energetic, slothful or somewhere in between. The other traits were affection-aggression, anxiety-calmness and intelligence-stupidity.

...

In total, 78 dogs of all shapes and sizes were tested. In general, owners and strangers agreed on an individual dog's personality. This suggests that the dog personalities are real, says Gosling.

Of course there is a biological basis to personality differences in humans and in dogs. Yes, even strangers can size up a dog and tell you how affectionate or calm or smart the dog is. But now this has been demonstrated scientifically.

Many scientists resist the idea that animals have different unique personalities.

The research, conducted with the help of 78 dogs and owners who were recruited at a dog park in Berkeley, Calif., found the animals' personality traits could be judged with an accuracy comparable to judgments made about humans' personality traits.

"The findings ... suggest a conclusion not widely considered by either human-personality or animal-behavior researchers: Differences in personality traits do exist and can be measured in animals," says the research paper by Samuel D. Gosling, an assistant professor of psychology at the University of Texas at Austin; Virginia S.Y. Kwan of Princeton University; and Oliver John of the University of California, Berkeley.

Gosling says that many researchers are reluctant to believe that dogs have distinct personalities. The mind boggles. Do personality researchers as a group have an aversion to dogs? Have they no experience with owning a variety of dogs who have very distinct personalities? It is amazing what obvious truths even have to be proved by science.

While some dogs and some human children behave poorly as a result of a lack of training or due to abuse many others are just plain determined to be aggressive or defiant or highly motivated to achieve some goal regardless of any adult human supervision. That dogs have unique traits just as humans do is obvious to anyone who has considerable experience with multiple dogs. Even within a breed there is considerable variation though less variation than is found between breeds.

The demonstration that dog personalities can be classified is useful for enabling the search for genetic variations that influence personality. It strikes me, however, that the 4 traits used are inadequate for describing all genetically-based variations in dog behavior. For instance, dogs vary in their instinctive like for water and for retrieving and they many other ways that obviously have genetic bases. The wide range of extremes of dog personalities due to the development of so many breeds for different purposes provides fertile ground on which to search for genetic factors that influence personality and behavior.

By Randall Parker 2004 January 20 02:03 AM  Biological Mind
Entry Permalink | Comments(5)
2004 January 16 Friday
Will Eternal Youthfulness Lead To Less Ambition?

In a post entitled "Would potential immortals be risk-averse?" Tyler Cowen of Marginal Revolution also explores the question of whether long-lived people will be less ambitious.

A related question is whether immortals would be less ambitious, since they might always feel they could accomplish their goals in a more distant future. As long as we are citing fiction, I recall seeing a television show about immortal beings. They were content to remain homeless and spent most of their time sitting around a campfire and talking. They accumulated few possessions. They never feared such a course of action would lead to death, and they always held the option of trying to do more.

In my own post on this topic "Will Longer Lives Make People More Risk Averse?" I explored the question of whether the development of medical treatments that will offer eternal youthfulness will cause people to become extremely risk averse. Aubrey de Grey thinks people will go so far as to stop flying in airplanes and even stop driving cars. I don't think this will happen for reasons very similar to why young people do dangerous things even though they have many decades to live: a lot of people are bored and want to get their kicks. Many (though by no means all) people are biologically wired to be strongly motivated to desire experience of intense and dangerous thrills. Plus, human brains are not wired to accurately measure risks and therefore some people are bound to do things that are more dangerous than they believe to be the case.

One factor tends to argue against the idea that eternal youthfulness will lead to low ambition: A young mind and body is an energetic mind and body. An energetic mind and body will find ambitious undertakings much easier to perform. If something feels like it takes less effort to accomplish it then people are more likely to try to accomplish it. My guess is that youthfulness will increase accomplishment by making work seem more effortless. My further guess is that this feeling of effortlessness will outweigh the effect that will come as the prospect of a long life removes the sense of urgency for the need to ccomplish things before getting too old. So the possibility exists that external measures of ambition will rise even as internal feelings of ambition decline. People might actually end up feeling less ambitious even as they accomplish more due to the ease with which they will be able to exert themselves.

Whether future eternally youthful people will become less ambitious also depends on the answer to the same question I raised about eternal youthfulness and risk aversion: What kinds of personalities will people choose to give themselves once they are able to make enduring changes to their personalities? People could choose to give themselves hard driving highly motivated and goai-oriented personalities. In that case, people might use multi-thousand year lifespans to carry out plans that take hundreds or thousands of years to execute. Or they might just keep going around and finding new goals to achieve that take less time to accomplish.

The bigger question I have about personality engineering is this: Are there personality types which people are less likely to change away from and therefore will people who periodically change their personalities eventually end up at one of the personality types that people tend to not want to switch away from? One might imagine each personality as having something akin to an energy state. The lower the energy state the less the likelihood that a person, once in that state, would ever decide to leave it. Perhaps once it becomes very easy to change personality types the human race will gradually distribute out into the "low energy state" personality types. There may be radically different personality types which each cause the people who think as those types to totally lack the desire to become another type. So humanity might end up dividing up into those types, whatever those types might be.

By Randall Parker 2004 January 16 12:49 AM  Aging Debate
Entry Permalink | Comments(13)
2004 January 14 Wednesday
Researchers Find Key Gene For Evolution Of Human Intelligence

Bruce Lahn and collaborators have discovered signs of strong selective pressure in primates on a gene that affects brain size.

The researchers, led by Howard Hughes Medical Institute (HHMI) investigator Bruce Lahn at the University of Chicago, reported their findings in an advance access article published on January 13, 2004, in the journal Human Molecular Genetics. Patrick Evans and Jeffrey Anderson in Lahn's laboratory were joint lead authors of the article.

“People have studied the evolution of the brain for a long time, but they have traditionally focused on the comparative anatomy and physiology of brain evolution,” said Lahn. “I would venture, however, that there really hasn't been any convincing evidence until now of any gene whose changes might have contributed to the evolution of the brain.”

In this study, the researchers focused on a gene called the Abnormal Spindle-Like Microcephaly Associated (ASPM) gene. Loss of function of the ASPM gene is linked to human microcephaly - a severe reduction in the size of the cerebral cortex, the part of the brain responsible for planning, abstract reasoning and other higher brain function. The discovery of this association by HHMI investigator Christopher A. Walsh and colleagues at Beth Israel Deaconess Medical Center is what prompted Lahn to launch an evolutionary study of the gene.

Lahn and his colleagues compared the sequence of the human ASPM gene to that from six other primate species shown genetically to represent key positions in the evolutionary hierarchy leading to Homo sapiens. Those species were chimpanzee, gorilla, orangutan, gibbon, macaque and owl monkey.

“We chose these species because they were progressively more closely related to humans,” said Lahn. “For example, the closest relatives to humans are chimpanzees, the next closest are gorillas, and the rest go down the ladder to the most primitive.”

For each species, the researchers identified changes in the ASPM gene that altered the structure of the resulting protein, as well as those that did not affect protein structure. Only those genetic changes that alter protein structure are likely to be subject to evolutionary pressure, Lahn said. Changes in the gene that do not alter the protein indicate the overall mutation rate - the background of random mutations from which evolutionary changes arise. Thus, the ratio of the two types of changes gives a measure of the evolution of the gene under the pressure of natural selection.

Lahn and his colleagues found that the ASPM gene showed clear evidence of changes accelerated by evolutionary pressure in the lineage leading to humans, and the acceleration is most prominent in recent human evolution after humans parted way from chimpanzees.

“In our work, we have looked at evolution of a large number of genes, and in the vast number of cases, we see only weak signatures of adaptive changes,” said Lahn. “So, I was quite surprised to see that this one gene shows such strong and unambiguous signatures of adaptive evolution — more so than most other genes we've studied.”

By contrast, the researchers' analyses of the ASPM gene in the more primitive monkeys and in cows, sheep, cats, dogs, mice and rats, showed no accelerated evolutionary change. “The fact that we see this accelerated evolution of ASPM specifically in the primate lineage leading to humans, and not in these other mammals, makes a good case that the human lineage is special,” said Lahn.

According to Lahn, among the next steps in his research will be to understand how ASPM functions in the brain. Studies by Walsh and others hint that the protein produced by the gene might regulate the number of neurons produced by cell division in the cerebral cortex. Lahn and his colleagues plan functional comparisons of the ASPM protein among different species, to understand how this gene's function or regulation changes with evolution.

The acceleration of ASPM functional changes in the whole lineage this suggests that there was evolutionary selective pressure for changes in cognitive function not just from the point where humans split off from chimpanzees but even much early as well. Was that pressure consistent and continuing? Or did it happen periodically? One can only speculate at this point. Perhaps something about the shape of a primate makes higher intelligence more useful. If so then the more that shape changes in certain directions the more the selective pressure increases. That could come as a result of the types of habitats primates moved into or how they functioned in those habitats or from what they used as food sources or still other factors. There are a lot of possibilities.

One thing interesting about intelligence as an adaptive mutation is that it allows an animal to learn how to adapt itself to new environments. An animal that is in exactly the same environment that its ancestors evolved in might be able to do well in that environment just by following instincts. But in a new environment a species has to either get mutated into a new shape that better adapts the species to the environment or it has to learn how to function in that environment without changing shape. Look at human clothing. A species with fur that migrates into a colder environment might simply gradually develop metabolic changes for cold weather and fur thickness changes that adapt that species to the colder environment. Humans whose ancestors lived in colder environments do have genetic variations in their mitochondria that allow them to be warmer in cold weather. But humans also were smart enough to develop the ability to kill furry animals and use their pelts for clothing to be warmer. So humans could use their intelligence to adapt themselves to cold climates more rapidly than specific mutational changes would happen to help in the adaptation. Humans spread out across all the continents and adapted themselves to a very wide range of environments even before the modern age of science and technology. What will be interesting to find out is whether specific types of ancient environments required greater cognitive abilities and, if so, what was it about those environments that levelled greater demands on the brain.

You might be wondering how exactly scientists can detect selective pressure on a gene. Note how the article talks about mutations that are not functionally significant versus mutations that are functionally significant. Well, compare two related species for the ratio of functionally significant to functionally insignificant variations in the same gene. The higher the ratio the higher the selective pressure must have been.

Here's an intuitive example of why ratios of functionally signficant to functionally insignificant mutations reveal the extent of past selective pressure: Suppose at some point in the past there was a species that has only a million animals of that species. Suppose they had some gene we will call X. Suppose they all had only functionally insignificant mutations in X and that between the million animals of that species they had 20 different combinations of mutations in X. Then suppose a single animal in that species was born that had a mutation in X that caused a functional change that made that animal more adaptive. Perhaps the mutation in X made the animal smarter and therefore more successful in finding food. Well, that animal with the "smart X" variation also had one of the existing 20 combinations of functionally insignificant mutations. The other 19 combinations existed only in animals that did not have the "smart X" intelligence-enhancing mutation. All the other animals of that species will therefore be less successful, on average, at reproducing. That will, over a period of generations, cause those other 19 combinations in the X gene to become far less common. Many of the combinations in X likely will disappear entirely as their carriers become outcompeted in the search for food and fail to reproduce successfully. The 1 combination of insignificant variations that occurs with the "smart X" mutation will become far more common and may become the only combination of insignificant variations in the X gene until new insignificant combinations start accumulating across generations as new mutations happen in animals that have the "smart X" mutation.

The point is that a valuable mutation will mprove the relative reproductive success of the first animal that gets it. But then any unimportant or less important mutations that animal also has will be propagated along with the important mutation. The amount of overall variation in that gene will go down in future generations as the animals that do not have the valuable mutation but which have various functionally insignificant mutations do not reproduce as successfully. Valuable mutations have the effect of reducing the number of functionally unimportant mutational variations that will be found around genes that has the valuable mutations.

Update: Nicholas Wade of the New York Times has more details about the historical frequency of ASPM mutations.

"There has been a sweep every 300,000 to 400,000 years, with the last sweep occurring between 200,000 and 500,000 years ago," Dr. Lahn said, referring to a genetic change so advantageous that it sweeps through a population, endowing everyone with the same improved version of a gene.

By this measure humans may be due for another ASPM mutation. Perhaps there is some human out there walking around with the next intelligence-enhancing ASPM mutation.

Where Lahn talks about a mutation that "sweeps through a population" understand what that really means: All animals that did not have the mutation in a given species were outcompeted and, over some generations, failed to reproduce. The mutation didn't just jump from one ape to another ape like a viral infection. The line of successive mutations were each so helpful for enhancing survival and reproduction that animals that didn't have them were outcompeted for food or for mates or in fights and perhaps in all of those ways.

Wade says at least 5 other genes cause microcephaly but they have not yet been identified. Once they are expect evolutionary geneticists to repeat the same comparison between species as was done with ASPM. While few humans appear to have functional variations in ASPM (aside from victims of microcephaly) it is possible that some of these yet-to-be-discovered genes will turn out to vary between humans. Humans do vary in brain size and brain shape. Genetic variations in some genes must be causing this. Though some of those variations might be occurring in genes that are not responsible for microcephaly.

By Randall Parker 2004 January 14 12:29 AM  Trends, Human Evolution
Entry Permalink | Comments(17)
2004 January 11 Sunday
UCLA Team Claims It Can Predict Earthquakes

A UCLA team claims it can predict earthquakes months in advance falling within a several month period.

Earthquakes can be predicted months in advance

Major earthquakes can be predicted months in advance, argues UCLA seismologist and mathematical geophysicist Vladimir Keilis-Borok.

"Earthquake prediction is called the Holy Grail of earthquake science, and has been considered impossible by many scientists," said Keilis-Borok, a professor in residence in UCLA's Institute of Geophysics and Planetary Physics and department of earth and space sciences. "It is not impossible."

"We have made a major breakthrough, discovering the possibility of making predictions months ahead of time, instead of years, as in previously known methods," Keilis-Borok said. "This discovery was not generated by an instant inspiration, but culminates 20 years of multinational, interdisciplinary collaboration by a team of scientists from Russia, the United States, Western Europe, Japan and Canada."

The team includes experts in pattern recognition, geodynamics, seismology, chaos theory, statistical physics and public safety. They have developed algorithms to detect precursory earthquake patterns.

In June of 2003, this team predicted an earthquake of magnitude 6.4 or higher would strike within nine months in a 310-mile region of Central California whose southern part includes San Simeon, where a magnitude 6.5 earthquake struck on Dec. 22.

In July of 2003, the team predicted an earthquake in Japan of magnitude 7 or higher by Dec. 28, 2003, in a region that includes Hokkaido. A magnitude 8.1 earthquake struck Hokkaido on Sept. 25, 2003.

Previously, the team made "intermediate-term" predictions, years in advance. The 1994 Northridge earthquake struck 21 days after an 18-month period when the team predicted that an earthquake of magnitude 6.6 or more would strike within 120 miles from the epicenter of the 1992 Landers earthquake — an area that includes Northridge. The magnitude 6.8 Northridge earthquake caused some $30 billion in damage. The 1989 magnitude 7.1 Loma Prieta earthquake fulfilled a five-year forecast the team issued in 1986.

Keilis-Borok's team now predicts an earthquake of at least magnitude 6.4 by Sept. 5, 2004, in a region that includes the southeastern portion of the Mojave Desert, and an area south of it.

If this technique continues to return correct answers how will Los Angeles or Bay Area residents respond if they are eventually told that a really big quake is coming their way?

Kellis-Borok apparently took on earthquake prediction to give him something worthwhile to do in his old age. Incredible.

Still, not all seismologists are convinced. "Application of nonlinear dynamics and chaos theory is often counter-intuitive," Keilis-Borok said, "so acceptance by some research teams will take time. Other teams, however, accepted it easily."

Keilis-Borok, 82, has been working on earthquake prediction for more than 20 years. A mathematical geophysicist, he was the leading seismologist in Russia for decades, said his UCLA colleague John Vidale, who calls Keilis-Borok the world's leading scientist in the art of earthquake prediction. Keilis-Borok is a member of the National Academy of Sciences, and the American Academy of Arts and Sciences, as well as the Russian Academy of Sciences, and the European, Austrian and Pontifical academies of science. He founded Moscow's International Institute of Earthquake Prediction Theory and Mathematical Geophysics, and joined UCLA's faculty in 1999.

His research team has started experiments in advance prediction of destructive earthquakes in Southern California, Central California, Japan, Israel and neighboring countries, and plans to expand prediction to other regions.

Parenthetically, this report demonstrates the potential of life extension. Kellis-Borok's mind is probably aging more slowly than the average mind. Imagine what top scientists would accomplish if the aging of their minds could be delayed or avoided.

By Randall Parker 2004 January 11 09:46 AM  Dangers Natural General
Entry Permalink
2004 January 09 Friday
Stanford Researchers Find Evidence Of Memory Supression Mechanism

Using functional magnetic resonance imaging (fMRI) researchers at Stanford University have found additional evidence that the brain has biological mechanisms for making specific memories harder to access.

Anderson first revealed the existence of such a suppression mechanism in the brain in a 2001 paper published in Nature titled "Suppressing Unwanted Memories by Executive Control." He took the research a step further at Stanford by using brain imaging scans to identify the neural systems involved in actively suppressing memory. The core findings showed that controlling unwanted memories was associated with increased activation of the left and right frontal cortex (the part of the brain used to repress memory), which in turn led to reduced activation of the hippocampus (the part of the brain used to remember experiences). In addition, the researchers found that the more subjects activated their frontal cortex during the experiment, the better they were at suppressing unwanted memories.

"For the first time we see some mechanism that could play a role in active forgetting," Gabrieli said. "That's where the greatest interest is in terms of practical applications regarding emotionally disturbing and traumatic experiences, and the toxic effect of repressing memory." The Freudian idea is that even though someone is able to block an unpleasant memory, Gabrieli said, "it's lurking in them somewhere, and it has consequences even though they don't know why in terms of their attitudes and relationships."

The experiment

Twenty-four people, aged 19 to 31, volunteered for the experiment. Participants were given 36 pairs of unrelated nouns, such as "ordeal-roach," "steam-train" and "jaw-gum," and asked to remember them at 5-second intervals. The subjects were tested on memorizing the word pairs until they got about three-quarters of them right -- a process that took one or two tries, Anderson said.

The participants then were tested while having their brains scanned using functional magnetic resonance imaging (fMRI) at Stanford's Lucas Center for Magnetic Resonance Spectroscopy. The researchers randomly divided the 36 word pairs into three sets of 12. In the first set, volunteers were asked to look at the first word in the pair (presented by itself) and recall and think about the second word. In the second set, volunteers were asked to look at the first word of the pair and not recall or think of the second word. The third set of 12 word pairs served as a baseline and was not used during the brain scanning part of the experiment. The subjects were given four seconds to look at the first word of each pair 16 times during a 30-minute period.

After the scanning finished, the subjects were retested on all 36 word pairs. The researchers found that the participants remembered fewer of the word pairs they had actively tried to not think of than the baseline pairs, even though they had not been exposed to the baseline group for a half-hour.

"People's memory gets worse the more they try to avoid thinking about it," Anderson said. "If you consistently expose people to a reminder of a memory that they don't want to think about, and they try not to think about it, they actually don't remember it as well as memories where they were not presented with any reminders at all."

While the unlocking of repressed memories has been viewed by Freudians as a worthwhile goal of therapy it is not obvious that this should be the case. If someone can clearly recall a painful memory then the recall is bound to cause many of the same painful emotional reactions that the original incident caused. Well, why put oneself through the same experience again? Isn't there a lot of advantage to making painful memories hard to access in order to reduce the likelihood of being reminded of them?

Debriefing after a traumatic event has been called into question as a useful method for counseling the victims of traumatic experiences. It may well be that debriefing doesn't work because it causes the mind to go over the traumatic events and therefore it may strengthen memory formation and later painful recall.

Perhaps what is needed is the ability to place removable blocks on memories. I'm thinking of something along the line of the ability to recall past memories when it becomes critical to do so. If any of you have read Roger Zelazny's Today We Choose Faces then imagine the ability to recall past memories of past clones without having to die in order to do so. One could even place tags on memories explaining why one might want to peer into them. For instance, a person thinking of remarrying might want to gain better access to memories of what went wrong in a previous marriage. Or on an anniversary date of the death of a loved one one might want to make the memories of time spent with that person more accessible.

The Stanford researchers were working with very new memories. But in rats the protein synthesis inhibitor anisomycin has been successfully used to erase 45 day old memories See the previous post Consolidated Memories Are Erasable In Rats.

By Randall Parker 2004 January 09 04:58 PM  Brain Memory
Entry Permalink | Comments(1)
Bush To Propose Moon Colony, Mars Trip, Ho Hum

President Bush wants humans to create a permanent Moon base in the late 2010s and to go to Mars in the 2020s.

President Bush will announce plans next week to send Americans to Mars and establish a permanent human presence on the moon, senior administration officials said Thursday night.

Bush won't propose sending Americans to Mars anytime soon; rather, he envisions preparing for the mission more than a decade from now, one official said.

If George W. Bush proposed a massive effort to develop enabling technologies and to work on basic scientific questions whose solution would provide the basis for enabling technologies for space exploration then I'd be thrilled. But of course that is not what he did. A trip to Mars is going to have all the long term impact of previous human trips to the Moon. The astronauts will go. They will plant the flag (or perhaps multiple flags from a multnational consortium). Then they will collect some rocks, do some tests, and eventually get back on their spacecraft and come home. Tens of billions will be spent and, while the Mars program will produce some advances, most of the effort will not go toward making big advances.

James Oberg says the Moon base will be useful for testing out technologies needed for a Mars trip.

A human presence on the Moon, says space expert James Oberg, would allow engineers to iron out the technical and medical challenges of a manned Mars mission, which require at least a year of space travel.

Oberg is right that a Moon base would serve as a useful test bed for trying out technologies necessary for a Mars trip. But a Moon base and a Mars trip both are very inefficient ways to advance space technology. One reason for this is that there is an inherent conservatism to any effort to send humans into space. Manned initiative always run on a schedule and technologies that might take too long to develop get axed in the planning stage. Also, the costs of manned programs are so large that most of the money has to be spent on approaches that are least risky and least likely to fail either in development or in use.

What ought to be driving NASA efforts is the goal of space colonization. In order to achieve that larger goal We need to strive to achieve technological goals that are much more ambitious than the next manned mission or even the manned mission after that. Given a sufficiently ambitious set of technological goals the priorities on what to fund and the overall approach taken toward manned spaceflight would undergo a radical change.

Do I hear you asking what should be the technological goals of NASA? Oh great, excellent question, glad you asked. Okay, here are some FuturePundit technological goals for the NASA manned spaceflight program:

  • Develop technologies that could be used to lower the cost of launch into space by orders of magnitude. The two main contenders are hypersonic ramjets or scramjets and nanotube-based space elevators.
  • Develop technologies that can cut the fuel weight needed for interplanetary travel and to increase trip speed. One leading contender to accomplish this is nuclear fission propulsion using either nuclear thermal or nuclear electric methods. Nuclear fusion and anti-matter are even more advanced approaches. Still further out on the horizon is the idea of laser beam propulsion.
  • Advance biological science and develop biotechnologies to produce the means to better adapt human bodies to zero gravity and lower gravity. For instance, pursue the development of gene therapies to retrain osteoclasts and osteoblasts to keep bones strong in low gravity. The research would produce valuable medical treatments for those suffering from osteoarthritis, osteoporosis, and other bone diseases.
  • Develop biotechnologies to produce food, drugs, and structures that would help to create self-sustaining human settlements on the Moon, Mars, and asteroids. Genetic engineering could be used to create bacteria that can produce a large assortment of drugs. Genetic engineering could also be used to create plants and algae that can be used to process waste and grow food.

The ability of humans to get into space, move in space, and live in space and on other planets is so incredibly primitive at this point that we ought to be concentrating on developing radically better technologies rather than spending tens of billions of dollars on space programs that utilize fairly small improvements on existing technologies. We are not going to be able to move out into the solar system and colonize other planets, moons, and asteroids with self-sustaining colonies until we make very large technological leaps in enabling technologies. Multi-billion dollar short visits to distance places by a small astronaut elite viewable by the masses as Reality TV may satisfy a lot of voyeurs. But voyeurism has never held much appeal to me. I don't want to watch astronauts on TV as they first step onto Mars for a brief visit. I want to be able to go there myself and live and work there for a period of years before moving on to Ganymede or to a radically reengineered Venus.

Update: In a column entitled "Mission to Nowhere" Anne Applebaum argues that the public is being deceived about just how far away we are from being able to move many humans out into space great distances.

If the average person on Earth absorbs about 350 millirems of radiation every year, an astronaut traveling to Mars would absorb about 130,000 millirems of a particularly virulent form of radiation that would probably destroy every cell in his body. "Space is not 'Star Trek,' " said one NASA scientist, "but the public certainly doesn't understand that." No, the public does not understand that. And no, not all scientists, or all politicians, are trying terribly hard to explain it either. Too often, rational descriptions of the inhuman, even anti-human living conditions in space give way to public hints that more manned space travel is just around the corner, that a manned Mars mission is next, that there is some grand philosophical reason to keep sending human beings away from the only planet where human life is possible.

It isn't impossible to sustain human life on Mars. It is just impossible to do so with the current level of technology. Make a trip to Mars go faster and the total amount of radiation absorbed en route would be much less. But a faster trip would require making major strides to advance science and to develop many new technologies. Send robots ahead to burrow underground and build highly sheltered living quarters and then a Mars colony would not receive such massive doses of on-going radiation. Develop better shielding materials for the trip to Mars and for living on Mars and, again, the radiation exposure could be drastically reduced. But all this takes lots of advances in science and technology. If only the $100+ billion spent on the International Space Station had been spent to fund labs down here on Earth we'd be closer to the day when trips to Mars will become possible. But NASA is not pursuing a long term strategy. Most of the space program amounts to a big reality TV production company producing footage that makes it onto the nightly news occasionally that makes the public feel good that something is being done to get humanity into space. But most of the money spent is a waste.

By Randall Parker 2004 January 09 02:08 PM  Space Exploration
Entry Permalink | Comments(21)
2004 January 08 Thursday
Therapy Versus Drugs For Depression Compared Via Brain Scans

Cognitive Behavioral Therapy (CBT) which aims to train depressed patients not to think negative thoughts about themselves causes a different pattern of changes to the brain than the changes caused by anti-depressant drugs.

Using positron emission tomography (PET) -- multi-colored imaging that pinpoints where maximum changes in brain metabolism occur -- Dr. Mayberg's team, led by CBT expert Zindel Segal, PhD, and graduate student Kimberly Goldapple, generated a detailed picture of what this self-correction looks like.

CBT has theoretically been considered a top-down approach because it focuses on the cortical (top) area of the brain -- associated with thinking functions -- to modulate abnormal mood states. It aims to modify attention and memory functions, affective bias and maladaptive information processing. In contrast, drug therapy is considered a bottom-up approach because it alters the chemistry in the brain stem and limbic (bottom) regions which drive more basic emotional and circadian behaviors resulting in eventual upstream changes in depressive thinking.

In this current study in Archives, 14 clinically-depressed adult patients underwent a full course of CBT. They each received 15 to 20 individualized outpatient sessions. None were on drug therapy. The patients' brains were scanned prior to beginning treatment and at the end of the full course of therapy.

Investigators found that CBT targets many of the same limbic and cortical regions affected by drug therapy, but in 'different directions'. With drug therapy, metabolism (blood flow) decreases in the limbic area and increases in the cortical area. With CBT, Mayberg and colleagues identified the reverse pattern: limbic increases (in the hippocampus, dorsal mid cingulate) and cortical decreases (in the dorsolateral, ventrolateral and medial orbital frontal; inferior temporal and parietal). Furthermore, each treatment showed changes in unique brain regions supporting the top-down, bottom-up theories.

What explains this reverse pattern? As CBT patients learn to turn off the thinking paradigm that leads them to dwell on negative thoughts and attitudes, activity in certain areas in the cortical (thinking, attention) region are decreasing as well.

"The challenge continues to be how to figure out 'how to best treat' for what the brain needs," says Dr. Mayberg. She suggests that brain scans may one day become a useful component of the treatment protocol for clinically depressed patients, helping doctors to determine in advance what treatment will be most efficacious, as well as monitor the effectiveness of a particular treatment strategy.

Both types of treatment work on only a subset of all depressed patients and the two different subsets only partially overlap. If patterns in the brains of depressed patients could be found that show how depressed patients differ from each other it might be possible to discover markers for which type of therapy is most likely to work. Some day depressed patients may have their brains scanned to determine what type of anti-depressant treatment has the best chance of working for each patient.

"This experiment lays the groundwork for looking for different markers that will help to optimize the treatment for a given individual; that's the really cool part," said Mayberg, a professor of psychiatry and neurology who conducted the study while at the University of Toronto but recently moved to Emory University in Atlanta.

Genetic testing will probably become even more common than brain scanning for the purpose of choosing the optimal therapy for treating depression. The genetic testing will be cheaper and easier to carry out. Also, genetic testing will be useful for identifying which anti-depressant drugs are more or less likely to work and more or less likely to cause side effects for each person.

Also, it may eventually become possible to automate much of the delivery of cognitive behavioral therapy. An interactive computer could be used to do part of the training of how to avoid thinking negative thoughts. It may also become possible to implant sensors and something like a hearing aid that would be triggered to tell a patient what positive thoughts to have when the sensors detect negative thoughts. Of course such a method of treatment would bring with it the potential of abuse as a means to control people.

By Randall Parker 2004 January 08 12:03 AM  Biological Mind
Entry Permalink | Comments(1)
2004 January 07 Wednesday
Does Reality TV Make People More Accepting Of Surveillance?

Mark Andrejevic argues that people are becoming less afraid of surveillance and some are even eagerly embracing it. (same article here)

Today's college students have none of the fear of "Big Brother" that marked their parents' post-McCarthy Cold War generation. In fact, their fascination with the notion of watching and being watched has fueled a dramatic shift in entertainment programming and ushered in the era of Reality Television.

Mark Andrejevic, an assistant professor of communication studies in the University of Iowa College of Liberal Arts and Sciences, says a number of factors including technology and economy paved the way for the rise of reality television, but none so much as a transformation of Americans' attitudes toward surveillance. He explores these factors and more in his new book, "Reality TV: The Work of Being Watched," (Rowman & Littlefield, 2004.)

...

Andrejevic believes that the interactivity of the Internet paved the way for reality TV mania. He interviewed producers of early reality programs such as MTV's The Real World who said that they initially had a hard time finding people willing to have their lives taped nearly 24 hours a day for several months. That was 1992. Now they hold auditions in college towns and thousands of young people form lines snaking for blocks just for the chance to audition.

"There are now more people applying to The Real World each year than to Harvard," Andrejevic says.

The key to that success is connected to people's increasing comfort with levels of surveillance that were once anathema in American society, Andrejevic says.

"In my book, I have attempted to think about the ways in which reality TV reconfigures public attitudes about surveillance," he says. "We're trained to make a split between private and public surveillance -- to be worried about government surveillance but not private, which is entertainment or gathering information to serve you better. We're moving into a period where that distinction starts to dissolve. Private surveillance is becoming so pervasive that it's time to start worrying about it as a form of social control."

That viewers of reality programming don't worry about surveillance or social control is testament to the power of television as a messenger, Andrejevic says.

"The cast members on these shows are constantly talking about how great the experience is, how much they have grown personally because of it," he says. "It connotes honesty -- you can't hide anything about yourself if you're on camera all day every day. It becomes a form of therapy or almost a kind of extreme sport -- how long can you withstand allowing yourself to be videotaped?"

There are many precedents for some elements of the reality TV shows. Various precedents have each introduced some element of what goes into a reality TV show. Consider all the TV shows that broadcast pictures and video footage of celebrities trying to go about their private lives and the TV shows dedicated to showing pictures of houses, cars, clothes and other things that celebrities own. Sometimes the celebrities cooperate with the paparazzi photographers because the celebs want to promote themselves. Other times celebs get quite angry at having their privacy invaded and yet viewers do not switch away from such shows in disgust out of seeing someone's privacy invaded. But even this is nothing new because gossip columnists have been reporting on details of the private lives of public figures for decades and have found large ready audiences for their reports.

There is even a TV show called Cribs where celebrities allow camera crews in to film the insides of their houses. But Cribs is not an entirely novel idea. For decades there have been magazines containing picture spreads of the insides of especially stylish houses whose non-celebrity owners wanted to show off their tastes and affluence to the readers of such magazines.

New generations are growing up viewing television shows that let anyone see the lives of others recorded either voluntarily, as is the case of most reality TV, or involuntarily, as is the case with paparazzi celebrity stalking but also with some reality shows like COPS where criminals are filmed being chased and arrested by police. The results of surveillance are increasingly seen as entertainment and as within the realm of the public's right to know. Perhaps the government can not watch us all but TV show producers can.

By Randall Parker 2004 January 07 01:26 PM  Surveillance Society
Entry Permalink | Comments(1)
2004 January 05 Monday
NIST Research On Tissue Engineering

A pair of press releases from the National Institute of Standards and Technology (NIST) draw attention to efforts of NIST researchers to come up with assays and instrumentation to enable the acceleration of research on tissue engineering. Tissue engineering is a very important discipline for the development of the means to grow replacement organs and to fix existing organs and other tissues. I especially love research and engineering into tools and techniques that essentially enable other researchers to solve problems that provide direct benefit. The development of enabling tools and techniques can shrink the amount of time needed to do biomoedical research by years and even decades and, in my view, the development of enabling tools does not get the attention it deserves.

The first report is for a new technique for testing the biocompatibility of synthetic materials.

A new method for quantitatively measuring the compatibility of materials with living tissues has been developed by researchers at the National Institute of Standards and Technology (NIST). Described in a Dec. 11 presentation at the Tissue Engineering Society International's conference in Orlando, Fla., the technique should provide a more sensitive and reliable means to evaluate the biocompatibility of new materials for a wide range of applications from contact lenses to dental coatings to bone implants.

A paper outlining the new method has been accepted for publication in the Journal of Biomedical Materials Research.

The new method, which represents a novel application of existing bench-top scientific instruments, is a two-step process. The first step involves using a device called a polymerase chain reaction instrument to measure the levels of an organism's cytokines when exposed to a given material. Cytokines are signaling molecules released by white blood cells to protect the body from foreign materials. Higher levels of cytokine production generally indicate non-biocompatible materials have caused inflammation. The second step involves testing exposed cells for a specific protein in the cell membrane, the presence of which indicates cells are dying. This is a complementary test for more serious responses to materials because dying cells are often not capable of producing cytokines. The NIST tests were conducted on cultured mouse cells, which produce similar responses as whole tissues.

NIST post-doctoral researcher LeeAnn Bailey called the new method a "barometer" of biocompatibility.

Whereas current means to test biocompatibility produce a yes/no result that a material is minimally biocompatible or not, the new analysis can tell which materials are more biocompatible than others. Industry and researchers should be able to use this method to produce new materials for dentistry and other medical applications that are even more well matched to the human body.

The second report is about a technique to combine an optical coherence microscope and a confocal fluorescence microscope to watch deeply into tissue-engineering scaffolds to watch tissue growth.

In the November issue of Optics Express*, National Institute of Standards and Technology (NIST) scientists describe a novel combination of microscopes that can peer deep into tissue-engineering scaffolds and monitor the growth and differentiation of cells ultimately intended to develop into implantable organs or other body-part replacements.

The new dual-imaging tool provides a much needed capability for the emerging tissue engineering field, which aims to regenerate form and function in damaged or diseased tissues and organs. Until now, scrutiny of this complicated, three-dimensional process has been limited to the top-most layers of the scaffolds used to coax and sustain cell development.

Composed of biodegradable polymers or other building materials, scaffolds are seeded with cells that grow, multiply, and assemble into three-dimensional tissues. Whether the cells respond and organize as intended in this synthetic environment depends greatly on the composition, properties, and architecture of the scaffolds' porous interiors. Tools for simultaneously monitoring microstructure and cellular activity can help scientists to tease apart the essentials of this interactive relationship. In turn, such knowledge can speed development of tissue-engineered products ranging from skin replacements to substitute livers to inside-the-body treatments of osteoporosis.

NIST scientist Joy Dunkers and her colleagues paired an optical coherence microscope---a high-resolution probe of the scaffold interior---with a confocal fluorescence microscope---used to track cells stained with a fluorescent dye. The instruments provide simultaneous images that can be merged to create a comprehensive rendering of microstructure and cellular activity. By stacking the sectional images, they can create a top-to-bottom movie showing structural and cellular details throughout the scaffold's volume.

*J. P. Dunkers, M. T. Cicerone, and N. R. Washburn, "Collinear optical coherence and confocal fluorescence microscopies for tissue engineering," Optics Express, Vol. 11, No. 23, pp. 3074-3079. [http://www.opticsexpress.org].

If I could make one change in US government research funding policy I'd rechannel funding in biomedical sciences away from clinical trials and away from people who are trying to use existing tools and techniques and toward researchers who are developing new tools. So, for instance, some of the money spent sequencing genomes would be reallocated toward research in microfluidics and nanopore technology to develop new means of DNA sequencing that will eventually be orders of magnitude faster and cheaper. Also, I'd channel more money in the direction of people like the NIST researchers above who are working on enabling technologies for tissue engineering.

With the right set of tools any question can be answered very quickly. The fact that we are spending decades trying to develop cures for cancer, heart disease, degenerative neurological diseases and the like is a sign that our tools are inadequate for the problems which researchers are attempting to solve. We need better tools.

By Randall Parker 2004 January 05 03:45 PM  Biotech Organ Replacement
Entry Permalink | Comments(1)
Ritalin Exposure May Increase Risk Of Depression, Alter Reward Sensitivity

Methylphenidate, best known by the brand name Ritalin, has long term effects on the brains of rats when administered intraveneously.

Three new studies conducted in animals, published in the December issue of the journal Biological Psychiatry, provide evidence that misuse of the stimulant methylphenidate (Ritalin) may have long-term effects on the brain and behavior. While methylphenidate and other stimulant medications are the recommended treatments for Attention Deficit Hyperactivity Disorder (ADHD), based on the more than 150 controlled studies demonstrating their safety and efficacy when used as prescribed, these three studies showed changes in the brains of young (adolescent or pre-adolescent) animals that persisted into adulthood. In both animals and humans, the brain continues to develop throughout adolescence. If the current studies are applicable to humans, they could have important implications for young people who use stimulants for recreational purposes.

In the first study, Dr. Cindy Brandon and her colleagues at the Finch University of Health Sciences/The Chicago Medical School examined how low doses of methylphenidate affect dopamine cells in the brains of adolescent rats. Dopamine is a brain chemical that has been implicated in natural rewards, such as food and sex, as well as in drug abuse and addiction. The study showed that the rats experienced brain cell changes that subsequently made them more sensitive to the rewarding effects of cocaine.

In the second study, Dr. William Carlezon, Jr., and his colleagues at Harvard Medical School and McLean Hospital in Belmont, Massachusetts, looked at how pre-adolescent exposure to methylphenidate affected certain behaviors in rats when they reached adulthood. They found that early exposure to twice-daily injections of methylphenidate actually reduced the sensitivity to cocaine reward, but increased other behaviors that could indicate depression. The timing of exposure to methylphenidate may be important — in this study the rats were exposed at an age corresponding to childhood, whereas in the study by Dr. Brandon et al., the rats were slightly older, more akin to adolescence.

In the third study, Dr. Carlos Bolaños and his colleagues at the University of Texas Southwestern Medical Center in Dallas assessed certain behaviors of adult rats given methylphenidate prior to adolescence. They found that compared to drug-naive rats, those chronically exposed to methylphenidate were less responsive to natural rewards, such as sugar and sex, and more sensitive to stressful situations. The methylphenidate-exposed animals also had increased anxiety-like behaviors, and enhanced blood levels of stress hormones.

Adolescent exposure to Ritalin may increase sensitivity to cocaine but pre-adolescent exposure may decrease cocaine sensitivity. That a drug can have different effects depending on the age of the patient is not surprising when we consider the amount of brain growth and changing in configuration that happens during adolescence. A drug is going to have a different effect on a rapidly growing nervous system as compared to its effects on a nervous system that is growing less rapidly or which is going through different kinds of changes.

One big caveat about these studies is that children with ADD (attention deficit disorder) or ADHD (attention deficit hyperactivity disorder) may be so different cogntively than other children that the long term effects of methylphenidate on ADD/ADHD children may be substantially different than that reported for what are considered to be "normal" mice. But it is unclear whether that means methylphenidate will have worse or better effects on ADD/ADHD children. Also, the rats were given the drug intravenously whereas children usually take it as a pill which means the drug is not going to reach the brain as quickly or not necessarily even in the same chemical state.

What is amazing about this is the scale on which doctors and parents have embarked upon a massive experiment that may cause a variety of lasting changes on cognitive function. As of 1995 2.8 percent of American children were on methylphenidate (Ritalin) and that represented a sharp increase from 1.2% in 1990. Methylphenidate use is also up in Canada and some other Western countries in about the same time period.

If anyone doubts whether, when it becomes possible to do so, humans will be willing to reengineer their minds or the minds of their offspring consider the use of nervous system-altering drugs on children today. Look at how willing parents and authority figures are to embrace treatments that are not sufficiently well understood and which probably have a number of lasting effects on cognitive function thoroughout the rest of the lives of the children who are given methylphenidate and other nervous system drugs.

By Randall Parker 2004 January 05 01:15 PM  Brain Addiction
Entry Permalink | Comments(0)
2004 January 02 Friday
Foxg1 Gene Suppression Turns Cortical Cells Into Embryonic Nerve Cells

The Foxg1 gene is necessary for cellular differentiation to produce cerebral cortex cells.

Scientists have identified a gene in the cerebral cortex that apparently controls the developmental clock of embryonic nerve cells, a finding that could open another door to tissue replacement therapy in the central nervous system. In a new study, the researchers found that they could rewind the clock in young cortical cells in mice by eliminating a gene called Foxg1. The finding could potentially form the basis of a new method to push progenitor cells in the brain to generate a far wider array of tissue than is now possible.

The study, led by researchers at NYU School of Medicine, is published in the January 2, 2004 issue of Science magazine.

"What we found was a complete surprise," says Gordon Fishell, Ph.D., Associate Professor in the Department of Cell Biology at New York University School of Medicine. "No one had believed that it was possible to push back the birth date of a cortical neuron. There is this central tenet governing the process of brain development, which says that late progenitor cells [forerunners of mature cell types] cannot give rise to cell types produced earlier in development," he explains.

"Consequently, while some populations of stem cells exist in the adult brain, these cells are restricted to producing only a subset of cell types," notes Dr. Fishell. "If one's goal is to produce cells for replacement therapy, some method must be found to turn back the clock and allow adult stem cells to give rise to the wide variety of cells made during normal brain development."

Eseng Lai, Ph.D., of Merck & Co. and one of the study's co-authors, cloned the Foxg1 gene while he was working at Memorial Sloan-Kettering Cancer Center in New York. He also did seminal work in the late 1990s showing that when the gene is eliminated in embryonic mice, the brain's cerebral hemispheres barely develop. Subsequent work demonstrated that the gene played a role in the early phases of cortical development.

While the press release makes it sound like the researchers have converted more differentiated cells into less differentiated cells my take on this is that what they have really done is found that by knocking out a gene they can prevent nerve stem cells from becoming more differentiated in the first place. They are not converting cells into a less differentiated state. (someone correct me if I'm wrong). They are instead blocking the process of differentiation. That is not nearly as exciting and yet it is a very useful piece of information. But would suppression of the Foxg1 gene in cells that are already more differentiated (more specialized toward becoming neurons of some specific later stage type) cause those cells to revert to a less differentiated state? I don't think these researchers have demonstrated that yet. I also don't think Foxg1 gene suppression in differentiated nerve cells will necessarily cause those cells to revert back into partially differentiated stem cells. The Foxg1 gene might cause methyl groups to be placed in spots on the genome that cause the cells to stay differentiated even once Foxg1 is turned off (just speculating but this is not an unreasonable speculation).

Still, this is useful information. Every discovery of a gene that plays a role in cellular differentiation (the process by which cells become changed to become specialized for specific tasks) provides researchers with useful information that will help point the way toward experiments to attempt that may cause cells to change differentiation state and to revert to some type of stem cells or to convert into a more specialized cell.

There have been a number of recent reports on promising techniques for producing adult stem cells for a variety of purposes. These other reports strike me as more advanced than this latest report excerpted above. See my previous posts Scripps Researchers Find Molecule That Turns Adult Cells Into Stem Cells, TriStem Claims Converts Blood Cells To Stem Cells, and MIT Technique To Produce Large Numbers Of Adult Stem Cells.

There is another angle to this latest report: it may turn out to be useful information for future techniques in intelligence enhancement. If Foxg1 and other related genes can be manipulated to cause more cerebral cortex cells to be made then it may be possible to improve reasoning ability or memory capacity. More generally, research on cellular differentiation and growth regulation of nerve cells will eventually yield information that will be useful for doing intelligence enhancement.

By Randall Parker 2004 January 02 01:05 PM  Biotech Organ Replacement
Entry Permalink | Comments(0)
Site Traffic Info
Site Copyright
The contents of this site are copyright ©