2004 July 31 Saturday
DNA Research Data Handling Insufficient To Protect Privacy?

Current DNA privacy guidelines may not be sufficient to protect genetic privacy.

STANFORD, Calif. – In their exuberance over cracking the genetic code, scientists have paid too little attention to privacy issues, say researchers at the Stanford University School of Medicine. Their findings, published in the July 9 issue of Science, suggest that traditional means of ensuring confidentiality do not apply to genetic data and that additional safeguards are needed to protect patients from potential abuses.

“I am surprised that no one has looked at this problem before and asked, ‘Can we really release genome-wide information about individuals to the public,’” said Zhen Lin, a genetics graduate student who led the study. “Nobody did a careful calculation to find whether ‘anonymous’ patients could be identified from this data.”

Supposedly stripping out non-DNA information is enough to protect a patient's privacy.

A 1996 federal law that governs medical privacy requires that research data be stripped of identifying information such as names, addresses and even the last three digits of a patient’s ZIP code before it can be shared. But the law is essentially silent on the issue of DNA, and most researchers have interpreted this to mean that sharing sequence data linked to information from a patient’s medical history is safe.

Sift through a big chunk of published DNA sequence from a published research paper and you can dig out enough points to compare to a person's DNA gotten some other way to get a match.

“Traditionally people believe that if there is no identifier attached, then the sample is anonymous,” Lin said. “We found that’s really not true because the DNA code itself is an identifier.” To demonstrate this, the researchers looked at specific sites in DNA that commonly vary from person to person, accounting for many genetic differences. Each person has about 5 million such sites in their DNA. Using a statistical model, the researchers found that matching 100 of these sites would identify an individual to a high degree of certainty.

In theory, if a person collected a small amount of genetic information about a former research subject, he could match it to database material in the future to get personal medical information about the subject.

Very few people are research subjects on projects that publish a lot of DNA sequences. Most research projects that use DNA variations look at only a fairly small number of sections of DNA to find variations in particular genes thought to be involved in some disease or perhaps in producing differences in athletic performance or cognitive function. So this concern about published DNA sequences is not important (at least not yet) for most research trial subjects who get their DNA tested in some manner.

For research efforts that sequence larger chunks of DNA sequences per test subject more safeguards may be needed for handling the data. However, at this point the cost of personal DNA sequencing is so high that few people are getting so much of their total genome sequenced by researchers to allow for identification of each person. It is not enough simply to have knowledge of 100 out of the 5 million sites which vary between people. Those sites have to be scattered across enough different chromosomes to provide coverage of all (or nearly all) of the chromosomes. My guess is that most research projects that are sequencing large sections are doing so on a limited number of chromosomes and so are not providing the kind of data needed to enable unique identification of each research subject.

In the long run I believe genetic privacy is going to become impossible to protect. So I'm pretty lackadaiscal about this whole subject anyhow.

By Randall Parker 2004 July 31 10:35 PM  Biotech Privacy
Entry Permalink | Comments(6)
2004 July 30 Friday
Will Electric Hybrid Cars Be Used As Peak Electric Power Sources?

Future hybrid cars may be used as "Vehicle To Grid" or V2G power sources to meet peak electric power demand.

But if automakers were to make 1 million next-generation V2G vehicles by 2020, they could generate up to 10,000 megawatts of electricity - about the capacity of 20 average-size power plants, according to a 2001 study by AC Propulsion, the electric vehicle maker in San Dimas, Calif., that created the V2G Jetta.

While vehicles could generate plenty of power - studies show they sit idle 90 percent of the time - it would be far too costly to use as simple "base-load" power. Their main value would be in supplying spurts of peak and other specialty "ancillary" power for which utilities pay premium prices. It would be far cheaper for utilities to tap the batteries of thousands of cars, say, than the current practice of keeping huge turbines constantly spinning just to supply power at a moment's notice, studies show.

With hybrids it wouldn't be necessary to start the engines of parked cars in order to use them as electric power sources. The hybrids have lots of batteries in them. If they plug in when stopped part of their battery charge could be drainied whenever a distributed computer network decided to switch them onto the grid. The switching could be fairly automated and used to deal with quick spikes in electric power demand.

Next generation electric hybrids will have electric power generation costs that are too high to compete with large electric power plants for non-peak electric power uses. But advances in fuel cell technologies will eventually provide a way to generate electricity cheaply than car internal combustion engines. Whether those car fuel cells can ever compete with natural gas or coal fired electric power plants remains to be seen. For more on that possibility see my previous post Cars May Become Greater Electricity Generators Than Big Electric Plants

Even if car electricity doesn't become cost competitive expect to see car electric power generators to become emergency back-up power sources. If a big grid failure happens 10 or 20 years from now we won't have to wait while the long distance electric power lines and switching stations are repaired. People will just plug houses into cars.

Aside about hybrids on the street: Quite a number of people in Santa Barbara California have Toyota Prius electric hybrid cars. I see them every day. One of the most curious aspects of the hybrids is that they are so quiet when running off of battery. As a result I've had to adjust my behavior a bit. You can't rely as much on the absence of an engine sound to know that a car is not coming when, say, doing a daily dog run up a street.

By Randall Parker 2004 July 30 02:20 AM  Energy Tech
Entry Permalink | Comments(17)
2004 July 29 Thursday
Video Cameras Spreading In Nursing Homes To Prevent Abuse

Called "granny-cams", the use of video cameras placed in the rooms of elderly nursing home residences is being funded in many cases by families so that families can verify that their elderly are not being abused or neglected by nursing home workers.

About a dozen state legislatures have granny-cam legislation under consideration. Earlier this year, New Mexico joined Texas in allowing nursing home residents or their representatives to install monitoring cameras in their rooms.

Under the laws, a resident must let nursing-home operators know ahead of time of the placement of the camera. If the operator is not notified or if the equipment is not open and obvious in the room, the camera is considered covert surveillance and illegal.

Use of such cameras is a positive step in reducing the potential for elderly abuse, Cottle, an editor at the journal, concluded. In particular, Web cameras hold the greatest potential for restoring public confidence in nursing homes by giving family members access to "real time" or to recently stored footage.

Commercial outlets now sell Web-camera systems to the elderly at prices from $629 to $1,584, depending on the specifications of each camera, plus a $20 monthly fee to access the server and $10 a month for a data-only line to upload images.

"Certainly some families have the financial means to provide this quality of technological protection, however the majority of Americans do not," Cottle wrote. To be effective and properly regulated, granny-cam technology should therefore be mandated for all nursing facilities.

In some cases family members are able to monitor their parents and grandparents by watching camera video streams remotely over the internet.

Cameras also could monitor many of the basics of resident care, such as drug administration and diaper changing. By linking the camera feed to the Internet, nursing homes could handle routine assignments more efficiently.

But because of understandable concerns over privacy, Cottle advocates placing the surveillance systems in the hands of independent companies, which would then monitor the equipment and be responsible for making the data available online.

"In this way, families can check on their loved ones and nursing homes can check on their residents, and everyone will sleep a little better at night knowing that the independent source is regulating and reviewing the tapes should any problems arise," Cottle wrote.

Many people are willing to give up privacy in exchange for security. Effectively the cameras provide a way for more trusted people to monitor the actions of less trusted people. The monitoring capability provided by electronic technology allows the role of trusted agent to be separated from the role of service provider. The cameras are monitored either by family members or by third party organizations. These organizations effectively serve to audit and monitor performance of nursing homes on behalf of family members or even on behalf of the elderly themselves.

Another way to think about video cameras used in security is that they allow a trusted agent to leverage their trust to enforce and monitor more transactions and facilities. This ability to separate out the role of trusted agent from the roles of providing various other services is a big underappreciated long term trend that is changing how societies are organized. It is going to affect the structure of governments in part by allowing outsourcing of various components of governance. For example, one can imagine how this could lead to situations where particularly corrupt governments agree to remote monitoring of a large range of transactions and faciltiies in exchange for international aid. A country like Finland with an incredibly low level of corruption could literally provide remote trust services for institutions in countries with high levels of corruption such as Moldova or Paraguay.

By Randall Parker 2004 July 29 02:50 PM  Surveillance Cameras
Entry Permalink | Comments(1)
2004 July 28 Wednesday
Will Intelligent Alien Life Be Discovered Within 20 Years?

Search for Extraterrestrial Intelligence Institute astronomer Seth Shostak argues that computers and radio telescopes will advance enough in the next two decades that within 20 years we will be able to scan all stars for radio transmissions that are the signs of intelligent life elsewhere.

If intelligent life exists elsewhere in our galaxy, advances in computer processing power and radio telescope technology will ensure we detect their transmissions within two decades. That is the bold prediction from a leading light at the Search for Extraterrestrial Intelligence Institute in Mountain View, California.

Seth Shostak, the SETI Institute's senior astronomer, based his prediction on accepted assumptions about the likelihood of alien civilisations existing, combined with projected increases in computing power.

Astronomer and Panspermia theorist Chandra Wickramasinghe thinks Shostak's prediction is reasonable.

"The criticism of this group has been to say that we've looked for intelligence for close on half a century and nothing has turned up, therefore there has to be nothing.

"I think that's an extremely false position to take.

"Forty years is too short a time to expect anything. We would be greedy if we expect the first hellos to come in the next 10 years.

"Twenty years is a more reasonable time to took forward to."

Even if an alien civilization is discovered that is 1000 light years away (i.e. it would take light 1000 years to travel between here and there) there may be people alive right now that will live long enough to carry on conversations with aliens. Once we achieve engineered negligible senescence people will be able to live youthful lives for thousands of years. So while conversations with aliens may require thousands of years to conduct such conversations could be carried out between individuals rather than between successive generations as representatives of civilizations.

I think the biggest problem with a conversation with aliens is that both we and the aliens would change so much between transmitting messages and receiving responses that we'd have little in the way of a meeting of the minds or a convergence of beliefs or mutual understanding.

Of course, if we could live for tens or hundreds of thousands of years then that opens up the possibility of travelling to meet aliens. But the trip would be so boring and even if the aliens sounded friendly we'd have no way of knowing how much they'd change before we travelled through space to the alien star system. By the time we reached their planet they might be gone or overrun by artificial nanotech creatures.

The other big problem with travelling to meet them is that either their microbes might be fatal to us or ours might be to them or to some aspect of their ecology. They might not want to risk having us as visitors.

By Randall Parker 2004 July 28 02:31 AM  Space Exploration
Entry Permalink | Comments(18)
2004 July 27 Tuesday
MIT Electron Microsope Offers Higher Resolution For Biological Molecules

Paul Matsudaira, a Whitehead Institute Member, professor of biology, and professor of bioengineering at MIT, has just set up a unique new electron microscope with the ability to image biological molecules at near-atomic resolution.

Deep in MIT’s labyrinthine campus, the Whitehead/MIT BioImaging Center, a collaboration launched a few years ago with seed funding from the W. M. Keck Foundation, and headed by Matsudaira, has set up its new digs. Right in the middle, sequestered in a specially designed, environmentally isolated room, is the Center’s prize possession: a $2M cryoelectron microscope, the JEOL 2200FS.

The first one of its kind in the world for biology problems, the microscope is designed to image the smallest biological molecules at near-atomic resolution, surpassing what most other microscopes can offer.

Like all electron microscopes, this one images electrons as they pass through an object. Placing the microscope in a climate-controlled room isolated from vibrations, magnetic fields, and even people—the microscope is operated remotely—helps stabilize these easily perturbed electrons, thus improving image quality. In spite of these environmental safeguards, some electrons lose energy simply by colliding with atoms, often clouding the image that the microscope detects. A built-in energy filter acts as a sort of funnel, collecting only the electrons that have not lost energy. Put another way, it only photographs electrons that are in focus. Knowing a protein’s shape is intrinsic to understanding its function, so this sort of imaging makes for more than just a pretty picture.

The energy filter makes this electron microscope unique.

Although a handful of other microscopes in the world are capable of imaging at such a resolution, the lack of an energy filter forces a reliance on computer applications to complete the images. Says Matsudaira, “This one just doesn’t have to work as hard as the others to get the same results.”

The tools keep getting better and so the rate of bioscientific and biotechnological progress keeps on accelerating.

By Randall Parker 2004 July 27 11:22 PM  Biotech Advance Rates
Entry Permalink | Comments(0)
2004 July 26 Monday
Humans Use A Fifth Of All Land Plant Materials

Dr. Marc L. Imhoff, a Principal Investigator in NASA's Carbon Cycle Science and Land Cover Land Use Change Programs, visiting scientist Lahouari Bounoua and colleagues have added up the human consumption of plant matter and found that globally humans are consuming 20% of the world's plant life.

NASA scientists working with the World Wildlife Fund and others have measured how much of Earth's plant life humans need for food, fiber, wood and fuel. The study identifies human impact on ecosystems.

Satellite measurements were fed into computer models to calculate the annual net primary production (NPP) of plant growth on land. NASA developed models were used to estimate the annual percentage of NPP humans consume. Calculations of domesticated animal consumption were made based on plant-life required to support them.

Marc Imhoff and Lahouari Bounoua, researchers at NASA's Goddard Space Flight Center (GSFC), Greenbelt, Md., and colleagues, found humans annually require 20 percent of NPP generated on land. Regionally, the amount of plant-based material used varied greatly compared to how much was locally grown.

Note that this analysis does not include the oceans. So human fishing activities as a percentage of all ocean-based biomass production are not included in the above analysis.

North America's lower latitudes and lower population density allows it to produce 9 times more carbon in plant matter than Europe. So even though North America consumes much more plant matter than Europe humans in North America consume a smaller percantage of its local plant matter than do Europeans.

The effort resulted in a worldwide map of consumption that could be broken down to individual regions by population to compare how much an area such North America, for example, consumes with the amount it can produce locally - about 24 percent of its annual plant production.

Highly populated Western Europe and South Central Asia, one the other hand, each consume 70 percent of the greenery they produce.

With a much larger population China is already consuming more plant matter per year than North America even though China consumes only a fourth the amount per capita. Continued industrialization therefore seems likely to greatly increased world human demand for plant matter. This seems likely to result in rising timber prices and a shift toward the use of other types of building materials.

The bigger problem with rising Chinese demand and overall rising world demand is that it will decrease the amount of biomass available for wild animal life.

The rising demand for biomass also argues against a big role for biomass materials as future energy sources. Granted, some waste biomass could be converted to energy. But even wastes are potential food sources for bacteria, fungi, insects, fish, and other life forms. A perfectly efficient human-managed biomass cycle is going to squeeze out other life forms that rely upon plants, dead animals, and wastes as food sources.

Also see Structures In United States Cover Area Equal To Ohio.

By Randall Parker 2004 July 26 01:50 PM  Energy Tech
Entry Permalink | Comments(6)
2004 July 25 Sunday
Eugenics Debated In Germany

Eugenics is considered in many circles to be morally repugnant. Among Germany's political elites this attitude is especially prevalent as a reaction to Nazi killings and sterilizations which were motivated in part by ridiculous Nazi genetic theories (though a ruthless tribalistic view of the other was a powerful motivation as well). As a reaction to Nazi era practices pre-implantation genetic diagnosis (PIGD) of defects in babies conceived in test tubes is against the law in Germany even though it is legal in almost all other Western countries. In spite of elite views most Germans favor the practice of PIGD to avoid defects in offspring.

The procedure, called pre-implantation genetic diagnostics (PGD), is forbidden in Germany but has been used in fertility clinics elsewhere since its invention in 1989.

The latest firestorm erupted last month at a Berlin conference on human reproduction, when researchers released a survey indicating that 4 in 5 Germans approve of PGD to prevent genetic diseases.


The findings seem to fly in the face of the consensus among politicians. A parliamentary commission reexamined the legality of PGD in 2002 - and unanimously decided to keep PGD strictly forbidden.

Pre-Implantation Genetic Diagnostics (also sometimes abbreviated PIGD) is legal and used in most Western countries. Therefore it can be argued that eugenics is already being widely practiced with little opposition by many people who are using in vitro fertilization to start pregnancies. Also, the genetic testing of couples before conception in order to provide advice about risks of having a baby amounts to eugenics as well. The use of knowledge of genetics of prospective parents or embryos to decide whether to proceed with a pregnancy is eugenics. Eugenics is not defined as something only governments carry out. Whether individuals use genetic technology to alter genetics of offspring or governments mandate the use of technology for eugenic purposes either way the use of genetic knowledge to alter reproductive outcomes is a form of eugenics.

I expect to see the practice of eugenics to become more widespread as the cost of genetic testing drops, as the expanding body of genetic research allows us to derive increasing numbers of useful insights from genetic tests, and as it becomes possible to do gene therapy on eggs, sperm, and embryos. While most eugenic decisions in the West will be left to individuals I also expect to see laws passed to discourage or even to forbid the passing along of certain genetic variations - and not just variations that cause what are widely held to be defects. For instance, when genetic variations that make a person very likely to be highly violent are identified then I expect most people to eventually favor the outlawing of knowingly passing along those genetic variations to future generations.

As eugenics becomes something that larger numbers of individuals can practice for their own benefit the stigma associated with the term eugenics is going to fade. As it becomes possible for individuals and couples to make more decisions about the genetic make-up of their offspring it is going to become necessary to remove the general taboo associated with the term eugenics so that the costs and benefits to society as a whole for particular genetic variations can be debated. Some parents will inevitably select genetic variations that make their children more problematic for the rest of us (for example, by reducing the impulse to carry out altruistic punishment). Since I think it unlikely that most governments will ban eugenics entirely we will need to come up with criteria for which genetic variations are allowable.

By Randall Parker 2004 July 25 10:39 AM  Biotech Society
Entry Permalink | Comments(10)
2004 July 23 Friday
Fetuses Give Pregnant Women Stem Cell Therapy

Diana W. Bianchi, M.D. of the Tufts University Sackle School of Graduate Biomedical Research has found that cells from fetuses during pregnancy cross over into mothers and become a large assortment of types of specialized cells in the mothers and persist for years.

Bianchi and her colleagues retrieved cells from the tissue samples of 10 women who had male sons and compared them to tissue samples from 11 women who had never had male offspring. The reason the researchers chose women with male offspring is that it would be easy to detect cells from male offspring because male cells carry the Y chromosome, while female cells do not.

The tissue samples were from the thyroid, cervix, liver, lymph node, intestine, spleen and gallbladder. Skin samples were also collected from 11 women in a control group.

Bianchi said that not only did they find fetal cells present in the mothers' tissue samples, but that the fetal cells had taken on the characteristics of the mother's cells.

This result is very important for the stem cell debate. (same article here)

The findings could also affect the national debate over stem cells, she said, in that they raise the possibility of obtaining stem cells, which can change into many tissues of the body, without the ethical issues involved in creating or destroying human embryos. President Bush has sharply restricted federal funding for research on human embryonic stem cells to keep the government from supporting research that he believes destroys human life.

Pregnant or previously pregnant women could potentially be a source of pluripotent stem cells.

Likewise, the author of the Tufts study, Dr. Diana Bianchi, said another potential source of stem cells is women who have been pregnant.

"Studies have virtually ignored the role of pregnancy, but women who have been pregnant potentially have cells with therapeutic potential from their fetus," she said.

It is possible that many years after a pregnancy there are no longer cells in the mother's body that are fetal and capable of becoming all cell types. But a better point at which to try to catch fetal cells from the blood stream of women would be while they are still pregnant or perhaps shortly after giving birth. If fully pluripotent stem cells can be isolated from the blood of pregnant women then this may well provide a source for such cells that will not raise religious hackles.

Back in 1996 Bianchi first found fetal cells in mothers. But those cells were blood cells only. Her finding that fetal cells are becoming other cell types strongly suggests that fetal cells that are fairly undifferentiated are crossing over into women from fetuses and then differentiating (specializing) into various adult cell types. If those fetal cells that are crossing over are capable of converting into a large variety of specialized cell types then they are quite possibly the equivalent of pluripotent embryonic stem cells.

A confirmation of this result poses what seems to me an ethical problem for the religious opponents of embryonic stem cell research. If developing embryos effectively are donating human embryonic stem cells (hESC) to mothers and literally doing cell therapy to mothers then this natural process is doing something that at least some hESC therapy opponents consider to be morally repugnant.

It will be interesting to see where the various hESC research opponents come down on this result. Will they oppose the extraction of embryonic stem cells from a mother's blood while she is pregnant. If so, on what moral basis?

My guess is that a large fraction of the hESC research opponents will decide that extraction of hESC from a mother's blood is morally acceptable. No fetus will be killed by the extraction. The cells so extracted are not cells that would go on to become a complete new human life. If a sizable portion of the religious hESC opponents can be satisfied by this approach for acquiring hESC then Bianchi's research may well lead to a method to get hESC that will open the gates to a much larger effort to develop therapies based on hESC.

By Randall Parker 2004 July 23 01:24 PM  Biotech Organ Replacement
Entry Permalink | Comments(19)
2004 July 22 Thursday
Monster Waves Major Cause Of Large Ship Sinkings1

Susanne Lehner, Associate Professor in the Division of Applied Marine Physics at the University of Miami and Wolfgang Rosenthal of the GKSS Forschungszentrum GmbH research centre, in Geesthacht Germany used synthetic aperture radar data of the oceans collected by two European Space Agency satellites to find that huge 25+ meter high waves are far more common than previously thought.

Once dismissed as a nautical myth, freakish ocean waves that rise as tall as ten-storey apartment blocks have been accepted as a leading cause of large ship sinkings. Results from ESA's ERS satellites helped establish the widespread existence of these 'rogue' waves and are now being used to study their origins.

Severe weather has sunk more than 200 supertankers and container ships exceeding 200 metres in length during the last two decades. Rogue waves are believed to be the major cause in many such cases.

Mariners who survived similar encounters have had remarkable stories to tell. In February 1995 the cruiser liner Queen Elizabeth II met a 29-metre high rogue wave during a hurricane in the North Atlantic that Captain Ronald Warwick described as "a great wall of water… it looked as if we were going into the White Cliffs of Dover."

In a three period looking at a small fraction of the ocean surface these scientists found 10 waves that were at least 25 meters (over 82 feet) high. So these waves are like 7 story buildings or even higher.

Previously many scientists thought waves of such heights were extremely rare.

Objective radar evidence from this and other platforms – radar data from the North Sea's Goma oilfield recorded 466 rogue wave encounters in 12 years - helped convert previously sceptical scientists, whose statistics showed such large deviations from the surrounding sea state should occur only once every 10000 years.

The fact that rogue waves actually take place relatively frequently had major safety and economic implications, since current ships and offshore platforms are built to withstand maximum wave heights of only 15 metres.

This brings up an interesting question: Is the risk of dying in a trans-Atlantic or trans-Pacific crossing greater on a cruise liner or in a jumbo jet? I had assumed up until now that the risk was greater on an airplane. Now I'm not so sure. Anyone know if there are reliable numbers that can be used for calculating risks for ocean cruise ship crossings?

Aircraft and ships will become safer with time. One obvious strategy to adopt with ships is to develop technologies for spotting waves so that a ship's course can be altered to avoid them. Also, ships can be designed to be able to survive encounters with 30 or even 40 or 50 meter waves. But right now what is the safest way to travel?

Once it becomes possible to reverse aging and keep one's body in a permanently youthful state many people are going to become far more interested in reducing risks of accidental death. A risk that may seem low for an 80 year lifespan will seem much larger for an 800 or 8000 year lifespan. So the relative risks of driving, flying, and travelling on ships or trains is going to become a topic of much wider spread interest. Since the purpose of this web log is to think about issues that will be of increasing importance in the future it is not too early to start thinking about what is the safest way to cross oceans.

By Randall Parker 2004 July 22 03:54 PM  Dangers Natural General
Entry Permalink | Comments(8)
2004 July 21 Wednesday
Caffeine Alertness Comes At Cost Of Word Recall

The increased level of alertness from using caffeine from coffee or tea comes at a cost: When asked a question unrelated to your chain of thought you'll be less likely to recall the correct word for the answer if you are on caffeine.

Your first cup of coffee each morning increases your alertness, but a new study suggests caffeine--potentially because of how it interacts with neurons in the brain--might actually hinder your short-term recall of certain words. That is, it may temporarily suppress access to information locked in your memory and unrelated to your current train of thought. For example, you might struggle to remember an acquaintance's name after meeting many new people at a morning meeting.

The findings come from a study in the latest issue of APA's Behavioral Neuroscience (Vol. 118, No. 3) in which researchers Steve Womble, PhD, and Valerie Lesk, both of the International School for Advanced Studies in Trieste, Italy, examined the tip-of-the-tongue (TOT) phenomenon--a form of memory retrieval failure in which someone knows an answer for certain, yet is at a particular moment unable to recall it. They sought to provide a potential neurological explanation of how caffeine affects short-term memory.

Caffeine helps for some types of word recall but hurts for other types.

Miss Lesk said: "In some conditions caffeine helps short-term memory and in others it makes it worse.

"It aids short-term memory when the information to be recalled is related to the current train of thought but hinders short-term memory when it is unrelated.

"If the word is unrelated then caffeine is still strengthening retrieval in the same way, but because it is unrelated to the word you want to find it is actually having a negative effect," she said.

Think of caffeine as a drug that reduces distractability. Get on one chain of thought. Then encounter a distraction that requires you to shift to another chain of thought. Caffeine will inhibit your brain's ability to make that transition.

Obviously, this is a trade-off. The ability to resist distractions is a plus in some environments. But that is not always the case. Other environments require frequent attention shifting.

Caffeine is a tool. This is another piece of scientific evidence on how to use it. Also see my previous post: Scientists Demonstrate Best Way To Use Caffeine.

So what does this portend for the future? What we need (and I believe we will eventually find) are better pharmaceutical tools for shifting mental states to fit the types of work tasks we are doing. Imagine a safe, non-toxic, and fast-acting drug that reduces distractability and then imagine another safe, non-toxic, and fast-acting drug that reverses the effect of the first drug. That would be a useful pair of drugs. Shift your mind into an distraction-resistant mode and work at the computer in your office. Then, when called out to an impromptu meeting to debate some issue, flip your mind into a mode that reacts well to handling the input of lots of other people who are all jumping around making competing points. Sound appealing?

Update: Another example of a situation where you wouldn't want to be on coffee is as a TV game show contestant. A player on Jeoparday is going to be hit by a series of questions on unrelated topics. One wouldn't want one's mind to be better at answering follow-up questions on the same topic at the cost of not being as good at answering questions on unrelated topics. But a medical student taking a test on body bones would probably do better on caffeine since the knowledge of all the bones would probably be stored together and memorized fairly recently before taking the test (caffeine works against recalling older memories too).

By Randall Parker 2004 July 21 03:26 PM  Brain Enhancement
Entry Permalink | Comments(0)
2004 July 20 Tuesday
Methamphetamine Addict Brain Scans Show Extensive Losses

Paul Thompson, Ph.D. of the UCLA Lab of Neuro-Imaging and Brain Mapping Division and a number of colleagues have published new research on the extensive brain damage caused by methamphetamine addiction.

A new UCLA Neuroscience imaging study shows for the first time the selective pattern of destruction to the brain's memory, emotion and reward systems experienced by chronic methamphetamine users. Color, three-dimensional visualizations created from magnetic resonance images vividly show the damage. The study reveals the mechanism by which drug abuse damages the brain and suggests potential targets for therapy in recovering drug users. The research appears in the June 30 online edition of the peer-reviewed Journal of Neuroscience. Authors Dr. Paul Thompson, associate professor of neurology, and Dr. Edythe London, professor at the UCLA Neuropsychiatric Institute, are available for interviews.

If you click through on that previous link you'll see a graphic showing the scale of the loss in different parts of the brain. Note that the legend puts the red color level of loss at 5% and most of the brain shows a 5% loss of volume. White matter swelling makes the brain larger overall due to inflammation. But obviously there is extensive cell death.

Brain regions involved in drug craving, emotion and reward, and hippocampal brain regions involved in learning and memory, lose up to 10% of their tissue. Red colors denote brain regions with greatest tissue loss, blue colors regions that remain relatively intact. Hippocampal volume reductions are linked with poorer memory performance in the methamphetamine users. At the same time, a 7% volume increase occurs in the brain's white matter, suggesting an inflammatory response to chronic drug use.

Dr. Paul Thompson tells the New York Times that the effect of methamphetamine abuse is akin to that of a forest fire.

The first high-resolution M.R.I. study of methamphetamine addicts shows "a forest fire of brain damage," said Dr. Paul Thompson, an expert on brain mapping at the University of California, Los Angeles. "We expected some brain changes but didn't expect so much tissue to be destroyed."

The actual research paper is available on the web and reports that the average length of time using methamphetamine (MA) was 10.5 years and brain volume losses ran as high as 10%.

The MA abusers had used the drug (primarily by smoking) for 10.5 years on average, beginning in their mid-twenties. They consumed ~3 gm MA per week, having used MA on most of the 30 d before entering the study. The groups reported similar alcohol use (Table 2). Most of the MA abusers but only two of the controls, however, smoked tobacco cigarettes. All analyses were run with and without six MA subjects who reported remarkable levels of mari-juana use (more than one joint per week or a history of marijuana dependence, as determined by the SCID-I interview). Although p values changed slightly, this did not affect whether each result was statistically significant, so we present results for the full sample.

As South Park character Mr. Mackey says "Drugs are just bad, mmm'kay?"

You can even go hear Mr. Mackey tell you:

"Now as I was saying, drugs are bad. You shouldn't do drugs. If you do them, you're bad, because drugs are bad, mkay? It's a bad thing to do drugs, so don't be bad by doing drugs, mkay? That'd be bad, because drugs are bad, mkay?"

By Randall Parker 2004 July 20 12:39 PM  Brain Addiction
Entry Permalink | Comments(2)
2004 July 19 Monday
Brain Gray Matter Size Correlated To Intelligence

Size of grey matter areas of the brain more strongly correlate to IQ than does the overall size of the brain.

General human intelligence appears to be based on the volume of gray matter tissue in certain regions of the brain, UC Irvine College of Medicine researchers have found in the most comprehensive structural brain-scan study of intelligence to date.

The study also discovered that because these regions related to intelligence are located throughout the brain, a single “intelligence center,” such as the frontal lobe, is unlikely.

Dr. Richard Haier, professor of psychology in the Department of Pediatrics and long-time human intelligence researcher, and colleagues at UCI and the University of New Mexico used MRI to obtain structural images of the brain in 47 normal adults who also took standard intelligence quotient tests. The researchers used a technique called voxel-based morphometry to determine gray matter volume throughout the brain which they correlated to IQ scores. Study results appear on the online version of NeuroImage.

Previous research had shown that larger brains are weakly related to higher IQ, but this study is the first to demonstrate that gray matter in specific regions in the brain is more related to IQ than is overall size. Multiple brain areas are related to IQ, the UCI and UNM researchers have found, and various combinations of these areas can similarly account for IQ scores. Therefore, it is likely that a person’s mental strengths and weaknesses depend in large part on the individual pattern of gray matter across his or her brain.

“This may be why one person is quite good at mathematics and not so good at spelling, and another person, with the same IQ, has the opposite pattern of abilities,” Haier said.

While gray matter amounts are vital to intelligence levels, the researchers were surprised to find that only about 6 percent of all the gray matter in the brain appears related to IQ.

Attempts to deny the significance of IQ tests are being undermined by the results of physical measures of the brain using brain scanning technologies. The fact that IQ test correlates with the size of a particular type of brain tissue is a very strong indicator that IQ tests are measuring real physical differences in brain abilities.

The researchers found a curious result with the size of brain gray matter areas, age, and IQ correlation.

The findings also suggest that the brain areas where gray matter is related to IQ show some differences between young-adult and middle-aged subjects. In middle age, more of the frontal and parietal lobes are related to IQ; less frontal and more temporal areas are related to IQ in the younger adults.

The research does not address why some people have more gray matter in some brain areas than other people, although previous research has shown the regional distribution of gray matter in humans is highly heritable. Haier and his colleagues are currently evaluating the MRI data to see if there are gender differences in IQ patterns.

My guess is that the frontal areas are still developing in young adults and so are not capable of fully contributing to measured IQ until later in life. Though this article is too vague to tell what they mean by "young-adult".

Update: A January 2005 update on Haier's work shows that male and female brains differ greatly in their organization.

The study shows women having more white matter and men more gray matter related to intellectual skill, revealing that no single neuroanatomical structure determines general intelligence and that different types of brain designs are capable of producing equivalent intellectual performance.

“These findings suggest that human evolution has created two different types of brains designed for equally intelligent behavior,” said Richard Haier, professor of psychology in the Department of Pediatrics and longtime human intelligence researcher, who led the study with colleagues at UCI and the University of New Mexico. “In addition, by pinpointing these gender-based intelligence areas, the study has the potential to aid research on dementia and other cognitive-impairment diseases in the brain.”

Study results appear on the online version of NeuroImage.

In general, men have approximately 6.5 times the amount of gray matter related to general intelligence than women, and women have nearly 10 times the amount of white matter related to intelligence than men. Gray matter represents information processing centers in the brain, and white matter represents the networking of – or connections between – these processing centers.

This, according to Rex Jung, a UNM neuropsychologist and co-author of the study, may help to explain why men tend to excel in tasks requiring more local processing (like mathematics), while women tend to excel at integrating and assimilating information from distributed gray-matter regions in the brain, such as required for language facility.

The environmentalist Blank Slate view of the mind is becoming ever harder to defend.

By Randall Parker 2004 July 19 12:23 PM  Brain Intelligence
Entry Permalink | Comments(12)
2004 July 18 Sunday
Sun Energy Output At Over 1,000 Year Peak

Sami Solanki, Professor at the Federal Institute of Technology in Zurich Switzerland, says the Sun has been burning more brightly over the last 60 years than over the previous 1090 years.

“We have to acknowledge that the Sun is in a changed state. It is brighter than it was a few hundred years ago, and this brightening started relatively recently – in the last 100 to 150 years. We expect it to have an impact on global warming,” he told swissinfo.

The sun's brightness hasn't changed much over the last 20 years. But it has been brighter for the last 60 years than it has been at any time in the last 1,150 years.

According to scientists, the Sun’s radiance has changed little during this period. But looking back over 1,150 years, Solanki found the Sun had never been as bright as in the past 60 years.

The team studied sunspot data going back several hundred years. They found that a dearth of sunspots signalled a cold period - which could last up to 50 years - but that over the past century their numbers had increased as the Earth's climate grew steadily warmer. The scientists also compared data from ice samples collected during an expedition to Greenland in 1991. The most recent samples contained the lowest recorded levels of beryllium 10 for more than 1,000 years. Beryllium 10 is a particle created by cosmic rays that decreases in the Earth's atmosphere as the magnetic energy from the Sun increases. Scientists can currently trace beryllium 10 levels back 1,150 years.

Sunspots have been increasing in number as the Earth has been getting warmer.

Over the past few hundred years, there has been a steady increase in the numbers of sunspots, a trend that has accelerated in the past century, just at the time when the Earth has been getting warmer.

Variations in sunspot activity are probably behind the increases and decreases in solar radiation and consequence changes in Earth's climate.

During the Medieval maximum of 1000-1300 there was an extremely large Sunspot which is believed to have warmed the Earth higher than normal. There were no accurate measurements of the weather to call upon during this time but the discovery and colonization of Greenland by Eric the Red supports this hypothesis. Eric was exiled from Iceland for manslaughter and sailed west discovering Greenland. He then led many ships, filled with people who wanted to make a fresh start, to this new land. For 300 years Greenland flourished, new communities settled, trade with other countries grew, and the population increased. Around 1325 the climate cooled down considerably, people started to abandon the northern settlements. By 1350 glaciers covered the northern settlements, and the southern most settlements were dying out as well.

The Sporer minimum of 1400-1510 and the Maunder minimum of 1645-1715 were each known as a "little ice age." They were both droughts in Sunspot activity, and a link to a time of abnormally cold weather on Earth. In addition to finishing off the Greenland colonies, the Sporer minimum showed increased rates of famine in the world, and the Baltic Sea froze solid in the winter of 1422-23. Some of the more notable effects of the Maunder minimum included the appearance of glaciers in the Alps advancing farther southward, the north sea froze, and in London there was the famous year without a summer where it remained cold for 21 consecutive months.

The evidence supports the effect of Sunspot activity on the Earth's climate, but that is only one of many areas that effects us on Earth. On March 13,1989 a large Sunspot ignited powerful flares that tripped the circuit breakers a generator station. The started the collapse of the Quebec power system and left people without power for hours to days. These same flares damaged several man made satellites, and caused smaller outages all over the U.S and Canada. There are countless other times when large Sunspots have effected similar damage to various electrical systems on Earth.

The Sun could start going through a down trend in sunspot activity at any time. We could find ourselves back in a state similar to the Maunder Minimum with decades of much colder weather. Or sunspot activity could increase to an even higher level and temperatures could rise more than the amount some models project as a consequence of higher atmospheric carbon dioxide.

My guess is that the chances are greater for a reduction in sunspot activity than for an increase. Why? Most of the time the planet Earth is in an ice age. This is suggestive of the possibility that the Sun just doesn't put out enough heat to keep the Earth out of ice ages most of the time. Also, the higher sunspot activity reported above is at the high end of an over 1,000 year period. Therefore the odds seem greater that we will have more future years with lower sunspot activity than with higher sunspot activity.

My further guess is that a reduction in sunspot activity would cause more harm to humans than a further increase in sunspot activity. A decrease could put large amounts of farm fields out of production and would reduce the useful length of the growing seasons for other fields. The freezing over of rivers and seas along with snows and ice would interfere with transportation more than higher temperatures would.

Also, my guess is that it would be easier to reflect away excessive sunlight than to try to replace the heat lost in another cold period like the Maunder Minimum. For example, to reduce the sunlight hitting the Earth during high sunspot periods we could genetically engineer plankton to produce more of the chemicals they generate to make clouds. We could also try to engineer more snowfall around glaciers to increase the areas covered by reflective white snow. We could also paint more human structures white to reflect back sunlight.

But imagine trying to generate enough energy to make up for a reduction in solar radiation during a period of low sunspot activity. We could take some steps to compensate for reduced solar radiation. For instance, we could paint all human structures black to make them absorb more light to raise ground temperatures. Also, we could try to develop some really large scale methods for coating ice sheets with dark coverings. It may also be possible to reduce cloud cover by seeding clouds to cause rains to fall in areas where the water is needed.

One option for a period of reduced sunspot activity would be to increase the release of green house gasses. But it is not clear that the planet contains enough fossil fuels to make that possible. We'd probably have to shift heavily toward the use of coal. But even that might not generate enough greenhouse gasses to compensate for a period of no sunspots.

By Randall Parker 2004 July 18 01:18 PM  Climate Trends
Entry Permalink | Comments(30)
2004 July 16 Friday
Asteroid Collision Mission To Study Defense Against Asteroids

The European Space Agency has approved a mission proposal to collide a space probe with an asteroid in order to study techniques to deflect any large asteroid found to be on a collision course with Earth.

On 9 July 2004, the Near-Earth Object Mission Advisory Panel recommended that ESA place a high priority on developing a mission to actually move an asteroid. The conclusion was based on the panel’s consideration of six near-Earth object mission studies submitted to the Agency in February 2003.

Of the six studies, three were space-based observatories for detecting NEOs and three were rendezvous missions. All addressed the growing realisation of the threat posed by Near-Earth Objects (NEOs) and proposed ways of detecting NEOs or discovering more about them from a close distance.

A panel of six experts, known as the Near-Earth Object Mission Advisory Panel (NEOMAP) assessed the proposals. Alan Harris, German Aerospace Centre (DLR), Berlin, and Chairman of NEOMAP, says, “The task has been very difficult because the goalposts have changed. When the studies were commissioned, the discovery business was in no way as advanced as it is now. Today, a number of organisations are building large telescopes on Earth that promise to find a very large percentage of the NEO population at even smaller sizes than visible today.”

As a result, the panel decided that ESA should leave detection to ground-based telescopes for the time being, until the share of the remaining population not visible from the ground becomes better known. The need for a space-based observatory will then be re-assessed. The panel placed its highest priority on rendezvous missions, and in particular, the Don Quijote mission concept. “If you think about the chain of events between detecting a hazardous object and doing something about it, there is one area in which we have no experience at all and that is in directly interacting with an asteroid, trying to alter its orbit,” explains Harris.

The Don Quijote mission concept will do this by using two spacecraft, Sancho and Hidalgo. Both are launched at the same time but Sancho takes a faster route. When it arrives at the target asteroid it will begin a seven-month campaign of observation and physical characterisation during which it will land penetrators and seismometers on the asteroid’s surface to understand its internal structure.

Sancho will then watch as Hidalgo arrives and smashes into the asteroid at very high speed. This will provide information about the behaviour of the internal structure of the asteroid during an impact event as well as excavating some of the interior for Sancho to observe. After the impact, Sancho and telescopes from Earth will monitor the asteroid to see how its orbit and rotation have been affected.

The FuturePundit reaction? Finally a space agency is trying to do something in space that may yield a huge benefit to the human race. We could all die from an asteroid impact and yet little is done to develop defenses against this potential threat. Meanwhile billions are spent every year on the Space Shuttle and International Space Station with little return in scientific knowledge, technological advance, or improved safety for humans down here on Earth. An asteroid detection and deflection system capable of preventing all major asteroid threats to human life offers a far greater potential benefit for humanity than the vast bulk of the programs funded by government space agencies.

A small change in an asteroid's path could prevent a collision with Earth.

"It is just to test a technique: can we change their orbits by running a kinetic energy impactor?" said Matt Genge, an asteroid expert at Imperial College, London.

"Can we change its orbit by less than a centimetre per second? If we ever find an asteroid that is on collision course with Earth, at some point in the future, whether it is 10 orbits away, or 20 orbits away, just giving it a small nudge will make it miss the Earth."

In the proposed mission one space probe would watch while another probe slammed into an asteroid.

Sancho would arrive first and orbit the asteroid for several months. It would deploy some penetrating probes to form a seismic network on the asteroid to examine its structure before and after its sister craft's smashing arrival.

Hidalgo would crash into the asteroid at about 22,370 mph (10 kilometers per second).

NASA's Deep Impact mission to Comet Tempel 1 to slam into it on July 4, 2005 (creating fireworks that will be visible from Earth btw) bears some similarity to the ESA mission But while the Deep Impact mission will slam into Tempel 1 at a very simlar speed it does not appear aimed at gathering information about how to do asteroid deflection.

And how. The 770-pound (350-kilogram) probe will hit the comet at 22,300 miles (35, 885 kilometers) per hour and penetrate 16 to 32 feet (5 to 10 meters). Much, but not all of the probe will be vaporized.

Still, it seems likely that the Deep Impact mission will yield information useful for doing asteroid deflection.

For more on the subject of asteroid defenses see my previous post We Should Develop Defenses Against Large Asteroids.

By Randall Parker 2004 July 16 12:06 PM  Dangers Natural General
Entry Permalink | Comments(5)
2004 July 15 Thursday
Stanford Team Develops New Way To Generate Potential Drug Compounds

A team in the Pehr Harbury lab at Stanford has developed a method that may allow the automated generation of a large number of organic compounds as drug candidates through molecular breeding.

Traditionally, developing small molecules for research or drug treatments has been a painstaking enterprise. Drugs work largely by binding to a target protein and modifying or inhibiting its activity, but discovering the rare compound that hits a particular protein is like, well, finding a needle in a haystack. With a specific protein target identified, scientists typically either gather compounds from nature or synthesize artificial compounds, then test them to see whether they act on the target.

The birth of combinatorial chemistry in the early nineties promised to revolutionize this laborious process by offering a way to synthesize trillions of compounds at a time. These test tube techniques have been refined to "evolve" collections of as many as a quadrillion different proteins or nucleic acids to bind a molecular target. These techniques are called molecular breeding, because like traditional livestock and crop breeding techniques, they combine sets of genotypes over generations to produce a desired phenotype. Molecular breeding has been restricted to selecting protein or nucleic acid molecules, which have not always been the best lead compounds for drugs. Conventional synthetic organic chemistry, which has traditionally been a better source of candidate drugs, has not been amenable to this type of high throughput molecular breeding.

But this bottleneck has potentially been overcome and is described in a series of three articles by David Halpin et al. in this issue of PLoS Biology. By inventing a genetic code that acts as a blueprint for synthetic molecules, the authors show how chemical collections of nonbiological origin can be evolved. In the first article, Halpin et al. present a method for overcoming the technical challenge of using DNA to direct the chemical assembly of molecules. In the second, they demonstrate how the method works and test its efficacy by creating a synthetic library of peptides (protein fragments) and then showing that they can find the "peptide in a haystack" by identifying a molecule known to bind a particular antibody. The third paper shows how the method can support a variety of chemistry applications that could potentially synthesize all sorts of nonbiological "species." Such compounds, the authors point out, can be used for drug discovery or as molecular tools that offer researchers novel ways to disrupt cellular processes and open new windows into cell biology. While medicine has long had to cope with the evolution of drug-resistant pathogens, it may now be possible to fight fire with fire.

The first, second, and third articles are available on-line for reading. All PLoS Biology are available to be read without any cost to the reader.

Peptides (which are just are just a sequence of amino acids and serve as components of larger protein molecules) and DNA are hard to get into the body because they tend to get broken down before absorption. Even if they are injected into the bloodstream they stand a pretty good chance of being broken down before they reach a desired target. Whereas a lot of synthetic compounds can be absorbed and reach their targets more easily without getting broken down by enzymes. So the most interesting aspect of these papers is the claim (at least as far as I understand it) that they can use this technique to generate chemical compounds that are not DNA or peptides.

The punch line is in the third article.

Beyond the direct implications for synthesis of peptide–DNA conjugates, the methods described offer a general strategy for organic synthesis on unprotected DNA. Their employment can facilitate the generation of chemically diverse DNA-encoded molecular populations amenable to in vitro evolution and genetic manipulation.

The need that they are trying to solve is the generation of a large number of different compounds to test more rapidly as potential antibiotics against bacteria. The sort of Holy Grail would be a method to do high volume automated means of generating compounds, testing against pathogens, and then feeding that back into the generator mechanism to make more variations most like those variations that had the strongest effects against the pathogens. The hope is that when a new drug-resistant pathogen pops up then by sheer brute force so many compounds could be tried against it so rapidly that in a relatively short period of time antibiotics effective against the new pathogen strain would be identified.

By Randall Parker 2004 July 15 04:46 PM  Biotech Advance Rates
Entry Permalink | Comments(2)
2004 July 14 Wednesday
Will We Live To See Next Collapse Of Earth's Magnetic Field?

Civilization could be disrupted by a collapse and reversal of Earth's magnetic field.

The collapse of the Earth's magnetic field, which both guards the planet and guides many of its creatures, appears to have started in earnest about 150 years ago. The field's strength has waned 10 to 15 percent, and the deterioration has accelerated of late, increasing debate over whether it portends a reversal of the lines of magnetic force that normally envelop the Earth.

During a reversal, the main field weakens, almost vanishes, then reappears with opposite polarity. Afterward, compass needles that normally point north would point south, and during the thousands of years of transition, much in the heavens and Earth would go askew.

A reversal could knock out power grids, hurt astronauts and satellites, widen atmospheric ozone holes, send polar auroras flashing to the equator and confuse birds, fish and migratory animals that rely on the steadiness of the magnetic field as a navigation aid. But experts said the repercussions would fall short of catastrophic, despite a few proclamations of doom and sketchy evidence of past links between field reversals and species extinctions.

Note that with sufficient planning a lot of the electrical effects could be ameliorated by better shielding and back-ups. Even satellites could be built to be better shielded. But maintaining a human presence in low Earth orbit would become a much riskier proposition.

Suppose the flip comes to pass. Should we respond by creating new maps that show the Southern Hemisphere on top?

Consider just one practical problem: If we continue to call the "North" the "North" then all compasses will be wrong. But if we make new compasses then they will have to be labelled as "Post-Collapse" compasses or else someone could use a compass, not know it was built before or after the collapse, and go off in the wrong direction. Likely the period of collapsed magnetic field would last so long before the field popped up firmly again that the use of compasses will have long been abandoned before their labelling becomes an issue.

Animals that have evolved to navigate by the magnetic field might be driven extinct.

When baby loggerhead turtles embark on an 8,000-mile trek around the Atlantic, they use invisible magnetic clues to check their bearings. So do salmon and whales, honeybees and homing pigeons, frogs and Zambian mole rats, scientists have found.

But within a hundred years we may well know all the species that have genetic adaptations to magnetic fields. We could use future advances in biotechnology to easily do genetic engineering to the most threatened of these species to adapt them to the change in magnetic fields. Therefore mass extinctions may be avoidable unless the collapse of the magnetic field causes holes in the ozone layer that cause extinctions via increases in UV radiation. Though even in that scenario we could save some of the species either by genetically engineering them to be more resistant to high UV or by doing climate engineering to create UV shields.

The flip of the magnetic poles may not happen for hundreds of thousands of years. But there may already be people alive today who will live to see the magnetic field collapse. Anyone who is still alive when Engineered Negligible Senescence (rejuvenation therapies that will make us young again) is achieved could conceivably live long enough to witness the magnetic field collapse. There may well already be people alive today who will live long enough to be around when rejuvenation becomes commonplace. Therefore some readers of this post may live through the future collapse of Earth's magnetic field.

The European Space Agency is going to launch a "Swing" cluster of 3 satellites in 2009 to collect enough data to perhaps allow magnetic field forecasting in a fashion analogous to weather and climate forecasting.

The objective of the Swarm mission is to provide the best ever survey of the geomagnetic field and its temporal evolution, in order to gain new insights into the Earth system by improving our understanding of the Earth’s interior and climate. The mission is scheduled for launch in 2009. After release from a single launcher, a side-by-side flying lower pair of satellites at an initial altitude of 450 km and a single higher satellite at 530 km will form the Swarm constellation.

High-precision and high-resolution measurements of the strength, direction and variation of the magnetic field, complemented by precise navigation, accelerometer and electric field measurements, will provide the necessary observations that are required to separate and model various sources of the geomagnetic field. This results in a unique “view” inside the Earth from space to study the composition and processes in the interior.

My guess is that we will be so technologically advanced by the time a magnetic field collapse becomes severe that we will be able to easily compensate for its effects. Climate engineering, UV shields over human habitats, genetic engineering of other species, and heavy shielding of electronics will be among the methods we use to protect human civilization and other species.

By Randall Parker 2004 July 14 03:44 PM  Dangers Natural General
Entry Permalink | Comments(10)
2004 July 13 Tuesday
New MRI Technique Shows How Brain Connections Age

The last mental abilities we gain are the first to go.

UCLA neuroscientists using a new MRI analysis technique to examine myelin sheaths that insulate the brain's wiring report that as people age, neural connections that develop last degenerate first. The computer-based analysis method is unique in its ability to examine specific brain structures in living people at millimeter resolution.

Published online by the Neurobiology of Aging earlier this year and scheduled to appear in the August 2004 print edition of the peer-reviewed journal, the study offers new insights into the role of myelin in brain aging and its contribution to the onset of Alzheimer's disease. In addition, the success of the MRI analysis technique opens new opportunities for studying the impact of lifestyle on brain aging and for developing medications that could slow aging or prevent Alzheimer's disease.

"The study increases our understanding of the role of myelin in brain development and degeneration, and demonstrates the usefulness of this MRI method for examining the single most powerful risk for Alzheimer's disease by far — age," said Dr. George Bartzokis, the study's lead investigator and visiting professor of neurology at the David Geffen School of Medicine at UCLA. He also is director of the UCLA Memory Disorders and Alzheimer's Disease Clinic and clinical core director of the UCLA Alzheimer's Disease Research Center.

Myelin is a sheet of lipid, or fat, with very high cholesterol content — the highest of any brain tissue. The high cholesterol content allows myelin to wrap tightly around axons, speeding messages through the brain by insulating these neural "wire" connections.

As the brain continues to develop in adulthood and as myelin is produced in greater and greater quantities, cholesterol levels in the brain grow and eventually promote the production of a toxic protein that together with other toxins attacks the brain. This toxic environment disrupts brain connections and eventually also leads to the brain/mind-destroying plaques and tangles visible years later in the cortex of Alzheimer's patients.

"The brain is not a computer, it is much more like the Internet," Bartzokis said. "The speed, quality and bandwidth of the connections determine its ability to process information, and all these depend in large part on the insulation that coats the brain's connecting wires.

"The results of our study show that in older age, the myelin insulation breaks down, resulting in a decline in the speed and efficiency of our Internet. Myelin and the cells that produce it are the most vulnerable component of our brain — the human brain's Achilles' heel," he said. "This safe, non-invasive technology can assess the development and degeneration of the brain's insulation in specific regions. Now that we can measure how brain aging proceeds in vulnerable regions, we can measure what treatments will slow aging down and thus begin in earnest to look at preventing Alzheimer's disease."

The UCLA research team examined the deterioration of myelin in the brain's splenium and genu regions of the corpus callosum, which connects the two sides of the brain. Neural connections important to vision develop early in life in the splenium, while connections important to decision‑making, memory, impulse control and other higher functions develop later in the genu.

The team found that the brain connections deteriorated three times as fast in the genu compared to the splenium. The study also notes that myelin deterioration is far greater throughout the brain of patients with Alzheimer's disease than in healthy older adults. The late myelinating regions are much more vulnerable and may be why the highest levels of reasoning and new memories are the first to go when one develops Alzheimer's disease, while movement and vision are unaffected until very late in the disease process.

We need a way to reseed the brain with the cells that make myelin. We also need ways to remove the accumulated toxic compounds that accumulate in the intercellular spaces as well as the toxic compounds (and junk that simply takes up increasing space) that accumulate inside cells. These are 3 of the 7 basic Strategies for Engineered Negligible Senescence (SENS) which you ought to read all about if you haven't already.

This result comes on the heels of another recent study that found brains older than 40 show lots of signs of damaged genes and expression of more repair and inflammation enzymes.

Update: Note how once again a new technique that enables the measurement of phenomena which were not previously measurable has enabled new discoveries to be made. While this initial discovery is interesting the technique itself will be more important in the long run because it will enable many more future discoveries. Better scientific tools are more important than any particular discoveries made with the tools.

By Randall Parker 2004 July 13 03:25 PM  Brain Aging
Entry Permalink | Comments(0)
CD ELISA Medical Test Achieves First Success

A test commonly done in medical testing and in research may some day be done more quickly and cheaply on the surface of compact discs.

COLUMBUS, Ohio – Ohio State University engineers and their colleagues have successfully automated a particular medical test on a compact disc (CD) for the first time -- and in a fraction of the normal time required using conventional equipment.

The ELISA biochemical test -- one of the most widely used clinical, food safety, and environmental tests -- normally takes hours or even days to perform manually. Using a specially designed CD, engineers performed the test automatically, and in only one hour.

The patent-pending technology involves mixing chemicals inside tiny wells carved into the CD surface. The spinning of the CD activates the tests.

In a recent issue of the journal Analytical Chemistry, the engineers report that the CD successfully detected a sample of rat antibody -- a standard laboratory test -- using only one-tenth the usual amount of chemicals.

This first demonstration paves the way for CDs to be used to quickly detect food-borne pathogens and toxins, said L. James Lee, professor of chemical and biomolecular engineering at Ohio State. The same technology could one day test for human maladies such as cancer and HIV, using a very small cell sample or a single drop of blood.

Lee estimated that the first commercial application of the concept is at least two years away.

“This study shows that the technology is very promising, but there are challenges to overcome,” he said. “We have been working on designing special valves and other features inside the CD, and better techniques for controlling the chemical reactions.”

“When we work on the micro-scale, we can perform tests faster and using less material, but the test also becomes very sensitive,” he explained. As chemicals flow through the narrow channels and reservoirs carved in the CD, interactions between individual molecules become very important, and these can affect the test results.

These scientists are working on automating the ELISA test which is a very widely used type of biological test.

ELISA, short for enzyme linked immunosorbent assay, is normally conducted in much larger reservoirs inside a microtiter plate -- a palm-sized plastic grid that resembles an ice cube tray.

Microtiter plates are standard equipment in chemical laboratories, and ELISA testing is a $10-billion-per-year industry. It is the most common test for HIV. Still, the test is tedious and labor-intensive, in part because of the difficulty in mixing chemicals thoroughly enough to get consistent results.

“Everyone working in the life sciences labs would fall in love with this revolutionary CD system for ELISA because it's easier, faster and cheaper to use,” said Shang-Tian Yang, professor of chemical and biomolecular engineering at Ohio State and collaborator on the project. Yang and Lee are founding a company to commercialize the CD technology. Until then, product development is being handled by Bioprocessing Innovative Company, Inc., a company in which Yang is a part owner.

Automated techniques that scale down the size of the devices that do the testing make tests easier, faster, and cheaper to do while at the same time making the tests more sensitive. This will speed up the rate of scientific progress while also lowering the costs of doing science. But it is also going to change the way medicine is done. Rather than taking one trip to the doctor to give blood and other samples with a follow-up trip to get the results the trend is going to be toward in-office testing. You will walk in to a doctor's office, the doctor will decide on what tests to do, and then you will get the tests done and then see the doctor again to review the results during the same office visit. This will save both time and money.

For more on the use of CDs to do biological and medical tests see my previous posts CD Will Simultaneously Test Concentrations Of Thousands Of Proteins and CD Player Turned Into Bioassay Molecule Detection Instrument.

By Randall Parker 2004 July 13 02:35 PM  Biotech Advance Rates
Entry Permalink | Comments(0)
2004 July 11 Sunday
Nanotech Start-Ups Pursuing Cheaper Photovoltaic Solar Power

MIT's Technology Review has a good survey of some of the venture capital start-ups pursuing development of cheaper methods for producing photovoltaic solar cells.

At least one startup may beat Siemens to that goal. Konarka is now gearing up to manufacture its novel photovoltaic film, which it expects to start selling next year. Unlike Siemens’s, Konarka’s films don’t use buckyballs, instead relying on tiny semiconducting particles of titanium dioxide coated with light-absorbing dyes, bathed in an electrolyte, and embedded in plastic film. But like Siemens’s solar cells, Konarka’s can be easily and cheaply made.

The article also covers an interesting approach by a company called Nanosolar.

Down the road, researchers hope to boost nano solar cells’ power output and make them even easier to deploy, eventually spraying them directly onto almost any surface. Palo Alto, CA-based startup Nanosolar, which has raised $5 million in venture capital, is working on making this idea practical. The company is exploiting the latest techniques for automatically assembling nanomaterials into precisely ordered architectures—all with a higher degree of control than ever before possible.

Nanosolar’s approach is disarmingly simple. Researchers spray a cocktail of alcohol, surfactants (substances like those used in detergents), and titanium compounds on a metal foil. As the alcohol evaporates, the surfactant molecules bunch together into elongated tubes, erecting a molecular scaffold around which the titanium compounds gather and fuse. In just 30 seconds a block of titanium oxide bored through with holes just a few nanometers wide rises from the foil. Fill the holes with a conductive polymer, add electrodes, cover the whole block with a transparent plastic, and you have a highly efficient solar cell.

The ability to spray paint a surface with photovoltaics would allow sides and roofs of buildings, signs, billboards, water towers, bridges, and numerous other structures to be turned into solar collectors. In the United States human structures already cover an area equal to the size of Ohio and that is more than enough area to provide enough power for current level of usage if photovoltaics could be made that could cover all the human-built structures.

An article on Konarka Technologies explains how Konarka's approach allows photovoltaics to be made at lower temperates than current processes require.

The problem? Until now, PVCs have been made by heating the titanium crystals to 450 degrees Celsius and then coating them with a light-sensitive dye – a process known as “sintering.” That process was too expensive to make them a practical source of power. Tripathy and his researchers perfected a “cold-sintering” method that achieves the same result at temperatures of 150 degrees or lower.

Those cooler temperatures are critical to new uses for PVCs. When forged at higher temperatures, PVC material can only be coated onto glass, which makes for expensive, delicate product applications. Cold-sintering allows the PVC material to be coated onto plastics; in essence, a product’s outer shell becomes its power source.

And at those cooler temperatures, they can churn out large numbers of photovoltaic cells quickly and cheaply. The Konarka cell does not generate any more electricity than other power cells, or do so more efficiently. Its appeal is that the cell can be manufactured far more cheaply, so Konarka can churn out a large supply and, the company hopes, put them into all sorts of devices.

The ideal process would not require the use of any elevation of temperatures when the photovoltaics are applied. So if Nanosolar's process can be perfected it would open up a greater potential by allowing easier conversion of existing surfaces into photovoltaic collectors. Though the approach being pursued by Nanosys to incorporate photovoltaics into plastics to make roofing tiles would certainly work for new structures and when installing the inevitable new roofs when old roofs wear out.

Update: Nobel Prize winner Richard Smalley has an opinion piece on Small Times arguing for a big research effort to develop new cleaner and cheaper energy technologies to end our reliance on oil.

Imagine by 2050 that every house, business and building has its own local electrical energy storage device, an uninterruptible power supply capable of handling the needs of the owner for 24 hours.

Today using lead-acid storage batteries, such a unit for a house to store 100-kilowatt hours of electrical energy would take up a small room and cost more than $10,000.

Through advances in nanotechnology, it may be possible to shrink an equivalent unit to the size of a washer and drop the cost to $1,000. Among the approaches being developed today are nanotubes, nanowires and nanocomposites for batteries.


America should take the lead. We should launch a bold New Energy Research Program. Just a nickel from every gallon of gasoline, diesel, fuel oil, and jet fuel would generate $10 billion a year. That would be enough to transform the physical sciences and engineering in this country.

You can read some Congressional testimony by Smalley advocating a big energy research effort at the bottom of this previous post on energy policy. I think that Smalley is right that solar photovoltaics, batteries, fuel cells, and all sorts of other energy technologies are all solvable problems. With enough research and development the problems holding back the development of these approaches can all be solved. This is not a question of if but rather one of when. The technologies can be made to work and to be much cheaper than oil, natural gas, and coal. If we tried harder we could make those technologies become cost effective much sooner.

Update II: A Stanford prof thinks organic photovoltaic nanoparticles can eventually be made an order of magnitude cheaper than current solar cells.

Right now, the efficiency rate--the amount of sunlight that gets turned into electricity--ranges from 3 percent to nearly 12 percent for various nanoparticles in different lab experiments. That could grow to 20 percent, said Michael McGehee, an assistant professor at Stanford in materials science and engineering. McGehee currently is conducting research on organic photovoltaic nanoparticles.


"It costs $300 per square meter now for crystalline solar cells. We think we can get this down to $30 a square meter," he said. Michael McGehee, an assistant professor at Stanford in materials science and engineering

That article has some good quotes by venture capitalists who see energy tech as a generally hot area for investment. The growth of venture capital involvement is reason for much more optimism about photovoltaics and other promising energy technologies. If various claims that the Saudis are exaggerating the size of their oil reserves turn out to be correct then we are going to be in need of these new energy technologies sooner that most think.

By Randall Parker 2004 July 11 07:21 PM  Energy Solar
Entry Permalink | Comments(30)
Periodontal Ligament Stem Cell Type Found In Wisdom Teeth

Working with freshly extracted human third molars (wisdom teeth) scientists have been able to isolate stem cells that can turn into the ligament that hold teeth into place.

Scientists at the National Institute of Dental and Craniofacial Research (NIDCR), one of the National Institutes of Health, and their colleagues have isolated human postnatal stem cells for the first time directly from the periodontal ligament, the fibrous, net-like tendon that holds our teeth in their sockets.

The scientists also say these cells have "tremendous potential" to regenerate the periodontal ligament, a common target of advanced gum (periodontal) disease. This enthusiasm is based on follow up experiments, in which the researchers implanted the human adult stem cells into rodents, and most of the cells differentiated into a mixture of periodontal ligament — including the specific fiber bundles that attach tooth to bone — and the mineralized tissue called cementum that covers the roots of our teeth.

"The stem cells produced beautifully dense, regenerated tissue in the animals," said Dr. Songtao Shi, a senior author on the paper and an NIDCR scientist. "That was when we knew they had great potential one day as a treatment for periodontal disease, and we're continuing to follow up on this promise with additional animal work." The results are published in the current issue of The Lancet.

The isolated cells were able to form periodontal ligament.

After further validation of their findings, Shi said he and his colleagues decided to pursue the next big question: Could these stem cells actually form periodontal ligament and cementum when transplanted into mice?

Of the 13 transplants — each of which was derived from a distinct colony of stem cells cultured in the laboratory and loaded into a hydroxyapetite carrier — eight produced a dense mixture of cementum and periodontal ligament. Interestingly, the cells even produced fibrous structures similar to the so-called Sharpey's fibers, which insert into both cementum and bone to hold teeth in place. The other five transplants showed no signs of differentiation.

Shi said his group is now following up on this finding in larger animals. If successful, Shi said he would be eager to evaluate their regenerative ability in people with advanced periodontal disease, which can be extremely difficult to control with current treatments.

My guess is they want to extract similar cells from large non-human animals because for ethical and practical reasons it is easier to do most of the work toward developing therapies using animals before attempting trials in humans.

While the press release has just been released to announce the publishing of the results in Lancet it appears this work was done last year, a patent has already been filed on it, and the final confirming step involved putting the human cells into immunicompromised mice to form the specialized ligament cells.

The NIH announces a new technology wherein stem cells from the PDL have been isolated from adult human PDL. These cells are capable of forming cementum and PDL in immunocompromised mice. In cell culture, PDL stem cells differentiate into collagen fiber forming cells (fibroblasts), cementoblasts, and adipocytes. It is anticipated that these PDL stem cells will be useful for periodontal tissue regeneration to treat periodontal disease.

It is hard to guess when this work will translate into wide availability of human treatments. But the consensus of German stem cell researchers is that some stem cell therapies will be available within 10 years.

My guess is that there are many more sources of adult stem cells hiding in various locations of the human body waiting to be found. Expect to read many more reports of discoveries of types of adult stem cells. Each such discovery is helpful not just as a potential starting source of cells for cell thearpies but also to compare to other cell types to develop a better understanding of how cells differentiate. The more cell types scientists have to compare the better they will be able to figure out how cells control their cell types and how to intervene to alter cell types for therapeutic purposes.

By Randall Parker 2004 July 11 04:51 PM  Biotech Organ Replacement
Entry Permalink | Comments(11)
Adult Bone Marrow Stem Cells Boost Heart Capacity

Helmut Drexler of University of Freiburg, Germany and his colleagues treated sufferers of acute myocardial infarctions (i.e. heart attacks) with bone marrow stem cells and found that the bone marrow stem cells boosted the volume of blood pumped by the left ventricle of the heart.

60 patients who had undergone successful percutaneous coronary intervention (PCI; balloon angioplasty and coronary stenting) to restore coronary artery bloodflow took part in the study. Half were given bone marrow stem-cell transfer 5 days after PCI, the other half were given optimum medical therapy. Patients who had been given stem-cell transfer had around a 7% improvement in left-ventricular function compared with only a 0.7% increase for patients given medical therapy.

Drexler says longer term and larger scale trials are needed to verify that there is a real lasting benefit from this therapy.

But he added: "Larger trials are needed to address the effect of bone marrow cell transfer on clinical endpoints, such as the incidence of heart failure and survival."

While other studies have used adult stem cells to attempt to repair damaged hearts this is the first study done using adult stem cells and a proper control group of patients. (same article here)

"What makes this notable is it's the first controlled study where they actually have a control group," said Dr. Robert Bonow, chief of cardiology and professor of medicine at Northwestern University in Chicago and past president of the American Heart Association. "In previous studies, you didn't know whether the stem cells were responsible or if it was going to happen anyway."

Note that this research is being done in Germany. In the United States the US Food and Drug Administration (FDA) is throwing up roadblocks even for adult stem cell therapy. The FDA's stance has nothing to do with the debate about embryonic stem cells. Rather, it is part of the FDA's never-ending quest to protect people with fatal diseases from the risk that experimental therapies might harm them. In my view people with fatal diseases ought to be allowed to try experimental therapies and the FDA's position both slows the rate at which treatments are developed and unjustifiably takes away the individual's right to choose which treatment risks are worth taking.

By Randall Parker 2004 July 11 03:15 PM  Biotech Organ Replacement
Entry Permalink | Comments(2)
Adult Skin Cells Turned Into Neural Stem Cells

Better Humans reports on research by Siddharthan Chandran of the University of Cambridge, UK Cambridge Centre for Brain Repair on the use of a mix of growth factors to successfully turn skin cells into neural stem cells.

This resulted in large numbers of nestin-positive neural precursors.

"The generation of almost limitless numbers of neural precursors from a readily accessible, autologous adult human source provides a platform for further experimental studies and has potential therapeutic implications," say the researchers.

The presence of nestin protein is generally seen as an indicator that a cell type has become a neural stem cell. But that indicator alone is not a certain measure of success. Other cases of seeming success with stem cell transformation have been thrown into question with the use of more sensitive means of measuring cell state (though if you click thru on that you will see even in that case the more sensitive means did not absolutely disprove the original result because cell culture media vary too much in composition in unknown and uncontrollable ways between lots). However, Chandran's team may really have succeeded in doing what they have reported.

My take on all this is that what Chandran is trying to do is at least theoretically possible. One does not always have to start with embryonic stem cells to get cells to differentiate (specialize) into any desired type. Systematic searches for compounds that turn more specialized cells into less specialized cells have turned up promising compounds for making stem cells from fully differentiated adult cells. Expect to see many more such reports.

By Randall Parker 2004 July 11 01:21 PM  Biotech Organ Replacement
Entry Permalink | Comments(0)
2004 July 08 Thursday
Is Aging A Medical Condition?

A UC Irvine scientist claims aging is not a medical condition.

James McGaugh, director of the Center for the Neurobiology of Learning and Memory at the University of California-Irvine, bristles at the notion of people with normal brains taking medication to boost their brainpower. After all, he says, no one regards the slowing down of the body with age as a medical condition.

I disagree! So do many others. Aging is a medical condition because an aged body does not function properly. A body that does not function properly has a disease. A disease is a medical condition.

(Related story: 'Smart pills' make headway)

"Does Michael Jordan have age-related physical impairment?" McGaugh asks.

Yes, of course he does. Jordan's problem with his knees is obviously age-related. His general slowing down and less endurance are age-related. It is not a coincidence that as the years passed and his body aged he became impaired as compared to what he used to be able to do.

Just as Jordan may not be as agile on the basketball court as he used to be, McGaugh says, there's strong evidence that memory processing slows with age.

Yes, our brains age. The aging of our brains is a medical condition. Consider this recent report on gene expression, brain aging, and damaged genes.

To investigate age-associated molecular changes in the human brain, Dr. Bruce A. Yankner, professor in the Department of Neurology and Division of Neuroscience at Children's Hospital Boston and Harvard Medical School, and colleagues examined patterns of gene expression in postmortem samples collected from thirty individuals ranging in age from 26 to 106 years. Using a sophisticated screening technique called transcriptional profiling that evaluates thousands of genes at a time, the researchers identified two groups of genes with significantly altered expression levels in the brains of older individuals. A gene's expression level is an indicator of whether or not the gene is functioning properly.

"We found that genes that play a role in learning and memory were among those most significantly reduced in the aging human cortex," said Yankner. "These include genes that are required for communication between neurons."

In addition to a reduction in genes important for cognitive function, there was an elevated expression of genes that are associated with stress and repair mechanisms and genes linked to inflammation and immune responses. This is evidence that pathological events may be occurring in the aging brain, possibly related to gene damage.

The researchers then went on to show that many of the genes with altered expression in the brain were badly damaged and could not function properly. They showed that these genes also could be selectively damaged in brain cells grown in the laboratory, thereby mimicking some of the changes of the aging brain.

Is gene damage a medical condition? More generally, is brain damage a medical condition? Yes, of course. If you have something in your body that is damaged then you have a medical condition.

I am amazed to see scientists promoting a naturalistic fallacy that if some process is natural it must be normal and must not be treated. Imagine making that argument about, say, a troubled pregnancy: "Sorry maam, we can't intervene to save you or your baby from preeclampsia because in our view your illness is a natural result of an interaction between your genes and your environment." Or imagine saying this about a bacterial infection: "We can't give your daughter an antibiotic to kill the Group A streptococcal infections that is causing scarlet fever because infections are natural and have been happening for all of human history. So she'll just have to die since there is no medical condition here." You'd be thought either crazy or incredibly unethical if you said such things. But today too many scientists, doctors, and members of the public at large think of aging as an inevitability to be embraced as part of the natural order. Well, aging is not inevitable. It is one big medical condition that we need to cure. Aging reversal will some day become possible and we ought to be trying much harder to make that day come as soon as possible.

By Randall Parker 2004 July 08 10:55 AM  Aging Debate
Entry Permalink | Comments(26)
Ocean Iron Fertilization Viable To Remove Atmospheric Carbon Dioxide?

Some scientists are questioning the viability of ocean iron seeding as a means to sequester carbon from atmospheric carbon dioxide. (The Scientist requires free registration - an excellent publication that is worth the trouble to sign up)

The idea can be traced back to a Woods Hole Oceanographic Institution meeting in 1985, when John Martin, then director of the Moss Landing Marine Laboratory, boasted: "Give me half a tanker of iron and I'll give you an ice age." Martin's general hypothesis that iron seeding would create a photosynthetic bloom proved correct, although the idea has turned out to be far less economical than he expected. The breakeven point for sequestration programs is $10 per ton of carbon dioxide; models based on the iron-seeding experiments still put the cost at $100 or more. Many scientists involved in iron-seeding projects as well as those observing them from afar say that iron seeding for purposeful carbon sequestration just doesn't work. "In the beginning, the assumptions were that for every atom of iron, we could sink 500,000 atoms of carbon," says Ken Caldeira, an ocean carbon-cycle scientist at Lawrence Livermore National Laboratory in California, who helped to create computer simulations. Those estimates have since been revised downwards by hundreds of orders of magnitude, he says.

The article quotes a variety of scientists on whether the latest Southern Ocean Iron Fertilization Experiment (SOFeX) provides good or bad news for the prospect of iron fertilization as a way to increase photosynthesis by marine plant organisms as a way to cheaply remove carbon dioxide from the atmosphere. Some scientists still hold that it is the cheapest method to remove atmospheric carbon dioxide found to date. Read the full article if the debate interests you. I lack sufficient knowledge to render any sort of opinion on the subject.

Also see my previous post on the SoFEX results that links to more optimistic assessements of the experiment's results: Iron Enriching Southern Ocean Pulls Carbon Dioxide From Atmosphere.

By Randall Parker 2004 July 08 02:46 AM  Engineering Environmental
Entry Permalink | Comments(5)
Will Youthful Societies Be More Or Less Political?

Steve Sailer thinks beautiful young women are averse to political passions because they are more interested in personal passions.

"Fahrenheit 9/11" Fails to Draw Hot Babe Market -- Having made $64 million in 14 days, Michael Moore's docucomedy is certainly a smash hit, as I learned on Saturday while trying to find a theatre that wasn't sold out. Nonetheless, one audience segment was conspicuous by its absence Saturday night in Westwood: the beautiful girl demographic. I fear that this transcends ideology: political passions increase as the hair turns gray and more personal passions diminish.

Steve is probably right. Currently Western populations are aging. This argues for a more politicized female population. But this trend will eventually reverse and go to an extreme of widespread youthfulness and beauty. Likely within a few decades we will be able to reverse aging and become young again. Those of us lucky enough to still alive when biomedical science finally advances to the point that aging reversal becomes possible will be able to look young again My guess is that anyone alive in America in 2035 will not have to grow old and die. But perhaps the switchover point will come sooner.

Rejuvenation therapies will definitely increase the babetudinous factor in the population. But universal youthfulness will not be the only cause of increased amounts of beauty. Advances in cell therapy and other techniques such as better methods of growing new bone in the face will replace today's methods of plastic surgery with cheaper, safer, less painful, and far more powerful techniques. Less attractive humans will bioengineer themselves into stunningly attractive creatures.

Along with youth and beauty people will also have enhanced ability to enjoy sex. Men already have Viagra, Levitra, and other drugs to help with impotence. But impotence treatment by itself is not always an aphrodisiac, and especially not for women. However, recent research on Palatin Technology's drug PT-141 which is under development. The most recent research abstract from the Proceedings of the National Academy of Sciences shows PT-141 increases sexual solicitation in female rats.

Here we report that PT-141, a peptide analogue of {alpha}-melanocyte-stimulating hormone that binds to central melanocortin receptors, selectively stimulates solicitational behaviors in the female rat. This occurs without affecting lordosis, pacing, or other sexual behaviors. PT-141 did not cause generalized motor activation, nor did it affect the perception of sexual reward. A selective pharmacological effect on appetitive sexual behavior in female rats has not been reported previously, and indicates that central melanocortin systems are important in the regulation of female sexual desire. Accordingly, PT-141 may be the first identified pharmacological agent with the capability to treat female sexual desire disorders.

PT-141 is also under development for male Erectile Dysfunction.

Palatin's research suggests that PT-141 works through activation of melanocortin receptors in the central nervous system (CNS) rather than acting directly on the vascular system. Activation of melanocortin receptors by PT-141 results in the specific stimulation of vasodilatation in the male and female genitalia. Based on PT-141's CNS site of activity, animal studies and early clinical work may be effective in treating the desire as well as arousal components of FSD. PT-141 is also in development as a treatment for Erectile Dysfunction (ED). Clinical data from phase 2 studies indicates that PT-141 may be effective in treating a broad range of patients suffering from erectile dysfunction. The nasal formulation of PT-141 being developed is as convenient as oral treatments, is more patient friendly than invasive treatments for ED, such as injections and trans-urethral pellets, and appears to result in a more rapid onset of action than is seen following the oral administration of treatments for ED.

Expect to see many more drugs developed that modify and enhance sexual performance, sexual interest, and sexual pleasure. Youthful and beautiful bodies will have large sexual appetites.

Picture a future full of sexually aroused youthful beautiful people. Will they be as interested in politics as they are today? Interests compete. There are only so many hours in the day. The enthusiasms of youth will compete with involvement in politics. I expect to see a decrease in political interest among women. However, I do not expect to see as large of a decrease in political interest among men. Some men will be drawn to politics in order to gain power and status to make themselves more successful in competing with other men.

I have previously argued that desire for attractive objects of affection will be easier to satisfy than the desire for higher status. Though perhaps one partial solution to the need for humans to feel high status will be the creation of virtual realities where real humans can feel higher in status than simulated humans.

By Randall Parker 2004 July 08 02:29 AM  Future Youthful Society
Entry Permalink | Comments(3)
2004 July 06 Tuesday
Depressed People Have Larger Thalamus Area For Emotions

Depression appears to be caused by long standing abnormalities in an area of the thalamus that controls emotion.

ndividuals who suffer from severe depression have more nerve cells in the part of the brain that controls emotion, researchers at UT Southwestern Medical Center at Dallas have found.

Studies of postmortem brains of patients diagnosed with major depressive disorder (MDD) showed a 31 percent greater than average number of nerve cells in the portion of the thalamus involved with emotional regulation. Researchers also discovered that this portion of the thalamus is physically larger than normal in people with MDD. Located in the center of the brain, the thalamus is involved with many different brain functions, including relaying information from other parts of the brain to the cerebral cortex.

The findings, published in today's issue of The American Journal of Psychiatry, are the first to directly link a psychiatric disorder with an increase in total regional nerve cells, said Dr. Dwight German, professor of psychiatry at UT Southwestern.

"This supports the hypothesis that structural abnormalities in the brain are responsible for depression," he said. "Often people don't understand why mentally ill people behave in odd ways. They may think they have a weak will or were brought up in some unusual way.

"But if their brains are different, they're going to behave differently. Depression is an emotional disorder. So it makes sense that the part of the brain that is involved in emotional regulation is physically different."

I find it curious that an emotional disorder would seemingly need such a large change in the size of a part of the brain in order for the disorder to manifest. Small changes in enormous computer programs can lead to large malfunctions. Yet at least with major depression incredibly small changes are apparently not sufficient to cause it in most sufferers of depression.

Note, however, that sufferers of bipolar depression do not have an increase in the size of the mediodorsal and anteroventral/anteromedial areas of the thalamus. So then is there an abnormality in the size of some other part of the brain of bipolars waiting to be discovered?

Researchers from UT Southwestern, working with a team from Texas A&M University System Health Science Center, used special computer-imaging systems to meticulously count the number of nerve cells in the thalamus.

Results showed an increase of 37 percent and 26 percent, respectively, in the number of nerve cells in the mediodorsal and anteroventral/anteromedial areas of the thalamus in subjects with MDD when compared with similar cells in those with no psychiatric problems. The number of nerve cells in subjects with bipolar disorder and schizophrenia was normal.

Researchers also found that the size of the affected areas of the thalamus in subjects with MDD was 16 percent larger than those in the other groups.

"The thalamus is often referred to as the secretary of the cerebral cortex – the part of the brain that controls all kinds of important functions such as seeing, talking, moving, thinking and memory," Dr. German said. "Most everything that goes into the cortex has to go through the thalamus first.

"The thalamus also contains cells that are not involved with emotion. Our studies found these portions of the thalamus to be perfectly normal. But the ones that are involved in emotion are the ones that were abnormal."

Researchers also looked at the effect of antidepressant medications on the number of nerve cells and found no significant difference among any of the subject groups – whether they had taken antidepressants or not – reinforcing the belief that abnormalities in brain development are responsible for depression.

Does this report suggest any avenues for the development of therapeutic treatments? Could a drug be developed that would inhibit some of the neurons in the mediodorsal and anteroventral/anteromedial areas of the thalamus? Would such a drug reduce the symptoms of depression? It will be interesting to see what develops from this report.

By Randall Parker 2004 July 06 06:25 PM  Brain Disorders
Entry Permalink | Comments(3)
Life Expectancies Increased In Upper Paleolithic Period

By examining dental information derived from molar wear patterns a pair of anthropologists has been able to show that human life expectancy increased during the Upper Paleolithic Period.

ANN ARBOR, Mich.---Researchers at the University of Michigan and the University of California at Riverside have discovered a dramatic increase in human longevity that took place during the early Upper Paleolithic Period, around 30,000 B.C.

In their study of more than 750 fossils to be published July 5 in the Proceedings of the National Academy of Sciences, anthropologists Rachel Caspari and Sang-Hee Lee found a dramatic increase in longevity among modern humans during that time: the number of people surviving to an older age more than quadrupled.


By calculating the ratio of old-to-young individuals in the samples from each time period, the researchers found a trend of increased survivorship of older adults throughout human evolution. It's not just how long people live that's important for evolution, but the number of people who live to be old, Caspari and Lee pointed out.

The increase in longevity that occurred during the Upper Paleolithic period among modern humans was dramatically larger than the increase identified during earlier periods, they found. "We believe this trend contributed importantly to population expansions and cultural innovations that are associated with modernity," they wrote.

A large number of older people allowed early modern humans to accumulate more information and to transmit specialized knowledge from one generation to another, they speculated. Increased adult survivorship also strengthened social relationships and kinship bonds, as grandparents survived to educate and contribute to extended families and others. Increased survivorship also promoted population growth, the authors explain, since people living longer are likely to have more children themselves, and since they also make major contributions to the reproductive success of their offspring.

"Significant longevity came late in human evolution and its advantages must have compensated somehow for the disabilities and diseases of older age, when gene expressions uncommon in younger adults become more frequent," the authors noted.

"There has been a lot of speculation about what gave modern humans their evolutionary advantage," Caspari said. "This research provides a simple explanation for which there is now concrete evidence: modern humans were older and wiser."

Here is my FuturePundit speculation on this report: the lengthening of lifespans created a selective pressure for higher intelligence. When people started living longer they accumulated more knowledge. The increase in available knowledge increased the value of having a high cognitive ability to sort through, analyze, and apply that knowledge. A smarter person can notice more and learn more useful lessons from an accumulation of life experiences than can a less intelligent person. So genetic mutations that lengthened lifespans may have led to selection for mutations that increased intelligence. Then the selection for higher intelligence likely increased the value of living even longer which would have fed back into selecting for longer lifespans.

But important questions remain unanswered: Did any Upper Paleolithic civilizations collapse from spiralling taxes enacted in a futile attempt to meet unfunded pension liabilities? Were massive human migrations across the continents driven by a desire to escape from old age pension taxes?

Update: Managing to live for 30 years was enough to classify someone as "old" in the Upper Paleolithic.

They judged the age of specimens by examining wear to teeth and classified "old" as twice the age of sexual maturity - roughly 30 years.

By Randall Parker 2004 July 06 03:49 PM  Trends, Human Evolution
Entry Permalink | Comments(1)
2004 July 05 Monday
Bone Marrow Stem Cells Confirmed To Convert To Liver Cells

Diane Krause and colleagues at Yale report that bone marrow stem cells can differentiate into epithelial cells. (which are found in skin and on the surface of inner body cavities)

New Haven, Conn. -- Epithelial cells derived from bone marrow cells can be a result of differentiation, not fusion, according to a study published in Science by Yale researchers who arrived at some of the earliest findings on non-blood cells derived from bone marrow.

Led by Diane Krause, associate professor of Laboratory Medicine and Pathology at Yale School of Medicine, the investigators transplanted marrow-derived cells from male mice into female mice. They followed the fate of the marrow-derived cells (male) by detecting the Y chromosome. If resulting epithelial cells were formed by cell-to-cell fusion, they should express green fluorescent protein (GFP) and not beta-galactosidase.

"Our results show that under normal circumstances, the green fluorescent protein was not expressed, which means that no fusion has occurred and that the marrow derived cells can become non-blood cells without fusing," said Krause, attending physician in Laboratory Medicine at Yale-New Haven Hospital.

Krause said they did find that when the tissues were damaged, there were some cells that expressed GFP and therefore were derived from donor cells fusing with recipient cells.

Several years ago Krause's laboratory published a study showing that bone marrow stem cells can differentiate into liver, lung, kidney, skin, muscle and other cells. Later studies published by other researchers postulated that the bone marrow derived cells had actually fused with epithelial cells.

Krause said the ramifications of these latest findings are still unclear. "The absence of fusion in this model does not necessarily imply that trans-differentiation, a change in phenotype of one mature cell type to that of another mature cell type, has occurred," she said. "In fact, we carefully refrain from using that terminology in this report to avoid making assumptions about the mechanism of the phenotypic change."

She said it may be that an as-yet-unidentified, multipotent epithelial precursor exists in the bone marrow or that a separate population of marrow precursors exhibit a gene expression pattern that can be reprogrammed to express markers of other cell types.

This result is important because it restores some of the luster of adult stem cells that was lost when it was found that in at least some cases adult stem cells were providing benefits by fusing with existing cells. In 2003 some researchers reported that bone marrow hematopoietic stem cells were repairing livers by fusing with existing liver cells rather than by differentiation into liver cells.

The phenomenon of cell fusion between adult stem cells and other cell types has been demonstrated with heart, brain, and liver cells.

In a study that calls into question the plasticity of adult stem cells, Howard Hughes Medical Institute (HHMI) researchers and colleagues at the University of California, San Francisco, have demonstrated that adult bone marrow cells can fuse with brain, heart and liver cells in the body.

The phenomenon of fusion would give the appearance that bone marrow stem cells are altering themselves to become mature cells in other tissues, when in fact they are not, according to one of the study's senior authors, HHMI investigator Sean J. Morrison at the University of Michigan.

Note that the fusion does appear to provide benefits. If, say, you have a fatal liver disease and you can get better by having adult stem cells fuse instead of having the stem cells convert into liver cells are you going to turn down the treatment? Of course not. But the ability of adult stem cells to convert into various fully differentiated (specialized for particular functions) cells opens up the door for many more potential medical uses such as growth of replacement organs and replacement of lost cells such as happens with Parkinson's Disease and heart disease.

Krause's group is not the only team to recently report success at getting bone marrow stem cells to take on specialized liver cell functions without fusing with existing liver cells. A month ago a team at Johns Hopkins also reported success in using bone marrow stem cells to convert into liver cells without fusing with existing liver cells.

Bone marrow stem cells, when exposed to damaged liver tissue, can quickly convert into healthy liver cells and help repair the damaged organ, according to new research from the Johns Hopkins Kimmel Cancer Center.


There has been debate among the scientific community over whether these cells also can differentiate into other tissue types such as the liver, says Saul J. Sharkis, Ph.D., senior author of the study and a professor of oncology at the Johns Hopkins Kimmel Cancer Center. Some studies suggest that the bone marrow cells fuse with other types of cells, taking on those cells' properties. But in this study, the researchers found, through highly thorough analysis with a microscope and other tests, that the cells did not fuse, suggesting that "microenvironmental" cues from existing liver cells caused them to convert.

I've never viewed the obstacles to making adult stem cells more plastic (i.e. capable of changing into more cell types) as insurmountable. It is possible that it will be quicker to use embryonic stem cells for some purposes. But eventually techniques will be developed to allow adult stem cells to be converted into all cell types. As more adult stem cell sources are found in the body and as better techniques for growing them are discovered the range of potential target cell types that adult stem cells will be able to make will steadily increase.

My point here is not to argue for or against embryonic stem cell research. My point is that even a complete ban on human embryonic stem cell research will only delay the developmetn of cell therapies and organ growth methods. Granted, such a delay would result in human deaths that would otherwise would be avoided. But legal obstacles can be literally programmed around once our knowledge of genetic programming advances far enough and we have the ability to change the epigenetic programming of cells to whatever state we desire.

By Randall Parker 2004 July 05 02:01 PM  Biotech Organ Replacement
Entry Permalink | Comments(2)
2004 July 02 Friday
Do Artificial Sweeteners Cause Obesity?

Consumption of Artificial sweeteners may mistrain the brain to make it underestimate the amount of calories in foods that contain real sugar.

Professor Terry Davidson and associate professor Susan Swithers, both in the Department of Psychological Sciences, found that artificial sweeteners may disrupt the body's natural ability to "count" calories based on foods' sweetness. This finding may explain why increasing numbers of people in the United States lack the natural ability to regulate food intake and body weight. The researchers also found that thick liquids aren't as satisfying - calorie for calorie - as are more solid foods.

Our attempts to fool ourselves are defeated because the parts of our brain that regulate appetite can monitor signals that provide an indication of how much calories were consumed.

"The body's natural ability to regulate food intake and body weight may be weakened when this natural relationship is impaired by artificial sweeteners," said Davidson, an expert in behavioral neuroscience. "Without thinking about it, the body learns that it can use food characteristics such as sweetness and viscosity to gauge its caloric intake. The body may use this information to determine how much food is required to meet its caloric needs."

Over the past 25 years, there has been a dramatic increase in the consumption of artificially sweetened foods and low viscosity, high-calorie beverages, said Swithers, a developmental psychobiologist.

"Incidence of overweight and obesity has also increased markedly during this period," she said. "Our hypothesis is that experience with these foods interferes with the natural ability of the body to use sweet taste and viscosity to gauge caloric content of foods and beverages. When you substitute artificial sweetener for real sugar, however, the body learns it can no longer use its sense of taste to gauge calories. So, the body may be fooled into thinking a product sweetened with sugar has no calories and, therefore, people overeat."

Swithers said that the loss of the body's ability to gauge caloric intake contributes to increased food intake and weight gain, especially when people do not count calories on their own. A similar dynamic is at work with foods' texture and thickness.

"Historically, we knew that our body learns that if the food is thick, such as whole milk, it tends to have more calories than compared to a thinner liquid such as skim milk," Swithers said. "Now, our research reinforces this and takes it one step further, showing that our bodies translate this information about perceived calories into a gauge to tell us when to stop eating."

Two studies on rats yielded results that led to this theory.

Davidson and Swithers' findings are based on two studies.

In the first study, two groups of rats were given two different sweet-flavored liquids. In the first group, both liquids were sweetened with natural high-calorie sweeteners so there was a consistent relationship between sweet taste and calories. For the second group, one of the flavored liquids was artificially sweetened with non-caloric saccharin so that the relationship between sweet taste and calories was inconsistent.

After 10 days of exposure to the flavors, the rats were allowed to eat a small amount of a sweet, high-calorie chocolate flavored snack. The researchers compared the two groups' ability to compensate for the calories contained in the chocolate snack. The rats that had experienced the inconsistent relationship between sweet taste and calories were less able to compensate for the calories contained in the snack and ate more than the rats that had experienced the consistent relationship between sweetness and caloric intake.

"This suggests that experience with the inconsistent relationship reduced the natural ability of the rats to use sweet taste to judge the caloric content of the snack," Swithers said.

In the second study, two groups of rats were given a high-calorie dietary supplement along with their regular food every day for 30 days. Although the supplements were identical in calories and nutritive content, they differed in viscosity. For one group the supplement had the consistency of thick chocolate pudding, whereas for the other group, the supplement was similar to chocolate milk. Davidson and Swithers found that over the course of the study, the rats given the milk-like supplement gained significantly more weight than the rats given the more viscous, pudding-like supplement.

"This finding indicates that rats are less able to estimate and compensate for the calories contained in liquids than in semi-solid foods," Davidson said. "If the body is less able to detect and compensate for calories contained in liquids, then intake of high-calorie beverages compared to semi-solid or solid foods could increase the tendency to gain weight."

There are a couple of obvious lessons here. First of all, do not use artificial sweeteners. Second, drink water and get all your calories for semi-solid and solid foods. This is wise advice for other reasons anyway. You are better off eating an apple than drinking apple juice because the solid apple has more vitamins, minerals, antioxidant compounds, and fiber. The same holds true for other fruits and vegetables versus their juices in the vast majority of cases.

By Randall Parker 2004 July 02 02:36 PM  Brain Appetite
Entry Permalink | Comments(5)
2004 July 01 Thursday
MIT Team Demonstrate Plant Protein Photovoltaic Cell

Marc Baldo and a collaborator at MIT have taken spinach chloroplast proteins and worked them into a photovoltaic solar cell that generates electricity.

Baldo's team isolated a variety of photosynthetic proteins from spinach and sandwiched them between two layers of conducting material. When light was shone on to the tiny cell, an electrical current was generated. Their discovery is reported in Nano Letters1.

Plant chloroplasts normally capture photons to excite electrons to drive photosynthesis. The machinery is there in chloroplasts to cause electron flow in the presence of light. If that machinery could be massaged into a form that would create lasting photovoltaic cells then it holds the potential of providing a way to make cheap photovoltaics. This idea has been around for a while and back in the late 1970s there was a short burst of funding for plant biochemists to work on chloroplast electrochemistry. The advocates of one proposed approach claim chlorophyll-photovoltaic cell may be able to convert 72% of sunlight to electricity.

As well, in a chlorophyll-photovoltaic cell (nicknamed the “chlorovoltaic cell,” or “CVC”) it will be useful to employ multiple layers of synthetic chlorophyll, each specializing in absorbing certain wavelengths of light. For instance, layers of chlorophyll-a and chlorophyll-b (both synthesized from plants) will specialize in blue-violet-red and orange-blue wavelengths respectively. New layers can be engineered using developed technologies such that they are able to absorb the energy from ultraviolet, yellow-green, and infrared light. With each layer parallel to one another, incident light will be used to its maximum capability.

More here.

My take on chlorophyll-photovoltaic cells is that they will be feasible some day but it is hard to say when. Their potential advantage over more conventional biomass approaches to energy is that they would not need to be tended to the way plants in fields or in vats must be. Their potential advantage over more conventional silicon photovoltaic cells is that they may some day be much cheaper to make. But one question that arises is whether the proteins in the chloroplasts can be treated to be made stable for long periods of time.

By Randall Parker 2004 July 01 06:09 PM  Energy Solar
Entry Permalink | Comments(2)
Site Traffic Info
Site Copyright
The contents of this site are copyright ©