2004 October 31 Sunday
Toll Of Gravity Not Major Cause Of Facial Aging

Plastic surgeon Val Lambros, MD says the major cause of change in facial appearance is loss of fat that causes deflation of the skin in toward muscles and bones.

PHILADELPHIA – To the surprise of many people, the loss of fat and sun exposure play a bigger role than gravity in aging the face, according to a study presented today at the American Society of Plastic Surgeons (ASPS) Plastic Surgery 2004 conference in Philadelphia.

“People make assumptions about how the face ages because when they pull up on their facial skin, they look better,” said Val Lambros, MD, ASPS member and author of the study. “Actually the pull of gravity on facial tissues is not a significant component of facial aging. Instead, other factors, like the loss of facial fat and sun damage are more contributory in the complex process of aging.”

In addition, the nature of facial skin changes over time becoming thinner, most notably around the eyelids. These changes are often accelerated by sun exposure, which damages the skin.

“Plastic surgeons rejuvenate the aging face by pulling up and tightening the tissue, but treatment also requires a balance between tightening tissue and replacing loss facial fat with wrinkle fillers,” said Dr. Lambros. “The key is knowing how much of each to do.”

Dr. Lambros use perfectly aligned photographs taken at different times in peoples lives to measure what moves and changes with age and found few facial features change. (same article here)

Surprisingly, he said, only a few features shifted over time. The subjects' brows fell slightly and their upper lips thinned. Their jowls became more prominent, but they expanded rather than dropped. Every other feature in the photographs remained perfectly still.

Subcutaneous fat injections are now part of the repetoire of plastic surgeons. However, fat injections and collagen injections typically last only several months.

Results: The duration of the fat injections varies significantly from patient to patient. Though some patients have reported results lasting a year or more, the majority of patients find that at least half of the injected fullness disappears within 3-6 months. Therefore, repeated injections may be necessary.

What is needed is the ability to move intact fat cells in a way that allows a higher percentage of them to survive at the transplanted location. One report I found on injection of fat into penises to make them thicker (really!) mentions that perhaps 30% of transplanted cells live. So then would several transplants over a period of a couple of years eventually result in a large enough build-up of viable fat cells at the target location that further transplants would not be necessary? Anyone know?

By Randall Parker 2004 October 31 02:08 PM  Aging Studies
Entry Permalink | Comments(1)
Electric Current To Scalp Improves Speed Of Word Recall

Meenakshi Iyer, Ph.D. of the US National Institute of Neurological Disorders and Stroke Brain Stimulation Unit has found that a two thousandths of an ampere applied to scalps using electrodes increases the speed of word recall.

A current of two thousandths of an ampere (a fraction of that needed to power a digital watch) applied for 20 minutes is enough to produce a significant improvement, according to data presented this week at the annual meeting of the Society for Neuroscience, held in San Diego.


The volunteers were asked to name as many words as possible beginning with a particular letter. Given around 90 seconds, most people get around 20 words. But when Iyer administered the current, her volunteers were able to name around 20% more words than controls, who had the electrodes attached but no current delivered. A smaller current of one thousandth of an amp had no effect.

This result is not a strong case for using electric currents to improve your brain's performance. First of all, I'm not convinced that this is entirely harmless. The current flows are causing events in the brain that otherwise would not occur. Are all the effects transitory? We do not know.

Also, the improvement in performance in one kind of test may very well decrease performance in other kinds of tests. There are precedents for this. For example, caffeine helps the brain stay on a chain of thought but does so at the expense of reducing word recall on unrelated subjects. Well, the ability to stay on a chain of thought is obviously useful in a lot of situations and so the trade-off is often worth it. But what is the trade-off from having the ability to recall more words that start with the same letter? Iyer's work needs to be repeated with a larger assortment of tests of cognitive function to see if there is any decay in the ability to perform other types of mental tasks.

Also, a wider range of cognitive tests run on people who take this treatment might turn up other benefits of this treatment. Perhaps the ability to recall more words improves the ability to find words to use to write better prose. People who do a lot of writing such as reporters, book authors, and Hollywood script writers might benefit from a little bit of electric juice. But another question that needs to be answered is just how long does the effect last?

My advice is to go jogging if you have hit a mental writing block. Exercise will increase brain performance while fat and table sugar will probably decrease brain performance.

Update: My point above about there often being trade-offs for improving cognitive enhancement is driven home by a new report out of Ohio State University that stress increases memory recall but decreases problem solving ability.

Researchers at Ohio State University gave a battery of simple cognitive tests to 19 first-year medical students one to two days before a regular classroom exam – a period when they would be highly stressed. Students were also given a similar battery of tests a week after the exam, when things were less hectic.

While pre-exam stress helped students accurately recall a list of memorized numbers, they did less well on the tests that required them to consider many possibilities in order to come up with a reasonable answer. A week after the exam, the opposite was true.

"Other studies have suggested that elevated stress levels can actually improve some aspects of cognition, particularly working memory," said Jessa Alexander, a study co-author and a research assistant in neurology at Ohio State. "The results of the two problem-solving tests we administered suggested a decline in problem solving abilities that required flexible thinking."

She conducted the study with David Beversdorf, an assistant professor of neurology at Ohio State. The two presented their findings on October 25 in San Diego at the annual Society for Neuroscience conference.

By Randall Parker 2004 October 31 12:17 PM  Brain Enhancement
Entry Permalink | Comments(3)
2004 October 28 Thursday
1918 Killer Flu Virus Research Lab Containment Levels Too Low?

The 1918 influenza virus pandemic that killed tens of millions of people is being reconstructed in research laboratories using virus samples extracted from the long frozen bodies of victims of the virus who were buried in very cold regions of far northern Europe. While some of the work on these virus samples were done in a top level BioSafety Level 4 (or BSL-4 or BSL4) level laboratory much of the work on the 1918 influenza is being done in lower level BSL-3Ag and even lower level BSL-3 labs. (same article here)

Yet despite the danger, researchers in the US are working with reconstructed versions of the virus at less than the maximum level of containment. Many other experts are worried about the risks. “All the virologists I have spoken to have concerns,” says Ingegerd Kallings of the Swedish Institute for Infectious Disease Control in Stockholm, who helped set laboratory safety standards for the World Health Organization.

If a virus that killed 40 million people is not worthy of being handled in top level of containmen then what is? I guess an argument can be made that smallpox is an even greater threat and warrants an even higher level of caution. But I'm not even sure that is true. My guess is we are far better equipped to stop a smallpox outbreak than to stop an influenza outbreak.

By contrast, the team in Georgia, the first to experiment with genetically engineered 1918 viruses, did all its work at BSL-3Ag. Meanwhile, Michael Katze at the University of Washington at Seattle is planning to expose monkeys to aerosols of 1918-type viruses at BSL-3, a step down from BSL-3Ag. The recent SARS escapes were from BSL-3 labs.

Some scientists are less worried that the bioengineered viruses being worked with are that big of a potential threat.

The head of the World Health Organization's global influenza program said he isn't certain that work on the virus needs to be restricted to the most secure facilities, but the agency would be open to hosting a forum on the issue.

"What we mustn't forget is that what they're working on is not the 1918 virus," Dr. Klaus Stohr cautioned in an interview from Geneva.

The 1918 flu epidemic (sometimes called the Spanish flu though it probably didn't originate in Spain) killed somewhere between 20 and 40 million people. The estimates of mortality are broad because there are no reliable statistics on deaths in much of the world and death rates in differed greatly. But an assumption that about 2% of the world's population died is not unreasonable. My guess is that an influenza outbreak that happened today with similar mortality rates would probably kill between 60 million and 150 million people. Of course in the more developed countries public health mesures such as quarantines and protective gear would reduce the death rates considerably. Yet most people do not live in developed countries. Also, after the first season of outbreak a vaccine might be able to be produced that would halt it in at least parts of the world.

Currently a building boom in BSL-3 and BSL-4 labs is underway mainly in reaction to the 9/11 terrorist attack and the anthrax attack that followed it.

Altogether, the nation's combined total of BSL-4 lab space, 1,689 square meters, "is extremely limited and obviously insufficient," says Stephen Morse, director of Columbia University's Center for Public Health Preparedness. To remedy such shortages, NIAID awarded grants of about $120 million each in October 2003 to help pay for new BSL-4 labs at Boston University and a second, much larger lab at UTMB. Together they will add 3,925 square meters, more than twice the current amount of space.

In addition, NIAID is planning to build two BSL-4 labs for its employees, one in Hamilton, Montana, (which is now undergoing an environmental assessment) and the other in a new National Interagency Biodefense Campus at USAMRIID. They will add 2,879 square meters, almost twice as much as exists now. In addition, the CDC plans to build 1,275 square meters of new BSL-4 space in Atlanta.

But all those labs take 5 to 10 years to become available.

We need for scientists to figure out what made the 1918 so deadly. Another influenza strain could mutate into that level of lethality at any time. Would convention influenza vaccine development techniques work against the 1918 strain? We need to find out. So the research work is very much worth doing.

An important aside to this report is that we need more rapid techniques for making influenza vaccines. The latest problem with contamination of 48 million doses of Chiron Corporation's influenza vaccine highlight the problem with the current method of making influenza vaccines in specially produced chicken eggs. Influenza vaccine production takes 6 months and can not easily be scaled up to handle a known large scale outbreak.

The quaint system of producing flu vaccine based on seasonal egg-laying has harsh implications for what would happen if new batches had to be made in a hurry to fight a super-strain pandemic. At best, it would take half a year.

We need very rapid techniques for producing flu vaccine.

It is only a matter of time before a new highly deadly flu strain arises that rivals or surpasses the 1918 pandemic in deadliness.

Monica Schoch-Spana, a senior fellow at the Center for Biosecurity of the University of Pittsburgh Medical Center, said the likelihood of an upcoming flu pandemic is "not a matter of if, but when."

Also see my previous post Sequencing Of 1918 Spanish Flu DNA Increases Risk Of Bioterrorism.

By Randall Parker 2004 October 28 04:20 PM  Dangers Natural Bio
Entry Permalink | Comments(9)
2004 October 27 Wednesday
Will Cruise Ships Become Old Age Nursing Homes?

Markets can take rather unexpected turns. The cruise ship market could grow by leaps and bounds if millions of retirees move permanently onto ships.

Living on a cruise ship is a feasible and cost-effective option to assisted living facilities, and the services offered on a cruise ship parallel — even surpass — what is provided in senior care facilities, according to a study in the November issue of the Journal of the American Geriatric Society.

“Offering many amenities, such as three meals a day with escorts to meals, physicians on site and housekeeping/laundry services, cruise ship could be considered a floating assisted living facility,” said Lee Lindquist, M.D., instructor of medicine at Northwestern University Feinberg School of Medicine.

“Seniors who enjoy travel, have good or excellent cognitive function and require some assistance with activities of daily living are the ideal candidates for cruise ship care,” Lindquist said.

Lindquist, who is also an attending physician in the divisions of geriatric and general internal medicine at Northwestern Memorial Hospital, compared costs over a 20-year life expectancy after moving to assisted living facilities, nursing homes and a cruise ship, including costs of treating acute illness, Medicare reimbursement and other factors.

She found that the net costs of cruise ship living were only about $2,000 higher ($230,000 vs. $228,000) than those associated with the assisted living facilities but resulted in higher quality over the 20-year period.

Lindquist’s plan would include integration with regular passengers, with seniors selecting a cabin to inhabit as home during their prolonged cruise, whereas other passengers would disembark as usual.

I picture David Brin's Earth novel with all the old folks wearing video cameras tied to the net. They'd get off the ships and all the locals would complain that every time an old folks' ship docks there's just no privacy in town.

I do not understand the cost totals. The per year costs for nursing homes run into the 6 figures. Perhaps the numbers above are average per year?

One might expect a bigger price gap. But think about how the competitive environment differs for cruise ships versus nursing homes. Most nursing homes do not compete in a national market, let alone an international market. Whereas each cruise ship probably faces many more competitors than do nursing homes. Also, because most cruise ship passengers do not stay on board very long and return business is important the cruise ships have to be appealing to a much larger number of people to keep each cabin filled and the cruise ships need to satisfy each of them to get them to come back.

Is there any way to extend on this idea to make medical care provision more competitive? Imagine surgery ships that ply a long coast all competing to provide the cheapest, safest, most comfortable, and effective hip replacements or knee replacements or plastic surgeries. No need to travel to Beverly Hills to get the best. If it is elective surgery you seek then you could just wait for the ship to dock that you believe has the best combination of reputation, service, and cost.

Update: Here is some comparative yearly data from The Economist:

A year in an “assisted-living facility” costs Americans, on average, around $28,500 a year. In large cities such as Chicago, costs are even higher, topping $40,000. Living in a dedicated cabin aboard the Royal Caribbean's Majesty of the Seas, on the other hand, rings in at a rather competitive $33,260 a year.

Update II: In the comments below Ted argues quite plausibly that the costs of the dedicated cabin on the ship come in at a low price point because people on the ship are pretty healthy. Make them all sick enough to require assisted living help and suddenly costs would balloon. Perhaps so. Though I wonder what the labor costs are on a ship. Do they use foreigners who make less than US minimum wages?

By Randall Parker 2004 October 27 10:16 PM  Aging Population Problems
Entry Permalink | Comments(28)
UK Researchers Find Workable Hydrogen Storage Nanomaterial

Some British researchers have found a way to store hydrogen without sustained high pressures.

A team from the Universities of Newcastle upon Tyne and Liverpool in the UK, who report their findings in the prestigious academic journal, Science, have found a safe way of storing and releasing hydrogen to produce energy. They do this using nanoporous materials, which have tiny pores that are one hundred-thousandth (100,00th) the thickness of a sheet of paper.


The Liverpool and Newcastle researchers have found a workable method of injecting the gas at high pressure into the tiny pores - of ten to the minus nine metres in size - in specially-designed materials to give a dense form of hydrogen. They then reduce the pressure within the material in order to store the captured hydrogen safely. Heat can be applied to release the hydrogen as energy, on which a car could potentially run.

Professor Mark Thomas, of Newcastle University's Northern Carbon Research Laboratories in the School of Natural Sciences, a member of the research team, said:

"This is a proof of principle that we can trap hydrogen gas in a porous material and release it when required. However, if developed further, this method would have the potential to be applied to powering cars or any generator supplying power. Although hydrogen-powered cars are likely to be decades away, our discovery brings this concept a step towards becoming reality.

"Now that we have a mechanism that works, we can go on to design and build better porous framework materials for storing hydrogen, which may also be useful in industries that use gas separation techniques."

Professor Matt Rosseinsky, of the University of Liverpool's Department of Chemistry, said "Our new porous materials can capture hydrogen gas within their channels, like a molecular cat-flap.

"After allowing the hydrogen molecule – the 'cat - in, the structure closes shut behind it. The important point is that the hydrogen is loaded into the materials at high pressure but stored in them at a much lower pressure - a unique behaviour. This basic scientific discovery may have significant ramifications for hydrogen storage and other technologies that rely on the controlled entrapment and release of small molecules."

The ability to store hydrogen at high density but under low pressure without extreme cooling is the holy grail for making hydrogen storage in cars practical. But this one result probably doesn't solve that problem. The nanotech material used might be very expensive to manufacture (as is presently the case with many nanotech materials such as nanotubes). Or it might not work over a wide range of environmental conditions. Or it might not work over hundreds of recharges. Still, this report is reason to be hopeful that hydrogen storage is a solvable problem. My guess is that nanotechnology approaches will be where the solutions are found. This report is therefore a step in the right direction.

By Randall Parker 2004 October 27 05:18 PM  Energy Tech
Entry Permalink | Comments(5)
Word Memory Shifts From Sound To Meaning As We Age

Somewhere between the age of 5 and the age of 11 human minds shift from associating words primarily by similarity of sounds to similarity of meaning.

The study found evidence of an age-related, developmental shift in language, suggesting that younger children process words primarily on the basis of phonology, or sound, while older children and adults process words primarily on the basis of semantics, or meaning. The findings are presented in the article "False Memories in Children: Evidence for a Shift from Phonological to Semantic Associations," by Steve Dewhurst and Claire Robinson of Lancaster University, United Kingdom. The article will be published in the November issue of Psychological Science, a journal of the American Psychological Society.


To test whether children would make similiar memory errors based on sound rather than semantics, the researchers used a version of this earlier experiment. They developed a list of words in which each word had at least one possible rhyme, then presented the list to children aged five, eight, or 11, who were asked to recall the words after hearing them. The results suggested a developmental correlation between age and language processes: The 11-year-olds performed in the same way as adults and falsely recalled words that were semantically related to the lists; the 8-year-olds were equally likely to falsely recall rhymes and semantic associates; and the 5-year-olds falsely recalled words that rhymed with those presented in the lists.

I think adults tend to forget and underestimate the intellectual difficulty of being a child because adults do not realize just how much contextual information they have built up and rely upon. For a child so much more of daily experience is novel and has no framework through which it can be sorted and organized.

For another demonstration of how minds develop better ways of classifying information as they age see my recent post Ferret Visual Cortex 80% Active Even In The Dark.

By Randall Parker 2004 October 27 04:11 PM  Brain Development
Entry Permalink | Comments(3)
2004 October 25 Monday
On Lies And Science Policy

The Bush and Clinton Administrations differ in their styles of lying.

Andrew G. Keeler, who until June 2001 was on the president's Council of Economic Advisers and has since returned to teaching at the University of Georgia, said the Clinton administration had also played with economic calculations of the costs of curbing carbon dioxide emissions, in its case to show that limiting emissions would not be expensive.

But it made available all of the assumptions that went into its analysis, he said; by contrast, the Bush administration drew contorted conclusions but never revealed the details.

"The Clinton administration got these lowest possible costs by taking every assumption that would bias them down," he said. "But they were very clear about what the assumptions were. Anybody who wanted to could wade through them."

This illustrates why I have a hard time feeling enthusiastic about major political figures.The Clinton Administration, personifying the very outgoing and brazen nature of its leader, was willing to lie in detail in public (yes, arbitrarily choosing every unprovable assumption to tip an argument in your favor is brazen lying). By contrast, the Bush Administration prefers to make its lies to the public in the form of simpler summary conclusions which seem aimed at shutting off discussion by providing little to discuss. In the first instance the advantage for critics of the Clintonites was of course that one could challenge each of the individual assumptions that went into building the big lie product. But it is as if the Clinton Adminstration operated under a "dishonesty in labelling" law (as distinct from a "truth in labelling" law) where they revealed all their deceptive ingredients. There is something more brazen about the Clinton Administration choice because a detailed lie is a larger scaled effort that requires more work to produce. More people have to agree to lie when the lie is going to be a detailed economic or ecological model.

Detailed lies remind me of how Spock would tell Captain Kirk some impossibly precise number (Spock: It is difficult to be precise, Captain. I should say approximately 7824.7 to one.) to give the illusion of having greater knowledge about a matter than it was possible to have. Perhaps in Star Trek this was acceptable since it was fiction. But the fact that the deception created an illusion in the minds of many audience members demonstrates that the technique works. The offering of elaborate details and great mathematical precision in results can be (and too often is) used as a technique for deception.

By contrast, the Bush Administration just asserts that its announcements of the truth are miraculously what makes their preferred choices the best choices. Is this worse? The downside is that it provides no basis from which to start arguing their conclusions. It tends to discourage public scrutiny of government decisions and it amounts to a simple assertion of "trust me". It is an approach that probably has the effect of reducing the amount of time the public spends thinking about public policy issues. Or perhaps it just causes a shifting to other policy topics as the public spends less time thinking about public policy issues the government doesn't want to have attract so much attention.

But which approach allows for a greater level of deception? Which is more effective? Is the human mind more easily fooled by simple lies or by complex lies? Perhaps it depends on the mind. Perhaps the deceptions of the Bush Administration are, at least on average, being pitched to a different target demographic group or audience than the Clinton Administration's deceptions were aimed at.

Of course the government has no monopoly on public policy deception. Various factions fool themselves and others into believing they are the virtuous ones presenting the real truth of the matter on some complex issue of policy. The actual act of debating some policy issue - even with the most honest of intentions - inevitably ends up being deceptive in some manner. One has to select what one thinks to be relevant facts (and hopefully correct facts) to present. That act of selection can cause one to deceive both oneself and others.

On the bright side technological trends strike me as favoring more accurate public policy discussions on issues involving science. We can so much more easily find information because of the ever improving world wide web and search engines. Anyone who Googles and reads the better web logs regularly can become far better informed on some issue than was possible even a few years ago. One can read multiple news stories from different sources on the same subject. One can go back to more original sources from which news stories are written. One can even contact scientists and other figures and ask for clarifications whereas previously only journalists could do that.

My sense of how things are going is that the quality of available information is improving and it is becoming easier to get better informed and less partisan analysis on any topic. Though there is still the challenge of how to find the best people on each topic.

By Randall Parker 2004 October 25 05:16 PM  Policy Science
Entry Permalink | Comments(7)
Conservation With Colder Winter Offices Does Not Pay

Lowering the setting of an office thermostat during the winter is a false economy.

ITHACA, N.Y. -- Warm workers work better, an ergonomics study at Cornell University finds.

Chilly workers not only make more errors but cooler temperatures could increase a worker's hourly labor cost by 10 percent, estimates Alan Hedge, professor of design and environmental analysis and director of Cornell's Human Factors and Ergonomics Laboratory.

When the office temperature in a month-long study increased from 68 to 77 degrees Fahrenheit, typing errors fell by 44 percent and typing output jumped 150 percent. Hedge's study was exploring the link between changes in the physical environment and work performance.

"The results of our study also suggest raising the temperature to a more comfortable thermal zone saves employers about $2 per worker, per hour," says Hedge, who presented his findings this summer at the 2004 Eastern Ergonomics Conference and Exposition in New York City.

In the study, which was conducted at Insurance Office of America's headquarters in Orlando, Fla., each of nine workstations was equipped with a miniature personal environment-sensor for sampling air temperature every 15 minutes. The researchers recorded the amount of time that employees keyboarded and the amount of time they spent making error corrections. Hedge used a new research approach employing software that can synchronize a specific indoor environmental variable, in this case temperature, with productivity.

"At 77 degrees Fahrenheit, the workers were keyboarding 100 percent of the time with a 10 percent error rate, but at 68 degrees, their keying rate went down to 54 percent of the time with a 25 percent error rate," Hedge says. "Temperature is certainly a key variable that can impact performance."

One lesson of this study is that conservation should be done with better technology (e.g. better insulation) and not by making people suffer more extreme variations in temperature.

What is even more interesting here is the idea that there must be some optimal room temperature for productivity. Does anyone know whether psychometric studies of human intelligence have been conducted under a range of environmental conditions? Is there an optimal room temperature range for IQ? If so, what is that range?

By Randall Parker 2004 October 25 02:29 PM  Brain Enhancement
Entry Permalink | Comments(5)
2004 October 24 Sunday
Obesity Being Selected For In Modern Society?

Here is more evidence that Darwinian natural selection has not stopped operating on humanity as a result of medical advances and rising living standards. Lee Ellis of Minot State University in North Dakota and his student Dan Haman have just published a research paper providing evidence that natural selection is currently selecting for fatter people.

This study sought to determine if genetic factors might be contributing to the increases in the proportions of North Americans who are obese and overweight. The body mass index (BMI) for a large sample of two generations of United States and Canadian subjects was correlated with family fertility indicators. Small but highly significant positive correlations were found between the BMIs of family members and their reproduction rates, especially in the case of women. For instance, mothers in the sample (most of whom were born in the 1940s and 50s) who were in the normal or below normal range had an average of 4·3 siblings and 3·2 children, compared with 4·8 siblings and 3·5 children for mothers who were overweight or obese. When combined with evidence from twin and adoption studies indicating that genes make substantial contributions to obesity, this study suggests that recent increases in obesity are partially the result of overweight and obese women having more children than is true for average and underweight women.

Ellis and Haman speculate that medical advances are allowing obese and diabetic women to live longer to have more children. But that does not explain why overweight women would have more children than skinny women.

So what is going on here? There are a number of possibilities.

One possibility is that one or more of the many hormones being released by fat cells are altering the brain to make women (or their spouses) either more eager or able to have children (possibly by causing them to enter puberty at an earlier stage) or more eager to find a mate or to engage in other behavior that increases reproduction. The hormones from fat cells might even be increasing fertility.

The scientific view of fat cells has changed a lot in recent years and fat cells are now seen as exerting many influences on the rest of the body. There are plenty of hormones being released by adipose fat tissue into the bloodstream. (same article here and here and here)

“When we look at fat tissue now, we see it’s not just a passive depot of fat,” says Dr. Rudolph Leibel of Columbia University. “It’s an active manufacturer of signals to other parts of the body.”

The first real inkling that fat is more than just inert blubber was the discovery 10 years ago of the substance leptin. Scientists were amazed to find that this static-looking flesh helps maintain itself by producing a chemical that regulates appetite.

Roughly 25 different signaling compounds — with names like resistin and adiponectin — are now known to be made by fat cells, Leibel estimates, and many more undoubtedly will be found.

Another possibility is that the obesity is a side effect of a higher fat diet that also boosts hormones and thereby makes girls more fertile or eager to have sex or to have children. This is plausible because a higher fat diet in adolescence raises sex hormones and causes other endocrine changes.

Joanne F. Dorgan, Ph.D., of the Fox Chase Cancer Center in Philadelphia, and her colleagues conducted a study ancillary to the Dietary Intervention Study in Children to examine whether diet influences sex hormone levels during adolescence. The study involved 286 girls ages 8 to 10 who were randomly assigned to a low-fat dietary intervention group or to a group receiving usual care (e.g., educational materials available to the public). The researchers measured blood sex hormone levels at the start of the study and 1, 3, 5, and 7 years later.

After 5 years, girls in the intervention group had 29.8% lower estradiol, 30.2% lower non-sex hormone binding globulin-bound estradiol, 20.7% lower estrone, and 28.7% lower estrone sulfate levels during the first half of their menstrual cycles, and 27.2% higher testosterone levels during the second half of their menstrual cycles,compared with girls in the usual care group. After 7 years, girls in the intervention group had half the progesterone levels during the second half of their menstrual cycles as did girls in the usual care group.

All those hormonal differences must be having some effects on the brain and on the reproductive organs.

Another related possibility comes from the fact that the leptin hormone acts early in life to change the brain to lower inhibition from eating and might also lower the inhibition against having sex. This is just speculation on my part of course. But it is at least plausible.

Another possibility is that obesity is negatively correlated with intelligence and that it is lower intelligence that is responsible for the higher fertility. Of course there are quite smart fat people and plenty of dumb skinny people. But it seems likely that smarter people are, again on average, doing a better job than dumber people of consciously choosing foods and restricting foods in order to manage their weight. Well, if that is the case then selection for obesity may be coming as a side-effect of the existing selective pressures that are obviously selecting against higher levels of intelligence. My guess is that this possibility explains part of the difference in fertility between overweight and skinny people. However, the difference in fertility as a function of educational attainment (which is a decent though not perfect proxy for IQ) appears smaller than the difference as a function of weight. So IQ is probably only one contributing factor.

Another possibility is that overweight people have lower expectations about what they can achieve in looking for a mate and therefore they more quickly decide someone they have found is good enough to settle for. Therefore they start having children sooner and have more children than those who hold out for better mates. Skinnier people probably (and again on average) believe they have a wider selection of choices and may be willing to wait longer to hold out for a still better choice. The delay that comes from waiting may cause them to delay reproduction and therefore reduce the number of children that women in particular can hope to have. Some women wait so long that by the time they are ready to try for a kid they are not even able to start a pregnancy.

Update: Are obese people less bright to begin with? Does lower intelligence cause the risk of obesity to increase? Also, does the direction of causation also run in the opposite direction? Does the presence of obesity cause intellectual abilities to decay? Some researchers think obesity appears to interfere with cognitive function.

Elias, a member of the Statistics and Consulting Unit of BU’s Department of Mathematics and Statistics, and his co-investigators at the Framingham Heart Study are the first to show that long-term, early-onset obesity is an independent risk factor to cognitive dysfunction. This knowledge should help inform physician–patient decisions to treat this physical condition.


Analyses of these data by the researchers found that the combination of obesity and hypertension showed a statistically significant association with the cognitive functioning of men, but not of women. Among late middle-aged and elderly men, obesity and hypertension were associated with lowered cognitive functioning. Among all men, the effects of obesity and hypertension were found to be cumulative, with cognitive functioning lowered more when both conditions were present than when one or neither was a factor. The researchers speculated that obesity and hypertension may have similar physiological “paths” by which they affect cognitive functioning and that the different distribution of fat on men and women may help to explain the adverse effects of obesity in men compared to women.

Stay skinny for your brain. Stay skinny for your heart. Do it to reduce the odds of getting cancer too.

By Randall Parker 2004 October 24 05:33 PM  Trends, Human Evolution
Entry Permalink | Comments(19)
2004 October 22 Friday
Coke And Pepsi Advertising Effects Measurable In Brain Scans

Samuel M. McClure, now at Princeton University, Jian Li at Baylor College of Medicine, and a number of colleagues at Baylor have found that brand preferences are measurable using functional magnetic resonance imaging (fMRI) brain scans.

The preference for Coke versus Pepsi is not only a matter for the tongue to decide, Samuel McClure and his colleagues have found. Brain scans of people tasting the soft drinks reveal that knowing which drink they're tasting affects their preference and activates memory-related brain regions that recall cultural influences. Thus, say the researchers, they have shown neurologically how a culturally based brand image influences a behavioral choice.

These choices are affected by perception, wrote the researchers, because "there are visual images and marketing messages that have insinuated themselves into the nervous systems of humans that consume the drinks."

Even though scientists have long believed that such cultural messages affect taste perception, there had been no direct neural probes to test the effect, wrote the researchers. Findings about the effects of such cultural information on the brain have important medical implications, they wrote.

Advertising may be contributing to the obesity epidemic.

"There is literally a growing crisis in obesity, type II diabetes, and all their sequelae that result directly from or are exacerbated by overconsumption of calories. It is now strongly suspected that one major culprit is sugared colas," they wrote.

My prediction: Some day people will be able to elect to be put under brain scanners and shown a series of advertising images to discover which advertisers have done the best job of programming them to like their products. Then some drug combination or other therapy will be available to deliver in conjunction with an image of some product to cause the cancellation of the neural pattern that makes one favor that product.

Besides the health implications of studying soft drink preference, the researchers decided to use Coke and Pepsi because-- even though the two drinks are nearly identical chemically and physically--people routinely strongly favor one over the other. Thus, the two soft drinks made excellent subjects for rigorous experimental studies.

In their study, the researchers first determined the Coke versus Pepsi preference of 67 volunteer subjects, both by asking them and by subjecting them to blind taste tests. They then gave the subjects sips of one drink or the other as they scanned the subjects' brains using functional magnetic resonance imaging (fMRI). In this widely used imaging technique, harmless magnetic fields and radio signals are used to measure blood flow in regions of the brain, with such flow indicating brain activity levels. In the experiments, the sips were preceded by either "anonymous" cues of flashes of light or pictures of a Coke or Pepsi can.

The experimental design enabled the researchers to discover the specific brain regions activated when the subjects used only taste information versus when they also had brand identification. While the researchers found no influence of brand knowledge for Pepsi, they found a dramatic effect of the Coke label on behavioral preference. The brand knowledge of Coke both influenced their preference and activated brain areas including the "dorsolateral prefrontal cortex" and the hippocampus. Both of these areas are implicated in modifying behavior based on emotion and affect. In particular, wrote the researchers, their findings suggest "that the hippocampus may participate in recalling cultural information that biases preference judgments."

The researchers concluded that their findings indicate that two separate brain systems--one involving taste and one recalling cultural influence--in the prefrontal cortex interact to determine preferences.

The ventromedial prefrontal cortex gets programmed by advertising.

We delivered Coke and Pepsi to human subjects in behavioral taste tests and also in passive experiments carried out during functional magnetic resonance imaging (fMRI). Two conditions were examined: (1) anonymous delivery of Coke and Pepsi and (2) brand-cued delivery of Coke and Pepsi. For the anonymous task, we report a consistent neural response in the ventromedial prefrontal cortex that correlated with subjects' behavioral preferences for these beverages. In the brand-cued experiment, brand knowledge for one of the drinks had a dramatic influence on expressed behavioral preferences and on the measured brain responses.

They found that the knowledge of the Coke brand exerted a more powerful effect upon the brain than knowledge of the Pepsi brand. Given that Coke is the bigger seller what is to be expected. Dr. Read Montague, director of the Brown Human Neuroimaging Lab at Baylor, said the brain scans allowed him to predict preference before a sip was taken.

Functional magnetic resonance imaging allowed Montague to predict fairly accurately which people preferred Coke or Pepsi before they even took a sip.

“We were stunned by how easy this was,” Montague said. “I could tell what they were going to do by looking at their brain scans.”

A large portion of the market value of Coca Cola is the result of patterns of neural network connections which Coke advertising has created in hundreds of millions of people.

Surely there are similar neural phenomena causing national loyalties, religious loyalties, and other preferences.

By Randall Parker 2004 October 22 04:04 PM  Brain Conditioning
Entry Permalink | Comments(18)
2004 October 21 Thursday
Arrogant Scientists And What Is A Rights-Possessing Being?

William Happer, a professor of physics at Princeton University and George W. Bush supporter, says a lot of scientists are too stuck on their own intellectual superiority. (the magazine The Scientist requires free registration that is well worth the time to sign up for it)

Happer, a member of the Homeland Security Science and Technology Advisory Panel, suggested that the charges from the UCS and Nobel Laureates are largely overblown and out of context. He said that some scientists, who've garnered a sort of "deity complex" based on their scientific achievements, take their role to be akin to Plato's "philosopher kings," wise advisors who would tell citizens how to live. "They're extremely upset when the Bush administration doesn't call in the philosopher kings to be told what to do," he said.

When I hear some of the scientists who are angry at Bush Administration restrictions on embryonic stem cell research part of my reaction is that the scientists are seemingly opposed to the idea that anyone besides scientists should be able to decide what is ethical in areas where the scientists are working. This is something ultimately arrogant and condescending about their rhetoric. They can't see how anyone can legitimately disagree with them. Yet we face serious ethical issues with the ability to manipulate cells that have the potential to develop into full humans. It strikes me as immature to expect the public to all just jump and shift to the position held by most stem cell researchers just because the stem cell researchers are experts. Should we have a society which is ruled by experts?

Leave aside what you personally feel about ethical questions related to embryonic stem cell research. Look at the case of murder. Almost everyone agrees that murder is a bad thing and it should be outlawed. By contrast, there are sharp divisions in America and in many European countries over abortion. Why? It is hard to draw the line on what is a human now that we can intervene in areas we never had the ability to intervene in before. What principles should we use to guide us in making those decisions? Most scientists arguing for allowing the use of embryonic stem cells do not even try to provide an answer to that question. They just rag on Bush and those supposedly horrible fundamentalist Christians.

Religious folks do attempt to provide an answer for why they are opposed to both abortion and embryonic stem cell therapies: They think these procedures and manipulations kill spirits. Now, do we have spirits? Heck if I know. Hope so. Doubt it too. Is there a God who has set absolute rules for right and wrong? The scientists have no better idea to the answer for that question than do priests and pastors. Science is throwing up all sorts of cases where we have to decide on right and wrong where we never had to before because we couldn't create the conditions that produced the ethical problem in the first place.

It is a strain on the public to be faced with so many ethical issues on matters of such gravity. Scientists need to recognize this and to show some patience. That some people (whether for religious or non-religious reasons) tend to take a more expansive view of what is a human than some scientists desire is not a bad thing even if those people are wrong. Would you rather live in a society where the populace tends to draw too small a circle around what is human? Or would you rather live in a society where people err in the direction of greater protection? It is exceedingly unlikely you are going to live in a society where people make their judgements about rights with perfect precision and infinite wisdom. We are only humans after all.

It seems to me that a government has to be legitimate in the minds of its people and that legitimacy has to rest on a widely held set of beliefs on what is right and what is wrong. That need to come up with a consensus on moral questions can not always be avoided by oft-made claim that we can sidestep the need for consensus by letting each person decide whether, say, to make use of the ability to have an abortion or to use embryonic stem cells for therapy. The reason is that the debate is over the question of what is a rights-possessing entity. The answer to that question is by no means obvious. We hold now that babies are from the moment of birth rights-possessing entities. Killing a baby is murder. But back in the Roman Empire that was not the case. Down through time there have been many changes on where to draw the lines on what is a human and on what rights humans possess under different circumstances. So there is no obvious self-evident truth on what is murder or what is a human.

Scientific advances are going to create new situations to debate on where to draw the line and also provide information that will affect how we define where to draw the line. But science by itself can not provide ethical answers. Arrogant condescending assertions of what is right and wrong by academic biomedical researchers are no more helpful than similar assertions by their opponents. One can be a reasonable and well-informed person and disagree with either side. One can even be reasonable and well-informed and be deeply ambivalent about many of the questions that are arising as a result of biomedical advances.

What is a human? The stakes are incredibly high for how we answer that question. The stakes matter for not just abortion and embryonic stem cell research but also for genetic engineering of children, genetic engineering to increase the intelligence of other species, the development of artificial intelligence and human-computer interfaces, and with the ability to keep alive brain-dead or extremely cognitively decayed humans. Most of the ramifications of what happens if we make various choices are hard to guess at. But at least some of those choices would be disastrous in my view. For instance, imagine if we allowed any entity that can pass a Turing test full rights. This would likely be an appealing criterion for some scientists even though most people don't even know what a Turing test is. But then genetically engineered psychopaths would be free to prey on people until they were caught committing a crime.

So far secular scientists have not advanced a compelling non-religious basis for deciding what is a human. Carl Sagan suggested drawing the line (if memory serves) at the point at which the cerebral cortext begins development. His argument was that the frontal lobes of the brain are what makes human minds unique. But most scientists do not try to engage the question at that level. They just assert that of course any reasonable person could not possibly believe that a single cell deserves legal protection.

What I find more worrying about this state of the debate is not the arrogance of scientists. The bigger problem in the longer run comes from the intellectual demands that will be placed on anyone who is trying to judge whether some product of science really is something we want to recognize as sufficiently human-life to deserve protection as a human. The level of cognitive ability and of education even needed to understand the reasoning of arguments for some moral positions about what is a human and what is a rights-possessing entity will be so great that those arguments will be inaccessible to a substantial fraction of the population. How can we have a moral consensus on the legitimacy of crucial laws regarding what is murder and what is a human life if the population can't even understand the laws and their justifications?

By Randall Parker 2004 October 21 03:06 PM  Policy Science
Entry Permalink | Comments(28)
2004 October 20 Wednesday
Carbon Coating Hiding Many Dangerous Comets?

Bill Napier, Chandra Wickramasinghe, and his daughter, Cardiff University student Janaki Wickramasinghe have proposed that there may be hundreds of comets in Sun's orbit that are so dark that optical methods will fail to detect them before they collide with Earth.

Napier worked with Chandra Wickramasinghe, an astronomer at Cardiff University in Wales, to explain the comets' invisibility. Wickramasinghe has suggested that Sedna, the most distant body identified in our Solar System, could have an orbiting twin that is dark, fluffy and made of tarry carbon compounds (see "Sedna 'has invisible moon'").

As Sedna may be a member of the Oort cloud, Napier thinks that other members of the cloud could be equally dark. Once ejected, the tarry comets would simply suck up visible light, he says, remaining cloaked in darkness. "Photons go in, but they don't come out."

Infrared telescopes may be able to detect dark comets.

Because dark matter emits little light it will be invisible to optical telescopes, but it might emit infrared radiation and be able to be picked up by infra- red telescopes.

As yet no such object has definitely been found in the solar system. But Prof Wickramasinghe believes that if there is one, there may well be hundreds, lurking beyond the outer planets of Neptune and Pluto.

Such comets may require defenses that can deflect them within fairly short distances from Earth.

Here we demonstrate that the surfaces of inactive comets, if composed of loose, fluffy organic material like cometary meteoroids, develop reflectivities that are vanishingly small in visible light. The near-Earth objects may therefore be dominated by a population of fast, multi-kilometre bodies too dark to be seen with current near-Earth object surveys. Deflection strategies that assume decades or centuries of warning before impact are inapplicable to this hazard.

If it was my decision to make I'd divert NASA money from either manned programs or space probe programs toward the detection of objects in the Sun's orbit that are dangerous enough to kill a lot of humans and toward the development of methods for diverting such objects away from collision courses with Earth. What would be going through your mind if you just heard on the radio a report that we were all going to die tomorrow due to a massive asteroid or comet just discovered to be on a collision course for Earth? I'd be thinking that we were total fools and idiots for failing to develop defenses against such a threat.

By Randall Parker 2004 October 20 03:51 PM  Dangers Natural General
Entry Permalink | Comments(5)
Proposal To Make Child With Three Genetic Parents

Some British scientists have proposed the use of in vitro fertilization techniques that will create a child with 3 genetic parents where one of the parents donates only mitochondrial DNA (mtDNA).

The research application from Doug Turnbull and Mary Herbert at the University of Newcastle will be decided upon by the UK's regulatory body, the Human Fertilisation and Embryology Authority, over the next few weeks. The procedure would involve fertilising a woman's egg by in-vitro fertilisation outside the body and transplanting the fertilised nucleus to an egg from another woman which has had its nucleus removed.

The resulting baby would have mitochondrial DNA from the woman who donated the egg. The nuclear DNA would be an even split between the two original parents who provided and fertilized the first egg. Since the nucleus of humans has about 2.9 billion DNA letters and the mitochondrial DNA has only 16,569 DNA letters the amount of DNA contribution by the egg donor will be extremely small. Those 16,539 letters code for 13 genes that are involved in the mitochondrion's breaking down of sugar to produce the energy molecules NADH and ATP.

The chief value of this technique is that it would allow women who have diseases caused by harmful mitochondrial DNA mutations to have offspring that do not suffer from their mother's mutation. Also, there is evidence that mitochondrial DNA variations have an influence on life expectancy. So one can imagine future parents wanting to select mtDNA to give to one's child that would add a one or two decades to their life expectancy.

In a broader context this is one step down a much longer road where children will be born who have many genetic parents. In the future with more advanced technqiues for manipulating cells (perhaps using microfluidics) the 23 pairs of individual nuclear chromosomes that make up a single cell's DNA complement could be taken from different people to combine in the nucleus of a single embryonic cell. That cell could then develop into a full adult. Once it becomes possible to extract and insert individual chromosomes the nuclear DNA for a single embryo could be built using chromosomes taken from 46 different people.

The ability to combine chromosomes from lots of different people is one of the ways that people will create kids who combine many different most desired features into individual people. This will have the effect of speeding up human evolution because as desired features are more rapidly selected for then of course less desired features will be just as rapidly selected against.

In Western societies I expect women will be doing most of the selecting of DNA donors. In some other societies men will be doing more of the selecting. Given the differences in male and female ideals and the differences in ideals between societies it seems reasonable to expect a greater divergence in the genetically determined and influenced characteristics in people in different parts of the world. Though perhaps in some qualities there will be a convergence as, for instance, blond hair and blue eyes are popular in so many places.

My guess is that higher IQ will be universally popular. But in other cognitive characteristics I expect to see divergences between populations. For example, not all populations will place equal value on introversion versus extroversion. Similarly, I expect to see differences between societies in choices for genetic variations that influence the tendency to be faithful in marriage. Some societies will want more masculine men or feminine women than other societies.

By Randall Parker 2004 October 20 02:48 PM  Biotech Reproduction
Entry Permalink | Comments(4)
2004 October 19 Tuesday
Air Pollution From Trees Increasing Rapidly

A number of factors have combined to increase volatile organic compounds (VOCs) air pollution from trees faster than VOC pollution from humans has declined.

They calculated that vegetal sources of monoterpenes and isoprene rose by up to 17% from the 1980s to the 1990s – equivalent to three times the industrial reductions.

The three major contributing factors are the natural reversion of abandoned farm land to forested land, the invasion of sweetgum trees, and the growth of large forests of pine trees for lumber.

Princeton University post doc Drew Purves got to the bottom of the tree pollution problem.

Further studies at Princeton and the federal Geophysical Fluid Dynamics Lab at Princeton are using sophisticated computer models to estimate the changes in ozone caused by the changes in tree-produced VOCs. Purves noted that interactions between VOCs, NOx and ozone are complex -- some may actually lower pollution -- so it would be premature to base environmental policy on studies of VOCs alone.

Purves, a postdoctoral fellow, wrote the article in collaboration with Stephen Pacala, professor of ecology and evolutionary biology at Princeton, as well as John Casperson of the University of Toronto, Paul Moorcroft of Harvard University and George Hurtt of the University of New Hampshire. The article is scheduled to be published later this fall in the journal Global Change Biology.

The scientists conducted the study by analyzing data collected by the U.S. Forest Service, which measured and cataloged 2.7 million trees on 250,000 plots of land across the country. They calculated the VOC emissions for each tree and each plot and used their findings to map VOC levels nationally. The scientists compared survey data taken in the 1980s with those taken in the 1990s to determine how levels were changing over time.

They found that areas where farmland has been abandoned during the last century have early generations of trees that produce higher levels of VOCs than older growth forests. In the South, pine plantations used for their fast-growing supplies of timber have proven to be havens for sweetgum trees, which are major producers of VOCs. Indeed, virtually every tree that grows fast -- a desirable quality for forestry production -- is a heavy emitter of VOCs.

"It's just one of those biological correlations," said Purves. "What you want is a fast-growing tree that doesn't produce a lot of VOCs, but that doesn't seem to exist."

The truth is plain to see: Nature is dangerous and needs to be brought under greater human control so that we can have a safer and cleaner environment. This shouldn't be surprising. After all, where does typhoid come from? Nature. Where does the Ebola virus come from? Nature. Where do cholera, diphteria, malaria, and tuberculosis come from? Or tidal waves? Earthquakes? Rattlesnakes? Tornadoes? Floods? Avalanches? Black widow spiders? The asteroids that probably wiped out the dinosaurs? You already know the answer to all those questions. And what about air polluting volcanoes? They aren't operated by the petroleum industry.

Consider the irony for construction. If you build with concrete, steel, plastic, and other less natural materials you will reduce the need to plant trees and therefore fewer polluting trees will be planted.

Polluting trees also call into question the idea of using various kinds of biomass as energy sources. If we grow more stuff then that disgusting and dirty (hey, plant roots have dirt all over them) plant matter is going to release all kinds of pollutants into the atmosphere.

The findings also could raise questions about potential strategies for developing "green" fuels. One idea for cutting greenhouse gas emissions is to create "biofuels" from renewable tree plantations; however, these plantations may lead to increased ozone levels, the authors note.

What to do? Technology can provide the answer: plants used for biomass and trees grown for lumber need to be genetically reengineered to be less polluting. If better engineering designs can make cars less polluting then why can't better engineering clean up trees and other natural polluters as well?

Ronald Reagan came in for a lot of criticism when he warned of the dangers of letting trees run amok and ruin our air.

Noting President Ronald Reagan's notorious 1980 reference to trees causing pollution (Reagan said: "Approximately 80 percent of our air pollution stems from hydrocarbons released by vegetation."), the authors conclude: "The results reported here call for a wider recognition that an understanding of recent, current and anticipated changes in biogenic VOC emissions is necessary to guide future air-quality policy decisions; they do not provide any evidence that responsibility for air pollution can or should be shifted from humans to trees."

But obviously Ronnie was on to something. Where others have been lured into looking at towering Redwoods and seeing ancient stately majestic beauties reaching serenely into the sky Ronnie saw right through them like he saw through communist fronts. While the real effects of trees were invisible to the rest of us Ronnie clearly saw that trees were waging a silent war on Western civilization.

Update: One other point: The older trees in older forests pollute less. Tree population aging is a good thing.

By Randall Parker 2004 October 19 03:08 PM  Pollution Natural
Entry Permalink | Comments(17)
2004 October 18 Monday
90 Day Mars Trip With Magnetic Sail Plasma Beam Propulsion?

90 days to Mars? I want to go! (same article here)

A new means of propelling spacecraft being developed at the University of Washington could dramatically cut the time needed for astronauts to travel to and from Mars and could make humans a permanent fixture in space.

In fact, with magnetized-beam plasma propulsion, or mag-beam, quick trips to distant parts of the solar system could become routine, said Robert Winglee, a UW Earth and space sciences professor who is leading the project.

Currently, using conventional technology and adjusting for the orbits of both the Earth and Mars around the sun, it would take astronauts about 2.5 years to travel to Mars, conduct their scientific mission and return.

"We're trying to get to Mars and back in 90 days," Winglee said. "Our philosophy is that, if it's going to take two-and-a-half years, the chances of a successful mission are pretty low."

Mag-beam is one of 12 proposals that this month began receiving support from the National Aeronautics and Space Administration's Institute for Advanced Concepts. Each gets $75,000 for a six-month study to validate the concept and identify challenges in developing it. Projects that make it through that phase are eligible for as much as $400,000 more over two years.

Note that NASA's funding level for this concept is miniscule. Meanwhile billions per year are spent on the obsolete and flawed Space Shuttle. For a complete list of the 12 funded projects see here.

A space station beam generator would shoot ions at the spacecraft and the spacecraft would use a magnetic sail to capture the momentum of the particles blowing at it from what would essentially be an ion wind blown at the spacecraft.

Under the mag-beam concept, a space-based station would generate a stream of magnetized ions that would interact with a magnetic sail on a spacecraft and propel it through the solar system at high speeds that increase with the size of the plasma beam. Winglee estimates that a control nozzle 32 meters wide would generate a plasma beam capable of propelling a spacecraft at 11.7 kilometers per second. That translates to more than 26,000 miles an hour or more than 625,000 miles a day.

Mars is an average of 48 million miles from Earth, though the distance can vary greatly depending on where the two planets are in their orbits around the sun. At that distance, a spacecraft traveling 625,000 miles a day would take more than 76 days to get to the red planet. But Winglee is working on ways to devise even greater speeds so the round trip could be accomplished in three months.

But to make such high speeds practical, another plasma unit must be stationed on a platform at the other end of the trip to apply brakes to the spacecraft.

"Rather than a spacecraft having to carry these big powerful propulsion units, you can have much smaller payloads," he said.

Winglee envisions units being placed around the solar system by missions already planned by NASA. One could be used as an integral part of a research mission to Jupiter, for instance, and then left in orbit there when the mission is completed. Units placed farther out in the solar system would use nuclear power to create the ionized plasma; those closer to the sun would be able to use electricity generated by solar panels.

The mag-beam concept grew out of an earlier effort Winglee led to develop a system called mini-magnetospheric plasma propulsion. In that system, a plasma bubble would be created around a spacecraft and sail on the solar wind. The mag-beam concept removes reliance on the solar wind, replacing it with a plasma beam that can be controlled for strength and direction.

A mag-beam test mission could be possible within five years if financial support remains consistent, he said. The project will be among the topics during the sixth annual NASA Advanced Concepts Institute meeting Tuesday and Wednesday at the Grand Hyatt Hotel in Seattle. The meeting is free and open to the public.

Winglee acknowledges that it would take an initial investment of billions of dollars to place stations around the solar system. But once they are in place, their power sources should allow them to generate plasma indefinitely. The system ultimately would reduce spacecraft costs, since individual craft would no longer have to carry their own propulsion systems. They would get up to speed quickly with a strong push from a plasma station, then coast at high speed until they reach their destination, where they would be slowed by another plasma station.

"This would facilitate a permanent human presence in space," Winglee said. "That's what we are trying to get to."

I've seen claims that the Space Shuttle has had as much as $100 billion spent on it and we have little to show for all that money. For a much smaller amount of money we could have a set of space highways running around the solar system moving spacecraft at faster speeds than any conventional propulsion system could achieve.

A test flight could be made within 5 years with proper funding.

“If we are solidly funded we could at least mount a test flight in five years,” says Winglee. A test flight alone would cost $1 million, while a trip to Mars would cost billions because it would require building a space station there.

Leonard David of Space.com has an article discussing other advanced propulsion concepts NASA is funding. But keep in mind that the funding level for most of these projects is still pretty low. If the Space Shuttle was retired and that money put into advanced propulsion concepts we could greatly accelerate the rate of development of much more advanced approaches to space travel.

By Randall Parker 2004 October 18 03:37 PM  Airplanes and Spacecraft
Entry Permalink | Comments(34)
2004 October 16 Saturday
Brain Battles Between Short Term Emotions And Long Term Logic

When you give in to the impulse for an immediate reward over a longer term greater reward your emotional regions of your brain has beat out your logical regions.

You walk into a room and spy a plate of doughnuts dripping with chocolate frosting. But wait: You were saving your sweets allotment for a party later today. If it feels like one part of your brain is battling another, it probably is, according to a newly published study.

Researchers at four universities found two areas of the brain that appear to compete for control over behavior when a person attempts to balance near-term rewards with long-term goals. The research involved imaging people's brains as they made choices between small but immediate rewards or larger awards that they would receive later. The study grew out of the emerging discipline of neuroeconomics, which investigates the mental and neural processes that drive economic decision-making.

The study was a collaboration between Jonathan Cohen and Samuel McClure at Princeton's Center for the Study of Brain Mind and Behavior; David Laibson, professor of economics at Harvard University; and George Loewenstein, professor of economics and psychology at Carnegie Mellon University. Their study appears in the Oct. 15 issue of Science.

"This is part of a series of studies we've done that illustrate that we are rarely of one mind," said Cohen, also a faculty member at the University of Pittsburgh. "We have different neural systems that evolved to solve different types of problems, and our behavior is dictated by the competition or cooperation between them."

The researchers examined a much-studied economic dilemma in which consumers behave impatiently today but prefer/plan to act patiently in the future. For example, people who are offered the choice of $10 today or $11 tomorrow are likely choose to receive the lesser amount immediately. But if given a choice between $10 in one year or $11 in a year and a day, people often choose the higher, delayed amount.

In classic economic theory, this choice is irrational because people are inconsistent in their treatment of the day-long time delay. Until now, the cause of this pattern was unclear, with some arguing that the brain has a single decision-making process with a built-in inconsistency, and others, including the authors of the Science paper, arguing that the pattern results from the competing influence of two brain systems.

The researchers studied 14 Princeton University students who were asked to consider delayed reward problems while undergoing functional magnetic resonance imaging (fMRI), a procedure that shows what parts of the brain are active at all times. The students were offered choices between Amazon.com gift certificates ranging from $5 to $40 in value and larger amounts that could be obtained only by waiting some period, from two weeks to six weeks.

The study showed that decisions involving the possibility of immediate reward activated parts of the brain influenced heavily by brain systems that are associated with emotion. In contrast, all the decisions the students made -- whether short- or long-term -- activated brain systems that are associated with abstract reasoning.

Most important, when students had the choice of an immediate reward but chose the delayed option, the calculating regions of their brains were more strongly activated than their emotion systems, whereas when they chose the immediate reward, the activity of the two areas was comparable, with a slight trend toward more activity in the emotion system.

The researchers concluded that impulsive choices or preferences for short-term rewards result from the emotion-related parts of the brain winning out over the abstract-reasoning parts. "There are two different brain systems and one of them kicks in as you get really proximate to the reward," McClure said.

What does it say about economists that today it is a growing view among economists that factors other than pure reasoning often drive people's decisions? They are just figuring this out? They didn't understand this, say, 20 years ago?

The finding supports the growing view among economists that psychological factors other than pure reasoning often drive people's decisions.

Do smarter people have a greater ability to control their emotions? Are brains that have more gray matter dedicated to abstract reasoning also brains whose abstract reasoning areas are better able to win out in a fight with the emotional areas? Or do the emotional areas typically scale up as much in size as the abstract reasoning areas in people who have larger abstract reasoning areas and bigger brains?

"Our emotional brain has a hard time imagining the future, even though our logical brain clearly sees the future consequences of our current actions," Laibson said. "Our emotional brain wants to max out the credit card, order dessert and smoke a cigarette. Our logical brain knows we should save for retirement, go for a jog and quit smoking. To understand why we feel internally conflicted, it will help to know how myopic and forward-looking brain systems value rewards and how these systems talk to one another."

The findings also may cast light on other forms of impulsive behavior and drug addiction.

This result explains what Spock had to do to become more Vulcan: He had to suppress the human dopaminergic circuits in his brain that caused him to feel emotions. But pure-blooded Vulcans have an easier job of mastering their emotions. The Vulcans must have either genetically engineered their species to down-regulate emotion producing areas of the brain or they engaged in generations of eugenic breeding practices ("Match maker, match maker, make me a logical match, catch me an unemotional catch") aimed at suppressing impulsiveness and emotional reactions to immediate stimuli. Certainly, as Sarek demonstrated, the Vulcans are not above the use of sophisticated reproductive technology to produce unusual offspring. Just because the Vulcans never have admitted to genetically engineering their species in any TV show episode that doesn't mean they didn't do so. After all, they hid Pon Farr as long as they could. They obviously have their secrets.

Dopaminergic circuits (neurons that release and accept dopamine neurotransmitter molecules) make us do what we ought not do.

"Our results help explain how and why a wide range of situations that produce emotional reactions, such as the sight, touch or smell of a desirable object, often cause people to take impulsive actions that they later regret," Loewenstein said. Such psychological cues are known to trigger dopamine-related circuits in the brain similar to the ones that responded to immediate rewards in the current study.

Concerning addiction, said Loewenstein, the findings help explain some aspects of the problem, such as why addicts become so focused on immediate gratification when they are craving a drug. The dopamine-related brain areas that dominated short-term choices among the study subjects also are known to be activated when addicts are craving drugs.

The researchers are now trying to pin down what kinds of rewards and how short a delay are needed to trigger the dopamine-related reaction. Their ultimate goal is to better understand how the emotion-related and calculating systems interact and to understand how the brain governs which system comes out victorious.

These results also suggest that Vulcans may be resistant to drug addiction. Vulcan neural physiology probably doesn't have dopaminergic neurons capable of responding much to opiates for example.

It would be interesting to take a large population of youths, test their tendency to choose smaller shorter term rewards over larger longer term rewards, and then follow them in a longitudinal study to see if the youths that have a stronger preference for shorter term rewards are more likely to become cigarette smokers, alcoholics, drug addicts, and convicted criminals.

It would also be interesting to test tendency toward choosing smaller shorter term versus larger longer term rewards as a function of IQ. Are people with 100 IQs more likely to go for the immediate reward than people with 130 IQs? Anyone know if that sort of experiment has been done? Or is IQ research too much of a taboo subject for that kind of work to be done by economists?

By Randall Parker 2004 October 16 12:49 PM  Brain Economics
Entry Permalink | Comments(20)
2004 October 14 Thursday
Can We Finally Retire The Space Shuttle?

Burt Rutan makes clear his disdain for the terrible Space Shuttle design.

Over the decades, Rutan said, despite the promise of the Space Shuttle to lower costs of getting to space, a kid’s hope of personal access to space in their lifetime remained in limbo.

“Look at the progress in 25 years of trying to replace the mistake of the shuttle. It’s more expensive…not less…a horrible mistake,” Rutan said. “They knew it right away. And they’ve spent billions…arguably nearly $100 billion over all these years trying to sort out how to correct that mistake…trying to solve the problem of access to space. The problem is…it’s the government trying to do it.”

It is my hope that the success of SpaceShipOne and the coming flights of SpaceShipTwo and other private spacecraft designs will allow the American public to get over their emotional attachment to the Space Shuttle. People no longer need to invest their hopes for space exploration in the Shuttle. We can relegate the Shuttle to history as an obsolete and flawed design. We have wasted enough money on the Shuttle and more billions continue to be thrown at it to little result. The Shuttle was a bad idea in 1980. It is just an expensive money sink today. We should focus on the new designs and innovations that can be developed in the future.

There are signs of hope with NASA. Most importantly, NASA is going to offer Centennial Challenges prizes for innovations in aerospace and space exploration.

Welcome to Centennial Challenges, NASA's program of prize contests to stimulate innovation and competition in solar system exploration and ongoing NASA mission areas. By making awards based on actual achievements, instead of proposals, Centennial Challenges seeks novel solutions to NASA's mission challenges from non-traditional sources of innovation in academia, industry and the public.

NASA is accepting Centennial Challenges ideas from the public. So if you have any ideas for prizes for development of spacecraft and for advance of aerospace technology do write them up and send them in. So far NASA has not yet announced even a single challenge prize. Since NASA operates in a very political environment and wants to please its political masters it wouldn't hurt for you all to contact your elected representatives via their email address (enter your zip code at the top) and let those Congress critters know you want them to support NASA's move to offer cash prizes for aerospace achievements.

By Randall Parker 2004 October 14 03:54 PM  Airplanes and Spacecraft
Entry Permalink | Comments(11)
NHGRI Aims For $100,000 Genome Sequencing Cost In 5 Years

The US government's National Human Genome Research Institute (NHGRI) is allocating $38.4 million dollars over the next few years to development of cheaper DNA sequencing technologies (and FuturePundit thinks this is still too little, too late).

BETHESDA, Md., Thurs., Oct. 14, 2004 – The National Human Genome Research Institute (NHGRI), part of the National Institutes of Health (NIH), today announced it has awarded more than $38 million in grants to spur the development of innovative technologies designed to dramatically reduce the cost of DNA sequencing, a move aimed at broadening the applications of genomic information in medical research and health care.

NHGRI's near-term goal is to lower the cost of sequencing a mammalian-sized genome to $100,000, which would enable researchers to sequence the genomes of hundreds or even thousands of people as part of studies to identify genes that contribute to cancer, diabetes and other common diseases. Ultimately, NHGRI's vision is to cut the cost of whole-genome sequencing to $1,000 or less, which would enable the sequencing of individual genomes as part of medical care. The ability to sequence each person's genome cost-effectively could give rise to more individualized strategies for diagnosing, treating and preventing disease. Such information could enable doctors to tailor therapies to each person's unique genetic profile.

DNA sequencing costs have fallen more than 100-fold over the past decade, fueled in large part by tools, technologies and process improvements developed as part of the successful effort to sequence the human genome. However, it still costs at least $10 million to sequence 3 billion base pairs – the amount of DNA found in the genomes of humans and other mammals.

"These grants will open the door to the next generation of sequencing technologies. There are still many opportunities to reduce the cost and increase the throughput of DNA sequencing, as well as to develop smaller, faster sequencing technologies that meet a wider range of needs," said NHGRI Director Francis S. Collins, M.D., Ph.D. "Dramatic reductions in sequencing costs will lead to very different approaches to biomedical research and, eventually, will revolutionize the practice of medicine."

In the first set of grants, 11 teams will work to develop "near term" technologies that, within five years, are expected to provide the power to sequence a mammalian-sized genome for about $100,000. In the second set, seven groups will take on the longer-term challenge of developing revolutionary technologies to realize the vision of sequencing a human genome for $1,000 or less. The approaches pursued by both sets of grants have many complementary elements that integrate biochemistry, chemistry and physics with engineering to enhance the whole effort to develop the next generation of DNA sequencing and analysis technologies.

"These projects span an impressive spectrum of novel technologies – from sequencing by synthesis to nanopore technology. Many of these new approaches have shown significant promise, yet far more exploration and development are needed if these sequencing technologies are to be useful to the average researcher or physician," said Jeffery Schloss, Ph.D., NHGRI's program director for technology development. "We look forward to seeing which of these technologies fulfill their promise and achieve the quantum leaps that are needed to take DNA sequencing to the next level."

Note that to get from $10 million to $100,000 per genome in 5 years would be a 2 order of magnitude drop that would be almost as much of a drop in scale as happened in the previous 10 years. But maybe the various research teams can pull it off.

To get to the $1000 genome requires a further 2 orders of magnitude drop in costs. Note that the press release provide any indication of when that goal might be reached. You click through and read the full press release and you will notice that the bulk of the funding ($31.5 million by my calculations) is for achieving the short-term $100,000 genome goal. Much smaller amounts of money are allocated toward the development of technologies that will enable much more radical advances. This seems like a mistake to me.

In my opinion too much money has been spent on using sequencing technologies and not enough on developing new sequencing technologies. Even this $38 million is not much for development of new sequencing technologies since on a per year basis it amounts to well less than $20 million per year (it is hard to calculate an exact amount since some of the grants are 2 years and some are 3 years). When the federal government is spending many hundreds of millions per year (I'm too lazy to look up NHGRI's total yearly budget but this is a very small fraction of it) using sequencing technologies that are orders of magnitude more expensive than what we could have in a few years then it seems obvious to me that the money spent over the last few years on sequencing should mostly have gone to develop cheaper technologies. The focus on short-term results by using current technologies is very unoptimal.

Update: To put the spending for faster DNA sequencing techniques in perspective the National Human Genome Research Institute has a total budget of almost a half billion dollars.

Mr. Chairman, I am pleased to present the President's budget request for the National Human Genome Research Institute for fiscal year 2005, a sum of $492,670,000, which reflects an increase of $13,842,000 over the FY 2004 Final Conference appropriation.

The National Institutes of Health are spending over $28 billion per year.

President Bush yesterday (February 2) sent to Congress a $28.6 billion budget request for the National Institutes of Health (NIH) in fiscal year 2005, a 2.6% increase of $729 million over the current year's funding. The National Science Foundation (NSF) would receive a 2.5% increase of around $140 million to $5.7 billion, but the Centers for Disease Control and Prevention (CDC) would be cut by 8.9% to $4.3 billion, a reduction of $408 million.

Aside: As the baby boomers begin to retire and an enormous fiscal crisis erupts I expect total NIH spending will go down, not up. More money will go toward treating the already sick with existing technologies rather than doing the scientific research and technological research that could so revolutionize medicine that people will rarely get sick.

One reason that biomedical scientists ought to get on the aging-reversal rejuvenation SENS (Strategies for Engineered Negligible Senescence) bandwagon is that when the fiscal crisis erupts medical and biological researchers need to have a rosier future achievable by research to sell to the public. Nothing less than an incredibly rosy scenario of rejuvenation and the end of most diseases will be enough of an enticement to keep the research bucks flowing and growing when the strains on the US federal budget become enormous.

By Randall Parker 2004 October 14 02:06 PM  Biotech Advance Rates
Entry Permalink | Comments(3)
2004 October 13 Wednesday
Hgher Temperature Superconducting Wire Continues To Advance

An article in the Christian Science Monitor reports that Sumitomo Electric Industries and American Superconductor Corp. (AMSC) are heading to market with next generation high temperature superconducting ceramic wire.

The wire, produced in Osaka, Japan, is narrower than the width of a pencil.

To develop the market, Sumitomo - Japan's biggest electric cablemaker - will offer the cable at competitive prices - about two to five times the price of conventional copper, Mr. Saeki says.

But Sumitomo will soon have competition. American Superconductor Corp. of Westborough, Mass., is working with the Oak Ridge National Laboratory [NIST] on a more advanced version of the wire, which could be used as transmission lines for electric utilities.

This type of wire still needs to be cooled by liquid nitrogen to a range of -452 to -320 degrees Fahrenheit. So this stuff isn't going to be used as building wiring. But it could still be used for power lines and in motors for ships, trains, and other large pieces of equipment.

The article gives the impression that Sumitomo's next generation wire is coming to market right now. While AMSC is still 3 or 4 years from hitting the market with their next gen wires but they are already shipping existing designs and claim to be the world leader in higher temperature superconducting wire sales.

AMSC is the world’s leading developer and manufacturer of High Temperature Superconductor (HTS) wire. AMSC's first generation HTS wire, based on a multi-filamentary composite architecture, is capable of carrying over 140 times the power of copper wires of the same dimensions. It is the industry leader in both price and performance and is the product of choice in a variety of applications including power cables, motors, generators, and specialty magnets.

AMSC announced break-through results in September of 2002 of its second generation HTS wire beating the Department of Energy's benchmark for performance by 15 months. Second generation wire, when available in commercial quantities in the next three to four years, is expected to cost two to five times lower than first generation HTS wire and will significantly broaden the market for HTS-based products and applications. As a form-fit-function replacement for first generation wire, second generation will require no re-engineering of applications developed and commercialized using first generation wire.

What sort of future will higher temperature superconducting materials make possible? Jesse H. Ausubel, director of the Program for the Human Environment at The Rockefeller University in New York, has an article in The Industrial Physicist one one potential future application of higher temperature supercondutors: the zero-emission power plant (ZEPP) and the Continental SuperGrid.

The ZEPP is a supercompact, superfast, superpowerful turbine putting out electricity and carbon dioxide (CO2) that can be sequestered. Investments by energy producers will make methane (natural gas) overtake coal globally as the lead fuel for making electricity over the next two to three decades. Methane tops the hydrocarbon fuels in heat value, measured in joules per kilogram, and thus lends itself to scaling up. Free of sulfur, mercury, and other contaminants of coals and oils, methane is the best hydrocarbon feedstock.

Ausubel quotes a source that expects ZEPP plants to boost methane-to-electric conversion efficiency from 55% to 70% and imagines a future of methane fueled 5000 MW and 10,000 MW electric power plants fuels by oxygen which has been purified from the atmosphere using croygenic separation. He envisions power plants operating under such enormous pressures that the carbon dioxide by-product of combustion comes out in liquid form for easy capture to send to sequestration facilities. The whole article is pretty interesting. Though a competing argument can be made for the continued spread of smaller electric power generators for local generation and use of electricity. That is the future that KnowlegeProblem blogger Lynne Kiesling thinks distributed energy generation systems are a real possibility, especially if the regulatory environment can be changed to be more accommodating to them. As I've previously pointed out, this might ultimately lead all the way down to cars as distributed electric power generators.

By Randall Parker 2004 October 13 04:07 PM  Energy Tech
Entry Permalink | Comments(4)
2004 October 12 Tuesday
X Prize Foundation To Offer New Round Of Prizes

Hot on the heels of the successful flights of the Scaled Composites SpaceShipOne to win the $10 million Ansari X Prize the X Prize Foundation and the World Technology Network have banded together to announce a series of X Prizes to accomplish things down on Earth.

According to the X Prize Foundation and the World Technology Network, examples of privately-funded solutions in scientific and social fields might include the following:

1. Transportation: Demonstration of a 4-seat vehicle able to achieve 200 miles per gallon in a cross country race

2. Nanotechnology: Construction of a pre-determined molecule by an assembler

3. Aging deceleration: Extension of mammal life, or demonstrated evidence of aging reversal

4. Education: Demonstration of a self-sufficient education facility able to operate independently and educate villagers anywhere on the planet

Note item 3 above. The Methuselah Foundation Methuselah Mouse Prize aims to provide a large cash award to the first scientific team to double the life of an ordinary lab mouse. That prize needs volunteers and donors.

The WTN X Prize team is accepting suggestions from the public for prize ideas. Go click on the WTN X Prize link and you will be presented with a form for submitting suggestions. What say we kick around some ideas for prizes in the comments of this post? Anyone have any ideas?

The X Prize success demonstrates that prize money can be a very effective tool for accelerating the advance of science and technology. I favor aging research prizes aimed at the development of effective rejuvenation treatments most of all. But another class of prizes that deserves support are prizes for achievements in developing new energy technologies. What would be useful milestones in the development of better energy technologies? Keep in mind that ideal milestones should be achievable by fairly small teams of engineers and scientists.

Dave Gobel of the Methuselah Foundation alerts me to the existence of a poorly publicized prize for a cheap DNA sequencer which is being offered by Craig Venter of Celera DNA sequencing fame. Venter is offering a half million dollars to the first team to produce a sequencer that can sequence an entire human genome for $1000 or less.

ROCKVILLE, MD (September 23, 2003). The J. Craig Venter Science Foundation announced today a $500,000 Genomic Technology Prize. The prize, to be awarded one time only, is aimed at stimulating the scientific and technology research community to significantly advance automated DNA sequencing so that a human genome can be sequenced for $1,000 or less as soon as possible. The prize was announced during New Frontiers in Sequencing Technology session at the 15th annual Genome Sequencing and Analysis Conference (GSAC) in Savannah, Georgia.

Over the last decade there have been significant advances in the field of genomics. More than 150 genomes, including the human genome, have been sequenced. Despite this progress we need substantial improvement in technology so that genomics can be fully integrated into all of our lives. One such area is DNA sequencing, said J. Craig Venter, Ph.D., president and founder of The J. Craig Venter Science Foundation. By continuing to reduce the cost and increase the accuracy and speed of DNA sequencing we will enable genomics to be more fully integrated into areas such as clinical medicine. It is the hope of the Venter Science Foundation that providing this challenge to the scientific community will enable us to reach the $1,000 genome sooner.

While sequencing costs continue to decline (currently costs are approximately $300,000-$500,000 to sequence the gene and regulatory regions of a human genome) and on the order of $25 million for a 5X coverage of the genome, it is necessary that these cost decrease significantly toward the $1,000 mark. Once this threshold has been reached it will be feasible for the majority of individuals to have their genome sequenced and encoded as part of their medical record.

Dave also says that it has always been the plan for the Methuselah Foundation to offer more prizes for more goals related to rejuvenation and anti-aging therapies. Their obstacle is the need to raise the funds. They accept contributions on a web page.

By Randall Parker 2004 October 12 01:28 PM  Worthy Causes
Entry Permalink | Comments(16)
2004 October 11 Monday
Breastfeeding Women Secrete Aphrodisiac Chemosignal

University of Chicago researcher Martha McClintock and colleagues have found that

Breastfeeding women and their infants produce a substance that increases sexual desire among other women, according to research at the University of Chicago.

"This is the first report in humans of a natural social chemosignal that increases sexual motivation," said Martha McClintock, the David Lee Shillinglaw Distinguished Service Professor in Psychology at the University, and the lead researcher in a team at the University's Institute for Mind and Biology. Chemosignals are substances that while not necessarily perceived as odors, nonetheless have an impact on mood and menstrual cycles when absorbed through the nose.

The researchers found that after being exposed to the breastfeeding compounds for two months, women with regular partners experienced a 24 percent increase in sexual desire as reported on a standard psychological survey. Women without partners experienced a 17 percent increase in sexual fantasies after exposure for the period.

Women in the control group with partners who were exposed to a neutral substance reported an insignificant decrease in sexual desire, while women without partners in the control group experienced a 28 percent decrease in fantasies.

The work on sexual desire is reported in the paper "Social Chemosignals from Breastfeeding Women Increase Sexual Motivation," being published in the latest issue of Hormones and Behavior.

Joining McClintock in writing the paper were Natasha Spencer, Sarah Sellergren, Susan Bullivant and Suma Jacob, researchers at the University of Chicago, and Julie Mennella, a scientist with the Monell Chemical Senses Center, in Philadelphia. The study was conducted both in Chicago and Philadelphia.

In Philadelphia, Mennella recruited 26 breastfeeding women, who were asked to eat a bland diet to avoid transmitting odors such as curry through the breast milk. The breastfeeding women wore pads in their nursing bras, where the saliva from their infants in addition to their own perspiration and milk was collected. They also wore pads secured by underarm shields to collect perspiration.

The pads were collected, cut in pieces and frozen. Other studies in the McClintock lab have shown that the procedure is effective in collecting chemosignals.

In Chicago, the researchers recruited about 90 women between the ages of 18 and 35 who had not born a child. The women were divided into two groups, one group exposed to the pads with breast feeding substances, and the other group exposed to pads with potassium phosphate, a substance that mimics the concentration of the sweat and breast milk.

"Because preconceived ideas about pheromones could potentially influence their responses, study participants were blind to the hypotheses and the source of the compounds," Spencer said. "The study was presented to the subjects as an examination of odor perception during the menstrual cycle."

Participants were given a set of pads on a regular basis and asked to swipe them under their noses in the morning and at night and any other time of the day in which they may have wiped their upper lips, showered or exercised.

The women with partners were asked about their moods and were asked to complete daily a survey with a scale indicating "the degree you felt desire today for sexual intimacy." They also recorded their sexual activity. Women without partners were also asked about their moods and reported whether they experienced "any fantasies/daydreams today of a sexual or romantic nature." Among women exposed to the breastfeeding substance, "The effect became striking during the last half of the menstrual cycle after ovulation when sexual motivation normally declines," McClintock said.

From the abstract:

Here, we demonstrate that natural compounds collected from lactating women and their breastfeeding infants increased the sexual motivation of other women, measured as sexual desire and fantasies. Moreover, the manifestation of increased sexual motivation was different in women with a regular sexual partner. Those with a partner experienced enhanced sexual desire, whereas those without one had more sexual fantasies.

Suppose this compound is identified. How it gets used will depend on how rapidly it works. If it works rapidly then expect guys to wear it as a perfume. If it takes a few hours to work then guys will want to go on longer dates to allow more time for it to take effect. If it takes days then it will be a lot harder for a single guy to use it for his own benefit. However men in longer term relationships or even men travelling with women on extended business trips would have obvious incentives to use it.

Will women want to use such a compound on themselves? That depends in part on whether it just enhances desire to have sex or does it also enhance the pleasurability of the sexual experience?

Also, some women will want to defend themselves against having their sex drive manipulated by someone else without their knowledge. One way to do that defense would be to develop compounds that block the effect of whatever compound(s) that will be isolated and found to be involved in this effect. However, another line of defense is detection. Imagine a chemical strip that is designed to react only to the aphrodisiac. A woman could wear such a strip as, perhaps, a ribbon tying up her hair or somewhere else inconspicuous and then she could check whether the strip changed color while she was sitting in a bar or restaurant.

A growing knowledge about what increases and decreases sexual drives is inevitably going to be used in the war between the sexes. Whether the net increase in knowledge will end up being used more by the offensive or the defensive or perhaps only under negotiated peace treaties remains to be seen. My guess is all of the above.

By Randall Parker 2004 October 11 03:59 PM  Brain Appetite
Entry Permalink | Comments(10)
2004 October 10 Sunday
Ferret Visual Cortex 80% Active Even In The Dark

While a popular myth holds that only about 10% of the neurons in our brains actually do anything Michael Weliky, associate professor of brain and cognitive sciences at the University of Rochester, investigated ferrets and found that young ferret brains may be less able to organize and make sense of visual stimuli, that young ferret brains are less busy in the dark than adult ferret brains, and that even in the dark 80% of adult ferret brain visual cortexes are still busy.

The test was then to see if there was any relationship between the statistical motion of the movie and the way visual neurons in the ferrets fired. Each visual neuron is keyed to respond to certain visual elements, such as a vertical line, that appears in a specific area of the ferret’s vision. A great number of these cells combine to process an image of many lines, colors, etc. By watching the patterns of how these cells fired while watching The Matrix, Weliky could describe the pattern statistically, and match those statistics of how the ferret responded to the film with the statistics of the actual visual aspects of the film.

Weliky found two surprises. First, while the neurons of adult ferrets statistically seemed to respond similarly to the statistics of the film itself, younger ferrets had almost no relationship. This suggests that though the young ferrets are taking in and processing visual stimuli, they’re not processing the stimuli in a way that reflects reality.

“You might think of this as a sort of dyslexia,” explains Weliky. “It may be that in very young brains, the processing takes place in a way that’s not necessarily disordered, but not analogous to how we understand reality to be. It’s thought that dyslexia works somewhat like this—that some parts of the brain process written words in an unusual way and seem to make beginnings of words appear at their ends and vice versa. Infant brains may see the entire world the same way, as a mass of disparate scenes and sounds.” Weliky is quick to point out that whatever way infant brains may interpret the world, just because they’re different from an adult pattern of perception does not mean the infants have the wrong perception. After all, an adult interpreted the visual aspects of the film with our adult brains, so it shouldn’t be such a surprise that other adult brains simply interpret the visual aspects the same way. If an infant drew up the statistics, it might very well match the neural patterns of other infants.

The second, and more surprising, result of the study came directly from the fact that Weliky’s research is one of the first to test these visual neurons while the subject is awake and watching something. In the past, researchers would perhaps shine a light at an unconscious ferret and note which areas of the brain responded, but while that method narrowed the focus to how a single cell responds, it eliminated the chance to understand how the neural network of a conscious animal would respond. Accepting all the neural traffic of a conscious brain as part of the equation let Weliky get a better idea of the actual processing going on. As it turned out, one of his control tests yielded insight into neural activity no one expected.

When the ferrets were in a darkened room, Weliky expected their visual neurons to lack any kind of activity that correlated with visual reality. Neurologists have long known that there is substantial activity in the brain, even in darkness, but the pattern of that activity had never been investigated. Weliky discovered that while young ferrets displayed almost no patterns that correlated with visual reality, the adult ferrets’ brains were humming along, producing the patterns even though there was nothing to see. When watching the film, the adult ferrets’ neurons increased their patterned activity by about 20 percent.

“This means that in adults, there is a tremendous amount of real-world processing going on—80 percent—when there is nothing to process,” says Weliky. “We think that if you’ve got your eyes closed, your visual processing is pretty much at zero, and that when you open them, you’re running at 100 percent. This suggests that with your eyes closed, your visual processing is already running at 80 percent, and that opening your eyes only adds the last 20 percent. The big question here is what is the brain doing when it’s idling, because it’s obviously doing something important.”

Since the young ferrets do not display similar patterns, the “idling” isn’t necessary for life or consciousness, but since it’s present in the adults even without stimulus, Weliky suggests it may be in a sense what gives the ferret its understanding of reality. The eye takes in an image and the brain processes the image, but 80 percent of the activity may be a representation of the world replicated inside the ferret’s brain.

There's an obvious math error in how this press release is written. If the brain is operating at 80% of capacity in the dark and then increases to 100% capacity then from the reference point of the brain's activity level in the dark the amount of increase going into a richer visual environment is actually 25%. But that is just a quibble.

A more basic problem is with the 100% figure for what is implied to be some sort of maximum activity level. Is that supposed to be the absolute maximum level of activity of a ferret brain's visual cortext? Isn't it possible that there are conditions under which a ferret visual cortex might become twice as active as the highest level of activity that this researcher ever measured? There has to be some absolute maximum level of activity because there is a limit to how much oxygen the bloodstream can deliver to neurons. Also, some neurons are going to be less active because the suppression of some neurons combines with the excitation of other neurons to form a representation of any one image.

All of these results are from ferrets and it is possible that human brains do not exhibit similar behavior. But my guess is that while the absolute percentages may differ from the numbers reported above human brains probably do have similar differences between babies and adults. The ability of adult brain visual cortexes to stay so active in the dark likely is the result of the development of a fairly complex model of the visual world. That model is always running and mulling over older images even when no new images are being presented to it.

What this suggests about babies and children is especially interesting. They can't make as much sense of the world. They do not know as many logical relationships between objects in an image field and therefore can't create as many higher level meanings from what they are seeing. So the visual world must seem far more random and unpredictable to them. They may not even have as great an ability to track temporal order of changes in elements in a visual field.

By Randall Parker 2004 October 10 01:07 PM  Brain Development
Entry Permalink | Comments(2)
2004 October 08 Friday
Thousand Nuclear Reactors Could Hydrogen Power All Cars In America

Andrew Oswald, an economist at the University of Warwick, and his brother Jim, claim that to switch to hydrogen power for vehicles would require either covering half of California with with turbines or building 1,000 nuclear reactors.

Converting every vehicle in the United States to hydrogen power would demand so much electricity that the country would need enough wind turbines to cover half of California or 1,000 extra nuclear power stations.

The Oswalds are making the argument that hydrogen isn't an easy solution to our energy problems. Fair enough. But could hydrogen play a role if we really thought we were better off ending our reliance on fossil fuels? Let us leave aside the fact that hydrogen has a lot of problems associated with it that its enthusiasts tend to ignore. Perhaps some day those problems will be solved. Or perhaps if we only had a non-fossil fuel based way to generate enough hydrogen to power our cars we could instead use the power to generate synthetic hydrocarbons or we could develop better battery technology. The more important question then is whether we could get that power from somewhere if we really wanted to.

While I would oppose the construction of so many wind turbines on esthetic grounds some might disagree. I'm not sure what the cost would be of all those wind turbines but the 1,000 nuclear reactors are at least within the realm of the affordable. It is not clear what reactor size the Oswalds assumed in their calculation. But suppose they based their calculation on the new and very large Westinghouse AP1100 1,100 Megawatt nuclear reactor. The cost for a pair is estimated to be about $2.2 to $2.7 billion. But if 1,000 of them were built it seems safe to assume that there'd be considerable economies of scale. So let us suppose the reactors would cost $1 billion each. Well, that is only $1 trillion to build 1,000 of them.

Put that $1 trillion in perspective. The US burns about 20 million barrels of oil per day which at $50 per barrel is $1 billion per day or 364 billion per year. Though much of that is not for cars. Still, is that $1 trillion affordable if we really needed to switch to nuclear? The United States has a $11 trillion dollar a year economy. For a cost equalling slightly more than one month's economic production we could drastically cut our use of fossil fuels. So when people say we have no choice but to use fossil fuels, well, that just isn't true.

Granted, we couldn't convert to a nuclear economy in a year. We'd have to develop a number of supporting technologies and deploy them. It would take a couple of decades to make the full transition. Yet it really could be done.

There are problems with going the nuclear route. Waste disposal is a problem and is a large cost too. Operations and fuels are additional costs but much lower than construction costs. Securing so many nuclear reactors against terrorist attacks would be another substantial problem. Plus, increased use of nuclear power throughout the world would raise the risk of nuclear materials falling into the hands of terrorists.

Also, implementation of a massive nuclear reactor building program might be premature. Pebble Bed Modular Reactor technology could first be developed to provide a safer and cheaper nuclear option. Then PBMR reactors could be built instead. But even the current cost of nuclear power demonstrates that we do not absolutely need fossil fuels in order to maintain a modern industrial economy with fairly high living standards.

Nuclear power is also not the only energy alternative available that could totally displace fossil fuels. Another option would be to construct massive arrays of space-based solar photovoltaic panels, usually referred to as Space Solar Power Satellites (SSPS). Though it is harder to estimate what the costs would be of such an undertaking it seems safe to assume that an effort of that scale would create enough demand for space launch capabilities that space launch technologies would advance as a consequence of the demand for launch services. In conjunction with a space solar power project giant reflectors could be built in space to prevent global warming.

Ironically, while Hoffert’s team recommends harnessing the Sun’s energy from space, they also suggest blocking some of it, either with giant translucent shields or mirrors. About 2 percent of the Sun’s energy would need to be blocked in order to correct for climate-warming gas production. Such an effort is called geoengineering.

"For this application a sunshield or solar parasol would have to be very large (thousands of kilometers in diameter), possibly very thin, and possibly fabricated from lunar materials," Hoffert said. "At this point, space mirrors are more of a thought experiment than a real option."

We could build space-based solar power collection systems or space-based reflectors to cool the Earth. So we could either eliminate our need for fossil fuels or neutralize the warming effects of the continuing increase in atmospheric carbon dioxide due to fossil fuel burning.

Rather that either government spending or government mandates for private spending on massive non-fossil fuel power systems my own preference is for an increase in government funding of energy research in combination with government prizes offered in various areas for achievements in advances in technologies that would help toward the development of alternative energy technologies. Better to develop new technologies that the market will then choose to implement. Implementation of a mandated alternative power source in the United States would be more costly than current energy sources and still would only reduce American demand for fossil fuels while the demand of the rest of the world would grow to eventually far exceed today's current world aggregate demand.

On the subject of prizes to advance energy technologies imagine, for example, a $1 million dollar prize for every demonstrated single point increase in photovoltaics material conversion efficiency. The size of the prize per percentage point increase could even be scaled to provide larger prizes the higher the best existing efficiency becomes. So increasing from 25% to 26% conversion efficiency would not yield as big of a prize as going from 50% to 51%.

The challenge with a prize system for advancing energy technologies would be to find a large and appropriate set of technological goals that would each have a prize offered for their attainment. For example, prizes to researchers for better batteries would have to include both achievement of higher energy/weight density, total energy capacity, size, and number of cycles the batteries would have to be able to be recharged. There could not be just a prize for achieving a battery good enough to make electric cars feasible. We'd need lots of prizes to reward the reaching of intermediate points toward the ultimate goal.

The biggest problem with solar power is the cost. But to incentivize academic researchers to come up with better materials for making solar cells it would make more sense to leave aside the cost question and instead reward achievement of more scientific and technical goals such as new efficiency records for each of several different classes of materials. For example, separate rewards could be made for higher efficiencies of thin film carbon-based, silicon-based, nanotube-based, and other categories.

Update: Ergosphere blogger Engineer-Poet takes a look at the Oswald paper and argues that the Oswalds made some errors in their calculations and that the number of nuclear reactors needed to power cars isn't nearly as many as the Oswalds believe.

By Randall Parker 2004 October 08 11:28 PM  Energy Tech
Entry Permalink | Comments(28)
2004 October 07 Thursday
High Blood Pressure Harmful To Young Brains Too

High blood pressure contributes to a more rapid decline in cognitive function with age.

ORONO, Maine – High blood pressure in otherwise healthy adults between the ages of 18 and 83 is associated with a measurable decline in cognitive function, according to a report published today by University of Maine researchers in the pre-publication online edition of the journal Hypertension. The article will appear in the October issue of the printed journal.

While they characterize the decline as “relatively minor and manageable in terms of everyday functioning,” the authors say their findings underscore the importance of treatment for high blood pressure. In the study, younger individuals (18-47) performed at a higher level on cognitive function tests than did older individuals (48-83), but they, like older individuals, showed blood pressure-related decline in cognitive function over time.

There is a larger lesson here for younger people because this study fits into a larger pattern. Lots of factors that increase the risk of heart disease, cancer, cognitive decline, and other symptoms of aging characteristic of later life also degrade performance earlier in life. For example, exercise boosts cognitive function at any age. So lack of exercise when one is in one's 20s or 30s is reducing one's cognitive function below what it otherwise would be at those ages. It is never too early to start getting lots of exercise or eating an optimal diet.

An IQ test was used to show the decline in cognitive function due to high blood pressure.

Subjects in the study exhibited a normal range of cognitive functioning, as determined by the Wechsler Adult Intelligence Scale (WAIS). People suffering from dementia, diabetes, psychiatric illness, alcoholism, drug abuse or stroke were excluded.

In tests of four major areas of mental function, the researchers found that measurements of problem solving abilities under time constraints showed a statistically significant association with blood pressure in younger and older adults, aged 18-83.

Unfortunately, just as a rising rate of child obesity is raising the risk of heart disease and of insulin-resistant diabetes rising prevalence of obesity is also increasing the incidence of early onset high blood pressure.

"What we're finding is that with the current epidemic of overweight and couch-potato children, a higher percentage than ever before are in the hypertensive range," said Dr. Julie R. Ingelfinger, a pediatrics professor at Harvard Medical School.

Obesity has got to be the biggest health problem in the industrialized nations. It increases the risk of many different diseases. Obesity not only increases the risk of cancer but it also has recently been shown that the risk of dying from breast cancer among those who are diagnosed is greater if one is overweight. So obesity makes cancer not only more likely to happen but more deadly for those who get it.

On the bright side, if you take blood pressure lowering drugs you will probably be at less risk of osteoporosis and bone fractures.

They found that taking beta-blockers together with thiazide diuretics, which protect against bone loss, was linked to a reduced risk of fracture of 29%.

Using beta-blockers alone for around six months was linked to a 23% reduced risk. Taking thiazides alone was associated with a 20% risk.

By Randall Parker 2004 October 07 06:04 PM  Aging Studies
Entry Permalink | Comments(2)
2004 October 06 Wednesday
Cell Phones And Shopping Networks Problematic For Human Nature

Andrew Monk at the University of York and coworkers found that what makes the overhearing of cell phone conversations so annoying is that you can only hear one side of them.

We also feel an innate need to listen when we can only hear one side of a conversation, the researchers say. Even if it's no louder than a regular two-way exchange, the fact that we can only hear half means that we instinctively tune in, almost as if we're expecting to join in to complete the conversation.

If this idea is correct, the researchers reason, then mobile phone chatter should be no more annoying than overheard conversations where both people are present but only one voice is audible. When Monk and his team tested their theory on railway passengers in Britain, that's exactly what they found.

The article reports that ther are now silent carriages on British trains which only came about since the introduction of cell phones. Humans were never sufficiently irritated with each other's conversations to demand such carriages before cell phones came along.

This strikes me as yet another example of how modern communications technologies create environments that do not mesh well with how humans were evolved to relate to each other. It was unusual historically to find oneself able to listen to only one half of conversations. It was also unusual to be able to listen but not be able to join in on a conversation. The mind is wired up to listen to complete conversations (and even to ignore them as a whole) and finds it irritating to hear only half a conversation. I've certainly found myself annoyed at having to listen to people talk on phones. This result shows this reaction is not uncommon.

A cable arts channel occasionally shows a 1908 documentary "Moscow clad in snow" which shows what life was like outside in Moscow in the winter. One of the striking things about the film is the one horse sleighs moving in long lines along busy streets so slowly that people could talk to each other (not that one could tell from the film whether they did so) as they passed by. The sleighs were all open and the horses were moving at a speed that would allow casual exchanges. Compare that to cars today on the road. People rarely can speak to each other and events happen much more quickly. Technology has created unnatural ways for people to interact and "road rage" should not be an entirely surprising result.

Another example of unnatural interactions is with TV and movies and the idolization of stars and imagined relationships with people that most people will never meet. Imagined relationships are a part of the TV shopping channel experience that increases sales.

In order to determine if viewers developed close relationships with program hosts, the participants were asked to rate on a scale how much they agreed with statements like “The hosts are almost like friends you see everyday.”

Impulse shopping was measured by how participants rated their agreement with statements like “I decide what to buy after I watch television shopping programs.”

It’s not surprising that viewers develop close relationships with hosts, Lennon said. The shopping channels actively encourage viewers to feel close to the hosts.

“The hosts and guests on these shopping programs use a variety of conversational techniques that may encourage pseudo-interactive responses on the part of viewers,” she said.

“The hosts focus on similarities between the viewers and themselves, in order to facilitate a relationship.”

In addition the hosts invite viewers to contact them, and often provide e-mail and postal addresses, as well as telephone numbers to contact the hosts.

“Viewers develop attachments to their favorite hosts, and we find that this encourages viewers to buy more impulsively without considering whether they need the clothing they are buying,” Lennon said.

We can create changes in our environments orders of magnitude more rapidly than humans can evolve to adapt to those changes. Rules such as bans on cell phone use in many situations are natural responses to unnatural and problematic changes wrought by technological advances. It is not ignorant Luddism to support such rules. Humans need to regulate changes in their environments to keep down stress and maladaptive responses to technological changes.

By Randall Parker 2004 October 06 04:45 PM  Brain Society
Entry Permalink | Comments(2)
2004 October 05 Tuesday
X Prize Shows Prizes Can Speed Technological Advances

The $10 million Ansari X Prize has been won by another successful SpaceShipOne flight.

SpaceShipOne, the sleek combination of rocket and glider designed by Burt Rutan and financed by the billionaire Paul G. Allen, reached a record altitude of 368,000 feet, or 69.7 miles, blasting past the 337,600-foot altitude reached by the same ship last week.

The prize will actually be paid by an insurance company.

The prize, which required two flights in two weeks, will be paid by a special "hole-in-one" insurance policy, a common method of financing prize contests in which an insurance company essentially bets against success. The premium for the policy was paid by Anousheh Ansari, a telecommunications entrepreneur in Texas and a board member of the X Prize Foundation; she said that it cost "in excess of a million" dollars.

A Bermuda insurance company has just lost millions of dollars on their bet against an X Prize winner.

And so Diamandis and his backers found a Bermuda insurance company that was willing to underwrite the prize as, essentially, a bet it expected never to be collected. Even then, it took a major contribution from Anousheh Ansari, a young engineer who made $180 million in the telecom boom, to get the premiums paid up.

My guess is that it will be a lot harder in the future to find insurance companies willing to insure against the achievement of aerospace prize goals.

The prize money will be shared by Paul Allen, Burt Rutan, and Scaled Composites employees.

Allen put up all the cash for developing the spacecraft, but said he'll share the prize with Rutan's company, Scaled Composites, which built it. Rutan, in turn, said the company will distribute its share of the winnings among employees.

But the more important story here is not about space flight or human exploration. The more important story is that prize money can very efficiently speed the rate of technological advance in targetted areas. If NASA's entire budget was shifted over into prize money it would do far more to accelerate the development of space technology than the current set of programs that NASA funds.

The success of the X Prize is spawning imitators. A $50 million dollar prize may be offered for an orbital vehicle.

Bigelow Aerospace is reportedly on the verge of offering a $50 million American Space Prize to any private American company that can develop a reliable orbital vehicle. There's a good reason for that. Bigelow has been working on orbiting space habitats - and needs an orbital rocket for people to get to them.

The offering of prize money for flight achievements represents a return to a practice that has produced at least one great historical success.

Raymond Orteig emigrated to New York from France in 1912. He worked as a bus boy and café manager and eventually acquired two New York Hotels which were popular with French airmen assigned to duty in the United States during the Great War In 1919 Raymond Orteig offered a prize of $25,000 for the first nonstop aircraft flight between New York and Paris. By the mid 1920’s, airplanes had finally developed enough to make such a flight possible.

The Orteig Prize in the 1920s spurred many teams to develop better aircraft.

. The Orteig Prize stimulated not one, but nine separate attempts to cross the Atlantic. To initiate the flights, competitors raised and spent some $400,000, or 16 times the amount of the prize. As a result of these early aviation prizes, the world’s $250 Billion aviation industry was created. The ANSARI X PRIZE hopes to spur the creation of a vibrant commercial space industry through the $10M competition.

Some critics claim the X Prize only achieved a repeat of what the X-15 did over 45 years ago. But SpaceShipOne incorporates many technological advances that were unavailable when the X-15 was built.

Designer Burt Rutan said what makes his SpaceShipOne so robust is its lightweight materials of graphite and epoxy (it weighs about 6,000 pounds and can be towed by a pickup), its safer propulsion system fuel of rubber-nitrous oxide fuel, and its ability to fold and open its wings, which stabilizes the craft. With the exception of refueling its rocket motors, 97 percent of the spaceship was reused for the two X Prize flights.

But what is amazing in this story is the low cost for SpaceShipOne's development. Estimates for the cost of development range between $20 million and $30 million. What prizes for cutting edge technological achievements do is they give America's and the world's many multimillionaires and billionaires entertaining and ego gratifying ways to use their their cash to push the envelope on what is technologically possible. Putting up technological goal posts and declaring contests with large money prizes is a great way to spur incredibly cost-effective competition. We need more prizes for more more technological goals.

The research and development area most in need of funding for prizes is aging rejuvenation therapies known as Strategies for Engineered Negligible Senescence (SENS). There is now one prize aimed at this topic which is called the Methuselah Mouse Prize. The goal of this prize is to provide researchers an incentive to develop biotechnologies that will double the life expectancies of lab rats from 3 years to 6 years. Currently the prize has a half million dollars in funding.

Note that academic researchers already have large funding agencies to which to apply for grants and the agencies are to some extent led in directions based on which topics researchers decide to apply for to get funding. If top researchers in many fields started applying for research grants to explore the development of various SENS therapies then some of those grants would get funded and more SENS research would get done. Financial incentives in the form of prize money could sway a lot of existing research money in the direction of rejuvenation research. So prize money for the achievement of SENS research goals could potentially sway the allocation of literally orders of magnitude more money than was used to win the X Prize.

For more on the topic of SENS research and prize money the Fight Aging! blog has 3 posts on the X Prize, the Methuselah Mouse Prize and how the fight against aging can be accelerated with prize money. See here and here and here for more.

Update: The X Prize's multiplier effect on money spent is going to be much lower than was the case with the Orteig prize.

How much investment the X Prize has spurred won't be known until next year, after all the various teams have wrapped up their work on prototype space taxis for tourists. Most contenders have been scrimping along on shoestring budgets, so the X Prize isn't likely to reach the 16-to-1 payback ratio of the Orteig Prize. Rutan's Mojave (Calif.) company, Scaled Composites, chewed through some $25 million of the fortune Allen earned as a founder of Microsoft.

However, there is a positive spin that can be put on that news: The Orteig Prize was there for 8 years before its goal was achieved. That gave plenty of time for a succession of teams to come along, spend money, and fail. The X Prize was announced in 1996. But it didn't become fully funded until some time in 2001 when an insurance policy was negotiated to fund the prize for a limited period of time.

Bermuda-based insurer XL Capital took the wager. The firm required regular payments of $50,000 to $100,000 from Diamandis and a deadline in 2003 for someone to make it to space, a date that later was extended to Jan. 1, 2005.

It was a risky move. The contract with XL stipulated that if Diamandis missed even one premium payment, the deal was off and the firm got to keep whatever had been paid in."

Prizes with more certain funding and funding for much longer periods of time could produce much larger multiplier effects in terms of dollars spent. Also, prizes aimed at researchers who can write grant applications to get money to pursue prize goals could produce even larger multiplier effects.

By Randall Parker 2004 October 05 03:06 PM  Airplanes and Spacecraft
Entry Permalink | Comments(5)
2004 October 04 Monday
Study Participants Go Without Internet Access, Experience Withdrawal

Yahoo! Inc. and the OMD media agency sponsored a study of how 28 people reacted to being deprived of the internet for two weeks. People reported feelings of withdrawal and isolation.

It's a sign of the times that to get people to agree to the deprivation in the first place, researchers paid as much as $950 per household. In video diaries, participants talked about feeling "withdrawal" as they resisted the temptation to log on.

Neighbors? Such things exist in the physical world? But our virtual neighborhoods are so much more interesting. I can't find real neighbors nearly as interesting as my virtual neighbors.

Could a study like this be done as a reality TV show? Picture people being interviewed about their feelings of withdrawal from the internet. Picture some guy flipping between TV channels to try to recreate the experience of changing web pages. Or a girl could go by and see her girlfriend to ask her what topics are being discussed in instant messaging chats with their other friends. Take away phone use and really watch the girl beg for information.

At least three-fourths said they spent more time talking on the phone, watching TV or movies, and reading newspapers. Some reported visiting their neighbors, playing games, and exercising more.

The internet makes people feel more secure and powerful. (same article here)

Internet users feel confident, secure and empowered. The Internet has become, to some, the ultimate symbol of modernity to the point that participants were hobbled without convenient access to routine information like maps and telephone numbers. The pervasive nature of the Internet is such that participants often forgot or lost the desire to use "old fashioned tools" like the phone book, newspapers and telephone-based customer service.

The loss of communications ability was felt more keenly than the loss of the ability to do research, look up information, or engage in commercial transactions.

"I haven't talked to people I usually talk to and have been tempted to go on instant-messenger because I feel out of the loop," said study participant Kristin S.

"I'm starting to miss emailing my friends -- I feel out of the loop," said study participant Penny C.

According to the research, communications figured most prominently in the withdrawal process, demonstrating a new social network paradigm. The study shows that the Internet affords people the ability to overcome time and distance and to manage communications with a larger social circle, thereby creating an effortless community. Participants in the study found they missed the ability to exercise control over the pace and content of communication with different layers of friends and families. As a result, during the deprivation period, participants' outer circle of relationships suffered.

One can end an online work break faster than a physical work break. That makes sense. It is probably harder to politely end a conversation in the hallway at work as compared to ending a messaging session.

"I miss the private space the Internet creates for me at work." Kim V.

"I've been taking physical breaks instead of online breaks at work. The difference is that I can't get right back into what I was doing," said Ryan V.

Have you tried kicking the internet for a few weeks? Do you feel an emptiness if you go on vacation without it?

By Randall Parker 2004 October 04 01:10 PM  Comm Tech Society
Entry Permalink | Comments(14)
Stress And Violence Feed Back In Vicious Cycle

Hormonal and neural chances caused by stress and aggression feed back on each other and promote each other.

WASHINGTON -- Scientists may be learning why it's so hard to stop the cycle of violence. The answer may lie in the nervous system. There appears to be a fast, mutual, positive feedback loop between stress hormones and a brain-based aggression-control center in rats, whose neurophysiology is similar to ours. It may explain why, under stress, humans are so quick to lash out and find it hard to cool down. The findings, which could point to better ways to prevent pathological violence, appear in the October issue of Behavioral Neuroscience, which is published by the American Psychological Association (APA).

In five experiments using 53 male rats, behavioral neuroscientists from the Netherlands and Hungary studied whether stimulating the brain's aggression mechanism raised blood levels of a stress hormone and whether higher levels of the same hormone led to the kind of aggression elicited by that mechanism. The results showed a fast-acting feedback loop; the mechanism works in both directions and raising one variable raises the other. Thus, stress and aggression may be mutually reinforcing, which could explain not only why something like the stress of traffic jams leads to road rage, but also why raging triggers an ongoing stress reaction that makes it hard to stop.

In the study, the scientists electrically stimulated an aggression-related part of the rat hypothalamus, a mid-brain area associated with emotion. The rats suddenly released the stress hormone corticosterone (very like cortisol, which humans release under stress) -- even without another rat present. Normally, rats don't respond like that unless they face an opponent or another severe stressor.

Says lead author Menno Kruk, PhD, "It is well known that these stress hormones, in part by mobilizing energy reserves, prepare the physiology of the body to fight or flee during stress. Now it appears that the very same hormones 'talk back' to the brain in order to facilitate fighting."

To study the hypothesized feedback loop from the other direction, the scientists removed the rats' adrenal glands to prevent any natural release of corticosterone. Then researchers injected the rats with corticosterone. Within minutes of injection, the hormone facilitated stimulation-evoked attack behavior.

Thus, in rapid order, stimulating the hypothalamic attack area led to higher stress hormones and higher stress hormones led to aggression – evidence of the feedback loop within a single conflict. Write the authors, "Such a mutual facilitation may contribute to the precipitation and escalation of violent behavior under stressful conditions."

They add that the resulting vicious cycle "would explain why aggressive behavior escalates so easily and is so difficult to stop once it has started, especially because corticosteroids rapidly pass through the blood-brain barrier." The findings suggest that even when stress hormones spike for reasons not related to fighting, they may lower attack thresholds enough to precipitate violent behavior. That argument, if extended in research to humans, could ultimately explain on the biological level why a bad day at the office could prime someone for nighttime violence toward family members.

Stress reaction is one of the evolutionary legacies of human evolution that in modern conditions is mostly maladaptive. Most people who get angry or frustrated and therefore feel stress are not benefitting and are even being harmed by the stress. We need better biotechnological tools for suppressing stress response. This would do more than reduce the incidence of acts of violence. Heart disease, general aging, depression, and other maladies would occur less frequently if stress responses happened less often.

Regarding my previous post on car cruise controls and automated driving: One of the benefits of being able to turn driving over to computers would be a reduction in feelings of stress. The stress of fighting commuter traffic comes on top of stresses associated with work and home life. Lower levels of stress made possible by automated driving computers would reduce both illness and violence.

By Randall Parker 2004 October 04 12:25 PM  Brain Violence
Entry Permalink | Comments(5)
2004 October 03 Sunday
Cameras On Cars Read Road Signs And Detect Obstacles

An Australian team has mounted video cameras on cars and fed the video signal into software that can decode signs to detect when a car is driving too fast or failing to respond to a stop sign.

The software scans the video pictures and detects road signs by recognising their symmetrical shapes: rectangles, diamonds, octagons or circles. Once a sign is detected, the image is compared to a list of signs stored in the computer’s memory. If it recognises a stop sign, the computer checks if the car is slowing down.

The computer uses a commercial package called FaceLab to analyse images from the stereoscopic cameras and work out where the driver is looking. If the driver appears not to have seen a sign, and the car’s speed does not change, an alert is issued, says Nick Barnes, one of the developers at NICTA.

Most such warnings wouldn't be popular with speeders. But an adjustment could be added that lets you specify how far above the speed limit you want to go before being notified. Plus, an "Off" switch would be really helpful - especially in an emergency if you are in a mad dash to take someone to a hospital emergency ward.

Radar-based adaptive cruise control with automated speed adjustment to avoid collision first debuted on passenger cars in 1998 in an S-class Mercedes. or possibly first in a Toyota in 1997 for Japan's domestic market. More recently some Spanish researchers have demonstrated optically-based adaptive obstacle avoidance.

A camera-based cruise control system that automatically slows a car down to avoid potential accidents has been developed by Spanish researchers.

The prototype system - which uses a single dashboard camera to monitor traffic ahead - has been installed and tested by Miguel Ángel Sotelo and colleagues at the University of Alcalá, Spain.

An optical system has the advantage that it could be combined with the first reported system above to read road signs and do other image processing to detect dangers that a radar-based system might not be able to manage to pick up (e.g. a deer running out of the cover of woods close to the road). But image processing is probably algorithmically more difficult and probably also requires greater computer processing power.

A cruise control that could read existing signs could detect when the speed limit changed and automatically adjusted the car speed up or down to the new limit. Further into the future electronic bar codes that are either optical or electro-magnetic could be embedded into roadways to be read automatically by passing cars.

But the problem with all these computer computer-assisted driving gadgets is that they do not yet eliminate the need for drivers to pay attention. What we really need are cars that can drive themselves. The big savings in time will come when drivers no longer have to pay attention. Sometimes active driving can be fun and even pleasantly distracting from the rest of life. But lots of hours spent behind the wheel are just wasted time that could be used to perform various tasks. Once high speed cellular modems are installed along most roadways one could, for example, go shopping on the internet or read internet news sites while being whisked home by a robotic car. Or one could carry on intellectually more demanding phone conversations for one's job without the distraction putting at risk the lives of everyone else on the road. Or one could do more sight-seeing

I see these smaller incremental improvements in road condition sensing, automatic cruise control adjustment, and the like as helpful steps that will allow the testing and maturing of various technological components that will gradually allow computers to take over more of the work of driving. The first real step toward total automation will probably occur on special lanes on highways. The more complex environment presented by city and residential neighborhood driving will likely remaini in the hands of humans for one or two decades longer.

By Randall Parker 2004 October 03 07:03 PM  Travel Tech
Entry Permalink | Comments(4)
Site Traffic Info
Site Copyright
The contents of this site are copyright ©