"This model embraces the human brain as a high-speed Internet rather than a computer. The quality of the Internet's connections is the key to its speed, fidelity and overall capability," said Dr. George Bartzokis, the author and visiting professor of neurology at UCLA's David Geffen School of Medicine. He also is director of the UCLA Memory Disorders and Alzheimer's Disease Clinic and Clinical Core director of the UCLA Alzheimer's Disease Research Center.
As the brain continues to develop in adulthood and as myelin is produced in greater and greater quantities, cholesterol levels in the brain grow and eventually promote the production of a toxic protein that attacks the brain. The protein attacks myelin, disrupts message transfer through the axons and eventually leads to the brain/mind-destroying plaques and tangles visible years later in the cortex of Alzheimer's patients.
Bartzokis' analysis of magnetic resonance images and post-mortem tissue data suggests that genetic factors coupled with the brain's own developmental process of increasing cholesterol and iron levels in middle age help degrade the myelin. The papers describe how complex connections that take the longest to develop and allow humans to think at their highest level are among the first to deteriorate as the brain's myelin breaks down in reverse order of development.
"The body was designed to myelinate through the natural lifespan. Medical advances, however, have expanded the lifespan well beyond the brain's natural capacity to operate in a healthy, efficient manner," Bartzokis said. "The process of adult brain development and becoming 'wiser' has this downside that evolution could not anticipate."
This new model of brain development and degeneration suggests that the best time to address the inevitability of myelin breakdown is when it begins, in middle age. By the time the effects of Alzheimer's disease become apparent in a patient's 60s, 70s or 80s, it may be too late to reverse the course of the disease.
Preventive therapies worth investigating include cholesterol- and iron-lowering medications, anti-inflammatory medications, diet and exercise programs and possibly hormone replacement therapy designed to prevent menopause rather than simply ease the symptoms. In addition, education or other activities designed to keep the mind active may stimulate the production of myelin. Finally, there may be ways to address genetic and environmental factors that accelerate the degeneration process.
The brain is probably going to turn out to be the most difficult organ in the body to develop therapies for to stop and reverse aging. For many organs the solution will simply be to grow replacements. But your brain is your identity. Swapping out the brain defeats the whole purpose of trying to delay aging and death.
Repairing the brain in place will be helped by the eventual ability to deliver replacement stem cells into the hippocampus to replaced aged reservoirs of stem cells. But we also need gene therapies that can be delivered to neurons throughout the brain to do repairs on individual cells. The development of those therapies may take 10, 20, or even 30 or more years.
In the mean time it would be helpful to have better ways to slow down brain aging. Lowering blood LDL cholesterol and raising blood HDL cholesterol would both likely slow brain aging. While there are drugs for lowering overall cholesterol so far we have no good way to raise HDL cholesterol aside from exercise and healthier living (aside: having a dog that wants to be run every day has done more to make me exercise than anything else I've tried). Also, the development of pharmaceuticals that would reduce the stress on the brain (both emotional and due to free radicals and other compounds in the blood) is another potential avenue of attack.
The ability to prevent brain deterioriation would have enormous beneficial economic consequences. Brain work makes up an increasing fraction of the economy as so many manual tasks are automated. A delay in the decay of the mental abilities of middle aged and elderly people would boost their productivity and allow them to work for more years. Western countries faced with aging populations ought to spend more money on research aimed at slowing and reversing brain aging. The cost of the research will be paid back many times over through increased productivity, longer work lives, and avoidance of costs for care of mentally incapacitated elderly.
Berkeley - Disabling a set of genes in a strain of the tuberculosis bacteria surprisingly led to a mutant form of the pathogen that multiplied more quickly and was more lethal than its natural counterpart, according to a new study led by researchers at the University of California, Berkeley.
As early as two weeks after infection, researchers found significantly more bacteria from the organs of mice infected with the mutated tuberculosis (TB) bacteria than for mice infected with the unmodified, or "wild-type," strain. By 27 weeks, the mutant-infected mice started to die, while their counterparts infected with the wild-type strain survived until the end of the experiment at 41 weeks.
"These findings came as a complete surprise to us," said Dr. Lee Riley, professor of epidemiology and infectious diseases at UC Berkeley's School of Public Health and principal investigator of the study. "We thought we had made a mistake, so we repeated the test several times, and we always got the same result."
The researchers say the study, to be published Dec. 8 in Proceedings of the National Academy of Sciences, sheds light on the mechanisms used by a pathogen that now infects one-third of the world's population and kills 2 million people per year. According to the World Health Organization, which in 1993 declared TB a global emergency, an estimated 36 million people could die of TB by 2020 if the disease is not controlled.
The results were unexpected because prior studies pointed to the mce1 operon, the collection of genes that researchers disabled in the TB bacteria, as an important virulence factor that helped the organism invade cells. Researchers expected that mutating the mce1 genes would impair the pathogen's ability to infect the mice. Instead, the bacteria became more deadly.
"This is one of the very few hypervirulent organisms ever created," said Lisa Morici, a lead author of the study who received her Ph.D. in infectious diseases from UC Berkeley in May. "This breaks a long-standing assumption among scientists that disabling a potential virulence gene weakens a pathogen."
This is the second incident that I'm aware of where investigators were trying to bioengineer a less dangerous strain of a pathogen and instead inadvertently created a far more dangerous strain. The previous case, which occurred in January 2001, was created by the addition of the gene for IL-4 (interleukin-4 which is involved in immune response) to a mousepox virus. The result was a 100% fatal mousepox.
The mousepox discovery probably presents the greater potential danger if the gene IL-4 gene, engineered into human smallpox, would have the same effect. The reason for the greater danger from the smallpox is that it is a virus and it is very difficult to develop drugs that will stop viruses. The most effective treatments for viruses to date are vaccines. If a bioengineered smallpox was essentially able to defeat the immune system then vaccines might turn out to be worthless as a means to protect against it.
This is not to say that a bioengineered TB would not be dangerous. Such a form of TB, released into a population, would kill a lot of people. But at least we'd have a fighting chance of coming up with antibiotics to save people who would be infected by it.
The other factor here is transmissibility. Would a bioengineered TB or smallpox be more transmissible? The TB might kill the victims so quickly that the victims would not be able to spread it very well. The genes that the scientists disabled may well have been selected for precisely to allow the TB to infect without killing so that the carrier could live long enough to transmit the disease to others. However, that might make it attractive for bioterrorists. If they can be assured that the release of such a pathogen would kill few beyond those infected in the initial release then the attraction would be that it wouldn't eventually spread back to their own countries.
Would measures against transmission be easier or harder to implement against the TB or smallpox? Can anyone provide an educated guess?
Surely more such accidental discoveries are in store. What we need are more discoveries, accidental or otherwise, that point the way toward how to develop better defenses against both bioengineered and naturally occuring pathogens.
A fascinating article published in the American Journal Of Psychiatry by Swedish medical researcher Lars Farde M.D., Ph.D. and colleagues from the Karolinska Institute have found that the concentration of serotonin receptors in the brain correlates inversely with spirituality. (same abstract here or here)
Jacqueline Borg, Bengt Andrée, Henrik Soderstrom, and Lars Farde
The Serotonin System and Spiritual Experiences
Am J Psychiatry 2003 160: 1965-1969.
METHOD: Fifteen normal male subjects, ages 20-45 years, were examined with PET and the radioligand [11C]WAY100635. Personality traits were assessed with the Swedish version of the Temperament and Character Inventory self-report questionnaire. Binding potential, an index for the density of available 5-HT1A receptors, was calculated for the dorsal raphe nuclei, the hippocampal formation, and the neocortex. For each region, correlation coefficients between 5-HT1A receptor binding potential and Temperament and Character Inventory personality dimensions were calculated and analyzed in two-tailed tests for significance. RESULTS: The authors found that the binding potential correlated inversely with scores for self-transcendence, a personality trait covering religious behavior and attitudes. No correlations were found for any of the other six Temperament and Character Inventory dimensions. The self-transcendence dimension consists of three distinct subscales, and further analysis showed that the subscale for spiritual acceptance correlated significantly with binding potential but not with the other two subscales. CONCLUSIONS: This finding in normal male subjects indicated that the serotonin system may serve as a biological basis for spiritual experiences. The authors speculated that the several-fold variability in 5-HT1A receptor density may explain why people vary greatly in spiritual zeal.
Are there particular alleles in genes that cause different humans to have different numbers of serotonin receptors? Do more spiritual people, on average, have genetic variations that make them produce fewer serotonon receptors per nerve cell or fewer nerve cells that make serotonin receptors?
Currently, are spiritual people having more children than non-spiritual people? Therefore, are the alleles that increase serotonin receptor concentrations being selected against? Is the extent to which spiritualism correlates with larger family size different in different societies? So are some societies being selected for to be more spiritual more than other societies?
To reiterate an argument I've made in the past: Once it becomes possible to control what genetic variations people pass on to their offspring and once genetic variations are discovered that alter personality then at that point the average personality types born to people of different regions, countries, occupations, economic classes, and religious beliefs will diverge. People will make decisions to make their children more like what they want ideal children to be. Imagine religious believers choosing to make their children have personalities that are highly spiritual while at the same time scientists and engineers choose to have children who are highly rational and skeptical. This could lead to genetic religious wars.
If people in some regions of the world decide to make their children more spiritual and other regions make their children more rational and skeptical then one can imagine wars being fought as a result of conflicts of values that flow from fundamental differences in brain wiring. One can also imagine wars fought to stop the people or governments of opposing countries from creating offspring that are either seen as a security threat (e.g. a highly willing deeply spiritual suicide martyr personality type) or as a blasphemy against god.
Scripps Research Institute researchers have found a molecule that will convert adult differentiated adult muscle cells into undifferentiated stem cells.
La Jolla, CA. December 22, 2003—A group of researchers from The Scripps Research Institute has identified a small synthetic molecule that can induce a cell to undergo dedifferentiation—to move backwards developmentally from its current state to form its own precursor cell.
This compound, named reversine, causes cells which are normally programmed to form muscles to undergo reverse differentiation—retreat along their differentiation pathway and turn into precursor cells. These precursor cells are multipotent; that is, they have the potential to become different cell types. Thus, reversine represents a potentially useful tool for generating unlimited supply of such precursors, which subsequently can be converted to other cell types, such as bone or cartilage.
"This [type of approach] has the potential to make stem cell research more practical," says Sheng Ding, Ph.D. "This will allow you to derive stem-like cells from your own mature cells, avoiding the technical and ethical issues associated with embryonic stem cells."
Ding, who is an assistant professor in the chemistry department at Scripps Research conducted the study—to be published in an upcoming issue of the Journal of the American Chemical Society—with Peter G. Schultz, Ph.D., who is a professor of chemistry and Scripps Family Chair of Scripps Research's Skaggs Institute of Chemical Biology, and their colleagues.
Stem cell therapy would be most effective if you could use your own stem cells, since using one's own cells would avoid potential complications from immune rejection of foreign cells. However, in general it has proven very difficult to isolate and propagate stem cells from adults. Embryonic stem cells (ESCs) offer an alternative, but face both practical and ethical hurdles associated with the source of cells as well as methods for controlling the differentiation of ESCs. A third approach is to use one's own specialized cells and dedifferentiate them.
This is excellent news. The ability to use one's own cells as a starting point for making stem cells to treat one's own illnesses would avoid the immune rejection problems that make stem cells from non-self sources so problematic.
This molecule reversine is not the only molecule discovered recently that is useful for controlling the differentiation state of cells. Currently human embryonic stem cells have to be grown on top of mouse cells in order to prevent the stem cells from differentiating. The mouse cells release compounds that prevent the cells from differentiating. The problem with doing this is that the human cells may get contaminated in some way that could make them risky to be used in human therapy. Well, Ali Brivanlou of Rockefeller University has identified a compound extracted from sea snails that prevents stem cell differentiation.
Ali Brivanlou of Rockefeller University in New York says that he and his colleagues may have found a partial solution to these problems. Brivanlou treated ES cells with a chemical, nicknamed BIO, from a sea snail.
BIO stopped ES cells turning into specialized adult cells, Brivanlou and his colleagues found. BIO works by activating a set of protein signals - called the Wnt pathway - in the ES cells1.
Slowly but surely more tools and techniques are being developed to make stem cell growth and differentiation and dedifferentiation controllable.
Soot particles may be twice as bad as the greenhouse gas carbon dioxide in contributing to global warming, suggests a new study.
Grains of soot deposited in snow have also caused about one-quarter of the observed rise in global surface temperature since 1880, suggests the model by James Hansen and Larissa Nazarenko. The pair examined how soot particles affect the atmosphere when they darken snow and ice.
Note that when soot causes ice to melt that makes previously white surface areas become darker and hence to absorb more sunlight and radiate more heat. Hence, the melting of ice and packed snow raises temperatures and causes more ice to melt.
Hansen, director of NASA's Goddard Institute for Space Studies, and Nazarenko, a staff associate there, found soot is twice as potent as carbon dioxide in changing global surface air temperatures in the Arctic and the Northern Hemisphere.
The NASA Goddard Space Flight Center press release page has nice graphs and animations that make clicking here worth the trip.
Hansen and Larissa Nazarenko, both of the Goddard Institute and Columbia University's Earth Institute, found soot's effect on snow albedo (solar energy reflected back to space), which has been neglected in previous studies, may be contributing to trends toward early springs in the Northern Hemisphere, thinning Arctic sea ice, melting glaciers and permafrost. Soot also is believed to play a role in changes in the atmosphere above the oceans and land.
"Black carbon reduces the amount of energy reflected by snow back into space, thus heating the snow surface more than if there were no black carbon," Hansen said.
Soot's increased absorption of solar energy is especially effective in warming the world's climate. "This forcing is unusually effective, causing twice as much global warming as a carbon-dioxide forcing of the same magnitude," Hansen noted.
Hansen cautioned, although the role of soot in altering global climate is substantial, it does not alter the fact greenhouse gases are the primary cause of climate warming during the past century. Such gases are expected to be the largest climate forcing for the rest of this century.
The researchers found that observed warming in the Northern Hemisphere was large in the winter and spring at middle and high latitudes. These observations were consistent with the researchers' climate model simulations, which showed some of the largest warming effects occurred when there was heavy snow cover and sufficient sunlight.
Hansen and Nazarenko used a leading worldwide-climate computer model to simulate effects of greenhouse gases and other factors on world climate. The model incorporated data from NASA spacecraft that monitor the Earth's surface, vegetation, oceans and atmospheric qualities. The calculated global warming from soot in snow and ice, by itself in an 1880-2000 simulation, accounted for 25 percent of observed global warming. NASA's Terra and Aqua satellites are observing snow cover and reflectivity at multiple wavelengths, which allows quantitative monitoring of changing snow cover and effects of soot on snow.
While this may not be immediately obvious this report seems like good news. Why? Because it is a lot cheaper to reduce soot emissions than to reduce carbon dioxide emissions. If a substantial source of warming can be cancelled out cheaply then that buys time (assuming it really is necessary to intervene in the first place) to develop technologies that will allow carbon dioxide emissions to be reduced at much lower cost.
Soot is not the only warming pollutant which would be cheaper to reduce than carbon dioxide. Methane also has warming effects and reduction of methane emissions could be done cheaply. A reduction in methane emissions would have the added benefit of reducing ozone at ground level.
Reduction in soot emissions would also yield substantial health benefits. A recent paper published in the medical journal Circulation C. Arden Pope of Brigham Young University and colleagues have found that particulate pollutants increase the incidence of cardiopulmonary diseases and ischemic heat attack.
Tiny particles of pollutants emitted by automobiles, power plants and factories significantly increase the risk of dying from cardiovascular disease in the United States, according to a study led by Brigham Young University epidemiologist Arden Pope.
The research was published in "Circulation: Journal of the American Heart Association" on Dec. 15. Statistical links between air pollution and increased mortality were reported by Pope and others in the mid-1990s. In March of 2002 he and colleagues reported associations between air pollution and lung cancer, as well as the broad category of cardiopulmonary disease, which includes both heart and lung ailments.
The new study narrows the latter finding by identifying a strong link between particulate air pollution and ischemic heart disease (the type that causes heart attacks), and also a link between pollution and the combined category of irregular heart rhythms, heart failure and cardiac arrest. It also suggests general biological pathways through which pollution might cause these diseases that lead to death – increased inflammation and nervous system aberrations that change heart rhythm.
"Not only do we show a statistical link between particulate air pollution and these types of heart disease," Pope said, "but we see specific patterns that are consistent with mechanistic pathways that may help explain how air pollution causes those diseases. The study discusses recent advancements in cardiovascular medicine that have explored the role of inflammation in the development and progression of atherosclerosis. The study results are consistent with recent findings that air pollution may provoke low-grade pulmonary inflammation, accelerate the progression of atherosclerosis, and alter cardiac function. These results add biological plausibility that air pollution really is a risk factor for heart disease."
The EPA has declared that the annual average level of PM2.5 particles in the air should not exceed 15 micrograms per cubic meter. Pope's study showed that each 10 micrograms-per-cubic-meter increase in fine particulate air pollution is accompanied by an 18-percent increase in risk of death from ischemic heart disease and a 13-percent increase in risk of death from altered heart rhythm, heart failure or cardiac arrest.
Further analysis also showed higher risks associated with air pollution for former and current smokers – 26 percent and 94 percent, respectively. Pope notes that "smoking is clearly a much larger risk factor, but air pollution increases the risk of cardiovascular death in non-smokers and seems to add additional risk to smokers."
Pope emphasized that the study's findings are "good news." Since the early 1980s, the annual U.S. average of PM2.5 has dropped from 21 to 14 micrograms per cubic meter.
If we consider both the uncertainties in the current climate models (witness the sudden discovery of the greater importance of soot outlined above) and the advantages both in costs and in health benefits for the reduction in warming pollutants other than carbon dioxide it seems foolish to rush into a major reduction in carbon dioxide emissions. There are easier pickings that provide more benefits. A reduction in particulates would reduce heart disease and cancer and make us healthier in other ways. A reduction in methane emissions would reduce ozone pollution and therefore also reduce harm to the lungs and other organs.
Now, almost 12 years later, Dr. Hansen says that too much emphasis has been placed on the effects of fossil fuels combustion. Instead, Hansen says that warming over the past century was mostly driven by gases such as methane and chlorofluorocarbons.
In a report in Proceedings of the National Academy of Sciences, he says, "We suggest that a strategy to slow global warming focus on reducing air pollution, especially tropospheric (ground level) ozone, methane and black carbon particles." The report notes that the growth rate of non-carbon greenhouse gases has "declined in the past decade."
Dr. Hansen states that global warming can be prevented "without any economically wrenching actions." He says that "human health and ecological costs of these pollutants are counted in billions of dollars in the United States and impacts are reaching devastating levels in the developing world. A strategy focused on reducing these pollutants, which are not essential to energy production, should unite the interests of developed and developing nations."
Update: See a previous work by Hansen and colleagues from May 2003. Note that soot emissions are high in areas that are economically less developed.
Black carbon or soot is generated from traffic, industrial pollution, outdoor fires and household burning of coal and biomass fuels. Soot is a product of incomplete combustion, especially of diesel fuels, biofuels, coal and outdoor biomass burning. Emissions are large in areas where cooking and heating are done with wood, field residue, cow dung and coal, at a low temperature that does not allow for complete combustion. The resulting soot particles absorb sunlight, just as dark pavement becomes hotter than light pavement.
Update II: On a related note Richard Muller discusses the limitations of current methods to estimate historical temperatures.
The disagreement is not political; most of it arises from valid issues involving physics and mathematics. First the physics. An accurate thermometer wasn’t invented until 1724 (by Fahrenheit), and good worldwide records didn’t exist prior to the 1900s. For earlier eras, we depend on indirect estimates called proxies. These include the widths of tree rings, the ratio of oxygen isotopes in glacial ice, variations in species of microscopic animals trapped in sediment (different kinds thrive at different temperatures), and even historical records of harbor closures from ice. Of course, these proxies also respond to other elements of weather, such as rainfall, cloud cover, and storm patterns. Moreover, most proxies are sensitive to local conditions, and extrapolating to global climate can be hazardous. Chose the wrong proxies and you’ll get the wrong answer.
When you hear some claim that a particular year is the hottest year on record for x many hundreds or thousands of years take it with a grain of salt. As Muller's article shows, there are unresolved questions about how to accurately estimate historical temperatures in previous centuries on planet Earth.
Boeing's Phantom Works is working on the problem of how to make aircraft that unskilled regular folks can drive.
NS: What's the big idea you're working on for the future?
PD: Boeing's philosophy in terms of commercial travel is focused on point-to-point travel. At Phantom Works we try to think further out, to the extreme version of point-to-point, which would be personal transportation vehicles where you can have this thing take off and land from your driveway. One thing we think very critical to that concept is the air traffic control (ATC).
NS: ATC in that environment sounds an unfeasible nightmare - but you think it might actually be possible?
PD: Yes. We think it could certainly be possible. What we are beginning to explore is what technologies you would want to deploy both in terms of the ATC and the flight controls on such a vehicle. Also, how they would inter-operate with one another so that we can have a safe and efficient air transportation system on a personal level.
If the past century was about winning military superiority and exploring the frontiers of flight, then the coming 100 years could be more about making flight more accessible to all.
Andrew Hahn thinks about that constantly. As a member of the Personal Air Vehicle project at NASA's Langley Research Center in Hampton, Va., his job, essentially, is to invent the Jetson's car. Like many of his colleagues, Mr. Hahn's aim is to turn the Technicolor dreams of past futurists and reshape them into something that could actually make its way into garages by 2103
The Personal Air Vehicle concept makes perfect sense as the future of aviation for a very simple reason: direct point-to-point flights from local and much smaller airports would be much faster than trips to bigger airports followed by hops through hub sites and trips down long hallways to go to luggage unloading areas. Computers are going to get fast enough and sophisticated enough to take over much of the work currently done by pilots. Materials advances and fabrication technologies advances ought to eventually allow the construction of small fast aircraft at much lower cost.
What doomed the train as a means of passenger travel? One factor was the rise of much faster aircraft. But much train travel was done over shorter distances where aircraft didn't offer much advantage. Most train traffic was lost to cars rather than to aircraft. Why did cars displace trains? Because cars allowed direct point-to-point travel and therefore saved time. The same pattern is going to play out in the air as smaller highly automated aircraft that will allow more direct and faster trips start to displace larger aircraft.
Also see another New Scientist article on the future of air flight that covers the Defense Advanced Research Project Agency's pursuit of the use of advanced materials to create aircraft that can morph into different shapes for different missions and portions of missions.
NS: So who is doing what in the DARPA programme?
TW: NextGen is looking at a sliding skins idea, Raytheon at telescoping wings and Lockheed Martin at rotating and folding wings.
Telescoping wings would also help in allowing personal aircraft to fit into a car garage.
Small aircraft that could both fly and drive, carrying two to four passengers, are not a century away, but rather two or three decades, said Dennis Bushnell, the chief engineer at NASA's Langley Research Center in Virginia. These "personal air vehicles" could go 600 to 900 miles after a vertical takeoff that could transform the landscape much the way cars did over the last century.
"You won't need airports," he said. "Everyone can use them -- the aged, the infirmed, the young, the inebriated. You don't have the restriction that you need a pilot. It will be automated."
Horses are in some sense analogous to smart airplanes because a horse could take its master home even if the master was too drunk to drive.
VICHY, FRANCE, December 18, 2003-Consuming Concord grape juice significantly improved laboratory animals' short-term memory in a water maze test as well as their neuro-motor skills in certain of the coordination, balance and strength tests, according to preliminary research presented at the 1st International Conference on Polyphenols and Health recently held in Vichy, France.
"In the study we subjected 45 senescent rats-meaning they were mature animals approaching the end of their expected life spans-to a range of tests and challenges that are commonly accepted methods of measuring changes in short-term memory and neuro-motor skills," says James A. Joseph, Ph.D., Chief, Neurosciences Laboratory, USDA Human Nutrition Research Center on Aging and lead researcher in the study. "Concord grape juice appeared to reduce or reverse the loss of sensitivity of muscarinic receptors, thus enhancing cognitive and some motor skills in the test animals. In many of the tests we saw significant improvements or trends toward improvement."
The memory test was the Morris water maze, an age-sensitive challenge that requires animals to use spatial learning to find a platform submerged 2 cm below the surface of a pool of water. Rats fed a 10% solution of Concord grape juice found the platform in roughly 20% less time than the control group. Other tests measured the animals' ability to balance on a horizontal stationary rod; a rotating, slowly accelerating rod; and various sized planks, and their ability to hold onto a suspended wire and an inclined wire screen. Some of those tests saw improvements in either or both of the group consuming a 10% solution of Concord grape juice and the group consuming a 50% solution.
"The Concord grape juice findings are not surprising," explains Joseph. "We have seen similar effects in the work we've done in blueberries."
The researchers point to several factors as potential mechanisms of action, including increased dopamine production and a potent overall antioxidant effect. According to previously published USDA studies, Concord grape juice has the highest total antioxidants of any fruits, vegetables or juices tested.
Regarding the reference to previous USDA studies: Be aware that spinach is especially efficacious for improving aged rat memories. Also, blueberries and blackberries have more antioxidant activity than red grapes. Note that while raisins have high scores this is due to dehydration and that their antioxidant to calorie ratio is probably no better than that of undehydrated grapes.
Also on a related topic, if you have forgotten my previous post see: Choline May Restore Middle Aged Memory Formation.
Scientists at Johns Hopkins have discovered the first direct evidence in mammals that a chemical intermediate in the production of fatty acids is a key regulator of appetite, according to a report in a recent issue of the Proceedings of the National Academy of Sciences.
Scientists have long known that hunger causes increases in some brain chemicals while lowering others. However, the root cause of hunger's effects -- the initial chemical trigger of appetite -- has been elusive.
In experiments with mice, the Johns Hopkins researchers showed that appetite is immediately and directly tied to amounts of a chemical called malonyl-CoA. In hungry mice, malonyl-CoA was almost undetectable in the brain. Once fasting mice were given food, however, amounts of the chemical increased to high levels within two hours. Furthermore, chemically reducing appetite by injecting a compound called C75 into the brain brought levels of malonyl-CoA up to those of mice given food, helping to explain C75's effects.
"From this work, it appears as though malonyl-CoA levels control appetite and levels of other brain chemicals that we know go up and down with hunger and feeding," says Dan Lane, Ph.D., professor of biological chemistry in Hopkins' Institute for Basic Biomedical Sciences. "There may be other contributors, but this is the first direct evidence that malonyl-CoA could be the body's primary appetite controller."
In previous work, Lane and his colleagues had shown that giving mice C75, which blocks conversion of malonyl-CoA into fatty acids, dramatically reduced animals' appetites. Subsequently, they found that C75 triggers levels of several known appetite signals (NPY, AgRP, POMC and others) to register "full" even when animals should have been hungry.
However, the new experiments, during which C75 was injected directly into the animals' brains, suggest that increasing levels of malonyl-CoA, caused by "blocking the dam" with C75, is the first step in the process that alters levels of those appetite signals.
"Fully understanding how appetite is regulated by the brain should reveal ways to control appetite," says Lane, who was studying how fat cells develop when he and colleagues discovered the appetite-suppressing effects of C75 a few years ago. "Because C75 was injected into the brain, rather than into the abdomen as in earlier experiments, we also now know that the compound's effects on appetite stem primarily from its effects on chemicals in the brain, not from effects it might have elsewhere in the body."
The scientists also discovered that preventing formation of malonyl-CoA by injecting a different substance (TOFA) into the brain partially reversed the appetite-suppressing effect of C75. Lane suggests that a better blocker of malonyl-CoA formation should more completely counteract C75's effects.
My guess is that C75 binds and blocks the activity of an enzyme that converts malonyl-CoA into something else. By preventing malonyl-CoA from being further metabolized the C75 compound causes a rise in the level of malonyl-CoA in brain cells and that, in turn, probably causes malonyl-CoA to bind in places that cause other signals to be sent that cause the brain to feel sated. If hunger can be increased by blocking malonyl-CoA formation and if hunger can be decreased by slowing the breakdown or usage of malonyl-CoA then it might turn out be fairly easy to control appetite and weight.
A safe and effective pair of drugs for increasing and decreasing appetite would give humans easy conscious control of body weight. The development of the means to take conscious control of appetite is one element in a larger toolset which humans need to develop in order to better adapt ourselves to the lifestyles most common in industrialized societies to which our evolutionary past currently leaves us poorly adapted. We also still need to develop the means to adapt ourselves to lower levels of exercise, less need for fear and anger, and other changes we have made in our environments for which we are not well adapted.
Atsumu Ohmura has discovered that the lights are going out all over the world.
"It's an uncomfortable one," says Gerald Stanhill, who published many of these early papers and coined the phrase global dimming. "The first reaction has always been that the effect is much too big, I don't believe it and if it's true then why has nobody reported it before."
That began to change in 2001, when Stanhill and his colleague Shabtai Cohen at the Volcani Centre in Bet Dagan, Israel collected all the available evidence together and proved that, on average, records showed that the amount of solar radiation reaching the Earth's surface had gone down by between 0.23 and 0.32% each year from 1958 to 1992.
The few experts who have studied the effect believe it's down to air pollution. Tiny particles of soot or chemical compounds like sulphates reflect sunlight and they also promote the formation of bigger, longer lasting clouds. "The cloudy times are getting darker," says Cohen, at the Volcani Centre. "If it's cloudy then it's darker, but when it's sunny things haven't changed much."
The explanation most popular among atmospheric scientists is that soot and other pollutants are blocking visible and infrared light from reaching the surface of the planet. But another possibility is that global warming is causing an increase in cloud cover by increasing the amount of airborne water or dust. If pollution is the cause then efforts to reduce particulate pollution may be starting to cause a reduction in the dimming effect. But it could be that the US and Europe will reduce their particulate pollution while South and East Asia increase theirs.
The reduction in light reaching the surface is probably reducing plant growth in areas closer to the poles. At the equator carbon dioxide (CO2) and water are more likely to be rate-limiting factors for plant growth. But note that as CO2 levels rise that has the tendency to allow plants to grow faster by both increasing the amount of CO2 available and also by reducing the amount of water that plants have to lose when they try to absorb CO2 gasses and hence higher CO2 reduces plant needs for water.
Pollution regulations effectively could be used as a climate engineering tool. Mandate a more rapid reduction in particulate pollution and the effect will probably be to increase the amount of sunlight that reaches the Earth's surface. That could increase plant growth in regions closer to the poles while also probably increasing surface evaporation from the oceans and hence probably lead to an increase in precipitation. That would be beneficial in some areas but detrimental in other areas.
The scale of human activity has gotten so large that we inevitably change the climate to some extent. We do not know yet just how much we are changing the climate because we do not know what the climate would be like in our absence. Since the human population is growing and parts of the world are rapidly industrializing human influence on the climate looks set to grow even further. But since there are so many human activities that cause climate effects and since some of those effects cancel each other out (at least to some extent) any effort to reduce only a single pollutant or to reduce the impact of only a single method of modifying our environment will have the effect of strengthening the impact of other things that we do.
Real life is increasingly beginning to resemble the science fiction fantasy world of David Brin's novel Earth where he portrayed a future world where old folks video recorded everything happening around them in order to protect them from crimnals. But in the real world equivalent phone cameras are being used to surreptitiously make nude photographs and otherwise satisfy the desires of voyeurs.
The phones, with their discreet lens, tiny size and ability to immediately transmit images onto the Internet or other cell phones, are a voyeur's dream.
The phones first appeared on the market in early 2001, and for the last several months, media reports out of Asia have called attention to incidents such as nude photographs of unsuspecting victims turning up on the Internet.
As governments rush to pass laws restricting the usage of cell phone cameras prosecutions under the new laws are beginning to take place. Jack Le Vu, 20, of Sammamish Washington state, has been charged with pursuing his panty fetish by taking pictures up a woman's skirt while crouched down at a supermarket shelf.
A witness told investigators Mr. Vu pretended to scan the shelves July 10 as he followed a 26-year-old woman in a supermarket, crouched down with his cellphone extended beneath her skirt and then stood, punched a few buttons on the phone and looked at the screen.
Charged with voyeurism, a felony under state law, Vu pleaded not guilty Monday in what officials believe is the first case of its kind in King County.
Last spring, Hawaii passed legislation outlawing "upskirt" snapshots and video, but a First Amendment expert says such laws may be unconstitutional, according to the newspaper article.
That position has been supported by the Washington state Supreme Court, which last year overturned the convictions of two men who, in separate incidents, took "upskirt" photos with plans to sell them on the Internet.
Any legal experts reading this who care to comment?
Long a staple overseas, "cam phones" arrived here in 2002, promising sleek and cheap--under $100--fun with a voyeuristic twist. And they're taking off: 7 million of 72 million cell phones shipped in the U.S. have cameras; by 2007, 51 million out of over 110 million will have them, predicts research firm IDC.
"The evolution, the penetration, the spread of digital capture capabilities in phones is going to be so fast, so wide that it might be a losing battle ultimately," said analyst Alex Slawsby of IDC, a leading technology industry analysis firm.
Count me in the ranks of those who think privacy will erode regardless of what governments do about it.
Obviously digital cameras already allow pictures to be taken fairly easily for later download into a computer and posting to the internet. So what do mobile camera phones bring to the table? First of all, they offer greater ease of concealment. Most cameras are bulkier and easier to spot in use. Also, phones offer the ability to immediately send a picture. The result is that more people will use them to take more pictures to send to other people or to post on the internet.
And textamerica figures to cash in on this latest hotbed of digital technology.
The Rancho Santa Fe startup offers free moblog hosting to users around the world, and last month initiated a moblog where San Diegans could post photos of the wildfires, often taken before any firefighters or news media were on scene. (The textamerica service is free, but the user is charged by the carrier for sending the image.)
A 15-year-old boy foiled an apparent abduction attempt when he pulled out his cell phone camera and snapped photos of a man trying to lure him into a car, police said.
The teen also photographed the vehicle's license plate and gave the evidence to police, who arrested a suspect the next day.
An increasing portion of all the places we go to will have video devices recording whatever transpires. People will install them for security in their homes just as businesses and governments install them in offices, stores, busses, taxis, and other locations. Cell phone cameras are part of a much larger trend.
Many local governments in the United States are moving to restrict the use of cell phone cameras even as the quality of the camera pictures steadily improves.
Trying to distinguish between a camera phone and any other cellphone has also complicated matters. The Elk Grove Park District in suburban Chicago enacted a ban in November that covered the possession of any cellphone - not just camera phones - in park-owned restrooms, locker rooms and showers.
"There is no reason to have a cellphone while you're changing and showering," said Ron Nunes, one of the park district's commissioners. "I'd rather protect the children and the public more than someone who wants to call home and see what's for dinner." Fresh in the town's memory was a 2001 incident in which a man used a fiber-optic camera to secretly take pictures of children in a park shower.
Alex Slawsby, an analyst with IDC, said that by next year the typical camera phone sold in the United States would have a resolution of at least one megapixel, about three times the current average - doing wonders, no doubt, for the rendering of sloppy restaurant patrons.
More likely to gain prevalence are camera phones that make some kind of noise to alert bystanders of the possibility that their photo is being taken. In November, the South Korean government ordered manufacturers to install beeping sounds of at least 65 decibels on camera phones made and sold there, after officials received a flood of complaints about camera phone-wielding peeping toms.
In the future digital cameras will get smaller, cheaper, easier to conceal, higher resolution, have higher storage capacity, and will be integrated with electronics to allow smart software to control when something of interest is seen in order to trigger when a picture will be recorded. Wireless network bandwidth will increase by leaps and bounds. Technological gimmicks like the South Korean government beeping cell phone requirement will at best slow the rate at which surreptitious picture taking spreads.
There is a new fad in web logging called the mobile weblog or moblog. A moblog is a web log which displays pictures taken with cell ohone cameras. See, for example the Gary Dann photojournal as well as Neutral Zone, Furry Felines!, Wallace, the pug, and countless others. I think "moblog" is a poorly formulated term. It sounds too much like "mob log" which might have something to do with the use of electronic communications to organize spontaneous mobs (which itself could easily spawn a type of photo web log to record strange things that mobs might be organized to do).
In a way what is happening is that the invasion of celebrity privacy by paparazzi photographers and video camera operators is being extended to include the invasion of privacy of non-celebrities as well. People who used to expect that their relative anonymity would allow them to conduct their daily activities free from surveillance and recording by others are at greater risk of being photographed. But there is a big rate-limiting factor in all this: there are not enough people to view all the pictures. Besides, most of the pictures are pretty boring anyhow.
A tiny nanowire sensor — smaller than the width of a human hair, 1,000 times more sensitive than conventional DNA tests, and capable of producing results in minutes rather than days or weeks — could pave the way for faster, more accurate medical diagnostic tests for countless conditions and may ultimately save lives by allowing earlier disease detection and intervention, Harvard scientists say.
In preliminary laboratory studies demonstrating the capability of the new sensor, the researchers showed that it has the potential to detect the gene for cystic fibrosis more efficiently than conventional tests for the disease. CF is the most common fatal genetic disease among people of European origin.
One of a growing number of promising diagnostic tools that are based on nanotechnology, the silicon sensor represents the first example of direct electrical detection of DNA using nanotechnology, according to the researchers. The sensor and the detection of the CF gene will be described in the Jan. 14 issue of the journal Nano Letters, a peer-reviewed publication of the American Chemical Society, the world's largest scientific society.
"This tiny sensor could represent a new future for medical diagnostics," says study leader Charles M. Lieber, Ph.D., a professor of chemistry at Harvard and one of the leading researchers in nanotechnology.
"What one could imagine," says Lieber, "is to go into your doctor's office, give a drop of blood from a pin prick on your finger, and within minutes, find out whether you have a particular virus, a genetic disease, or your risk for different diseases or drug interactions."
With its high sensitivity, the sensor could detect diseases never before possible with conventional tests, he says. And if all goes well in future studies, Lieber predicts that an array of sensors can ultimately be configured to a handheld PDA-type device or small computer, allowing almost instant test results during a doctor's visit or possibly even at home by a patient. It could potentially be used to screen for disease markers in any bodily fluid, including tears, urine and saliva, he says.
The sensor also shows promise for early detection of bioterrorism threats such as viruses, the researcher says.
A company called Nanosys is commercializing this nanotech sensor tecnology. Nanosys is pursuing a number of other applications of nanotech sensors.
Ultimately, the goal at Nanosys is to revolutionize sensors, nanoelectronics and optoelectronics by building products literally from the bottom up through molecular self-assembly that is cheaper, better and faster; uses less power; and basically delivers a lot more bang for the buck than today's most advanced devices. The technology that makes this possible is based on groundbreaking work in nanoelectronics by Dr. Charles Lieber, the Mark Hyman Professor of Chemistry at Harvard University.
Exactly how this process works is a closely guarded secret, but Lieber and his team have basically developed a way to make nanowires any way they want. They can control the size and shape of the wires, as well as the amount of impurities, or dopens, attached to the wires, thereby controlling the wires' conductive and photo-reactive characteristics, which, at the end of the day, dictate their usefulness.
"That was the real 'ah-ha' Charles Lieber came up with," Bock said. "That means that he can make devices much more quickly because he doesn't have to go look for the materials he wants each time, he just makes them."
Toto, pretty soon we are not going to be in Kansas any more.
The modified nanotubes have so far only been used to ferry a small peptide into the nuclei of fibroblast cells. But the researchers are hopeful that the technique may one day form the basis for new anti-cancer treatments, gene therapies and vaccines.
Cancer is a particularly difficult disease to treat because cancer cells are host cells. It is hard to develop methods to selectively kill some host cells while not killing too many of the normal cells. Reports of interesting approaches for doing so are always interesting. The first here is the use of stem cells to go to where the cancer is located to deliver a treatment payload.
SAN DIEGO -- Genetically engineered stem cells can find tumors and then produce biological killing agents right at the cancer site, say researchers at The University of Texas M. D. Anderson Cancer Center, who have performed a number of successful "proof of concept" experiments in mice.
Their novel treatment, presented at the annual meeting of the American Society of Hematology (ASH), may offer the first gene therapy "delivery system" capable of homing in on and then attacking cancer that has metastasized -- wherever it is in a patient's body. And the stem cells will not be rejected, even if they are not derived from the patient.
The researchers have tested the system in mice with a variety of human cancers, including solid ones such as ovarian, brain, breast cancer, melanoma and even such blood-based cancer as leukemia. "This drug delivery system is attracted to cancer cells no matter what form they are in or where they are," says Michael Andreeff, M.D., Ph.D., professor in the Departments of Blood and Marrow Transplantation and Leukemia. "We believe this to be a major find."
M. D. Anderson has filed patent applications on the system, which uses human mesenchymal progenitor cells (MSC), the body's natural tissue regenerators. These unspecialized cells can migrate to an injury by responding to signals from the area. There they develop the kind of connective tissue that is needed to repair the wound, and can become any kind of tissue required.
Tumors are "never-healing wounds" which use mesenchymal stem cells to help build up the normal tissue that is needed to support the cancer, says Andreeff. "There is constant remodeling of tissue in tumors," he says. So researchers turned the tables on the cancer, taking advantage of a tumor's ability to attract the stem cells.
In their novel delivery system, researchers isolate a small quantity of MSC from bone marrow, and greatly expand the quantity of those cells in the lab. They then use a virus to deliver a particular gene into the stem cells. When turned on, this gene will produce an anti-cancer effect. When given back to the patient through an intraveneous injection, the millions of engineered mesenchymal progenitor cells will engraft where the tumor environment is signaling them, and will activate the therapeutic gene.
In the study reported at ASH, the researchers examined whether MSC producing human interferon-beta can inhibit the growth of metastatic tumors in the lungs of mice that do not have a functioning immune system. They used an adenovirus vector to deliver the gene that expresses interferon-beta, which can prevent cell reproduction. Andreeff and his team found that when mice were treated with just four weekly injections, their lifespan doubled, on average. They also discovered that when treated cells were placed under the skin of the mice, there was no effect. "The cells need to be in the immediate environment of the tumor to work," which suggests that normal tissue will not be adversely affected, says Andreeff.
Other studies being reported by Andreeff that used different therapeutic "payloads" found a doubling of survival in mice with one kind of ovarian cancer and a cure rate of 70 percent in mice with a different kind of ovarian tumor. Another study demonstrated that when the gene therapy was injected into the carotid (neck) artery of mice with human brain cancer, the genes incorporated themselves into the cancer, not into normal brain tissue.
This is a very cool result. One question is whether the stem cells will stick around to cause problems by expressing their payload genes. But if the stem cells come from another host then the body's own immune system will probably eventually wipe out the foreign stem cells after they have done their job.
Another treatment approach utilizes the fact that many cells in a tumor tend not to be well fed. A rapidly growing tumor usually has areas where some cells are not near capillaries and that are therefore oxygen starved. Those cells actually have a survival advantage against many chemotherapeutic agents because either agents can't be given in high enough dosages to seep into those cells or the chemo works better on active cells by reacting to oxygen in order to activate or for some other reason related to metabolism of active cells. KuDOS Pharmaceuticals and Novacea are working with a compound that is activated by a metabolic pathway that is typically only active in cells that lack oxygen.
South San Francisco, Calif. and CAMBRIDGE, United Kingdom, Dec. 11, 2003 -- Novacea Inc. and KuDOS Pharmaceuticals announced today that Novacea has licensed from KuDOS the North American rights to develop and commercialize AQ4N, a novel proprietary hypoxic cell-activated agent with broad potential in a variety of cancers.
As a first-in-class hypoxic cell-activated anti-tumor therapy, AQ4N represents a new approach to cancer treatment. The drug is considered inactive when administered and is selectively converted into its active cytotoxic form, known as AQ4, once it reaches hypoxic tumor cells (cells that are oxygen starved), reducing potential systemic toxicity. AQ4 is a potent topoisomerase II inhibitor and DNA intercalator.
More than two million patients each year are estimated to present with tumors in the U.S. and Europe. The large majority of these tumors have hypoxic components, which are relatively resistant to standard anti-cancer treatment, including radiotherapy and chemotherapy. As a result, a specific agent like AQ4N that can treat the hypoxic fractions should enhance the overall efficiency of cancer cell killing and reduce tumor recurrence.
Preclinical data demonstrate that AQ4N markedly enhances the effects of radiation and chemotherapy when administered in combination with either treatment. Data further suggest anti-tumor activity as a monotherapy. The agent is currently being evaluated in a Phase 1 clinical trial in combination with radiation in esophageal cancer. Sixteen patients have been treated to date and AQ4N has been well tolerated, with no serious drug-related adverse events reported.
AQ4N was originally discovered by Prof. Lawrence Patterson of the School of Pharmacy, at University of London, working in collaboration with BTG International plc (BTG). KuDOS acquired a worldwide license for AQ4N from BTG in March 2001.
While AQ4N is at best only going to kill a subset of cancer cells that targetted subset too often escapes death from current chemotherapeutic agents. So in combination with existing chemotherapeutic agents it might turn out to be a useful treatment.
Update: Another unusual anti-cancer therapy under development by Dr. William Wold of the Saint Louis University School of Medicine and his colleagues genetically engineers adenoviruses that cause common colds to instead selectively kill cancer cells.
Dr. Wold, chair of the department of molecular microbiology and immunology, and his colleagues Karoly Toth, Konstantin Doronin, Ann E. Tollefson, and Mohan Kuppuswamy have found a way to convert the relatively benign "adenovirus" that causes the common cold into an anti-cancer drug that attacks and destroys cancerous cells.
"Human cancer is currently treated with surgery, radiation therapy, or chemotherapy, depending on the cancer type," Wold said. "These treatments can be highly successful, but new therapies are required, especially for tumors that have become resistant to chemo- or radiation-therapy."
Wold's group has developed several new "adenovirus cancer gene therapy vectors," changing these genes so the virus will attack cancer cells.
"Some of our vectors are designed to destroy many different types of cancers, others are designed to be specific to colon or lung cancer. In preclinical testing these vectors were highly effective against cancerous tumors and did not harm normal tissues."
Wold and his colleagues have done this by modifying one gene so that the virus can grow in cancer cells but NOT normal cells and by boosting the activity of another gene that the virus normally uses to disrupt the cells it has infected. "When the virus infects cells, it takes the altered genes with it, and those genes attack cancer cells while leaving normal cells intact," Wold explained.
A U.S. patent (No. 6,627,190) was awarded this fall to Dr. Wold and his team of researchers. Pre-clinical testing is complete and is expected to move soon into clinical trials.
Now this patented technology has been issued and exclusively licensed to a company, Introgen Therapeutics, which made the announcement this morning. Introgen and VirRx, a biotechnology company founded by Wold and with a primary interest in cancer gene therapy, are collaborating on new therapies for cancer and other diseases.
AUSTIN, Texas, Dec. 16 /PRNewswire-FirstCall/ -- A patent that covers an important class of replicating adenoviruses relating to Introgen Therapeutics' anti-cancer product candidate INGN 007 (VRX 007) has been issued and exclusively sub-licensed to Introgen, the company announced today. The United States patent, U.S. 6,627,190, emanates from research performed at VirRx, Inc. and Saint Louis University under the direction of Dr. William S.M. Wold, one of the world's leaders in replicating oncolytic virus technology. Introgen and VirRx are collaborating on new therapies for cancer and other diseases. VirRx, LLC was founded by Dr. Wold.
INGN 007 is an oncolytic virus product that over-expresses the ADP gene, the protein responsible for the rapid disruption (oncolysis) of tumor cells and, hence, is an important therapeutic activity of oncolytic viruses. Oncolytic viruses are viruses that kill cancer cells by replicating at high levels and cause a cancer cell to break apart. In animal models, INGN 007 has demonstrated that it saturates the entire tumor treated and has shown it can eradicate cancer. Introgen and VirRx initiated their collaboration in order to develop a series of potential products emanating from VirRx and the Wold laboratory. Preclinical testing of INGN 007 is now being completed and the product is being readied for clinical development.
It isn't clear why this cell death effect is specific to cancer cells. A couple of Journal of Virology abstracts of Wold and his colleagues here and here refer to Transforming growth factor β1 (TGF-β1) but are they trying to turn it off or on in cancer cells and why is the mechanism not also going to kill normal cells? Again, it is not clear. Anyone have any insights on this?
Astrobiologists disagree about whether advanced life is common or rare in our universe. But new research suggests that one thing is pretty certain – if an Earthlike world with significant water is needed for advanced life to evolve, there could be many candidates.
In 44 computer simulations of planet formation near a sun, astronomers found that each simulation produced one to four Earthlike planets, including 11 so-called "habitable" planets about the same distance from their stars as Earth is from our sun.
"Our simulations show a tremendous variety of planets. You can have planets that are half the size of Earth and are very dry, like Mars, or you can have planets like Earth, or you can have planets three times bigger than Earth, with perhaps 10 times more water," said Sean Raymond, a University of Washington doctoral student in astronomy.
Raymond is the lead author of a paper detailing the simulation results that has been accepted for publication in Icarus, the journal of the American Astronomical Society's Division for Planetary Sciences. Co-authors are Thomas R. Quinn, a UW associate astronomy professor, and Jonathan Lunine, a professor of planetary science and physics at the University of Arizona.
The simulations show that the amount of water on terrestrial, or Earthlike, planets could be greatly influenced by outer gas giant planets like Jupiter.
"The more eccentric giant planet orbits result in drier terrestrial planets," Raymond said. "Conversely, more circular giant planet orbits mean wetter terrestrial planets."
In the case of our solar system, Jupiter's orbit is slightly elliptical, which could explain why Earth is 80 percent covered by oceans rather than being bone dry or completely covered in water miles deep.
The findings are significant because of the discovery in recent years of a large number of giant planets such as Jupiter and Saturn orbiting other suns. The presence, and orbits, of those planets can be inferred from their gravitational interaction with their parent stars and their affect on light from those stars as seen from Earth.
It currently is impossible to detect Earthlike planets around other stars. However, if results from the models are correct, there could be planets such as ours around a number of other suns relatively close to our solar system. A significant number of those planets are likely to be in the "habitable zone," the distance from a star at which the planet's temperature will maintain liquid water on the surface. Liquid water is thought to be a requirement for life, so planets in a star's habitable zone are ideal candidates for life. It is unclear, however, whether those planets could harbor more than simple microbial life.
Suppose there are a lot of planets which are similar to Earth in size and in the amount of radiation they receive from their own suns. Even if some of them do not contain sentient lifeforms they still may have native life forms and those life forms may be incompatible with human life. Imagine pathogens that human immune systems couldn't even recognize let alone effectively fight. Or all native plant matter might be poisonous not only to humans but to any plants humans would bring to grow on such a planet.
It is incredibly common in science fiction movies and television shows for humans to mate and reproduce with aliens and to find edible food on distant planets. But if there is life on other planets both of these possibiliities are very unlikely. Other lifeforms will probably use different combinations of compounds for genetic encoding and for building tissues. Species on other planets may use amino acids to build proteins but probably not the exact same set of amino acids humans use. Ditto for sugars and other biological compounds.
The real tragedy is that even if humans and sentient species from other planets could get along and even if other sentient species lived under similar levels of gravity and atmospheric pressure and also were oxygen breathing it would probably be necessary to never have direct physical contact due to fears that pathogens would jump from one species to another with deadly results.
Now researchers have found that neophobic rats die an average of three months younger than their outgoing brothers — equivalent to ten years shaved off a human life1.
"It shows we need to consider personality traits and behavioural styles when trying to understand physiological mechanisms of health," says Sonia Cavigelli at the University of Chicago, Illinois, who conducted the study with her colleague Martha McClintock.
But way back in our evolutionary past the kinds of personality traits that might make us more depressed or anxious or stressed were probably evolutionarily adaptive for some of our ancestors.
"It was the going up and not coming down as fast," McClintock says. She likens it to hearing a sound in the middle of the night. Some people will wait and listen for a few minutes then fall back asleep. Others will worry for hours, alert for another noise.
"What we showed is that they recovered from that response more slowly," she adds. "We correlated that [with] an early death. It really suggests that we need to look at the effect of having a slow recovery to a stressor to being neophobic, where it happens over and over again on a daily basis."
Then do people who fear the new and unknown need to have their personalities modified in order to live longer? It might be possible to block the physiological response without impacting the emotional response. The reason is pretty simple: it may some day to possible to selectively suppress the stress response in the body when the brain becomes fearful. If the stress response could be suppressed then the more neophobic personalities could probably go through life feeling fear or aversion to new experiences with little resulting damage to their bodies.
The mechanism by which the life of the rats are shortened is probably by fear causing the brain to act on the hypothalamus and other glands to cause them to release stress hormones which in turn cause changes that age the body more rapidly. The changes which shorten life expectancy probably include greater production of immune and inflammation responses. Evidence is accumulating for the role of chronic inflammation in the development of a large variety of diseases. Inflammation may also wear out adult stem cell reservoirs by causing more cell division than would otherwise be the case. So the stem cells more quickly reach their limit in the number of times they can divide and once adult stem cells are no longer available to repair injuries the total amount of damage would accumulate more rapidly.
Still, there is an argument to be made for personality engineering in order to reduce the incidence of neurological disorders. For instance, psychological distress appears to contribute to the development of Alzheimer's Disease.
People who are prone to psychological distress are more likely to develop Alzheimer's disease, research suggests.
Rush University in Chicago found people plagued by negative emotions like depression and anxiety were at double the risk of more laid-back individuals.
But is this higher risk the result of emotional states in the brain acting directly on the brain? Inflammation may play an important role in the development of Alzheimer's. Anti-inflammatory drugs may be able to reduce the level of a peptide that is involved in the formation of amyloid plaques that are characteristic of Alzheimer's.
In a new study, researchers suggest that some anti-inflammatories reduce the levels of a protein fragment (or peptide) associated with Alzheimer's by interfering with two proteins that help manufacture the fragment in the first place. The proteins are called Rho and Rock.
The researchers inhibited Rho and Rock in mutant mice and significantly reduced the levels of the most dangerous peptide found in humans, called amyloid-ß-42. Amyloid peptides form the plaques that are a signature of Alzheimer's disease. The findings are reported in Science.
“The take-home message from this study is that if non-steroidal anti-inflammatory drugs are effective, they may be effective through this [Rho-Rock] mechanism,” says Steven M. Paul of Lilly Research Laboratories in Indianapolis, who led the study. “The mechanism may be unrelated to what these drugs do as anti-inflammatories.”
Now at this point you might be thinking the immortal lyric "Don't worry, be happy". But life is never that simple. A little stress may be good for you
EVANSTON, Ill. -- We've often heard that red wine and dark chocolate in moderation can be good for you. Now it appears that a little stress may be beneficial, too.
Northwestern University scientists have shown that elevated levels of special protective proteins that respond to stress in a cell (known as molecular chaperones) promote longevity. Acute stress triggers a cascading reaction inside cells that results in the repair or elimination of misfolded proteins, prolonging life by preventing or delaying cell damage.
The findings are published online today (Dec. 10) by Molecular Biology of the Cell, a publication of the American Society for Cell Biology. The article will appear in print in the journal's February 2004 issue.
"Sustained stress definitely is not good for you, but it appears that an occasional burst of stress or low levels of stress can be very protective," said Richard I. Morimoto, John Evans Professor of Biology, who co-authored the paper with lead author James F. Morley, a graduate student in Morimoto's lab. "Brief exposure to environmental and physiological stress has long-term benefits to the cell because it unleashes a great number of molecular chaperones that capture all kinds of damaged and misfolded proteins."
Update: UCLA researchers have found that shy people infected with HIV do more poorly on anti-retroviral drug treatment than more outgoing and extroverted people. (or here)
"We found a strong linear relationship between personality and HIV replication rate in the body," Cole said. "Shy people with high stress responses possessed higher viral loads."
The researchers were surprised to find that the antiretroviral drugs barely made a dent in the shy patients' disease. Instead of showing lower viral loads, the immune systems of introverted subjects replicated the virus between 10 to 100 times as fast as in other patients.
"Shy patients on drug therapy didn't experience even a 10-fold drop in their viral load," said Naliboff, co-director of the UCLA Center for Neurovisceral Sciences and Women's Health. "Doctors classify that as a treatment failure. The drugs should shrink HIV replication by at least 100‑fold."
"Our findings suggest that high nervous system activity helps the virus continue replicating," Cole said. "Patients with high-stress personalities continued to lose T-cells — even on the best drug therapy available. Stress sabotages their battle against this lethal disease."
"It looks as though sensitive people are simply wired to respond to stress more strongly than resilient people," Naliboff said. "How someone reacts to stress seems to be more important than the stress itself in explaining why one person gets sick and one person doesn't."
"This heightened stress response is the equivalent of waves striking a stone on the beach," Cole said. "One wave won't do much damage. But the constant pounding of waves eventually grinds that stone to sand. That's how continual stress response wears down the immune system."
Previously the UCLA team found that the body under stress releases a chemical called norepinephrine that leaves the T-cells open to infection and accelerates HIV replication. The researchers' next step will be to try and change shy persons' physiologic response to stress using drugs that block norepinephrine's impact on T-cells.
"Our current study suggests that the body's production of norepinephrine during stress makes a big difference in people trying to fight off infection," Cole said.
Note the part about how these UCLA researchers are going to try to block the physiological response that follows from the cognitive response to stressful situations. The ability to block the brain's ability to send the body into a stress state would be useful for healthy people as well since it would slow and delay the development of many degenerative diseases associated with aging. The stress response is an evolutionary legacy that has become maladaptive in modern industrial society.
Anger is another stressor that has harmful effects on the body. Anger and the lack of friends are both risk factors for periodontal disease.
If you constantly exhibit anger and are a social hermit, such stressors might put your oral health at risk, according to a study in this month's Journal of the American Dental Association. Results from the study reveal that men who reported being angry on a daily basis had a 43 percent higher risk of developing periodontitis (gum disease) compared with men who reported seldom being angry, wrote the Harvard University researchers. In addition, men who reported having at least one close friend had a 30 percent lower risk of developing periodontitis compared with those who did not.
The study authors cited stress as being associated with poor oral hygiene, increased glucocorticoid secretion that can depress immune function and increased insulin resistance. All of these mechanisms, they wrote, can potentially increase the risk of developing periodontitis.
Stress, whether caused by fear of the unknown, loneliness, anger, or other factors, is bad for your health and for your life expectancy.
For the first time, an international research program involving the Department of the Interior's U.S. Geological Survey has proven that it is technically feasible to produce gas from gas hydrates. Gas hydrates are a naturally occurring "ice-like" combination of natural gas and water that have the potential to be a significant new source of energy from the world's oceans and polar regions.
Today at a symposium in Japan, the successful results of the first modern, fully integrated production testing of gas hydrates are being discussed by an international gathering of research scientists. The international consortium, including the USGS, the Department of Energy, Canada, Japan, India, Germany, and the energy industry conducted test drilling at a site known as Mallik, in the Mackenzie Delta of the Canadian Arctic. This location was chosen because it has one of the highest concentrations of known gas hydrates in the world.
The United States is committed to participating in international research programs such as this one to advance the understanding of natural gas hydrates and the development of these resources. Even though gas hydrates are known to occur in numerous marine and Arctic settings, little was known before the Mallik project about the technology necessary to produce gas hydrates.
The successful results from this research form the world's most detailed scientific information about the occurrence and production characteristics of gas hydrates.
The estimated amount of natural gas in the gas hydrate accumulations of the world greatly exceeds the volume of all known conventional gas resources. While gas hydrates hold great potential as an "environmentally-friendly" fuel for the 21st Century, the technical challenges of realizing them as a resource are substantial. Additional research is required to understand and develop new techniques to quantify their distribution in nature.
Depressurization and thermal heating experiments at the Mallik site were extremely successful. The results demonstrated that gas can be produced from gas hydrates with different concentrations and characteristics, exclusively through pressure stimulation. The data supports the interpretation that the gas hydrates are much more permeable and conducive to flow from pressure stimulation than previously thought. In one test, the gas production rates were substantially enhanced by artificially fracturing the reservoir.
So how big a deal is this as compared to other fossil fuels energy sources? Gas hydrates reserves estimates vary quite a bit. But some of the estimates are pretty high.
The technology may take between 10 and 15 years to develop, but will help us tap gas hydrate reserves, estimated to be "more than double the known reserves of fossil fuel," said C.N.R.Rao, Founder and Honorary President of the Jawaharlal Nehru Centre for Advanced Scientific Research, and A.Kuznetsov, Director, Institute of Inorganic Chemistry, in Russia.
Interest in hydrate E&P has soared in recent years because of growing evidence that more hydrocarbon exists in hydrate deposits than the combined oil, gas and coal reserves worldwide. According to the U.S. Energy Information Agency in its just-released Natural Gas 1998: Issues and Trends, "Recovery of only 1% of hydrates would more than double the domestic gas resource base." A report from Ocean Drilling Program Leg 164, which investigated the huge Blake Ridge offshore the Carolinas, estimated U.S. methane hydrate reserves at 200,000 Tcf.
That really puts gas hydrates in the big leagues because there is an enormous amount of fossil fuel energy stored in coal.
The estimate was refined in 1997 to a more conservative 200,000 trillion cubic feet. Even this lower estimate is significant when compared to the 1,400 trillion cubic feet in the nation's conventional gas reserves. On a world-wide basis, it is estimated that methane hydrate reserves are 400 million trillion cubic feet, compared with 5,000 trillion feet in known gas reserves.
Well, there may not be enough hydrocarbons available to bring on global warming from conventional fossil fuels reserves. But if the technology to extract methane gas hydrates can be made cost-effective then humanity might need to refrain from using as much fossil fuel as it can burn.
In one science fiction novel whose title escapes me (anyone remember the story?) some event (nukes exploded on the ocean floor by accident or by terrorists?) caused all the gas hydrates to come to the surface and this caused an enormous hot house effect that melted all the ice and let lose massive hurricanes (or am I mixing up different science fiction novels? they all blend together after a while). The point here is that it would be a bad thing if all the gas hydrates came to the surface in an incontrolled manner. They constitute a pretty large amount of hydrocarbons.
Update: MIT's Technology Review has an article covering pretty much the same ground as covered in the other links above.
As a source for natural gas, hydrate today is about where coal bed methane was 15 years ago, says Michael Max, a hydrate expert formerly with the Naval Research Laboratory in Washington, D.C. “Coal bed methane was a classic, unconventional gas play,” with more than a few doubters, Max says. “Now it supplies around eight percent of the U.S natural gas supply. We think hydrate has a similar trajectory.”
Natural gas frm hydrates may well become a much higher percentage of the total energy mix if oil field production starts to decline within 10 years as some predict.
Harvard School of Public Health Associate Professor of Society, Human Development, and Health Stephen Buka and colleagues have published a study that finds a link between maternal cigarette smoking and later nicotine dependence of offspring.
Participants in the study were the grown children of mothers enrolled in the Providence, Rhode Island site of the National Collaborative Perinatal Project (NCPP), a multi-site study that involved the observation and examination of more than 50,000 pregnancies through the first seven years of life. Participants for the NCPP were enrolled between 1959 and 1966 and were visited regularly by NCPP investigators. Beginning with the first prenatal meeting and in each subsequent meeting until delivery, the mothers in the study were asked if they smoked, and if so, the number of cigarettes per day. From these data the researchers were able to establish the maximum number of cigarettes smoked at any point during the pregnancy. More than 60 percent of the women smoked during their pregnancies; approximately 35 percent smoked more than a pack per day (20 cigarettes) and nearly 25 percent smoked less than a pack per day.
Offspring whose mothers reported smoking a pack or more of cigarettes per day during their pregnancy were significantly more likely to meet DSM criteria for lifetime nicotine dependence than offspring of mothers who never smoked during their pregnancy. Among offspring who tried cigarettes, the odds of progressing to nicotine dependence was almost twice as great for those whose mothers smoked heavily during pregnancy. In contrast, the use of marijuana was not increased among children whose mothers smoked cigarettes during pregnancy. Marijuana use among the adult offspring was of particular interest to the researchers because of its similar route of administration (inhalation) and because research has shown an association between cigarette smoking and marijuana use.
Stephen Buka, lead author of the study and an associate professor in the Department of Society, Human Development and Health at the Harvard School of Public Health said, “More than half a million infants each year are exposed to cigarette smoke before birth. In the short term, this increases the risk of low birthweight and birth defects, and in the longterm, this adds to the likelihood that children will become heavy smokers, dependent on nicotine. Eliminating smoking during pregnancy and afterwards remains a critical challenge for clinicians and for public health practionners.”
The exposure that altered the brain may not necessarily have come during pregnancy. Since women who smoked during pregnancy were likely to smoke after birth the babies might have had their brains modified by the exposure during the period of postnatal brain development.
The lack of an increase in marijuana smoking among children exposed to nicotine during fetal development is not too surprising. The mechanisms for causing craving for different kinds of drugs are not all identical. A predisposition to become addicted to nicotine may not necessarily translate into a predisposition to become addicted to opiates either.
It is very valuable to identify factors that cause brains to develop in ways that predispose people to become addicts of various sorts. For some types of addiction (e.g. heroin) the success rate of getting off and staying off is less than 50%. The kinds of changes that happen to brains of addicts to make them desire drugs are often very long lasting and possibly permanent in many cases. Techniques are sorely needed to be able to change brains in ways that reduce the cravings for addictive drugs.
In an interview, Buka explained that pregnant women were interviewed many years ago, when people were not yet aware of the health effects of smoking. Consequently, more than 60 percent of the women smoked, and around 35 percent smoked at least one pack on at least one day of their pregnancies.
These poisonous chemicals pose unique and real threats to the unborn child. Smoking during pregnancy is associated with low birth weight, premature delivery, placenta previa (a complication that could cause bleeding and become a medical emergency), miscarriage and post-delivery death. It has also been associated with a 50 to 70 percent higher chance of delivering babies with a cleft lip or palate, and this risk is believed to increase with the number of cigarettes smoked per day
In a paper published in the scientific journal Climate Change Dr. William Ruddiman argues that humanity prevented an ice age that would otherwise have begun about 4,000 or 5,000 years ago.
Both should have continued declining through the present day, leading to lower temperatures, and a new ice age should have begun 4,000 to 5,000 years ago, Dr. Ruddiman said. Instead, levels of carbon dioxide reversed 8,000 years ago and starting rising again. The decline in methane levels reversed 5,000 years ago, coinciding with the advent of irrigation rice farming.
If this argument is correct then humanity, by engaging in rice farming and deforestation, reversed a trend of decline in atmospheric carbon dioxide and methane and, by doing so, prevented a cooling trend that would have brought on another ice age. This is a strong argument in favor of climate engineering.
All throughout the natural history of planet Earth the development of new life forms has altered the climate. The only difference between human intervention and intervention by other life forms is that humans have a higher level of sentience and hence can develop an awareness of the effects of their actions. But that awareness by itself is not a sufficient reason to refrain from acting in ways that alter the climate.
There is no stable state of climate that humans are disturbing. The climate has gone through many large changes because of variations in solar output, gradual changes in Earth's orbit, volcanic eruptions, asteroid strikes, and large assortment of other factors. It is likely do so again.
Humans could develop the ability to engineer the climate. Attempts to do so run the risk of causing some huge unforeseen outcome. But the biggest argument against climate engineering on a global scale is that most changes would hurt some nations while providing benefits to others. Warm the planet and Russia and Canada become more livable places. But perhaps other places will get less needed rains or will witness failures of crops due to shifting rain patterns or temperature changes. Any attempt to reach a global agreement on climate engineering would be hard to achieve unless humanity was faced with a disaster that only climate engineering could prevent.
Update: BBC science correspondent Richard Black says Ruddiman's theory is plausible.
Professor Ruddiman, of the Department of Environmental Sciences at Virginia, believes this 10,000-year warming added almost a degree Celsius to the global average temperature.
This though is a radical departure from existing theories about climate change and will inevitably be debated by other researchers.
But there is supporting evidence, and it is consistent with what we know about deforestation and farming today.
The idea is likely to spark debate among climate scientists, but at least one sceptic is already changing his mind. "I hadn't fully appreciated the actual magnitude of the human disturbance," says Thomas Crowley, who works on global warming at Duke University in Durham, North Carolina. "I've been thinking more and more that Ruddiman is on to something."
Technological advances in coming decades will gradually increase the ability of humans to engage in intentional climate engineering. Instead of having climates engineered as an unintentional side effect it will become possible to modify regional and global climates with conscious intent. Climate changes naturally and our ability to predict how the climate will change without human intervention will increase along with our ability to predict how much and what sorts of changes will come as side effects of human interventions. Those who oppose all human intervention in climate trends will eventually be faced with computer climate models that will be able to show with fairly high probability what the world climate would be like right now or 50 years from now had humans not developed agriculture or any other technology. If the gap between what is and what would have been turns out to be really large the opponents of human-caused climate change are then going to have to explain why we shouldn't engineer the climate to be more like it would have been if we had never developed agriculture. Ice age anyone? If not, why not?
Steven M. Block, a professor of biological sciences and of applied physics at Stanford University, and his team have developed two-dimensional optical force clamps that can monitor the action of a single RNA polymerase (RNAP) enzyme.
In a new study in the journal Nature, Block and his colleagues present strong evidence to support this proofreading hypothesis. Their results -- based on actual observations of individual molecules of RNAP -- are posted on Nature's website: http://www.nature.com. In another set of experiments published in the Nov. 14 issue of Cell magazine, the researchers discovered that RNAP makes thousands of brief pauses as it pries open and copies the DNA double helix.
"Together these two papers push the study of single proteins to new limits," Block said. "We've been able to achieve a resolution of three angstroms -- the width of three hydrogen atoms -- in our measurements of the progress of this enzyme along DNA. In so doing, we've been able to visualize a backtracking motion of just five bases that accompanies RNAP error-correction or proofreading."
Both studies were conducted using two-dimensional optical force clamps -- unique instruments designed and built by the Block lab. Located in soundproofed and temperature-controlled rooms in the basement of Stanford's Herrin labs, these devices allow researchers to trap a single molecule of RNAP in a beam of infrared light, and then watch in real time as it moves along a single molecule of DNA.
"We've been able to reduce drift and noise in our instruments to such an extent that we can see the tiniest motions of these molecules, through distances that are less than their own diameters," Block explained. "Studying one macromolecule at a time, you learn so much more about its properties, but these kinds of experiments were just pipedreams 15 years ago."
This is an example of why the rate of advance in biological science is not constant. The development of instrumentation that can study components of biological systems down on the scale at which they operate will allow these systems to be figured out orders of magnitude more quickly. The biggest reason we still know only a small fraction of what there is to understand about cells and diseases is that we can't watch what happens down at the level at which events actually take place. Continued advances in the ability to build smaller devices and smaller sensors will make observable that which it has previously never been possible to observe.
Satellite cells are a type of adult stem cells that can become myocytes, adipocytes or osteocytes. By becoming myocytes satellite cells help to repair injured muscle. Satellite cells do not divide as rapidly in older animals and as a result muscles do not heal as rapidly as humans and animals age. Some Stanford University School of Medicine researchers have discovered that a compound that mimicks the effect of a satellite cell regulatory protein can cause satellite cells to repair older muscles more rapidly.
In previous work, Rando found that satellite cells spring into action when a protein on the cell surface called Notch becomes activated, much like flicking the cell’s molecular “on” switch. What flips the switch is another protein called Delta, which is made on nearby cells in injured muscle. This same combination of Delta and Notch also plays a role in guiding cells through embryonic development.
Having found this pathway, Rando and Conboy wondered whether slow healing in older muscles resulted from problems with signaling between Delta and Notch – failing either to make enough Delta or to respond to the Delta signal.
In their initial experiments, Rando and Conboy found that young, middle-aged and older mice all had the same number of satellite cells in their muscles and that these cells contained equivalent amounts of Notch.
“It doesn’t seem as if there’s anything wrong with the satellite cells or Notch in aged muscle,” Rando said. That left Delta as the suspect molecule.
To test whether older muscles produce normal amounts of Delta, the researchers looked at the amount of protein made by mice of different ages. Young and adult mice, equivalent to about 20- and 45-year-old humans, both had a large increase in Delta after an injury. Muscles in older mice, equivalent to a 70-year-old human, made much less Delta after an injury, giving a smaller cry for help to the satellite cells. In response, fewer satellite cells were activated to repair the muscle damage.
A further set of experiments showed that slow repair in older muscles can be overcome. When the team applied a molecule to young muscles that blocked Delta, those satellite cells failed to divide in response to damage. Conversely, when they applied a Delta-mimicking molecule to injured, older muscles, satellite cells began dividing much like the those in younger muscle. The older muscles with artificially activated satellite cells had a regenerative ability comparable to that of younger muscle.
Although the studies focused on muscle regeneration after injury, Rando said similar problems with the interplay between Delta and Notch may cause the gradual muscle atrophy that occurs in older people, in astronauts or in people whose limbs are immobilized in a cast or from bed rest.
There might be cancer risks from taking a Delta-mimicking drug as a long-term treatment to avoid the muscle atrophy that comes with age. It is likely that the satellite cells really are aging and the down-regulation of Delta might be an evolutionary adaptation to reduce the risk that mutated and damaged pre-cancerous satellite cells might be stimulated to divide and become cancerous. This result does not eliminate the need to develop cell therapies to replace satellite cells with more youthful replacements.
The other reason that lower Delta activity with age might have been selected for as an evolutionary adaptation is again age related: this might have been done to conserve the cells by reducing the number of times they divide. The satellite cells probably can divide only a limited number of times. By reducing the production of Delta with age the satellite cells might be conserved for higher priority uses. Upregulating Delta or delivering a Delta agonist might simply wear out the satellite cells too rapidly providing a short-term benefit but a longer term greater harm.
What is needed is a process that can easily isolate aged adult stem cells from one's own body and basically refurbish and rejuvenate them. It is not too hard to see the broad outlines of what such a rejuvenation process might look like. One step has got to be a way to sort through different stem cells isolated from the body to choose ones that have little damage to their chromosomes and, in particular, little or no damage to genes that regulate cell growth. An accumulation of mutations to genes that regulate cell growth is what produces cancers. Rejuvenation of stem cells that are close to becoming cancerous would pose a substantial health risk. Gene therapy applied to carefully selected adult stem cells would elongate their telomeres and perhaps do other rejuvenating repairs. Then the rejuvenated cells would be grown up in large numbers and reinjected back into various appropriate locations of the body that they were originally isolated from.
Writing in the journal Science influenza experts Richard Webby and Robert Webster see the inevitable rise of a future mutant strain of influenza that could kill millions of people.
Nature's "on-going experiments" with influenza strains "may be the greatest bio-terror threat of all", with the world alarmingly unprepared for a global epidemic, researchers warn today.
Another pandemic of the kind that killed up to 40 million people worldwide in 1918 is inevitable - and could be imminent, according to a team from St Jude Children's Research Hospital in Memphis.
Using current technologies, it takes as long as six months to create flu vaccines.
"The world will be in deep trouble if the impending influenza pandemic strikes this week, this month, or even this year," write international flu experts Richard Webby and Robert Webster of in Memphis.
The technology exists to make a flu vaccine more accurately and much more quickly through a technique called reverse genetics, but the method is not yet approved for use in humans. The current method uses chicken eggs as a sort of incubator for viral genes, a process that involves both guesswork and time. Reverse genetics relies less on both of these factors.
"The advantages of reverse genetics is that we can do it much more quickly and have exactly what we need," Webster said, sometimes in as little as two or three weeks. Clinical trials are needed, he added.
Two families of drugs are now available for the flu, amantadine and neuraminidase inhibitors (like Tamiflu and Relenza). Existing supplies could be wiped out in days, however, Webster cautioned. "It would take about 18 months to start from primary chemicals to make more antivirals," he said.
“If an influenza pandemic started tomorrow, we would not be able to head it off with vaccines because the production facilities available to produce them are grossly inadequate,” said Robert G. Webster, Ph.D., a member of the Infectious Diseases department and holder of the Rose Marie Thomas Chair at St. Jude. Webster is co-author of the Science article.
This ability to rapidly make hundreds of millions of vaccine shots is a capability that the industrialized nations ought to develop. It would be useful in an emergency in response to natural and man-made pathogen pandemics. There is also the problem of how to rapidly identify which antigens a DNA vaccine or other type of vaccine should code for. In the case of influenza outbreaks that usually can be done fairly rapidly. But for other pathogens the job can be much tougher (even taking many years) and therefore there is a need for the development of biotechnologies that could accelerate the process identifying the most promising antigens to use in a vaccine.
As for the antiviral drugs: the efficacy of existing antivirals that are used against influenza are of such limited efficacy that the FDA is criticised in some quarters for approving them for sale. When a real killer influenza pops up the ability to very rapidly develop and manufacture a vaccine will likely be of far greater benefit than the antiviral drugs. But if a killer influenza pops up what would really help the most is probably the rapid implementation of rules and the propagation of sound advice that will reduce the risk of transmission. It would help to do more research ahead of time to identify the least disruptive techniques for reducing the rate of transmission of influenza. What would be the relative value of face masks, closure of schools, avoidance touching of handrails or door knobs, and other changes to daily routines? That would be great to know.
University of Wisconsin-Madison researchers have discovered that the same brain pathways involved in rewarding addictive drug use appear to reward compulsive running behavior.
The researchers studied changes in brain activity in two groups of rodents: typical laboratory mice and a special breed of mice selected over 29 generations for their affinity for voluntary wheel running.
"All mice run on wheels, and, therefore, have a motivation to run," says Justin Rhodes, a postdoctoral fellow at Oregon Health & Science University who completed the study while a graduate student at UW-Madison. But he adds that the specially bred mice have a genetic predisposition to run longer distances.
"They represent those few extreme individuals in the population with an intense desire or compulsion to run," he says.
To understand what drives these mice to run faster and farther than the average mouse, Rhodes and his colleagues at UW-Madison designed a study to measure changes in brain activity when both groups of mice were granted or denied access to the running wheel. For six days, they let all mice run as long as they wanted, and they recorded their distances. By and large, the high wheel running mice, compared to the other group, covered more ground in the same amount of time on their spinning treadmills. On the sixth day, for example, these mice averaged about six miles, compared to about two miles among the controls.
On the seventh day, the researchers blocked half the mice in each group from the wheel while giving free access to the other half. Five hours later, when the mice usually reach their running peak, the researchers compared brain activity in each mouse by measuring levels of Fos, a gene that's expressed in response to neuronal excitement.
"We thought we'd see more activity in the mice doing the running, but that's not what we saw at all," says Stephen Gammie, assistant professor of zoology at UW-Madison and senior author of the recent paper.
Instead, Gammie, Rhodes and their Wisconsin colleague Theodore Garland, Jr., (now at the University of California, Riverside) found that all the mice denied access showed higher levels of neuronal stimulation in 16 out of 25 brain regions.
Stimulation was even greater in mice that typically ran longer distances, showing a correlation between brain activity levels and average amount of wheel running.
"In the high-running mice, certain brain regions displayed extremely high levels of activity, more than normal," says Rhodes. "These were the same brain regions that become activated when you prevent rats from getting their daily fix of cocaine, morphine, alcohol or nicotine."
The researchers explain that blocking the running behavior in the mice bred to do more voluntary wheel spinning triggers a neuronal response - activation of brain regions involved in reward circuitry - that drives them to run. Explains Gammie, "These mice have run for six days. They want to run, and they're ready to run, but they can't. Change in brain activity is an indication of their motivation to run."
These findings then would suggest that all mice have the motivation to run, since each blocked mouse showed neuronal stimulation, but that some mice may actually crave it. After all, abstaining from their running regimen signals the same pathways involved in the craving for drugs of abuse, says Rhodes.
Whether these findings on exercise motivation hold true for humans remains to be studied. If it does, anecdotal evidence from Rhodes and Gammie would suggest that they've got more in common with the study's control mice: While they bike or play ultimate Frisbee, neither one says he feels the compulsion to do it on a regular basis.
One big reason cited for pursuing research into addiction is that such research may eventually point the way toward way to treat and stop addictions. Also, since addiction shows up as a form of compulsion the study of addiction may lead to better treatments of Obsessive Compulsive Disorders (OCDs). But isn't it easy to imagine all sorts of Obsessive Compulsive Adaptations (let's call them OCAs) where the compulsion is to do things that you really think you ought to do? The most obvious example for this is running. But suppose you want to study math or clean the house but never seem to get a strong enough urge to do it. Wouldn't it be handy to be able to flip a neural switch to engage a compulsive desire for, say, 30 minutes and then have the desire automatically shut down?
Our real problem is not that we have addictions or compulsions. Our problem is that we have compulsions that are counterproductive and harmful. What we need is the ability to shape our compulsions to make our compulsions more adaptive. It would be very handy to be able to do some behavior and while doing it send a signal to one's brain to reward that behavior and to record that as reward should be delivered when that behavior is done. One could set that up when faced with the need to do a large amount of fairly repetitive and boring but necessary work. This would probably dramatically enhance productivity for many tasks. Though it would also be necessary to be able to turn off a compulsion so that in situations where it is not appropriate one could turn one's attention to other activities.
Of course, with any new capability comes new ways for it to be abused. The ability to program in compulsions would open up the potential for both self-abuse (imagine a really depressed person with low self-esteem programming in a compulsion to commit physical self-torture) or a military programming in a compulsion to obey a particular person or a pimp programming young girls to have the desire to be a prostitute. When technologies are developed that make the mind more reprogrammable the issue of who does the programming and for what purposes may become one that wars are fought over. Still, I'd like to be able to shape and control the intensity of my compulsions in order to make myself more productive.
New research from the National Institute of Standards and Technology (NIST) suggests that next-generation, high-temperature superconductor (HTS) wire can withstand more mechanical strain than originally thought. As a result, superconductor power cables employing this future wire may be used for transmission grid applications. Projected to become available in three to four years, the advanced superconductor wire (known in the industry as second generation HTS wire) is expected to cost less than the HTS wire used in today's superconductor power cables. The NIST research is described in the Nov. 17 issue of Applied Physics Letters.
Superconductor power cables can carry three to five times the power of conventional copper cables. Compact, underground superconductor cables can be used to expand capacity and direct power flows at strategic points on the electric power grid and can be used in city centers where there is enormous demand, but little space under the streets for additional copper cables. One important challenge in using this next-generation HTS wire in such applications is the need for sufficient strength and resiliency to withstand the stretching and bending that occurs during power cable fabrication and installation.
Using superconductor ceramic coatings on metallic substrates fabricated by American Superconductor Corp. and Oak Ridge National Laboratory, the NIST researchers tested the material's electromechanical properties. According to lead author Najib Cheggour, they found that these advanced wires could stretch almost twice as much as previously believed without any cracking of the superconductor coating and with almost no loss in the coating's ability to carry electricity.
Moreover, the NIST team found that strain-induced degradation of the superconductors' ability to carry electricity is reversible up to a certain critical strain value. That is, the materials return to their original condition once the strain is relieved. The strain tolerance of this future HTS wire was found to be high enough for even the most demanding electric utility applications. The discovered reversible strain effect also opens new opportunities for better understanding of the mechanisms governing the conduction of electricity in this class of superconductors.
I just love it when better living is made possible by advances in materials science.
Some day we may be able to walk into a store and be completely alone and not have to see a living person in sight, imagine walking out holding the items you want and being billed instantly just as you leave the store. No confrontations, no customer service, no cute check-out girl, isn't our future grand.
To entice you more, APS is offering $50 Off to the First 100,000 registrants at the time of the their first "chipping" procedure.
Does anyone remember James Coburn in that 1967 paranoid classic movie The President's Analyst? At one point Coburn's character is kidnapped by "The Phone Company" because "The Phone Company" wants Coburn to convince the President of the United States to authorize the implantation of embedded telephone devices in everyone's brains that would allow everyone to think a phone number and have a phone connection made instantly to that phone number. Well, this proposal is not quite as radical. But effortless totally automated and instantaneous shopping check-out certainly would take us in that general direction.
VeriChip is a subdermal, radio frequency identification (RFID) device that can be used in a variety of security, financial, emergency identification and other applications. About the size of a grain of rice, each VeriChip product contains a unique verification number that is captured by briefly passing a proprietary scanner over the VeriChip. The standard location of the microchip is in the triceps area between the elbow and the shoulder of the right arm. The brief outpatient “chipping” procedure lasts just a few minutes and involves only local anesthetic followed by quick, painless insertion of the VeriChip. Once inserted just under the skin, the VeriChip is inconspicuous to the naked eye. A small amount of radio frequency energy passes from the scanner energizing the dormant VeriChip, which then emits a radio frequency signal transmitting the verification number. In October 2002, the US Food and Drug Administration (FDA) ruled that VeriChip is not a regulated device with regard to its security, financial, personal identification/safety applications but that VeriChip's healthcare information applications are regulated by the FDA. VeriChip Corporation is a wholly owned subsidiary of Applied Digital Solutions.
Recall that one reason carjackings have become popular is that it is much harder for criminals to steal unattended cars that have more technologically advanced anti-theft features. So attempts to steal a car from a person who has the key in hand or in the ignition are a response to technological advances in anti-theft technology. Well, it is easy to imagine some of the ways that criminals might respond to embedded credit cards:
Biometric tests combined with an embedded chip would eliminate the value of just taking the chip out of the body. A really advanced biometric test could even check the body temperature or iris response to light in order to verify that a person is alive and conscious. Another possible counter would be to put sensors on the device that check via various means whether it is still in the target host body and whether that body is still alive and free of trauma. One can even imagine an embedded cell phone technology where the device would phone for help in event that it is either removed from its host or the host is significantly harmed. So each counter the criminals might develop could be met by still more technological counters.
Yet this is precisely why Katherine Albrecht, the founder of the consumer advocacy group CASPIAN, finds Veripay frightening: "It's a lot easier to cancel and credit card account than it is to gouge a chip out of your arm." She worries that the chips will provide tracking opportunities for advertisers wishing to know the intimate shopping habits of particular consumers.
If the idea of this device seems too creepy keep in mind that the use of it is voluntary. Will embedded credit cards take off in popularity? Or will some other first application be able to better break through popular resistance? For instance, I'd expect embedded devices that could identify a person's location to catch on with less resistance than embedded credit cards might encounter since many parents would be strongly attracted to the idea of being able to rapidly find a kidnapped child. Another target market for embedded devices that will meet with less resistance are devices for health problems. An embedded device that would have the ability to do a cell phone call to alert that a person is having an epileptic fit or a heart attack would be attractive to many people. Also, for Alzheimer's patients the ability to find them if they wondered off or for law enforcement personnel to scan one of them to figure out who they are and where to return them would be of some value.
Reader Jonathan Swerdloff has brought to my attention what strikes me as a worthy cause: Arthur J. Olson's laboratory at the Scripps Research Institute has a distributed computing program runnable on home computers with internet connections for screening drug compounds against HIV viral proteins to discover possible HIV treatments.
Go to battle against AIDS with your computer!
"So what is FightAIDS@Home?"You can help!
FightAIDS@Home is the first biomedical distributed computing project ever launched. It is run by the Olson Laboratory at The Scripps Research Institute, and uses your computer to assist fundamental research to discover new drugs, using our growing knowledge of the structural biology of AIDS.
"Why should I join?"
About 42 million people are living with HIV or AIDS around the world. HIV mutates and evolves very quickly. Drug resistance is on the rise. If there is any "bioterrorism" in the world, it comes from Nature itself, in the form of HIV, and we need to fight this very real and long-standing problem now - more than any other threat to humanity.
So every computer counts! Your CPU helps to screen millions of candidate drug compounds computationally against detailed models of evolving AIDS viruses—an accomplishment previously impossible without expensive supercomputers. FightAIDS@Home accelerates AIDS research by connecting you to a global "grid" of distributed computing power.
Together, we are making a difference!
Your donation of spare computer cycles helps us in our entirely non-profit, scientific endeavours. Entropia helped to launch the FightAIDS@Home project, and we are grateful for their help and donated efforts, but as of May 2003, FightAIDS@Home is no longer associated with Entropia.
Professor Olson leads a large program project funded by the National Institutes of Health to develop new approaches to discover novel AIDS therapeutics based upon our ever-increasing knowledge of the structural biology of HIV.
We are working together with other laboratories here at Scripps and elsewhere, to design, synthesize and test new HIV protease inhibitors that are better than existing drugs in defeating the virus's ability to develop drug resistance. Our collaborators include:
The Elder Laboratory - Virology
The Olson Laboratory - Computational Chemistry
The Sharpless Laboratory - Synthetic Chemistry
The Stout Laboratory - Xray Crystallography
The Torbett Laboratory - Cell Biology
The Wlodawer Laboratory - Xray Crystallography
The Wong Laboratory - Synthetic Chemistry
If there are other worthy biomedical research distributed computing projects that anyone wants to bring to my attention then please post in the comments to this post or send me an email. I'd like to build up a category archive collection of posts linking to such projects.