Whitehead Institute Fellow Alan Jasanoff at MIT is developing new techniques to use functional Magnetic Resonance Imagining (fMRI) to study electrical activity in the brain.
Identifying such networks is a goal that drives Jasanoff, who is pioneering new fMRI techniques that go beyond blood flow to expose the brain’s electrical activity—a series of impulses that transmits messages between neurons. The techniques are still experimental, so Jasanoff works with laboratory animals to isolate neural circuits involved in simple behaviors. “What we learn about simple behaviors in animals guides us toward an understanding of more complex behaviors in humans,” Jasanoff says. “Our findings can influence the direction of human research.”
Researchers trying to “get inside the brain” during experimental research traditionally have relied on electrodes wired directly into neural tissue. This process is not only invasive and cumbersome, it’s also limited in terms of its spatial coverage—electrodes gather data only from the area to which they are attached. Jasanoff’s research is offering another option, namely, a set of MRI contrasting, or imaging, agents that can selectively be activated by the brain’s electrical currents. “My approach will provide a direct assay for neural activity deep within the brain,” Jasanoff says. “This is unlike anything that is currently available.”
To date, Jasanoff’s focus has been on establishing a way to test imaging agents for fMRI in single brain cells of an oversized housefly called a “blowfly.” He presented the blowfly brain imaging approach in a 2002 article in the Journal of Magnetic Resonance, and demonstrated an oxygen imaging application using the setup in a 2003 article in the journal Magnetic Resonance in Medicine. Now Jasanoff is completing work on two new brain imaging agents, and intends to adapt the agents so they can be used safely in higher organisms, for instance, rodents. Studies in animals are necessary before the agents can be used in experiments with human subjects, a step in the research that Jasanoff notes is many years away.
The current use of fMRI to measure blood flow limits how much information can be discovered. Imaging agents that make possible the measure of actual patterns of electrical activity would allow fMRI to far more directly measure brain activity than is currently the case. So Jasanoff's work is potentially very important for brain studies.
While this work is years away from being applied to humans it illustrates a larger trend familiar to long time FuturePundit readers: biological assay tools are becoming more powerful. Our ability to measure biological structures and activity is steadily increasing. Phenomena that are now difficult to study and to manipulate will become increasingly easier to watch and to change.
The National Human Genome Research Institute (NHGRI) portion of the US National Institutes of Health (NIH) has awarded a Centers of Excellence in Genomic Science (CEGS) to a team at Harvard Medical School to develop cheaper and faster DNA sequencing technologies.
At Harvard, a team led by George Church, Ph.D., will address the biomedical research community's need for better and more cost-effective technologies for imaging biological systems at the level of DNA molecules (genomes) and RNA molecules (transcriptomes). The center will receive $2 million annually in CEGS funding for five years.
Specifically, the Harvard center plans to further develop polymerase colony sequencing technologies for studying sequence variation in biological systems. In this highly parallel method of nucleic acid analysis, a sample of DNA is dispersed as many short fragments in a polyacrylamide gel affixed to a microscope slide. Researchers then add an enzyme called DNA polymerase, which copies each DNA fragment repeatedly, forming tiny, localized sets of identical fragments. These sets of fragments are embedded in the gel in a manner reminiscent of bacterial colonies, which has prompted scientists to refer to them as "polonies."
Next, the polonies are exposed sequentially to free DNA bases tagged with fluorescent markers in the presence of a different enzyme, and the incorporation of those bases into the polonies is monitored with a scanning machine. This produces a read-out of the DNA sequence from each polony. A computer program then assembles the DNA sequences from the individual polonies into an order that reflects the complete sequence of the original DNA sample. The ordering process is accomplished by aligning the sequences of the individual polonies with a reference DNA sequence, such as the sequence produced by the Human Genome Project. In addition to its application in DNA sequencing, polony technology can be used to study the transcriptome (RNA content) of cells and to determine differences in genome sequence between different individuals (genotypes and haplotypes).
The technology developed by Church's team currently can read a slide with 10 million polonies in about 20 minutes, making it one of the swiftest DNA sequencing methods now available. With the further development planned at the center, the technology has the potential to lead to quicker, more cost-effective ways of sequencing individual genomes for use in research or clinical settings. Producing a high-quality draft of a mammalian-sized genome currently costs about $20 million, but NHGRI's aim is to dramatically reduce that cost to $1,000 over the next 10 years.
"In order to reach that ambitious goal, we will need to develop a completely integrated system that requires very small volumes and utilizes very inexpensive instruments. Ideally, the system would cost no more than a good desktop computer," said Dr. Church.
Cheap fast DNA sequencing will allow individuals to have their DNA sequenced. That ability will usher in a new era where the knowledge of a given person's DNA sequences will lead to many changes. Among the practices that will become commonplace once personal DNA sequencing becomes cheap:
Scientists at the University of Florence in Italy found that when youngsters were deprived of their TV sets, computers and video games, their melatonin production increased by an average 30 per cent.
“Girls are reaching puberty much earlier than in the 1950s. One reason is due to their average increase in weight; but another may be due to reduced levels of melatonin,” suggests Roberto Salti, who led the study. “Animal studies have shown that low melatonin levels have an important role in promoting an early onset of puberty.”
Don't be too surprised if boys embarrassed by a lack of pubic hairs and girls wanting to grow breasts start watching more TV.
Alessandra Graziottin, director of the Centre for Gynaecology and Medical Sexology in Milan and a former president of the International Society for the Study of Women's Sexual Health, said the results were "very interesting and plausible".
She told the newspaper La Repubblica: "Studies in the US have shown that the greater the exposure to television the greater the number of early sexual experiences, including teen pregnancies."
I've previously argued for the delay of puberty in order to help kids be less distracted by sexual desires and more capable of learning. But the question hanging over this proposal is whether the delay of puberty will also delay brain changes that increase cognitive ability. We literally become smarter in our teen years and in recognition of this fact IQ tests are routinely normalized for age. We need psychometric research that tracks IQ changes as a function of the onset of puberty.
To get a sense of just how much the brain changes during adolesence see my post Adolescence Is Tough On The Brain.
Julie Daniels, an epidemiologist at the University of North Carolina at Chapel Hill School of Public Health, has found in a population of English children that consumption of fish by mothers during pregnanacy is positively correlated with cognitive development after controlling for educational levels of the mothers and some other factors.
CHAPEL HILL -- When fish is not contaminated, moderate consumption of the protein-rich food source by pregnant women and young children appears to boost the children’s neurological development, a new study shows.
"Our research adds to the literature suggesting that fish contains nutrients that may enhance early brain development," said Dr. Julie Daniels, assistant professor of epidemiology at the University of North Carolina at Chapel Hill School of Public Health. "We can not say that we have proven that eating fish will have long-lasting effects in making people smarter since we have only looked at early development markers through an observational study."
More research is needed to corroborate the findings, Daniels said.
A report on the study appears in the July issue of the journal Epidemiology. Besides Daniels, authors are Drs. Matthew P. Longnecker of the National Institute of Environmental Health Sciences, Andrew S. Rowland of the University of New Mexico’s family and community medicine department and Jean Golding of the University of Bristol Institute of Child Health’s ALSPAC Study Team.
Conducted in Bristol, England, the research involved evaluating the association between mothers’ fish intake during pregnancy and their offspring’s early development of language and communication skills, Daniels said.
The team evaluated 7,421 English children born in 1991 and 1992. They studied the children since much has been learned about contaminants in fish, but little research has been done on the potential developmental benefits of eating fish, she said.
"We measured mothers’ and children’s fish intake by questionnaire," Daniels said. "Later, we assessed each child’s cognitive development using adaptations of the MacArthur Communicative Development Inventory at 15 months and the Denver Developmental Screening Test at 18 months."
Researchers also measured mercury levels in umbilical cord tissue for a subset of 1,054 children.
"We found total mercury concentrations to be low and not associated with neurodevelopment," she said. "Fish intake by mothers during pregnancy, and by infants after birth, was associated with higher mean developmental scores. For example, the adjusted mean MacArthur comprehension score for children whose mothers consumed fish four or more times a week was 72 compared with 68 among those whose mothers did not consume fish. While this may not be a major difference clinically, but the statistically significant results were consistent across related subtests that could be important across a large population."
Scientists found that there was a subtle but consistent link between eating fish during pregnancy and children’s subsequent test scores, even after adjusting for factors such as the age and education of the mother, whether she breastfed and the quality of the home environment.
The largest effect was seen in a test of the children’s understanding of words at age 15 months. Children whose mothers ate fish at least once a week scored 7 percent higher than those whose mothers never ate fish.
A similar pattern, although less marked, was seen in tests measuring social activity and language development. Developmental scores were also higher among children who also ate fish at least once a week before their first birthdays.
The study suggests that if a woman eats moderate quantities of fish -- about two to three servings per week, or 12 ounces, of non-contaminated species -- her child might benefit, the scientist said. There is no evidence that the more fish a woman eats, the higher that benefit would be.
"Women should definitely avoid shark, swordfish, king mackerel and tilefish, according to the U.S. Environmental Protection Agency and Food and Drug Administration," Daniels said. "Those fish are higher on the food chain and have greater accumulation of pollutants."
Depending on the region where they are caught, many of the most commonly eaten fish are low in pollutants while still being high in critical long-chain fatty acids and other nutrients, she said. They include salmon, herring, pollock, canned light tuna and sardines.
Daniels said she is pursuing similar work in a group of U.S. children to confirm the results in other populations.
"We also need to follow the children longer to determine whether any benefits from fish intake are permanent or transient," she said.
Fish intake during pregnancy has the potential to improve fetal development because it is a good source of iron and long chain omega fatty acids, which are necessary for proper development and function of the nervous system, Daniels said. Fish, especially oily fish, is a dietary source of eicosapentaenoic and docosahexaenoic acids (DHA), which are important in the structural and functional development of the brain before birth and through a child’s first year. The concentration of DHA in fetal brain increases rapidly during the last three months in the womb.
The Avon Longitudinal Study of Parents and Children, or ALSPAC, (also known as Children of the 90s) is a continuing research project based at the University of Bristol. It enrolled 14,000 mothers during pregnancy in 1991-2 and has followed the children and parents in minute detail ever since.
Support for the study came from the Medical Research Council, the Wellcome Trust, the Department of Health, the Department of the Environment, DfEE, Nutricia and other companies, all in the United Kingdom.
The US Food and Drug Admnistration has a very handy page of mercury levels in fish. See the chart Mercury Levels in Commercial Fish and Shellfish and follow down the column labelled "MEAN" to see how the various types of fish compare. Women especially would do well to avoid the higher scoring fish. You have to decide for yourself where you want to draw the line. Personally, I do not eat anything above 0.10 PPM and usually choose fish well below even that limit.
Some commentators recommend avoiding fatty fish because more PCBs, DDT, dioxin, and other chemical compounds will be found in the fat. But the big health benefit from fish comes from the omega-3 fatty acids found in fish. So the advice to avoid fatty fish ends up defeating the purpose of eating fish in the first place.
I think a smarter approach is to avoid the types of fish that have been shown to have the most chemical contamination and to eat fish which have omega-3 fatty acids as a high fraction of total fats. That way you can limit the total amount of fish fat you need to consume in order to get the omega-3 fats. I prefer ocean fish to fresh water fish since the chemical pollution has become far more concentrated in lakes and rivers than in the oceans.
Salmon has very low levels of mercury and at the same time wild salmon has lower levels of PCBs than beef. It is also has a relatively good ratio of omega-3 fatty acids to omega-6 fatty acids. However, the Environmental Working Group has found that most farmed salmon has much higher levels of PCBs along with greater amounts of non-omega-3 fats.
Seven of ten farmed salmon purchased at grocery stores in Washington DC, San Francisco, and Portland, Oregon were contaminated with polychlorinated biphenyls (PCBs) at levels that raise health concerns, according to independent laboratory tests commissioned by Environmental Working Group.
These first-ever tests of farmed salmon from U.S. grocery stores show that farmed salmon are likely the most PCB-contaminated protein source in the U.S. food supply. On average farmed salmon have 16 times the dioxin-like PCBs found in wild salmon, 4 times the levels in beef, and 3.4 times the dioxin-like PCBs found in other seafood. The levels found in these tests track previous studies of farmed salmon contamination by scientists from Canada, Ireland, and the U.K. In total, these studies support the conclusion that American consumers nationwide are exposed to elevated PCB levels by eating farmed salmon.
However, not all farmed salmon has higher chemical contamination. Environmental Working Group did find 2 farmed salmon companies which use feeds that keep PCB levels down as low as those found in wild salmon.
Some farmed salmon companies -- Black Pearl and Clare Island Sea Farm -- are producing salmon that have very low PCB levels similar to those of wild salmon, Green said. These producers use herring and sardine fish meal, canola oil, soya and other uncontaminated ingredients.
Seek out the Black Pearl and Clare Island Sea Farm brands if you can find them.
Whether the higher levels of PCBs pose a health risk is not clear. Contaminant risks are typically greater for developing fetuses than for adults because the fetuses are going through complex changes which, if disrupted by chemical toxins, can cause improper development with very lasting and even permanent results. Pregnant women therefore need to be more conservative when evaluating risks.
Keep in mind when evaluating risks of fish that fish consumption probably will reduce your risks of heart disease and other diseases in which inflammation mechanisms are implicated (e.g. arthritis and perhaps some of the neurodegenerative diseases). So the risks of chemical contaminants has to be weighed against the health benefits of eating high omega-3 fatty acid foods. In the case of wild salmon the mercury and chemical contaminant contaminations are so low that a strong argument can be made for eating the wild salmon. My guess is that even farmed salmon is a strong net health benefit for the vast majority and in the case of the farm salmon that is given feed that has low levels of contaminants the benefit from eating it almost as big as the benefit from eating wild salmon.
The Lancaster University research found that if you are journeying from Edinburgh to London by standard Intercity train with all the seats taken, you will be using slightly more fuel per passenger - about 11 litres of fuel per passenger compared to about ten litres - than you would if you made the same journey by car, with all the seats occupied.
By using more fuel, you are causing more damage to the environment through emissions not to mention through using up more of the planet’s natural resources.
And if you choose to travel on one of the soon-to-be-launched higher speed trains, you would be using slightly more fuel than a plane - about 22 litres of fuel per passenger compared to about 20 litres - and more than twice as much fuel per head than the car, at ten litres.
Of course the average train over that particular journey probably has a higher load factor than the average car travelling between that same pair of cities.
Regulations and higher speeds are making British trains less fuel efficient. Roger Ford, of Modern Railways magazine, provides the awful truth to environmentalists.
The introduction of crumple zones, disabled lavatories and seating rules for trains travelling over 100mph had added weight and reduced capacity.
"I know this will generate howls of protest, but at present a family of four going by car is about as environmentally friendly as you can get."
Of course the argument can be made that the average car on a long trip does not have most of its seats occupied. But at what percent occupancy does the average long range train in America, Britain, or Europe operate at? I've ridden across the United States on an Amtrak train that was at maybe 5% or 10% occupancy. Heck, that is a lower occupancy rate than a car can manage. One would need a 10 seat vehicle to get down to 10% occupancy with just a driver and a 20 seat vehicle to get to 5% occupancy with only the driver in the vehicle. Still, in Europe the train load factors are probably higher than the car load factors on average. But if you are going to load up a whole family to go tripping and you have concerns about the environment the good news is that you don't have to feel any worse for taking the car.
The UK Daily Telegraph's editors say take a car, save the planet.
May we make a modest suggestion? "Save the planet. Jump into your car."
However, there is still a reason to take the train: the death rate per hundred miles travelled is probably much lower. Plus, you can get up and walk around. Plus, there are some train routes that go through some breathtaking scenery where there are no roads. I'm told the Chicago to San Francisco Amtrak goes through some such scenery in the Rockies and it is on my list of trips to take.
Instapundit megablogger Glenn Reynolds interviews Aubrey de Grey for Tech Central Station on the subject of our future ability to reverse aging.
Q: Some people regard aging research, and efforts to extend lifespan, with suspicion. Why do you think that is? What is your response to those concerns?A: I think it's because people don't think extending healthy lifespan a lot will be possible for centuries. Once they realise that we may be able to reach escape velocity within 20-30 years, all these silly reasons people currently present for why it's not a good idea will evaporate overnight. People don't want to think seriously about it yet, for fear of getting their hopes up and having them dashed, and that's all that's holding us back. Because of this, my universal response to all the arguments against curing is simple: don't tell me it'll cause us problems, tell me that it'll cause us problems so severe that it's preferable to sit back and send 100,000 people to their deaths every single day, forever. If you can't make a case that the problems outweigh 100,000 deaths a day, don't waste my time.
By "escape velocity" Aubrey means the point at which we will be able to repair the damage of aging faster than it accumulates so that the odds of dying decrease rather than increase each year. As it stands now a 50 year old has a higher chance of dying than a 49 year old in the course of a year and a 51 year old has a higher chance of dying in a year's time than a 50 year old. As our bodies get older the odds go up of anything going wrong badly enough to kill us in the space of a year. Aubrey thinks we may reach the "escape velocity" point of aging reversal treatments in the 2020s or 2030s. I share this view and one reason I share it is that the rate of advance of biologicals sciences and biotechnology is accelerating. In fact, the reason I have a category archive entitled Biotech Advance Rates is to demonstrate that we can not use past rates of advance as an indicator of how fast we will advance in the future.
Aubrey recommends reading a fable written by Nick Bostrom, a British Academy Research Fellow at Oxford University, about aging called The Fable of the Dragon-Tyrant which is about to be published in The Journal of Medical Ethics.
Next to speak was the king’s chief advisor for morality, a short and shriveled man with a booming voice that easily filled the auditorium:“Let us grant that this woman is correct about the science and that the project is technologically possible, although I don’t think that has actually been proven. Now she desires that we get rid of the dragon. Presumably, she thinks she’s got the right not to be chewed up by the dragon. How willful and presumptuous. The finitude of human life is a blessing for every individual, whether he knows it or not. Getting rid of the dragon, which might seem like such a convenient thing to do, would undermine our human dignity. The preoccupation with killing the dragon will deflect us from realizing more fully the aspirations to which our lives naturally point, from living well rather than merely staying alive. It is debasing, yes debasing, for a person to want to continue his or her mediocre life for as long as possible without worrying about some of the higher questions about what life is to be used for. But I tell you, the nature of the dragon is to eat humans, and our own species-specified nature is truly and nobly fulfilled only by getting eaten by it...”
This advisor for morality sounds like George W. Bush's advisor Leon Kass.
Here's a point I emphatically agree with: Glenn Reynolds thinks there is nothing beautiful about aging and dying.
I've watched people I love age and die, and it wasn't "beautiful and natural." It sucked. Aging is a disease. Cataracts and liver spots don't bring moral enlightenment or spiritual transcendence. Death may be natural -- but so are smallpox, rape, and athlete's foot. "Natural" isn't the same as "good."As far as I'm concerned, I'd rather see my tax dollars spent on longevity research than, well, most of the other things they're spent on. I wonder how many other people feel that way.
Looking at how things have worked out in American society, I'm not too worried. The tendency in America seems to be toward more turnover, not less, in major institutions, even as lifespans grow. CEOs don't last nearly as long as they did a few decades ago. University presidents (as my own institution can attest) also seem to have much shorter tenures. Second and third careers (often following voluntary or involuntary early retirements) are common now. As a professor, I see an increasing number of older students entering law school for a variety of reasons. And we've seen all of this in spite of the abolition of mandatory retirement ages by statute over a decade ago. It's more dynamism, not less.Of course, that may not be true everywhere. In societies that are already stagnant, like the Egypt of the Pharaohs, or the Central Committee of Leonid Brezhnev's time, death is the main source of dynamism, and the young (and middle-aged) often do wind up in sour apprenticeships waiting for their elders to die. In capitalist democracies, other forces play a far greater role. So it seems to me that we have little to fear from extending human lifespans in our own society. And to the extent that lifespan-extension robs dictatorships of what little dynamism they possess, it probably makes them less dangerous, too.
I certainly agree with him about free societies. Though imagine a Joseph Stalin or a Mao Tse Tung given eternal youth. There are countries that have begun to go down the path away from totalitarianism because their dictator died from old age. Still, we shouldn't all be forced to grow old and die in every country of the world just in order to cause the death of a Stalin or a Pol Pot. The greatest murderers in history have killed only a very fraction of the number of people that aging has killed.
For more on Aubrey and the prospects for reversing aging see my previous posts Aubrey de Grey Decries Entrenched Timidity Of Aging Research Funding, Aubrey De Grey: We Could Triple Mouse Lives In 10 Years, Aubrey de Grey: First Person To Live To 1000 Already Alive, Wanted: Half Billion Dollars To Jumpstart Eternal Youthfulness Research and my entire Aging Reversal category archive.
Update: Writing in PLoS Biology Aubrey de Grey has a review of Coping With Methuselah: The Impact of Molecular Biology on Medicine and Society where he discusses the potential nearness of the point where we will reach 'actuarial escape velocity’ (AEV) and become less likely to die from one year to the next.
Unfortunately, they didn't discuss what would happen if age-specific mortality rates fell by more than 2% per year. An interesting scenario was thus unexplored: that in which mortality rates fall so fast that people's remaining (not merely total) life expectancy increases with time. Is this unimaginably fast? Not at all: it is simply the ratio of the mortality rates at consecutive ages (in the same year) in the age range where most people die, which is only about 10% per year. I term this rate of reduction of age-specific mortality risk ‘actuarial escape velocity’ (AEV), because an individual's remaining life expectancy is affected by aging and by improvements in life-extending therapy in a way qualitatively very similar to how the remaining life expectancy of someone jumping off a cliff is affected by, respectively, gravity and upward jet propulsion (Figure 1).
The escape velocity cusp is closer than you might guess. Since we are already so long lived, even a 30% increase in healthy life span will give the first beneficiaries of rejuvenation therapies another 20 years—an eternity in science—to benefit from second-generation therapies that would give another 30%, and so on ad infinitum. Thus, if first-generation rejuvenation therapies were universally available and this progress in developing rejuvenation therapy could be indefinitely maintained, these advances would put us beyond AEV
Aubrey believes that policymakers may well try to accelerate the development of rejuvenation therapies once they see that such therapies will provide a way to escape from the crushing burden of retirement benefits. I also have argued that rejuvenation therapies would solve demographic problems including the financial burdens of an aging population.
Reason of the Fight Aging! blog has additional commentary on Aubrey's PLoS Biology review. But be sure to read Aubrey's article first. He makes a number of excellent points and I had a hard time choosing what to excerpt.
La Jolla, CA. June 21, 2004—Scientists at The Scripps Research Institute have designed a potentially valuable tool for treating cocaine addiction by creating a modified "phage" virus that soaks up the drug inside the brain.
They coated the virus with an antibody that binds to molecules of cocaine and helps to clear the drug from the brain, which could suppress the positive reinforcing aspects of the drug by eliminating the cocaine high.
"Typically one would think of a virus as a bad entity," says principal investigator Kim D. Janda, Ph.D., who holds the Ely R. Callaway, Jr. Chair in Chemistry and is an investigator in The Skaggs Institute for Chemical Biology at Scripps Research. "But we are taking advantage of a property it has—the ability to get into the central nervous system."
The structure and design of the virus and its effect in rodent models are described in an article that will be published in an upcoming issue of the Proceedings of the National Academy of Sciences.
Note this virus is not acting like a typical vaccine. It is not being injected in order to cause an immune response by the body against cocaine. The actual virus contains the antibodies that bind to the cocaine.
The economic and human toll for cocaine is rather higher.
Americans spend more on cocaine, a chemical extracted from the leaf of the Erythroxylaceae coca plant, than on all other illegal drugs combined, says a White House Office of National Drug Control Policy study that came out in the mid-1990s. The study estimates that $38 billion was spent on cocaine in the years 1988 to 1995 alone.
Cocaine's secondary costs to society due to cocaine treatment and prevention programs, emergency room visits and other healthcare costs, lost job productivity, lost earnings, cocaine-related crime, and social welfare are estimated to be in the billions of dollars annually—not to mention the drug's human toll. According to the National Institute on Drug Abuse (NIDA), about 1.7 million people regularly use cocaine in the United States—a population larger than that of the city of Philadelphia—and cocaine is the leading cause of heart attacks and strokes for people under 35.
Sounds like this could be delivered as a nasal spray.
A few years ago, Janda and his graduate students Rocio Carrera and Gunnar Kaufmann decided they wanted to target the cocaine antibodies into the brain. That's when they set out to create a new form of virus. This was done with collaborators Jenny Mee and Michael Meijler in the Department of Chemistry and Professor George Koob in the Department of Neuropharmacology and the Pearson Center For Alcoholism And Addiction Research at Scripps Research.
The reserachers used filamentous phage—a type of virus that infects bacteria—for the study. They inserted DNA encoding an antibody that binds cocaine into the phage's genetic code. When the modified phage were grown, they had hundreds of these antibodies displayed on their surfaces.
Phage particles, like many types of viruses, have the ability to enter the brain through the internasal passageway. Janda, Carrera, and Kaufmann used this ability to deliver their antibody into the central nervous system. The current study demonstrates the ability of the antibody/phage to reduce one effect of cocaine in rodent models (increased locomotion).
She hopes that addicts who want to quit could eventually be given the treatment. “It’s for weak moments,” she says. The virus lingers in the brain for around two weeks, so although they might relapse once, the absence of any euphoric feeling would then discourage them from taking it again.
Lead researcher Professor Kim Janda told BBC News Online: "This would be used in conjunction with abstinence programmes and maybe in conjunction with other vaccines that only treat the peripheral sites - like a one-two punch."
The need to take this treatment once every week or two is problematic. Many addicts who are trying to kick who only take cocaine at a weak moment will be willing to take boosters. But some will decide at some point that they want to get back on the coke and stop taking the nasal spray booster and then just wait a couple of weeks before using again. Though boosters delivered in front of a parole officer or doctor could be compelled by court orders in some cases.
We need better methods to control addiction. A lot of addicts want to kick their habits but the compulsion to use can be very strong. An ideal treatment, however, would stop the craving and even repair or replace damaged and destroyed neurons. Plus, given that some people are more prone to addiction an ideal treatment would change their brains in ways that make them permanently less likely to get addicted again.
These kidneys grown inside the rats from transplanted were not perfect and only externded life for several days. However, one of the scientists likens this first successful attempt to grow kidneys in a mammal to the short first flight of the Wright brothers at Kitty Hawk.
St. Louis, June 21, 2004 -- Growing new organs to take the place of damaged or diseased ones is moving from science fiction to reality, according to researchers at Washington University School of Medicine in St. Louis.
Scientists have previously shown that embryonic tissue transplants can be used to grow new kidneys inside rats. In their latest study, though, they put the new kidneys to an unprecedented and critical test, removing the rat's original kidneys and placing the new kidneys in position to take over for them. The new kidneys were able to successfully sustain the rats for a short time.
"We want to figure out how to grow new kidneys in humans, and this is a very important first step," says Marc R. Hammerman, M.D., the Chromalloy Professor of Renal Diseases and leader of the study. "These rats lived seven to eight days after their original kidneys were removed, long enough for us to know that their new kidneys worked."
The study will appear in the July/August issue of Organogenesis, a new scientific journal. It is also available online.
Hammerman is a leader in the burgeoning field of organogenesis, which focuses on growing organs from stem cells and other embryonic cell clusters known as organ primordia. Unlike stem cells, organ primordia cannot develop into any cell type--they are locked into becoming a particular cell type or one of a set of cell types that make up an organ.
My guess is they are getting these organ primordia cells by letting an embryo develop further than the initial stem cell stage. Using human organ primorida cells would likely elicit much stronger ethical objections than using embryonic stem cells since the fetus would need to develop to a later stage before being aborted to use its tissue for this purpose. Still, even if this approach was not allowed in practice this research is still going to yield important information for other approaches.
"Growing a kidney is like trying to construct an airplane--you can't just make a single part like a propeller, you have to build several different parts and systems and get them all working together properly," Hammerman explains. "Fortunately, kidney primordia already know how to grow different parts and self-assemble into a kidney--we just have to give them the right cues and a little assistance at various points."
For the study, Hammerman and coauthor Sharon Rogers, research instructor in medicine, gave renal primordia transplants to 5- and 6-week-old rats. Prior to insertion, scientists soaked the transplant tissue in a solution that included several human growth factors, proteins and hormones. One of the rats' original kidneys was removed at the same time.
Three weeks after the transplant, researchers connected the new kidneys to the bladder and administered a second dose of growth factors.
Approximately five months after the transplants, scientists removed the remaining original kidney in control and experimental rats. To help resolve uncertainty about which kidney functions are critical to sustaining life, scientists cut the connections between the bladder and the new kidneys in a subset of the experimental rats.
Rats with no new kidneys lived for two to three days, and rats whose new kidneys were disconnected from their bladders lived no longer. However, the rats with new kidneys connected to their bladders lived seven to eight days.
"This tells us that the urine-producing functions of the kidney are key to preservation of life," says Rogers.
"Seven to eight days may not seem like a long time," adds Hammerman. "However, what we have done is akin to building the first airplane and showing that it can fly, if only for a few minutes. It's just as revolutionary."
Hammerman's goal of using pig cells to grow kidneys in humans would sidestep ethical opposition to the use of human stem cells and human embryos.
Hammerman, who is director of the Renal Division at the school's affiliate Barnes-Jewish Hospital, hopes to use animal-to-human transplants, known as xenotransplants, as a solution for chronic organ donation shortages.
"Every year, approximately 10,000 kidneys become available for transplant into patients with end-stage kidney disease," Hammerman says. "But the waiting lists for kidney transplants can run as high as 100,000 individuals, and most patients die of the disease before an organ becomes available."
Kidney function in pigs is similar to that in humans, and Hammerman's eventual goal is to use embryonic pig tissue transplants to help renal failure patients live longer.
Would Jews or Muslims who will not eat pork also object to having a pig kidney grown inside them?
The vascular system that grows for these embryonic kidneys appears to be partially or fully made from host cellls that serve as precursors for formation of blood vessels that then grow into the shape needed for the kidneys. That explains why the use of early stage transplants would avoid immune rejection via an attack on the vascular system. The vascular system is not foreign tissue even though the actual kidney cells are foreign.
Working with embryonic tissues that grow into organs inside the patient lets Hammerman avoid hyperacute and acute vascular rejection, two immune system responses that can destroy xenotransplants. In both of these responses, the body's immune system recognizes the blood vessels of transplanted tissue as foreign and attacks them.
"Those two types of rejection have so far made it impossible to xenotransplant fully grown kidneys," Hammerman explains. "However, we can avoid this by transplanting embryonic kidneys before blood vessels develop."
The primordia are small enough that survival can be maintained after transplantation through diffusion of oxygen and nutrients. The transplanted cells attract the growth of new blood vessels from the host as they grow into a mature organ.
This approach of using organ primordia cells may avoid the immune system rejection of the vascular system in the kidneys but will it avoid the immune system's eventual rejection of the pig kidney cells? If one could grow kidneys from one's own adult stem cells then the immune rejection problem could be avoided.
We really need a number of different organ growth capabilities. The ability to grow an organ ahead of time outside of the human that can be put into any body would allow replacement organs to be used in acute emergencies (e.g. after a bullet has shredded a heart). Such an organ would not necessarily have to be perfectly immune compatible. In an emergency immune suppression drugs could suppress the immune response while a more immuno-compatible replacement organ was grown.
This approach reported above both requires the host to grow the organ and may not yield a perfectly immune compatible organ. Still, artifiical kidneys have deficiencies that eventually result in death and with time we will have better techniques for coaxing a body's immune system to treat an organ as compatible. This approach might end up working for many puposes until a more ideal approach is developed.
Sad people are nice. Angry people are nasty. And, oddly enough, happy people tend to be nasty, too.
Such (allowing for a little journalistic caricature) were the findings reported in last month's issue of Psychological Science. Researchers found that angry people are more likely to make negative evaluations when judging members of other social groups. That, perhaps, will not come as a great surprise. But the same seems to be true of happy people, the researchers noted. The happier your mood, the more liable you are to make bigoted judgments -- like deciding that someone is guilty of a crime simply because he's a member of a minority group. Why? Nobody's sure. One interesting hypothesis, though, is that happy people have an ''everything is fine'' attitude that reduces the motivation for analytical thought. So they fall back on stereotypes -- including malicious ones.
My assumption is that people will genetically engineer themselves and their children to be happier. Genetic variations that create propensities toward sadness and depression will be excised. So then will people become nastier and more judgemental as a result?
Another way that people may change in the future is they may become less pain sensitive. When men choose to boost their testosterone levels they are probably lowering their pain sensitivity.
"If men are less sensitive to pain, there is more willingness to fight and participate in further fights," says Michaela Hau, an animal physiology and behaviour scientist at Princeton University, New Jersey, and lead author of the study.
The research team gave testosterone implants to male sparrows and measured their reaction times to pain. Testosterone allowed the birds to tolerate discomfort for longer periods, suggesting that the hormone somehow disguises pain.
It is likely that lowered pain sensitivity is not the only way that testosterone boosts will change the brain and hence change behavior. Look at 'roid rage reports of weightlifters who become extremely angry and aggressive as a consequence of taking steroids. Imagine a future of happy people, more prone to anger, and who feel less pain. They will be nasty, judge others more harshly, and be more aggressive. That doesn't sound like a recipe for either neighborhood peace or world peace, does it?
Another worry about how human brains may come to be different in the future is that people may genetically engineer their children to be less prone to engage in altruistic punishment. Think of the impulse that drives a person to report or testify about a crime that they see commtted against someone else. Imagine that impulse just wasn't there. A reduction in that impulse would reduce the motivation of police and prosecutors as well. A future full of happy nasty people with a lower propensity to dole out altruistic punishment brings to mind the Oingo Boingo song Nothing bad ever happens to me.
WASHINGTON - If all the highways, streets, buildings, parking lots and other solid structures in the 48 contiguous United States were pieced together like a giant jigsaw puzzle, they would almost cover the state of Ohio. That is the result of a study by Christopher Elvidge of the National Oceanic and Atmospheric Administration's National Geophysical Data Center in Boulder, Colorado, who along with colleagues from several universities and agencies produced the first national map and inventory of impervious surface areas (ISA) in the United States.
As calculated by the researchers, the total impervious surface area of the 48 states and District of Columbia is approximately 112,610 square kilometers [43,480 square miles], and, for comparison, the total area of the state of Ohio is 116,534 square kilometers [44,994 square miles].
The new map is important, because impervious surface areas affect the environment The qualities of impervious materials that make them ideal for construction also create urban heat islands, by reducing heat transfer from Earth's surface to the atmosphere. The replacement of heavily vegetated areas by ISA reduces sequestration of carbon, which plants absorb from the atmosphere, Elvidge says in the 15 June issue of Eos, published by the American Geophysical Union. Both of these effects can play a role in climate change.
In watersheds, impervious surface areas alter the shape of stream channels, raise the water temperature, and sweep urban debris and pollutants into aquatic environments. These effects are measurable once ten percent of a watershed's surface area is covered by ISA, Elvidge writes. The consequences of increased ISA include fewer fish and fewer species of fish and aquatic insects, as well as a general degradation of wetlands and river valleys. The impervious surface area of the contiguous United States is already slightly larger than that of its wetlands, which is 98,460 square kilometers [38,020 square miles].
Some argue that the use of photovoltaics as a power source will require the covering of too much surface area. Well, lets start with that area, 112,610 square kilometers [43,480 square miles], which is currently covered by human structures and look at what we would need to get enough power to use photovoltaics as our sole power source.
To put those numbers in perspective recall a previous post where I reported on a calculation by Dr. David Goodstein, Vice Provost and Professor of Physics and Applied Physics at Caltech about the surface area needed to be covered by 10% efficient photovoltaics to provide enough energy for the whole world at current consumption rates.
Solar energy will be an important component, an important part of the solution. If you want to gather enough solar energy to replace the fossil fuel that we’re burning today—and remember we’re going to need more fossil fuel in the future- using current technology, then you would have to cover something like 220,000 square kilometers with solar cells. That’s far more than all the rooftops in the country. It would be a piece of land about 300 miles on a side, which is big but not unthinkable.
Dr. Goodstein was kind enough to provide me with some of the basic facts that went into those figures. The energy that would be collected by 300 by 300 mile area is for the whole world and he's assuming a current world total fossil fuel burn of 10 TW (ten trillion watts). He's also assuming a 10% conversion efficiency for the photovoltaics.
We would need an area not quite twice the size of Ohio to get enough power for the entire human race at current rates of energy consumption. Of course, energy consumption is growing and so the area needed is going to grow. But the 10% conversion efficiency assumption is rather low. Groups at Lawrence Berkeley and Los Alamos National Laboratories are pursuing two different methods for boosting photovoltaic conversion efficiency to over 50%. The development of very high conversion efficiency photovoltaics is a matter of when not if. So the surface area that needed for collecting energy for photovoltaic electric power for the whole world is likely to be less than two Ohios.
A related interesting question is just how much of the surface areas covered by human structures will be available to be converted to solar power collectors? That depends on what types of materials can be made to be photovoltaics. If we could discover photovoltaic materials strong enough to use as road covering then roads could be made into energy collectors. While the development of such materials may be a more distant prospect there are lots of other areas covered by human activities aside from the obvious rooftoops that could be covered by solar collectors. Take, for example, train tracks. All the train weight comes down on the rails, not on most of the surface of the railroad ties or the gaps between the railroad ties. So one could imagine railroad track lengths turned into solar power collectors.
There are other ways that road and parking lot surfaces could be made more usable for photovoltaics. One obvious way would be to develop devices for moving cars to parking spaces in ways that allowed most of the surface area not be untouched by car tires. Then less a less strong photovoltaic material could cover all the areas aside from the tracks that the car moving devices used and where the car tires would be placed in parking spaces.
One obvious point about parking lots: If a parking lot is covered and the top is not itself a parking lot then the roof of the parking lot could be covered with photovoltaics. Though I'm not optimistic about the economics of covering Walmart parking lots with roofs to convert them into solar collectors. Still, advances in the development of cheaper materials might make this economically justifiable at some point in the future.
One esthetic problem with high conversion efficiency photovoltaics is that they will likely be dark in color since they will capture most of the photons that hit them. So roofs and sidings of houses covered with layered nanotube photovoltaics would be dark. Most people would probably find that acceptable for roofs but perhaps less so for sidings. One solution might be the development of less efficient photovoltaics that absorb only some colors of light. Such selective frequency absorbing photovoltaics will then allow houses to be different colors. This is not as far-fetched as it sounds. A physicist at Virginia Polytechnic Institute (sorry, no cite, this is from a couple year old memory) has argued that by controlling the spacing between nanotubes it is possible to control what light frequencies they aborb. To take this even further imagine nanotubes that are repositionable. One could perhaps send a pulsed current through them to order them to change their spacing to change their color. This might even provide a way to absorb more photons to provide more electricity during peak periods.
The lesson I'd like you to take away from this post is pretty simple: There is enough surface area already being used by humans that if we just use that surface area for photovoltaics we can get enough power for the human race's needs for many years to come.
Andreas Bartels and Semir Zeki of the Wellcome Department of Imaging Neuroscience, University College London have found using Functional Magnetic Resonance Imaging (fMRI) that love turns down activity in some areas of the brain in part so that we will not see flaws in the object of our affections.
However the key result was that it's not just that certain shared areas of the brain are reliably activated in both romantic and maternal love, but also particular locations are deactivated and it's the deactivation which is perhaps most revealing about love.
Among other areas, parts of the pre-frontal cortex – a bit of the brain towards the front and implicated in social judgment – seems to get switched off when we are in love and when we love our children, as do areas linked with the experience of negative emotions such as aggression and fear as well as planning. The parts of the brain deactivated form a network which are implicated in the evaluation of trustworthiness of others and basically critical social assessment.
The scientists recruited mothers and used pictures of their children as well as pictures of other people and watched how the women responded to the pictures. The researchers also reanalysed data they had previously collected for previously published research involving women in love.
He said: "Our research enables us to conclude that human attachment employs a push-pull mechanism that overcomes social distance by deactivating networks used for critical social assessment and negative emotions, while it bonds individuals through the involvement of the reward circuitry explaining the power of love to motivate and exhilarate."
Bartels has the full text of the research paper on his web site. When we fall in love we become blinded to faults and at the very same time we become flooded with rewarding feelings. (PDF format)
Romantic and maternal love are highly rewarding experiences. Both are linked to the perpetuation of the species and therefore have a closely linked biological function of crucial evolutionary importance. Yet almost nothing is known about their neural correlates in the human. We therefore used fMRI to measure brain activity in mothers while they viewed pictures of their own and of acquainted children, and of their best friend and of acquainted adults as additional controls. The activity specific to maternal attachment was compared to that associated to romantic love described in our earlier study and to the distribution of attachment-mediating neurohormones established by other studies. Both types of attachment activated regions specific to each, as well as overlapping regions in the brain’s reward system that coincide with areas rich in oxytocin and vasopressin receptors. Both deactivated a common set of regions associated with negative emotions, social judgment and ‘mentalizing’, that is, the assessment of other people’s intentions and emotions. We conclude that human attachment employs a push– pull mechanism that overcomes social distance by deactivating networks used for critical social assessment and negative emotions, while it bonds individuals through the involvement of the reward circuitry, explaining the power of love to motivate and exhilarate.
Maternal and romantic love share a common and crucial evolutionary purpose, namely the maintenance and perpetuation of the species. Both ensure the formation of firm bonds between individuals, by making this behavior a rewarding experience. They therefore share a similar evolutionary origin and serve a similar biological function. It is likely that they also share at least a core of common neural mechanisms. Neuro-endocrine, cellular and behavioral studies of various mammalian species ranging from rodents to primates show that the neurohormones vasopressin and oxytocin are involved in the formation and main-tenance of attachment between individuals, and suggest a tight coupling between attachment processes and the neural systems for reward (Carter, 1998; Insel and Young, 2001; Kendrick, 2000; Pedersen and Prange, 1979). This is confirmed by lesion, gene expression and behavioral studies in mammals (Numan and Shee-han,
Perhaps it is not a coincidence that many lovers call each other "babe" and there is a great deal of overlap between the brain's feelings of romantic and maternal love.
Note that regions rich with vasopressin receptors are involved in maternal and romantic love. This brings us to another recent report where scientists have found that gene therapy to deliver vasopressin receptor genes into the ventral pallidum part of the brain made male meadow voles become uncharacteristically monogamous.
ATLANTA -- Researchers at the Yerkes National Primate Research Center of Emory University and Atlanta's Center for Behavioral Neuroscience (CBN) have found transferring a single gene, the vasopressin receptor, into the brain's reward center makes a promiscuous male meadow vole monogamous. This finding, which appears in the June 17 issue of Nature, may help better explain the neurobiology of romantic love as well as disorders of the ability to form social bonds, such as autism. In addition, the finding supports previous research linking social bond formation with drug addiction, also associated with the reward center of the brain.
In their study, Yerkes and CBN post-doctoral fellow Miranda M. Lim, PhD, and Yerkes researcher Larry J. Young, PhD, of the Department of Psychiatry and Behavioral Sciences at Emory University's School of Medicine and the CBN, attempted to determine whether differences in vasopressin receptor levels between prairie and meadow voles could explain their opposite mating behaviors. Previous studies of monogamous male prairie voles, which form lifelong social or pair bonds with a single mate, determined the animals' brains contain high levels of vasopressin receptors in one of the brain's principal reward regions, the ventral pallidum. The comparative species of vole, the promiscuous meadow vole, which frequently mates with multiple partners, lacks vasopressin receptors in the ventral pallidum.
The scientists used a harmless virus to transfer the vasopressin receptor gene from prairie voles into the ventral pallidum of meadow voles, which increased vasopressin receptors in the meadow vole to prairie-like levels. The researchers discovered, just like prairie voles, the formerly promiscuous meadow voles then displayed a strong preference for their current partners rather than new females. Young acknowledges many genes are likely involved in regulating lifelong pair bonds between humans. "Our study, however, provides evidence, in a comparatively simple animal model, that changes in the activity of a single gene profoundly can change a fundamental social behavior of animals within a species."
According to previous research, vasopressin receptors also may play a role in disorders of the ability to form social bonds, such as in autism. "It is intriguing," says Young, "to consider that individual differences in vasopressin receptors in humans might play a role in how differently people form relationships."
And, Lim adds, past research in humans has shown the same neural pathways involved in the formation of romantic relationships are involved in drug addiction. "The brain process of bonding with one's partner may be similar to becoming addicted to drugs: both activate reward circuits in the brain."
The researchers' next step is to determine why there is extensive variability in behaviors among individuals within a species in order to better understand the evolution of social behavior.
Well, consider the possibilities. Want to solve the soaring divorce rate problem? Bioengineer a virus to infect the population to deliver the vasopressin gene into the ventral pallidum at the base of the brain. After years of ineffective moralizing and countless social science studies the problem of disintegrating marriages would be solved.
Another possibility would be the use of such a gene therapy by someone who is in love to make the object of their affections primed to fall in love. Of course, the lover surreptiously treated with emotional brain engineering genetic therapy might fall in love with the next person they accidentally bump into in the supermarket. So such a gene therapy would not be foolproof once it becomes feasible.
But since love causes brain changes that have some similarities to what addictive drugs do to the brain an argument can be made for the proposition that love is just another form of addiction for which humans need an effective treatment that will end the craving.
In their research, funded by the National Institute of Mental Health, Larry Young, PhD., associate professor of psychiatry and behavioral sciences at Emory University School of Medicine and an affiliate scientist at Yerkes National Primate Research Center; graduate student Miranda Lim; and Anne Murphy, PhD., associate professor of biology at Georgia State University, examined the distribution of two brain receptors in the ventral forebrain of monogamous prairie voles that have been previously tied to pair bond formation: oxytocin (OTR) and vasopressin V1a receptor (V1aR). Using receptor audiographic techniques, the scientists found that these receptors are confined to two of the brain's reward centers, the nucleus accumbens and the ventral pallidum. V1aR receptors, which are thought to be activated in the male vole brain during pair bond formation, were confined largely to the ventral pallidum. OTR receptors, which play a crucial role in pair bond formation in females, were found mainly in the nucleus accumbens.
Perhaps a person with more oxytocin and vasopressin receptors finds life to be more rewarding in general. But are they more or less prone to drug addiction?
As people live physically longer and healthier lives, mental health will become the preeminent social and political issue of our time. Living longer physically does not mean living in better mental health. Mental health is the springboard of thinking, communication skills, learning, emotional growth, resilience, and self-esteem.
With longer life spans, the potential for mental illness follows. For example, dementia, the loss of function in multiple cognitive domains, increases with age. The largest number of persons with dementia occurs in people in their early eighties. As the number of people living over 80 years explodes to over 20% of the US population by 2040, dementia will take over as the leading cause of disability. That is, if appropriate tools for stemming cognitive decline, cogniceuticals, don't materialize.
Well, I'd put WMD proliferation, inter-civilizational conflicts, robot take-overs, nanotech goo, and a few other issues up there in competition for preeminent social and political issue going into the future. However, I think Zack is right and perhaps for more reasons than he intends.
First of all, as Zack points out, the aging of populations is causing a much higher incidence of Alzheimer's Disease, vascular dementia, and other neurodegenerative diseases. Worse still, I believe that it will be easier to stop and reverse aging in other parts of the body than in the brain. So we may find ourselves getting younger in other parts of our bodies as our brains continue to age. Why? Because we can replace bigger parts in other parts of the body whereas in the brain we can not replace whole subsystems without losing a part of ourselves and we can't replace our whole brain without completely wiping out our identity.
We will be able to grow replacement hearts, livers, kidneys, and other organs. By growing a replacement we can restore some organ's functionality to youthful levels. To make our brains young again we will need to repair it in situ with gene therapy and other highly targetted therapies that repair existing neurons and remove wastes from around and within cells. Certainly such therapies will be developed and those therapies will also be used on other parts of the body as well. But other parts of the body will be repairable by a wider range of techniques and some of those techniques will very likely be developed faster than the smaller set of techniques that will be usable in the brain.
Stem cell therapies have some uses in the brain for rejuvenation. For instance, hippocampal stem cell reservoirs will need to be replenished with youthful adult stem cells. Also, stem cells will be useful for repairing some of the damage caused by Parkinson's Disease. However. stem cells are not the right solution for Alzheimer's Disease where the real need is to prevent large scale neuronal cell death in the first place.
Removal of amyloid plaques via immunotherapies and other therapies may turn out to be the trick that prevents Alzheimer's. But that will not make neurons young again. Our brains will still age and a slower rate of cell death and accumulation of cells that are in the sensecent state or otherwise impaired will still gradually reduce our intellectual capacity.
So then do we face a future of older brains in younger bodies? Perhaps, but probably only as a transitional phase. Still, this transitional phase will be a serious enough problem that efforts to develop brain rejuvenation therapies should be a high priority in anti-aging research. Many of those therapies will have uses in other parts of the body as well. But the really big win from brain rejuvenation therapies will come from increased worker productivity. An increasing portion of all work is mental work and rejuvenated brains would do more to increase economic productivity than rejuvenated bodies.
There is yet another reason why mental health is going to be more important in the future: Technological advances are going to make individual humans capable of greater acts of destruction and so the individual urges for aggression and destruction are going to become more dangerous to the human race as a whole. Of course this problem is more than just a mental health issue and I do not mean to trivialize all political conflicts by labelling them as cases of mass mental illness. In fact, let me go on record as stating my opposition to the tendency of labelling all anti-social behavior as signs of mental health problems. There are a lot of other factors to consider and we shouldn't medicalize all human behavior. Still, mental health problems really are going to become politicallly more important as humans become more powerful as a result of technological advances.
A recurring theme of this blog is that automation and miniaturization of laboratory devices are speeding up the rate of advance of biological science and biotechnology. Miniaturization is being done in conjunction with parallelization to enable faster and cheaper testing and manipulation of cells and materials. This trend is analogous to the acceleration of computers by making their parts ever smaller and many technologies developed by the computing industry are helping to enable the acceleration of biological advances. Well, yet another example of this trend is an MIT report on the development of miniaturized arays for growing and testing embryonic stem cells.
CAMBRIDGE, Mass.--An MIT team has developed new technology that could jump-start scientists' ability to create specific cell types from human embryonic stem cells, a feat with implications for developing replacement organs and a variety of other tissue engineering applications.
The scientists have already identified a simple method for producing substantially pure populations of epithelial-like cells from human embryonic stem cells. Epithelial cells could be useful in making synthetic skin.
Human embryonic stem cells (hES) have the potential to differentiate into a variety of specialized cells, but coaxing them to do so is difficult. Several factors are known to influence their behavior. One of them is the material the cells grow upon outside the body, which is the focus of the current work.
"Until now there has been no quick, easy way to assess how a given material will affect cell behavior," said Robert Langer, the Germeshausen Professor of Chemical and Biomedical Engineering. Langer is the senior author of a paper on the work that will appear in the June 13 online issue of Nature Biotechnology.
The new technique is not only fast; it also allows scientists to test hundreds to thousands of different materials at the same time. The trick? "We miniaturize the process," said Daniel G. Anderson, first author of the paper and a research associate in the Department of Chemical Engineering. Anderson and Langer are coauthors with Shulamit Levenberg, also a chemical engineering research associate.
The team developed robotic technology to deposit more than 1,700 spots of biomaterial (roughly 500 different materials in triplicate) on a glass slide measuring only 25 millimeters wide by 75 long. Twenty such slides, or microarrays, can be made in a single day. Exposure to ultraviolet light polymerizes the biomaterials, making each spot rigid and thus making the microarray ready for "seeding" with hES or other cells. (In the current work, the team seeded some arrays with hES and some with embryonic muscle cells.)
Each seeded microarray can then be placed in a different solution, including such things as growth factors, to incubate. "We can simultaneously process several microarrays under a variety of conditions," Anderson said.
Another plus: the microarrays work with a minimal number of cells, growth factors and other media. "That's especially important for human embryonic stem cells because the cells are hard to grow, and the media necessary for their growth are expensive," Anderson said. Many of the media related to testing the cells, such as antibodies, are also expensive.
In the current work, the scientists used an initial screening to find especially promising biomaterials for the differentiation of hES into epithelial cells. Additional experiments identified "a host of unexpected materials effects that offer new levels of control over hES cell behavior," the team writes, demonstrating the power of quick, easy screenings.
My guess is that this technique is also usable for testing adult stem cells and more differentiated cell types.
This report brings to mind a pair of recent reports by USCD researchers where they tested tens of thousands of molecules to find a molecule called cardiogenol C that will turn embryonic stem cells into heart muscle cells and a molecule called reversine that will turn adult muscle cells into stem cells. In each case the ability to develop screening methods to test the effects of large numbers of molecules on cells in culture enabled the discovery of useful molecules.
The development of the enabling technologies that accelerate by orders of magnitude the search for compounds that change the internal regulatory state of cells is more important than any particular discovery made with the tools. There are many useful discoveries that could be made sooner if only ways can be developed to do tests more cheaply, more rapidly, more accurately, and with greater sensitivity.
Scientists have invented an efficient way to coat cotton cloth with tiny particles of titanium dioxide. These nanoparticles are catalysts that help to break down carbon-based molecules, and require only sunlight to trigger the reaction. The inventors believe that these fabrics could be made into self-cleaning clothes that tackle dirt, environmental pollutants and harmful microorganisms.
Hong Kong Polytechnic University researchers Walid Daoud and John Xin baked 20 nanometer titanium dioxide particles into clothes for 15 minutes to create cloth that can be cleaned by standing in the sunlight. I like efforts to achieve better living through materials science. Not sure if this approach is wise though.
The first potential problem here is that when hit by photons the materials generate oxygen free radicals. Well, do you want to wear clothing that generates oxygen free radicals? It might accelerate the aging of your skin. Also, some of the oxygen radicals may react with O2 to form O3 ozone gas. It would be like walking around in a microenvironment that is like Los Angeles on a bad day (or Bakersfield or Fresno even more often). You wouldn't want to stand still too long since in order to avoid clothing pollution.
The other problem with this approach is that the oxygen radicals would probably damage the cotton fibers. Cotton is made from cellulose which is a polymer of glucose sugar. That polymer can be damaged by free radicals. So the clothing might wear out faster.
Self-cleaning clothes will be developed eventually. Though it is not clear that this approach is the one that will ultimately succeed.
Serendipity, a form of next-generation networking, was developed by Nathan Eagle, a graduate student and Media Lab Europe Fellow working with Alex (Sandy) Pentland, the Toshiba Professor of Media Arts and Sciences in the Media Lab’s Human Dynamics group.
The system uses Bluetooth, an RF (radio frequency) protocol that works like a low-power radio in most cell phones, sending out a short-range beacon. “Think of it as each person having a 16-foot bubble around them, blinking out a unique ID," Eagle said. “When two or more people running Serendipity come into the same ‘bubble,' their IDs are sent to our server, which looks for their profiles. If there’s a match, each gets the other’s name, thumbnail photo and common interests on his or her cell phone." Then it’s only a matter of introductions.
And it’s quick. The server scans for IDs every 60 seconds and only takes about five seconds to find a match, so the whole sequence takes about a minute at the most.
How does the server know about your interests? Just like web-based social network systems like Friendster or match.com, Serendipity depends on profiles that users write about themselves. But Serendipity is unique because it allows the user to “weight” his or her profile to emphasize interests that are of greatest importance to the user’s current social situation.
Another possible interesting application would be to manage affinity groups. Imagine a traveller who is cruising down a road trying to decide which night club to try out. If people registered with an affinity tracking service then a traveller could choose a club or restaurant whose currently present patrons fit some desired demographic profile. One obvious problem with such a service is that just because one person likes a particular type of person doesn't mean that most who fit a desired profile will like that person in return. Look at celebrities for example. They are loved by all sorts of people who the celebrities would very much like to avoid. So a service would need to develop eligibility criteria that require matching of preferences in both directions before that person driving down the street would get a flashing light on their car LCD pointing them to a particular bar or night club.
Now I'm actually expecting to see this sort of thing to really be implemented and to become widely used. For bar scenes one of the difficult challenges will be the development of image processing software that can analyse the image of a person you haven't even seen yet to decide whether you might find that person attractive. You could just drive through downtown and be told where to stop. In a bar situation the algorithm would have to be fairly sophisticated and use not just images of a person and background info but also your degree of inebriation (higher levels mean lower standards - imagine an embedded nanotech sensor reporting blood alcohol to your cell phone), the time of night (later means lower standards), how long it has been since you last hooked up, and perhaps similar information from the other person to factor in whether you both ought to be told by your avatars to seek each other out.
Heck, the avatar might even tell you how many drinks you'll have to drink to be able to feel that your realistic choices are acceptable. In the longer term as neurobiology and neurochemistry become more advanced you will be able to have embedded implants installed that will release compounds to make you find many more people attractive than you would naturally. Of course in the longer term gene therapies, stem cell therapies, and other therapies will raise average attractiveness that this will be far less of a problem anyhow. As I've argued previously, the female desire for high status males is going to be harder to solve than the male desire for more attractive females.
Of course, some statistical outliers will actually use this technology to meet more interesting people. I'm not trying to argue that the only application for this technology is meeting people for sexual hook-ups. But my guess is it will find wider use for sexual purposes than for intellectual ones.
To investigate age-associated molecular changes in the human brain, Dr. Bruce A. Yankner, professor in the Department of Neurology and Division of Neuroscience at Children's Hospital Boston and Harvard Medical School, and colleagues examined patterns of gene expression in postmortem samples collected from thirty individuals ranging in age from 26 to 106 years. Using a sophisticated screening technique called transcriptional profiling that evaluates thousands of genes at a time, the researchers identified two groups of genes with significantly altered expression levels in the brains of older individuals. A gene's expression level is an indicator of whether or not the gene is functioning properly.
"We found that genes that play a role in learning and memory were among those most significantly reduced in the aging human cortex," said Yankner. "These include genes that are required for communication between neurons."
In addition to a reduction in genes important for cognitive function, there was an elevated expression of genes that are associated with stress and repair mechanisms and genes linked to inflammation and immune responses. This is evidence that pathological events may be occurring in the aging brain, possibly related to gene damage.
The researchers then went on to show that many of the genes with altered expression in the brain were badly damaged and could not function properly. They showed that these genes also could be selectively damaged in brain cells grown in the laboratory, thereby mimicking some of the changes of the aging brain.
"Our findings suggest that these genes are unusually vulnerable to damage from agents such as free radicals and toxins in the environment," said Yankner. "The brain's ability to cope with these toxic insults and repair these genes declines with age, leading to their reduced expression. It will now be important to learn how to prevent this damage, and to understand precisely how it impacts brain function in the elderly."
According to Yankner, "If you examine brain gene patterns among young adults, they are quite similar. In very old adults, there is some increased variability, but there is still similarity between individuals. In contrast, individuals in the middle age population between 40 and 70 years of age are much more variable. Some middle-aged individuals exhibit gene patterns that look more like the young group, whereas others show gene patterns that look more like the old group."
This is evidence that people may age differently during middle age. It will now be of great interest to understand what it is that makes some people age more rapidly than others.
These findings raise the exciting possibility that treatments or lifestyles that reduce gene damage in young adults may delay cognitive decline and the onset of brain diseases in later years. However, more research is needed.
"We can repair these aging genes in the laboratory, but that is a far cry from the human brain. This is only a first step," cautions Yankner.
The brain is going to be the hardest organ to repair and rejuvenate. With most organs we are going to be able to just grow replacements or at least replace pieces. But each of your neurons involved in memory or personality encode a part of who you are. Killing off old neurons kills off some small part of who you are. We need to be able to repair individual aged cells. That is going to be hard to do.
The hardest problem is going to be the development of gene therapy techniques for delivering genes into brain cells. The removal of extracellular accumlated junk is probably a more solvable problem as vaccines have already showed considerable promise in removing the beta amyloid plaques involved in Alzheimer's Disease. It is possible that the removal of at least some of the intracellular junk may not turn out to require gene therapy. But my guess is that part of the intracellular junk that accumulates in lysosomes will require gene therapies developed to provide enzymes that can break down that junk.
We need much greater funding and effort aimed at developing brain rejuvenation therapies. We also need a lot more research aimed at discovering why some brains age more rapidly than others. However, it is very likely that just improving blood lipid and cholesterol profiles will reduce the oxidative load on the brain and therefore reduce the rate of brain aging. The sorts of dietary and exercise advice aimed at avoiding heart disease and cancer will very likely slow brain aging as well.
If this report of middle aged signs of brain aging makes the problem more real to you and makes you feel you ought to do something to protect your brain then consider the "ape" diet to improve blood lipids and cholesterol. The ape diet uses a combination of nuts, foods high in soluble fiber (e.g. eggplant), margarines fortified with plant sterols, soy, and vegetables. The ape diet lowers cholesterol, C Reactive Protein (an inflammation indicator), and triglycerides.
There is now a large accumulation of studies which show that cholesterol-lowering statin drugs reduce the risk of Alzheimer's Disease and vascular dementia. Though be aware that some statin drug users experience acute memory problems. Still, you can always stop taking a statin drug or switch to a different one if you experience harmful cognitive effects. But if you are not going to take statins then for the sake of your brain at least consider being like an ape man when you eat and lower your cholesterol that way. Ray Davies had it all figured out years ago when he sang "I'm an ape man, I'm an ape ape man, oh I'm an ape man. I'm a King Kong man, I'm a voodoo man, oh I'm an ape man".
Also see my previous posts Brain Aging Studied With Gene Microarrays, Myelin Cholesterol and Iron Build-Up Leads To Alzheimer's, GABA Neurotransmitter Rejuvenated Aged Monkey Brains, Vitamin C, E In High Dose Combination May Protect Against Alzheimer's.
First of all, why are Third World fertility statistics a topic for FuturePundit? Well, at the risk of boring you by stating and then answering an obvious question: demographics is destiny. Gene Expression blogger Razib just got back to Oregon from visiting with his extended family in Bangladesh and along with his many other interesting comments on his trip (with more trip reports in the pipeline) he reports that an economist relative claims the Bangladeshi government is exaggerating the decline in fertility of women in Bangladesh.
Oh, about Bangladesh's drop in total fertility in the past 10 years, my economist relative told me that a lot of it was a paper drop, as functionaries cooked the books. Some change has occured, but it is inflated.
How widespread is the practice of cooking the books on human reproduction in less developed countries? If this is a widespread phenomenon then assorted projections of future human population trends are substantially in error and the world's population is going to grow much larger than currently forecast. That is a great tragedy. An increase in population of a country like Bangladesh is a net harm to both Bangladesh and the world at large.
This reminds me of a South African correspondent who tells me that we can't trust the crime statistics and migration statistics coming out of South Africa. The government takes years longer to produce the statistics than it did in past years and there is no reason for the delay since the government can churn through the input datasets to produce the statistics quite rapidly. This correspondent says that the government cooks the crime rate statistics to make South Africa look better than it is and that it does not even report black-on-white crime any more even though that is the category that is rising most rapidly (adjusted of course for the fact that the white population is dropping and the extent of that drop is hidden as well).
This all leads to a more general question: what important demographic trends are being covered up or exaggerated by which governments? Also, what kinds of sampling methods could social scientist employ to spot check and look for indications of systematic deception?
Since there is such a huge quantity of statistical data produced by a large assortment of sources we need some rules to inform our suspicions. For instance, one can expect governments to usually have an incentive to underreport crime statistics (though occasionally desires for larger budgets probably cause some law enforcement agencies to exaggerate threats). Also, in a place like South Africa where the crime rates have been rising the populace is going to tend to stop reporting many types of crime when they see that reporting does no good. So we have to also look for signs that entire populaces may be facing changing incentives with regard to whether to report pertinent information.
Razib also says that if reports from his own extended familiy are indicative (and he comes from an unusually highly educated family) many of the most skilled Bangladeshis are living abroad. One thing I wonder about that is whether there are nations whose outward migration patterns of skilled workers have gotten large enough to lower average national IQ. That seems plausible for Middle Eastern nations that have little or no oil wealth. Ditto for sub-Saharan African nations suffering from brain drain of their most skilled.
Razib also reports that employees of Non Governmental Organizations (NGOs) are living high on the hog by Bangladeshi standards. NGOs are funded by international agencies and by aid programs of wealthier nations.
Many of the people who work at NGOs or "own" them drive posh cars. 10% are really making out from foreign aid, while 90% are unaffected. Of course, if the money was given directly to the government, 1% would benefit. My economist uncle is working on "microdevelopment." Don't really know what it is, but sounds like getting illiterates to behave in a less stupid and exploitable fashion. I'm skeptical.
Are these NGOs providing a net benefit to Bangladesh or are they just creating a privileged class?
ITHACA, N.Y. -- A study by researchers at Cornell University suggests that higher-than-normal amounts of a selenium-containing enzyme could promote type 2 diabetes. The researchers found that mice with elevated levels of the antioxidant enzyme develop the precursors of diabetes at much higher rates than did control mice.
Selenium, a common dietary supplement, is an antioxidant, materials that help mop up harmful free radicals, molecules that can damage cell membranes and genetic material and contribute to the development of cancer and heart disease. Many of the benefits of selenium are related to its role in the production of glutathione peroxidase (GP), an antioxidant enzyme that helps detoxify the body.
"Although free radicals are known to be harmful and antioxidants helpful, our study suggests that we actually need some free radicals to regulate insulin sensitivity," says Xingen Lei, associate professor of animal science at Cornell and an author of the study, published in the June 15 issue of the Proceedings of the National Academy of Sciences , and now available online. The lead author is James McClung, who received his Ph.D. from Cornell this spring and is now a diabetes researcher at a U.S. Army laboratory in Boston.
McClung notes that high levels of GP appear to promote diabetes by mopping up too many free radicals, which are needed to help switch insulin signaling on and off in glucose (blood sugar) metabolism.
"Most people believe that both selenium and the selenium-containing enzyme GP are good for health by protecting cells and tissues from oxidation. However, this study suggests that they are a double-edged sword," says Lei. "Antioxidants can be harmful by neutralizing too many free radicals and interfering with insulin signaling, which results in promoting obesity, insulin resistance and possibly diabetes."
He points out that these findings are consistent with a recent study of pregnant women that reported on a link between high levels of GP, insulin resistance and gestational diabetes.
"Before people blindly supplement their diets with antioxidants, such as selenium and vitamins E and C, more research is needed," he concludes. Next, Lei plans to put the obese mice from this study on a diet to see if weight loss and fat loss can prevent or improve the mice's insulin sensitivity.
Diabetes, both type I and type II, effectively accelerates aging in a number of ways. Therefore anything that might contribute to the incidence of type II diabetes should be seen as a potential aging accelerator. Obesity is the largest risk factor for type II dabetes and the rising incidence of obesity is contributing to a rising incidence of type II (so-called adult onset or insulin insensitive) diabetes.
What is unanswered by this current study is whether a higher level of selenium in the diet or through supplementation will cause an unhealthily high level of glutathione peroxidase (GP) activity. Maybe not. In this experiment the scientists made the mice produce more GP. Just taking more selenium may not up GP activity to a level that will be unhealthy since the supply of GP will become a rate-limiting factor once all GP enzymes have selenium in their active centers. Still, the fact that excessive antioxidant activity can contribute to type II diabetes is important news.
Denman Harman, the original developer of the free radical theory of aging, has argued that, yes, there is such a thing as too much antioxidants. He found that taking too much antioxidant vitamins made him feel sluggish. This makes sense. Starting in the 1970s scientists have discovered that free radicals to be involved in an increasing number of signalling pathways in the body. If all our free radicals were quenched by extremely powerful antioxidants we'd literally die and rather quickly.
There are very likely optimal levels and ratios of antioxidant vitamins and minerals. But unfortunately at this point we do not know what those levels are. This latest study suggests that useful insights could be gotten from the measurement of precursors to type II diabetes in animals and people taking large amounts of various types of antioxidants.
As long as humans do not mess it up, the Earth's climate is set at fair for the next 15,000 years. That is according to information extracted from the oldest ice core ever drilled.
That already shows that the amount of CO2 in the air during stage 11 was similar to our own pre-industrial level3. As Earth's orbit was much the same then, this suggests that stage 11 was very like our present balmy interglacial, and that without the effects of global warming we would probably have to wait some 16,000 years for another ice age. "It's a wonderful window into the future as well as the past," says Raynaud.
While some scientists are interpreting these new results to mean that a new ice age is still a long way off one ice-core scientist from the University of Colorado sounds less certain about our ability to predict the next ice age.
Recovery of a new ice core in Antarctica that extends back 740,000 years -- nearly twice as long as any other ice core record -- is extremely important and will help scientists better understand the Earth's climate and issues related to global warming, according to a University of Colorado at Boulder professor.
The new ice core, announced June 9 by the European Project for Ice Coring in Antarctica, or EPICA, reaches far enough back in time to give scientists "a first shot at looking at climate and greenhouse gases during interglacial periods when humans had nothing to do with climate change," said geological sciences Professor James White.
"This has the potential to separate the human-caused impacts from the natural and place it in a much clearer context," he said.
A commentary on the new ice core written by White will appear in the June 11 edition of the journal Science. One of the world's leading ice-core scientists, White has conducted dozens of studies of ice cores from both Antarctica and Greenland. He is director of the CU-Boulder Environmental Studies program and a fellow of the CU-Boulder Institute of Arctic and Alpine Research.
Concern about the effects of human-caused greenhouse gas releases into the atmosphere has led many scientists to conclude that humans are changing the atmosphere. Drilling deep into polar regions and extracting ice cores can tell scientists about the Earth's climate during the distant past, before technology and farming became a factor in releasing such gases.
Ice cores can tell scientists about concentrations of greenhouse gases such as carbon dioxide and methane, dust levels in the atmosphere, volcanic eruptions and estimates of temperatures and precipitation.
"We're living in an unusual time," White said. "In the past 430,000 years, the percentage of time that the climate was as warm as it is today is quite small, about 5 percent to 10 percent, and before that time, it appears to never have been that warm.
"Humans have been active in messing with the carbon cycle for a long period of time," he said. "Here we are warming the planet, while at the same time, climatologists will tell us that we are perhaps long overdue for a glacial period."
The average number of years spent in a warm period between ice ages -- like our current climate -- has been about 6,000 years, White said. But the current interglacial period has lasted for 12,000 years. Only one other interglacial period has exceeded that length of time -- it lasted for about 28,000 years -- and it happened about 450,000 years ago.
The EPICA core will provide the first complete record of that period and will allow scientists to study it in more detail than ever before, White said.
"Ice cores are the ultimate preservation tool for information about past environments," White said. "Whatever happens on the Earth that changes the atmosphere -- the big events -- is recorded in the ice and it stays there."
Another exciting aspect of the EPICA ice core is that at 740,000 years, scientists have not yet reached the bottom of the ice sheet, he said. "The possibility of a million-year ice core is out there and a million years ago is a really significant period in the Earth's climate history."
Prior to a million years ago, there was no large-scale glacial/interglacial pattern and the Earth had a more steady climate controlled by the sun, he said. Something happened at around that time to cause the Earth to have larger variations in its climate.
"One of our biggest scientific questions is: Is glaciation overdue?" White said. "For our future it is very important that we understand how these huge glaciers start."
The fact that we are overdue, by historical standards, for another ice age may be due to human intervention. See my previous post Farming And Forest Destruction Prevented Ice Age 5000 Years Ago.
The more we understand the climate the better we are going to be able to predict both naturally-caused and human-caused future changes in the climate. At the same time, our ability to intervene in the climate to engineer different outcomes will increase. Along with our increasing understanding and ability to intervene will come increased demands for climate engineering to either reverse the effects of other human interventions or to prevent natural trends from playing out.
If the theory that human farming practices have already prevented an ice age then this will have a profound effect upon the debate about human influences on climate. A radical program to neutralize human influences on the climate would literally bring on an ice age. Only a small minority of environmental radicals are going to support an effort to produce that outcome.
Giulio Tononi of the University of Wisconsin-Madison and his colleagues measured electrical brain signals in subjects who learned a simple computer game before going to sleep.
The kind of activity that occurs during sleep was increased in a penny-sized region in the brains of slumbering subjects who had learned the game. Just playing the game did not have this effect. The researchers conclude that sleep falls on brain circuits that have been changed, not just used, during the day.
And someone with more of such activity in this area, which is in the top right hemisphere, tends to perform better in the morning, they report in a paper published online by Nature1
This study brings up an interesting question: Is one better off learning things right before bedtime rather than earlier in the day? Is new learning more likely to be translated into lasting changes in brain wiring if the learning episode is closer to the time one goes to sleep? The idea seems plausible because mice delayed from getting to sleep after learning have their learning blocked. So evening is probably the better time to study.
Think of each sleep episode as a chance to learn more information. Looked at in that light it may make more sense to spread learning out over longer periods by studying every day rather than concentrating a larger amount of learning into a smaller number of days. Also, it might make more sense to learn a lot on days when you know you'll be able to get a full night's sleep.
I've previously argued that the development of drugs that would allow more rapid cycling through sleep and wakefulness might allow accelerated learning. Also see my previous post Long Term Memories Processed By Anterior Cingulate.
It's readily apparent that handling two things at once is much harder than handling one thing at a time. Spend too much time trying to juggle more than one objective and you'll end up wanting to get rid of all your goals besides sleeping. The question is, though, what makes it so hard to process two things at once?
Two theories try to explain this phenomenon: "passive queuing" and "active monitoring." The former says that information has to line up for a chance at being processed at some focal point of the brain, while the latter suggests that the brain can process two things at once – it just needs to use a complicated mechanism to keep the two processes separate. Recent research from MIT points to the former as an explanation.
Yuhong Jiang, Rebecca Saxe and Nancy Kanwisher, in a study to be published in the June issue of Psychological Science, a journal of the American Psychological Society, examined the brain activity involved in multitasking. They gave people two simple tasks. Task one was identifying shapes, and for some subjects, task two was identifying letters, for others it was identifying colors. The subjects were forced to switch from one task to the other in either one and a half seconds or one tenth of a second. When they had to switch faster, subjects would take as much as twice as long to respond than when switching more slowly.
Using MRI technology, Jiang, Saxe and Kanwisher examined subjects' brain activity while performing these tasks. They observed no increase in the sort of activity that would be involved in keeping two thought processes separate when subjects had to switch faster. This suggests that there are no complicated mechanisms that allow people to perform two tasks at once. Instead, we have to perform the next task only after the last one is finished.
I am looking forward to the day when it becomes possible to genetically engineer minds to have bigger working memories and other cognitive enhancements. Given that some people have larger working memories than others have once we find out the cause of that difference we will probably be able to genetically engineer offspring to have bigger working memories and perhaps to do the same for ourselves. But abilities that do not already exist (such as some types of parallel processing) will be more difficult to add. But if enhancements for parallel processing could be developed it would be very handy. The ability to do productive work while carrying on a demanding conversation would be particularly useful.
Researchers at Northwestern University have developed a technique that delivers into rat brains genes that can be switched off with the use of an antibiotic.
Northwestern University neuroscientists have overcome a major obstacle in gene therapy research. They've devised a method that will safely deliver and regulate expression of therapeutic genes introduced into the central nervous system to treat Parkinson's disease and other neurodegenerative diseases.
The method, developed by Martha C. Bohn and colleagues, is described in the June issue of the journal Gene Therapy. Bohn is Medical Research Institute Council Professor of Pediatrics at the Children's Memorial Institute for Education and Research and professor of pediatrics and of molecular pharmacology and biological chemistry at Northwestern University Feinberg School of Medicine.
Jiang Lixin, a post-doctoral fellow in Bohn's laboratory, created three different viral vectors -- carrier molecules -- that used human fluorescent green protein to track gene delivery and expression in cells. The vectors, made with the harmless adeno-associated virus (AAV), carried the "tet-off" system, in which the introduced gene is continually expressed or "on" but can be temporarily "turned off" when a small dose of the tetracycline antibiotic derivative doxycycline is administered.
One vector, known as rAAVS3, displayed particularly tighter regulation in neurons when gene expression was measured at the protein and molecular RNA levels.
To assess regulation in the brain, the researchers injected the vector into the striatum of rats, the area in the brain where the neurotransmitter dopamine activates the nerve cells that control motor coordination.
In their experiments, Bohn and co-researchers found that up to 99 percent of the vector-introduced gene was turned off when the rats were given small doses of doxycycline. In Parkinson's disease, dopamine-producing neurons degenerate, resulting in gait problems, muscle rigidity and tremors
Several years ago Bohn's laboratory group discovered that glial cells in the embryonic brain stem secrete factors, or proteins, that promote survival and differentiation of dopamine neurons.
One of these proteins, called glial cell line-derived neurotrophic factor (GDNF), is a potent factor that promotes growth of not only dopamine neurons, but also motor neurons and several other types of neurons. GDNF may have therapeutic potential for several neurodegenerative diseases, including Parkinson's disease and Lou Gehrig's disease.
Bohn's laboratory was the first to show that introduction of a GDNF gene in a rodent model of Parkinson's disease halts the disease process.
"GDNF gene therapy has exciting potential to 'cure' Parkinson's disease, but since putting a gene into the brain may lead to expression and increased levels of GDNF protein for years, it will be important to have some way to turn off gene expression to arrest unanticipated side effects," Bohn said.
Bohn and her colleagues have been developing viral vectors that offer a safe means to deliver GDNF, as well as other therapeutic genes. The AAV vector that the researchers used in these experiments is safe and approved for use in several clinical trials in the brain of humans; however, no vector in which the gene can be turned off is yet approved for use in clinical trials.
"A crucial piece of our research is related to safety," Bohn said. "We were excited to find the right mechanism to deliver the gene into the nervous system and tightly control its expression using doxycycline, a drug already approved by the Food and Drug Administration and found to have no side effects."
Bohn cautioned that thorough safety and toxicity studies of the new vector are needed and that her laboratory group is not ready to assess its use in humans.
The mechanism these researchers are using delivers a gene that expresses unless the doxycycline is delivered. But for some therapeutic applications what will be needed is the ability to deliver a gene that will by default not express. Then delivery of a drug would be used to turn it on for some finite period of time. In some ceases one can imagine why it would be helpful to turn on a gene periodically. We really need a large variety of types of gene switches where a gene delivered by gene therapy can be flipped on to stay on or flipped off to stay off or stay on temporarily to stay off temporarily. Plus, we need better ways to deliver genes only into specific desired target cell types.
One might think that the bigger challenge of gene therapy is developing the gene or genes to deliver. But so far the biggest challenge has been in various aspects of delivery. We need better ways to get genes into cells, in only into specific cell types, in amounts no higher or lower than desired, and in ways that do not cause damage to the normal DNA in each cell. It is hard to guess at the rate gene therapy will advance because really good delivery mechanisms have turned out to be very difficult to develop.
Once genes with control switches can be easily and reliably delivered into brain neurons consider the James Bond angle: brains could be surreptitiously programmed to alter their behavior in response to some environmental exposure. Imagine some scent that carries a chemical that has a switch in it that turns genes in the brain on or off. A guy goes into a room, smells the perfume that Sydney Bristow of Alias is innocently (or not so innocently) wearing and suddenly he goes berzerk and starts trying to kill someone that the Covenant knows he hates.
There is also the large group control aspect. A country that need killer soldiers may need the soldiers to be mild mannered civilians between wars but homicidal killers when sent on missions. Well, flip a few switches and suddenly the special forces are chafing at the bit to inflict suffering and death. Or an entire society could be rendered docile during a coup attempt by putting something into the water supply to flip genetic switches that were gradually installed via insect-born vectors in all the brains in a capital city without being noticed.
One of the great challenges of adapting human bodies to the modern era is to get control of appetite. Humans tend to eat too much when food is cheap, easily available, and prepared to appeal to their cravings. As a result, the incidence of obesity is rising rapidly in the industrialized countries and this is causing a higher incidence of a large range of obesity-related diseases including joint problems, heart disease, and even an increased risk of cancer. Therefore research into factors that affect appetite and the tendency to gain weight is very important for finding ways to make humans more adaptive to environments that are much different from our ancestors' environments which selected for our food cravings.
In America trade restrictions on sugar imports under a tariff rate quota system have increased the cost of the common table sugar sucrose (which is a dimer of glucose and fructose covalently bonded to each other). The result has been such high costs for table sugar that liquid beverage makers responded by switching to using high fructose corn syrup as a sweetener. Your FuturePundit been annoyed about this for a long time and have been aware of theories of potential harm from excess fructose consumption for years. Well, science has begun to confirm at least one of those theories with a new report which shows that fructose consumption in place of glucose consumption causes blood changes which are unfavorable and which may lead to obesity.
Philadelphia, PA -- Researchers at the Monell Chemical Senses Center, the University of California, Davis and other collaborating colleagues report that drinking beverages containing fructose, a naturally-occurring sugar commonly used to sweeten soft drinks and other beverages, induces a pattern of hormonal responses that may favor the development of obesity.
It is estimated that consumption of fructose has increased by 20-30% over the past three decades, a rate of increase similar to that of obesity, which has risen dramatically over the same time span. Data from the present study suggest a mechanism by which fructose consumption could be one factor contributing to the increased incidence of obesity.
In the study, reported in the June 4 issue of the Journal of Clinical Endocrinology and Metabolism, 12 normal-weight women ate standardized meals on two days. The meals contained the same number of calories and the same distribution of total carbohydrate, fat and protein. On one day the meals included a beverage sweetened with fructose. On the other day, the same beverage was sweetened with an equal amount of glucose, another naturally-occurring sugar that is used by the body for energy.
Following meals accompanied by the fructose-sweetened beverage, circulating levels of insulin and leptin were decreased compared to when the women ate the same meals accompanied by the glucose-sweetened beverage. Lower levels of insulin and leptin, hormones that convey information to the brain about the body's energy status and fat stores, have been linked in other studies to increased appetite and obesity.
In addition, levels of ghrelin, a hormone thought to trigger appetite that normally declines following a meal, decreased less after meals on the day the women drank the fructose-sweetened beverage. And, the fructose also resulted in a long-lasting increase of triglycerides, fatty molecules in the blood that are indicators of risk for cardiovascular disease.
Together, the hormonal responses observed after drinking beverages sweetened with fructose suggest that prolonged consumption of diets high in energy from fructose could lead to increased caloric intake and contribute to weight gain and obesity. Lead author Karen Teff, Ph.D., a physiologist at Monell, comments, "Fructose consumption results in a metabolic profile of hormones which would be predicted to increase food intake, thereby contributing to obesity in susceptible populations."
Teff notes that this pattern of hormonal responses is similar to that observed after consuming a high-fat meal, and continues, "Based on our previously published work, this metabolic profile resembles that of fat consumption. Thus, despite the fact that fructose is a sugar, metabolically the responses are similar to those seen following fat ingestion." The elevated levels of plasma triglycerides observed after fructose consumption further suggest that frequent fructose consumption could also contribute to the development of atherosclerosis and cardiovascular disease.
According to co-author Dr. Peter Havel, a research endocrinologist at the University of California, Davis, "Although this short-term experiment provides important new data, additional research is needed to investigate the long-term impact of consuming fructose in humans, particularly its effects on lipid metabolism and on endocrine signals involved in body weight regulation. New studies should also be conducted in subjects who are at increased risk for metabolic diseases such as type-2 diabetes and cardiovascular disease and who may be more susceptible to the adverse effects of overconsuming fructose".
Who do we have to blame for all the fructose in Coca Cola and Pepsi? Well, leaving aside an apathetic public that shouldn't allow its politicians to be bought off or the politicians themselves how about the Fanjul sugar family formerly of Cuba and now of Florida?
The Fanjuls are formidable adversaries. They control about 40 percent of Florida’s sugar crop, and last year they made contributions to 31 political candidates, giving more than any other sugar power. They deeply resent their nickname: the first family of corporate welfare. Little known to the American public, Pepe and Alfy Fanjul operate within the hidden world of implicit linkage, the grand club of the country’s power brokers, who routinely trade favors like baseball cards. "There is a rule to understanding life in South Florida," author and Miami Herald columnist Carl Hiaasen tells me. "Alligators don’t give to political campaigns, and the Fanjuls do." Last year the Fanjuls and Florida Crystals gave $486,000 to Democratic candidates and $279,000 to Republicans. (Alfy, who co-chaired Clinton’s Florida campaign in 1992, is the family’s Democrat; Pepe, who was on Bob Dole’s finance committee in 1996, is the family’s Republican.) "The most telling thing about Alfy Fanjul is that he can get the president of the United States on the telephone in the middle of a blow job. That tells you all you need to know about their influence," Hiaasen says.
Read Marie Brenner's full article from that last link if you are in the mood for some moral outrage.
There is some obvious personal take-home advice here: avoid refined fructose. Consider avoiding the sweetest fruits and If you are having a problem with obesity avoiding fructose is probably even more important.
Obesity is a serious problem. The Scientist has an excellent survey of the costs and the scientific advances being made in appetite and obesity research. (requires free registration and it is an excellent site to register for access)
Obesity is also about money. Last year, the United States spent $75 billion on medical expenditures attributable to obesity; about one-half of this money came from public coffers.3 The US Federal Trade Commission estimates that North Americans spend $35 billion per year on weight-loss products and programs. "Industry has recognized that this is the largest possible market worldwide ever," says Tschöp. "This is a large amount of people that will have to take a drug until the end of their life, and those people have money." Seckl, whose team identified 11beta-hydroxysteroid dehydrogenase, an enzyme linked to obesity, says that one drug company recently sold an inhibitor for $86 million.
My guess is those numbers underestimate the real costs of obesity because it increases so many disease risks. A major reason why obesity is harmful to you is that fat cells secrete hormones and when a person is obese the doses of those secreted hormones become so great that they cause toxic effects on the body.
The problem is the volume of chemicals these oversize cells churn out, says Dr. George Bray of Louisiana State University. "The big cell secretes more of everything that it secreted when it was small. When you get more of these things, they are not good for you."
The future achievement of control over human appetite will therefore be an enormous health benefit.
A PET imaging study conducted at the UCLA Neuropsychiatric Institute indicates the neurobiology of America's estimated 1 million compulsive hoarders differs significantly from people with other obsessive-compulsive disorder (OCD) symptoms. The findings indicate that different medications could improve treatment success.
Detailed in the June 4 edition of the peer-reviewed American Journal of Psychiatry, the study is the first to examine the neurobiology of people with compulsive hoarding and saving, one of several symptom clusters associated with OCD.
The study identified lower brain activity in the anterior cingulate gyrus of compulsive hoarders, compared with other OCD patients. This brain structure helps govern decision-making, focused attention, motivation and problem-solving, cognitive functions that are frequently impaired in compulsive hoarders. The study also found a correlation between severity of hoarding symptoms and lower brain activity in the anterior cingulate gyrus across all of the study subjects with OCD.
In addition, the hoarding group showed decreased brain activity in the posterior cingulate gyrus compared to healthy control subjects who had no OCD symptoms. The posterior cingulate gyrus is involved in spatial orientation and memory. The decreased activity in hoarders may explain why they have difficulty with excessive clutter and fear of losing belongings.
The findings also demonstrate how neurobiological testing could improve diagnosis and treatment of psychiatric disorders. Lower activity in the anterior and posterior cingulate areas may not only underlie compulsive hoarding symptoms, but also their poor response to standard treatments for OCD. The results suggest cognitive-enhancing medications commonly used in patients with age-related dementia may be more effective at treating compulsive hoarding behaviors than standard OCD medications such as serotonin reuptake inhibitors.
"Our work shows that hoarding and saving compulsions long associated with OCD may spring from unique, previously unrecognized neurobiological malfunctions that standard treatments do not necessarily address," said Dr. Sanjaya Saxena, lead author and director of the UCLA Neuropsychiatric Institute's OCD Research Program.
"In addition, the results emphasize the need to rethink how we categorize psychiatric disorders. Diagnosis and treatment should be driven by biology rather than symptoms. Our findings suggest that the compulsive hoarding syndrome may be a neurobiologically distinct variant of OCD," said Saxena, an associate professor-in-residence of psychiatry and biobehavioral sciences at UCLA's David Geffen School of Medicine.
Hoarding and saving behaviors are associated with a number of psychiatric disorders, including age-related dementia and cognitive impairment, but they are most commonly associated with OCD. An estimated 7 million to 8 million people in the United States suffer from OCD, with compulsive hoarding present in up to one-third. Compulsive hoarding is the primary source of impairment in 10 percent to 20 percent of OCD patients.
Compulsive hoarding is one of several symptom clusters associated with OCD. Others include contamination fears that lead to cleaning compulsions, aggressive and harm-related obsessions that lead to doubt and checking, and symmetry and order concerns. Each of these symptom clusters may be associated with a distinct pattern of brain activity. Standard OCD treatments, including serotonin reuptake inhibitor medications, typically are less effective in OCD patients with prominent compulsive hoarding behaviors.
The UCLA Neuropsychiatric Institute study involved 62 adults: 12 with OCD who had prominent compulsive hoarding behaviors, 33 with OCD who had mild or no symptoms of hoarding, and 17 control subjects who had no OCD symptoms. The researchers used positron emission tomography (PET) to measure brain glucose metabolism, a marker of regional brain activity, in each subject and compared the results.
Upcoming studies at the UCLA Neuropsychiatric Institute will use both PET and magnetic resonance imaging scanning to look for structural and functional abnormalities in the brains of subjects with compulsive hoarding and other types of OCD as the team seeks to further refine and understand these differences. The research team also will examine the effectiveness of newer medications that better address the unique brain activity found in subjects with compulsive hoarding behaviors.
Note that results from brain scans are obviously causing neuroscientists to reorganize the way they categorize and sort various mental disorders. This is analogous to the way that DNA sequencing results have been causing a recategorization of the relationships between species with species being shifted between genuses and other higher level categories of taxonomy. Systems of classification based on intuitive judgements of outwardly visible qualities are being replaced by qualities come from the measurement of phenomena at the cellular and molecular level. Reductionism marches onward.
Note also that the ability of brain scanner instruments to measure what is going on is providing both a more accurate method of diagnosis and yielding useful hints about what drugs may be most effective to treat each person. The reliance on the psychiatrist's intuitive judgement based on interviews and observation of visible behavior to form a diagnosis is being at least partially supplanted by direct internal observation of what is happening in the brain. Also, the ability to observe what is happening in the brain is pointing toward the potential of courses of treatment that otherwise may never have been considered.
Advances in medical instrumentation are making medical diagnosis more accurate and in the process it is removing subjective judgements from medicine. The removal of the subjective judgement creates the potential for far greater amounts of automation of diagnosis and treatment delivery.
Reports of advances in gene therapy research have not been coming anywhere near as fast as reports about stem cell research advances. However, some University of Wisconsin researchers have achieved exciting successes in delivering genes into laboratory animals.
One group, consisting of researchers from the Medical School, the Waisman Center and Mirus Bio Corporation, now reports a critical advance relating to one of the most fundamental and challenging problems of gene therapy: how to safely and effectively get therapeutic DNA inside cells.
The scientists have discovered a remarkably simple solution. They used a system that is virtually the same as administering an IV (intravenous injection) to inject genes and proteins into the limb veins of laboratory animals of varying sizes. The genetic material easily found its way to muscle cells, where it functioned as it should for an extended period of time.
“I think this is going to change everything relating to gene therapy for muscle problems and other disorders,” says Jon Wolff, a gene therapy expert who is a Medical School pediatrics and medical genetics professor based at the Waisman Center. “Our non-viral, vein method is a clinically viable procedure that lets us safely, effectively and repeatedly deliver DNA to muscle cells. We hope that the next step will be a clinical trial in humans."
Wolff conducted the research with colleagues at Mirus, a biotechnology company he created to investigate the gene delivery problem. He will be describing the work on June 3 at the annual meeting of the American Society of Gene Therapy in Minneapolis, and a report will appear in a coming issue of Molecular Therapy. The research has exciting near-term implications for muscle and blood vessel disorders in particular.
Love that bit about "near-term implications". This technique could be used to treat Duchenne’s muscular dystrophy and a number of other diseases.
Duchenne’s muscular dystrophy, for example, is a genetic disease characterized by a lack of muscle-maintaining protein called dystrophin. Inserting genes that produce dystrophin into muscle cells could override the defect, scientists theorize, ensuring that the muscles with the normal gene would not succumb to wasting. Similarly, the vein technique can be useful in treating peripheral arterial occlusive disease, often a complication of diabetes. The disorder results in damaged arteries and, frequently, the subsequent amputation of toes.
What’s more, Wolff says, with refinements the technique has the potential to be used for liver diseases such as hepatitis, cirrhosis and PKU (phenylketonuria).
In the experiments, the scientists did not use viruses to carry genes inside cells, a path many other groups have taken. Instead, they used “naked” DNA, an approach Wolff has pioneered. Naked DNA poses fewer immune issues because, unlike viruses, it does not contain a protein coat (hence the term “naked”), which means it cannot move freely from cell to cell and integrate into the chromosome. As a result, naked DNA does not cause antibody responses or genetic reactions that can render the procedure harmful.
Researchers rapidly injected “reporter genes” into a vein in laboratory animals. Under a microscope, these genes brightly indicate gene expression. A tourniquet high on the leg helped keep the injected solution from leaving the limb.
“Delivering genes through the vascular system lets us take advantage of the access blood vessels have — through the capillaries that sprout from them — to tissue cells,” Wolff says, adding that muscle tissue is rich with capillaries. Rapid injection forced the solution out of the veins into capillaries and then muscle tissue.
The injections yielded substantial, stable levels of gene activity throughout the leg muscles in healthy animals, with minimal side effects. “We detected gene expression in all leg muscle groups, and the DNA stayed in muscle cells indefinitely,” notes Wolff.
In addition, the scientists were able to perform multiple injections without damaging the veins. “The ability to do repeated injections has important implications for muscle diseases since to cure them, a high percentage of therapeutic cells must be introduced,” he says.
The researchers also found that they could use the technique to successfully administer therapeutically important genes and proteins. When they injected dystrophin into mice that lacked it, the protein remained in muscle cells for at least six months. Similar lasting power occurred with the injection of erythropoietin, which stimulates red blood cell production.
Furthermore, in an ancillary study, the researchers learned that the technique could be used effectively to introduce molecules that inhibit — rather than promote — gene expression, a powerful new procedure called RNA interference.
Given a way to reliably and safely deliver genes there are literally thousands of different diseases that potentially could be treated with gene therapy. The lack of good mechanisms for delivery of genes has been the major factor holding back the development of gene therapies. Whether this mechanism will turn out to be safe remains to be seen. For a gene therapy technique to work one has to have ways to avoid causing gene expression in the wrong kinds of cells other than the desired target type. One also has to avoid overexpression in the desired target cell type and achieve fairly even distribution to all cells of the target type. Plus, there is also a very real worry about risk of damage to genomes that could cause cancer. Whether this latest therapy technique will get tripped up by one or more of these problems remains to be seen.
The International Atomic Energy Agency claims that the risk of terrorist use of radioactive dirty bombs is growing.
The IAEA's records, which it has released to New Scientist, show a dramatic rise in the level of smuggling of radiological materials, defined as radioactive sources that could be used in dirty bombs but not nuclear bombs.
In 1996 there were just eight of these incidents but last year there were 51.
As more of the world industrlalizes the use of highly radioactive materials such as cobalt-60 for medical purposes will increase and hence there will be radioactive material at more sites around the globe and greater chances for diversion of the material for nefarious purposes. Though in the longer run advances in biotechnology ought to produce better treatments and medical tests that do not involve the use of toxic radiation. For example, a recent study found that a test for a blood protein could be used to eliminate need for as much as 60% of lung angiogram CT scans.
David Stipp of the business magazine Fortune has an article about biogerontologist Aubrey de Grey and his radical views about the feasibility of halting and reversing aging.
Even if he's right, de Grey is well aware that scientific feasibility doesn't equal political will. In fact, he says his own starting point in gerontology was his recognition in the mid-1990s of an institutional "fatalism logjam." Since there have been few signs of progress in the quest for anti-aging therapies, funding agencies generally dismiss such work as a waste of resources, or worse, as attempts to brew up snake oil. They won't pay for research, so no progress is made—which, in turn, keeps the impression of intractability in place. Thus, serious scientists have long avoided the pursuit of anti-aging therapies for fear of being labeled flaky dreamers or aspiring charlatans. The closest approach to such work is the relatively modest quest for medicines that prolong good health during old age. This entrenched timidity "just makes me spit," says de Grey. Many researchers on aging privately agree, he adds, but can't afford to be as outspoken as he is because it might hurt their chances to get grants. (A problem he doesn't have, thanks to his genetics job.) Breaking the vicious circle, he adds, will require a big, bold stroke.
It is great that a mainstream business magazine is publicizing these ideas. As anyone who has been reading FuturePundit for a while must know by now, I share Aubrey's views about what is possible to achieve in human rejuvenation. Also, he is right to argue that we are not trying anywhere near as hard as we should to develop rejuvenation therapies given the excellent prospects for success within the lifetimes of many people now alive. So big is the potential pay-off that the failure to make the big push for rejuvenation is surely the biggest mistake in science policy now being made by the United States and the other developed countries.
On the bright side, some of the problems being worked on with the goal of treating various diseases are going to contribute toward the set of therapies that Aubrey has outlined as Strategies for Engineered Negligible Senescence. For instance, all the work on stem cells and tissue engineering builds toward the ability to grow replacement organs and to send in stem cells to replace cells lost from the accumulation of damage that comes with aging. Also, the continued development of a large range of technologies that accelerate the rate of advance of biological science and biotechnology are making it easier to develop rejuvenation therapies. So there are rays of hope in spite of the pessimistic and obviously wrong conventional wisdom that still guides biomedical research funding policy in the United States and other developed countries.
Aubrey is arguing for $100 million per year for a 10 year project to triple the life expectancies of bioengineered mice as a way to test out rejuvenation therapies for humans. To put that amount in perspective the US National Institutes for Health (NIH) is currently funded at $28 billion for Fiscal Year 2004. We are failing to spend even chump change amounts to pursue rejuvenation treatments that would obsolesce the need for the development of most disease treatments. Most disease is the result of general aging. Parts wear out and begin to act in ways that cause symptoms of disease. If the parts could be rejuvenated, if they could be replaced, if built up toxins could be removed then the vast bulk of diseases would never develop in the first place.
Update: The Fight Aging blog has a post with additional commentary about the Fortune article and mentions the Methuselah Mouse Prize which Aubrey and Dave Gobel have organized to provide incentives to researchers to develop longer lived mice.
Picture a mannequin. Siemens has made an automated shirt-ironing device shaped like the chest and arms of a mannequin (see the picture here) that one places a shirt on. Then the mannequin inflates and applies hot steam to a shirt and that takes out the wrinkles. The price for sale in Germany, €1,000, in Euros probably translates into a price one or two hundred dollars above a thousand dollars when it reaches the US. (same article here)
The main objective of the Dressman robot is to dry and press shirts. On placing a damp shirt on the ironing figure, this dummy inflates with hot air in its interior, and thus puffs the shirt up, removing creases drying the garment (it has to be previously wet and undergone a spin-dry in a washing machine). The device has a heater box inside with a number of different resistance elements. While we are placing the shirt on it, this box stores up heat in such a way that, when the garment is positioned and we press the start button, the whole ironing dummy fills with hot air which presses and dries the shirt. Moreover, the device has an air filter which prevents dirt entering the ironing dummy.
Since it isn't really a mechanical device calling it a robot may seem to stretch the definition of robot. After all, a dishwashing machine automates a human task and yet we do not think of it as a robot. Still, it does something that many may have expected would require a more complex robot to perform.
Stressed-out homemakers can now take a break and leave the iron in the closet. A new product from Siemens called "dressman" will soon at least take over the chore of ironing shirts. An Emnid survey confirms something we already know from personal experience: Ironing is one of the household chores that people hate the most. It also eats into precious free time, for even experienced ironers need about eight minutes to press a shirt. This new ironing assistant promises to deliver perfectly ironed shirts in no time. In Germany Siemens sold about 4.000 units within a few months. Now the company starts to market the device in other countries. The equipment looks like the upper body of the mannequins you see in store windows. A freshly washed shirt is simply pulled over the device, and any wrinkles are smoothed out. Twelve fully automatic programs for various types of shirts and materials take care of the rest: The shell made of balloon silk literally inflates itself with hot air and gets the shirts into shape. And the process is easy on the shirts because it uses low temperatures. Broken buttons and unsightly stains will also become things of the past, and additional functions can dry wet jackets or air out sports coats. Up to now, such automatic ironing systems have been available only for professional cleaners and laundries. These use high pressure and are hard on the material as a result. They are also big and expensive. The dressman, which costs about €1,000, is not exactly inexpensive, but it works very economically. The operating costs amount to only five cents per shirt. By comparison, it costs about €2 at the cleaners — not including the cost of getting there. (IN 2004.02.6)
4,000 of these puppies have already been sold in Germany and from a press release date it appears it went on sale in February 2004 there. Are there any German readers who have one who can comment on how well they work? Siemens is starting to introduce this device in other countries. Anyone outside of Germany seen one for sale yet?
This leads to the obvious question: Is Siemens going to produce a Pantsman for ironing pants? One complication there is the crease that we expect pants to have ironed into them. Anyone have a home pants pressing machine?
Take home lesson? It is possible to automate additional common household tasks without waiting for the development of artificially intelligent robots.
When steam is used to turn a generator, it must be pressurised and raised to around 650 °C. Below 450 °C, the process no longer operates efficiently because the steam pressure drops too low. This means that the heat in flue gases below 450 °C cannot be used to generate electricity, and so is lost to the atmosphere.
Two engineers have developed a mechanism using a pair of heat exchangers and propane cycling between liquid and vapor states to drive a turbine to generate more electricity from heat that is currently wasted.
But now Daniel Stinger, a turbine engineer, and Farouk Mian, a petroleum engineer, have developed a surprisingly simple way to harness almost all this waste heat. They calculate that a second turbine, driven by the waste heat from the first, would capture almost all the remaining energy. The first turbine's waste heat would vaporise and pressurise still more propane to drive the second (see diagram).
Daniel Stinger and Farouk Mian have founded Wow Energies and have a patent pending on their invention.
A new patent pending technology is available which replaces the steam turbine system with a Cascading Closed Loop Cycle (CCLC) system producing an increase in MW output of 150% to 600% over a steam turbine system operating at the same heat source temperatures. Click here for comparison chart. The CCLC can also be installed to operate in conjunction with an existing steam turbine system to increase the output by 100% without using additional fuel, the Super CCLC system. If the CCLC turbine system had been installed in place of or with the steam turbines in use today, it is estimated that the U.S. economy could save $100 billion in fuel costs annually. The savings to the economy if the CCLC technology is used to retrofit existing units is conservatively estimated at $35 billion annually.
Other industry proceses are equally inefficient. Industries that depend on burning fossel fuels in boilers, furnaces, ovens, kilns, gas turbines, internal combustion (IC) engines, fuel cells, nuclear power plants, etc. all produce equivalent losses in the form of waste heat exhausted to the atmosphere. Major industries, in addition to the power generation industry, which can benefit from the CCLC technology includes refining, petrochemical, transportation, cement, pulp & paper, metals and pharmaceutical.
Even more dramatic are the corresponding environmental benefits of conservation of non-renewable fuel resources and dramatic reductions or elimination of emissions when installed on an existing waste heat source.
The CCLC system uses off-the-shelf components. The three (3) major components are a pump, heat exchanger and turbo-expander (turbines) that are readily available from numerous suppliers. For example, both axial and centrifugal turbo-expanders are used extensively in the petrochemical and oil & gas industries and are readily available from suppliers such as GE, Atlas Copco, Mafi Trench, Mitsubishi, Siemens and MAN.
Less fossil fuels burned to generate electric power translates into less pollutants released across the board. But there is an additional benefit of their approach. By converting more of the heat into electricity they lower the temperature of exhaust air and that causes many pollutants to condense into liquids and solids instead of being released into the atmosphere.
The CCLC system is so efficient that during the process of converting waste heat to power, it reduces the flue gas temperature to near ambient where conditions are favorable for elimination of pollutants. At these temperatures, vaporized pollutants such as Mercury, Vanadium, Lead, Cadmium, as well as Vaporized Organic Compounds (VOC), can no longer exist in a vaporized state and are “forced” to condense out of the flue gas as a liquid or solid. The remaining SOx and NOx can be removed using a low temperature Final Flue Gas Cleanup (FFGC) system by circulating a dilute water solution of sodium hydroxide and hydrogen peroxide in a scrubber that reacts with any remaining SOx and NOx to form stable salt solutions. The dilute solution also serves to remove PM2.5 and PM10 particulates, returning the flue gas to the environment in a pristine state. Low temperature scrubbers are commonly used in the petrochemical and pharmaceutical industries where they must totally prevent far more dangerous pollutants from entering the environment. Any pollutants escaping their plants would be instantly destructive, whereas the pollutants noted above only slowly but surely damage the environment and destroy our health.
Lower costs and less pollution are double wins for their invention.
The CCLC system uses off-the-shelf components to generate electricity by recovering the trillions of BTUs discharged hourly to the environment in the form of 300 oF to 700 oF waste heat. Instead of vaporizing water to produce steam to drive a steam turbine, the CCLC process vaporizes propane to drive turbo-expanders in a sealed closed loop system. The propane is identical to that used in back-yard grills for cooking, stored in tanks for heating homes, and as a clean fuel for cars, trucks and other vehicles. Propane is not consumed in the process and serves only as the medium to convert thermal energy to mechanical energy; requiring only 130 Btu/lb to vaporize versus 1000 Btu/lb for water. More importantly, propane will vaporize and absorb superheat at low ambient temperatures – not possible with water. Turbo-expanders have been used for decades throughout industry to expand vaporized hydrocarbons, including propane, to produce electrical power. The uniqueness of the CCLC patent pending system is the use of twin turbo-expanders and multiple heat exchangers, in a parallel/series arrangement, resulting in conversion of nearly all the temperature from the heat source to electrical power.
If this turns out to work then consider the implications. Rather than building new electric power generation plants existing plants could be outfitted to generate more electricity from the same amount of fuel. Even nuclear plants could have their electric power output boosted. Plus, the CLCC system could be hooked up to all sorts of industrial processes used in other industries to provide yet more sources of electric power and less pollution to boot.
Duke University researchers have demonstrated the ability to extract stromal cells from mouse fat tissue and convert it into nerve cells.
DURHAM, N.C. -- Two years after transforming human fat cells into what appeared to be nerve cells, a group led by Duke University Medical Center researchers has gone one step further by demonstrating that these new cells also appear to act like nerve cells.
The team said that the results of its latest experiments provide the most compelling scientific evidence to date that researchers will in the future be able to take cells from a practically limitless source -- fat -- and retrain them to differentiate along new developmental paths. These cells, they said, could then be used to possibly treat a number of human ailments of the central and peripheral nervous systems.
The results of the team's latest experiments were published June 1, 2004, in the journal Experimental Neurology.
Using a cocktail of growth factors and induction agents, the researchers transformed cells isolated from mouse fat, also known as adipose tissue, into two important nerve cell types: neurons and glial cells. Neurons carry electrical signals from cell to cell, while glial cells surround neurons like a sheath.
"We have demonstrated that within fat tissue there is a population of stromal cells that can differentiate into different types of cells with many of the characteristics of neuronal and glial cells," said Duke's Kristine Safford, first author of the paper. "These findings support more research into developing adipose tissue as a viable source for cellular-based therapies."
Over the past several years, Duke scientists have demonstrated the ability to reprogram these adipose-derived adult stromal cells into fat, cartilage and bone cells. All of these cells arise from mesenchymal, or connective tissue, parentage. However, the latest experiments have demonstrated that researchers can transform these cells from fat into a totally different lineage.
Earlier this year, Duke researchers demonstrated that these adipose-derived cells are truly adult stem cells. As a source of cells for treatment, adipose tissue is not only limitless, it does not carry the potentially charged ethical or political concerns as other stem cell sources, the researchers said.
"This is a big step to take undifferentiated cells that haven't committed to a particular future and redirect them to develop down a different path," said Duke surgeon Henry Rice, M.D., senior member of the research team. "Results such as these challenge the traditional dogma that once cells become a certain type of tissue they are locked into that destiny. While it appears that we have awakened a new pathway of development, the exact trigger for this change is still not known."
For their latest experiments, the researchers demonstrated that the newly transformed adipose cells expressed many similar cellular proteins as normal nerve and glial cells. Furthermore, they showed that the function of these cells is similar to nerves.
The problem of how to change differentiated cells (cells specialized to perform particular functions) into less differentiated cells is obviously very solvable. Differentiation of cells into specialized types is not a one way street. This should not be too surprising. Cells are made up of matter and matter is malleable. The arrangement of the cellular matter that determines cellular type (known as epigenetic information) is becoming steadily more malleable with each discovery of how to manipulate cells. Recently Scripps researchers found a compound they labelled reversine that converts differentiated cells into stem cells. They had to search through only 50,000 compounds to find one that would do that. Surely there are huge numbers of other compound waiting to be discovered that will dedifferentiate (i.e. despecialize) cells to turn them back into stem cells and even turn them all the way back into the equivalent of embryonic stem cells.
Embryonic stem cells may turn out to provide a starting point for therapy development that allows the more rapid development of some types cell therapies. But there is no treatment that can be developed from embryonic stem cells that won't also eventually be solvable using adult stem cells or fully adult differentiated cells as starting points. Of course, in the short term one can understand why those who have no moral qualms about using embryonic stem cells want to see them used to develop therapies. Embryonic stem cells may save some lives. But for those who will need cell therapy-based treatments in the medium to long term the debate about embryonic stem cell therapy will probably have no impact on the availability of treatments.