Professor Jaroslav Flegr of Charles University in Prague has discovered evidence that infection by intracellular protozoan parasite toxoplasma gondii (T. gondii) causes changes in human personalities.
He found the women infected with toxoplasma spent more money on clothes and were consistently rated as more attractive. “We found they were more easy-going, more warm-hearted, had more friends and cared more about how they looked,” he said. “However, they were also less trustworthy and had more relationships with men.”
By contrast, the infected men appeared to suffer from the “alley cat” effect: becoming less well groomed undesirable loners who were more willing to fight. They were more likely to be suspicious and jealous. “They tended to dislike following rules,” Flegr said.
Why the cat parallels? The parasite infects cats and is passed on to rats by cat feces. In rats it creates the proverbial fatal attraction.
LONDON - Scientists have discovered a parasite that inhabits rats and makes them feel a suicidal attraction for cats. The parasite, which infects as many as one in five rats, can also affect humans.
The parasite, nicknamed the love bug but scientifically known as Toxoplasma gondii, an intracellular protozoan, infects the rodent's brain, inducing an effect similar to Prozac so it becomes less fearful of cats.
It might be too late to get rid of Fluffy. U College London T. gondii researcher Dr Dominique Soldati says once infected you have it for life and it gradually grows.“Once you are infected you cannot get rid of this parasite and the numbers of them slowly grow over the years,” she said. “It’s not a nice thought.”
The scientists set up a wild enclosure for rats, with different smells in each corner. Rats infected by the parasite were attracted to the smell of cat urine.
The minds of infected rats are subtly altered so that they become less able to avoid getting captured and eaten by cats. Cat feces that are eaten by rats serve as a way to spread the disease to rats that the cats can then eventually capture and eat.
As this review of the molecular biology of T. gondii demonstrate scientists are looking for ways to stop and destroy this parasite once it has infected humans.
T. gondii has evolved a remarkable ability to survive in its host, typically for long periods of time, with minimal pathogenicity and in a variety of cell types. However, the mechanisms by which this obligate intracellular parasite becomes a master at manipulating the structures and pathways of the host cell for its own nefarious purposes to create a hospitable environment remain difficult to analyse in the background of a nucleated host cell.
T. gondii infection is especially dangerous for children born to women who become infected during pregnancy.
Toxoplasma, mainly transmitted by consumption of contaminated meat or by cat faeces, chronically infects half the world's population. The pathogen is a leading cause of neurological birth defects in children born to mothers who contract the disease during pregnancy and can cause fatal toxoplasmosis encephalitis in immunosuppressed patients.
Scientists hope that understanding the gene's function will aid efforts to develop drugs that target and block the way Apicomplexa parasites penetrate host cells.
Women who want to have children should probably give away Fluffy to post-menopausal women who show signs of promiscuity and large tasteful wardrobes.
But what about the threat to Western Civilization? Cats are making our women less trusthworthy and more superficial while they are making men into scruffy loners who are unwilling to follow rules. If some terrorist group was releasing pathogens that had this effect we'd be hunting them down and killing them without mercy (assuming the FBI and CIA could find them - the anthrax mail case may never be solved). But since kitties are fluffy, make cute purring sounds, and occasionally rub up against people's legs they are considered adorable by many. This leaves them free to operate in plain sight to undermine Western Civilization while every single one of them affects an air of total indifference and disinterest.
Update: Christopher Genovese has taken the time to read some of Jaroslav Flegr's research papers and presents an excellent analysis on the question whether Flegr's work has discovered a real effect on humans. My take on it is that while it isn't clear that Flegr has proved his case it is plausible in part because the human domestication of cats happened fairly recently (in ancient Egypt if memory serves) in human evolution. So a deleterious effect on humans of a cat parasite seems like something humans wouldn't have had time to evolve to develop an effective response. By contrast, the likelihood of getting harmful parasites from dogs would seem lower since humans have been living with dogs for a much longer period of time.
Writing for The Christian Science Monitor Michelle Thaller reports on the Brane Theory for an eleven dimensional universe.
There are some theoretical reasons to believe that there are other branes out there besides our own, separated from us by a dimension we can't travel in. Cosmologists are getting pretty excited about a new model of how the universe began, with one or more branes interacting with each other. There may even be observational evidence of this in the microwave background radiation, leftover heat from the very beginning of our universe. The implications of this theory are staggering. Not only is the door left wide open to the possibility of entire parallel universes existing out there in the Bulk, but now we have the real possibility that gravity may allow us to explore them, to a very limited degree.
Brane is short for membrane and is a reference to the idea that our 3-dimensional universe might be enclosed in a higher dimensional membrane of some sort.
While there is a few decade cycle of hurricane frequency there does not appear to be a longer term trend toward more or stronger hurricanes.
"It does confirm there are cycles of activity, rather than long-term trends towards more or stronger storms," says Landsea. That database also reveals that states such as Georgia that were largely spared during the 20th century remain at risk.
The problem we face now is that the cycle is moving into the higher frequency period. This comes after a lot of population growth in coastal regions. So the economic and human cost is going to be much greater in the coming higher hurricane activity period.
"The results indicate that early nicotine exposure can leave a lasting imprint on the brain," said Edward Levin, Ph.D., professor in the psychiatry and behavioral sciences department at Duke University Medical Center and a researcher at Duke's Nicotine Research Center. The study was supported by grants from the National Institute on Drug Abuse and the National Institute of Mental Health.
Most tobacco use begins during adolescence, Levin pointed out. Among smokers in the United States, 88 percent smoked their first cigarette before the age of 18 and 60 percent before age 14. Adolescence is also a crucial period for the brain, he said, in which the final phase of neuron development occurs.
If humans minds react similarly to those of rats the effect of the early exposure is quite deleterious:
To clarify the basis of early nicotine addiction, Levin and colleagues tested for a link between the age of initial nicotine use and addiction in female rats in the laboratory. The researchers provided some rats with nicotine at 40 to 46 days of age, while others were provided nicotine only after 70 to 76 days, once they had reached adulthood. Rats could self-administer a dose of nicotine by pressing a lever.
The adolescent rats self-administered significantly more nicotine than did adults, the researchers found. In a test for chronic nicotine use in the rats during a period of four weeks, animals that began using nicotine during adolescence continued to use more of the drug even after they became adults.
The results suggest that people who begin using nicotine during adolescence may be at greater risk for long-lasting addiction, the team reports.
"The brain continues to develop throughout the teenage years," Levin said. "Early nicotine use may cause the wiring of the brain to proceed inappropriately. In essence, the brains of adolescents who use tobacco may be sculpted around an addiction to nicotine."
This is not an entirely surprising result and, in my view, will eventually be confirmed in humans. The minds of adolescents are undergoing a lot of changes. This suggests a higher degree of plasticity that probably means younger growing minds will change more in response to exposure to drugs. Allowing adolescents easy access to drugs will result in changes to their minds that will last for many years and perhaps for their lifetimes.
Update: The rats used for this experiment were the age equivalent of 14 year old girls.
Update II: Drugs vary in their addictiveness.
According to the Institute of Medicine of the National Academy of Science, 32 percent of people who try tobacco become dependent, as do 23 percent of those who try heroin, 17 percent who try cocaine, 15 percent who try alcohol and 9 percent who try marijuana.
Dr. Cami and Dr. Farré observed that personality traits like risk-taking and novelty-seeking tendencies, as well as mental disorders, are "major conditioning factors in drug addiction."
It would be interesting to see how those figures break out by age, race, and sex. If the results of exposing rats to nicotine at different ages are a measure of a general phenomenon them we'd expect to see higher rates for those who try a gven drug in adolescence as compared to trying it at a later age.
Update III: Technological advances are reducing the need to smuggle drugs as synthetic drug manufacture can be done close to the point of consumption and synthetic drug abuse has now surpassed cocaine and heroin drug abuse.
Ecstasy abuse spiralled 70 percent and amphetamines, such as speed, by 40 percent between 1995 and 2001. By contrast, cocaine and heroin abuse worldwide grew less than one percent each.
Addictive drugs that alter and damage the mind are going to become easier to make and their use is likely to grow.
Dog Genome Published by Researchers at TIGR, TCAG New technique, partial shotgun-genome sequencing at 1.5X coverage (6.22 million reads) of genome, provides a useful, cost-effective way to increase number of large genomes analyzed
Analysis reveals that 650 million base pairs of DNA are shared between dog and humans including fragments of putative orthologs for 18,473 of 24,567 annotated human genes; Data provide necessary tools for identifying many human and dog disease genes
Since not all the dog genome has been sequenced this surely represents a minimum estimate on the percentage of genes held in common. Also, not all human genes have been identified and some of the undiscovered ones might turn out to be shared with dogs as well..
September 25, 2003
Rockville, MD - Researchers at The Institute for Genomic Research (TIGR) and The Center for the Advancement of Genomics (TCAG) have sequenced and analyzed 1.5X coverage of the dog genome. The research, published in the September 26th edition of the journal Science, asserts that a new method of genomic sequencing, partial shotgun sequencing, is a cost-effective and efficient method to sequence and analyze many more large eukaryotic genomes now that there are a number of reference genomes available with which to compare. This important new study was funded by the J. Craig Venter Science Foundation.
The TIGR/TCAG team assembled 6.22 million sequences of dog DNA for nearly 80% coverage of the genome. Comparing the dog sequence data with current drafts of the human and mouse genome sequences showed that the dog lineage was the first to diverge from the common ancestor of the three species and that the human and dog are much more similar to each other at the genetic level than to the mouse. The group also identified 974,400 single nucleotide polymorphisms (SNPs) in the dog. These genetic variations are important in understanding the genes that contribute to diseases and traits among various breeds of dogs.
The identified SNPs are probably only a fraction of the total number of SNPs that dogs have. If humans are a reliable indicator then we can expect that eventually millions of dog SNPs will be found. So, yes, your dog really is unique.
The dog genome sequencing project, led by Ewen Kirkness, Ph.D., investigator at TIGR, revealed that more than 25% or 650 million base pairs of DNA overlap between human and dog. The sequence data was used to identify an equivalent dog gene for 75% of known human genes. In addition to this core set of common genes, the sequence data has revealed several hundred gene families that have expanded or contracted since divergence of the dog lineage from our common ancestor. For example, the dog genome is predicted to encode a much greater diversity of olfactory receptors than we find in human - which may contribute to their keen sense of smell.
"In little more than a decade genomics has advanced greatly and we now have approximately 150 completed genomes, including the human, mouse and fruit fly, in the public domain," said J. Craig Venter, Ph.D., president, TCAG. "With each sequenced genome the level of information gleaned through comparative genomics is invaluable to our understanding of human biology, evolution, and basic science research. Our new method is an efficient and effective way of sequencing that will allow more organisms to be analyzed while still providing significant information."
Most of those 150 completed genomes are for bacteria and other species that have smaller genomes. So that is not as impressive as it first sounds.
Conservation of the dog and human genome sequences is not restricted to genes, but includes an equally large fraction of the genomes for which functions are not yet known. "Understanding why these sequences are so highly conserved in different species after millions of years of divergent evolution is now one of the most important challenges for mammalian biologists," says Kirkness.
Comparing genomes between species is a great way to find active important areas. An area that is conserved across species must not be a junk unused area.
The first rough draft sequence of the dog genome was done using DNA from Craig Venter's standard poodle Shadow. It shows that even though our common ancestor with dogs is more distant we are genetically more similar to dogs than to mice.
The study confirms that, while dogs and wolves diverged from the common ancestor of all mammals long before early humans and mice did, dogs are much more closely related to humans than mice are.
"The wolf line diverged a little earlier, but the mouse is evolving faster," Venter said.
One likely reason for the more rapid divergence is that mice have shorter lifespans and shorter reproductive cycles. So mice have gone thru more generations since they split off than have dogs or humans. Another reason could be that their ecological niches exerted more selective pressures on them.
Because so many people love their doggies a great deal of medical knowledge has been amassed about dogs.
"Dogs enjoy a medical surveillance and clinical literature second only to humans, succumbing to 360 genetic diseases that have human counterparts," comment O'Brien and Murphy. "Dogs have been beneficial for standard pharmaceutical safety assessment and also for ground-breaking gene therapy successes."
Dogs get a lot of the same symptoms for many disorders. Though unfortunately they aren't as good at explaining how they feel.
This study demonstrates the extent to which DNA sequencing technology has become faster.
It is a point echoed by Dr Stephen O'Brien from the US National Cancer Institute: "NHGRI recently estimated that in the next four years, US sequencing centres alone could produce 460 billion bases - the equivalent of 192 dog-sized genomes at [just under the Tigr/TCAG] coverage."
Since there are 4,600 mammalian species we are still some way away from having a complete sequencing library of all mammals. Plus, the genetic variations are as important as the basic sequences and we still do not know what all the genetic variations are for a single species let alone what they all mean.
In about 10 or 20 years the cost of DNA sequencing will fall so far and the speed of DNA sequencing machines will increase so much that the sequencing of a genome will be doable in less than a day. At that point what seems like an amazing accomplishment today will seem pretty commonplace.
Said conference organizer Bryan E. Laubscher of the Los Alamos Space Instrumentation and System Engineering Group, "With the discovery of carbon nanotubes and their remarkable strength properties, the time for the space elevator is at hand."
"The promise of inexpensive access to space is so important to the human race that we are ready to meet these challenges head on. Viewed in one way, the space elevator will be the largest civil engineering project ever attempted," Laubscher said.
For online information, visit http://www.isr.us/spaceelevatorconference.
"In order to be ready with the required technologies, those scientists and engineers interested in the space elevator must begin now to identify and solve the technical challenges involved in constructing and operating a space elevator. The Second Annual Space-Elevator Conference is being held to discuss these challenges and their solutions."
NASA's Institute for Advanced Concepts (NIAC) granted funds to Dr. Bradley Edwards, ISR's director of research, to investigate the feasibility of designing and building a space elevator. Once relegated to the realm of science fiction, the space elevator is now the subject of scientific research by ISR. The discovery of carbon nanotubes and the ongoing development to implement them into a composite is the key to space elevator viability being achieved in the future.
Researchers estimate that a space elevator capable of lifting 5-ton payloads every day to low Earth orbits could be operational in 15 years. From this first orbit, the costs to go on the moon, Mars, Venus, or the asteroids would be reduced dramatically. The first space elevator is projected to reduce lift costs immediately to $100 per pound, as compared to current launch costs of $10,000-$40,000 per pound, depending upon destination and choice of rocket-launch system. Additional and larger elevators, built utilizing the first, would allow large-scale manned and commercial activities in space and reduce lift costs even further.
With so much orbiting clutter, including spent rocket stages, dead or dying satellites, zipping around Earth all the way up to stationary orbit, damage to the space elevator is a worry, Clarke said.
There is also concern, Clarke added, that the heavenly elevator is sure to become a target for terrorism. "We need to remove economic and other grudges. But, of course, you could never cope with total lunatics that could do anything."
It would seem to be a relatively easy thing to damage. Once it is built the first priority ought to be to send up more nanotube ribbon fiber to give the top-end enough material to send down repair ribbon. It might even be wise to have stopping off points where repair ribbon is warehoused to send down more quickly to repair damaged pieces. But if a complete cut is made either intentionally or by a piece of fast-moving space debris then everything below that point is going to come flying down. How big of a splash would that make in the ocean?
All the necessary underlying technology exists, Dr. Edwards said, except the material for the ribbon. (The longest nanotube to date is just a few feet long.) But he said he expected that scientists would develop a strong enough nanotube-polymer composite in a few years.
The initial elevator will carry only cargo. A week trip up the elevator would require a special capsule to be designed to support humans and the capsule would need to carry a lot of food and other supplies to keep the humans protected. Plus, there are concerns that pockets of high energy particles trapped in the Earth's magnetic field could deliver harmful doses of radiation. Human capsules would need to be shielded and move faster to deal with this problem.
The first big unknown is when will nanotube fabrication technology advance far enough to make the space elevator buildable? The second big unknown is how fast will funding become available to build it once it becomes possiible?
New ways to identify cells in a precancerous state well before they become numerous and metastasize hold the potential to prevent many cases of cancer which now are not discovered until they reach a fatal state of development. Researchers at MIT's George R. Harrison Spectroscopy Laboratory in the School of Science have just received a $7.2 million dollar grant from the National Institutes of Health (NIH) to develop a method using optical fibers to detect precancerous lesions more accurately, cheaply, quickly, and easily.
Clinical screening for cervical and oral precancer are multibillion-dollar industries which currently rely on visual detection of suspicious areas followed by invasive biopsy and microscopic examination. Given that visually identified suspicious areas do not always correspond to clinically significant lesions, spectroscopic imaging and diagnosis could prevent unnecessary invasive biopsies and potential delays in diagnosis.
Furthermore, real-time detection and diagnosis of lesions could pave the way for combined diagnosis and treatment sessions, thus preventing unnecessary follow-up visits.
Michael S. Feld, professor of physics and director of the Spectroscopy Lab, says the laboratory has developed a portable instrument that delivers weak pulses of laser light and ordinary white light from a thin optical fiber probe onto the patient’s tissue through an endoscope. This device analyzes tissue over a region around 1 millimeter in diameter and has shown promising results in clinical studies. It accurately identified invisible precancerous changes in the colon, bladder and esophagus, as well as the cervix and oral cavity.
The second device, which has not yet been tested on patients, can image precancerous features over areas of tissue up to a few centimeters in diameter.
The researchers hope that these new methods, which can provide accurate results in a fraction of a second, may one day replace tissue biopsies in diagnosing certain types of cancers.
Feld predicted that in a couple of years, these devices will lead to a new class of endoscopes and other diagnostic instruments that will allow physicians to obtain high-resolution images. These easy-to-read images will map out normal, precancerous and cancerous tissue the way a contour map highlights elevations in reds, yellows and greens.
The optical fiber probe instrument employs a method called trimodal spectroscopy, in which three diagnostic techniques—light-scattering spectroscopy (LSS), diffuse reflectance spectroscopy (DRS) and intrinsic fluorescence spectroscopy (IFS)—are combined.
IFS provides chemical information about the tissue, LSS provides information about the cell nuclei near the tissue surface and DRS provides structural information about the underlying tissue. The information provided by the three techniques is complementary and leads to a combined diagnosis, though the imaging technique is based on LSS alone.
This brings to mind a different effort aimed at making cancer cells show up with greater contrast versus normal cells. Shuming Nie at Georgia Tech is doing work to develop quantum dot labelling techniques for cancer cells.
Cancer cells have certain characteristics or markers. After targeting and labeling these markers with color-coded quantum dots, Nie's computer-based algorithm converts the optical information into biological data. He then knows which markers are present, as well as their distribution over the surface of a cell. The patterns formed by the optical information may indicate the presence of cancer.
One can imagine how a liquid or paste containing quantum dots could be spread on a target tissue surface such as a cervix as a preparation to enhance the contrast for the spectroscopic device being developed by MIT.
Paul Gourley and colleagues at the US Department of Energy's Sandia National Laboratories have developed a nanolaser technique that can be used to accelerate screening for drugs that protect mitochondria and neurons.
“Our goal is make the brain less susceptible to diseases like Lou Gehrig’s,” says Sandia researcher Paul Gourley, a physicist who grew up in a family of doctors.
Preliminary work thus far has shown the biolaser (which recently won first place in the DOE’s annual Basic Energy Sciences’ competition for using light to quantify characteristics of anthrax spores) is able to measure mitochondrial size through unexpected bursts of light given up by each mitochondrion. The laser, using the same means, can also measure the swelling effect caused by the addition of calcium ions — the reaction thought to be the agent of death for both mitochondria and their host cells. The researchers expect to introduce neuroprotectant drugs into experiments this month, and be able to test hundreds of possible protective substances daily instead of two or three formerly possible.
“If we can use this light probe to understand how mitochondria in nerve cells respond to various stimuli, we may be able to understand how all cells make life or death decisions — a step on the road, perhaps, to longer lives,” says Gourley.
To do that, he says, scientists must understand how a cell self-destructs, which means understanding how mitochondria send out signals that kill cells as well as energize them.
If compounds can be found that protect mitochondria those compounds may protect neurons from the effects of many kinds of mitochondrial dysfunction and perhaps slow the accumulation of damage in mitochondria that occurs with age.
“Cyclosporin protects mitochondria better than anything else known, but it is not a perfect drug,” says Keep. “It has side effects, like immunosuppression. Unrelated drugs may have a similar protective effect on mitochondria. Gourley’s device will lead to a rapid screening device for hundreds of cyclosporin derivatives or even of chemical compounds never tested before.”
While testing with conventional methods would take many people and many batches of mitochondria, says Keep, the nanolaser requires only tiny amounts of mitochondria and drug to test.
“With one tube on the left flowing in a number of mitochondria per second, and microliters of different drugs in different packets flowing in to join them on the right, we could rapidly run through hundreds of different compounds. Each mitochondrion scanned through the analyzer would show if there were a change in its lasing characteristics. That would determine the effectiveness of chemical compounds and identify new and even better neuroprotectants.”
Of course this is just one step in a drug development process. All the compounds to be screened would still have to be synthesized. Also, any compounds which look good using this assay method would still need to be tested in whole cells, lab animals, and eventually humans. But this report is a sign of the times. Techniques that use micro-scale and nano-scale components to speed various laboratory procedures by orders of magnitude represent the future of biological science and biotechnological development.
Dr. Neil A. Williams of the University of Bristol in Britain and colleagues are working on a vaccine made from a protein found in an E. coli bacteria strain that trains the immune system to stop auto-immune responses.
In a study of a strain of mice that naturally develop diabetes, the vaccine, which is being developed with the backing of British biotech company Hunter-Fleming Ltd, reduced the occurrence of the illness from 80 to 15 percent.
Auto-immune responses play a role in rheumatoid arthritis, type I diabetes, multiple sclerosis, allergies, asthma, and a number of other disorders.
The Escherichia coli bacteria's Enterotoxin B Subunit which makes up this vaccine is known as ETxB.
The team is using just a transport component of the bacterium's toxin molecule. Called ETxB, the component is separated off from the rest of the protein so there is no chance of a vaccine causing stomach upsets in patients.
The vaccine reduced the incidence of type 1 diabetes in mice strains prone to the disease from 80 per cent to 15 per cent. Arthritic mice show similar benefits. Williams is now collaborating with the pharmaceutical company Hunter Fleming to conduct the first human trials, which should begin in six months.
A model of how ETxB stops auto-immune response is available here.
This is pretty special. If a single vaccine could eliminate or even just substantially reduce the frequency of a wide range of auto-immune disorders the benefits would enormous.
A ceramic material reinforced with carbon nanotubes has been made by materials scientists at UC Davis. The new material is far tougher than conventional ceramics, conducts electricity and can both conduct heat and act as a thermal barrier, depending on the orientation of the nanotubes.
Ceramic materials are very hard and resistant to heat and chemical attack, making them useful for applications such as coating turbine blades, said Amiya Mukherjee, professor of chemical engineering and materials science at UC Davis, who leads the research group. But they are also very brittle.
The researchers mixed powdered alumina (aluminum oxide) with 5 to 10 percent carbon nanotubes and a further 5 percent finely milled niobium. Carbon nanotubes are sheets of carbon atoms rolled up into tiny hollow cylinders. With diameters measured in nanometers -- billionths of an inch -- they have unusual structural and conducting properties.
The researchers (postdoctoral scholar Guodong Zhan, graduate students Joshua Kuntz and Javier Garay, and Mukherjee) treated the mixture with an electrical pulse in a process called spark-plasma sintering. This process consolidates ceramic powders more quickly and at lower temperatures than conventional processes.
The new material has up to five times the fracture toughness -- resistance to cracking under stress -- of conventional alumina.
"It's a lot more forgiving under service application when you have a dynamic load," said Mukherjee.
The material shows electrical conductivity ten trillion times greater than pure alumina, and seven times that of previous ceramics made with nanotubes. It also has interesting thermal properties, conducting heat in one direction, along the alignment of the nanotubes, but reflecting heat at right angles to the nanotubes, making it an attractive material for thermal barrier coatings, Mukherjee said.
The work is published in the August issue of Applied Physics Letters.
Certainly newer and better materials can be expected to lower the costs of building or operating a variety of types of equipment, structures, and means of transportation. But seems more exciting to this commentator is the question of what structures and new ground, air, or space vehicles each new materials advance might make possible. Can carbon nanotubes make ceramics and other materials strong enough to make hypersonic scramjet space launch vehicles feasible some day? They are tougher and better at conducting and reflecting heat. Perhaps they will help.
Paul Allen is ponying up $100 million dollars to map all the genes that are activated in mouse brain cells in 3 to 5 years.
Microsoft co-founder Paul Allen has donated $100 million to launch a private research organization in Seattle devoted to deciphering the links between our genes and our brain.
Dr. Thomas Insel, director of the National Institute of Mental Health, says that approximately 6,000 genes are thought to be expressed only in brain with many more that are expressed in the brain and also in other parts of the body as well.
Insel and his team at the NIH have been working on a gene map of the mouse brain since 1999 and hope to soon publish the location of several hundred genes. Allen's project, Insel noted, is on a much larger scale aiming to identify 10,000 genes a year.
Note that Allen is therefore accelerating work on identifying genes active in the brain by at least an order magnitude over that of the NIMH Brain Molecular Anatomy Project.
While the anatomy project can analyze 600 to 800 genes a year, Dr. Boguski's team is shooting for about 10,000 genes a year.
Because mice and humans have 99 percent of the same genes, scientists hope the map of the mouse brain will provide a template for comparison with the human brain.
Insel says these genes would all have been identified over the next couple of decades but that Allen's money is going to compress the amount of time to discover them down to only a few years. Well then hurray for Paul Allen!
Once all the genes which are expressed in the brain are identified the bigger job of figuring out how each affects the brain will still remain to be done.
It's like opening a box filled with parts to build two tables and there are 30,000 parts and no instructions. There is no map," says Mark Boguski, a longtime genomics researcher who is the senior director of the Allen Brain Atlas team. "We have to figure out which are for the brain, and then we have to figure out how they are put together or what they do."
Still, just by knowing which genes are expressed in the brain the next step will be able to be done much more quickly. Just being able to compare people with different genetic variations for brain genes will lead to the much more rapid identification of genetic variations that affect intelligence, personality type, tendencies toward specific forms of behavior, and susceptibility to a large assortment of neurological and mental disorders such as Alzheimer's Disease, Parkinson's Disease, depression, and anxiety.
This first effort by the Allen Institute for Brain Science is known as the Allen Brain Atlas project.
The first endeavor of the Allen Institute for Brain Science is the Allen Brain Atlas project, the planning for which has been underway for two years. For decades, scientists have been eager for an intense, focused effort to develop a compendium of information that could serve as a foundation for general brain research. Instead of researching genes one at a time, the Allen Brain Atlas project will give scientists an unprecedented view of that portion of the genome that is active in the brain.
Why build a brain atlas? The human brain has been an object of mystery and wonder since antiquity. It defines who we are as a species and as individuals—our emotions, thoughts and desires—and controls many of the body’s essential, but unconscious functions, such as breathing and heart rate.
Our understanding of how the brain is organized and how it works is still in the very early stages. Basic processes of memory and cognition remain a mystery. While it is estimated that the human brain contains a trillion different nerve cells or neurons, capable of making up to a thousand different connections each, scientists don’t know how many subtypes of neurons exist, how they are linked up in circuits, or how they work.
Despite more than a century of research, classical neuroanatomists still cannot agree on the boundaries of different brain regions or even their names. In some regions of the brain, there is such fundamental disagreement about mapping regional boundaries that it is almost like comparing maps of Western Europe from 100 years ago to today. An accurate, definitive map is of utmost importance if we want to develop new therapies for neurological disorders such as Alzheimer’s, schizophrenia, depression, and addiction, or simply to understand the essence of what makes us human.
This project demonstrates the value of "Big Science" funding in molecular biology and genetics to achieve major goals. Note that DNA double helix co-discoverer James Watson is serving as one of Allen's advisors on this Brain Atlas project. Watson is advocating a Manhattan Project style effort to map all the genes expressed in each type of cancer in order to rapidly develop far more effective treatments for cancer. Watson thinks his proposed project could be done for a few hundred millions of dollars. We are at the point where the instrumentation and techniques for doing DNA sequencing and gene expression measurement with gene arrays have gotten fast enough that such ambitious projects can be done within a few years to provide substantial benefits fairly rapidly.
Princeton University researchers have developed techniques that may finally make organic photovoltaics cheaper than existing silicon-based photovoltaic solar cells.
PRINCETON, N.J. -- Princeton electrical engineers have invented a technique for making solar cells that, when combined with other recent advances, could yield a highly economical source of energy.
The results, reported in the Sept. 11 issue of Nature, move scientists closer to making a new class of solar cells that are not as efficient as conventional ones, but could be vastly less expensive and more versatile. Solar cells, or photovoltaics, convert light to electricity and are used to power many devices, from calculators to satellites.
The new photovoltaics are made from "organic" materials, which consist of small carbon-containing molecules, as opposed to the conventional inorganic, silicon-based materials. The materials are ultra-thin and flexible and could be applied to large surfaces.
Organic solar cells could be manufactured in a process something like printing or spraying the materials onto a roll of plastic, said Peter Peumans, a graduate student in the lab of electrical engineering professor Stephen Forrest. "In the end, you would have a sheet of solar cells that you just unroll and put on a roof," he said.
Peumans and Forrest cowrote the paper in collaboration with Soichi Uchida, a researcher visiting Princeton from Nippon Oil Co.
The cells also could be made in different colors, making them attractive architectural elements, Peumans said. Or they could be transparent so they could be applied to windows. The cells would serve as tinting, letting half the light through and using the other half to generate power, he said.
Because of these qualities, researchers have pursued organic photovoltaic films for many years, but have been plagued with problems of efficiency, said Forrest. The first organic solar cell, developed in 1986, was 1 percent efficient -- that is, it converted only 1 percent of the available light energy into electrical energy. "And that number stood for about 15 years," said Forrest.
Forrest and colleagues recently broke that barrier by changing the organic compounds used to make their solar cells, yielding devices with efficiencies of more than 3 percent. The most recent advance reported in Nature involves a new method for forming the organic film, which increased the efficiency by 50 percent.
Researchers in Forrest's lab are now planning to combine the new materials and techniques. Doing so could yield at least 5 percent efficiency, which would make the technology attractive to commercial manufacturers. With further commercial development, organic solar devices would be viable in the marketplace with 5 to 10 percent efficiency, the researchers estimated. "We think we have pathway for using this and other tricks to get to 10 percent reasonably quickly," Forrest said.
By comparison, conventional silicon chip-based solar cells are about 24 percent efficient. "Organic solar cells will be cheaper to make, so in the end the cost of a watt of electricity will be lower than that of conventional materials," said Peumans.
The technique the researchers discovered also opens new areas of materials science that could be applied to other types of technology, the researchers said. Solar cells are made of two types of materials sandwiched together, one that gives up electrons and another that attracts them, allowing a flow of electricity. The Princeton researchers figured out how to make those two materials mesh together like interlocking fingers so there is more opportunity for the electrons to transfer.
The key to this advance was to apply a metal cap to the film of material as it is being made. The cap allowed the surface of the material to stay smooth and uniform while the internal microstructure changed and meshed together, which was an unexpected result, said Forrest. The researchers then developed a mathematical model to explain the behavior, which will likely prove useful in creating other micromaterials, Forrest said.
"We've shown a very new and general process for reorganizing the morphology of materials and that was really unanticipated," Forrest said.
The research was supported by grants from the Air Force Office of Scientific Research, the National Renewable Energy Laboratory and the Global Photonic Energy Corp.
Some day advances in fabrication techniques will so lower the cost of making photovoltaic solar cells that they will become cost-effective to generate a substantial portion of our electric power. The big question is when will this happen?
Lehigh University environmental engineer Wei-xian Zhang has developed techniques to use iron nanoparticles to destroy dangerous organic compounds in soil and to neutralize toxic heavy metals in soil.
Iron's cleansing power stems from the simple fact that it rusts, or oxidizes, explains Zhang. Ordinarily, of course, the only result is the familiar patina of brick-red iron oxide. But when metallic iron oxidizes in the presence of contaminants such as trichloroethene, carbon tetrachloride, dioxins, or PCBs, he says, these organic molecules get caught up in the reactions and broken down into simple carbon compounds that are far less toxic.
Likewise with dangerous heavy metals such as lead, nickel, mercury, or even uranium, says Zhang: The oxidizing iron will reduce these metals to an insoluble form that tends to stay locked in the soil, rather than spreading through the food chain. And, iron itself has no known toxic effect--just as well, considering the element is abundant in rocks, soil, water, and just about everything else on the planet. Indeed, says Zhang, for all those reasons, many companies now use a relatively coarse form of metallic iron powder to purify their industrial wastes before releasing them into the environment.
Unfortunately, says Zhang, these industrial reactors aren't much help with the pollutants that have already seeped into the soil and water. That's the beauty of the nanoscale iron particles. Not only are they some 10 to 1000 times more reactive than conventional iron powders, because their smaller size collectively gives them a much larger surface area, but they can be suspended in a slurry and pumped straight into the heart of a contaminated site like an industrial-scale hypodermic injection. Once there, the particles will flow along with the groundwater to work their decontamination magic in place--a vastly cheaper proposition than digging out the soil and treating it shovelful by shovelful, which is how the worst of the Superfund sites are typically handled today.
In that sense, says Zhang, nanoscale iron is similar to in situ biological treatments that use specialized bacteria to metabolize the toxins. But unlike bacteria, he says, the iron particles aren't affected by soil acidity, temperature, or nutrient levels. Moreover, because the nanoparticles are between 1 and 100 nanometers in diameter, which is about ten to a thousand times smaller than most bacteria, the tiny iron crystals can actually slip in between soil particles and avoid getting trapped.
Laboratory and field tests have confirmed that treatment with nanoscale iron particles can drastically lower contaminant levels around the injection well within a day or two, and will all but eliminate them within a few weeks--reducing them so far that the formerly polluted site will now meet federal groundwater quality standards. The tests also show that the nanoscale iron will remain active in the soil for 6 to 8 weeks, says Zhang, or until what's left of it dissolves in the groundwater. And after that, of course, it will be essentially undetectable against the much higher background of naturally occurring iron.
Finally, says Zhang, the cost of the nanoscale iron treatments is not nearly as big a barrier as it was in 1995, when he and his colleagues first developed a chemical route for making the particles. Then the nanoscale iron cost about $500 a kilogram; now, it's more like $40 to $50 per kilogram. (Decontaminating an area of about 100 square meters using a single injection well requires 11.2 kilograms.)
United States federal "Superfund" clean-up costs for polluted sites run over $1 billion per year and additional money is spent by state governments and private interests. Other countries face similar problems. Superfund costs are expected to continue for years to come. This technique holds the promise of much lower cost and even more effective clean-up of polluted sites.
What would be more exciting and potentially much more beneficial for human health is a way to clean up organic pollutants that concentrate in fish. In particular, I'd love to see a nanotech solution to the problem of PCB build-up in farmed salmon.
Seven of ten farmed salmon purchased at grocery stores in Washington DC, San Francisco, and Portland, Oregon were contaminated with polychlorinated biphenyls (PCBs) at levels that raise health concerns, according to independent laboratory tests commissioned by Environmental Working Group.
These first-ever tests of farmed salmon from U.S. grocery stores show that farmed salmon are likely the most PCB-contaminated protein source in the U.S. food supply. On average farmed salmon have 16 times the dioxin-like PCBs found in wild salmon, 4 times the levels in beef, and 3.4 times the dioxin-like PCBs found in other seafood. The levels found in these tests track previous studies of farmed salmon contamination by scientists from Canada, Ireland, and the U.K. In total, these studies support the conclusion that American consumers nationwide are exposed to elevated PCB levels by eating farmed salmon.
The problem is coming from their food. I'm guessing that iron nanoparticles would be both too expensive and too generally destructive if applied to the feedstock used for farmed salmon. Though the PCB concentration problem may even be a problem for some wild Sockeye salmon.
The farmed fish industry needs to grow because ocean fish are being depleted even as the demand for fish looks set to grow enormously as the health benefits of omega-3 fatty acids become more generally known. As fish go salmon is otherwise an attractive choice because salmon are an excellent omega-3 fatty acid source and salmon do not appear to concentrate mercury. So a cheap way to eliminate PCBs from farmed salmon feedstock would be great.
The problem is bioaccumulation - the build-up of contaminants in creatures at the top of the food chain. The North Pacific contains about 1 nanogram of PCBs per litre. By the time the average salmon has finished bulking up for its journey, its fat contains about 160 micrograms, Blais and co-workers report.
Incredibly low concentrations of a pollutant in the environment can be concentrated enormously by the food chain.
In the January 2003 issue of Pediatrics researchers from the Institute for Nutrition Research of the University of Oslo in Norway reported supplementation with omega 3 fatty acid DocosaHexaenoic Acid (DHA) boosted the intelligence of infants.
We received dietary information from 76 infants (41 in the cod liver oil group and 35 in the corn oil group), documenting that all of them were breastfed at 3 months of age. Children who were born to mothers who had taken cod liver oil (n = 48) during pregnancy and lactation scored higher on the Mental Processing Composite of the K-ABC at 4 years of age as compared with children whose mothers had taken corn oil (n = 36; 106.4 [7.4] vs 102.3 [11.3]). The Mental Processing Composite score correlated significantly with head circumference at birth (r = 0.23), but no relation was found with birth weight or gestational length. The children's mental processing scores at 4 years of age correlated significantly with maternal intake of DHA and eicosapentaenoic acid during pregnancy. In a multiple regression model, maternal intake of DHA during pregnancy was the only variable of statistical significance for the children's mental processing scores at 4 years of age. CONCLUSION: Maternal intake of very-long-chain n-3 PUFAs during pregnancy and lactation may be favorable for later mental development of children.
This result supports a previous report arguing for a link between breast-feeding as a source of DHA and intelligence.
The importance of omega-3 fatty acids for brain development is getting a lot of support from a variety of quarters. A just announced study by USC psychology professor Adrian Raine and colleagues found that supplementing the diets of poor children in Mauritania with higher quality food including fish for omega-3 fatty acids reduced their rate of commission of crimes when they got older.
The research, published in this month's American Journal of Psychiatry, involved 100 Mauritian children and a group of around 350 control subjects not put through the programme. EEGs - scans of brain electrical activity - at the age of 11 found heightened activity compared to their peers: they were less likely to have criminal records and 35 per cent less likely to report having engaged in some criminal activity and got away with it.
The most striking effects were observed in those most malnourished when they started the programme, Raine said, suggesting that the diet - unusually rich in fish - could be the crucial element.
Raine thinks the omega-3 fatty acids may be responsible for the difference in behavior. Note that he also foresees a day when surgery might be used to correct prefrontal lobe defects that prevent people from controlling their impulses that cause them to commit crime!
Raine also says that we can't ignore biological and genetic causes of mental illness.
Raine cautioned, however, that there does appear to be a strong genetic component to schizophrenia that shouldn't be discounted. "Pushing biology and genetic issues under the carpet isn't going to help society in the long run," he said. The good nutrition and educational programs early in life might at least delay the onset of mental illness in some people, he added.
Raine was involved in earlier research that found less grey matter in the prefrontal lobes of violent criminals.
Researchers writing in this month’s Archives of General Psychiatry have found that men with antisocial personality disorder, a condition characterized by violence and criminal behavior, have 11 percent less gray matter than normal men in a part of the brain called the prefrontal cortex.
Firstly, the prefrontal cortex is responsible for self-restraint and deliberate foresight. If this part of the brain was damaged, then one effect that would arise would be the tendency for one to act on all his impulses without thinking ahead or thinking of the consequences. Second, the prefrontal cortex is important for learning conditioned responses. This area of the brain has been thought to be central to a child's ability to learn to feel remorse, conscience, and social sensitivity (7). If the prefrontal cortex was to function abnormally, how is the child supposed to learn how to have a conscience? For example, one study reported that children who received damage to their prefrontal cortex before the age of seven developed abnormal social behavior, which was characterized by their inability to control their aggression and anger (2). Lastly, Raine suggests that if prefrontal deficits underlie the APD group's low levels of autonomic arousal, these people may unconsciously be trying to compensate through stimulation-seeking (5).
So does a diet deficient in omega-3 fatty acids lead to poor development of the prefrontal lobe and hence to both lower intelligence and more criminality?
The results about the Mauritanian children follow on the heals of another recent study that found improving the nutrition of prisoners decreased prison violence.
A few months ago, C. Bernard Gesch of Oxford University and coworkers reported in the British Journal of Psychiatry that vitamin-mineral-essential fatty acid supplements appeared capable of dampening violence in a prison population (Psychiatric News, October 2, 2002). However, J.S. Zil, M.D., J.D., chief forensic psychiatrist of the State of California Department of Corrections, told Psychiatric News that he was skeptical of their results. To which Gesch replied: "I don’t feel that Dr. Zil’s cynicism is a problem. It’s only natural to be cautious about such provocative findings."
"About one in a million T-cells holds latent HIV that the antiretroviral drugs can't touch," said Zack, a professor of medicine and vice chair of microbiology, immunology and molecular genetics at the David Geffen School of Medicine at UCLA. "Our challenge was to make latent HIV vulnerable to treatment without harming healthy cells."
The UCLA researchers created a model using mice specially bred without immune systems. The team implanted the mice with human thymus tissue and then infected the tissue with HIV. The mice responded by producing human T-cells infected with latent HIV.
Zack and Brooks next used a two-step approach to expose and destroy latent HIV. First, they stimulated the T-cells strongly enough to prompt the cell to express latent virus but not to trigger other cellular functions. This revealed the hidden HIV.
Second, they used a new weapon called an immunotoxin — an anti-HIV antibody genetically fused with a bacterial toxin — to target and kill only the T-cells infected with HIV.
"The immunotoxin functions like a smart bomb — the antibody is the missile guidance system and the toxin is the explosive," Zack said. "When the T-cell switches on and starts expressing virus, the antibody binds to the surface of the T-cell, forcing the toxin into the cell and killing it. This prevents the cell from making more virus."
"The beauty of this approach is that it doesn't destroy healthy T-cells — only the ones hiding virus," Brooks said.
Prior to the UCLA discovery, scientists needed to over-stimulate T-cells to force them to express latent virus. This ran the risk of harming the patient by impairing the entire immune system. In contrast, the UCLA model exposed and killed hidden HIV without affecting the rest of the immune system. The T-cells in the UCLA model also did not divide, indicating that they were able to produce virus without behaving as if they were confronting a foreign particle.
"In our mouse model, the two-step approach cleared out nearly 80 percent of the latently infected T-cells," said Zack. "No one has ever been able to achieve this before. We hope that the strategy we've proven effective in the lab will show similar success in people."
This technique still must undergo a lot of development before it is ready for use in humans. One difficulty will be to be able to calibrate how exactly to stimulate human T-cells just enough to get only the desired response. Given that lab animals are less genetically variable than humans the discovery of the correct level of stimulation may be hard to do for each patient. Also, there are plenty of other factors that could make the results harder to duplicate in humans. Still, this is a clever technique.
Humans suffer from a number of other chronic viral infections including oral and genital herpes and various forms of viral hepatitis. The ability to eliminate chronic viral infections would be great for the wider population as well. However, this model doesn't really work for them since this model is specific to T-cells which HIV infects. Still, it does not seem unreasonable to expect that ways will also be found to bring viruses out of hiding in other cell types.
Dr. Linda Patridge and colleagues of University College London have discovered that the effects of calorie restriction for life extension on fruit flies is remarkably short-lasting.
In a detailed demographic analysis of life and death among 7,492 fruit flies, published today in Science magazine, Dr. Partridge and her colleagues discovered that the protective effect of dieting snaps into place within 48 hours, whether the diet starts early in life or late. Flies that dieted for the first time in middle age were the same as flies that had been dieting their whole lives. But the effect can be lost just as quickly. Flies that dieted their entire lives and then switched, as adults, to eating their fill were the same two days later as flies that had never dieted.
It has been thought that Calorie Restriction must work to increase life expectancy by slowing the gradual accumulation of damage. Therefore the length of time on the Calorie Restriction diet was expected to determine its total life-extending effects. At least in fruit flies there do not appear to be extended benefits once the fruit flies are taken off the diet. This is surprising.
"If this works in humans, then it means that from the time a person starts on a restricted diet, they'll be like individuals of the same age who were always on that diet. Their prospects of survival are the same."
Researcher Robert Plomin has found from a study of twins in Britain that genes which have variations which cause learning disabilities also have variations that are responsible for causing differences in normal variations in intelligence.
Research from the largest study of twins ever conducted in the UK shows that genetic influences on common learning disabilities are not specific to each disorder.
Professor Robert Plomin of the Institute of Psychiatry, King’s College London today presents evidence on common learning disabilities to the BA Festival of Science at the University of Salford, Greater Manchester.
The Twins Early Development Study (TEDS) compares identical and non-identical twins born in England in 1994-6 and latest findings are from year-long assessments by teachers of reading and maths at age seven.
Researchers found three main reasons that genes involved in common learning disabilities are generalists in three ways. First, genes that affect these disabilities are the same genes responsible for normal variation in learning abilities. Second, genes are not specific to one aspect of a learning disability but are general to many aspects of the disorder. Thirdly, genes affecting one learning disability also affect others.
Professor Plomin says: ‘Although simple genetic anomalies can lead to specific syndromes, most common problems such as language and reading problems are caused by a range of genetic and environmental risk factors. Many of these causal factors overlap in their effects on different disorders.’
What does this mean? Many learning disabilities may simply be the result of inheriting too many intelligence lowering variations of different genes that contribute to determining mental abilities.
The study also shows that genes that affect common learning disabilities are also responsible for normal variation in learning abilities."The abnormal is normal - what we call abnormal is merely the low end of the same genetic and environmental factors responsible for normal variation."
"We found the same genes responsible for ability and disability," he said. "In one sense abnormal is normal. There are no disabilities, just distributions of genes."
This result supports the argument that the general measure of intelligence known as 'g' has a biological foundation. Many of the genes that affect intellectual ability affect ability throughout the brain.
"There is a general set of genes that operates in the brain to affect all learning processes," he said.
University of Colorado aging researcher Tom Johnson takes exception with U Cambridge biogerontologist Aubrey de Grey's argument that life extension will make people more risk averse.
"Look at who dies in accidents now," he said. "It's people in their twenties, who already have the most to lose."
That's because people don't become cautious until they feel the first tinges of mortality in their joints, he reasons.
"And if you feel fairly youthful at the age of 100, you're more likely to go bungee jumping and sky-diving," he said.
But is this simply because those twenty somethings feel young? Or are they lacking in the experience and wisdom that comes with age that teaches people not to act so crazy?
back in 1999 I predicted that, once we cure aging, driving (even on the ground!) will be outlawed as too dangerous for others. Remember also that when we have so many more years ahead of us, we won't need to be in such a hurry all the time, so flying cars would only be for recreation anyway.
My own guess is that Aubrey is atypical in the intensity of his desire to avoid death. If he wasn't then there'd be a lot more advocates of a massive effort to reverse aging than there currently are and more people would make it the chief goal of their life. Though perhaps many would rethink their views if they knew they could entirely avoid aging and achieve engineered negligible senescence.
So will people choose to live less risky lives once we can stop and reverse aging? That depends on human nature. Some people are thought to have an innate urge for sensation-seeking and to be risk-takers by nature. One possible explanation involves differences in cortisol levels. Another proposed explanation is that genetic variations on dopamine receptor genes DRD2 and DRD4 as a cause of dangerous thrill-seeking behavior. However, that report has been contradicted by later studies that have failed to find confirmation for a link.
We are still in the early days for discovering the genetic factors that affect behavior. But it seems likely that there are underlying genetic causes of differences in the desire to engage in highly risky and thrilling behaviors. Once those causes are discovered it is almost a certainty that drugs and other therapies will be devised for modifying human personalities to make a person have a greater or lesser desire to engage in dangerous activities. So this brings us back to the question of what people will do once they have youthful life expectancies that are, for all intents and purposes, of an indefinitely long duration. Whether those who currently are risk-averse will become even more risk-averse and whether the risk-takers will become risk averse depends heavily on this basic question: What kinds of personalities will people choose to give themselves once they are able to make enduring changes to their personalities?
Your guess is as good as mine. What do you think? Will people choose to become risk averse and give up driving and flying? Or will the timid chartered accountants of the world decide to fulfill their dreams to become lion-tamers by having their personalities altered so that they can be fearless in the face of a dozen lions propped up on circus stands in the big top?
But to me, the interesting question is this: assume you can make state-sponsored terrorism extinct. Then what kind of terrorism will survive?
I think that survivors will be splinter groups, rogue operations, and individuals. As of now, that is less of a threat than a large network. But the power of the individual keeps increasing as technology increases. Eventually, we are going to have to develop the capability to identify and thwart a lone terrorist with no connections to anyone.
A lot of civil libertarians see an increasing danger from technological advances that enable greater surveillance of people by their governments. What they fail to address is the problem that Arnold Kling alludes to: the danger from the lone individual who will be able to use advances in technology to kill increasingly larger numbers of people in a single act.
As I've argued in the past, a basic question about technological advances in the future is whether technological advances will favor the defensive or the offensive under scenarios where the attackers are small groups of people or individuals.
The basic question that any debate about the future dangers of technology has to answer is whether the net effect of likely technological advances in the 21st century will favor the offensive or the defensive. Optimists assume that the kinds of dangers generated by technological advances be offset by even greater abilities to create systems to protect us from these dangers. But that assumption can not be proven and there are very plausible arguments against it.
In his excellent 1984 book The Pursuit of Power: Technology, Armed Force and Society Since A.D. 1000 historian William H. McNeill explored the history of technological changes as they affected the ability to conduct offensive and defensive operations. At different periods of history a succession of technological advances shifted the balance between offensive and defensive and in the process changed the nature of warfare and the structure of societies. In the 21st century we are facing technological changes that will dwarf in their effects all previous technological changes put together. It is worth asking whether the coming technological advances will have a net effect of making civilization easier or harder to defend. My own view is that these advances will make civilization harder to defend.
If we are going to be faced with growing threats from terrorism due to technological advances that make it easier to launch terrorist attacks of enormous lethality is there anything we can do about it? As I see it there are only about two major counters that can be used to sustain a defense in the long run:
Either offensive actions have to be watched for at the individual level just as governments now watch each other or we have to change human motives using biotech so that there will be no outliers who have a desire to kill large numbers of people.
Such extreme measures are neither necessary or possible today. Rather less extreme measures (e.g. the overthrow of the North Korean regime) can buy us a couple of decades of delay before the risk becomes much greater. But eventually technological advances will make it too easy for lone individuals or small groups to make and deliver weapons of mass destruction.
A couple of years ago 2 Brookhaven National Laboratory scientists developed bacteria to recover methane from coal in an environmentally more friendly manner.
NEW YORK, NY — Scientists at the U.S Department of Energy’s Brookhaven National Laboratory are exploring the use of bacteria to increase the recovery of methane, a clean natural gas, from coal beds, and to decontaminate water produced during the methane-recovery process.
Methane gas, which burns without releasing sulfur contaminants, is becoming increasingly important as a natural gas fuel in the U.S. But the process of recovering methane, which is often trapped within porous, unrecovered or waste coal, produces large amounts of water contaminated with salts, organic compounds, metals, and naturally occurring radioactive elements. “Our idea is to use specially developed bacteria to remove the contaminants from the wastewater, and also help to release the trapped methane,” says Brookhaven chemist Mow Lin.
Lin’s team has developed several strains of bacteria that can use coal as a nutrient and adsorb or degrade contaminants. They started with natural strains already adapted to extreme conditions, such as the presence of metals or high salinity, then gradually altered the nutrient mix and contaminant levels and selected the most hardy bugs (see details).
In laboratory tests, various strains of these microbes have been shown to absorb contaminant metals, degrade dissolved organics, and break down coal in a way that would release trapped methane. The use of such microbe mixtures in the field could greatly improve the efficiency and lower the associated clean-up costs of coal-bed methane recovery, Lin says.
This latest report suggests these scientists are still pursuing this line of work. The potential benefits are considerable. The United States has more energy in coal than Saudi Arabia has in oil.
Over half of the electricity produced in the United States is generated by coal-based power plants. Coal is affordable. Supplies are plentiful. And, the United States possesses 275 billion tons of recoverable coal reserves, or about one-fourth of the world's total.
U.S. coal reserves are equivalent to four times the oil of Saudi Arabia, 1.3 times the oil of OPEC and equal to all the world's proved oil reserves.
The development of environmentally friendly and cheaper ways to use coal for more purposes holds out the hope of considerably reducing US dependence on Middle Eastern oil and, by doing so, improving America's strategic position in a number of ways. A total reduction of US dependence on foreign oil would provide a number of benefits for the United States:
Methods to more cheaply extract oil from US oil shale or Canadian oil sands would have most of the same set of benefits though in the case of the Canadian oil sands some of the economic benefits would of course flow to Canada rather than to the US. Still, the resulting lower world oil prices and reduction in the need for defense spending would yield substantial benefits for the US economy as well.
In my view there are very large compelling reasons of grand national strategy for the US government to push the development of a broad range of technologies to provide cost-competitive replacements for oil. That the US government has been and continues to be willing to spend hundreds of billions per year on national security and yet so little on meaningful energy research seems unwise when we consider that a substantial portion of defense and even foreign aid spending is due to the presence of so much oil in the Middle East.
Look at it this way: some day methods to extract energy from coal, oil shale, and oil sands will be found. Why not make that day come sooner? Some day methods to make orders of magnitude cheaper photovoltaics by using nanotechnology fabrication methods and materials will be developed. Why not make that day come sooner too? Some day we will have lithium polymer batteries light enough and sufficiently long lasting to use for powering cars. Again, why not make that day come sooner as well? Similar arguments could be made for new nuclear reactor designs that would be cheaper and safer and that would produce far less nuclear waste and far less material useful for making nuclear bombs. Ditto for a wide range of other energy-related technologies. US national security and US living standards would be improved by the development of these technologies and the development costs would be repaid many times over.
Update: Some may wonder whether we should look for ways to shift to coal for a greater portion of our energy consumption given that coal burning generates more carbon dioxide per amount of energy generated as compared to other fossil fuel energy sources. While it is still debatable whether the effects of the build-up of carbon dioxide in the atmosphere will be a net detriment or benefit to humanity even if it becomes clear at some point in the future that the build-up will have to be stopped and perhaps even reversed this does not mean that fossil fuel consumption will necessarily have to be stopped. Dan Giammar, Ph.D., Washington University in Saint Louis assistant professor of civil engineering, is studying ways to sequester carbon dioxide deep underground by bonding it with silicate minerals in solid form.
"If you make more of it (carbon dioxide), you're going to have to do something with it," said Giammar. "Storing and sequestering is a good option."
Giammar's research may lead to not only storage but also permanent sequestration of carbon dioxide. He has found that when combined with silicate minerals containing either calcium, magnesium, or iron, carbon dioxide will precipitate, or change, into a carbonate solid.
"If you just have gaseous carbon dioxide stored underground, it becomes problematic when you think about leakage. But the carbonate mineral is a solid. It can't leak."
If carbon dioxide were injected into deep saline aquifers, several reactions would occur. The minerals would begin to dissolve as the pH of the saltwater became more acidic. The porosity of the rock would increase, allowing for the addition of more carbon dioxide. Eventually, carbonate solids would precipitate. This last phase is the most important in this model.
"Reactive transport models now make assumptions based on calculations that carbonates will precipitate at a certain time," said Giammar. "If that 's not what is really happening in the environment, we should know that. If we can understand this process, potentially it could give us the ability to control when and where these minerals form."
Carbon dioxide sequestration is still in its infancy. Giammar began his work on the project as part of the Carbon Mitigation Initiative at Princeton University. The United States Department of Energy (DOE) currently is planning a heavily monitored system to inject carbon dioxide into a sandstone aquifer on the Texas Gulf Coast. Another project in the North Sea has been storing carbon dioxide in an aquifer beneath the ocean for several years. And most recently, drilling began in July 2003 on a 10,000-foot well to evaluate underground rock layers in New Haven, W. Va.. as part of a DOE carbon sequestration research project now underway at the American Electric Power Mountaineer plant there.
A “Batcane,” developed by Sound Foresight Ltd. and Cambridge Consultants Ltd., directly mimics bats' echolocation by emitting ultrasonic pulses of sound (beyond the reach of human hearing) and analyzing the echoes that bounce back from nearby objects.
The cane navigates by bouncing ultrasonic signals off objects that lie in its path and feeding the information back to the user. This makes it possible to avoid obstacles with confidence - even obstacles at head height. No other primary aid can do this effectively. The batcane is now being developed for manufacture and will be launched at the start of 2004.
It also picks up the reflections of these waves to map obstacles up to three metres away in three dimensions. Buttons on the cane's handle vibrate gently to warn a user to dodge low ceilings and sidestep objects blocking their path.
But will this remain a technology only for blind people? Think about it. Bats use sonar. Their sonar transmitters and receivers must be very small because, well, bats are very small. Wouldn't it be handy to have built-in sonar perhaps located the back of your ears and under your chin? If it could be made to blend in it might not affect your appearance. You'd get a warning of you were about to, say, bump your head or trip over a chair in the dark. Sonar could warn you if you are about to walk off a steep cliff or become entangled in brush.
Then there is the comic book and movie superhero angle. Batman should certainly have at least a Batcane for sneeking up on bad guys at night His Batcar should have sonar as well. Infrared isn't adequate for cold objects. But if he's going to be a superhero then he ought to have genetic engineering to give him supersenses befitting his exalted role.
Joel Kotkin, author of The New Geography: How the Digital Revolution Is Reshaping the American Landscape, has an article in the Washington Post on how the digital communications revolution is undermining the position of the old large cities and leading to a shift of skilled workers to smaller, lower cost, more comfortable, and safer cities.
But the most recent demographic trends show a massive exodus from these same centers. Between 2000 and 2002, for example, more than 300,000 more Americans left New York City than moved in, among the highest rates of outflow in the nation. (Though the city's population saw a small uptick in 2002, this was due chiefly to immigration and births.) Even worse was the outflow from San Francisco, which was nearly 50 percent higher, adjusting for the city's smaller population, than New York's. This is part of a broader trend; in 2002 migration out of large metropolitan areas reached the highest level since the mid-1990s, driven largely by the escalating cost of housing.
The greatest beneficiaries of the demographic shift have been the cities of the South and West, such as Phoenix and San Antonio. But a surprising development has been the gradual slowing, and even reversal, of flight from the Midwest, which was a virtual torrent several decades ago. Today more Americans are moving into cities in the heartland -- such as Fargo, Des Moines, Columbus and Indianapolis -- than are moving out. Even cities like St. Louis, which people have been leaving in massive numbers since the 1960s, are now approaching an equilibrium among domestic migrants.
Individuals who can work wherever they can get a fast internet connection are going to choose their residences increasingly based on lifestyle choices. Companies that have more connections to companies all over the world than they do to other companies in any one city are going to choose their headquarters locations more based on costs and the appeal of the locations to prospective workers and less based on the size or types of industry in any city.
The major remaining advantage of a large metropolitan area for an employer is the labor force. But if a company can find an appropriate labor force in a smaller city or if it can recruit people who are willing to move to the smaller city then the need for the large metropolitan area declines even further. Plus, if the company is outsourcing functions to India or to other businesses then it has less of a need for such a large assortment of workers with different speciality skills found in the biggest metropolitan areas. Even if a particular function is kept in-house the need to co-locate all functions that used to be assocated with a head office has declined with the declining cost of communication and transportation. The marketing department doesn't need to be in the same city as the information technology department. Even the IT department can make use of specialists located in other cities and towns who log on remotely when their specialty skills are needed.
You can find out more about Kotkin's book at his NewGeography.com website.
Scientists at the Department of Energy's Brookhaven National Laboratory are investigating metal catalysts that use energy absorbed from photons to convert carbon dioxide to carbon monoxide.
NEW YORK, NY — Scientists studying the conversion of carbon dioxide (CO2) to carbon monoxide (CO) — a crucial step in transforming CO2 to useful organic compounds such as methanol — are trying to mimic what plants do when they convert CO2 and water to carbohydrates and oxygen in the presence of chlorophyll and sunlight. Such “artificial photosynthesis” could produce inexpensive fuels and raw materials for the chemical industry from renewable solar energy. But achieving this goal is no simple task.
“Nature has found a way to do this over eons,” says Etsuko Fujita, a chemist at the Department of Energy’s Brookhaven National Laboratory. “It’s very complicated chemistry.”
Nature uses chlorophyll as a light absorber and electron-transfer agent. However, chlorophyll does not directly react with CO2. If you take it out of the plant and place it in an artificial system, it decomposes rather quickly, resulting in only a small amount of CO production.
So Fujita and others trying to mimic photosynthesis have turned to artificial catalysts made from robust transition metal complexes such as rhenium complexes. These catalysts absorb solar energy and transfer electrons to CO2, releasing CO. But until now, no one had explained how these processes work in detail. By studying these reactions over very short and long timescales (ranging from 10-8 seconds to hours), Fujita and her colleagues at Brookhaven have discovered an important intermediate step. A most intriguing result is the involvement of two energetic metal complexes to activate one CO2 molecule. Without CO2, the complexes dimerize much more slowly than expected.
The Brookhaven scientists’ work, incorporating a combined experimental and theoretical approach, may help to explain why the reaction proceeds so slowly, which may ultimately contribute to the design of more efficient catalysts.
This work is nowhere near ready for practical application. But in my view this is a direction of research that attracts too little attention. As an energy storage form hydrogen has problems. Liquid hydrocarbon fuels have a lot of advantages. They are fairly compact and existing infrastructure can distribute them. Plus, almost all the vehicles on the road can burn liquid hydrocarbons. A technology that could cheaply convert photon energy from sunlight into liquid hydrocarbons by using the energy to fix CO2 and water into hydrocarbons would be very useful.
Sufferers of cataracts and just about anyone over the age of 40 and suffering from presbyopia (aka farsightedness - inability to focus on close-up objects) may be helped by a new replacement gel material for eye lenses to restore the lens flexibility that is lost with age.
"The gel material is soft to the touch, and it has elastic properties similar to those found in the natural human lens," Fetsch says. "It also looks as if it has the potential to be injectable, which would mean it could be deliverd with less invasive surgery."
Ravi and Fetsch say that using molecular techniques, it's possible to change the artificial lens material from a gel to a liquid. That liquid then can become a gel again in the presence of oxygen in the body after it is injected into the capsular bag. The hope would be that only a very small injection hole would be required during cataract or other lens replacement surgery so that patients undergoing the operation would not require stitches.
The researchers expect to begin animal testing early next year. What they reported to the American Chemical Society was work that involved mechanical and physical testing of the hydrogel that was done in the laboratory. Before testing the hydrogel in animals, the researchers also hope to improve the material's refractive index — the degree at which it refracts light — a key to how well the eye can focus once the material is implanted.
"Currently, in this particular system, the refractive index has been a little low," Fetsch says. "It's not good enough to be able to provide much more than blurry vision."
But other researchers in Ravi's group, particularly research associates Hyder Ali Aliyar, Ph.D., and Paul Hamilton, Ph.D., have successfully formed several soft gels with the appropriate refractive index. "It's a very significant breakthrough," Ravi says.
The researchers admit there is still much work ahead before an injectable lens could be used in human patients, but Fetsch and Ravi expect it would be introduced into cataract patients first.
This latest report from a Washington University of St. Louis research led by Nathan Ravi MD PhD follows on the heels of an Australian group's report of the development of a competing material that holds promise for the same purpose. It seems very likely that within 10 years effective treatments for reversing age-related presbyopia will be available.