2003 March 28 Friday
Earth Core Nuclear Reactor May Run Out Of Nuclear Fuel

Tired of the war? Not sufficiently scared by the spread of SARS? Want something different, more dramatic, and larger scale to worry about? How about the running down of the nuclear reactor supposedly at the Earth's core?

Geophysicist J. Marvin Herndon argues that the core of the Earth is really a 5 mile (or 8 kilometer) uranium ball that operates as a natural nuclear reactor. He says some day the reactor will exhaust its supply of radioactive material and that when it does the Earth's magnetic field will collapse with disastrous consequences.

SAN DIEGO, March 27 (UPI) -- New government laboratory test results are fueling a controversial contention that a giant natural nuclear reactor at the center of the Earth powers the planet's life-protecting magnetic field -- but it might be running out of gas, scientists told United Press International.

Herndon happens also to have served as a technical consultant for the new disaster movie "The Core" which is based on the idea that the Earth's core will stop spinning.

J. Marvin Herndon of Transdyne Corp. in San Diego, who worked as an advisor to Paramount Pictures in the creation of the new science thriller, maintains that a nuclear "georeactor" provides most of the heat in the Earth's spinning core

Unfortunately its probably impossible for real life terranauts to travel down to the Earth's core and fix it if the core starts running out of nuclear fuel.

A new research paper published in the Proceedings of the National Academy of Sciences provides supporting evidence for the theory.

Computer simulations of a nuclear reactor in the Earth's core, conducted at the prestigious Oak Ridge National Laboratory, reveal evidence, in the form of helium fission products, which indicates that the end of the georeactor lifetime may be approaching.

Most geophysicists do not believe Herndon's theory.

Dr. Fred Vine laid the foundations for many of Herndon's theories in the 1970s. Vine, however, believes that the Earth's core stops spinning every 400,000 years.

The August 2002 issue of Discover has a fairly lengthy write-up of Herndon's theory.

In Herndon's view, these polarity flip-flops make no sense if the magnetic field is powered, as traditionalists contend, by heat from the crystallization of molten iron and nickel from the fluid core or from the decay of isolated radioactive isotopes. "Those are both gradual, one-way processes," he says. But if the field's energy results from a mass of uranium and plutonium acting like a natural nuclear reactor, Herndon says, such variations in the field's strength would be almost mandatory.

In this theory the energy comes from the splitting of Uranium U235 atoms and is the same process as occurs in nuclear power plants..

"It's a self-sustaining critical reaction," said nuclear engineer Daniel F. Hollenbach of Oak Ridge National Laboratory, a longtime collaborator of Herndon's until the two parted ways last year. "Depending on how much it fissions, that's the power."

There are a few separate issues here. One is whether the Earth's core is a large nuclear reactor that drives the Earth's magnetic field. There is no consensus among geophysicists that this is the case. Herndon is definitely in a small minority with his theory. However, there is also the widely accepted theory that every few hundred thousand years the Earth's core stops spinning, the magnetic field collapses, and then the core starts spinning again and the magnetic field reverses. So a magnetic field collapse could still happen as part of the process of periodic magnetic field collapse even if Herndon's theory is wrong.

This leads to the important question: Could the Earth's magnetic field reverse today? The British Geological Survey weighs in on the odds of the possibility.

Measurements have been made of the Earth's magnetic field more or less continuously since about 1840. Some measurements even go back to the 1500s, for example at Greenwich in London. If we look at the trend in the strength of the magnetic field over this time (for example the so-called 'dipole moment' shown in the graph below) we can see a downward trend. Indeed projecting this forward in time would suggest zero dipole moment in about 1500-1600 years time. This is one reason why some people believe the field may be in the early stages of a reversal. We also know from studies of the magnetisation of minerals in ancient clay pots that the Earth's magnetic field was approximately twice as strong in Roman times as it is now.

Even so, the current strength of the magnetic field is as high as it has been in the last 50,000 years, even if it is nearly 800,000 years since the last reversal. Also, bearing in mind what we said about 'excursions' above, and knowing what we do about the properties of mathematical models of the magnetic field, it is far from clear we can easily extrapolate to 1500 years hence.

The British Geological Survey does not see a big threat to human life if an Earth's magnetic field reversal should start happening in earnest today.

Is there any danger to life?

Almost certainly not. The Earth's magnetic field is contained within a region of space, known as the magnetosphere, by the action of the solar wind. The magnetosphere deflects many, but not all, of the high-energy particles that flow from the Sun in the solar wind and from other sources in the galaxy. Sometimes the Sun is particularly active, for example when there are many sunspots, and it may send clouds of high-energy particles in the direction of the Earth. During such solar 'flares' and 'coronal mass ejections', astronauts in Earth orbit may need extra shelter to avoid higher doses of radiation. Therefore we know that the Earth's magnetic field offers only some, rather than complete, resistance to particle radiation from space. Indeed high-energy particles can actually be accelerated within the magnetosphere.

At the Earth's surface, the atmosphere acts as an extra blanket to stop all but the most energetic of the solar and galactic radiation. In the absence of a magnetic field, the atmosphere would still stop most of the radiation. Indeed the atmosphere shields us from high-energy radiation as effectively as a concrete layer some 13 feet thick.

Human beings have been on the Earth for a number of million years, during which there have been many reversals, and there is no obvious correlation between human development and reversals. Similarly, reversal patterns do not match patterns in species extinction during geological history.

The fact that Earth's atmosphere provides as much protection from radiation as 13 feet of concrete has interesting ramifications for space exploration and planet colonization. The ability to create a thick atmosphere on Mars would be enormously valuable not just for allowing people to go out and breath the atmosphere. It also greatly reduce the amount of radiation Mars colonists would be exposed to.

You can read more about Herndon's nuclear core theory and its ramifications on the NuclearPlanet website.

By Randall Parker 2003 March 28 02:09 PM  Dangers Natural General
Entry Permalink | Comments(10)
Smart Dust Sensors To Be Cheap, Ubiquitous

Smart Dust to allow cheap and widespread distribution of sensor systems.

"Smart dust" devices are tiny wireless microelectromechanical sensors (MEMS) that can detect everything from light to vibrations. Thanks to recent breakthroughs in silicon and fabrication techniques, these "motes" could eventually be the size of a grain of sand, though each would contain sensors, computing circuits, bidirectional wireless communications technology and a power supply. Motes would gather scads of data, run computations and communicate that information using two-way band radio between motes at distances approaching 1,000 feet.

That dust you got on your shoes in the company parking lot may be spy sensors planted by a competitor who wants to listen in on company meetings. Or perhaps the dust in your pet's hair was put there by your ex-spouse who wants to find out who you are spending your time with.

When sensor systems become cheap and as small as dust particles it is going to become much easier to lay out sensor networks for a large variety of reasons. Of course this will inevitably lead to microscopic sensors that are designed to detect other types of sensors.

By Randall Parker 2003 March 28 02:45 AM  Surveillance Society
Entry Permalink
2003 March 26 Wednesday
Cell Phones Will Track Traffic Flow And Crowd Density

Here's another sign of how many ways location identification technology will be used to track groups of people.

Finnish mobile operator Radiolinja Oy has developed technology to monitor traffic by tracking cell phones in cars without identifying the owner. The technology, developed as an alternative to video-monitoring systems, could also be used to monitor the flow of crowds at public events or the number of cars passing roadside billboards as a tool for advertisers.

If the resolution of the locations is sufficiently fine then this technology could also be used by stores to track the flow of people in a store and get an idea what patterns of movement people use. Since a person could be tracked to the check-out stand it may even become possible to associate movement patterns with purchase patterns.

I predict that parents who are afraid of child abduction will have location detectors embedded in their children. Then the parents will move on from using it for emergencies to using it for routine tracking of their children. The same will be done with pets that have a tendency to run away.

By Randall Parker 2003 March 26 01:21 PM  Surveillance Society
Entry Permalink | Comments(1)
2003 March 25 Tuesday
Multilayer Photovoltaic Design Promises To Lower Costs

There are many approaches being pursued by various research groups searching for cheaper ways to make photovoltatic solar energy cells for the generation of electricity. One reason that photovoltaics are expensive is that it is expensive to make the highly purified silicon semiconductor material that most existing photovoltaic cells use. Eric McFarland at UC Santa Barbara is pursuing the development of a two layer approach that allows the use of a less pure and therefore much cheaper titanium dioxide semiconductor

The researchers' prototype suggests that the devices would be much less expensive to manufacture than today's solar cells and can be improved to be nearly as efficient. "It's enormously cheaper... more than a factor of 10," said Eric McFarland, a professor of chemical engineering at the University of California at Santa Barbara.

The titanium dioxide serves only as the lower layer charge carrier while a dye serves as a light absorbing layer.

To overcome this problem, McFarland and Jing have developed a multi-layer device that separates the light-absorption and charge-carrier transport processes.

The efficiency of this new type of photovoltaic is low at this stage in development. But McFarland thinks he can raise the effciency and a design that allows the use of much cheaper materials is a great way to make photovoltaics cost competitive.

Electrons excited in the dye shoot through the gold and are collected in the titanium dioxide. Missing electrons in the dye are replaced from the metal. Because the semiconductor does not have to absorb light itself, inexpensive semiconductors will do the job.

By Randall Parker 2003 March 25 08:38 PM  Energy Solar
Entry Permalink | Comments(1)
NASA To Extend Space Shuttle Life To 2020 Or Later

NASA director of shuttle and space station programs Michael C. Kostelnik says NASA plans to keep operating the shuttle at least until 2020. Since the Shuttle will continue to be an inherently dangerous design one way to reduce the casualty rate will be to automate the shuttle's operation so that astronauts do not have to die when it crashes.

Reducing the risk might require eliminating the crew altogether, Kostelnik said. The shuttle will be needed as a "workhorse," he said, but it might not need to carry people. "Perhaps even flying a robotic shuttle in those out years would not be out of the question," he said.

To switch the Shuttle over to an automated robotic cargo delivery system will require the development of the orbital space plane to carry astronauts into space.

Kostelnik said that in the future -- once an orbital space plane is a reality and can ferry astronauts to and from the international space station -- NASA would have the option of flying the shuttle only as a cargo vessel.

The problem is that the orbital space plane that will be designed to more safely and cheaply (at least if they don't botch the design job) carry astronauts will not be ready for at least 12 years.

The space agency's long-term plans call for the shuttle fleet to be active at least until a "next- generation launch technology" -- which is in the earliest stages -- makes its first flight some 12 years from now.

NASA does not want to give up operating the Shuttle because NASA does not want to give up operating the International Space Station. Of course, operating ISS and Space Shuttle (especially since now the Shuttle will need to have a lot more spent on it to improve its safety) eats up so much budget money that the money left over for the Orbital Space Plane won't be enough to fund its rapid development.

The Orbital Space Plane is not all that radical anyway. It will still be boosted into space via chemical rockets. Space launch of humans will remain very expensive even once the Orbital Space Plane is operational. Do not expect NASA to revolutionize space flight. It has some big white elephants to tend and feed and doesn't have budget allocations for radical advances.

The upshot of all this is that NASA is going to continue to be irrelevant to the future of humanity in space. Will that always be the case? There is one scenario under which that could change. China could get such a big space program going that the US might decide there's a serious national security issue at stake and that something big ought to be done about it quickly. NASA could be funded to do more radical work on much more ambitious projects. If this sounds far-fetched just remember that some day pigs will fly with the help of genetic engineering.

Another way that NASA could regain its relevancy would be if it lost its remaining shuttles in accidents. Then it might be pressured into pursuing bigger steps forward in space launch designs. Of course, if the other shuttles were lost NASA still might just decide to use available technology to design a safer but still very expensive Shuttle replacement.

Ultimately human space travel is going to be enabled by advances in nanotechnology and biotechnology. Nanotech advances made for other reasons will provide stronger and cheaper materials for building space launch vehicles. Biotech will make it easier to modify human bodies to live in low gravity and to grow food and structures for Mars colonies. The broader economy will drive the development of enabling technologies for space travel and space exploration far more than anything NASA is likely to do.

By Randall Parker 2003 March 25 04:58 PM  Airplanes and Spacecraft
Entry Permalink | Comments(1)
2003 March 24 Monday
Is Hydrogen The Energy Of The Future?

The April 2003 issue of Wired has an article written by Peter Schwartz and Doug Randall advocating an accelerated conversion to a hydrogen economy. After discussing the problems inherent to storing hydrogen in gaseous and liquid forms they argue that solid materials as hydrogen sponges will be the best long term solution.

In the long run, the most promising approach is to fill the tank with a solid material that soaks up hydrogen like a sponge at fill-up and releases it during drive time. Currently, the options include lithium hydride, sodium borohydride, and an emerging class of ultraporous nanotech materials. Unlike gaseous hydrogen, these substances can pack a lot of power into a small space of arbitrary shape. And unlike liquid hydrogen, they can be kept at room temperature. On the other hand, energy is required to infuse the solid medium with hydrogen, and in some cases very high temperatures are required to get the fuel back out, exacting a huge toll in efficiency. Also, filling the tank can take far more time than pumping gasoline. Government money could bridge the gap between today's experiments and a viable solution.

But will the problems involved in solid hydrogen storage be any more tractable and yield to any better solution than the problems with gaseous or liquid storage? Will the solid material needed to store the hydrogen weigh so much as to make it weigh as much as a battery which would contain the same amount of energy? The authors provide no indication as to why their preferred approach will turn out to be so advantageous.

The bigger problem with the article is that it does not explain why the use of hydrogen will allow us to reduce and eventually eliminate the use of fossil fuels. Hydrogen is not a source of energy. It would be more accurate to say that hydrogen is a way to store, transport, and use energy. Therefore it competes with other forms of stored energy. In cars and other vehicles hydrogen could be burned in fuel cells. But energy is needed to produce the hydrogen in the first place. To be a better automotive fuel hydrogen would somehow have to reduce the total usage of fossil fuels and do that better than other approaches that could be pursued.

Fossil fuels are a major source of energy today. Fossil fuels could be converted to hydrogen. But hydrogen advocates have not made a clear case for why hydrogen as an intermediate storage and end use form of energy is a more efficient way to use fossil fuels. There are too many unsolved problems and questions. Again, hydrogen does not really compete against other types of originating fuels. Rather, it relies on other types of originating fuels because it has to be produced using these other fuels.

If hydrogen is produced from electricity then the electricity must first be generated. But most electricity is generated by burning coal or natural gas. Hydro and nuclear also produce small fractions of the total electric supply. We've pretty much harnessed the available hydroelectric sources and hydroelectric is a pretty small fraction of total electric generation. The other big current alternative is nuclear energy. But for electricity generation nuclear power costs more than burning fossil fuels. There is no big economic incentive on a global scale to drive the building of massive numbers of nuclear power stations to cause a conversion to a nuclear-hydrogen economy. Also, widespread use of nuclear power on a global scale would so increase the availability of enriched uranium and plutonium that it would cause unacceptable risks of nuclear and radiological weapons proliferation.

The economic case for the use of nuclear power looks even worse than current fossil fuel prices suggest. The marginal cost of oil production (in some fields it is about $3/barrel) in the Middle East is much lower than current oil prices. Therefore nuclear power can not displace the use of Middle Eastern fossil fuels unless nuclear power becomes much cheaper than it is now.

Fossil fuels could be used to generate hydrogen. Would this be a more efficient way to use fossil fuels for transportation purposes? Keep in mind that each step in the use of hydrogen would produce an energy loss. The efficiency of the energy conversion of fossil fuels to hydrogen would be less than 100%. The hydrogen could then be piped (or driven) to what are now gasoline stations. If liquid hydrogen was used in cars then the hydrogen would have to be cooled first to liquid form. To keep it cool would require a great deal of insulation and probably additional cooling on-going. Therefore a car just sitting in a parking lot would consume energy at some low rate. As the Wired article points out, even a solid storage method may require energy usage in order to get the hydrogen into the solid and to get it back out again. Meanwhile, there are an assortment of ways to make the old internal combustion vehicle more fuel efficient. Therefore hydrogen is not just competing against today's internal combustion engine transportation systems. It is also competing against tomorrow's.

Hydrogen would most likely propel vehicles by being burned in a fuel cell. In theory fuel cells are a more efficient means of converting a liquid or gaseous fuel to mechanical power than the internal combustion engine. But hydrogen is not the only energy form that can be burned in fuel cells. There are fuel cell designs that will burn methane gas for instance. In fact, due to the greater efficiency of fuel cells for the conversion of fosil fuels to electricity fuel cells will become widely used for electric power generation from fossil fuels before they become used in transportation.

Is hydrogen the only viable candidate as an energy storage form to replace gasoline and diesel fuel in vehicles? In a word, no. Lead acid batteries have an energy storage density of 35 Watt Hours per kilogram. This leads to electric cars that weigh too much and have too short a range between recharges. MIT professor Donald R. Sadoway believes lithium polymer batteries can be developed that will have over an order of magnitude greater energy density than lead acid batteries.

Niels Bohr, the Danish physicist and Nobel Laureate, once cautioned that prediction is always dangerous, especially when it is about the future. With this disclaimer, then, we speculate on what is in store for rechargeable lithium batteries. In the near term, expect the push for all-solid- state, flexible, thin-film batteries to continue. This is driven by the desire to maximize the electrode–electrolyte inter-facial area while minimizing diffusion distances within the electrodes themselves, in order to combine high capacity with high rate capability. Recent results from our laboratory indicate that in a multi-layer configuration comprising an anode of metallic lithium, a solid polymer electrolyte, and a cathode of dense, thin-film vanadium oxide, it is possible to construct a battery with projected values of specific energy exceeding 400 Wh/kg (700 Wh/l) and specific power exceeding 600 W/kg (1000 W/l).10,11 Another trend is distributed power sources as opposed to a single central power supply. This allows for miniaturization (e.g., the microbattery). Expect also the integration of energy generation with energy storage, for example, a multilayer laminate comprising a photo-voltaic charger and a rechargeable battery. Ultimately, if scientific discoveries prove to be scalable and cost-effective, we should witness the large-scale adoption of electric vehicles.

When the cost of photovoltaics is lowered far enough to compete with fossil fuels then a combination of photovoltaics and lithium polymer batteries may well be the combination of technologies that will lead to the phase-out of the use of fossil fuels as vehicle power sources.

The article co-authored by Donald Sadoway and Anne Mayes is from the August 2002 issue of MRS Bulletin dedicated to lithium batteries.

By Randall Parker 2003 March 24 02:45 AM  Energy Tech
Entry Permalink | Comments(9)
2003 March 21 Friday
Sun Solar Radiation Has Been Increasing .05 Percent Per Decade

What have we done to anger the Sun God? Helios is getting hotter every decade.

Since the late 1970s, the amount of solar radiation the sun emits, during times of quiet sunspot activity, has increased by nearly .05 percent per decade, according to a NASA funded study.

"This trend is important because, if sustained over many decades, it could cause significant climate change," said Richard Willson, a researcher affiliated with NASA's Goddard Institute for Space Studies and Columbia University's Earth Institute, New York. He is the lead author of the study recently published in Geophysical Research Letters.

"Historical records of solar activity indicate that solar radiation has been increasing since the late 19th century. If a trend, comparable to the one found in this study, persisted throughout the 20th century, it would have provided a significant component of the global warming the Intergovernmental Panel on Climate Change reports to have occurred over the past 100 years," he said.

NASA's Earth Science Enterprise funded this research as part of its mission to understand and protect our home planet by studying the primary causes of climate variability, including trends in solar radiation that may be a factor in global climate change.

The solar cycle occurs approximately every 11 years when the sun undergoes a period of increased magnetic and sunspot activity called the "solar maximum," followed by a quiet period called the "solar minimum."

Although the inferred increase of solar irradiance in 24 years, about 0.1 percent, is not enough to cause notable climate change, the trend would be important if maintained for a century or more. Satellite observations of total solar irradiance have obtained a long enough record (over 24 years) to begin looking for this effect.

Total Solar Irradiance (TSI) is the radiant energy received by the Earth from the sun, over all wavelengths, outside the atmosphere. TSI interaction with the Earth's atmosphere,oceans and landmasses is the biggest factor determining our climate. To put it into perspective, decreases in TSI of 0.2 percent occur during the weeklong passage of large sunspot groups across our side of the sun. These changes are relatively insignificant compared to the sun's total output of energy, yet equivalent to all the energy that mankind uses in a year. According to Willson, small variations, like the one found in this study, if sustained over many decades, could have significant climate effects.

Perhaps we have gradually been angering the god Helios (a.k.a. Sol Invictus, Mithra, Ra, Dazhbog, and assorted other names for Sun and Light gods. Perhaps Helios is getting hotter under the collar as his anger builds.

Of course our prehistoric ancestors might intentionally have set out to do something that would gradually increase anger of Helios and to make him hot under the proverbial collar because they were freezing their buns in the Ice Age. Helios, being a God, may not react in the same time frame in which ephemeral mortals respond.

By Randall Parker 2003 March 21 04:16 PM  Dangers Natural General
Entry Permalink | Comments(2)
People Are Rushing To Embrace The Surveillance Society

MIT's Technology Review has an excellent article entitled Surveillance Nation.

It’s not all about Big Brother or Big Business, either. Widespread electronic scrutiny is usually denounced as a creation of political tyranny or corporate greed. But the rise of omnipresent surveillance will be driven as much by ordinary citizens’ understandable—even laudatory—desires for security, control, and comfort as by the imperatives of business and government. “Nanny cams,” global-positioning locators, police and home security networks, traffic jam monitors, medical-device radio-frequency tags, small-business webcams: the list of monitoring devices employed by and for average Americans is already long, and it will only become longer. Extensive surveillance, in short, is coming into being because people like and want it.

As surveillance systems become steadily cheaper and easier to use their use will skyrocket. Personal usage of surveillance systems will be just as extensive as government and corporate usage. For instance, I predict that within 10 or at most 20 years most houses will have cameras installed in them tied to the internet and sending out encrypted feeds so that their owners can look at what is happening in their houses when they are not home. Parents will embed surveillance systems into the cars they let their teenage kids drive so that the parents can know where the kids go, who rides with them, and whether the kids drive dangerously.

Imagine nanotech that allows instant testing for drug use. Governments and employers will not be the only users of cheap and miniaturized drug testing technology. It is easy to imagine a form of nanotech drug detector that can be unknowingly swallowed and absorbed into the body. Then when Mom gets suspicious that Junior is smoking pot she can give him a slice of cake that contains nanosensors and then Dad can secretly install a sensor on the front door that will interrogate the nanosensors every time Junior comes home.

Or how about nanosensors installed in the upholstery of a car that can detect marijuana or cigarette smoke or even beer fumes? Dad could check Junior's car by passing a small hand-held detector near it to interrogate its embedded sensors.

One way sensor usage may evolve at a personal level will be the sharing of sensor feeds among friends to allow people to organize into groups to help each other. For instance, imagine a group of people who are close friends letting each other watch the video feeds of their homes when they are not there. This could be done for security reasons or to track what their children are up to. If one person in the group has a job that gives her a lot of time to look at a video display then she could spend time watching what is going on in her own home and the homes of a few of her friends.

One can imagine neighborhood cooperation for detecting the movement of children. Every child could have embedded location detectors and many houses could have electronics for interrogating such detectors. Then a sharing of detector feeds could allow parents to detect whether their kids are still in the neighborhood. Automated software could even inform parents when their children are moving out of the area where they are allowed to roam.

Sharing of data feeds by government agencies, companies, and individuals will all lead to much greater scrutiny of the actions of individuals There are many causes of the greater sharing of data streams (e.g. detect fraud, detect terrorists, detect robbery attempts, detect bad credit risks, look for changing patterns of product demand). At the same time, the costs of collecting, sharing, and processing of data will all decline as the ease of sharing steadily increases.

By Randall Parker 2003 March 21 01:00 AM  Surveillance Society
Entry Permalink | Comments(2)
2003 March 19 Wednesday
Hunt For Cause Of Severe Acute Respiratory Syndrome

The Scientist web site has a good article on the hunt for the cause of SARS. The web site requires registration to access its articles but the registration is free and the site has generally good quality articles. Donald Lowe, head of the Department of Microbiology at Mount Sinai Hospital in Toronto thinks the hunt for the cause will succeed rather more quickly than similar hunts in the past.

But it took six months to identify the Legionella pneumophila bacterium as the cause of Legionnaire's disease, the pneumonia that attacked an old soldiers' convention in Philadelphia in 1976. Could it take just as long to identify the cause of SARS?

Unlikely, Lowe thinks. "Since Legionella, the molecular world has changed dramatically — if we have any evidence of an infectious particle, we can amplify the DNA or RNA, sequence it, and therefore be able more rapidly and accurately to define an agent. Unless we are dealing with a virus which is difficult to grow."

The New Scientist reports that a Canadian lab has ruled out 250 types of pathogens by testing samples from Canadian SARS victims.

Suspicions are continuing to focus on a paramyxovirus.

Teams in Hong Kong and Germany said they had found evidence of a virus known as a paramyxovirus in at least some of the patients with the illness, called severe acute respiratory syndrome.

They stressed that more tests are needed before the virus is pinned down as the culprit, but said it is the best clue yet about the cause of the syndrome, which may have killed as many as 14 people and sickened hundreds more.

One of the Hong Kong scientists involved in the search for the pathogen is confident that his group has isolated the virus and that it looks like a paramyxovirus.

"We've identified the virus," said Dr. John Tam, a microbiologist at the Chinese University of Hong Kong, at a news conference late Tuesday. "We used an electronic microscope and found the virus in patient samples."

A similar virus found previously in Holland was slow moving.

John Oxford, professor of Virology at Queen Mary's School of Medicine, said a similar virus had been discovered in Holland last year.

"It is rather slow-moving, rather restricted to families and hospitals, not a rip-roaring affair, but still very nasty.

The World Health Organization says the presence of a paramyxovirus in SARS victims might just be a coincidence.

Viruses in the Paramyxoviridae family include many common, well-known agents associated with respiratory infections, such as respiratory syncytial virus, and childhood illnesses, including the viruses that cause mumps and measles. Some of these viruses are widespread, particularly during the winter season. Screening of specimens could therefore be expected to detect particles of these common viruses. At this point, it cannot be ruled out entirely that tests for the SARS agent are detecting such “background” viruses rather than the true causative agent.

The Paramyxoviridae family also includes two recently recognized pathogens, Hendra virus and Nipah virus. These related viruses are unusual in the family in that they can infect and cause potentially fatal disease in a number of animal hosts, including humans. Most other viruses in the family tend to infect a single animal species only.

Nipah virus first began to cause deaths in humans in Peninsular Malaysia in 1998 in persons in close contact with pigs. The outbreak caused 265 cases of human encephalitis, including 105 deaths. Two separate outbreaks of Hendra virus, associated with severe respiratory disease in horses, caused two human deaths in Australia in 1994 and 1995. No human-to-human transmission was documented in either outbreak. No treatment was available for cases caused by either of these two viruses. Human-to-human transmission did not occur.

Even if the virus seen by the Hong Kong and German scientists via electron microscope is the infectious agent causing this disease and is from the paramyxoviridae family of viruses it is probably a previously unknown pathogen and not one of the known existing members of the paramyxoviridae family.

Still, Dr. Klaus Stöhr, a virologist and epidemiologist who is leading the health organization's scientific team investigating the illness, said that none of those viruses had caused a disease like the one under investigation, which doctors are calling severe acute respiratory syndrome, or SARS. Instead, the findings suggest that the virus might be a hitherto unknown member of the paramyxoviridae family.

If the SARS illness is positively identified as being caused by the paramyxovirus which the Hong Kong and German researchers have isolated then the only existing antiviral drug that might work against it is Ribavirin. Anti-viral drug development historically has been much more difficult than anti-microbial drug development. To date vaccines have been far more effective than anti-virals and this is likely to continue to be the case (though HIV is an exception to this pattern since HIV vaccine development has proven so hard to do). Other viruses of the paramyxoviridae family include measles and mumps and of course vaccines do exist for them. It therefore seems reasonable to expect that if the SARS illness is being caused by paramyxovirus the development of a vaccine should be possible.

Drug development time and vaccine development time are both usually measured in years. An exception to this is influenza. For influenza every year vaccines for new strains are usually developed within months of when the new strains are first detected. However, if this SARS pathogen is a paramyxovirus it is probably a species unknown to scientists and perhaps even from a genus heretofore unknown. Therefore developing a vaccine for it will require a lot more work to first understand it to a level of detail that would make vaccine development possible.

In the short to medium term the best line of defense against the SARS pathogen is likely to remain the rapid isolation of its victims. But the work to isolate and identify the virus is important because it will lead to tests for its presence. The ability to test and detect it will lead to a much more rapid ability to identify and isolate those who are infected by the virus. That, in turn, will help to prevent its spread. Therefore the race going on to identify and characterize the SARS pathogen could lead to a fairly rapid benefit for the public health.

The CDC has a web page where all CDC SARS press releases are published. The World Health Organization puts out all the WHO SARS press releases on their general press release page.

Update: The SARS illness spread from China. It started there in November 2002. The Chinese government was very irresponsible for not telling the rest of the world about the illness until after it had spread to outside of China.

Although the outbreak in Guangdong province started in November, health officials said almost nothing publicly for months afterward. Chinese officials gave their first report of the outbreak in Guangdong province to the World Health Organization on Sunday, saying that the outbreak was abating on its own. The report raised hopes at the World Health Organization that it would burn itself out elsewhere as well.

Because millions of poor Chinese people live in close proximity with chickens and pigs new strains of influenza sometimes jump across from other species into humans. These strains have a much greater potential to be very virulent in humans than the strains that emerge from within the human population. Therefore China is the most probable origin of an influenza whose lethality could rival the so-called Spanish flu of the 1918 pandemic. For China to be slow about notifying the rest of the world about a newly emerging illness is therefore an extremely dangerous practice. Governments and public health officials around the world should be sharply critical of China's lapse in its responsibility in this matter.

Update II: For more on SARS see Charles Murtaugh's post on SARS and my previous post on SARS.

Update III: The extent of the Chinese government's irresponsibility in failing to inform the rest of the world of the emergence of the SARS disease before it escaped from Guangdong province China is brought into sharp relief by the revelation of how SARS disease crossed over into Hong Kong. The carrier is now believed to have been a Guangdong doctor aged 64 who stayed in a Hong Kong hotel and infected 6 other people staying on the same floor of the hotel.

Director Margaret Chan said the source of the outbreak appeared to be a Guangzhou doctor who stayed at the Metropole Hotel in Waterloo Road last month and infected six others.

Had the Hong Kong medical establishment been properly informed of the threat it is very likely that hospitals would have been operating with a far higher degree of caution when examining new admittees and those who became ill with this disease would have been diagnosed and isolated much more rapidly. The Chinese government deserves a serious loud dose of criticism for its handling of this matter.

Update IV: SARS is now suspected of being caused by a coronavirus. Blood serum from recovered victims has had gamma globulin extracted from it and injected into the bodies of those with very severe SARS and the gamma globulin treatments have been very effective in treating severe cases of SARS.

"Facts have proven that in at least 20 of our patients who went through very smooth recovery, their serum has been used to treat very severe sufferers and that has been very successful," said Leung Ping-chung, a professor and surgeon at the Prince of Wales Hospital in Hong Kong, considered ground zero of the outbreak.

Also see my more recent post Fears Grow That SARS May Spread Into Pandemic.

By Randall Parker 2003 March 19 10:26 PM  Dangers Natural Bio
Entry Permalink | Comments(6)
Los Alamos Develops Muon Nuclear Weapons Detector

Cosmic subatomic particles called muons strike the Earth continuously. Scientists at Los Alamos National Laboratory in New Mexico have demonstrated that the scattering of muons thru different types of materials can be used to detect smuggled nuclear weapons.

The high-energy particles, called muons, scatter in a highly predictable pattern when they strike dense materials like uranium or the lead used in heavy shielding, and that scattering could be picked up by a special detector, the scientists said.

Unlike X-rays muons can penetrate dense objects and produce 3 dimensional images.

In contrast, muons are highly penetrating - a typical cosmic-ray muon can pass through more than 10 metres of water - and could be used to produce radiographic images of medium-to-large objects in a short exposure time.

Bill Priedhorsky of LANL says the muon detectors are pretty simple to build.

The muon detectors are little more than "extruded aluminium, stainless steel wires and argon gas" and the device needs no radiation source. Furthermore, unlike the X-ray and gamma-ray scanners, there is no health risk associated with the radiation dose.

Konstantin Borodzin of LANL says this method will be useful for examining large objects.

"This method shows promise as an inexpensive, harmless probe for medium to large objects, such as commercial trucks, passenger cars or sea containers, using only the natural flux of muons," Borozdin said.

As the technologies needed to develop nuclear weapons spread more widely there is an increasing need to detect attempts by terrorists to smuggle radiological and nuclear weapons.

By Randall Parker 2003 March 19 01:29 PM  Dangers Tech General
Entry Permalink | Comments(0)
2003 March 18 Tuesday
Sub-Kilometer Asteroids Will Not Produce Tsunamis

We don't need to worry about small asteroid strikes causing tidal waves that wipe out coastal regions.

The idea that asteroids as small as 100 meters across pose a serious threat to humanity because they create great, destructive ocean waves, or tsunamis, every few hundred years was suggested in 1993 at a UA-hosted asteroids hazards meeting in Tucson.

At that meeting, a distinguished Leiden Observatory astrophysicist named J. Mayo Greenberg, who since has died, countered that people living below sea level in the Netherlands for the past millennium had not experienced such tsunamis every 250 years as the theory predicted, Melosh noted.

But scientists at the time either didn't follow up or they didn't listen, Melosh added.

While on sabbatical in Amsterdam in 1996, Melosh checked with Dutch geologists who had drilled to basement rock in the Rhine River delta, a geologic record of the past 10,000 years. That record shows only one large tsunami at 7,000 years ago, the Dutch scientists said, but it coincides perfectly in time to a giant landslide off the coast of Norway and is not the result of an asteroid-ocean impact.

In addition, Melosh was highly skeptical of estimates that project small asteroids will generate waves that grow to a thousand meters or higher in a 4,000-meter deep ocean.

Concerned that such doubtful information was -- and is -- being used to justify proposed science projects, Melosh has argued that the hazard of small asteroid-ocean impacts is greatly exaggerated.

Melosh mentioned it at a seminar he gave at the Scripps Institution of Oceanography a few years ago, which is where he met tsunami expert William Van Dorn.

Van Dorn, who lives in San Diego, had been commissioned in 1968 by the U.S. Office of Naval Research to summarize several decades of research into the hazard posed by waves generated by nuclear explosions. The research included 1965-66 experiments that measured wave run-up from blasts of up to 10,000 pounds of TNT in Mono Lake, Calif.

The experiments indeed proved that wave run-up from explosion waves produced either by bombs or bolides (meteors) is much smaller relative to run-up of tsunami waves, Van Dorn said in the report. "As most of the energy is dissipated before the waves reach the shoreline, it is evident that no catastrophe of damage by flooding can result from explosion waves as initially feared," he concluded.

The discovery that explosion waves or large impact-generated waves will break on the outer continental shelf and produce little onshore damage is a phenomenon known in the defense community as the "Van Dorn effect."

But Van Dorn was not authorized to release his 173-page report when he and Melosh met in 1995.

However, sub-kilometer asteroids can still cause a lot of damage.

The asteroid that exploded over the Tunguska River in 1908 is estimated to have been 160- to 180-feet in diameter. And a similar sized asteroid is believed to have exploded over Khazakstan in the late 1940s.

The 1908 Tunguska Siberia asteroid was fairly small and yet devastated a large area when it exploded.

A notorious example occurred in 1908 when an asteroid in this size range is believed to have exploded above the uninhabited Tunguska region of Siberia, leveling trees for some 800 square miles (2,000 square kilometers) around. Astronomers have for a decade or so said so-called Tunguska events probably occur about once every hundred years, leading some to speculate that we're about due for another.

So the take-home lesson is that you can still worry about getting killed by smaller asteriods that could hit closer to where you are. You just can't expect to be killed by a sub-kilometer asteroid that hits the ocean thousands of miles away from land.

By Randall Parker 2003 March 18 01:22 PM  Dangers Natural General
Entry Permalink | Comments(0)
2003 March 17 Monday
Male Sweat Brightens, Soothes Females

If your woman is unhappy you need to work up a sweat more often.

PHILADELPHIA -- Scientists at the University of Pennsylvania and the Monell Chemical Senses Center in Philadelphia have found that exposure to male perspiration has marked psychological and physiological effects on women: It can brighten women's moods, reducing tension and increasing relaxation, and also has a direct effect on the release of luteinizing hormone, which affects the length and timing of the menstrual cycle.

The results will be published in June in the journal Biology of Reproduction and currently appear on the journal's Web site.

"It has long been recognized that female pheromones can affect the menstrual cycles of other women," said George Preti, a member of the Monell Center and adjunct professor of dermatology in Penn's School of Medicine. "These findings are the first to document mood and neuroendocrine effects of male pheromones on females."

In a study led by Preti and colleague Charles J. Wysocki, extracts from the underarms of male volunteers were applied to the upper lip of 18 women ages 25 to 45. During the six hours of exposure to the compound, the women were asked to rate their mood using a fixed scale.

"Much to our surprise, the women reported feeling less tense and more relaxed during exposure to the male extract," said Wysocki, a member of the Monell Center and adjunct professor of animal biology in Penn's School of Veterinary Medicine. "This suggests that there may be much more going on in social settings like singles bars than meets the eye."

After the women's exposure to the underarm extract, further testing revealed a shift in blood levels of luteinizing hormone. Levels of this reproductive hormone, produced in pulses by the pituitary gland, typically surge right before ovulation but also experience hundreds of smaller peaks throughout the menstrual cycle.

Preti and Wysocki found that application of male underarm secretions hastened onset of these smaller pulses. Duration to the next pulse of luteinizing hormone was shortened by an average 20 percent, from 59 to 47 minutes.

Headed for a pick-up bar? How about a work-out or a trip to the sauna first?

The scientists who did this work are currently trying to identify what compound(s)s in sweat cause the reported responses. Cologne and after-shave makers will no doubt rush to incorporate any compound that is found to play a role in causing the reported effects.

By Randall Parker 2003 March 17 11:59 PM  Brain Sex Differences
Entry Permalink | Comments(0)
2003 March 15 Saturday
Severe Acute Respiratory Syndrome Causing Concerns

Update VII: SARS now looks to be much more infectious than previously thought. See my later post Fears Grow That SARS May Spread Into Pandemic. Also read below for historical context about previous epidemics.

There is a new deadly strain of some kind of pathogen (its not yet clear if its an influenza but it is suspected to be a virus of some kind) has been making some people sick and it appears to have originated in Guangdong province China. The illness is being called atypical pneumonia or Severe Acute Respiratory Syndrome (SARS). The nature of the illness has started to raise alarms.

Before we get to that lets put it in perspective by taking a brief look at most famous and deadly influenza outbreak recorded in modern history. The lethality of that outbreak explains why public health officials become very worried when new strains of pathogens with increased lethality are reported.

During World War I a virulent influenza known as Spanish flu swept the world and killed tens of millions.

The Epidemic spread quickly around the earth. In all, some 525 million people were infected by the virus, with about 21 million people dying. That was more than twice the number who had been killed during the Great War. In many countries public gatherings were forbidden. The Flu was especially devastating on many people as they welcomed back their men from the war, overjoyed that they had managed to survive the slaughter that was the war. But their joy soon turned to grief when they found out that their men had brought the virus back with them, and it would not only kill them but also other family members.

The flu may really have originated in Tibet but the first known concentration of deaths from it occurred in Spain and hence its called the Spanish Flu. The estimates of how many died from it vary from 20 to 40 million. Spanish Flu struck down people in the prime of life.

Spanish Influenza swept the entire globe in the years 1918-1920, leaving a billion people sick, more than half of the worid's population at that time. It killed at least 30 million people, threc times the death toll of World War l (Wilton 1993). A study for Norway has recently resulted in an upward revision of the death toll. The suggested estimate is 14 676, twice as high as the most frequently cited figure (Mamelund 199Sa). The socio-economic impact of the flu was also considerable. One reason for this is that the flu took its greatest toll among people in their most productive ages (20-40 years, especially men), i.e. that part of life when people tend to marry and have children (Mamelund 1998a).

Spanish Flu killed far more than World War I.

The 1918 Spanish flu was one of the most contagious viruses ever known. It killed as many as 40 million people in the winter of 1918 and 1919, more than died in the First World War.

Flu strains vary considerably in their lethality. Spanish flu belongs to the Type A strain. Type A strains are usually more lethal than other types. See here and here for information about influenza types and how they mutate to form new strains. A later type A influenza strain killed hundrends of thousands in 1968:

Hong Kong Flu - Common name for the influenza A strain that killed nearly 750,000 people around the world in the 1968 pandemic

Influenza mutates. Some mutations are more lethal than others. As Spanish flu demonstrated, some can be incredibly lethal. Should we be worried that a new influenza strain might pop up and kill millions including members of our families and circles of friends? Well, on one hand we have more advanced medical technologies. You might think we could much better handle a new strain that was as harmful as the Spanish flu. But keep in mind that if a significant portion of the population gets sick all the hospital beds will fill up and there won't be enough respiratories and other modern medical equipment to go around. Also, we don't have highly effective treatments to use against viruses that compare to the antibiotic drugs that are effective for use against most bacteria (though the rise of drug resistant bacterial strains is making bacterial infections a growing concern).

Another problem we have is that cars and airplanes move more people around the globe and much faster than was the case over 80 years ago. So new disease strains can spread rather rapidly and can reach even remote places.

Still, its not all doom and gloom. The best way to avoid dying or getting very sick from a disease is to avoid exposure in the first place. The biggest advantage we have are far better ways to isolate ourselves from sick people. For instance, we live in less dense housing. Accounts of the 1918 epidemic describe immigrant families living in crowded New York City tenements where lots of people breathed each other's air. Individual families just had to have one member come home with the disease and soon ten or twenty others were all exposed to it and likely other people walking up and down the same staircases were exposed as well. By contrast, today we have smaller families and on average a much lower number of people living in each dwellling and more square feet of living space per person.

If it was suspected that some deadly disease on the order of the 1918 flu epidemic was on the scene then the most rational response would be to quickly and calmly reorder society in ways that would reduce risks of exposure. With this in mind one of the strangest (at least to my American eyes) things I saw riding subways and trains in Japan were people wearing surgical masks. Either they were sick and didn't want to pass their illness on to others or they wanted to avoid breathing in particles of influenza and cold viruses coughed into the air by others. Such a practice is easy to adopt. The inconvenience would be fairly minor and it beats dying. Even if there was a shortage of surgical masks all matter of cloth can be adapted to that purpose.

A reduction of exposure between humans can be accomplished in many other ways. People who go shopping can go less often, buy more per trip, and not go during rush hours when the isles are crowded. Optional activities such as vacations, club meetings, movie outings, concert attendance, and the like can be cancelled. People who are able to work from home can stop going into the office. Another simple thing is to avoid touching surfaces in public places. If you do then wash your hands quickly (perhaps with a bottle of antiseptic fluid). Better yet, wear gloves and avoid touching surfaces in public places. Also, when out in public avoid touching your face with your hands unless you've recently washed your hands. Even the people who can not change their daily routine will be at less risk if all those who can change their daily routine do so to the extent that they can.

Okay, so lethal epidemics can still happen. But we have lots of things we can do to reduce our risks of getting seriously ill or killed in such an epidemic. With all this in mind lets look at a recent development that has health officials thinking some pretty worried thoughts.

WHO issues a global alert about cases of atypical pneumonia

12 March 2003 | GENEVA -- Since mid February, WHO has been actively working to confirm reports of outbreaks of a severe form of pneumonia in Viet Nam, Hong Kong Special Administrative Region (SAR), China, and Guangdong province in China.

In Viet Nam the outbreak began with a single initial case who was hospitalized for treatment of severe, acute respiratory syndrome of unknown origin. He felt unwell during his journey and fell ill shortly after arrival in Hanoi from Shanghai and Hong Kong SAR, China. Following his admission to the hospital, approximately 20 hospital staff became sick with similar symptoms.

The signs and symptoms of the disease in Hanoi include initial flu-like illness (rapid onset of high fever followed by muscle aches, headache and sore throat). These are the most common symptoms. Early laboratory findings may include thrombocytopenia (low platelet count) and leucopenia (low white blood cell count). In some, but not all cases, this is followed by bilateral pneumonia, in some cases progressing to acute respiratory distress requiring assisted breathing on a respirator. Some patients are recovering but some patients remain critically ill.

Today, the Department of Health Hong Kong SAR has reported on an outbreak of respiratory illness in one of its public hospitals. As of midnight 11 March, 50 health care workers had been screened and 23 of them were found to have febrile illness. They were admitted to the hospital for observation as a precautionary measure. In this group, eight have developed early chest x-ray signs of pneumonia. Their conditions are stable. Three other health care workers self-presented to hospitals with febrile illness and two of them have chest x-ray signs of pneumonia.

World Health Organization issues emergency travel advisory

15 March 2003 | GENEVA -- During the past week, WHO has received reports of more than 150 new suspected cases of Severe Acute Respiratory Syndrome (SARS), an atypical pneumonia for which cause has not yet been determined. Reports to date have been received from Canada, China, Hong Kong Special Administrative Region of China, Indonesia, Philippines, Singapore, Thailand, and Viet Nam. Early today, an ill passenger and companions who travelled from New York, United States, and who landed in Frankfurt, Germany were removed from their flight and taken to hospital isolation.

Due to the spread of SARS to several countries in a short period of time, the World Health Organization today has issued emergency guidance for travellers and airlines.

“This syndrome, SARS, is now a worldwide health threat,” said Dr. Gro Harlem Brundtland, Director General of the World Health Organization. “The world needs to work together to find its cause, cure the sick, and stop its spread.”

There is presently no recommendation for people to restrict travel to any destination. However in response to enquiries from governments, airlines, physicians and travellers, WHO is now offering guidance for travellers, airline crew and airlines. The exact nature of the infection is still under investigation and this guidance is based on the early information available to WHO.

Countries are starting to discourage their citizens from travelling to areas where SARS has been reported. Thailand joins Singapore and Taiwan in urging their citizens not to go to Hanoi or southern China.

Passengers are being required to fill out health forms indicating whether they had been to the affected areas, and airlines have been instructed to report immediately if any passengers begin exhibiting symptoms.

The announcement follows similar moves by Singapore and Taiwan, which have both urged their citizens not to travel to Hanoi in Vietnam or southern China "unless absolutely necessary".

The experts do not know the type of disease causing SARS.

"It is either a new germ which hasn't caused a disease before or is a more common germ which has undergone a large change," David Bell, a public health physician at the Manila-based WHO Western Pacific office, said.

"If it is a new organism -- which has undergone significant change -- it may be more difficult to identify," warned Rob Condon, a WHO epidemiologist at the same office.

World Health Organization official David Heymann is clearly worried.

"It is not a very good situation," said Dr. David L. Heymann, a top expert in communicable diseases at the health agency. "It is a very difficult disease to figure out, and this has been going on for the last 10 days to two weeks."

Influenza has not been ruled out as the cause.

Among the survivors, "no one has gotten well yet," Dr. Heymann said in an interview. "It is not clear what is going on, and it is not clear what the extent of spread will be," particularly because "these are areas where there is a lot of international travel," he added.

The WHO spokesman sounds rather concerned.

Dick Thompson, a WHO spokesman in Geneva, could recall no such emergency travel advisory being issued in recent memory.

"Until we can get a grip on it, I don't see how it will slow down," said Thompson. "People are not responding to antibiotics and antivirals, it's a highly contagious disease and it's moving around by jet. It's bad."

Now that so many health authorities and medical doctors are attempting to identify victims of this disease it may be possible to contain it. One advantage we have today that didn't exist over 80 years ago is that information travels even more quickly than people do. A lot of the initial victims were hospital workers. One would expect the rate of infection of health care workers to drop as they recognize the disorder more quickly and take more drastic measures to avoid exposure from infected patients. However, one concern there is that in less developed countries the health care workers may lack the kinds of facilities and supplies needed to reduce their own degree of risk. Therefore the health care workers in less developed countries may end up either spreading the disease or they may turn away the sick and therefore the sick may not be properly isolated. Whether efforts at containment will work remains to be seen.

Will this disease spread and kill massive numbers of people? Don't know. Its certainly a story to watch very carefully.

Update: Encouraging news about SARS comes from a CDC press conference. Dr. Julie Gerberding of the CDC says the pattern of transmission so far has been through close personal contact.

QUESTION: And also, do we know how contagious? I mean if I was on a subway car with someone who was ill, could I get it from them, or do you need to have that close like I'm-taking-care-of-me kind of contact.

DR. GERBERDING: What we know so far from the investigations in progress are that it's very close personal contact of the type defined by WHO as having cared for, having lived with, or having had direct contact with respiratory secretions and body fluids of a person with the diagnosis. So there is no evidence to suggest that this can be spread through breath contact or through assemblages of large people; it really seems to require a fairly direct and sustained contact with a symptomatic individual.

If pattern of transmission continues then the chances of containing the disease will be much better. If it becomes as easy to transmit as a cold or regular influenza then containment would be much more difficult and perhaps impossible. So far there has been no indication whether the disease is transmissible during the asymptomatic incubation period. Whether it is will also affect the ability to slow or stop its spread.

Update II: Even if SARS doesn't turn out to be a massive killer plague (and its pattern of transmission suggests it will not be) we are still vulnerable to being killed off by a pathogen that hops into the human population from another species and that mutates into a virulent form. We have had close calls that have been contained such as the 1997 chicken influenza that was extremely deadly in humans.

In 1997 epidemiologists and public health officials from around the world got their first glimpse¹ of an entirely new variety of human influenza. Known as subtype H5N1 for the surface proteins which the virus carries, the new strain had only ever previously been observed in birds. Ominously, the effect of H5N1 on poultry had earned it the evocative title of "Chicken Ebola." And when it surfaced in the human population of Hong Kong last year it proved to be almost as deadly.

How deadly? Even with the advantages of intensive-care treatment, fully one third of the first 18 confirmed cases never recovered. They died.

Some additional accounts of the 1918 Spanish Flu Pandemic here and here provide a sense of how rapidly it spread and how difficult it would be to fight a similar outbreak today.

What is needed is the ability to develop and produce vaccines more rapidly. DNA vaccines are held out by some researchers as promising faster and lower cost manufacture with fewer side effects than many conventional vaccines. However, another promising approach is to use bacterial viruses knows as bacteriophages to produce vaccines very rapidly and cheaply.

BALTIMORE – March 10, 2003 – Genetically altered bacterial viruses appear to be more effective than naked DNA in eliciting an immune response and could be a new strategy for a next generation of vaccines that are easy to produce and store, say researchers from Moredun Research Institute in the United Kingdom.

"In theory, millions of doses can be grown within a matter of days using simple equipment, media and procedures," says John March, one of lead researchers presenting findings at the American Society for Microbiology's Biodefense Research Meeting.

Bacteriophages are viruses that infect bacteria but not humans. In this particular study, March and his colleagues used a bacteriophage as a vehicle for genes from hepatitis B virus in mice and compared its ability to elicit a protective immune response with a vaccine made of naked DNA. They found that not only could the bacteriophage induce an immune response, the number of bacteriophage they needed was less than 1 percent of the number of pieces of naked DNA required to mount an effective immune response.

Using bacteriophages to deliver vaccine components offers several advantages over vaccination with naked DNA, says March. The DNA is protected inside the protein shell of the virus making it longer lasting and easier to store. In addition, bacteriophages have a large cloning capacity, making large-scale production cheap, easy and extremely rapid – important attributes considering the current bioterrorism threat when sudden demands may be placed on vaccine stocks.

In order to produce vaccines the vaccines first must be developed. One has to have a design for a vaccine. Therefore the other needed element of a fast response strategy for new strains of influenza and even for new kinds of pathogens is to have high isolation labs that are equipped to rapidly take apart a pathogen and to develop vaccines for it. See my recent post United States Lacks Sufficient Biodefense Lab Space. What is needed is not just ultra-secure and ultra-isolated lab space. The labs would need to be equipped with or be located near labs that capabilities to do DNA sequencing, protein sequencing, protein structure determination and other relevant capabilities. Properly designed and equipped such labs could work on longer term problems between crises but when a deadly naturally occurring or man-made pathogen threatened to cause massive numbers of fatalities the best microbiologists and virologists could staff them and work to develop vaccines and drugs.

Update III: The illness may be caused by a paramyxovirus.

There is a long list of other candidates, with a family of microbes called the paramyxoviruses "certainly ranking on the top of most people's thoughts," said Klaus Stohr, a WHO virologist and epidemiologist who is helping to direct the investigation.

Update IV: The CDC has a web page where all CDC SARS press releases are published. The WHO puts out all the WHO SARS press releases on their general press release page.

Update V: To reiterate for those who are worried that SARS could become an enormous killer: It is spreading slowly. It appears to require fairly close contact to catch it. It does not appear to be as easily transmitted as many cold and influenza viruses. In a March 17, 2003 press conference Dr. Julie Gerberding, Director of the Centers for Disease Control and Prevention, says SARS is not being transmitted by casual contact. (bold emphases mine)

We know that the disease is so far limited to people who have had very close contact with cases. Most of the individuals are health care personnel who have been in direct contact with either the patient or body fluids from the patient. We also know that household contacts are at risk, particularly if they've had direct and sustained contact with sick individuals.

So far the cases are limited, as Secretary Thompson said, to individuals who have either lived in parts of Asia that are affected, or who have recently traveled from those areas.

We believe the incubation period is approximately 2 to 7 days, although as new information unfolds, that may be updated. So the travel advisories that have been issued stipulate that individuals returning from those areas with fever and respiratory symptoms within 7 days of their departure should seek medical attention to be sure that they are not in the early stages of this syndrome.

We also know that there is no evidence so far that persons not in direct contact with suspect cases are at risk. We have not identified any people with casual contact or indirect contact. I think we were reassured by the investigation here in Georgia, where there was an individual who acquired this infection presumably from family members, was here in this city while sick, was involved in activities that involved exposure to others in a workplace setting, and there is no evidence of spread from that kind of contact in the workplace.

Nevertheless, I stress again this is an ongoing investigation. We certainly don't have all the information we need to know to have certainty about any of these issues, and we will just simply have to update you as we go forward.

The most important thing that we need to do is to prevent spread of this infection, and I'll tell you some of the things we're doing about that right now. But the second most important thing is to figure out what's causing. This appears to be a contagious infectious disease, and as I said, limited to health care personnel and close household contacts. That suggests spread by the droplet route, and that's why our infection control precautions emphasize prevention of droplet spread through the use of face shields and gowns and gloves.

Update VI: The disease is increasingly looking like it is not spreading. Victims are popping up in more countries but the vast bulk of them all were infected in China or Vietnam. Therefore a general outbreak all around the world is looking less likely. Now that the knowledge about its symptoms has been widely disseminated victims are rapidly isolated and health care workers protected from exposure. Hopefully this trend will continue and the disease will be contained. The disease increasingly looks like a new pathogen

"As time goes by that is increasingly likely, simply because so many people have run so many tests," said Iain Simpson, a spokesman for the World Health Organisation (WHO).

"If it is something we already knew about we would almost certainly have identified it," he told Reuters.

While this disease looks like it is going to be successfully contained it should serve as a wake-up call that we are ill-equipped to deal with a deadly disease that spreads easily and that does break out into the general population. Some day a much more deadly influenza strain will cross over from fowl or swine into humans and as of yet we are unprepared to effectively deal with that eventuality.

Update VII: SARS now looks to be much more infectious than previously thought. See my later post Fears Grow That SARS May Spread Into Pandemic.

By Randall Parker 2003 March 15 05:45 PM  Dangers Natural Bio
Entry Permalink | Comments(19)
2003 March 14 Friday
World Population May Shrink Later In 21st Century

Ben Wattenberg says UN demographers have finally accepted the extent of declining birth rates.

Now, in a new report, United Nations demographers have bowed to reality and changed this standard 2.1 assumption. For the last five years they have been examining one of the most momentous trends in world history: the startling decline in fertility rates over the last several decades. In the United Nations' most recent population report, the fertility rate is assumed to be 1.85, not 2.1. This will lead, later in this century, to global population decline.

Wattenberg and the UN demographers are almost certainly wrong in predicting a decline in the human population. If robots do not take over and if nanotech replicators do not run amuck and wipe us out then the human population will increase even late in the 21st century. Why? Aging rejuvenation therapies will cause a radical increase in life expectancy.

Increased life expectancy will increase the population in two ways. The most obvious is of course that people who live longer will not cause a population decline by dying off. The less obvious cause is that aging rejuvenation therapy will allow female reproductive organs to stay functional for a longer period of time. Women will be able to have children in their 50s, 60s, and later. Gene therapies will help repair aging cells and cell therapies will replace aging stem cell reservoirs. Also, the development of the ability to grow new organs will eventually include the ability to grow new ovaries and other female reproductive organs.

Another future contributing factor to the growth in human population will be the development of artificial wombs This will increase the fertility rate even among younger women who are too busy with their careers to want to be slowed up by a pregnancy. Consider the Hollywood actresses who can't get pregnant without putting their careers on hold. This is especially true for TV actresses who star in their shows. Filming goes on for too large a portion of the year to allow a pregnancy to happen. When artificial wombs become reliable many will opt instead to have their babies grown from their own cells but not in their own bodies. An artifiical womb will be more trustworthy than a surrogate mother because women won't have to worry about the artificial womb doing drugs, drinking alcohol, smoking, eating poorly, or getting an infection.

Natural selection is also going to eventually cause reproduction rates to rise. In a study of twins in Australia evidence was found for natural selection for genetically influenced traits that increase fertility.

University-educated women have 35% lower fitness than those with less than seven years education, and Roman Catholic women have about 20% higher fitness than those of other religions. Although these differences were significant, education and religion only accounted for 2% and 1% of variance in fitness, respectively. Using structural equation modeling, we reveal significant genetic influences for all three life-history traits, with heritability estimates of 0.50, 0.23, and 0.45, respectively. However, strong genetic covariation with reproductive fitness could only be demonstrated for age at first reproduction, with much weaker covariation for age at menopause and no significant covariation for age at menarche. Selection may, therefore, lead to the evolution of earlier age at first reproduction in this population.

Current highly visible trends are useful for predicting the future one or two decades in advance. But the further out a prediction is made the more other factors need to considered. Natural selection happens more slowly than changes caused by industrialization. But natural selection does happen and it is exerting selective pressure that is changing the behavioral and other characteristics of humans. At the same, the ways in which technology is changing society today are not the only ways technology will reshape human society tomorrow.

The two biggest wild cards that make future prediction extremely difficult for the 21st century will be the development of machine intelligence and the development of genetic engineering techniques for boosting human intelligence. These two developments will cause such huge changes in human society that predictions of demographers about human reproductive patterns 50 or 100 years from now are almost certainly very far from what will really happen.

It is still possible that the human population on Earth will decrease by the end of the 21st century. But it is unlikely to do so as a result of trends that demographers can now measure. Humans could migrate off planet in large numbers when space flight becomes widely affordable. Or a world government, responding to the development of biotechnologies that can rejuvenate and dramatically extend life, could enforce strict reproductive limits in order to prevent a rise in population. Or perhaps humans may all become part of a Borg mind that suppresses reproductive instincts in the individual nodes. Or a virus could be released into the population that does genetic alternations that in turn cause cognitive changes in human minds that make child-rearing unappealing. There are a lot of reasons human population could rise or fall in the latter part of the 21st century. The most important factors that will determine the outcome depend on technological developments that will happen as the century progresses. We can not know with sufficient precision how all these changes will play out. Therefore predictions of world population changes become increasingly inaccurate the further out the predictions are made.

By Randall Parker 2003 March 14 09:09 AM  Trends Demographic
Entry Permalink | Comments(3)
2003 March 13 Thursday
Ritalin For Children Reduces Later Alcohol and Drug Abuse

Some people argue about whether Attention Deficit Hyperactivity Disorder (ADHD) is overdiagnosed. The use of Ritalin on children is linked to a larger debate on whether the mind's function can be explained as a bunch of biochemistry and electrical patterns. The argument against evolutionary psychology on the grounds that evolutionary psychology relies upon "an evolutionary past which is permanently inaccessible to empirical research" is not persuasive because the evolutionary past really is scientifically accessible in a number of ways. For instance, comparative DNA sequence analysis within and across species combined with measures of various attributes can yield a great deal of useful information about selective pressures that must have acted on humans and other species (e.g. mutations that provide resistance to particular illnesses are found in people from parts of the world where those diseases are endemic).

While the debate continues about whether various aspects of human nature are genetically specified the reductionist neurobiologists continue to find ways to manipulate the mind biochemically. While the rate of occurrence of AHDH is debated the use of Ritalin has recently been found to have long term effects on behavior.

A study by researchers at Harvard University has provided more evidence that using stimulant medications such as methylphenidate to treat children with attention-deficit/hyperactivity disorder (ADHD) may reduce their risk of developing drug and alcohol use disorders later in life.

Dr. Timothy Wilens, lead investigator, and colleagues used a statistical method called meta-analysis (an examination of whether data compiled from multiple scientific studies provides evidence for statistical significance) to evaluate the relationship between stimulant therapy and subsequent substance use disorders (SUD) in youths with ADHD. After searching the literature for studies of children, adolescents, and adults with ADHD that had information on childhood exposure to stimulant therapy and later SUD outcomes, the researchers applied meta-analyses to six long-term studies. Two studies followed patients into adolescence and four followed patients into young adulthood. These studies comprised data from 674 youths receiving medication therapy for ADHD and 360 unmedicated youths with ADHD. Of those receiving medications, 97 percent were taking the stimulants methylphenidate or amphetamine.

From the compiled data, researchers found that youths with ADHD who were treated with stimulants had an almost two-fold reduction in the risk for developing SUD when compared with youths with ADHD who did not receive stimulants. Examination of each study individually suggested that stimulant medications might have a protective effect against the development of SUD.

Analysis of studies that reported follow-up into adolescence revealed that youths treated with stimulants were 5.8 times less likely to develop SUD than those not treated. However, analysis of studies that followed subjects into adulthood found that those treated with stimulants were about 1.5 times less likely to develop SUD. The researchers say that the less robust effect during adulthood may have occurred because the patients discontinued stimulant treatment when they reached a certain age or that parents may closely monitor the medications of youths with ADHD.

Overall, treating ADHD pharmacologically appears to reduce the risk of substance abuse by half. Untreated, ADHD is associated with a two-fold increased risk for developing a substance abuse disorder. Hence, while not truly immunizing against substance abuse, treating ADHD pharmacologically reduces the risk for drug and alcohol abuse and addiction to the level of risk faced by the general population. The report's findings are among the most robust in child psychiatry demonstrating a protective effect of pharmacological treatment on reducing the risk for later substance abuse.

The study, funded by the National Institute on Drug Abuse (NIDA), is published in the January 6, 2002, issue of Pediatrics.

Think about some of the implications if this report turns out to be correct. A drug has been identified that will affect the development of the mind in such a way that it produces behavior which is more adaptive. Surely this will not be the last such drug found.

It may turn out that gene therapy will not be necessary in order to cause children to develop different personalities or higher intelligence. Surely gene therapy will turn out to be a more powerful technique than drug use. But if drug use alone can affect cognitive development in a way that is not damaging then engineering of personality types may become more widespread more quickly.

It is possible that Ritalin's effect works for only as long as the drug is taken. It may block pleasure caused by other drugs or may provide some of the same pleasure and therefore reduce the size of the increase in pleasure caused by recreational drugs.

The reason it is plausible that Ritalin may have enduring effects is that during adolescence the human mind undergoes a lot of growth and reorganization. Drugs taken during that time that affect mental state likely affect the pattern of connections that form and hence should have lasting effects. Also, an injectable protein has already demonstrated the ability to enhance learning in rats. It should be possible to develop drugs that will affect gene expression of assorted proteins involved in nerve growth and therefore to change the course of brain development during adolescence.

By Randall Parker 2003 March 13 03:14 PM  Brain Addiction
Entry Permalink | Comments(9)
2003 March 12 Wednesday
Silicon Chip May Be Brain Hippocampus Replacement

Damage to the hippocampus at the base of the brain can leave a person unable to form new memories. One solution to the problem that is nearing testing is to build a chip that performs all the functions of the hippocampus.

The world's first brain prosthesis - an artificial hippocampus - is about to be tested in California. Unlike devices like cochlear implants, which merely stimulate brain activity, this silicon chip implant will perform the same processes as the damaged part of the brain it is replacing.

The prosthesis will first be tested on tissue from rats' brains, and then on live animals. If all goes well, it will then be tested as a way to help people who have suffered brain damage due to stroke, epilepsy or Alzheimer's disease.

A team led by Theodore W. Berger of USC spent 10 years to build a mathematical model of the hippocampus and then to program it into a silicon chip.

Slices of rat hippocampus were stimulated with electrical signals millions of times, until scientists could be sure which input produced a corresponding output.

Putting the information from each slide together, the researchers were able to devise a mathematical model of a whole hippocampus.

The model was then programmed on to a chip.

From the University of Southern California web site of team leader Theodore W. Berger:

The research of Dr. T.W. Berger involves the complementary use of experimental and theoretical approaches to developing biologically constrained mathematical models of mammalian neural systems. The focus of the majority of current research is the hippocampus, a neural system essential for learning and memory functions. The goal of this research is to address three general issues: (1) the relation between cellular/molecular processes, systems-level functions, and learned behavior; (2) the extent of which the functional dynamics of neural systems are altered by activity-dependent synaptic plasticity; (3) the extent to which the essential functions of a neural system can be incorporated within a hardware representation (e.g., VLSI circuitry).

Experimental studies involve the use of extracellular, intracellular, and whole-cell electrophysiological recording techniques, applied in vivo using anesthetized and chronically implanted animals, and in vitro using hippocampal slice preparations. A number of neurobiological issues are being investigated, including: (1) quantifying the signal processing capabilities of hippocampal neurons and the extent to which these capabilities reflect regulation due to feedforward and feedback circuitry vs. intrinsic neuronal mechanisms, such as voltage-dependent conductances or second messenger biochemical systems; (2) the spatio-temporal distribution of activity in neural networks and its dependence on input pattern and network connectivity; (3) the cellular mechanisms underlying changes in the strength of connections among neurons, i.e., synaptic plasticity, and the influence of synaptic plasticity on signal processing characteristics of neurons and the spatio-temporal distributions of activity in networks.

These and other experimental studies are used in conjunction with several different theoretical approaches to develop models of: (1) the nonlinear, input/output properties of single hippocampal neurons and circuits composed of several populations of hippocampal neurons (in collaboration with Dr. V. Marmarelis, Biomedical Engineering, USC), (2) the hierarchical relationship between synaptic and neuronal events (in collaboration with Dr. G. Chauvet, Institute for Theoretical Biology, University of Angers, France), (3) the kinetic properties of glutamatergic receptor subtypes, and (4) adaptive properties expressed by the "hippocampal-like" neural networks implemented with analog VLSI technology (in collaboration with Dr. B. Sheu, Electrical Engineering, USC).

Suppose the initial tests on rats are successful and the group wants to move onto trying it in humans. There seems to be a problem with how to get patient consent. People who can't form new long term memories may be unable to have the treatment explained to them well enough to be able to evaluate the risks and potential benefits.

Still, this is weird wild stuff. If a chip can be made to emulate the hippocampus can the chip's algorithms be improved upon to make it better than the hippocampus? Could it be turned up to stimulate learning when one is studying material one needs to remember?

By Randall Parker 2003 March 12 08:52 PM  Cyborg Tech
Entry Permalink | Comments(8)
On Parenthood and Genetic Engineering

Stanley Kurtz reports on a lesbian couple who want to have more than two people declared parents of a child.

A lesbian couple from London, Ontario has asked a Canadian court to simultaneously recognize the two of them (the biological mother and her partner), as well as the biological father, as legal parents of a young boy. Rather than turn to an anonymous sperm donor, the women in question asked a friend to father their child. The father does not live with the couple and child, but is nonetheless treated as a member of the household.

Group Marriage Would Be Problematic

Kurtz sees this development as a threat to the existing institution of marriage and to the two parent family. As he sees it, gay marriage will find its justification in homosexual partners sharing legal parent status of the same child. If more than two adults can be legal parents of a child then that will be used to justify group marriages.

Group marriages would be hopelessly unwieldy. More than two people could disagree with each other. One person could decide to divorce all the other marriage members. Or one person might try to force another person out of the marriage while other members either oppose the move or are ambivalent. Or two groups within the marriage could split off and divorce the members of the other group. There could even be legal fights over who ends up in the two new subgroups with some person unwilling to choose one subgroup or the other.

Is it fair to the public at large to inflict group marriage and all its problems on society? Companies could easily face lawsuits because they do not extend spousal benefits to more than one spouse. The legal costs of divorce court and of police calls for domestic dispute situations would be much higher.

Two Competing Legal Claims On A Children Are Bad Enough

Then there are the children. There are alot of already occurring situations involving biological and non-biological parents. In the case of the lesbian couple and the biological father there are only 2 biological parents involved but three people all willing to share custody. In adoption cases where a couple adopts a child there is no biological parent involved. In cases where one biological parent marries a person unrelated to their child the person who is not related may opt to formally adopt the child so that both marriage partners are legal parents to the child. Though in such a case the other biological parent could assert rights and therefore the adoption by the non-biological partner could be challenged.

Already children shuffle back and forth to live parts of their time with two different parents. It would seem cruel to make a child move around to five or six different residences after the break-up of a group marriage.

What If The Child Doesn't Have Exactly Two Biological Parents?

In all of those cases there are still exactly 2 biological parents. This will not always be the case. Human cloning will some day be perfected. This will allow for there to be just one biological parent. Parenthetically there will be just two grandparents and they will have the same genetic relationship to the clone as they have to their child who is the parent of the clone. This could conceivably lead to legal battles in some cases where the grandparents seek to gain legal recognition of their status as parents.

Cloning throws up a fairly simple case of untraditional genetic parenthood relationships. But that is not the only case that will happen. Advances in biotechnology promise to make the genetic (and hence legal) parenthood issue a much more complicated question. Eventually it will be possible to use chromosomes from more than two people in order to put together the chromosome complement for a new human. Each person has 23 pairs of chromosomes for 46 chromosomes total. It will one day be possible to construct an embryo that contains chromosomes donated by 46 different people. Well, who should be eligible for status as custodial parent in such a case? In this hypothetical case each chromosome donor will be able to claim to have donated approximately 2% of the genetic complement of a child. Since some chromosomes are bigger and some smaller its not even the case that all 46 parents will have made an equal contribution.

Stolen Chromosomes And Offspring

It gets weirder. Some day an embryo may be constructed by secretly taking DNA samples from people without their knowledge. Imagine a groupie of rock stars or movie stars who saves sperm samples from one night stands and by doing so builds up a large collection of chromosomes from which to select to construct an embryo's genetic complement. A child born from such deception would not even be able to find out who any of their biological parents (if chromosome donors can properly be called biological parents) were.

Biological parenthood would become fairly meaningless if a baby was born with chromosomes taken from dozens of people. Even if a single chromosome's sequence could be matched with a particular adult that does not mean that adult was the source of the chromosome. A chromosome with very nearly identical sequence which was functionally equivalent could have come from that person's parents, siblings, or others more distantly related.

Sperm Banks Allow One Biological Parent To Have No Obligations

Keep in mind that its already pretty weird. When sperm bank donor sperm are used essentially the source of the paternal DNA is free of obligation to the resulting offspring. Also, the offspring frequently can't find out who the father was.

There is also de facto parental escape from responsibility for offspring via giving babies up to the state for placement in foster homes with the rest of the populace picking up the expense. There is also widespread abandonment of parental responsibility by fathers as well as situations where the mother can't figure out who the father might have been (e.g. mom never saw the guy after the one night stand and doesn't know where he lives or his last name). There is also adoption where again the biological parents escape responsibility but at least the state doesn't have to pick up the tab.

She Did Gene Therapy So Its Not My Kid

It gets weirder still. Gene therapy on embyros will allow introduction of sequence variations and even genes that do not occur in either of two biological parents. Imagine a situation where a woman and man have a one-night stand, she gets pregnant, doesn't tell him, and she goes off and has genetic engineering done on the embryo. She might conceivably sue the biological father for child support. But he might argue that since she tweaks many genes that she got from him that he should not be forced to have legal obligations toward the kid.

The argument that gene therapy makes a kid not one's own is not unreasonable. The mother might choose to introduce characteristics that make the child's personality and appearance totally unlike the man whose sperm started the pregnancy (I hesitate to call him the biological father). The man may feel that the child is not really his if he can't honestly say that the kid is "a chip off the old block".

Children Shouldn't Be Victimized By Lifestyles Of Crazy Adults

In deciding what is reasonable to allow in marriage the emotional, intellectual, and physical needs of children should weigh most heavily. Too many arguments about alternative forms of marriage are framed in terms of the rights and needs of adults. Well, adults can take care of themselves. Its children who are most vulnerable and it is children who are the biggest justification for society's support for the institution of marriage in the first place. Society's paramount interest in marriage is to see that children are properly cared for. Demands for support of new forms of marriage should be balanced against the interests society has in having workable forms of marriage for seeing to the care of offspring.

By Randall Parker 2003 March 12 06:50 PM  Biotech Society
Entry Permalink | Comments(1)
United States Lacks Sufficient Biodefense Lab Space

The United States has too few ultra-safe biological research labs for working with the most dangerous pathogens.

Virologists call the world's most lethal disease organisms Level 4 pathogens. Experiments with them are confined to Biosafety Level 4, or BSL-4, labs. Those labs need to be sealed, pressurized areas designed to prevent pathogens from escaping, even in a nuclear blast.

Lab capacity is needed for testing to respond when an actual terrorist attack is suspected or known to be occurring.

While the funding includes money for two new BSL-4 labs, in Maryland and Montana, neither will be open for at least three years. Meanwhile, the four existing BSL-4 labs in the United States do not offer nearly enough space. If a biological attack occurred, creating the need for sample testing, the labs' capacity would be stretched past the breaking point, researchers say.

The other reason this type of lab space is needed is to test new vaccines and drug treatments against pathogens that would likely be used in a bioweapons attack. The lack of lab space discourages researchers from working on research for bioweapons defense against bioterrorism.

To deal with a bioterrorism attack the United States really needs highly secure labs scattered throughout the country located near major biomedical research centers. Also, the labs need an associated larger living area that scientists could move into and isolate themselves from the larger society if a major epidemic was raging. Boston, San Diego, and other cities with large concentrations of biomedical researchers should have the facilities that will allow them to put a large number of top researchers on developing counters to a plague introduced by terrorists.

By Randall Parker 2003 March 12 01:11 AM  Dangers Tech General
Entry Permalink | Comments(1)
2003 March 10 Monday
Leon Kass Doesn't Like Prospect Of Aging Rejuvenation

A March 2003 staff working paper of the US President's Council on Bioethics reflects its chairman Leon Kass's lack of enthusiasm for the prospect of preventing and reversing the aging process.

4. Attitudes toward Death and Mortality: An individual committed to the scientific struggle against aging and decline may be the least prepared for death, and the least willing to acknowledge its inevitability. Therefore, given that these technologies would not in fact achieve immortality, but only lengthen life, they would in effect make death even less bearable, and make their beneficiaries even more terrified of it and, in a sense, obsessed with it. The fact that we might die at any time could sting far more if we were less attuned to the fact that we must die at some time. In an era of age-retardation, we might, in practice, therefore live under an even more powerful preoccupation with death, but not one that leads us to commitment, engagement, urgency and renewal.

5. The Meaning of the Life Cycle: There is also more to the question of aging than the place of death and mortality in our lives. Not just the specter of mortality, but also the process of aging itself affects our lives in profound ways. Aging, after all, is a process that mediates our passage through life, and that gives shape to our sense of the passage of time and our own maturity and relations with others. Age-retardation technologies at once both make aging more manipulable and controllable as explicitly a human project, and sever age from the moorings of nature, time, and maturity. They put it in our hands, but make it a less intelligible component of our full human life. In the end, they could leave the individual unhinged from the life-cycle. Without the guidance of our biological life-cycle, we would be hard-pressed to give form to our experiential life-cycle, and to make sense of what time, age, and change should mean to us.

Kass and company apparently believe that if our bodies don't grow old we will become even more fearful of death. He also thinks we will feel unhinged and lack the sense of purpose that supposedly comes with growing old. I don't personally derive a sense of meaning and purpose from growing old (except that as more years go by I try harder and harder to encourage others to support anti-aging research - so maybe he's right). Aging seems like an entirely undesireable process. Wisdom and understanding would come with the passing of the years even of one didn't grow old.

What would be wrong with having many generations at the prime of their lives for many decades? These ethicists are arguing as if we need really old and the children around to give the middle aged people someone to boss around. Oh great. Couldn't this need be satisfied by getting really obedient dogs, border collies perhaps?

1. Generations and families: Family life and the relations between the generations are, quite obviously, built around the shape of the life cycle. A new generation enters the world when its parents are in their prime. With time, as parents pass the peak of their years and begin to make way and assist their children in taking on new responsibilities and powers, the children begin to enter their own age of maturity, slowly taking over and learning the ropes. In their own season, the children bring yet another generation into the world, and stand between their parents and their children, helped by the former in helping the latter. The cycle of succession proceeds, and the world is made fresh with a new generation, but is kept firmly rooted by the experience and hard-earned wisdom of the old. The neediness of the very young and the very old put roughly one generation at a time at the helm, and charge it with caring for those who are coming, and those who are going. They are given the power to command the institutions of society, but with it the responsibility for the health and continuity of those institutions. In a society reshaped by age-retardation, generation after generation would reach and remain in their prime for many decades. Sons would not surpass their fathers in vigor just as they prepared to become fathers themselves. One generation would have no obvious reason to make way for the next as the years passed. The succession of generations would be obstructed by a glut of the able. The old would think less of preparing their replacements, and the young would see before them only layers of their elders blocking the path, and no great reason to hurry in building families or careers. Families and generational institutions would surely reshape themselves to suit the new demographic form of society, but would that new shape be good for the young, the old, the familial ties that bind them, the society as a whole, or the cause of well-lived human lives?

2. Innovation and change: The same glut would likely affect other institutions, private and public. From the small business to the city council, from the military to the Fortune 500 corporation, generational succession would be disrupted, as the rationale for retirement diminished. With the slowing of succession cycles might well also come the slowing of the cycles of innovation and adaptation in these institutions. Innovation is often the function of a new generation of leaders, with new ideas to try and a different sense of the institution’s mission and environment. Waiting decades for upper management to retire would surely stifle this renewing energy and slow the pace of innovation—with costs for the institutions in question and society as a whole.

They also bring up a fallacy about a loss of the ability to innovate with the passing of the years. If a person's mind didn't age and it stayed as keen as a mind is when it is young then the person will have a longer run at being creative. With no loss in the ability to concentrate or to form new memories will come a greater ability to sustain creative output in many fields.

If there are too many people in a corporation at the top who never retire then one can just change jobs to a company that is growing and promoting people. Or one can become self-employed. Most people aren't going to become senior managers anyway and yet the failure to reach senior management level does not rob a life of meaning.

Along with aging reversal therapies will come the ability to boost intelligence. With youthful smarter minds and energetic bodies people will become far more creative. The result will be a cultural renaissance.

People who think like Leon Kass are fighting a losing battle. Biotechnology will continue to advance and its rate of advance will accelerate. The only question is how long will we have to keep ourselves alive before the technology becomes available to make our bodies young again? Will the technology come soon enough to help those who otherwise will die of old age in 20 years? Or do you need to make it another 30 or 40 years to survive to see the day when it becomes possible to have one's body rejuvenated and returned to a state of youthfulness?

Update: The President's Council on Bioethics is arguing that aging is not a disease.

The ethics of using biotech enhancements to slow the aging process were a focus of the Council's March 6 meeting. "Is it reasonable to think that the biological processes of aging are rightly regarded as analogous to a model of disease, to be studied and modified?" chairman Leon Kass asked to launch the topic.

Members chewed over his question and most agreed that aging is a natural part of the life cycle, not a disease.

The problem with this line of reasoning is that many conditions that are now called diseases are essentially the product of aging processes. For instance, what is heart disease? If cells in a heart are very aged the heart will show the symptoms of heart disease.

In fact, as University of Idaho gerontology researcher Steven Austad explained to this bioethics panel the vast bulk of diseases increase in incidence with age.

And the last point is that slowing aging is really a much more effective approach to preserving health, than is the treatment of individual diseases, and I'll give you the rationale for that in this slide here, which shows that these are major causes of death. And you can see that virtually all of them increase exponentially with age. And one of the consequences of the analyses that Jay Olshansky will, no doubt, talk about later, is that curing each of these individual diseases has a surprisingly small impact on life expectancy. But more important, curing one of those diseases does not take care of all of the other disabilities that may be associated with aging, because of other disabilities, such as chronic arthritis, the decline in sensory capacity. These things also increase exponentially in aging, getting rid of one cause at a time, basically leave people who may be alive, but may be very disabled. By slowing down the aging rate, we basically delay the onset and the progression of a whole host of mortal and debilitating diseases.

Slow aging and the onset of a large number of illnesses will be delayed. Reverse aging and the onset of many illnesses will be entirely avoided. Can a biological process lead to disease and yet not be a disease process itself? One can debate the question philosophically but regardless of whether aging is classified as a disease the most effective way to prevent most diseases is to slow and reverse aging.

University of Utah aging researcher Richard Cawthon makes a similar argument.

According to some estimates, slowing the rate of aging just enough to postpone the age of onset of multiple age-related chronic diseases by two to three years would save hundreds of billions of dollars in health care costs. Furthermore, lowering age-specific mortality rates from multiple causes by slowing the rate of aging may be easier to achieve than lowering them to the same extent by developing a separate, more specific intervention for each of a multitude of age-related life-threatening diseases of which atherosclerotic heart disease, cancer, stroke, lung infections, and chronic obstructive pulmonary disease are among the most common.

By Randall Parker 2003 March 10 12:46 AM  Aging Reversal
Entry Permalink | Comments(9)
2003 March 07 Friday
Cheaper Way To Make Carbon Nanoscrolls Discovered

One of the biggest obstacles in the use of nanotechnology is the cost of manufacture. Scientists working in labs come up with all sorts of interesting nanomaterials that have qualities superior to existing materials for many applications. These discoveries regularly receive glowing media reports. But too many such discoveries are going unused because of a lack of ways to make these nanomaterials cheaply in bulk. Nanotubes are a great example. They are considered to have enormous promise but in spite of the interest they have attracted no team has found a cheap way to make them. Therefore reports of nanotech production cost reduction advances are important.

UCLA chemists report in the Feb. 28 issue of Science a room-temperature chemical method for producing a new form of carbon called carbon nanoscrolls. Nanoscrolls are closely related to the much touted carbon nanotubes -- which may have numerous industrial applications -- but have significant advantages over them, said Lisa Viculis and Julia Mack, the lead authors of the Science article and graduate students in the laboratory of Richard B. Kaner, UCLA professor of chemistry and biochemistry.

"If nanotubes can live up to all their predicted promise, then we believe that we have a method for making analogous materials for a fraction of the cost," Mack said.

Nanotubes are pure carbon sheets in a tubular form, capped at each end. Viculis and Mack's carbon nanoscrolls are also pure carbon but the sheets are curled up, without the caps on the ends, potentially allowing access to significant additional surface area. While nanotubes are normally made at high temperatures, nanoscrolls can be produced at room temperature.

"Our method involves scrolling sheets of graphite, which could give us a much higher surface area," Viculis said.

"If we can access the entire surface area on both sides of the carbon sheets -- unlike with carbon nanotubes, where only the outside surface is accessible -- then we could adsorb twice the amount of hydrogen -- an enormous increase," Mack said, "improving on hydrogen storage for fuel (an alternative to fossil fuels)."

"Nanoscrolls can be made by a relatively inexpensive and scalable process at low temperatures," Mack said. "Our starting materials are just graphite and potassium metal. The idea is beautiful in its simplicity."

"Carbon surfaces are known to adsorb hydrogen. A difficulty with using hydrogen as a fuel source for cars, instead of gas, is obtaining a material capable of storing enough hydrogen to make the approach feasible," Viculis said.

"Carbon nanoscrolls could make pollution-free, hydrogen-powered cars better than they would otherwise be," said Kaner, the third co-author on the Science paper. "This research is a good start. We have a long way to go. For this approach to work well, we need to get down to individual carbon layers, and we are not there yet. On average, the nanoscrolls are 40 layers thick. We have not yet realized the full surface area or all the properties we are after. The challenge is to reduce the nanoscrolls to individual layers. We have many good leads, and have started new collaborations."

The research may lead to numerous applications.

"For electronic applications, nanotubes may work well," Kaner said. "For applications where high surface area is important -- such as hydrogen storage, or energy storage in super-capacitors -- these nanoscrolls may be better."

Other possible applications for nanoscrolls, Kaner said, include lightweight but strong materials for planes and cars, and improved graphite-based tennis rackets and golf clubs.

The use of nanoscrolls for energy storage is especially interesting. Liquifying hydrogen requires considerable energy expenditure to cool it and also requires extremely well insulated tanks to hold it. But gaseous hydrogen takes up too much space. If nanoscrolls could be used either to store hydrogen densely at room temperature or to make a better kind of battery then they'd be very attractive.

By Randall Parker 2003 March 07 10:57 AM  Nanotech Advances
Entry Permalink | Comments(3)
2003 March 06 Thursday
22 Billion Years From Now Big Rip May Wreck Universe

Something called phantom energy is as yet unproven to exist but if it does then the universe will eventually accelerate its expansion and even atoms will be pulled apart.

"Until now we thought the Universe would either re-collapse to a big crunch or expand forever to a state of infinite dilution," says Robert Caldwell of Dartmouth College, New Hampshire. "Now we've come up with a third possibility - the 'big rip'."

Its slated for 22 billion years from now so we can relax for now.

The question Caldwell and his colleagues posed is, what would happen if the rate of acceleration increased?

Their answer is that the eventual, phenomenal pace would overwhelm the normal, trusted effects of gravity right down to the local level. Even the nuclear forces that bind things in the subatomic world will cease to be effective.

We have more pressing problems that need to be solved first. For instance, in a mere 1 billion years increased light output from the Sun will make Earth too hot for human life. Therefore, we will need to move Earth. Some may already be living on Mars and will welcome the increased sunlight. But Mars is smaller and will be too crowded to hold everyone.

Of course by then the human race may either be wiped out by nanogoo or by robots.

By Randall Parker 2003 March 06 12:00 PM  Dangers Natural General
Entry Permalink | Comments(0)
2003 March 04 Tuesday
Hydrogen Economy Cost Calculations

Harry Braun, Chairman of the Hydrogen Political Action Committee, has written article laying out some costs and arguing for the use of windpower to generate hydrogen for a hydrogen economy.

With state-of-the-art electrolyzers, about 55 kWh will be needed to manufacture the energy content of a gallon of gasoline in the form of gaseous hydrogen. Assuming electricity costs of 3 cents/kWh, the electricity costs alone would be $14.00/mBtu, which is equivalent to gasoline costing $1.60 per gallon. The cost and maintenance of the electrolyzer and related hydrogen storage and pumping system also needs to be factored in to the equation.

One problem that Braun brings up about liquid hydrogen as a transportation fuel source is that while a gallon of gasoline has 115,000 Btus of energy a gallon of liquid hydrogen has only 30,000 Btus. Therefore liquid hydrogen tanks would need to be much larger and at the same time stronger and insulted in order to hold the extremely cold liquid hydrogen. Not exactly an appealing prospect. Also, liquifying the hydrogen itself takes energy that boosts the costs by nearly a quarter.

Even if we accept his assumptions for how far down windpower costs could drop if mass produced his calculations take little account of the infrastructure costs for hydrogen for the huge transition that he envisions. Also, windpower seems a worse choice than photovoltaics for the United States in the long term in part because the wind farms have to be built where the wind is. Whereas with the move of people in the US toward the Southern parts of the country people have been moving toward where there is more solar power to be tapped. Eventually, (eventually? how long is eventually? er, I don't know) thin film photovoltaics will allow electric power to be generated much closer to where it is used.

Given the drawbacks of hydrogen as a power source it still seems possible that a big advance in battery technology could make batteries a viable alternative to hydrogen fuel cells.

An EE Times article from 2001 surveyed the field of battery development and experts think batteries viable as automotive power sources are still years away.

Similar efforts are in progress at Massachusetts Institute of Technology (MIT), where researchers have developed a competing lithium-polymer battery that could ultimately achieve energy densities of 300 W-hr/kg, according to its developers. The technology, which uses a multiple-layer configuration of polymer and metal resembling a potato chip bag, is funded by the Office of Naval Research and is said to be 5 to 10 years from commercialization.

That article does a good job of describing how far a battery technology would have to advance in order for it to become competitive for automotive applications. The MIT effort, if successful, would create batteries that would have about 4 times more power density than the nickel-metal hydride batteries found in the most expensive uncompetitive electric vehicles (whose market prices are way below manufacturing costs btw). That would make the batteries dense enough. The cost is a question though.

In a more recent article Donald Sadoway and John Heywood (director of MIT's Sloan Automotive Lab) are noticeably lacking in enthusiasm for hydrogen as an automotive power source.

“Their state of development is oversold,” said Heywood. Sadoway put it another way: “In the context of portable power, fuel cells are not a technology, they’re a laboratory curiosity.”

Among other things, fuel cells are now far too expensive for use in mainstream applications like cars. That’s because they’re made partly of platinum, the same metal used in expensive jewelry. And an alternative to platinum will be difficult to discover, said Sadoway; “that’s Nobel Prize-winning work.”

Another key challenge: “How are we going to produce, distribute and store the hydrogen” for fuel cells, asked Heywood. He pointed out that the production of hydrogen itself involves generating various greenhouse gases. “So when people argue that the fuel cell produces only water vapor, that’s deceptive in the context of a complete transportation system,” he said.

Battery technology is appealing from an infrastructure standpoint because batteries could be recharged at night when existing electric power plants run well below maximum capacity. Then when photovoltaics become cost effective vehicles could be recharged during the day.

Stationary applications for alternative power sources are not as hard. There are lots of future possibilities for better ways to get energy for stationary uses. Some NASA researchers think thin film batteries and thin film photovoltaic cells could be integrated into roof tiles that would collect and store electrical energy.

There are also numerous applications that could exploit integrated power devices. Examples of these include: battery and solar cell devices integrated into a roof tile to provide a total power system for homes, or solar-rechargeable power systems for the military, for recreational vehicles, for cell phones or for other consumer products designed to be used in remote locations. In summary, the same considerations that provide performance edges for space applications make these power technologies applicable for terrestrial needs both individually or used in tandem.

By Randall Parker 2003 March 04 05:33 PM  Energy Tech
Entry Permalink | Comments(1)
2003 March 03 Monday
Virginia Collects DNA Samples From All Felons

Starting on January 1, 2003 Virginia became the first US state to collect DNA samples from all convicted felons.

Beginning Jan. 1, the state expanded its sampling to include anyone arrested for a violent felony. The DNA database, which already contains roughly 200,000 biological profiles, is expected to grow by a third.

Britain already requires DNA samples to be taken of all people who are arrested. Since not all those arrested are convicted that is an even broader approach

Expect to see some countries adopt a policy to taking DNA samples from all babies at birth and from all the adult population. Already many US states require a thumbprint to be taken as part of the drivers license application process. DNA sampling is just another way to uniquely identify a person. So the thumbprint requirement is a precedent for sampling of DNA for the general public.

By Randall Parker 2003 March 03 01:47 AM  Biotech Society
Entry Permalink | Comments(0)
2003 March 02 Sunday
Hydraulic Energy Storage Hybrid Engine Developed For Trucks

The business case for electric hybrid cars is pretty weak because so many heavy batteries are required and replacing them when they wear out is expensive. Another approach to energy storage is compressed air. Ford is taking this idea and pursuing compressed air hybrids. Still yet another approach uses hydraulic fluid compression to store energy.

Chandler noted that, "This smaller, lighter version of Permo-Drive's Regenerative Drive System (RDS) offers significant fuel savings, reduced emissions and improves brake life to the trucking industry and major fleet operators, including the U.S. military."

A U.S. Army vehicle equipped with the Permo-Drive system recently underwent three weeks of intensive testing. Preliminary results show a 27 percent improvement in fuel economy, a 36 percent jump in rapid-acceleration or "dash" capability and a 60 percent improvement in deceleration when comparing hydraulic-system deceleration rates to engine-braking.

Dennis J. Wend, executive director of the U.S. Army National Automotive Center, recently noted that, "In our modeling and simulation work to date, parallel hybrid-hydraulic systems show the potential to provide significant fuel-economy savings for future generations of trucks."

The US Army has a greater incentive than private industry to boost fuel efficiency of its vehicles because of the logistical cost and enormous difficulty of delivering fuel to remote battlefields. Therefore it is not surprising that the Army would be testing this technology.

Chandler pointed out that the company's hybrid hydraulic system also has been tested on commercial vehicles in Australia, where it achieved fuel economy gains of 33 percent or more. Permo-Drive's system captures normally wasted energy generated during braking, then releases it back into the vehicle's driveline when additional power is needed.

RDS technology can be applied to new or existing trucks. Key design features include an innovative inline axial-piston pump/motor, high-pressure accumulator energy-storage devices that utilize special composite materials, ultra-light-weight metals and advanced hydraulic and electronic engineering. The Permo-Drive system integrates vehicle dynamics, hydraulics, mechanical engineering, accumulator technology, material science, computer telemetry and electronics.

Computer control advances combined with materials advances are probably both essential enabling technologies that are making designs based on this type of hybrid technology possible.

The Permo-Drive RDS storage system includes two hydraulic fluid "accumulators" -- a high-pressure tank (up to 5,000 PSI) and a low-pressure reservoir. As braking takes place, energy is captured with the flow of oil from the low-pressure tank to the high-pressure accumulator. A central processor later controls the release of the oil during acceleration to enhance overall fuel economy and reduce emissions.

Technologies that capture braking energy would useful as a way to decrease fuel usage even in vehicles powered by fuel cells or batteries.

By Randall Parker 2003 March 02 11:25 PM  Energy Transportation
Entry Permalink | Comments(7)
Site Traffic Info
Site Copyright
The contents of this site are copyright ©