2003 February 28 Friday
James Watson Calls For Curing A Disease Called Stupidity

James Watson, president of Cold Spring Harbor Laboratory and co-discoverer along with Francis Crick of the DNA double helix structure, calls for treating stupidity as a disease that should be cured with biotechnology.

“If you really are stupid, I would call that a disease,” Dr Watson said. “The lower 10 per cent who really have difficulty, even in elementary school, what’s the cause of it? A lot of people would like to say, ‘Well, poverty, things like that.’ It probably isn’t. So I’d like to get rid of that, to help the lower 10 per cent.

“It seems unfair that some people don’t get the same opportunity. Once you have a way in which you can improve our children, no one can stop it. It would be stupid not to use it because someone else will. Those parents who enhance their children, then their children are going to be the ones who dominate the world.” Genes that influence beauty could also be engineered. “People say it would be terrible if we made all girls pretty. I think it would be great.”

I'm with James on this one: More smarties and more pretty girls. What a wonderful world it would be.

Nervous Nellies think Watson's comments are dangerous.

Tom Shakespeare, a bio-ethicist at Britain's University of Newcastle, criticized Watson's remarks.

"He is talking about altering something that most people see as part of normal human variation, and that I think is wrong.... I am afraid he may have done more harm than good, his leadership of the Human Genome Project and his discovery of 1953 notwithstanding."

Countries where the people do not enhance the intelligence of their offspring are countries that will be left behind. "Naturalists" who do not want to see genetic enhancement of humans are going to be on the losing side in history.

Others think that beauty can not be genetically engineered because it is subjective.

Geneticist Steve Jones, at University College London, dismisses Watson's comments about beauty as "daft". "The concept of beauty is a subjective one," he told New Scientist.

This claim flies in the face of everyday experience. Why do pretty girls get elected home coming queens in high school? Why is there little controversy over who the top contenders are? Why do certain TV and movie stars become the predictable heartthrobs of millions all over the world?

Social science research finds there is a large amount of agreement on what constitutes attractive appearance. (also see this link for the same article)

Men and women generally agree about how attractive another person is, and are often quite accurate in predicting how others will rate their own appearance, new study findings show.

People are going to genetically engineer their kids. They will do this for intelligence, personality, looks, health, and other characteristics. There may well be some countries that pass and strictly enforce laws forbidding this sort of thing. But other countries won't pass the laws or that will not make a concerted effort to enforce the laws. The incentives for genetically enigneering progeny will be so great that people will find a way to do it regardless of what governments say or do.

Update: It may seem hard to believe but some scientists and medical doctors still deny genetics can control the level of intelligence.

Australian Medical Association's ethics committee chairman Dr Trevor Mudge said it was not yet known if intelligence was determined by genetic or environmental factors.

For ideological reasons some highly educated people do not want to admit that genetics is more important than environment in determining intelligence. Obviously toxins and malnutrition can prevent proper brain development. Obviously a mind's development can be messed up by putting a child in an environment in which it can not intellectually develop. But to suppose that environment is the only or even the main factor separating the average 90 IQ person from the average 150 IQ person is ludicrous. Yet as the previous article demonstrates it is still possible to go around and find people with impressive sounding titles and credentials who will deny the role of genetic variations in determining intelligence.

This debate will end when it becomes possible to genetically engineer offspring to be smarter (i.e. probably in about 10 or 20 years). The people who are willing to do genetic engineering to their offspring will stampede to use the biotechnology that does so. Many people who, if asked today, would say they oppose offspring genetic engineering will be first in line to use it when it becomes possible to do so. Many will decide it is so important to give one's own children every advantage that they will place this feeling ahead of whatever argument they advance today in opposition to germ line genetic engineering.

By Randall Parker 2003 February 28 03:01 PM  Biotech Reproduction
Entry Permalink | Comments(6)
2003 February 27 Thursday
GPS and Other Tech Increases Spying and Tracking

Global Position System (GPS) devices are being used to track the movement of spouses and workers.

Spouses who believe mates are having affairs, employers who suspect workers are misusing company vehicles or parents who wonder if their children are where they are supposed to be are among those using devices tied to the global positioning system of satellites.

The ability to track children with GPS will likely eventually lead to the embedding of GPS devices inside the bodies of children. Then parents who are worried about kidnapping will be able to rest assured that they will be able to find their children if they are kidnapped. Of course, a very sophistication kidnapper will be able to shield or remove locator devices. But the devices would at least be an obstacle for most kidnappers and would allow quicker location of children who are murdered by perverted killers.

GPS is used to track stolen cars and find accident locations.

Working with police, Avis turns on the system only if the car appears to have been stolen, Deutsch says. The system will also automatically activate if the air bag inflates, indicating a possible accident. In that event, Deutsch says, you may hear a voice through your radio ask, "Are you OK? Are you all right?" The system indicates your location for emergency aid.

GPS is used as a substitute for criminal incarceration.

GPS monitoring gives local governments a cheap alternative to incarceration and allows offenders an opportunity to continue working and living at home. Law enforcement agencies can create "electronic fences" around areas that are off-limits to offenders. The GPS system can be programmed to alert police if a pedophile enters a schoolyard, for example.

Kenosha Wisconsin police charged Paul Anthony Siedler with use of GPS to stalk his ex-girlfriend.

Kenosha police allege that Seidler placed a Global Positioning System tracking device under the hood of the woman's car and began monitoring her movements. Charged with stalking, burglary, disorderly conduct, and reckless endangerment,

GPS is being studied for car insurance billing by the mile driven.

With new technology on the scene to accurately record mileage, the time is also right. Traditionally, some insurers have worried over how the would record mileage accurately with such a system. Progressive Insurance, the fourth-largest US car insurer, has pioneered this area. For two years, it has been testing "smart" insurance in Texas, installing miniprocessors that use GPS technology to record distance and time driven. Pleased with consumer response, Progressive is considering a national rollout of the policy.

One can easily imagine how this technology could be enhanced to bill at different rates in different driving conditions. For instance, driving in rainy or snowy conditions or at night or in densely populated areas which have higher accident rates could be more expensive.

The Oregon Road User Fee Task Force has proposed billiing in-state cars for mileage driven in-state to compensate for new car technologies that reduce the amount of gasoline fuel needed and hence the amount of gasoline tax collected.

“We also have to have a way to track mileage only within the state,” Whitty said. This rules out basing the fee on odometer readings, which would include out-of-state driving.

“Technology has improved to the degree that this can be done, with an electronic device,” he said. The device, in a car, would be linked to the Global Positioning Satellite or GPS system, which allows pinpoint navigation by bouncing signals off satellites.

A British government advisory panel has proposed nationwide use of GPS to implement road use taxes.

All cars will be fitted with a 'big brother' satellite tracking meter to charge drivers up to 45p a mile for every journey taken under radical plans to slash congestion on British roads.

The scheme, proposed by the Government's independent transport advisers, would see drivers handed monthly bills charging them for every single journey.

GPS is even coming to cell phones so that callers to emergency numbers can be located.

Many rental fleets and trucking companies already use satellite positioning systems to track cars and cargo. Companies promote similar products for keeping tabs on kids, Alzheimer's patients or cheating spouses. Washington is also promoting locator technology. By October, the Federal Communications Commission wants cell phones equipped with locator technology to help emergency responders find callers.

A large assortment of other technologies are being used to help catch cheating spouses.

Whatever happened to the lipstick stain on the collar? In the old days of freewheeling adultery, a hang-up call in the middle of the night was the worst a philandering rogue had to worry about. Now there are itemized cell-phone bills, call-display screens, automobile tracking devices, Internet history folders, stealth-mode keystroke-recording software programs and spray-on sperm detectors all waiting to trip you up.

The spy business is growing.

The spy business is a $3 billion a year industry in the United States, and spouses are leading the way, employing a range of techniques to catch their mates at adultery.

Lots of other technologies can be used to track and monitor people and to detect types of behavior. Semen detection tests available for order on the internet are used to detect spousal sexual activity with other people.

The Original CheckMate Semen Detection Test Kit will quickly and easily monitor your spouse's sexual activity outside of the relationship by detecting invisible traces of dried semen that is left in their undergarments after sex...

In spite of the rise of DNA testing, GPS tracking, and phone conversation recording hidden cameras are still the most popular seller.

“Spy cameras are definitely our No. 1 seller,” says Ursula Lebana, owner of Spy Tech in Toronto, Canada. “The cameras have become so small that they will fit into anything. People bring us their own items—lamps, music boxes, humidifiers—and we install cameras in them. You could be on camera anywhere. If you’re not doing anything wrong, then it should make you feel safer.”

The total surveillance society of the future will not be one where only the government is watching. Even business surveillance of employees and customers is only a part of the larger phenomenon. Individuals will increasingly track the movements, conversations, electronic messages, and activities of others in a growing number of ways. Spouses will surveil each other. Parents will track the movements and activities of their children. Portable automated chemical assay devices will make it easy for parents or others to rapidly and easily check for drug use.

Imagine the possibilities that will be opened up by steadily higher density recording media. Gifts of jewelry which have hidden audio storage capacity will provide a way to record the conversations of someone for romantic or business reasons. Nanotech electronics will likely eventually allow the recording media to be the jewelry itself. Detection of an embedded piece of nanoelectronics may turn out to be extremely difficult to do.

Automated processing of video, sound, scent, motion, location, other types of sensors, and electronic information will make it easier to sort thru the growing number of sensor feeds that individuals, companies, and governments will monitor. David Brin argues our only choice is between limiting powerful surveillance technologies solely to government use or allowing everyone to use them. Privacy is inevitably going to decrease. There is no feasible way to stop a large decrease in privacy.

Some types of technology are so easy to move around on a black market that restrictions on their use by the general public will have the effect of allowing only criminals and governments to use them. Some types of information will be so widely desired that even otherwise non-criminal citizens will opt to use them even if their use is illegal. Miniaturization of electronic monitoring devices and the ability to embed them in common items will make it very hard to detect or control their use.

One's privacy is not just a matter of where one goes or what one says or does. It includes financial data as well as medical details about oneself such as health records and even details of one's very structure. One crucial set of details is one's personal DNA code. As I've argued previously, in the long run DNA sequence privacy is going to be impossible to protect. It will simply be too easy to get a sample of someone's DNA sequence. Note how the semen detector service is a viable business because samples of biological material of even a spouse's lover is easy to get without the spouse or the lover knowing that one has done so. Once DNA sequencing machines become sufficiently fast, sensitive, and cheap that biological material will surely be usable to find out the DNA sequence of a spouse's lover. One way that information will be usable would be to predict the approximate physical appearance of the lover so that a private detective could more easily spot the lover as part of an investigation into a spouse's cheating. Once the results of DNA sequencing can be used to predict approximate physical appearance of a person then the ability to do DNA sequencing on saliva, blood, skin, semen, and other biological material will also be used by police, private investigators, and intelligence agencies to develop profiles of suspects.

The widespread embrace of the use of surveillance technology by the general public demonstrates a popular willingness to watch and track other people. This trend looks set to continue to grow with no end in sight.

By Randall Parker 2003 February 27 03:47 PM  Surveillance GPS
Entry Permalink | Comments(4)
2003 February 25 Tuesday
High Intelligence Sperm and Egg Donor Prices Rising

A recent ad quoted in this article offered 15 thousand dollars for a Stanford sperm donor who matched the desired set of qualifications. That's an astounding amount of money for a sperm donation. Egg donations are much harder to make and are riskier for the donors. Hence it is not surprising that prices for Ivy League egg donations can range as high as $50,000. A strong demand for the most preferred kinds of sperm and eggs is driving up prices.

Do a search on the internet and you'll find that there is a significant market for Ivy League egg and sperm donors. One company, Tiny Treasures, located in Massachusetts, offers "Extraordinary Ivy League [Egg] Donors". These are recent women Ivy League graduates who are offering their eggs for $8000 to $15000.

"I think it's unbelievable, and kind of strange, although hard to compare," said Jessica Lucent spokesperson for the New England Cyrobank (sic) Center, located in Cambridge, Mass., when asked to comment on the recent trend of high priced offers for donations.

These prices also provide an indication of how much affluent fertile people will be willing to pay to enhance their own DNA that they pass along to their own genetic offspring. If people are willing to pay more to increase the odds of having smarter children when egg and sperm donors are used then they will certainly be willing to do it when its their own chromosomes that they are passing along.

Keep in mind that while the use of high intelligence sperm and egg donors certainly raises the odds of having smarter children it does not guarantee that outcome. Each person has two copies of each chromosome. When they donate chromosomes via a sperm or egg to make an embryo there is as yet no way to control which of each pair of chromosomes will get donated (except in the case of X and Y chromosomes for sex determination). Once technology is developed to control which of each pair of chromosomes gets donated then much of the uncertainty can be eliminated.

The technology required to do genetic modification to human embryos is being developed. The other part of the puzzle that is still missing is the knowledge of exactly which genes influence intelligence and which alleles of each of those genes will boost intelligence.

On the question of which genes have variations that influence intelligence a lot of scientists are looking for candidates. Here's the abstract to a recent paper by David Comings et. al. of the City of Hope Medical Center that looks at a cholinergic pathway gene whose alleles seem to correlate with intelligence differences.

Cholinergic pathways have been widely implicated in cognition and memory, making the respective genes excellent candidate markers for cognitive abilities. Identification of a possible role of cholinergic receptor genes in humans has been hampered by the lack of reported polymorphisms. The authors identified a common AT 1890 polymorphism in the 3'UTR of the CHRM2 gene. To determine if it was associated with IQ, the authors examined 358 adult males and 470 adult females for a total of 828 adults. The subjects were the parents of twins from the Minnesota Twin and Family Study, a long-term study of the genetics and environmental factors in substance abuse. All subjects in the CHRM2 study were of Caucasian ancestry. All were given the Wechsler Adult Intelligence Scale-Revised (Vocabulary, Information, Block Design, and Picture Arrangement) test. The study was approved by the internal review boards of both the University of Minnesota and the City of Hope Medical Center and all subjects gave written informed consent. Using the SSCP technique, the authors identified a common single nucleotide polymorphism, A 1890T in the 3'UTR of the CHRM2 gene based on accession No. M16404. To assess which variable was more closely associated with the CHRM2 gene, the authors performed a MONOVA using both total IQ and years of education as the dependent variables and the CHRM2 gene as the independent variable for the total set. The total MANOVA (Wilks) was significant at P<0.009. The F-ratio for IQ was 4.12, P<0.017, and for years of education the F-ratio was 5.86, P<0.003. The authors have replicated these findings using a quantitative TDT method developed by Abecasis et al. in 230 parent-child trios from the MTFS. While a marginally significant association was found between CHRM2 and total IQ, after stratifying parental origin of transmission, there was a highly significant association for paternal transmission (P=0.007). Although in need of replication, the authors believe these preliminary results are consistent with a role of the CHRM2 gene in cognitive processes in humans, as assessed by both total IQ and years of education.

As the cost of doing DNA sequencing and SNP testing declines (and see the Biotech Advance Rates category archive for technologies on the horizon that will radically lower costs) the search for genetic variations that influence intelligence will become much easier to do. Because of the rate at which DNA sequencing costs are likely to fall it will surprise me if within the next 10 years we do not know most of the genetic variations that influence intelligence.

Once the cost of DNA sequencing declines and many genetic variations that influence intelligence and personality are identified the market for sperm and eggs will shift away from using attendance at an elite school as a proxy for genetic alleles that code for higher intelligence. Customers will be able to choose based on detailed knowledge of the genetic endowment of each potential sperm or egg donor. A competitive market with more available information will produce choices of donors that more closely match the exact preferences of the customers.

For a review of work in the area of genetic influences on intelligence see Robert Plomin's recent editorial in Molecular Psychiatry entitled "Genetics, genes, genomics and g".

By Randall Parker 2003 February 25 02:37 PM  Biotech Reproduction
Entry Permalink | Comments(1112)
2003 February 24 Monday
GM: Production Fuel Cell Vehicles by 2010

General Motors is optimistic that it can go into production with fuel cell vehicles by 2010.

Fuel cell-powered vehicles could be widely available by 2010, not 2020 as President Bush has suggested, General Motors said on Monday.

The White House said last week it hopes experts will be able to decide by 2015 whether hydrogen-powered fuel cells are commercially viable. And Energy Secretary Spencer Abraham said the Bush administration believes automakers could bring fuel cell vehicles to showrooms by 2020.

But Larry Burns, the head of GM's research and development, said his company plans to keep its 2010 timetable. "You've got to put it out there because the main message is if you're not driving to make this viable on a high volume, profitable, affordable basis, you shouldn't be doing it," Burns said.

New York Times Writer Nicholas Kristof was able to drive GM's $5 million dollar fuel cell prototype Hy-wire.

It's called Hy-wire, and it's a one-of-a-kind prototype: a four-door sedan fueled by hydrogen, capable of speeds of 100 miles an hour, whisper-quiet, and emitting no pollution at all — only water vapor as exhaust. It looks like a spaceship, with glass all around and no pedals or steering wheel.

Jeff Wolak, the engineer who travels with Hy-wire and mothers it, explained that it is drive-by-wire, controlled by electronics and computers rather than cables and hydraulics. To accelerate, you rotate the handgrips. To steer, you move the grips up or down.

The automotive equivalent of aircraft fly-by-wire comes to cars. This also brings us closer to the day of automated computer driving. No physical human force would be needed to operate any of the controls. One can imagine hybrid steps where, say, a single car on a freeway is networked to a line of cars behind it and the driver of the front car chooses a path and speed that all cars behind also follow.

Some argue for interim use of hybrid vehicles with gasoline engine plus electric and battery in combination. The problem holding back electric powered cars has always been weight and cost of the batteries. Hence the need for a hybrid design in order to use electric at all. Batteries are not the only way to store energy however, French company Moteur Developpement International (MDI) has developed a prototype vehicle that runs on the energy stored in compressed air. Compressed air storage hits up against similar capacity problems that batteries have. This has led Ford Motor Company and some collaborators to propose compressed air hybrid vehicles.

A soon-to-be-released study projects that an air hybrid engine could improve fuel economy 64 percent in city driving and 12 percent in highway driving. Scientists from the University of California, Ford Motor Company and consultant Michael M. Schechter will present their findings during the SAE 2003 World Congress, March 3 - 6, Cobo Center, Detroit, Michigan, USA.

Unless a really high energy density battery or a more efficient means of compressed air storage can be developed the future for vehicle propulsion is probably going to belong to fuel cells.

In spite of GM's aggressive schedule to begin using fuel cells in production cars it still seems more likely that fuel cells will be used as stationary power sources before they are used in transportation. Late in 2002 some researchers at Lawrence Berkely National Laboratory announced development of an alloy that will dramatically decrease the cost of stationary fuel cells.

The alloy is manufactured using the same process used to make metal filters that work in high temperature applications. Powdered steel is fired in an oxygen-free environment, which creates a porous metal. This stainless steel alloy is much stronger than ceramic, and unlike ceramic, it can be welded, brazed, hammered, and crimp-sealed. This translates to increased design flexibility and reduced manufacturing costs. Furthermore, the cost of stainless steel is approximately $2 per pound, while zirconia is between $30 and $60 per pound.

Alloy construction offers other advantages. A stable, high performance cathode can be operated at between 600 and 800 degrees Celsius. Efficiency loss due to current collection is minimized. And the alloy increases a fuel cell's strength as well as its electronic and thermal conductivity.

But does the design meet the $400 per kilowatt target? First, there's more to a fuel-cell-based generator than fuel cells. Roughly speaking, one-third of a generator's cost lies in the actual fuel cell stack, the other two-thirds lies in external "plumbing" such as insulation and a DC-to-AC inverter. This means the fuel cell stack can't exceed $130 per kilowatt if the entire unit is to meet the $400 per kilowatt target. No problem there: the raw materials for the Berkeley Lab stainless steel-based fuel cell are only $37 per kilowatt.

"The low cost of a metal-based SOFC's raw materials, and its design flexibility, should allow a stack to be manufactured below the $130 fuel cell target," Visco says.

To meet the $400 generator target, the Berkeley Lab fuel cell must now be developed into planar and tubular stack designs, and paired with a low-cost inverter and other supporting technology.

Ultimately, such technology could play a key role in meeting the nation's growing demand for power without incurring a proportional jump in air pollution. According to a recent Department of Energy report, annual energy demand will increase from a current capacity of 363 million kilowatts to 750 million kilowatts by 2020.

Fuel cells will provide many benefits. They will provide more efficient ways to convert fossil fuels into electricity. They will do so with less pollution as well. But they are also an enabling technology for other energy technologies. When photovoltaics drop far enough in price then fuel cells will enhance the value of photovoltaics. Photovoltaics will be used to generate electricity when the sun in shining and the electricity will be storable by using hydrolysis to generate hydrogen gas from water. Then the hydrogen can be burned back into water when stationary electricity or automotive electricity is needed.

Fuel cells have a very bright future.

By Randall Parker 2003 February 24 03:52 PM  Energy Transportation
Entry Permalink | Comments(1)
Ultrasound Microbubble Gene Therapy

The use of ultrasound microbubble gene therapy to successfully deliver genes into muscles in mice raises hopes of use of this technique to treat muscular dystrophy.

Scientists at the Hammersmith Hospitals NHS Trust, Imperial College London and the Medical Research Council have pioneered a new way of delivering gene therapy, using an innovative combination of ultrasound and microbubbles. Research published today (20th February 2003) in Gene Therapy1 shows how this new delivery technique not only improves the efficiency of modifying genes, but may offer safety advantages over other methods.

Gene therapy has the potential to treat or cure many diseases where there is an underlying genetic cause, but its progress has been severely hampered by concerns over the way in which genes are delivered. There are a number of safety and other issues surrounding the use of viruses and existing non-viral techniques have proved to be less effective.

Dr Martin Blomley (Senior Lecturer in Radiology) and Dr Qi-Long Lu (Senior Research Scientist), together with Dr Haidong Liang (Research Associate) and Professor Terry Partridge at the Hammersmith Hospitals NHS Trust, Imperial College London, and the Medical Research Council Clinical Sciences Centre, have been developing this new gene delivery technique. They have been studying skeletal muscle in mice, which gives insight into how we might use gene therapy to treat muscular dystrophy in children.

Microbubbles are already in use around the world to improve patient ultrasound scans in the heart, liver and many other areas and are known to be both safe and effective. They are tiny gas bubbles measuring about 3 microns, and are usually injected intravenously to boost ultrasound signals. There is evidence that when ultrasound is applied to microbubbles the microbubbles are disrupted (or "pop") and this can cause small perforations in the target cells, which allows the DNA to enter. This could allow for a "point and shoot" approach, as ultrasound can be pointed at a particular target area.

The Hammersmith Hospital, together with Imperial College, is a leading international centre for the use of the use of microbubbles in imaging and is also at the forefront of research into gene therapy.

The researchers mixed a commercial microbubble, already used by doctors for scanning patients, with DNA that coded for a "reporter gene" and injected it into the skeletal muscle of mice of different ages. The trial showed that the microbubbles and ultrasound helped in delivering DNA, and the efficiency of gene therapy was improved by about ten times. They also observed that even when the microbubbles were used without ultrasound, an improvement in efficiency could be seen, especially in younger mice. In younger mice, no additional improvement in efficiency was conferred by using ultrasound. In addition, in experiments where microbubbles were used, the amount of inflammation and damage associated with the injection was reduced.

Overall results from the trial, which was supported by the Medical Research Council, showed:

  • Microbubble ultrasound improved the delivery of DNA to the muscle
  • The microbubbles have some effect intrinsically and may reduce local inflammation

As non-viral methods are not usually very efficient, viruses have been used in many gene therapy applications. Although efficient at actually delivering genes into target cells, there are problems associated with their use including infection of non-target tissues and dangerous immune responses.

Dr Martin Blomley, Consultant Radiologist at the Hammersmith Hospitals NHS Trust and Senior Lecturer in the Imaging Sciences Department of Imperial College London, commented:

"What we’ve found here seems a promising lead into a new, safe and effective way of delivering genes into target cells – in this case muscle tissue. The combination of microbubbles and ultrasound may offer a targeted approach to gene therapy. In addition, the microbubbles alone have some effect, and we are exploring why this is in further work.

Gene therapy holds great promise in future for curing and ultimately preventing serious diseases but is still in its infancy as a clinical tool. This promising study suggests that there may be a less invasive and more efficient, safe and accurate technique for targeting tissue, than those currently in use.

Now we’ve found a good delivery system, we need to build on the research to improve the technique and assess the possible impact it could have on diseases such as muscular dystrophy."

This technique is much more effective.

The technique proved to be 10 times more effective than more conventional methods.

While some of the news reports on this story are phrased in a way that suggests this is a brand new breakthru cardiovascular ultrasound microbubble gene therapy was reported in 2000.

Progress in cardiovascular gene therapy has been hampered by concerns over the safety and practicality of viral vectors and the inefficiency of current nonviral transfection techniques. We have previously reported that ultrasound exposure (USE) enhances transgene expression in vascular cells by up to 10-fold after naked DNA transfection, and enhances lipofection by up to three-fold. We report here that performing USE in the presence of microbubble echocontrast agents enhances acoustic cavitation and is associated with approximately 300-fold increments in transgene expression after naked DNA transfections.

The latest report used muscle as a target and so it is valuable for its demonstration of the potential value of the technique for muscle targets. One downside of this approach is that it is likely to deliver genes into other tissue that is near the muscles. For instance, blood vessel cells would likely receive some of the genes. Delivery of a muscle gene into blood vessels or nerves or other tissue could cause problems if that gene started being expressed in one of those cell types. Therefore its not clear that this technique will turn out to be work well in practice. Whether it does turn out to be useful might depend on the gene being delivered and the type of tissue it is being delivered to. In some cases the gene's regulatory region may prevent it from being activated in tissue that is not the desired target tissue type.

The exact mechanism of action of microbubble gene therapy is not understood.

A new trend in bubble medicine is to use the same kind of microbubbles for therapy, in which the bubbles can act as vectors for directed drug delivery and gene transfection into living cells. The permeability of cell walls for large molecules (both drugs and genes) is dramatically increased in the presence of ultrasound and microbubbles.15 The nature of the mechanism behind this phenomenon is not yet understood. Jet formation, induced by collapsing bubbles, is one of the candidates for enhancing cell-wall permeation: Electron micrographs of insonated leukemia cells show conspicuous holes in their walls.16 Jet cavitation damage and cell-wall permeation could thus be two manifestations of the same process. However, other high-energy processes besides jets are associated with the bubble collapse and could be important: Shear and pressure forces, sound waves, and shock waves also provide significant mechanical interactions between bubble and cell.

By Randall Parker 2003 February 24 01:49 PM  Biotech Therapies
Entry Permalink | Comments(2)
Pluripotent Stem Cells Made From Blood Cells

Adult monocyte white blood cells can be converted into stem cells that can become many other cell types.

The particularly powerful – and very scarce – flexible forms of stem cells needed for medical research and treatment may now be both plentiful and simple to produce, with a new technology developed at the U.S. Department of Energy’s Argonne National Laboratory – and the source is as close as your own bloodstream.

These flexible stem cells, able to morph into a variety of cell types, are called “pluripotent,” and before this Argonne research, they have been found only in fetal tissue, which is limited, and in bone marrow, which is difficult to collect. Pluripotent stem cells are important because they can generate all types of tissues found in the body, and the Argonne-developed technology can produce them from adult blood cells.

The finding may eventually offer researchers a practical alternative to the use of embryonic stem cells for research, drug discovery, and transplantation.

Argonne scientist Eliezer Huberman and his colleagues, Yong Zhao and David Greene, examined adult monocytes, a type of white blood cells that act as precursors to macrophages. The researchers found that when monocytes were exposed to a growth factor, they created a set of pluripotent stem cells. After cultivating the stem cells, the scientists were able to make the cells “differentiate” into nerve, liver, and immune system tissue by delivering more growth factors.

“Because of its great promise in medicine, I’m prouder of this work than of anything else I’ve done,” Huberman said.

The research is being published in the Proceedings of the National Academy of Sciences.

Storing the precursor cells in liquid nitrogen had no effect on their differentiation later. Because monocytes can be easily gathered from a patient's own blood supply, the researchers suggest that treating disease with a genetic match to prevent rejection may be possible in the future.

This means that the material should produce valuable candidates for transplantation therapy, useful to replenish immune cells that have been eradicated by cancer therapy or to replace neuronal tissue damaged during spinal cord injury, stroke, Alzheimer’s or Parkinson’s disease.

Funding for the research is from the National Institutes of Health. The researchers have applied for a patent on the new technology.

This is a very exciting development. By avoiding cloning or the use of embryos or fetal tissue this technique may not provoke as much opposition on ethical grounds. There is still a possibility that some will ethically object. If these cells turn out to be capable of developing into a fetus some might argue that application of the growth factor is essentially creating an embryo.

It is premature to conclude that these cells are pluripotent and as fully useful as embryonic stem cells. Other labs are going to have to confirm that the cells can become many more cell types. But if the cells can do that then the next question that needs to be asked is whether they are as youthful as stem cells taken from an embryo or umbilical cord. Tests need to be done to measure telomere sizes, ability to go thru many cell divisions, speed of cell division, and other indicators.

By Randall Parker 2003 February 24 12:47 PM  Biotech Manipulations
Entry Permalink | Comments(0)
2003 February 22 Saturday
Smallpox Vaccine Risks Put Into Perspective

Jonathan Rauch puts the risk of smallpox vaccination into perspective by comparing it to car driving risks.

But the risk of a potentially life-threatening reaction to the smallpox vaccine is between 14 and 52 per million inoculations, according to the Department of Health and Human Services, and the odds of death are one to two per million. By comparison, the chance of dying behind the wheel of a car is about 24 per million drivers per year. In other words, the fatality risk you would assume by taking the smallpox vaccine is about a 10th the risk you assume by driving around, and the reason for being vaccinated seems somewhat more compelling than, say, the need for a Slurpee.

A more complex breakdown of smallpox vaccination risks puts the risk of death from smallpox vaccination at 5 per million for babies under the age of 1 but a tenth that amount after the age of 1. Though there is not enough data to determine the risks for those after age 20.

There is a risk after vaccination of passing vaccinia on to someone who has a compromised immune system. A car driver has a 100 times greater risk of killing a pedestrian or other nonrider than to kill someone with "contact vaccinia" after getting vaccinated.

Working from data that Neff and three of his colleagues recently published in the Journal of the American Medical Association, I figure the odds of dying from "contact vaccinia," as it's called, at two to four per 10 million inoculations. In 2001, by way of comparison, every 10 million licensed drivers caused the deaths of about 300 pedestrians and other nonriders -- people who had not voluntarily assumed the risk of getting into an automobile.

There has been a lot of debate about whether getting vaccinated for smallpox is worth it. The problem with the debate is that we can't know what the Iraqi and North Korean regimes and other potential possessors of smallpox are capable of. We do not know with certainty the identity of every government that has smallpox or how well guarded their smallpox stock is. Therefore we are stuck comparing a precisely calculable and known risk of vaccination with an alternative which has risk probabilities that are not known.

Having already been inoculated for smallpox once and having emerged unscathed I'd be inclined to get inoculated again if the opportunity to do so was made available to the general public. I have a rather pessimistic view of what terrorists and nasty regimes are capable of.

It is possible to develop a safer form of smallpox vaccine using DNA vaccines. DNA vaccines can cause the body to make protein antibodies while at the same time the inability of the DNA vaccines to replicate eliminates the risk from infection. The problem with such an alternative vaccine is that it would take years to develop and so would be of no help in reducing the threat of vaccination or bioterrorist attack in the short term.

DNA vaccines have an additional benefit: They can be much more rapidly modified to deal with bioengineered weaponized versions of pathogens that are immunologically different fom naturally occurring versions of pathogens. The US Navy's Naval Medical Research Center has been working on DNA vaccines.

Building on the innovative DNA vaccine models developed by Carucci and his fellow Navy researchers, the three captains and their colleagues have quietly worked in laboratories at NMRC to develop the next generation of vaccines against deadly diseases, whether they are naturally occurring or bio-engineered weapons.

Traditional vaccines have saved countless millions, but have their limitations. They take years to develop and can be difficult and costly to manufacture. They need constant refrigeration, and generally cannot be mixed to inoculate against more than one disease at a time. And there's always the danger of side effects.

But now, Carucci, Mateczun, Galloway and their colleagues may have taken the first steps to a potential new generation of vaccines, which is expected to be safer, cheaper, stable, have fewer side effects, be more effective against a wider variety of diseases and easier to administer.

They are expected to have what the researchers call "agility" -- that is, they can be retailored quickly to become "just-in-time" inoculations against bacteria, viruses or other pathogens that have emerged or re-engineered to make existing vaccines ineffective.

"One of the potential advantages of this agile vaccine technology, which the Navy is a leader in developing, is that production from start to finish might take a matter of months, not years," said Rear Adm. Steven Hart, MC, head of the Navy's medical research programs.

Even months is still too long a time. What the US and other Western nations need is the ability to sequence a new version of a pathogen and manufacture a new version of a DNA vaccine in a matter of days.

By Randall Parker 2003 February 22 08:36 PM  Dangers Tech General
Entry Permalink | Comments(0)
2003 February 21 Friday
Aged Blood Stem Cells Indicator For Cardiovascular Disease Risk

The levels of a type of adult stem cells called endothelial progenitor cells are inversely correlated with cardiovascular disease risk.

Levels of a type of adult stem cell in the bloodstream may indicate a person's risk of developing cardiovascular disease, according to a study supported by the National Heart, Lung, and Blood Institute (NHLBI), part of the National Institutes of Health in Bethesda, MD.

The study looked at the blood level of endothelial progenitor cells, which are made in the bone marrow and may help the body repair damage to blood vessels. Scientists from NHLBI and Emory University Hospital in Atlanta, GA, found that cardiovascular disease risk was higher in persons with fewer endothelial progenitor cells. The cells of those at higher risk also aged faster than those at lower risk, as determined by the Framingham Heart Study risk factor score, a standard measurement of cardiovascular risk. Additionally, the study found that blood vessels were much less likely to dilate and relax appropriately in persons with low levels of the cells.

Results of the study, which involved 45 healthy men aged 21 and older, some of whom had standard cardiovascular risk factors, appear in the February 13, 2003, issue of The New England Journal of Medicine. The two main forms of cardiovascular disease are heart disease and stroke. Standard heart disease risk factors are age, family history of early heart disease, smoking, high blood pressure, high blood cholesterol, overweight/obesity, physical inactivity, and diabetes.

"Past research on cardiovascular disease has often focused on what causes the damage to the blood vessels," said Dr. Toren Finkel, chief of NHLBI's Cardiology Branch and coauthor of the study. "We looked at the other part of the equation: How does the body repair damaged blood vessels? What does that tell us about the cause of the disease?

"We believe that these endothelial progenitor cells patch damaged sites in blood vessel walls," he continued. "When the cells start to run out, cardiovascular disease worsens. We don't yet know what causes their depletion but it may be related to the fact that the risk of cardiovascular disease increases as people age. For instance, the cells may be used up repairing damage done by other risk factors or those risk factors could directly affect the survival of the endothelial cells themselves.

"Much more research needs to be done to better understand this finding," Finkel added. "But it's possible that, some day, doctors may be able to test a person's risk of cardiovascular disease by taking a blood sample and measuring these cells. If the level is too low, an injection of endothelial cells might boost the body's ability to repair itself and prevent more blood vessel damage."

The decline in endothelial progenitor cells may be due an aging process that has left those stem cells less able to divide and make new cells for blood vessel repair.

In order to test their hypothesis that endothelial progenitor cells age prematurely in individuals with higher cardiovascular risk factors, the investigators studied endothelial progenitor cells from subjects with either high or low Framingham risk scores. After seven days in culture, a significantly higher number of cells from the high-risk subjects had characteristics of senescence, or aging.

"Cardiovascular health is dependent on the ability of the blood vessels to continually repair themselves," says Arshed Quyyumi, MD, professor of medicine at Emory University School of Medicine, formerly of the NHLBI, and a member of the research team. "Evidence has shown that cardiovascular risk factors ultimately lead to damage to the endothelial layer of blood vessels. We can now speculate that continuing exposure to cardiovascular risk factors not only damages the endothelial layer, but may also lead to the depletion of circulating endothelial progenitor cells. Thus, the net damage to blood vessels and hence the risk of developing atherosclerosis depends not only on the exposure to risk factors, but also on the ability of the bone marrow-derived stem cells of endothelial origin to repair the damage.

"We will need larger studies to determine a definite cause and effect relationship between a decrease in these cells and adverse cardiovascular events. Our study did demonstrate, however, a correlation between endothelial progenitor cells, cardiovascular risk factors, increased senescence of endothelial progenitor cells, or stem cells, and vascular function. We are hopeful that further research will show that endothelial progenitor cells are a useful marker for cardiovascular disease risk."

Here's part of what might be going on here: in someone who has cardiovascular risk factors the endothelial progenitor cells (which are a type of non-embryonic stem cell) may need to divide at a faster rate in order to repair the damage being done to cells in the endothelial layer of blood vessels. The need to divide at a faster rate basically may be causing the endothelial progenitor cells to age more rapidly.

If the endothelial progenitor cells divide more rapidly in response to damage caused by cardiovascular risk factors then the telomeres on their chromosomes shrink more rapidly and the cells will lose the ability to divide sooner. Short telomeres are a marker for increased risk of mortality. This latest result suggests one reason why: blood vessels can not be repaired as well and this will increases the risk of heart disease and stroke.

Another reason why some people have lower levels of endothelial progenitor cells may be because they started life with shorter telomeres. Whether the telomeres are shrinking more rapidly or starting out shorter it is likely that short telomeres are at least one of the causes of the senescence of endothelial progenitor cells. It would be very interesting to test the cells of those with lower and higher Framingham heart disease risk scores and see if those with higher risk scores have shorter telomeres in their endothelial progenitor cells and in other blood cell types.

What can be done about this? A direct approach would be to remove stem cells, lengthen their telomeres, and then return those cells into the body. This approach might increase the risk of cancer unless accompanied with other techniques to assure that the cells so treated do not have any mutational damage that makes them prone to become cancerous. Another approach would be to use stem cells from embryos to replace host blood stem cells. But that approach elicits strong ethical objections from some quarters. Its not clear that therapeutic cloning or the harvesting of stem cells from aborted embryos will ever be allowed i the United States.

This latest result is further evidence for the idea that reseeding non-embryonic stem cell reservoirs with more youthful stem cells will be an essential technique for reversing aging.

Update: Statins may provide part of their benefit by boosting endothelial progenitor cells.

Quyyumi noted a class of drugs called statins, used to lower high cholesterol levels and reduce the risk of developing heart disease, have been shown to triple the levels of these stem cells.

This is an interesting twist. If telomere shortening was causing the stem cells to become senescent then one wouldn't expect a drug that lowers cholesterol to boost the levels of these cells.

By Randall Parker 2003 February 21 07:37 PM  Aging Reversal
Entry Permalink | Comments(0)
Human Embryonic Stem Cell Gene Insertion and Deletion

Scientists at U Wisc Madison have developed a technique for humans to add or delete genes from embryonic cells. (bold emphases added)

MADISON - The technique that helped revolutionize modern biology by making the mouse a crucible of genetic manipulation and a window to human disease has been extended to human embryonic stem (ES) cells.

In a study published today (Feb. 10) in the online editions of the journal Nature Biotechnology, a team of scientists from the University of Wisconsin-Madison reports that it has developed methods for recombining segments of DNA within stem cells.

By bringing to bear the technique, known in scientific parlance as homologous recombination, on DNA in human embryonic stem cells, it is now possible to manipulate any part of the human genome to study gene function and mimic human disease in the laboratory dish.

"Indeed, homologous recombination is one of the essential techniques necessary for human ES cells to fulfill their promise as a basic research tool and has important implications for ES cell-based transplantation and gene therapies," write Wisconsin researchers Thomas P. Zwaka and James A, Thomson, the authors of the new study.

The technique has long been used in the mouse and is best known in recent years for its use to generate mice whose genomes have been modified by eliminating one or more genes. Known as 'knockouts,' genetically altered mice have become tremendously important for the study of gene function in mammals, and have been used to explore everything from the underlying mechanisms of obesity and other conditions to the pinpointing of genes that underpin many different diseases.

Significant differences between mouse and human embryonic stem cells have, until now, hampered the application of the technique to human ES cells, according to Zwaka, the lead author of the Nature Biotechnology report and a research scientist working in the laboratory of James Thomson. Thomson was the first to isolate and culture human embryonic stem cells nearly five years ago.

"This is a big benefit for the human ES cell field," Zwaka said. "It means we can simulate all kinds of gene-based diseases in the lab - almost all of them."

To demonstrate, the team led by Zwaka and Thomson were able to remove from the human genome the single gene that causes a rare genetic syndrome known as Lesch-Nyhan, a condition that causes an enzyme deficiency and manifests itself in its victims through self-mutilating behavior such as lip and finger biting and head banging.

The study of genes derived from human ES cells, as opposed to those found in mice, is important because, while there are many genetic similarities between mice and humans, they are not identical. There are human genes that differ in clinically significant ways from the corresponding mouse genes, said Zwaka. The gene that codes for Lesch-Nyhan is such a gene, as mice that do not have the enzyme do not exhibit the dramatic symptoms of the disease found in humans whose genes do not make the enzyme.

Another key aspect of the new work is that it may speed the effort to produce cells that can be used therapeutically. Much of the hype and promise of stem cells has centered on their potential to differentiate into all of the 220 kinds of cells found in the human body. If scientists can guide stem cells - which begin life as blank slates - down developmental pathways to become neurons, heart cells, blood cells or any other kind of cell, medicine may have access to an unlimited supply of tissues and cells that can be used to treat cell-based diseases like Parkinson's, diabetes, or heart disease. Through genetic manipulation, 'marker' genes can now be inserted into the DNA of stem cells destined for a particular developmental fate. The presence or absence of the gene would help clinicians sort cells for therapy.

"Such 'knock-ins' will be useful to purify a specific ES-cell derived cell type from a mixed population," Zwaka said. "It's all about cell lineages. You'll want dopamine neurons. You'll want heart cells. We think this technique will be important for getting us to that point."

Genetic manipulation of stem cells destined for therapeutic use may also be a route to avoiding transplant medicine's biggest pitfall: overcoming the immune system's reaction to foreign cells or tissues. When tissues or organs are transplanted into humans now, drugs are administered to suppress the immune system and patients often need lifelong treatment to prevent the tissue from being rejected.

Through genetic manipulation, it may be possible to mask cells in such a way that the immune system does not recognize them as foreign tissue.

This press release is as notable for what it doesn't say as for what it does say. First lets review what it does say the technique will be useful for.

Yes, this technique will be useful for doing cloning to create cell lines that have knock-outs of genes in order to study the effects of eliminating individual genes. This is routinely done with mouse cell lines. It can even be used with reproductive cloning to find out whether a mouse can still live without a gene and to see what effects the absence of the gene has on whole organisms. Of course it is unlikely (at least in Western countries) that scientists are going to do reproductive cloning on humans with gene knock-outs to discover what effects a gene knock-out has on full human organisms.

The technique will probably be useful in helping to guide cellular differentiation and to solve immuno-compatibility problems in order to create replacement organs.

The press release doesn't mention the possibility that this technique will also probably be useful for the creation of non-embryonic stem cell lines in order to do replenishment of non-embryonic stem cell reservoirs. That will be useful for treating some genetic diseases (e.g. sickle cell anemia) and some types of cancer (e.g. leukemia). But the biggest benefit to come from making non-embryonic stem cells is as a rejuvenation therapy to partially reverse aging.

The biggest potential use of this technique that the press release doesn't mention is like the huge elephant in the room that everyone pretends not to notice: This technique will be useful for human germ line genetic engineering for the purpose of creating genetically engineered human offspring. The ability to delete and insert genes means the ability to replace one version of a gene with a different version of the same gene. It also means the ability to add new genes. Also, additional copies of existing genes could be added in order to get more expression of those genes in the embryo, child, or adult human.

A technique to change embryonic stem cell genes is unlikely to be limited only to embryonic stem cells produced in a single way. It seems likely that the technique will work on embryos produced by cloning or by in vitro fertilization of an egg or by regular sexual reproduction where the embryonic cells would be removed from a woman's womb. Therefore regardless of how a viable embryo is created it will be possible to do genetic engineering to it before it develops into a fetus and baby.

By Randall Parker 2003 February 21 03:24 PM  Biotech Manipulations
Entry Permalink | Comments(6)
2003 February 20 Thursday
Quantum Nucleonics May Power USAF UAVs

Picture Global Hawk Unmanned Aerial Vehicles (UAVs) that can fly continuously for months.

The AFRL now has other ideas, though. Instead of a conventional fission reactor, it is focusing on a type of power generator called a quantum nucleonic reactor. This obtains energy by using X-rays to encourage particles in the nuclei of radioactive hafnium-178 to jump down several energy levels, liberating energy in the form of gamma rays. A nuclear UAV would generate thrust by using the energy of these gamma rays to produce a jet of heated air.

A tutorial on quantum nucleonics provides some details on the science involved.

A friend who knows a fair amount of physics thinks that pumping the hafnium nuclei up to a high energy state to make them into a suitable fuel source would be a fairly inefficient process because most of the x-ray energy would not hit the nuclei to cause the needed energy jump. Many times more energy would be needed to pump the nuclei up into a higher energy state than would be given back as the jump down to a lower state and release their energy. Therefore quantum nucleonics is not an approach suitable for large scale energy storage use.

The advantage over batteries or even hydrogen fuel is that quantum nucleonics can be used to produce a material with a much higher energy density. In specialty applications such as military UAVs the energy costs may be worth it because the high energy density would allow continuous operation of a UAV for a long period of time.

The technology seems like it would be more attractive for UAV use over oceans than over land. A UAV that crashed over an ocean would likely sink to the bottom taking its radioactive material with it far beyond the reach of humans. There are plenty of naval military applications for UAVs such tracking enemy fleets, surveilling ships to look for possible terrorist ships carrying WMD, as well as search and rescue. There are also civilian scientific applications such as continuous data collection on weather and environmental monitoring. Plus, search and rescue is a civilian need as well. One can even imagine long duration UAVs being used as lower altitude equivalents of communications satellites.

Update: Back in August 2001 some Lawrence Livermore National Laboratory scientists published a result that argues against the possibility of using hafnium-178 as an energy storage material.

LIVERMORE, Calif.—Physicists from the Lawrence Livermore National Laboratory, in collaboration with scientists at Los Alamos and Argonne national laboratories, have new results that strongly contradict recent reports claiming an accelerated emission of gamma rays from the nuclear isomer 31-yr. hafnium-178, and the opportunity for a controlled release of energy. The triggering source in the original experiment was a dental X-ray machine.

Using the Advanced Photon Source at Argonne, which has more than 100,000 times higher X-ray intensity than the dental X-ray machine used in the original experiment, and a sample of isomeric Hf-178 fabricated at Los Alamos, the team of physicists expected to see an enormous signal indicating a controlled release of energy stored in the long lived nuclear excited state. However, the scientists observed no such signal and established an upper limit consistent with nuclear science and orders of magnitude below previous reports.

By Randall Parker 2003 February 20 11:34 AM  Airplanes and Spacecraft
Entry Permalink | Comments(4)
2003 February 18 Tuesday
Biological Journals Start Withholding Dangerous Information

In a sign of the times the American Society for Microbiology has instituted a system for review of research articles that contain potentially dangerous information.

As a publisher of 11 peer-reviewed journals in the microbiological sciences, the ASM is on the front lines in dealing with publication of information that could be misused, Atlas pointed out. For this reason, the ASM Publication Board has adopted policies and procedures for dealing with any manuscript that may describe misuse of microbiology or of information derived from microbiology. Reviewers alert editors, who then alerts the Editor in Chief. The Editor in Chief contacts the Chair of the ASM Publications Board, and the entire board may be involved in the disposition of the manuscript.

ASM publication policy also requires that research articles must contain sufficient detail to permit the work to be repeated by others, and authors must agree to supply materials in accordance with laws and regulations governing the shipment, transfer, possession, and use of biological materials and that such supply be for legitimate research needs.

During the period 2001-2002, 14,000 manuscripts were submitted to the ASM journals. Of these, 224 dealt with select agents. Of these, 90 were rejected—57 with non-US authors. There were 134 accepted—58 with non-US authors. Among these, 2 (<0.015%) elicited elevated concern. Each was considered by the entire Publications Board and they are to be published with modification, Atlas reported.

While this sort of measure will provide some benefit the real problem is that as technology advances it becomes easier to manipulate matter into whatever form is desired. Most of the technological advances that will make it easier to develop biological weapons will not be advances made specifically in order to make nasty pathogens. Advances that enhance general abilities to study and manipulate biological materials will make it easier to make bioweapons.

Update: The rules for when ASM journals will withhold information are less strong than the statement makes them appear. The rule seems to refer to misuse of microbiology by the actual researcher.

Ask ALL reviewers to advise the Editor, by use of the Confidential Comments section of the review form and the appropriate check-off box when it becomes available, if, in their opinion, the manuscript under review describes misuses of microbiology or of information derived from microbiology.

If a researcher has legitimate reasons to study dangerous pathogens and isn't trying to make a bioterror weapon then it doesn't sound like this rule would be invoked to possibly restrict a paper from being published even if the paper contained information directly applicable to the manufacture of bioweapons. A researcher could, for instance, describe which variations of some viral gene make that virus more or less virulent. There are certainly legitimate reasons for wanting to acquire that type of information. But that information could be easily misused by someone else who otherwise would not be able to easily figure it out.

The vast bulk of all researchers are not trying to misuse microbiological research techniques. But they are discovering information that is useful for those who do wish to use biological science to harm others.

By Randall Parker 2003 February 18 12:03 PM  Dangers Tech General
Entry Permalink | Comments(0)
Nanotech Sensors For Microfluidics Devices

Microfluidics devices will be enhanced by embedded carbon nanotube sensors.

San Jose, Calif.--February 3, 2003--Cutting edge research is setting the stage for the practical deployment of carbon nanotubes as flow sensors. Studies drawing on both electrokinetic phenomena and slip boundary conditions are offering in-depth understanding of microfluid flow in restricted microchannels.

Complex experiments have now demonstrated that the Coulombic effect, involving direct scattering of free charge carriers from fluctuating Coulombic fields of ions or polar molecules in the flowing liquid, is stronger than the phonon drag effect in generating electric current/voltage.

The outcome has been the emergence of a model for a practical flow sensor, capable of being downsized to small dimensions as short as the nanotubes.

A new avenue has thereby been created to gauge flow in tiny liquid volumes, with high sensitivity at low velocities and exceptionally rapid response times.

Microfluidics will accelerate the rate of advance of biological science and technology. Microfluidic devices will certainly need a variety of built-in sensors. One application for microfluidic devices will be automated mini test labs to allow blood tests to be done right in a doctor's office or even at home.

Of course all technologies have their downsides and we need to learn to look at every technology and ask how it might (or, rather, will) be abused. In the face of microfluidics one method of abuse would be to use it to make biowarfare agents. A really complex microfluidic device ought to be able to synthesis a viral pathogen. This could even be used to carry out assassinations. Make a pathogen and put it on the surface of something the target is about to touch.

By Randall Parker 2003 February 18 11:27 AM  Nanotech for Biotech
Entry Permalink | Comments(0)
Mitochondrial DNA Mutation Common In Centenarians

A mutation in mitochondrial DNA appears to be linked to longer life expectancy.

Mitochondrial DNA is the portion of the cell DNA that is located in mitochondria, the organelles which are the "powerhouses" of the cell. These organelles capture the energy released from the oxidation of metabolites and convert it into ATP, the energy currency of the cell. Mitochondrial DNA passes only from mother to offspring. Every human cell contains hundreds, or, more often, thousands of mtDNA molecules.

It's known that mtDNA has a high mutation rate. Such mutations can be harmful, beneficial, or neutral. In 1999, Attardi and other colleagues found what Attardi described as a "clear trend" in mtDNA mutations in individuals over the age of 65. In fact, in the skin cells the researchers examined, they found that up to 50 percent of the mtDNA molecules had been mutated.

Then, in another study two years ago, Attardi and colleagues found four centenarians who shared a genetic change in the so-called main control region of mtDNA. Because this region controls DNA replication, that observation raised the possibility that some mutations may extend life.

Now, by analyzing mtDNA isolated from a group of Italian centenarians, the researchers have found a common mutation in the same main control region. Looking at mtDNA in white blood cells of a group of 52 Italians between the ages of 99 and 106, they found that 17 percent had a specific mutation called the C150T transition. That frequency compares to only 3.4 percent of 117 people under the age of 99 who shared the same C150T mutation.

To probe whether the mutation is inherited, the team studied skin cells collected from the same individuals between 9 and 19 years apart. In some, both samples showed that the mutation already existed, while in others, it either appeared or became more abundant during the intervening years. These results suggest that some people inherit the mutation from their mother, while others acquire it during their lifetime.

The mitochondria contain DNA that code for a subset of all the proteins that get used in mitochondria. Most of the genes for mitochondria are coded for in the nucleus. Those genes that are part of the miticondrial DNA (mtDNA) are more vulnerable to oxidative damage as cells age because the mitochondria produce a lot of free radicals as a side effect of how they do energy metabolism. The presence of mtDNA within mitochondria is rather like an Achilles Heel for how eukaryotic organisms are designed. Therefore a mutation in the mtDNA that affects longevity is not surprising.

Is news of this mutation useful for devising anti-aging therapies? Possibly. It might point to a method to slow aging by development of pharmaceutical means of enhancing mtDNA replication. However, the most optimal way to deal with the accumulation of damage to mtDNA would be a gene therapy that moved the mtDNA genes into the nucleus. Such a therapy would essentially move the mtDNA genes out of harm's way and therefore the genetic variations that help those genes survive better in mitochondria would become irrelevant.

Techniques to do gene therapy to a large portion of the cells in the body is probably the ability most needed to be able to turn back the biological clock on aged cells. While some cells and organs will some day be replaced via cell therapy and organ replacements there are parts of the body where replacement is really not a good idea. Most notably, the central nervous system defines who we are. Even if brain replacement was possible that would replace who we are with someone else. Gene therapy is most needed for brain rejuvenation so that the cells that constitute our brains can be made youthful again.

The abstract for the PNAS research paper that reports these results is available online.

By Randall Parker 2003 February 18 02:02 AM  Aging Reversal
Entry Permalink | Comments(2)
2003 February 17 Monday
Terrorist Threat From Nuclear Spent Fuel Rod Storage

The threat of terrorism makes dealing with nuclear waste storage problems an urgent priority.

space-saving method for storing spent nuclear fuel has dramatically heightened the risk of a catastrophic radiation release in the event of a terrorist attack, according to a study initiated at Princeton.

Terrorists targeting the high-density storage systems used at nuclear power plants throughout the nation could cause contamination problems "significantly worse than those from Chernobyl," the study found.

The study authors, a multi-institutional team of researchers led by Frank von Hippel of Princeton, called on the U.S. Congress to mandate the construction of new facilities to house spent fuel in less risky configurations and estimated a cost of $3.5 billion to $7 billion for the project.

The paper is scheduled to be published in the spring in the journal Science and Global Security.

Strapped for long-term storage options, the nation's 103 nuclear power plants routinely pack four to five times the number of spent fuel rods into water-cooled tanks than the tanks were designed to hold, the authors reported. This high-density configuration is safe when cooled by water, but would likely cause a fire -- with catastrophic results -- if the cooling water leaked. The tanks could be ruptured by a hijacked jet or sabotage, the study contends.

The consequences of such a fire would be the release of a radiation plume that would contaminate eight to 70 times more land than the area affected by the 1986 accident in Chernobyl. The cost of such a disaster would run into the hundreds of billions of dollars, the researchers reported.

Society is going to have to be gradually restructured to adjust for the danger posed by small groups waging asymmetric warfare. Technologies that are inherently less usable by terrorists should be preferred over technologies that are more easily turned against the society that uses them.

By Randall Parker 2003 February 17 01:14 AM  Dangers Tech General
Entry Permalink | Comments(1)
2003 February 16 Sunday
What Biotech Could Do For Space Travel

When most people think of space travel they typically think of rockets, spaceships, propulsion systems, spacesuits, and structures to ship to Moon or Mars colonies to live in. The key role that biotechnology could play in enabling space travel and colonization is too often ignored. I'd like to bring up a number of ways that biotechnological advances could enable space travel and colonization.

One big problem with space travel is that the costs per pound or kilogram sent are incredibly high. Spacefarers need ways to make consumables en route and once they have arrived at their destinations. A number of problems need to be solved to make space travel and space colonies feasible. Some of those problems must be solved with a biological approach. Others, while they could be solved with biotech, may be solvable using other approaches as well.

Human Hibernation

This is an approach for reducing consumables on a long space voyage and also for reducing the psychological strain of long journeys in small spaces.

One approach is to try to replicate the state that hibernating animal species enter into. The study of the molecular biology of hibernation may yield valuable information. It may not be possible to safely put a human body into a hibernation-like state for weeks and months at a time. Species that hibernate may have metabolic differences that are so drastic that adjusting humans to have the ability to hibernate for a long time might very difficult. One objective of hibernation research should be to discover how extensive the metabolic changes are in hibernation in order to determine whether hibernation is an approach worth pursuing.

A more limited adjustment that increases the number of hours slept per day could probably be achieveable with much less modification of human physiology.

Another approach would be to slow the metabolism down into a state that mimics the state achieved by those practicing calorie restriction. Drugs that reduce appetite and slow down metabolism would also decrease the rate of consumption of food. This approach would not provide as much relief from the strain of extended confinement.

Adjust Human Bodies to Low and Zero Gravity

Techniques for manipulating human metabolism to adapt it to space travel and to low gravity Mars and Moon colonies are probably not optional for colonization. Mars has only 0.377 of Earth's gravity and the Moon is even worse with only 0.166 of Earth's gravity. It is likely that extended living in such low gravity environments will cause problems for human health. Bones may weaken so much that return to Earth may become impossible or extremely difficult. Muscles similarly will atrophy. Plus, the lower need for blood circulation may cause inflammation and atherosclerosis. It is likely there are other longer term effects of low and zero gravity living that will need to be solved.

Centrifugal spaceships can only solve the problem that low gravity poses for human health for the trips to and from Moons and planets. But since moons and some planets have lower gravity than Earth the problem needs a more general solution. That solution must be a method of manipulating human metabolism to adapt it to low gravity living.

A Closed Biosphere

Make a closed cycle biosphere for space voyages in order to reduce the weight in consumables that must be sent with a human crew. Microorganisms could be genetically engineered to recycle waste and produce food. If a spaceship is nuclear powered then it will have enough energy to warm and provide light to microorganisms that could be genetically engineered to break down human waste to feed to still other genetically engineered cells that would create food.

The ability to run a closed biosphere implies the ability to grow food. This will be useful not just for reducing weight requirements for food eaten during long journeys to colonies but also for the food eaten at the destinations. Closed biosphere research is probably the most important area where work is needed to support colonization.

Structure producing plants and microorganisms

Mars colonists will need materials suitable for building structures. Chairs, bedframes, baby cribs, walls, and ceilings are just a few of the types of structures they will need to be able to build. Trees grow too slowly and take up too much space. What is needed is a way to use energy from a nuclear power plant to create organic materials to feed organisms that can create materials with wood-like qualities.

Textile Fiber Producing Microorganisms

Mars colonists will need clothes. They'll need bedsheets, pillow cases, napkins, towels, rags, and materials for furniture covering. Any need for textiles that exists on Earth likely will exist on Mars as well. Genetic engineering could produce plants capable of making fibers suitable for textile production. Either the genes for making silk could be genetically engineered into microorganisms or something similar could be done with cotton plant genes. It might even be possible to use cotton plant cells but engineer them to make cotton fibers without being attached to a full plant.

Medicine and Vaccine Producing Plants and Microorganisms.

This is a hard one to solve because there are so many drugs that would eventually be useful on Mars. Each drug requires its own series of synthesis steps.

Some vaccine producing plants are under development. But they are less useful on Mars in part because there won't be as many diseases to contend with. Colonists won't exactly have to worry about getting malaria from mosquitoes. Plus, their numbers will be so low initially that diseases that pass from person to person won't have a way to be maintained. Plus, all the colonists can be vaccinated before they leave Earth.

Another reason vaccine creation on Mars will probably not be reliant on plants or microorganisms is that a single device capable of making DNA vaccines could make all the types of DNA vaccines needed. While drugs each need their own unique set of chemical synthesis steps DNA vaccines all will use the same series of 4 (Adenosine, Cytosine, Thymidine, and Guanidine) chemical letters to make them. A single general purpose DNA sequencer that can be programmed to make any DNA sequence could be used to make all types of DNA vaccines. A lot of groups are working on DNA vaccines and it is reasonable expect that the optimal DNA sequences for a wide range of DNA vaccines will be available in 10 or 20 years.

Priorities for Research and Development

Most of the items above would have plenty of commercial uses here on planet Earth. Most of the advances needed will be done for other reasons.

The effects of low gravity effects on the human body have got to be the biggest set of problems. Progress will be made on these problems due to biomedical research efforts for problems that humans have here on Earth. Scientists will figure out how weight is used to signal bones to grow and the mechanisms by which muscles are signalled to grow will be elucidated as well. The knowledge gained from such reseach will be useful in treating aging-related changes and for injury healing. While that research will provide a firm foundation upon which to develop drugs, gene therapies, and other techniques to deal with extended living in low gravity environments a substantial amount of research work will still have to be done for that specific purpose. Humans are adapted to the force of one Earth gravity.

Beyond adapting humans to low gravity environments the biggest need is to be able to produce consumables for longer term living. The spaceships used to travel to Mars or the Moon will provide some shelter. Clothing made of long-lasting materials can last for years. So there shouldn't be much need to produce new clothing for the first few years. Methods to grow food and to maintain a closed biosphere would address another really big need.

We do not just need bigger and better rockets and spaceships in order to set up space colonies on the Moon or Mars. There are difficult problems in biology that must be solved. The biggest set of problems concern the human body. We are designed to live in a very narrow range of conditions. Even if we could cheaply go to other places we could not sustain human settlements under conditions for which we are not adapted. Until the basic problems are solved we can only visit other places and then only at great expense.

By Randall Parker 2003 February 16 11:03 PM  Space Exploration
Entry Permalink | Comments(18)
Spheral Solar Will Start Production In 2004

The flexible Spheral Solar Power photovoltaic panels will be usable in house construction and will give buildings a blue denim look.

Buildings of the future could be "clothed" in a flexible, power-generating material that looks like denim. The Canadian company developing the material says it can be draped over just about any shape - greatly expanding the number of places where solar power can be generated.

One big advantage that Spheral Solar has with their process is that they can use a lower grade of silicon. They also use less silicon.

There are no rare materials utilized in the manufacturing of Spheral Solar™ products. Silicon supply has already become a problem for wafer-based technologies that all use the same semiconductor silicon waste stream. Silicon wafer based technologies utilize 18-24 tons of silicon per megawatt of solar cells produced. SSP is expected to utilize 9 tons per megawatt. As a result of the continuous improvement plan already underway, SSP silicon utilization could improve to below 2 tons per megawatt. Because of its inherently low gram/watt consumption of silicon and its ability to utilize many grades of silicon, silicon supply is not a major concern for Spheral Solar™ technology.

See this previous post for more on Spheral Solar.

Also see this brief interesting history of how of Spherical Solar Power came to be. What is amazing is how Texas Instruments and Ontario Hydro failed to push to develop this promising technology for many years.

By Randall Parker 2003 February 16 01:01 PM  Energy Solar
Entry Permalink | Comments(15)
2003 February 14 Friday
Cloned Sheep Dolly Dead At Premature Age 6

Sheep normally live to be 11 or 12. Dolly was 6 and a half and lame with arthritis. She was euthanised due to a lung condition.

The institute's Dr. Harry Griffin said Dolly had suffered from a virus-induced lung cancer that was also diagnosed in the past few months in other sheep housed with Dolly

She might not have gotten the infection and cancer as a result of being a clone. It is not clear that being a clone contributed to her early death. Though one scientist who was involved in research on Dolly says her clone status did contribute to her early death.

Professor Rudolf Jaenisch, who in March 2001 co-wrote an article with Dolly's creator Professor Ian Wilmut for the journal Science titled "Don't Clone Humans!" said that the death "is exactly what was expected: clones will die early".

How could her status as a clone contribute to her early death? Dolly had short telomere caps on her chromosomes.

The researchers found that Dolly's telomeres were shorter than other 3-year-old sheep, suggesting she is genetically older than her birth date.

The telomeres of chromosomes shrink with age losing 100-200 base pairs in length every time a cell divides. Shorter telomeres have recently been demonstrated to correlate with shorter life expectancy in humans. This strongly suggests that Dolly's problems stem from her shorter telomeres. Certainly her immune system would be less vigorous as a result of shortened telomeres.

However, as scientists cloning cows at Advanced Cell Technology have demonstrated, cloning of cells from an older organism does not always lead to shorter telomeres in the cloned offspring.

Moreover, when the scientists examined skin cells from the clones, they found that the telomeres were longer than those of the original senescent cells and even longer than those of typical newborn calves. One cloned cow that's now 2 years old has the telomeres of a calf, says Lanza.

ACT might be using a technique that is part of their proprietary cloning technology to achieve better cloning outcomes.

Worcester, MA, November 22, 2001 – Advanced Cell Technology, Inc. (ACT) and its subsidiary Cyagra, Inc. today reported that its proprietary cloning technology has been used to produce healthy and normal adult animals. ACT evaluated 30 cattle cloned from proliferating skin cells. Twenty-four (80%) of the clones were vigorous and remained alive and healthy one to four years later (by comparison, survival to adulthood normally ranges from 84% to 87%). Results of general health screens, physical examination and immune function were normal for all clones, including laboratory analysis on blood and urine, biochemistry, and behavioral responses. "We haven't observed any of the genetic defects, immune deficiencies or other abnormalities reported in the popular or scientific press," said Robert Lanza, M.D., Vice-President of Medical & Scientific ACT. "All of the data collected reinforce the view that these animals were clinically and phenotypically normal." The report will be published in next week's issue of SCIENCE (November 30, 2001), titled "Cloned Cattle Can Be Healthy and Normal" by ACT and its collaborators at the Mayo Clinic, Trans Ova Genetics, Em Tran and the University of Pennsylvania.

ACT has also done therapeutic cloning experiments where they cloned old cows, extracted cells from the embryo, and then injected the embryo cells back into the old cattle. These cells became major sources of blood immune system cells in the old cattle.

He said that 170 days later the injected cells had survived and were thriving in the blood of the cattle. When put into lab dishes, they grew abundantly, much as young fetal cells do.

This is an important experiment because ACT was able to take cells from an old cow and from them make cells that were much younger that could be injected back into the same cow to rejuvenate the cow's aged immune system. The problem with this approach is that since it involves the creation of an embryo it is considered morally objectionable by many and quite possibly will be outlawed by the US Congress.

See this previous post for more on the importance of the ACT work with cow immune system rebooting with stem cells. Also see this post entitled Aubrey de Grey On Stem Cell Reseeding For Aging.

An experiment with fetal cells was recently done in the UK where instead of cloning to create a source of cells aborted fetuses were used. The fetuses provided eye stem cells that improved the eyesight of sufferers of retinitis pigmentosa.

Transplants of fetal eye tissue seem to have improved the vision of two out of four people with a degenerative eye disease. It is too early to be sure the improvements are real and lasting, but on the strength of the results the team pioneering the surgery has asked regulators for permission to carry out further operations.

The ACT therapeutic cloning experiment to rejuvenate aged cow immune systems and the experiment with aborted fetus eye stem cells both demonstrate the therapeutic potential of cells derived from embryos (whether pluripotent or not). Work on adult stem cells so far has not produced cell lines with all the same advantages. On the bright side these experiments are demonstrating the enormous future therapeutic potential for cell therapies. However, regardless of how one feels personally about the use of therapeutic cloning or aborted fetuses stem cell sources, enough people have ethical objections to these approaches that the quickest way scientifically to the development of many useful cell therapies is likely to be legally blocked in at least some countries.

By Randall Parker 2003 February 14 07:34 PM  Aging Reversal
Entry Permalink | Comments(10)
New Generation Bomb Detecters Under Development

Fortune has a nice survey of a wide range of chemical and nuclear weapons detectors under development.

The neutron-scanning leader is Ancore Corp., a small company in Santa Clara, Calif., that was recently acquired by OSI, a California inspection-systems firm. Ancore's president, physicist Tsahi Gozani, who co-invented the technology in the late 1980s, has fought an uphill battle ever since to convince potential buyers that it can be cost-effective. His work is cut out for him: Among other precious parts, Ancore's $10 million scanner includes a custom-built, 30-foot-long atom smasher to generate neutron beams.

What is especially curious about this is how technologies previously useful only by particle physicists for basic research are being adapted for use to detect different types of materials.

By Randall Parker 2003 February 14 02:16 PM  Dangers Tech General
Entry Permalink | Comments(0)
2003 February 12 Wednesday
Cornell Group Can Watch One Molecule At A Time

Cornell University scientists have developed the means to optically watch a single biological molecule at a time.

Until now, researchers were constrained from seeing individual molecules of an enzyme (a complex protein) interacting with other molecules under a microscope at relatively high physiological concentrations -- their natural environment -- by the wavelength of light, which limits the smallest volume of a sample that can be observed. This, in turn, limits the lowest number of molecules that can be observed in the microscope's focal spot to more than 1,000. Internal reflection microscopes have managed to reduce the number of molecules to about 100. But because this number is still far too high to detect individual molecules, significant dilution of samples is required.

The researchers have discovered a way around these limitations, and in the process reduced the sample being observed 10,000-fold to just 2,500 cubic nanometers (1 nanometer is the width of 10 hydrogen atoms, or 1 billionth of a meter), by creating a microchip that actually prevents light from passing through and illuminating the bulk of the sample. The microchip, engineered from aluminum and glass in the Cornell Nanoscale Science and Technology Facility, a NSF-funded national center, contains 2 million holes (each called a waveguide), some as tiny as 40 nanometers in diameter, or one-tenth of the wavelength of light.

Small droplets of a mixture containing enzymes and specially prepared molecules were pipetted into wells on the microchip and placed the chip in an optical microscope. Each of the chip's holes is so tiny that light from a laser beam is unable to pass through and instead is reflected by the microchip's aluminum surface, with some photons "leaking" a short distance into the hole, on the bottom of which an enzyme molecule is located.

These few leaking photons are enough to illuminate fluorescent molecules, called fluorophores, attached as "tags" to nucleotides (molecules that make up the long chains of DNA) in the sample. In this way, the researchers were able to observe, for the first time, the interaction between the ligand (the tagged nucleotide) and the enzyme in the observation volume (the region of the mixture that can be seen).

The problem until now has been seeing exactly how long an interaction between a biological molecule and an enzyme takes and how much time elapses between these interactions. This is complicated by the need to distinguish those molecules interacting with the protein and those just passing by. "A freely moving molecule will come in and out of the observation volume very quickly -- on the order of a microsecond. But if it interacts with the enzyme it will sit there for a millisecond," says Levene. "There are three orders of magnitude difference in the length of time that we see this burst of fluorescence. So now it's very easy to discriminate between random occurrences of one ligand and a ligand interacting with the enzyme."

Says Webb: "We see only one fluorescent ligand at a time, so we can now follow the kinetics [movement and behavior] in real time of individual reactions." He adds, "We can actually see the process of interaction."

This is pretty impressive. The problem with biology has always been that most important mechanisms are shaped and operate on such a small scale that it is hard to figure out how exactly biological systems function. Anything that makes it easier to watch smaller scale phenomena can be very beneficial in speeding up the rate at which biological systems can be taken apart and figured out.

A logical extension of this technique would probably be to use quantum dots in place of the flourophores. Quantum dots last longer and they can be tuned to emit light at many different frequencies. That way different molecules emitting at different frequencies can be watched at the same time.

By Randall Parker 2003 February 12 05:10 PM  Biotech Advance Rates
Entry Permalink | Comments(0)
Light Bulb Replacements To Make Driving Safer

Diode lights will reduce energy costs and provide greater control over lighting color and lighting locations. Aside from the obvious expected benefits their ability to light up faster will reduce car accidents.

Buses, trucks and autos have diodes in brake lights and interior lighting. Styling and maintenance benefits are driving the trend, but there are safety benefits, too. Because the diodes light up fractions of a second faster than do incandescent lights when a driver hits the brakes, anyone trailing a vehicle at 65 miles an hour is able to stop about 19 feet sooner, according to a study at the University of Michigan Transportation Research Institute.

For more on the future of light bulb replacement technologies see this and this previous post.

By Randall Parker 2003 February 12 04:12 PM  Materials Advances
Entry Permalink | Comments(1)
2003 February 11 Tuesday
Space Shuttle Replacements And Space Exploration

On the NuclearSpace.com website Robert Zubrin argues that the Space Shuttle is a very inefficient way to put people into space.

In truth, the shuttle is not a space lift vehicle at all; rather, it is a self-launching space station. It is not a truck with a heavy hauling capability, it is a Winnebago whose primary function is to move itself. The shuttle at lift off has the same thrust as a Saturn V moon rocket, yet it has only 15 percent of the payload, because 85 percent of the mass it delivers to orbit is that of the orbiter itself. This is why it is the least efficient payload delivery system ever flown.

Zubrin argues that the Shuttle's rockets could be used more productively with an unmanned upper stage to put payloads into orbit that are as big as what the Saturn V could launch. While this might be a good idea given where we stand right now it demonstrates just how far we haven't progressed since the Apollo program was cancelled. If we make the right decision we can have as much launch capability as the Saturn V provided. Oh geez, why am I not excited?

Zubrin also argues for the creation of a new human carrying spaceplane which would not try to carry cargo with the humans and which therefore would be small enough to sit at the top of expendable rockets (Delta or Atlas) and which would be able to fly back to Earth. It would be able to fly itself away from a Delta or Atlas that failed and, since it would sit entirely above the rocket, would not be susceptible to damage from pieces falling off the rocket. This is not a new idea. As John Pike points out in a New York Times article the idea was under discussion in the 1960s.

Mr. Pike said the concept of a reusable plane on top of an expendable rocket dates from the 1960's, before NASA decided on the shuttle. "When you sit down and do the math, if all you're trying to do is get people back and forth from a space station, that's what you want," he said. "That's the appropriate degree of reusability. After four and a half decades of the space age, the technology to do that is readily at hand. There's essentially no research required. It's literally off the shelf."

At this point it seems likely that NASA will use existing technology to build this kind of design that they should have pursued 40 years ago. They will probably build something better than what they could have built back then because materials science has advanced in the meantime and computers can test and change designs more rapidly than humans could in the pre-CAD/CAE era. But they won't push the envelope of what is possible when they build that spaceplane.

It is probably true that with current technology we could mount a human mission to Mars. Some proponents of a Mars mission argue that since it is technically possible to do it now we should therefore do it because it would be a huge step forward in human exploration. But while it would be a huge step forward in terms of the uniqueness of the accomplishment would it be an enabling step toward later steps? I think we ought to stop, step back, and look at what happened as a result of the human Apollo mission to the Moon. Once the trip had been made people quickly lost enthusiasm because basically it was expensive to do and there wasn't any way (absent even greater on-going expense) to maintain a permanent human settlement on the Moon. The Apollo program did not produce technology that made human presence in space or on the Moon into an economically viable proposition.

As long as the human presence in space is so expensive that it requires widespread public support to get tax money to fund it we are not going to go into space in any sustained fashion. We might be able to get public support to a high enough level at some point to do a Mars mission. But is it wise to do so? After it is over and the astronauts have returned to have their tickertape parade we could find ourselves back in the same position we were in as the Apollo program wound down. If we do not make technological advances that make a human presence cost-effective to maintain then a human presence isn't going to be maintained, let alone grow.

If we are to move into space in large numbers and sustain a human presence in space then we should put the development of new enabling technologies ahead of the building of hardware to execute large missions using existing technology. Building hardware and running missions with existing technology does not move us any closer to the creation of permanent self-supporting human settlements on the Moon or Mars. What it does is it delays the development of those settlements because it burns thru money doing things that do not push the technological envelope very far. If we compare where we are technologically to where we need to be to make self-sustaining settlements it is clear that there is a very large gap. We should make the closing of that technological gap be our highest funding priority. Among the technologies we should pursue toward that end:

  • Nuclear propulsion for human spacecraft.
  • Scramjet launch vehicles. Advances would need to be made in both materials science and in our understanding of the physics of the intake air in hypersonic flight.
  • Nanotechnology for making a cable to reach up into orbit.
  • Biological advances to discover how to prevent muscle and bone loss in low and zero gravity environments.
  • Biotechnological advances to grow foods and medicines for a remote colony.
  • Biotechnological advances to grow structures and fibers for a remote colony.

If there is to be a government-funded space program then it should pursue the achievement of longer term goals. The pursuit of shorter term goals has plagued the space program from its inception. The result after over 40 years of human space flight is a very expensive, unreliable and dangerous set of technologies for supporting human activity in space. It is time to learn from our mistakes and commit to working on the hard technical problems that must be solved to enable permanent self-sustaining human settlements off of planet Earth.

Update: Paul Krugman calls for an end to manned spaceflight using current rocket technology.

Does that mean people should never again go into space? Of course not. Technology marches on: Someday we will have a cost-effective way to get people into orbit and back again. At that point it will be worth rethinking the uses of space. I'm not giving up on the dream of space colonization. But our current approach -- using hugely expensive rockets to launch a handful of people into space, where they have nothing much to do -- is a dead end.

At the risk of sounding repetitive: We should work on making the large leaps in technology that would enable space travel and colonization to be done on a larger scale and on a more sustainable economically self-supporting basis. Money spent operating current technology is money poured down the drain.

By Randall Parker 2003 February 11 04:22 PM  Space Exploration
Entry Permalink | Comments(40)
2003 February 10 Monday
Richard Muller On Benefits Of Unmanned Space Efforts

UC Berkeley Physics Professor Richard Muller argues that the biggest NASA achievements in space in the last two decades did not involve manned missions.

Hubble aside, what would you name as the really glorious achievements of NASA in the last 20 years? My favorite: the discovery that every moon of every planet is significantly different from every other moon, a result completely unanticipated and still not understood. One might also pick the amazing success of weather satellites. Or the remarkable pictures you get from your satellite TV system. Those in the know might pick our space spy systems. Then there’s GPS—the Global Positioning System, used to guide airplanes, boats, hikers, automobiles as well as soldiers and smart weapons. These projects have one thing in common: they were all unmanned.

Note that some of the achievements Muller lists were not done by NASA. GPS was developed by the military. Weather satellites are similarly funded by a different government agency (NOAA? National Weather Service? one of those). For the amount of money that has been spent on manned space trips over the last 20 years we could funded an enormous amount of space science as well as a great deal of technological development of radically more advanced space launch and space travel technologies.

By Randall Parker 2003 February 10 01:42 PM  Space Exploration
Entry Permalink | Comments(1)
Wind Power Comes Wrong Time Of Year

Wind power isn't there when its most needed.

"We do want to include wind in the mix. Wind is becoming more price competitive at 3 to 5 cents per kilowatt," said Koszyk, noting coal generation costs about 3 cents per kilowatt and nuclear, 3 to 5 cents. "But the size of the farms is of concern to us. Electricity cannot be stored. We need the right amount of energy at the right time."

He also noted winds typically blow hardest at times of lower energy demand - spring and fall. Peak usage occurs in summer and winter.

Wind power needs to become much cheaper to compensate for its inconsistent availability.. Since its not always there when most needed then additional non-wind generating plants must be built and maintained for use when the wind isn't blowing. If wind power fell enough in price then it might become justifiable to convert electricity to another form of energy and then convert it back to electricity when needed. The development of cheap methods of storing mass quantities of hydrogen would be an obvious enabling technology for wind power. Electricity could be used to generate hydrogen by the electrolysis of water. Then the hydrogen could be used to run hydrogen fuel cells. Of course, that approach would also require a reduction in the cost of hydrogen fuel cells as well.

When hydrogen fuel cells become cheap and dependable enough for vehicle use then wind power could be used to generate hydrogen to power cars. This would be a more attractive proposition than the use of wind power to generate home and industrial electricity since the vehicles would need stored hydrogen anyhow.

By Randall Parker 2003 February 10 01:11 PM  Energy Wind
Entry Permalink | Comments(1)
2003 February 08 Saturday
Start-Up Companies Chasing Suborbital Tourist Market

Technology Review surveys the burgeoning field of privately funded space launch start-ups.

Of course, people have tried for decades to realize the vision of a reusable rocket plane, with little success. “Rocket science has become synonymous with advanced technology, but the fact of the matter is that there has been very little in the way of new development of rockets since the early 1960s,” says Xcor Aerospace president Jeff Greason, a former Intel executive. What’s different now, he and others say, is that even before Columbia broke apart on February 1, people were actually starting to build and test new designs. Indeed, more than two dozen companies worldwide, not to mention NASA and other national space agencies, are actively developing rocket planes. And with the loss of the Columbia, deaths of seven astronauts, and subsequent grounding of the remaining shuttles, both the number of developers and the urgency of their task are likely to grow. “The need to find some way to get new technologies and new approaches to space transportation is probably a lot clearer than it was before,” Greason says.

Xcor has test flown their EZ-Rocket prototype.

Mojave, CA, Friday, July 12, 2002: XCOR Aerospace announced yesterday that its EZ-Rocket flew twice in one day. The flights were in preparation for the first air show flight of the EZ-Rocket at EAA AirVenture 2002 in Oshkosh, WI later this month. In addition to flying twice in one day, the EZ-Rocket performed two mid-flight engine restarts during each flight, another first for the EZ-Rocket.

The plane took off at 8:00 AM and performed a series of steep climbs while making multiple passes over the Mojave Airport. After the morning flight the EZ-Rocket was brought back to the hanger for refueling. "We were able to reload propellants quickly by cryogenically chilling our helium that is used to pressurize the propellant tanks," said XCOR Rocket Engineer Doug Jones. "Typically, as we load helium, its temperature rises through compression heating. Chilling the helium during loading negates this heating and allows us to get a full load onto the EZ-Rocket quickly."

At 1:15 PM the EZ-Rocket was rolled back out to the runway and made its second flight of the day with Dick Rutan at the controls. The EZ-Rocket performed another series of tight turns, steep climbs, and a wingover maneuver. "It was a zero-defects flight," said test pilot Dick Rutan. "It's quite pleasant to fly with that much power. It reminded me of my days in the military flying high performance jet fighters with afterburners. I am confident we'll be able to put on a great show at Oshkosh."

The Rutan brothers are involved with XCor.

The EZ-Rocket, a modified Long EZ plane piloted by retired Lt. Col. Dick Rutan, flew two flights in one day earlier this month. Rutan's brother Burt Rutan, of Scaled Composites, designed the Long EZ plane and is also developing a separate reusable vehicle as part of the $10 million X-Prize competition to put three people in space and return them safely.

Xcor expects to be able to extend their upcoming Xerus design to make it capable of delivering microsatellites into low Earth orbit.

In this configuration, our suborbital vehicle would function as a reusable first stage that carries an expendable upper stage. Our vehicle releases the upper stage, which has its own rocket engine and is capable of putting a microsatellite into low Earth orbit. This vehicle will service the current small payload market as well as customers who today are not in the satellite launch market for reasons of expense and lead time.

Currently existing satellite launch vehicles do not allow for quick turnaround experiments. Not even the military has rapid and responsive access to space. Microsatellites almost always are launched as secondary payloads that are tied to the schedule of the larger primary payload. Although this is currently a small market, we think it has a large potential for development. Electronics miniaturization and diminishing size and power needs means that the next generation of certain kinds of satellites need not be as large as current models.

Pioneer Rocketplane is working on a spaceplane that can be used to deliver satellites into low Earth orbit.

The Pathfinder is a sub-orbital propellant-transfer spaceplane. The configuration is a two-seat fighter-bomber-sized aircraft powered by two turbofan engines and one kerosene/oxygen-burning RD-120 rocket engine. The Pathfinder aircraft is designed to take off with its turbofan engines, and climb to approximately 30,000 feet where it meets a tanker aircraft. The tanker then transfers about 150,000 pounds of liquid oxygen to the Pathfinder spaceplane. After disconnecting from the tanker, the spaceplane starts its rocket engine and climbs to 70 mile altitude and Mach 15. By this time, the spaceplane is outside the atmosphere and can open its payload bay doors, releasing the payload with a liquid rocket upper stage, which delivers the payload to its intended low-earth orbit. The doors are then closed and the Pathfinder aircraft reenters the atmosphere. After slowing down to subsonic speeds, the turbofan engines are restarted and the aircraft is flown to a landing field.

The efforts of these private companies are more reason for optimism about the future of space travel than anything that NASA is currently doing. These companies are free of the complex bureaucracies of NASA and of the big aerospace firms. Their engineers can make purely engineering judgements. Lots of companies translates into the trying out of lots of new designs. Their pursuit of the sub-orbital space tourist market will, if any of these companies produce a working production craft, provide them with revenue from their first generation designs to fund development of successive generations.

By Randall Parker 2003 February 08 07:48 PM  Airplanes and Spacecraft
Entry Permalink | Comments(3)
Space Shuttle Economics

Jay Manifold has a great (or infuriating - actually both) post up on the economics of the Space Shuttle and ISS. Skylab wasn't as pretty looking but it was way more cost effective.

While I'm making comparisons, consider that Skylab had a total habitable volume of 361 m3, and cost less than $100 million (see page 5); for comparison, the ISS has a habitable volume of 425 m3, for a cost approaching $100 billion. In the Encyclopedia Astronautica Skylab entry referenced above, Mark Wade concludes that a second Skylab/Apollo-Soyuz could have been launched in the mid-1970s, "an International Space Station, at a tenth of the cost and twenty years earlier." I'd say more like less than 1% the cost and thirty years earlier ...

NASA is being run as a Congressional district and aerospace industry jobs program. It has made itself irrelevant to the future of humanity in space. Will the latest tragedy be enough to convince Congress that a radical change in course is necessary?

The first step toward a more productive space program would be to announce the permanent grounding and retirement of the Space Shuttle. Relegate it to history and move on. Take its funding and use it to develop nuclear propulsion, a large variety of experimental space launch vehicles, research on biological problems with space travel (zero gravity effects, growing food, growing drugs, and even growing structures on Mars and the Moon), and nanotechnological research on materials fabrication for the special requirements for rocket engines, hypersonic ramjets and other demanding space applications.

By Randall Parker 2003 February 08 02:48 PM  Airplanes and Spacecraft
Entry Permalink | Comments(1)
Gregory Benford: Beyond the Shuttle

Science fiction writer and physics professor Gregory Benford has an excellent article up about NASA and the future of manned space flight.

Perhaps the only good thing about this disaster is that it will prompt NASA to rethink the design of manned spacecraft from first principles. Foremost is that the more complex a spacecraft is, the more things can go wrong.

The safest manned descent module was also the simplest: the Soviet "sharik" descent capsule, which was used by Vostok and Voskhod craft, and also in many unmanned missions since. It was just a sphere with the center of gravity on the side with the thickest ablative thermal shielding, so it was self-stabilizing. Even if the retrorockets failed to separate, it could re-enter safely. Simple ballistic craft that do not fly are also (relatively) simple.

With a spaceplane like the shuttle, however, you are not only committed to a complex shape, you are also committed to using brittle ceramic materials for thermal shielding. The first item on NASA's agenda will be to revisit the tiles issue.

There is the old KISS principle of engineering: Keep It Simple Stupid. NASA's Shuttle design violates that principle in a big way and the result is an expensive, unreliable, and unsafe spacecraft. Benford argues for inherently more reliable designs that do not rely on so many things to go right in order to work.

Benford argues for the development of a centrifuge in space because it is needed for human health during extended periods in zero gravity. That seems like the wrong solution to the problem. It makes much more sense to fund basic research into how muscle and bone growth is regulated. If control could be achieved over those processes then humans could be adapted to zero gravity living. This would be beneficial for more than just zero gravity conditions. As Benford points out, Mars has only 0.377 of Earth's gravity. But the problem of insufficient gravity for human health is even worse for a Moon base with the Moon having only 0.166 of Earth's gravity. Plus, a space hotel at the L1 Earth-Moon Lagrange point would be cheaper to build and operate if it didn't have to be a large centrifuge.

An even more compelling reason to solve the human gravity problem with a biological approach is that the research would surely produce valuable information for the treatment of osteoporosis as well as for healing bone and tissue injuries. For years one justification offered up for space exploration is that it will yield valuable technological spin-offs that will benefit us down here on Earth. A biological approach to solving the gravity problem would produce medically valuable research results.

Benford also argues for developing a closed biosphere. Certainly permanent Moon and Mars bases should have the ability to grow their own food. This problem also would best be solved in biological research. Tissue engineering techniques could be used to develop cell lines that can grow edible steak meat and chicken meat. Plant and animal cell lines could be developed to produce optimal quantities of vitamins and other nutrients. This is another avenue of research that could be pursued to enable space exploration that would generate technological spin-offs with commercially valuable Earth-bound applications.

Biotechnological approaches could address many other problems that would need to be solved in order to maintain human populations on permanent Moon and Mars bases. One problem is medical. One could take as much of each type of drug as might conceivably be needed. But there are too many drugs and it would be difficult to predict needs. One approach to solving this problem would be to genetically engineer strains of bacteria, yeast or other organisms to produce a large variety of drugs. One would need to take along frozen samples of each strain of bacteria that produced a given type of drug. Then when the need for a drug arose that bacteria could rapidly be cultured to grow and produce the needed quantity.

We should not rush to make a trip to Mars. We should instead identify all the technological problems that need to be solved in order to make a Mars trip and permanent establishment of a Mars base safe and affordable. We should not push out into space using barely adequate technology. We should put technology development first. Nuclear propulsion for much faster interplanetary travel, biological techniques to adapt to zero gravity, and biological technologies for growth of food and drugs, are just a few of the areas that a forward thinking space program would fund.

By Randall Parker 2003 February 08 02:30 PM  Space Exploration
Entry Permalink | Comments(5)
2003 February 07 Friday
Space Travel Will Be Enabled By Non-Space Technologies

Rand Simberg argues that we need low cost reusable space launch vehicles.

We need to recognize that we have a chicken and egg problem. We will only get low costs and reliability with high activity levels, and we will only get high activity levels with vehicles designed to sustain them, at low cost (and that means not throwing them away).

In the comments section of that post Michael Mealling argues that only a business approach to space will make space development happen.

IMHO, there are two methods: 1) we all build businesses unrelated to space and create enough wealth among us that we can pay to have that value network built for us (there is imperical evidence that this works) 2) we figure out disruptive technologies/products/business methods that change the underlying assumptions about space and its relationship to people on the planet. The first one is tractable and relatively easy. The second is much more fun and potentially paradigm changing but extremely hard.

Let me argue a different viewpoint: The vast bulk of the technologies that will eventually enable significant human movement into space will come from outside the aerospace industry and will not come from people whose motive it is to develop technologies that will enable the development of cheap safe spaceflight. The US Department of Defense will have an FY2004 budget of around $379 billion dollars. In spite of this the DOD increasingly looks for ways to more rapidly incorporate civilian technologies into military weapons systems. NASA, with a budget of only $15 billion dollars (little of which goes to the development of new space launch technology) is even more in the position of user of the best new private sector technologies (and then only when it gets around to designing something new).

NASA has been locked for years into supporting the continued use of old technologies to produce sentimentally appealing human space missions in the short term. Whether the fault for this lies in NASA or Congress or Presidents or the American people is really besides the point. Because of the continued inability of NASA to focus on long term technological development the technological advances that will some day enable the economic development and colonization of space will not come from NASA funding.

It makes sense for NASA to abandon the Space Shuttle and ISS in order to focus on new technology development. But my own prediction is that the only way that is going to happen in the short term is if the loss of the Columbia is found to have been due to a design flaw in the Space Shuttle that can't easily be fixed. NASA and Congress are too committed to the Space Shuttle and ISS. The film clips the Shuttle missions create are seen as glorious in the minds of too much of the public. Political leaders are not at all eager to educate the public to see the Shuttle and ISS as big mistakes (after all, who made those mistakes?). Nor are they going to tell the public that the deaths of the astronauts who die on Shuttle flights do not contribute to the advance of our ability to move out into space (even though that is obviously the case). In the face of the widespread belief in myths about what our current human space flight program accomplishes it seems unlikely that NASA will be ordered to abandon the Shuttle. It seems even more unlikely that NASA will instead be assigned as its the top priority the development of new space-enabling technologies.

Given that NASA is unlikely to become more effective and that other national space programs are less well-funded and even less ambitious where does that leave the future of manned space travel? We need to make very large technological strides in order to get out of our current rut of high costs and low safety and reliability for human space launch. But until future Shuttle losses eventually end the Space Shuttle program by attrition NASA is not going to put much effort into radical technological advances. Even when NASA gets around to developing a new type of shuttle it will do so in such a hurry to meet an immediate need (yet another Shuttle loss being the most likely proximate cause) that the new design will just incorporate the best technologies available at that point. Therefore NASA will not try out many experimental design concepts as prototypes and will instead opt to pursue a fairly conservative design utilising existing knowledge.

Luckily there is a silver lining in this pessimistic story. The overall rate of scientific and technological advance is accelerating. While Moore's Law may slow down the rate of increase in processor speeds the rate of advance in computer microprocessors (eventually using quantum computing or biomolecular computing) will still produce computers that are orders of magnitude faster in the next few decades. Also, fiber optics and mass storage will continue their own rapid rates of advance. All of these technologies along with advances in mathematical algorithms for simulating designs and physical phenomena will combine to provide better computer aided design and engineering tools. Therefore future spacecraft development efforts will be able to produce much more optimized designs.

General physics, chemistry, and biology continue to advance. Advances in materials science and nanotechnology will provide many new materials and fabrication techniques for use in space launcher design. New types of structural and sensor materials will enable the implementation of spacecraft whose performance greatly exceed the best spacecraft that could be built today. Computer advances combined with sensor advances to make new kinds of control systems will enable the creation of designs that would otherwise not be possible.

The development of a significant human presence in space could in theory be accelerated by a focused attempt to develop enabling technologies specific to spaceflight. Before the advent of computers with sufficient throughput to simulate the performance of advanced supersonic ramjet designs and other advanced design approaches it would be possible to develop many prototype concepts and to try many prototype materials in prototype experimental spacecraft. Such an effort, while risky, might produce a much better design. But the political environment argues strongly against that the pursuit of such a high-risk high-payoff approach. Instead, advances in space launch technology will have to await the creation of a large range of enabling technologies which will originally be developed for other purposes.

Space enthusiasts who do not like this prognosis do have one option: promote arguments to the general public and to opinion leaders about the benefits of pursuing a more radical path for the development of space technologies. A reasonable component of such an argument would be to advocate the split of NASA to put its scientific space studies work (i.e. studying planets, asteroids, stars and all other stuff up there) into an agency dedicated to that purpose. Then another agency should be dedicated to the development of science and prototype technologies focused on lower cost launchers and human space travel.

By Randall Parker 2003 February 07 03:44 PM  Space Exploration
Entry Permalink | Comments(8)
2003 February 05 Wednesday
Aircraft and Space Shuttle Accident Rates

We have now lost both the Challenger and the Columbia. That's 40% of the Shuttle fleet. It's time to seriously reexamine the US space program. Should the Shuttle continue to be operated? Should a new kind of shuttle be designed? What should be the criteria used to answer these questions? The debate about the future of the US Space Shuttle should be a debate about how we can make space travel much safer, more reliable, and lower in cost. These are interrelated goals. Unreliable launchers and passenger carrier spacecraft are more likely to be lost. Loss of a launcher is both fatal for the crew and incredibly costly. Higher reliability equipment is safer and less costly to maintain.

Let's compare aircraft safety to Space Shuttle safety.

The 1995 fatal accident rate per million miles flown for these large scheduled airlines declined to 0.0004 from 0.0008 the year before. Based on 100,000 departures, the fatal rate was 0.024, down from 0.050 in 1994.

Scheduled commuter or regional airline fatalities dropped to 9 persons from 25 in 1994 for the lowest level since 1990. The fatal accident rate fell both in terms of million miles flown to 0.003 from 0.005 in 1994, and from 0.083 to 0.057 in terms of 100,000 departures. It was the fourth consecutive annual decline in the fatal accident rates.

In 1995 the fatal accident rate per 100,000 departures for the airlines flying the smaller aircraft was 0.057 and impressively was less than half that for the big jets operated by the majors. Assuming they are talking about number of accidents and not how many people died in each accident (anyone know?) then to compare that to the Space Shuttle record we compare the 2 fatal Space Shuttle accidents out of 113 flights. That works out to 1770 fatal accidents per 100,000 flights. 1770 for the Shuttle divided by 0.057 for smaller commercial aircraft works out to an accident rate that is over 31,000 times greater for the shuttle than for small craft commercial aviation. When compared to large craft commercial aviation using 1995 again as a comparison point (note that there is fluctuation from year to year because there can be clusters of accidents in a year just by chance - but the trend in commercial aviation is toward ever lower accident rates) we see that the Shuttle is over 73,000 times more dangerous.

James Dunnigan has a table showing failure rates of launchers that have been used more than 100 times each. The US Space Shuttle has a lower failure rate than the other launchers. The failure rates range from 5% for the Russian R-7 Soyuz and European Ariane 1-4 to 14% for the US Atlas. In his article Dunnigan argues that the International Space Station (ISS) is the major justification for the US Space Shuttle. But before we get to that let's think about what these failure rates mean.

If the best space launch vehicle in existence has an failure rate of 2% and the rest are worse this argues that achieving an acceptable level of spacecraft passenger safety can not be done by developing small incremental improvements to current launch vehicle technology. One option is to design passenger carrying spacecraft to integrate with the launchers in a way that allows the spacecraft passengers can survive launch failures. Such a technology was built into Apollo (the Apollo Escape Tower for pulling the CM away from a failing rocket). Another, and not mutually exclusive approach, is to develop a launcher technology that is inherently more reliable than current technology. We can't expect to implement either of these approaches with the Space Shuttle. Technologies that would offer greater than order-of-magntitude improvements in safety and reliability can not be retrofitted into existing designs. Limitations inherent in the original design of the Space Shuttle makes it totally inappropriate as a target for attempts to make big strides in the reliability and safety of human space travel. We need to start over from scratch.

In 1950 there were 2,482 thousand aircraft departures, 19,102,905 passengers carried, and 6 fatal accidents. In 1997 there were 8,157 thousand departures, 598,895,000 passengers carried, and 3 fatal accidents. Fatal accidents per million aircraft miles flown dropped from 0.0126 to 0.0005. The number of fatal accidents per million miles flown was about 25 times greater in 1950 than in 1997. This is the standard against which spacecraft should be compared. The Space Shuttle is at least 3 orders of magnitude more dangerous than passenger aircraft from 1950. Could the aircraft in the fleet of 1950 have been continually modified to make them as safe as a passenger aircraft manufactured 20 or 30 years later? Of course not. Better design and fabrication techniques produced later designs that were inherently more reliable.

But let's go back even further to look at aircraft safety in 1938. That's when some US government agency was created that started tracking aircraft safety. It is not clear from the table what kind of fatal accident rate measure they were using. But compare the 1938 rate of 11.9 to the 1950 rate of 5.0 and the year 2000 rate of 1.1. The 1938 rate of fatal accidents was about an order of magnitude higher than it is now. But its still more than two orders of magnitude lower than the fatal accident rate of the Space Shuttle.

1938 was 35 years after the first aircraft flight of Orville and Wilbur Wright on December 17, 1903 at Kitty Hawk North Carolina. Manned space travel began on April 12, 1961 when a Soviet air force pilot, Major Yuri A. Gagarin, made an orbit of the Earth. So manned space travel is over 40 years old. Space travel into Earth's orbit is orders of magnitude more dangerous after 40 years than aircraft travel was when it was only 35 years old.

Aside: If anyone has aircraft safety data that goes back to the era of biplanes in WWI (leaving aside casualties from war) then please pass it along. It seems quite possible that aircraft safety was never as dangerous as spacecraft safety is now or it was only that dangerous for a relatively short period of time.

Is the safety of spacecraft travel going to improve? Don't look to NASA Space Shuttle contractor Boeing for leadership in spacecraft safety improvements.

"I expect the shuttle will fly another 20 years," said Rick Stephens, vice president and general manager of Boeing's Homeland Security and Services and Integrated Defense Systems.

Imagine another 20 years of space travel that is 4 orders of magnitude more dangerous than air travel. Boeing would be happy to keep getting paid to maintain the dangerous Space Shuttle for that long. Any reason to develop a more technologically advanced, cheaper, more reliable, and safer alternative? Why do that as long as the current dangerous unreliable obsolete system is generating a large revenue stream?

But Stephens, who has headed up operations in Boeing's Space and Communications Services and Reusable Space Systems, said he did not think the tragedy would speed up the search for alternatives to the shuttles.

How about at least saying that the latest tragedy should be a wake-up call that we should start working on a better design? The Shuttle should be treated as a means to an end rather than a glorious end in itself. That end should be the a continuing improvement in the ability to move humans into space. The Space Shuttle is irrelevant to that goal. An extremely dangerous, unreliable, expensive (all by the standards of the commercial aircraft industry of 65 years ago) launch system built with early 1970s technology that costs $500 million per trip is going to ensure that human presence in space remains a rarity.

Why have the Space Shuttle? What do we need it for? NASA says we need it for the International Space Station. But the International Space Station has been so scaled back in capabilities that it can do very little science. Without the ISS to give the Shuttle a purpose is the Shuttle worth operating?

The Shuttle has been used for upgrades to Hubble. But absent the Shuttle as a repair device a replacement for the Hubble could have been built. That would have cost more than than sending up the Shuttle to do a repair. But at $500 million per Shuttle mission its not cheap. The money saved by occasionally doing Hubble repairs and upgrades is hardly justification for keeping the Shuttle around. If no money was spent on the Shuttle at all then the amount saved could pay for many Hubbles.

There is also the age of the Shuttles from the standpoint of on-going maintenance. The fleet is way older than its designers expected it to get. Parts are hard to find.

The fleet - 22 years old - has now been flying for twice as long as its builders first envisioned. Some parts were made so long ago that they are no longer available. Shuttle engineers have had to turn to Internet auction site eBay for desperately needed hardware and electronics.

Where should we go from here? Science fiction writer Jerry Pournelle has a discussion going about Single Stage To Orbit (SSTO) vehicles. Jerry's message is pretty simple. Instead of operating old technology we should build lots of experimental designs to test out various concepts and see what works, what doesn't, and why.

Two stages to orbit, or one stage and a flyable zero which may well be a ring of jet engines, is another possibility: again the operations penalties are not insignificant. The operational penalties are not small: imagine if every time you wanted to fly across the Atlantic, you had to have a second airplane that did nothing but get your plane aloft. It may be required, but it's not desirable.

So: let me sum it up. We need to build more rocket ships. We need to fly more rocket ships. We need better data. These were conclusions we sent to the President in 1983, and repeated to a different President in 1989. They haven't changed. We need X programs. Real ones, not corporate welfare programs like the "X"-33.

What is more important? Is it more important to use expensive old technology so that we can have humans in space in the short to medium term? Or is it more important to experiment and try out lots of engineering experiments for different approaches for spacecraft designs? Do we want to innovate? Do we want to advance? For the amount of money that is going into keeping the old tech Space Shuttle going and to build an incredibly expensive low scientific value space station we could be designing and trying out many innovative spacecraft designs.

Our old space launch technology is woefully inadequate. Compared to aircraft technology it becomes clear just how unsafe, unreliable, and costly our space launch technology really is. Lots of incremental improvements to an old design will not get us very far. If we want to go into space in a serious manner then we need to admit our mistake in funding the old technology for as long as we have. It's time to move on. Its time to let go of the past. Start concentrating on finding the technologies we need for the future.

Update: Jay Manifold has a bit more info. Also, an earlier post of his points out the low science value of both the Space Shuttle and the ISS. This is something I intend to explore at greater length in future posts. Look at the NSF budget and what it buys in the way of scientific advance. The Jerry Pournelle link above has a quote that the NSF budget at 5 billion dollars per year is half what the Shuttle plus ISS are going to cost per year to operate (supposedly $10 billion - need to search out some details on this). The ISS is going to have one guy on it doing science part time. The amount of science that could be accomplished if that money was spent in other ways (e.g. just give it to the NSF and thereby triple the NSF's budget) could be enormous.

The ISS is not for science. Its purpose is to make people feel good that there are people up there working in space. It accomplishes very little beyond that except for giving aerospace companies multi-billion dollar contracts that stretch out for decades. If we want to do science in space then the money would be far better spent on unmanned probes and satellites. If we want to do human exploration of space then the money would be far better spent on developing next generation launch vehicles and also nuclear thermal propulsion systems. NASA's manned space budget of billions of dollars per year is yielding precious little in science and precious little in technological advances that could lower the cost and increase the safety of human space travel.

If it hasn't yet become clear that I think that the International Space Station and Space Shuttle are monumentally stupid wastes of money then let me make it clear: the International Space Station and Space Shuttle are monumentally stupid wastes of money.

By Randall Parker 2003 February 05 11:16 PM  Space Exploration
Entry Permalink | Comments(11)
Girl Shortage Causes Wife Buying In India

The use of amniocentesis and ultrasound to guide sex selective abortions is leading to a shortage of females in India.

The villages are full of frustrated bachelors. In Haryana, a quarter of the female population has simply disappeared.

Many now see buying wives from outside as their only option.

A study of sex ratio trends in India from 1981 to 1991 predicted the sale of women.

The adverse sex ratio has not increase the value of women by decreasing the supply. India’s population sex ratio worsened from 972 females per 1000 males in 1901 to 929 per 1000 in 1991. At the same time, women's status steadily eroded despite gains in some sectors by some groups. A ‘shortage’ of women does not lead to their increased value, but to greater restrictions and control placed over them. In China, practices such as kidnapping and sale of women, organized import of wives from other countries, etc., have been noted as a result of the shortage of women there. The same might be predicted for India.

Still, one reason many families abort the females is to avoid expensive dowries.

On the other hand, there is a great deal of public support in India from pro-sex selective abortion advocates who feel that these tools are helping families to cope with intransigent problems, especially dowry. Health clinics, buoyed by record profits, are aggressively selling their wares. One clever economic pitch blares from tens of thousands of billboards through the country--"Pay five hundred rupees [US$14.00] now rather than five lakhs [Rs500,000 or $14,000] later." Poor families, fearing expensive dowries that can cripple a family, willing undergo the tests.

Shouldn't the female shortage cause an end to the dowry practice or shift it in the opposite direction? Will the market apply corrective forces? Laws against sex selection are not working.

In the midst of such strong public support of these tests, criminalization has not noticeably reduced their use. Even with the passage of the Prenatal Diagnostic Techniques (Regulation and Prevention of Misuse) Act of 1988 in Maharastra, and similar acts in Haryana, Punjab and Gujarat, sex determination practices could not be stemmed.

This article makes a claim that is a surprise to me: dowry only dates back to the 19th century.

However, in approximately the 19th century, the loving practice of stri dana was joined by the very much different concept of dowry. Dowry became first an expected, then a demanded, offering given by the bride's family to the groom's family at the time of marriage.

Dowry is far from the only factor leading to sex selection in favor of males. The high rate of sex selection in China demonstrates that sex selection happens in societies which do not have the dowry custom.

"The male sex preference in China is clearly established with 118 or 119 male birth for 100 female births," said Caroline Hoy, a demographer from Dundee University. She said a survey of migrant workers carried out in Beijing in 1994 put the skew higher, at an average 139 male births for 100 female births.

In rural areas of China the status of women is so low that their existence is under-reported. But the ratio of males to females is still very high.

However population statistics show that the sex ratio in China is unique among all the large countries of the world. Artificial sex selection is the cause of this. In recent years, the use of ultrasound equipment, the traditional Chinese method of feeling pulse, folk remedies, and the crippling of girl babies occur frequently in some areas. These practices have aggravated the imbalance in the Chinese sex ratio. In some areas the male : female sex ratio has reached 130:100.

An in-depth study in a small area confirmed the widespread use of prenatal sex selection and abortion.

Chu designed a questionnaire to assess the prevalence of prenatal sex selection and abortion. The survey started in one village and spread to over 100 villages in five townships of the county. In-depth interviews took place in six villages in three townships of the county.

The 820 women surveyed reported 301 induced abortions. Of these, 36 percent (109) were acknowledged to be female sex-selective abortions. The sex ratio of births reported by the women in the survey was much higher than the biological norm of about 105-107 males per 100 females. The sex ratio of children ever born was 125.9 and of living children it was 126.1. "Prenatal sex selection was probably the primary cause, if not the sole cause, for the continuous rise of the sex ratio at birth in the study area in the past decade," Chu maintains.

According to Chu, female infant abandonment or infanticide is extremely rare in the county she studied. "Rural families believe in fate: if they do something horrible, they will be punished by unseen forces. Besides, it is easy to arrange for the adoption of unwanted girls," she says.

Think about the implications of these reports. Tens or hundreds of millions of people are willing to abort fetuses in order to choose a characteristic in their offspring. Is there any doubt that once genetic engineering of embryos becomes possible that it will pass into widespread use?

Francis Fukuyama claims the government of South Korea successfully stopped the practice of sex selection.

Governments can intervene successfully to correct individual choices like these. The severe sex ratio imbalance in Korea that emerged in the early 1990s was noticed, and the government took measures to enforce existing laws against sex selection so that today the ratio is much closer to 50-50. If the government of a young democracy like Korea can do this, I don't see why we can't.

This report from 1999 throws doubt on Fukuyama's claim.

Hunger for girl tots is anomalous on the Asian continent, where abortions of XX-chromosome fetuses are widespread. Significant gender imbalance has resulted in many nations: China has 118 boys per 100 girls under age 5, Korea has 117 to 100, and Taiwan is 110 to 100.

How is sex selection done in South Korea in spite of its illegality? Its pretty simple. For any country that allows abortion if ultrasound is available for legal purposes then there is an easy and unprovable way to do sex selection.

It is a possibility that a particular sex will be preferred and for this reason sex selection could result. An example of this is currently being seen in Korea. Korea currently predicts a 10% increase in the male to female ratio within the next 30 years. Sex selection is not legal in Korea, but doctors give unspoken results through the amount of enthusiasm they show the mothers. If the doctor lacks enthusiasm upon the test results, the mother often calls for an abortion, knowing that the fetus is female (Wolf, 1996).

It is currently legal to use sex selection techniques in the United States though the practice hasn't yet become widespread. The MicroSort technique of pre-fertilization sex selection removes the need for abortion but costs a few thousand dollars and while it pushes the odds heavily toward one sex or the other its success rate is less than perfect.

Here's how Fortune magazine recently summed up at least the potential market for MicroSort alone: "Each year, some 3.9 million babies are born in the U.S. In surveys, a consistent 25 percent to 35 percent of parents and prospective parents say they would use sex selection if it were available. If just two percent of the 25 percent were to use MicroSort, that's 20,000 customers . . . [and] a $200-million-a-year business in the U.S. alone."

Curiously, there have been historical circumstances in which females have had survival advantages.

Eckart Voland of Göttingen University has studied church records of births, deaths and marriages from Ostfriesland, along the northern coastal region of Germany, as these records also show interesting sex differences in survival. The analysis of records from the eighteenth and nineteenth centuries turned up a couple of hundred cases where one spouse died, leaving a number of living children. In this monogamous, Christian, agricultural society, if a young wife lost her husband she almost invariably became very poor. Any sons that she had were unlikely to compete adequately with boys from richer families, but her daughters, like the daughters of low ranking hinds in Clutton-Brock's studies, always had some chance of marrying up the social ladder. An investment in daughters in this particular social situation had an adaptive biological significance and, as evolutionary biology would predict, the sons of widows were 36 per cent more likely to die in infancy and childhood than the daughters. When men lost their wives, however, their economic status did not change and they often had the opportunity to remarry; the sons of widowers were no more likely to die young than their daughters.

There is one modern society where female children are preferred: Japan.

ISEHARA, Japan - In a surprising repudiation of the traditional Asian values that for centuries have put a premium on producing male heirs, surveys show that up to 75 percent of young Japanese parents now prefer baby girls. Daughters are seen as cuter, easier to handle, more emotionally accessible and, ever more important in this fast-aging society, more likely to look after their elderly parents.

By Randall Parker 2003 February 05 11:31 AM  Biotech Society
Entry Permalink | Comments(23)
RNA Interference Speeds Discovery Of Purposes Of Genes

RNA-mediated interference (RNAi) is being used as a technique to more easily turn genes off in order to discover their purposes. Caenorhabditis elegans (or C. elegans) is a perfect organism to use for RNAi experiments.

A quirk of the physiology of C. elegans means that such gene inactivation can occur simply if the RNAi molecule is eaten by the worm. And luckily for the researchers, the preferred diet of this little worm is the bug that for decades has been used in thousands of lab experiments - the bacterium E coli. Simply inserting the RNAi sequences into E coli and allowing the worms to feed resulted in the chosen gene being knocked out.

The technique is remarkably fast. "It used to take a year to knock out a gene, now with RNAi one person can knock-out every gene in just a few months," says Ahringer.

To support this work the scientists had to develop a way to grow bacteria strains that each contained the ability to make a different RNAi aimed to knock out a different target gene.

"The worms eat the bacteria ... silencing the gene in the worm and her progeny," Julie Ahringer, of the Wellcome Trust/Cancer Research UK Institute of Cancer and Developmental Biology at the University of Cambridge in England, told UPI. "We optimized this ... technique and then worked out methods to efficiently engineer the large number of bacterial strains needed (one for each gene)."

Since it is so easy to deliver RNAi molecules into C. elegans its being used for experiments to rapidly discover what many genes do. This has sped up experiments that rely on knocking out specific genes by orders of magnitude. Recently the use of RNA interference led to the discovery 400 genes in C. elegans worm that affect fat storage.

Scientists at Massachusetts General Hospital (MGH) and their colleagues have scoured thousands of genes in the C. elegans worm and have come up with hundreds of promising candidates that may determine how fat is stored and used in a variety of animals. The findings, published in the Jan. 16 issue of Nature, represent the first survey of an entire genome for all genes that regulate fat storage.

The research team led by Gary Ruvkun, PhD, of the MGH Department of Molecular Biology, and postdoctoral fellow Kaveh Ashrafi, PhD, identified about 400 genes encompassing a wide range of biochemical activities that control fat storage. These studies were conducted using the tiny roundworm Caenorhabditis elegans, an organism that shares many genes with humans and has helped researchers gain insights into diseases as diverse as cancer, diabetes, and Alzheimer's disease.

Many of the fat regulatory genes identified in this study have counterparts in humans and other mammals. "This study is a major step in pinpointing fat regulators in the human genome," says Ruvkun, who is a professor of Genetics at Harvard Medical School. "Of the estimated 30,000 human genes, our study highlights about 100 genes as likely to play key roles in regulation of fat levels," he continued. Most of these human genes had not previously been predicted to regulate fat storage. This prediction will be tested as obese people are surveyed for mutations in the genes highlighted by this systematic study of fat in worms.

In addition, this study points to new potential therapies for obesity. Inactivation of about 300 worm genes causes worms to store much less fat than normal. Several of the human counterparts of these genes encode proteins that are attractive for the development of drugs. Thus, the researchers suggest that some of the genes identified could point the way for designing drugs to treat obesity and its associated diseases such as diabetes.

Of the 400 genes which RNAi-based screening identified to affect fat metabolism about half have known human counterparts.

To discover this treasure trove of fat regulators, the researchers inactivated genes one at a time and looked for increased or decreased fat content in the worms. Through this time-consuming process, they identified about 300 worm genes that, when inactivated, cause reduced body fat and about 100 genes that cause increased fat storage when turned off. The identified genes were very diverse and included both the expected genes involved in fat and cholesterol metabolism as well as new candidates, some that are expected to function in the central nervous system.

About 200 of the 400 fat regulatory worm genes have counterparts in the human genome. "A number of these worm genes are related to mammalian genes that had already been shown to be important in body weight regulation. But more importantly, we identified many new worm fat regulatory genes, and we believe that their human counterparts will play key roles in human fat regulation as well," says lead author Ashrafi. "The work was done in worms because you can study genetics faster in worms than in other animal models, such as mice," says Ashrafi. "The model is a great tool for discovering genes."

RNAi allowed the relevant genes to be identified out of a much larger set of genes.

The work was dependent on the use of an RNA-mediated interference (RNAi) library constructed by the MGH team's collaborators at the Wellcome/Cancer Research Institute in England. The library consists of individual genetic components that each disrupt the expression of one particular gene. With this tool, the researchers were able to systematically screen almost 17,000 worm genes for their potential roles in fat storage.

Now that the bacteria have been created that make each type of RNAi for C. elegans many other effects of genes can be looked at. Already the original researchers have used this technique to look at genes that affect longevity.

In another paper, Dr. Ruvkun and Dr. Ahringer have used the RNA method to screen the worm's genome for genes that increase longevity. With two of the six chromosomes tested, they have found that genes in the mitochondria, the energy-producing structure, are particularly important in determining life span.

This result demonstrates how the use of RNAi can support massive rapid screening of a large number of genes in order to identify a relevant subset for a particular purpose. This is not the only such recent result of this nature.

RNAi is being used to control the expression of the gene for p53 which is a crucial protein for regulating cell proliferation. Mutations in areas of the genome that control p53 expression are known to be crucial in the development of some types of cancer.

The study showed that establishing different levels of p53 in B-cells by RNAi produces distinct forms of lymphoma. Similar to lymphomas that form in the absence of p53, lymphomas that formed in mice with low p53 levels developed rapidly (reaching terminal stage after 66 days, on average), infiltrated lung, liver, and spleen tissues, and showed little apoptosis or "programmed cell death."

In contrast, lymphomas that formed in mice with intermediate p53 levels developed less rapidly (reaching terminal stage after 95 days, on average), did not infiltrate lung, liver, or spleen tissues, and showed high levels of apoptosis. In mice with high B-cell p53 levels, lymphomas did not develop at an accelerated rate, and these mice did not experience decreased survival rates compared to control mice.

The study illustrates the ease with which RNAi "gene knockdowns" can be used to create a full range of mild to severe phenotypes (something that geneticists dream about), as well as the potential of RNAi in developing stem cell-based and other therapeutic strategies.

Along with a recent study by Hannon and his colleagues that demonstrated germline transmission of RNAi, the current study establishes RNAi as a convenient alternative to traditional, laborious, and less flexible homologous recombination-based gene knockout strategies for studying the effects of reduced gene expression in a wide variety of settings.

RNA interference will be used in a newly announced effort to look at 10,000 genes for signs that they play roles that would make them relevant in understanding the causes of cancer.

This has been made possible by the discovery of a process called RNA interference which is used by the body to switch off individual genes while leaving all others unaffected.

The charity Cancer Research UK and the Netherlands Cancer Institute plan to join forces to exploit this knowledge to inactivate almost 10,000 genes one at a time in order to find out precisely what they do - and how they might contribute to cancer's development.

RNAi is being used as a tool to study the effects of shutting down a gene that causes soybean allergies.

Last September, for example, Anthony J. Kinney, a crop genetics researcher at DuPont Experimental Station in Wilmington, Del., and his colleagues reported using a technique called RNA interference (RNAi) to silence the genes that encode p34, a protein responsible for causing 65 percent of all soybean allergies. RNAi exploits the mechanism that cells use to protect themselves against foreign genetic material; it causes a cell to destroy RNA transcribed from a given gene, effectively turning off the gene.

How the double-stranded RNA gets used by the cell to turn off genes.

When double-strand RNA is detected, an enzyme called dicer, discovered at the Cold Spring Harbor Laboratory on Long Island, chops the double-strand RNA into shorter pieces of about 21 to 23 bases. The pieces are known as small interfering RNAs or siRNAs. Each short segment attracts a phalanx of enzymes.

Together, they seek out messenger RNA that corresponds to the small RNA and destroy it. In plants and roundworms, the double-strand RNA can spread through the organism like a microscopic Paul Revere.

The cell's reaction to double-stranded RNA in this manner may have evolved as a defense mechanism against double-stranded RNA virus invaders.

This page has links to some published papers that involve working with RNAi.

RNA interference (RNAi) is the process where the introduction of double stranded RNA into a cell inhibits gene expression in a sequence dependent fashion. RNAi is seen in a number of organisms such as Drosophila, nematodes, fungi and plants, and is believed to be involved in anti-viral defence, modulation of transposon activity, and regulation of gene expression.

By Randall Parker 2003 February 05 12:59 AM  Biotech Advance Rates
Entry Permalink | Comments(3)
2003 February 02 Sunday
What NASA Ought To Do

NASA should cancel the Space Shuttle. It is old technology. It is very expensive to operate. It has many safety and reliability problems that are inherent in its design. It is not the future. Its main advantage is that it lets NASA put people up into space now. Its a short-term photo op generator. It lets current generation astronauts go into space. But the Shuttle does not accelerate the human migration into space. By sucking money away from development of newer enabling technologies the shuttle slows the human movement into space.. If the Shuttle had been cancelled after the first accident and if the money spent on it had been spent on new space technologies we'd be much farther along than we are now.

Along with a cancellation of the Shuttle NASA should mothball the space station. Send up a rocket to put the ISS into a higher mothball orbit where it won't decay and enter the Earth's atmosphere for years. Then go back to the basics of working on next generation space travel technology.

NASA has spent the last couple of decades using a launch technology that was a major compromise over initial shuttle design goals. The compromise was adopted because a more ambitious design was going to cost more than Congress would allocate. Instead of delaying or taking longer to develop a great shuttle NASA chose to develop a lousier shuttle. We've now spent two decades funding its higher operating costs and suffering the consequences of its less safe and less dependable design.

NASA should design and build a next generation shuttle. That next generation shuttle ought to be launchable from more than one existing rocket design. Uncouple the shuttle design from the rocket launch design. Also, it should be extremely safe. Instead of a cheaper tile design an inherently tougher cast metal alloy or newer material design should be pursued. The next generation shuttle should be an inherently safer design. It should be capable of saving the passengers even if the booster rocket launching it fails. It also should be capable of landing on and floating in the ocean if a rocket launcher fails.

NASA should work on a faster way to move between planets. Therefore NASA should develop nuclear themal propulsion. A trip to Mars with chemical rocket technology is a bad idea because it would take too long, be too risky, and cost too much. Its long term effect would be similar to that of the Apollo program. The Apollo moon program was a stunt which was pursued in a way that did not lay firm technological foundations that would lower the longer term costs of going back to the moon repeatedly. So once the stunt had been done people lost interest in it and the money needed to keep using its high cost method of getting to the moon dried up. To repeat that same pattern with a Mars shot would be a similar waste of resources. Development of enabling technologies should be placed ahead of performing stunts.

The focus of NASA should shift away from generating short-term results and toward advancing our underlying technologies for going into and operating in space. NASA should not send people into orbit just to have people in orbit. NASA should not try to go to another planet just to be the first to get there. The desire to do manned expeditions should take a backseat to the need to develop technologies that make manned expeditions easier to do.

Update: Jim Miller links to some articles about why the Space Shuttle ought to be cancelled. Says Jim:

The shuttle is too large for people, too small for cargo, underpowered for many tasks, far too expensive, and too dangerous for routine use. The flaws are not fixable with minor design changes, since the basic system design is wrong.

In the face of these obvious truths about what is unfixably wrong with the Shuttle the Shuttle program has been kept running for decades. It is time to stop being sentimental about the Space Shuttle just because it takes humans into space. It kills people. Its unreliable. Its extremely cost. Its doing precious little to advance space science and space technology. It takes money away from the development of approaches that could really advance our abilities to do things in space.

While Gregg Easterbrook gets some technical facts wrong he's right in arguing that the Shuttle has been kept alive by lobbying of aerospace companies and Congresscritters protecting jobs for their districts. We should not be fooled and let patriotic emotional appeals blind us to the economic and political interests that work to protect an economic and technological albatross.

Switching to unmanned rockets for payload launching and a small space plane for those rare times humans are really needed would cut costs, which is why aerospace contractors have lobbied against such reform. Boeing and Lockheed Martin split roughly half the shuttle business through an Orwellian-named consortium called the United Space Alliance. It's a source of significant profit for both companies; United Space Alliance employs 6,400 contractor personnel for shuttle launches alone. Many other aerospace contractors also benefit from the space-shuttle program.

Easterbrook is quite right when he argues that we should abandon the International Space Station and stop putting humans up into orbit while we develop better technologies for doing so. The ISS costs billions and produces precious little in the way of scientific advances. The $35 billion spent on it so far would have paid for a lot of nanotechnology research. Nanotech promises to reduce the costs of manufacturing space vehicles by orders of magnitude. We should stop pouring money down holes and instead work on making the advances that will make a future in space possible.

Update II: I've previously posted links to articles that claimed nuclear electric propulsion was suitable for space probes but that nuclear thermal propulsion would be better for human space travel. But Jay Manifold says nuclear electric is the way to go. Also see this previous post by Jay. Jay knows a lot more about this than I do and I take his word for it. In any case, some form of nuclear propulsion is what we need to develop for human travel between planets in the solar system. The development of nuclear propulsion technology is just one of the things that would become possible if the money now going for the Space Shuttle and ISS was rechanneled toward developing new technologies for space launch and space travel.

By Randall Parker 2003 February 02 08:39 PM  Space Exploration
Entry Permalink | Comments(5)
2003 February 01 Saturday
Weather Radar Track Of Columbia Shuttle Debris

Blogger John Moore has posted a link to a weather radar track of the shuttle debris.

I hope this tragic loss causes a reassessment of the shuttle program. Its a lousy old tech design that was a poor choice to begin with. We need a radically newer human launch vehicle design that is inherently much safer and lower maintenance.

Here's a Google News news article cluster on the loss of the Columbia. Here are two more here and here.

Update: A friend points out that the Shuttle could have been damaged in orbit by collision with a small fragment of space debris. Space debris is a growing problem. Tethers that ride magnetic fields to slowly change orbits have been proposed as a way to clean up space debris. But one has to ask: Even if such a system was launched can really small orbiting fragments be identified in the first place? Surely larger sized pieces can be tracked. But can the smallest fragment that can cause lethal damage to a shuttle be identified with radar or opticals sensor systems? I'm guessing that the answer is no. Anyone know?

By Randall Parker 2003 February 01 10:44 AM  Space Exploration
Entry Permalink | Comments(1)
Site Traffic Info
Site Copyright
The contents of this site are copyright ©