2007 December 30 Sunday
Pregnancy Surrogacy Outsourced To India

Picture groups of women in India living - and, by doing so, working - in a sort of baby factory.

ANAND, India - Every night in this quiet western Indian city, 15 pregnant women prepare for sleep in the spacious house they share, ascending the stairs in a procession of ballooned bellies, to bedrooms that become a landscape of soft hills.

A team of maids, cooks and doctors looks after the women, whose pregnancies would be unusual anywhere else but are common here.

While lots of the surrogacy is for local women some of it is for foreign women who can't carry a baby to term.

More than 50 women in this city are now pregnant with the children of couples from the United States, Taiwan, Britain and beyond.

They use eggs and sperm from the prospective parents, do in vitro fertilization (IVF), and then implant the resulting embryo in an Indian woman who accepts payment for carrying the baby to term. The article states an Indian woman can earn the equivalent of 15 years of salary by carrying just one pregnancy.

This is all legal in India.

Decades ago Honeywell used to run TV ads about computers for controlling commercial building heating and cooling and they'd end each ad with a guy saying "The future is today at Honeywell". Well, I increasingly feel that way about the whole world. There's increasingly science fiction quality to aspects of every day life. We are getting far enough away from the limitations of our primitive past that the future of our imaginings doesn't seem as distant and unreachable as the future seemed in the past.

50 years ago science fiction writers could write all sorts of plot elements into a story secure in the knowledge that whatever they'd describe would seem distantly futuristic. Well, it seems harder to come up with ideas about the future that sit way in the distant future. What is theoretically physically possible to do that is unlikely to happen in this century?

By Randall Parker 2007 December 30 10:49 PM  Bioethics Reproduction
Entry Permalink | Comments(23)
2007 December 29 Saturday
Most IVF Embryos Destroyed

Most embryos created during in vitro fertilization (IVF) are eventually discarded.

MORE than 1m embryos created for fertility treatment in British clinics have been destroyed over the past 14 years, government figures have shown.

The Department of Health data show that 2,137,924 embryos were created using IVF between 1991 and 2005, but about 1.2m were never used.

While political opposition to the creation of embryos to extract embryonic stem cells remains strong the rate of destruction of embryos is already high without the use of embryos for this purpose. More embryos get created than used for a few reasons. First off, women trying to start a pregnancy using IVF get extra embryos created because they don't know how many attempts will be needed or even how many fertilizations will succeed in creating viable embryos.

Also, the development of tests (genetic and otherwise) for checking on the health of embryos leads to the identification of unhealthy embryos before implantation. These tests are becoming more powerful and as a result many embryos can be judged to either be unlikely to start a pregnancy or to result in birth defects.

The development of increasingly more powerful genetic tests for pre-implantation genetic diagnosis (PIGD or PGD) of embryos will lead prospective parents to become much more selective in choosing embryos. As the significance of more genetic variations becomes known people will have far more reasons to choose between different embryos. The trend is going to be toward the creation of far larger numbers of embryos so as to increase the odds of finding an embryo will be found that contains the best combination of genes from the two parents. Basically, people will throw the genetic dice more times in order to better their odds.

The era of large scale excess embryo creation will last a few decades at most. That era will end with the development of nanotechnological tools that will provide the means to select each chromosome to put in an egg, sperm, or embryo. Rather than throw the dice many times we'll gain the ability to basically put down the genetic dice with the combination we desire.

By Randall Parker 2007 December 29 10:53 PM  Bioethics Reproduction
Entry Permalink | Comments(0)
2007 December 27 Thursday
Liver Fibrosis Stoppable In Mice

Lots of liver diseases kill by causing accumulation of scar tissue. Well, at least in mice the scarring process can be stopped and partially reversed with an inhibitor peptide.

University of California, San Diego researchers have proven in animal studies that fibrosis in the liver can be not only stopped, but reversed. Their discovery, to be published in PLoS Online on December 26, opens the door to treating and curing conditions that lead to excessive tissue scarring such as viral hepatitis, fatty liver disease, cirrhosis, pulmonary fibrosis, scleroderma and burns.

Six years ago, the UC San Diego School of Medicine research team discovered the cause of the excess fibrous tissue growth that leads to liver fibrosis and cirrhosis, and developed a way to block excess scar tissue in mice. At that time, the best hope seemed to be future development of a therapy that would prevent or stop damage in patients suffering from the excessive scarring related to liver or lung disease or severe burns.

In their current study, Martina Buck, Ph.D., assistant professor of medicine at UCSD and the Veterans Affairs San Diego Healthcare System, and Mario Chojkier, M.D., UCSD professor of medicine and liver specialist at the VA, show that by blocking a protein linked to overproduction of scar tissue, they can not only stop the progression of fibrosis in mice, but reverse some of the cell damage that already occurred.

We have been watching bioscience and biotechnological advances for many years. Isn't it about time this progress starts to translate into a whole bunch of disease cures? It is all well and good to watch the progress and marvel at the cleverness of the researchers who find ways to tease out the secrets of biological systems. But getting down to some curing treatments would be great. You might want to see cancer or heart disease cured first. But I'd be happy to see an end to death by liver cirrhosis as a starter.

Inhibition of a protein that actives growth of a type of cell involved in collagen production did the trick.

In response to liver injury – for example, cirrhosis caused by alcohol – hepatic stellate cell (HSC) activated by oxidative stress results in large amounts of collagen. Collagen is necessary to heal wounds, but excessive collagen causes scars in tissues. In this paper, the researchers showed that activation of a protein called RSK results in HSC activation and is critical for the progression of liver fibrosis. They theorized that the RSK pathway would be a potential therapeutic target, and developed an RSK inhibitory peptide to block activation of RSK.

The scientists used mice with severe liver fibrosis – similar to the condition in humans with cirrhosis of the liver – that was induced by chronic treatment with a liver toxin known to cause liver damage. The animals, which continued on the liver toxin, were given the RSK-inhibitory peptide. The peptide inhibited RSK activation, which stopped the HSC from proliferating. The peptide also directly activated the caspase or “executioner" protein, which killed the cells producing liver cirrhosis but not the normal cells.

“All control mice had severe liver fibrosis, while all mice that received the RSK-inhibitory peptide had minimal or no liver fibrosis,” said Buck.

Researchers probably had to spend many years teasing out the connection between the RSK protein, hepatic stellate cells, collagen production and scar tissue accumulation. But now they have something really powerful to show for it. Hurray.

But how many years will it take for a human treatment to make it to market?

By Randall Parker 2007 December 27 09:09 PM  Biotech Therapies
Entry Permalink | Comments(6)
Diesel Car Sales Expected To Grow On High Energy Costs

A series of Popular Mechanics articles reviews the fuel efficiency diesels in very high fuel cost Europe and looks at diesels headed for sale in American showrooms. Like other makers Ford sells a lot of fuel efficient diesel cars in Europe.

FRANKFURT — Ford of Europe unveiled three new alt-fuel cars here, the first of which we’ll see is the Focus ECOnetic model in 2007. It combines the latest common-rail diesel powertrain and other engineering features to reduce CO2 emissions to the absolute minimum. Powered by a 109-hp 1.6-liter Duratorq common-rail turbodiesel engine with a diesel particulate filter, the ECOnetic is gunning for around 54 mpg.

We can see in the European car market how the United States could handle another doubling or tripling of oil costs. Smaller diesel cars would allow Americans to drive just as far to jobs and for fun. Granted, the bigger cars have advantages. But we don't have to abandon cars in order to double or triple our fuel efficiency. That we can afford to drive such big cars at today's gasoline prices means that we can still afford to get around (albeit in smaller and more efficient cars) once the world production of oil starts declining.

Prius is getting unseated as a fuel efficiency leader. The 2009 Jetta diesel is expected to get a combined 50 mpg city/highway when it comes on sale in the spring of 2008.

The new era of clean diesel in America will officially be ushered in by the new VW Jetta TDi when it goes on sale in a few months. Powered by a 2.0-liter four-banger that produces 140 hp and 236 lb.-ft. of torque, it will be the first automobile to meet the world’s most stringent emission control standards, California’s Tier II, Bin 5.

Those California emissions standards are one reason why we see fewer diesels in America than Europe. The Europeans lag on auto emissions standards and so the European car makers find it easier to create diesels that will qualify to pass European emissions regulations. The lowering of sulfur in US diesel fuel (to meet a US Environmental Protection Agency regulation) has made it possible to design diesel exhaust systems that can meet tougher US emissions standards.

A Popular Mechanics writer was more impressed by the Europe-only VW Polo which gets 60 to 70 mpg.

It’s also not the car that most impressed me. Nope, that honor goes to the Euro-only Polo, a Rabbit-like hatchback—only smaller—with plenty of room for four adults, a modest hatch that could swallow a weekend’s worth of gear, and a 1.4-liter, turbocharged diesel under the hood. Oh yeah, and a five-speed manual transmission.

Here’s the kicker: The Polo gets 60 to 70-plus mpg. And it’s really fun to drive. It’s got a good bit of turbo lag, so you need to keep the revs up for serious power, but once the turbo kicks in, acceleration is frisky.

The coming world decline in oil production, once started, will go on for decades and each year we'll see less oil produced than was produced the year before. As oil production declines liquid fuels will become more and more expensive. Therefore the use of diesels for commuting will be a transitional phase. In the long run I expect diesel cars to be used almost solely for longer trips and for freight hauling. For people who do average commutes (less than 40 miles per day) I expect rechargeable hybrid electric and pure electric cars to become the mainstays.

The wild cards in all this are methods to create liquid hydrocarbons. We can't get very far with biomass grain crops due to lack of land. But maybe other biological approaches such as genetically engineered algae for making diesel will become cost competitive. Though the capital costs of such an approach seem too high. Or perhaps nuclear reactors to produce hydrogen to then bind it to carbon will become cost competitive. Another possibility is that solar photovoltaics will become so cheap (and Nanosolar might make it happen) that solar electric could some day produce electric power cheaply enough to run processes to synthesize diesel fuel.

By Randall Parker 2007 December 27 08:40 PM  Energy Transportation
Entry Permalink | Comments(14)
2007 December 26 Wednesday
Evidence For Omega 3 Fatty Acid Against Alzheimer's Disease

More evidence that fish oils really are good for you.

Many Alzheimer's researchers have long touted fish oil, by pill or diet, as an accessible and inexpensive "weapon" that may delay or prevent this debilitating disease. Now, UCLA scientists have confirmed that fish oil is indeed a deterrent against Alzheimer's, and they have identified the reasons why.

Reporting in the current issue of the Journal of Neuroscience, now online, Greg Cole, professor of medicine and neurology at the David Geffen School of Medicine at UCLA and associate director of UCLA's Alzheimer Disease Research Center, and his colleagues report that the omega-3 fatty acid docosahexaenoic acid (DHA) found in fish oil increases the production of LR11, a protein that is found at reduced levels in Alzheimer's patients and which is known to destroy the protein that forms the "plaques" associated with the disease.

The plaques are deposits of a protein called beta amyloid that is thought to be toxic to neurons in the brain, leading to Alzheimer's. Since having high levels of LR11 prevents the toxic plaques from being made, low levels in patients are believed to be a factor in causing the disease.

I ate a large piece of Salmon on Christmas Day and now I feel even better about it.

DHA also boosted LR11 in human cells grown in culture.

"We found that even low doses of DHA increased the levels of LR11 in rat neurons, while dietary DHA increased LR11 in brains of rats or older mice that had been genetically altered to develop Alzheimer's disease," said Cole, who is also associate director of the Geriatric Research Center at the Veterans Affairs Medical Center.

To show that the benefits of DHA were not limited to nonhuman animal cells, the researchers also confirmed a direct impact of DHA on human neuronal cells in culture as well. Thus, high levels of DHA leading to abundant LR11 seem to protect against Alzheimer's, Cole said, while low LR11 levels lead to formation of the amyloid plaques.

Since the oceans don't have enough fish in them we really need land crops genetically engineered to have high levels of omega 3 fatty acids.

Not convinced yet? See my previous posts Omega 3 Fatty Acids In Fish Delay Alzheimer's In Mice, Fish In Diet Slows Rate Of Cognitive Decline, Omega 3 Fatty Acids Might Slow Alzheimers Disease, and Omega 3 Fatty Acids Protect Against Parkinsons Disease? for more on brain aging benefits of omega 3 fatty acids.

We need to slow the aging of our brains for the day when brain rejuvenation therapies become available. The more neurons we have left at that point the more that can be repaired and restored.

By Randall Parker 2007 December 26 09:38 PM  Aging Diet Brain Studies
Entry Permalink | Comments(10)
2007 December 25 Tuesday
Has Earth Climate Warming Trend Stopped?

Former BBC science journalist and astrophysicist Dr. David Whitehouse says in spite of rising atmospheric CO2 the average temperate on planet Earth is not rising.

With only few days remaining in 2007, the indications are the global temperature for this year is the same as that for 2006 – there has been no warming over the 12 months.

But is this just a blip in the ever upward trend you may ask? No.

The fact is that the global temperature of 2007 is statistically the same as 2006 as well as every year since 2001. Global warming has, temporarily or permanently, ceased. Temperatures across the world are not increasing as they should according to the fundamental theory behind global warming – the greenhouse effect. Something else is happening and it is vital that we find out what or else we may spend hundreds of billions of pounds needlessly.

Whitehouse is not making a radical claim. He's just not putting the same spin on the facts that you'll find in most media reports about temperature trends. A recent BBC report (not by Whitehouse) has a chart showing 1998 was warmer than any year since and 6 years in that period were slightly warmer than 2007. Their spin is that the 2007 temperature shows that global warming is a confirmed trend. Um, well, on one hand 2007 didn't return the world to cooler temperature levels from earlier decades. But on the other hand the amount of carbon dioxide in the atmosphere has gone up a lot since 1998. So why hasn't the average global temperature for 2007 easily beat the 1998 number? (not trying to imply an answer btw - I'm just full of questions)

Dr. Whitehouse says the world might be cooling due to reduced solar energy output.

Something is happening to our Sun. It has to do with sunspots, or rather the activity cycle their coming and going signifies. After a period of exceptionally high activity in the 20th century, our Sun has suddenly gone exceptionally quiet. Months have passed with no spots visible on its disc. We are at the end of one cycle of activity and astronomers are waiting for the sunspots to return and mark the start of the next, the so-called cycle 24. They have been waiting for a while now with no sign it's on its way any time soon.

So maybe atmospheric carbon dioxide (CO2) buildup really has a warming effect. But that warming effect is getting offset by a cooling effect caused by less solar radiation.

But recently the Sun's internal circulation has been failing. In May 2006 this conveyor belt had slowed to a crawl – a record low. Nasa scientist David Hathaway said: "It's off the bottom of the charts... this has important repercussions for future solar activity." What's more, it's not the only indicator that the Sun is up to something.

Back during the Little Ice Age era (starting perhaps as early as the 13th century and ending in the 19th century) the Earth experienced periods of reduced sunspot activity including during the Sporer Minimum (1450–1540) and Maunder Minimum (1645-1715). That period featured a Thames River that froze over in winters and lots of hunger and death from food shortages in Europe. Another Little Ice Age would cause problems on a scale rivaling or exceeding some of the problems predicted from global warming.

Reduced sunspot activity isn't necessarily a reason for complaisance about atmospheric CO2 buildup. Even if our pollution is buffering the effects of reduced solar output at some point the sun will probably kick back up again and the CO2 will still be there. Though if the Sun causes huge climate changes (and that appears to be the case) then we need to develop the means to rapidly dial up and down the greenhouse effect in order to reduce the size of climate swings caused by solar output fluctuations.

Another possibility: Maybe increased sulfur aerosol pollution from China burning more coal is generating a cooling effect that is partially canceling the warming effect of CO2 buildup. This seems plausible at least. China's rate of expansion has caused a huge increase in a wide range of emissions and not just CO2 emissions.

Along with aluminum and cement, steel is the biggest reason China added 90 gigawatts of power generation capacity this year, the third year in a row in which it will increase its power output by more than the total capacity of Britain. About 85 percent of those new power plants burn coal.

The International Energy Agency, an energy policy and research group in Paris, had predicted as recently as a few years ago that China's carbon emissions would not reach those of the United States until 2020. But industrial production and coal use have grown so much faster than estimated that the agency now thinks China took the lead this year.

Production which has been shifted from the West to China (many economists call this "free trade") is cheaper in China in part because China tolerates far more pollution per unit of production.

A study by researchers at Carnegie Mellon University found that if all the goods that the United States imported between 1997 and 2004 had been produced domestically, America's carbon emissions would have been 30 percent higher.

A separate study for the European Parliament examined the transfer of steel production to China from Germany. It found that China's less efficient steel mills, and its greater reliance on coal, meant that it emitted three times as much carbon dioxide per ton of steel as German steel producers.

Pollution has not only shifted to China, in other words, but intensified even faster than the country's rapidly expanding output.

So types of pollutants that reflect away the sun's energy are another possible explanation for the seeming end of the warming trend in Earth average temperature. Aside: Britain is also in the ranks of countries that have basically exported a lot of their pollution to China.

Update: What I want to know: How noisy is the data for measuring the average temperature of the Earth's atmosphere? Could noise in the data make a real warming trend seem to stop? Given that temperature over a period of centuries varies a great deal naturally one should expect natural trends to sometimes work with and work against human-caused trends and therefore make human-caused trends harder to detect and confirm. There are real limits on our ability to know what is going on.

Update II: See the comments section for a comment about how volcanic eruptions make the temperature data noisy. Also, Peak Coal might end the whole fossil fuels emissions debate in a couple of decades. Peak Oil and Peak Natural Gas will probably happen sooner. For more on Peak Coal see here and here and here and here.

By Randall Parker 2007 December 25 09:43 PM  Climate Trends
Entry Permalink | Comments(23)
Image Processing System Identifies Bicycle Thieves

If a person taking a bicycle doesn't match an image of who parked it then a theft might be in progress.

PhD student Dima Damen, from the University’s Faculty of Engineering has developed a computer system that detects individuals parking their bicycles and can automatically warn security staff if it appears that someone other than the owner retrieves the vehicle.

...

Currently at prototype stage, Damen’s system takes colour information from CCTV images when a bike is parked and stores it until the bike is retrieved. It then marries the stored information with the new image and where there are significant differences, it can raise an alert to CCTV operators. In initial tests using a camera located above a bike rack at the University of Leeds, eleven out of thirteen simulated thefts were detected.

This approach seems like it might work for cars as well. Extended further, cameras trained on a street with image processing algorithms could alert humans when someone enters a building who has never been recorded entering that building before.

I am reminded of science fiction movies where small flying police monitor cameras watch people. When it comes to suspected thieves such mini flying cameras could be dispatched to record higher resolution images and even to follow a driver of a car or bicycle to track where they go.

By Randall Parker 2007 December 25 05:33 PM  Surveillance Society
Entry Permalink | Comments(0)
2007 December 23 Sunday
South Korean Boy Baby Preference Declines

Girls provide better care for their parents in old age. Girls also are less likely to run afoul of the law. Girls are less extreme. Yet in many societies (e.g. China and India) ultrasound, selective abortion, and other reproductive technologies are getting used to tilt live births toward boys. However, in South Korea the preference for boys seems to be ending.

In South Korea, once one of Asia’s most rigidly patriarchal societies, a centuries-old preference for baby boys is fast receding. And that has led to what seems to be a decrease in the number of abortions performed after ultrasounds that reveal the sex of a fetus.

According to a study released by the World Bank in October, South Korea is the first of several Asian countries with large sex imbalances at birth to reverse the trend, moving toward greater parity between the sexes. Last year, the ratio was 107.4 boys born for every 100 girls, still above what is considered normal, but down from a peak of 116.5 boys born for every 100 girls in 1990.

Rising status and more career opportunities for women helped reduce the desire for boys.

In some Asian countries the preference for boys is still quite strong.

In China in 2005, the ratio was 120 boys born for every 100 girls, according to the United Nations Population Fund. Vietnam reported a ratio of 110 boys to 100 girls last year. And although India recorded about 108 boys for every 100 girls in 2001, when the last census was taken, experts say the gap is sure to have widened by now.

In some Indian provinces the male to female ratio is much higher.

NEW DELHI: The sex ratio has further declined in the five northern States with Punjab showing the worst results ­ there were only 527 girls for every 1,000 boys in 2005 as against 754 girls as per the 2001 Census.

I see one big benefit: slower population growth. Girls can make babies. Fewer girls means fewer babies. So Punjab 15 to 20 years from now will show less population growth than other parts of India where selective abortion is less practiced. One of the biggest problems in the world is too many people. If we had fewer people habitat destruction would be far less, extinctions would be less, fossil fuels depletion would be less, and many other problems would be smaller.

South Korea, with a fertility rate of just 1.15, doesn't really need a sex ratio imbalance to control population. Whereas China and especially India would benefit from smaller populations. Even more so, large parts of Africa and Afghanistan with disastrously high fertility rates (and see this human fertility rate map) would benefit from sex ratio imbalances or anything else that would lower their fertilities. These places are stuck in a Malthusian Trap where any increase in capacity to grow food gets used up by population growth. The people suffer. The wildlife shrinks as their habitats get shifted into human habitats. This is horrible.

I see another benefit from the sex ratio imbalance: higher male competition for females might boost average IQ because dumber guys will probably lose the competition at higher rates than smarter guys. It is a politicallly incorrect truth (and therefore ignored or denounced) that the dummies are breeding faster than the smarties (demonstrated by smart South Korea's pathetic fertility rate). Any selective pressure for higher IQ is a welcome trend.

But there are potential downsides to the boy surplus. The high ratio of boys to girls can be expected to increase violence and crime. Also, large numbers of sexually frustrated young single men could rise up and rebel against their governments.

Though again on the bright side, the Chinese sex ratio imbalance might bring down the North Korean regime due to Chinese wife purchasing of North Korean girls.

By Randall Parker 2007 December 23 09:58 PM  Trends Demographic
Entry Permalink | Comments(7)
2007 December 22 Saturday
$350,000 Full Personal Genome Sequencing Service

At least the upper class can now get their full genomes sequenced.

CAMBRIDGE, Massachusetts — Nov. 29, 2007 — Knome, a personal genomics company, today announced the launch of the first commercial whole-genome sequencing and analysis service for individuals.

Knome does genetic sequencing and not just genetic testing. The latter usually involves testing a number of predetermined locations in genes where genes are known to vary between individuals and groups. The former, what Knome is offering, involves the much harder task of going through and reading every letter in your genome. When done well full genetic sequencing can identify rarer single letter differences that the cheaper genetic testing techniques won't identify. Also, full sequencing can measure what are called copy variations where the number of copies of each gene gets measured.

Whole-genome sequencing decodes the 6 billion bits of information that make up an individual’s genome. Unlike existing genome scanning or “SNP chip” technologies that provide useful but limited information on approximately 20 conditions, whole-genome sequencing allows for the analysis of up to 2,000 common and rare conditions, and over 20,000 genes – numbers that are rapidly growing.

“Whole-genome sequencing is the endgame,” according to Mr. Conde. “It will enable us to look at nearly 100% of your genetic code compared to the less than 0.02% currently available on SNP chips. This is the approach that most fully reveals what our genomes can tell us about ourselves.”

Pricing for Knome’s service will start at $350,000, including whole-genome sequencing and a comprehensive analysis from a team of leading geneticists, clinicians and bioinformaticians. This team will also provide continued support and counseling.

But if you don't have a spare $350k but can scrape up a thousand or two you can still get fairly cheap genetic testing of some large subset of known genetic differences.

Two rival firms have just unveiled services that will allow people to scrutinize their own genomes for $1,000.

The first was deCODE genetics, an Icelandic firm that has already developed genetic tests for several diseases. On Nov. 16 it announced an Internet-based service, called deCODEme.

...

Then, on Nov. 19, 23andMe, a start-up based in California's Silicon Valley, announced a similar service.

...

Navigenics, another Californian firm, says it will unveil a more medically oriented service, priced at around $2,500, in January.

The X Prize Foundation has an Archon prize to encourage the development of faster full genome sequencing technology.

If the X Prize Foundation has its way, it will soon be possible to sequence a genome in hours. To make that happen, the foundation, perhaps better known for its spaceflight prize, is offering the Archon genomics prize. This will be worth $10m to the first team able to sequence 100 human genomes accurately in ten days or less. (The prize is sponsored by Stewart Blusson, a philanthropist who is president of Archon Minerals, a mining company based in Vancouver.)

Faster sequencing is usually cheaper sequencing. So the effect of this prize is to create incentives to develop cheaper DNA sequencing technologies.

Expect to see huge price drops for DNA sequencing and testing services. Also, as the underlying technologies become cheaper the resulting flood of genetic information will allow scientists to discover orders of magnitude more information about what each genetic difference means. So genetic test results will tell us far more useful information than they can so far today.

I expect the biggest impact of genetic testing to occur with mating practices. People will use genetic testing to select suitable partners (or donors) for reproduction. Also, they will use gene testing to select among embryos with in vitro fertilization. They will choose among embryos based on what genetic test results indicate about looks, intelligence, personality qualities, athletic abilities, health risks, and other qualities. These genetic testing companies are going to usher in huge shifts in the directions of human evolution.

By Randall Parker 2007 December 22 10:48 PM  Biotech Assay Tools
Entry Permalink | Comments(8)
2007 December 20 Thursday
Scientists Consider Climate Engineering Option

Atmospheric CO2 concentrations are rising more rapidly than in previous decades. In the face of this trend a number of scientists are looking at the risks and benefits of climate engineering.

Govindasamy Bala, an atmospheric scientist at the East Bay's Lawrence Livermore National Laboratory, discussed a climate model he recently completed. By putting aerosols in the stratosphere to reflect sunlight, he found the amount of sunlight that reaches the Earth's surface could be reduced by 2 percent - enough to counterbalance the doubling of carbon dioxide. On the other hand, he emphasized, the climate reacted more strongly to the aerosols than the carbon dioxide, resulting in less global average rainfall.

"I don't think our understanding of the climate system is now complete in order for us to start with geoengineering," Bala said.

Could another technique combined with the aerosols boost precipitation? If so, what would do it? Cloud seeding? Methods to spray water into the atmosphere to get more water to evaporate? How to do that without using fossil fuels energy? Floating windmills to power water spray pumps?

But aerosols aren't the only way to address the problem. Ken Caldeira opposes using sulphate aerosols to cool the Earth but Caldeira and Govindasamy think reflecting light away with a physical reflective material might work better than an aerosol.

A few years ago, Dr Caldeira set out to disprove an idea put forward by Livermore physicists Lowell Wood and Edward Teller to cool the Earth with a sheet of superfine reflective mesh - similar in concept to orbiting mirrors.

In a computer model, Dr Caldeira and colleague Bala Govindasamy simulated the effects of diminished solar radiation.

"We were originally trying to show that this is a bad idea, that there would be residual regional and global climate effects," explains Dr Caldeira.

"Much to our chagrin, it worked really well."

But mirrors and other physical surfaces for reflecting light might also reduce precipitation.

My problem with the light reflection schemes is that they don't prevent higher CO2 concentrations from dissolving into the oceans to form a mild acid that acidifies the oceans. If that acidification is a problem then light reflection by itself doesn't address what might be the biggest problem with atmospheric CO2 build-up.

Pulling CO2 out of the atmosphere by seeding oceans with iron is another option. The iron would allow more algae to grow and the algae would convert dissolved CO2 into hydrocarbon materials, a portion of which would sink to the ocean floor. Dutch aquatic microbiologist Jef Huisman says iron fertilization to extract CO2 from the atmosphere seems a risky way to do climate engineering.

Asked about the research about to be conducted by Planktos, Professor Huisman said: I think it's an interesting idea as well as a dangerous idea. Interesting because we know that if we can increase the primary production and there will be a larger intake of carbon dioxide in to the ocean.

"But is also dangerous. Just as you fertilize on land you will change the eco-system. Whereas we have experience of what happens in a meadow, we have no experience of what would happen with the eco-system species composition in the ocean. What happens if you do large scale iron fertilization? We have no idea which species are going to profit or whether it will cause harmful algal blooms.

Professor Huisman predicts that once the iron goes into the ocean that there will be a strong increase in phytoplankton species.

"I would expect the small phytoplankton species -- that have a fast growth rate - will be there first," he said. "Secondly you would have slow plankton species that would catch up and start grazing on the phytoplankton species."

But you have to weigh risks against other risks. The Chinese and Indians aren't going to stop their rising consumption of fossil fuels unless either fossil fuels reserves start running out or we find cheaper alternative sources of energy. I am expecting oil and natural gas production to peak in the next decade. But coal reserves might be large enough (it is not clear) to melt the polar ice caps.

Big phytoplankton blooms could be harnessed in aquaculture. Create enclosed areas in the middle of oceans in areas where iron shortages prevent phytoplankton growth. Seed with iron. Put in fish. Let them eat the phytoplankton. Harvest the fish. Feed an unfortunately growing world population and extract CO2 from the atmosphere at the same time.

The way I see it we need replacements for fossil fuels regardless of whether Peak Coal is nearing. If Peak Coal is a distant prospect then we need alternatives because coal is a big conventional source of pollution (and I really wish the harm from conventional pollution got half the press that global warming receives since particulates and mercury really are bad for you). Cheaper alternatives to coal would displace coal and we'd get cleaner air and water. If Peak Coal is coming soon along with Peak Oil and Peak Natural Gas then we need some other way to power civilization or else our living standards will plummet. Either way we need cheaper cleaner sources of energy.

Well, readers Alex and Brock both draw my attention to one possibility: 200 kilowatt Toshiba micro nuclear plants might bring cheap nuclear power to small communities, big buildings or city blocks.

By Randall Parker 2007 December 20 10:41 PM  Climate Engineering
Entry Permalink | Comments(22)
2007 December 19 Wednesday
Is Time Slowing Down? Will Time Stop Some Day?

Weird wild stuff.

The idea that time itself could cease to be in billions of years - and everything will grind to a halt - has been set out by Professor José Senovilla, Marc Mars and Raül Vera of the University of the Basque Country, Bilbao, and Univerisity of Salamanca, Spain.

These scientists propose this theory as an explanation for a known phenomenon: distant stars seem to be moving faster. Since images from distant stars come from further back in time these scientists suggest that in the past time ran more rapidly. So things moved more rapidly.

A decade ago, astronomers noticed that distant supernovae - exploding stars on the very fringes of the universe - seemed to be moving faster than those nearer to the centre, suggesting that they were accelerating as they shot through space.

They think their idea makes more sense than hidden dark matter that is at the center of an alternative explanation for how the distant stars appear in telescopes.

My guess is that we are not in the only universe. We need to find a way travel to other universes so we can escape each universe as time in it slows down or all the matter converts to diffuse energy or the universe otherwise gets used up.

Of course, first we need to development rejuvenation treatments that will reverse aging. Once once we accomplish that goal can we have the luxury of worrying about our slowing down or running down universe.

By Randall Parker 2007 December 19 10:42 PM  Dangers Natural General
Entry Permalink | Comments(14)
Lawsuits Coming Over Genetic Inheritance?

Some members of the British House of Lords argue that the use of donor eggs and sperm to create offspring should not be kept secret from those offspring.

Children born from donor eggs or sperm could have the information recorded on their birth certificates.

An influential group of peers yesterday called for a law change to force parents to reveal donor conceptions.

Under the proposals, a special mark next to a child's name would reveal whether he or she was conceived naturally or with the help of a donor.

Parents who tried to hide the truth from their children could be fined or imprisoned.

What do we have a right to know? Should parents get to know that their kids are not genetically from them while the kids get kept in the dark? Deceived kids can pay a price as a result of the deception: when they look at their parents to judge their potential capabilities they will tend to more incorrectly guess their capabilities. The tendency to underestimate their capabilities kids who have genes that make them more capable.

But if the goal of knowing about your genetic parents is to provide useful insights about yourself then the rapid decline in the cost of genetic sequencing and testing will provide a much better way to do that by the time babies born today reach adulthood.

But knowing your genetic inheritance is less important than what is in your genetic inheritance. I expect some people to object to what they've been given as their genetic inheritance. After all, what you get in your genes has huge consequences. Why won't some unhappy and angry children sue?

I see lawsuits over genetic inheritance probably in about 35 years. Once people gain the ability to choose genetic variations for their children those children will grow up, see what decisions were made for them, and sue their legal parents over the genetic choices of their parents.

By Randall Parker 2007 December 19 09:42 PM  Bioethics Reproduction
Entry Permalink | Comments(7)
2007 December 18 Tuesday
Tunguska Simulation Shows Higher Risk From Smaller Asteroids

Smaller asteroids are more dangerous than previously thought.

ALBUQUERQUE, N.M. — The stunning amount of forest devastation at Tunguska a century ago in Siberia may have been caused by an asteroid only a fraction as large as previously published estimates, Sandia National Laboratories supercomputer simulations suggest.

“The asteroid that caused the extensive damage was much smaller than we had thought,” says Sandia principal investigator Mark Boslough of the impact that occurred June 30, 1908. “That such a small object can do this kind of destruction suggests that smaller asteroids are something to consider. Their smaller size indicates such collisions are not as improbable as we had believed.”

Because smaller asteroids approach Earth statistically more frequently than larger ones, he says, “We should be making more efforts at detecting the smaller ones than we have till now.”

We need to search harder for the larger number of smaller asteroids. Our risk of death and destruction from asteroids is larger than previously believed.

The Tunguska blast has a downward direction that amplifies its destructive effect on the surface of the Earth.

Simulations show that the material of an incoming asteroid is compressed by the increasing resistance of Earth’s atmosphere. As it penetrates deeper, the more and more resistant atmospheric wall causes it to explode as an airburst that precipitates the downward flow of heated gas.

Because of the additional energy transported toward the surface by the fireball, what scientists had thought to be an explosion between 10 and 20 megatons was more likely only three to five megatons. The physical size of the asteroid, says Boslough, depends upon its speed and whether it is porous or nonporous, icy or waterless, and other material characteristics.

We should try much harder to identify the asteroids that are going to collide with the Earth in the future. The sooner we identify them the easier it will be to deflect them from their paths.

The Tunguska event felled over 80 million trees and is a topic of active research which has generated lots of cool photographs.

By Randall Parker 2007 December 18 10:42 PM  Dangers Asteroids
Entry Permalink | Comments(2)
Stanford Scientists Claim 10 Times Better Batteries

Silicon nanowires will improve lithium ion batteries by an order of magnitude of capacity?

Stanford researchers have found a way to use silicon nanowires to reinvent the rechargeable lithium-ion batteries that power laptops, iPods, video cameras, cell phones, and countless other devices.

The new version, developed through research led by Yi Cui, assistant professor of materials science and engineering, produces 10 times the amount of electricity of existing lithium-ion, known as Li-ion, batteries. A laptop that now runs on battery for two hours could operate for 20 hours, a boon to ocean-hopping business travelers.

"It's not a small improvement," Cui said. "It's a revolutionary development."

If this works out it really is revolutionary. Will the batteries last through many rechargings? Will they be manufacturable?

Cui thinks these batterries will work in electric cars and as a way to store solar photovoltaic electric power.

The breakthrough is described in a paper, "High-performance lithium battery anodes using silicon nanowires," published online Dec. 16 in Nature Nanotechnology, written by Cui, his graduate chemistry student Candace Chan and five others.

The greatly expanded storage capacity could make Li-ion batteries attractive to electric car manufacturers. Cui suggested that they could also be used in homes or offices to store electricity generated by rooftop solar panels.

"Given the mature infrastructure behind silicon, this new technology can be pushed to real life quickly," Cui said.

The future is electric. The sooner we can make the shift from oil to non-fossil fuels methods of electric power the better off we'll be.

By Randall Parker 2007 December 18 10:26 PM  Energy Batteries
Entry Permalink | Comments(10)
2007 December 17 Monday
Immune Naive T Cell Aging Cuts Their Numbers

Because immune cells circulate in the blood they strike me as great early candidates for development of rejuvenating stem cell therapies. The cells in the blood are more accessible and replaceable than cells which living in complexly shaped organs. A great target for youthful cell development are the naive T cells which age and become less able to divide.

PORTLAND, Ore. – Researchers at Oregon Health & Science University have uncovered new information about the body’s immune system in a study that suggests new strategies may be in order for protecting the country’s aging population against disease. The research is published in the current edition of the Proceedings of the National Academy of Science.

The research focused on an important component of the body’s immune system, a certain type of white blood cell called naïve T-cells. These cells are called naive because they have no experience of encountering germs. However, once they encounter germs, they learn and adapt to become strong defenders of the organism. The cells play an important role in the vaccination process because vaccines, which contain either weakened or dead viruses, teach naïve T-cells how to recognize germs and prepare the body for fighting infectious diseases at a later date. Previous research shows that an individual’s supply of naïve T-cells diminishes over their lifetime, meaning that in old age a person is more susceptible to infections such as the flu.

“Our research identified one actual process by which naïve T-cells are lost later in life,” explained Janko Nikolich-Zugich, Ph.D., a senior scientist at the OHSU Vaccine and Gene Therapy Institute and the Oregon National Primate Research Center and a professor of molecular microbiology and immunology in the OHSU School of Medicine.

“Throughout our lives, naïve T-cells divide very slowly in our bodies. This helps maintain sufficient numbers of naïve T-cells while we are young. As we age, naïve T-cells are lost and the remaining ones speed up their division to make up for the losses in their numbers. Interestingly, after a certain point, this actually causes the numbers of naïve T-cells to dwindle over time. Our data shows that once the number of naïve T-cells drops below a critical point, the rapidly dividing naïve cells are very short lived. Based on this finding and other information, research suggests that some of the aging Americans may be better protected against disease by finding a way to jumpstart production of new naïve T-cells instead of through revaccination.”

Infectious disease kill a lot of elderly people because their immune systems become too weak to hold off infections. But that is not the only way that immune system aging costs us. Aged immune systems are less able to fight off cancer and immune system aging might even be the biggest cause of the increasing incidence of cancer seen as people age.

We need to find a way to create youthful naive T cells to inject into us. Such cells would at least partially rejuvenate our immune systems and by doing so reduce our risk of cancer and infectious diseases.

By Randall Parker 2007 December 17 10:27 PM  Aging Mechanisms
Entry Permalink | Comments(2)
Aging Of Brain Thirst Area Cuts Water Consumption

Here is yet another reason brain aging is something we should figure out how to stop and reverse. The part of the brain that regulates thirst becomes inaccurate and underestimates water needs as we age.

Florey researchers Dr Michael Farrell, A/Prof Gary Egan and Prof Derek Denton discovered that a region in the brain called the mid cingulate cortex predicts how much water a person needs, but this region malfunctions in older people.

Dr Farrell said they infused old (age 65 to 74) and young (age 21 to 30) research participants with salty water to make them thirsty and then allowed them to drink as much water as they wanted.

“Although all participants had the same level of thirst, the older people only drank half as much water as the younger subjects,” Dr Farrell said.

“Using PET imaging we found in the older people, the mid cingulate cortex was ‘turned off’ much earlier by drinking small volumes.”

“This discovery helps explain why the elderly can become easily dehydrated,” he said.

As you age many processes in your brain start going awry. We need to develop biotechnologies to rejuvenate our brains. We'd become more productive, happier, and less hobbled by assorted maladies.

By Randall Parker 2007 December 17 10:18 PM  Aging Mechanisms
Entry Permalink | Comments(1)
2007 December 16 Sunday
Carbon Dioxide In Oceans Threatens To Kill Coral Reefs

I do not see global warming as an unsolvable problem or as a reason to stop using oil (especially since I think we are running out of oil anyway). We can use one cheap way to do climate engineering or yet another to keep down world temperatures. But as I've stated on previous occasions, CO2 build-up in the oceans seems like it might be the reason to worry about atmospheric CO2 build-up.

Since I think we are running out of oil and natural gas the question I most want answered with regard to the environment is how much coal does the world really have left that is accessible to extract and burn? American coal reserves and world coal reserves might be smaller than commonly thought. However, if the amount of accessible coal is large then CO2 emitted by burning coal for electricity and other purposes could acidify the oceans and kill all the coral reefs.

Stanford, CA — Carbon emissions from human activities are not just heating up the globe, they are changing the ocean’s chemistry. This could soon be fatal to coral reefs, which are havens for marine biodiversity and underpin the economies of many coastal communities. Scientists from the Carnegie Institution’s Department of Global Ecology have calculated that if current carbon dioxide emission trends continue, by mid-century 98% of present-day reef habitats will be bathed in water too acidic for reef growth. Among the first victims will be Australia’s Great Barrier Reef, the world’s largest organic structure.

Chemical oceanographers Ken Caldeira and Long Cao are presenting their results in a multi-author paper in the December 14 issue of Science* and at the annual meeting of American Geophysical Union in San Francisco on the same date. The work is based on computer simulations of ocean chemistry under levels of atmospheric CO2 ranging from 280 parts per million (pre-industrial levels) to 5000 ppm. Present levels are 380 ppm and rapidly rising due to accelerating emissions from human activities, primarily the burning of fossil fuels.

By the time we reach 550 ppm all the coral reefs are dead. Likely other ocean species will bite the dust as well. Time to switch to nuclear power. But in case we don't make that move, well, I've always wanted to see Australia's Coral Reefs. So I guess I need to fly down there on a fossil fuel burning CO2 emitting jumbo jet to see the Great Barrier Reef in all its glory before everyone else uses so much fossil that the reefs are dead. Some of you are thinking "what a twisted guy that FuturePundit is to think that". Yes, I'm pretty twisted in my thinking. But in this case that idea did not come from my imagination. Nope, I read it in the New York Times There's a growing travel industry in taking people to see what humanity is ruining and wrecking.

From the tropics to the ice fields, doom is big business. Quark Expeditions, a leader in arctic travel, doubled capacity for its 2008 season of trips to the northern and southernmost reaches of the planet. Travel agents report clients are increasingly requesting trips to see the melting glaciers of Patagonia, the threatened coral of the Great Barrier Reef, and the eroding atolls of the Maldives, Mr. Shapiro said.

Meet humanity. Why do some people say how wonderful it is?

So what should we do? Even if industrialized countries kick the carbon habit Asia and other places are on course to boost atmospheric CO2 levels.

Richard Richels, an economist at the Electric Power Research Institute, helped produce an ominous forecast: even if the established industrial powers turned off every power plant and car right now, unless there are changes in policy in poorer countries the concentration of carbon dioxide in the atmosphere could still reach 450 parts per million — a level deemed unacceptably dangerous by many scientists — by 2070. (If no one does anything, that threshold is reached in 2040.)

In my view this tells us that we need to develop cheaper alternatives to fossil fuels. We need to develop clean energy sources cheap enough that the developing countries will be lured away from coal to these alternatives.

Energy usage to manufacture goods for export is a major source of CO2 emissions in China.

Yet one of the biggest is the enormous increase in China’s production of manufactured goods for export. Indeed, a study by the Tyndall Center for Climate Change Research in Britain estimated that in 2004, net exports accounted for 23 percent of Chinese greenhouse gas emissions.

Think George W. Bush is an enemy of the environment? He's an environmentalist compared to the Chinese. China is bypassing the US as biggest CO2 emitter and probably is already the biggest emitter of conventional pollutants. Yet China is just getting started. Their emissions are going to get far worse (and not just on CO2) before they get better. More mercury. More particulates. More pollutants in rivers and the oceans. China's industrialization is a disaster for the world's environment.

A team of economists led by Dieter Helm at Oxford University claims that Britain's decrease in CO2 emissions is an illusion caused in part by importing products whose domestic manufacture used to cause domestic CO2 emissions.

The analysis says pollution from aviation, shipping, overseas trade and tourism, which are not measured in the official figures, means that UK carbon consumption has risen significantly over the past decade, and that the government's claims to have tackled global warming are an "illusion".

...

Under Kyoto, Britain must reduce its greenhouse gas output to 12.5% below 1990 levels by 2012. According to official figures filed with the UN, Britain's emissions are currently down 15% compared with 1990.

But the new report says UK carbon output has actually risen by 19% over that period, once the missing emissions are included in the figures.

Britain has basically exported some of its fossil fuels using industries (as have the United States and other Western countries) to countries like China whose leaders think nothing about setting new records in rates of pollution emissions. Again, doesn't this argue for a much more rapid development of technologies for cleaner energy to make those cleaner sources cheaper? We can't appeal to altruism or enlightened self interest about long term costs. Such arguments aren't going to work with China or India. They haven't even worked with Canada which signed Kyoto and then, under a left-of-center government, went on to greatly increase CO2 emissions since signing the treaty. Japan and other Kyoto signatories didn't meet their treaty obligations either.

Governments around the world aren't willing to impose much hardship on their populaces to reduce fossil fuels use. Some talk a good game. But coal mines are getting reopened in Germany and Britain.

A group of prominent scientists agree that a big increase in research funding is needed to solve our energy and environment problems.

The letter, sent Sunday, calls for at least $30 billion a year in spending to promote sustained research akin to the Apollo space program or the Manhattan Project.

It was drafted by Martin I. Hoffert, an emeritus physics professor at New York University; Kenneth Caldeira, a Carnegie Institution scientist based at Stanford University; and John Katzenberger, director of the Aspen Global Change Institute, a private research group. Other signers include Nobel Prize winners in chemistry, economics, and medicine.

Gregory Benford, Lowell Wood, and Nobelist Paul Crutzen are among the signers. You can read the full letter (PDF format). Note the graph showing types and levels of research funding from 1955 till today.

Update: Also see Andrew Revkin's article from a year ago: Budgets Falling in Race to Fight Global Warming.

By Randall Parker 2007 December 16 09:16 PM  Trends Habitat Loss
Entry Permalink | Comments(13)
2007 December 15 Saturday
Regulatory Fears Cut Coal Electric Plans In United States

A Bloomberg article mostly about prospects of increased sales by GE of natural gas electric generator turbines highlights a shift away from new coal electric plants due to fears of carbon emissions regulations.

Concern that climate-change legislation could render coal- fueled plants obsolete prompted the cancellation of about 13 this year. Coal plants capable of generating 12,000 megawatts, enough power for 9.6 million average U.S. homes, were proposed for construction in 2005. Only 329 megawatts, enough for 263,200 homes, were built, according to U.S. Energy Department data.

Fears of global warming (now widely relabeled to "climate change" to make the assertion less disprovable?), whether realistic or not, are serving a constructive purpose by cutting back on coal electric plants. We get less conventional pollution as a result. Both nuclear and wind gain from this turn of events. Coal gets used for base load demand. But wind can contribute little to reliable base load demand. Therefore power companies either need to build nukes or a combination of wind with natural gas back-up.

If nuclear power plant building firms can manage to get nuclear power plant cost overruns below the 25% cost overrun of the Olkiluoto-3 plant in Finland (and some of the mistakes seem avoidable next time) then I expect nuclear will become much more competitive against the wind/natural gas combination - especially when natural gas production starts declining. Parts of Europe (France excepted) might go with the more expensive wind approach. Offshore wind costs more than onshore wind or nuclear. Yet the British government has just decided on a big offshore wind push. On the bright side, when natural gas prices skyrocket at least offshore wind will be cheaper than natural gas for electric power generation.

Update: Regulatory obstacles to new coal electric plants might work in our favor for reasons unrelated to pollution. More nuclear and wind facilities will be developed and therefore the coming peak in world coal production will fall less hard on countries that are forced away from coal by environmental opposition. The Energy Watch Group released a report in October 2007 which argued that measured by energy content US coal production already peaked in 2002.

The USA, being the second largest producer, have already passed peak production of high quality coal in 1990 in the Appalachian and the Illinois basin. Production of subbituminous coal in Wyoming more than compensated for this decline in terms of volume and – according to its stated reserves – this trend can continue for another 10 to 15 years. However, due to the lower energy content of subbituminous coal, US coal production in terms of energy has already peaked 5 years ago – it is unclear whether this trend can be reversed. Also specific productivity per miner is declining since about 2000.

The Energy Watch Group expects world coal production to peak around 2025.

Global coal reserve data are of poor quality, but seem to be biased towards the high side. Production profile projections suggest the global peak of coal production to occur around 2025 at 30 percent above current production in the best case.

In the United States coal provides about half of all electric power. A decline in coal production in the US means higher electric prices and inability to migrate current oil uses to electric power instead. We need a lot more nuclear and wind power. We also need accelerated research and development into ways to make photovoltaics cost competitive.

CalTech professor David Rutledge also expects a coal peak much sooner than previously projected.

By Randall Parker 2007 December 15 09:00 PM  Energy Electric Generators
Entry Permalink | Comments(16)
2007 December 13 Thursday
David Levy Sees Sexbots In Our Future

British artificial intelligence researcher expects in a few decades time robots will become highly desirable partners in relationships.

Levy is an expert in artificial intelligence. He is fascinated with the idea of "love and sex with robots," and his visions of the future include "malebots" and "fembots" as lovers and life partners. A chess champion and the president of the International Computer Games Association, Levy, 62, has just published a book, "Love and Sex with Robots: The Evolution of Human-Robot Relationships" -- that is provocative in the truest sense of the word. He is convinced that human beings will be having sex with robots one day. They will show us sexual practices that we hadn't even imagined existed. We will love them and respect them, and we will entrust them with our most intimate secrets. All of this, says Levy, will be a reality in hardly more than 40 years from now.

I can understand how relationships with artificial intelligences could be very satisfying. An artificial intelligence could be programmed to have similar values and interests and to be very patient and engaging. But I do not expect robots to become sexually attractive unless materials for robot construction become so advanced that robots can look very like humans.

But will robots find us attractive as companions?

"The mere concept of an artificial partner, husband, wife, friend or lover is one that, for most people at the start of the 21st century, challenges their notion of relationships," says Levy. "But my thesis is this: Robots will be hugely attractive to humans as companions because of their many talents, senses and capabilities." Given rapid developments in technology, Levy believes that it is only a matter of time before machines will be capable of offering human-like traits. According to Levy, "love and sex with robots on a grand scale are inevitable."

Also, will robots be autonomous or will they be part of large collective minds? Will the machines moving around have their own self-contained artificial intelligences and autonomy? Or will they be so closely connected into massive stationary AI computers that there won't be all that many distinct separate artificial intelligences to have relationships with?

I'm thinking humans will find their ideal mates by genetically engineering them.

By Randall Parker 2007 December 13 09:40 PM  Transhumans Posthumans
Entry Permalink | Comments(15)
2007 December 12 Wednesday
Scientists Simulate DNA Nanopore Sequencer

The trend of using computer semiconductor technologies to manipulate biological material promises to revolutionize biological science and biotechnology. Orders of magnitude cost reductions become possible when very small devices are fabricated to manipulate cells and components of cells. Researchers at University of Illinois have created a simulated design for a nanopore-based DNA sequencer that could drastically cut DNA sequencing costs.

CHAMPAIGN, Ill. — Using computer simulations, researchers at the University of Illinois have demonstrated a strategy for sequencing DNA by driving the molecule back and forth through a nanopore capacitor in a semiconductor chip. The technique could lead to a device that would read human genomes quickly and affordably.

Being able to sequence a human genome for $1,000 or less (which is the price most insurance companies are willing to pay) could open a new era in personal medicine, making it possible to precisely diagnose the cause of many diseases and tailor drugs and treatment procedures to the genetic make-up of an individual.

“Despite the tremendous interest in using nanopores for sequencing DNA, it was unclear how, exactly, nanopores could be used to read the DNA sequence,” said U. of I. physics professor Aleksei Aksimentiev. “We now describe one such method.”

Cheap DNA sequencing is going to most dramatically change reproductive practices. Once embryos can be fully DNA tested and the meaning of all genetic variations become known then a substantial fraction of the population will use in vitro fertilization and pre-implantation genetic diagnosis (PIGD or PGD) to select embryos to start pregnancies with. That act of selection will speed up human evolution by orders of magnitude even before we start introducing genetic variations with genetic engineering.

By Randall Parker 2007 December 12 11:05 PM  Biotech Advance Rates
Entry Permalink | Comments(17)
Electricity Dynamic Pricing Spreads

Dynamic pricing based on short term changes in supply and demand is a spreading experiment.

Meters that can read prices every hour are also the centerpiece of aggressive conservation efforts in Virginia and Maryland, where Gov. Martin O'Malley (D) has pledged to reduce the state's electricity consumption by 15 percent by 2015. Fluorescent light bulbs that outlast traditional incandescent ones, rebates on energy-efficient appliances, free energy audits -- all are on the table for customers of Pepco, Virginia Dominion Power and BGE, among others.

"At the end of the day, people want to understand what their electricity is costing them and what they are getting for it," said Steven B. Larsen, chairman of the Maryland Public Service Commission, the state's utility regulator. "The basic concept is that technology can help save us money."

Smart meters have not been mandated, but they are being used in several states. The Illinois legislature has required the expansion of peak pricing programs, and Florida and California are among those conducting pilot programs.

This is a necessary development. Two of our big prospects for future electric generation, wind and solar, are not dependable. To use more wind and solar we need to adjust electric prices based on not just demand but also on available supply. Charge less when the sun shines and the wind blows. Charge more at night and overcast and winter days. Charge more as well when the breezes die down.

Dynamic pricing also helps nuclear power because it shifts more demand away from peak periods. Nuclear works best as baseload power that is running constantly 24 hours a day and 365 days a year. Nuclear capital costs are too high to operate a nuclear power plant only during hot afternoons. Dynamic pricing will partially flatten demand and by doing so make nuclear able to supply a larger fraction of total used electric power.

By Randall Parker 2007 December 12 10:35 PM  Energy Policy
Entry Permalink | Comments(5)
Genetically Engineered Organisms To Convert Cellulose To Diesel

MIT's Technology Review has placed David Berry of biotech energy start-up LS9 on a list of top 35 innovators for 2007 for an on-going attempt to genetically engineer organisms to feed on plant cellulose and produce gasoline or diesel fuel.

Berry took the lead in designing a system that allowed LS9 researchers to alter the metabolic machinery of ­microorganisms, turning them into living hydrocarbon refineries. He began with biochemical pathways that microbes use to convert ­glucose into energy-storing molecules called fatty acids. Working with LS9 scientists, he then plucked genes from various other organisms to create a system of metabolic modules that can be inserted into microbes; in different combinations, these modules induce the microbes to produce what are, for all practical purposes, the equivalents of crude oil, diesel, gasoline, or hydrocarbon-based in­dustrial chemicals.

...

Nonetheless, LS9 has no products so far and many ­hurdles to surmount. Berry's system, for example, is designed to exploit glucose-based feedstocks such as cellulose. Berry says he is "agnostic" about what source of cellulose might drive the LS9 system on an industrial scale; he lists switchgrass, wood chips, poplar trees, and Miscanthus, a tall grass similar to sugarcane, as potential sources of biomass. But a cost-­effective and efficient source of cellulose is one of the more significant bottlenecks in the production of any biofuel.

Producing gasoline or diesel has a lot of advantages over producing ethanol. Unlike ethanol both gasoline and diesel can be transported via pipelines. They also let you go much further between fill-ups than ethanol. So a reader asked whether this approach can work. I can't say for sure but the question should be considered in parts:

1) Can they genetically engineer the sorts of organisms that they want to genetically engineer to perform the way they want those organisms to perform?

2) Will the resulting process be cost competitive?

3) If they manage to create a cost competitive process will the result be a good thing?

I'm much more optimistic on the first point than on the second point. Worse, I'm more optimistic on the second point than on the third.

On the first point: Sure, with enough genetic engineering talent and time you can modify organisms to eat cellulose and produce diesel. Will this particular crew succeed? Hard to know.

But if they succeed in the genetic engineering task will the resulting fuel be cheap enough? They have going for them the rising cost of corn driven by both corn ethanol subsidies and rising world demand for food. But their approach starts with an inefficiency: They first grow plants to produce cellulose. Therefore some of the initial plant energy gets lost as they feed the cellulose to genetically engineered organisms. The conversion from cellulose to diesel fuel will have some inefficiency associated with it. Will the conversion process cost more than half the cellulose energy?

A larger fraction of the sun's energy would get converted to diesel fuel if they geneticallly engineered organisms to convert the sun's energy directly into diesel fuel rather than first into cellulose. But that approach would raise costs since then plants would need to be grown in elaborate diesel fuel collection systems. Far easier to harvest existing trees, bushes, and grasses to get cellulose.

That depends on the cost of the cellulosic material, the efficiency of the conversion of it into less oxidized hydrocarbons (out with the oxygens and in with the hydrogens), and the cost of vats and other equipment.

How much of the existing biomass out there in nature eill get diverted to this purpose? The human footprint is already much too big and growing. The other species are already too squeezed.

I'm skeptical of this approach because it seems inefficient. Plants are inefficient converters of sunlight into chemical energy. Then there's an additional step of harvesting the plants, transporting them to vats, and then using the cellulose to feed microorganisms where part of the energy gets lost running the metabolism of these plants.

Most estimates I've come across on the efficiency of conversion of light energy into chemical energy by plants end up with a conversion efficiency of 1% or less. Keep in mind that most photons are at frequencies that plant chloroplasts can't use. Plus, seasonal plants aren't even alive part of the year to absorb photons and convert them into chemical energy. According to this report sugarcane is the most efficient converter of light energy into chemical energy.

Tropical grasses that are C4 plants include sugarcane, maize, and crabgrass. In terms of photosynthetic efficiency, cultivated fields of sugarcane represent the pinnacle of light-harvesting efficiency. Approximately 8% of the incident light energy on a sugarcane field appears as chemical energy in the form of CO2 fixed into carbohydrate. This efficiency compares dramatically with the estimated photosynthetic efficiency of 0.2% for uncultivated plant areas. Research on photorespiration is actively pursued in hopes of enhancing the efficiency of agriculture by controlling this wasteful process. Only 1% of the 230,000 different plant species known are C4 plants; most are in hot climates.

Since the conversion efficiency into chemical cellulose energy is so low in the first place the harvesting, transportation and conversion of cellulose to diesel or gasoline makes a low initial efficiency even lower by the time the final usable chemical product comes out of the conversion process. That means if we shift to biomass energy to push our vehicles around more land must get shifted to provide energy for humans.

By Randall Parker 2007 December 12 10:22 PM  Energy Biomass
Entry Permalink | Comments(2)
One Day Per Month Fasting Good For Arteries

Fasting once a month for 1 day substantially cuts artery disease risk.

Mormons have less heart disease — something doctors have long chalked up to their religion's ban on smoking. New research suggests that another of their "clean living" habits also may be helping their hearts: fasting for one day each month.

A study in Utah, where the Church of Jesus Christ of Latter-Day Saints is based, found that people who skipped meals once a month were about 40 percent less likely to be diagnosed with clogged arteries than those who did not regularly fast.

Thanks to James Bowery for the tip.

Even non-Mormons benefited.

Though more than 90% of the people studied were Mormons, the findings held true even in those who had a different religious preference, says Benjamin D. Horne, PhD, director of cardiovascular and genetic epidemiology at Intermountain Medical Center in Salt Lake City.

Yes friends, even you can benefit. Step right up and choose your day to go hungry. I'm thinking the fasting day needs to be a day where there's a whole lot of constant distraction so you don't have to spend the day thinking how hungry you are. Maybe we need fasting amusement parks where you spend all day riding roller coasters.

The research was conducted at LDS Hospital using the Intermountain Heart Collaborative Study registry, made up of patients who had heart angiography between 1994 and 2002. The researchers focused on patients who are LDS to see if other church-dictated practices besides not smoking had an impact, said Benjamin Horne, director of cardiovascular and genetic epidemiology, now at the new Intermountain Medical Center. He's also an adjunct professor at the University of Utah.

Researchers looked at data from more than 4,600 people, average age 64, who had come through the cardiac cath lab, to see the degree of risk for someone who was LDS compared to others. They focused on those with obvious coronary artery disease (CAD), defined as 70 percent narrowing or blockage in at least one artery, and those who had little or no CAD (less than 10 percent narrowing). They found that while 66 percent of others had CAD, only 61 percent of LDS members did.

I am going to found a church which preaches a high vegetable, high fruit, and low glycemic index diet. Maybe call it the SENS Church for Strategies for Engineered Negligible Senescence Church. I need to find a barrel to look into to receive divine messages. What are my prospects for success?

SENS believers, you've got to suffer hunger pangs once a month to survive until the redemption of rejuvenation therapies. Once we receive the rejuvenation therapies we will enter the promised land of no harm daily Roman style feasts.

By Randall Parker 2007 December 12 08:06 PM  Aging Diet Studies
Entry Permalink | Comments(8)
2007 December 10 Monday
Cochran And Harpending See Human Evolution Acceleration

Evolutionary theorist Greg Cochran and genetic anthropologist Henry Harpending have teamed up again and with John Hawks, Eric Wang, and Robert Moyzis to argue that human evolution has greatly accelerated in the last 10,000 years and the human race is diverging.

Dec. 10, 2007 - Researchers discovered genetic evidence that human evolution is speeding up - and has not halted or proceeded at a constant rate, as had been thought - indicating that humans on different continents are becoming increasingly different.
 
"We used a new genomic technology to show that humans are evolving rapidly, and that the pace of change has accelerated a lot in the last 40,000 years, especially since the end of the Ice Age roughly 10,000 years ago," says research team leader Henry Harpending, a distinguished professor of anthropology at the University of Utah.
 
Harpending says there are provocative implications from the study, published online Monday, Dec. 10 in the journal Proceedings of the National Academy of Sciences:

  • "We aren't the same as people even 1,000 or 2,000 years ago," he says, which may explain, for example, part of the difference between Viking invaders and their peaceful Swedish descendants. "The dogma has been these are cultural fluctuations, but almost any Temperament trait you look at is under strong genetic influence."

  • "Human races are evolving away from each other," Harpending says. "Genes are evolving fast in Europe, Asia and Africa, but almost all of these are unique to their continent of origin. We are getting less alike, not merging into a single, mixed humanity." He says that is happening because humans dispersed from Africa to other regions 40,000 years ago, "and there has not been much flow of genes between the regions since then." 

"Our study denies the widely held assumption or belief that modern humans [those who widely adopted advanced tools and art] appeared 40,000 years ago, have not changed since and that we are all pretty much the same. We show that humans are changing relatively rapidly on a scale of centuries to millennia, and that these changes are different in different continental groups."
 
The increase in human population from millions to billions in the last 10,000 years accelerated the rate of evolution because "we were in new environments to which we needed to adapt," Harpending adds. "And with a larger population, more mutations occurred."
 
Study co-author Gregory M. Cochran says: "History looks more and more like a science fiction novel in which mutants repeatedly arose and displaced normal humans - sometimes quietly, by surviving starvation and disease better, sometimes as a conquering horde. And we are those mutants."
 
Harpending conducted the study with Cochran, a New Mexico physicist, self-taught evolutionary biologist and adjunct professor of anthropology at the University of Utah; anthropologist John Hawks, a former Utah postdoctoral researcher now at the University of Wisconsin, Madison; geneticist Eric Wang of Affymetrix, Inc. in Santa Clara, Calif.; and biochemist Robert Moyzis of the University of California, Irvine.

Using data from the International Haplotype Map Project on single nucleotide polymorphisms (SNPs which are single letter genetic differences)

Harpending and colleagues used a computer to scan the data for chromosome segments that had identical SNP patterns and thus had not broken and recombined, meaning they evolved recently. They also calculated how recently the genes evolved. A key finding: 7 percent of human genes are undergoing rapid, recent evolution.

So we are becoming less alike due to adaptations to local environments. My guess is this trend will accelerate when offspring genetic engineering becomes possible. People in different cultures, religions, climates, occupations, social classes, and regulatory environments will make different decisions on which genetic variations to give their offspring. As a result groups will become less alike. Some groups will choose genes that enhance analytical ability and mathematical skills. Some will emphasize genes that boost ambition and perhaps even ruthlessness. Others will go for genetic variations that increase moral motivation and spirituality.

Update: The divergence of human genomes is a result of growing human populations moving into lots of different habitats that each exert different selective pressures. The selective pressures operated on immune systems, musculature, hair, skin, brains, and many other aspects of human shape and physiology.

It says something about the adaptive value of specific temperaments to specific habitats that these researchers report big selective pressures on genes that control temperament. That makes sense if you think about it intuitively. A hunter probably needs a different temperament than a goat herder (who experiences a lot of solitude) who needs a different temperament than a merchant (who interacts with many other humans and needs to enjoy sizing them up quickly). Some tasks are far more cognitively demanding than others. Some tasks require much more hand-eye coordination or better balance or more strength or endurance. Humans working at different tasks to survive in different environments got selected to be shorter or taller, better sprinters or better long distance runners, more muscular or fatty or skinny, and many other attributes. This is akin to specialization of labor.

See the John Hawks introduction to the paper on his blog. Also see his "Acceleration rarely asked questions" about the research. Hawks says the selective pressures acting on human genomes have been so strong in recent history that the signal they are measuring is larger than the biases one might expect would make the data hard to interpret.

In the earliest studies, when people were finding that 3 or 4 percent of a sample of genes had signs of recent selection, those numbers were already extremely high. They got even higher, as more and more powerful methods of detecting selection came online. Our current estimate is the highest yet, but even this very high number is perfectly consistent with theoretical predictions coming from human population numbers.

At one level, the mathematical answer is as simple as "more people means more mutations." But more deeply, we can predict a linear response of new selected alleles to population size, and we can model this response with respect to a particular frequency range. The genome is a complicated place -- with different mutations originating at different times, selected at different strengths, consequently with different fixation probabilities and different current frequencies. For some reason, nobody really tried to describe this mathematically before.

Now, our model is extremely simple -- it can be challenged on several specific bases. For instance, population increase was not a simple exponential -- it grew in fits and starts, with some significant crashes. The average strength of selected mutations probably changed over time, and the distribution of the strength of selection may have departed from our assumptions. Even the adaptive mutation rate may have changed over time.

Still, the general prediction is quite clear: the population has grown, its conditions of existence have changed, and as a result selection on new mutations should have accelerated. And the observed data fit our theoretical prediction exceptionally well. Certainly we could do better if we made a more detailed model, and we will be doing some of that in future papers. But mathematical simplicity has a great virtue: we can see precisely why human historical changes should have accelerated this aspect of our evolution, and we can see the magnitude of the response. That magnitude greatly outweighs all potential biases.

Go read the full Hawks post. It is all worth reading.

Razib discusses the research and surveys the media reaction to this important report.

March 2009 Update: This work has since become the basis for an excellent book by Cochran and Harpending entitled The 10,000 Year Explosion: How Civilization Accelerated Human Evolution.

By Randall Parker 2007 December 10 10:34 PM  Trends, Human Evolution
Entry Permalink | Comments(20)
2007 December 09 Sunday
British Governnment Decides On Massive Wind Farms

The Labour government of Gordon Brown has decided to strengthen their green bona fides in a big way.

Britain is to embark on a wind power revolution that will produce enough electricity to power every home in the country, ministers will reveal tomorrow.

The Independent on Sunday has learnt that, in an astonishing U-turn, the Secretary of State for Business, John Hutton, will announce that he is opening up the seas around Britain to wind farms in the biggest ever renewable energy initiative. Only weeks ago he was resisting a major expansion of renewable sources, on the grounds that it would interfere with plans to build new nuclear power stations.

But what will it cost?

Combined with almost 1 GW of existing capacity the proposed and planned wind farms will add up to 35 GW of capacity.

Mr Hutton's announcement, which will be made at a conference in Berlin tomorrow, will identify sites in British waters for enough wind farms to produce 25 gigawatts (GW) of electricity by 2020, in addition to the 8GW already planned – enough to meet the needs of all the country's homes.

But since this uses wind that does not always blow are they talking about max output? If so, then assuming 32% average operating capacity (guessing based on reports about existing wind farms) a more reasonable output estimate would be maybe 11 GW. They could accomplish the same goal of avoiding carbon dioxide emissions by building 8 GE ESBWR nuclear reactors (assuming 90% uptime). I wonder whether 7000 wind turbines, deep ocean towers, and cables to bring the power to shore will cost more or less than 8 nukes. Also, the wind towers will require a lot of gas or coal fired back-up electric power plants for when the wind does not blow. That's an added cost the nukes wouldn't have.

You might expect me to think this proposal is dumb because politicians didn't realistically weigh costs and nuclear power might be cheaper. But I'm looking at a bigger picture: Even the second cheapest substitute for fossil fuels for generating electricity is still an improvement over using fossil fuels to generate electricity. Now some of the more skeptical among you about global warming are thinking I've gone soft and sentimental. Not to worry. I'm still really worried about Peak Oil and I'm thinking more and more that we need to reserve natural gas and coal for transportation, fertilizer, and plastics. That'll still leave some libertarians among you unsatisfied. But sorry, I think a world of sovereign national oil companies in control of most of the remaining oil and hiding their real reserves is not a very efficient market. Plus, I think the market is making a massive mistake on energy.

To put that 34 GW number in perspective currently the United States has 13 GW of installed wind capacity. The US had only half that capacity 4 years ago. So a tripling of capacity before 2020 seems quite possible and perhaps even likely. Not sure if Britain will ever become the biggest producer of wind power. Right now Texas alone exceeds Britain in wind energy production.

The Scots don't want wind turbines they can see.

Up to 7,000 turbines could be installed off the UK's coastline in a bid to boost the production of wind energy 30-fold by 2020. The plans are likely to see a huge increase in wind farms off the coast of Scotland, although plans to situate new farms within 12 miles of the Scottish shore have been shelved.

Instead, the new farms will most likely be in deep-water locations up to 200 nautical miles offshore.

There's a growing movement in Britain against land-based and near shoreline ocean-based wind towers. The opponents share my esthetic reaction. Wind towers might be neat to go look at in a few places. But I want most countryside to remain more natural looking.

The British government might also pursue tidal power.

Mr Brown and his environment secretary, Hilary Benn, are expected to announce a range of measures including a tighter renewables obligation on electricity companies, a commitment to the Severn tidal barrage and an offshore Thames estuary wind farm capable of supplying a quarter of London's electricity with 341 wind turbines.

The UK's outstanding tidal resources could provide at least 10% of the country's electricity, the government's sustainable development commission has insisted.

What I'd like to know: So then is Brown's government going to abandon their flirtation with a revival of nuclear power? Or are they going to do nuclear and wind? If they do both they could save future dwindling supplies of natural gas for other uses.

By Randall Parker 2007 December 09 11:41 PM  Energy Wind
Entry Permalink | Comments(6)
New York Times Notices Oil Export Land Model Problem

When a tree falls in a forest where no human will hear it does it make any sounds? Yes, but some like to pretend no in order to make the point that without observers sounds might as well be absent. Well, previously I've highlighted work by Peak Oil theorists "Khebab" and "westexas" on how rapidly rising internal consumption is going to cut oil exports by big oil exporters. But the mainstream media hasn't paid much attention to this problem until now. So the writings of Peak Oil theorists have until now resembled trees falling in empty forests. Finally the trees are falling within earshot of people who matter. The New York Times has a story entitled Oil-Rich Nations Use More Energy, Cutting Exports:

The economies of many big oil-exporting countries are growing so fast that their need for energy within their borders is crimping how much they can sell abroad, adding new strains to the global oil market.

That crimping is going to get much worse.

Experts say the sharp growth, if it continues, means several of the world’s most important suppliers may need to start importing oil within a decade to power all the new cars, houses and businesses they are buying and creating with their oil wealth.

I like the "if it continues". Well, okay, we could get hit by a massive asteroid next year, wiping out the human race and ending the trend of domestic oil consumption growth by big oil producers. Or aliens might land and give us technology for making fusion energy workable for cheap. Or aliens might attack us and wipe us out. So the "if" part can be defended. But I think it safe to say those are pretty low probability events (and anyone with good outer space alien contacts please correct me in the comments).

We are effectively already at Peak Oil for the non-exporting countries. Think I'm just some crazy extreme loon nutjob? Well, maybe. But I've got the company of 7 NY Times reporters who contributed to their story:

Internal oil consumption by the five biggest oil exporters — Saudi Arabia, Russia, Norway, Iran and the United Arab Emirates — grew 5.9 percent in 2006 over 2005, according to government data. Exports declined more than 3 percent. By contrast, oil demand is essentially flat in the United States.

Cheap prices have been driving consumption increases in oil producing nations.

Saudis, Iranians and Iraqis pay 30 to 50 cents a gallon for gasoline. Venezuelans pay 7 cents, and demand is projected to rise as much as 10 percent this year.

Fatih Birol of the UN's International Energy Agency says between now and 2015 world oil production might increase 1.1 million barrels a day. That increase will get eaten up in producer countries while demand from India and China will grow by large amounts. The IEA is assuming 25 million barrels per day (mbd) of new production to offset declines of 23.9 mbd in existing fields. But that decline rate is probably optimistic. Some of the OPEC countries are hiding their real capabilities and painting an excessively rosy picture.

Oil consumption in the United States and other Western industrial countries will start declining before world oil production starts declining. Reduced exports by big producers will combine with increasing demand from India and China to push up oil prices and cut US, European, Austrialian, Canadian, and Japanese demand.

What should we do about it? Electrify everything. We do not face a shortage of fuels for generating electricity. To the extent that any activity can be shifted over to electrical power we need to find ways to do it. Liquid fuels are too valuable to be wasted on, for example, heating homes and commercial buildings. Oil for heating should be replaced with electrically driven ground source heat pumps which will actually lower the cost of heating. Vehicles that go shorter distances should be shifted to electric motors and battery power. We need better batteries to power vehicles for longer distances.

Update: Oil geologist Jeffrey "westexas" Brown was consulted by the NY Times writer who wrote their story.

Cliff and I had several conversations, but I am in no way taking any kind of credit for this story. This is his work, and I think that he did a very good job. I don't know what kind of discussions went on behind the scenes at the Times, but my guess is that trying to discuss the mathematical models of future exports was too complicated for an introductory article, and perhaps too scary.

My only real complaint is that I think that the MSM guys should reference the fact that Yergin's price and production projections have so far been way off the mark.

The NY Times should follow up with articles about expected rates of export decline for various oil producers.

Update II: Also see my post Wall Street Journal Takes Peak Oil Seriously. That's another sign that Peak Oil problems are entering mainstream discussion.

Update III: We need to build up nuclear plants and wind towers rapidly so that we can stop using natural gas and coal to generate electricity. This would free up the natural gas for fleet vehicle fuel and the coal to use to convert for liquids to power vehicles as well. This is not politically possible yet because Peak Oil is not yet an accepted event. Once it becomes accepted conventional wisdom remember that we need to reserve fossil fuels for transportation and plastics. Wasting them on heat and electric power generation is stupid.

By Randall Parker 2007 December 09 12:40 PM  Energy Fossil Fuels
Entry Permalink | Comments(19)
2007 December 07 Friday
Pebble Bed And Other Gen IV Nuclear Reactor Designs

An article in Popular Mechanics examines generation IV nuclear reactor designs now under development and reports that pebble bed reactor designs look likely to get built before other Gen IV designs.

Kevan Weaver, like most of the lab's 3500 employees, works in a sprawling group of campus-like buildings on the outskirts of Idaho Falls. Standing in his third-floor office, the fresh-faced nuclear engineer holds what could be the future of nuclear power in his hand: a smooth graphite sphere about the size of a tennis ball. It could take years to weigh the pros and cons of all six Gen IV designs, Weaver says, but Congress can't wait that long. In addition to replacing the aging fleet of Generation II reactors, the government wants to make progress on another front: the production of hydrogen, to fuel the dream of exhaust-free cars running independent of foreign oil.

As a result, the frontrunner for the initial $1.25 billion demonstration plant in Idaho is a helium-cooled, graphite-moderated reactor whose extremely high outlet temperature (1650 to 1830 F) would be ideal for efficiently producing hydrogen. There are a couple of designs that could run that hot, but the “pebble bed,” so named for the fuel pebble that Weaver holds, is attracting particularly intense interest.

A typical pebble-bed reactor would function somewhat like a giant gumball machine. The design calls for a core filled with about 360,000 of these fuel pebbles--"kernels" of uranium oxide wrapped in two layers of silicon carbide and one layer of pyrolytic carbon, and embedded in a graphite shell. Each day about 3000 pebbles are removed from the bottom as fuel becomes spent. Fresh pebbles are added to the top, eliminating the need to shut down the reactor for refueling. Helium gas flows through the spaces between the spheres, carrying away the heat of the reacting fuel. This hot gas--which is inert, so a leak wouldn't be radioactive--can then be used to spin a turbine to generate electricity, or serve more exotic uses such as produce hydrogen, refine shale oil or desalinate water.

The ability to make hydrogen more efficiently will only matter if and when we find better ways to store hydrogen. Since Gen IV reactor designs are easily a decade away from initial use in commercial reactor construction better methods for storing hydrogen will become available by then.

The biggest promise of pebble bed is much more rapid construction. A substantial part of the cost of nuclear power is the interest cost of reactors when they are only partially constructed. If a reactor takes 5 years to build then the portion of the cost spent in the first year doesn't start earning back on its investment for over 4 years. That period during which the capital equipment is sitting idle while the rest of the plant gets constructed makes nuclear power far more expensive.

The article claims a new demonstration reactor build decision won't be taken until 2014. So the development of new nuclear reactor designs seems really slow. Does it have to take that long?

By Randall Parker 2007 December 07 12:00 AM  Energy Nuclear
Entry Permalink | Comments(11)
2007 December 06 Thursday
Higher Diabetes Risk With Too Much Or Little Sleep

Don't get too much or too little sleep.

The study, authored by James E. Gangwisch, PhD, of Columbia University in New York, explored the relationship between sleep duration and the diagnosis of diabetes over an eight-to-10-year follow-up period between 1982 and 1992 among 8,992 subjects who participated in the Epidemiologic Follow-Up Studies of the first National Health and Nutrition Examination Survey. The subjects’ ages ranged from 32 to 86 years.

According to the results, subjects who reported sleeping five or fewer hours and subjects who reported sleeping nine or more hours were significantly more likely to have incident diabetes over the follow-up period than were subjects who reported sleeping seven hours, even after adjusting for variables such as physical activity, depression, alcohol consumption, ethnicity, education, marital status, age, obesity and history of hypertension.

The effect of short sleep duration on diabetes incidence is likely to be related in part to the influence of short sleep duration upon body weight and hypertension, said Dr. Gangwisch. Experimental studies have shown sleep deprivation to decrease glucose tolerance and compromise insulin sensitivity by increasing sympathietic nervous system activity, raising evening cortisol levels and decreasing cerebral glucose utilization. The increased burden on the pancreas from insulin resistance can, over time, compromise â-cell function and lead to type two diabetes, warned Dr. Gangwisch.

Too little sleep accelerates your aging.

Knowledge about how to slow your aging only helps if you act on it. Anyone going to change their sleep habits as a result of reading this?

By Randall Parker 2007 December 06 08:49 PM  Aging Studies
Entry Permalink | Comments(2)
Biomass Energy Push Making Diets Less Healthy?

Want another argument against biomass energy? It will make vegetables more expensive. Lower calorie foods such as vegetables are generally healthier and yet their prices are rising most rapidly.

As food prices rise, the costs of lower-calorie foods are rising the fastest, according to a University of Washington study appearing in the December issue of the Journal of the American Dietetic Association. As the prices of fresh fruit and vegetables and other low-calorie foods have jumped nearly 20 percent in the past two years, the UW researchers say, a nutritious diet may be moving out of the reach of some American consumers.

UW researchers Dr. Adam Drewnowski, director of the Center for Public Health Nutrition, and Dr. Pablo Monsivais, a research fellow in the center, studied food prices at grocery stores around the Seattle area in 2004. They found that the foods which are less energy-dense -- generally fresh fruits and vegetables -- are much more expensive per calorie than energy-dense foods -- such as those high in refined grains, added sugars, and added fats.

When the researchers surveyed prices again in 2006, the found that the disparity in food prices only worsened with time. Lower-calorie foods jumped in price by about 19.5 percent in that two-year period, while the prices of very calorie-rich foods stayed stable or even dropped slightly, the researchers found. The general rate of food price inflation in the United States was about 5 percent during that period, according to the U.S. Department of Labor.

"That the cost of healthful foods is outpacing inflation is a major problem," said Drewnowski. "The gap between what we say people should eat and what they can afford is becoming unacceptably wide. If grains, sugars and fats are the only affordable foods left, how are we to handle the obesity epidemic""

The demand for land to grow grains will squeeze out the growth of vegetables. Industrializing Asians and affluent people driving big SUVs are both pushing up the costs of fruits and vegetables. This is happening both due to rising affluence and the big push for corn ethanol and other biomass sources of energy.

World cereal and energy prices are linked according to the International Food Policy Research Institute.

World cereal and energy prices are becoming increasingly linked. Since 2000, the prices of wheat and petroleum have tripled, while the prices of corn and rice have almost doubled (Figure 6). The impact of cereal price increases on food-insecure and poor households is already quite dramatic. For every 1-percent increase in the price of food, food consumption expenditure in developing countries decreases by 0.75 percent (Regmi et al. 2001). Faced with higher prices, the poor switch to foods that have lower nutritional value and lack important micronutrients.

Birds and lions and tigers and bears (oh my) are all getting squeezed out of habitats by population growth, industrialization, oil reserves depletion, and the push for biomass energy. We have too many people, worsening resource limitations, and politicians who are compounding the problem with dumb energy policies aimed at raising incomes of farmers first and foremost. We need fewer babies, more nuclear power, and some breakthroughs in the cost of photovoltaics.

By Randall Parker 2007 December 06 12:07 AM  Energy Biomass
Entry Permalink | Comments(3)
2007 December 05 Wednesday
Our Brains Become Less Synchronized As We Age And This Is Bad

Out of all the aspects of aging I hate brain aging most of all. My brain is who I am. I do not want to lose the intellectual abilities I currently possess. In fact, I want more brain power, not less. Well, a group of researchers at Harvard, Washington University of St. Louis, and University of Michigan have found that as we age different parts of the brain become less in sync with each other.

The researchers assessed brain function in a sample of adults ranging in age from 18 to 93 and comprising 38 young adults and 55 older adults. They did so using functional magnetic resonance imaging (fMRI), which uses harmless radio waves and magnetic fields to measure blood flow in brain regions, which in turn reflects activity.

To assess the integrity of functional connections between brain areas, the researchers used fMRI to measure spontaneous low-frequency fluctuations known to reflect the activity of such connections. The researchers concentrated on large-scale connections between frontal and posterior brain regions that are associated with high-level cognitive functions such as learning and remembering.

The researchers reported a “dramatic reduction” in functional connections when they compared the younger and older groups.

I do not want to undergo a “dramatic reduction” in my brain's functional connections. If anyone doubts the desirability of development of rejuvenation therapies ask yourself whether you want to gradually lose parts of your brain and to become less able to think as you age. What aging costs you is far more than a less pretty appearance or a reduction in athletic ability. You aren't just gaining more aches and pains. You are losing parts of your mind.

The researchers also used an MRI technique called “diffusion tensor imaging” to measure the integrity of white matter in the brains of the subjects. This technique reveals details of the structure of brain tissue. Their analysis revealed that the reduced functional connection they detected in brain areas of the older subjects was correlated with decreased white matter integrity.

When the researchers tested the subjects’ cognitive function, they found that “Those individuals exhibiting the lowest functional correlation also exhibited the poorest cognitive test scores.”

The cognitive test score results tend to validate the use of the MRI techniques to measure brain conditions. Granted measured levels of brain activity are open to interpretation. But note that they didn't just measure brain activity. They measured quantity of white matter and connections.

I want my brain to stay in sync.

Among the older individuals, some of the subjects’ brains systems were correlated, and older individuals that performed better on psychometric tests were more likely to have brain systems that were in sync. These psychometric tests, administered in addition to the fMRI scanning, measured memory ability, processing speed and executive function.

Among older individuals whose brain systems did not correlate, all of the systems were not affected in the same way. Different systems process different kinds of information, including the attention system, used to pay attention, and the default system, used when the mind is wandering. The default system was most severely disrupted with age. Some systems do remain intact; for example, the visual system was very well preserved. The study also showed that the white matter of the brain, which connects the different regions of the brain, begins to lose integrity with age.

My guess is the decay in white matter is the cause of the decay in linkage between the parts of the brain.

The researchers used PET scans to measure amyloid plaque build-up in order to screen out people who were developing Alzheimer's Disease. They wanted to see how aging changes brains of people who are not developing Alzheimer's. So these results apply to us as we age even if we don't developer Alzheimer's.

The back and front of their brain become less well linked.

They focused on the links within two critical networks, one responsible for processing information from the outside world and one, known as the default network, which is more internal and kicks in when we muse to ourselves. For example, the default network is presumed to depend on two regions of the brain linked by long-range white matter pathways. The new study revealed a dramatic difference in these regions between young and old subjects. “We found that in young adults, the front of the brain was pretty well in sync with the back of the brain,” said Andrews-Hanna. “In older adults this was not the case. The regions became out of sync and they were less correlated with each other.” Interestingly, the older adults with normal, high correlations performed better on cognitive tests.

We need rejuvenation therapies. But first we need a society-wide awareness and acceptance of potential and need to develop rejuvenation therapies. Our minds are at stake.

By Randall Parker 2007 December 05 11:40 PM  Brain Aging
Entry Permalink | Comments(12)
2007 December 04 Tuesday
Mice With Increased Energy Burn Live Longer

Genetic engineering of a mitochondrial gene in mice to generate more heat causes the mice to live longer.

By making the skeletal muscles of mice use energy less efficiently, researchers report in the December issue of Cell Metabolism, a publication of Cell Press, that they have delayed the animals’ deaths and their development of age-related diseases, including vascular disease, obesity, and one form of cancer. Those health benefits, driven by an increased metabolic rate, appear to come without any direct influence on the aging process itself, according to the researchers.

The mitochondria powering the mouse muscles were made inefficient by increasing the activity of so-called uncoupling protein 1 (UCP1). UCP1 disrupts the transfer of electrons from food to oxygen, a process known as mitochondrial respiration, which normally yields the energy transport molecule ATP. Instead, the energy is lost as heat.

“When you make the mitochondria inefficient, the muscles burn more calories,” a metabolic increase that could be at least a partial substitute for exercise, said Clay Semenkovich of Washington University School of Medicine in St. Louis. “There are a couple of ways to treat obesity and related diseases,” he continued. “You can eat less, but that’s unpopular, or you could eat what you want as these animals did and introduce an altered physiology. It’s a fundamentally different way of addressing the problem.”

This result suggests that the development of drugs to cause the same effect in humans might increase human longevity.

This genetic alteration produced many beneficial effects.

In the new study, Semenkovich’s group used these mice to determine whether respiratory uncoupling in skeletal muscle—a tissue that adapts to altered heat production and oxygen consumption during exercise—can affect age-related disease. They found that animals with increased UCP1 only in skeletal muscle lived longer. Altered female animals also developed lymphoma, a type of cancer that originates in white blood cells called lymphocytes, less frequently. In mice genetically predisposed to vascular disease, the increase in UCP1 led to a decline in atherosclerosis in animals fed a “western-type” high-fat diet. Likewise, mice predisposed to developing diabetes and hypertension were relieved of those ailments by increased UCP1 in skeletal muscle. The “uncoupled mice” also had less body fat (or adiposity) and higher body temperatures and metabolic rates, among other biochemical changes.

I would rather have a version of UCP1 that I could switch between different levels of efficiency. Before going on a hike or after an accident or natural disaster it might make sense to shift UCP1 into a more efficient form. Basically, burn off excess energy when you can afford to do so but put your body into a high efficiency mode of operation when the need arises.

The development of drugs that reduce appetite should eventually reduce the benefit of turning UCP1 into a less efficient form. No need to burn off excess sugars and fats if you can make your brain not crave calories in the first place.

By Randall Parker 2007 December 04 11:57 PM  Aging Genetics
Entry Permalink | Comments(6)
People Who Imagine Body Distortions Too Analytical With Images

People with body dysmorphic disorder (BDD) see their bodies as more disfigured and ugly than they really are. Some BDD sufferers disfigure themselves with pointless plastic surgery. Okay, so what's with them? BDD sufferers look at images with more activity in the analytical left side of their brains.

For the first time, functional magnetic resonance imaging (fMRI) was used to reveal how the patients’ brains processed visual input. The UCLA team outfitted 12 BDD patients with special goggles that enabled them to view digital photos of various faces as they underwent a brain scan.

Each volunteer viewed three types of images. The first type was an untouched photo. The second type was a photo altered to eliminate facial details that appear frequently, such as freckles, wrinkles and scars. This “low frequency” technique blurred the final image.

The third type of image essentially subtracted the blurred second image from the untouched photo. This “high frequency” technique resulted in a finely detailed line drawing.

Feusner’s team compared the BDD patients’ responses to 12 control subjects matched by age, gender, education and handedness. What the scientists observed surprised them.

“We saw a clear difference in how the right and left sides of the brain worked in people with BDD versus those without the disorder,” noted Feusner.

There are situations where being really analytical will get you into trouble.

BDD patients more often used their brain’s left side -- the analytic side attuned to complex detail -- even when processing the less intricate, low-frequency images. In contrast, the left sides of the control subjects’ brains activated only to interpret the more detailed high-frequency information. Their brains processed the untouched and low-frequency images on the right side, which is geared toward seeing things in their entirety.

“We don’t know why BDD patients analyze all faces as if they are high frequency,” said Feusner. “The findings suggest that BDD brains are programmed to extract details -- or fill them in where they don’t exist. It’s possible they are thinking of their own face even when they are looking at others.”

There's the capability to do analysis. Separately, there's what in your environment your mind tends to focus on to do analysis. If you focus on the wrong stimuli to process you can become pretty dysfunctional.

But will is this tendency to focus on analysis of facial and body shapes ever work to the benefit of some people? Do any painters or film makers do better jobs because their minds intensely analyze body shapes? Is the real problem with BDD the focus on one's own body rather than the bodies of other people? I suspect so.

By Randall Parker 2007 December 04 12:02 AM  Brain Disorders
Entry Permalink | Comments(3)
2007 December 02 Sunday
Implanted Microchip Dispenses Drugs

Some day most people will have embedded microelectromechanical systems (MEMS) drug dispensing devices in them.

After nearly a decade of working on microelectromechanical systems (MEMS) for medical implants, a startup based in Bedford, MA, called MicroChips has prototypes for its first commercial products. By the beginning of January, the company plans to start animal trials of a device for healing bones damaged by osteoporosis. In a year and a half, it hopes to begin human trials on an implant for monitoring glucose levels in diabetics.

The first product, a device for delivering an anti-osteoporosis drug automatically, could allow patients to replace 500 daily injections with a single outpatient implant procedure. The glucose sensor, by continuously monitoring glucose levels, could reveal spikes in blood-sugar levels that go undetected using conventional sensors. Such spikes, if not treated, can contribute to organ damage, including blindness.

MEMS devices will become fancier with more built-in sensors and more elaborate algorithms for dispensing drugs. Eventually external instructions sent via radio signals will be able to control drug dispensing.

There's a future genetic engineering step that goes beyond MEMS and will reduce the need for MEMS: Livers and other tissue will get genetic engineering done to them to turn them into synthesis factories for drugs. That will eliminate the need for periodic replacement of MEMS chips. But genetic engineering won't be appropriate for delivering chemicals that have complex synthesis steps that aren't easily replicated inside of biological cells. So MEMS will have a future even once gene therapy becomes very powerful.

By Randall Parker 2007 December 02 08:43 PM  Biotech Embedded Devices
Entry Permalink | Comments(0)
2007 December 01 Saturday
Political Drive For Thorium Nuclear Reactor Development

Some politicians want to push thorium nuclear reactors. They are doing this for fairly parochial reasons. But there are potentially much wider benefits if they manage to kickstart thorium nuclear power.

Senators representing several Western states, including Utah's Orrin Hatch and Senate Majority leader Harry Reid, of Nevada, are working on legislation to promote thorium. They say it's a cleaner-burning fuel for nuclear-power plants, with the potential to cut high-level nuclear-waste volumes in half.

"They're concerned about the spent fuel from nuclear reactors ending up in their states," says Seth Grae, president of thorium-fuel technology developer Thorium Power, based in McLean, VA.

This method of fueling reactors can work with existing reactors modified to use a mix of uranium and thorium fuel rods. Neutrons from Uranium-235 are used to convert Thorium-232 into Uranium-233. The Uranium-233 is fissile (it can break down to release energy to drive electric power generation). The Wikipedia Thorium page says that Thorium as a nuclear fuel requires solving problems related to fuel fabrication and reprocessing.

In theory Thorium delivers a few benefits. First off, the waste is not as difficult to dispose of in part because thorium rods stay in reactors for longer periods of time than uranium rods. So fewer rods come out needing disposal. The greater ease of disposal motivates the US Senators from Western states to support it since they oppose the use of sites in their states (e.g. Yucca Mountain in Nevada) for disposal.

Thorium's fuel cycle also poses less risk for nuclear proliferation. The reduced risk of nuclear proliferation sounds very beneficial as well. The coming decline in world oil production is going to cause a big drive to develop nuclear power around the world. The ability to put thorium reactors into less developed countries would reduce the use of uranium in places which aren't full of peace, love, and understanding.

Combining uranium with thorium would also basically stretch the supply of uranium. Whether we really need to do that is much debated. The Japanese process for uranium extraction from the oceans might make uranium reserves depletion a non-problem. But thorium at least might lower total nuclear fuel costs.

If you are curious about thorium as an energy source Kirk Sorensen writes a web log Thorium Energy dedicated to the topic.

Update: Thorium Power will test thorium in a Russian nuclear reactor in 2010 (PDF format).

Lead test assemblies of thorium fuel are planned to be loaded into one of the VVER-1000 reactors at Kalinin near Moscow in 2010 as part of a multi-year demonstration program, Ernie Kennedy, a member of US company Thorium Power Ltd.’s technical advisory board, told a London conference October 31. He said the idea is to demonstrate the new fuel, which consists of a central “seed” assembly surrounded by a thorium blanket, in a VVER and “then expand to other PWRs and then perhaps BWRs,” for which the thorium fuel design is more difficult.

Thorium Power says thorium as a fuel reduces nuclear proliferation risks in a few ways.

Charles said spent thorium seed-and-blanket fuel would be “very difficult” to reprocess because of gamma radiation, and “wouldn’t be worth it” because the seed assemblies would contain very little fissile material and a lot of minor actinides. In the seed-and-blanket assembly, a central metallic “seed” consisting of either uranium-zirconium or plutonium-zirconium fuel rods is surrounded by a thorium-uranium dioxide blanket.

Kennedy said the thorium in the blanket reduces the proliferation risk of fissile materials in the spent fuel because, under irradiation, the thorium is converted to fissile U-233, which is burned in-situ over the life of the fuel assembly. Also, the thorium fuel cycle leads to the production of only small amounts of plutonium and the isotopic content of that plutonium makes it more unsuitable for weapons than normal reactor-grade plutonium.

For countries that want to consume excess plutonium, plutonium in the seed of the thorium fuel assembly can be burned “about three times faster and at somewhere between a third and half the cost of the mixed-oxide process,” he said, referring to more conventional uranium-plutonium oxide fuel now used in LWRs.

By Randall Parker 2007 December 01 12:33 PM  Energy Nuclear
Entry Permalink | Comments(19)
Site Traffic Info
Site Copyright
The contents of this site are copyright ©