2011 March 31 Thursday
Rare Genetic Variants Make Biggest Health Impact

The rare genetic variants and not the common variants contribute the most to disease risks.

DURHAM, N.C. – New genomic analyses suggest that the most common genetic variants in the human genome aren't the ones most likely causing disease. Rare genetic variants, the type found most often in functional areas of human DNA, are more often linked to disease, genetic experts at Duke University Medical Center report.

We all carry at least hundreds of rare genetic variants. So one has to read "rare" to mean that each rare variant is not covered by large numbers of people even though the total number of rare variants is very high.

These results make sense because any genetic variant that makes a big negative impact on health will usually get selected out of a population before the variant spreads to large numbers of progeny.

The study was published in the American Journal of Human Genetics on March 31.

"The more common a variant is, the less likely it is to be found in a functional region of the genome," said senior author David Goldstein, Ph.D., director of the Duke Center for Human Genome Variation. "Scientists have reported this observation before, but this study is the most comprehensive effort to date using annotations of the functional regions of the human genome and fully sequenced genomes."

Goldstein said that "the magnitude of the effect is dramatic and is consistent across all frequencies of variants we looked at." He also said he was surprised by the notable consistency of the finding. "It's not just that the most rare variants are different from the most common, it's that at every increase in frequency, a variant is less and less likely to be found in a functional region of the DNA," Goldstein said. "This analysis is consistent with what appears to be a growing consensus that common variants are less important in common diseases than many had originally thought."

This makes the discovery of the genetic variants that impact disease risks much harder to do. The number of disease-risk influences is much larger. Each variant has very few people carrying it. So there's less of a mutual benefit when each harmful variant is found.

This result is an argument for allowing the general population to spend their own money to get themselves genetically tested and sequenced. They can then submit their own genetic test results and disease history to medical genetics research groups for study. To discover all the genetic variants that influence disease risks will require most of the population to get themselves genetically tested and then to share their genetic test results with medical genetics researchers.

By Randall Parker 2011 March 31 11:38 PM  Biotech Genetic Diseases
Entry Permalink | Comments(4)
Low Wind Power Output Too Frequent In Britain

A study from the John Muir Trust finds that British wind power output sometimes falls to less than 5% of peak (nameplate) capacity.

The report, Analysis of UK Wind Generation, is the result of detailed analysis of windfarm output in Scotland over a 26-month period between November 2008 to December 2010 using data from the BMRS (Balancing Mechanism Reporting System). It's the first report of its kind, and drew on data freely available to the public. It challenges five common assertions made regularly by wind industry and the Scottish Government:

1. 'Wind turbines will generate on average 30% of their rated capacity over a year'
In fact, the average output from wind was 27.18% of metered capacity in 2009, 21.14% in 2010, and 24.08% between November 2008 and December 2010 inclusive.

2. 'The wind is always blowing somewhere'
On 124 separate occasions from November 2008 to December 2010, the total generation from the windfarms metered by National Grid was less than 20MW (a fraction of the 450MW expected from a capacity in excess of 1600 MW). These periods of low wind lasted an average of 4.5 hours.

3. 'Periods of widespread low wind are infrequent.'
Actually, low wind occurred every six days throughout the 26-month study period. The report finds that the average frequency and duration of a low wind event of 20MW or less between November 2008 and December 2010 was once every 6.38 days for a period of 4.93 hours.

4. 'The probability of very low wind output coinciding with peak electricity demand is slight.'
At each of the four highest peak demand points of 2010, wind output was extremely low at 4.72%, 5.51%, 2.59% and 2.51% of capacity at peak demand.

5. 'Pumped storage hydro can fill the generation gap during prolonged low wind periods.'
The entire pumped storage hydro capacity in the UK can provide up to 2788MW for only 5 hours then it drops to 1060MW, and finally runs out of water after 22 hours.

What I wonder: over how big a geographic area would wind farms need to be built and connected up via long distance transmission lines to allow wind farms to provide back-up for each other? Britain does not cover a large area as compared to, for example, the North American continent. Would wind power output be sufficiently uncorrelated over a couple of thousand mile range to allow a much higher worst case power output? California alone suffers very low min outputs across its wind farms.

Political opposition to long distance electric power transmission lines already makes it hard to sell wind electric power hundreds or thousands of miles from where it is generated. So wide geographic distribution of wind farms does not currently enable distant wind farms to back up each other as electric power sources. Costs of long distance lines might also argue against trying to use wind as base load power. If wind can't work as base load power it will hit a wall over how much its use can grow.

By Randall Parker 2011 March 31 11:18 PM  Energy Wind
Entry Permalink | Comments(9)
2011 March 30 Wednesday
Onagawa Nuclear Plant Becomes Refugee Site

Can a nuclear power plant be designed to survive a tsunami? Tohuku Electric Power could teach Tepco lessons in nuclear power plant site construction.

Tohoku Electric Power Co.’s Onagawa nuclear power plant was about 75 kilometers closer to the epicenter of the quake, and suffered no critical damage because it was built 15 meters above sea level, spokesman Yoshitake Kanda said.

What to do with a nuclear power plant after a tsunami? Silly question. Turn it into a refugee center of course. 240 residents of Onagawa are now living at their nuclear power plant.

ONAGAWA, Japan — As a massive tsunami ravaged this Japanese fishing town, hundreds of residents fled for the safest place they knew: the local nuclear power plant.

The Onagawa nuke offers excellent accommodations for a tsunami refugee.

"I'm very happy here, everyone is grateful to the power company," said Mitsuko Saito, 63, whose house was leveled in the tsunami. "It's very clean inside. We have electricity and nice toilets."

By Randall Parker 2011 March 30 10:28 PM  Energy Nuclear
Entry Permalink | Comments(5)
2011 March 29 Tuesday
Genes For Fast High Altitude Adjustment

The US military is funding research to predict who will get sick when suddenly transported to high altitude locations (e.g. by parachuting onto a mountain). The latest round of research will try to verify an earlier round that identified 6 genetic variants that appear to predict who will do worse at high altitude.

Robert Roach, who directs the Altitude Research Center at the University of Colorado, performed a similar test last year, taking 28 research subjects to a simulated altitude of 16,000 feet by putting them in a special chamber that mimics the effect of a low-oxygen environment. A blood test, screening for those six genetic elements, was able to predict with 96% accuracy which of the 28 would fall ill.

The researchers hope their research will ultimately lead to the development of drugs that cause human metabolisms to adapt to high altitudes. But drug development typically takes many years and many hundreds of millions of dollars. Not all drug development efforts succeed and many drugs on the market have side effects that one should avoid risking unless absolutely necessary.

Since drug development is a distant prospect genetic testing to select rapid adapters would seem a more practical way to the knowledge garnered from this research. If genetic testing can predict which soldiers have the best prospects to hit the ground running (and shooting) then for high altitude jobs why not just send the soldiers whose genetic profiles indicate they can handle it?

Some groups (e.g. Tibetans) carry genetic variants that adapt them especially well to high altitude living. Once those genetic variants are identified those genetic variants are probably rare outside of groups that have lived at high altitudes for many centuries. But gene therapies, cell therapies, and tissue engineering techniques will provide ways to basically do software upgrades.

Adjustment to altitude is just one of many types of genetically determined capabilities that will get genetically screened for in the future. Where it will get especially interesting: offspring genetic engineering. This isn't just about muscles, beauty, and intelligence. As this report above demonstrates, many other genetic adaptations are possible. So will parents choose genes optimized for high or low altitude living? For cold or warm or moderate temperature? For low tendency for distraction to allow more sustained abstract reasoning or for higher tendency for distraction to be more responsive to other people or to dangers?

By Randall Parker 2011 March 29 11:54 PM  Human Population Genetics
Entry Permalink | Comments(7)
2011 March 28 Monday
Japan Nuclear Establishment Ignored Warnings

Japanese Communist Party legislator Hidekatsu Yoshii warned the Japanese parliament that nuclear reactor backup systems could fail due to natural disaster and lead to core meltdown.

TOKYO—A Japanese lawmaker last year raised in Parliament the possibility that a natural disaster could wipe out a nuclear reactor's backup systems, leading to melting in the core, but the country's top nuclear regulator responded that such a scenario was "practically impossible."

In 2006 Yoshii-san said a tsunami could knock out the diesel back-up generators. If a legislator could figure out the obvious what's the excuse for Tepco and the regulators? Had Yoshii-san been listened to in 2006 preparations to enable back-up generator survival in event of a tsunami could have been carried out. Instead, the Fukushima Dai-Ichi reactors are now worthless, causing huge economic damage, and will take years and huge sums to clean up.

Japanese regulators were very slow to recognize tsunamis as a serious risk to nuclear reactors.

TOKYO — In the country that gave the world the word tsunami, the Japanese nuclear establishment largely disregarded the potentially destructive force of the walls of water. The word did not even appear in government guidelines until 2006, decades after plants — including the Fukushima Daiichi facility that firefighters are still struggling to get under control — began dotting the Japanese coastline.

The guy who was in charge of Fukushima Daiichi in the late 1990s says the idea of a tsunami never crossed his mind. Given the amount of attention Japan has given to tsunami warning systems and facilities to protect civilians from tsunamis this inattention seems inexcusable. Many critics are pointing to the dual role that Japan's Ministry of Economy, Trade and Industry plays as both promoter and regulator of Japan's nuclear power industry. This is reminiscent of the same dual role America's Atomic Energy Commission used to play for the nuclear power industry. After a reactor at Three Mile Island experienced a partial meltdown the AEC was broken up with its regulatory mission going to the Nuclear Regulatory Commission. The same argument is made about India's nuclear power promotion and regulation. But METI is probably worse because lots of top METI officials retire from METI into top electric power industry positions. The relationship between regulator and regulated is too cozy and familiar.

I worry about complacent nuclear regulatory agencies lacking in imagination and captured by industry. Nuclear power requires sustained highly competent regulation. Are governments even capability of the needed level of competence? Seriously.

A move toward newer and much safer reactor designs will be slowed by the Fukushima failures. Regulators will take longer to approve new designs and will spend a lot of time examining existing reactors. Three Mile Island and Chernobyl caused higher nuclear reactor construction costs. Whether that will happen this time around is less obvious. New designs are designed to lower costs and boost safety margins at the same time.

Do you know what Japanese CEOs do doing a crisis? They disappear. The CEO of Tepco has basically gone missing. BP's former CEO Tony Hayward is a champ compared to these guys.

By Randall Parker 2011 March 28 08:34 PM  Energy Nuclear
Entry Permalink | Comments(3)
2011 March 27 Sunday
Face Aging Simulation Increases Retirement Savings

This Wall Street Journal article is worth reading in full. Stanford researchers find that young people shown what their faces will look like in their 60s become more willing to save for retirement.

In one experiment, young people who saw their elderly avatars reported they would save twice as much as those who didn't. In another, students averaging 21 years of age viewed avatars of themselves that smiled when they saved more and frowned when they saved less. Those whose avatars were morphed to retirement age said they would save 30% more than those whose avatars weren't aged.

The potential real-world applications of the Stanford research are promising. "An employee's ID photo could be age-morphed and placed on the benefits section of the company's website," says Dan Goldstein of London Business School, another psychologist who worked on the project.

The thinking is that people who can see what they'll look like when old become better able to identify with their older self. The future self becoming less of a stranger makes people perhaps more sympathetic with that future person and more eager to take steps to help out that future person.

To use this capability only to get people to save more would be a waste. What's needed is a tie-in with efforts to develop rejuvenation therapies. Software that shows us aging to a point and then reversing aging could help build up support for the development of such therapies. Imagine watching a time-lapsed video of your aging face up to, say, age 60. Then the software would show different kinds of rejuvenation therapies applied. For example, one kind of cell therapy could restore areas that droop due to less collagen or other specific changes. Another kind of cell therapy could regrow receded gums on teeth.

Does any open source software for facial aging exist? It would be worth taking a crack at developing this capability for public web sites.

By Randall Parker 2011 March 27 04:58 PM  Aging Appearances
Entry Permalink | Comments(8)
2011 March 26 Saturday
Modest Proposal: Open Source Words And Ilg Blerps

An article in the New York Times takes a look at the legal battles between major computer companies over use of pairs of common words.

Microsoft is suing Apple, and Apple is suing Amazon, all over the right to use a simple two-word phrase: “app store.” Apple got there first, introducing its App Store in July 2008 as a marketplace for mobile applications. In January, Microsoft disputed Apple’s trademark claim, arguing that “app store” had already become a generic expression. And last week, Amazon announced its own “Appstore” for Google’s Android devices, prompting an infringement suit from Apple.

"Facebook" strikes me as a less obvious use of a pair of words than "App store". But Facebook has actually filed trademarks for "like" and "wall" among other words. See the article for details. This all seems like a big problem that will probably grow much worse. What to do?

Modest proposal: Make up new words to replace commonly used words and then create the equivalent of an open source license for each word where the word can be be used by anyone without fear of trademarks. If we created enough different words and also extended the license to include all combinations of open source words then we could have things like App Stores (perhaps renamed Ilg Blerps) where lots of companies could name their similar things by the same name. We'd all know the Amazon Ilg Blerp, the Google Ilg Blerp and the RIM Ilg Blerp sell the same kinds of things. That strikes me as incredibly handy.

The licensing model should probably be tweaked to allow use of open source words in copyrighted novels and news stories. If one uses 100 or 200 or 1000 (take your pick, we can debate) words in a row then that combination could be copyrightable. Though we'd want to avoid copyright on word orders derivable from sorts or other mathematical manipulations of words.

What do you think of this idea? Do you want the ability to use the same word pair to describe similar things sold by or owned by different companies? Do you see this as a valuable innovation in human languages?

By Randall Parker 2011 March 26 06:36 PM  Comm Tech Languages
Entry Permalink | Comments(22)
2011 March 24 Thursday
Energy Shortages In Japan

We all take electric power for granted (survivalists excepted). But as the Japanese are finding, we are one disaster away from electric power shortages.

The first pitch of Japan's baseball season has been pushed back so that people don't waste gasoline driving to games. When the season does start, most night games will be switched to daytime so as not to squander electricity. There'll be no extra innings.

Tokyo's iconic electronic billboards have been switched off. Trash is piling up in many northern Japanese cities because garbage trucks don't have gasoline. Public buildings go unheated. Factories are closed, in large part because of rolling blackouts and because employees can't drive to work with empty tanks.

Just what disasters could cause severe power shortages varies by country. Nations differ greatly in the extent of their vulnerability. But a solar Carrington event would leave many countries short on electric power.

TEPCO (Tokyo Electric Power Company) used to average 51 million kilowatt hours on average but now tops out at 35 million kilowatts. Summertime needs are at about 60 million. Living in Tokyo this summer won't be pleasant.

Since Japan is divided between 60 Hz and 50 Hz grids with little connection between them the western 60 Hz grid has lots of power while the 50 Hz grid (which has Fukushima and Tokyo in it) has a severe electric power shortage. The rolling blackouts make electric train commuting intermittent and this disrupts work schedules.

In order to rebuild areas damaged by earthquakes and the tsunami Japan's steel mills need to ramp up production. But while the mills suffered little damage they lack sufficient electric power to operate at maximum capacity. Since the Japanese are steel exporters they can probably compensate by exporting less steel.

You might think the Japanese could save electric power by rapidly switching to more energy efficient appliances. But their standards for appliance efficiency are already very high (higher than the United States) and they've already made many of the easy changes for boosting energy efficiency. High energy efficiency has a curious side effect: If a single unit of energy enables production of a large quantity of goods and services then elimination of that unit of energy causes a larger reduction in total output than would happen in a less energy efficient economy.

By Randall Parker 2011 March 24 11:46 PM  Energy Shortages
Entry Permalink | Comments(7)
2011 March 23 Wednesday
Tsunami Risk To Nuclear Plants Foreseen And Ignored

Of course a scientist warned of the tsunami risk to nuclear plants and was ignored.

Yukinobu Okamura, a prominent seismologist, warned of a debilitating tsunami in June 2009 at one of a series of meetings held by the Nuclear and Industrial Safety Agency to evaluate the readiness of Daiichi, as well as Japan’s 16 other nuclear power plants, to withstand a massive natural disaster. But in the discussion about Daiichi, Okamura was rebuffed by an executive from the Tokyo Electric Power Co., which operates the plant, because the utility and the government believed that earthquakes posed a greater threat.

Read the full article. One defense of Tepco and Japanese regulators was that the recent tsunami was not foreseeable. But that's not the case.

Back in 869 AD a similar tsunami occurred.

“The 869 Jōgan earthquake and tsunami struck the area around Sendai in the northern part of Honshu on the 13 July. The earthquake had an estimated magnitude of 8.6 on the surface wave magnitude scale. The tsunami caused widespread flooding of the Sendai plain, with sand deposits being found up to 4 km from the coast.”

While the nukes have gotten enormous attention the deaths were caused directly by the tsunami. All those thousands of who died might have been saved if the Japanese had paid more attention to their geological researchers. So what risks are we under today that do not get the attention they warrant?

The 869 earthquake and tsunami were part of a longer run pattern.

Their results, published in the Journal of Natural Disaster Science, indicated that the medieval tsunami was probably triggered by a Magnitude 8.3 offshore quake and that waters spread more than 4km from the shore.

They also found evidence of two earlier tsunamis on the scale of the Jogan disaster, leading them to conclude that there had been three massive events in the last 3,000 years.

Or maybe the cycle is every 500 years?

"[Tsunamis] have accompanied earthquakes off the Sanriku Coast over a 500-year cycle. There was concern [such a massive quake] would occur in the near future," said Active Fault and Earthquake Research Center chief Yukinobu Okamura.

Smaller tsunamis have hit more often 0 - and will continue to do so.

Large offshore earthquakes have occurred in the same subduction zone in 1611, 1896 and 1933 that each produced devastating tsunami waves on the Sanriku coast of Pacific NE Japan.

That coastline is particularly vulnerable to tsunami waves because it has many deep coastal embayments that amplify tsunami waves and cause great wave inundations.

A 2007 paper by a group of Japanese researchers predicted tsunami recurrences at about 1000 year intervals. So they were predicting what happened.

In Sendai plain, the tsunami deposits extend about 1 to 3 km from the coast line at that time, which is estimated as about 1 km inland of the present coast. In Ishinomaki plain, the tsunami deposits extend > 3 km from the estimated coast line, which is about 1-1.5 km inland of the present coast. Multiple sand layers indicate recurrence of such unusual tsunamis with approximately 1,000 yr interval. We computed tsunami inundation in both plains from several types of tsunami source models such as outer-rise normal fault, tsunami earthquakes (narrow fault near trench axis), interplate earthquakes with fault widths of 50 and 100 km. Comparison of the computed inundation area with the distribution of tsunami deposits indicates that only an interplate earthquake source with 100 km width (depth range of 20 to 50 km) can reproduce the observed distribution of tsunami deposits in both Sendai and Ishinomaki plains. This source (Mw=8.1 to 8.3) is much larger than the anticipated Miyagi-oki earthquake (M~~7.5) with 99% probability in the next 30 years.

What other predictable disasters await us?

By Randall Parker 2011 March 23 11:00 PM  Dangers Natural Geological
Entry Permalink | Comments(10)
2011 March 22 Tuesday
George Monbiot: Now Hard Core Nuclear Power Supporter

Writing in Britain's Guardian, George Monbiot makes a great point as he comes out for nuclear power in the wake of the failures of the Fukushima reactors: in spite of a very rare combination of severe geological events followed by mistakes on the part of reactor site workers and higher management, yes, in spite of all that what happend? With a reactor designed with 40 year old technology the result was far less than the worst case outcome scenarios.

A crappy old plant with inadequate safety features was hit by a monster earthquake and a vast tsunami. The electricity supply failed, knocking out the cooling system. The reactors began to explode and melt down. The disaster exposed a familiar legacy of poor design and corner-cutting. Yet, as far as we know, no one has yet received a lethal dose of radiation.

I am not entirely persuaded by the point. But I want to be persuaded because the world needs every major energy source that exists. Look at the price of oil. Look at the rising difficulties with extracting oil. Look at the gray skies of Chinese cities. We need cleaner energy sources and can't afford to lose one.

Monbiot goes on to point out how inappropriate solar power is for a country like Britain that lies so far north. The densely populated European countries can't use much wind power without going offshore and that's more expensive. Solar power in North Africa delivered by cables into Europe is one discussed option. But with a civil war raging in Libya that option is looking dimmer. Does Europe want to put itself at risk so much more more political instability?

Most reactors do not sit near a subduction zone where one continental plate is getting squeezed under another plate. So most reactors aren't located where 9.0 quakes or 7 meter tsunamis are possible. Most reactors are not built as low to the ocean. Newer reactors have better safety features. Plus, this accident in Japan, rather like airplane crashes, will get heavily picked over by engineers to learn how to prevent even the level of failure we saw at Fukushima. Even better newer reactor designs have much better passive safety features that make them less vulnerable to failures.

As I've already pointed out the nuclear power industry could develop many tools and capabilities to rapidly deliver to a reactor site should multiple pieces of equipment fail at a reactor. Even if all the power generation and cooling systems of a reactor fails off-site equipment should be available for delivery within hours of the start of an incident.

Steve LeVine argues that our energy sources face multiple problems. There's not an easy solution.

What's going on is economic fear, but also a global energy system under severe stress. Over the last several months, we've learned the hard way in incredibly coincidental events that we are in firm control of almost none of our major sources of power: Deep-water oil drilling can be perilous if the company carrying it out cuts corners. Because of chronically bad governance by petrostates, we can't necessarily rely on OPEC supplies either. Shale gas drilling may result in radioactive contamination of water, though who knows since many of the companies involved seem prepared to risk possible ignominy and lawsuits later rather than proactively straighten out their own bad actors. As for much-promoted nuclear power, we know now that big, perfect-storm, black-swan natural disasters can come in twos.

Why do oil companies drill for oil in deep water with half billion dollar drilling platforms? Because that's where most of the new oil fields are going to be found. Offshore oil could make up 40% of world oil production by 2015. We are getting energy from politically, geologically, and technologically challenging sources because those are the sources that are left. The cost of solar power isn't dropping fast enough and solar and wind have big problems with intermittency. There is no one clear great solution for our energy needs.

By Randall Parker 2011 March 22 11:22 PM  Energy Nuclear
Entry Permalink | Comments(14)
Biochip Does Blood Sample Analysis

The field of microfluidics holds the promise of orders of magnitude cheaper biological assays of blood and other samples. Plus, it will enable fast testing without sending off to a lab. Well, an international group of researchers has developed an autonomous lab-on-a-chip.

BERKELEY — A major milestone in microfluidics could soon lead to stand-alone, self-powered chips that can diagnose diseases within minutes. The device, developed by an international team of researchers from the University of California, Berkeley, Dublin City University in Ireland and Universidad de Valparaíso Chile, is able to process whole blood samples without the use of external tubing and extra components.

The researchers have dubbed the device SIMBAS, which stands for Self-powered Integrated Microfluidic Blood Analysis System. SIMBAS appeared as the cover story March 7 in the peer-reviewed journal Lab on a Chip.

“The dream of a true lab-on-a-chip has been around for a while, but most systems developed thus far have not been truly autonomous,” said Ivan Dimov, UC Berkeley post-doctoral researcher in bioengineering and co-lead author of the study. “By the time you add tubing and sample prep setup components required to make previous chips function, they lose their characteristic of being small, portable and cheap. In our device, there are no external connections or tubing required, so this can truly become a point-of-care system.”

The FDA and AMA line up to restrict your freedom to get personal genetic testing. Yet the technological trends are running very rapidly toward the development of very powerful hand-held medical testing devices. You can already buy home triglycerides and cholesterol test kits. I do not want to see governments restrict their availability. Rather, we should some day be able to each individually check a very large list of chemicals in our blood by pulling out a little medical testing lab from a pants pocket.

Instant real-time medical testing any time of the day anywhere you are can revolutionize medicine and medical research. The number of people generating their own personal medical records can explode with cheap lab-on-a-chip devices. These gadgets will some day take your medical test results, send them to a medical expert system web server, and report back to you about whether you have any looming or current problems.

By Randall Parker 2011 March 22 12:11 AM  Biotech Assay Tools
Entry Permalink | Comments(5)
2011 March 20 Sunday
Tepco Slow On Reactor Decisions After Earthquake

The Wall Street Journal has an excellent article outlining some of the mistakes made in the aftermath of the earthquake and tsunami. The top management of Tokyo Electric Power Company, (Tepco) which operates the Fukushima reactor, was too slow to accept the necessity of drastic measures.

TOKYO—Crucial efforts to tame Japan's crippled nuclear plant were delayed by concerns over damaging valuable power assets and by initial passivity on the part of the government, people familiar with the situation said, offering new insight into the management of the crisis.

Tepco did not want to lose the reactors as productive assets. Therefore Tepco hesitated too long to inject sea water whose salt would corrode the reactors so much as to make them unusable in the future. Tepco also initially had very poor communication with the reactor site due to communications damage from the earthquake and tsunami.

"This disaster is 60% man-made," said one government official.

Workers at the reactor site also made mistakes such as running out of fuel for a pump and setting a valve to a wrong position. Not surprising given the pressure they were working under.

We should not be complacent that most nuclear reactors aren't situated near offshore subduction zones. Nukes have plenty of ways to fail. Even when working at a more sedate pace some of the mistakes made by nuclear plant operators do not inspire confidence. For almost 18 months Diablo Canyon's emergency cooling water valves were broken without the awareness of the plant operators.

At the Diablo Canyon nuclear plant, operators found themselves unable to open the valves that provide emergency cooling water to the reactor core and containment vessel, during a test on October 22, 2009.

A misguided fix of an earlier problem had prevented the emergency valves from opening, the NRC team sent to investigate found.

With nuclear reactors capable of going very wrong very quickly what's needed? Human decision loops that are fast enough. There's an analogy here with legendary fighter pilot John Boyd's OODA loop (Observe, Orient, Decide, and Act). The problem with nukes is that for months, years, even decades nothing might happen that requires an extremely fast decision loop. But suddenly the calamity strikes (earthquake followed by tsunami that knocks out back-up generators and communications) and people accustomed to a far more sedate pace of decision-making can't speed up fast enough. Even worse, the kinds of people who love to make fast decisions and who are good at making fast decisions (e.g. fighter pilots) won't take jobs that require them to spend decades staring at nuclear power plant consoles waiting for something to happen.

One of the challenges is to know when Business As Usual has ended and fast extreme decisions are required. With the Three Mile Island reactor incident there was not so dramatic a starting point as a 9.0 earthquake that lasted for 5 minutes followed by a tsunami. The TMI guys weren't as abruptly shaken into a heightened state. The Fukushima incident did have 2 big wake-up calls before the reactors started overheating. So the Fukushima reactor operators and their managers higher up in Tepco at least had big prods toward getting into faster decision loops. But decades of complacency left them ill-equipped to step it up.

Another need: A bigger tool box for dealing with nuclear reactors gone awry. Off-site from any reactor many tools should be available for very rapid deployment. For example:

  • Portable large electric power generators and water pumps transportable by helicopter or truck.
  • Capability to run power lines over miles in well less than a day. Lost local power? Bring in power from somewhere else in a hurry. It is a solvable problem. Might require choppers with crews to deposit towers and run lines across farm fields.
  • Lots of satellite phones and other communications gear deployable wherever communications have been lost.
  • Robots that can substitute for humans in many more tasks such as for bringing fire hoses to bear, for looking into reactors and other radioactively hot areas of a site, and for patching holes to fix spent fuel cooling pools that spring a leak.
  • Robotic autonomous vehicles, both on the ground and in the air.
  • Rapidly deployable sensor nets. Find out what is going on in areas where sensors have been disabled or destroyed.
  • Better tools for protecting humans from radiation. Lead-coated vehicles, lead-covered small control rooms within reactor complexes, portable HEPA filters for working in areas with contaminated air, lead chairs, lead-lined clothing, and other tools for protecting humans.

Most these tools do not need to be located at every single reactor site. They just have to be transportable to reactor sites within hours of the start of an incident. The deeper need is an acceptance that things will go wrong (take off those Panglossian rose-colored sunglasses) and therefore additional tools and capabilities should be available for deployment. We need a strong commitment by the nuclear power industry and governments to develop the tools to handle each failure. The next nuclear accident should not require heroic workers getting themselves radioactively damaged. The response time and tools available should allow problems to get stopped at much earlier stages and at far less cost in assets and lives.

There are upsides to the development of the sort of tool sets I describe above: The tools would have other uses. For example, robotic firefighting equipment can save human lives in conventional fires and robotic vehicles can work in other types of disaster zones.

Update: Also see an opinion piece by Christopher Stephens in the WSJ about regulatory oversight in the nuclear power industry.

By Randall Parker 2011 March 20 04:18 PM  Energy Nuclear
Entry Permalink | Comments(28)
Facebook Attracts Narcissists

Not all personality types are equally attracted to Facebook. Okay, not everyone on Facebook is a narcissist. So all you narcissists out there: Not to worry, you've still got an audience.

“Facebook users tend to be more extroverted and narcissistic, but less conscientious and socially lonely, than non-users,” Tracii Ryan and Sophia Xenos of RMIT University in Melbourne write in the journal Computers in Human Behavior.

This should be seen as a societal benefit of Facebook. By their segregating themselves onto Facebook the narcissists remove themselves from venues where non-narcissists find extroverted narcissists to be overbearing and abusive. If the narcissists do not entirely remove themselves from other venues they at least reduce their presence in other settings. Only so many hours in a day after all.

Here's the challenge: How to create social media web sites where underrepresented people (e.g. the shy and highly conscientious) and relate to each other without the extroverts crashing in? Creating an environment for the shyest people seems especially problematic. They will be too shy to friend each other. They'll be too shy to even sign up, except maybe with a pseudonym and a fake personality.

Are you attracted to Facebook? Do you consider yourself a narcissist? Or just an extrovert? Or is Facebook a place where your shy self feels less inhibited? Or do you really like reconnecting with kids you knew in grade school?

By Randall Parker 2011 March 20 01:17 PM  Society Virtual
Entry Permalink | Comments(13)
2011 March 18 Friday
Cheap Tsunami Survival Ideas

Suppose you are going to choose to live in a zone at risk for a massive earthquake in an offshore subduction zone. Given that a tsunami can travel at 400 mph if the quake occurs within 200 miles of your home you will not have a lot of time to evacuate. The 2004 quake off of Sumatra took just 15 minutes for the massive wave to reach shore and for the quake off of Japan the arrival was just 30 minutes. Not a lot of time to evacuate on damaged roads full of other evacuees.

If you can not evacuate over land then what are your choices? I see 4:

  • Boating: Head out to sea in a speed boat. That means: after a severe earthquake get to a marina where you keep a speedboat and hope you can get it launched into the ocean before the water recedes in advance of the tsunami. Then get into deep water where the wave won't be as high. Unless you've got hours of warning this approach does not seem viable.
  • Duck and cover: Create an underground shelter under your home that is waterproof. One danger in this approach is that even if your shelter can survive the tsunami your path back up to the surface could be blocked. So how to guarantee you'll be able to escape?
  • Skyward: Up, up, and away in your beautiful balloon. But and store a hot air balloon where you live and/or where you work. But a cruise thru eBay turned up a 12 year old hot air balloon for $12,500.00. Pretty expensive. Plus, how long does it take to get one airborne? Also, what if the prevailing wind is toward the sea? Also, how long can you store the fuel?
  • A life raft sphere: Think of the Boston Whaler boats that can't be sunk because of their use of foam in their seats and hull. Well, imagine something more spherical or in a shape similar to an Apollo command module and with the ability to seal it from the inside. Situated on a roof top the life raft might hang together when the tsunami sweeps though. Then you'll ride the wave inland.

The speed boat seems the most expensive option. It is not clear to me the relative costs of the hot air balloon, underground shelter, or specialized life raft.

Both the hot air balloon and the underground shelter get out of the way of the high speed mass, either above or below it. After the quake happens the hot air balloon seems the safest approach if it can be executed fast enough. But keeping the fuel for it stored nearby poses a long term risk and you might not have time enough to pull it out and get it inflated. Plus, the basket where the humans stands probably better float because you are likely to land on the water.

The underground shelter does not work for apartment dwellers or those who do not plan to live where they are living long term. It is a considerable investment.

I find the life raft idea most appealing because of its relatively lower cost and ease of use. It does not guarantee survival. The wave could easily push along debris that would pierce the enclosed life raft right as the water hits it. Also, the acceleration of the life raft as the water hits could be so severe as to kill any occupants. But if every person in the affected area of Japan had been in such an enclosure when the wave hit most who died probably would have lived. Putting these special life rafts on roofs would increase the odds of getting a good lift-off that keeps you riding on top of the advancing wave.

The simplest version of the life raft could be pretty cheap. Imagine a big ball made from hard thick plastic with a heavy bottom for ballast and with foam seats built into it.

Got any other ideas on how to survive a tsunami? Or insights into the practicality of the ideas above?

Update: Tall buildings: In the comments Bruce and LAG suggest using tall buildings. It seems doable given building codes aimed at making the buildings strong enough to survive tsunami waves. In the Japanese town of Minamisanriku even the 4th floor of the hospital was flooded and only those who went up to the 5th floor survived. You can see at that link how few buildings survived. It appears only bigger buildings survived. So in theory fishing villages could be restored with 6 story tall and extremely well build structures. Then people could move up to the 6th floor.

Update II: Off-road bicycles or motorcycles: Looking at a map of the extent of the flooding (see also here it looks like the speed of a bicycle or off-road motorcycle (and an on-road motorcycle would probably work in a pinch) would let you cover enough distance to outrun a wave even if the roads are in a shambles and jammed with cars.

The off-road motorcycle runs the risk of not starting when you most need it. So an off-road bike could work as a back-up plan. Two wheel escape plans seem best out of the ideas so far. But for larger groups of people either extremely strong 6 story buildings or spherical escape pod lifeboats would be better choices.

Update III: Floatable houses: If a house not designed for floating can float out to sea with much of the house above water then imagine how well a house could do if it was designed to float. For example, use foam insulation and include foam insulation in the floor. Before you laugh click thru and look at the picture. Someone could have survived in that house's attic.

By Randall Parker 2011 March 18 10:50 PM  Disaster Survival
Entry Permalink | Comments(63)
2011 March 16 Wednesday
Empty Store Shelves As Japanese Hoard

You know all that advice you hear about stocking up before a disaster? Events in Japan demonstrate the wisdom of advanced preparation. Even in Tokyo Japanese shoppers are cleaning out grocery stores by buying everything.

People in the capital, home to 12 million, snapped up radios, torches, candles, fuel containers and sleeping bags, while for the fourth day there was a run on bread, canned goods, instant noodles, bottled water and other foodstuffs at supermarkets.

This is highly advanced, affluent, and civilized Japan. People in Tokyo fear a full reactor meltdown followed by winds blowing radioactivity into the city. They want to have supplies if the stores stop getting deliveries.

Think where you live makes you immune to, say, nuclear reactor failures triggered by tsunamis? Okay, but that doesn't take you off the hook. The 1970s oil crisis triggered panic buying in Japan.

Retailers said the panic buying was reminiscent of the oil crisis in the 1970s.

So imagine what a revolution in the Persian Gulf would do for your supply of gasoline and food when your neighbors start panic buying. Sound far fetched? Sure. North Africa isn't going to be convulsed by revolutions and civil wars either. Oh wait.

Even emergency supplies available to be sent to the disaster area aren't getting thru due to lack of fuel and damaged roads. Therefore Sendai is short of food, fuel, and water.

The panic buying isn't restricted to Japan. In the United States a run on potassium iodide pills has suppliers running out of pills. Don't wait till the radioactive fall-out threat becomes imminent before trying to get some iodine. Though if you live near an ocean you could go and get some kelp to eat.

In the comments of another recent post SkippyTony of Christchurch New Zealand discussed what he found useful for survival when Christchurch was recently struck by a strong earthquake. Sounds like he discovered too late that he needed to bolt his wine rack to the wall. All you people living within a few hundred miles of the New Madrid fault should take note. He also has a sobering set of Christchurch before and after photos.

One thing to note about supplies: You'll die of thirst weeks or months sooner than you'll die of starvation. So stock up on water if an earthquake or other failure can cut off your water supply for days. If you can lose water for longer then think about means for filtering and purifying dirty water.

If you lose power for an extended period of time and have no way to cook frozen or refrigerated food without utility power then you lose out on a way to prepare perishable food before it goes bad. A big water supply and a camping stove with fuel seem like useful things to stock. Though if you have wood and a suitable place to burn it you could get your cooking heat that way.

Imagine your government some day giving this advice:

“Please do not go outside. Please stay indoors,” Chief Cabinet Secretary Yukio Edano urged the public. “Please close windows and make your homes airtight. Don’t turn on ventilators. Please hang your laundry indoors.”

With a tightly sealed home what do you do for oxygen as it gets depleted and the CO2 builds up? A HEPA air purifier purchased in advance would work - at least as long as you have electric power. Want to go all survivalist? Get photovoltaic panels on your roof so you can use the electric power to run outside air thru HEPA filters during the day to get fresh filtered air.

By Randall Parker 2011 March 16 11:00 PM  Disaster Survival
Entry Permalink | Comments(7)
Medical Info Overload And Direct To Consumer Genetic Testing

Razib Khan points out that the flood of genetic and medical data has reached the point where individual medical practitioners can't grasp the whole big picture.

And yet something else looming in the background right now is the way medicine is practiced in the world today is changing, and has to change. I accept the proposition that from Galen to the 20th century medical doctors generally caused more harm than benefit (much of it due to the fact that they spread disease amongst their patients). Modern medicine is exceptional in that it actually works on a biophysical level. But a lot of the “low hanging fruit” has been picked, and due to the nature of medical research much of the “cutting edge consensus” is wrong. Medicine, like many fields, has been subject to information overload, and I’m skeptical of the ability of any human to keep up. The practice of medicine needs to be augmented by computational analytic tools, as well as a deeper understanding of the natural distortions which occur because of the nature of funding and the institutional framework of biomedical research in the United States, which exhibits an unfortunate trend toward careerism. Add on top of it the political, legal, and ethical variables, and medicine is a tangle which is far more than just applied human biology.

Those computational tools ought to allow us to directly receive biological information about ourselves from sophisticated software in cloud servers. Since computers will be needed to make sense of the flood of data why go to a doctor's office to have a doctor interpret what the computer screen says when you can read it yourself?

In another post Razib draws attention to websites that enable people to engage in Do It Yourself (DIY) biological science. It is just this sort of (rapidly growing) way of doing bottom-up genetics and health research that FDA regulation threatens.

In light of my last post, I want to point to some groups attempting to create some “bottom-up” biological science in the real world. In the Los Angeles area you have SoCal DIY Bio, and in northern California you have BioCurious. And you also have the DIYgenomics website. Apparently the Gene Sequencer for the SoCal DIY Bio needs to be repaired, so I thought I’d pass word on.

Imagine a future in which people use home medical testing devices and online genetic and medical testing services to collect information that they use to enroll themselves as long term research projects. Plummeting costs of full genome sequencing and advances in microfluidics promise to make genetic and other biological testing very cheap and widely available - if only regulators stay out of the way. But the threat of a large scaling up of regulatory restrictions on direct-to-consumer (DTC) genetic testing and medical testing could put the kibosh on all that.

I see the issue of DTC genetic testing as important for a few main reasons:

  • The freedom to know yourself. Know your genetic ancestry, disease risks, and details about your body's genetically-driven design.
  • The potential to save money by cutting out the professional highly credentialed gatekeepers. Medical tests have become unaffordable to many.
  • The potential for bottom-up crowd sourcing medical research. This will be a form of democratization of science.

The bottom-up research holds great potential. The genetic testing company 23andme is already conducting 400 genetics research projects using customer genetic testing data. Imagine a world where many different teams (including people who are amateurs in genetics research) develop software to search for genetic correlations with diseases, behavior, or assorted traits and then the teams ask for volunteers. This effectively democratizes research. The research that advances most quickly is the research that the most people decide to help with genetic and medical testing results and by filling out web forms about their histories, preferences, habits, and abilities.

In the future a large chunk of medical research will get done by millions of people who will pay their own money to get themselves and friends and family tested. They'll pay this money in order to provide to scientists the raw test results needed to do analyses. People with a malady will pay for genetic and other testing that will provide researchers with the data that will otherwise to be too expensive to collect.

By Randall Parker 2011 March 16 12:30 AM  Policy Medical
Entry Permalink | Comments(6)
2011 March 14 Monday
New Designs Would Avoid Japan Reactor Failures?

In Technology Review Kevin Bullis highlights nuclear reactor design improvements that would cut the risk of cooling system failures.

The latest nuclear reactor designs could help avoid the overheating and explosions that have occurred at the Fukushima Daiichi nuclear plant in Japan following the powerful earthquake and tsunami that struck on Friday. Newer reactor designs propose the use of passive cooling systems that would not fail after a power outage, as happened in Japan, as well as other novel approaches to managing reactor heat.

Passive systems are key in my view. Human operators make mistakes and active systems can get damaged when you most need them. According to a Bloomberg report the Japanese reactors had back-up generators designed to withstand 6.3 meter waves but the plant was hit by 7 meter waves. So for want of an additional 0.7 meters of protection the reactors have undergone partial meltdowns. Ouch.

On the bright side, big nuclear reactor failures are like big passenger aircraft accidents: They get heavily picked over and analyzed by large numbers of skilled engineers. We learn from failure. Unfortunately, the failure of nuclear plants in Japan is starting to look more like Chernobyl and less like Three Mile Island in terms of the scale of the disaster.

An LA Times story points to mistakes made by Japanese engineers that have exacerbated the problems.

Engineers had begun using fire hoses to pump seawater into the reactor — the third reactor at the Fukushima No. 1 complex to receive the last-ditch treatment — after the plant's emergency cooling system failed. Company officials said workers were not paying sufficient attention to the process, however, and let the pump run out of fuel, allowing the fuel rods to become partially exposed to the air.

Once the pump was restarted and water flow was restored, another worker inadvertently closed a valve that was designed to vent steam from the containment vessel. As pressure built up inside the vessel, the pumps could no longer force water into it and the fuel rods were once more exposed.

One can guess these reported mistakes are not the only mistakes that have been made so far. Under intense pressure in a crisis situation people will make mistakes. Emergency handling can not depend near perfect decision making.

The Diablo Canyon nuclear power plant at Avila Beach California is designed to handle only a 7.5 earthquake. San Onofre is designed for a 7.0 quake and has a 25 foot high wall to protect from tsunami. A fault 5 miles offshore could let loose some day. The absence of a subduction zone off of SoCal is supposed to put us at much lower risk of a tsunami. Also, SoCal Edison claims San Onofre has more safety layers than the older reactors in Japan. Plus, it has an emergency cooling source that is gravity driven.

But what about the soundness of the probabilities of geological risks and systems reliability that are fed into models for choosing nuclear reactor sites and designs? As Joel Achenbach wrote in a WaPo piece a few days ago, Japanese scientists were expecting the Big One to occur south of Tokyo, not north of it. Japan's preparations were oriented in the wrong direction. The coast off of Sendai hadn't had a huge quake for at least 1,000 years. What other geological surprises lay in store?

If an earthquake followed by a large tsunami is "beyond what anyone could expect" then how can the nuclear power industry claim it can choose sites and designs that will avoid events like the ones happening at the Fukushima nuclear power site?

Richard Meserve, a physicist and former NRC chairman from 1999 to 2003, said the Japanese reactors experienced a "one-two punch of events beyond what anyone could expect or what was conceived."

What is so unlikely about a big earthquake followed by a big tsunami along a Pacific Rim subduction zone?

Update: Diablo Canyon is 85 feet above the ocean. Plus, it has a gravity fed back-up water reserve. So it looks like it is at much lower risk of a tsunami.

What the Japanese ought to do for their remaining undamaged nukes: Build back-up generator buildings that can survive a tsunami flooding over them. Be able to ride out the worst that nature can throw at you can keep on going. Earthen berms and lots of concrete would do the trick. Another idea: Build back-up generators that are well inland and run underground cables to the nukes near the coast.

Any other spectators want to tell the nuclear plant engineers how to make their nukes safe from tsunamis?

By Randall Parker 2011 March 14 10:59 PM  Dangers Complex Engineering
Entry Permalink | Comments(59)
2011 March 13 Sunday
Lessons From Japan For US West Coast

Planet Earth is dangerous. Those of us on the US, Canadian, and Central American West Coast should think seriously about what we can learn from the Japanese earthquake, tsunami, and nuclear reactor failures.

Japan’s massive earthquake and tsunami is alerting the US west coast that the same kind of thing could happen here. In fact, say experts who study the earth’s shifting crust, the “big one” may be past due.

The Pacific Northwest is especially vulnerable and could experience a 9.0 earthquake either onshore or offshore. If offshore the time to get to higher ground would be on the order of about 15 minutes. The Cascadia subduction zone could shake and cause offshore landslides that would cause massive wave movement.

The Cascadia earthquake of 1700 was previously thought to be part of a pattern of earthquakes that averaged 500 year intervals. But more recent research puts the average earthquake interval at 240 years. So we are about 71 years past the average Cascadia earthquake interval.

What about California? A Hayward fault quake could devastate the Bay Area forcing 200,000 out of their homes. SoCal is overdue for a Carrizo Plain earthquake. Risks come from other faults as well.

The US has several big earthquake risks including the New Madrid fault which last let loose in a major way in 1811 and 1812. A replay of especially severe 19th century natural disasters would make the earthquake in Japan small stuff in comparison.

What I'd like to know: How at risk are the San Onofre and Diablo Canyon nuclear power plants from a tsunami and/or strong earthquake? Should they be made safer from tsunami or earthquake risks? The take-home lesson from the Japanese nuclear power plant failures is that equipment and designs for maintaining sufficient reactor coolant water must be capable of handling severe earthquakes. The need for active systems (as distinct from passive systems) to cool nuclear reactors is a very unfortunate aspect of most (all?) operating nuclear power plants today.

Diablo Canyon is designed to handle 20 foot tsunami waves. Can even bigger tsunami waves strike there?

DCPP is designed for storm surge waves of 36 feet and tsunami waves of 20 feet. In 1981, DCPP experienced a 31-foot storm surge. Because of the location and relative geometry of DCPP and the Cascadia (Washington-Oregon) earthquake, there would be no significant tsunami wave action at DCPP, particularly compared to the storm surge that has already been experienced at the plant. Waves from Alaska and Chile could be expected to reach DCPP in five and 13 hours, respectively.

Practical advice: Got enough water to last a couple of weeks? Got enough batteries? Warm clothing if you lose electric power and natural gas?

By Randall Parker 2011 March 13 11:25 PM  Dangers Natural Geological
Entry Permalink | Comments(26)
2011 March 12 Saturday
The Problems With Passenger Rail

A New York Times story looks at why the Tampa-to-Orlando high speed rail project lost political support.

The story of the line’s rise and fall shows how it was ultimately undone by a tradeoff that was made when the route was first selected.

The Tampa-to-Orlando route had obvious drawbacks: It would have linked two cities that are virtually unnavigable without cars, and that are so close that the new train would have been little faster than driving. But the Obama administration chose it anyway because it was seen as the line that could be built first. Florida had already done much of the planning, gotten many of the necessary permits and owned most of the land that would be needed.

These cities were too close together to have air service between their airports. It would have stopped many times. So the time savings over driving would have been small. Upon arriving at either city the need for a car would have been so great that car rental would be necessary.

The fantasy for passenger rail advocates is Europe. But in reality the fantasy does not exist. In much more densely populated Europe the governments encourage mass transit use with high gasoline taxes and large government subsidies for passenger rail and other mass transit. In spite of these conditions cars still account for most miles traveled. A 2007 UK government report on transportation "Are we there yet? A comparison of transport in Europe" contains a chart in chapter 2, "Figure 3: Overall mode share of distance travelled (%) in 2003", that speaks volumes about mass transit in Europe:

Figure 3: Overall mode share of distance travelled (%) in 2003

Only in Switzerland is more than 10% of passenger miles from rail. In the far less densely populated USA we can't get anywhere near those levels of mass transit penetration.

I like railroads. My first trip between America's coasts was on an Amtrak train. Trains are cool. I love to watch them lumbering by. I used to live next to a train track and did not mind the sounds of their passing. But if one's goal is to reduce reliance on oil (and that need seems urgent given fairly stagnant world oil production and yet large non-OECD oil consumption growth since 2000) then one has to consider the marginal costs of cutting demand for oil in all the ways it could be cut (e.g. more hybrids, lighter weight material in cars, bikeways, technology to allow trucks to run automatically in groups on highways to cut wind resistance).

Multi-billion dollar passenger rail projects should not be undertaken just because they've got all their permits lined up and a few politicians and passenger rail enthusiasts are excited. Resource limitations and a $1.6 trillion US budget deficit call for setting a high bar for expected benefits from taxpayer-funded transportation spending.

Even in the realm of rail policy other options loom more tempting. The most obvious: policies aimed at shifting more freight traffic to rail. Rail in the US saves oil by pulling freight away from less energy-efficient trucks (while saving lives just as passenger rail can). A 2009 study for the Federal Railroad Administration found that trains are 1.9 to 5.5 more fuel efficient for freight movement than trucks.

For all movements, rail fuel efficiency is higher than truck fuel efficiency in terms of ton-miles per gallon. The ratio between rail and truck fuel efficiency indicates how much more fuel efficient rail is in comparison to trucks. As illustrated in Exhibit 1-1, rail fuel efficiency varies from 156 to 512 ton-miles per gallon, truck fuel efficiency ranges from 68 to 133 ton-miles per gallon, and rail-truck fuel efficiency ratios range from 1.9 to 5.5.

That link contains more about the causes of differences in rail and truck fuel efficiency than most of you want to know. One factor influencing train fuel efficiency is whether a train route allows double stacking. Well, if the US government wanted to shift more freight traffic to trains it could offer to pay part of the costs of lifting bridges or reworking tunnels (e.g. with accelerated depreciation of investment costs) to accelerate the trend toward more double-stacking. Also, more crossings could be reworked so that trains and cars get separated by bridges. Doing this will speed up freight rail while also saving lives. Faster speeds would both cut rail freight delivery times and increase the total shipping capacity of rail lines. This would cause a shift of more freight to rail. Not as fun as a high speed train ride. But probably far more cost effective as a way to cut both oil usage and highway deaths.

Many passenger rail advocates are uninterested in trade-offs between different ways to spend taxpayer dollars. The cognitive deficiencies that lead them to their way of looking at things are probably not tractable without decades more of advances in genetic engineering and nanotechnology. But there's another approach that might work with a subset of them: passenger rail's role as an energy saver is far from clear.

When Amtrak compares its fuel economy with automobiles (see p. 19), it relies on Department of Energy that presumes 1.6 people per car (see tables 2.13 for cars and 2.14 for Amtrak). But another Department of Energy report points out that cars in intercity travel tend to be more fully loaded — the average turns out to be 2.4 people.

“Intercity auto trips tend to [have] higher-than-average vehicle occupancy rates,” says the DOE. “On average, they are as energy-efficient as rail intercity trips.” Moreover, the report adds, “if passenger rail competes for modal share by moving to high speed service, its energy efficiency should be reduced somewhat — making overall energy savings even more problematic.”

Add in the regulatory demands for higher car efficiency and rail's energy efficiency advantage for moving people becomes even less clear when Prius-level vehicle fuel efficiency becomes the norm. Another source finds poor energy efficiency from light rail.

Since I see reduction in oil usage as far more urgent than reduction in overall energy usage electrified passenger rail could still provide an advantage over gasoline-powered cars. But how long will it take for a passenger rail system to pay back the energy that would go into its construction? Also, it is not clear in the year 2011 whether car battery costs will come down fast enough to remove that advantage from electrified passenger rail. My guess is that electrifying freight rail makes more sense than building out a massive infrastructure of passenger rail in a country with a fairly low population density.

By Randall Parker 2011 March 12 05:18 PM  Energy Transportation
Entry Permalink | Comments(62)
2011 March 10 Thursday
John Hawks A Genetic Libertarian

Many genetics bloggers are responding to the very real threat of stifling US FDA regulations of the genetics direct-to-consumer (DTC) testing industry. Researcher John Hawks proclaims himself a genetic libertarian. So am I. How about you?

Much news coming out of the FDA public meeting on direct-to-consumer (DTC) genetics. Dan Vorhaus was at the proceedings and reports on them ("Looking Ahead After the FDA’s DTC Meeting").

I believe that I have a fundamental right to my own biological information. What I mean is that, if anybody has biological information about me, I should be able to access and use it. Additionally, I think it is immoral for anyone to charge me excessive rates to access my own information. So that's where I'm coming from. I'm a genetic libertarian.

As Razib Khan has pointed out, it is not credible to argue that medical doctors know the details of genetic research on humans. The threat of FDA regulation of DTC genetics testing is all about the FDA empowering themselves and the medical doctors. It is not that they think we can't handle the truth. It is more likely they do not want to allow us to handle the truth. The idea we are going to stress out from genetic testing results does not stand up to scrutiny. People who get themselves genetically tested are not psychologically harmed by the results. People really can handle the truth.

Joe Pickrell says genetic research can be accelerated by people who have a personal interest in discoveries about diseases or traits that they or people close to them have.

You can think what you want about the value of the research done to date by 23andme [1], but in my mind, there’s one simple reason why the sorts of participant-driven research they’re doing can only be a good thing: all research is driven by curiosity, and the people most curious about a disease or trait are those who have it. While people may think of the academic research community as a machine with endless resources and limitless motivation, it’s not. People work on things they think are interesting; they sometimes follow “trendy” topics, or move into fields with more grant money, or get bored of a given problem and move on. So if the research in the trait you’re most interested in isn’t moving fast enough for you, well, tough luck.

We already have examples of individuals who have spearheaded discoveries for genetic diseases they or family members suffered from. The FDA's regulatory ambitions for DTC genetic testing are an obstacle for this sort of research.

Recall that one of the key players in the discovery of the gene for Huntington’s disease was a foundation started by a man whose wife had the disease (startlingly, the current president of the foundation apparently accused DTC companies of “raping” the human genome during the present FDA hearing). Recall also that James Lupski, curious about the cause of his Charcot-Marie-Tooth disease, simply sequenced his own genome to find it.

In the comments of Joe Pickrell's post "Nick" says the ability to contribute to the growth of genetic knowledge is one reason he chose to get himself tested by 23andme.

As a recent 23andme customer I think it’s fair to say that this aspect of the the 23andWe community was probably the biggest single factor in my decision to submit a sample for testing. I’m realistic to know that the current state of genomics knowledge can at best give indications of marginal risks for the various health conditions, and the fact that my results show a large set of common alleles (‘common things are common’ is one of the medical doctrines that applies equally well here) adds to the feeling that the ‘traits’ data, and the contribution to a developing field of science is at least as much part of the value of 23andme’s product as the medical report.

As long as people are free to get themselves genetically tested and genetically sequenced volunteer efforts to crowdsource genetic information to discover causes of diseases and traits can make a substantial and rapidly growing contribution to the rate of genetic discovery. The FDA is an obstacle to progress. It should get out of the way.

By Randall Parker 2011 March 10 10:28 PM  Policy Medical
Entry Permalink | Comments(5)
2011 March 09 Wednesday
FDA's Jeffrey Shuren: Perjury Against Genetic Testing Freedom?

Michael Lee of the FDABlog draws my attention to his post on FDA deception as part of their attempt to regulate direct-to-consumer genetic test. Check out his video Did FDA's Jeffrey Shuren lie under oath about Google-backed 23andMe?

Michael's editorial text interspersed with the video makes great points. The worst statement from Shuren comes at the end where he basically calls for heavy regulation and a market dominated by large cap companies. Large regulatory agencies always end up getting captured by the big players and they work together to protect each other. But it is the small companies that are going to do the most innovation in new areas.

Anne Wojcicki's views are much closer to my own. We can do original and highly valuable science by crowdsourcing. We can get together as groups, pay to collect our own genetic and medical data, and submit the data to data chomping teams that organize on web sites. This approach is a challenge to the traditional gatekeepers and data owners.

The American Medical Association has predictably come out against allowing you to directly go and get yourself genetically tested. They want you to go thru a gatekeeper that has M.D. at the end of their name. I find this infuriating. I believe I have a right to get my full genome sequenced, to own the resulting data, and to get that data analyzed by whoever I want to pay to analyze it. Do you think you have the right to your own design? If so, oppose the FDA, the AMA, and Henry Waxman as they work to take that right away.

Razib Khan shares my fury on this and he says "This is a power grab, this is not about safety or ethics.".

In the very near future you may be forced to go through a “professional” to get access to your genetic information. Professionals who will be well paid to “interpret” a complex morass of statistical data which they barely comprehend. Let’s be real here: someone who regularly reads this blog (or Dr. Daniel MacArthur or Misha’s blog) knows much more about genomics than 99% of medical doctors. And yet someone reading this blog does not have the guild certification in the eyes of the government to “appropriately” understand their own genetic information. Someone reading this blog will have to pay, either out of pocket, or through insurance, someone else for access to their own information. Let me repeat: the government and professional guilds which exist to defend the financial interests of their members are proposing that they arbitrate what you can know about your genome. A friend with a background in genomics emailed me today: “If they succeed in ramming this through, then you will not be able to access your own damn genome without a doctor standing over your shoulder.” That is my fear. Is it your fear? Do you care?

In the medium term this is all irrelevant. Sequencing will be so cheap that it will be impossible for the government and well-connected self-interested parties to prevent you from gaining access to your own genetic information. Until then, they will slow progress and the potential utility of this business. Additionally, this sector will flee the United States and go offshore, where regulatory regimes are not so strict. BGI should give glowing letters of thanks to Jeffrey Shuren and the A.M.A.! This is a power play where big organizations, the government, corporations, and professional guilds, are attempting to squelch the freedom of the consumer to further their own interests, and also strangle a nascent economic sector of start-ups as a side effect.

You are so much more than your genes. So much more than that 3 billion base pairs. But they are a start, a beginning, and how dare the government question your right to know the basic genetic building blocks of who you are. This is the same government which attempted to construct a database of genetic information on foreign leaders. We know very well then who they think should have access to this data. The Very Serious People with a great deal of Power. People with “clearance,” and “expertise,” have a right to know more about your own DNA sequence than you do.

Click thru and read what Razib is going to do about this. Also, read Daniel MacArthur and Misha Angrist.

Also see my post Daniel MacArthur On Freedom Of Genetic Info And Paternalism.

Fellow bloggers: tell your readers about this!

Update: Michael Lee has more:

"The statement, for 23andMe in particular, that 'they are not doing their own research on the genetic profiles,'" said Stanford's Serafim Batzoglou, "is patently false." "Clearly this is false," said Russ Altman, also at Stanford. "I am reviewing [23andMe's paper] in my annual review of translational bioinformatics."

The FDA wants to discredit 23andme in order to make it easier to ban direct-to-consumer genetic testing. The FDA wants to keep us chained to the gatekeepers.

Update II: Here's a letter genetic anthropologist and US National Academy of Sciences member Henry Harpending telling the FDA we do not need the paternalism of the AMA when it comes to genetic information.

Dear FDA: I am writing to comment on the meeting to be held March 8-9 about direct to consumer (DTC) genetic testing (Docket FDA-2011-N-0066). I am especially motivated to write after reading the plea to you by the AMA that any DTC results of possible medical interest be censored to consumers. Their letter reflects an appalling paternalistic arrogance that would violate basic freedoms and impede public scientific understanding. I presume that if they could they would have you ban bathroom scales on the grounds that body weight must only be revealed in consultation with a “qualified medical professional.”

The AMA submission has two main themes. The first is that citizens are unable to understand the risks and predicted outcomes that might be reported and that experts are vital to provide guidance. My own experience is that I am perfectly capable of finding empirical risks from current literature, I expect I can do a much better and more thorough job than my personal physician, and even my teenage son can do it with no trouble. My own experience, again, is that only about 1 in 5 medical students know what Bayes’ Theorem is.

The second theme is that knowledge of potentially medically relevant genotypes can do some unspecified harm to customers. I have spent a total of six or so years on university IRBs, and this kind of worry is ever present. While there is much public loose talk about psychological harm and the like, within the committee room we all understand that the practice of witholding any data from subjects about themselves is nothing but protection from lawyers. I am perfectly free to refuse to participate in research and in clinical trials but I am not free to refuse to participate in federal censorship of knowledge of my own genotype.

I would urge you to keep freedom of information for consumers at the center of the table when you discuss regulation of the DTC genetic testing industry.

The AMA wants to stand between you and your genetic information. I am opposed. How about you? Writing letters to your elected representatives and to the FDA is one way to fight this.

By Randall Parker 2011 March 09 09:34 PM  Policy Medical
Entry Permalink | Comments(8)
2011 March 08 Tuesday
Grown Replacement Urethras Work In Kids

Regeneration and rejuvenation will become possible as a result of tissue engineering research aimed at growing replacement parts.

WINSTON-SALEM, NC – March 7, 2011 – Researchers at the Institute for Regenerative Medicine at Wake Forest University Baptist Medical Center and colleagues reported today on a new advance in tissue engineering. The team is the first in the world to use patients’ own cells to build tailor-made urinary tubes and successfully replace damaged tissue.

In an article published Online First by The Lancet, the research team reports replacing damaged segments of urinary tubes (urethras) in five boys. Tests to measure urine flow and tube diameter showed that the engineered tissue remained functional throughout the six-year (median) follow-up period.

“These findings suggest that engineered urethras can be used successfully in patients and may be an alternative to the current treatment, which has a high failure rate,” said Anthony Atala, M.D., senior author, director of the Wake Forest Institute for Regenerative Medicine and a pediatric urologic surgeon. “This is an example of how the strategies of tissue engineering can be applied to multiple tissues and organs.” 

Humans differ from cars in that cars can have their parts replaced when the parts wear out. When crucial human parts wear out we get sick and eventually die. When scientists succeed to growing replacement parts for all of our bodies (except our brains) then death due to aging will become avoidable as long as brain rejuvenation techniques can be made to work.

Atala's team has previously succeeded in growing replacement bladders that work in humans. Atala's team is also working on development of tissue engineering techniques to repair the bodies of damaged soldiers. More successes from his team and other labs will keep getting reported. Many body parts will be replaceable in 10 years.

By Randall Parker 2011 March 08 11:29 PM  Biotech Tissue Engineering
Entry Permalink | Comments(3)
2011 March 07 Monday
Mediterranean Diet Meta-Analysis Finds Benefits

A meta-analysis finds the Mediterranean diet has proven benefits.

The Mediterranean diet has proven beneficial effects not only regarding metabolic syndrome, but also on its individual components including waist circumference, HDL-cholesterol levels, triglycerides levels, blood pressure levels and glucose metabolism, according to a new study published in the March 15, 2011, issue of the Journal of the American College of Cardiology. The study is a meta-analysis, including results of 50 studies on the Mediterranean diet, with an overall studied population of about half a million subjects.

Here's the diet in broad outline:

The Mediterranean diet is a dietary pattern characterized by high consumption of monounsaturated fatty acids, primarily from olives and olive oils; daily consumption of fruits, vegetables, whole grain cereals, and low-fat dairy products; weekly consumption of fish, poultry, tree nuts, and legumes; a relatively low consumption of red meat; and a moderate daily consumption of alcohol, normally with meals.

Out of that list what are the good foods versus the less bad foods? Are whole grains beneficial or less bad? Are low-fat dairy products beneficial or less bad? A previous study provides clues to these questions.

Curiously the admonitions against eating lots of red meat put the Mediterranean diet at odds with most versions of the Paleo Diet. I'd like to see the Mediterranean and Paleo diets compared by blood triglycerides and blood sugar.

By Randall Parker 2011 March 07 10:56 PM  Aging Diet Studies
Entry Permalink | Comments(11)
Diverse Plant Communities Contain More Biomass

Reducing the number of plant species in an area reduces the productivity of that area.

An international team of researchers including professor Emmett Duffy of the Virginia Institute of Marine Science has published a comprehensive new analysis showing that loss of plant biodiversity disrupts the fundamental services that ecosystems provide to humanity.

This makes intuitive sense because different plants occupying the same niche each bring their own specializations of function that enable them to exploit different parts of that niche.

The team’s analysis shows that plant communities with many different species are nearly 1.5 times more productive than those with only one species (such as a cornfield or carefully tended lawn), and ongoing research finds even stronger benefits of diversity when the various other important natural services of ecosystems are considered. Diverse communities are also more efficient at capturing nutrients, light, and other limiting resources.

The analysis also suggests, based on laboratory studies of algae, that diverse plant communities generate oxygen—and take-up carbon dioxide—more than twice as fast as plant monocultures.

As humans shift more and more land into human uses the total biomass on the planet could substantially decline. In fact, that's probably already happened given the large areas now under human control. Though farming in more naturally barren regions could deliver the opposite effect of more biomass when extensive irrigation enables conversion of deserts into farm land.

By Randall Parker 2011 March 07 10:31 PM  Trends Agriculture
Entry Permalink | Comments(5)
2011 March 06 Sunday
Tiger Blood And Neuropeptide Y: Calm Under Fire

A Slate article took Charlie Sheen's (very entertaining IMO) comments about his tiger blood as an occasion to look at the science behind people who can respond very calmly and adaptively when in danger and under pressure.

Yale psychiatrist Andy Morgan, for example, has studied elite Special Forces recruits as they undergo "Survival, Evasion, Resistance, and Escape" training, a three-week course designed to simulate the tortures of enemy capture. The program is brutally stressful, yet many recruits preserve an amazing amount of mental clarity in the midst of it. When Morgan examined the poised trainees' blood tests, he saw that they were producing significantly more of "a goofy little peptide called neuropeptide Y" than other, more rattled recruits. The extra NPY was like a layer of stress-deflecting mental Kevlar; its effects are so pronounced that Morgan can tell whether a soldier has made it into the Special Forces or not just by looking at a blood test.

I've long argued that we will see the development of drugs that allow us to customize our minds and bodies to different work environments, home environments, and other conditions. Shift into hyperfocus mode, shift into social conversational mode, and other states of mind as needed. One can see why militaries would want this capability. Turn soldiers into calm functioning machines when they go onto the battlefield. When first entering a dangerous battle avoid having a significant percentages of soldiers either totally crack and run or become paralyzed zombies.

Turns out this is not new news. A 2004 report finds special forces have more NPY than normal soldiers.

Special forces soldiers, who had thirty-three percent higher plasma levels of neuropeptide Y than general troop soldiers, were found to possess clearer minds and to have out-performed other soldiers under stress. In a related study, Morgan and colleagues also discovered that soldiers in Combat Dive training who released more NPY during stress excelled in underwater navigation, and that hostage rescue team members with higher NPY levels during stress performed better.

Whether the Special Forces guys developed the ability to respond better to stress or have that as an innate ability is not clear. Though my guess is more the latter.Special Forces also have their NPY drop faster after a stressful situation. So they adjust more rapidly both to stressful and normal situations. They aren't called "special" for nothing.

Alterations in how our minds and bodies respond come with costs. Some evidence suggested that genetic alleles (variations) that boost NPY levels also boost atherosclerosis risk. So does calm under fire really mean that the stress is just taking a different kind of toll on your body? I am surprised by that result because other reports claim NPY reduces the stress response. Though another report found NPY injection causes obesity in female rats.

Once scientists develop a much clearer and far more detailed picture of stress response pathways it will become possible to tweak hormones and other biochemicals to fine tune our metabolic and cognitive responses to a great many environments and conditions.

By Randall Parker 2011 March 06 09:49 PM  Brain Emotion Alteration
Entry Permalink | Comments(16)
Even Legal Mental Work Getting Automated

Even lawyers are getting automated out of jobs. Back in 1978 legal discovery costs could run into the millions for large numbers of workers sifting thru documents.

When five television studios became entangled in a Justice Department antitrust lawsuit against CBS, the cost was immense. As part of the obscure task of “discovery” — providing documents relevant to a lawsuit — the studios examined six million documents at a cost of more than $2.2 million, much of it to pay for a platoon of lawyers and paralegals who worked for months at high hourly rates.

The world has radically changed due to advances in computer hardware and software. Computers now replace lots of legal brain power.

But that was in 1978. Now, thanks to advances in artificial intelligence, “e-discovery” software can analyze documents in a fraction of the time for a fraction of the cost. In January, for example, Blackstone Discovery of Palo Alto, Calif., helped analyze 1.5 million documents for less than $100,000.

Was the $100k the total cost? It is not clear. But an inflation calculator shows $2.2 million in 1978 is $7.4 million in 2011.

Imagine a robot judge accepting filings from robot prosecutors and robot defense attorneys. The debates in many cases would proceed at speeds too fast for humans to follow. Still, a slower legal process could still happen due to computational costs for preparing arguments and analyzing opposing arguments. For example, imagine that the defense wants to prove their client was somewhere else at the time of a crime. The defense could file for a delay in order to have more time to run algorithms to filter thru video camera feeds looking for indications some camera caught the defendant somewhere else.

The defense could also try to come up with legal theories that are too computationally expensive to disprove. The prosecution could then complain that the defense has unfairly asserted a position based on an assertion that can't be disproved because any algorithm capable of generating a disproof would be of the complexity class NP-complete (at least unprovable today and possibly always unprovable).

Computational complexity will become a subject of legal rulings. Should reasonable doubt be allowed to rest on (seemingly improbable) interpretations of events that are computationally impossible to prove or disprove? Will defense teams (human or otherwise) be allowed to search for theories of events that can't be proved or disproved due to the enormous computational complexity of algorithms needed to test their theories?

Legal automation will proceed apace regardless of how these questions are resolved. Let us take a look at the big picture: how far will computer automation of human jobs go? Some people think that regardless of cognitive difficulty repetitive jobs will get automated but non-repetitive jobs won't. But as Paul Krugman notes at that link, even medical diagnosis stands a good chance of getting automated. So is it repetitive? That page also shows truck driving as a non-repetitive job. But wait a second. Google has automated guidance systems racking up many tens of thousands of miles driving cars on California highways. So how can truck driving be safe from automation? If truck driving isn't safe from automation then taxi driving isn't either.

Delivery truck driving might be safe for humans for a while longer just because humans have to hop out and deliver packages to the front door of a house or business. But suppose robotic delivery devices could deposit items into special delivery boxes out on streets? That'd allow automated delivery from not only online stores but also grocery stores.

What about the grocery stores? Well, no need for human check-out if robots get the food off the shelves. Kiva warehouse robots cut out human labor. These robots are going to enable automated local warehouses. So imagine grocery stories replaced with automated warehouses loading automated delivery vehicles to deliver groceries to houses. Deliveries could be scheduled to happen when you are at home so that getting perishables into the fridge in a timely manner won't be a problem.

New York Times writer Ron Lieber argues that online ordering from Amazon does not have to become as cheap as in-person shopping at Costco for Amazon to become preferred due to the time savings of online shopping. This has implications for automation. If you order stuff rather than buy it in person then its shipment to you is much more amenable to automation. As robotic delivery trucks hit the road and warehouse robot costs fall and they become more powerful and easier to manage the distance between home and warehouse will shrink. So the timeliness advantage of the local store will decline. Add in robotic delivery vehicles and why spend a couple of hours on a trip to a big box store?

Then there's work supervision. Supposedly non-repetitive and by humans only. Why? The theory is that humans need human supervision. But if bottom level tasks get automated then human supervisors aren't needed for the robots. And for some kinds of human tasks computers will eventually do much more monitoring and directing. This is already happening. For example, humans that package up orders (whether in restaurant kitchens or warehouses) are in many cases just reading order lists off a computer screen. In many of those cases no human supervisor chose which lists of foods or ordered goods to put on each computer screen. Humans are already under computer supervision. That'll happen even more in the future. We will enter our restaurant orders in a touch screen and no waitress will see that order before the order cook (human or robotic) sees it.

What other jobs are under threat from computers? Martin Ford sees radiology as a prime candidate for computer automation. Given that rising medical costs are stagnating living standards I'd rather see that happen sooner than later.

News reporting is at risk of automation too. The companies writing low quality web site content farms (whose content Google is trying to detect and avoid) are trying to develop automated news writing software that has higher quality. Out in the battlefield of Afghanistan 1 robot per 50 soldiers shows a trend toward robotic soldiers. So web wars and physical wars are both spurring development of more computer automation.

By Randall Parker 2011 March 06 04:03 PM  Robotics Trends
Entry Permalink | Comments(16)
2011 March 04 Friday
Bill Ford: Pace Of Electric Car Development Big Unknown

Ford Motor Company Chairman Bill Ford does not know how fast electric car technology will develop and doesn't believe anyone else knows either.

"We still don't know what the winning technology is going to be...

Ford continued: "We've made a big bet on electric... but the pace at which that develops, I think anyone who can tell you that is lying."

I'm with Bill Ford on this one: We do not know. One can certainly find confident claims of rapid electric battery cost reductions. Even the White House makes claims of coming rapid battery cost reductions. But the people who make the most confident statements are too often those who know the least or have motivations to deceive. How about the year 2020? Predictions are all over the map.

Skeptics include some major car companies and researchers.

Alex Molinaroli, president of Johnson Controls Inc.'s battery division, is confident it can reduce the cost of producing batteries by 50% in the next five years, though the company won't say what today's cost is. The cost reduction by one of the world's biggest car-battery makers will mostly come from efficient factory management, cutting waste and other management-related expenses, not from any fundamental improvement of battery technology, he said.

But researchers such as Mr. Whitacre, the National Academies of Science and even some car makers aren't convinced, mainly because more than 30% of the cost of the batteries comes from metals such as nickel, manganese and cobalt. (Lithium makes up only a small portion of the metals in the batteries.)

Governments are currently subsidizing electric car purchases. The hope is these subsidies will lead to economies of scale and incentives for faster rates of innovation. But note that similar subsidies for many years have not yet made photovoltaics competitive with other means of generating electricity.

By Randall Parker 2011 March 04 09:15 AM  Energy Electric Cars
Entry Permalink | Comments(12)
2011 March 02 Wednesday
Surrogacy, Donated Egg, Who Is Mom?

In a legal case in Camden County New Jersey a couple used his sperm and donated egg to carry a baby to term in a surrogate woman. Now the wife wants her name on the birth certificate and The state Registrar is balking that the wife has no legal standing to claim maternity.

But the state Registrar, an office that records birth certificates, asserted after the child's birth the wife had no legal grounds to claim maternity. Her only option, it said, is stepparent adoption.

The Family Court judge in Camden agreed with the Registrar, which wants to issue a second birth certificate -- this one with the mother's name left blank. And now, a three-judge appellate panel has upheld the Registrar's view.

Since the wife did not contribute either a womb or an egg she really wasn't the mother at the moment of birth. She can only establish a mother's role by acting as one.

What will get more interesting: Embryos constructed using DNA from several donors with the resulting embryo implanted in a future artificial womb. When that baby emerges from the artificial womb who is Mom for the birth certificate? There won't be an obvious candidate.

By Randall Parker 2011 March 02 11:31 PM  Bioethics Reproduction
Entry Permalink | Comments(15)
US Southwest At Risk For Megadrought?

Megadrought. What a great word. Not tinny at all. Looking at the previous interglacial most like our current Holocene era some climate history researchers found that the US southwest can undergo a megadrought during a warmer interglacial such as the interglacial we are currently in.

In a letter published recently in the journal Nature, Los Alamos National Laboratory researchers and an international team of scientists report that the Southwest region of the United States undergoes "megadroughts"—warmer, more arid periods lasting hundreds of years or longer. More significantly, a portion of the research indicates that an ancient period of warming may be analogous to natural present-day climate conditions. If so, a cooler, wetter period may be in store for the region, unless it is thwarted by increased concentrations of greenhouse gasses in the atmosphere that could warm the planet.

In a previous interglacial known as Marine Isotopic Stage 11 the southwest experienced a drought lasting thousands of years.

The oldest warm period in MIS 11 appears somewhat analogous to the present-day Holocene interglacial period, which has been ongoing for about the past 10,000 years. During MIS 11, the ancient climate warmed dramatically by about 14 degrees Fahrenheit. This warming in the wake of a preceding period of cold gave rise to an abundance of plant life and seasonally wet conditions. As warming continued, grasses and shrubs died off and lakes dried up. The ensuing drought lasted thousands of years before ending abruptly with a cooler, wetter period.

The world's human population has exploded during this interglacial. Humans have spread into regions which are more hospitable than they were in at least part of previous interglacials. If only we had a better understanding of climate we might be able to predict our risk for megadroughts in regions around the world. Such prediction would give us more time to prepare.

By Randall Parker 2011 March 02 10:23 PM  Climate History
Entry Permalink | Comments(20)
2011 March 01 Tuesday
Happier People Live Longer

Pessimists do not stick around as long.

“We reviewed eight different types of studies,” Diener said. “And the general conclusion from each type of study is that your subjective well-being – that is, feeling positive about your life, not stressed out, not depressed – contributes to both longevity and better health among healthy populations.”

A study that followed nearly 5,000 university students for more than 40 years, for example, found that those who were most pessimistic as students tended to die younger than their peers. An even longer-term study that followed 180 Catholic nuns from early adulthood to old age found that those who wrote positive autobiographies in their early 20s tended to outlive those who wrote more negative accounts of their young lives.

Of course there's the question of the direction of cause and effect. People with better health will be happier. Ditto smart people who see their way to success. A healthier and more capable body and mind has greater odds of having what it takes to go the distance.

There were a few exceptions, but most of the long-term studies the researchers reviewed found that anxiety, depression, a lack of enjoyment of daily activities and pessimism all are associated with higher rates of disease and a shorter lifespan.

Of course you might be thinking defeatist thoughts at this point: "My pessimistic outlook means I shouldn't even try to exercise and eat better since I'll die young regardless".

The message here is clear: Always Look On The Bright Side of Life .

Also, of course: Don't Worry Be Happy .

What other happy long-life songs am I missing?

Update: Employees who are expected to smile at work are better off fantasizing to feel good rather than faking smiles.

A new study led by a Michigan State University business scholar suggests customer-service workers who fake smile throughout the day worsen their mood and withdraw from work, affecting productivity. But workers who smile as a result of cultivating positive thoughts – such as a tropical vacation or a child’s recital – improve their mood and withdraw less.

So think positive thoughts about a tropical island or a road trip or a night on the town.

By Randall Parker 2011 March 01 11:30 PM  Aging Studies
Entry Permalink | Comments(11)
Site Traffic Info
Site Copyright
The contents of this site are copyright ©