2010 February 28 Sunday
Alien Star Clusters In Milky Way Galaxy

Think about the implications: Intelligent alien species could have migrated into our galaxy from an older galaxy riding along with a star cluster.

KINGSTON, ON – As many as one quarter of the star clusters in our Milky Way – many more than previously thought – are invaders from other galaxies, according to a new study. The report also suggests there may be as many as six dwarf galaxies yet to be discovered within the Milky Way rather than the two that were previously confirmed.

"Some of the stars and star clusters you see when you look into space at night are aliens from another galaxy, just not the green-skinned type you find in a Hollywood movie. These 'alien' star clusters that have made their way into our galaxy over the last few billion years," says Terry Bridges, an astronomer at Queen's University in Kingston, Canada.

Six dwarf galaxies in the Milky Way: Are the aliens in them short in stature?

Galaxies have collisions with other galaxies and either combine into a single galaxy or exchange some mass. Did these alien star clusters from from accidental collisions? Or did some aliens contrive to eject their star clusters from their own galaxies aimed at the Milky Way billions of years ago? Maybe they were losing factions in battles in their own galaxies and escaped from otherwise certain destruction by riding away in star clusters.

Anyone understand the politics of alien star cluster escapes?

By Randall Parker 2010 February 28 04:35 PM  Space Alien Intelligence
Entry Permalink | Comments(17)
2010 February 26 Friday
Embryonic Epigenetic State Mapped

A group of scientists has systematically mapped state changes in embryonic cells as they turn into the various types of cells needed to form organs and a complete organism. This information is needed, for example, to figure out how to coax stem cells into forming replacement organs.

LA JOLLA, CA – February 2, 2010 –– Scientists at The Scripps Research Institute and The Genome Institute of Singapore (GIS) led an international effort to build a map that shows in detail how the human genome is modified during embryonic development. This detailed mapping is a significant move towards the success of targeted differentiation of stem cells into specific organs, which is a crucial consideration for stem cell therapy.

The study was published in the genomics journal Genome Research on February 4, 2010.

"The cells in our bodies have the same DNA sequence," said Scripps Research Professor Jeanne Loring, who is a senior author of the paper with Chia-Lin Wei of the Genome Institute of Singapore and the National University of Singapore and Isidore Rigoutsos of IBM Thomas J. Watson Research Center. "Epigenetics is the process that determines what parts of the genome are active in different cell types, making a nerve cell, for example, different from a muscle cell."

Making stem cells into useful therapy amounts in large part to getting control of methylation patterns on the DNA and manipulating other aspects of epigenetic state. To put it another way: Scientists need to the ability to measure and manipulate the regulatory state of cells.

The genome is a few billion base pairs. Lots of methyl groups and proteins are basically parked at precise locations all over the DNA preventing some parts of the DNA from getting activated while allowing other parts to get read and used to operate the cell. How hard will it turn out to be to get all that regulatory state just right for cell therapies? If things do not go just right the risks include cancer, creation of the wrong cell types in the wrong places, and incomplete repair.

By Randall Parker 2010 February 26 05:27 PM  Biotech Stem Cells
Entry Permalink | Comments(1)
2010 February 25 Thursday
High Status Means More Brain Striatum Dopamine Receptors

If you have high social status you are more likely to enjoy rewards and feel motivations due to higher dopamine receptor concentrations.

Philadelphia, February 3, 2010 - People have typically viewed the benefits that accrue with social status primarily from the perspective of external rewards. A new paper in the February 1st issue of Biological Psychiatry, published by Elsevier suggests that there are internal rewards as well.

Dr. Martinez and colleagues found that increased social status and increased social support correlated with the density of dopamine D2/D3 receptors in the striatum, a region of the brain that plays a central role in reward and motivation, where dopamine plays a critical role in both of these behavioral processes.

The researchers looked at social status and social support in normal healthy volunteers who were scanned using positron emission tomography (PET), a technology that allowed them to image dopamine type 2 receptors in the brain.

This data suggests that people who achieve greater social status are more likely to be able to experience life as rewarding and stimulating because they have more targets for dopamine to act upon within the striatum.

What's the direction of cause and effect here? Might the higher dopamine receptor concentration motivate people do to more to raise their status? Or does the brain grow more dopamine receptors once higher status has been achieved?

By Randall Parker 2010 February 25 11:17 PM  Brain Economics
Entry Permalink | Comments(7)
2010 February 24 Wednesday
Bigger Brain Reaction For Poor Getting Money

Humans have an instinctive desire to redistribute some of the wealth?

PASADENA, Calif.—The human brain is a big believer in equality—and a team of scientists from the California Institute of Technology (Caltech) and Trinity College in Dublin, Ireland, has become the first to gather the images to prove it.

Specifically, the team found that the reward centers in the human brain respond more strongly when a poor person receives a financial reward than when a rich person does. The surprising thing? This activity pattern holds true even if the brain being looked at is in the rich person's head, rather than the poor person's.

These conclusions, and the functional magnetic resonance imaging (fMRI) studies that led to them, are described in the February 25 issue of the journal Nature.

"This is the latest picture in our gallery of human nature," says Colin Camerer, the Robert Kirby Professor of Behavioral Economics at Caltech and one of the paper's coauthors. "It's an exciting area of research; we now have so many tools with which to study how the brain is reacting."

I always wonder how much people hold some opinion or preference due to reasoning that they claimed to have used to arrive at their conclusion. When is logical reasoning the cause of an opinion versus a rationalization produced produced after conscious awareness of the preference? My guess is that most of the time we are not aware of our rationalizing and we tend to want to believe that our conscious reasoning brought us to some point of view.

By Randall Parker 2010 February 24 10:35 PM  Brain Economics
Entry Permalink | Comments(17)
2010 February 23 Tuesday
Fishing Bans Protect Coral Reefs

Australia’s Great Barrier Reef is benefiting from marine reserve areas.

Australia’s Great Barrier Reef (GBR) is showing an extraordinary range of benefits from the network of protected marine reserves introduced there five years ago, according to a comprehensive new study published in the Proceedings of the US National Academy of Sciences.

The scientific team, a ‘who’s-who’ of Australian coral reef scientists, describe the findings as “a globally significant demonstration of the effectiveness of large-scale networks of marine reserves”.

“Our data show rapid increases of fish inside no-take reserves, in both reef and non-reef habitats ,” says Professor Terry Hughes, Director of the ARC Centre of Excellence for Coral Reef Studies, speaking today at the American Association for the Advancement of Sciences meeting in San Diego, California.

“Critically, the reserves also benefit overall ecosystem health and resilience”, says lead author Dr Laurence McCook of the ARC Centre of Excellence for Coral Reef Studies and Great Barrier Reef Marine Park Authority.

“Outbreaks of coral-eating, crown-of-thorns starfish are less frequent on no-take reefs, which consequently have a higher abundance of healthy corals after outbreaks.”

I'd like to see a lot more fishing bans to give heavily depleted areas a chance to recover. Humanity is overfishing the oceans and the result is declining catches in many areas.

By Randall Parker 2010 February 23 10:51 PM  Trends Bio Resource Usage
Entry Permalink | Comments(2)
Mid Life Overweight Faster Mental Decline

Keep off the pounds or lose your mind more rapidly.

The adverse affects of being overweight are not limited to physical function but also extend to neurological function, according to research in the latest issue of The Journals of Gerontology Series A: Biological and Medical Sciences (Volume 65A, Number 1).

The publication presents a collection of ten articles highlighting new findings related to obesity in older persons.

"One of the unanticipated consequences of improved medical management of cardiovascular disease is that many obese individuals reach old age,” said Journal of Gerontology: Medical Sciences Editor Luigi Ferrucci, MD, PhD, of the National Institute on Aging. “We need a better understanding of the causes and consequences of obesity in older individuals — especially when obesity is associated with sarcopenia.”

A study headed by Anna Dahl, MS, of Sweden’s Jönköping University, found that individuals with higher midlife body mass index (BMI) scores had significantly lower general cognitive ability and significantly steeper decline than their thinner counterparts over time. These statistics were compiled from a study of Swedish twins that took place over the course of nearly 40 years, from 1963 to 2002; the results were the same for both men and women.

The press release also describes how people with a history of cyclical weight loss and gain have more problems in their old age. Try to lose weight in a way that you can keep it off. Of course, that's much easier to say than do.

We need great weight loss drugs that adjust the metabolism to keep weight off without side effects.

By Randall Parker 2010 February 23 10:45 PM  Aging Diet Brain Studies
Entry Permalink | Comments(11)
2010 February 22 Monday
Brain Genes Key For Facial Recognition

Your ability to recognize faces comes from your genes.

The ability to recognise faces is largely determined by your genes, according to new research at UCL (University College London).

Published today in the Proceedings of the National Academy of Sciences, scientists found that identical twins were twice as similar to each other in terms of their ability to recognise faces, compared to non-identical twins.

Researchers also found that the genetic effects that allow people to recognise faces are linked to a highly specific mechanism in the brain, unrelated to other brain processes such as the ability to recognise words or abstract art.

The researchers used the Cambridge Face Memory Test in this study. You can take the Cambridge Face Memory test online.

It is going to be interesting to see which forms of cognitive ability are not part of the g-factor type of general intelligence. Once genetic trade-offs between different types of cognitive abilities become known prospective parents will face difficult choices. Which types of intellectual ability to favor? Abilities that enhances different types of athletic performance? Abilities that make someone a top lawyer? Or a combination of intellectual abilities, coordination, and stamina that makes for a top surgeon?

By Randall Parker 2010 February 22 11:03 PM  Brain Intelligence
Entry Permalink | Comments(0)
Intelligence Tracked To Brain Regions

Spearman's g-factor comes from a distributed set of brain regions.

PASADENA, Calif.—A collaborative team of neuroscientists at the California Institute of Technology (Caltech), the University of Iowa, the University of Southern California (USC), and the Autonomous University of Madrid have mapped the brain structures that affect general intelligence.

The study, to be published the week of February 22 in the early edition of the Proceedings of the National Academy of Sciences, adds new insight to a highly controversial question: What is intelligence, and how can we measure it?

The research team included Jan Gläscher, first author on the paper and a postdoctoral fellow at Caltech, and Ralph Adolphs, the Bren Professor of Psychology and Neuroscience and professor of biology. The Caltech scientists teamed up with researchers at the University of Iowa and USC to examine a uniquely large data set of 241 brain-lesion patients who all had taken IQ tests. The researchers mapped the location of each patient's lesion in their brains, and correlated that with each patient's IQ score to produce a map of the brain regions that influence intelligence.

Of course, if IQ differences can be traced down to physical differences in brain regions then IQ is a product of physical qualities of brains.

Connections between the brain regions matter too.

"One of the main findings that really struck us was that there was a distributed system here. Several brain regions, and the connections between them, were what was most important to general intelligence," explains Gläscher.

Once the genetic causes of intelligence differences become known and DNA testing becomes ultra-cheap the dating and mating game will change quite drastically. Equally intelligent people won't have equal odds at making smart babies because some will have some IQ-boosting genes on only one out of a chromosome pair and others will have the boosting genes on both chromosomes. The latter will make the most attractive mates for those who want smart babies. Also, in vitro fertlization with genetic testing to select embryos will become the rage for those most ambitious about their children.

By Randall Parker 2010 February 22 10:34 PM  Brain Intelligence
Entry Permalink | Comments(14)
Lower Cost Hospitals Achieve Equivalent Quality

W. Edwards Deming would not be surprised. Lower cost hospitals are not lower quality - at least for congestive heart failure treatment.

The costs that hospitals incur in treating patients vary widely and do not appear to be strongly associated either with the quality of care patients receive or their risk of dying within 30 days, according to a report in the February 22 issue of Archives of Internal Medicine, one of the JAMA/Archives journals.

"Hospitals face increasing pressure to lower cost of care while improving quality of care," the authors write as background information in the article. However, critics have expressed concerns about the trade-off between the two goals. "In particular, might hospitals with lower cost of care and lower expenditures devote less effort to improving quality of care? Might the pursuit of lower cost of care drive hospitals to be 'penny wise and pound foolish,' discharging patients sooner, only to increase re-admission rates and incur greater inpatient use over time?"

Lena M. Chen, M.D., M.S., of the University of Michigan, Ann Arbor, and colleagues conducted a national study of hospitals that discharged Medicare patients who were hospitalized for congestive heart failure or pneumonia in 2006. For each condition, the researchers used data from national databases to examine the association between hospital cost of care and several variables: 30-day death rates, readmission rates, six-month inpatient cost of care and a quality score based on several performance indicators for each condition.

Costs of care for each condition varied widely. Care for a typical patient with congestive heart failure averaged $7,114 and could range from $1,522 to $18,927, depending on which of the 3,146 hospitals discharged the patient. Cost of care for a typical patient with pneumonia averaged $7,040 and varied from $1,897 to $15,829 per hospitalization among 3,152 facilities.

"Compared with hospitals in the lowest-cost quartile [one-fourth] for congestive heart failure care, hospitals in the highest-cost quartile had higher quality-of-care scores (89.9 percent vs. 85.5 percent) and lower mortality [death] for congestive heart failure (9.8 percent vs. 10.8 percent)," the authors write. "For pneumonia, the converse was true. Compared with low-cost hospitals, high-cost hospitals had lower quality-of-care scores (85.7 percent vs. 86.6 percent) and higher mortality for pneumonia (11.7 percent vs. 10.9 percent)."

This makes sense in the United States especially because providers and consumers of health care services both lack sufficient incentives for lower costs. The payers aren't the receivers of medical treatments. The overall system encourages massive spending. So there's lots of potential for cost cutting. Also, the pursuit of higher quality can cut costs in lots of ways. Achieving higher quality requires better understanding of a process and correction of flaws in the process. In a hospital that includes cutting infections, cutting surgical mistakes, cutting drug dosage mistakes, and other improvements that lead to better outcomes at lower cost.

Better practices for handling catheters developed at Johns Hopkins eliminates almost all bloodstream infections in ICUs.

The state of Michigan, which used a five-step checklist developed at Johns Hopkins to virtually eliminate bloodstream infections in its hospitals' intensive care units , has been able to keep the number of these common, costly and potentially lethal infections near zero — even three years after first adopting the standardized procedures. A report on the work is being published in the February 20 issue of BMJ (British Medical Journal).

Peter Pronovost, M.D., Ph.D., a professor of anesthesiology and critical care medicine at Johns Hopkins University School of Medicine and a patient safety expert, says the widely heralded success in Michigan — the first state system to tackle in a wholesale fashion infections in central-line catheters ubiquitous in intensive-care units — has significantly changed the way physicians think about these infections.

A checklist for inserting catheters saves money and lives. This is not high tech.

The checklist contains five basic steps for doctors to follow when placing a central-line catheter: wash their hands; clean a patient's skin with chlorhexidine; wear a mask, hat, gown, and gloves and put sterile drapes over the patient; avoid placing a catheter in the groin where infection rates are higher and remove the catheter as soon as possible, even if there's a chance it might be needed again at some point.

Central lines are used regularly for patients in the ICU to administer medication or fluids, obtain blood tests, and directly gauge cardiovascular measurements such as central venous blood pressure. Each year roughly 80,000 patients become infected and 30,000 to 60,000 die at a cost of $3 billion nationally. Before heading to Michigan, Pronovost tested the checklist at Johns Hopkins Hospital, where catheter infections have also been virtually eliminated.

That these practices aren't already practiced in every hospital tells us there's probably still plenty of low hanging fruit in the area of medical quality improvement. Oh, and Johns Hopkins is also trying to boost herd immunity of their employees. Good idea.

One can try to spend larger sums of money to prevent very few deaths (more here). But if the goal is really to save lives then a relentless pursuit of process improvements to achieve higher quality will save more lives and save money at the same time.

By Randall Parker 2010 February 22 10:07 PM  Policy Medical
Entry Permalink | Comments(0)
2010 February 21 Sunday
Cheap Solar Cells Alone Do Not Make Cheap Solar Power

Writing at greentechmedia.com Craig Hunter outlines an argument for why even if solar cells become almost free the current PV panel approach has so many other costs that competitive PV electric power remains a distant prospect.

Framed in this way, the litmus test for any solar energy technology is its ability in the next 10-20 years to be deployed in hundreds of gigawatts per year, delivering electricity at $.05-.07/kWh, even in areas that aren't very sunny. Given the load factors of PV installations, not to mention the possible need for storage, we need to consider a target installed system price of no more than $1/Wp.

Panel prices have indeed come down significantly, but the PV "experience curve" (15% cost reduction for each doubling of production) is too slow, requiring us to get to 40-80GW per year production just to reach sub-$1/Wp panel pricing. And unless we see disruptive improvements in conversion efficiency, the "balance-of-system" costs (i.e., all the system costs other than the solar panel itself) will make it impossible to achieve a $1/Wp system price even if the panels are nearly free.

So are PV's prospects really that bad? On homes will PV integrated into roofing tile provide the way to avoid most of the physical packaging costs of solar panels? I would expect PV tiles to enable avoidance of lots of labor costs - at least with new roofs where the labor is already getting paid to put up roofing tiles anyway. Even if that happens how much of the total cost is the panel packaging? Will grid-tie inverter costs come down far enough to enablesub-$1/Wp total system cost?

By Randall Parker 2010 February 21 10:21 PM  Energy Solar
Entry Permalink | Comments(51)
We Can Easily Prevent Ice Ages

Assuming humans do not go extinct we should be able to prevent future ice ages. NASA Goddard Institute director and climate scientist James Hansen's Storms of My Grandchildren: The Truth About the Coming Climate Catastrophe and Our Last Chance to Save Humanity has a relevant passage. He says the Earth won't experience ice ages as long as humans live here.

The size of continental-scale ice sheets is mind-boggling. Although thinner toward the edges, ice over New York towered several times higher than the Empire State building--thick enough to crush everything in today's New York City to smithereens. But not to worry--even though we sometimes hear geoscientists talk as if ice ages will occur again, it won't happen--unless humans go extinct. Forces instigating ice ages, as we shall see, are so small and slow that a single chlorofluorocarbon factory would be more than sufficient to overcome any natural tendency toward an ice age. Ice sheets will not descend over North America and Europe as long as we are around to stop them.

So stopping ice ages is no big deal. Those of us who live to the day when rejuvenation therapies stop the aging process will not have to worry about a new ice age several thousand years from now.

What I want to know: Are there chlorofluorocarbons that do not cause ozone layer damage? Or perhaps are there chlorofluorocarbons so potent as greenhouse gases that they can be used in concentrations too low to do much damage to the ozone layer?

What we will probably need to figure out much sooner: What's the safest way to cool the planet? Enhanced bright cloud formation? Sulfur aerosols? Silicon dioxide aerosols? A space-based approach with satellites to reduce sunlight to the Earth would cost orders of magnitude more than silicon dioxide. But satellites would allow for much more rapid and precise control.

By Randall Parker 2010 February 21 10:11 PM  Climate Engineering
Entry Permalink | Comments(59)
New Wind Power Maps For United States

Wired points to a new report on wind power availability in the United States.

The amount of wind power that theoretically could be generated in the United States tripled in the newest assessment of the nation’s wind resources.

Current wind technology deployed in nonenvironmentally protected areas could generate 37,000,000 gigawatt-hours of electricity per year, according to the new analysis conducted by the National Renewable Energy Laboratory and consulting firm AWS Truewind. The last comprehensive estimate came out in 1993, when Pacific Northwest National Laboratory pegged the wind energy potential of the United States at 10,777,000 gigawatt-hours.

Huge potential for lots of energy. Ah, but at what cost?

Check out an accompanying PDF document, which makes the argument that wind could supply 20% of the electric power in the United States by 2030. This document (basically a PDF slide show you can go thru quickly - so don't be bashful about clicking on it) makes a number of interesting claims. Especially see page 20 for comparisons of costs of electricity from wind (at different quality levels depending on where sited), nuclear, natural gas, and coal with different levels of emissions control.

If these claims are correct then onshore wind at the highest quality locations is almost the same price as new coal plants using older style and dirtier combustion (at least without a carbon tax). Also, coal with integrated gasification and combined combustion (IGCC - the cleanest way to burn coal) is more expensive than wind or natural gas and without a carbon tax coal IGCC equals nuclear in cost. The fact that IGCC is so expensive explains why we do not see cleaner coal. Oh, and when we hear politicians tout "Clean Coal" without also stating that they want to require all new coal plants to use IGCC we can know those politicians aren't being sincere.

Since coal burned with IGCC without Carbon Capture and Storage (CCS) already costs as much as nuclear power and IGCC emits CO2 whereas nuclear does not I do not see the point of "Clean Coal". Why not just use nuclear? Coal IGCC+CSS costs way more than nuclear. Again, see page 20. If CCS has a future it is probably with natural gas plants, chemical plants, and refineries.

Carbon taxes are still politically impossible in the United States. Given that fact and given that Obama believes CO2 emissions reduction is necessary in order to prevent global warming Obama's support for nuclear reactor loan guarantees in spite of anger from environmentalists is understandable. Even the the Waxman-Markey climate bill on Congress makes small steps with regard to coal CO2 emissions. Obama can't raise the cost of dirtier coal. So he's trying to lower the cost of cleaner nuclear. The US government has been doing the same for years with the Production Tax Credit for wind power. Wind and nuclear costs are being subsidized in order to lower their prices and make them competitive with dirty coal. This is politically easier to do than to tax pollution from coal electric plants.

You might think from looking at page 20 that wind is cheaper than nuclear and so why not just go with wind? Well, see page 50 for the problem with that line of thinking. In a nutshell: there's not enough of the cheaper higher quality wind from classes 4, 5, 6, and 7. We can't displace most of the fossil fuel electricity without getting into lower quality (more intermittent and slower) and therefore more expensive class 3 wind. But class 3 wind is very close to nuclear in price. Plus, nuclear has the advantage of working 24 hours per day and also nuclear works in areas where the winds are weak. The southeastern part of the United States has especially weak winds:

A wind resource map of the United States. Click on this map to go to more wind maps.

Note to Old South congressmen: Your future is nuclear if it is not coal.

There's strong wind offshore. But look again at page 50 and see the problem with offshore wind. Those blue bands on the right for offshore wind have really high prices. Offshore wind at 12 to 14 cents per kwh hour is even more expensive than coal IGCC+CSS. Nuclear is much cheaper than either of them.

Suppose the US government puts in place policies that lead to a huge ramp in wind production. What happens to US CO2 emissions? See page 60. Even if wind rises to 20% of total US electricity production in 2030 total CO2 emissions from the electric power sector will still rise. Without either stronger measures to curtail demand or a big build of nuclear or embrace of much more expensive coal IGCC+CSS plants CO2 production from burning coal will continue to increase. Again, in this light Obama's embrace of nuclear power in the face of howls from nuclear opponents is not surprising.

Currently coal provides 50% of US electric power. Coal IGCC+CSS to make coal into a carbon-free electric power source seems far too expensive. Why add 4 cents per kilowatt-hour to make coal clean when we can switch to nuclear and wind instead? Also, solar's cost continues to fall (unlike wind btw - see page 16) and solar will become competitive as an afternoon peaking power source in the US southwest probably by the middle this decade and in more areas each year beyond. Nuclear, wind, and eventually solar make sense as replacements for dirty coal. The much hyped "clean coal" might never make sense. What are the realistic prospects for lowering the costs of coal IGCC+CSS? I have greater hope for 4th gen nuclear reactors (or perhaps small modular reactors) to lower the costs for nuclear than for IGCC+CSS to mature into a competitive alternative.

By Randall Parker 2010 February 21 12:08 PM  Energy Wind
Entry Permalink | Comments(16)
2010 February 20 Saturday
Ibuprofen Cuts Parkinson's Disease Risk

Ibuprofen (brands include Advil, Motrin, Midol, Nuprin) cuts the risk of developing Parkinson's disease.

ST. PAUL, Minn. – New research shows people who regularly take ibuprofen may reduce their risk of developing Parkinson's disease, according to a study released today that will be presented at the American Academy of Neurology's 62nd Annual Meeting in Toronto April 10 to April 17, 2010.

What I want to know: What's the effect of long term ibuprofen on all-cause mortality? Does the risk of stomach bleeding from taking ibuprofen outweigh the risk reduction from avoiding Parkinson's?

The research involved 136,474 people who did not have Parkinson's disease at the beginning of the research. Participants were asked about their use of non-steroid anti-inflammatory drugs (NSAIDs), including aspirin, ibuprofen and acetaminophen. After six years, 293 participants had developed Parkinson's disease.

The protective effect appears to be dose dependent.

The study found regular users of ibuprofen were 40 percent less likely to develop Parkinson's disease than people who didn't take ibuprofen. Also, people who took higher amounts of ibuprofen were less likely to develop Parkinson's disease than people who took smaller amounts of the drug. The results were the same regardless of age, smoking and caffeine intake.

The other studied NSAIDs did not deliver this benefit.

"Ibuprofen was the only NSAID linked to a lower risk of Parkinson's," said Xiang Gao, MD, with Harvard School of Public Health in Boston. "Other NSAIDs and analgesics, including aspirin and acetaminophen, did not appear to have any effect on lowering a person's risk of developing Parkinson's. More research is needed as to how and why ibuprofen appears to reduce the risk of Parkinson's disease, which affects up to one million people in the United States."

Since long term use of ibuprofen carries with it a risk of stomach bleeding (and the risk is greater above age 60) you shouldn't start taking it without a compelling reason that will convince you (and preferably your doctor too) that your benefits will outweigh your risks.

When many more genetic risk factors for Parkinson's become known it might become possible to narrow down to a much smaller population of people who are likely to cut their Parkinson's risk. Also, neurological or blood tests in middle age might some day help identify those at greater risk. Since most people do not get Parkinson's even in their old age most people won't experience a risk reduction from taking ibuprofen. The problem is that we do not know which people who take ibuprofen will benefit by either delaying or avoiding the development of Parkinson's.

Similarly, better genetic and other testing might identify those at most risk from bleeding when taking ibuprofen. The ideal group to take ibuprofen would be those at least risk of stomach bleeding who are also at higher risk for getting Parkinson's. Perhaps in another 5-10 years we'll have enough genetic testing capability to identify who ought to take ibuprofen.

By Randall Parker 2010 February 20 09:40 PM  Brain Disorders
Entry Permalink | Comments(0)
Deep Ocean Pumps For Planet Cooling Problematic

One way to remove carbon dioxide from the atmosphere involves pumping deep and nutrient-rich water up to the surface where the nutrients will boost algae growth and thereby pull lots of carbon from the atmosphere as carbon gets fixed to hydrogen (to make sugars, fats. protein) by photosynthesis. But some researchers in Germany find that if the pumping system ever stops the rebound would be worse never intervening in the first place.

Pumping nutrient-rich water up from the deep ocean to boost algal growth in sunlit surface waters and draw carbon dioxide down from the atmosphere has been touted as a way of ameliorating global warming. However, a new study led by Professor Andreas Oschlies of the Leibniz Institute of Marine Sciences (IFM-GEOMAR) in Kiel, Germany, pours cold water on the idea.

"Computer simulations show that climatic benefits of the proposed geo-engineering scheme would be modest, with the potential to exacerbate global warming should it fail," said study co-author Dr Andrew Yool of the National Oceanography Centre, Southampton (NOCS).

Hundreds of millions of pipes would be needed. I wonder about the energy costs of such a scheme. The scheme at best would capture less than a tenth of annual carbon emissions.

A previous study, of which Yool was lead author, used an ocean general circulation model to conclude that literally hundreds of millions of pipes would be required to make a significant impact on global warming. But even if the technical and logistical difficulties of deploying the vast numbers of pipes could be overcome, exactly how much carbon dioxide could in principle be sequestered, and at what risk?

In the new study, the researchers address such questions using a more integrated model of the whole Earth system. The simulations show that, under most optimistic assumptions, three gigatons of carbon dioxide per year could be captured. This is under a tenth of the annual anthropogenic carbon dioxide emissions, which currently stand at 36 gigatons per year. A gigaton is a million million kilograms.

The biggest benefit would come from the cold water cooling nearby land masses.

One surprising feature of the simulations was that the main effect occurred on land rather than the ocean. Cold water pumped to the surface cooled the atmosphere and the land surface, slowing the decomposition of organic material in soil, and ultimately resulting in about 80 per cent of the carbon dioxide sequestered being stored on land. "This remote and distributed carbon sequestration would make monitoring and verification particularly challenging," write the researchers.

Stopping the pumps would result in higher CO2 than would have happened without the pumps.

More significantly, when the simulated pumps were turned off, the atmospheric carbon dioxide levels and surface temperatures rose rapidly to levels even higher than in the control simulation without artificial pumps. This finding suggests that there would be extra environmental costs to the scheme should it ever need to be turned off for unanticipated reasons.

"All models make assumptions and there remain many uncertainties, but based on our findings it is hard to see the use of artificial pumps to boost surface production as being a viable way of tackling global warming," said Yool.

I expect anyone still alive 50 years from now (and quite possibly sooner) will witness the use of large scale efforts to do climate engineering. The human impact on climate will continue to increase because human discount rates about the future are so high and lots of trends in land use amount to a big tragedy of the commons on a planetary scale. I hope the side effects of sulfur aerosols or silicon dioxide for cooling are minimal.

By Randall Parker 2010 February 20 03:12 PM  Climate Engineering
Entry Permalink | Comments(3)
2010 February 18 Thursday
Dolphins Smart Enough To Deserve Better?

Are dolphins sufficiently self-aware to deserve more ethical treatment from humans?

Emory University neuroscientist Lori Marino will speak on the anatomical basis of dolphin intelligence at the American Association for the Advancement of Science conference (AAAS) in San Diego, on Sunday, Feb. 21 at 3:30 p.m.

"Many modern dolphin brains are significantly larger than our own and second in mass to the human brain when corrected for body size," Marino says.

A leading expert in the neuroanatomy of dolphins and whales, Marino will appear as part of a panel discussing these findings and their ethical and policy implications.

Some dolphin brains exhibit features correlated with complex intelligence, she says, including a large expanse of neocortical volume that is more convoluted than our own, extensive insular and cingulated regions, and highly differentiated cellular regions.

"Dolphins are sophisticated, self-aware, highly intelligent beings with individual personalities, autonomy and an inner life. They are vulnerable to tremendous suffering and psychological trauma," Marino says.

The growing industry of capturing and confining dolphins to perform in marine parks or to swim with tourists at resorts needs to be reconsidered, she says.

"Our current knowledge of dolphin brain complexity and intelligence suggests that these practices are potentially psychologically harmful to dolphins and present a misinformed picture of their natural intellectual capacities," Marino says.

Marino worked on a 2001 study that showed that dolphins can recognize themselves in a mirror – a finding that indicates self-awareness similar to that seen in higher primates and elephants.

By Randall Parker 2010 February 18 11:31 PM  Bioethics Humanity Definition
Entry Permalink | Comments(43)
Permafrost Line Moves North In Canada

In an area around the southern end of Hudson Bay in Canada the permafrost line has moved 130 kilometers (80 miles) northward in the last 50 years.

Quebec City, February 17, 2010–The southern limit of permanently frozen ground, or permafrost, is now 130 kilometers further north than it was 50 years ago in the James Bay region, according to two researchers from the Department of Biology at Université Laval. In a recent issue of the scientific journal Permafrost and Periglacial Processes, Serge Payette and Simon Thibault suggest that, if the trend continues, permafrost in the region will completely disappear in the near future.

The researchers measured the retreat of the permafrost border by observing hummocks known as "palsas," which form naturally over ice contained in the soil of northern peat bogs. Conditions in these mounds are conducive to the development of distinct vegetation—lichen, shrubs, and black spruce—that make them easy to spot in the field.

I would be curious to know how much methane is being released in the area that ceased to have permafrost. The resumption in the rise of atmospheric methane is cause for concern.. A potential source of positive feedback in warming comes from release of methane from previously permafrost ground since methane is a much more potent greenhouse gas than carbon dioxide. So far arctic methane does not amount to much of total global methane emissions:

They found that just over half of all methane emissions came from the tropics, with some 20m tonnes released from the Amazon river basin each year, and 26m tonnes from the Congo basin. Rice paddy fields across China and south and south-east Asia produced just under one-third of global methane, some 33m tonnes. Just 2% of global methane comes from Arctic latitudes, the study found, though the region showed the largest increases. The 31% rise in methane emissions there from 2003-07 was enough to help lift the global average increase to 7%.

By Randall Parker 2010 February 18 10:01 PM  Climate Trends
Entry Permalink | Comments(17)
2010 February 17 Wednesday
Oxytocin Improves Social Skills Of Autistics

Social skills come in a nasal spray.

Autism is a disease characterized by difficulties in communicating effectively with other people and developing social relationships. The team led by Angela Sirigu at the Centre de Neuroscience Cognitive (CNRS) has shown that the inhalation of oxytocin, a hormone known to promote mother-infant bonds and social relationships, significantly improved the abilities of autistic patients to interact with other individuals. To achieve this, the researchers administered oxytocin to 13 autistic patients and then observed their social behavior during ball games and during visual tests designed to identify ability to recognize faces expressing different feelings. Their findings, published in PNAS on 15 February 2010, thus reveal the therapeutic potential of oxytocin to treat the social disorders from which autistic patients suffer.

Blood tests for oxytocin levels might indicate who could benefit from extra oxytocin.

Because previous research has indicated that some people with autism might have abnormally low levels of oxytocin, conducting tests to identify those people and administering them the hormone might help as well, said Karen Parker, an assistant professor of psychiatry at Stanford University School of Medicine.

By Randall Parker 2010 February 17 10:48 PM  Brain Altruism
Entry Permalink | Comments(21)
Poor Teen Sleeping Due To Lack Of Blue Light?

Our technological civilization is depriving kids of the short wavelength light they need to produce enough melatonin.

In the study just published in Neuroendocrinology Letters, Dr. Figueiro and LRC Director Dr. Mark Rea found that eleven 8th grade students who wore special glasses to prevent short-wavelength (blue) morning light from reaching their eyes experienced a 30-minute delay in sleep onset by the end of the 5-day study.

If you want to go to bed later and wake up later then wear glasses that block the blue light frequencies in the morning.

"If you remove blue light in the morning, it delays the onset of melatonin, the hormone that indicates to the body when it's nighttime," explains Dr. Figueiro. "Our study shows melatonin onset was delayed by about 6 minutes each day the teens were restricted from blue light. Sleep onset typically occurs about 2 hours after melatonin onset."

The kids need more blue light in their classrooms in the morning to get their melatonin production and circadian cycle working correctly.

The problem is that today's middle and high schools have rigid schedules requiring teenagers to be in school very early in the morning. These students are likely to miss the morning light because they are often traveling to and arriving at school before the sun is up or as it's just rising. "This disrupts the connection between daily biological rhythms, called circadian rhythms, and the earth's natural 24-hour light/dark cycle," explains Dr. Figueiro.

In addition, the schools are not likely providing adequate electric light or daylight to stimulate this biological or circadian system, which regulates body temperature, alertness, appetite, hormones and sleep patterns. Our biological system responds to light much differently than our visual system. It is much more sensitive to blue light. Therefore, having enough light in the classroom to read and study does not guarantee that there is sufficient light to stimulate our biological system.

"According to our study, however, the situation in schools can be changed rapidly by the conscious delivery of daylight, which is saturated with short-wavelength, or blue, light," reports Dr. Figueiro.

Anyone use a sort of light alarm clock where a really bright light comes on in your bedroom to help you wake up in the morning? I've thought of trying one. Might help to get going in the morning, especially in winter.

By Randall Parker 2010 February 17 07:10 PM  Brain Sleep
Entry Permalink | Comments(16)
2010 February 16 Tuesday
Caltech Scientists Hit New Solar Conversion Efficiency

Caltech scientists used thin silicon wires to achieve a previous unheard of level of efficiency in converting photons into electrical flow.

PASADENA, Calif.—Using arrays of long, thin silicon wires embedded in a polymer substrate, a team of scientists from the California Institute of Technology (Caltech) has created a new type of flexible solar cell that enhances the absorption of sunlight and efficiently converts its photons into electrons. The solar cell does all this using only a fraction of the expensive semiconductor materials required by conventional solar cells.

"These solar cells have, for the first time, surpassed the conventional light-trapping limit for absorbing materials," says Harry Atwater, Howard Hughes Professor, professor of applied physics and materials science, and director of Caltech's Resnick Institute, which focuses on sustainability research.

The light-trapping limit of a material refers to how much sunlight it is able to absorb. The silicon-wire arrays absorb up to 96 percent of incident sunlight at a single wavelength and 85 percent of total collectible sunlight. "We've surpassed previous optical microstructures developed to trap light," he says.

This is an amazing accomplishment. At a competitive price such high efficiency in photovoltaic material would allow a much smaller footprint of land area to supply a very large fraction of all the energy we use. Now, if they can manage to do this cheaply it'll be a game changer. See below for reasons why this approach has big cost reduction potential.

They don't waste precious photons.

The silicon wire arrays created by Atwater and his colleagues are able to convert between 90 and 100 percent of the photons they absorb into electrons—in technical terms, the wires have a near-perfect internal quantum efficiency. "High absorption plus good conversion makes for a high-quality solar cell," says Atwater. "It's an important advance."

The key to the success of these solar cells is their silicon wires, each of which, says Atwater, "is independently a high-efficiency, high-quality solar cell." When brought together in an array, however, they're even more effective, because they interact to increase the cell's ability to absorb light.

Is a low price possible? They use far less silicon than in a conventional silicon PV cell. Plus, the flexibility of the material lends itself to lower cost manufacturing techniques.

Each wire measures between 30 and 100 microns in length and only 1 micron in diameter. "The entire thickness of the array is the length of the wire," notes Atwater. "But in terms of area or volume, just 2 percent of it is silicon, and 98 percent is polymer."

In other words, while these arrays have the thickness of a conventional crystalline solar cell, their volume is equivalent to that of a two-micron-thick film.

Since the silicon material is an expensive component of a conventional solar cell, a cell that requires just one-fiftieth of the amount of this semiconductor will be much cheaper to produce.

The composite nature of these solar cells, Atwater adds, means that they are also flexible. "Having these be complete flexible sheets of material ends up being important," he says, "because flexible thin films can be manufactured in a roll-to-roll process, an inherently lower-cost process than one that involves brittle wafers, like those used to make conventional solar cells."

Technology Review has a very good write-up on this advance.

By Randall Parker 2010 February 16 11:18 PM  Energy Solar
Entry Permalink | Comments(8)
100 Million Year Ago Volcanoes Cut Oxygen Supply

The natural history of the planet Earth is full of extreme events that would wipe out humans if they occurred today.

Geoengineering -- deliberate manipulation of the Earth's climate to slow or reverse global warming -- has gained a foothold in the climate change discussion. But before effective action can be taken, the Earth's natural biogeochemical cycles must be better understood.

Two Northwestern University studies, both published online recently by Nature Geoscience, contribute new -- and related -- clues as to what drove large-scale changes to the carbon cycle nearly 100 million years ago. Both research teams conclude that a massive amount of volcanic activity introduced carbon dioxide and sulfur into the atmosphere, which in turn had a significant impact on the carbon cycle, oxygen levels in the oceans and marine plants and animals.

For a planet that is 4.5 billion years old 100 million years ago is recent geological history. What I want to know: Can Earth do this level of volcanic eruption again?

The volcanic eruptions cut ocean oxygen so much that one-third of marine life died.

Both teams studied organic carbon-rich sediments from the Western Interior Seaway, an ancient seabed stretching from the Gulf of Mexico to the Arctic Ocean, to learn more about a devastating event 94.5 million years ago when oxygen levels in the oceans dropped so low that one-third of marine life died.

The authors of the first paper, titled "Volcanic triggering of a biogeochemical cascade during Oceanic Anoxic Event 2," reveal that before oxygen levels dropped so precipitously there was a massive increase in oceanic sulfate levels. Their conclusion is based on analyses of the stable isotopes of sulfur in sedimentary minerals from the central basin of the Western Interior Seaway, located in Colorado.

A sulfate spike in the oceans increased phosphorus availability (how?) and phytoplankton went wild and created massive dead zones.

The researchers theorize that a massive amount of volcanic activity caused this sulfate spike, which triggered a cascade of biogeochemical events. More sulfate led to an abundance of the nutrient phosphorous, which allowed phytoplankton populations in the oceans to multiply. The phytoplankton thrived and then died. Their decomposing bodies depleted oxygen levels in the oceans, leading to the widespread death of marine animals.

We see a similar phenomenon on a smaller (albeit still large) scale today at the mouths of major rivers. Fertilizer run-off from farms causes massive dead zones. We need to restore wetlands that can serve as cleaners of rivers and also reduce agricultural run-off.

By Randall Parker 2010 February 16 10:30 PM  Climate Biodiversity
Entry Permalink | Comments(18)
2010 February 15 Monday
Embryonic Still Beat Induced Pluripotent Stem Cells

Some U Wisc Madison researchers compared induced pluripotent stem cells (iPS cells - made from reprogrammed adult cells reprogrammed) to embryonic stem cells and find embryonic stem cells become other cell types more efficiently. Techniques to make iPS cells still need additional improvement.

MADISON — The great promise of induced pluripotent stem cells is that the all-purpose cells seem capable of performing all the same tricks as embryonic stem cells, but without the controversy.

However, a new study published this week (Feb. 15) in the Proceedings of the National Academy of Sciences comparing the ability of induced cells and embryonic cells to morph into the cells of the brain has found that induced cells — even those free of the genetic factors used to program their all-purpose qualities — differentiate less efficiently and faithfully than their embryonic counterparts.

The finding that induced cells are less predictable means there are more kinks to work out before they can be used reliably in a clinical setting, says Su-Chun Zhang, the senior author of the new study and a professor in the University of Wisconsin-Madison School of Medicine and Public Health.

"Embryonic stem cells can pretty much be predicted," says Zhang. "Induced cells cannot. That means that at this point there is still some work to be done to generate ideal induced pluripotent stem cells for application."

The biggest advantage of iPS cells is that they can be made from each person's adult cells. So they avoid immuno-rejection problems. Plus, they just feel less alien. They are your own cells. I also think there's another practical benefit I've yet to see mentioned: iPS cells made into neural stem cells are less likely to change your personality by producing neurons that'll behave differently for genetic reasons.

Dr. Zhang does not expect this advantage of embryonic stem cells to last for long.

Despite their unpredictability, Zhang notes that induced stem cells can still be used to make pure populations of specific types of cells, making them useful for some applications such as testing potential new drugs for efficacy and toxicity. He also noted that the limitations identified by his group are technical issues likely to be resolved relatively quickly.

"It appears to be a technical issue," he says. "Technical things can usually be overcome."

Many researchers are quite busy working on improved techniques for making iPS cells. For example, some Stanford researchers have just developed a safer and easier way to make induced pluripotent stem cells.

STANFORD, Calif. - Tiny circles of DNA are the key to a new and easier way to transform stem cells from human fat into induced pluripotent stem cells for use in regenerative medicine, say scientists at the Stanford University School of Medicine. Unlike other commonly used techniques, the method, which is based on standard molecular biology practices, does not use viruses to introduce genes into the cells or permanently alter a cell's genome.

It is the first example of reprogramming adult cells to pluripotency in this manner, and is hailed by the researchers as a major step toward the use of such cells in humans. They hope that the ease of the technique and its relative safety will smooth its way through the necessary FDA approval process.

"This technique is not only safer, it's relatively simple," said Stanford surgery professor Michael Longaker, MD, and co-author of the paper. "It will be a relatively straightforward process for labs around the world to begin using this technique. We are moving toward clinically applicable regenerative medicine."

Lowly fat cells harnessed to a higher purpose.

The Stanford researchers used the so-called minicircles - rings of DNA about one-half the size of those usually used to reprogram cell - to induce pluripotency in stem cells from human fat.

By Randall Parker 2010 February 15 09:59 PM  Biotech Stem Cells
Entry Permalink | Comments(9)
Progress On Nanotech For DNA Sequencing

The ultimate DNA sequencing devices will process individual strands of DNA, one letter at a time, thru measuring gates. Arizona State University researcher Stuart Lindsay leads a team using nanotech to read strands of DNA.

Lindsay's team relies on the eyes of nanotechnology, scanning tunneling- (STM) and atomic force- (ATM) microscopes, to make their measurements. The microscopes have a delicate electrode tip that is held very close to the DNA sample. In their latest innovation, Lindsay's team made two electrodes, one on the end of microscope probe, and another on the surface, that had their tiny ends chemically modified to attract and catch the DNA between a gap like a pair of chemical tweezers. The gap between these functionalized electrodes had to be adjusted to find the chemical bonding sweet spot, so that when a single chemical base of DNA passed through a tiny, 2.5 nanometer gap between two gold electrodes, it momentarily sticks to the electrodes and a small increase in the current is detected. Any smaller, and the molecules would be able to bind in many configurations, confusing the readout, any bigger and smaller bases would not be detected.

"What we did was to narrow the number of types of bound configurations to just one per DNA base," said Lindsay. "The beauty of the approach is that all the four bases just fit the 2.5 nanometer gap, so it is one size fits all, but only just so!"

At this scale, which is just a few atomic diameters wide, quantum phenomena are at play where the electrons can actually leak from one electrode to the other, tunneling through the DNA bases in the process. Each of the chemical bases of the DNA genetic code, abbreviated A, C, T or G, gives a unique electrical signature as they pass between the gap in the electrodes. By trial and error, and a bit of serendipity, they discovered that just a single chemical modification to both electrodes could distinguish between all 4 DNA bases.

"We've now made a generic DNA sequence reader and are the first group to report the detection of all 4 DNA bases in one tunnel gap," said Lindsay. "Also, the control experiments show that there is a certain (poor) level of discrimination with even bare electrodes (the control experiments) and this is in itself, a first too."

The computer revolution has been driven by rapidly making devices of ever smaller sizes. Smaller is cheaper and faster. The same drive to smaller scale in biotechnology makes the rate of advance in biotech look more like the very fast rate of advance in computing. I am optimistic about the prospects for developing rejuvenation therapies mainly because of this drive to attack biological problems at much smaller scale which enables greater precision, finer control, more automation, and therefore much lower costs.

By Randall Parker 2010 February 15 05:37 PM  Biotech Assay Tools
Entry Permalink | Comments(1)
2010 February 14 Sunday
Highway Pollution Speeds Artery Wall Hardening

Living near freeways is bad for your health. Of course, you get exposed to particulates when driving on freeways as well.

Researchers at the Keck School of Medicine of the University of Southern California (USC), in collaboration with international partners in Spain and Switzerland and colleagues in California, have found that exposure to air pollution accelerates the thickening of artery walls that leads to cardiovascular disease.

The study, published this week in the journal PloS ONE, is the first to link outdoor air quality and progression of atherosclerosis in humans. Researchers found that artery wall thickening among people living within 100 meters (328 feet) of a Los Angeles highway progressed twice as quickly as those who lived farther away.

This result is an argument for electric cars, tougher regulations on coal electric plants, and accelerated retirement of old belching diesel trucks that were manufactured under easier regulations.

Here's an excerpt from the full paper which you can read in full.

We examined data from five double-blind randomized trials that assessed effects of various treatments on the change in CIMT. The trials were conducted in the Los Angeles area. Spatial models and land-use data were used to estimate the home outdoor mean concentration of particulate matter up to 2.5 micrometer in diameter (PM2.5), and to classify residence by proximity to traffic-related pollution (within 100 m of highways). PM2.5 and traffic proximity were positively associated with CIMT progression. Adjusted coefficients were larger than crude associations, not sensitive to modelling specifications, and statistically significant for highway proximity while of borderline significance for PM2.5 (P = 0.08). Annual CIMT progression among those living within 100 m of a highway was accelerated (5.5 micrometers/yr [95%CI: 0.13–10.79; p = 0.04]) or more than twice the population mean progression. For PM2.5, coefficients were positive as well, reaching statistical significance in the socially disadvantaged; in subjects reporting lipid lowering treatment at baseline; among participants receiving on-trial treatments; and among the pool of four out of the five trials.

Moving more freight to trains would also cut pollution since trains use much less diesel fuel per ton of freight moved. Also, trains lend themselves to electrification. A fully electrified train system with major bottlenecks sped up would move freight with far less harm to human health both due to less pollution and fewer deaths and injuries in accidents.

By Randall Parker 2010 February 14 06:38 PM  Pollution Health
Entry Permalink | Comments(9)
2010 February 12 Friday
Nissan Leaf Coming December 2010

Chuck Squatriglia in Wired reports on how to get a Nissan Leaf all-electric car and the likely cost. Nissan has given up on separately leasing the battery.

Nissan won’t say what the car costs until April, but it is shooting for a price in the $26,000 to $33,000 ballpark. The latest word is the car could be in the mid-20s after the $7,500 federal EV tax credit. That would seriously undercut the Volt, which General Motors is widely believed to be trying to keep under $40,000 before the tax credit, and make it competitive with the Toyota Prius hybrid.

Suppose it costs $33k before tax break. Will Nissan sell at a profit or a loss? To put it another way: What's Nissan's initial cost of production and, most important question of all, what are the prospects for lowered manufacturing costs for electric cars in a few years time?

Since I believe we are close to world Peak Oil I see a lot riding on the cost of electric cars in the next 10 years. Governments can not afford to subsidize large volumes of electric (e.g. Nissan Leaf) or pluggable hybrid electric (e.g. Chevy Volt) vehicles. 1 million vehicles at $7,500 each would cost $7.5 billion to subsidize. Industrialized countries have too much sovereign debt and can not sustain big incentives.

Nissan says Leaf production begins in Japan in October 2010 whereas GM says Chevy Volt production starts November 1 2010. Given the time required to ship Leafs from Japan the initial Volts might hit dealers first. But the Volt's price tag is likely to be close to $40k before the $7,500 tax credit in the US.

Given a choice between a Nissan Leaf or a Chevy Volt which is more appealing? The pure electric Leaf with about 100 miles range or the Volt with perhaps 40 miles range on electric power and a few hundred miles on gasoline power? Could you satisfy all your driving needs short of vacation trips with the Leaf?

Given the house or apartment you live in is it practical for you to recharge a car at home? How big a premium would you pay to get either a pure electric or pluggable hybrid?

By Randall Parker 2010 February 12 07:30 PM  Energy Electric Cars
Entry Permalink | Comments(14)
2010 February 11 Thursday
3 Genetic Causes Of Stuttering Found

3 of the many genetic causes of stuttering have been identified.

Feb. 10, 2010 -- Researchers with the National Institutes of Health (NIH) have identified three genes that may predispose people to stuttering -- a condition that affects 3 million Americans and 5% of young children.

Because stuttering tends to run in families, it has long been suspected that genes play a role in the speech disorder.

Many genetic mutations are suspected of causing stuttering.

The GNPTAB, GNPTG, and NAGPA variants were found in only a small proportion of cases, together accounting for 21 of 393 cases in unrelated, affected subjects — a finding that is consistent with the genetic heterogeneity that underlies stuttering.8,9,10,11 Causative factors in the remaining 95% of cases remain to be elucidated. Because the identified variants are rare, they would have escaped detection on a standard genomewide association screen.

Massive cheap genetic scanning is needed to tease out the many genetic causes of stuttering. The same is true for many other genetically caused disorders. Fortunately genetic testing and sequencing has fallen by orders of magnitude in recent years and the cost declines continue. That's why we are seeing more reports of discoveries of causes of genetically caused disorders. I predict we will soon see a lot of reports about genetic causes of personality and IQ differences because the tools are finally available to collect sufficient amounts of genetic data to discover the causes.

The GNPTAB and GNPTG genes have mutations that cause stuttering and mutations in those same genes cause lysosomal storage diseases when people inherit two copies of the mutations.

GNPTAB encodes its enzyme with the help of another gene called GNPTG. In addition, a second enzyme, called NAGPA, acts at the next step in this process. Together, these enzymes make up the signaling mechanism that cells use to steer a variety of enzymes to the lysosome to do their work. Because of the close relationship among the three genes in this process, the GNPTG and NAGPA genes were the next logical place for the researchers to look for possible mutations in people who stutter. Indeed, when they examined these two genes, they found mutations in individuals who stutter, but not in control groups.
The GNPTAB and GNPTG genes have already been tied to two serious metabolic diseases known as mucolipidosis (ML) II and III. MLII and MLIII are part of a group of diseases called lysosomal storage disorders because improperly recycled cell components accumulate in the lysosome. Large deposits of these substances ultimately cause joint, skeletal system, heart, liver, and other health problems as well as developmental problems in the brain. They are also known to cause problems with speech.
“You might ask, why don’t people with the stuttering mutations have more serious complications? Why don’t they have an ML disease?” posed Dr. Drayna, senior author of the paper. “ML disorders are recessive. You need to have two copies of a defective gene in order to get the disease. Nearly all of the unrelated individuals in our study who stuttered had only one copy of the mutation. Also, with stuttering, the protein is still made, but it’s not made exactly right. With ML diseases, the proteins typically aren’t made at all. Still, there are a few complexities remaining to be understood, and we’d like to learn more about them.”

By Randall Parker 2010 February 11 10:26 PM  Brain Genetics
Entry Permalink | Comments(0)
Bees Caffeine And Nicotine Fiends

Some innocent-looking flowers are really dope pushers who hook bees on drugs. Bees prefer nectar that contains caffeine and nicotine.

Bees prefer nectar with small amounts of nicotine and caffeine over nectar that does not comprise these substances at all, a study from the University of Haifa reveals. “This could be an evolutionary development intended, as in humans, to make the bee addicted,” states Prof. Ido Izhaki, one of the researchers who conducted the study.

Do bees get a buzz from the drugged nectar?

The innocent and sweet portions of the plant and animal kingdom never are as sweet and innocent as natural mythologies would have us believe. The never ending competition for survival and reproductive success ensures that the competition is brutal, relentless, and amoral.

Update: Don't start feeling sorry for those drug addicted bees. Oh no. Honey bees are no more innocent than drug pusher flowers. Those supposedly innocent natural honey bees fight bees from rival colonies for food. It is a Malthusian bee-eat-bee world out there.

A biologist at UC San Diego has discovered that honey bees warn their nest mates about dangers they encounter while feeding with a special signal that's akin to a "stop" sign for bees.

The discovery, detailed in a paper in the February 23 issue of the journal Current Biology, which appears online today, resulted from a series of experiments on honey bees foraging for food that were attacked by competitors from nearby colonies fighting for food at an experimental feeder. The bees that were attacked then produced a specific signal to stop nest mates who were recruiting others for this dangerous location. Honey bees use a waggle dance to communicate the location of food and other resources. Attacked bees directed "stop" signals at nest mates waggle dancing for the dangerous location.

By Randall Parker 2010 February 11 05:27 PM  Brain Addiction
Entry Permalink | Comments(11)
2010 February 10 Wednesday
Habitat Loss And Poaching Work Against Tigers

Poaching and habitat loss continue to tilt the odds against survival for wild tigers.

WASHINGTON, DC, February 10, 2010 – As many Asian countries prepare to celebrate Year of the Tiger beginning February 14, World Wildlife Fund (WWF) reports that tigers are in crisis around the world, including here in the United States, where more tigers are kept in captivity than are alive in the wild throughout Asia. As few as 3,200 tigers exist in the wild in Asia where they are threatened by poaching, habitat loss, illegal trafficking and the conversion of forests for infrastructure and plantations.

WWF is releasing a new interactive map of the world’s top 10 tiger trouble spots and the main threats against tigers. WWF is also launching a campaign: Tx2: Double or Nothing to support tiger range states in their goal of doubling wild tiger numbers by the next Year of the Tiger in 2022.

The issues highlighted in the trouble spots map (www.worldwildlife.org/troublespots) include:

  • Pulp, paper, palm oil and rubber companies are devastating the forests of Indonesia and Malaysia, home to two endangered tiger sub-species;
  • Hundreds of new or proposed dams and roads in the Mekong region will fragment tiger habitat;
  • Illegal trafficking in tiger bones, skins and meat feeds a continued demand in East and Southeast Asia;
  • More tigers are kept in captivity in the U.S. than are left in the wild -- and there are few regulations to keep these tigers from ending up on the black market. The largest numbers of captive tigers are in Texas (an estimated 3,000+), but they are also kept in other states;
  • Poaching of tigers and their prey, along with a major increase in logging is taking a heavy toll on Amur, or Siberian, tigers;
  • Tigers and humans are increasingly coming into conflict in India as tiger habitats shrink;
  • Climate change could reduce tiger habitat in Bangladesh’s Sundarbans mangroves by 96 percent.

I expect the habitat loss to continue due to growing human populations and industrialization. Biotechnological innovations that increase the uses of land to make agricultural products (most notably biofuels) might well accelerate this process.

Tigers have already lost 93% of their historic range.

Three tiger sub-species have gone extinct since the 1940s and a fourth one, the South China tiger, has not been seen in the wild in 25 years. Tigers occupy just seven percent of their historic range. But they can thrive if they have strong protection from poaching and habitat loss and enough prey to eat.

Lions and tigers and bears. Oh my, where did they all go?

By Randall Parker 2010 February 10 10:33 PM  Trends Habitat Loss
Entry Permalink | Comments(4)
IBM Uses Cheap Materials For Solar Cells

Decent efficiency photovoltaic (PV) solar cells made from cheap copper, zinc, tin, and sulfur but rarer selenium. The efficiency is close to that of commercial thin film PV.

Researchers at IBM have increased the efficiency of a novel type of solar cell made largely from cheap and abundant materials by over 40 percent. According to an article published this week in the journal Advanced Materials, the new efficiency is 9.6 percent, up from the previous record of 6.7 percent for this type of solar cell, and near the level needed for commercial solar panels. The IBM solar cells also have the advantage of being made with an inexpensive ink-based process.

Even with selenium this type of cell has materials cost advantages over existing commercial thin films from First Solar made of cadmium and telluride. Also, this cell has advantages over the CIGS (copper indium gallium selenide) cells of the newer thin film manufacturers since indium and gallium cost more and CIGS also uses selenium.

But currently capital costs are more important than materials cost - at least among the thin film makers. The thin film makers are managing to find ways to cut their costs without switching to cheaper elements. Some CIGS PV makers expect to get their costs down to 50 cents per watt in 2010 versus current low cost leader First Solar's 85 cents a watt in 2009. Also, silicon polycrystal prices have fallen so far and have the potential to fall even farther so that silicon PV makers shouldn't be counted out of the race to lower costs..

Price declines in PV, bigger incentives by state and federal governments, and state requirements for more power from renewables are combining to cause a possible doubling in the amount of PV installed in the United States in 2010 as compared to 2009.

By Randall Parker 2010 February 10 10:09 PM  Energy Solar
Entry Permalink | Comments(11)
2010 February 09 Tuesday
Subtropics In Danger Of Further Desertification

38 percent of world's surface in danger of desertification.

"Despite improvements in the LCA, it has a methodological weakness, which is a lack of environmental impact categories to measure the effect of human activities such as cultivation or grazing on the soil", Montserrat Núñez, lead author and a researcher at the Institute of Agro Food Research and Technology (IRTA), tells SINC.

The research, published in the latest issue of the International Journal of Life Cycle Assessment, is the first study in the world to include the impact of desertification in the LCA, based on classifying 15 natural areas or "eco-regions" according to their degree of aridity. By simultaneously using the LCA and a Geographic Information System (GIS), the researchers have shown that eight of these 15 areas can be classified as at risk of desertification, representing 38% of the land surface of the world.

Excessive pumping of aquifers (a huge problem btw), farming practices, and other factors contribute to the risk.

The subtropical regions are at most risk.

The eight natural areas at risk are coastal areas, the Prairies, the Mediterranean region, the savannah, the temperate Steppes, the temperate deserts, tropical and subtropical Steppes, and the tropical and subtropical deserts.

"The greatest risk of desertification (7.6 out of 10 on a scale produced using various desertification indicators) is in the subtropical desert regions – North Africa, the countries of the Middle East, Australia, South West China and the western edge of South America", the scientist explains.

Sounds like strange days have come.

By Randall Parker 2010 February 09 09:22 PM  Trends Habitat Loss
Entry Permalink | Comments(3)
Soy Isoflavones Not Protective For Bones?

Not looking so good for isoflavone benefits.

AMES, Iowa -- A previous six-month study by Iowa State University researchers had indicated that consuming modest amounts of soy protein, rich in isoflavones, lessened lumbar spine bone loss in midlife, perimenopausal women. But now an expanded three-year study by some of those same researchers does not show a bone-sparing effect in postmenopausal women who ingested soy isoflavone tablets, except for a modest effect at the femoral (hip) neck among those who took the highest dosage.

The multi-center clinical trial of 224 postmenopausal women -- led by D. Lee Alekel, professor of nutrition and interim associate director of the Nutrition and Wellness Research Center (NWRC) at Iowa State, and supported by the National Institute of Arthritis, Musculoskeletal and Skin Diseases, one of the research institutes of the National Institutes of Health (NIH) -- was the longest ever conducted on the effects of soy isoflavones on bone mineral density (BMD). It compared the effects of either ingesting daily 80-mg daily or 120-mg soy isoflavone tablets, compared to placebo tablets on BMD and other health outcomes.

The hope was that the isoflavones would act like hormone substitutes and therefore help reduce bone loss. There might have been a small benefit on neck bone.

While the 120-mg dose soy isoflavones did reveal a small protective effect on femoral neck bone BMD, researchers found no significant effect of treatment on lumbar spine, total hip, or whole-body BMD.

There's still exercise, vitamin D, vitamin K, and of course beer for the silicon.

By Randall Parker 2010 February 09 09:08 PM  Aging Diet Bone Studies
Entry Permalink | Comments(1)
Urbanization And Exports Drive Deforestation

The hope in some circles was that urbanization would decrease pressures on forests. A new study finds that urbanization does not help prevent loss of forests.

The drivers of tropical deforestation have shifted in the early 21st century to hinge on growth of cities and the globalized agricultural trade, a new large-scale study concludes. The observations starkly reverse assumptions by some scientists that fast-growing urbanization and the efficiencies of global trade might eventually slow or reverse tropical deforestation. The study, which covers most of the world’s tropical land area, appears in this week’s early edition of the journal Nature Geoscience.

I am not surprised that exports drive deforestation. Affluent countries and developing countries like China can afford to buy growing amounts of timber and crops. So, for example, parts of the Amazon get cut down to expand Brazilian agricultural output for international markets. But the result with urbanization is surprising to me.

Large industrial farms are replacing rural dwellers and driving into forests.

Deforestation has been a rising concern in recent decades, especially with the recognition that it may exacerbate climate change. Studies in the late 20th century generally matched it with growing rural populations, as new roads were built into forests and land was cleared for subsistence agriculture. Since then, rural dwellers have been flooding into cities, seeking better living standards; 2009 was recorded as the first year in history when half of human lived in urban areas. Large industrial farms have, in turn, taken over rural areas and expanded further into remaining forests, in order to supply both domestic urban populations and growing international agricultural markets, the study suggests.

I read news reports of big investments in African farm operations by business interests in Saudi Arabia, China, and other countries. I expect this trend to continue. As people become more affluent they eat higher on the food chain. Instead of living directly on grains they get more of their calories from meat and milk. Of course this requires much more grain to feed livestock.

“The main drivers of tropical deforestation have shifted from small-scale landholders to domestic and international markets that are distant from the forests,” said lead author Ruth DeFries, a professor at the Earth Institute’s Center for Environmental Research and Conservation. “One line of thinking was that concentrating people in cities would leave a lot more room for nature. But those people in cities and the rest of the world need to be fed. That creates a demand for industrial-scale clearing.”

DeFries and her colleagues analyzed remote-sensing images of forest cover across 41 nations in Latin America, Africa and Asia from 2000-2005, and combined these with population and economic trends. They showed that the highest forest losses were correlated with two factors: urban growth within countries; and, mainly in Asia, growth of agricultural exports to other countries. Rural population growth was not related.

Since the world's population is headed toward 9 billion and much of Asia is industrializing much more of the remaining rain forests will go under the plow.

By Randall Parker 2010 February 09 08:56 PM  Trends Habitat Loss
Entry Permalink | Comments(4)
2010 February 08 Monday
Genetic Variant Influences Aging Rate?

A genetic variant appears to influence the length of telomeres which are caps on chromosomes. These telomere caps shorten with age and the shortening is linked to aging.

Scientists announced today (7 Feb) they have identified for the first time definitive variants associated with biological ageing in humans. The team analyzed more than 500,000 genetic variations across the entire human genome to identify the variants which are located near a gene called TERC.

It is important to note that telomere shortening might offer a life extending benefit: shorter telomeres might stop at least some cancer cells from dividing too many times. The genetic variant that shortens telomeres therefore might have been selected for in some humans.

Shorter telomeres are linked to some diseases of old age.

"What we studied are structures called telomeres which are parts of one's chromosomes. Individuals are born with telomeres of certain length and in many cells telomeres shorten as the cells divide and age. Telomere length is therefore considered a marker of biological ageing.

"In this study what we found was that those individuals carrying a particular genetic variant had shorter telomeres i.e. looked biologically older. Given the association of shorter telomeres with age-associated diseases, the finding raises the question whether individuals carrying the variant are at greater risk of developing such diseases"

Possibly the genetic variant causes telomeres to shorten more rapidly.

Professor Tim Spector from King's College London and director of the TwinsUK study, who co-led this project, added:

"The variants identified lies near a gene called TERC which is already known to play an important role in maintaining telomere length. What our study suggests is that some people are genetically programmed to age at a faster rate. The effect was quite considerable in those with the variant, equivalent to between 3-4 years of 'biological aging" as measured by telomere length loss. Alternatively genetically susceptible people may age even faster when exposed to proven 'bad' environments for telomeres like smoking, obesity or lack of exercise – and end up several years biologically older or succumbing to more age-related diseases. "

A great way to cure cancer that has minimal side effects would open the door for a variety of rejuvenation therapies. Treatments to lengthen telomeres would carry less risk if cancer was easily curable.

Stem cell therapies hold out the hope of working around the telomere shortening problem. Old cells can accumulate dangerous genetic mutations that can lead to cancer. But in the future stem cell lines will be selected to have few harmful mutations and then stem cells with long telomeres can be inserted into the body with the ability to grow and do repairs that cells with short telomeres are unable to do.

By Randall Parker 2010 February 08 11:21 PM  Aging Genetics
Entry Permalink | Comments(3)
Beer Silicon Against Osteoporosis

I know how dutiful you all are about your health and I'm sure many of you will do the responsible thing and drink beer for your bones.

A new study suggests that beer is a significant source of dietary silicon, a key ingredient for increasing bone mineral density. Researchers from the Department of Food Science & Technology at the University of California, Davis studied commercial beer production to determine the relationship between beer production methods and the resulting silicon content, concluding that beer is a rich source of dietary silicon. Details of this study are available in the February issue of the Journal of the Science of Food and Agriculture, published by Wiley-Blackwell on behalf of the Society of Chemical Industry.

"The factors in brewing that influence silicon levels in beer have not been extensively studied" said Charles Bamforth, lead author of the study. "We have examined a wide range of beer styles for their silicon content and have also studied the impact of raw materials and the brewing process on the quantities of silicon that enter wort and beer."

Silicon is present in beer in the soluble form of orthosilicic acid (OSA), which yields 50% bioavailability, making beer a major contributor to silicon intake in the Western diet. According to the National Institutes of Health (NIH), dietary silicon (Si), as soluble OSA, may be important for the growth and development of bone and connective tissue, and beer appears to be a major contributor to Si intake. Based on these findings, some studies suggest moderate beer consumption may help fight osteoporosis, a disease of the skeletal system characterized by low bone mass and deterioration of bone tissue.

They tested 100 commercial beers. Anyone have access to this journal and wants to tell us which beers are best?

The lighter beers are better.

The researchers examined a variety of raw material samples and found little change in the silicon content of barley during the malting process. The majority of the silicon in barley is in the husk, which is not affected greatly during malting. The malts with the higher silicon contents are pale colored which have less heat stress during the malting process. The darker products, such as the chocolate, roasted barley and black malt, all have substantial roasting and much lower silicon contents than the other malts for reasons that are not yet known.

Partly for the sake of my bones I also take vitamin D, calcium and biweekly high potency vitamin K (as K2 fwiw).

By Randall Parker 2010 February 08 11:53 AM  Aging Diet Bone Studies
Entry Permalink | Comments(4)
Richard Branson: Oil Crunch In 5 Years

Virgin Group boss and billionaire Richard Branson joins the Peak Oil crowd.

"The next five years will see us face another crunch – the oil crunch. This time, we do have the chance to prepare. The challenge is to use that time well," Branson will say.

"Our message to government and businesses is clear: act," he says in a foreword to a new report on the crisis. "Don't let the oil crunch catch us out in the way that the credit crunch did."

One wonders what Branson thinks this means for Virgin Air. Is he still buying airplanes? What's he planning to fuel them with?

Other notable British CEOs join Branson in citing the looming threat.

Other British executives who will support the warning include Ian Marchant, chief executive of Scottish and Southern Energy group, and Brian Souter, chief executive of transport operator Stagecoach.

Their call for urgent government action comes amid a wider debate on the issue and follows allegations by insiders at the International Energy Agency that the organisation had deliberately underplayed the threat of so-called "peak oil" to avoid panic on the stock markets.

Branson's an optimist compared to Jose S. Gabrielli de Azevedo, CEO of Brazilian oil company Petrobras. Gabrielli doesn't see how world oil production can be maintained at current levels after 2010. Optimists cite offshore Brazilian fields such as Tupi as reasons not to worry about Peak Oil. But Petrobras's ambition to increase production by a couple million barrels won't replace much larger production declines in existing fields around the world. You might want to take your last long road trip fling this year.

I hope Gabrielli is off by a couple of years. I want to do a road trip up the Alaskan Highway thru Canada before oil prices skyrocket.

Update: Lest you think Gabrielli is an outlier among major oil company CEOs, ConocoPhillips CEO James Mulva said in 2007 that world oil production will never hit 100 million barrels per day. So we are within at most 15% of world peak.

"Demand will be going up, but it will be constrained by supply," Mulva said. " I don't think we are going to see the supply going over 100 million barrels a day and the reason is: Where is all that going to come from?"

By Randall Parker 2010 February 08 10:19 AM  Energy Fossil Fuels
Entry Permalink | Comments(23)
2010 February 07 Sunday
Nanomaterial Promotes Cartilage Growth

Tell those stem cells to get off their duffs and fix things!

EVANSTON, Ill. --- Northwestern University researchers are the first to design a bioactive nanomaterial that promotes the growth of new cartilage in vivo and without the use of expensive growth factors. Minimally invasive, the therapy activates the bone marrow stem cells and produces natural cartilage. No conventional therapy can do this.

The results will be published online the week of Feb. 1 by the Proceedings of the National Academy of Sciences (PNAS).

"Unlike bone, cartilage does not grow back, and therefore clinical strategies to regenerate this tissue are of great interest," said Samuel I. Stupp, senior author, Board of Trustees Professor of Chemistry, Materials Science and Engineering, and Medicine, and director of the Institute for BioNanotechnology in Medicine. Countless people -- amateur athletes, professional athletes and people whose joints have just worn out -- learn this all too well when they bring their bad knees, shoulders and elbows to an orthopaedic surgeon.

I know people who experience daily pain from worn knee joints. Some started feeling this pain in their 30s. That's a long time to go thru life with a disability that, absent a treatment such as this one, will mean only worsening pain to look forward to.

In an animal model the nanofiber gel with growth factor injected into the injured joint stimulates stem cells to produce the desired type II collagen.

"Our material of nanoscopic fibers stimulates stem cells present in bone marrow to produce cartilage containing type II collagen and repair the damaged joint," Shah said. "A procedure called microfracture is the most common technique currently used by doctors, but it tends to produce a cartilage having predominantly type I collagen which is more like scar tissue."

The Northwestern gel is injected as a liquid to the area of the damaged joint, where it then self-assembles and forms a solid. This extracellular matrix, which mimics what cells usually see, binds by molecular design one of the most important growth factors for the repair and regeneration of cartilage. By keeping the growth factor concentrated and localized, the cartilage cells have the opportunity to regenerate.

Anyone know what the obstacles are to trying this in humans? In the United States is FDA approval needed?

By Randall Parker 2010 February 07 08:07 PM  Biotech Repair Joints
Entry Permalink | Comments(1)
2010 February 06 Saturday
Water-Stressed Trees Convert Less CO2 Into Biomass

Warming will reduce winter snow pack and therefore reduce tree growth in summer due to lack of water.

Contrary to conventional belief, as the climate warms and growing seasons lengthen subalpine forests are likely to soak up less carbon dioxide, according to a new University of Colorado at Boulder study.

As a result, more of the greenhouse gas will be left to concentrate in the atmosphere.

"Our findings contradict studies of other ecosystems that conclude longer growing seasons actually increase plant carbon uptake," said Jia Hu, who conducted the research as a graduate student in CU-Boulder's ecology and evolutionary biology department in conjunction with the university's Cooperative Institute for Research in Environmental Sciences, or CIRES.

The study will be published in the February edition of the journal Global Change Biology.

Working with ecology and evolutionary biology professor and CIRES Fellow Russell Monson, Hu found that while smaller spring snowpack tended to advance the onset of spring and extend the growing season, it also reduced the amount of water available to forests later in the summer and fall. The water-stressed trees were then less effective in converting CO2 into biomass. Summer rains were unable to make up the difference, Hu said.

While not mentioned in this press release, outright drought is especially problematic for use of trees to capture CO2. Pine needs water in order to produce resin that protects against beetle infestation (yes, trees have active defensive mechanisms against pests). Without enough water pine trees will get killed by beetles. Obviously dead trees release CO2 rather than absorb it. Warmer winters make this problem worse by reducing snow pack and also by not killing the beetles. More beetles survive warmer winters and cause more damage to trees.

While longer seasons and higher CO2 will increase plant growth in many areas that won't happen where drought and reduced summer water run-off makes water a bigger limiting factor than CO2 for plant growth.

Why do I assume CO2 will cause warming? Absorption spectra (and with a decent explanation here and here) I respect physics.

By Randall Parker 2010 February 06 09:30 PM  Climate Feedbacks
Entry Permalink | Comments(70)
Phase Change Material Could Cool Houses

MIT's Technology Review reports on paraffin wax capsules could use the cold of evening to cool rooms in the day.

Building materials that absorb heat during the day and release it at night, eliminating the need for air-conditioning in some climates, will soon be on the market in the United States. The North Carolina company National Gypsum is testing drywall sheets--the plaster panels that make up the walls in most new buildings--containing capsules that absorb heat to passively cool a building. The capsules, made by chemical giant BASF, can be incorporated into a range of construction materials and are already found in some products in Europe.

This won't help much so much where the difference in day and night temperatures is small. But desert areas get very cool at night. So this approach would work well for these areas. What I wonder: Does the paraffin increase the flammability of the walls in a fire?

One could also use a similar approach to make use of lower night rates for electricity. Run an air conditioner or ground sink heat pump at night and use it to cool a compound from liquid to solid phase at night. Then blow home air over the solid during the day to cool it.

Pricing of electricity by time of day and even by level of demand would provide more incentive to implement storage systems for heat and cool. Changes in utility regulatory policies to change electricity pricing based on supply and demand would encourage greater use of materials for storing cool and heat.

By Randall Parker 2010 February 06 08:03 PM  Energy Heating
Entry Permalink | Comments(11)
2010 February 04 Thursday
Loan Guarantees For 7 More Nuclear Reactors

More loan guarantees mean more nuclear power plants.

President Obama's proposed 2011 budget could provide a significant boost to the U.S. nuclear power industry, which has been stalled for decades. If approved by Congress, the budget would provide $36 billion in loan guarantees for nuclear power plants, opening the way for around seven new nuclear power plants, depending on the final cost of each. The new guarantees are in addition to $18.5 billion in guarantees provided for in a 2005 energy bill.

That's about $5.2 billion per nuke. What I'd like to know: What size of nukes? 1.6 GW each? The US currently has 104 nukes operating and they deliver 20% of the electric power used in the United States. Coal delivers about half the electric power. So we'd need two and a half times the amount of nuclear power added in order to displace dirtier coal electric power.

Nuclear power plants are the most capital intensive way to generate electricity. But they have the lowest fuel costs. Without loan guarantees bond interest rates make nuclear power too expensive to compete with coal and natural for electric power generation.

According to one recent analysis, the cost of building nuclear power plants has approximately doubled in the last seven years (due to things such as increasing materials costs). As it stands, this means that the cost of electricity from new plants would be around 8.4 cents per kilowatt hour, compared to about 6 cents per kilowatt hour for conventional fossil fuel plants.

A 2.4 cent per kwh price gap isn't large. If we had to pay 2.4 cents per kwh more for electricity the effects on our living standards would be pretty small. The existing differences in electricity prices between states are several times larger than that. The people in Connecticut pay about 10 cents more kwh than the people in Minnesota for example. At current prices the people in Connecticut would pay less if all their base load electric power came from nukes. Ditto for the rest of the US northeast.

A major reason for the higher interest rates is that even today new nuclear power plant construction projects in other countries are experiencing unexpected delays. Delays push up total costs because a partially completed plant has a lot of embedded costs without revenue flowing in to pay the interest on bonds.

Loan guarantees are effectively a way for the political system to support nukes without raising prices on nuclear power's dirtier competitors. For nuclear power to grow into a much higher percentage of total electric power generation one or more of of several things need to happen:

  • The next round of power plant construction has to go so well that the perceived risk for new nuke construction goes down and therefore interest rates on nuke bonds goes down.
  • Manufacture of small nukes on assembly lines substantially lowers the cost of 3rd generation nuclear designs.
  • 4th generation nuclear power plants substantially lower the risks and capital requirements for new nuclear power plants.
  • Carbon taxes raise the costs of coal and natural gas electricity high enough to make nukes more competitive.
  • Regulations on conventional pollutants (e.g. soot and oxides of nitrogen) become much stricter and raise the costs especially of coal electric power.

To put it another way, to make nukes competitive either nuclear power costs need to come down or fossil fuel power costs need to go up. Will either of these developments happen?

By Randall Parker 2010 February 04 11:40 PM  Energy Nuclear
Entry Permalink | Comments(10)
2010 February 03 Wednesday
Advanced Persistent Threats In Computer Networks

What you can not hear is the massive silent sucking sound of Western corporate secrets flowing into servers in China.

“The scope of this is much larger than anybody has every conveyed,” says Kevin Mandia, CEO and president of Virginia-based computer security and forensic firm Mandiant. “There [are] not 50 companies compromised. There are thousands of companies compromised. Actively, right now.”

Mandia claims these intrusions are persistent and used for industrial espionage on a massive scale.

Called Advanced Persistent Threats (APT), the attacks are distinctive in the kinds of data the attackers target, and they are rarely detected by antivirus and intrusion programs. What’s more, the intrusions grab a foothold into a company’s network, sometimes for years, even after a company has discovered them and taken corrective measures.

I do not know whether the threat is this large. Are Chinese hackers really sucking massive amounts of proprietary design and business plan data from American, Japanese, and European corporations?

If the infiltrations really are persistent and on a large scale I have some practical suggestions on how to cut them down by orders of magnitude. Analogies with biological systems come to mind. Biological RNA and DNA viruses can only work because they use the same DNA codon mappings to amino acids. The same 3 letter DNA sequences and RNA sequences map in just about all living organisms on this planet. An organism that used a very different set of mappings would likely be immune to existing viruses.

This description is about to get too technical for most people who aren't computer architects or software developers. Sorry about that.

In computing the problem stems from the universal use of the same operating systems, scripting languages, networking protocols, and CPU op codes. The obvious solution: generate custom instruction set with different orderings of bits in op codes. The same compilers (e.g. gcc) could be used with back-end code generators that would read in tables for how to map to specialized bit orderings of existing processor instruction sets.

Take a microprocessor instruction set like some level of the ARM instruction set. Create a description of an ARM processor in, say, VHDL. Enhance the description so that as instructions get fetched their op code bits will get swapped around from the ordering out in memory to the ordering that the CPU understands. The CPU could execute op codes laid out like any conventional ARM processor. But it could fetch from memory in a secret format which the secret version of the gcc back-end would know how to generate for.

Alternatively, the CPU could execute the secret op code layout. At each site the VHDL (or Verilog or other logic description language) could be transformed into a different unique op code layout. Then the compiled processor architecture could be loaded into an FPGA for execution.

Each super-secure site would generate a different secret bit ordering. The odds of a binary code virus getting into the facility and invading servers would be extremely low because the virus writers wouldn't know how to generate legal op codes.

This same approach could be applied to interpreted scripting languages. Developers could still write and debug in, say, Python or Ruby or Perl. But their source code could be translated into a very different looking interpreted language using a secure (not on a network) computer that would read in, say, Python and split out a different secret scripting language whose interpreter could actually be derived from the open source public Python interpreter engine.

The key to this approach is to develop microprocessor descriptions and interpreted languages that lend themselves to automated transformation into functionally equivalent but different looking instruction execution machines.

In a nutshell: automate the generation of obscure execution languages and op code architectures.

Desktops are a harder nut to crack. One way to do it is to just make desktops as akin to X servers. Run the real word processor, spreadsheet, or browser on the secret server's instruction set architecture. Of course, then Open Office and Mozilla Firefox would need to be compiled for each server. This approach is easier to do with open source.

By Randall Parker 2010 February 03 10:15 PM  Computing Security
Entry Permalink | Comments(13)
2010 February 02 Tuesday
White Roofs For Cooler Cities In Summer

Painting roofs white in order to cool the planet has been proposed previously. Now some scientists do some computer modeling of the effects of more reflective roofing in cities.

BOULDER—Painting the roofs of buildings white has the potential to significantly cool cities and mitigate some impacts of global warming, a new study indicates. The new NCAR-led research suggests there may be merit to an idea advanced by U.S. Energy Secretary Steven Chu that white roofs can be an important tool to help society adjust to climate change.

But the study team, led by scientists at the National Center for Atmospheric Research (NCAR), cautions that there are still many hurdles between the concept and actual use of white roofs to counteract rising temperatures.

Whiter buildings in cities are of special interest because cities are warmer (especially in the summer) than surrounding regions. The buildings and roads of cities absorb more sunlight than the same areas absorbed before humans built the cities. "Hot time, summer in the city."

White roofs would make a substantial difference.

"Our research demonstrates that white roofs, at least in theory, can be an effective method for reducing urban heat," says NCAR scientist Keith Oleson, the lead author of the study. "It remains to be seen if it's actually feasible for cities to paint their roofs white, but the idea certainly warrants further investigation."

The study is slated for publication later this winter in Geophysical Research Letters. It was funded by the National Science Foundation, NCAR's sponsor.

One third of the urban heat island effect could be eliminated by painting all city roofs white (orby using white materials to make the roofs). Okay, so what would it take to remove the other two thirds of the effect?

Cities are particularly vulnerable to climate change because they are warmer than outlying rural areas. Asphalt roads, tar roofs, and other artificial surfaces absorb heat from the Sun, creating an urban heat island effect that can raise temperatures on average by 2-5 degrees Fahrenheit (about 1-3 degrees Celsius) or more compared to rural areas. White roofs would reflect some of that heat back into space and cool temperatures, much as wearing a white shirt on a sunny day can be cooler than wearing a dark shirt.

The study team used a newly developed computer model to simulate the amount of solar radiation that is absorbed or reflected by urban surfaces. The model simulations, which provide scientists with an idealized view of different types of cities around the world, indicate that, if every roof were entirely painted white, the urban heat island effect could be reduced by 33 percent. This would cool the world's cities by an average of about 0.7 degrees F, with the cooling influence particularly pronounced during the day, especially in summer.

I'm thinking white streets would get us part of the way there. Is it possible to develop a whiter concrete? Would such a concrete make streets too bright at mid day?

The extent of the urban heat effect on rural temperature monitoring stations that cease to be rural when economic development occurs around them is a problem with studies that attempt to track long term temperature changes. This problem became an issue in an important case of researchers accused of misrepresenting data from Chinese temperature monitoring stations (and see here for more details). Those two articles from The Guardian in England are worth a read.

By Randall Parker 2010 February 02 11:48 PM  Climate Engineering
Entry Permalink | Comments(20)
Fish Oil Might Keep You Sane

Go crazy for fish so you don't just go crazy.

Individuals at extremely high risk of developing psychosis appear less likely to develop psychotic disorders following a 12-week course of fish oil capsules containing long-chain omega-3 polyunsaturated fatty acids, according to a report in the February issue of Archives of General Psychiatry, one of the JAMA/Archives journals.

"Early treatment in schizophrenia and other psychoses has been linked to better outcomes," the authors write as background information in the article. "Given that subclinical psychotic symptoms may predict psychotic disorder and psychosis proneness in a population may be related to the rate of psychotic disorder, intervention in at-risk individuals holds the promise of even better outcomes, with the potential to prevent full-blown psychotic disorders."

By Randall Parker 2010 February 02 10:27 PM  Brain Nutrition
Entry Permalink | Comments(11)
Counsyl Genetic Tests For Prospective Parents

Check for whether you carry 100 potentially dangerous genes for prospective parents.

Counsyl, a Stanford startup based in Redwood City, CA, has developed a genetic test for prospective parents that determines their risk for passing more than 100 different genetic diseases on to their child. The test, which costs $349 and is already covered by some major insurers, could rapidly expand preconception screening for rare inherited conditions.

Here is a map of 100 medical centers offering this test.

You can bet that the list of testable genetic diseases will grow each year and the general usefulness of pre-pregnancy genetic screening will grow along with the list of testable genes.

The big recent cost declines for genetic testing and genetic sequencing don't just make a test such as this cheaper. Lower costs also enable scientists to engage in much larger scale searches for biologically significant genetic variants. As a result the number of known ways that genetic variants cause human differences is going to grow by orders of magnitude in the next 10 years.

Most (all?) of these genetic variants mentioned above only cause disease if inherited from both parents. Test results for a couple can influence their decision on whether to start a pregnancy naturally or via IVF with pre-implantation genetic diagnosis or whether to avoid reproduction entirely. If you are thinking about making a baby then $349 to assess your genetic risks seems like a small price to pay as compared to the total costs (which can run into the hundreds of thousands of dollars) to raise a child to adulthood.

Looking down the line 10 or 20 years I expect to see online dating services match people up based on avoidance of shared harmful recessive genes. Searchers for Mr. and Mrs. Right will get steered toward prospective mates with whom they can pretty safely make babies.

By Randall Parker 2010 February 02 06:41 PM  Biotech Reproduction
Entry Permalink | Comments(7)
2010 February 01 Monday
Global Warming Speeding Tree Growth?

Faster tree growth in eastern US forests in recent years.

Speed is not a word typically associated with trees; they can take centuries to grow. However, a new study to be published the week of Feb. 1 in the Proceedings of the National Academy of Sciences has found evidence that forests in the Eastern United States are growing faster than they have in the past 225 years. The study offers a rare look at how an ecosystem is responding to climate change.

For more than 20 years forest ecologist Geoffrey Parker has tracked the growth of 55 stands of mixed hardwood forest plots in Maryland. The plots range in size, and some are as large as 2 acres. Parker's research is based at the Smithsonian Environmental Research Center, 26 miles east of the nation's capital.

Parker's tree censuses have revealed that the forest is packing on weight at a much faster rate than expected. He and Smithsonian Tropical Research Institute postdoctoral fellow Sean McMahon discovered that, on average, the forest is growing an additional 2 tons per acre annually. That is the equivalent of a tree with a diameter of 2 feet sprouting up over a year.

If this trend continues the amount of biomass tied up in these forests will continue to increase.

The researchers suspect higher temperatures, longer growing seasons, and more CO2 (which is nutritious for a plant) as causes.

It was not enough to document the faster growth rate; Parker and McMahon wanted to know why it might be happening. "We made a list of reasons these forests could be growing faster and then ruled half of them out," said Parker. The ones that remained included increased temperature, a longer growing season and increased levels of atmospheric CO2.

During the past 22 years CO2 levels at SERC have risen 12%, the mean temperature has increased by nearly three-tenths of a degree and the growing season has lengthened by 7.8 days. The trees now have more CO2 and an extra week to put on weight. Parker and McMahon suggest that a combination of these three factors has caused the forest's accelerated biomass gain.

Ecosystem responses are one of the major uncertainties in predicting the effects of climate change. Parker thinks there is every reason to believe his study sites are representative of the Eastern deciduous forest, the regional ecosystem that surrounds many of the population centers on the East Coast. He and McMahon hope other forest ecologists will examine data from their own tree censuses to help determine how widespread the phenomenon is.

Some plants benefit from more CO2 because they open their stomata to let in CO2 for shorter periods of time. This reduces moisture loss. Of course, if warming causes a drought in an area then the net effect on plant growth from warming will be negative.

By Randall Parker 2010 February 01 11:43 PM  Climate Biosphere
Entry Permalink | Comments(11)
Older Brains Need Less Sleep?

As we age we sleep less without an increase in sleepiness.

WESTCHESTER, Ill. — A study in the Feb. 1 issue of the journal SLEEP suggests that healthy older adults without sleep disorders can expect to have a reduced "sleep need" and to be less sleepy during the day than healthy young adults.

Results show that during a night of eight hours in bed, total sleep time decreased significantly and progressively with age. Older adults slept about 20 minutes less than middle-aged adults, who slept 23 minutes less than young adults. The number of awakenings and the amount of time spent awake after initial sleep onset increased significantly with age, and the amount of time spent in deep, slow-wave sleep decreased across age groups. Yet even with these decreases in sleep time, intensity and continuity, older adults displayed less subjective and objective daytime sleep propensity than younger adults.

Furthermore, two additional nights involving experimental disruption of slow-wave sleep led to a similar response in all age groups. Daytime sleep propensity increased, and slow-wave sleep rebounded during a night of recovery sleep. According to the authors, this suggests that the lack of increased daytime sleepiness in the presence of an age-related deterioration in sleep quality cannot be attributed to unresponsiveness to variations in homeostatic sleep pressure. Instead, healthy aging appears to be associated with reductions in the sleep duration and depth required to maintain daytime alertness.

Does the decline in sleeping come about as a result of a real reduction in the need for sleep? Or does the mechanism that causes us to sleep become more faulty as we age?

Perhaps the brain and the rest of the body are less metabolically active as we age and therefore there's less need for sleep to do repairs and process information gathered during waking hours?

If we made ourselves continue to sleep as much in our later years as we did when we were younger would we derive any benefit?

By Randall Parker 2010 February 01 11:32 PM  Aging Brain Studies
Entry Permalink | Comments(13)
Site Traffic Info
Site Copyright
The contents of this site are copyright ©