2009 September 30 Wednesday
Dutch Study Finds No Fish Heart Benefit

A prospective study does not find a heart benefit from fish.

With heart failure treatments often limited to palliative care, much rests on prevention; this latest report from the Rotterdam Study was to investigate whether intake of the long-chain n-3 polyunsaturated fatty acids (PUFAs) found in fish conferred protection against heart failure as they seem to do against coronary heart disease.(3)

The analysis comprised 5299 subjects (41% men, mean age 67.5 years) who were free from heart failure and for whom dietary data were available. During 11.4 years of follow-up, 669 subjects developed heart failure. Their habitual diet had been assessed at baseline (in a self-reported checklist and by expert interview), with subjects specifically asked to indicate the frequency, amount, and kind of fish they had eaten, either as a hot meal, on a sandwich, or between meals.

Results showed that the dietary intake of fish was not significantly related to heart failure incidence. This relative risk was measured according to five levels of fish consumption as reflected in intake of two long chain n-3 PUFAs (eicosapentaenoic acid [EPA] and docosahexaenoic acid [DHA]), both of which have been shown to exert some cardiovascular benefit via anti-inflammatory mechanisms, anti-arrhythmic effects and/or a reduction in serum triglycerides, blood pressure, and heart rate.

I am pretty confident that we should eat lots of fruits and vegetables and when eating grains they should be whole grain and high in fiber. But just how much omega 3 fatty acids to consume?

One of the problems is that there's genetic variability in determining what's the ideal diet. We need a level of understanding of nutritional genomics that does not exist yet. Though with the big drops in DNA sequencing costs I'm hopeful genetic alleles for dietary guidance will become known in a few years.

Update: Note that the problem with the Dutch study might be a dosing problem. Not enough people in the sample might have eaten enough fish for enough time to deliver a benefit. This study does contradict a lot of other studies. You can't conclude from just this study that eating fish doesn't help.

Also, omega 3 fatty acid concentrations vary greatly between species of fish. For example, you are going to get a lot more DHA and EPA from salmon than from tuna. Take a lot at the table at the bottom of this page for omega 3 in various fish. Salmon, anchovy, and sardines have the most omega 3 fats.

By Randall Parker 2009 September 30 11:08 PM  Aging Diet Heart Studies
Entry Permalink | Comments(12)
Heart Risk Factors Rising In Europe And United States?

Progress is not inevitable.

“Over the next few years it's likely that this observed decline in the proportion of people with low cardiovascular risks will translate into increased cardiovascular disease,” said Professor De Backer, a former chair of the European Society of Cardiology (ESC) Joint Prevention Committee. “This paper should act as a wake-up call in Europe as well as the US, since overall European risk factors are not so different. While obesity may be higher in the US, Europe has been less successful in reducing smoking and cutting blood pressure.”

Indeed, the EuroAspire survey(2), which reviewed risk factors in patients with established coronary heart disease from 22 European countries, found that only 6 % of men and 4 % women were achieving lifestyle, risk factor and therapeutic targets for prevention.

“Health surveillance is essential for the development of good health policy. We need to know exactly what are the problems we are facing to determine the best ways of counteracting them,” said Professor De Backer.

The people in some European countries have porked out beyond US levels of porkiness. A whole lot of porking going on. Oink, oink. Plus, Europe has a lot more nicotine fiends. Gotta say I like California's legal hostility to cigarette smokers in workplaces.

Of course, if you are skinny non-smoker who gets a lot of exercise and eats a lot of fruits and vegetables your odds look a whole lot better. Drugs might even boost your odds of longer life even higher.

By Randall Parker 2009 September 30 10:59 PM  Aging Cardiovascular Studies
Entry Permalink | Comments(0)
Distant Earthquakes Weaken San Andreas Fault

Live near a major fault? When you read about distant earthquakes brace for the possibility of big local one as a result.

HOUSTON -- (Sept. 30, 2009) -- U.S. seismologists have found evidence that the massive 2004 earthquake that triggered killer tsunamis throughout the Indian Ocean weakened at least a portion of California's famed San Andreas Fault. The results, which appear this week in the journal Nature, suggest that the Earth's largest earthquakes can weaken fault zones worldwide and may trigger periods of increased global seismic activity.

"An unusually high number of magnitude 8 earthquakes occurred worldwide in 2005 and 2006," said study co-author Fenglin Niu, associate professor of Earth science at Rice University. "There has been speculation that these were somehow triggered by the Sumatran-Andaman earthquake that occurred on Dec. 26, 2004, but this is the first direct evidence that the quake could change fault strength of a fault remotely."

Live in an earthquake zone? I do. Want to be prepared? Look at the building you live in and the building you work in and ask yourself whether you are likely to die in either structure in event of an earthquake. If so, change jobs or move as appropriate. A really big quake in SoCal will take out water supplies in some areas. Think about putting in some big water storage bottles (appropriately padded and braced to prevent breakage) so you can keep drinking water after the Big One.

Last night I was watching a History Channel show about the odds of a really big earthquake in Southern California. One of the people on the show said the scientific consensus is for a 99% probability in the next 30 years. If you live in SoCal you really ought to prepare for it.

By Randall Parker 2009 September 30 10:30 PM  Dangers Natural Geological
Entry Permalink | Comments(3)
Women Prefer Taken Guys

Oklahoma State University researchers Melissa Burkley and Jessica Parker demonstrate an aspect of female desire that I've certainly experienced: women prefer taken guys.

Unknown to the participants, everyone was offered a fictitious candidate partner who had been tailored to match their interests exactly. The photograph of "Mr Right" was the same for all women participants, as was that of the ideal women presented to the men. Half the participants were told their ideal mate was single, and the other half that he or she was already in a romantic relationship.

"Everything was the same across all participants, except whether their ideal mate was already attached or not," says Burkley.

The most striking result was in the responses of single women. Offered a single man, 59 per cent were interested in pursuing a relationship. But when he was attached, 90 per cent said they were up for the chase.

You might think these women are unethical.

Roissy would not find these results surprising. Lots of us have had the experience of getting hit on by more women after they've seen a really attractive women hanging on us. That's been my experience.

So what's going on here? Women go with the herd. If some woman likes a guy enough to want to be attached to him this becomes evidence for other women that he's worth going after. For a guy a beautiful woman can serve as a powerful demonstration of higher value. I think there's a potential business here for a specialized escort service where women sell time with them in public places to guys who want to advertise their desirability.

By Randall Parker 2009 September 30 09:58 PM  Brain Sex Differences
Entry Permalink | Comments(17)
2009 September 29 Tuesday
Amyloid Beta Builds Up While Awake And Declines In Sleep

A protein implicated in as a cause of Alzheimer's Disease increases in mice while they are away and declines while they are asleep. The implication here is that people who do not get enough sleep may be at increased risk of Alzheimer's.

While the occasional all-nighter to cram for exams or finish a grant proposal may seem like no big deal, losing sleep night after night could take its toll on brain health in later life, two new studies suggest. Based on microdialysis experiments in live mice, Dave Holtzman, Washington University, St. Louis, Missouri, and colleagues report in the current issue of Science that extracellular amyloid-beta levels in the brain fall during slumber and rise with wakefulness. They discovered that these Abeta dynamics rely on the hormone orexin, and that forcing animals to sleep or stay awake decreases or increases Abeta plaque formation accordingly in a mouse model for Alzheimer disease.

Lack of sleep increases inflammation and inflammation is also implicated in Alzheimer's. Lack of sleep also puts on the weight and increases obesity. This accelerates aging. So get lots of sleep. It is good for your brain.

By Randall Parker 2009 September 29 11:39 PM  Brain Alzheimers Disease
Entry Permalink | Comments(3)
1930s Economic Depression Boosted Life Expectancy

Good news for Peak Oil doomsters: The Great Depression was accompanied by a rapid rise in life expectancies. So when oil production starts declining every year and most of us lose our jobs we'll live longer?

ANN ARBOR, Mich.—The Great Depression had a silver lining: During that hard time, U.S. life expectancy actually increased by 6.2 years, according to a University of Michigan study published in the current issue of the Proceedings of the National Academy of Sciences.

Life expectancy rose from 57.1 in 1929 to 63.3 years in 1932, according to the analysis by U-M researchers José A. Tapia Granados and Ana Diez Roux. The increase occurred for both men and women, and for whites and non-whites.

"The finding is strong and counterintuitive," said Tapia Granados, the lead author of the study and a researcher at the U-M Institute for Social Research (ISR). "Most people assume that periods of high unemployment are harmful to health."

Whereas mortality declined during economic expansions.

For the study, researchers used historical life expectancy and mortality data to examine associations between economic growth and population health for 1920 to 1940. They found that while population health generally improved during the four years of the Great Depression and during recessions in 1921 and 1938, mortality increased and life expectancy declined during periods of strong economic expansion, such as 1923, 1926, 1929, and 1936-1937.

We are presented such images of poverty back in the Great Depression. From a distance one might expect unemployed people to have starved to death. But hard to square that with a rise in life expectancies. Okay, why this result?

"Working conditions are very different during expansions and recessions," Tapia Granados said. "During expansions, firms are very busy, and they typically demand a lot of effort from employees, who are required to work a lot of overtime, and to work at a fast pace. This can create stress, which is associated with more drinking and smoking.

"Also, new workers may be hired who are inexperienced, so injuries are likely to be more common. And people who are working a lot may also sleep less which is known to have implications for health. Other health-related behaviors such as diet may also change for the worse during expansions."

In recessions, Tapia Granados noted, there is less work to do, so employees can work at a slower pace. There is more time to sleep, and because people have less money, they are less likely to spend as much on alcohol and tobacco.

In addition, economic expansions are also associated with increases in atmospheric pollution which has well-documented short-term effects on cardiovascular and respiratory mortality. Other reasons that periods of economic expansion may be bad for health could include increases in social isolation and decreases in social support that typically occur when people are working more.

What I wonder: Has this pattern held up in recent years?

Workplaces have become a lot safer since the 1920s. Also, hours worked are shorter now than back then. So people working longer hours during an upturn now are probably still working less than, say, workers in the 1920s. Also, economic upturns are less associated with pollution (at least in the United States, though obviously not in China) than was the case in the 1920s and 1930s. So has economic growth become relatively safer today?

By Randall Parker 2009 September 29 10:07 PM  Aging Studies
Entry Permalink | Comments(9)
Accidental IVF Implantation Produces Baby

A pair of reports underscore how in vitro fertilization is creating a strange new world. 40 year old Carolyn Savage of Ohio was accidentally impregnated with another couple's embryo and due to religious beliefs carried the baby to term before handing over the baby boy to his biological parents.

Imagine this woman hadn't been willing to give up the baby. Well, how would a court rule? Would different courts render different decisions?

A 26 year old transsexual who started out as a female is going to try for another IVF pregnancy after miscarriage the first time. Her/his 43 year old girlfriend is too old for kids.

Ruben Noe Coronado Jimenez, 26, sparked a debate in Spain about the ethical use of reproductive technology, when he revealed earlier this year that he was carrying twins following IVF treatment.

The babies were due to be born this month but Mr Coronado, from Jaen in Andalucia, suffered a miscarriage in May during the 18th week of pregnancy.

So he's shifting back toward being more of a she in order to carry a baby to term.

A British fertility doctor says some women are literally dying to have a baby. The instinct to reproduce runs strongly in some.

Women are risking death and bankruptcy in their desperation to become mothers, according to Professor Sammy Lee, one of the country's leading experts on infertility.


"I have treated young women with cancer who have refused to have treatment for their illness until they have got pregnant and given birth, knowing they are risking their lives," added Lee, who has helped some couples through 12 cycles of IVF. The maximum number of treatments provided on the NHS is three. "Some of these women do, indeed, go on to die [from the cancer], but they die happy, feeling that they have achieved something greater than their own continued existence."

How to approach these issues? Women putting their lives at risk, really old women having babies they can't live long enough to raise, accidental implantation of wrong embryos, people shifting their bodies toward the feminine side of the ledger to have a baby. Weird wild stuff. I'm struck by the need for standards aimed at protecting the future babies. Prospective parents really just represent themselves and their own desires as they try to create babies who will have to live with the consequences of these decisions. Who represents the interests of the babies?

By Randall Parker 2009 September 29 12:23 AM  Bioethics Reproduction
Entry Permalink | Comments(6)
Dogs Better Exercise Companions Than Humans

The University of Missouri College of Veterinary Medicine Research Center for Human-Animal Interaction (ReCHAI) finds dogs do a better job of getting older adults out for exercise.

ReCHAI sponsors several projects that attempt to further the understanding and value of the relationship between humans and animals. In 2008, ReCHAI sponsored the “Walk a Hound, Lose a Pound and Stay Fit for Seniors.” In the preliminary program, a group of older adults were matched with shelter dogs, while another group of older adults were partnered with a human walk buddy. For 12 weeks, participants were encouraged to walk on an outdoor trail for one hour, five times a week. At the end of the program, researchers measured how much the older adults’ activity levels improved.

“The older people who walked their dogs improved their walking capabilities by 28 percent,” Johnson said. “They had more confidence walking on the trail, and they increased their speed. The older people who walked with humans only had a 4 percent increase in their walking capabilities. The human walking buddies tended to discourage each other and used excuses such as the weather being too hot.”

This bit about humans discouraging each other contrasts with my own experience with dog walking. They are not interested in excuses. They want to go galloping up the road. They think exercise is just plain great. A good dog is a great professional trainer.

By Randall Parker 2009 September 29 12:04 AM  Aging Exercise Studies
Entry Permalink | Comments(3)
2009 September 28 Monday
Corn Ethanol Increases Soil Run-Off

Hey, it has been far too long since I bashed corn ethanol as a product of bad US federal energy policy. Higher demand for corn to produce ethanol causes more run-off of soil, pesticides, and fertilizer due to less crop rotation.

WEST LAFAYETTE, Ind. - More of the fertilizers and pesticides used to grow corn would find their way into nearby water sources if ethanol demands lead to planting more acres in corn, according to a Purdue University study.

The study of Indiana water sources found that those near fields that practice continuous-corn rotations had higher levels of nitrogen, fungicides and phosphorous than corn-soybean rotations. Results of the study by Indrajeet Chaubey, an associate professor of agricultural and biological engineering, and Bernard Engel, a professor and head of agricultural and biological engineering, were published in the early online version of The Journal of Environmental Engineering.

"When you move from corn-soybean rotations to continuous corn, the sediment losses will be much greater," Chaubey said. "Increased sediment losses allow more fungicide and phosphorous to get into the water because they move with sediment."

Corn ethanol is a bad idea with incentives more aimed at subsidizing farmers than at doing anything useful for our energy problems. The energy returned on energy invested (EROEI) isn't high enough to be worth the costs and it can't scale due to limited availability of good soil. Plus,

Biofuel crops are seen as boosting the dead zone in the Gulf of Mexico caused by oxygen depletion as a result of fertilizer run-off.

Scientists in Pennsylvania report that boosting production of crops used to make biofuels could make a difficult task to shrink a vast, oxygen-depleted "dead zone" in the Gulf of Mexico more difficult. The zone, which reached the size of Massachusetts in 2008, forms in summer and threatens marine life and jobs in the region. Their study is scheduled for the Oct. 1 issue of ACS' semi-monthly journal Environmental Science & Technology.

Christine Costello and W. Michael Griffin and colleagues explain that the zone forms when fertilizers wash off farm fields throughout the Mississippi River basin and into the Gulf of Mexico. The fertilizers cause the growth of algae, which eventually depletes oxygen in the water and kills marine life. Government officials hope to reduce fertilizer runoff and shrink the zone to the size of Delaware by 2015. But that goal could be more difficult to reach due to federally-mandated efforts to increase annual biofuel production to 36 billion gallons by 2022, the study says.

Maybe genetically engineered microbes for biofuels production will prove useful. But using food crops to produce energy is a bad idea. Even without the demand for crops to make biofuels rain forests are getting shifted into agriculture due to population growth and Asian economic growth. We shouldn't make this problem worse with bad energy policy.

By Randall Parker 2009 September 28 11:48 PM  Energy Biomass
Entry Permalink | Comments(1)
Cheaper To Remove CO2 From Atmosphere?

Regardless of whether you think atmospheric carbon dioxide (CO2) is a problem assume for the moment you want to reduce the amount of CO2 in the atmosphere. How best to do this? Prevent CO2 from being released into the atmosphere or remove the CO2 once it is there because maybe that approach is cheaper?

Governments are doing practically nothing to study the removal of carbon dioxide directly from the atmosphere, but this technology could be a much cheaper form of climate protection than photovoltaic cells and other approaches getting lavish support, according to an article published today in Science.

David W. Keith, a physicist at the University of Calgary, reviews some of the technologies for air capture of carbon and notes that there is not a single government program devoted specifically to that purpose. Dr. Keith estimates that less than $3 million per year in public money is currently being spent on related research, even though it could potentially be a bargain.

The example cited by Dr. Keith as a very expensive way to avoid CO2 emissions is the use of photovoltaic panels. But I wonder if PV today is worse than expensive: How much CO2 gets released in the production of PV in the first place? The reason I ask that question is that expensive products on average require more fossil fuels in their manufacture than cheaper products. Products that require more expensive manufacturing require more energy to do the manufacturing. In the case of PV that includes mining materials, purifying materials, transporting materials, running production lines, and then transporting and installing the resulting products. That energy isn't just for the PV cells but also for the glass covering, aluminum and other rack material, grid tie inverters and other electrical equipment.

Now, I'm not arguing against the PV industry. Since demand for PV made from current tech provides revenue to develop newer and cheaper ways to make PV I expect the ratio of energy return on energy invested (EROEI in some circles) for PV manufacture will improve and become very favorable even if it isn't favorable now. So the PV industry is still going to eventually make a huge contribution toward cutting back on fossil fuels usage. But given the high (albeit rapidly falling) cost of PV it is not clear to me that it has a big positive EROEI.

Another point: We can also cut back on fossil fuels usage by using energy more efficiently. For example, better home design to lower heating and cooling needs will reduce fossil fuels usage. But in a world where China has surpassed the United States in CO2 emissions we might derive real benefit from the development of ways to remove CO2 that is already in the atmosphere. What I'd like to know: Could a massive program of tree planting, harvesting, and submersion in deep lakes remove CO2 at a lower cost than the cost of carbon capture and storage on coal electric plants?

By Randall Parker 2009 September 28 12:13 AM  Climate Policy
Entry Permalink | Comments(39)
2009 September 27 Sunday
US Life Expectancy Lags Due To Cigarettes

In political debates over health care the fact that the United States lags many other industrialized countries in average life expectancy is sometimes blamed on how health care is funded in the US. But John Tierney of the New York Times reports that once the lifestyles of Americans are adjusted for America's health care system comes out looking pretty good in terms of its effects on longevity.

But a prominent researcher, Samuel H. Preston, has taken a closer look at the growing body of international data, and he finds no evidence that America’s health care system is to blame for the longevity gap between it and other industrialized countries. In fact, he concludes, the American system in many ways provides superior treatment even when uninsured Americans are included in the analysis.

So why does America lag in life expectancy? Past heavy usage of the demon tobacco.

For four decades, until the mid-1980s, per-capita cigarette consumption was higher in the United States (particularly among women) than anywhere else in the developed world. Dr. Preston and other researchers have calculated that if deaths due to smoking were excluded, the United States would rise to the top half of the longevity rankings for developed countries.

I see this report as both good news and bad news. First off, Europe is now substantially lagging the United States in turning away from the demon weed. So the good news for Americans is that in future years the US life expectancy should improve faster than in some of the bigger smoking European countries like Greece, Estonia, Slovakia, Germany, and Hungary. The bad news? We do not have a big potential for longer average life expectancy via changes in funding of health care. We need to eat better food, get more exercise, and make other lifestyle changes. If you still smoke you are accelerating your aging process by about 10 years. So stop doing that!

What's going to matter most for life expectancy in the long run: The rate of advance of biomedical science and the rate of development of new drugs and other treatments. What worries me: The current debate about medical care delivery is focused on short term goals and the effects of proposed policies on long term incentives get short shrift. Yet for the vast majority of us our potentially fatal diseases lie years or decades in the future.

Also see Tierney's posts Is U.S. Health Care System Not the Culprit? and Debating the Longevity Gap for more background on this research and the larger debate on this issue.

By Randall Parker 2009 September 27 11:49 PM  Aging Studies
Entry Permalink | Comments(16)
Silicon Nanotubes For 10 Times Better Batteries

A report in MIT's Technology Review bodes well for the future of electric cars. This could be a game changer.

In an advance that could help electric vehicles run longer between charges, researchers have shown that silicon nanotube electrodes can store 10 times more charge than the conventional graphite electrodes used in lithium-ion batteries.

Better anodes can absorb more lithium and so hold more charge.

Researchers at Stanford University and Hanyang University in Ansan, Korea, are developing the nanotube electrodes in collaboration with LG Chem, a Korean company that makes lithium-ion batteries, including those used in the Chevy Volt. When such a battery is charged, lithium ions move from the cathode to the anode. The new battery electrodes, described online in the journal Nano Letters, are anodes and can store much more energy than conventional graphite electrodes because they absorb much more lithium when the battery is charged.

We need a path of migration away from fossil-fuel powered cars. There's not enough cheap oil left for easy access and rising Asian demand combined with the coming of Peak Oil look set to drive up gasoline prices far higher than they are today. The costs of substitutes will determine how high the price of gasoline will go. Cheap high capacity batteries for long range electric cars would enable most driving to be transitioned to electric power. The incremental increase in electric demand could then come from nukes, wind, solar, and other non-fossil fuels energy sources.

By Randall Parker 2009 September 27 12:41 PM  Energy Batteries
Entry Permalink | Comments(15)
2009 September 26 Saturday
Vitamin D Lack, Fructose Excess Linked To High Blood Pressure

Among women enrolled in the Michigan Bone Health and Metabolism Study high blood pressure developed at 3 times the rate in women who were vitamin D deficient before menopause. Do not wait until you get older before starting to take nutrition seriously. If you wait the damage will already be done before you act.

Women who have vitamin D deficiency in the premenopausal years are at three times increased risk of developing high blood pressure in mid-life.

Hypertension rose from 6 percent to 25 percent over 15 years in this study population of women average age 38.

The age range was 24 to 44 at the start of the study in 1992. So the oldest at the end of the study were 59. A 25% overall high blood pressure rate seems pretty high.

You can get a blood test on your vitamin D to see where you stand. A blood test in winter will be especially telling due to shorter days and less sunshine.

Vitamin D deficiency was defined as less than 80 nanomoles per liter (nmol/L), while normal levels were considered more than 80 nmol/L. Experts in the medical community generally agree that vitamin D deficiency among women is widespread. Some researchers report many women don’t get enough sunlight exposure to help keep vitamin D levels near to normal, nor do they have diets or practice supplementation that support normal levels of vitamin D, Griffin said. Vitamin D is either synthesized in the skin through exposure to ultraviolet B rays in sunlight or ingested as dietary vitamin D.

Another cause of higher blood pressure: large amounts of fructose.

CHICAGO, Sept. 23, 2009 — A high-fructose diet raises blood pressure in men, while a drug used to treat gout seems to protect against the blood pressure increase, according to research reported at the American Heart Association’s 63rd High Blood Pressure Research Conference.

“This is the first evidence of a role of fructose in raising blood pressure and a role for lowering uric acid to protect against that blood pressure increase in people,” said Richard Johnson, M.D., co-author of the study and professor and head of the division of Renal Diseases and Hypertension at the University of Colorado–Denver medical campus in Aurora, Colo.

They used 200 grams of fructose per day. Since grapes appear to be about 8% fructose by weight You'd have to eat about 5 lbs of grapes per day to get 200 grams of fructose from grapes.

Johnson and co-author Santos Perez-Pozo, M.D., a nephrologist at Mateo Orfila Hospital in Minorca, Spain who led the study, evaluated 74 adult men, average age 51, who consumed a diet that included 200 grams (g) of fructose per day in addition to their regular diet. The amount is much higher than the estimated U.S. daily intake of 50 g to 70 g of fructose consumed by most U.S. adults. Half of the men were randomly assigned to get the gout drug allopurinol and the other half acted as controls.

After only two weeks on the diet, the high-fructose plus placebo group experienced significant average blood pressure increases of about 6 millimeters of mercury (mm Hg) in systolic blood pressure (the pressure when the heart beats) and about a 3 mm Hg rise in diastolic blood pressure (the pressure between heartbeats). They were measured with strap-on monitors that record blood pressure periodically around the clock.

Gout drug allopurinol blocked this effect. The main threat comes from high fructose corn syrup used in soda and processed foods.

Fruits contain good stuff that isn't present in high fructose corn syrup.

Fruit, which has just 4 g to 10 g of fructose per serving, also contains many beneficial substances including antioxidants, vitamin C, potassium and fiber that are believed to counter the effects of fructose alone. The main risk for excessive fructose consumption in the Western diet comes from sweetened drinks and foods rich in sugar or high fructose corn syrup, he said.

I'd really like to know if there's a fructose health threat from eating lots of fruits. I happen to eat at least 1 apple a day and lots of grapes daily as well. I've come across a number of scientific reports showing a positive correlation between fruit consumption and health. So I'm still eating lots of fruits. Curiously, the ratio of fructose to total sugar varies greatly between fruits. Note in table 1 how apricot and peach have especially low ratios of fructose to total sugar.

By Randall Parker 2009 September 26 02:50 PM  Aging Diet Heart Studies
Entry Permalink | Comments(7)
2009 September 25 Friday
Doing Tasks Depletes Willpower

Most of us have a finite supply of willpower.

HAMILTON, Ont. September 24, 2009—Have you ever sat down to work on a crossword puzzle only to find that afterwards you haven't the energy to exercise? Or have you come home from a rough day at the office with no energy to go for a run?

A new study, published today in Psychology and Health, reveals that if you use your willpower to do one task, it depletes you of the willpower to do an entirely different task.

Do some people not experience this depletion of willpower?

Regulating your emotions can deplete your willpower. So if someone is giving you a hard time and you are suppressing your desire to strangle them then you are depleting your will to exercise. Stay away from people who cause you to exert more effort regulating your emotions.

"Cognitive tasks, as well as emotional tasks such as regulating your emotions, can deplete your self-regulatory capacity to exercise," says Kathleen Martin Ginis, associate professor of kinesiology at McMaster University, and lead author of the study.

Martin Ginis and her colleague Steven Bray used a Stroop test to deplete the self-regulatory capacity of volunteers in the study. (A Stroop test consists of words associated with colours but printed in a different colour. For example, "red" is printed in blue ink.) Subjects were asked to say the colour on the screen, trying to resist the temptation to blurt out the printed word instead of the colour itself.

"After we used this cognitive task to deplete participants' self-regulatory capacity, they didn't exercise as hard as participants who had not performed the task. The more people "dogged it" after the cognitive task, the more likely they were to skip their exercise sessions over the next 8 weeks. "You only have so much willpower."

Avoid situations that deplete your willpower unnecessarily. What situations or tasks deplete your willpower? Introspect and see if you can identify what does it to you and which depleters you can avoid.

By Randall Parker 2009 September 25 12:01 AM  Brain Performance
Entry Permalink | Comments(34)
2009 September 24 Thursday
Statin Cuts Heart Risk With High CRP

People with high sensitivity C-reactive protein (an inflammation marker) benefit from a cholesterol-lowering statin even if their cholesterol is low.

DALLAS, Sept. 22, 2009 — Statin therapy may be as effective in reducing heart attack, stroke, the need for artery-opening procedures, or heart-related death in people with normal or even low cholesterol but elevated high sensitivity C-reactive protein (hsCRP) as in patients with high cholesterol, according to research reported in Circulation: Cardiovascular Quality and Outcomes, a journal of the American Heart Association.

Note the reference to high sensitivity C-reactive protein (hsCRP)

Doctors would need to treat approximately 20 patients with high hsCRP (a sign of inflammation) and normal cholesterol levels with a statin for five years to avoid one incident of the primary end points of heart attack, stroke, percutaneous coronary intervention (a catheter-based procedure to reopen blocked arteries), or one cardiovascular-related death, researchers said.

The number needed to treat (NNT) value — 20 patients in this case — is a commonly used metric that helps doctors evaluate therapies.

The benefit is greater than that seen with statins taken to lower high cholesterol.

“Those NNT values are comparable or even superior to NNT values we already consider acceptable to prevent cardiovascular disease with statins in people with high cholesterol levels, where the 5-year NNT values range from 44 to 63,” said Paul M. Ridker, M.D., lead author of the study and director of the Center for Cardiovascular Disease Prevention at Brigham and Women’s Hospital in Boston, Mass.
The lower the NNT the better, because it means fewer patients would need to be treated to reap a benefit, he said.

So short of taking Lipitor or Crestor can you bring down your CRP and cut your heart disease, stroke, and other heart risks? Sure. Plant sterols, soy protein, foods high in viscous fiber and a few other dietary changes will lower cholesterol and CRP. You want to go for an inflammation-lowering diet. Generally speaking, dietary factors known to lower cholesterol also lower CRP. In your diet make like an ape man.

By Randall Parker 2009 September 24 12:12 AM  Aging Cardiovascular Studies
Entry Permalink | Comments(4)
2009 September 23 Wednesday
Gregg Easterbrook Argues For Ethical Acceptability Of Cloning

Gregg Easterbrook argues cloning is not unnatural.

Others argue that cloning is "unnatural." But nature wants us to pass on our genes; if cloning assists in that effort, nature would not be offended. Moreover, cloning itself isn't new; there have been many species that reproduced clonally and a few that still do. And there's nothing intrinsically unnatural about human inventions that improve reproductive odds—does anyone think nature is offended by hospital delivery made safe by banks of machines?

Do you oppose allowing cloning of humans to make nearly genetically identical copies? If so, why?

Update: My own take: I would want to create clone based on a genetically patched, fixed, and improved version of my current DNA. I would not want to exactly clone myself. I'd want to do genetic fixes basically like software bug fixes and only then create a sort of clone 2.0.

We are all born with hundreds of genetic mutations that are harmful without any benefit. Once we know enough about the functional significance of most mutations we'll know many thousands of genetic variations that are purely harmful. Well, a more perfected copy of myself seems like a better thing to create than another copy of my current flawed self.

When gene therapies and cell therapies become safe, cheap, and readily available we'll gain the ability to do some those genetic software fixes to ourselves. The fixes we do to our fully developed bodies will not be as thorough as fixes done to a clone since the clone will have fixes in every single cell in their body. Whereas fixes to our adult bodies won't be anywhere near as thorough.

With organ replacements grown from genetically improved versions of our DNA we will some day be able to insert replacement organs that will be free of our harmful genetic variations. So at least parts of us will be perfectible.

By Randall Parker 2009 September 23 12:27 AM  Bioethics Reproduction
Entry Permalink | Comments(22)
Genetic Testing To Enable Personalized Learning Strategies

An article in New Scientist takes a look at recent neuroscience research on learning. Among the topics covered: The COMT gene which is involved in dopamine metabolism has a version that improves the ability to pay attention.

Education before school can have benefits further down the track, Posner says. The neurotransmitter dopamine has been shown to play an important role in the function of the anterior cingulate gyrus, and genetic variations in the dopamine system seem to interact with parenting quality to affect executive function. Posner found that children between 18 and 21 months old with a particularly active variant of the COMT gene, which leads to less dopamine transmission, showed improved attention compared with those carrying other variants. The children also responded especially well to high-quality parenting (Neuroscience, DOI: 10.1016/j.neuroscience.2009.05.059).

The article discusses how individual genetic profiles could lead to personalized methods to optimize learning. In my view the use of such genetically guided teaching strategies will increase differences in educational outcome. Look at the COMT variant mentioned above. Kids who have it will benefit more from high-quality parenting. Okay, so kids identified from genetic testing as having greater capacity to pay attention will get taught stuff faster and more intensely because they'll be recognized as more able to stay focused and absorb information from longer stretches of learning. They'll rise above their peers that much faster.

Advances in methods of teaching will, on average, amplify the effects of differences in abilities. Only gene therapies, cell therapies, and other methods for changing brain metabolism can enable the cognitively less well endowed close some of the gap with the cognitively most able.

By Randall Parker 2009 September 23 12:07 AM  Brain Genetics
Entry Permalink | Comments(0)
2009 September 22 Tuesday
Black Holes Pierce Stars To Cause Gamma Ray Bursts

If we could even detect the black hole's approach is there any way to divert it from hitting our Sun?

Black holes are invading stars, providing a radical explanation to bright flashes in the universe that are one of the biggest mysteries in astronomy today.

The flashes, known as gamma ray bursts, are beams of high energy radiation – similar to the radiation emitted by explosions of nuclear weapons – produced by jets of plasma from massive dying stars.

The orthodox model for this cosmic jet engine involves plasma being heated by neutrinos in a disk of matter that forms around a black hole, which is created when a star collapses.

But mathematicians at the University of Leeds have come up with a different explanation: the jets come directly from black holes, which can dive into nearby massive stars and devour them.

What I want to know: This far out on our spiral arm of the Milky Way Galaxy what are the odds that some black hole will come flying thru our solar system and into our sun? We'd all die if that happened.

If it happened to us it would all be over in 10,000 seconds. But we'd be dead before that.

Their theory is based on recent observations by the Swift satellite which indicates that the central jet engine operates for up to 10,000 seconds - much longer than the neutrino model can explain.

By Randall Parker 2009 September 22 08:23 PM  Dangers Natural General
Entry Permalink | Comments(4)
2009 September 21 Monday
Lower Vitamin D And Higher Heart Death Rates

Older people with lower levels of vitamin D die from heart disease at higher rates.

A new study by researchers at the University of Colorado Denver and Massachusetts General Hospital (MGH) shows vitamin D plays a vital role in reducing the risk of death associated with older age. The research, just published in the Journal of the American Geriatrics Society, evaluated the association between vitamin D levels in the blood and the death rates of those 65 and older. The study found that older adults with insufficient levels of vitamin D die from heart disease at greater rates that those with adequate levels of the vitamin.

"It's likely that more than one-third of older adults now have vitamin D levels associated with higher risks of death and few have levels associated with optimum survival," said Adit Ginde, MD, MPH, an assistant professor at the University of Colorado Denver School of Medicine's Division of Emergency Medicine and lead author on the study. "Given the aging population and the simplicity of increasing a person's level of vitamin D, a small improvement in death rates could have a substantial impact on public health."

Older adults are at high risk for vitamin D deficiency because their skin has less exposure to the sun due to more limited outdoor activities as well as reduced ability to make vitamin D.

The skin syntheses vitamin D in the presence of sunlight. But aged skin syntheses less vitamin D when exposed to sunlight. Old folks engage in fewer outside activities also get less sun exposure. Of course, there's the possibility that the lower level of outside activity is due to underlying illnesses and therefore that the lower levels of vitamin D are a result of illness which also is causing the higher death rate.

What is needed: a large prospective study on whether vitamin D supplementation lowers death rates.

By Randall Parker 2009 September 21 10:28 PM  Aging Diet Heart Studies
Entry Permalink | Comments(5)
Green Tea Good For Bones?

Some researchers in Hong Kong find chemicals in green tea appear to boost bone growth.

In the new study, Ping Chung Leung and colleagues note that many scientific studies have linked tea to beneficial effects in preventing cancer, heart disease, and other conditions. Recent studies in humans and cell cultures suggest that tea may also benefit bone health. But few scientific studies have explored the exact chemicals in tea that might be responsible for this effect.

The scientists exposed a group of cultured bone-forming cells (osteoblasts) to three major green tea components — epigallocatechin (EGC), gallocatechin (GC), and gallocatechin gallate (GCG) — for several days. They found that one in particular, EGC, boosted the activity of a key enzyme that promotes bone growth by up to 79 percent. EGC also significantly boosted levels of bone mineralization in the cells, which strengthens bones. The scientists also showed that high concentrations of ECG blocked the activity of a type of cell (osteoclast) that breaks down or weakens bones. The green tea components did not cause any toxic effects to the bone cells, they note.

One can't conclude from this one report that green tea will lessen your risks of osteoarthritis or osteoporosis.

Check out this USDA document (PDF format) of flavonoids in foods. Some other foods have catechins such as apples, apricots, blackberries, broad beans, chocolate, raspberries, and tea (both black and green).

By Randall Parker 2009 September 21 10:14 PM  Aging Diet Bone Studies
Entry Permalink | Comments(3)
2009 September 20 Sunday
Death Of Norman Borlaug And World Hunger

Norman Borlaug, who won a Nobel Peace Prize for is work in developing new plant strains to boost crop output to reduce world hunger, died recently. This has occasioned many essays about his legacy. In the past I've read arguments that Borlaug and the Green Revolution showed that we do not have to worry about world hunger as long as science, technology, and free markets are allowed to flourish. I am skeptical of that line of argument. It is interesting to note that Borlaug did not believe innovations in food production eliminated the need to control human reproduction.

Borlaug was not naive on these issues, though. In his Nobel acceptance speech, he recognised that "we are dealing with two opposing forces, the scientific power of food production and the biologic power of human reproduction":

There can be no permanent progress in the battle against hunger until the agencies that fight for increased food production and those that fight for population control unite in a common effort. Fighting alone, they may win temporary skirmishes, but united they can win a decisive and lasting victory to provide food and other amenities of a progressive civilization for the benefit of all mankind.

Borlaug said this in 1970 when the global human population stood at 3.7 billion. Today, it is fast approaching seven billion. Modern farming has won the "battle" with population control convincingly.

Think we've made great strides in eliminating hunger? In fact, advances that boosted food production have enabled the human popuation to grow so large that we can have far more people hungry than was the case 200 years ago. Every day 1 billion people go hungry.

19 June 2009, Rome - World hunger is projected to reach a historic high in 2009 with 1 020 million people going hungry every day, according to new estimates published by FAO today.

The world's population hit 1 billion in the year 1804. So more humans are hungry today than were hungry in 1804 (since not all humans in 1804 were hungry). I am curious to know whether one could somehow calculate the percentage of people who were hungry in 1804 and later in the 1800s. How big did the human population grow until 1 billion people were hungry? To 2 billion? 3 billion?

I expect the hunger problem to worsen as declining oil production in the 2010s causes economic contraction even as populations grow. Higher costs for fertilizer, fuel for tractors, and other energy-dependent inputs will reduce per capita food availability.

By Randall Parker 2009 September 20 08:01 PM  Trends Demographic
Entry Permalink | Comments(18)
California Energy Efficiency Transparency In Buildings

Prospective buyers and renters will have more visibility into future building heating, cooling, and lighting costs for commercial buildings sold in California.

Assembly Bill 1103, signed into law in 2007 and set to start taking effect in 2010, will require owners to provide 12 months’ worth of comparable energy-use information to prospective buyers or full-building tenants as well as financiers. It was originally scheduled to take effect for all properties Jan. 1, but draft implementation regulations from the California Energy Commission last month proposed a three-year phase-in period, starting with the largest buildings in July 2010.

I see this as a positive step since it increases market transparency. This can lead to higher energy efficiency in a few ways. First off, current owners will have incentives to look for ways to increase efficiency at least a year before a sale. Second, buyers with better skills at boosting energy efficiency could conceivably hunt around for buildings that have poor energy efficiency and hence lower resale value so that the buyers can buy inefficient buildings and improve efficiency for a profit. Third, companies constructing buildings will have more incentive to use more energy efficient designs since efficiency will play a larger role in determining eventual sales price.

One of the ways the market fails with building energy efficiency is with rental apartments where the renter pays all the utility bills. The renter typically doesn't own the refrigerator or heater or air conditioner. The owner doesn't have as much incentive to put in more efficient equipment since it is the tenant paying the bills. The tenants typically do not get to see the utility bills of the previous tenants. So the tenants can't choose more efficient apartments.

I'd like to see more policy changes for houses and apartments that provide better incentives for higher efficiency. We need these improvements before more Peak Oil price shocks make people too poor to do lots of retrofitting for efficiency.

By Randall Parker 2009 September 20 04:04 PM  Energy Policy
Entry Permalink | Comments(0)
Tesla Roadster Battery Charging And Efficiency

Dr. Robert Wilder describes the charging of his Tesla Roadster's battery.

But before you knock the Roadster for increasing our energy demand, remember: We're not paying a penny for gasoline. And the Roadster has supercar performance and a correspondingly large battery. This battery holds 54 kWh, giving this car great speed and a good range but therefore needing much (solar) 'juice' -- certainly more than a smaller EV that might be used mainly for short trips or inter-city commuting and errands.

Due to cooling and other losses in charging, filling from empty takes about 68 kWh, or 26% more than 54 kWh the battery holds. This 68 kWh is the seminal amount; it quantifies how much truly is needed. We'll reference this number to determine how far we can go from power of the sun alone.

What I find most interesting here: Charging up the Tesla requires 26% more electricity than the battery holds. 26% additional gets wasted. Batteries getting charged heat up. That heat is waste.

What I want to know: Will the Chevy Volt and other pluggable hybrids and pure electric cars have similar amounts of electricity waste when charging their batteries? Does anyone reading this have some data on battery charging efficiency for other lithium battery chemistries?

Wilder charges his Tesla at night when electric rates are cheaper. But he lives in an area where electric power prices are quite high.

Crucially, we do all EV charging overnight because with Time Of Use (TOU) meter rates, the cost here is 'only' 18 cents/kWh during off-peak hours at night.

By contrast, a peak rate is far higher at 30 cents/kWh from 11 a.m. to 6 p.m., when our PV makes surplus power from the sun and sells it back to the utility, giving us a credit on our bill.

So even though he has photovoltaic panels he charges his Tesla at night since his daytime electricity is worth more to sell to his local electric utility.

In sunnier areas at sunnier times of the year really cheap PV could eventually make late morning the cheapest time to buy electricity. The big spike in demand happens in the late afternoon in warmer climes. If the price declines in PV continue then eventually this trend might cause a decline in electric power prices in the morning and a sharper spiking of electricity prices in the late afternoon and early evening.

Of course, given enough electric cars and sufficient battery longevity the late afternoon electric power price spike could be dampened by selling electric power from car batteries out onto the grid.

The range on a Tesla depends heavily on how fast you drive. You can go over 200 miles if you drive slowly enough. A blog post by Tesla CTO JB Straubel shows how fast drag increases and electric power usage doubles as the Tesla Roadster goes faster.

To cruise at 60 mph takes about 15kW. However, if you double that to 30kW you will only accelerate to about 80mph — far less than twice as fast. And if you double it again to 60kW you will accelerate to about 107 mph using 4 times as much power as you did at 60mph, yet you’d only travel about 1.8 times as fast.

Check out the first graph at that page. The Tesla is using slightly over 250 Watt-hours per mile at 60 mph but at 30 it drops to only 150 Wh per mile and bottoms out at about 135 Wh per mile around 17 mph. So the big losses in efficiency occur over 60 mph.

By Randall Parker 2009 September 20 03:22 PM  Energy Electric Cars
Entry Permalink | Comments(11)
Arctic Has 13% Of Remaining Undiscovered Oil?

An article about the potential for oil discovery in the Arctic gives a sense of how much oil is left to be discovered.

The other man who knows interesting things about oil deposits in the Arctic is Donald Gautier, who works for the United States Geological Survey (USGS) in Menlo Park in the heart of Silicon Valley.


Comparing similar areas helps determine possible amounts of oil in other parts of the world. So where are structures similar to those found in the Arctic region? "In the case of northeast Greenland, you can say, for example, that the area is very similar to western Norway and the northern part of the North Sea," says Gautier. Since there is already significantly more data on these analogous areas than for the Arctic, researchers can use this information for modeling:

A total of 17 surveyed sites promise significant finds. There could be up to 90 billion barrels of undiscovered oil in the Arctic, representing 13 percent of the world's as yet undiscovered reserves.

Does 90 billion barrels sound like a lot? It is only 3 years worth of oil at the world's current burn rate. If that's 13% of the world's undiscovered oil then 692 barrels are waiting to be discovered and that's only 23 years at the current burn rate. But even if the world's current oil production rate could be maintained (and it can't) Westerners would see much less of that oil in coming years. The Chinese, with over 4 times the population of the United States, are now buying cars at a faster rate than Americans. China is setting up for a huge surge in oil demand growth. Currently China's oil consumption per capita is a tenth of US oil consumption per capita. The potential for demand growth in China is enormous. Also, India, with an even larger and more rapidly growing population is also industrializing. The $2500 Tata Nano is bringing car ownership within the reach of many more Indians and will help feed growing oil demand in India.

Even if the 90 billion barrel estimate for the Arctic is conservatively off by a factor of 2 or 3 it doesn't much change our problem with dwindling oil reserves. North American crude production peaked in 1985 and I do not expect Canadian and Alaskan Arctic oil production to enable a new North American oil production peak.

World oil discovery peaked in 1965, give or take a year. The consumption rate surpassed the discovery rate in 1981. High oil prices in recent years haven't caused discovery to approach the rate of consumption. We'd need a discovery rate higher than the current consumption rate to accommodate rising Asian demand while still maintaining Western consumption levels. Not going to happen. My advice: Make choices that will lower oil rate of oil consumption. Future high costs won't hit you as hard if you start adjusting now.

By Randall Parker 2009 September 20 12:34 PM  Energy Fossil Fuels
Entry Permalink | Comments(7)
2009 September 19 Saturday
Induced Stem Cells Retain Some Original Cell Type Memory

A continuing series of improvements in how to make cells revert to pluripotent (highly flexible) state open up the possibility of stem cell therapies for a large assortment of disorders and diseaess. But a group that has developed a new safer method for reverting cells to the pluripotent state finds that the converted cells still show signs of their original differentiated state.

A team of researchers from the University of California, San Diego School of Medicine and the Salk Institute for Biological Studies in La Jolla have developed a safe strategy for reprogramming cells to a pluripotent state without use of viral vectors or genomic insertions. Their studies reveal that these induced pluripotent stem cells (iPSCs) are very similar to human embryonic stem cells, yet maintain a "transcriptional signature." In essence, these cells retain some memory of the donor cells they once were.

This "transcriptional signature" they speak of is the pattern of gene expression into messenger RNAs and other RNA pieces that DNA sections get used to generate. Basically, DNA gets read to create matching RNA and then the RNA gets used to guide the creation of proteins.

That a cell can be induced to become more like embryonic cells and yet still retain characteristics of, say, skin or fat cells is problematic for the desire to create stem cell therapies. Ideally one wants to convert cells back to an embryonic-like state and get them to turn off all genes that are specific to being, for example, a liver or fat or kidney cell. This report suggests that inducing cells to become pluripotent does not, by itself, make them into the ideal starting point for stem cell therapies. The job of resetting cells back to a truly embryonic state is trickier than that and scientists haven't get figured out how to fully manage that trick.

On the bright side, the scientists did identify a single gene that is enough to convert a cell into the pluripotent state.

The study, led by UCSD Stem Cell Program researcher Alysson R. Muotri, assistant professor in the Departments of Pediatrics at UCSD and Rady Children's Hospital and UCSD's Department of Cellular and Molecular Medicine, will be published online in PLoS ONE on September 17.

"Working with neural stem cells, we discovered that a single factor can be used to re-program a human cell into a pluripotent state, one with the ability to differentiate into any type of cell in the body" said Muotri. Traditionally, a combination of four factors was used to create iPSCs, in a technology using viral vectors – viruses with the potential to affect the transcriptional profile of cells, sometimes inducing cell death or tumors.

The researchers were using familiar genes for inducing pluripotency: Oct4 and Nanog. (and this link is to the full paper)

Genetic reprogramming of somatic cells to a pluripotent state (induced pluripotent stem cells or iPSCs) by over-expression of specific genes has been accomplished using mouse and human cells. However, it is still unclear how similar human iPSCs are to human Embryonic Stem Cells (hESCs). Here, we describe the transcriptional profile of human iPSCs generated without viral vectors or genomic insertions, revealing that these cells are in general similar to hESCs but with significant differences. For the generation of human iPSCs without viral vectors or genomic insertions, pluripotent factors Oct4 and Nanog were cloned in episomal vectors and transfected into human fetal neural progenitor cells. The transient expression of these two factors, or from Oct4 alone, resulted in efficient generation of human iPSCs. The reprogramming strategy described here revealed a potential transcriptional signature for human iPSCs yet retaining the gene expression of donor cells in human reprogrammed cells free of viral and transgene interference. Moreover, the episomal reprogramming strategy represents a safe way to generate human iPSCs for clinical purposes and basic research.

My guess: We need a far more detailed understanding of genetic regulation in order to create at least some types of cell therapies and for the growth of replacement organs. But for a lot of fatal diseases a cell therapy that is less than perfect might still extend life.

By Randall Parker 2009 September 19 10:29 PM  Biotech Stem Cells
Entry Permalink | Comments(0)
Inkjet Boosts Silicon Solar Cells Efficiency

Another step toward lower photovoltaic (PV) solar power prices.

A California company is using silicon ink patterned on top of silicon wafers to boost the efficiency of solar cells. The Sunnyvale, CA, firm Innovalight says that the inkjet process is a cheaper route to more-efficient solar power. Using this process, the company has made cells with an efficiency of 18 percent.

The efficiency increase was from a starting efficiency 16.5-17% up to 18%. So this additional layer captures an additional 1% of the sunlight falling on the silicon PV cells. This by itself isn't going to close the cost gap between silicon PV and cheaper thin film PV from the likes of First Solar, Solyndra, and Nanosolar. But it is another reason why PV prices are going to stay down even as demand surges in China and the United States.

This report reminds me of how 1366 Technologies is also going to sell an efficiency-boosting technique to silicon PV makers. The existing manufacturers are now so big that new entrants develop technology to sell to them rather than directly going into manufacturing themselves. The scale of existing manufacturers make the barriers to entry too big for most innovators. There are exceptions to this such as Nanosolar which looks like it is coming out with manufacturing technology so revolutionary that they might be entering the market as the new low cost leader. Impressive achievement if so. Sure looks that way given their order book.

No reason for continued gloom about high solar power prices. The market is turning up lots of innovators. Costs are falling.

By Randall Parker 2009 September 19 09:15 AM  Energy Solar
Entry Permalink | Comments(7)
2009 September 18 Friday
Tightwads And Spendthrifts Attracted To Each Other

Scott Rick of Michigan's Ross School of Business find that tightwads and big spenders are attracted to each other and then make unhappy marriages.

Rick and colleagues Deborah Small of the University of Pennsylvania and Eli Finkel of Northwestern University surveyed more than 1,000 married and unmarried adults in three separate studies to find out whether feelings toward spending money predict who people will marry and whether spousal differences in feelings toward spending money influence marital well-being.

They found that both tightwads and spendthrifts are unhappy with their emotional reactions toward spending money—and the more dissatisfied they are, the more likely they are to be attracted to people with opposing views toward spending.

"However, this complementary attraction ultimately appears to hurt marriages, as it is associated with greater conflicts over money and diminished marital well-being," Rick said. "The more spouses differ on the tightwad-spendthrift dimension, the more likely they are to argue over money and the less satisfied they are with the marriage.

"This remains true even when income, debt and savings are controlled for. That is, even though a spendthrift will have greater debt when married to another spendthrift than when married to a tightwad, the spendthrift is still less likely to argue about money with the other spendthrift."

No wonder so many marriages end in divorce. People enter into marriage with incompatible desires about money. But they choose the incompatibility.

Not married yet? If you are a tightwad then marry a fellow tightwad. If you are a spendthrift then I do not know what to advise. You are headed for financial disaster.

By Randall Parker 2009 September 18 10:48 AM  Brain Economics
Entry Permalink | Comments(14)
California Regulations On TV Energy Efficiency

The state of California is going to regulate television efficiency and knock some of the biggest TVs out of the market.

The rules, which took more than a year to develop, are designed to shave $8.1 billion off Californians' electricity bills over a 10-year-period. That works out to $30 per set per year, according to commission officials.

It will also help California utilities head off the need to build more power plants just so residents can watch "American Idol" and other shows. TVs already account for 10% of residential energy use in California, driven largely by surging demand for large-screen TVs.

The first line of this paragraph is a hoot.

Yet California's energy needs are so vast, it still must import about 30% of its electricity from out of state. Continued conservation, officials say, is critical to ensure California has enough electricity to keep its economy growing and healthy.

A geographically huge state can't generate all the electricity it uses? Why? The problem is not the vastness of California's needs. Let me reword: California's NIMBY regulations are so vast that the state prevents sufficient electricity generating capacity from being built within the state's borders. While I'm at it: California's regulatory restrictions increase transmission line losses by requiring generation capacity to be built far from its population centers and it increases odds of power outages due to failures in long distance transmission lines.

My guess is that would-be constructors of new electric power generating capacity also believe that once a generating plant is built in Nevada or Arizona it is at less future political risk from new regulatory and legislative decisions.

Plasma TVs are most threatened by this regulation. LCD TVs are substantially more energy efficient, by some reports by as much as a factor of 2 to 4. But even without regulatory pressures the plasma TV makers already have incentives to increase TV efficiency:

Plasma manufacturers are trying to avoid being edged out of the HDTV market by LCD, so putting any money into research in this area will likely bring a huge payoff for them. For one, better luminous efficiency will mean fewer parts needed to put the TV together. The power supply in a 42-inch 720p plasma TV accounts for 9 percent of the manufacturing cost, for example. It's only 3 percent of the cost of a comparable LCD TV. By increasing a plasma's efficiency to 5 lumens per watt, the cost of producing the TV could become equivalent to LCD, Young argues, which will allow plasma manufacturers to simply focus on improving the panel technology. And every dollar counts in the TV market, where margins are razor thin.

While plasma TVs have picture quality advantages LCDs weigh less, take up less space, make less noise, last longer, and use less energy. I expect the advantages to narrow but not disappear.

If you are curious about energy efficiency ratings of TVs you can go to the US government's Energy Star TV rating web page and do searches on types of TVs. Not all plasma TVs are so bad. For example, here are Energy Star ratings for plasma TVs bigger than 36 inches. They estimate that a Panasonic TC-P42U1 42" Viera U1 Series Plasma "1080p HDTV will use 261 kwh per year. At 11 cents per kwh (and you may pay more or less depending where you live) that's less than $30 per year. Not much. But a Sharp 40 inch LCD TV uses only 140 kwh per year.

By Randall Parker 2009 September 18 10:18 AM  Energy Appliances
Entry Permalink | Comments(13)
2009 September 17 Thursday
VW Diesel Hybrid Gets 170 MPG

Volkswagen has developed a concept car that will get 170 miles per gallon. It probably wouldn't pass current US safety standards. But it would probably be safer than a motorcycle.

Volkswagen is redefining the automobile with the L1, a bullet-shaped diesel hybrid that weighs less than 900 pounds, gets an amazing 170 mpg and might see production within four years.

The L1 concept car unveiled at the Frankfurt auto show pushes the boundaries of vehicle design and draws more inspiration from gliders than conventional automobiles.

In the United States 95% of energy used for transportation comes from oil. The approach of Peak Oil poses a big problem for our lifestyles and living standards. But since our current cars are so big we have plenty of room for downshifting into smaller and more efficient vehicles.

When oil production starts dropping every year cutting industrial uses of oil will prove more problematic than cutting personal transportation uses. Why? There's more room for improved efficiency in personal uses of oil than in commercial uses because industry places higher importance on efficiency already. Industry uses trucks and trains that are much closer to max efficiency than personal cars. Decisions in industry are driven more by cost and less desire for status or comfort. So, for example, trucks have far less room to increase efficiency than cars do.

By Randall Parker 2009 September 17 11:32 PM  Energy Transportation
Entry Permalink | Comments(5)
Gene Therapy Fixes Color Blindness In Squirrel Monkeys

All male squirrel monkeys are naturally red-green color blind. Gene therapy has successfully restored vision of 2 male squirrel monkeys.

Researchers have used gene therapy to restore colour vision in two adult monkeys that have been unable to distinguish between red and green hues since birth — raising the hope of curing colour blindness and other visual disorders in humans.

The problem with gene therapy is cancer risk. Whenever scientists figure out how to delivery gene therapy safely lots of diseases will become treatable. But when will that happen? Seems like a real hard problem. The Nature article above reports on 3 gene therapy phase 1 trials for underway in humans for retinal regeneration. I'd be curious to know what the scientists involved in these trials see as risks.

The fact that pigment expression was enough to fix the problem opens up the prospect that the same would work for humans.

Most striking, says Ali, is the discovery that the brains and retinas of the adult monkeys weren't too "hard-wired" or fixed to respond to the treatment. "What's so exciting about this study is that is demonstrates there's more plasticity in the brain and cone cells than we thought," says Ali. "It forces us to reconsider our assumptions, and opens up more possibilities than we thought for treating blindness."

What I want to know: If gene therapy was used to add a 4th and 5th pigment could humans gain the ability to see a wider range of colors? Just what would the additional colors look like?

By Randall Parker 2009 September 17 07:34 AM  Biotech Gene Therapy
Entry Permalink | Comments(5)
2009 September 16 Wednesday
Too Much Radiation For Humans In Mars Trip

Dreams of a human trip to Mars run up against limits to allowable human radiation exposure.

But calculations by Cucinotta and his colleagues suggest the trip would not meet NASA's existing rules, which aim to keep each astronaut's lifetime risk of fatal cancer from space radiation below 3 per cent.

For journeys outside Earth's magnetic field, astronauts could reach that limit in less than 200 days in a spacecraft with aluminium walls nearly 4 centimetres thick, according to worst-case scenario estimates (Radiation Measurements, DOI: 10.1016/j.radmeas.2006.03.011).

But a trip to Mars and back would take over 2 years. Two potential solutions:

  • Travel much faster.
  • Use several times the amount of mass as would otherwise be used.

Of course both of these approaches require far more energy. The faster trip is especially problematic because more energy would be available to launch a space ship toward Mars than to launch it back toward Earth. Getting a ship to move fast enough on the return trip would be a big challenge.

One way to get a ship to Mars that would have lots of chemical rocket mass to propel a return trip: Send two space ships. First send one slowly that would carry a lot of fuel. That fuel would enter Mars orbit before humans even left Earth. Then humans could leave Earth on a fast ship and arrive to find another fast ship with lots of fuel ready to take them back to Earth.

Part of the radiation exposure would come while humans are on Mars. How to reduce that exposure? Send robots ahead of time that would burrow down underground to create living quarters in several places that would be within driving distance of each other. The astronauts could move from underground shelter to underground shelter.

Of course, all this requires huge amounts of money and resources. Could other approaches work? I can imagine beam technology for pushing spaceships to faster speeds with power sources on stations in orbit around Earth and Mars.

What else? Think small. Methods to cure cancer or prevent cancer would reduce the scale of the problem. Nanobots could repair astronaut bodies as the damage occurred. Or nanobots could kill cancer. So we can wait 30-40 years to go to Mars until we have the biotechnology and nanotechnology sufficient to reduce the risks from higher radiation exposure.

I do not see the point of going to Mars with today's technology. Better to first push the edge of what is possible before sending humans on a trip that would put people on another planet for a pretty limited period of time. Humans went to the moon all we got were some cool videos.

By Randall Parker 2009 September 16 10:12 PM  Space Exploration
Entry Permalink | Comments(20)
Human Brains React To Tool Use Unlike Rhesus Monkeys

An area of the brain called the anterior supramarginal gyrus (aSMG) lights up with activity in humans when humans watch tool use. Rhesus monkeys do not show similar reactions in their brains when scanned with fMRI (functional magnetic resonance imaging).

Forty-seven people and five rhesus monkeys participated in the experiments. Two of the monkeys had been trained to obtain rewards beyond their reach by using either a rake or a pair of pliers.

Exactly the same areas of the brain became active in people and monkeys when they watched footage of hands simply grasping tools.

But when they watched videos of tools actually being used, the aSMG became active in the humans alone. It was silent even in the two trained monkeys'.

Do people who study mechanical engineering show more activity in the aSMG when they watch tool use? Do women show less aSMG activity while watching tool use? Are there genetic variations within human populations that increase and decrease aSMG activation when watching tool use?

Imagine a sort of aptitude test where one's brain gets scanned while one looks at and tries various forms of activity. Such a test might be able to reveal what one would most enjoy doing in the long run.

By Randall Parker 2009 September 16 09:52 PM  Brain Evolution
Entry Permalink | Comments(0)
2009 September 15 Tuesday
Air Pollution Raises Blood Pressure

The quality of the air you breathe plays a role in determining whether you get high blood pressure.

ANN ARBOR, Mich. – It’s well known that measures such as exercise, a healthy diet and not smoking can help reduce high blood pressure, but researchers at the University of Michigan Health System have determined the very air we breathe can be an invisible catalyst to heart disease. Inhaling air pollution over just two hours caused a significant increase in diastolic blood pressure, the lower number on blood pressure readings, according to new U-M research.

People placed in air similar to that near an urban roadway experienced higher blood pressure. This is, parenthetically, an argument against doing long commutes in urban environments. The air you breathe in your car is bad for your health.

Eighty-three people in Ann Arbor and Toronto were involved in testing and breathed air pollution, concentrated by a mobile air quality research facility, that was similar to what would be found in an urban environment near a roadway.

“We looked at their blood vessels and then their responses before and after breathing high levels of air pollution,” explains Robert Bard, M.S., co-author and clinical research manager at U-M.

Ozone gases, a well-known component of air pollution, were not the biggest culprit. Rather, small microscopic particles about a 10th of the diameter of a human hair caused the rise in blood pressure and impaired blood vessel function, tests showed. The blood pressure increase was rapid and occurred within 2 hours, while the impairment in blood vessel function occurred later but lasted as long as 24 hours.

I'd like to know more about indoor air pollution and background air pollution for those who do not necessarily live near a highway. Would a home HEPA filter deliver real health benefits for most people? If you live in a city or near a busy street or freeway the argument for filtering one's air is more compelling.

By Randall Parker 2009 September 15 09:28 PM  Health Pollution Harm
Entry Permalink | Comments(5)
2009 September 14 Monday
Autonomous Robot Development For Space Missions

NASA is working on software to allow future Mars robots to make more decisions.

SOMETHING is moving. Two robots sitting motionless in the dust have spotted it. One, a six-wheeled rover, radios the other perched high on a rocky slope. Should they take a photo and beam it back to mission control? Time is short, they have a list of other tasks to complete, and the juice in their batteries is running low. The robots have seconds to decide. What should they do?

Today, mission control is a mere 10 metres away, in a garage here at NASA's Jet Propulsion Laboratory (JPL) in Pasadena, California. Engineers can step in at any time. But if the experiment succeeds and the robots spot the disturbance and decide to beam the pictures back to base, they will have moved one step closer to fulfilling NASA's vision of a future in which teams of smart space probes scour distant worlds, seeking out water or signs of life with little or no help from human controllers.

The 20 minute lag time for transmissions to Mars combined with the limited bandwidth available for sending pictures and other sensor data make greater autonomous capability highly desirable. Greater autonomy requires better image processing algorithms. For example a rover needs the ability to recognize obstacles and cruise around them.

Money spent on developing robotic technologies will do more to enable space exploration than far larger sums spent on rockets capable of launching humans into space. Plus, the development of robotic technology for this purpose could produce some technological spin-offs for automating work down here on Earth. In my view we should put more effort into robotic exploration while we wait for nanotubes that will enable the creation of a "space beanstalk" approach to getting into space. The costs are too high for launching all the equipment and supplies needed to support human life on a trip to Mars or for a colony on the moon.

By Randall Parker 2009 September 14 10:54 PM  Robots Space Exploration
Entry Permalink | Comments(2)
Half Of Eyewitnesses Can Be Fooled By Doctored Video

Human memory and human judgment can not be trusted. Doctored video caused people to believe they saw something they never saw.

Associate Professor Dr Kimberley Wade from the Department of Psychology led an experiment to see whether exposure to fabricated footage of an event could induce individuals to accuse another person of doing something they never did.

In the study, published in Applied Cognitive Psychology, Dr Wade found that almost 50% of people shown fake footage of an event they witnessed first hand were prepared to believe the video version rather than what they actually saw.

Scary thought: You get arrested for something you didn't do and then humans are called to testify as witnesses. Not Vulcans, oh no. Lowly, flawed, mistaken, gullible, foolish humans. The very thought is enough to make me ill. Lesson to take home: Innocent people should move to planet Vulcan lest they get convicted of crimes they didn't do.

You can't trust half the eyewitnesses.

In a game involving the use of fake money participants were deceived into believing that a person sitting next to them cheated. Shown a doctored video people were willing to claim they saw the cheating when it supposedly first occurred - even though the cheating never happened.

One third of the subjects were told that the person sat next to them was suspected of cheating. Another third were told the person had been caught on camera cheating, and the remaining group were actually shown the fake video footage. All subjects were then asked to sign a statement only if they had seen the cheating take place.

Nearly 40% of the participants who had seen the doctored video complied. Another 10% of the group signed when asked a second time by the researchers. Only 10% of those who were told the incident had been caught on film but were not shown the video agreed to sign, and about 5% of the control group who were just told about the cheating signed the statement.

Think about that. 5% didn't even need to see doctored video to agree to something they never saw.

By Randall Parker 2009 September 14 10:10 PM  Brain Limits
Entry Permalink | Comments(4)
Nanosolar Steps Out With Lower Costs Solar Cells

Nanosolar has announced $4.1 billion of orders for their thin film photovoltaic solar cells even though they are just starting up production.

On Wednesday, Nanosolar pulled back the curtain on its thin-film photovoltaic cell technology — which it claims is more efficient and less expensive than that of industry leader First Solar — and announced that it has secured $4.1 billion in orders for its solar panels.

CEO Martin Roscheisen claims lower costs than cost leader First Solar. Great news if true.

Nanosolar has an unusual manufacturing approach using thin aluminum foil and a wet chemistry for depositing CIGS (Cadmium Indium Gallium Selenide) thin film on the foil as the foil moves rapidly. This continuous flow approach holds the prospect of much lower cost of manufacture. Their new plant in Germany can produce 640MW per year when operated 24x7.

Today Nanosolar demonstrated the completion of its European panel-assembly factory as part of an inauguration event attended by Germany's Minister of the Environment, the Governor of the State of Brandenburg, and a host of other leading public officials. Located in Luckenwalde near Berlin, the fully-automated factory processes Nanosolar cells into finished Nanosolar panels using innovative high-throughput manufacturing techniques and tooling developed by Nanosolar and its partners.

The panel factory is automated to sustain a production rate of one panel every ten seconds, or an annual capacity of 640MW when operated 24x7. Nanosolar also today announced that serial production in its San Jose, California, cell production factory commenced earlier this year.

That 640MW is probably in practice equivalent to about a quarter that amount on average - depending heavily on just where the panels are installed. I'm guessing roughly 10 such factories would produce the equivalent of a new nuclear power plant every year. If they can get their costs low enough then raising the capital for a big capacity build will be easy to do. They claim their process makes cost reduction much easier than competing approaches.

Nanosolar claims the electrical capacity of their panels make them especially well suited for large utility installations.

Electrically, it is the industry’s highest-current thin panel, by as much as a factor of six. It is also the industry’s first photovoltaic module certified by TUV for a system voltage of 1500V, or 50% higher than the previously highest certified. Together this enables utility-scale panel array lengths and results in a host of substantial cost savings during the deployment of solar power plants.

Nanosolar's foil PV has produced electricity at 16.4% efficiency.

Our lab and production teams have managed to make more progress on efficiency than we had planned on in any of our business plans. Recall that we print CIGS onto inexpensive metal foil, that is, something that some have been skeptical can work while others have been wondering whether it can deliver efficient cells.

So we are pleased to announce that our low-cost printed-CIGS-on-metal-foil cell stack and process produces quite efficient cells: Earlier this year, NREL independently verified several of our cell foils to be as efficient as 16.4%.

In production efficiency is running at only 11%. If Nanosolar can get their production cell efficiency up to 16.4% that would cut substantially cut their costs.

PV costs are definitely on a downhill slope after several years of stagnant prices. Now we have two main competitors driving costs below $1 per watt. Good news. Will costs of supporting equipment such as grid tie inverters also drop? Or will the development of DC (direct current) appliances avoid the need for grid tie inverters for home PV installations?

Michael Graham Richard has some cool pictures of their manufacturing process.

Update: Be sure to read Chris Nelder's more detailed look at Nanosolar's cost advantages. They cut interconnect costs, reduce need for aluminun frames, and cut costs in other ways relating to installation. They might end up far cheaper than First Solar, the current cost leader. Solyndra has been lauded in the press as the potential big rival to First Solar. But if the claims about Nanosolar are true then Nanosolar might beat both of them. Sure looks like PV costs are headed for a big fall.

By Randall Parker 2009 September 14 09:27 PM  Energy Solar
Entry Permalink | Comments(14)
2009 September 13 Sunday
New Silicon Solar Cell Design Cuts Costs

Yet another report about lowered costs for making photovoltaic solar cells.

Improvements to conventional solar cell manufacturing that could significantly increase the efficiency of multicrystalline silicon cells and bring down the cost of solar power by about 20 percent have been announced by startup 1366 Technologies of Lexington, MA.

They claim they can boost light-to-electron conversion efficiency for a very small increase in manufacturing cost. This helps the silicon cells compete against the currently lower cost thin film cells made by First Solar.

Such cost reduction would make solar power more competitive with conventional sources of electricity. In sunny environments, this could bring the cost of solar down to about 15 or 16 cents per kilowatt hour, says Craig Lund, 1366 Technologies's director of business development. That's cheaper than some conventional sources of electricity, especially those used during times of peak electricity demand.

The company is going to sell manufacturing equipment to PV makers. So this tech will drive down costs of multiple suppliers. Also, the efficiency boost will increase the amount of power you can get from the same roof after.

The company’s ultimate business will be to make and sell texturing and metallization machines that solar cell manufacturers can incorporate into their existing assembly lines. “The big news for us is that we’re going into commercial production with equipment that delivers an 18-percent multicrystalline cell,” Lund says.

For years progress is lowering photovoltaic costs seemed slow to non-existent. But rapidly growing demand (mostly caused by government policies) was hiding progress in cutting costs. Many companies are working on practical innovations to lower costs and all this effort is beginning to show real results in the market place. I've become optimistic that solar power is going to become much cheaper.

What I'd like to know: can we expect grid tie inverter costs to fall as much as PV costs? Also, how practical is it to use some of the PV power directly in DC appliances? Will DC (as distinct from AC) appliances remain too rare for PV DC power bypass grid tie inverters and more cheaply power appliances?

By Randall Parker 2009 September 13 10:54 PM  Energy Solar
Entry Permalink | Comments(11)
2009 September 12 Saturday
Half Of Consumed Fish From Aquaculture

On the surface this sounds like goods news since you might expect aquaculture fish to reduce the pressure to over-harvest wild fish. But no.

Aquaculture, once a fledgling industry, now accounts for 50 percent of the fish consumed globally, according to a new report by an international team of researchers. And while the industry is more efficient than ever, it is also putting a significant strain on marine resources by consuming large amounts of feed made from wild fish harvested from the sea, the authors conclude. Their findings are published in the Sept. 7 online edition of the Proceedings of the National Academy of Sciences (PNAS).

"Aquaculture is set to reach a landmark in 2009, supplying half of the total fish and shellfish for human consumption," the authors wrote. Between 1995 and 2007, global production of farmed fish nearly tripled in volume, in part because of rising consumer demand for long-chain omega-3 fatty acids. Oily fish, such as salmon, are a major source of these omega-3s, which are effective in reducing the risk of cardiovascular disease, according to the National Institutes of Health.

The problem: salmon and other aquaculture fish are fed with wild fish.

In 2006, aquaculture production was 51.7 million metric tons, and about 20 million metric tons of wild fish were harvested for the production of fishmeal. "It can take up to 5 pounds of wild fish to produce 1 pound of salmon, and we eat a lot of salmon," said Naylor, the William Wrigley Senior Fellow at Stanford's Woods Institute for the Environment and Freeman Spogli Institute for International Studies.

The amount of fish oil in the aquaculture salmon diet could be lowered. But this seems like an inadequate response in the face of rising demand.

One way to make salmon farming more environmentally sustainable is to simply lower the amount of fish oil in the salmon's diet. According to the authors, a mere 4 percent reduction in fish oil would significantly reduce the amount of wild fish needed to produce 1 pound of salmon from 5 pounds to just 3.9 pounds. In contrast, reducing fishmeal use by 4 percent would have very little environmental impact, they said.

What is really needed: Genetic engineering to get land-based crops to produce the omega 3 fatty acids DHA and EPA. Note that flax oil does not contain these long chain omega 3s. Flax contains Alpha Linolenic Acid.

Update: Some people in the salmon aquaculture industry say that they use 5 lb of feed with only 1.5 lb of that fish meal to produce 1 lb of salmon. That's not as bad as Naylor's 5 lb of fish to create 1 lb of salmon. Still, the ratio of fish meal to salmon has got to get well below 1-to-1 in order to take pressure off the oceans.

By Randall Parker 2009 September 12 07:54 PM  Trends Resource Depletion
Entry Permalink | Comments(24)
2009 September 10 Thursday
$20000 Per Genome Sequencing For 8 At A Time

Just a month ago Stephen Quake sequenced his genome for $50000. That represents a drop of 80% from the $250k cost of a year ago and orders of magnitude lower than the cost 10 years ago. But if you go out and pay $50k to get your genome sequenced you are probably spending too much. MIT's Technology Review reports a company called Complete Genomics has dropped the cost of genome sequencing even further.

And CEO Clifford Reid says the company will soon start charging $20,000 per genome for an order of eight genomes or more, and $5,000 apiece for an order of 1,000 or more-with variable pricing in between.

How low does the price have to get for you to pay to get your genome sequenced?

The biggest problem at this point is just what do you do with the information? We are going to go thru a period where genome sequencing is really cheap but the information about your DNA letter sequence isn't of much help to most people. We need to know what all the differences mean.

Once the significance of lots of genetic sequence information becomes known how useful will it be in daily life? We'll certainly know ourselves better. But if, say, you've got some genetic variants that increase your risk of cancer what to do with this information? Perhaps get colonoscopies more often if your risk of colon cancer is elevated. But not all cancers lend themselves to preventative testing and not all tests are easy to be done.

I expect understanding of genetic variations will play a big role in changing mating choices and in embryo selection. Knowledge of genetic variants will help some in dietary choices too. Got any ideas on how detailed knowledge of your genetic variations will be useful to you?

By Randall Parker 2009 September 10 08:41 PM  Biotech Advance Rates
Entry Permalink | Comments(3)
Snort Stem Cells, Not Cocaine

New Scientist reports from the latest Strategies for Engineered Negligible Senescence (SENS - all about reversing the aging process) conference in Cambridge UK and reveals some scientists find they can deliver stem cells into mouse brains with nose drops.

Since proteins, bacteria and viruses can enter the brain this way, Lusine Danielyan at the University Hospital of Tübingen in Germany, and her colleagues, wondered if stem cells would also migrate into the brain through the cribriform plate.

To test their idea, they dripped a suspension of fluorescently labelled stem cells into the noses of mice. The mice snorted them high into their noses, and the cells migrated through the cribriform plate. Then they travelled either into the olfactory bulb - the part of the brain that detects and deciphers odours - or into the cerebrospinal fluid lining the skull, migrating across the brain. The stem cells then moved deeper into the brain.

Now all we need are stem cells suitably programmed to, for example, replace aged neurons, aged glial cells, and even aged cells in brain arteries and veins. Then snort up.

Improvements in methods to create induced pluripotent stem cells (most recently at Stanford using fat cells) create the prospect of stem cells created from one's own body. No need to worry about immune rejection. We probably still need further improvements that reduce risk that the stem cells will turn cancerous. Then we need improvements in methods to turn pluripotent stem cells into whatever stem cell types we most want to send in the brain.

Okay, suppose we jump ahead 10 or 20 years and those problems are solved. Time to snort stem cells to improve our brains. That'll help aging brains. But where it gets really interesting is when we discover which genetic variations contribute to higher intelligence. Can stem cells genetically engineered for high IQ genes boost our intelligence? If so, snorting stem cells could make a whole society smarter. In 20 years will that become possible?

By Randall Parker 2009 September 10 07:47 PM  Brain Disorder Repair
Entry Permalink | Comments(1)
2009 September 09 Wednesday
Pretty Women Make Guys Dumber

Guys should avoid taking classes that have good-looking women in them if the goal is to learn anything. The cognitive performance of men declines after interacting with attractive women.

The present research tested the prediction that mixed-sex interactions may temporarily impair cognitive functioning. Two studies, in which participants interacted either with a same-sex or opposite-sex other, demonstrated that men’s (but not women’s) cognitive performance declined following a mixed-sex encounter. In line with our theoretical reasoning, this effect occurred more strongly to the extent that the opposite-sex other was perceived as more attractive (Study 1), and to the extent that participants reported higher levels of impression management motivation (Study 2). Implications for the general role of interpersonal processes in cognitive functioning, and some practical implications, are discussed.

Spend your cognitive resources wisely. Guys, trying to impress women is often counterproductive. Plus, it diminishes your ability to do other things.

Being in a relationship didn't help guys maintain their ability to think about mentally challenging tasks.

In their second study, the researchers had 53 male and 58 female college participants interact with each other, instead of using a confederate for the interactions (like they did in the first study). Men (but not women), likewise, displayed a decline in performance on a different, very cognitively demanding task, requiring both task-switching and inhibition. Also, just like the first study, this effect held independent of whether the participant was currently in a relationship.

Curiously, that subset of women who admitted they were trying to impress the guy also experienced declines in cognitive performance. So fewer women than men try to impress. But those who do become dumber just like men do.

This all demonstrates how we are slaved to our genes.

Psychologist Dr George Fieldman, a member of the British Psychological Society, said the findings reflect the fact that men are programmed to think about ways to pass on their genes.

'When a man meets a pretty woman, he is what we call 'reproductively focused'.

Of course all that reproductive focus is defeated by birth control. Nature is foiled until natural selection creates future generations with stronger desire to make babies.

By Randall Parker 2009 September 09 11:12 PM  Brain Sex Differences
Entry Permalink | Comments(2)
Big Differences Between European Countries In Heart Risks

The big smoker countries in Europe have much higher rates of heat disease death under age 65.

While heart disease remains the leading cause of death in Europe, mortality rates are falling in most (but not all) countries, according to new findings released by the EuroHeart mapping project.(1) However, this detailed research, part of a three-year programme to analyse cardiovascular health and prevention policies in 16 European countries, also reveals huge inequalities among countries both in the rate of cardiovascular mortality and in national prevention programmes.

  • Highest rates of mortality from coronary heart disease (CHD) in men under 65 were found in Hungary (105 per 100,000 population), Estonia (104), Slovakia (74), Greece (50), Finland (48) and UK (44).
  • Highest rates for women under 65 were found in Hungary (28), Estonia (20), Slovakia (19), UK (11), Greece (10) and Belgium (9).
  • Lowest rates for men under 65 were found in France (17), Netherlands (22), Italy (25) and Norway (27).
  • Lowest rates for women under 65 were found in Iceland (3), France (3), Slovenia (5) and Italy (5).

This pattern was also reflected (though not exactly mirrored) in risk factor prevalence, where, for example, Greece (46%), Estonia (42%), Slovakia (41%), Germany (37%) and Hungary (37%) had the highest rates of cigarette smoking.

Hungary and Estonia have a lot of unhealthy people. The Greeks need to stop smoking themselves to death.

The Finns have made big improvements in lowering coronary heart disease (CHD) mortality. Whereas Greece is losing ground. My guess: part of this is due to fewer Greeks eating the Mediterranean diet.

There are also noticeable differences in trends in CHD mortality; in Finland mortality rates from CHD declined by 76% from 1972 to 2005; in the same period in Greece, mortality rates for CHD increased by 11%. In nine of the 16 EuroHeart countries, the trends in CHD death rates in women show that they have declined less than in men.

Europe is clearly lagging the United States in turning away from the demon weed. Kentucky has the highest incidence of smoking at 28.3% in 2007 with Utah's Mormons living a pure life at only 11.7% but the US Virgin Islands at 8.7% even beat Utah. Good for those Virgin Islanders. The Virgin Islands, center of heart-healthy living. Who knew?

In 2007, the median prevalence of adult current smoking in the 50 states and DC was 19.8%. Among states, current smoking prevalence was highest in Kentucky (28.3%), West Virginia (27.0%), and Oklahoma (25.8%); and lowest in Utah (11.7%), California (14.3%), and Connecticut (15.5%). Smoking prevalence was 8.7% in USVI, 12.2% in PR, and 31.1% in Guam. Median smoking prevalence among the 50 states and DC was 21.3% (range: 15.5%-28.8%) for men and 18.4% (range: 8.0%-27.8%) for women. Men had a significantly higher prevalence of smoking than women in 30 states, DC, and all three territories.

National smoking numbers for the United States show a higher rate for ages 18-24 than for the adult population overall. So then has the decline in smoking bottomed out?

  • An estimated, 20.8% of all adults (45.3 million people) smoke cigarettes in the United States.4
  • Cigarette smoking estimates by age are as follows: 18–24 years (23.9%), 25–44 years (23.5%), 45–64 years (21.8%), and 65 years or older (10.2%).4
  • Cigarette smoking is more common among men (23.9%) than women (18.0%).4
  • Prevalence of cigarette smoking is highest among American Indians/Alaska Natives (32.4%), followed by African Americans (23.0%), whites (21.9%), , Hispanics (15.2%), and Asians [excluding Native Hawaiians and other Pacific Islanders] (10.4%).4
  • Cigarette smoking estimates are highest for adults with a General Education Development (GED) diploma (46.0%) or 9–11 years of education (35.4%), and lowest for adults with an undergraduate college degree (9.6%) or a graduate college degree (6.6%).4
  • Cigarette smoking is more common among adults who live below the poverty level (30.6%) than among those living at or above the poverty level (20.4%).4

Smokers need another drug that gives them as much nerve calming but without the cardiovascular damage.

By Randall Parker 2009 September 09 08:20 PM  Aging Cardiovascular Studies
Entry Permalink | Comments(6)
2009 September 08 Tuesday
Brain Mistake Signal Stronger In Better Academic Performers

The error-related negativity (ERN) signal is stronger when better students make mistakes.

In the first study ever to link academic performance to a neural signal, participants performed a Stroop task – a well-known test of cognitive control – while hooked up to EEG electrodes that measured their brain activity.

U of T researchers monitored a brain signal known as the error-related negativity (ERN) in each participant's brain while they completed the task. ERN signals are observed approximately 100 milliseconds after a mistake is made, and are involved in cognitive control and self-regulation. Large ERN signals indicate a participant is responding strongly when they've made a mistake; smaller ERN signals indicate they are less responsive to their mistakes.

The researchers then compared the size of each participant's ERN signals to their official university transcript grades.

"Those students with larger ERN signals did significantly better in school, showing that these neural signals have important real world implications," says doctoral researcher Jacob Hirsh.

Did higher academic performers do better only because their brains could recognize more mistakes? Or did they also do better because their brains more loudly signaled a mistake? Could a lower performing person improve their performance by listening more carefully to their doubts?

I'd like to know how strongly the ERN signal's strength correlates with IQ. Does use of ERN signal in combination with IQ predict academic performance more accurately than using either of these measures alone? Not surprisingly, half the ERN signal's strength is down to your genes.

Because the size of the ERN is only 50 per cent determined by genetics, though, Hirsh says students may be able to improve their ERN signals by attending to their mistakes, thereby helping to improve their academic performance. "The ERN is not set in stone," he says.

It is not obvious to me that most people can become better at recognizing when they've made mistakes.

By Randall Parker 2009 September 08 11:44 PM  Brain Performance
Entry Permalink | Comments(5)
Inflammation And Infections Accelerate Alzheimer's Disease?

More evidence that inflammation contributes to the development of Alzheimer's.

The study found that people who had respiratory, gastrointestinal or other infections or even bumps and bruises from a fall were more likely to have high blood levels of tumor necrosis factor-α, a protein involved in the inflammatory process, and were also more likely to experience memory loss or other types of cognitive decline than people who did not have infections and who had low levels of the protein.

The blood levels and cognitive abilities of 222 people with Alzheimer's disease with an average age of 83 were measured at the beginning of the study and three more times over six months. Caregivers were interviewed to determine whether the participants had experienced any infections or accidental injury that could lead to inflammation.

A total of 110 people experienced an infection or injury that led to inflammation during the study. Those people experienced memory loss that was at twice the rate of those who did not have infections or injuries.

People who had high levels of the protein in their blood at the beginning of the study, which may indicate chronic inflammation, had memory loss at four times the rate of those with low levels of the protein at the start of the study. Those who had high levels of the protein at the start of the study who also experienced acute infections during the study had memory loss at 10 times the rate of those who started with low levels and had no infections over the six-month period.

How to make use of this info? Eat foods that lower inflammation. The Mediterranean diet, plenty of sleep, and omega 3 fatty acids will likely all reduce your risk of Alzheimer's and dementia.

By Randall Parker 2009 September 08 11:34 PM  Brain Alzheimers Disease
Entry Permalink | Comments(0)
ADHD Driven By Dopamine Receptor Concentrations?

PET scans show brain receptor differences in people with attention deficit hyperactivity disorder (ADHD).

UPTON, NY — A brain-imaging study conducted at the U.S. Department of Energy's (DOE) Brookhaven National Laboratory provides the first definitive evidence that patients suffering from attention deficit hyperactivity disorder (ADHD) have lower-than-normal levels of certain proteins essential for experiencing reward and motivation.

"These deficits in the brain's reward system may help explain clinical symptoms of ADHD, including inattention and reduced motivation, as well as the propensity for complications such as drug abuse and obesity among ADHD patients," said lead author Nora Volkow, Director of the National Institute on Drug Abuse and a long-time collaborator on neuroimaging research at Brookhaven Lab.

The study, published in the September 9, 2009, issue of the Journal of the American Medical Association, also has important implications for treatment. "Finding ways to address the underlying reward-system deficit could improve the direct clinical outcome of ADHD, and potentially reduce the likelihood of other negative consequences of this condition," said study co-author Gene-Jack Wang, chair of Brookhaven's medical department.

Can't pay attention? It is all down to your dopamine receptors. You don't have enough of them to get rewarded for paying attention.

The scientists used positron emission tomography (PET) to measure two markers of the dopamine system — dopamine receptors, to which the chemical messenger binds to propagate the "reward" signal, and dopamine transporters, which take up and recycle excess dopamine after the signal is sent.

Lying in a PET scanner, each patient was injected with a minute amount of a "radiotracer" compound — a chemical labeled with a radioactive form of carbon and designed to bind specifically to one of the targets. Different tracers were used for each target, and patients were scanned for each at separate times. By detecting the signal from the radiotracers, the PET machine can measure the receptor and transporter locations and concentrations in various parts of the brain.

The results clearly showed that, relative to the healthy control subjects, the ADHD patients had lower levels of dopamine receptors and transporters in the accumbens and midbrain — two key regions of the brain directly involved in processing motivation and reward. In addition, the measurements of dopamine markers correlated with measures of behavior and clinical observations of ADHD symptoms, such as reduced levels of attention as measured by standard psychological tests.

A drug that would stimulate neurons to increase the synthesis of dopamine receptors would probably improve the ability of people to pay attention for longer periods of time. Do any existing drugs increase dopamine receptor synthesis?

Think about the continuing stream of discoveries about how neurotransmitter receptor concentrations alter behavior and mood. These discoveries are building up a foundation of knowledge needed to develop drugs that will alter mood, motivation, and intellectual performance. We will become more pharmacologically malleable as a result of drug development guided by these discoveries. Those new drugs might be 10 or 20 years off. But they'll come eventually.

By Randall Parker 2009 September 08 11:25 PM  Brain Performance
Entry Permalink | Comments(1)
2009 September 06 Sunday
Solar Photovoltaic Price Drops Continue

An article in Digitimes claims the price of polysilicon-based solar photovoltaic (PV) solar cells has dropped by more than half since 3Q 2008 and may drop to a quarter of the current price by 2011.

The price per watt has now dropped to US$1.80 for polysilicon-based products, which is lower than the US$1.85 level The Information Network previously thought the industry would see at the end of 2009. By way of comparison, the average selling price in the third quarter of 2008 was US$4.05 per watt.

Whether we will see a continued rapid decline in prices remains to be seen. Currently I doubt any manufacturer aside from First Solar can make a profit below $1 per watt. Anyone have insights into the costs of China's biggest PV makers?

Also, anyone have insights into how much prices for residential PV installations are declining?

The decline in demand as a result of the recession combined with a big ramp up of manufacturing capability (it caused by the high energy prices that helped trigger the recession) created the glut. But China now is aiming to become the dominant maker of PV. Suntech Power of China is selling product in the US at a loss in order to gain market share.

Chinese companies have already played a leading role in pushing down the price of solar panels by almost half over the last year. Shi Zhengrong, the chief executive and founder of China’s biggest solar panel manufacturer, Suntech Power Holdings, said in an interview here that Suntech, to build market share, is selling solar panels on the American market for less than the cost of the materials, assembly and shipping.


But even in the solar industry, many worry that Western companies may have fragile prospects when competing with Chinese companies that have cheap loans, electricity and labor, paying recent college graduates in engineering $7,000 a year.

But if Suntech Power is already selling at a loss at today's prices I fail to see how it can afford to sell at a half or a quarter of today's prices. I doubt manufacturing costs will drop that quickly.

Manufacturing capacity looks set to continue a very rapid growth.

DisplaySearch, a unit of research firm NPD Group that focuses on the display and solar markets, reports today that global solar cell manufacturing capacity is expected to grow 56% in 2009 to 17 gigawatts, with further growth at a 49% compounded rate to more than 42 GW in 2013.

Might China's planned very rapid growth in use of solar power prevent the steepest PV price declines?

The country may raise its solar-power capacity to 2,000 megawatts by 2011 and 20,000 megawatts by 2020, from 150 megawatts at the end of last year, Cui Rongqiang, head of the Shanghai Solar Energy Society, said by telephone today.

I do not come across articles on China's use of concentrating solar power (CSP). Yet at least in the United States CSP looks cheaper than PV. CSP also makes it much easier to shift production from afternoon into evening via heat storage in molten salts.

But as a percentage of total energy used the contributions from renewable energy will remain small.

Renewable energy will account for 4 percent of the city’s total consumption by next year and 6 percent by 2020, the Beijing Municipal Development and Reform Commission said in a statement handed out to reporters at a briefing today.

By 2020 renewables might account for 15% of China's energy consumption.

The central government aims to meet 15% of its energy needs through renewable sources by 2020. Beijing hopes to triple its solar heater capacity by the same year, according to Greenpeace China.

Can Suntech or another Chinese PV maker catch up with First Solar on costs? Can PV catch up with CSP on costs? Anyone got insights on these questions?

By Randall Parker 2009 September 06 01:29 PM  Energy Solar
Entry Permalink | Comments(19)
China To Double Nuclear Power Plant Build Rate

Bloomberg reports on an interview with the President of Japan Steel Works that China will build more than double previous estimates. 132 units will take China way past the US (at 104 units and probably smaller average size) in total nuclear reactor capacity.

The country may build about 22 reactors in the five years ending 2010 and 132 units thereafter, compared with a company estimate last year for a total 60 reactors, President Ikuo Sato said in an interview. Japan Steel Works has the only plant that makes the central part of a large-size nuclear reactor’s containment vessel in a single piece, reducing radiation risk.

More nukes means a slower growth rate in coal electric power plant construction. The total amount of CO2 emissions from Chinese plants will continue to rise. But it would rise as fast and as far as previously projected.

That high build rate should bring down costs and make China the low cost leader in nuclear power plant construction.

By Randall Parker 2009 September 06 01:10 PM  Energy Nuclear
Entry Permalink | Comments(4)
Natural Arctic Cooling Reversed Starting 1900

A long term climate trend in the Arctic has reversed course.

Warming from greenhouse gases has trumped the Arctic's millennia-long natural cooling cycle, suggests new research. Although the Arctic has been receiving less energy from the summer sun for the past 8,000 years, Arctic summer temperatures began climbing in 1900 and accelerated after 1950.

The decade from 1999 to 2008 was the warmest in the Arctic in two millennia, scientists report in the journal Science. Arctic temperatures are now 2.2 F (1.2 C) warmer than in 1900.

To track Arctic temperatures 2,000 years into the past, the research team analyzed natural signals recorded in lake sediments, tree rings and ice cores. The natural archives are so detailed the team was able to reconstruct past Arctic temperatures decade by decade.

As part of a 21,000-year cycle, the Arctic has been getting progressively less summertime energy from the sun for the last 8,000 years. That decline won't reverse for another 4,000 years.

The new research shows the Arctic was cooling from A.D. 1 until 1900, as expected. However, the Arctic began warming around 1900, according to both the natural archives and the instrumental records.

What I want to know: If we wanted to select a temperature at which to stabilize the world's climate (since, after all, it naturally warms and cools) at what temperature would we need to stabilize it at to stop average global ice melting? My guess is that temperature is cooler than it is today. Is that temperature also cooler than it was in 1900?

If I understand this correctly the 1.2 C warming since 1900 reverses the last 6 thousand years of cooling (1.2 C divided by 0.2 C per thousand years).

The analysis shows that summer temperatures in the Arctic, in step with reduced energy from the sun, cooled at an average rate of about 0.36 F (0.2 C) per thousand years -- until the 20th century.

Naturally we should be cooling as we head back into the next ice age. But do you want the natural cycle to continue? Or do you want whatever climate we get as a side effect of continued industrialization? Or do you want to select a climate target and manipulate climate policy to achieve that target?

By Randall Parker 2009 September 06 09:41 AM  Climate Trends
Entry Permalink | Comments(9)
Sea Ice Down 53% Since 1980

US Navy submarine upward looking sonar profiles of the ice during the Cold War provides a longer term view into sea ice thickness changes.

While satellites provide accurate and expansive coverage of ice in the Arctic Ocean, the records are relatively new. Satellites have only monitored sea ice extent since 1973. NASA's Ice, Cloud, and land Elevation Satellite (ICESat) has been on the task since 2003, allowing researchers to estimate ice thickness as well.

To extend the record, Kwok and Drew Rothrock of the University of Washington, Seattle, recently combined the high spatial coverage from satellites with a longer record from Cold War submarines to piece together a history of ice thickness that spans close to 50 years.

Analysis of the new record shows that since a peak in 1980, sea ice thickness has declined 53 percent. "It's an astonishing number," Kwok said. The study, published online August 6 in Geophysical Research Letters, shows that the current thinning of Arctic sea ice has actually been going on for quite some time.

Average sea ice thickness declined from 3.64 to 1.89 meters. But if 1980 was a peak then what was the ice like in the 1960s and 1970s? Anyone seen the paper?

During the Cold War, the submarines collected upward-looking sonar profiles, for navigation and defense, and converted the information into an estimate of ice thickness. Scientists also gathered profiles during a five-year collaboration between the Navy and academic researchers called the Scientific Ice Expeditions, or "SCICEX," of which Rothrock was a participant. In total, declassified submarine data span nearly five decades—from 1958 to 2000—and cover a study area of more than 1 million square miles, or close to 40 percent of the Arctic Ocean.

Kwok and Rothrock compared the submarine data with the newer ICESat data from the same study area and spanning 2003 to 2007. The combined record shows that ice thickness in winter of 1980 averaged 3.64 meters. By the end of 2007, the average was 1.89 meters.

How good is the instrumentation of the Earth's climate at this point? How many more sensors would be needed to measure heat flows accurately enough to adjust for major influences and create better future projections?

By Randall Parker 2009 September 06 08:56 AM  Climate Trends
Entry Permalink | Comments(4)
2009 September 05 Saturday
100 To 200 New Mutations Per Person

We've each got our own unique genetic mutations. Each person has 100-200 new genetic mutations that their parents did not have.

Scientists at the Wellcome Trust Sanger Institute and colleagues have made the first direct measurement of the general rate of genetic mutation in humans.

They calculated that there are 100-200 new DNA mutations (single base changes in our DNA sequence that are different from the sequence inherited from our parents) from generation to generation. Almost all were harmless, with no apparent effect on our health or appearance, and only four mutations accumulated over 13 generations.

The findings and method developed by the researchers furthers our understanding of mutation rates and could help us test ways to help reduce mutations. Mutation is the source of genetic variation, which can lead to diseases such as cancer. They also provide a ‘molecular clock’ for measuring evolutionary timescales.

I expect the use of in vitro fertilization (IVF) combined with pre-implantation genetic testing to reduce the frequency of new functionally significant mutations in offspring. As we learn more about the functional meaning of genetic variations (driven by huge declines in DNA sequencing costs) people will increasingly use the discoveries to guide embryo choices with IVF.

The researchers looked at Y chromosomes. This means they missed the accumulation of lethal recessive mutations that can only occur on chromosomes other than the Y chromosome. But they were still able to get a good picture of the rate of mutation. Keep in mind their sample set was small. One would need to repeat this experiment with more people to get a more precise idea of the range of rates of mutation accumulation.

In the new study, the researchers looked at the Y chromosomes of two Chinese men born 13 generations apart. The Y chromosome is passed unchanged from father to son, so mutations accumulate slowly over generations.

The researchers sequenced the chromosomes and compared them with the reference sequence from the original human genome project to find single base pair differences in the sequence.

They found four significant mutations between the Y chromosomes of the two men, despite the many generations of separation. They then calculated that the rate of mutation is equivalent to one mutation in every 15-30 million nucleotides.

“These four mutations gave us the exact mutation rate - one in 30 million nucleotides each generation - that we had expected,” said Dr Tyler-Smith.

The generation of all these mutations eventually results in functionally significant mutations. Of those which have functional significance most are harmful. Some cause death during fetal development or at a young age after birth. Others make people debilitated in various ways. A much smaller number enhance or alter performance in ways that are advantageous in some environments.

As humans spread out across the planet and encountered different environments, diseases, dangers, and food sources selective pressures on humans changed and human evolution accelerated in order to adapt us to local niches. As we developed early technologies and civilizations we exerted even stronger selective pressures on ourselves. In their book The 10,000 Year Explosion: How Civilization Accelerated Human Evolution Gregory Cochran and Henry Harpending explain that we evolved more rapidly as a consequence of our own evolution. If you look around you can spot signs that selective pressures are at work today.

By Randall Parker 2009 September 05 02:35 PM  Trends, Human Evolution
Entry Permalink | Comments(2)
2009 September 04 Friday
Financially Impulsive Are Generally Impulsive

People who are impulsive show impulsive tendencies in multiple areas.

The study, conducted through the BBC website with over 40,000 participants, measured people's financial impulsivity by asking whether they would they prefer to receive £45 in three days or £70 in three months. The survey asked a related series of questions about other behaviours. Nearly half of those who responded preferred the smaller-sooner sum of money, and these people were more likely to show a raft of other impulsive behaviours.

Dr Stian Reimers, ESRC Centre for Economic Learning and Social Evolution at UCL, says: "One of the big questions about people's financial planning is whether decisions to spend or save come from personal knowledge and experience of money matters or whether they reflect someone's personality more generally.

"Our research shows that people with an impulsive money-today attitude ignore the future in other ways. For example, they are more likely to smoke and more likely to be overweight, which may reflect a preference for immediate pleasure of nicotine and food over long-term good health. People who chose to take the smaller-sooner amount of money were also more likely to admit to having had an affair in recent years, suggesting another manifestation of desire for immediate gratification."

So impulsiveness tends to express itself in many ways in the same person. I wonder if some readers though would admit to impulsiveness in only very narrow areas of behavior. Anyone have untypical impulsiveness?

Someone who takes the £45 now is turning down a 56% return in just 3 months. You can't expect such a person to save for their rent let alone for their retirement. The rest of us end up paying thru taxes and other means to support the most impulsive.

Imagine credit card issuers were allowed to somehow measure impulsiveness. What would they do with the results? On the one hand, impulsive people are more likely to run up debts they can not pay. On the other hand, credit card companies make money by charging high interest rates to people who insist on immediate gratification. Credit card issuers would probably ideally like to prey on people who have moderate impulsiveness and substantial earnings capability.

The most surprising thing I've read lately related to impulsiveness is that men who have had fewer sexual partners have more babies. I take that to mean that at least monogamy is being selected for. Possibly lower impulsiveness is also getting selected for. I'd like to see an impulsiveness study on middle aged men and women where they are questioned about their offspring. Do less impulsive people have more babies? That'd be good news if so.

By Randall Parker 2009 September 04 02:17 PM  Brain Economics
Entry Permalink | Comments(3)
Lowest Cost Solar Power At Coal Plants

Okay, this is really strange. But if you integrate concentrating solar power mirrors to the existing electric generating equipment of a coal electric plant the result is the cheapest way to generate electric power from the sun.

A project that will add solar power to a coal-fired power plant could reduce the amount of coal required to generate electricity and dramatically cut the cost of solar power.

Effectively this avoids lots of idle electric generator capital equipment at night.

"The thing that's attractive about this is you only have to buy the solar field portion of the plant, which is 50 to 60 percent of the cost of the plant," says Hank Price, director of technology at Abengoa Solar. That could effectively make solar-thermal power about 30 to 50 percent cheaper, according to various estimates. That would equate to a range of about six to 12 cents per kilowatt-hour, which is competitive with many conventional sources of electricity. "It's potentially the most cost-effective way to get significant solar power on the grid," he says.

There are obvious limits to this approach. Coal plants need to be in sunny areas and to have land available near them. Plus, there must be some physical distance limitation for how far the heated liquid can be transported to the coal plant. So the percentage of total electricity coming from concentrating solar at an existing coal plant is limited to 10-15% according to the article.

By Randall Parker 2009 September 04 08:53 AM  Energy Solar
Entry Permalink | Comments(3)
2009 September 03 Thursday
Higher Density Housing To Do Little To Cut Fossil Fuels

A National Academy of Sciences report throws cold water on urbanism as a way to cut carbon dioxide emissions.

Urban planners hoping to help mitigate CO2 emissions by increasing housing density would do better to focus on fuel-efficiency improvements to vehicles, investments in renewable energy, and cap and trade legislation now being voted on in Congress, according to the study, released Tuesday. It concludes that increasing population density in metropolitan areas would yield insignificant CO2 reductions.

Plus it amounts to trying to get people to do something they clearly don't want to do: live closer to each other.

Even if 75 percent of all new and replacement housing in America were built at twice the density of current new developments, and those living in the newly constructed housing drove 25 percent less as a result, CO2 emissions from personal travel would decline nationwide by only 8 to 11 percent by 2050, according to the study.

I find it curious that a doubling of density would cut miles driven by only 25%. I would expect a doubling of density to cut miles driven in half. Why isn't this the case? Zoning laws that keep housing away from commercial buildings?

I see a fundamental flaw in attempts to cut oil consumption to cut CO2 emissions: The oil is going to get burned no matter what any one country or group countries does with their energy policies. The uses of oil and users of oil are so many that an attempt to cut demand in one area will just free up oil to be used elsewhere. Oil production will peak and decline for reasons unrelated to global warming.

It makes more sense to me to focus on shifting from oil to nuclear, solar and wind for electric power generation. It seems more within the realm of the doable to cut global coal demand than the cut global oil demand.

Make car transportation expensive enough and people will live closer to work even without moving into higher density housing. People will effectively swap jobs and houses to live closer to work. Peak Oil will do that more effectively than any government policy.

Currently in the United States 95% of all transportation energy comes from oil and 71% of consumed oil gets used for transportation. Want to decrease the amount of oil burned in trucks, cars, trains, ships, and airplanes? Develop ways to use other energy sources in transportation. Most notably, better and cheaper batteries would allow most commuting to be done under electric power. A build up of more nuclear power plants along with some wind and solar would then cut emissions from fossil fuels burning.

I do not expect this report will have much impact on the urban enthusiasts who want us all to move into multi-story apartment buildings and ride on subways. They'll eventually get some satisfaction for their dreams when Peak Oil really starts to bite. But that crisis will come on too fast for many to move into cities as a way to adjust. I expect electric cars and electric bicycles to do more and faster than a big surge in urban construction.

By Randall Parker 2009 September 03 10:15 PM  Climate Policy
Entry Permalink | Comments(13)
2009 September 02 Wednesday
150th Anniversary Of Solar Carrington Event

Named after English astronomer Richard Carrington, the solar eruption (coronal mass ejection) of September 2, 1859 caused such an intense geomagnetic event that telegraph lines operated from currents induced by geomagnetism. Such an event today would melt key stations of our electric grids and throw us into an unelectrified society for months or years. Massive famine would result.

On Sept. 2, 1859, at the telegraph office at No. 31 State Street in Boston at 9:30 a.m., the operators’ lines were overflowing with current, so they unplugged the batteries connected to their machines, and kept working using just the electricity coursing through the air.

In the wee hours of that night, the most brilliant auroras ever recorded had broken out across the skies of the Earth. People in Havana and Florida reported seeing them. The New York Times ran a 3,000 word feature recording the colorful event in purple prose.

For far less than the cost of a Middle Eastern war or far less than the cost of an economic stimulus against a recession we could protect ourselves from the worst of another Carrington event.

See my post Solar Carrington Event Repeat Today Would Collapse Civilization.

Update: Some skeptical readers doubt that we are vulnerable to a coronal mass ejection (CME). A NASA web page about CMEs summarizes a US National Academy of Sciences report about our vulnerabilities to severe space weather events.

According to the report, power grids may be more vulnerable than ever. The problem is interconnectedness. In recent years, utilities have joined grids together to allow long-distance transmission of low-cost power to areas of sudden demand. On a hot summer day in California, for instance, people in Los Angeles might be running their air conditioners on power routed from Oregon. It makes economic sense—but not necessarily geomagnetic sense. Interconnectedness makes the system susceptible to wide-ranging "cascade failures."

To estimate the scale of such a failure, report co-author John Kappenmann of the Metatech Corporation looked at the great geomagnetic storm of May 1921, which produced ground currents as much as ten times stronger than the 1989 Quebec storm, and modeled its effect on the modern power grid. He found more than 350 transformers at risk of permanent damage and 130 million people without power. The loss of electricity would ripple across the social infrastructure with "water distribution affected within several hours; perishable foods and medications lost in 12-24 hours; loss of heating/air conditioning, sewage disposal, phone service, fuel re-supply and so on."

"The concept of interdependency," the report notes, "is evident in the unavailability of water due to long-term outage of electric power--and the inability to restart an electric generator without water on site."

You can read this report: Severe Space Weather Events--Understanding Societal and Economic Impacts: A Workshop Report (2008):

Severe space weather has the potential to pose serious threats to the future North American electric power grid.2 Recently, Metatech Corporation carried out a study under the auspices of the Electromagnetic Pulse Commission and also for the Federal Emergency Management Agency (FEMA) to examine the potential impacts of severe geomagnetic storm events on the U.S. electric power grid. These assessments indicate that severe geomagnetic storms pose a risk for long-term outages to major portions of the North American grid. John Kappenman remarked that the analysis shows “not only the potential for large-scale blackouts but, more troubling, … the potential for permanent damage that could lead to extraordinarily long restoration times.” While a severe storm is a low-frequency-of-occurrence event, it has the potential for long-duration catastrophic impacts to the power grid and its users. Impacts would be felt on interdependent infrastructures, with, for example, potable water distribution affected within several hours; perishable foods and medications lost in about 12-24 hours; and immediate or eventual loss of heating/air conditioning, sewage disposal, phone service, transportation, fuel resupply, and so on. Kappenman stated that the effects on these interdependent infrastructures could persist for multiple years, with a potential for significant societal impacts and with economic costs that could be measurable in the several-trillion-dollars-per-year range.

Electric power grids, a national critical infrastructure, continue to become more vulnerable to disruption from geomagnetic storms. For example, the evolution of open access on the transmission system has fostered the transport of large amounts of energy across the power system in order to maximize the economic benefit of delivering the lowest-cost energy to areas of demand. The magnitude of power transfers has grown, and the risk is that the increased level of transfers, coupled with multiple equipment failures, could worsen the impacts of a storm event.

Kappenman stated that “many of the things that we have done to increase operational efficiency and haul power long distances have inadvertently and unknowingly escalated the risks from geomagnetic storms.” This trend suggests that even more severe impacts can occur in the future from large storms. Kappenman noted that, at the same time, no design codes have been adopted to reduce geomagnetically induced current (GIC) flows in the power grid during a storm. Operational procedures used now by U.S. power grid operators have been developed largely from experiences with recent storms, including the March 1989 event. These procedures are generally designed to boost operational reserves and do not prevent or reduce GIC flows in the network. For large storms (or increasing dB/dt levels) both observations and simulations indicate that as the intensity of the disturbance increases, the relative levels of GICs and related power system impacts will also increase proportionately. Under these scenarios, the scale and speed of problems that could occur on exposed power grids have the potential to impact power system operators in ways they have not previously experienced. Therefore, as storm environments reach higher intensity levels, it becomes more likely that these events will precipitate widespread blackouts in exposed power grid infrastructures.

So we really are extremely vulnerable to a CME and a 1921 style CME - or even worse , a 1859 style CME - would cause months-long black-outs in some areas.

By Randall Parker 2009 September 02 07:13 PM  Dangers Natural General
Entry Permalink | Comments(12)
2009 September 01 Tuesday
More Sensitive Cancer Breathalyzer Developed

Higher levels of volatile organic compounds in the breath serve as markers for lung cancer.

Lung cancer is a brutal disease, often not caught until it's too late for treatment to do much good. Now researchers are building an electronic nose that could help physicians detect the disease during its initial stages. Using gold nanoparticles, scientists at the Israel Institute of Technology in Haifa have created sensors with an unprecedented sensitivity for sniffing out compounds present in the breath of lung-cancer patients.

Many of the highly sensitive disease detector assay devices will end up portable for use at home or work. People will test themselves every day or week as they are inclined. For some tests there won't even be need for a person to initiate testing. One's bed stand or sink or toilet will have sensors built in that constantly scan for tell tale signs.

What I want: something that tells me when my sleep deficit is getting too big. Though such a device would probably have the effect of prodding me to go to sleep rather than write that last blog post of the day.

By Randall Parker 2009 September 01 10:42 PM  Biotech Assay Tools
Entry Permalink | Comments(1)
Genes Found Unique To Humans

3 genes have been identified that appear unique to humans and we might have a total of 18 genes unique to us.

In this work, David Knowles and Aoife McLysaght of the Smurfit Institute of Genetics at Trinity College Dublin undertook the painstaking task of finding protein-coding genes in the human genome that are absent from the chimp genome. Once they had performed a rigorous search and systematically ruled out false results, their list of candidate genes was trimmed down to just three. Then came the next challenge. "We needed to demonstrate that the DNA in human is really active as a gene," said McLysaght.

The authors gathered evidence from other studies that these three genes are actively transcribed and translated into proteins, but furthermore, they needed to show that the corresponding DNA sequences in other primates are inactive. They found that these DNA sequences in several species of apes and monkeys contained differences that would likely disable a protein-coding gene, suggesting that these genes were inactive in the ancestral primate.

The authors also note that because of the strict set of filters employed, only about 20% of human genes were amenable to analysis. Therefore they estimate there may be approximately 18 human-specific genes that have arisen from non-coding DNA during human evolution.

One wonders whether these regions became active with an initial benefit or did the functional value of the translated regions come later?

These results may seem far-fetched. But a mutation that would cause a previously unused part of the genome to start getting translated into protein might happen in an area which originally came from a viral infection. A portion of viral DNA might have been incorporated into the genome. Also, many more inactive areas of the genome have gotten mutated into activity that caused no benefit or even harm. These few regions that went on to become useful genes arose out of a background of a much larger number of mutations that didn't produce anything useful.

Once we start genetically engineering ourselves the initial changes will involve adding sequences that some people already have. For example, women will want to genetically engineer their melanocytes to produce red or blond hair for example. Also, some men will get genetic engineering for more muscle. These will involve existing genetic sequences already present in the human population.

When things will get really interesting: A much deeper understanding of the functional purposes of genes from other species will turn up many features we do not have that some people will want for themselves or their offspring. When the first genetic transplant from another species into humans is done what will it be done for?

By Randall Parker 2009 September 01 10:23 PM  Trends, Human Evolution
Entry Permalink | Comments(15)
Emotional Music For Tamarin Monkeys

Mozart doesn't speak to them. Beethoven didn't have them in mind. Bach was writing to a more ascended audience. Go hear and listen to music composed for tamarin monkeys.

What I wonder: would another species on some other planet which has achieved industrial civilization (assuming such a species exists) find our greatest compositions appealing? Would they find Elvira Madigan or the Emperor Concerto or perhaps Cappricio Espanol enjoyable or uplifting?

By Randall Parker 2009 September 01 09:47 PM  Brain Innate
Entry Permalink | Comments(1)
Optimal DHA Daily Dosage?

Can blood and urine tests point toward the ideal daily dosage of the omega 3 fatty acid DHA.

A team of French scientists have found the dose of DHA (docosahexaenoic acid) that is "just right" for preventing cardiovascular disease in healthy men. In a research report appearing in the September 2009 print issue of The FASEB Journal (http://www.fasebj.org), the scientists show that a 200 mg dose of DHA per day is enough to affect biochemical markers that reliably predict cardiovascular problems, such as those related to aging, atherosclerosis, and diabetes. This study is the first to identify how much DHA is necessary to promote optimal heart health.

"This study shows that regularly consuming small amounts of DHA is likely to improve the health status of people, especially in regards to cardiovascular function," said Michel Lagarde, co-author of the study.

Now, the press release above is not clear. Do they see the optimal response at 200 mg of DHA per day? Does the response stay the same, get worse, or get better above 200 mg? Well, okay, so lets move on to the abstract.

Twelve healthy male volunteers (aged 53–65 yr) were assigned to consume an intake of successively 200, 400, 800, and 1600 mg/d DHA, as the only omega-3 fatty acid, for 2 wk each dose. Blood and urine samples were collected before and after each dose of DHA and at 8 wk after arrest of supplementation. DHA was incorporated in a dose-response fashion in platelet phospholipids. After supplementation with 400 and 800 mg/d DHA, platelet reactivity was significantly decreased. Platelet vitamin E concentration increased only after 200 mg/d DHA, while p38 MAP kinase phosphorylation decreased. Urinary isoprostane was also significantly lowered after 200 mg/d DHA but was increased after 1600 mg/d. Therefore, supplementation with only 200 mg/d DHA for 2 wk induced an antioxidant effect.

Okay, that's still not perfectly clear. I can imagine why vitamin E would drop at higher DHA levels as DHA gets oxidized and vitamin E gets depleted reducing it. What happened with isoprostane between 200 and 1600 mg? What is urinary isoprostane? Turns out isoprostanes are an indication of free radicals reacting with fats. Higher isoprostanes probably mean more bad stuff going on.

Isoprostanes are prostaglandin-like compounds produced primarily from esterified arachidonic acid in tissues by non-enzymatic reactions catalysed by free radicals in vivo.

Higher circulating concentrations of F2-isoprostanes are associated with coronary artery calcification (CAC) and CAC is definitely a bad thing.

So maybe there's an ideal dose level of DHA and maybe it is in the hundreds of milligrams per day. I emphasize the maybe.

By Randall Parker 2009 September 01 09:00 PM  Aging Diet Heart Studies
Entry Permalink | Comments(1)
Site Traffic Info
Site Copyright
The contents of this site are copyright ©