Your friends getting along too well? Feeling like the group is stuck in a rut of conformity? Testosterone could give you the edge you need to break away and strike out on your own. Testosterone makes us place more value on our own opinions. Whether that is a good or bad thing depends on the quality of your opinions versus the opinions of those around you.
Testosterone makes us overvalue our own opinions at the expense of cooperation, research from the Wellcome Trust Centre for Neuroimaging at UCL (University College London) has found. The findings may have implications for how group decisions are affected by dominant individuals.
Problem solving in groups can provide benefits over individual decisions as we are able to share our information and expertise. However, there is a tension between cooperation and self-oriented behaviour: whilst groups may benefit from a collective intelligence, collaborating too closely can easily lead to an uncritical groupthink ending in decisions that are bad for all.
Attempts to understand the biological mechanisms behind group decision making have tended to focus on the factors that promote cooperation. Research has shown that people given a boost of the hormone oxytocin tend to be cooperative. Now, in a study published today in the journal Proceedings of the Royal Society B, researchers have shown that the hormone testosterone has the opposite effect – in other words, it makes people act less cooperatively and more egocentrically.
But will a lower willingness to cooperate always result in worse outcomes? Those with less need to go along with a group consensus have greater latitude to innovate in areas where the conventional wisdom is blocking development of different and much better approaches to problem.
Women can be made less cooperative with testosterone. One wonders whether the marginal impact on males would be as great.
Dr Nick Wright and colleagues at the Wellcome Trust Centre for Neuroimaging at UCL carried out a series of tests using seventeen pairs of female* volunteers who had previously never met. The test took place over two days, spaced a week apart. On one of the days, both volunteers in each pair were given a testosterone supplement; on the other day, they were given a placebo.
During the experiment, both women sat in the same room and viewed their own screen. Both individuals saw exactly the same thing. First, in each trial they were shown two images, one of which contained a high contrast target – and their job was to decide individually which image contained the target. If their individual choices agreed, they received feedback and moved on to the next trial. However, if they disagreed then they were asked to collaborate and discuss with their partner to reach a joint decision. One of the pair then input this joint decision.
I wonder whether it makes sense to have more and less cooperative states of mind at different times of the day and week in order to get combined benefits of both cooperation and independent thinking.
It is a mistake to allow import of animals that have the potential to wipe out lots of species. Pythons are making major in-roads in south Florida.
When researchers struck out to count animals along a main road that runs to the southernmost tip of the park, more than 99 percent of raccoons were gone, along with nearly the same percentage of opossums and about 88 percent of bobcats. Marsh and cottontail rabbits, as well as foxes, could not be found.
Look at the Asian carp spreading up and down the Mississippi and tributary rivers. That's an even bigger mistake in my view.
I am curious to know whether in a few decades cheap and very cheap electronic monitoring systems will make it possible to wipe out some of the invasive species. If we can just watch a large area at a fine enough level of granularity then it might become possible to track down every member of an invading species and wipe it out even if it is well concealed most of the time.
A new University of Colorado Boulder-led study appears to answer contentious questions about the onset and cause of Earth's Little Ice Age, a period of cooling temperatures that began after the Middle Ages and lasted into the late 19th century.
According to the new study, the Little Ice Age began abruptly between A.D. 1275 and 1300, triggered by repeated, explosive volcanism and sustained by a self- perpetuating sea ice-ocean feedback system in the North Atlantic Ocean, according to CU-Boulder Professor Gifford Miller, who led the study. The primary evidence comes from radiocarbon dates from dead vegetation emerging from rapidly melting icecaps on Baffin Island in the Canadian Arctic, combined with ice and sediment core data from the poles and Iceland and from sea ice climate model simulations, said Miller.
Low sun spot activity to blame? Nope. The big cooling was all about cooling aerosols ejected by volcanoes.
Most scientists think the Little Ice Age was caused either by decreased summer solar radiation, erupting volcanoes that cooled the planet by ejecting shiny aerosol particles that reflected sunlight back into space, or a combination of both, said Miller.
The new study suggests that the onset of the Little Ice Age was caused by an unusual, 50-year-long episode of four massive tropical volcanic eruptions. Climate models used in the new study showed that the persistence of cold summers following the eruptions is best explained by a sea ice-ocean feedback system originating in the North Atlantic Ocean.
"This is the first time anyone has clearly identified the specific onset of the cold times marking the start of the Little Ice Age," said Miller. "We also have provided an understandable climate feedback system that explains how this cold period could be sustained for a long period of time. If the climate system is hit again and again by cold conditions over a relatively short period -- in this case, from volcanic eruptions -- there appears to be a cumulative cooling effect."
When will nature throw another really big curve ball at us?
Two scientists, including David King, formerly chief scientific advisor to the British government, have come out with a paper in Nature arguing that we have a greater need to reduce fossil fuel use due to oil shortages than due to global warming. They see 2005 as the end of oil supply growth.
Stop wrangling over global warming and instead reduce fossil-fuel use for the sake of the global economy.
That's the message from two scientists, one from the University of Washington and one from the University of Oxford in the United Kingdom, who say in the current issue of the journal Nature (Jan. 26) that the economic pain of a flattening oil supply will trump the environment as a reason to curb the use of fossil fuels.
"Given our fossil-fuel dependent economies, this is more urgent and has a shorter time frame than global climate change," says James W. Murray, UW professor of oceanography, who wrote the Nature commentary with David King, director of Oxford's Smith School of Enterprise and the Environment.
The "tipping point" for oil supply appears to have occurred around 2005, says Murray, who compared world crude oil production with world prices going back to 1998. Before 2005, supply of regular crude oil was elastic and increased in response to price increases. Since then, production appears to have hit a wall at 75 million barrels per day in spite of price increases of 15 percent each year.
"As a result, prices swing wildly in response to small changes in demand," the co-authors wrote. "Others have remarked on this step change in the economies of oil around the year 2005, but the point needs to be lodged more firmly in the minds of policy makers."
I like it when prominent scientists come around to a point of view I've held for years. But my satisfaction is rather short-lived because high level recognition of the Peak Oil problem is coming too late. The coming decline in oil production is going to cause economic contraction and declining living standards until development new technologies make a migration away from oil more practical.
Production at existing fields is declining sharply. So new fields have to be found and brought online. But the rate of new field discovery is too slow.
For those who argue that oil reserves have been increasing, that more crude oil will be available in the future, the co-authors wrote: "The true volume of global proved reserves is clouded by secrecy; forecasts by state oil companies are not audited and appear to be exaggerated. More importantly, reserves often take 6 - 10 years to drill and develop before they become part of the supply, by which time older fields have become depleted." Production at oil fields around the world is declining between 4.5 percent and 6.7 percent per year, they wrote.
The result is a several year period of high oil prices (with dips on economic downturns). The high prices put a ceiling on economic growth.
Also see Scientific American coverage of this story.
An Australian government agency thinks oil supplies might peak in 2017 The Australian government tried to hide this report. For Western countries peak oil availability has already happened. Oil consumption is growing rapidly in oil exporters. So their exports have stopped growing and for many of them exports have already dropped from peak. At the same time, growing Asian demand reduces the affordability of the remaining oil for the Western countries.
See my November 20101 post Dr. James Schlesinger: The Peak Oil Debate Is Over. Schlesinger was the first US Secretary of Energy and he's also served as Defense Secretary and head of CIA, all back in the 1970s.
The rate of advance of biomedical research is a matter of life and death for all of us. Therefore we should be concerned that biomedical research funding is under control of older scientists that younger scientists spend much of their career working for.
A new study by Rice University's Baker Institute for Public Policy illustrates a disconnect between government funding of biomedical research by young investigators and a novel standard by which to judge it: the Nobel Prize.
The study found the average age of biomedical researchers getting their first grant from the National Institutes of Health (NIH) in 2008 was 42. Over the past 30 years, the average age of Nobel winners when they performed their groundbreaking research was 41.
So the younger scientists don't start getting their own direct funding until they are past their peak years of research productivity. That's dumb. They spend their younger years as grad students and (poorly paid) post docs. This puts their research directions much more under the control of (older) professors who run labs and have grants flowing to them. The young turk with a hunch does not always have the freedom to follow that hunch. Not good.
Labs can present much more credible research proposals if they've already done part of the work. Established labs with existing funding and a body of work come across as lower risk when they submit proposals. What's needed: set aside a substantial portion of research funding for younger scientists. A bucket of money for under 35s would help matters.
That should trouble those concerned about the United States' standing as a biomedical powerhouse, said Kirstin Matthews, first author of a paper published in the open-access journal PloS One by the nonprofit Public Library of Science. As older scientists retire over the next two decades, the nation needs to support the next generation of researchers or risk losing them to more sustainable careers, wrote Matthews and co-authors Vivian Ho, the James A. Baker III Institute Chair in Health Economics and a professor of economics; recent alumna Kara Calhoun and senior Nathan Lo, all of Rice.
"This is a bit controversial at a time when we're encouraging more people to get into science," said Matthews, a fellow in science and technology policy at the Baker Institute. "The gist is that we're dealing with a shrinking NIH budget; at best, we'll get the same budget year to year, adjusted for inflation. So how can we use the money most effectively?"
The trend is toward higher first age of funding. This trend is probably reducing the amount of highly innovative research.
In the past 30 years, the average age of biomedical researchers has steadily increased. The average age of an investigator at the National Institutes of Health (NIH) rose from 39 to 51 between 1980 and 2008. The aging of the biomedical workforce was even more apparent when looking at first-time NIH grantees. The average age of a new investigator was 42 in 2008, compared to 36 in 1980. To determine if the rising barriers at NIH for entry in biomedical research might impact innovative ideas and research, we analyzed the research and publications of Nobel Prize winners from 1980 to 2010 to assess the age at which their pioneering research occurred. We established that in the 30-year period, 96 scientists won the Nobel Prize in medicine or chemistry for work related to biomedicine, and that their groundbreaking research was conducted at an average age of 41—one year younger than the average age of a new investigator at NIH. Furthermore, 78% of the Nobel Prize winners conducted their research before the age of 51, the average age of an NIH principal investigator. This suggested that limited access to NIH might inhibit research potential and novel projects, and could impact biomedicine and the next generation scientists in the United States.
Again, the rate of advance of biomedical research is a matter of life and death for us all. We need the funding to go to people most likely to make big breakthroughs. Those people are younger than the people who currently get the research bucks.
Does depression help people do art? Does autism help people do science? Does the normal mind without enough of either come up short in creativity? Not sure. But relatives of those with depression have different intellectual interests than relatives of those who have autism.
A hallmark of the individual is the cultivation of personal interests, but for some people, their intellectual pursuits might actually be genetically predetermined. Survey results published by Princeton University researchers in the journal PLoS ONE suggest that a family history of psychiatric conditions such as autism and depression could influence the subjects a person finds engaging.
Although preliminary, the findings provide a new look at the oft-studied link between psychiatric conditions and aptitude in the arts or sciences. While previous studies have explored this link by focusing on highly creative individuals or a person's occupation, the Princeton research indicates that the influence of familial neuropsychiatric traits on personal interests is apparently independent of a person's talent or career path, and could help form a person's basic preferences and personality.
Princeton researchers surveyed nearly 1,100 students from the University's Class of 2014 early in their freshman year to learn which major they would choose based on their intellectual interests. The students were then asked to indicate the incidence of mood disorders, substance abuse or autism spectrum disorder (ASD) in their family, including parents, siblings and grandparents.
Students interested in pursuing a major in the humanities or social sciences were twice as likely to report that a family member had a mood disorder or a problem with substance abuse. Students with an interest in science and technical majors, on the other hand, were three times more likely to report a sibling with an ASD, a range of developmental disorders that includes autism and Asperger syndrome.
I suspect moderate doses of genes for autism make the mind much more capable of handling the rigor required to do math and science. The "normal" human mind is not as well suited to understand the world scientifically.
What I wonder: does maladaptive autism exist further out on a spectrum from adaptive autistic traits? Or is maladaptive autism caused by other genetic variants beyond those that cause more adaptive forms of altruism?
Here's the abstract and full paper.
From personality to neuropsychiatric disorders, individual differences in brain function are known to have a strong heritable component. Here we report that between close relatives, a variety of neuropsychiatric disorders covary strongly with intellectual interests. We surveyed an entire class of high-functioning young adults at an elite university for prospective major, familial incidence of neuropsychiatric disorders, and demographic and attitudinal questions. Students aspiring to technical majors (science/mathematics/engineering) were more likely than other students to report a sibling with an autism spectrum disorder (p = 0.037). Conversely, students interested in the humanities were more likely to report a family member with major depressive disorder (p = 8.8×10−4), bipolar disorder (p = 0.027), or substance abuse problems (p = 1.9×10−6). A combined PREdisposition for Subject MattEr (PRESUME) score based on these disorders was strongly predictive of subject matter interests (p = 9.6×10−8). Our results suggest that shared genetic (and perhaps environmental) factors may both predispose for heritable neuropsychiatric disorders and influence the development of intellectual interests.
Since genetic sequencing costs have crashed over the the last decade and especially in the last few years we are about to get the flood of DNA sequencing data needed to identify large numbers of genetic variants that contribute to cognitive traits. The picture will be much much clearer in just 4 or 5 years. I'm awaiting the discoveries with great interest.
Before car computers totally take over the task of driving they will continue to gain more capabilities for accident avoidance. The latest: use of wireless technology so that car computers can know distances and velocity and collision risk of nearby cars.
Vehicle-to-vehicle communication—known as "V2V" in the industry—is eagerly anticipated because it could help reduce crashes. The Wi-Fi signals, which go out in all directions, would act like an alert passenger, warning the driver that another car is about to run a red light or that there's a motorcycle in the blind spot. U.S. government researchers estimate that V2V would let drivers avoid or make less serious around 80 percent of collisions.
Just like anti-lock brakes (ABS) and other car risk reduction technologies anything that works well enough will eventually become mandated by regulatory agencies. So perhaps in 10 or 15 years cars will all come with WiFi transmitters and receivers that work to detect dangerous approaches between cars.
One can imagine street lights continuously broadcasting not just their current settings but also when they will change. This will enable better calculations of collision risks.
I expect another benefit: cars approaching a street light will signal asking for a green. If the street light doesn't get similar signals from the cross street it will change to give the green to the approaching car. Newer cars would then again an advantage in getting thru street lights.
Rising atmospheric carbon dioxide might cause so much heating that some countries will respond by releasing sulfate aerosols to reflect sunlight and cool the planet. If that happens the question arises: Will the net effect of less sunlight (which would reduce energy flowing to plants) be outweighed by the plant-boosting effects of lower temperatures and higher CO2? Some climate scientists did modeling that suggests globally crop yields would rise overall but fall in some areas.
Although scientists know that climate change in recent decades has negatively impacted crop yields in many regions, the study by Pongratz and colleagues is the first to examine the potential effect of geoengineering on food security. Pongratz's team, which included Carnegie's Ken Caldeira and Long Cao, as well as Stanford University's David Lobell, used models to assess the impact of sunshade geoengineering on crop yields.
Using two different climate models, they simulated climates with carbon dioxide levels similar to what exists today. A second set of simulations doubled carbon-dioxide levels – levels that could be reached in several decades if current trends in fossil-fuel burning continue unabated. A third set of simulations posited doubled carbon dioxide, but with a layer of sulfate aerosols in the stratosphere deflecting about 2% of incoming sunlight away from the Earth. The simulated climate changes were then applied to crop models that are commonly used to project future yields.
The team found that, in the model, sunshade geoengineering leads to increased crop yields in most regions, both compared with current conditions and with the future projection of doubled carbon dioxide on its own. This is because deflecting sunlight back to space reduces temperatures, but not CO2. "In many regions, future climate change is predicted to put crops under temperature stress, reducing yields. This stress is alleviated by geoengineering," Pongratz said. "At the same time, the beneficial effects that a higher CO2 concentration has on plant productivity remain active."
Of course this work was done with a model that is much simpler than the real planet. Future improvements in model quality could yield different results. Still, an interesting result.
The researchers point out that sulfate aerosols would not reverse the ocean acidification caused by more CO2 dissolving into the oceans. Speaking of ocean acidification, rising atmospheric CO2 will increase the acidification of the oceans so much that the coral reefs would drastically shrink in extent.
"In some regions, the man-made rate of change in ocean acidity since the Industrial Revolution is hundred times greater than the natural rate of change between the Last Glacial Maximum and pre-industrial times," emphasizes Friedrich. "When Earth started to warm 17,000 years ago, terminating the last glacial period, atmospheric CO2 levels rose from 190 parts per million (ppm) to 280 ppm over 6,000 years. Marine ecosystems had ample time to adjust. Now, for a similar rise in CO2 concentration to the present level of 392 ppm, the adjustment time is reduced to only 100 – 200 years."
On a global scale, coral reefs are currently found in places where open-ocean aragonite saturation reaches levels of 3.5 or higher. Such conditions exist today in about 50% of the ocean – mostly in the tropics. By end of the 21st century this fraction is projected to be less than 5%. The Hawaiian Islands, which sit just on the northern edge of the tropics, will be one of the first to feel the impact.
If necessary we could use a variety of climate engineering techniques to prevent the melting of Antarctica and Greenland. But I've yet to come across a proposal for how to prevent a shift in ocean acidity as atmospheric CO2 continues to rise.
Why not a higher autism correlation between twins? birth weight is a key factor in determining autism risk.
EVANSTON --- Although the genetic basis of autism is now well established, a growing body of research also suggests that environmental factors may play a role in this serious developmental disorder affecting nearly one in 100 children. Using a unique study design, a new study suggests that low birth weight is an important environmental factor contributing to the risk of autism spectrum disorder (ASD).
“Our study of discordant twins -- twin pairs in which only one twin was affected by ASD -- found birth weight to be a very strong predictor of autism spectrum disorder,” said Northwestern University researcher Molly Losh. Losh, who teaches and conducts research in Northwestern’s School of Communication, is lead author of the study that will be published in the journal “Psychological Medicine” and is now available online.
But let me quibble about terminology: For millions of people on the autistic spectrum their autistic minds are not a disorder. Many are quite high functioning and happy to be on the autistic spectrum with the intellectual advantages they gain from their modes of thought. I think the problem is that too much cognitive variation has gotten subsumed under the autism label.
Still, this finding on birth weight is important.
The researchers found that lower birth weight more than tripled the risk for autism spectrum disorder in identical twin pairs in which one twin had ASD and the other did not.
My guess (and I expect this to become crystal clear in under 10 years due to cheap DNA sequencing technology) is that many genes that cause some aspects of autism were selected for. There was an adaptive advantage to a sort of specialization of cognitive labor. Genetic influences on a phenomenon that occurs a fairly high rate of incidence usually point to something that was selected for rather than developmental error or relatively rarer purely harmful genetic mutations. Highly maladaptive traits rarely get selected for.
There's some overlap between autism as a result of genes selected due to their reproductive advantages versus events that go wrong in brain development. That overlap has made clear thinking about autism harder to do. Given that clear thinking about the mind is hard to do in general that's not too surprising. But on the bright side, advances in psychometrics, neuroscience, and genetics are going to usher in a new age of understanding of the human mind and along with it will come a much more nuanced view of autism.
Researchers funded by the Biotechnology and Biological Sciences Research Council (BBSRC) have found that vitamin D reduces the effects of ageing in mouse eyes and improves the vision of older mice significantly. The researchers hope that this might mean that vitamin D supplements could provide a simple and effective way to combat age-related eye diseases, such as macular degeneration (AMD), in people.
The research was carried out by a team from the Institute of Ophthalmology at University College London and is published in the current issue of the journal Neurobiology of Ageing.
The retina's cells are very heavy energy users, heavier energy users than any other cell type in the body. I did not know that.
Professor Glen Jeffery, who led the work, explains "In the back of the eyes of mammals, like mice and humans, is a layer of tissue called the retina. Cells in the retina detect light as it comes into the eyes and then send messages to the brain, which is how we see. This is a demanding job, and the retina actually requires proportionally more energy than any other tissue in the body, so it has to have a good supply of blood. However, with ageing the high energy demand produces debris and there is progressive inflammation even in normal animals. In humans this can result in a decline of up to 30% in the numbers of light receptive cells in the eye by the time we are 70 and so lead to poorer vision."
We need cell therapies to replace tired eye blood vessels, rod cells, cone cells, and other retinal cells. The inflammation and decline in number of light receptive cells with age could be reversed and some day will be reserved. Faster please, as Glenn Reynolds likes to say.
The vitamin D reduced both inflammation and amyloid beta.
The researchers found that when old mice were given vitamin D for just six weeks, inflammation was reduced, the debris partially removed, and tests showed that their vision was improved.
The researchers identified two changes taking place in the eyes of the mice that they think accounted for this improvement. Firstly, the number of potentially damaging cells, called macrophages, were reduced considerably in the eyes of the mice given vitamin D. Macrophages are an important component of our immune systems where they work to fight off infections. However in combating threats to the aged body they can sometimes bring about damage and inflammation. Giving mice vitamin D not only led to reduced numbers of macrophages in the eye, but also triggered the remaining macrophages to change to a different configuration. Rather than damaging the eye the researchers think that in their new configuration macrophages actively worked to reduce inflammation and clear up debris.
The second change the researchers saw in the eyes of mice given vitamin D was a reduction in deposits of a toxic molecule called amyloid beta that accumulates with age. Inflammation and the accumulation of amyloid beta are known to contribute, in humans, to an increased risk of age-related macular degeneration (AMD), the largest cause of blindness in people over 50 in the developed world. The researchers think that, based on their findings in mice, giving vitamin D supplements to people who are at risk of AMD might be a simple way of helping to prevent the disease.
This result is consistent with earlier research on humans that found that higher blood vitamin D levels are associated with a lower risk of age-related macular degeneration. Also, higher vitamin D is associated with healthier blood vessels.
Some evidence exists that too much vitamin D increases risk of atrial fibrillation. But the levels associated with healthier blood vessels and lower AMD risk in humans are below those found to increase atrial fibrillation. So this makes me think there's some value in getting one's vitamin D levels tested and then supplement as necessary to keep one's blood vitamin D in the normal (41-80 ng/dl) range.
People who aren't made poorer by recessions spend less for status signaling when others have to cut back on their own status-driven spending. If others can't flash as many status signals then people feel less need to do their own spending to signal higher status.
"Even when their consumption budget is unaffected by a recession, consumers will change their expenditure patterns because some of these expenses depend on social standards that shift with economic conditions," write authors Wagner A. Kamakura (Duke University) and Rex Yuxing Du (University of Houston).
So a recession allows some people to voluntarily take a breather from status spending.
The authors analyzed U.S. household expenditure data for more than two decades, using a model that allowed them to separate budget and positionality effects. "As one would expect, we find that the share of consumption budget devoted to nonessentials (apparel, jewelry and watches, recreation, traveling) drops, while shares devoted to essentials (food at home, housing, utilities) increase during a recession due to the budget effect," the authors write.
Wealthy consumers don't necessarily spend less out of empathy for those who are less well off. Instead, they perceive a reduction in others' expenditures on positional goods and services and feel they don't need to spend as much to maintain the same status relative to their peers, the authors explain.
During hard times, visible luxuries are hit twice, because people generally have less to spend and those who can consume feel less compelled to show off. "Keeping up with the Joneses is less onerous when they are not keeping up," the authors conclude.
People with a lower instinctive (or even practical) need to flash status symbols are relatively more free to live the way they want to live. I am reminded of a piece that Marty Cortland wrote about 4 years ago on how he had to buy a Lexus because his wife thought they weren't rich enough to drive around in a mere Buick. The tyranny of upper middle classness: you've got to flash status symbols because you aren't known as a billionaire.
“I’m not driving a Buick,” she declared. “There is no way I’m showing up at playgroup at Brook Hollow in a Buick.”
“But what about Ross Perot?” I argued. “He drives a Crown Vic.”
“Ross Perot is a billionaire,” she shrieked. “He can afford to drive anything he wants!”
I bet there are genetic variants that cause different levels of desire to have status. A combination of a low desire for status and a high desire for savings would seem the best combination for a lower stress life. Though one might still feel stress about not having saved enough.
Polysilicon crystal is an input into making silicon-based photovoltaics. After peaking at over $400 per kilogram in 2008 due to rapidly rising demand big capital investments in polysilicon crystal manufacturing plants led to a glut. Now polysilicon crystal has fallen in price by an order of magnitude. The good news: according to that link the manufacturing cost (at least for lower cost producers) is still lower than the market price. So the current lower market price is sustainable and will lead to lower silicon PV prices as new contracts for polysilicon are negotiated.
However, oversupply in the polysilicon market pushed the spot price of silicon down from $80 per kilogram in late March 2011 to under $30 per kilogram in December, representing more than a 60 percent drop.
For the future of silicon PV what we need to know: how much further can polysilicon manufacturing costs drop? Is energy cost the biggest cost in that process? Are we need the floor for long term polysilicon prices? Can silicon PV continue to drop in cost as fast as thin films?
As I've written previously, the manufacturing and installation cost trends are what we should watch when it comes to the future of renewables. Market prices can be going up and down independent of manufacturing costs.
Update: In the comments Ronald Brakels points to a report on research to lower the energy cost of making polysilicon crystals. This has the potential to raise the energy return on energy invested (EROEI) of silicon PV.
At least among the overweight eating carbs that digestly slowly lowers markers for inflammation. Choose your carbohydrates for lower glycemic index.
SEATTLE – Among overweight and obese adults, a diet rich in slowly digested carbohydrates, such as whole grains, legumes and other high-fiber foods, significantly reduces markers of inflammation associated with chronic disease, according to a new study by Fred Hutchinson Cancer Research Center. Such a "low-glycemic-load" diet, which does not cause blood-glucose levels to spike, also increases a hormone that helps regulate the metabolism of fat and sugar. These findings are published online ahead of the February print issue of the Journal of Nutrition.
The controlled, randomized feeding study, which involved 80 healthy Seattle-area men and women – half of normal weight and half overweight or obese – found that among overweight and obese study participants, a low-glycemic-load diet reduced a biomarker of inflammation called C-reactive protein by about 22 percent.
"This finding is important and clinically useful since C-reactive protein is associated with an increased risk for many cancers as well as cardiovascular disease," said lead author Marian Neuhouser, Ph.D., R.D., a member of the Cancer Prevention Program in the Public Health Sciences Division at the Hutchinson Center. "Lowering inflammatory factors is important for reducing a broad range of health risks. Showing that a low-glycemic-load diet can improve health is important for the millions of Americans who are overweight or obese."
The diets had the same amounts of total carbohydrates.
Study participants completed two 28-day feeding periods in random order – one featuring high-glycemic-load carbohydrates, which typically are low-fiber, highly processed carbs such as white sugar, fruit in canned syrup and white flour; and the other featuring low-glycemic-load carbohydrates, which are typically higher in fiber, such as whole-grain breads and cereals. The diets were identical in carbohydrate content, calories and macronutrients. All food was provided by the Hutchinson Center's Human Nutrition Laboratory, and study participants maintained weight and physical activity throughout.
Check out Rick Mendosa's tables on glycemic index and glycemic load. Become familiar with higher glycemic index foods and avoid them.
A third of your gasoline goes toward overcoming friction. Over half of that friction might become avoidable within 15 to 25 years.
No less than one third of a car's fuel consumption is spent in overcoming friction, and this friction loss has a direct impact on both fuel consumption and emissions. However, new technology can reduce friction by anything from 10% to 80% in various components of a car, according to a joint study by VTT Technical Research Centre of Finland and Argonne National Laboratory (ANL) in USA. It should thus be possible to reduce car's fuel consumption and emissions by 18% within the next 5 to 10 years and up to 61% within 15 to 25 years.
There are 612 million cars in the world today. The average car clocks up about 13,000 km per year, and in the meantime burns 340 litres of fuel just to overcome friction, costing the driver EUR 510 per year.
Electric cars lose far less of their energy to friction. So they have less to gain from friction reduction. So reduced friction loss will improve the relative advantage of cars buring liquid hydrocarbon fuels versus electric cars.
Of the energy output of fuel in a car engine, 33% is spent in exhaust, 29% in cooling and 38% in mechanical energy, of which friction losses account for 33% and air resistance for 5%. By comparison, an electric car has only half the friction loss of that of a car with a conventional internal combustion engine.
Annual friction loss in an average car worldwide amounts to 11,860 MJ: of this, 35% is spent in overcoming rolling resistance in the wheels, 35% in the engine itself, 15% in the gearbox and 15% in braking. With current technology, only 21.5% of the energy output of the fuel is used to actually move the car; the rest is wasted.
One thought: on shorter trips it should be possible to avoid the need for a cooling air conditioner and gasoline power lost to it. Imagine when parked you could plug in the car and the electric power would operate a condenser to super cool some liquid. Then that frozen material could provide a source of cooling for, say, an hour or two.
Newer materials can cut friction. But what about the costs?
A recent VTT and ANL study shows that friction in cars can be reduced with new technologies such as new surface coatings, surface textures, lubricant additives, low-viscosity lubricants, ionic liquids and low-friction tyres inflated to pressures higher than normal.
Friction can be reduced by 10% to 50% using new surface technologies such as diamond-like carbon materials and nanocomposites. Laser texturing can be employed to etch a microtopography on the surface of the material to guide the lubricant flow and internal pressures so as to reduce friction by 25% to 50% and fuel consumption by 4%. Ionic liquids are made up of electrically charged molecules that repel one another, enabling a further 25% to 50% reduction in friction.
The payback will come faster in commercial vehicles that travel great distances every year. So are diamond-line carbon materials getting designed into engines or transmissions of any long distance trucks today?
The Norwegian Institute of Public Health says for healthier air light your wood burning heaters from the top, not the bottom. An added benefit: more complete burning means higher efficiency. You get more heat from the wood.
Before anyone states "this is obvious": I've seen wood fires lit from the bottom many times. It is usually seen as easier to light from the bottom. Kindling can light bigger logs above it.
But it makes sense that a fire on top can burn the gases that get released from warming wood underneath them. So try to light the top.
Update: "pond" points to instructions on how to easily light from the top. What's still missing: How to easily load new logs into a burning fire bottom-up?
Every time most cells divide their telomere chromosome caps get shorter. When the telomere caps get very short cellular division is inhibited. Cells that can't divide can not repair damaged tissue. It is not a coincidence that cells around damaged arthritic joints have short telomeres.
Telomeres, the very ends of chromosomes, become shorter as we age. When a cell divides it first duplicates its DNA and, because the DNA replication machinery fails to get all the way to the end, with each successive cell division a little bit more is missed. New research published in BioMed Central's open access journal Arthritis Research & Therapy shows that cells from osteoarthritic knees have abnormally shortened telomeres and that the percentage of cells with ultra short telomeres increases the closer to the damaged region within the joint.
While the shortening of telomeres is an unavoidable side effect of getting older, telomeres can also shorten as a result of sudden cell damage, including oxidative damage. Abnormally short telomeres have been found in some types of cancer, possibly because of the rapid cell division the cells are forced to undergo.
The question: are the short telomeres a result of osteoarthritis? Or are the short telomeres a cause of osteoarthritis? Does the inflammation associated with osteoarthritis accelerate cell division and thereby cause short telomeres? Or do joints wear down and become osteoarthritic once few cells remain that can do repairs on them?
A Danish team developed a better assay to measure telomere length. Better assays speed up scientific discovery.
There has been some evidence from preliminary work done on cultured cells that the average telomere length is also reduced in osteoarthritis (OA). A team of researchers from Denmark used newly developed technology (Universal single telomere length assay) to look in detail at the telomeres of cells taken from the knees of people who had undergone joint replacement surgery. Their results showed that average telomere length was, as expected, shortened in OA, but that also 'ultra short' telomeres, thought to be due to oxidative stress, were even more strongly associated with OA.
Maria Harbo who led this research explained, "We see both a reduced mean telomere length and an increase in the number of cells with ultra short telomeres associated with increased severity of OA, proximity to the most damaged section of the joint, and with senescence. Senescence can be most simply explained as biological aging and senescent cartilage within joints is unable to repair itself properly."
Cartilage damage and telomere shortening are both contributing to the development of osteoarthritis.
She continued, "The telomere story shows us that there are, in theory, two processes going on in OA. Age-related shortening of telomeres, which leads to the inability of cells to continue dividing and so to cell senescence, and ultra short telomeres, probably caused by compression stress during use, which lead to senescence and failure of the joint to repair itself. We believe the second situation to be the most important in OA. The damaged cartilage could add to the mechanical stress within the joint and so cause a feedback cycle driving the progression of the disease."
Lots of researchers investigate a large assortment of diseases of old age. But many of these diseases have a common cause: loss of ability of the body to do repairs. So while the diseases manifest in different ways with different symptoms they could be reversed with a common strategy: restore the body's ability to do repairs on itself. Cell therapies to deliver youthful cells are a key part of a larger strategy to reverse the aging process and repair aged tissues.
The Technology Review Arxiv blog has an interest post about how a Russian serial killer's frequency of killing fits a power law that suggests a pattern of neuronal recharge after killings.
On 20 November 1990, Andrei Chikatilo was arrested in Rostov, a Russian state bordering the Ukraine. After nine days in custody, Chikatilo confessed to the murder of 36 girls, boys and women over a 12 year period. He later confessed to a further 20 murders, making him one of the most prolific serial killers in modern history.
Today, Mikhail Simkin and Vwani Roychowdhury at the University of California, Los Angeles, release a mathematical analysis of Chikatilo's pattern of behaviour. They say the behaviour is well characterised by a power law and that this is exactly what would be expected if Chikatilo's behaviour is caused by a certain pattern of neuronal firing in the brain.
Click thru and read it to see if you find their conclusion likely. It seems plausible to me. Here's the abstract.
What I'd like to know: Short of committing a real murder how to drain the neural excitation that initiates a serial killer's desire to kill? On a related note: Has the playing of violent video games reduced the frequency of killing by serial killers? Could we detect the draining effects of simulated kills in video games by looking for signs of lower frequency of serial killing over the last 20 years?
What would really help: better ways to detect serial killers: What sorts of measurements could detect a person's propensity to become a serial killer? I've been reading a number of books lately about psychological research (e.g. Daniel Kahnemann's excellent Thinking Fast And Slow) and am struck by tricky means psychologists have devised for measuring cognitive phenomena. For example, pupil dilation happens and can be usefully measured when the conscious brain is thinking hard. Can any existing tools of psychological and neurobiological research show differences between serial killers and the population at large?
We need the ability to detect serial killers in advance. How to get from the idea that neurons charge up to activate serial killing circuitry to a way to more rapidly detect serial killers in our midst? Ideas?
Check out this piece in the Miami Herald. Plastic surgeons and dermatologists are injecting fat cells and stem cells into facial regions to plump up faces for a more youthful look.
But some South Florida plastic surgeons and dermatologists are expanding their repertoire to include new options, such as injecting fat or stem cells into the face, as well as using ultrasound technology to tighten sagging skin. “I’m doing a lot of fat,” said Dr. Constantino Mendieta, a Miami plastic surgeon and spokesman for the American Society of Aesthetic Plastic Surgery, who is using fat as a natural way to add volume to the face. As we age, we tend to lose fat from the face, as well as from the hands, derriere and breasts, Mendieta said.
I've been expecting the plastic surgeons to take an aggressive approach to use of cell therapies than most medical specialties. Plastic surgery is one of most free market-oriented areas of medicine. Usually patients pay with their own money. The potential pool of patients is far larger than the number who currently use plastic surgeons. Better treatments will pull in more customers and generate bigger incomes for the surgeons. The plastic surgeons already have a history of innovation.
The good news as I see it: If people start spending large amounts of money on appearance-improving cell therapies their money will fund a biotech industry for stem cell manipulation that will use part of their revenue to improve their capabilities for manipulating and improving stem cells. Consumer dollars spent on cell therapies will accelerate the eventual development of real rejuvenation therapies that extend life.
Researchers at the Dana-Farber Cancer Institute in Boston find a hormone that makes a high-fat diet less damaging. The hormone has effects similar to that of exercise - but without all that effort and time spent exercising.
Mice given irisin lost a few grams in the first 10 days after treatment, the study shows, and certain genes involved in powering the cell were turned on. Irisin also appeared to reduce the damage done by a high-fat diet, protecting mice against diet-induced obesity and diabetes, according to the paper, whose first author is postdoctoral fellow Pontus Boström.
Drug delivery is a serious problem in this case. Also, it isn't clear this hormone will deliver all the benefits of exercise.
Internet addiction disorder may be associated with abnormal white matter structure in the brain, as reported in the Jan. 11 issue of the online journal PLoS ONE. These structural features may be linked to behavioral impairments, and may also provide a method to study and treat the disorder.
Previous studies of internet addiction disorder (IAD), which is characterized by an individual's inability to control his or her Internet use, have mostly focused on psychological questionnaires. The current study, on the other hand, uses an MRI technique to investigate specific features of the brain in 18 adolescents suffering from IAD.
The researchers, led by Hao Lei of the Chinese Academy of Sciences in Wuhan, found that IAD is characterized by impairment of white matter fibers connecting brain regions involved in emotional generation and processing, executive attention, decision making, and cognitive control, and suggest that IAD may share psychological and neural mechanisms with other types of impulse control disorders and substance addiction.
As a heavy internet user I suddenly want to know: is my brain white matter tweaked?
Significantly negative correlations were found between FA values in the left genu of the corpus callosum and the Screen for Child Anxiety Related Emotional Disorders, and between FA values in the left external capsule and the Young's Internet addiction scale.
If you have to feel anxiety to have this abnormality then I'm okay. Anxiety is an emotional that I don't much feel. Okay, do I have, like, anxiety deficit syndrome? Its always something.
As part of the Whitehall II cohort study, medical data was extracted for 5,198 men and 2,192 women, aged between 45 and 70 at the beginning of the study, monitored over a 10-year period. The cognitive functions of the participants were evaluated three times over this time. Individual tests were used to assess memory, vocabulary, reasoning and verbal fluency.
The results show that cognitive performance (apart from the vocabulary tests) declines with age and more rapidly so as the individual's age increases. The decline is significant in each age group.
For example, during the period studied, reasoning scores decreased by 3.6 % for men aged between 45 and 49, and 9.6 % for those aged between 65 and 70. The corresponding figures for women stood at 3.6% and 7.4% respectively.
Brain aging is a tremendous waste of resources and we should support research aimed at reversing it.
On a related note surgeons peak between ages 35 and 50. Not surprising since training takes many years and their nervous systems are aging.
Surgeons aged between 35 and 50 years provide the safest care compared with their younger or older colleagues, finds a study published on bmj.com today.
The findings raise concerns about ongoing training and motivation of surgeons during their careers.
Typically, experts reach their peak performance between the ages of 30 and 50 years or after about 10 years' experience in their specialty, but few studies have measured the association between clinicians' experience and performance.
It would make sense to train medical doctors, especially surgeons, starting about 5 years earlier. They would then be able to enter their peak performing years at younger ages and spend more time at peak performance before nervous system aging starts to take its toll. That is true as well for many other occupations where high cognitive performance is essential. Teens should have access to online courses and tests to allow them to study and take tests all year round and at all hours of every day and night.
Why haven't the space aliens shown themselves to us? Gone extinct? Or headed our way in massive invasion armadas as soon as they detected our electromagnetic signals? Over a hundred billion planets in our galaxy waiting to be visited.
Six years of observations of millions of stars now show how common it is for stars to have planets in orbits around them. Using a method that is highly sensitive to planets that lie in a habitable zone around the host stars, astronomers, including members from the Niels Bohr Institute, have discovered that most of the Milky Way's 100 billion stars have planets that are very similar to the Earth-like planets in our own solar system – Mercury, Venus, Earth and Mars, while planets like Jupiter and Saturn are more rare. The results are published in the prestigious scientific journal, Nature.
"Our results show that planets orbiting around stars are more the rule than the exception. In a typical solar system approximately four planets have their orbits in the terrestrial zone, which is the distance from the star where you can find solid planets. On average, there are 1.6 planets in the area around the stars that corresponds to the area between Venus and Saturn" explains astronomer Uffe Gråe Jørgensen, head of the research group Astrophysics and Planetary Science at the Niels Bohr Institute at the University of Copenhagen.
If aliens are out there and warp drives are possible I would expect they would be here already - or their robotic explorer ships would be. Have all the surviving intelligent life forms survived by hiding? Are intelligent species so dangerous to each other that most of them learn to hide?
If biological species eventually get wiped out by artificially intelligent machines then I'd expect the robots to build up massive machine civilizations and perhaps go around snuffing out biological species. So where are the android invaders?
Even more fun: millions of planets in the Milky Way orbit two suns.
Astronomers using NASA's Kepler mission have discovered two new circumbinary planet systems – planets that orbit two stars, like Tatooine in the movie Star Wars. Their find, which brings the number of known circumbinary planets to three, shows that planets with two suns must be common, with many millions existing in our Galaxy.
"Once again, we're seeing science fact catching up with science fiction," said co-author Josh Carter of the Harvard-Smithsonian Center for Astrophysics.
Is that cool or what?
Faster than expected. Life Technologies of Carlsbad California has announced a $149k genome sequencing machine that with a chip upgrade coming before the end of 2012 will sequence an entire human genome in less than a day. The machine is cheap compared to other products in this market and it is faster.
While the claim of a $1000 cost for sequencing a genome may be a premature exaggeration even for the end of 2012 the company is getting pretty close. The computer chip which can be used for only one genome and its associated biochemicals by themselves cost $1k. So the machine's price tag is in addition to the per genome cost. Also, a company using this machine to sequence a genome will have added labor, marketing, and other costs on top of a profit margin. Still, we are probably going to at least be below $5k per genome by the end of 2012.
An IEEE Spectrum Tech Talk blog post argues Moore's Law drove down the cost of DNA sequencing just it has the cost of computer power. Since the Life Technologies Ion Proton Sequencer does use transistors in wells on a small scale made possible by the semiconductor industry this seems like a correct analysis.
Since current market leader Illumina also just announced an upgrade to their existing sequencer that will sequence a genome in a day the cost reductions are happening in a competitive market and should translate into price reductions as well.
Illumina, Life Sciences, and other competitors will continue to find ways to cut costs. So I'm thinking 2013 looks like the year to get my full genome sequenced. We will need to look at not just price but also accuracy and extent of the sequencing done. How will each service compare a year from now in terms of error rates and thoroughness? Will they detect large copy variation? Once prices go below $1k it might make sense to get yourself sequenced by a couple of competing services and compare the results.
These low prices are going to drive up the rate of full genome sequencing. Therefore expect to see an explosion of discoveries on what the many genetic variants mean. What I'm especially looking forward to: genetically derived personal advice on ideal diet, exercise, sleep, and other lifestyle choices.
In an interview with Technology Review Mark Perry, Nissan America director of product planning, discusses how Nissan is scaling up their Smyrna Tennessee factory to make 150,000 cars and 200,000 battery packs.
We have a complete assembly line in Osaka, Japan, built up from scratch, especially for the electric motor. The battery construction is done in a clean room—that's also new for an automotive factory.
We're now re-creating all that here in the United States, in Tennessee. It will be the world's largest battery assembly plant—our engine plant will actually be winding away electric motors this time next year. And at full capacity it'll be capable of putting out 200,000 battery packs a year.
What I'm wondering: What price point do they think they need to reach to get 150k of demand for electric vehicles? My guess is they are counting on CAFE standards to force all manufacturers to charge so much for pure gasoline and even hybrid vehicles that non-EVS rise in price to make EVs competitive. In other words, personal transportation is going to cost more.
Another possibility: Peak Oil will drive up the price of gasoline so high that EVs will seem a bargain due to lower fuel cost. But again personal transportation becomes more expensive.
So what are the odds of a substantial reduction in vehicle battery costs? When? We are in 2012. Where are the signs that EV batteries cost less now than a year ago? Anyone seen indications that EV battery costs are coming down?
The Edison Foundation's Institute for Electric Efficiency reports that just from 2009 to 2010 the incremental cost of saving electric power with more energy efficiency technologies went up by about 10%.
That adds up to a cost of about 3.5 cents per kWh, as cheap or cheaper than almost every source of power generation out there, including natural gas and coal. But that cost has also gone up in the past few years, according to Lisa Wood, IEE’s executive director. In 2009, energy efficiency cost 3.2 cents per kilowatt hour, according to IEE figures.
Makes sense that the low hanging fruit gets harvested first and incremental costs for additional efficiency rise. While conservation is still fairly cheap we can't count on it remaining so.
Since utilities do not try to tackle every form of energy waste there's probably more potential for increased efficiency on topics they don't try to tackle. Also, many customers don't take up utilities on deals they offer. But it seems reasonable to expect rising electricity prices in some regions to cause more people to take steps to cut their energy usage. The state governments that are requiring more renewable energy usage are pushing up electric power costs. So in those states I expect more conservation measures as a response to expensive solar and wind power driving up utility bills.
The US Energy Information Administration web site has recently had a make-over which I'm exploring. Check out the main page for electricity. Their scrolling list of charts shows some charts of interest. One of the more interesting charts for energy geeks: Electricity tends to flow south in North America.
NIMBY California imports 25% of its (expensive) electricity.
One reason for these flows: Cheap hydro's mostly in the north and southern states will pay for importing it.
The map above shows that electricity tends to flow south in North America. The numbers on the map reflect average net power flows—metered hourly—between electric systems aggregated by regions for the year 2010. Most electric power demand is served by local generators. Net interregional trade accounted for less than 1% of delivered power in 2010. However, excess, low-cost power—primarily from hydroelectric generators in the Pacific Northwest, Manitoba, and Quebec—supplied higher-cost markets to the south.
Another page worth a look: ranges of wholesale electric prices by region. The northeast and Texas have very large swings in whole electric power prices. Also, New England, California, and Alaska have the highest electric power prices. California's prices will rise much higher to meet the state legislated requirement to get 33% of electric power from renewables by 2020.
A new paper coming soon in Nature Geoscience presents evidence that our current warm period would end 1500 years from now if only the industrial age hadn't unleashed massive releases of carbon dioxide into the atmosphere.
The research, led by Chronis Tzedakis of University College, London, examined similarities between the current warm interval between ice ages and a particular point, around 780,000 years ago, during a past warm period known as Marine Isotope Stage 19. Using a variety of methods, the authors conclude that the onset of a new ice age would likely begin about 1,500 years from now, if the concentration of carbon dioxide was back below the levels produced since the Industrial Revolution.
Anticipating some comments: Yes, if you think that CO2 isn't a greenhouse gas, that its infrared absorption spectrum (see figure 6) is of little consequence, then it's Ice Age for you in 1500 years. At least that'll be the case if you live long enough to get rejuvenation therapies that turn back your aging clock.
Which leads me to another subject: Once we get rejuvenation therapies if I'm still around I am going to try to get large numbers of people to publicly record their beliefs and derived predictions. Let us get people on record saying the ice caps will melt or not and ditto for many other contentious issues. Then after a century or two people can see what mistakes they made. All the proven wrong predictions are needed to make people humble a thousand years from now. Otherwise some of us are going to develop absolutely insufferable unjustified confidence in our delusional beliefs about the future.
Also see Andrew Revkin's post The Next Ice Age and the Anthropocene.
Due to both technological advances and regulatory and economic drives toward greater energy efficiency decisions over light bulb choices have gotten much more complex - unless you just ignore the complexity and grab something off the shelf. But if you like the maximize the quality of your light bulb buying decisions then a review on light bulbs by Bob Tedeschi in the New York Times is a good place to start.
The quality of L.E.D. light, even the “soft white” types, is noticeably cooler than that of halogens or C.F.L.’s. And because most L.E.D.’s are unidirectional, they work well for recessed lights or lamps that spotlight artwork. But this single-focus nature is a problem for standard shaded lamps. The packaging of Sylvania’s Ultra A-Line L.E.D. suggests that it’s suitable for a shaded lamp, but when I tried it in a lamp in my living room, the top half was lit, while the bottom saw little light.
However, Sylvania will release an omnidirectional L.E.D. this winter, and two manufacturers are now making them. When I tried them — G.E.’s Energy Smart L.E.D. and the Philips AmbientLED — they lighted up both the top and bottom of my lamp. The Philips bulb was softer than G.E.’s — so much so that I now have two of them gracing my living room.
LEDs, while still expensive, have come down far enough in price that their life expectancies (20+ years in most cases) make them an attractive choice. It is always good to find more areas of one's life where one can basically deal with a problem once and then not think about it for a long time.
the article is not a sales job for uniform use of LEDs. He does a good job of explaining the trade-offs for different rooms of the house and purposes. Worth a read if you want to make smarter lighting decisions.
One interesting factoid from the article brings up a new way (at least I haven't thought of it) to save energy: rejuvenate eyes.
“Fifty-year-olds need twice as much light to read something as well as a 20-year-old,” Mr. Bernecker said. “It’s a sad story.”
Think about it. If we could replace aged lens and send in stem cells and gene therapies to repair eyes we would have in electric lighting costs. Plus, we've save even more in costs of optometry and eye glasses. The gain in convenience (no more "where are my glasses") would be considerable as well.
In MIT's Technology Review Phil McKenna reports on a potentially revolutionary advance in zinc-air batteries to enable heap grid storage.
Battery developer Eos Energy Storage claims to have solved key problems holding back a battery technology that could revolutionize grid energy storage. If the company is right, its zinc-air batteries will be able to store energy for half the cost of additional generation from natural gas—the method currently used to meet peak power demands.
Most of the non-fossil fuels based energy sources (hydro excepted) do not have the ability to load follow. In other words, they can't boost production when demand is higher. That's not just true for wind and solar. Look at nuclear and geothermal. They run continuously. Turning them down when there's less demand idles very expensive capital and therefore reduces return on investment. So a cheap way to allow electric power generated at one time to be used at another time would make solar, wind, nuclear, and geothermal all more cost competitive versus natural gas peaking generators.
They also claim twice the energy density of lithium ion batteries. That opens up the possibility of electric cars with twice the range.
While the potential for cost-cutting advances electric power for wind (e.g. bigger turbines, floating turbines), solar (e.g. thin films), and nuclear (e.g. LFTR and fusion) get a lot of attention we really need advances in storage technology even more. The widespread adoption of electric vehicles depends on cheaper and higher energy density batteries. Extensive displacement of fossil fuels for electric power generation depends on the ability to use clean baseload (geothermal, nuclear) and unreliable renewables (wind, solar) when demand for electric power is highest. That means cheap and long lasting batteries.
Note that the use of concentrating solar power to store heat energy for electric power generation in the evening loses its advantage over photovoltaics and other electric power generation sources if storage battery tech becomes cheap enough. Big solar projects are switching from thermal to PV and a solar concentrator maker has been driven out of business. So can solar concentrators compete in the long term against PV and cheaper battery storage?
Update: A recent study by a Nevada electric utility finds that fossil fuel back-up to solar power for cloudy days will add 3 to 8 cents per kilowatt-hour for solar power. That's a big hit for an electric power source that is already more expensive than coal, natural gas, nuclear, and wind. To put it in perspective, the average cost to US residential customers for electric power in 2009 was 11.51 cents per kwh. Imagine your electric power bill going up by about 50%. Not a pleasant thought.
This study above was done in Nevada, certainly not the cloudiest part of the United States. Would the cost be even higher in the east or midwest? How about cloudy rainy Seattle?
Liberals are Panglossians. Conservatives expect bad things. Conservatives stare more at wounds while liberals stare more at fluffy bunnies.
From cable TV news pundits to red-meat speeches in Iowa and New Hampshire, our nation's deep political stereotypes are on full display: Conservatives paint self-indulgent liberals as insufferably absent on urgent national issues, while liberals say fear-mongering conservatives are fixated on exaggerated dangers to the country.
A new study from the University of Nebraska-Lincoln suggests there are biological truths to such broad brushstrokes.
In a series of experiments, researchers closely monitored physiological reactions and eye movements of study participants when shown combinations of both pleasant and unpleasant images. Conservatives reacted more strongly to, fixated more quickly on, and looked longer at the unpleasant images; liberals had stronger reactions to and looked longer at the pleasant images compared with conservatives.
"It's been said that conservatives and liberals don't see things in the same way," said Mike Dodd, UNL assistant professor of psychology and the study's lead author. "These findings make that clear – quite literally."
Political leanings have a large innate component. This makes continued political conflict inevitable.
Beach balls and bunnies? Or open wounds and crashed cars?
To gauge participants' physiological responses, they were shown a series of images on a screen. Electrodes measured subtle skin conductance changes, which indicated an emotional response. The cognitive data, meanwhile, was gathered by outfitting participants with eyetracking equipment that captured even the most subtle of eye movements while combinations of unpleasant and pleasant photos appeared on the screen.
While liberals' gazes tended to fall upon the pleasant images, such as a beach ball or a bunny rabbit, conservatives clearly focused on the negative images – of an open wound, a crashed car or a dirty toilet, for example.
Both Republicans and Democrats stare at Democrats more than at Republicans.
Consistent with the idea that conservatives seem to respond more to negative stimuli while liberals respond more to positive stimuli, conservatives also exhibited a stronger physiological response to images of Democratic politicians – presumed to be a negative to them – than they did on pictures of well-known Republicans. Liberals, on the other hand, had a stronger physiological response to the Democrats – presumed to be a positive stimulus to them – than they did to images of the Republicans.
Do these groups balance each other out in constructive ways? Or do they just make each other infuriated? Break the country up into conservative and liberal countries? Or keep them together in eternal political battle?
In the past when I've written posts about innate political differences inevitably someone has commented basically "so that's what's wrong with the other side". So I'm not optimistic that people are going to develop greater political tolerance by learning that political differences are partly innate.
Rather than believing those with opposite political views are uninformed or willfully obtuse, the authors said, political tolerance could be enhanced if it was widely understood that political differences are based in part on our physiological and cognitive differences.
Perhaps these two leanings were specializations of labor back in tribal days. That would have enabled them to remain evolutionarily stable and coexist in the gene pool.
Also see my previous posts Political Orientation Tied To Biological Reaction To Threats and Greater Disgust Response Associated With Political Right?
Cancer death rates declined much more rapidly than cancer incidence. One possible interpretation: treatments are becoming more effective.
ATLANTA – January 4, 2012 – The American Cancer Society's annual cancer statistics report shows that between 2004 and 2008, overall cancer incidence rates declined by 0.6% per year in men and were stable in women, while cancer death rates decreased by 1.8% per year in men and by 1.6% per year in women.
Progress is slowly being made across a range of different cancers.
Death rates continue to decline for all four major cancer sites (lung, colorectum, breast, and prostate), with lung cancer accounting for almost 40% of the total decline in men and breast cancer accounting for 34% of the total decline in women.
One of the next weapons against cancer: whole genome sequencing. The hope is that anti-cancer treatments can be customized to aim at identifying and then counteracting the combination of mutations that enable each specific cancer. Multiple research efforts are each sequencing hundreds of cancer genomes. A company called Complete Genomics will sequence cancer and normal genomes of a cancer patient for $12,000 and already have hundreds of customers.
A total of about 30,000 human genomes were sequenced in 2011, an order of magnitude more than were sequenced in 2010. This is due to the very rapid rate of decline in costs of sequencing DNA. So we are just at the beginning of a huge flood of genetic sequencing data.
Since some (if not all) cancer happens due to genetic mutations the flood of genetic data ought to provide major clues on how to defeat cancer. Since each cancer has many unique mutations sorting thru them is very non-trivial. Even once more cancer-enabling mutations are identified developing treatments that target them will take years. So I'm not expecting a big short-term payoff.
Mitochondria are sub-cellular organelles that break down sugar to make energy for the cell. Our mitochondrial DNA accumulate mutations and mitochondria become less functional as a result. Possibly other mechanisms are at working causing mitochondrial aging as well. A new report finds mitochondrial damage accumulation in stem cells has an especially large impact on overall aging.
Aging-related tissue degeneration can be caused by mitochondrial dysfunction in tissue stem cells. The research group of Professor Anu Suomalainen Wartiovaara in Helsinki University, with their collaborators in Max Planck Institute for Biology of Aging, Karolinska Institutet and University of Wisconsin reported on the 3rd January in Cell Metabolism their results on mechanisms of aging-associated degeneration.
Stem cells are called the spare parts for tissues, as they maintain and repair tissues during life. They are multipotent and can produce a variety of different cell types, from blood cells to neurons and skin cells. Mitochondria are the cellular engine: they transform the energy of nutrients to a form that cells can use, and in this process they burn most of the inhaled oxygen. If this nutrient 'burning' is inefficient, the engine will produce exhaust fumes, oxygen radicals, which damage cellular structures, including the genome. Antioxidants target to scavenge these radicals.
Already in 2004 and 2005 a research model was created in Sweden and USA, which accumulated a heavy load of mitochondrial genome defects. This led to symptoms of premature aging: thin skin, graying of hair, baldness, osteoporosis and anemia.
In the current publication, scientist Kati Ahlqvist in Professor Suomalainen Wartiovaara's group showed that these symptoms were partially explained by stem cell dysfunction. The number of stem cells did not reduce, but their function was modified: the progeny cells in blood and the nervous system were dysfunctional. The researchers also found out that these defects could be partially prevented by early antioxidant treatment.
Stem cells are needed to create replacements for damaged cells that die off or cease to do their jobs. Damaged stem cells are unable to perform their function. So less repair gets done as our stem cells accumulate damage and become dysfunctional with age. Biotechnology that would enable us to replace our old stem cells with younger ones would go far to slow and partially reverse aging.
Another research team found that in mice bred to age rapidly stem cell injections slowed aging and enabled the mice to live longer.
PITTSBURGH, Jan. 3 – Mice bred to age too quickly seemed to have sipped from the fountain of youth after scientists at the University of Pittsburgh School of Medicine injected them with stem cell-like progenitor cells derived from the muscle of young, healthy animals. Instead of becoming infirm and dying early as untreated mice did, animals that got the stem/progenitor cells improved their health and lived two to three times longer than expected, according to findings published in the Jan. 3 edition of Nature Communications.
Previous research has revealed stem cell dysfunction, such as poor replication and differentiation, in a variety of tissues in old age, but it's not been clear whether that loss of function contributed to the aging process or was a result of it, explained senior investigators Johnny Huard, Ph.D., and Laura Niedernhofer, M.D., Ph.D. Dr. Huard is professor in the Departments of Orthopaedic Surgery and of Microbiology and Molecular Genetics, Pitt School of Medicine, and director of the Stem Cell Research Center at Pitt and Children's Hospital of PIttsburgh of UPMC. Dr. Niedernhofer is associate professor in Pitt's Department of Microbiology and Molecular Genetics and the University of Pittsburgh Cancer Institute (UPCI).
"Our experiments showed that mice that have progeria, a disorder of premature aging, were healthier and lived longer after an injection of stem cells from young, healthy animals," Dr. Niedernhofer said. "That tells us that stem cell dysfunction is a cause of the changes we see with aging."
Stem cells from young healthy mice enabled progeria mice (i.e. mice selected for to age more rapidly) to live longer.
Their team examined a stem/progenitor cell population derived from the muscle of progeria mice and found that compared to those from normal rodents, the cells were fewer in number, did not replicate as often, didn't differentiate as readily into specialized cells and were impaired in their ability to regenerate damaged muscle. The same defects were discovered in the stem/progenitor cells isolated from very old mice.
"We wanted to see if we could rescue these rapidly aging animals, so we injected stem/progenitor cells from young, healthy mice into the abdomens of 17-day-old progeria mice," Dr. Huard said. "Typically the progeria mice die at around 21 to 28 days of age, but the treated animals lived far longer – some even lived beyond 66 days. They also were in better general health."
The symptoms which old mice suffer from serve as a reminder of why we need rejuvenation therapies. Do you want to hunch over, tremble, or move slowly and awkwardly? I think not.
As the progeria mice age, they lose muscle mass in their hind limbs, hunch over, tremble, and move slowly and awkwardly. Affected mice that got a shot of stem cells just before showing the first signs of aging were more like normal mice, and they grew almost as large. Closer examination showed new blood vessel growth in the brain and muscle, even though the stem/progenitor cells weren't detected in those tissues.
Once rejuvenating stem cell therapies become available I expect people will start using them while still at fairly young ages. Starting in one's 20s doesn't seem too soon.
A diagnosis of pancreatic cancer basically represents a notice you're going to be checking out of the Life Hotel. Bad cancer. So it is very desirable to find ways to lower the odds of getting pancreatic cancer. A new report finds some trace elements raise and lower the risk of pancreatic cancer.
A new study has found that high bodily levels of the trace elements nickel and selenium may be associated with reduced risk for pancreatic cancer, and that high levels of arsenic, cadmium and lead may increase the risk.
Avoiding toxins is usually a good idea unless you are training your body to detoxify some toxin because you expect someone to try to poison you.
I did some poking around about arsenic. In a fairly small number of areas arsenic in the water supply is a problem. Odds are you aren't in one of those areas. Arsenic comes into the diet in quite a few different ways. For example, arsenic was used to kill boll weevils in the Old South in the United States. So rice from some areas of the US has much higher arsenic in it. Louisiana rice appears to be worst with California rice as best. Imported Basmati and Jasmine rice have the lowest arsenic. But there's some dispute over how much of the arsenic in rice is of the more toxic inorganic kind. I'd like to know where some of the brand name rices come from btw. Anyone know?
Fish is the main source of arsenic in the UK diet.
I do not have details on which types of fish are especially high in arsenic. However, sounds like rainbow trout have pretty low arsenic levels.
The main sources of inorganic arsenic intake are cereal grains and cereal based products, food for special dietary uses (e.g. algae), bottled water, coffee and beer, rice and rice-based products, fish and vegetables.
Next we come to chickens fed arsenic. It is not clear to me how much Pfizer's pulling of Roxarsone from the market cut the use of arsenic in chicken feed. Did other arsenic suppliers just replace Pfizer's product? My advice: do not eat chicken liver (where the arsenic concentrates) unless you can be certain the chicken you eat wasn't fed arsenic.
It is not clear to me what to cut out of the diet to do the easiest reduction in consumed arsenic.
Ever looked at a big book on the shelf and thought you wished you had the time to read it? Ray Sawhill (who spent many years covering the book publishing industry for Newsweek) frequently tells me books are too long. Publishers think they need high page counts to justify prices. But since books frequently have too much filler, repetition, and detail we aren't interested in we read fewer books and get less benefit per book than would be the case if books were written and formatted to optimize reader benefit.
Ray tells me he's long wanted publishers to include a 40 page short version at the front of a book. Read it for the gist. Then see if any sections seem worth reading in greater detail.
Now that a large and growing percentage of books get delivered digitally we ought to be able to do something even better. For some years now I've thought the publishing industry should agree to a standard way to provide multiple views into an ebook filtered for different reading purposes. Imagine your ebook reader allowed you to select between 10 page, 40 page, 100 page,and 500 page views of the same book. Writers and editors would need to decide what went into each view. Links in the shorter views could allow one to read a particularly interesting section in greater detail.
With a standard way to call out book subsets it would even be possible for 3rd parties to make and publish their own filtered views of a book. An astute reviewer could select subsections of a book that are most interesting and then publish a special filter file that enabled owners of the full ebook to look at a subset of that ebook as chosen by the reviewer. The subset files would not need contain the actual book text. For example, in the simplest version of a subset file it could contain just sentence numbers or other unique tag for each sentence (so all books would need a sentence numbering system - and that system would need to support fixes where publishers add or delete sentences to fix errors in the original published work). So there'd be no copyright issue with distributing the reviewer's filter.
The filter format should include support for added commentary by reviewers that goes with it. So one could get a filter that basically is both a fast way to read the book and a review of the book that gets into detail tied to specific sections of a book. Viewing software should even be powerful enough to show the subset chosen by reviewer A along with any additional sections which reviewer B wrote specific commentary about. So you wouldn't need to see all the sections reviewer B made visible. But you could still see the sections that reviewer B commented on.
Of course chopping out sentences might require adding bridge sentences. It is harder for a 3rd party to do that without potentially violating copyright. But bridge sentences are still a useful feature even if only publishers could add them.
An optional feature: the ability to link between books. Be able to basically create a view that interleaves 2 or 3 or more books. Anyone would owns all of the books could read them in a way that interleaves different treatments by different authors on the same topic. So, for example, a few books on algorithms could have their sorting algorithm sections linked together. One could cycle between the books to read different treatments of the same topic.
So I'm really proposing a few things here:
Any book readers have some thoughts on this proposal?
Despite current public and expert opinion to the contrary, having the neurological condition epilepsy is not directly associated with an increased risk of committing violent crime. However, there is an increased risk of individuals who have experienced previous traumatic brain injury going on to commit violent crime according to a large Swedish study led by Seena Fazel from the University of Oxford, UK, and colleagues at the Karolinska Institutet, Sweden, and Swedish Prison and Probation Service, and published in this week's PLoS Medicine.
The authors say: "The implications of these findings will vary for clinical services, the criminal justice system, and patient charities."
In their study, the authors identified all people with epilepsy and traumatic brain injury recorded in Sweden between 1973 and 2009 and matched each case with ten people without these brain conditions from the general population. The investigators linked these records to subsequent data on all convictions for violent crime using the personal identification numbers that identify Swedish residents in national registries.
Sweden's extensive record keeping on its populace sure comes in handy for scientific research.
I don't think these results mean what these researchers think they mean. Just because other non-epileptic family members of epileptics have the same increased risk of violent crime does not mean that epileptics aren't at increased risk of committing violent crime.
Using these methods, the authors found that 4.2% of people with epilepsy had at least one conviction for violence after their diagnosis compared to 2.5% of the general population. However, after controlling for the family situation (in which individuals with epilepsy were compared with their unaffected siblings), the association between being diagnosed with epilepsy and being convicted for violent crime disappeared.
A plausible explanation: genetic variants that increase the risk of developing epilepsy also boost the risk of crime regardless of whether they cause epilepsy in their carriers. Though even if some genetic variants contribute to epilepsy risk and violence that does not mean that all genetic variants that contribute to one also contribute to the other. Also, even if some genetic variant contributes to both epilepsy and violence that does not mean that variant increases the risk of violent crime in all carriers. Other genetic variants elsewhere in the genome could cancel out violent crime risks in some carriers. The interactions between human brain genes are going to be enormously difficult to discover and model.
Traumatic brain injury is linked to higher risk of committing violent crime. There ar4e plausible mechanisms of causation in both directions. Obviously brain damage could harm brain circuitry that restrains behavior or it could enhance anger that people feel when insulted or could work via some other mechanism to make violent acts more likely. Though people who are violent in the first place are at greater risk of getting their brains beaten on in a fight or damaged in a road rage incident. The full research paper (linked in the first paragraph) even mentions that brain trauma is common in prison.
In contrast, the authors found that after controlling for substance abuse or comparing individuals with brain injury to their unaffected siblings, there remained an association between experiencing a traumatic brain injury and committing a violent crime.
Given the high incidence of traumatic brain injury from improvised explosive devices (IEDs) among US military veterans that served in Iraq and Afghanistan some of these veterans might be at greater risk of violence as a result.
The full paper even mentions that some subtypes of epilepsy are associated with a reduced risk of violent crime.
Somehow missed this study a month ago. Below 3 grams (3000 milligrams) of sodium per day is linked to a higher incidence of congestive heart failure.
For years doctors have warned that too much salt is bad for your heart. Now a new McMaster University study suggests that both high and low levels of salt intake may put people with heart disease or diabetes at increased risk of cardiovascular complications.
This is good news for olive lovers. Looking at a 19 oz bottle of pitted Greek Kalamata olives I see it has a total of 8.4 grams of salt in the 140 olives, presumably more in the water. For someone who eats very little in the way of processed foods and who has few other sources of sodium in their diet it seems quite safe to eat, say, 20 olives a day since that will be only 1.2 grams of sodium. This study found that eating below 3 grams of sodium per day appears to increase the risk of congestive heart failure. Since I don't even manage to eat 20 olives per day I'm thinking more ketchup might be called for. Gotta get that sodium somehow. A quick check shows that the dark chocolate from Trader Joes has no sodium in it. So can't get it that way.
Moderation is best. Yet again. How frustrating for the extremists among us.
The study, published in the Journal of the American Medical Association (JAMA) today, found that moderate salt intake was associated with the lowest risk of cardiovascular events, while a higher intake of sodium was associated with an increased risk of stroke, heart attack and other cardiovascular events and a low intake was associated with an increased risk of cardiovascular death and hospitalization for congestive heart failure.
Most people still probably eat too much processed food and therefore this report isn't a license for most people to eat more sodium. But if you do manage to cut back your fast food and other processed food consumption it might be time for some supplementary olive eating or use of ketchup on baked potatoes.
So are current guidelines too low? One can't be sure from a single study. But the case against moderate sodium consumption has been weakened by this result.
Compared with moderate sodium excretion (between 4 to 5.99 grams per day), the researchers found that sodium excretion of greater than seven grams per day was associated with an increased risk of all cardiovascular events, and sodium excretion of less than three grams per day was associated with an increased risk of cardiovascular death and hospitalization for congestive heart failure.
The findings call into question current guidelines for salt intake, which recommend less than 2.3 grams (or 2,300 mg) per day. The guidelines are mostly based on previous clinical trials that found blood pressure is lowered modestly when sodium intake is reduced to this level (which was also found in the present study). However, there are no large studies looking at whether such low levels of sodium intake reduce the incidence of heart attacks and stroke.
Got any good ideas for higher sodium but healthy foods?