Good news! Cocoa really does help you live longer.
A study of elderly Dutch men indicates that eating or drinking cocoa is associated with lower blood pressure and a reduced risk of death, according to an article in the February 27 issue of Archives of Internal Medicine, one of the JAMA/Archives journals.
Cocoa has been linked to cardiovascular health benefits since at least the 18th century, but researchers are just beginning to collect scientific evidence for these claims, according to background information in the article. Cocoa is now known to contain chemicals called flavan-3-ols, which have been linked to lower blood pressure and improved function of the cells lining the blood vessels.
Brian Buijsse, M.Sc., National Institute for Public Health and the Environment, Bilthoven, the Netherlands, and colleagues examined cocoa's relationship to cardiovascular health in 470 Dutch men aged 65 to 84 years. The men underwent physical examinations and were interviewed about their dietary intake when they enrolled in the study in 1985 and at follow-up visits in 1990 and 1995. The researchers then placed them into three groups based on their level of cocoa consumption. Information about their subsequent illnesses and deaths were obtained from hospital or government data.
Over the next 15 years, men who consumed cocoa regularly had significantly lower blood pressure than those who did not. Over the course of the study, 314 men died, 152 due to cardiovascular diseases. Men in the group with the highest cocoa consumption were half as likely as the others to die from cardiovascular disease. Their risk remained lower even when considering other factors, such as weight, smoking habits, physical activity levels, calorie intake and alcohol consumption. The men who consumed more cocoa were also less likely to die of any cause.
Although blood pressure is usually linked with risk of cardiovascular death, that was not the case in this study. "The lower cardiovascular mortality risk associated with cocoa intake could not be attributed to the lower blood pressure observed with cocoa use," the authors write. "Our findings, therefore, suggest that the lower cardiovascular mortality risk related with cocoa intake is mediated by mechanisms other than lowering blood pressure." The benefits associated with flavan-3-ols may play a role.
Of course some people are determined to believe that nothing that you love could possibly be good for you. Cathy Ross, medical spokesperson for the British Heart Foundation, still discourages chocolate consumption.
"Cocoa is rarely tolerable in large amounts in its raw state and therefore to consume the suggested therapeutic amount you would have to have 100g of dark chocolate per day.
"This would mean an average intake of 500 calories per 100g and an average 30% of fat. Eating less did not produce the same effect.
"We are certainly not suggesting people never eat chocolate - everyone can enjoy a treat from time to time.
"But there are much better ways of improving your heart health."
If you are going to eat chocolate then the darker and the lower in sugar the better. I personally get semi-sweet cooking chocolate. I'd like to find some lower in sugar and cocoa butter. To keep the calories down as an alterrnative I've started eating cocoa powder straight on occasion. Might try it in apple sauce and eat it more regularly instead of dark chocolate.
Not all dark chocolates are the same. Processing removes some of the antioxidants. Mars dark chocolate has more of the antioxidants than the average chocolate. But I'm not clear on how big a difference there is between the various dark chocolates. I'd really like to find a cocoa powder that has the least amount of flavanoids and flavanols removed. Such a cocoa powder might be more bitter tasting though.
What I want to know: Does cocoa keep down blood pressure by increasing nitric acid synthesis? If so, it might just be a mild aphrodisiac as well.
Researchers at MIT have developed a new type of lithium battery that could become a cheaper alternative to the batteries that now power hybrid electric cars.
Until now, lithium batteries have not had the rapid charging capability or safety level needed for use in cars. Hybrid cars now run on nickel metal hydride batteries, which power an electric motor and can rapidly recharge while the car is decelerating or standing still.
But lithium nickel manganese oxide, described in a paper to be published in Science on Feb. 17, could revolutionize the hybrid car industry -- a sector that has "enormous growth potential," says Gerbrand Ceder, MIT professor of materials science and engineering, who led the project.
"The writing is on the wall. It's clearly happening," said Ceder, who said that a couple of companies are already interested in licensing the new lithium battery technology.
Their success came from making the material have a more crystalline structure.
Lithium ions carry the battery's charge, so to maximize the speed at which the battery can charge and discharge, the researchers designed and synthesized a material with a very ordered crystalline structure, allowing lithium ions to freely flow between the metal layers.
The battery still costs too much to manufacture.
A battery made from the new material can charge or discharge in about 10 minutes -- about 10 times faster than the unmodified lithium nickel manganese oxide. That brings it much closer to the timeframe needed for hybrid car batteries, Ceder said.
Before the material can be used commercially, the manufacturing process needs to be made less expensive, and a few other modifications will likely be necessary, Ceder said.
Unfortunately the press release provides no indication of how the storage density compares to the nickel metal hydride (NiMH) batteries currently used in cars.
Note that this MIT group is not the only team pursuing better lithium batteries for hybrids. President Bush's recent big speech on energy policy was made at Johnson Controls in Milwaukee. Well, Johnson Controls is pursuing lithium ion batteries for hybrid vehicles.
MILWAUKEE, WISCONSIN (September 28, 2005) – Johnson Controls today launched an advanced lithium-ion battery development laboratory in Milwaukee, to create advanced power-storage solutions for near-future, hybrid-electric vehicles (HEVs). The facility – located at the company’s Battery Technology Center – features a “dry room” and an array of highly specialized tools and equipment for designing, developing and testing power-storage and power-management concepts based on lithium-ion technology.
The new laboratory facility and development equipment were installed at a cost of approximately $4 million.
Johnson Controls, the world’s largest manufacturer of automotive original equipment and aftermarket batteries, has been at the forefront of research and development activities to create enhanced batteries for future-generation HEVs. The company operates battery technology centers in the United States and Europe.
For more than a decade, Johnson Controls has supplied nickel-metal-hydride batteries for hybrid-vehicle applications in Europe. The company believes lithium ion technology is likely to replace nickel-metal-hydride as the battery technology of choice in hybrid-electric and electric vehicles in the future.
With the big players in the auto industry all pursuing hybrid vehicle development a lot of money is flowing into development of better hybrid vehicle batteries. The shift to hybrids is driving battery technology advances which will eventually culminate in pure electric vehicles. We'll go to hybrids and then to pluggable hybrids and then to pure electric vehicles.
My guess is that the money flowing through the auto industry for battery development means that batteries will be ready for storing solar electric power long before photovoltaics become cheap enough to provide a significant source of electricity that needs storage.
In a major step towards understanding prostate disease, Melbourne scientists have grown a human prostate from embryonic stem cells.
A study published in the March edition of Nature Methods describes how human embryonic stem cells were developed into human prostate tissue equivalent to that found in a young man, in just 12 weeks.
Hey, I want to grow a new prostate and replace mine before mine gets old enough to cause really serious problems. The idea of getting youthful replacement parts becomes more appealing with every passing year.
We should be able to get replacement parts that last longer than the originals. Not every man gets BPH or prostate cancer. Once DNA sequencing is cheap and large numbers of people get their medical histories and DNA sequences compared scientists will identify large numbers of genetic variations that contribute to disease risk. My guess is DNA sequencing will get cheap before replacement organs become feasible. So we'll know what to change. One's own DNA might still be used for making replacement parts (reduces immuno-rejection problems). But gene therapy to patch our DNA (rather like software patches) with the best sequences for reducing specific disease risk will fix it to make it last longer.
For making organs last longer there's a step beyond using best existing genetic variations: Develop genetic sequences that are better than any that have evolved naturally. For example, move all the genes now found in the mitochondria (which have about 15,000 genetic letters of code) into the nucleus where they will be better protected from free radical damage.
The researchers expect a more immediate benefit from their work in the form of prostate cells that can be studied for showing how prostates deteriorate and become diseased.
The study was co-authored by Dr Renea Taylor from Monash University's Immunology and Stem Cell Laboratories, PhD researcher Ms Prue Cowin from the Monash Institute of Medical Research and other Australian and US researchers.
Dr Taylor said the discovery would allow scientists to monitor the progression of the prostate from a normal to a diseased state.
"We need to study healthy prostate tissue from 15-25 year old men to track this process," she said. "Understandably, there is a lack of access to samples from men in this age group, so to have found a way we can have an ongoing supply of prostate tissue is a significant milestone.
"As nearly every man will experience a problem with their prostate, we're very excited about the impact our research will have."
Although prostate cancer is the most common cancer in men, the impact of benign prostate disease (BPH) is equally significant - up to 90 percent of men will have BPH by the time they're 80. BPH is not usually life-threatening, but has a dramatic impact on quality of life.
My guess is that a lot of men with ambivalent feelings about embryonic stem cells who have benign prostate hyperplasia (i.e. difficulty urinating - which also increases the odds of kidney failure btw) are going to resolve those ambivalent feelings in favor of embryonic stem cells if they can get offered replacement prostates that solve their problem.
The cells were planted into mice after being treated in ways that instructed the cells to become prostate tissue.
"We grew the prostate tissue by 'telling' the embryonic stem cells how to become a human prostate gland. We then implanted the cells into mice, where they developed into a human prostate, secreting hormones and PSA; the substance in the blood used to diagnose prostate disease,'' Ms Cowin said.
We really need the ability to grow whole replacement organs. These ladies have taken a big step in that direction. But additional problems must also be solved to grow organs in the right three dimensional shape and ideally to do that in a human body. Mice are obviously too small to grow replacement organs for humans and present problems with disease transfer as well.
An ICM poll of Brits for The Guardian appears to show a widespread willingness to sacrifce and reduce energy consumption to prevent global warming.
About a third of the UK's greenhouse gas pollution comes from domestic heating, and the poll reveals that people would be willing to spend an average of £331 to make their homes more environmentally friendly, even if the move brought them no direct cost saving. Only 16% said they would not pay anything, with 32% willing to invest over £100 and 8% more than £1,000. More than half (51%) said they or their family had boycotted a company because its products damage the environment.
Excuse me for asking a rude question but if these folks are so willing to spend for the environment why haven't they already done so? How can they be willing to spend an average of £331 (about $577 dollars)? They've had plenty of time (years, decades) to do that spending already. I doubt they are spending £331 per year on home insulation and just haven't gotten around to making their 2006 expenditures for their next triple paned argon glass window for another room in the house.
The poll suggest that voters do not share the prime minister's assertion that policies to drive the economy forward should take precedent over those to address climate change. Asked which two areas should be priorities for the government, 28% highlighted action to tackle climate change and 16% wanted the economy to grow faster. The signal from those aged 18-24 was clearer: 35% picked climate change and 9% the economy.
People want lots of stuff done by other people which they do not have to pay for themselves.
Oh, but look at what sacrifices we are making.
Some 82% of households said they had turned the central heating down, 75% had installed low energy lightbulbs, 25% had cycled at least one journey instead of using the car and 24% said they had decided against a holiday that involved flying.
Did they cycle for the exercise? To lose weight? To see the scenery? Or maybe to save money for that trip to Thailand next winter?
Do these poll results represent a strong willingness on the part of the British people to sacrifice for the environment? No, of course not. How did British aircraft emissions rise by 12% in one year if the British people are willing to curb their own fossil fuels consumption? Foreign tourists? I doubt it.
The text of the draft "open skies" treaty, obtained by the Guardian, is likely to alarm environmental activists who argue that the seemingly unstoppable growth in air travel is among the main contributory factors to global warming. Aviation emissions rose by 12% last year and now account for about 11% of Britain's total greenhouse gas emissions - the fastest growing sector. The government's chief scientific adviser, Sir David King, has described global warming as a bigger threat to the world than global terrorism.
Who is using all that aviation fuel? Basically anyone who can afford it. Why does the US use more fuel per capita than Europe? Higher average per capita GDP. More people can afford more airplane trips, bigger houses, bigger cars.
I'd love to see installation of R80 level of building insulation become enviro-chic like Toyota Priuses are in my neighborhood. But the Priuses are just so much more visible as a way to make a statement. Noone can see your insulation and few will notice whether your window panes have fresh putty to prevent air leaks. Besides, the money they save on fuel helps to pay for heating the hot tub and Priuses are cheaper than a big SUV.
Update: Governments that want to encourage conservation are missing the boat by not more loudly promoting improved building efficiency. Trying to force people into smaller vehicles runs up against the human desire to live the high life. But improved building efficiency doesn't face the obstacle of conflicting with basic human desires
Governments could make building efficiency differences more visible in the market through inspection when houses are sold where each house would get rated for efficency. Also, for new house construction in addition to a basic minimum ordinance for insulation local governments could adopt a standard for rating buidlings by scales that quantify how much a house exceeds the minimum. Then when houses go for sale the energy efficiency of a house could be a selling point and the market would reward more efficient houses with higher valuations. Market incentives would produce more efficient housing.
The latest 454 Life Sciences DNA sequencer might lower the cost of sequencing a complete human genome to below $10 million.
454’s Genome Sequencer 20 ($500,000) uses an on-bead sequencing-by-synthesis approach to generate some 40 million bases of raw data per four-hour run, meaning I could squeeze out a human genome in just under a month. With per-run reagent costs of $6,000, my genome would cost a mere $900,000.
But that’s just one sequence pass, and according to vice president for molecular biology Michael Egholm, “It’s simply ludicrous to say you can sequence a human genome with 1x coverage.” He suggests 8x or 15x coverage, which would boost my costs to between $7.2 million and $13.5 million, and increase my sequencing time to about a year.
The whole article is worth reading and covers some of the academic and commercial efforts to drive down the cost of DNA sequencing.
While some rich people could afford to get their DNA sequenced now they'd derive very little benefit from doing so. The meaning of a small number of genetic variations is known. But they can be tested for individually.
Church knew that a key to making gene sequencing fast and affordable lay in miniaturizing the process. He coats a slide with millions of microscopic beads, each impregnated with chemicals that light up when exposed to DNA base pairs. A digital camera fitted to a microscope photographs the pattern, and software decodes the results. His process is more than 250 times faster than conventional technology. In short, rather than take seven years to sequence the human genome, Church's machines can theoretically do it in less than a week. He says "theoretically" because he and his students have only decoded the DNA of E. coli, which is 1/1000th the size of the human genome. Based on his current costs, he thinks he could decode a human genome for about $2.2 million.
On Church's $2.2 million estimate see my August 2005 post "Harvard Group Lowers DNA Sequencing Cost Order Of Magnitude".
The PGP is an offshoot of the Human Genome Project, the massive government effort to read and put in proper sequence all 3 billion bits of human DNA. The project was completed in 2003 at about $3 billion - about $1 for each of the tiny chemical units, called bases, that make up the human genome.
Since then, better technology and greater efficiency have brought down the cost to $10 million - less than a penny per base - for a complete DNA sequence, according to Jeffery Schloss, the director of technology development at the National Human Genome Research Institute, a federal agency in Bethesda, Md.
The institute is financing a campaign to cut the cost of sequencing a genome to $10,000 by 2009 and drive it all the way down to $1,000 by 2014. An affordable $1,000 genome is biology’s next dream.
The cost of sequencing fell during the first sequencing of the human genome. So most of the $3 billion dollar amount represents a higher cost than had been achieved by the completion of the sequence.
Church is trying to recruit people into his Personal Genome Project at the Harvard Medical School where the recruits would allow comparison of their DNA sequences with lots of other information about them. We need this sort of research and on a massive scale with tens of thousands or even hundreds of thousands of enrollees. People should be enrolled long before DNA sequencing becomes cheap because some connections between genes and other characteristics will require longitudinal studies (i.e. studies that follow people for years and even decades).
Update: I see that we are on the cusp of a big change as a result of dropping DNA sequencing costs. 10 or 15 years from now people will use genetic analyses to formulate custom diet advice (nutritional genomics) to reduce the risk of diseases or choose the best diet for losing weight or putting on muscle. They will surreptitiously get DNA samples from their romantic interests to decide whether the other person has genetic profile good enough to warrant trying to marry (genetic tendency to cheat? intelligence? disease risks? violent tendencies? genetic tendencies toward laziness or conscientiousness?). Surreptitious DNA sample collection will find use for other purposes. Some sharp smaller employers will collect DNA from job interviewees (e.g. from a coffee cup) to analyse their DNA for personality tendencies and approximate level of intelligence.
Drugs will get developed for specific genetic profiles. Some will get preventative genetic therapies based on their genetic profiles to avoid diseases before they get sick.
Of course, mate selection is a pretty slow way to get the genes that you desire for your offspring and most will not be able to secure genetically ideal mates. More women will turn toward using sperm donors to get exactly what they want. Eventually gene therapies on sperm, eggs, and embryos will replace much of the coming increased use of sperm donors. But my guess is there will be a 5 to 15 year period during which use of sperm donors will soar before gene therapies provide better alternatives.
US President George W. Bush recently spoke at a Johnson Controls battery technology development facility in Milwaukee Wisconsin and Bush is showing signs of taking seriously his stated support for accelerating the development of new energy technologies across a wide range of technologies (which is not to say I agree with him on all points).
Secondly, government can help. Government provides about a third of the dollars for research and development. Two-thirds come from the private sector, one-third comes from the government. And so I propose to double the federal commitment to the most critical basic research programs in the physical sciences over the next decade.
He is starting to put some money where his mouth is.
Now, I laid out what's called an Advanced Energy Initiative. And a cornerstone of the initiative is a 22 percent increase in funding for clean energy research at the Department of Energy. And it's got two major goals, or two objectives. First, to transform the way we power our cars and trucks. And, secondly, to transform the way we power our homes and offices.
We can't move much of the car market to hybrids until battery costs fall and battery storage capacities rise. But big improvements in building energy efficiencies are achievable using already existing technologies. I think improvements in building efficiency are easier to achieve than improvements in vehicle efficiency because people resist riding in smaller cars and with less powerful engines. So we need new tech to improve vehicle efficiency. But better building insulation does not clash as much with lifestyle desires. Though some people want lots of windows and that does reduce the level of insulation in buildings there's a lot of room for building efficiency improvements.
So let me talk to you about the first one. Our nation is on the threshold of some new energy technologies that I think will startle the American people. It's not going to startle you here at Johnson Controls because you know what I'm talking about. (Laughter.) You take it for granted. But the American people will be amazed at how far our technology has advanced in order to meet an important goal, which is to reduce our imports from the Middle East by 75 percent by 2025, and eventually getting rid of our dependence totally.
The first objective is to change the way we power our cars and trucks. Today's cars and trucks are fueled almost exclusively by gasoline and diesel fuel, which, of course, comes from oil. To transform the way we power the vehicles, we have got to diversify away from oil. I just gave you a reason from a national security perspective, as well as economic security perspective why reliance upon oil is not good for the United States.
And so here are three ways that we can do that, change our reliance from oil. First, invest in new kinds of vehicles that require much less gasoline. It's a practical thing to do. Secondly, find new fuels that will replace gasoline and, therefore, dependence on oil. And, finally, develop new ways to run a car without gasoline at all.
The most promising ways to reduce gasoline consumption quickly is through hybrid vehicles. Hybrid vehicles have both a gasoline-powered engine and an electric battery based on technologies that were developed by the Department of Energy. In other words, this technology came to be because the federal government made a research commitment. That's why I think it's double -- important to double research as we go down the next decade. The gasoline engine charges the battery, which helps drive the vehicle. And the twin sources of power allow hybrid cars and trucks to travel about twice as far on a gallon of fuel as gasoline-only vehicles. That is a good start when something that can go twice as far on a gallon of gasoline than the conventional vehicle can.
Bush is getting over the original obsession of his Administration on hydrogen and seems to be realizing that development of better batteries is a highly desirable and achievable goal. Well, better that political leaders learn late than never.
Bush even seems to be aware that switch grass would be better than corn as a biomass source of energy. We need better technology for converting the cellulose in the switch grass into more usable sugars. But that's a solvable problem.
Now, we're on the edge of advancing additional ethanol production. New technology is going to make it possible to produce ethanol from wood chips and stalks and switch grass, and other natural materials. Researchers at the Energy Department tell me we're five or six years away from breakthroughs in being able to produce fuels from those waste products. In other words, we're beginning to -- we're coming up with a way to make something out of nothing. And this is important because it's -- economics are such that it's important to have your ethanol-producing factories or plants close to where the product is grown.
That's why E85 has spread throughout the Midwest, that's where you're growing the corn. Pretty soon, you know, if you're able to grow switch grass and convert that into ethanol, then you're going to have availability for ethanol in other parts of the country.
E85 from corn ethanol is spreading because politicians subsidize it. Corn ethanol is not going to solve our transportation energy problem. Corn would be better used for heating. But corn doesn't scale. It requires too much land in order to make a serious dent in energy needs.
This is a long speech on energy and I'm skipping over his coal and nuclear comments. In a nutshell, he's for development of cleaner ways to burn coal. But he's not forcing coal burners to rapidly clean up their acts since that'd cost real money. He's also for a resumption of nuclear power plant construction in a big way.
He is for solar.
Another electricity source with enormous potential is solar power. Today Americans use small amounts of solar power, mainly to heat water or to power small consumer products like outdoor lights. After spending some time with you all here, I'm going over to Michigan to go to a company that manufactures thin film, photovoltaic cells. That's kind of a fancy word for cells that can generate electricity directly from sunlight.
The technology -- solar technology has the potential to change the way we live and work, if you really think about it. For example, roof makers will one day be able to create a solar roof that protects you from the elements and, at the same time, powers your house. And that's what these folks are working on.
The vision is this: that you will have -- that the technology will become so efficient that you'll become a little power generator in your home, and that if you don't use the energy you generate, you'll be able to feed it back into the electricity grid. The whole purpose of spending money on solar power -- and we intend to spend $150 million next year in funding for both government and private research -- is to bring to market as quickly as possible this important and impressive technology. It's really going to help change the way we live, we think, and we want solar power to become competitive by 2015.
The $150 million per year for solar is chump change. Some estimates place the wasteful corn ethanol subsidy at $3 billion per year. 20 years from now I bet we will be getting 10 or 20 or more times the power from photovoltaics than from corn ethanol.
He's also for wind. Go read the full speech if you are interested.
Update: I do not think Bush's recent speeches on energy are a huge step forward. A huge step forward would put a couple billion dollars a year into solar research, a couple billion into batteries, maybe a billion into accelerating pebble bed nuclear reactors or other advanced reactor concepts, and still other initiatives. These initiatives should be on a scale similar to the corn ethanol boondoggle but in productive directions rather than aimed at satisfying farmers and Archer Daniels Midland.
A good step forward for federal energy policy would include an initiative to make all new and existing federal government buildings extremely well insulated and energy efficient. This is called "leading by example". Bush could also call on local governments to raise standards for insulation on building code. Also, federal education funding could be diverted toward insulating schools and for installation of passive solar water and space heating systems. The money spent that way would cut fuel bills and make more local money available for education.
Still, I think it very helpful for Bush to state that elimination of US energy imports from the Middle East is a desirable goal and that energy is a national security issue. Yet here's the bottom line: Bush's actual policies on energy fall far short of his rhetoric on energy. The US government could very productively spend several billions a year more on a large range of energy research initiatives. The federal government could also lead by example and implement a lot of conservation measures for buildings and its vehicle fleet using existing technology.
Conscious analysis of problems works best for simpler problems. But for more complex problems it may be best to absorb the facts and then distract your conscious mind while giving the subconscious time to work out the best choice.
One group was given four minutes to pick a favourite car from a list having weighed up four attributes including fuel consumption and legroom.
The other group was given a series of puzzles to keep their conscious selves busy before making a decision.
The conscious thought group managed to pick the best car based on four aspects around 55% of the time, while the unconscious thought group only chose the right one 40% of the time.
But when the experiment was made more complex by bringing in 12 attributes to weigh up, the conscious thought group's success rate fell to around 23% as opposed to nearly 60% for the unconscious thought group.
Instead, the scientists conclude, the best strategy is to gather all of the relevant information -- such as the price, the number of bathrooms, the age of the roof -- and then put the decision out of mind for a while.
Then, when the time comes to decide, go with what feels right. ''It is much better to follow your gut," said Ap Dijksterhuis, a professor of psychology at the University of Amsterdam, who led the research.
For relatively simple decisions, he said, it is better to use the rational approach. But the conscious mind can consider only a few facts at a time. And so with complex decisions, he said, the unconscious appears to do a better job of weighing the factors and arriving at a sound conclusion.
Dijksterhuis and his team also propose that, although we are unaware of it, our brains are churning through the mass of information involved in a complex decision and sifting out the best option.
The study ties in with a growing trend in psychology research over the past 15 years, suggesting that our unconscious mind is more important than we once thought. "A lot of complicated processes occur without our being aware of it," says Daniel Kahneman, an authority on decision making at Princeton University, New Jersey.
I wonder whether people with higher levels of intelligence have higher thresholds of complexity of problems before it makes sense to let their subconscious handle a problem.
Prof Dijksterhuis said: "Your brain is capable of juggling lots of facts and possibilities at the same time when you let it work without specifically thinking about the decision. But when you are specifically thinking about a problem, your brain isn't able to weigh up as much information. I sit on things and rely on my gut."
Whether the conscious mind does best will also depend on the nature of the problem. For example, I doubt the subconscious can compete with the conscious mind when a problem requires mathematical analysis.
As previously reported here once again the news is full of reports that Kyoto signatories are not meeting their emissions reductions targets. Greenhouse gas emissions are rising in Britain and hitting Kyoto emissions reductions targets is looking less likely.
A DTI spokeswoman said the UK's total carbon dioxide emissions, including the contribution from homes, cars and air travel, was now expected to total some 529 million tons by 2010.
That is 10.6 per cent below their level in 1990 - but compared with the Government's own target of a 20 per cent cut - or even the 12 per cent reduction required to meet Kyoto, they are not meeting requirements.
In 2004, the projection for total CO2 emissions in 2010 was 518 million tons, suggesting the UK is getting further and further away from meeting its targets.
The Brits would probably have to stop their economy from growing if they wanted to meet that target. Either that or they'd have to build a large number of nuclear power plants and wind farms in a hurry.
In the U.S., figures released by the Energy Information Administration at the end of 2004 showed that emissions had risen by 13.4 percent from 1990 levels.
But according to 2003 figures cited by Friends of the Earth Europe this week, some countries which, unlike the U.S., do have legally binding Kyoto targets are doing as badly, or even worse.
For instance, Austria was set a Kyoto target of -13 percent, but emissions are running at +16.6 percent. Italy's target was -6.5 percent, and its actual emissions are +11.6 percent. Others that are off target include Belgium, the Netherlands and Spain, while France, Britain and Germany are nearer to being on track.
Compared to the aggregate -8 percent target for the E.U.'s then 15 member states, the actual situation is -1.7.
"If current trends continue, Europe will not meet its Kyoto target," the green group said, adding that "if emission levels continue to develop as they did over the last three years, the [15 E.U. members'] emissions in 2010 will be +2.8 percent above of what they were in 1990."
The cost of emissions reduction rises with each additional step. The easier reductions come first. I expect bigger industry and consumer opposition for each additional attempt to cut emissions. Governments will favor imposing more regulations on industry than on consumers since the owners of capital are a much smaller number of voters than the employees. But the cost per quantity of emissions reduction is probably cheaper in the home than in the workplace.
Like other industrialized countries, Japan has committed itself to reducing its carbon emissions substantially by the year 2010 - in Japan's case to 6% less than 1990 levels.
But despite its good intentions, Japan's performance has been embarrassingly weak - carbon emissions have actually increased by nearly 8%.
Japan's problem is that it already has tried hard to make its economy energy efficient and therefore the Japanese have already adopted many more energy efficient practices and technologies that were relatively cheap to adopt.
One of Japan's difficulties is that it was already very energy-efficient at the time of the Kyoto treaty. The country has few natural energy sources of its own, making its vital manufacturing industries highly dependent on imported fuel.
So when the two oil shocks of the 1970s pushed up prices, Japan set about using its technological ingenuity to cut down on its fuel consumption.
There was a huge investment in nuclear power stations; Japan relies on nuclear power for one third of its electricity production.
That's the problem with conservation. It does not serve as a substitute for replacing fossil fuels with non-fossil fuels and there's a limit to how energy efficient an economy can become without a big hit in living standards. See my previous post on just how far the Japanese have already gone in shifting toward adopting conservation technologies.
Prices now average $ 5.63 to $ 5.90 a ton, according to the World Bank and think tanks. Futures for 2010 worldwide are averaging between $ 10.96 and $ 23.30, with highs above $ 30.
Estimates by the World Bank and private think tanks say Western Europe, Japan and Canada together may need somewhere between 2.5 billion and 3.0 billion tons of credits in the five years through 2012 to meet their commitments under the Kyoto Protocol. That comes to between 300 million to 800 million tons per year.
"Even 40 percent of the amount needed is going to be hard to reach," said Hitoshi Kurihara, manager of the public-private emissions investment team Japan Carbon Finance.
He estimates that the Kyoto Protocol could cost Japan as much as 2 trillion yen in carbon credits.
If the cost goes back up to $30 per ton and the needed tons goes to 800 million per year that'd be $24 billion per year. That's not a large figure compared to the sizes of the national economies in question.
The US economy has also become much more energy efficient in the last 15 years. But economic growth was faster then rate of increase in energy efficiency in the US.
Although overall net GHG emissions have increased more than 20% during the last 15 years, the economy as represented by the GDP grew 46%.
Still, there's one area where the US could do much more to make the US economy more energy efficient: Building codes. Require new construction to be built to higher insulation standards. That'd be a pretty cheap approach to take to increase energy efficiency.
Also, I favor making building energy efficiency a standard part of reports provided to home and commercial building buyers. Buildings could get constructed to some minimum level of efficiency. But an efficiency rating system could capture information about ways that a builder exceeded the regulatory minimum so that this information could be used as part of advertisements when buildings go on the market. Why not be able to easily find out that, say, a building's walls and roof have R70 insulation or triple pained argon glass windows or an orientation that increases solar heating?
Canada has consistently failed to meet its Kyoto targets and currently exceeds greenhouse gas emission targets by about 25 per cent.
Greenie commentators in many of the Kyoto Accord signatories point to the United States as irresponsible for failing to join in the Kyoto restrictions on CO2 and other greenhouse gases. But the Kyoto signatories are mostly failing to reach their emissions targets and some like Canada (whose press has conditioned much of its populace to look down on the United States) have done next to nothing to meet their treaty obligations. I see this moral posturing as a distraction from the discussion that really ought to take place: How best to accelerate scientific and technological developments in order to technologically obsolesce fossil fuels?
Canada's failure to meet its Kyoto limits flows in large part from a healthy rate of economic growth. Whereas Japan's smaller failure is a reflection of both a slower rate of economic growth and previous efforts to increase energy efficiency. The United States has performed similarly to Canada partly due to immigration driving up the size of the US population (more people consuming energy at US per capita rates of energy consumption) and faster economic growth than Europe.
What would be far more interesting than these country comparisons versus their Kyoto percentage reductions and increases since 1990 would be time graphs showing ratios of energy usage to per capita GDP adjusted to purchasing power parity (PPP).
A U.S. focus on developing cleaner technologies for the future was not enough to tackle the immediate threat from global warming, EU Environment Commissioner Stavros Dimas said.
"The U.S. still thinks that technology will find the answer," Dimas said, "but we know we need reductions" in fuel emissions.
Planet Earth to Stavros Dimas: carbon dioxide emissions will not decline until technological advances provide ways to more cheaply make non-fossil fuel energy. Technology is the answer. The political will does not exist - even in the Kyoto signatory nations - to pay a big price to reduce CO2 emissions. In late 2005 Tony Blair finally admitted that countries will not pay a high price for emissions reductions aimed at preventing global warming.
The second thing, though, is that I think – and I would say probably I’m changing my thinking about this in the past two or three years. I think if we are going to get action on this, we have got to start from the brutal honesty about the politics of how we deal with it. The truth is no country is going to cut its growth or consumption substantially in the light of a long-term environmental problem. What countries are prepared to do is to try to work together cooperatively to deal with this problem in a way that allows us to develop the science and technology in a beneficial way.
He's a slow learner. But at least he learned. Some are still in denial about the obvious.
Eyeing a successor treaty to the Kyoto Protocol, due to expire in 2012, Blair said despite U.S. concerns, there would have to be more decisive action to cut emissions.
"In my view, this can only be done if you have a framework that in the end has targets within it," he told a committee of senior parliamentarians. "If you don't get to that point...the danger is you never have the right incentives to invest heavily in clean technology."
U.S. President George W. Bush has rejected a targets-based approach in favor of developing clean technologies to curb greenhouse emissions. Remarks by Blair last year were interpreted as a sign he was moving toward the U.S. position, but Tuesday's comments are an apparent reassertion of his commitment to the Kyoto Protocol and a successor treaty.
China alone was responsible for almost half of greenhouse gas emissions in 2005. Probably it will become responsible for more than half in 2006 and an increasing percentage in future years. China is not going to sign up for emissions reductions. Only technologies that obsolesce fossil fuels will cut Chinese emissions.
What Stavros Dimas ought to say is that "if the Bush Administration thinks that technological advances are the answer then why is the Bush Administration doing so little to accelerate the rate of advance in energy technologies?" George W. Bush doesn't walk the talk. Dimas ought to call him on it. Similarly, Tony Blair ought to say "President Bush, I agree with your stated approach to energy. So why aren't you implementing it as policy?"
I don't know whether global warming will happen, to what degree, or with what trade-offs in costs and benefits (and there would certainly be benefits such as longer growing seasons and milder winters in the colder regions). But I continue to be interested in phasing out fossil fuels because we have other compelling reasons to develop replacements for fossil fuels. One such compelling reason is that fossil fuels produce conventional pollutants that hurt us down here on the ground much sooner than the theorized global warming. Another reason is that money sent to the Middle East causes problems for the rest of the world with Islam and terrorism and increases defense and security costs. Another reason for the United States especially is that imported oil is making a huge trade deficit even worse (now at 5.8 percent of US GDP).
Last but not least, cheaper alternatives would really be cheaper. Lower costs from the development of better energy technologies would enable much more rapid economic growth and higher living standards the world over.
There are 34 new plants under construction, according to the Renewable Fuels Association, an industry trade group in the District. Eight of the 95 existing plants are expanding. And 150 more new plants or expansions are in the planning stages.
The ethanol market is a product of politics. The US federal government is the cause of the increased demand for ethanol.
The ethanol industry also is being boosted by requirements in federal energy legislation approved last year that requires an increasing amount of the additive to be used.
Ethanol's subsidy from the US federal government might be about $0.75 per gallon.
Some studies peg the federal ethanol subsidy to producers at $3 billion per year.
The United States last year consumed an estimated 4 billion gallons of ethanol, compared with 140 billion gallons of gasoline.
In spite of federal subsidies and the high price of oil the E85 fuel (85% ethanol, 15% gasoline) costs more per gallon and carries cars fewer miles.
Plus, stations charge more for E85 than gasoline, even though it carries cars fewer miles.
Corn as the great liquid fuel alternative still doesn't seem convincing to me.
The March natural gas contract gained 6.8¢ to $7.13/MMbtu after falling for more than a week to a 7-month low.
To put that in perspective the spot price of natural gas hit a peak of over $15/MMbtu in late 2005. But natural gas was about a dollar cheaper a year ago than it is today. Note that in many areas of the world where natural gas is produced it is much cheaper. The US would have lower natural gas prices if it had more liquified natural gas (LNG) terminals. But local opposition to LNG terminals keeps US prices well above world market prices. Declining US production and delays in LNG terminal construction strike me as reasons to expect continued high natural gas prices.
So how does this lower price for natural gas affect natural gas's competitiveness with corn? Corn is around $2 per bushel though it might drop lower.
The projected 2005/06 price range for corn is $1.60 to $2.00 per bushel, down 5 cents on each end from last month, compared with $2.06 for 2004/05.
Dennis Buffington's Energy Strategies website puts the useful energy in corn at 6,808 BTU per pound and 56 pounds per bushel. So from that one would expect 147 pounds of corn to be the equivalent energy of 1 million BTU of natural gas. But Buffington also states that 170 pounds of corn has energy equivalent to 1 million BTU of natural gas. My guess is he might be accounting for burning efficiency.
Taking the 170 pounds of corn figure for 1 million BTUs and dividing by 56 pounds per bushel times $2 per bushel one gets $6.07 per million BTUs (MMbtu) for corn. Natural gas at $7.13 is not that much more expensive. But if corn fell to $1.60 per bushel then it would cost $4.86/MMbtu.
Corn as a heat source is a lot more compelling if your only fossil fuel alternative is oil. A gallon of #2 heating oil (basically diesel) has the energy equivalent of 22 pounds of corn. So a $2 bushel of corn has the heat content of 2.55 gallons of heating oil.
Corn production costs will fall in the future as agricultural technology advances. But what will happen to natural gas prices? Corn's price probably has less upside risk. For someone choosing a heating energy source for a new building if the choice is between natural gas and corn if one can build the corn feeder to be large enough to allow infrequent corn deliveries the corn might be the more economic choice.
But can corn make much of a dent in satisfying US energy needs? In the comments section of a previous post I estimated that if yield per acre could be maintained then it would take 36% of the US land mass to produce enough energy from corn to replace US consumption of oil and natural gas. That rough calculation ignored energy conversion losses to make ethanol. The calculation ignored the fact that corn can not grow with as high a yield per acre in the areas where it is not grown. In some areas it can't be grown at all (e.g. where would the water come from?). Plus, what about nature? Massive biomass production would destroy large areas of habitats. Corn for biomass energy can not scale up become a big energy source.
My take on corn: For individuals looking to switch away from expensive heating oil if you can solve the corn delivery problem to your satisfaction then corn heat will cost less. Comparing corn to natural gas as a heat source the choice is less clear.
At the level of national energy policy corn has at best a small role to play. Corn for liquid transportation fuel is a politically driven mistake. If corn must be used for political reasons then better to promote it as a heat source.
My fear with corn is that biotechnology will so lower the cost of corn production that a big shift from natural gas to corn will result in large scale habitat destruction as more land gets shifted to corn production. I'd prefer cost breakthroughs in nuclear and solar energy as more environmentally agreeable energy solutions.
Cassman told the Nebraska Ethanol Board that, when considering the 11 ethanol plants in production, seven plants that will be producing by 2007, and five plants that are in the planning stages, 1.31 billion gallons of ethanol could be produced in Nebraska.
That scale of production would use 580 million bushels of corn, which is only 50 percent of Nebraska's total corn crop, Cassman said.
580 million bushels of corn will be used to produce 1.31 billion gallons of ethanol. That's a ratio of 2.26 gallons per bushel. Scale that up to the entire 11 billion bushel per year US corn crop and dedicate it all to ethanol and the result would be only 24.85 billion gallons of ethanol. But ethanol has only 67.5% as much energy per gallon as gasoline. So all the US corn production diverted to ethanol would yield the equivalent of only 16.77 billion gallons of gasoline as compared to the 140 billion gallons consumed per year. But corn production uses energy. So the picture for corn ethanol is even worse once energy inputs are considered.
Out of about 11 billion bushels of corn grown in the United States per year Nebraska grows over 11 percent of it.
The 2005 Nebraska corn crop is the second-largest on record, according to the release. It was 1.27 billion bushels.
So about 6 percent of US corn production is going to toward ethanol production in Nebraska alone.
Dan Kammen and Alex Farrell of the Energy and Resources Group at UC Berkeley, with their students Rich Plevin, Brian Turner and Andy Jones along with Michael O'Hare, a professor in the Goldman School of Public Policy, deconstructed six separate high-profile studies of ethanol. They assessed the studies' assumptions and then reanalyzed each after correcting errors, inconsistencies and outdated information regarding the amount of energy used to grow corn and make ethanol, and the energy output in the form of fuel and corn byproducts.
Once these changes were made in the six studies, each yielded the same conclusion about energy: Producing ethanol from corn uses much less petroleum than producing gasoline. However, the UC Berkeley researchers point out that there is still great uncertainty about greenhouse gas emissions and that other environmental effects like soil erosion are not yet quantified.
The UC Berkeley team has made its model, the Energy and Resources Group Biofuels Meta Model (EBAMM), available to the public on its Web site.
"It is better to use various inputs to grow corn and make ethanol and use that in your cars than it is to use the gasoline and fossil fuels directly," said Kammen, who is co-director of the Berkeley Institute of the Environment and UC Berkeley's Class of 1935 Distinguished Chair of Energy.
Despite the uncertainty, it appears that ethanol made from corn is a little better - maybe 10 or 15 percent - than gasoline in terms of greenhouse gas production, he said.
"The people who are saying ethanol is bad are just plain wrong," he said. "But it isn't a huge victory - you wouldn't go out and rebuild our economy around corn-based ethanol."
But they are defining "benefit" as greenhouse gas emissions reduction and the benefit is small. How about the effects of putting much larger areas under till for farming? Also, the benefit is small. That suggests the net energy gain is small.
The UC Berkeley researchers think the ticket to making biomass more competitive is to develop better ways to convert cellulose sugar polymers into simpler sugars which could then be used to produce ethanol.
The transition would be worth it, the authors point out, if the ethanol is produced not from corn but from woody, fibrous plants: cellulose.
"Ethanol can be, if it's made the right way with cellulosic technology, a really good fuel for the United States," said Farrell, an assistant professor of energy and resources. "At the moment, cellulosic technology is just too expensive. If that changes - and the technology is developing rapidly - then we might see cellulosic technology enter the commercial market within five years."
Some grasses produce over 3 times as much energy per acre as corn. In theory producing ethanol from such grasses could be far more favorable in terms of both the ratio of energy out to energy in and also in terms of the size of the amount of land needed.
Still, I remain unethusiastic even for greatly improved biomass. If one really wants to reduce greenhouse gases then nuclear, solar, and wind energy are clearly the ways to go. They each would require far less land area than biomass. But all 3 are in need of technological advances for cost reductions. We also need far better battery technology since all 3 deliver energy as electricity.
For the first time, researchers have seen in action how the "hot" emotional centers of the brain can interfere with "cool" cognitive processes such as those involved in memory tasks. Their functional magnetic resonance imaging (fMRI) images of human volunteers exposed to emotional distraction revealed a "see-saw" effect, in which activation of emotional centers damped activity in the "executive" centers responsible for such processing.
The findings of the Duke University Medical Center researchers provide insight into the basic brain mechanisms responsible for the distraction caused by emotional stimuli that are irrelevant to a task. Moreover, they said, the findings offer a new approach to understanding how people with depression and post-traumatic stress disorder cope with traumatic events and memories. It is known that people with such problems are far more affected by emotional distraction.
The researchers compared the effects of three different kinds of distracters on the ability to memorize faces.
In their experiments, the researchers asked volunteer subjects to memorize sets of images of three human faces. Next, they exposed the subjects to one of three types of distracters -- emotional images such as injured people or aggressive behavior; neutral images such as people shopping or working; and scrambled images that meant nothing. The subjects were then showed a face image and asked to determine whether it was one of the original "to-be-memorized" faces or a new face.
Throughout the tests, the subjects' brains were scanned using fMRI. This widely used technique involves using harmless magnetic fields and radio waves to scan the brain to detect levels of blood flow, which indicates increased or decreased brain activity.
Stimuli that evoke an emotional reaction not only activate the ventral system of the brain but also reduce activity in the dorsal regions involved in rational thinking.
In earlier studies, the researchers had found that emotional images activated a "ventral affective system" in the brain that encompasses regions involved in emotional processing. In contrast, they found, cognitive tasks involving memory processes activated a "dorsal executive system." To their surprise, the researchers also found that the emotional distracters not only activated the ventral system, but also deactivated the dorsal regions.
In the new study, the researchers observed the same patterns of activation and deactivation of the regions. The emotional images produced greater activation of the ventral system and deactivation of the dorsal system than did the neutral or scrambled images, they found.
But most importantly, they found graded behavioral effects of the images. The emotional distracters produced the most detrimental effect on memory performance, the neutral distracters impaired performance to a lesser extent; and the scrambled images impaired performance very little. "Along with the fMRI results, these findings provide the first direct evidence concerning the neural mechanisms mediating cognitive interference by emotional distraction," said Dolcos.
Emotional distracter: That sounds like a technical term for "girlfriend".
People who could inhibit their emotional response were less distracted.
The researchers also found individual differences among the subjects in their response to the images. Those people who showed greater activity in a brain region associated with the inhibition of response to emotional stimuli rated the emotional distracters as less distracting. Said Dolcos, "One interpretation of this finding is that, because this region is associated with inhibitory process, people who engage that region more could cope better with distracting emotions."
I bet that genetic variations are partially responsible for people differing in their abilities to inhibit their emotional responses. For some inhibition of their emotions comes easy and surely the ability exists on a sliding scale. Also, there's probably not a single ability to inhibit all emotions. Some probably can more easily inhibit sadness and others anger and so on. If you have a particular form of emotion you have a hard time inhibiting then when you need to think clearly you are best off avoiding situations that will present stimuli that'll trigger that emotion.
This report of how emotional stimuli shut down areas of the brain involved in rational thought reminds me of another recent post of mine: "Political Partisans Addicted To Irrational Defense Of Their Tribes". This latest report throws more light on that previous report. People who are emotionally worked up about politics have a hobbled ability to think rationally.
Here is some news you can use to keep off excess pounds. Opaque rather than clear containers make a big difference in the amount of candy consumed from nearby containers.
When it comes to candy, it is out of sight, out of the mouth, a Cornell University researcher finds.
The study finds that women eat more than twice as many Hershey Kisses when they are in clear containers on their desks than when they are in opaque containers on their desks -- but fewer when they are six feet away.
"Interestingly, however, we found that participants consistently underestimated their intake of the candies on their desks yet overestimated how much they ate when the candies were farther away," said Brian Wansink, the John S. Dyson Professor of Marketing and of Applied Economics at Cornell.
The study -- one of the few experiments to quantify the "temptation factor" -- was presented at the Obesity Society meeting of the North American Association for the Study of Obesity in September in Vancouver, Canada. It is published online and will be published in an upcoming February issue of the International Journal of Obesity.
Wansink and his co-authors, James E. Painter and Yeon-Kyung Lee, assistant professor and visiting scholar, respectively, in food science at the University of Illinois-Champaign, gave 40 university female staff and faculty members 30 chocolate Kisses in either clear or opaque candy jars on their desks or six feet away. Each night, the researchers counted how many candies were eaten and refilled the jars.
"Not surprisingly, the participants ate fewer candies when the Kisses were in opaque rather than clear candy jars on their desks and even fewer when the opaque jars were six feet away from their desks," Wansink said. "The less visible and less convenient the candy, the less people thought about it and were tempted."
Specifically, participants ate an average of 7.7 Kisses each day when the chocolates were in clear containers on their desks; 4.6 when in opaque containers on the desk; 5.6 when in clear jars six feet away; and 3.1 when in opaque jars six feet away.
This cogs with my everyday experience at my work desk. If I put food in a cabinet and close the cabinet door I'm less likely to munch during the day.
I've also noticed I eat more dried cranberries than dried Montmorency cherries at similar sweetness levels. I like the Montmorency cherries better. But they are so rich in flavor compared to the cranberries that I eat them more slowly since each one provides more flavor experience than a similar quantity of cranberries. So maybe foods with more intense flavors (and by flavors I'm not referring to either sweetness or fattiness) could reduce food consumption.
CARSON - BP and Edison International said Friday they plan to team up on a $1 billion hydrogen-fueled power plant in southern California.
The plant, near the BP refinery in Carson 20 miles (32 km) south of Los Angeles, would come online by 2011 and generate 500 megawatts of electricity, about enough to power 325,000 homes.
Gov. Arnold Schwarzenegger said the plant would be the first in America to use a new process that uses a chemical process to produce clean-burning hydrogen from petroleum coke, a residue from refining crude oil.
Note that contrary to some news reports this is not a done deal. The costs are higher on this approach and BP wants government subsidies before going ahead with it.
While some articles on this story claim the petroleum coke is currently worthless this article says the petroleum coke from refineries is currently sold to Asia and used as an energy source.
Refineries in the South Bay and Harbor Area create about 17,000 tons of petroleum coke a day during the production of gasoline, diesel and jet fuel, officials said. BP Carson, which makes the Arco brand, alone creates about 4,000 tons a day.
The coke is not thrown away. It is often shipped to Asia, where it is simply burned as a fuel. It can also be used in the production of aluminum.
The BP-Edison project would consume about 5,000 tons per day, according to Ted Craver, CEO of the Edison Mission Group, the Edison subsidiary which will work on the project.
The proposed Carson project would combine a number of existing industrial processes to provide a new option for generating electricity without significant CO2 emissions. Petroleum coke produced at California refineries would first be converted to hydrogen and CO2 gases and around 90 percent of the CO2 captured and separated.
The hydrogen gas stream would be used to fuel a gas turbine to generate electricity. The captured CO2 would be transported by pipeline to an oilfield and injected into reservoir rock formations thousands of feet underground, both stimulating additional oil production and permanently trapping the CO2.
BP is hoping for tax incentives and regulatory incentives so that customers will buy the electricity which will be more expensive to produce.
Final project investment decisions will follow further study by the partners and review by the California Energy Commission and the South Coast Air Quality Management District. BP and EMG are beginning project discussions with state and federal government agencies and local stakeholders and are exploring options for selling the electricity the plant would generate. BP is in discussions with Occidental Petroleum to develop options for sequestering the CO2 in Occidental’s California oilfields.
The costs of hydrogen power are higher than those of traditional power plant fuels. As a result, the project will depend, in part, on incentives provided in the Federal Energy Policy Act of 2005 for advanced gasification technologies. In addition, continued progress on the California Public Utilities Commission's electricity "resource adequacy" procurement policies will encourage this first-of-its-kind facility.
Sounds like they want to extract hydrogen from the petroleum coke and in the process extract the carbon as carbon dioxide. Then they intend to pipe or otherwise ship the CO2 to oil fields while burning the hydrogen to generate electricity. My guess is that if the price of oil keeps going up the price of petroleum coke will rise as well. Perhaps in 2 years the unprofitability problem will have grown worse and this proposed facility won't get built.
I wonder how the cost of generating electricity using this method compares to the costs of generating electricity by burning corn. No need to sequester the CO2 from burning corn since corn gets its carbon from the atmosphere in the first place.
A new study from Imperial College London shows that robot assisted knee surgery is significantly more accurate than conventional surgery.
The team of surgeons tested whether Acrobot, a robotic assistant, could improve surgical outcomes for patients undergoing partial knee replacement. Acrobot works by helping the surgeon to line up the replacement knee parts with the existing bones.
The surgeons looked at 27 patients undergoing unicompartmental knee replacement. The patients were separated into two groups as part of a randomised controlled trial, with 14 having conventional surgery, and the remaining 13 having robot assisted surgery.
Although the operations took a few minutes longer using the robotic assistant, the replacement knee parts were more accurately lined up than in conventional surgery. All of the robotically assisted operations lined up the bones to within two degrees of the planned position, but only 40 percent of the conventionally performed cases achieved this level of accuracy.
The team found there were no additional side effects from using robot assisted surgery, and recovery from surgery was quicker in most cases.
The quicker recovery is a reflection of more accurate work. Since robots can improve upon what humans can do robotics could greatly improve surgical outcomes by reducing error rates and reducing collateral damage to other tissue.
Yes, these robots are not designed to take over the operation entirely. But the big cost savings will come from totally robotic surgery.
Professor Justin Cobb Opens in new window, from Imperial College London, who led the research team, said: "These robots are designed to hold the surgeon's hand in the operating theatre, not take over the operation. This study shows they can be an enormous help, preventing surgeons from making mistakes. More importantly, by showing how the increased accuracy makes a difference to how well a knee works after surgery, we will be able to develop a new generation of less invasive procedures without the risks of error, providing faster recovery and better functional outcomes for patients."
The study involved both surgeons and engineers from Imperial College, with medical robotics engineers designing the Acrobot prototype, and surgeons testing it.
Professor Cobb added: "This study could have important implications for not just surgery, but also for health economics. By improving the accuracy of surgery, and ultimately improving the outcome for patients, we can make sure the knee replacements work better and last longer, preventing the need for additional surgery."
The study was funded by The Acrobot Co. Ltd. a spin out of Imperial College London.
The big cost savings would come from speeding up surgery and reducing human involvement.
Some day genetically engineered pigs will produce organs that will be robotically removed from pigs and robotically transplanted into humans. Also, automated systems will grow up and transfer stem cells into people as part of rejuvenation therapies.
I would prefer development of better medical therapies such as gene therapies and cell therapies that reduce the need for surgery in the first place. But surgery will be necessary for a long time to come and automation of surgery could save trillions of dollars.
Adopting a low-fat diet in later life and following such a regimen for nearly a decade does not appear to have a significant impact on reducing the overall risk of breast cancer, colorectal cancer or heart disease, according to a Women's Health Initiative study that involved nearly 50,000 postmenopausal women across the United States. The results of the federally funded dietary modification study will be published in a series of three papers – two with lead authors at Fred Hutchinson Cancer Research Center and all three involving co-authors from the Hutchinson Center – in the Feb. 8 issue of the Journal of the American Medical Association, or JAMA.
The study – the first attempt to test the health impact of a low-fat diet in a randomized, controlled trial, considered the gold standard of clinical and public-health study design – did, however, uncover some encouraging trends, according to Hutchinson Center biostatistician Ross L. Prentice, Ph.D., lead author of the JAMA paper that describes the impact of a low-fat diet on breast-cancer risk, one of the primary goals of the study.
"Women in the low-fat-diet group reduced their overall rate of breast cancer by about 9 percent as compared to the women who didn't change their eating patterns, but that difference was not statistically significant; it could have been due to chance. So at this point we're not able to say with certainty that a low-fat diet reduces the risk of breast cancer," said Prentice, member and former director of the Hutchinson Center's Public Health Sciences Division. A 9 percent reduction in breast-cancer incidence means that, out of 10,000 women, 42 in the low-fat-diet group and 45 in the comparison group developed breast cancer each year.
Prentice and colleagues did find, however, that a low-fat diet was associated with a statistically significant 15 percent reduction in estradiol, a form of blood estrogen that increases the risk of breast cancer.
My guess is that if people could manage to stay on diets for decades then we'd see larger differences in results.
The 30% reduction in a single subtype of breast cancer suggests they really did find an effect from lower fat diets but that only some mechanisms by which cancer is generated are affected much by fat level in the diet.
Women in the low-fat group also experienced a 30 percent risk reduction for a certain subtype of breast cancer: tumors that were progesterone-receptor negative. "This finding provides an interesting hypothesis for further development and reinforces that breast cancer is multifaceted; it is not a single disease," Prentice said. PR-negative tumors, while relatively rare, are difficult to treat and associated with a higher mortality rate because they are unresponsive to hormone-blocking drugs such as tamoxifen.
The larger the decrease in fat in the diet (due to starting with a higher percentage of fat to begin with) the bigger the measured benefit:
Significant results were seen also among women in the low-fat-diet group who began the study with the highest baseline fat consumption and among women who most strictly adhered to the study's dietary-fat goals. Women in these categories experienced a 15 percent to 20 percent overall reduction in breast-cancer incidence.
"The bottom line is that changing to a low-fat diet may reduce breast-cancer risk, especially among women who have a relatively high-fat diet to begin with, but we don't view our data as strong enough at this time to make a broad recommendation that all women initiate a low-fat diet for that purpose," Prentice said. "Additional follow up with these women may yield a stronger, statistically significant conclusion."
With regard to colorectal cancer, the study did not reveal a reduction of cancer incidence overall, but it did show a modest 9 percent decrease in self-reported colon polyps – a precursor to colon cancer – among the women in the low-fat intervention group, according to Shirley A.A. Beresford, Ph.D., lead author of the paper describing the colorectal-cancer findings.
"It is important to remember that cancers often take decades to develop, and we may only be seeing the early stages of the impact of a low-fat diet intervention on the risk of colorectal cancer and other diseases," said Beresford, a member of the Hutchinson Center's Public Health Sciences Division and a professor of epidemiology at the University of Washington School of Public Health and Community Medicine. "The reduction in polyps suggests a possible reduction in colorectal-cancer risk could emerge over a longer time period." No significant reduction in heart disease emerged among the women in the low-fat intervention group, who achieved only a 2.4 percent reduction in low-density lipoprotein, or LDL, the so-called "bad" cholesterol, and a 3 percent lower rate of heart disease.
The study did, however, find trends toward reduction in heart-disease risk among the subset of women in the low-fat-diet group who made the greatest reduction in consumption of saturated fat and trans fat, both of which can raise the risk of heart disease because they increase production of LDL cholesterol.
"For heart-disease prevention, the data suggests that a greater emphasis on reduction of saturated and trans fats will be needed to have a major difference," Prentice said. Barbara V. Howard, Ph.D., president of MedStar Research Institute/Howard University in Washington, D.C., was the lead author of the heart-disease paper.
A reduction of dietary fat to only 20% of calories still falls well short of a Pritikin style diet where one might get only 10% of calories from fat. If they'd tried for a larger fat reduction they might have picked up a stronger signal in the data.
In my mind I hear Joe Jackson singing.
Everything gives you cancer
Everything gives you cancer
There’s no cure, there’s no answer
Everything gives you cancer
Forty percent of the participants were assigned to the low-fat diet, in which they were asked to reduce their fat intake to 20 percent of their total calories while eating five or more daily servings of vegetables and fruits and six servings of grains. The remaining 60 percent served as a comparison group and did not change their diet.
Although the primary hypotheses of protective effects of a low-fat diet on breast and colorectal cancer failed the test, the WHI researchers pointed out that the majority of women assigned to the low-fat diet didn't meet the 20 percent fat goal: On average, the women reduced their fat intake to 24 percent in the first year, but slowly increased their fat intake to 29 percent by the eighth year.
Furthermore, the study showed that women who had the highest fat intake at the study's outset showed greater evidence for reducing their breast cancer risk on the diet program. There was also a suggestive trend of breast cancer risk reduction for women who initially had the lowest consumption of vegetables and fruits and then increased their intake by one serving per day as part of the diet.
So the study was only for 8 years and yet our risk of cancer is the result of decades of accumulation of damage to our bodies. Also, the ladies didn't stay on their diets just like people do not stick to all the other diets they go on. Expecting people to stick with a diet for years isn't realistic.
A final point: I predict that in 10 years time we'll have plenty of genetic tests that tell us not only our genetic predisposition to various cancers (we already have some such tests) but also the genetic tests will tell us how much our individual risk will be modified by various dietary changes. Some people will be told that lowering their dietary fat won't matter much. Others will be told that lowering their dietary fat (or perhaps just specific kinds of fat) will have a big impact on their cancer risks. I hope the conductors of the study above took and stored DNA samples from the participants in this study. 10 or so years from now when DNA testing will be orders of magnitude cheaper testing of the DNA sequences of study participants could yield very useful information on how DNA variations and diet interact to influence cancer and heart disease risks.
Medical research will pay off with cures to diseases and eventually full body rejuvenation within the lifetimes of some of us alive right now and yet George W. Bush wants to freeze the NIH budget.
In stark contrast to his initiative for physical sciences [ScienceNOW, 1 February and 3 February], President Bush today proposed a budget freeze for the National Institutes of Health (NIH) in 2007, holding its funding steady at $28.6 billion. The proposal, part of the President's overall budget request to Congress, is drawing concern and even outrage from biomedical research advocacy groups, who worry that NIH is losing ground after its budget was doubled from 1999 to 2003. Now the budget proposal, which curbs domestic discretionary spending while boosting funding for national defense, must wind its way through Congress before being approved in some form later this year.
This is being penny wise and pound foolish. A freeze is really a cut by whatever the rate of inflation turns out to be. So medical research is getting cut 2% or 3%. Yet medical research is, in my opinion, the best value per dollar of government spending.
An article in the Wall Street Journal explores how technology is making the bathroom into a mini-office for many hard chargers.
With a BlackBerry, two mobile phones, three office computers and wireless Internet for his car, Greg Shenkman is never far from his work. But recently the CEO of San Francisco-based Exigen Group eked out more productivity by wiring the final frontier: his bathroom.
When Mr. Shenkman answers the speaker-phone in his shower, the water automatically shuts off. He can open the front door for deliveries while shaving. He's also put the finishing touches on a waterproof computer that will let him answer emails from his sauna. "I took Gates a little too literally," he says. "The flow of information never stops."
I want flat screens embedded in various room walls that'd let me see outside cameras very quickly to see why the dog is barking. The display software ought to use motion sensors to first show me the cameras most likely to explain the dog's barking. Ideally the dog's collar direction could provide info about which direction he's looking at so that camera could be brought up in, say, the shower wall flat panel.
How about a vanity mirror that doubles as an LCD display?
Manufacturer Acquinox of New York says sales of its steam shower/whirlpool units -- a hands-free phone is standard in each -- nearly tripled last year to 14,800 modules. Wisconsin-based Seura, meanwhile, reports rising sales of its vanity mirrors, which feature LCD screens in the glass. The mirrors, starting at $2,400, let users check their tie-knot, then flip a switch to watch the embedded TV.
Another trend I'm expecting is the development of sensor technology that turns the bathroom into the most intensely sensor-equipped automated medical diagnostic center of the house. The whole house will become instrumented with medical diagnosis sensors. But the bathroom, since it gets access to bodily fluids (and also skin and hair in the bathtub), is a great place to put lots of nanosensors that will detect disease, malnutrition, and other health problems. Further down the line most of us will eventually instrument our bodies with nanotech devices floating around the bloodstream. Those in-body sensors will send reports to the house computer network. That LCD display in the bathroom mirror will pop up your lipid profile, a body pathogen report, how far behind you are in sleep, and recommendations for foods that will address developing nutrient deficiencies.
My own fairly low tech method to save time in the bathroom is to use two Norelco shavers at once while shaving, one in each hand. I want to get some sort of mounting bracket built for a pair of shavers to make iit possible to operate 2 shavers in each hand and shave that much more rapidly. Ideally I'd prefer a hands free full face shaver that could shave just about all the regularly shaved area at once. If I could free up my hands then I could type at a computer (or perhaps brush my teeth or button my shirt) and shave at the same time.
Think about all the 1 or 3 or 5 minute tasks you do each day at home. If you could parallelize some of those tasks the times savings would add up. 3 minutes saved per day is an hour and a half per month or 18 hours per year. 12 minutes saved per day would be 6 hours per month or 72 hours per year. Got any ideas for how to automate home life? I'm always looking for ways to save time.
Can an easy way to tap fusion energy be developed? A new study on a way to create fusion with waves has found neutrons were generated by this approach.
Troy, N.Y. — A team of researchers from Rensselaer Polytechnic Institute, Purdue University, and the Russian Academy of Sciences has used sound waves to induce nuclear fusion without the need for an external neutron source, according to a paper in the Jan. 27 issue of Physical Review Letters. The results address one of the most prominent questions raised after publication of the team’s earlier results in 2004, suggesting that “sonofusion” may be a viable approach to producing neutrons for a variety of applications.
By bombarding a special mixture of acetone and benzene with oscillating sound waves, the researchers caused bubbles in the mixture to expand and then violently collapse. This technique, which has been dubbed “sonofusion,” produces a shock wave that has the potential to fuse nuclei together, according to the team.
But other scientists were skeptical of the results from that earlier round of sonofusion experiments.
In response to earlier criticisms this group of scientists has tried a sonofusion approach that did not use a external source of neutrons.
The telltale sign that fusion has occurred is the production of neutrons. Earlier experiments were criticized because the researchers used an external neutron source to produce the bubbles, and some have suggested that the neutrons detected as evidence of fusion might have been left over from this external source.
“To address the concern about the use of an external neutron source, we found a different way to run the experiment,” says Richard T. Lahey Jr., the Edward E. Hood Professor of Engineering at Rensselaer and coauthor of the paper. “The main difference here is that we are not using an external neutron source to kick the whole thing off.”
In the new setup, the researchers dissolved natural uranium in the solution, which produces bubbles through radioactive decay. “This completely obviates the need to use an external neutron source, resolving any lingering confusion associated with the possible influence of external neutrons,” says Robert Block, professor emeritus of nuclear engineering at Rensselaer and also an author of the paper.
The experiment was specifically designed to address a fundamental research question, not to make a device that would be capable of producing energy, Block says. At this stage the new device uses much more energy than it releases, but it could prove to be an inexpensive and portable source of neutrons for sensing and imaging applications.
To verify the presence of fusion, the researchers used three independent neutron detectors and one gamma ray detector. All four detectors produced the same results: a statistically significant increase in the amount of nuclear emissions due to sonofusion when compared to background levels.
A way to produce energy from fusion would put a pretty permanent end to our energy woes. But using more conventional approaches to creating the conditions under which fusion happens looks like it will take decades before fusion reactors are a reality. Can a less conventional approach provide a cost effective solution much sooner? If it did the implications would be staggering. Energy is the only resource whose limits matter for what humanity can accomplish. There is no single mineral whose short supply would stop or reverse economic growth. With enough energy we can mold many types of matter into states that would allow those types of matter to substitute for whatever is in short supply.