Chinese and English speakers both use the inferior parietal cortex when doing math. But Chinese and English speakers use different additional brain regions for calculating.
“But native English speakers rely more on additional brain regions involved in the meaning of words, whereas native Chinese speakers rely more on additional brain regions involved in the visual appearance and physical manipulation of numbers,” says Eric Reiman of the Banner Good Samaritan Medical Center in Phoenix, Arizona, US, one of the team.
Specifically, Chinese speakers had more activity in the visual and spatial brain centre called the visuo-premotor association network. Native English speakers showed more activity in the language network known as perisylvian cortices in the left half of the brain.
Reiman and his colleagues suggest that the Chinese language’s simple way of describing numbers may make native speakers less reliant on language processing when doing maths. For example, “eleven” is “ten one” in Chinese “twenty-one” is “two ten one”.
Note that the native Engilsih speakers used in the study probably were not ethnic Chinese. So this study does not control for genetic factors. I'd like to see this study repeated in an English speaking country with Chinese ethnics who were raised to speak English from birth. Also, a comparison with other groups and with more languages would provide more controls.
The difference "may mean that Chinese speakers perform problems in a different manner than do English speakers," said lead author Yiyuan Tang of Dalian University of Technology in Dalian, China.
"In part that might represent the difference in language. It could be that the difference in language encourages different styles of computation and this may be enhanced by different methods of learning to deal with numbers," Tang said in an interview via e-mail.
More use of some part of the brain to do computations might reduce the availability of that part of the brain for other uses. That, in turn, probably changes how the mind models the world.
This report is consistent with previous research which found differences in which parts of the mind process language. See Mandarin Language Uses More Of The Brain Than English.
I'd also like to brain scan comparisons done of people with different occupations (e.g. physicists, mathematicians, truck drivers, lawyers, reporters) for how they do mathematics. Do they differ between occupations as much as English and Chinese speakers differ?
The researchers say the eye pictures were probably influential because the brain naturally reacts to images of faces and eyes. It seems people were subconsciously cooperating with the honesty box when it featured pictures of eyes rather than flowers.
They also say the findings show how people behave differently when they believe they are being watched because they are worried what others will think of them. Being seen to co-operate is a good long-term strategy for individuals because it is likely to mean others will return the gesture when needed.
Details of the experiment, believed to be the first to test how cues of being watched affect people's tendency for social co-operation in a real-life setting, are published today, Wednesday June 28, in the Royal Society journal Biology Letters.
An honesty box is a system of payment which relies on people's honesty to pay a specified price for goods or services - there is no cashier to check whether they are doing so.
For this experiment, lead researcher Dr Melissa Bateson and her colleagues Drs Daniel Nettle and Gilbert Roberts, of the Evolution and Behaviour Research Group in the School of Biology and Psychology at Newcastle University, made use of a long-running 'honesty box' arrangement.
This had been operating as a way of paying for hot drinks in a common room used by around 48 staff for many years, so users had no reason to suspect an experiment was taking place.
An A5 poster was placed above the honesty box, listing prices of tea, coffee and milk. The poster also featured an image banner across the top, and this alternated each week between different pictures of flowers and images of eyes.
The eye pictures varied in the sex and head orientation but were all chosen so that the eyes were looking directly at the observer.
Each week the research team recorded the total amount of money collected and the volume of milk consumed as this was considered to be the best index available of total drink consumption.
The team then calculated the ratio of money collected to the volume of milk consumed in each week. On average, people paid 2.76 as much for their drinks on the weeks when the poster featured pictures of eyes.
Lead author of the study, Melissa Bateson, a Royal Society research fellow based at Newcastle University, said: "Our brains are programmed to respond to eyes and faces whether we are consciously aware of it or not.
"I was really surprised by how big the effect was as we were expecting it to be quite subtle but the statistics show that the eyes had a strong effect on our tea and coffee drinkers."
Those nations with massive posters of dictators staring down on every street probably have lower crime rates as a result.
This result seems like it has all sorts of obvious immediate results. Parents could put posters of eyes in rooms where their kids might be tempted to misbehave when the parents are not around. Posters of eyes could get put up in bus and train stations to see if the posters deter pick-pockets. Posters of eyes in workplaces might make people less likely to laze off.
One obvious direction for further research would be to try different kinds of faces and facial expressions to see if some faces make people work harder or to treat people more politely on technical support phone calls or otherwise perform better and more honestly in work situations.
Would people in workplaces feel more stressed when eyes in posters look down upon them?
In public places such as town squares, train stations, and airports which have video surveillance cameras (aka CCTVs) would the cameras be more effective in deterring crime if combined with a poster of eyes mounted above them to emphasize to people that they are being watched?
Two companies plan to market the first lie-detecting devices that use magnetic resonance imaging (MRI) and say the new tests can spot liars with 90% accuracy.
No Lie MRI plans to begin offering brain-based lie-detector tests in Philadelphia in late July or August, says Joel Huizenga, founder of the San Diego-based start-up. Cephos Corp. of Pepperell, Mass., will offer a similar service later this year using MRI machines at the Medical University of South Carolina in Charleston, says its president, Steven Laken.
Both rely in part on recent research funded by the federal government aimed at producing a foolproof method of detecting deception.
Lie detection will become a huge market. It will change personal relationships, marriages, the criminal justice system (I love tools that can exonerate the innocent), the hunt for terrorists, and raise honesty in business dealings.
Want to settle an argument where one party does not trust the other's claims? Even better, how about those arguments where both sides say the other is lying? The solution (assuming you don't mind the 90% accuracy rate) is quite affordable.
No Lie MRI plans to charge $30 a minute to use its device. Cephos has not yet set a price.
Have any disagreements with suspected liars that would be worth at least $30 to verify truth or dishonesty?
No Lie MRI will debut its services this July in Philadelphia, where it will demonstrate the technology to be used in a planned network of facilities the company is calling VeraCenters. Each facility will house a scanner connected to a central computer in California. As the client responds to questions using a handheld device, the imaging data will be fed to the computer, which will classify each answer as truthful or deceptive using software developed by Langleben's team.
Temple University radiologist Scott Faro sees lie detectors as great money savers.
"People say fMRI is expensive," Faro continues, "but what's the cost of a six-month jury trial? And what's the cost to America for missing a terrorist? If this is a more accurate test, I don't see any moral issues at all. People who can afford it and believe they are telling the truth are going to love this test."
The more parties to a disagreement the less the problem of the only 90% success rate. Ask several employees in a company or suspected members of a terrorist ring some hard questions. See where they all line up in terms of their answers and the fMRI machine's assessements.
The US federal government prevents private companies from using the cost savings of lie detection. This'll become an incentive to move work offshore when business needs place a very high value on honesty and trustworthiness.
No Lie MRI's plans to market its services to corporations will likely run afoul of the 1988 Employee Polygraph Protection Act, which bars the use of lie-detection tests by most private companies for personnel screening. Government employers, however, are exempt from this law, which leaves a huge potential market for fMRI in local, state, and federal agencies, as well as in the military.
I wonder if lie detection will be allowed in divorce cases? "Have you disclosed all your sources of income and all assets?" Or how about "Have you ever done illegal drugs while you had custody of the kids?"
With Hurricane Katrina people had a couple of days notice that something highly destructive was coming their way. When the big quake comes to SoCal and LA gets wrecked we'll find out about it right when the big ride starts. The last really big SoCal earthquake was in 1690. The San Andreas Fault has been building up unreleased tension for at least 300 years.
A researcher investigating several facets of the San Andreas Fault has produced a new depiction of the earthquake potential of the fault's southern, highly populated section. The new study shows that the fault has been stressed to a level sufficient for the next "big one"—an earthquake of magnitude seven or greater—and the risk of a large earthquake in this region may be increasing faster than researchers had believed, according to Yuri Fialko of Scripps Institution of Oceanography at the University of California, San Diego.
Historical records show that the San Andreas Fault experienced massive earthquakes in 1857 at its central section and in 1906 at its northern segment (the San Francisco earthquake). The southern section of the fault, however, has not seen a similar rupture in at least 300 years.
Although seismologists have not been able to predict when a great earthquake will occur on the southern San Andreas, most believe such an event is inevitable. Fialko has produced the clearest evidence to date of the strain buildup that will ultimately result in a large earthquake along the southern San Andreas Fault, a 100-mile segment that cuts through Palm Springs and a number of other cities in San Bernardino, Riverside and Imperial counties. Such an event would be felt throughout much of Southern California, including densly populated areas of metropolitan Los Angeles and San Diego.
If you are a SoCal resident now might be the time to start thinking about a really extended trip to some other part of the world.
"All these data suggest that the fault is ready for the next big earthquake but exactly when the triggering will happen and when the earthquake will occur we cannot tell. It could be tomorrow or it could be 10 years or more from now," said Fialko.
Bonds have maturities measured in decades. Earthquakes are a substantial risk factor. Hey there bond investors, you might want to think twice before buying bonds of SoCal governments.
Fialko found evidence that the southern San Andreas is mostly locked and continues to accumulate significant amounts of strain. He calculated the rate at which the fault is moving and estimated the "fault slip rate," the pace of the plate movement at the fault, at about an inch per year. According to Fialko, this means that during the last 300 dormant years the fault has accumulated approximately six to eight meters of slip "deficit," which will be released in the future big earthquakes. If all inferred deficit is released in a single event, it would result in a magnitude eight earthquake, roughly the size of the 1906 San Francisco earthquake.
"In the earthquake business, the past is a key to understanding the present and by comparing data on the timing of past earthquakes on the fault with what we have measured over the last 10 years, we can say with some certainty that the fault is approaching the end of its loading period," said Fialko.
When it does rupture on the San Andreas, such a quake could be as deadly as the 1994 Northridge quake, which struck on an unsuspected hidden fault now called the Northridge or Pico and killed 51 people, injured 9,000 and caused $44 billion in damages.
If you are in a wood frame house that is not perched precariously on a hillside your odds of getting killed are quite low. But if you are on an elevated roadway or some old high building or downstream of some dogdy old dam filled with water then your risks go up. For myself personally I'm more worried about the economic disruption (e.g. the need for electricity and internet to do work).
If someone, somewhere hadn't thought to make team uniforms the same color, we might be stuck watching NBA finals or World Cup soccer matches with only two players and a ref.
It is that color coding, Johns Hopkins University psychologists have now demonstrated, that allows spectators, players and coaches at major sporting events to overcome humans' natural limit of tracking no more than three objects at a time.
"We've known for some time that human beings are limited to paying attention to no more than three objects at any one time," said Justin Halberda, assistant professor of psychological and brain sciences in the university's' Zanvyl Krieger School of Arts and Sciences.
"We report the rather surprising result that people can focus on more than three items at a time if those items share a common color," he said. "Our research suggests that the common color allows people to overcome the usual limit, because the 'color coding' enables them to perceive the separate individuals as a single set."
Employers could increase productivity of employees by reducing cognitive overload of needless distractions. Instead office workers have to listen to more than 3 conversations at a time over cubicle walls.
If you know which color you are supposed to keep track of your mind can focus on that quite well.
Knowing that color is the key to making sense of large numbers of objects "informs our understanding of the structure of visual cognition and reveals that humans rely on early visual features to attend large sets in parallel," Halberda said. "Ongoing work in our lab is revealing which other features humans might use."
Halberda and Feigenson reached their conclusion by asking Johns Hopkins undergraduate volunteers to view series of colored dots flashing onto a black computer screen. The subjects were asked to estimate the number of dots in one randomly selected set on each trial.
Half the time, the subjects were told in advance whether to pay attention to, say, just the red dots or just the green ones. Otherwise, the subjects were required to store as much information as possible in visual memory from what they saw briefly onscreen.
Some sets contained as many as 35 dots and subjects viewed the sets for less than one half second, which Halberda points out "is too short to allow the subjects to actually count the dots." Subjects were very accurate when told in advance which set to pay attention to, regardless of how many different colors were present, revealing that humans are able to select a set that shares a common color. Subjects were also very accurate at enumerating a color subset when asked after the flash of dots so long as the flash contained three or fewer colors.
"We found that humans are unable to store information from more than three sets at once," Halberda said. "This places an important constraint on how humans think about and interact with sets in the world."
Just forget about spectator sports with 4 teams playing against each other at once. 3 teams competing at once are within the limits of what human spectators can track. But more than that does not work.
Maybe the real limit is 2 teams because people also have to keep track of the set of things they deem necessary for watching sports. There'd be no mental room for beer and food if 3 teams competed.
So then do space aliens with greater mental capacities routinely watch sports involving a dozen teams? Also, when human minds get genetically engineered to track more than 3 sets of things at once will human sports also change to bring more teams onto the field?
The use of thorium to power nuclear reactors holds out the prospect of a huge reduction in nuclear wastes, a nuclear fuel cycle that is much more proliferation resistant, lower costs, and a fuel that is many times more plentiful than uranium. Australian science writer Tim Dean examines the prospects for thorium reactors in a recent article and finds two avenues of technological advance that might make thorium powered nuclear reactors feasible. The more immediately promising approach uses a mixture of thorium with other radioactive materials.
The main stumbling block until now has been how to provide thorium fuel with enough neutrons to keep the reaction going, and do so in an efficient and economical way.
In recent years two new technologies have been developed to do just this.
One company that has already begun developing thorium-fuelled nuclear power is the aptly named Thorium Power, based just outside Washington DC. The way Thorium Power gets around the sub-criticality of thorium is to create mixed fuels using a combination of enriched uranium, plutonium and thorium.
At the centre of the fuel rod is the 'seed' for the reaction, which contains plutonium.
Wrapped around the core is the 'blanket', which is made from a mixture of uranium and thorium. The seed then provides the necessary neutrons to the blanket to kick-start the thorium fuel cycle. Meanwhile, the plutonium and uranium are also undergoing fission.
The primary benefit of Thorium Power's system is that it can be used in existing nuclear plants with slight modification, such as Russian VVER-1000 reactors. Seth Grae, president and chief executive of Thorium Power, and his team are actively working with the Russians to develop a commercial product by the end of this decade. They already have thorium fuel running in the IR-8 research reactor at the Kurchatov Institute in Moscow.
The potential to use existing reactors to burn thorium lowers the barrier to use of thorium. Success in existing reactors could catalyze the construction of new reactors designed to use thorium from their start.
He also goes over Carlo Rubbia's proposal to use a particle accelerator to shoot a stream of protons into a thorium reactor.
AN ALTERNATIVE DESIGN does away with the requirements for uranium or plutonium altogether, and relies on thorium as its primary fuel source. This design, which was originally dubbed an Energy Amplifier but has more recently been named an Accelerator Driven System (ADS), was proposed by Italian Nobel physics laureate Carlos Rubbia, a former director of one of the world's leading nuclear physics labs, CERN, the European Organisation for Nuclear Research.
An ADS reactor is sub-critical, which means it needs help to get the thorium to react. To do this, a particle accelerator fires protons at a lead target. When struck by high-energy protons the lead, called a spallation target, releases neutrons that collide with nuclei in the thorium fuel, which begins the fuel cycle that ends in the fission of U-233.
Governments should accelerate research into new nuclear reactor designs that promise to lower wastes and reduce costs.
David Morris is the vice president of the Institute for Local Self-Reliance says the current US federal 51 cents per gallon subsidy for ethanol should get replaced by a program that protects against price falls. (shorter New York Times version of the article here)
It will be difficult, if not impossible, to politically justify a 51 cent per gallon incentive for ethanol if there is an ethanol mandate for 10 or 20 or 30 billion gallons and oil prices remain high.
Ethanol needs no financial incentives to compete when crude oil prices are over $65 a barrel. However, history demonstrates the volatile nature of oil prices. The price of oil in the last 10 years has dropped below $20 per barrel several times. Moreover, the cost of ethanol production is highly dependent on the cost of its feedstock, and although corn prices have not varied nearly as dramatically as oil prices, they do vary. Corn prices briefly topped $4 per bushel in the late 1990s, although they have held fairly steady at about $2 per bushel for most of the last 20 years.
How might we redesign the federal incentive to honor the nation’s commitment to both farmers and taxpayers? The incentive needs to be structured to protect the farmer-producer if the price of oil plunges or the price of the feedstock(corn, soybeans, cellulose) jumps. The taxpayer must be protected from having to underwrite handsome subsidies when the biofuels industry no longer needs them.
Farmers have pushed for years to get more people using gasoline mixed with ethanol made from corn kernels, but so far such ethanol has replaced only about 3 percent of the nation's gasoline, and by most estimates, the country would never be able to grow enough corn to replace more than 10 or 12 percent of its fuel supply.
But high oil prices make plant biomass competitive.
"If you think we're heading towards a future where oil prices are going to stay relatively high, $50-plus a barrel, then the energy cost delivered in plant biomass is much, much less than the energy cost delivered in oil," said Bruce E. Dale, head of the Biomass Conversion Research Laboratory at Michigan State University. "I'm completely convinced that this industry is going to happen on economic grounds alone. The demand for liquid fuels is so high and rising that we're going to convert an awful lot of stuff to liquid fuels."
Scientists have projected that in the long run, ethanol made from biomass could be cheaper than gasoline or corn ethanol, costing as little as 60 cents a gallon to produce and selling for less than $2 a gallon at the pump. But right now it would be more expensive than gasoline, and the low prices are likely to be achieved only after large plants have been built and technical breakthroughs achieved in operating them.
The most promising future for biomass energy comes from developments to break down cellulose (often referred to as "cellulosic technology") so that all of a plant could get converted into ethanol and many more types of plants could serve as inputs to ethanol production..
Cellulose, like starch, is made up of glucose molecules, but packed so tightly they're extremely hard to break apart. Plants use cellulose chiefly as a structural material -- it helps trees and grasses stand upright. If efficient ways were developed to break open the molecules, a wide variety of agricultural wastes or specially planted energy crops could feed the new industry.
Scientific progress has been slow, but now it seems to be accelerating. Enzymes needed for the process used to cost more than $5 per gallon of ethanol, but biotechnology companies, under government research contracts, have reduced that to 30 cents per gallon. A handful of small companies, exploiting the drop, are already making small amounts of ethanol from biomass, and claim that they are close to doing so at competitive prices. Not only are they shopping for locations for bigger plants, they are also signing contracts with farmers to supply raw material.
A lot of environmentalists are thrilled at the prospect of cellulosic technology. But picture countries with 10 or more times the population density of the United States (e.g. India) shifting heavily toward biomass to power a growing economy with eventually hundreds of millions of more cars. What would happen to the already shrinking habitats where animals live? They'd be converted into fields to grow plants for cellulose.
I prefer accelerating research and development of photovoltaics, batteries, and nuclear power. Advances on these fronts will enable a shift toward electric power for cars and reduce the ecological footprint of our methods for generating energy.
David Gobel, one of the folks behind the Methuselah Mouse Prize, points me to news of an experiment where regular muscle tissue was isolated from rats, shaped into heart tissue that can conduct electric impulses.
Patients with complete heart block, or disrupted electrical conduction in their hearts, are at risk for life-threatening rhythm disturbances and heart failure. The condition is currently treated by implanting a pacemaker in the patient's chest or abdomen, but these devices often fail over time, particularly in infants and small children who must undergo many re-operations. Researchers at Children's Hospital Boston have now taken preliminary steps toward using a patient's own cells instead of a pacemaker, marking the first time tissue-engineering methods have been used to create electrically conductive tissue for the heart. Results appear in the July issue of the American Journal of Pathology (published online on June 19).
The goal of this research is to develop living replacements for artificial pacemakers.
The scienitsts isolated myoblasts, a type of stem cells, from muscles. Then they grew up those cells on a structure made from collagen. The resulting tissue formed a structure implantable on hearts.
Cowan's team, including first author Yeong-Hoon Choi in Children's Department of Cardiac Surgery, obtained skeletal muscle from rats and isolated muscle precursor cells called myoblasts. They "seeded" the myoblasts onto a flexible scaffolding material made of collagen, creating a 3-dimensional bit of living tissue that could be surgically implanted in the heart.
The cells distributed themselves evenly in the tissue and oriented themselves in the same direction. Tested in the laboratory, the engineered tissue started beating when stimulated electrically, and its muscle cells produced proteins called connexins that channel ions from cell to cell, connecting the cells electrically.
The bioengineered tissue created pathways for conducting electricity.
When the engineered tissue was implanted into rats, between the right atrium and right ventricle, the implanted cells integrated with the surrounding heart tissue and electrically coupled to neighboring heart cells. Optical mapping of the heart showed that in nearly a third of the hearts, the engineered tissue had established an electrical conduction pathway, which disappeared when the implants were destroyed. The implants remained functional through the animals' lifespan (about 3 years).
"The advantage of using myoblasts is that they can be taken from skeletal muscle rather than the heart itself--which will be important for newborns whose hearts are so tiny they cannot spare any tissue for the biopsy--and that they're resistant to ischemia, meaning they can go without a good blood supply for a relatively long period of time," Cowan says.
The researchers are now working toward experiments with larger animals.
I like tissue engineering experiments that try to take existing tissue and rearrange it to solve problems. We certainly need advances in understanding of how embryonic stem cells go through series of steps to become much more specialized. However, as demonstrated by this report, not all applications of stem cells require the development of an enormous amount of understanding of how to control the differentiation (specialization) of stem cells through many steps. Existing non-embryonic cells have the potential to be rearranged and grown up into cells needed to solve many medical problems.
New research suggests one reason vegetables may be so good for us – a study in mice found that a mixture of five common vegetables reduced hardening of the arteries by 38 percent compared to animals eating a non-vegetable diet. Conducted by Wake Forest University School of Medicine, the research is reported in the current issue of the Journal of Nutrition.
“While everyone knows that eating more vegetables is supposed to be good for you, no one had shown before that it can actually inhibit the development of atherosclerosis,” said Michael Adams, D.V.M., lead researcher. “This suggests how a diet high in vegetables may help prevent heart attacks and strokes.”
The study used specially bred mice that rapidly develop atherosclerosis, the formation on blood vessel walls of fatty plaques that eventually protrude into the vessel’s opening and can reduce blood flow. The mice have elevated low-density lipoprotein ( LDL), or “bad” cholesterol, which is also a risk factor for atherosclerosis in humans.
Half of the mice in the study were fed a vegetable-free diet and half got 30 percent of their calories from a mixture of freeze-dried broccoli, green beans, corn, peas and carrots. These five vegetables are among the top-10 vegetables in the United States based on frequency of consumption.
After 16 weeks, the researchers measured two forms of cholesterol to estimate the extent of atherosclerosis. In mice that were fed the vegetable diet, researchers found that plaques in the vessel were 38 percent smaller than those in the mice fed vegetable-free diets. There were also modest improvements in body weight and cholesterol levels in the blood.
The estimates of atherosclerosis extent involved measuring free and ester cholesterol, two forms that accumulate in plaques as they develop. The rate of this accumulation has been found to be highly predictive of the actual amount of plaque present in the vessels.
Most people do not eat an optimal amount of vegetables. Another report providing yet more evidence on the benefits of vegetables won't cause many to alter their diets. Maybe what we need is some sort of Pop-Tart that is mostly vegetables but with flavoring designed to hide the vegetable taste.
Veggies reduce inflammation.
He said that a 37 percent reduction in a certain marker of inflammation in mice suggests that vegetable consumption may inhibit inflammatory activity.
“It is well known that atherosclerosis progression is intimately linked with inflammation in the arteries,” Adams said. “Our results, combined with other studies, support the idea that increased vegetable consumption inhibits atherosclerosis progression through antioxidant and anti-inflammatory pathways.”
Since mouse studies can be done more rapidly and cheaply than human studies I'd love to see this study repeated with different vegetables diluted with conventional mouse chow to see which veggies are most potent. Potent veggies could be used in smaller doses in veggie Pop-Tarts. High potency veggie Pop-Tarts are our only hope.
We need high potency veggie Pop-Tarts to buy us some extra time while we wait for the development of SENS technologies.
One of the problems with use of vaccines to stop a flu pandemic is that it takes many months to develop and manufacture vaccines against a new flu strain. Even worse, the manufacturing capacity for making vaccines is woefully inadequate for the case of a global pandemic. In a pandemic the need for vaccine would go up over an order of magnitude and the current process for making flu vaccine is hard to scale up. One way to partially solve this problem would be to manufacture vaccines in advance using flu strains that are not exact matches for an eventual pandemic strain. Support for pre-pandemic vaccine production is building.
"People are taking pre-pandemic vaccination seriously," says Derek Smith at the University of Cambridge. In May, a meeting of scientists and manufacturers at the World Health Organization in Geneva, Switzerland, recommended the development of vaccines that could be used to inoculate people before a pandemic takes hold. These, they said, must have long-lasting effects, and be "broad-spectrum" enough to work against whatever pandemic virus emerges. Several novel vaccines that do both are now close to testing in humans. They include the addition of immunity-stimulating chemicals called adjuvants, vaccines made of DNA instead of the virus itself, and perhaps the ultimate - a vaccine that protects against every kind of flu.
While there is no way of knowing before a pandemic starts exactly how well the vaccine will work, the risks of doing nothing could be far greater. "Stockpiling pre-pandemic vaccines is more valuable than people realise," Robert Webster of St Jude Children's Research Hospital in Memphis, Tennessee, told a flu conference in Singapore last month. "It may not necessarily protect you from infection, but it will probably stop you dying."
I've been in favor of this idea for years and continue to think that movement in the direction of developing pre-pandemic vaccines is too slow. The problem with pre-pandemic vaccines is that they won't be an exact match for whatever strain of influenza eventually becomes pandemic. But if an H5N1 avian flu strain mutates into a human pandemic strain then even a vaccine made from a different H5N1 strain will provide partial immunity to the pandemic strain. That partial immunity might some day save millions of lives.
The article reports on promising advances in DNA-based vaccines and in adjuvants (which amplify immune system response to vaccines). Production of DNA vaccines could be scaled up much more rapidly than the current chicken egg-based vaccine manufacturing process.
Using electroencephalogram (EEG) measurements of reactions to a large assortment of types of images shown for several seconds researchers found the minds of women react most strongly to erotic images.
Researchers at Washington University School of Medicine in St. Louis measured brainwave activity of 264 women as they viewed a series of 55 color slides that contained various scenes from water skiers to snarling dogs to partially-clad couples in sensual poses.
What they found may seem like a "no brainer." When study volunteers viewed erotic pictures, their brains produced electrical responses that were stronger than those elicited by other material that was viewed, no matter how pleasant or disturbing the other material may have been. This difference in brainwave response emerged very quickly, suggesting that different neural circuits may be involved in the processing of erotic images.
"That surprised us," says first author Andrey P. Anokhin, Ph.D., research assistant professor of psychiatry. "We believed both pleasant and disturbing images would evoke a rapid response, but erotic scenes always elicited the strongest response."
As subjects looked at the slides, electrodes on their scalps measured changes in the brain's electrical activity called event-related potentials (ERPs). The researchers learned that regardless of a picture's content, the brain acts very quickly to classify the visual image. The ERPs begin firing in the brain's cortex long before a person is conscious of whether they are seeing a picture that is pleasant, unpleasant or neutral.
But when the picture is erotic, ERPs begin firing within 160 milliseconds, about 20 percent faster than occurred with any of the other pictures. Soon after, the ERPs begin to diverge, with processing taking place in different brain structures for erotic pictures than those that process the other images.
I wonder if there is any trend as a function of age where the minds of older women might react less strongly to erotic images. I also wonder whether there is a difference with women on hormone suppressing therapies or who have had hysterectomies without replacement hormones.
Women rate erotic material as less appealing than men do. But their brains react as much to it as do the brains of men.
A great deal of past research has suggested that men are more visual creatures than women and get more aroused by erotic images than women. Anokhin says the fact that the women's brains in this study exhibited such a quick response to erotic pictures suggests that, perhaps for evolutionary reasons, our brains are programmed to preferentially respond to erotic material.
"Usually men subjectively rate erotic material much higher than women," he says. "So based on those data we would expect lower responses in women, but that was not the case. Women have responses as strong as those seen in men."
I would expect people with stronger sex drives to react more strongly to erotic materials. I wonder if one could use the EEG results like a biofeedback machine and train oneself to react more or less strongly to erotic materials. If one could train oneself to react more strongly would it provide any benefit for people who feel frigid or who get little pleasure from sex?
I'd also be curious to know whether a fast IQ test delivered after being shown images that elicit a greater brain response would show higher or lower mental ability. Do images that evoke stronger responses make the mind work quicker in general? Or do they divert mental resources away from problem solving?
By comparing how quickly human facial expressions of different types are detected in a crowd of neutral faces, researchers have demonstrated that male angry faces are a priority for visual processing – particularly for male observers. The findings are reported by Mark Williams of the Massachusetts Institute of Technology and Jason Mattingley of the University of Melbourne, Australia, and appear in the June 6th issue of Current Biology.
In evolutionary terms, it makes sense that our attention is attracted by threat in the environment. It has long been hypothesized that facial expressions that signal potential threat, such as anger, may capture attention and therefore "stand out" in a crowd. In fact, there are specific brain regions that are dedicated to processing threatening facial expressions. Given the many differences between males and females, with males being larger and more physically aggressive than females, one might also suspect differences in the way in which threat is detected from individuals of different genders.
In the new work, Williams and Mattingley show that angry male faces are found more rapidly than angry female faces by both men and women. In addition, men find angry faces of both genders faster than women, whereas women find socially relevant expressions (for example, happy or sad) more rapidly. The work suggests that although males are biased toward detecting threatening faces, and females are more attuned to socially relevant expressions, both sexes prioritize the detection of angry male faces; in short, angry men get noticed. The advantage for detecting angry male faces is consistent with the notion that human perceptual processes have been shaped by evolutionary pressures arising from the social environment.
Angry males are a greater potential threat than angry females. So it makes sense that natural selection would favor a wiring of human brains that make them more easily recognized.
There's a security angle here: Secret Service and other professional bodyguard outfits that need to recognize angry male would-be assassins might do that job better with male agents. However, do assassins look and feel angry? Or are some feeling thrills at what they are about to do? If assassins express other kinds of emotions when preparing to kill then maybe women would be better at recognizing them.
What I wonder: Just how many distinct adaptations and abilities has natural selection wired into human brains? How many of those abilities are trade-offs with other abilities? For example, in the case above while males have an advantage recognizing angry faces females have an advantage in decoding the meaning of other facial expressions.
Also, once scientists identify which genetic variations make those abilities more or less pronounced which abilities will people choose to give their offspring? I think the question of how people will genetically engineer their offspring is one of the most important questions we face for the future.
“To make the drug-cues video, we worked with addicts who advised us on how to make it as realistic as possible while simulating scenes involving smoking or snorting cocaine,” said Wang. The scientists also asked the subjects to rate their level of craving while watching both videos, and assessed the severity of their addiction using a standard cocaine craving scale.
Dopamine levels were measured indirectly using positron emission tomography (PET) scanning at Brookhaven’s Center for Translational Neuroimaging. Each subject was injected with a radiotracer designed to bind to dopamine receptors in the brain. During scanning, the PET camera picks up the signal from any bound radiotracer so that levels of tracer bound to receptors can be compared with levels in the blood. As the body’s natural dopamine levels rise, this “endogenous” dopamine competes with the tracer for binding sites, so less radiotracer can bind to the receptors. Therefore, the lower the bound tracer signal, the higher the concentration of endogenous dopamine.
Compared with the neutral video, the cocaine-cues video triggered a significant increase in dopamine in the dorsal striatum, a part of the brain involved in experiencing desire or motivation. The changes in dopamine were associated with the level of craving reported by the subjects and were largest in the most severely addicted subjects.
This finding is consistent with previous animal studies that have suggested a role for the dorsal striatum in cue-induced craving. In those studies, neutral stimuli such as a particular cage environment that had been paired with a drug during “training” sessions later triggered a dopamine increase in both the nucleus accumbens and the dorsal striatum, a response that was correlated with drug-seeking behaviors in the animals.
Frustrated desires for food also cause a rise in brain dorsal striatum dopamine levels.
The finding is also consistent with earlier Brookhaven research documenting dopamine increases in the dorsal striatum induced by exposure to food (see this release). In that study, healthy subjects were allowed to observe and smell their favorite foods, but not eat them; the more the subjects desired the foods, the higher their dopamine levels went.
“Finding this same association between dorsal striatum dopamine levels and cravings for food and drugs suggests that, in the human brain, drug addiction engages the same neurobiological processes that motivate food-seeking behaviors triggered by food-conditioned cues,” Volkow said. This research suggests that compounds that could inhibit cue-induced striatal dopamine increases would be logical targets for medication development to treat cocaine addiction.
These findings suggest to me that compounds which inhibit or reduce desire for cocaine might also reduce cracvings for food. A drug developed to treat coke addicts might also help people to lose weight.
Also, since the vast bulk of us experience food cravings we non-drug addicts probably understand the cravings that drive drug addicts better than many of us realize. Obese people who look down their noses with disapproval at drug addicts ought to go look in the mirror and look at the signs that they have their own very similarly caused cravings which they can not control.
Some day we will gain the ability to tune our desires to better align our daily behavior with our longer term goals. Research into drug addiction, obesity, and other problems with human minds will produce much more than just treatments to suppress desires for food and drugs. We will also gain the ability to mold what causes our minds to feel satisfied, frustrated, impatient, happy, and sad. People will adjust their emotional reactions to make them better able to do tedious work and to pursue longer term goals.
In a paper published in Plos Medicine a team of researchers found that rises in the use of Prozac appear negatively correlated with suicide rates.
What Did the Researchers Do and Find?
They looked at annual suicide rates between 1960 and 1988 and compared them with annual rates in the period 1988 to 2002. They used several sources of data, including the Centers of Disease Control and the US Census Bureau. The researchers found that from the early 1960s until 1988, in the entire US population, between 12.2 and 13.7 people in every 100,000 committed suicide each year. After that time, the numbers of suicides gradually declined, with the lowest figure (10.4 people per 100,000) reached in 2000. The researchers did mathematical tests, which demonstrated that the steady decline was statistically associated with the increased number of fluoxetine prescriptions—that is, the more prescriptions there were, the fewer suicides there were. (There were around two-and-a-half million prescriptions of the drug in 1988, increasing to over 33 million in 2002.)
What Do These Findings Mean?
In all scientific research, it is an important principle that finding an association between two events does not prove that one caused the other to occur. However, the authors of this paper suggest that the use of this drug could have contributed to the reduction of suicide rates in the US in the period 1988 to 2002. Several other SSRIs are also now in common use, but they were not considered in this study, nor were other antidepressants, or other treatments for depression.
Prozac belongs to a class of anti-depressants known as Selective Serotonin Uptake Inhibitors (SSRIs). They work by blocking proteins on nerve cells that transport the serotonin into the nerves. That causes serotonin concentrations to rise in gap regions between nerves and ttherefore to bind instead to receptors to increase the effect of serotonin in sending messages that (at least in theory) will lighten moods.
Methods and Findings
Sources of data included Centers of Disease Control and US Census Bureau age-adjusted suicide rates since 1960 and numbers of fluoxetine sales in the US, since its introduction in 1988. We conducted statistical analysis of age-adjusted population data and prescription numbers. Suicide rates fluctuated between 12.2 and 13.7 per 100,000 for the entire population from the early 1960s until 1988. Since then, suicide rates have gradually declined, with the lowest value of 10.4 per 100,000 in 2000. This steady decline is significantly associated with increased numbers of fluoxetine prescriptions dispensed from 2,469,000 in 1988 to 33,320,000 in 2002 (rs = −0.92; p < 0.001). Mathematical modeling of what suicide rates would have been during the 1988–2002 period based on pre-1988 data indicates that since the introduction of fluoxetine in 1988 through 2002 there has been a cumulative decrease in expected suicide mortality of 33,600 individuals (posterior median, 95% Bayesian credible interval 22,400–45,000).
The introduction of SSRIs in 1988 has been temporally associated with a substantial reduction in the number of suicides. This effect may have been more apparent in the female population, whom we postulate might have particularly benefited from SSRI treatment. While these types of data cannot lead to conclusions on causality, we suggest here that in the context of untreated depression being the major cause of suicide, antidepressant treatment could have had a contributory role in the reduction of suicide rates in the period 1988–2002.
As the authors acknowledge, suggestions that there may be a causal relationship between fluoxetine prescription and suicide rates would represent an overinterpretation of the results. In a study like this, it is also important to consider other potential explanations for the fall in suicide rates, such as improvements in the economy or improved management of depression by primary-care providers. Moreover, as the study did not include people above 65 years of age, who are known to have an increased risk of suicide (especially in men) compared with younger people, the findings are limited to adults up to 65 years of age.
Another limitation of this study was the use of fluoxetine as a model of SSRI use. Several effective SSRIs have been introduced since the arrival of fluoxetine, and these newer SSRIs may have had an additional potential impact on suicide rates. Finally, although the authors used the best available data on the number of prescriptions of fluoxetine, these estimations are not very accurate in terms of actual intake of antidepressants. As there are no reliable figures available on adherence to drug prescriptions at the population level, the real effect of antidepressants on suicide rates is difficult to estimate.
Even if this result holds up and the SSRIs have prevented 33,000 deaths that is probably small potatoes compared to the benefits in reduce mortality that have flowed from use of statin drugs to lower cholesterol. However, if SSRIs are brightening moods then they are probably making a big economic impact by reducing the lethargy that comes with depression.
OTTAWA, June 13 — In an effort to revive a nuclear energy program that has been marred by billions of dollars in debt, cost overruns and disappointing performance, the province of Ontario on Tuesday announced a plan to spend about 20 billion Canadian dollars ($18 billion) to build reactors and refurbish some current units.
The plan also includes about 20 billion Canadian dollars for renewable energy projects and 6 billion Canadian dollars ($5.3 billion) for power conservation.
They will spend big money on renewables and conservation as well. Yet in spite of putting up big money for these purposes they obviously do not think big efforts in those areas will solve all their energy problems. So they were faced with a choice between coal and nuclear.
At least 2 new nuclear power plants will be built.
The project will initially involve at least two units at a cost of about 2 billion Canadian dollars each. But that number is expected to rise after an analysis by the government-owned Ontario Power Generation on the feasibility and cost effectiveness of renewing current stations.
The $2 billion Canadian is about $1.8 billion in US Dollars per reactor.
I've stated we face a choice between coal and nuclear. The Ontario government decided they faced that choice and they chose nuclear power. The Ontario government has decided to move away from coal.
Energy Minister Dwight Duncan directed the Ontario Power Authority (OPA) today to proceed with its recommended 20-year electricity supply mix plan, with some revisions.
The plan achieves a healthy balance by moving away from coal in favour of new nuclear power and renewable energy. The government has set targets that will double energy efficiency through conservation and double the amount of energy from renewables by 2025.
The government has directed Ontario Power Generation (OPG) to undertake feasibility studies for refurbishing units at the Pickering and Darlington sites. OPG has also been directed to begin the work needed for an environmental assessment for the construction of new units at an existing nuclear facility. Nuclear is expected to continue to be the single-largest source for Ontario's electricity in 2025.
The Association of Power Producers of Ontario (APPrO) welcomed today's announcement on Ontario's supply mix, saying that they are confident the power generation industry can bring forward the kind of supply contemplated by the government, on time, on budget and for a reasonable cost, said APPrO President David Butters. He added that development of this new generation will mean billions of dollars worth of new investment and jobs in Ontario, bringing environmentally sustainable new technologies and innovation, along with new jobs and a host of economic opportunities.
I bet htis Canadian decision will have some influence on the energy debate in the United States and in Britain. The British press has been reporting that the government is shifting toward a more pro-nuclear stance.. The Ontario government decided they didn't want the pollution that more coal plants would bring. That choice will appeal to some people in the United States.
Now, as temperatures creep up in much of the country and the peak air-conditioning season begins, it's worth noting that from an energy perspective, there's much good happening in California. More than 30 new power plants have come online in the past six years, generating 12,000 megawatts. The California Energy Commission estimates that it will have generation reserves of more than 20% this August, nearly three times what's required should power usage spike.
The better story, though, lies on the demand side of the equation, or what the state's fitness-focused governor might call portion control. Since California began aggressively pursuing energy efficiency in the mid-1970s, the state's per-capita electricity usage has remained flat at around 6,500 kilowatt-hours per person. In the rest of the country, consumption has risen from 8,000 to 12,000 kilowatt-hours in the same time frame. In terms of carbon emissions, that's the equivalent of keeping 12 million cars off the road.
Click through and you can read about all the ways the state government of California has managed to keep demand for electricity lower. However, one way is not mentioned in the article: electricity costs more in California than in most states. California's electricity is about 12 cents per kilowatt-hour (kwh) and in New England it costs about 13 cents per kwh with 16.45 per kwh in New York (wow!) versus a US national average of 9.67, 8.32 for the Moutain states and 6.97 for Wyoming. Therefore a chart of per capita energy usage by state shows Hawaii (22.83 cents per kwh), Rhode Island (14.84), New York (16.84), and California (12.98) at the 47th thru 50th spots. Whereas at the top of the list cheap electric Wyoming (6.97) has over three times the per capita electricity usage of California and not coincidentally about half the electric cost of the 4 lowest per capita electric using states. Prices have powerful effects on demand. The next biggest electricity using states are Kentucky (6.44), South Carolina (8.75), and Alabama (7.99). States with cheap electricity use more electricity. No surprise there.
How much of California's lower electric usage is due to higher electric prices? How much is due to weather that provides more natural lighting and the opportunity to spend more days outside? How much is due to government policies aimed at encouraging conservation and more efficient energy usage?
A historical analysis which compared California and national electric prices and per capita electricity usage along with per capita income (affluent people can afford to spend more to heat the jacuzzi and use air conditioning) might be able to tease out the effects of government policies versus prices. Of course, electric price differences are also a product of government policies where some regions put up bigger obstacles for coal and nuclear plants and new electric plants in general. Also, California and other states that mandate increased use of renewable electric energy sources are driving up electric prices and thereby discouraging electric usage.
George W. Bush is not a believer in many government efforts to improve energy efficiency. The Bush Administration continues to cut energy conservation programs.
If Congress accepts the Energy Department's proposed 2007 budget, it will cut $152 million - some 16 percent - from this year's budget for energy-efficiency programs. Adjusting for inflation, it would mean the US government would spend 30 percent less on energy efficiency next year than it did in 2002, the ACEEE says.
One energy-efficiency program on the chopping block is the Heavy Vehicle Propulsion and Ancillary Subsystems. It helps improve the fuel efficiency of heavy-duty trucks, one of the nation's biggest oil consumers. That program is "zeroed out" in the 2007 budget request.
The same fate awaits the $4.5 million Building Codes Implementation Grants program. It helps states adopt more energy-efficient requirements for new buildings, the nation's largest consumer of electricity and natural gas.
The $8 million Clean Cities program has helped clean-fuel technologies, like buses that run on compressed natural gas, get to market. But it's slated for a $2.8 million cut.
The article lists other programs that will be cut and includes conflicting views on the efficacy of all these programs. I suspect that program for raising building code standards for energy efficiency is money very well spent. Big improvements in building efficiency are achievable with existing technology and can be made fairly cheaply on new construction. Best to make sure the new buildings are energy efficient since the average building lasts for many decades.
Production in Alberta is up 61 percent over the past four years. This year, Alberta's oil sands are expected to produce 1.2 million barrels a day, roughly equal to the production of Texas.
However it's extracted, all bitumen has to be transformed into oil in a process called upgrading. There are several different steps in upgrading, all of them using a lot of energy, usually natural gas. It costs $23 to $26 a barrel - depending on the project - to produce light oil from sticky goo of the oil sands.
CALGARY, Alberta - A massive rise in crude production from Canada's oil sands region over the next decade will nearly triple the area's call on strained natural gas supplies, Canada's national energy regulator said Thursday.
Production from the oil sands of northern Alberta is expected to rise to more than 3 million barrels a day by 2015, according to a study by the National Energy Board, triple last year's output.
The Canadian Association of Petroleum Producers’ forecast two weeks ago was higher than NEB’s at 3.5 million bpd by 2015 and 4.9 million bpd by 2020. Both said getting the increased oil production to markets must keep pace.
Sounds like a lot right? Well, world oil production is currently 81 million barrels and most of the fields have peaked or will have peaked by 2020. More conventional Canadian production is declining just like American conventional production is declining. So even the optimistic forecast of an increase of almost 4 million barrels a day is not enough to make much of a dent in total world oil supply.
I'm curious about production costs of alternatives for conventional oil since so many oil fields are in decline and more are peaking every year. Coal-to-liquid (CTL) looks like the most likely alternative for liquid fuels at perhaps $40 to $45 per barrel. Biomass ethanol is another possibility that will become more competitive once cellulosic technologies become cheap enough to use to break down the cellulose in trees and bushes. Oil shale is another possibility and oil shale might turn out to be only slightly more expensive than tar sands oil extraction..
My guess is that CTL can scale much higher than tar sands oil and eventually oil shale might supass tar sands in daily production as well.
CHICAGO -- More than 130 wind turbines are proposed for the hilltops of central Wisconsin, but that project and at least 11 others have been halted by the Defense Department as it studies whether the projects could interfere with military radar.
Some people do not want to see views around the country ruined by wind farms. Others think wind farms are cool things to look at. My attitude is that having a few of them will provide some neat things to go look at but the operative word here is few. I want to look at mountains and just see mountains. I want to look at coasts and just see birds and perhaps the occasional passing ship. I realize this is just a personal esthetic preference. But I'm hardly alone in this preference.
The regulatory obstacles affecting so many projects are a side effect of efforts by Senator Ted Kennedy and allies to prevent a single wind farm from ruining views from Cape Cod and Nantucket. Whoever said the upper classes aren't powerful?
They say their wind turbines are victims of the ongoing dispute between Cape Cod residents and developers of the proposed Cape Wind farm in Nantucket Sound. The Defense Department study was put in the 2006 Defense Authorization Act -- inserted, say wind farm developers, by senators who want to block Cape Wind.
"This legislation was intended to derail Cape Wind, but it had a boomerang effect and affected a lot of projects around the country," said Michael Skelly of Horizon Wind Energy, a Texas company constructing the country's largest wind farm near Bloomington, Ill.
Tell the peasants to eat cake. We don't want to ruin the view from our country houses at Versailles.
The rate of wind turbine permit applications to the US Federal Aviation Administration has more than quadrupled since 2004.
The FAA has received more than 4,100 wind turbine applications so far this year, compared with about 4,300 in 2005 and 1,982 in 2004.
Not to worry wind power fans. Big money is lining up to invest in wind turbines. So the upper classes will be represented on both sides of the battle with probably more money lined up for the spread of wind farms than against it.
How about a market in views? Then the Cape Cod folks could refuse to sell their views. Others could sell theirs.
I happen to like scenic vistas myself and would prefer we accelerate research on both solar photovoltaics in order to come up with more visually preferable alternatives. I figure shingles and siding made from nanotech materials that aren't even recognizable as photovoltaics will be the best solution.
I also think we ought to accelerate the development of next generation nuclear reactor designs.
As the cost of natural gas and oil remain high we are going to start using more alternatives. Coal is going to be the biggest winner. We are also going to see farms expand into natural areas to grow more crops for biomass. Plus, many more wind farms will get constructed. I prefer accelerating technological developments to lower the costs of alternatives that have fewer environmental impacts. But not enough people hold this view with enough intensity for this view to have much choice. So more coal and more wind towers are in our future.
High energy prices are focusing more minds on energy policy. A big push is on to set a national goal of 25% of energy from renewables by 2025.
In Washington this week, a bipartisan group of lawmakers, industry leaders (including the three Detroit automakers), farm groups, governors, county officials, and environmentalists launched an effort to have the nation get 25 percent of its total energy from renewable sources by 2025.
This ambitious proposal - dubbed "25x'25" - goes well beyond what Congress and the White House have enacted so far, and it's likely to encounter environmental and economic speed bumps along the way.
The goal of securing one-fourth of the nation's total energy from renewable sources such as wind, solar, biomass, and biogas by 2025 was introduced this week as a concurrent resolution in both houses of Congress. So far, it has at least 30 cosponsors with the number growing daily.
The big downside of such a coalition is that they will make biomass for vehicle ethanol a big component of that drive. Whether that would yield a net environmental benefit is arguable. On the bright side, the ratio of energy out to energy in will rise due to the development of cellulosic technologies and a shift toward switchgrass and other biomass energy sources instead of corn. But the amount of land that would need to move into farm production will still increase substantially. This coalition ought to state how much additional land they expect to use to reach their goal.
At an international level a move toward biomass would be even more problematic for the environment. Picture densely popualted tropical countries cutting down rainforests to plant fields for biomass. Not a pretty picture.
The supporters of this initiative think attitudes are shifting in their direction. I tend toward the "Seen one way" view in this first paragraph.
Seen one way, this new energy effort is a coalition of well-known special interests like ethanol producers, tree farmers, and solar equipment manufacturers. But boosters believe a critical mass of public support has developed that puts a strong political wind at their backs.
One example: The kick off session for the annual meeting of the Western Governors Association this weekend in Sedona, Ariz., focuses on clean energy. The WGA, whose 18 state executives (11 of whom are Republicans) oversee the fastest growing states in terms of population and energy consumption, will propose the development of 30,000 megawatts of "clean and diverse energy" across the American West by 2015 while increasing energy efficiency 20 percent by 2020.
The coalition has a website at 25x25.org.
“Today we have Republicans and Democrats, rural and urban interests, and representatives from over 140 different farm, forestry and environmental organizations coming together behind a common energy goal for the nation,” said 25x’25 Steering Committee Co-Chair Bill Richards. “This introduction is truly unprecedented.”
Lead sponsors include: Sens. Charles Grassley (R-Iowa) and Ken Salazar (D-Colo.), Dick Lugar (R-Ind.), Tom Harkin (D-Iowa) and Reps. Bob Goodlatte (R-Va.), Collin Peterson (DMinn), Marilyn Musgrave (R-Colo.) and Mark Udall (D-Colo.)
Does a forest produce more cellulose and more energy per year than a field growing switchgrass? After all, the forest is there 365 days a year. Whereas most agricultural plants are just for a much shorter growing season. How much more energy per acre could be harvested per acre on a tree farm that grows for, say, 10 or 20 years as compared to the same land used for seasonal agriculture? Anyone know?
Support has been building for the 25x'25 initiative from all across the country. Over 100 organizations have endorsed the vision, including broad-based farm organizations like the American Farm Bureau Federation, the National Farmers Union, and companies like Deere & Company; as well as environmental groups like NRDC, Environmental Defense and the National Wildlife Federation. In addition, Governors Jeb Bush (R-Fla.), Dave Heineman (R-Neb.), Tim Pawlenty, (R-Minn.), Brian Schweitzer (D-Mont.), Ed Rendell (D-Pa.) and Mitch Daniels (R-Ind.) have endorsed the goal, as have the state legislatures of Colorado, Nebraska, Kansas and Vermont.
What do the NRDC and National Wildlife Federation think of the idea of shifting millions more acres of land into agricultural uses even as more land gets shifted into residential usage due to largely immigrant-driven population growth? Any lights on in the environmental movement?
How much biofuel can we produce?
Producing energy from America's abundant farm and forest lands is an idea whose time has come. In the State of the Union address this year, President Bush set a goal of replacing 75% of our oil imports from the Middle East by 2025 - a quantity very similar to 25x'25 (because most of our oil comes from other regions). Oak Ridge National Laboratory reports that we have more than 1 billion tons of unused raw materials each year that could be used to make biofuels. In fact, one of America's leading venture capitalists says 25x'25 is too conservative a goal, and that we can shoot higher and move faster.
A lot of those "unused raw materials" currently serve as food for a large variety of plant and animal species. I'm guessing the "leading venture capitalist" mentioned is Vinod Khosla.
How do we produce that much biofuel?
To get to 25x'25, we will need to use all kinds of plants for biofuels. Today ethanol is made from corn, sugar cane, and sweet sorghum. Biodiesel is made from oil seeds like soybeans and canola and from nuts like coconut, palm, and jatropha. Advanced biofuels can be made from the "cellulose" in trees, grass, agricultural residue (corn stalks, cotton gin, rice hulls), and municipal solid waste. Cellulose makes up the majority of a plant's structure and can be broken down into sugars, which can then be fermented and made into ethanol. The President vowed in the State of the Union to make advanced ethanol available by 2011. Once commercialized, advanced ethanol will be competitive with $35 per barrel oil. Studies indicate that the U.S. can produce 50 billion gallons of cellulosic ethanol using only agricultural residue.
50 billion gallons sounds like a lot right? Well, divide it by 300 million people. That's only 167 gallons per person. Note that ethanol has a much lower energy density than gasoline. Plus, energy will be used to collect the agricultural residue and operate the ethanol production plants. So the gain is even smaller than that suggested by the gallons per person. We'd be better off accelerating the development high conversion efficiency photovoltaics and better battery technologies. The photovoltaics would use far less land and have less environmental impact. Also, nukes would use far less land as well.
Genetic causes of behavior matter because they influence us right now. But they will matter even more in the future when offspring genetic engineering becomes a reality. I think it unlikely that people will consciously choose the same frequencies of genetic variations for their offspring as occur naturally. Every human nature that has some genetic causes is going to become either more or less frequent when people can choose which genetic variations to give their offspring. Hence every report about genetic causes of some human behavior is a report about something humans do that they'll become either more or less inclined to do in the future. Will parents choose to use genetic engineering make their kids more entrepreneurial?
Scott Shane, the Mixon Professor of Entrepreneurial Studies at Case Western Reserve University's Weatherhead School of Management; Nicos Nicolaou, a lecturer in entrepreneurship at the Tanaka School of Business of Imperial College London; and Janice Hunkin, Lynn Cherkas, and Tim Spector of the Twin Research & Genetic Epidemiology Unit at St Thomas' Hospital in London, home of the UK Twin registry of over 10,000 twins collaborated in this unique study. They compared rates of entrepreneurship between and among more than 1,200 pairs of identical and fraternal twins in the U.K and conclude that nearly half—48 percent—of an individual's propensity to become self-employed is genetic.
The authors studied self-employment among 609 pairs of identical twins, and compared it to self-employment among 657 pairs of same-sex fraternal twins in the U.K. Identical twins share 100% of their genetic composition, while fraternal twins share about 50%, on average. Thus differences in the rates at which pairs of identical twins both become entrepreneurs and the rates at which both members of fraternal twins both become entrepreneurs are attributable to genetics. "One can look at the patterns of concordance (the numbers of pairs of twins in which both members are or are not entrepreneurs) and reasonably infer that genetic factors account for the differences," says Shane.
The authors propose several methods by which genetic factors might influence people's tendency to become entrepreneurs. For example, genes may predispose an individual to develop traits such as being sociable and extroverted, which in turn facilitate skills such as salesmanship, which are vital to entrepreneurial success.
In addition, genes have been shown to affect the level of education an individual receives, and more highly educated people are likelier to become entrepreneurs because they are better able to recognize new business opportunities when they arise.
It is likely that entrepreneurship comes as a result of other qualities as mentioned above. Will parents choose those qualities based on a desire to make their kids self-employed? Or will they choose those qualities mainly for other reasons and will the effect on entrepreneurial behavior come as a side effect of choices made for other reasons?
People in different cultures, economic classes, occupations, religions, and with different genetically determined qualities for their own minds will make different choices on average. Will this tend to make the human race diverge? Or will there be a wide consensus on all the important genetically controlled qualities of the mind and will humanity tend to converge?
One split I expect: I predict some religious folks will choose genetic qualities that make their kids more inclined to have faith. Whereas more empirically minded folks will choose genetic qualities that make their kids highly skeptical, critical, and empirical. Though some of a more socialistic bent might choose qualities that make kids turn out more altruistic and group-oriented.
ARLINGTON HEIGHTS, Ill. – Want to know a person's real age? Just look at their hands, reports a study in the June issue of Plastic and Reconstructive Surgery®, the official medical journal of the American Society of Plastic Surgeons (ASPS). According to the study, most people can accurately tell a person's age by viewing only their hands.
"A primary motivation to have plastic surgery is to look and feel better, often by seeking a younger looking appearance. However, looking younger after your facelift or eyelid surgery can conflict with aged hands that simply do not match the face," said Roxanne Guy, MD, ASPS president-elect. "After the face, hands are the second most visible, tell-tale sign of one's age. If your goal is to look more youthful or you are bothered by the appearance of your hands, you may seriously want to consider hand rejuvenation."
In the study, people examined unaltered photographs of female hands and were asked to estimate the women's ages, i.e., younger than 20 years, 20 to 30 years, 30 to 40 years, etc. In the majority of cases, participants were able to accurately estimate the age of each woman in the unaltered photographs.
Participants were also asked to compare digitally altered photographs of female hands – blemishes and hand veins were removed or jewelry and nail polish were added – to unaltered photographs to assess which hands looked younger. The majority of participants felt that the altered photos of women's hands appeared younger. However, alterations to photos of very elderly hands – characterized by thin skin, age spots, wrinkles, deformity, veins and prominent joints – did not change the participants' ability to distinguish the person's age.
Overall, the physical characteristic which most commonly gave away age was prominent hand veins.
This being a press release from a society of plastic surgeons they went on to explain how plastic surgeons can do many things to hands to make them look younger. But SENS therapies that make our bodies younger through and through would be far more appealing.
Do people who have younger looking hands live longer on average? Does the outwardly visible condition of one's hands serve as a pretty good proxy for one's general rate of aging?
I'm expecting rejuvenation for skin to come much sooner than rejuvenation for some other parts of the body. The skin is very accessible for delivering treatments whether those treatments be gene therapies, cell therapies, or drugs that break age-related crosslinks between cells and fibers. Plus, people are willing to take risks in order to look better. Also, people spend their own money on plastic surgery rather than insurance money and are willing to pay for experimental treatments that will make them look better.
A little-known mental disorder marked by episodes of unwarranted anger is more common than previously thought, a study funded by the National Institutes of Health's (NIH) National Institute of Mental Health (NIMH) has found. Depending upon how broadly it's defined, intermittent explosive disorder (IED) affects as many as 7.3 percent of adults — 11.5-16 million Americans — in their lifetimes. The study is based on data from the National Comorbidity Survey Replication, a nationally representative, face-to-face household survey of 9,282 U.S. adults, conducted in 2001-2003.
I bet these people with IED will eventually be identifiable with brain scans. Suppose a treatment to stop IED is developed. Will future societies support mandatory delivery of therapies that prevent violent episodes?
People with IED have other mental problems.
People with IED may attack others and their possessions, causing bodily injury and property damage. Typically beginning in the early teens, the disorder often precedes — and may predispose for — later depression, anxiety and substance abuse disorders. Nearly 82 percent of those with IED also had one of these other disorders, yet only 28.8 percent ever received treatment for their anger, report Ronald Kessler, Ph.D., Harvard Medical School, and colleagues. In the June, 2006 Archives of General Psychiatry, they suggest that treating anger early might prevent some of these co-occurring disorders from developing.
To be diagnosed with IED, an individual must have had three episodes of impulsive aggressiveness "grossly out of proportion to any precipitating psychosocial stressor," at any time in their life, according to the standard psychiatric diagnostic manual. The person must have "all of a sudden lost control and broke or smashed something worth more than a few dollars…hit or tried to hurt someone…or threatened to hit or hurt someone."
People who had three such episodes within the space of one year — a more narrowly defined subgroup — were found to have a much more persistent and severe disorder, particularly if they attacked both people and property. The latter group caused 3.5 times more property damage than other violent IED sub-groups. Affecting nearly 4 percent of adults within any given year — 5.9-8.5 million Americans — the disorder leads to a mean of 43 attacks over the course of a lifetime and is associated with substantial functional impairment.
Evidence suggests that IED might predispose toward depression, anxiety, alcohol and drug abuse disorders by increasing stressful life experiences, such as financial difficulties and divorce
Once we achieve the ability to reverse aging using SENS techniques the existence of people who are capable of explosive anger and physical attacks will be viewed as a much bigger threat due to the length of individual lives. A person who goes overboard in their reactions when they get angry is more likely to kill someone if they have centuries more to do so. If we live for thousands of years each of us will face a much larger risk of eventually getting murdered.
In an era when aging becomes fully reversible I expect we will witness movements to create new nations that have extremely selective immigration policies designed to keep out people with violent tendencies. New polities will be created by long livers who want to minimize their risk of death. Such polities will also implement very high safety standards.
Boston-- Age-related macular degeneration (AMD) is one of the leading causes of vision loss in older adults and a person's risk may partly depend upon diet. When it comes to carbohydrates, quality rather than quantity may be more important, according to new research by Allen Taylor, PhD, director of the Laboratory for Nutrition and Vision Research at the Jean Mayer USDA Human Nutrition Research Center on Aging (HNRCA) at Tufts University, and colleagues. Their findings were reported in the April 2006 issue of the American Journal of Clinical Nutrition.
Taylor and colleagues analyzed data from a sub-group of participants in the Nurses' Health Study (NHS) who were enrolled in the Nutrition and Vision Program. The researchers looked at the total amount of carbohydrates consumed over 10 years and the dietary glycemic index, which is a measure of the quality of overall dietary carbohydrate.
"Women who consumed diets with a relatively high dietary glycemic index had greater risk of developing signs of early age-related macular degeneration when compared with women who consumed diets with a lower dietary glycemic index," says lead author Chung-Jung Chiu, DDS, PhD, scientist in the Laboratory for Nutrition and Vision Research at the HNRCA and an assistant professor at Tufts University School of Medicine. High total carbohydrate intake, however, did not significantly increase the risk factor for AMD.
"In other words, the types of carbohydrates being consumed were more important than the absolute amount," explains Taylor, senior author. A high-glycemic-index diet is one that is rich in high-glycemic-index foods, which are converted more rapidly to blood sugar in the body than are low-glycemic-index foods.
You can lower your average dietary glycemic index in all sorts of ways. For example, the sticky rice served in Chinese restaurants has a very high glycemic index (i.e. it gets digested and the sugar in it passes into your bloodstream very quickly). Whereas Basmati rice is much lower and Uncle Ben's converted rice is lower stilll. Similarly, the types of wheat used to make bread have much higher glycemic index than the types of wheat used to make pasta and whole grain generally is lower than white bread. The idea where is that you don't have to give up a major grain. You can just shift toward subtypes that have lower glycemic index.
Rick Mendosa has a great online list of foods and their glycemic indexes. Go study it. Also, read his introduction to glycemic index which offers all sorts of insights about why foods vary in glycemic index. Note that in the scaling he uses 100 is the index for glucose. The low range starts at 55 and goes down into the 30s and 40s for some grains and beans. Higher amylose grains have lower glycemic indexes because amylose starch is broken down fairly slowly in the digestive tract. I wish rices came with an amylose rating on the bag. Then purchase of a rice with 27% or 28% amylose would assure you are getting a low glycemic index variety. Note how he lists a low amylose corn muffin with a glycemic index of 102 and a high amylose corn muffin with a glycemic index of 49. Huge difference. Though the latter number might be due to rolled oats in the recipe.
The Wikipedia Glycemic Index page is worth a read as well.
Operating commercial buildings consumes a sixth of all the energy used in the Western world. Getting rid of air conditioning could cut that consumption by as much as a third -- but people don't like to work in sweltering heat.
So MIT researchers are making computer-based tools to help architects design commercial buildings that cool occupants with natural breezes.
Buildings can be designed to encourage airflow and maintain temperatures that minimize or eliminate the need for conventional air-conditioning systems. "That approach improves air quality, ensures good ventilation and saves both energy and money," said Professor Leon R. Glicksman, director of MIT's Building Technology Program. Indeed, studies have shown that people generally feel more comfortable in a naturally ventilated building than in an air-conditioned one.
The researchers studied a buildig in Luton Britain which cools using natural ventilation and built a computer model to simulate how the building's air circulates. They were able to find ways to improve the design of natural ventilization designs and think they can cut air conditioning costs in half.
"We found what we initially thought were some strange results when we did the full-scale-building tests," said Glicksman. "But using the computer model, we now understand the physics of it, first of all confirming that it's a real effect and second, why it occurred." Such effects can be corrected by building in automatic control systems that, for example, turn on the vent fans when needed to ensure the continuous flow of fresh air.
Based on these findings, the MIT team is formulating a simple, user-friendly computer tool that will help architects design for natural ventilation. They plan to incorporate the tool into their "Design Advisor," a web site (designadvisor.mit.edu) that lets architects and planners see how building orientation, window technology, and other design choices will affect energy use and occupant comfort.
Natural ventilation does, of course, have its limits. For example, during hot summers in Hong Kong or even Boston, conventional air conditioning would still be needed. But just using natural ventilation during spring and fall in Boston, for example, could save at least half the energy now used for year-round air conditioning, the researchers estimate.
Most popular discussions about energy costs tend to revolve around cars and other vehicles. But boosting efficiency of new building designs seems to me an easier goal to achieve and does not require so many basic breakthroughs in science and technology.
COLUMBUS , Ohio – Researchers found that they could eliminate the rewarding effect of cocaine on mice by genetically manipulating a key target of the drug in the animal's brain.
While the researchers aren't suggesting that these genetic modifications be made in humans, the work brings to light the key protein that controls cocaine's effects in the body, which may help scientists develop medications that achieve the same results and therefore help addicts overcome their dependence.
Humans are not evolutionarily adapted to handle recreational drugs. But some day with genetic engineering our offspring might be adapted to resist drug and alcohol abuse.
Howard Gu and colleagues at Ohio State University showed they could genetically engineer mice to be resistant to the effects of cocaine.
He and his colleagues raised laboratory mice with genetic alterations in the gene that codes for the dopamine transporter.
“By doing so we created a dopamine transporter that resists cocaine but also retains its function of taking up dopamine and carrying it back to the neurons,” Gu said.
I am not surprised that an alteraton of a brain protein could produce a different reaction to a drug will at the same time retaining normal function. But what is amazing is that these scientists - using 2006 biotechnology - were able to find an alteration that produces this outcome.
The behavior of the mice with genetically engineered dopamine transporters suggest that they did not get a high off of cocaine
“The normal mice spent more time in the compartment where they had received the cocaine injections,” Gu said. “These animals were seeking more cocaine. However, the mice with the modified transporters showed no preference for either test compartment within the box.”
The researchers used the video footage to measure each animal's activity level after a cocaine injection. The normal mice on cocaine covered roughly five times the distance than the control mice injected with saline (6 meters vs. 1 meter). In contrast, the cocaine-injected mice with the modified dopamine receptors covered about half the distance that the saline-only injected mice covered (roughly 1.5 meters vs. 3 meters.)
“After the cocaine injections, the normal mice ran all over the place, sniffing and checking everything out in the box over and over again, until we took them out of the box,” Gu said. “But cocaine seemed to calm the modified mice, as they sat in a corner for long periods of time.”
“To the modified mice, cocaine appears to be a suppressant, not a stimulant,” Gu said.
Some people argue it will be hard to discover new ways to enhance cognitive function. But using today's biotechnology these Ohio State scientists found a way to reduce the ability of a drug to alter cognitive function. Imagine what tools will be available 20 years from now to use to search for ways to alter functionality in brain proteins.
At the request of readers I've been out looking for information about whether methylphenidate (Ritalin) and dextroamphetamine (Dexedrine; Adderall) boost IQ and SAT scores. Haven't come up with anything quantitative yet. But in the process of looking I came across some interesting reports on biofeedback treatments for ADD (attention deficit disorder)/attention deficit hyperactive disorder (ADHD). A recent Stanford symposium, entitled "Brainwave Entrainment to External Rhythmic Stimuli: Interdisciplinary Research and Clinical Perspectives", surveyed methods of using rythmic stimuli as cognitive therapy. Maybe Janet Jackson has been delivering cognitive therapy.
Harold Russell, a clinical psychologist and adjunct research professor in the Department of Gerontology and Health Promotion at the University of Texas Medical Branch at Galveston, used rhythmic light and sound stimulation to treat ADD (attention deficit disorder) in elementary and middle school boys. His studies found that rhythmic stimuli that sped up brainwaves in subjects increased concentration in ways similar to ADD medications such as Ritalin and Adderall. Following a series of 20-minute treatment sessions administered over several months, the children made lasting gains in concentration and performance on IQ tests and had a notable reduction in behavioral problems compared to the control group, Russell said.
But the article does not quantify these gains.
The frequency of the delivered light and sound is controlled using biofeedback to measure brain activity.
"For most of us, the brain is locked into a particular level of functioning," the psychologist said. "If we ultimately speed up or slow down the brainwave activity, then it becomes much easier for the brain to shift its speed as needed."
Russell, whose study was funded by the U.S. Department of Education and included 40 experimental subjects, hopes to earn approval from the Food and Drug Administration to use the brainwave entrainment device as a treatment for ADD. The device uses an EEG to read brainwaves and then presents rhythmic light and sound stimuli through special eyeglasses and headphones at a slightly higher frequency than the brain's natural rhythm.
Thomas Budzynski, an affiliate professor of psychology at the University of Washington, conducted similar experiments with a small group of underachieving college students at Western Washington University. He found that rhythmic light and sound therapy helped students achieve a significant improvement in their grades.
Again, the article does not quantify the gains. Still, interesting.
Budzynski also found that rhythmic therapy could improve cognitive functioning in some elderly people by increasing blood flow throughout the brain. "The brain tends to groove on novel stimuli," Budzynski explained. "When a novel stimulus is applied to the brain, the brain lights up and cerebral blood flow increases." To maintain the high blood flow, Budzynski used a random alternation of rhythmic lights and sounds to stimulate the brains of elderly people. The result: Many of the seniors improved performance on an array of cognitive tests.
Wouldn't you like to try some of these methods of biofeedback to see if you could enhance your own cognitive function?
Jacques Duff, the Australian president-elect of the International Society of Neuronal Regulation, runs a centre in Melbourne that has treated more than 1000 people. He believes the treatment is so effective the need for medication can sometimes be eradicated.
"In the case of ADHD, within 20 sessions the effect is similar to Ritalin, with the effects being permanent," Duff says.
Biofeedback-based therapy strikes me as probably lower risk than drugs.
The IQ score boost of ADHD kids is probably much higher than would be seen with people who do not have ADHD. Also, the extent of the benefit might be different depending on whether one's problem is more distractability versus hyperactivity.
As the technique works on strengthening brainwaves, just about anyone can benefit from it, with students and athletes attending clinics. However, Duff warns that budding Einsteins will be disappointed.
"There is an optimum set-point for the brain. You can't, for example, keep making someone smarter. On average though, an IQ increase of 15 points is seen in children with ADHD and learning difficulties."
The use of biofeedback for ADD/ADHD is nothing new. You can find lots of articles on it going back decades if you search on it.
Results: BASC Monitor and TOVA scores indicated similar significant improvements in both groups. No significant difference in treatment change was seen in between-group comparisons. Parents' subjective appraisal of treatment effect on ADHD was more positive for the videogame group. The videogame treatment was rated significantly more enjoyable by both parents and children. Trends on pre-post QEEG change maps indicated that the videogame training may have advantages in creating more quantitative EEG effect in the therapeutic direction.
Conclusions: We conclude that the videogame biofeedback technology, as implemented in the NASA prototype tested, produced equivalent results to standard neurofeedback in effects on ADHD symptoms. Both the videogame and standard neurofeedback improved the functioning of children with ADHD substantially above the benefits of medication. The videogame technology provided advantages over standard neurofeedback treatment in terms of enjoyability for the children and positive parent perception, and possibly has stronger quantitative post-treatment effects on EEG.
I'd love to see large scale controlled tests of the cognitive performance effects of ADHD drugs with SAT, IQ, and other tests delivered before and after administration of drugs to people with and without ADHD and to people with a wide range of IQ levels.
Also see my post ADHD Drugs In Vogue For Boosting College Test Scores.
A London School of Economics philosophy professor Luc Bovens speculated that perhaps the rhythm method of birth control leads to many more conceptions that happen as eggs are becoming less viable and therefore the embryos do not survive?
It is believed that the method works because it prevents conception from occurring. But says Professor Bovens, it may owe much of its success to the fact that embryos conceived on the fringes of the fertile period are less viable than those conceived towards the middle.
We don’t know how much lower embryo viability is outside this fertile period, contends Professor Bovens, but we can calculate that two to three embryos will have died every time the rhythm method results in a pregnancy.
Is it not just as callous to organise your sex life to make it harder for a fertilised egg to survive, using this method, as it is to use the coil or the morning after pill, he asks?
Professor Bovens cites Randy Alcorn, a US pro-life campaigner, who has equated global oral contraceptive use to chemical abortion that is responsible for tens of thousands of deaths of embryos, or unborn children, every year.
But says Professor Bovens: if all oral contraceptive users converted to the rhythm method, then they would be effectively causing the deaths of millions of embryos.
Similarly, regular condom users, whose choice of contraception is deemed to be 95% effective in preventing pregnancy, would “cause less embryonic deaths than the rhythm method,” he says.
“…the rhythm method may well be responsible for massive embryonic death, and the same logic that turned pro-lifers away from morning after pills, IUDs, and pill usage, should also make them nervous about the rhythm method,” he contends.
If embryo death is morally the equivalent of the death of a full human then practices that lead to more embryo deaths should be seen as morally undesirable.
I do not see a clear dividing line between what is human and what is not human. This problem is going to become more obvious to the general population when biotechnology allows the creation of beings that are sentient and yet very unlike the average human.
Also (and I'm digressing here) as another sign that I'm a thorough heretic from secular liberal dogma: I do not see how all humans can be classified as having equal human rights. Humans do not have equal capacities to respect the rights of others (don't believe me? want your kids to live next to a pedophile?). So how can they have equal rights? Seems to me that rights flow from the capacity to respect rights. Seems to me one has to embrace a supernatural belief (God loves us all and we all have spirits) or become thoroughly unempirical about the nature of this world in order to believe we all should have equal rights.
I've only glanced through it but here's the paper (PDF format) which is getting published in the Journal of Medical Ethics.