A researcher at the University of Alberta has shown that parents are more likely to give better care and pay closer attention to good-looking children compared to unattractive ones. Dr. Andrew Harrell presented his findings recently at the Warren E. Kalbach Population Conference in Edmonton, Alberta.
Harrell's findings are based on an observational study of children and shopping cart safety. With the approval of management at 14 different supermarkets, Harrell's team of researchers observed parents and their two to five-year-old children for 10 minutes each, noting if the child was buckled into the grocery-cart seat, and how often the child wandered more than 10 feet away. The researchers independently graded each child on a scale of one to 10 on attractiveness.
Findings showed that 1.2 per cent of the least attractive children were buckled in, compared with 13.3 per cent of the most attractive youngsters. The observers also noticed the less attractive children were allowed to wander further away and more often from their parents. In total, there were 426 observations at the 14 supermarkets.
Harrell, who has been researching shopping cart safety since 1990 and has published a total of 13 articles on the topic, figures his latest results are based on a parent's instinctive Darwinian response: we're unconsciously more likely to lavish attention on attractive children simply because they're our best genetic material.
"Attractiveness as a predictor of behaviour, especially parenting behaviour, has been around a long time," said Harrell, a father of five and a grandfather of three. "Most parents will react to these results with shock and dismay. They'll say, 'I love all my kids, and I don't discriminate on the basis of attractiveness.' The whole point of our research is that people do."
Another possible interpretation is that the parents of less attractive children have genetic sequences that make them more lackadaisical toward thier children or less concerned about risks or perhaps more worried about other things (e.g. having enough money to buy the food). Perhaps the parents who have less attractive children are less intelligent on average. One could adjust for this by watching parents who have multiple children of different levels of attractiveness. Also, one could measure parental attractiveness and look for symbols of parental economic status (e.g. the economic value and type of the car the parents get into when they leave the supermarket). Still, I think the researchers are right that parents react more favorably toward more attractive children.
I'd like to see a study of this sort take pictures of parents and children and then measure their symmetry. Symmetry is one quality that enhances perceived attractiveness. More symmetrical parents are probably more likely to have more symmetrical children because the parents have genetic sequences that code for better embryonic development. My guess is that more symmetric parents will, on average, be more inclined to safeguard their children.
This is all an argument for genetic engineering by the way. How? If children are all genetically engineered to be beautiful then parents would be more likely to protect and less likely to abuse their children. Therefore in the future when genetic engineering is routinely used to improve physical appearances we can expect child injuries from accidents and abuse to become less frequent. Anyone like Leon Kass who opposes germ line genetic engineering is effectively opposing measures that will reduce child accidents and child abuse.
Hermundur Sigmundsson of the Norwegian University of Science and Technology, Trondheim Norway, found that in simulated driving conditions dyslexics took longer to react to flashing signs.
The six dyslexic drivers took on average 0.13 seconds longer to react during the rural drive than the non-dyslexic controls and were 0.19 seconds slower in the city, where the simulated environment was more complex. In both tests the controls took around 0.6 seconds to respond, so the dyslexic drivers were experiencing a delay of 20 to 30 per cent (Brain and Cognition, DOI: 10.1016/j.bandc.2004.11.007).
The article says this level of delay is worse than what happens as a result of moderate drinking.
Dr John Rack, of the Dyslexia Institute, said: "It's a small study and it's an overgeneralisation and oversimplification of what dyslexia is.
"It could really be quite offensive, I feel, to the many, many dyslexic people who are actually quite talented and skilled in those areas."
Perhaps some will be offended. But is it true? If this result is true is it true for all dyslexics? Are there dyslexics who can react very rapidly to sudden changes on roads? Also, does the average person who gets into a car accident have slower reaction times than the average person who does not get into a car accident?
In the future wouldn't it make sense to use objective tests of speed of responses to simulated road events to measure how well or poorly each prospective driver will do on the road? Even if dyslexics have slower reaction times on average surely there are non-dyslexics who naturally have reaction times that are worse than the average.
Also, picture devices in cars of marginal drivers or alcoholics that would test their reaction times and accuracies every time a bad driver wanted to start the car and go somewhere. Anyone unable to reaction in a timely manner could be denied the ability to make the engine start.
The more that different categories of people are compared the more differences will be found between those categories. This is inevitably going to lead to calls for rules changes that take into account the knowledge of these differences.
A study on brain development shows another example of differences in how the different brains react to road situations. Full development of a brain area involved in the tendency toward behavioral inhibition in the face of risks does not occur until age 25.
A National Institutes of Health study suggests that the region of the brain that inhibits risky behavior is not fully formed until age 25, a finding with implications for a host of policies, including the nation's driving laws.
"We'd thought the highest levels of physical and brain maturity were reached by age 18, maybe earlier -- so this threw us," said Jay Giedd, a pediatric psychiatrist leading the study, which released its first results in April. That makes adolescence "a dangerous time, when it should be the best."
Suppose it becomes possible to measure brains to show that some people never fully develop the part of their brain that causes them to inhibit risky behavior. Should those adults be kept off the road just because they are too prone to risks when behind the wheel? Or if some people develop tendencies to risk-aversion earlier should they be granted drivers licenses earlier than the majority of the population?
The problem with judging people only by what they do is that for many actions waiting for people to do undesired things (like cause a car accident) causes too much damage. We are going to gain greater abilities to predict what people might do. How to reconcile the coming greater abilities to predict human behavior with the legal ideal to treat all equally before the law?
Imperial College of London psychologist John Gruzelier used hypnosis on 12 patients resistant to hypnosis and 12 susceptible to hypnosis while watching their brains with functional Magnetic Resonance Imaging (fMRI) and gave them orders to follow. The hypnotically suspectible showed a difference in brain activity in the anterior cingulate gyrus and in the left prefrontal cortex of the brain.
But under hypnosis, Gruzelier found that the highly susceptible subjects showed significantly more brain activity in the anterior cingulate gyrus than the weakly susceptible subjects. This area of the brain has been shown to respond to errors and evaluate emotional outcomes.
The highly susceptible group also showed much greater brain activity on the left side of the prefrontal cortex than the weakly susceptible group. This is an area involved with higher level cognitive processing and behaviour.
Gruzelier also suspects that hypnotism may interfere with subjects' evaluation of future emotions such as embarrassment. A region in the brain's medio-frontal cortex, close to the anterior cingulate, governs our perception of how we will feel if we take a certain course of action, he says. If connections between the two regions are impaired, stage volunteers might happily act without thinking.
Professor John Gruzelier, from Imperial College London, said: “We have a magnificent therapeutic tool which is being ignored because there’s no evidence of the mechanism involved. Now we’re getting evidence of the mechanism and we hope people will take it more seriously.”
Last year, Stanford University psychiatric researcher David Spiegel used positron emission tomography (PET) scans to watch changes in brain function in volunteers who were highly hypnotizable.
The hypnotized volunteers were told to see colour. Then, regardless of whether or not the researchers showed them colour, the areas of the visual cortex that registers colour would fire. When the researchers told them to see "grey" objects, the volunteers had less activity in the colour zones of the brain.
Of course the development of greater understanding the mechanisms underlying hypnosis will inevitably lead to the development of more powerful and reliable techniques for invoking hypnotic states even in people who can not now be hypnotized. That, in turn, will inevitably lead to abuses of the techniques. Imagine a police state that brings in its citizens, invokes hypnotic states, and then feeds them with all sorts of suggestions about what they should believe and do. Imagine brain implants that can be used to remotely trigger a hypnotic state. Technology enhances the ability to carry out both good and evil acts.
Psychologists Elaine Duncan of the Glasgow Caledonian University and David Sheffield of Staffordshire University compared a group of people who kept regular diaries to a group that did not and found that diarists are more socially awkward, have more headaches, digestive problems, and other problems.
Statistically, the diarists scored much worse on health measures than the non-diarists. And worst affected of all were those who had written about trauma. “They were most susceptible to headaches and the like,” says Duncan.
Are those who decide to write diaries more prone to mental and physical unhealthiness in the first place? The fact that diarists who have written about trauma do worse than those who haven't suggests that it is the diary writing that is causing the health effects.
This result reminds me of the controversy over the question of whether post-trauma debriefing by counselors is beneficial for trauma victims. The results of a number of studies have been mixed. At best debriefing where victims are made to think through and discuss traumatic events probably has no value for most victims. At worst it may be causing the painful memories to have an even greater harmful effect upon mental health.
Malachy Corrigan, the director of the Counseling Service Unit of the New York City Fire Department, was once a proponent of debriefing—but months before the September 11th attacks he decided that it was generally not a beneficial technique. “Sometimes when we put people in a group and debriefed them, we gave them memories that they didn’t have,” he told me. “We didn’t push them to psychosis or anything, but, because these guys were so close and they were all at the fire, they eventually convinced themselves that they did see something or did smell something when in fact they didn’t.” For the workers in the pit at Ground Zero, Corrigan enlisted other firefighters to be “peer counsellors” and to provide moral support and educational information about the possible mental-health impact of sustained trauma.
We are probably better off letting traumatic memories fade. If the memories are clear we can recall them and relive them and suffer pain from thinking about them.
These results bring up an interesting possibility: Will future biotechnologies that enhance memory formation increase the incidence of mental health problems as people become more able to recall painful experiences? Of course, if that turns out to be the case then there is an obvious counter: selective memory erasure. While I argue that for practical reasons we can't be allowed to have an unlimited right to memory erasure there could be considerable therapeutic benefit from the erasure of particularly traumatic memories. Some scientists argue that after initial memory consolidation takes place it may be possible to recall memories and then interfere with their reconsolidation as a way to erase recalled memories. Until we develop that capability it is probably wisest to avoid dwelling excessively on painful memories. Perhaps diaries should only be written to on happier days. Also, live in ways that reduce your odds of having traumatic or otherwise unhappy experiences so that bad memories don't need to be forgotten in the first place. Don't worry. Be happy.
Psychologist Alan Slater of Exeter University showed pictures rated by adults as being more or less attractive to babies which were, on average, 2 and a half days old and found that babies invariably stared longer at faces which adults had rated as more attractive.
Babies are born with an eye for beauty. Infants only hours old will choose to stare at an attractive face rather than an unattractive one - and they also prefer to listen to Vivaldi straight, rather than Vivaldi backwards.
According to Alan Slater, a developmental psychologist at the University of Exeter, humans may have a biologically ingrained preference for beauty.
As Steven Pinker has argued we are not blank slates! Here's yet another report that is a nail in the coffin of the blank slate myth. Attraction to beauty is an instinctive trait that is present from birth.
Dr Alan Slater, a psychologist at Exeter University, said this proved that attraction to better-looking people was an instinctive human trait, something we are all born with.
"It used to be thought that new-born babies came into the world as a totally blank sheet of paper on which experience will then write," he said yesterday. "But what we are finding more and more is that babies are born with a number of in-built mechanisms that help them to organise and make sense of their newly-perceived world - and one of these is that they display an attractiveness effect."
Another preference present from birth is the ability to appreciate Vivaldi played forward. But then why is it that I didn't come to recognize and appreciate his Seasons until I was well into adulthood?
In a baby's mind, these beautiful faces may represent the stereotypical human face, says Slater, which they have evolved to recognise. Such built-in information helps babies learn quickly including the ability to pair faces with sounds like voices.
"Attractiveness is not simply in the eye of the beholder, it is in the brain of the newborn infant right from the moment of birth and possibly prior to birth," the University of Exeter researcher said.
In spite of the huge number of nails now splintering the Blank Slate coffin into toothpicks some diehard defenders keep soldiering on fighting for the myth that humans are only products of their social environments. See Godless Capitalist's most recent encounter with one of these intellectual dinosaurs in GC's post In which a Lewontinite is introduced to the 21st century. Secular faiths can be just as strongly held as religious ones.
MADISON - Everyone knows not to get between a mother and her offspring. What makes these females unafraid when it comes to protecting their young may be low levels of a peptide, or small piece of protein, released in the brain that normally activates fear and anxiety, according to new research published in the August issue of Behavioral Neuroscience.
"We see this fierce protection of offspring is so many animals," says Stephen Gammie, a University of Wisconsin-Madison assistant professor of zoology and lead author of the recent paper. "There are stories of cats rescuing their kittens from burning buildings and birds swooping down at people when their chicks are on the ground."
In terms of biology, it makes sense that mothers would lay down their own lives to protect their offspring, especially if it means the parents' genes will be passed down to the next generation, says Gammie. But he adds that despite all the observations and the theories explaining why mothers display this behavior - commonly known as maternal aggression - very little research has investigated the biological mechanisms that turn on this trait in new mothers.
"We've known for a long time that fear and anxiety decrease with lactation," explains Gammie. "Maybe it's this decrease that allows mothers to attack during a situation that normally would evoke a fear response."
Testing this hypothesis, the Wisconsin professor and his colleagues studied the link between maternal aggression in mice and levels of corticotropin-releasing hormone (CRH), a peptide that acts on the brain to control behavior.
About six days after a group of mice gave birth, the new mothers received injections containing either one of three doses of CRH or a saline solution with no amount of the peptide. Following each injection, which was given once a day for four consecutive days, the researchers returned the mother mice to their pups. Twenty-eight minutes later, the researchers removed the pups and introduced a male intruder.
Under normal conditions, female rodents will fiercely attack the males, says Gammie, noting that the males sometimes eat pups and that "the best defense for the mom is the offense."
For the study, only the mice that received either no dose or a low dose of the peptide displayed the expected behavior. As the levels of CRH increased, the number of attacks and the duration of them dramatically decreased.
The results show, for example, that while the mice with the lowest levels of CRH attacked more than 20 times for the duration of about 45 seconds, the mice with moderate levels of the peptide attacked about six times over about eight seconds. Mice with the highest levels of CRH didn't attack at all.
"When we put the male in the cage, some moms would just sit there. They weren't protective at all. If anything they were skittish. They showed a fear response," says Gammie.
The researchers note that altering the levels of the peptide appeared to affect only maternal aggression; normal maternal behaviors, such as nursing, were observed in all mothers both before and after the encounters with male mice.
Based on the results, Gammie says, "Low CRH levels appear to be a necessary part of maternal aggression. If you don't keep them low, you won't see this fiercely protective behavior."
Low CRH levels in some women suffering postpartum depression may explain child neglect and child abuse.
He adds that this finding - some of the first evidence suggesting a biological mechanism that enables parents, regardless of the potential danger, to defend their offspring - may also begin to explain why mothers occasionally neglect or harm their offspring.
"Postpartum depression in some individuals has been linked to higher levels of CRH release and an overly active stress response," explains Gammie. "If CRH needs to be low to see maternal protection of offspring, as our work suggests, then it explains why moms with high postpartum depression and high CRH not only may neglect, but also may abuse, their children."
Although the link between low levels of CRH and increased maternal aggression seems straightforward, Gammie admits that researchers do not know exactly how the process works. Perhaps lactation stimulates the brain to produce less CRH, he suggests. Or maybe the brain's cells simply become less responsive to the hormone.
It would be interesting to know whether nursing human mothers have lower CRH than mothers who use bottled formua and whether nursing mothers are less prone to postpartum depression, child neglect, and child abuse.
Perhaps at some point in the future women who neglect or abuse their kids will be given the option of either losing their children or taking drugs that make them feel more protective and caring toward their children.
“In war, soldiers are under high stress constantly,” says Tracy Bale, who works on CRH and depression at University of Pennsylvania. “In those cases, a CRH blocker might help.”
Given that the connection between soldiering and national security is more widely accepted than the connection between reproduction and national security it seems a safer bet to predict that while the use of drugs to reduce child neglect might be placed beyond the pale future soldiers will routinely be given drugs that control emotions and stress response on the battlefield. The benefits for soldiers will be more clearly understood and governments will be more motivated to tune the emotions of soldiers than of mothers. Drugs will be used to place upper limits on levels of stress response but when stress response is appropriate drugs may be used to heighten stress in order to increase alertness and motivation. Also, feelings of aggressiveness will be dialed up and down according to circumstances.
This reminds me: There is an on-going controversy about the use of drugs and other biotechnologies to enhance the performance of athletes. Well, the use of biotechnologies to enhance physical performance is yet another area where soldiers are going to have the edge on other potential users of human-altering tech. While sports organizations hold meetings and pursue increasingly more advanced methods to screen for the use of forbidden biotech military organizations will rush to embrace any tech that will improve the performance of soldiers.
A PET imaging study conducted at the UCLA Neuropsychiatric Institute indicates the neurobiology of America's estimated 1 million compulsive hoarders differs significantly from people with other obsessive-compulsive disorder (OCD) symptoms. The findings indicate that different medications could improve treatment success.
Detailed in the June 4 edition of the peer-reviewed American Journal of Psychiatry, the study is the first to examine the neurobiology of people with compulsive hoarding and saving, one of several symptom clusters associated with OCD.
The study identified lower brain activity in the anterior cingulate gyrus of compulsive hoarders, compared with other OCD patients. This brain structure helps govern decision-making, focused attention, motivation and problem-solving, cognitive functions that are frequently impaired in compulsive hoarders. The study also found a correlation between severity of hoarding symptoms and lower brain activity in the anterior cingulate gyrus across all of the study subjects with OCD.
In addition, the hoarding group showed decreased brain activity in the posterior cingulate gyrus compared to healthy control subjects who had no OCD symptoms. The posterior cingulate gyrus is involved in spatial orientation and memory. The decreased activity in hoarders may explain why they have difficulty with excessive clutter and fear of losing belongings.
The findings also demonstrate how neurobiological testing could improve diagnosis and treatment of psychiatric disorders. Lower activity in the anterior and posterior cingulate areas may not only underlie compulsive hoarding symptoms, but also their poor response to standard treatments for OCD. The results suggest cognitive-enhancing medications commonly used in patients with age-related dementia may be more effective at treating compulsive hoarding behaviors than standard OCD medications such as serotonin reuptake inhibitors.
"Our work shows that hoarding and saving compulsions long associated with OCD may spring from unique, previously unrecognized neurobiological malfunctions that standard treatments do not necessarily address," said Dr. Sanjaya Saxena, lead author and director of the UCLA Neuropsychiatric Institute's OCD Research Program.
"In addition, the results emphasize the need to rethink how we categorize psychiatric disorders. Diagnosis and treatment should be driven by biology rather than symptoms. Our findings suggest that the compulsive hoarding syndrome may be a neurobiologically distinct variant of OCD," said Saxena, an associate professor-in-residence of psychiatry and biobehavioral sciences at UCLA's David Geffen School of Medicine.
Hoarding and saving behaviors are associated with a number of psychiatric disorders, including age-related dementia and cognitive impairment, but they are most commonly associated with OCD. An estimated 7 million to 8 million people in the United States suffer from OCD, with compulsive hoarding present in up to one-third. Compulsive hoarding is the primary source of impairment in 10 percent to 20 percent of OCD patients.
Compulsive hoarding is one of several symptom clusters associated with OCD. Others include contamination fears that lead to cleaning compulsions, aggressive and harm-related obsessions that lead to doubt and checking, and symmetry and order concerns. Each of these symptom clusters may be associated with a distinct pattern of brain activity. Standard OCD treatments, including serotonin reuptake inhibitor medications, typically are less effective in OCD patients with prominent compulsive hoarding behaviors.
The UCLA Neuropsychiatric Institute study involved 62 adults: 12 with OCD who had prominent compulsive hoarding behaviors, 33 with OCD who had mild or no symptoms of hoarding, and 17 control subjects who had no OCD symptoms. The researchers used positron emission tomography (PET) to measure brain glucose metabolism, a marker of regional brain activity, in each subject and compared the results.
Upcoming studies at the UCLA Neuropsychiatric Institute will use both PET and magnetic resonance imaging scanning to look for structural and functional abnormalities in the brains of subjects with compulsive hoarding and other types of OCD as the team seeks to further refine and understand these differences. The research team also will examine the effectiveness of newer medications that better address the unique brain activity found in subjects with compulsive hoarding behaviors.
Note that results from brain scans are obviously causing neuroscientists to reorganize the way they categorize and sort various mental disorders. This is analogous to the way that DNA sequencing results have been causing a recategorization of the relationships between species with species being shifted between genuses and other higher level categories of taxonomy. Systems of classification based on intuitive judgements of outwardly visible qualities are being replaced by qualities come from the measurement of phenomena at the cellular and molecular level. Reductionism marches onward.
Note also that the ability of brain scanner instruments to measure what is going on is providing both a more accurate method of diagnosis and yielding useful hints about what drugs may be most effective to treat each person. The reliance on the psychiatrist's intuitive judgement based on interviews and observation of visible behavior to form a diagnosis is being at least partially supplanted by direct internal observation of what is happening in the brain. Also, the ability to observe what is happening in the brain is pointing toward the potential of courses of treatment that otherwise may never have been considered.
Advances in medical instrumentation are making medical diagnosis more accurate and in the process it is removing subjective judgements from medicine. The removal of the subjective judgement creates the potential for far greater amounts of automation of diagnosis and treatment delivery.
About 8 percent of domestic rams display preferences for other males as sexual partners. Scientists don't believe it's related to dominance or flock hierarchy; rather, their typical motor pattern for intercourse is merely directed at rams instead of ewes.
"They're one of the few species that have been systematically studied, so we're able to do very careful and controlled experiments on sheep," Roselli said. "We used rams that had consistently shown exclusive sexual preference for other rams when they were given a choice between rams and ewes."
The study examined 27 adult, 4-year-old sheep of mixed Western breeds reared at the U.S. Sheep Experiment Station. They included eight male sheep exhibiting a female mate preference – female-oriented rams – nine male-oriented rams and 10 ewes.
OHSU researchers discovered an irregularly shaped, densely packed cluster of nerve cells in the hypothalamus of the sheep brain, which they named the ovine sexually dimorphic nucleus or oSDN because it is a different size in rams than in ewes. The hypothalamus is the part of the brain that controls metabolic activities and reproductive functions.
The oSDN in rams that preferred females was "significantly" larger and contained more neurons than in male-oriented rams and ewes. In addition, the oSDN of the female-oriented rams expressed higher levels of aromatase, a substance that converts testosterone to estradiol so the androgen hormone can facilitate typical male sexual behaviors. Aromatase expression was no different between male-oriented rams and ewes.
The study was the first to demonstrate an association between natural variations in sexual partner preferences and brain structure in nonhuman animals.
The Endocrinology study is part of a five-year, OHSU-led effort funded through 2008 by the National Center for Research Resources, a component of the National Institutes of Health. Scientists will work to further characterize the rams' behavior and study when during development these differences arise. "We do have some evidence the nucleus is sexually dimorphic in late gestation," Roselli said.
They would also like to know whether sexual preferences can be altered by manipulating the prenatal hormone environment, such as by using drugs to prevent the actions of androgen in the fetal sheep brain.
I predict that some day it will be possible to alter this structure in adult humans. Will more people at that point choose to switch from heterosexual to homosexual orientation or vice versa? There are more heterosexuals to make the switch. So that tilts the odds in favor of hetero to homo transitions. But on the other hand the stigma associated with homosexuality is still great enough to provide incentive to switch in the other direction.
If a test on fetuses for sexual orientation can be developed and if a treatment for altering fetal sexual orientation can also be developed then that would probably favor a net shift toward heterosexuality since most parents would choose to guarantee their children will be heterosexual. Also, even without such a test if it becomes possible to control the genetic and environmental factors that influence the development of the part(s) of the brain that determine sexual orientation then many parents will opt to, metaphorically speaking, tilt the playing field even more toward the odds of heterosexuality in their offspring. In other words, it seems reasonable to expect that most parents will avail themselves of medical treatments that will make sure their kids will turn out to be heterosexuals.
Whether the ability to alter sexual orientation at the fetal and adult stages will cause a net change in the balance of the population in a more homosexual or heterosexual direction is hard to predict. It seems likely that males and females will, on average, make different decisions. So the ratio of male to female homosexuality could either increase or decrease once sexual orientation becomes malleable. Also, the ratio will likely diverge between cultures and population groups as different groups make different choices on average.
Hello there little pup, I'm Big Gay Al. [Sparky looks at him] Have you been outcast? [Sparky pants an affirmative] Well, then I'm so glad you found my Big Gay Animal Sanctuary. We're all big gay friends here. Would you like to live with us? [Sparky pants an affirmative] Come on in little fellow, nobody will ever oppress you here.
The finding is the first in humans to show that a receptor, which is pivotal to the action of widely prescribed anti-anxiety medications, may be abnormal in the disorder and help to explain how genes might influence vulnerability.
In the study, positron emission tomography (PET) determined that three brain areas of panic disorder patients are lacking in a key component of a chemical messenger system that regulates emotion, says Alexander Neumeister, MD, of the National Institute of Mental Health (NIMH). Brain scans revealed that the component, a type of serotonin receptor, is reduced by nearly a third in three structures straddling the center of the brain, according to the report in the current issue of The Journal of Neuroscience.
“This is first time anyone has shown, in vivo, a decrease in serotonin binding in panic disorder patients. Eventually, this work could lead to new more selective pharmacological treatments that would specifically target this receptor,” says Michael Davis, PhD, of Emory University, who studies anxiety disorders. “Clinical studies like this are extremely important for guiding basic research in animals to understand more fully the role of these receptors in anxiety.”
Each year, panic attacks strike about 2.4 million American adults “out of the blue,” with feelings of intense fear and physical symptoms sometimes confused with a heart attack. Unchecked, the disorder often sets in motion a debilitating psychological sequel syndrome of agoraphobia, avoiding public places. Panic disorder runs in families and researchers have long suspected a genetic component.
In the study, Neumeister and his colleagues used PET scans to visualize serotonin 5-HT1A receptors in the brains of 16 panic disorder patients – seven of whom also suffered from major depression – and 15 matched healthy controls. In the panic disorder patients, including those who also had depression, receptors were reduced by an average of nearly a third in the anterior cingulate in the front middle part of the brain, the posterior cingulate, in the rear middle part of the brain, and in the raphe, in the midbrain.
Unfortunately it doesn't sound like these researchers had the genes for 5-HT1A sequenced in this group of patients. Though even if they had it is possible that such a test wouldn't find the genetic difference causing this difference in receptor concentration. The genetic difference may be at a different site in the genome that codes for a regulatory protein or a piece of regulatory RNA (interference RNA) that regulates this gene. this group of patients do not
Because the disorder can run in families, experts have suspected that certain genetic variations might make people more vulnerable to developing it. The new research gives weight to that idea.
"This is the first study that shows a very clear biological difference in patients and controls," Neumeister said.
The illness, which most commonly begins between late adolescence and the mid-30's, is just one in a group of anxiety-inducing ailments that are relatively widespread. About 19 million Americans are afflicted by one of the diseases; obsessive-compulsive disorder, post-traumatic stress disorder and specific phobias are among the more well known.
It is interesting to note that a genetic variation of the 5-HT1A receptor gene is correlated with depression. Differences in the same receptor have been found to also correlate with differences in beliefs about spirituality.
The traits, which are also found in humans, have positive and negative extremes - for example, dogs could be rated as energetic, slothful or somewhere in between. The other traits were affection-aggression, anxiety-calmness and intelligence-stupidity.
In total, 78 dogs of all shapes and sizes were tested. In general, owners and strangers agreed on an individual dog's personality. This suggests that the dog personalities are real, says Gosling.
Of course there is a biological basis to personality differences in humans and in dogs. Yes, even strangers can size up a dog and tell you how affectionate or calm or smart the dog is. But now this has been demonstrated scientifically.
The research, conducted with the help of 78 dogs and owners who were recruited at a dog park in Berkeley, Calif., found the animals' personality traits could be judged with an accuracy comparable to judgments made about humans' personality traits.
"The findings ... suggest a conclusion not widely considered by either human-personality or animal-behavior researchers: Differences in personality traits do exist and can be measured in animals," says the research paper by Samuel D. Gosling, an assistant professor of psychology at the University of Texas at Austin; Virginia S.Y. Kwan of Princeton University; and Oliver John of the University of California, Berkeley.
Gosling says that many researchers are reluctant to believe that dogs have distinct personalities. The mind boggles. Do personality researchers as a group have an aversion to dogs? Have they no experience with owning a variety of dogs who have very distinct personalities? It is amazing what obvious truths even have to be proved by science.
While some dogs and some human children behave poorly as a result of a lack of training or due to abuse many others are just plain determined to be aggressive or defiant or highly motivated to achieve some goal regardless of any adult human supervision. That dogs have unique traits just as humans do is obvious to anyone who has considerable experience with multiple dogs. Even within a breed there is considerable variation though less variation than is found between breeds.
The demonstration that dog personalities can be classified is useful for enabling the search for genetic variations that influence personality. It strikes me, however, that the 4 traits used are inadequate for describing all genetically-based variations in dog behavior. For instance, dogs vary in their instinctive like for water and for retrieving and they many other ways that obviously have genetic bases. The wide range of extremes of dog personalities due to the development of so many breeds for different purposes provides fertile ground on which to search for genetic factors that influence personality and behavior.
Cognitive Behavioral Therapy (CBT) which aims to train depressed patients not to think negative thoughts about themselves causes a different pattern of changes to the brain than the changes caused by anti-depressant drugs.
Using positron emission tomography (PET) -- multi-colored imaging that pinpoints where maximum changes in brain metabolism occur -- Dr. Mayberg's team, led by CBT expert Zindel Segal, PhD, and graduate student Kimberly Goldapple, generated a detailed picture of what this self-correction looks like.
CBT has theoretically been considered a top-down approach because it focuses on the cortical (top) area of the brain -- associated with thinking functions -- to modulate abnormal mood states. It aims to modify attention and memory functions, affective bias and maladaptive information processing. In contrast, drug therapy is considered a bottom-up approach because it alters the chemistry in the brain stem and limbic (bottom) regions which drive more basic emotional and circadian behaviors resulting in eventual upstream changes in depressive thinking.
In this current study in Archives, 14 clinically-depressed adult patients underwent a full course of CBT. They each received 15 to 20 individualized outpatient sessions. None were on drug therapy. The patients' brains were scanned prior to beginning treatment and at the end of the full course of therapy.
Investigators found that CBT targets many of the same limbic and cortical regions affected by drug therapy, but in 'different directions'. With drug therapy, metabolism (blood flow) decreases in the limbic area and increases in the cortical area. With CBT, Mayberg and colleagues identified the reverse pattern: limbic increases (in the hippocampus, dorsal mid cingulate) and cortical decreases (in the dorsolateral, ventrolateral and medial orbital frontal; inferior temporal and parietal). Furthermore, each treatment showed changes in unique brain regions supporting the top-down, bottom-up theories.
What explains this reverse pattern? As CBT patients learn to turn off the thinking paradigm that leads them to dwell on negative thoughts and attitudes, activity in certain areas in the cortical (thinking, attention) region are decreasing as well.
"The challenge continues to be how to figure out 'how to best treat' for what the brain needs," says Dr. Mayberg. She suggests that brain scans may one day become a useful component of the treatment protocol for clinically depressed patients, helping doctors to determine in advance what treatment will be most efficacious, as well as monitor the effectiveness of a particular treatment strategy.
Both types of treatment work on only a subset of all depressed patients and the two different subsets only partially overlap. If patterns in the brains of depressed patients could be found that show how depressed patients differ from each other it might be possible to discover markers for which type of therapy is most likely to work. Some day depressed patients may have their brains scanned to determine what type of anti-depressant treatment has the best chance of working for each patient.
"This experiment lays the groundwork for looking for different markers that will help to optimize the treatment for a given individual; that's the really cool part," said Mayberg, a professor of psychiatry and neurology who conducted the study while at the University of Toronto but recently moved to Emory University in Atlanta.
Genetic testing will probably become even more common than brain scanning for the purpose of choosing the optimal therapy for treating depression. The genetic testing will be cheaper and easier to carry out. Also, genetic testing will be useful for identifying which anti-depressant drugs are more or less likely to work and more or less likely to cause side effects for each person.
Also, it may eventually become possible to automate much of the delivery of cognitive behavioral therapy. An interactive computer could be used to do part of the training of how to avoid thinking negative thoughts. It may also become possible to implant sensors and something like a hearing aid that would be triggered to tell a patient what positive thoughts to have when the sensors detect negative thoughts. Of course such a method of treatment would bring with it the potential of abuse as a means to control people.
A fascinating article published in the American Journal Of Psychiatry by Swedish medical researcher Lars Farde M.D., Ph.D. and colleagues from the Karolinska Institute have found that the concentration of serotonin receptors in the brain correlates inversely with spirituality. (same abstract here or here)
Jacqueline Borg, Bengt Andrée, Henrik Soderstrom, and Lars Farde
The Serotonin System and Spiritual Experiences
Am J Psychiatry 2003 160: 1965-1969.
METHOD: Fifteen normal male subjects, ages 20-45 years, were examined with PET and the radioligand [11C]WAY100635. Personality traits were assessed with the Swedish version of the Temperament and Character Inventory self-report questionnaire. Binding potential, an index for the density of available 5-HT1A receptors, was calculated for the dorsal raphe nuclei, the hippocampal formation, and the neocortex. For each region, correlation coefficients between 5-HT1A receptor binding potential and Temperament and Character Inventory personality dimensions were calculated and analyzed in two-tailed tests for significance. RESULTS: The authors found that the binding potential correlated inversely with scores for self-transcendence, a personality trait covering religious behavior and attitudes. No correlations were found for any of the other six Temperament and Character Inventory dimensions. The self-transcendence dimension consists of three distinct subscales, and further analysis showed that the subscale for spiritual acceptance correlated significantly with binding potential but not with the other two subscales. CONCLUSIONS: This finding in normal male subjects indicated that the serotonin system may serve as a biological basis for spiritual experiences. The authors speculated that the several-fold variability in 5-HT1A receptor density may explain why people vary greatly in spiritual zeal.
Are there particular alleles in genes that cause different humans to have different numbers of serotonin receptors? Do more spiritual people, on average, have genetic variations that make them produce fewer serotonon receptors per nerve cell or fewer nerve cells that make serotonin receptors?
Currently, are spiritual people having more children than non-spiritual people? Therefore, are the alleles that increase serotonin receptor concentrations being selected against? Is the extent to which spiritualism correlates with larger family size different in different societies? So are some societies being selected for to be more spiritual more than other societies?
To reiterate an argument I've made in the past: Once it becomes possible to control what genetic variations people pass on to their offspring and once genetic variations are discovered that alter personality then at that point the average personality types born to people of different regions, countries, occupations, economic classes, and religious beliefs will diverge. People will make decisions to make their children more like what they want ideal children to be. Imagine religious believers choosing to make their children have personalities that are highly spiritual while at the same time scientists and engineers choose to have children who are highly rational and skeptical. This could lead to genetic religious wars.
If people in some regions of the world decide to make their children more spiritual and other regions make their children more rational and skeptical then one can imagine wars being fought as a result of conflicts of values that flow from fundamental differences in brain wiring. One can also imagine wars fought to stop the people or governments of opposing countries from creating offspring that are either seen as a security threat (e.g. a highly willing deeply spiritual suicide martyr personality type) or as a blasphemy against god.
Scientists at Johns Hopkins have discovered the first direct evidence in mammals that a chemical intermediate in the production of fatty acids is a key regulator of appetite, according to a report in a recent issue of the Proceedings of the National Academy of Sciences.
Scientists have long known that hunger causes increases in some brain chemicals while lowering others. However, the root cause of hunger's effects -- the initial chemical trigger of appetite -- has been elusive.
In experiments with mice, the Johns Hopkins researchers showed that appetite is immediately and directly tied to amounts of a chemical called malonyl-CoA. In hungry mice, malonyl-CoA was almost undetectable in the brain. Once fasting mice were given food, however, amounts of the chemical increased to high levels within two hours. Furthermore, chemically reducing appetite by injecting a compound called C75 into the brain brought levels of malonyl-CoA up to those of mice given food, helping to explain C75's effects.
"From this work, it appears as though malonyl-CoA levels control appetite and levels of other brain chemicals that we know go up and down with hunger and feeding," says Dan Lane, Ph.D., professor of biological chemistry in Hopkins' Institute for Basic Biomedical Sciences. "There may be other contributors, but this is the first direct evidence that malonyl-CoA could be the body's primary appetite controller."
In previous work, Lane and his colleagues had shown that giving mice C75, which blocks conversion of malonyl-CoA into fatty acids, dramatically reduced animals' appetites. Subsequently, they found that C75 triggers levels of several known appetite signals (NPY, AgRP, POMC and others) to register "full" even when animals should have been hungry.
However, the new experiments, during which C75 was injected directly into the animals' brains, suggest that increasing levels of malonyl-CoA, caused by "blocking the dam" with C75, is the first step in the process that alters levels of those appetite signals.
"Fully understanding how appetite is regulated by the brain should reveal ways to control appetite," says Lane, who was studying how fat cells develop when he and colleagues discovered the appetite-suppressing effects of C75 a few years ago. "Because C75 was injected into the brain, rather than into the abdomen as in earlier experiments, we also now know that the compound's effects on appetite stem primarily from its effects on chemicals in the brain, not from effects it might have elsewhere in the body."
The scientists also discovered that preventing formation of malonyl-CoA by injecting a different substance (TOFA) into the brain partially reversed the appetite-suppressing effect of C75. Lane suggests that a better blocker of malonyl-CoA formation should more completely counteract C75's effects.
My guess is that C75 binds and blocks the activity of an enzyme that converts malonyl-CoA into something else. By preventing malonyl-CoA from being further metabolized the C75 compound causes a rise in the level of malonyl-CoA in brain cells and that, in turn, probably causes malonyl-CoA to bind in places that cause other signals to be sent that cause the brain to feel sated. If hunger can be increased by blocking malonyl-CoA formation and if hunger can be decreased by slowing the breakdown or usage of malonyl-CoA then it might turn out be fairly easy to control appetite and weight.
A safe and effective pair of drugs for increasing and decreasing appetite would give humans easy conscious control of body weight. The development of the means to take conscious control of appetite is one element in a larger toolset which humans need to develop in order to better adapt ourselves to the lifestyles most common in industrialized societies to which our evolutionary past currently leaves us poorly adapted. We also still need to develop the means to adapt ourselves to lower levels of exercise, less need for fear and anger, and other changes we have made in our environments for which we are not well adapted.
"We were surprised to find that brain activity in response to faces of black individuals predicted how research participants performed on cognitive tasks after actual interracial interactions," says Jennifer Richeson, Assistant Professor of Psychological and Brain Sciences, the lead author on the paper. "To my knowledge, this is the first study to use brain imaging data in tandem with more standard behavioral data to test a social psychological theory."
Their findings suggest that harboring racial bias, however unintentional, makes negotiating interracial interactions more cognitively demanding. Similar to the depletion of a muscle after intensive exercise, the data suggest that the demands of the interracial interaction result in reduced capacity to engage in subsequent cognitive tasks, say the researchers.
For the study, thirty white individuals were measured for racial bias, which involved a computer test to record the ease with which individuals associate white American and black American racial groups with positive and negative concepts. Racial bias is measured by a pattern in which individuals take longer to associate the white Americans with negative concepts and black Americans with positive concepts. The study participants then interacted with either a black or a white individual, and afterward they were asked to complete an unrelated cognitive task in which they had to inhibit instinctual responses. In a separate fMRI session, these individuals were presented with photographs of unfamiliar black male and white male faces, and the activity of brain regions thought to be critical to cognitive control was assessed.
"We found that white people with higher scores on the racial bias measure experienced greater neural activity in response to the photographs of black males," says Richeson. "This heightened activity was in the right dorsolateral prefrontal cortex, an area in the front of the brain that has been linked to the control of thoughts and behaviors. Plus, these same individuals performed worse on the cognitive test after an actual interaction with a black male, suggesting that they may have been depleted of the necessary resources to complete the task."
According to Richeson, most people find it unacceptable to behave in prejudiced ways during interracial interactions and make an effort to avoid doing so, regardless of their level of racial bias. A different research project by Richeson and her colleagues suggested that these efforts could leave individuals temporarily depleted of the resources needed to perform optimally on certain cognitive tasks. This new study by Richeson provides striking evidence that supports the idea that interracial contact temporarily impairs cognitive task performance.
This study will of course occasion considerable discussion about the continued existence of racial stereotypes and the harm therefrom. But since this site is dedicated to taking less conventional looks at human nature and our future let us look at some other issues that others will tend to ignore.
What would be interesting is to see this study repeated with much larger groups of people of different races, occupations, and histories of living in different areas. Does the feeling of bias run stronger among those who have more or less experience with other races? Does it vary as a function of age of the person when the most experience of other races happened. Does it run stronger as some sort of function of IQ? Does it vary as a function of personality type with someone who is outgoing having more or less bias than someone who is shy and retiring? Do some races bear more animosity or fear toward other races? This result was only with whites and a pretty small sample of them. So the really interesting questions can't be answered.
Think about the economic implications of this work. People whose work performance varies a great deal as a function of how much cognitive effort they can muster (for instance engineers, computer programmers) ought to avoid sources of cognitive drain. One way to avoid sources of cognitive drain would be to isolate oneself from them (like by not answering a phone call from a girlfriend who wants an emotionally complicated conversation while I'm trying to program something complicated - not that I'd ever do such a thing. But gotta love caller ID! ;>). Another way might be to learn desensitization techniques. If a biofeedback machine or some other device could allow one to measure the extent of one's responses one might be able to learn to dampen the responses that cause cognitive drain.
There are other implications that go beyond race. What kinds of physical appearance and personality characteristics in some people cause which other kinds of people to strain to react cordially? Are there personality types that are simply incompatible and that will drain off too much in the way of cognitive resources when working with each other? Could employers use that knowledge to divide people up into teams that will reduce the total amount of cognitive waste that results from emotional reactions between co-workers?
Researchers at UCLA have demonstrated with Functional MRI (fMRI) scans that the pain of rejection looks similar in a brain scan to the neuronal activation pattern seen with physical pain.
In the first of three rounds, experimenters instructed UCLA undergraduates just to watch the two other players because "technical difficulties" prevented them from participating. In the second round, the students were included in the ball-tossing game, but they were excluded from the last three-quarters of the third round by the other players. While the undergraduates later reported feeling excluded in the third round, fMRI scans revealed elevated activity during both the first and third rounds in the anterior cingulate. Located in the center of the brain, the cingulate has been implicated in generating the adverse experience of physical pain.
"Rationally we can say being excluded doesn't matter, but rejection of any form still appears to register automatically in the brain, and the mechanism appears to be similar to the experience of physical pain," Lieberman said.
When the undergraduates were conscious of being snubbed, cingulate activity directly responded to the amount of distress that they later reported feeling at being excluded.
The researchers also detected elevated levels of activity in another portion of the brain — the right ventral prefrontal cortex — but only during the game's third round. Located behind the forehead and eyes, the prefrontal cortex is associated with thinking about emotions and with self-control.
"The folks who had the most activity in the prefrontal cortex had the least amount of activity in the cingulate, making us think that one area is inhibiting one or the other," Lieberman said.
The psychologists theorize that the pain of being rejected may have evolved because of the importance of social bonds for the survival of most mammals.
"Going back 50,000 years, social distance from a group could lead to death and it still does for most infant mammals," Lieberman said. "We may have evolved a sensitivity to anything that would indicate that we're being excluded. This automatic alarm may be a signal for us to reestablish social bonds before harm befalls us."
"These findings show how deeply rooted our need is for social connection," Eisenberger said. "There's something about exclusion from others that is perceived as being as harmful to our survival as something that can physically hurt us, and our body automatically knows this."
There are interesting legal ramifications to this report. As the cost of fMRI and other objective measures of pain become more advanced do not be surprised if fMRI and other tests are used in legal cases to buttress claims of pain and suffering to win legal awards. This will be seen as unfair to those with higher pain thresholds and less sensitivity to rejection and to treatment that others might perceive as unfair and painful. Is it fair for people who suffer differing degrees of emotional pain from the same experience to receive different sized legal settlements because they are not equally prone to feeling emotional pain in response to traumatic experiences?
There is another ramification to this report: humans are wired to not want to be rejected by other humans. As the authors state, this is probably a consequence of human evolution. Well, suppose it becomes possible for people to modify their minds to reduce their need for acceptance by others. This would have all sorts of consequences for behavior. A great many human activities are performed (for both good and ill) in order to win acceptance from others. What would be the net effect of a reduced desire to be accepted? My guess is that among many other effects it would tend to reduce altruistic behavior and would reduce the incentive to avoid doing things that are inconsiderate of others.
The ability to edit memories, change one's personality, change very basic desires, and to change what causes pain or pleasure could provide us with many benefits. But it could also create changes in human nature that undermine civilization. When it becomes possible to reduce one's feeling of empathy or to stop oneself from feeling guilty over acts committed against others some malevolent and foolish people will choose to do so. This could be done out of a motive to reduce suffering. Some who feel very rejected and in pain from rejection will decide to eliminate the pain response that occurs when one is rejected. Imagine the consequences if more people became indifferent to the approval of others.
The ability to do brain reprogramming is going to force the issue of what constitutes a rights-possessing being. Ayn Rand's claim that rights are a product of our ability to think rationally is just not an adequate explanation. It is part of the explanation but only a part. What we feel pain or pleasure over in dealing with others plays a large role in causing us to treat others fairly or unfairly. It seems inevitable that our minds will become much more mutable in the future. Once that happens we will have to face the question of how to decide whether each person who opts to have mind modifications done still possesses the minimum set of qualities that are necessary for a human to possess to safely live in a society, respect the rights of others, and carry out responsibilities that are expected of anyone who is a member of that society.
This is far from the only report that suggests there are qualities of the human brain that help support the functioning of humans in societies. See, for example my previous post on altruistic punishment for another example. Also, see the post Emotions Overrule Logic To Cause Us To Punish.
Professor Jaroslav Flegr of Charles University in Prague has discovered evidence that infection by intracellular protozoan parasite toxoplasma gondii (T. gondii) causes changes in human personalities.
He found the women infected with toxoplasma spent more money on clothes and were consistently rated as more attractive. “We found they were more easy-going, more warm-hearted, had more friends and cared more about how they looked,” he said. “However, they were also less trustworthy and had more relationships with men.”
By contrast, the infected men appeared to suffer from the “alley cat” effect: becoming less well groomed undesirable loners who were more willing to fight. They were more likely to be suspicious and jealous. “They tended to dislike following rules,” Flegr said.
Why the cat parallels? The parasite infects cats and is passed on to rats by cat feces. In rats it creates the proverbial fatal attraction.
LONDON - Scientists have discovered a parasite that inhabits rats and makes them feel a suicidal attraction for cats. The parasite, which infects as many as one in five rats, can also affect humans.
The parasite, nicknamed the love bug but scientifically known as Toxoplasma gondii, an intracellular protozoan, infects the rodent's brain, inducing an effect similar to Prozac so it becomes less fearful of cats.
It might be too late to get rid of Fluffy. U College London T. gondii researcher Dr Dominique Soldati says once infected you have it for life and it gradually grows.“Once you are infected you cannot get rid of this parasite and the numbers of them slowly grow over the years,” she said. “It’s not a nice thought.”
The scientists set up a wild enclosure for rats, with different smells in each corner. Rats infected by the parasite were attracted to the smell of cat urine.
The minds of infected rats are subtly altered so that they become less able to avoid getting captured and eaten by cats. Cat feces that are eaten by rats serve as a way to spread the disease to rats that the cats can then eventually capture and eat.
As this review of the molecular biology of T. gondii demonstrate scientists are looking for ways to stop and destroy this parasite once it has infected humans.
T. gondii has evolved a remarkable ability to survive in its host, typically for long periods of time, with minimal pathogenicity and in a variety of cell types. However, the mechanisms by which this obligate intracellular parasite becomes a master at manipulating the structures and pathways of the host cell for its own nefarious purposes to create a hospitable environment remain difficult to analyse in the background of a nucleated host cell.
T. gondii infection is especially dangerous for children born to women who become infected during pregnancy.
Toxoplasma, mainly transmitted by consumption of contaminated meat or by cat faeces, chronically infects half the world's population. The pathogen is a leading cause of neurological birth defects in children born to mothers who contract the disease during pregnancy and can cause fatal toxoplasmosis encephalitis in immunosuppressed patients.
Scientists hope that understanding the gene's function will aid efforts to develop drugs that target and block the way Apicomplexa parasites penetrate host cells.
Women who want to have children should probably give away Fluffy to post-menopausal women who show signs of promiscuity and large tasteful wardrobes.
But what about the threat to Western Civilization? Cats are making our women less trusthworthy and more superficial while they are making men into scruffy loners who are unwilling to follow rules. If some terrorist group was releasing pathogens that had this effect we'd be hunting them down and killing them without mercy (assuming the FBI and CIA could find them - the anthrax mail case may never be solved). But since kitties are fluffy, make cute purring sounds, and occasionally rub up against people's legs they are considered adorable by many. This leaves them free to operate in plain sight to undermine Western Civilization while every single one of them affects an air of total indifference and disinterest.
Update: Christopher Genovese has taken the time to read some of Jaroslav Flegr's research papers and presents an excellent analysis on the question whether Flegr's work has discovered a real effect on humans. My take on it is that while it isn't clear that Flegr has proved his case it is plausible in part because the human domestication of cats happened fairly recently (in ancient Egypt if memory serves) in human evolution. So a deleterious effect on humans of a cat parasite seems like something humans wouldn't have had time to evolve to develop an effective response. By contrast, the likelihood of getting harmful parasites from dogs would seem lower since humans have been living with dogs for a much longer period of time.
Jay A. Gottfried, John O'Doherty and Raymond J. Dolan of the Institute of Neurology in London, U.K. have found that human brains, not too surprisingly, form connections between environmental stimuli and desired foods rather like Pavlov's dog.
For their Science study, the scientists used brain imaging on a group of 13 hungry human volunteers. The experiments involved an initial training period, in which the volunteers were shown abstract images in association with the smell of vanilla or peanut butter. Meanwhile, a functional magnetic resonance imaging (fMRI) machine monitored their brain activity.
The volunteers began to automatically associate certain images with either smell, according to the researchers.
Then, the volunteers ate their fill of either vanilla ice cream or peanut butter sandwiches, being asked to eat until they didn't want any more, but weren't uncomfortably full.
Back in the fMRI machine, the volunteers again experienced the various combinations of images and the two food smells. The researchers observed a change in brain activity for the responses related to the food that they had just eaten, but not for the other food.
The change was primarily in the brain's amygdala and orbitofrontal cortex, where the activity decreased significantly. Previous studies have also implicated these regions in conditioning. The researchers also observed some activity differences in other areas, including the ventral striatum, which is associated with the reward pathways in drug addiction.
About the role the amygdala and orbitofrontal cortex might play in the conditioning process, Gottfried stressed the importance of the fact that this activity decreased only when the volunteers were shown images corresponding to the particular food they had eaten.
Thus, this sort of brain activity is likely involved with anticipating the enjoyment of a given food – which also decreased after the volunteers had eaten until they didn't want any more. The volunteers' amygdala and orbitofrontal cortex responses remained the same for the smell (or corresponding picture) of the second food, which they did not eat.
Ultimately, this brain system may be far more versatile and wide-reaching than just a possible explanation for why food cravings can strike out of nowhere. It may be offer an adaptable system for learning, Gottfried said, that allows us to recognize cues that predict important events, and to discard cues that are no longer useful.
Jay Gottfried speculates that people who suffer from obesity may not respond as well to eating some kind of food by having a decrease in the stimulus effect of images that are correlated with availability of that type of food. How quickly we get disgusted wtih a food may in part determine how well we can regulate our eating.
Importantly, the team also showed that the human brain can put a "brake" on the powerful desire for certain foods once the appetite has been sated. This system to turn the "delectable into the distasteful" may be crucial in regulating behaviour, they say. Detecting faults in this system might in future help shed light on compulsive eating disorders and substance addictions, speculates Gottfried, a neurologist.
Since satiation tends to be at least partially specific for particular foods people are more likely to overeat if presented with a succession of different kinds of foods.
Gottfried was trying to explain what he calls the "restaurant phenomenon."
"You sit down to your eight-course meal for your birthday and you have gone though all the appetizers and entrees and just as you feel you can't fit one more thing in your tummy, then they bring the dessert menu or the dessert cart rolls by and suddenly you discover you have room for the chocolate fondant," Gottfried said in a telephone interview.
"This is specific satiation -- you are full of one thing but not another."
"These processes operate in a very food-specific fashion, and this is important," Gottfried explains. "If, ultimately, what you need is a good balance of nutrients, vitamins, minerals, and so forth, it's important for you to be sampling different foods." A braking effect is in place as long as it comes to the same food item.
This result suggests a couple of potential strategies for limiting calorie consumption. One is to avoid any environmental stimulation separate from food itself that reminds you of some kind of food. Another is to avoid food pictures, food smells, or real food. Another is to eat meals with fewer courses.
Michael May lost his sight at age 3 in an accident and over 40 years later stem cell therapy helped restore sight to one eye. A few years later it is clear that parts of his brain that do image processing never developed and show no signs of going thru the necessary development now.
He can discern motion, two-dimensional forms and color. "That was the most amazing thing. Initially I hadn't thought about color. To all of a sudden have the faucet turned on for this whole world of colors, it was amazing. Somewhere in the recesses of my mind was the ability to discern colors," May said in a telephone interview.
What he can't do is recognize objects in three dimensions, make sense of complex landscapes, recognize faces or interpret facial expressions.
Fine and Donald MacLeod of the University of California at San Diego have conducted a battery of tests. Brain scans showed that the part of the brain that becomes activated in sighted people when they see faces and objects remained dormant in May. But when he looks at an object that is moving, the motion detection part of his brain lights up with activity.
The findings suggest that certain visual skills, such as detecting color and motion, are more hard-wired and develop earlier in infancy than others.
When asked to identify a cube illustrated on a two-dimensional computer screen, for example, Mr. May failed. But once Miss Fine commanded the cube to rotate, simulating motion in three dimensions, he immediately recognized it.
One scientist likens it to how it is easier to learn languages when younger than when older. This result is also consistent with experiments done on cats decades ago where their heads were kept in harnesses when they were young and they were only shown vertical or horizontal lines for a key number of weeks (its been too long since I learned this to recall it with precise detail). After that the cats which had been exposed to vertical bars could recognise them but not recognise horizontal bars. The cats exposed to horizontal bars during the critical developmental period could see only horizontal bars.
It may eventually become possible to feed neural stem cells and hormones to the part of the mind that processes sight in order to get it to revert to a more plastic state so that in cases where sight is restored the mind can once again go thru the process of learning how to see.
James Wood, 14, uses more of his brain to solve puzzles than other children. To the surprise of Melbourne experts conducting research, James - and seven other mathematically gifted children - used both hemispheres, and parts he wasn't expected to.
"It may be there's big education consequences of this work," says Fred Mendelsohn, head of the Howard Florey Institute. "If it's something that can be learned, you can obviously make a big difference to the way you teach people, the way they learn, they way they develop skills."
It could be that the brains of smarter kids have connections coded for genetically to allow them to recruit more parts of the brain to work on a given problem. If so, it may not be possible to train a brain using teaching techniques to connect various parts of the brain in ways that would allow more of it to be used when solving problems.
Melissa Jungers, Caroline Palmer, and Shari Speer, of Ohio State University have discovered that musicians and speakers adjust the speed and, in the case of the speakers, even the pattern of phrase breaks to match what they just heard.
In one study, the researchers used 16 experienced adult pianists. The pianists sight-read two melodies to establish their preferred performing rate. They then alternated listening to computer-generated melodies and playing different melodies on the piano. They heard 10 melodies and played 10 melodies. The participants were led to believe they were participating in a memory test – they were given no direction about how slow or fast to play.
The melodies they heard were performed at relatively slow or relatively fast rates. When they played, the pianists were provided notations that did not include bar lines or time signatures so there would be no indication of meter or any indirect indication of a rate at which the tune should be played.
After listening to a slow melody, the pianists played their melody slower (an average of 6.8 seconds) when compared to when they had first heard the faster melody (an average of 5.3 seconds). The pianists preferred rate – the rate they played before they heard other melodies – was in between those two times, at 6.1 seconds.
The second study was designed to be as close as possible to the first study – except that the researchers examined speech rather than music. In this study, 64 native English speakers spoke 10 short sentences (6 to 7 words each). First, speakers read two sentences aloud from a computer screen to measure their preferred speaking rate. Next, speakers alternated listening to and reading sentences. In some cases, they listened to sentences spoken at a fast rate, in other cases at a slow rate. The participants were not told how fast or slow to speak – they were simply told to try to remember the sentences for a later memory test.
For 43 of the 64 speakers, they spoke faster after they heard a sentence spoken at a fast tempo and spoke slower when they heard a slower speaking delivery.
When they heard the slow speakers, their sentences averaged 1.81 seconds long, while when they heard the fast speakers their sentences averaged 1.72 seconds long. Their preferred rate of speech averaged 1.80 seconds per sentence.
In addition, this study showed that speakers were also influenced by the pattern of phrase breaks in the recordings they heard. If the recordings had the largest pause in the sentence after the verb, for example, that is where the speakers tended to put their biggest pause.
It would be interesting to play music at different rates and then see if people will speak at different speeds as a result.
Steven Platek and colleagues have found that those most likely to yawn in response to others yawning have more empathy.
Platek said the yawners who mimicked were the same kind who said "ouch" when seeing someone else in pain. They tended to be more empathetic to their fellow man.
This makes evolutionary sense, agrees Ronald Baenninger, who has studied yawning at Temple University in Philadelphia, Pennsylvania. Contagious yawning may have helped our ancestors to coordinate times of activity and rest. "It's important that all group members be ready to do the same thing at the same time," Baenninger says.
If you want to test someone's degree of empathy then fake a yawn and see if the person yawns in response.
Professor Moshe Koppel of Bar-Ilan University in Israel has developed software that can identify male versus female writers of fiction and non-fiction text with 80% accuracy.
Female writers use more pronouns (I, you, she, their, myself), say the program's developers, Moshe Koppel of Bar-Ilan University in Ramat Gan, Israel, and colleagues. Males prefer words that identify or determine nouns (a, the, that) and words that quantify them (one, two, more).
My guess is that a lot of other kinds of patterns in writing styles will be found that provide indications about differences in personality characteristics and cognitive processess. While these patterns may correlate with gender they are likely to be found to correlate with other characteristics such as political leanings, aggressiveness, empathy, and other mental characteristics.
Prof Koppel said that when his research group first submitted one of two papers to be published to the publishing panel of the prestigious National Academy of Sciences in the United States, the referees rejected it "on ideological grounds". "They said, ‘What do you mean? You’re trying to make some claim about men and women being different, and we don’t know if that’s true. That’s just the kind of thing that people are saying in order to oppress women’," he told The Boston Globe.
As more empirical evidence is gathered about innate differences in cognitive processes between individuals the ideological leftists who embrace a tabula rasa view of human nature are going to find themselves increasingly on the defensive. The steady advance of a truly scientific model of human nature will undermine all ideological belief systems about human nature and politics. The extent to which ideologies will be undermined by sicence will vary. Every ideology amounts to a set of simplifying assumptions about reality. The amount of simplification varies between ideologies and the importance of each simplification for the overall structure of each ideology varies condierably as well. But at this point it seems clear that radical egalitarians who claim we all have equal innate abilities and equal innate desires and preferences are going to find their ideology to be badly mauled by scientific advances in our understanding of human genetics and neurobiology.
September 17, 2002
There is a sound neurological basis for the cliché that men are more aggressive than women, according to new findings by scientists at the University of Pennsylvania School of Medicine.
Using magnetic resonance imaging (MRI) scans, the Penn scientists illustrated for the first time that the relative size of the sections of the brain known to constrain aggression and monitor behavior is larger in women than in men.
The research, by Ruben C. Gur, PhD, and Raquel E. Gur, MD, PhD, and their colleagues in Penn's Department of Psychiatry and Department of Epidemiology, was published in a recent issue of the Journal of the Cerebral Cortex.
The findings provide a new research path for therapies that may eventually help psychiatric patients control inappropriate aggression and dangerous patterns of impulsive behavior. They also bolster previous work by the Gurs demonstrating that although some gender differences develop as result of adaptive patterns of socialization, other distinctions are biologically based and probably innate.
"As scientists become more capable of mapping the functions of activity in various parts of the brain, we are discovering a variety of differences in the way men and women's brains are structured and how they operate," said Ruben Gur, first author of the study.
"Perhaps the most salient emotional difference between men and women, dwarfing all other differences, is aggression," he said. "This study affords us neurobiological evidence that women may have a better brain capacity than men for actually 'censoring' their aggressive and anger responses."
The Gurs' work relied on established scientific findings that human emotions are stimulated and regulated through a network that extends through much of the limbic system at the base of the brain (the region encompassing the amygdala, hypothalamus and mesocorticolimbic dopamine systems), and then upward and forward into the region around the eyes and forehead (the orbital and dorsolateral frontal area), and under the temples (the parietal and temporal cortex).
The amygdala is involved in emotional behavior related to arousal and excitement, while the orbital frontal region is involved in the modulation of aggression.
The Gurs' study measured the ratio of orbital to amygdala volume in a sample of 116 right-handed, healthy adults younger than 50 years of age; 57 subjects were male and 59 were female. Once the scientists adjusted their measurements to allow for the difference between men and women in physical size, they found that the women's brains had a significantly higher volume of orbital frontal cortex in proportion to amygdala volume than did the brains of the men.
"Because men and women differ in the way they process the emotions associated with perception, experience, expression, and most particularly in aggression, our belief is that the proportional difference in size in the region of the brain that governs behavior, compared to the region related to impulsiveness, may be a major factor in determining what is often considered 'gendered-related' behavior," Raquel Gur said.
Others Penn investigators participating in the study were Faith Gunning-Dixon, PhD, and Warren B. Bilker, PhD, of the Department of Epidemiology.
In fact, only one man had a "modulator" that was at least seven times larger than his "emotional stimulator," compared to eight women, and only three women had a really small modulator (less than 3.5 times the size of the stimulator) compared to about a quarter of the men. But oddly enough, one woman had the smallest modulator of all, less than two times the size of her amygdala, suggesting that it might not be a good idea to rile her up
Here is the original paper Sex Differences in Temporo-limbic and Frontal Brain Volumes of Healthy Adults.
Petty conducted the study with Pablo Brinol, a former doctoral student at Ohio State now at the Universidad Autonoma de Madrid in Spain. The research appears in the current issue of the Journal of Personality and Social Psychology.
In one study, the researchers told 82 college students that they were testing the sound quality of stereo headphones – particularly how the headphones would perform when they are being jostled, as during dancing or jogging.
Half the participants were told to move their heads up and down (nodding) about once per second while wearing the headphones. The other half was told to move their heads from side to side (shaking) while listening on the headphones.
All of the participants listened to a tape of a purported campus radio program that included music and a station editorial advocating that students be required to carry personal identification cards.
After listening to the tape, the participants rated the headphones and gave their opinions about the music and the editorial that they heard. The study found that head movements did affect whether they agreed with the editorial. But the effect is more complicated than might be expected.
The study found that nodding your head up and down is, in effect, telling yourself that you have confidence in your own thoughts – whether those thoughts are positive or negative. Shaking your head does the opposite: its gives people less confidence in their own thoughts.
So participants in this study who heard an editorial that made good arguments agreed more with the message when they were nodding in a “yes” manner than shaking in a “no” manner. This is because the nodding movements increased confidence in the favorable thoughts people had to the good arguments compared to shaking.
However, students who heard an editorial that made poor arguments showed the reverse pattern. These students agreed less with the message when they were nodding than when shaking. This is because the nodding movements increased confidence in the negative thoughts they had to the poor arguments compared to shaking.
Want to increase your self confidence on some subject? Think about it while nodding yes. Or listen or read someone else talk about it while nodding yes. Of course, if you are reaching your opinions on some subject without suffiicient critical thought or knowledge then it makes sense to decrease your confidence on that subject so that you try harder to learn enough to be right.
Also, if someone is trying to sell you on something and they are nodding yes while intereacting with you then keep in mind that their self confidence may be due more to the nodding than to their actually knowing what the heck they are talking about.
As the saying goes "I am not making this up". Anti-depressant SSRI citalopram controls obsessive compulsive shopping disorder.
In a study appearing in the July issue of the Journal of Clinical Psychiatry, patients taking citalopram, a selective serotonin reuptake inhibitor that is approved for use as an antidepressant, scored lower on a scale that measures compulsive shopping tendencies than those on a placebo. The majority of patients using the medication rated themselves "very much improved" or "much improved" and reported a loss of interest in shopping.
"I'm very excited about the dramatic response from people who had been suffering for decades," said Lorrin Koran, MD, professor of psychiatry and behavioral sciences and lead author of the study. "My hope is that people with this disorder will become aware that it's treatable and they don't have to suffer."
Compulsive shopping disorder, which is estimated to affect between 2 and 8 percent of the U.S. population, is categorized by preoccupation with shopping for unneeded items and the inability to resist purchasing such items. Although some people may scoff at the notion of shopping being considered an illness, Koran said this is a very real disorder. It is common for sufferers to wind up with closets or rooms filled with unwanted purchases (one study participant had purchased more than 2,000 wrenches; another owned 55 cameras), damage relationships by lying to loved ones about their purchases and rack up thousands of dollars in debt.
"Compulsive shopping leads to serious psychological, financial and family problems including depression, overwhelming debt and the breakup of relationships," Koran said. "People don't realize the extent of damage it does to the sufferer."
Earlier studies suggested that the class of medications known as SSRIs might be effective for treating the disorder, but this had not been confirmed through a trial in which participants didn't know whether they were taking a placebo or the actual medication. Koran and his team sought to test citalopram - the newest SSRI on the market at that time - by conducting a seven-week, open-label trial followed by a nine-week, double-blind, placebo-controlled trial.
The study involved 24 participants (23 women and one man) who were defined as suffering from compulsive shopping disorder based on their scores on the Yale-Brown Obsessive-Compulsive Scale-Shopping Version, or YBOCS-SV. Patients with scores above 17 are generally considered as suffering from compulsive shopping disorder. Most of the participants had engaged in compulsive shopping for at least a decade and all had experienced substantial financial or social adverse consequences of the disorder.
During the open-label portion of the study, each participant took citalopram for seven weeks. By the end of the trial, the mean score of the YBOCS-SV decreased from 24.3 at baseline to 8.2. Fifteen patients (63 percent) were defined as responders - meaning they self-reported as being "very much improved" or "much improved" and had a 50 percent or greater decrease in their YBOCS-SV scores. Three subjects discontinued their use of the medication because of adverse events such as headache, rash or insomnia.
The responders were randomized into the double-blind portion of the trial in which half took citalopram for nine weeks and the other half took a placebo. Five of the eight patients (63 percent) who took the placebo relapsed - indicated by self-reporting and YBOCS-SV scores above 17. The seven patients who continued the medication saw a decrease in their YBOCS-SV scores and also reported a continued loss of interest in shopping, cessation of browsing for items on the Internet or TV shopping channels, and the ability to shop normally without making impulsive purchases.
I've argued in the past that biomedical advances will lead to treatments that cause large changes in group-average behavior. Well, here's a drug that already exists which has the potential to affect the size of the GDP and the efficiency of the markets. My guess is that it would be a net benefit to the economy over a longer period of time since consumer debt in the US economy is much too high. People who do not make compulsive purchases probably tend to accumulate longer-lasting assets and to invest more.
Takeshi Satow, of the Kyoto University Graduate School of Medicine in Japan, has discovered that applying an electrical stimulation to an area on the surface of the inferior temporal gyrus region of the brain produces happiness and laughter.
Researchers found the tickle spot on one epileptic woman's brain when they realized that stimulating a specific brain region caused her to feel happy and laugh.
This brings to mind an excellent story written by Spider Robinson entitled God Is An Iron about a woman who was trying to commit suicide by starving to death while hooked up to a device that gave her intense continuous pleasure. That story became the second chapter of his novel Mindkiller.
While current drugs for making people more happy are fairly crude it seems inevitable that far more narrowly focused techniques for inducing happiness with fewer side effects will be developed. Whether, as in Robinson's story, someone will want to be happy and still want to kill themself remains to be seen.
Scientists used a stimulation technique to improve the sensitivity of people's fingertips, and then gave them drugs that either doubled or deleted this effect. Similar skin stimulation/drug treatment combinations may eventually help the elderly or stroke victims button shirts and aid professional pianists according to the authors of a paper appearing in the 04 July issue of the journal Science, published by AAAS, the science society.
Finger stimulations and drugs can temporarily reorganize parts of the human brain. This stimulation, called co-activation, shuffles the synapses that link neurons. The stimulated area becomes more sensitive as more neurons are recruited to process encountered tactile information. The scientists showed that amphetamine doubled stimulation-induced gains in tactile acuity. In the presence of an alternate drug, an NMDA blocker, the improvements in tactile acuity, or perceptual learning, gained via finger stimulations were lost.
Dinse said that related treatments could improve a person's ability to read Braille and that drug-mediated muscle stimulation could help the elderly and chronic pain patients perform everyday tasks.
"We are at the beginning of an era where we can interact with the brain. We can apply what we know about brain plasticity to train it to alter behavior. People are always trying to find ways to improve learning. What we tested is unconscious skill learning. How far could this carry to cognitive learning?…that remains to be seen," said Dinse.
"My personal opinion," Dinse maintained, "is that progress in brain pharmacology will sooner or later result in implications that are equally or possibly more dramatic than the implications tied to discussions about genes and cloning."
Drugs will almost certainly be developed that will enhance the training of the mind to increase specific types of sensitivity and discernment of sensory signals. The example of musicians who learn to discern finer grained differences between musical notes is another example. Also, drugs will be found - perhaps the same drugs - that will enhance the ability to learn new forms of coordination such as when learning a musical instrument. Also, drugs will be found that will enhance general learning without causing harmful side effects (unless, of course, you use the drugs to learn truly harmful ideas such as a vile and dangerous religious belief).
"We were able to change the tactile acuity of 80-year-old subjects to a performance of a 50-year-old," Dinse said -- a 50 percent to 100 percent improvement.
Coactivation causes a remapping of somatosensory cortex, in which the area used to represent the index finger becomes larger and produces a stronger EEG signal. In the new study, this cortical reorganisation was considerably more dramatic in the group that had received amphetamine.
Imagine the possibilities once scientists manage to find ways to increase the sensitivity of sex organs. That's one kind of enhancement that will overcome public opposition to human brain enhancement.
While Dinse is using amphetamines for his studies he recognises that they have too many effects including effects that are harmful. Another recent study reports that ex-users of methamphetamine show signs of neuronal damage.
But there is significant evidence that the drug can cause damage to the brain's neurons - the cells which are used for thinking.
Methamphetamine users have reduced concentrations of a chemical called N-acetyl-aspartate, which is a byproduct of the way neurons work.
Decrease in the brain levels of N-acetyl-aspartate occurs in a number of neurological disorders such as multiple sclerosis and Alzheimer's disease. It is found at lower levels even in ex-amphetamine users. Hence amphetamine appears to cause prolonged neuronal changes that likely are signs of neurological damage.
In a review of selenium metabolism (which is worth a read if you have any interest in selenium metabolism) the authors mention in passing that methamphetamine use causes free radical generation and damage to dopaminergic neurons.
Methamphetamine (MA) exposure of animals results in enhanced formation of superoxide radical (O2–) and nitric oxide (NO), which interact to produce peroxynitrite (OONO–). Peroxynitrite is a potent oxidant, leading to dopaminergic damage (Imam and Ali 2000). Thus, multiple dose administration of MA to mice results in long-lasting toxic effects in the nigrostriatal dopaminergic system, which is a relevant model of PD.
So kids, the moral of this story is don't try this at home. It is bad for you.
University of Michigan evolutionary biologist Jianzhi “George” Zhang argues that the development of the ability to see colors in our primate ancestors led to the loss of the ability to respond to sexual pheromones.
Zhang believes that a significant gene duplication made the difference and that it happened sometime between 23 million years ago and the split of the New World and Old World primates about 35 million years ago. An ancestor of the Old World primates (humans, chimps, gorillas, orangutans, gibbons, baboons and guerezas) developed a second copy of the red/green color-vision gene, which resides on the X chromosome. Female New World monkeys have full color vision because females have two X chromosomes that harbor both red and green color vision genes. But males only have one X chromosome, so New World males have only one copy of either the red or green gene, and that leaves them color-blind. After the red/green gene duplication in the Old World family however, even the males got color vision too.
Once humans could see in color the visual inspection of a potential mate yielded far more useful information and at a greater distance than was the case with scents. As a result of natural selection color-seeing primates came to have neuronal wiring that caused them to place much more importantce on appearance in mate choice. In Zhang's view it is therefore not coincidental that around the time human males developed the ability to see color humans also lost the ability to respond to pheromones:
To test their idea, Zhang’s team zeroed in on a human gene called TRP2, which makes an ion channel that is unique to the pheromone signaling pathway. They found that in humans and Old World primates, this gene suffered a mutation just over 23 million years ago that rendered it dysfunctional. But because we could use color vision for mating, it didn’t hurt us. In turn, the pheromone receptor genes that rely on this ion channel fell into disuse, and in a random fashion, mutated to a dysfunctional state because they haven’t experienced any pressure from natural selection. Zhang calls this process “evolutionary deterioration.”
The FuturePundit blog focuses on the future. This report is about events that took place tens of millions of years ago in our our evolutionary past. So how does this discovery about the history of human sexual evolution figure into the human future? In a couple of ways:
Many changes that happened in our evolutionary past will not be lost to us forever once it becomes possible to do genetic engineering to ourselves and our progeny. If we want to recover lost functionality or behavioral tendencies it will eventually become possible to do so. I would go so far as to predict that there will eventually be cultish groups who will pursue biological nostalgia fads to make themselves more authentic and less modern by giving themselves features associated with our pre-human ancestors. These faddists of the future will use biological technology to take the back-to-nature movement to a whole 'nuther level.
Allan Snyder, director of Centre for the Mind at the University of Sydney is using transcranial magnetic stimulation (TMS) to slow down or speed up various parts of the brain and by doing so appears to be able to unlock savant intellectual abilities dormant in many minds.
As remarkable as the cat-drawing lesson was, it was just a hint of Snyder's work and its implications for the study of cognition. He has used TMS dozens of times on university students, measuring its effect on their ability to draw, to proofread and to perform difficult mathematical functions like identifying prime numbers by sight. Hooked up to the machine, 40 percent of test subjects exhibited extraordinary, and newfound, mental skills. That Snyder was able to induce these remarkable feats in a controlled, repeatable experiment is more than just a great party trick; it's a breakthrough that may lead to a revolution in the way we understand the limits of our own intelligence -- and the functioning of the human brain in general.
Snyderr claims TMS can rapidly improve drawing abilitiies
Professor Allan Snyder and colleague Elaine Mulcahy say tests on 17 volunteers show their device can improve drawing skills within 15 minutes.
Some scientists think he may be on to something important.
In a 1999 paper, Snyder and his colleague John Mitchell challenged the compulsive-practice explanation for savant abilities, arguing that the same skills are biologically latent in all of us. "Everyone in the world was skeptical," says Vilayanur Ramachandran, director of the Center for Brain and Cognition at the University of California at San Diego. "Snyder deserves credit for making it clear that savant abilities might be extremely important for understanding aspects of human nature and creativity."
"I wrote a comment two or three years ago in Nature, on his theory on autism and early information processing. I never commented on his TMS stuff and the reason is I'm a little bit skeptical. And there's no data so far available supporting his claims," said Professor Niels Birbaumer, of the University of Tubingen, Germany.
For some more relevant reading see Darold A. Treffert's articles on savants which include his own comments on the results of studies of TMS which he calls repetitive transcranial magnetic stimulation (rTMS)
Anorexia and Bulimia Nervosa may be caused by an auto-immune disorder where antibodies attack the hypothalamus or pituitary.
Three-quarters of the anorexic and bulimic women studied by Serguei Fetissov of the Karolinska Institute in Stockholm carry blood antibodies targeted against appetite centres in the brain, he finds. Just 16% of those without eating disorders have such antibodies1.
Another article with additional details.
To test the theory, the investigators withdrew blood serum from 57 women between the ages of 17 and 42 who had anorexia, bulimia or both. Most of the women (74 percent) produced antibodies that, when applied to sections of rat brains and rat pituitary glands, selectively attached to cells that produce three specific neuropeptides: alpha-MSH, ACTH and LHRH.
This is a fascinating result. The targeting of adrenocorticotropic hormone suggests that stress may trigger the auto-immune response. But there may be a genetic predisposition for this inappropriate immune response. It brings up the question of just what other behavioral and endocrine disorders of currently unknown cause might be caused by auto-immune responses.
Think could be used to discover the political sympathies of a suspected traitor or terrorist.
In the study, Decety and doctoral student Thierry Chaminade used positron emission tomography (PET) scans to explore what brain systems were activated while people watched videos of actors telling stories that were either sad or neutral in tone. The neutral stories were based on everyday activities such as cooking and shopping. The sad stories described events that could have happened to anyone, such as a drowning accident or the illness of a close relative. The actors were videotaped telling the stories, which lasted one to two minutes, with three different expressions – neutral, happy or sad.
Decety and Chaminade found that, as people watched the videos, different brain regions were activated depending on whether an actor's expressions matched the emotional content of the story.
When the story content and expression were congruent, neural activity increased in emotional processing areas of the brain – the amygdala and the adjacent orbitofrontal cortex and the insula. In addition, increased activation also was noted in what neuroscientists call the "shared representational" network which includes the right inferior parietal cortex and premotor cortex. This network refers to brain areas that are activated when a person has a mental image of performing an action, actually performs that action or observes someone else performing it.
However, these emotional processing areas were suppressed when the story content and expression were mismatched, such as by having a person smile while telling about his mother's death. Instead, activation was centered in the ventromedial prefrontal cortex and superior frontal gyrus, regions that deal with social conflict.
After watching each video clip, the 12 subjects in the study also were asked to rate the storyteller's mood and likability. Not surprisingly the subjects found the storytellers more likable and felt more sympathetic toward them when their emotional expression matched a story's content than when it did not.
"Sympathy is a very basic way in which we are connected to other people," said Decety. "We feel more sympathy if the person we are interacting with is more like us. When people act in strange ways, you feel that person is not like you.
"It is important to note that the emotional processing network of the brain was not activated when the subjects in our study watched what we would consider to be inappropriate social behavior. Knowing how the brain typically functions in people when they are sympathetic will lead to a better understanding of why some individuals lack sympathy."
Imagine a future where people can be genetically engineered to lack sympathy. I think the technical ability to eventually do this is a matter of when, not if. PET Scans might be used to detect the equivalent of Blade Runner replicants.
If SD-6 started using this technique to look for sympathy then Sydney Bristow of Alias could be in a whole world of trouble. Still, it would be hard to word the questions to trip her up since she is supposed to believe that by working for SD-6 she's already working for the CIA.
Using the enzyme chondroitinase ABC Italian and British researchers found they could free up nerve cells from a layer of sugar and protein so that they could once again spread and this may be useful for brain trauma treatments:
Pizzorusso and his team sealed one eye shut in each of about a dozen adult lab rats with otherwise perfect vision. Normally, the critical period for vision passes about five weeks after birth.
When the rats were anaesthetized and their visual cortexes given the bacterial enzyme, the biochemical digested the sugary proteins and unlocked the nerve cells, allowing them once again to grow new connections. Nerves from the sealed eye, in a case of "use it or lose it," began reaching towards the open eye and its stream of sensory input.
Its possible to suppress a trained fear response in rats:
The researchers then electrically stimulated the infralimbic area in rats that had been fear conditioned but not extinguished — in effect simulating the safety signal, while pairing it with the tone. Remarkably, the rats showed little freezing. Later, the rats continued to be unafraid of the tone even without the stimulation, suggesting that memory for extinction was strengthened by experimentally mimicking the safety signal.
Since the prefrontal cortex is known to project to the amygdala, a hub of fear memory deep in the brain, the researchers propose that increased activity of infralimbic neurons in the prefrontal cortex strengthens memory of safety by inhibiting the amygdala's memory of fear. They speculate that stimulating parts of the prefrontal cortex in anxiety disorder patients, using an experimental technique called transcranial magnetic stimulation, might help them control fear.
Imagine some future battlefield where one of the combatant countries has implanted mini-electrodes in the brains of its soldiers. It could suppress all fear and make its soldiers fearless. They wouldn't panic when under fire.
The neural stem cells in this study are adult stem cells that are present in the adult human brain. While this article doesn't state it they are probably from the hippocampus. These cells normally differentiate to create neurons to form new memories and to replace lost cells. In Parkinson's Disease there is a particular type of neuron that makes the neurotransmitter dopamine that dies at a much faster rate than normal. One potential way to treat Parkinson's is to induce the adult stem cells to reproduce at a faster rate and to differentiate into dopamine-producing neurons to replace the lost neurons. A research group at Jefferson Medical College has demonstrated that it is possible to induce human adult neural stem cells to produce dopamine:
Developmental biologist Lorraine Iacovitti, Ph.D., professor of neurology at Jefferson Medical College of Thomas Jefferson University in Philadelphia, is searching for ways to convert stem cells into dopamine-making neurons to replace those lost in Parkinson's. In previous work, she and her co-workers showed that mouse neural stem cells placed in rats with Parkinson's disease could develop into brain cells that produced tyrosine hydroxylase (TH), the enzyme needed to make dopamine.
Dr. Iacovitti, who also is associate director of the Farber Institute for Neurosciences at Jefferson, wanted to see if human neural stem cells could become dopamine-producing brain cells as well. She and her colleagues grew neural stem cells in a laboratory dish. Using a cocktail of protein growth factors and nutrients, the researchers found they could coax approximately 25 percent of the stem cells to make TH in the dish, proving the stem cells had the capacity to manufacture dopamine. What's more, when they removed the growth factor-cocktail, the cells continued to produce the enzyme. She reports her team's findings November 5 at the annual meeting of the Society for Neuroscience in Orlando.
"We have two examples of human stem cells that do this," she says. "The obvious extension [of these results] is to take those predifferentiated human dopamine neurons and transplant then into Parkinson's disease model systems."
This is still a long way away from a useful therapy. But the value of this result is that it shows that the neural stem cells have the potential to produce dopamine. They haven't gone down a differentiation path that precludes their ability to make dopamine. This is great news.
Using adult stem cells to do this has a few advantages aside from the obvious one of avoiding the ethical objections some people have to the use of embryonic stem cells. First of all, it is theorized by some scientists that adult stem cells may be at lesser risk of converting into cancer cells than embryonic stem cells. Also, neural stem cells, being more differentiated than embryonic stem cells, are some unknown number of steps closer to being neurons. So to convert them to neurons of a particular type may turn out to be easier to do. Adult stem cells are also already immunologically compatible with their hosts. Another big potential advantage is that adult stem cells are already in the host body. It may be possible to come up with a mix of drugs and/or gene therapy that would flow up into the brain and tell those adult neural stem cells to reproduce at a much faster rate and convert into dopamine-producing neurons.
The ability to better control adult neural stem cells has other applications in treatment of disease. See this recent post on hippocampal stem cells and depression for another example. In the long run the ability to do gene therapy on adult neural stem cells and to control their cell division and differentiation will be useful not only for treating classical neurological disorders such as Parkinson's but also to rejuvenate aging brains, to help lift depression, to repair traumas to the brain, and even to raise intelligence.
An interesting report in The New Scientist about sea lion memory:
California sea lions may have the best memory of all non-human creatures. A female called Rio that learned a trick involving letters and numbers could still perform it 10 years later - even though she hadn't performed the trick in the intervening period.
U Penn scientist Daniel Langleben is developing the fMRI as a lie detector and expects within 50 years it will be possible to read minds. His collaborator Ruben Gur can already detect whether a person recognizes another person.
Ruben Gur, a neuropsychologist at the University of Pennsylvania, says new kinds of brain scans can reveal when a person recognizes a familiar face, no matter how hard he or she tries to conceal it.
The scanning machine, called a functional MRI, takes pictures that highlight specific parts of the brain activated during certain tasks. Telltale parts of your brain "light up," he said, when you are presented with a face you have seen before.
This is evidence that there is a part of the brain that is involved in moral enforcement:
The sudden and uncontrollable paedophilia exhibited by a 40-year-old man was caused by an egg-sized brain tumour, his doctors have told a scientific conference. And once the tumour had been removed, his sex-obsession disappeared.
The cancer was located in the right lobe of the orbifrontal cortex, which is known to be tied to judgment, impulse control and social behaviour. But neurologists Russell Swerdlow and Jeffrey Burns, of the University of Virginia at Charlottesville, believe it is the first reported case linking damage to the region with paedophilia.
Did this guy have this behavior as a result of the tumor's increasing the amount of pleasure he felt for forbidden activities? Or did the tumor disable a part of the brain that enforces moral constraint? Either way there is an important lesson here for future human genetic engineering: It will be possible some day to genetically engineer humans who do not feel as much constraint to respect the rights of others. Genetic engineering will make possible the design of minds that have a different set of desires than the typical range of desires seen in fairly normal law-abiding humans today.
Some day in the future it will become possible to genetically engineer a mind to find obesity or an old age appearance to be attractive. Or a person's tastes could be changed to find sweets to be repulsive or bitter taste to be yummy. So there will ways to make changes that will result in people whose tastes are, at least by our standards, extremely weird and even repugnant. These odd changes in tastes can create problems if, for instance, a fraction of the human race is made to like color schemes and designs that the rest of us think are disgusting. Imagine the political fights that could result over zoning ordinances or the appearance of public parks.
But the real danger from mind engineering comes from the ability to fiddle with the parts of the mind that involve moral constraints and intense desires that involve other humans. A mind could be engineered to feel no remorse at killing someone or to feel joy from beating and dominating others physically. A mind could be made to derive pleasure from deception, hurting others physically, and insulting others. Minds can be thought of as complex computer programs. Using the terminology of programming then modification of moral programming and modification of programming of desires that are controlled by moral programming are the greatest future potential dangers from genetic engineering.
A lot of attention is being paid to ethical arguments about choosing the sex of children or about selecting sperm and eggs that have higher intelligence. In terms of the potential danger to society these issues are small potatoes next to the issue of changing the genetic code in ways that affect moral and desire programming.
It would also be interesting to test pain sensitivity for milder pain without using anesthesia. One also wonders whether redheads have different levels of sensitivity to hot and cold or other differences in sensations. From a genetic engineering perspective it might be desireable to give your kids a different variation of the gene for melanocortin-1 receptor and then add red-headedness genetic variation just to the melanocytes of the hair after birth. Though we shouldn't rule out the possibility of unknown advantages to this mutation that manifest in other ways:
Ten red-haired women between 19 and 40 years of age and ten more with dark hair were given a commonly-used inhaled anaesthetic in the study. After each dose of the anaesthetic, the women were given a standard electric shock.
The process was repeated until the women said they felt no pain. Their reflexes were also monitored to assess the effectiveness of the painkiller. The researchers found that red heads required 20 per cent more aesthetic to dull the pain.
A smaller group of blondes was tested and found to have the same pain sensitivity as brunettes:
The sun triggers a hormone that in turn triggers the production of melanin to form a tan. Redheads seldom tan easily because they have a defective receptor for that hormone — a quirk with this “melanocortin-1 receptor” that also leaves their hair red. Without its intended receptor to dock in, the melanin-producing hormone may cross-react with a related receptor on brain cells that influences pain sensitivity, Sessler explained.
Blocking the LVGCC channels can prevent extinction of fear :
In a discovery with implications for treatment of anxiety disorders, UCLA Neuropsychiatric Institute investigators have identified a distinct molecular process in the brain involved in overcoming fear. The findings will be published in the Oct. 15 edition of the Journal of Neuroscience. The study of how mice acquire, express and extinguish conditional fear shows for the first time that L-type voltage-gated calcium channels (LVGCCs) -- one of hundreds of varieties of electrical switches found in brain cells -- are required to overcome fear but play no role in becoming fearful or expressing fear. The findings suggest that it may be possible to identify the cells, synapses and molecular pathways specific to extinguishing fear, and to the treatment of human anxiety disorders.
"Brain plasticity, or the ability of the central nervous system to modify cellular connections, has long been recognized as a key component to learning and memory," said Dr. Mark Barad, the UCLA Neuropsychiatric Institute's Tennenbaum Family Center faculty scholar and an assistant professor in-residence of psychiatry at the David Geffen School of Medicine at UCLA. "The discovery of a distinct molecular process in overcoming fear bodes well for development of new drugs that can make psychotherapy, or talk therapy, easier and more effective in treating anxiety disorders. More broadly, the findings also suggest that distinct molecular processes may be involved in the expression and treatment of other psychiatric disorders."
Both the acquisition and extinction of conditional fear are forms of active learning. The acquisition of conditional fear requires a unique pairing of an initially neutral conditional stimulus with an aversive unconditional stimulus. In this research, the conditional stimulus was a tone and the unconditional stimulus was a mild foot shock.
If blocking LVGCC will prevent one from overcoming fear then would a design of LVGCC that opens at a lower threshold of stimulation result in a personality that overcomes fear more easily? Will it become possible to genetically engineer personality types that are more fearless? Will people choose such personality types for their children when it becomes possible to do so?