As humans face increasing distractions in their personal and professional lives, University of British Columbia researchers have discovered that people can gain greater control over their thoughts with real-time brain feedback.
The study is the world's first investigation of how real-time functional Magnetic Resonance Imaging (fMRI) feedback from the brain region responsible for higher-order thoughts, including introspection, affects our ability to control these thoughts. The researchers find that real-time brain feedback significantly improves people's ability to control their thoughts and effectively 'train their brains.'
This result would be a lot more interesting if MR scanners did not cost $1 million to $3 million. Superconducting cools supercooled make for an expensive machine. What's a cheaper way to do this? What are the prospects for room temperature superconductors?
Train your brain to make your rostrolateral prefrontal cortex (RLPFC) think more higher-order thoughts. Any way to do this without an fMRI machine?
For the study, published the current issue of NeuroImage journal, participants performed tasks that either raised or lowered mental introspection in 30-second intervals over four six-minute sessions. fMRI technology tracked real-time activity in the rostrolateral prefrontal cortex (RLPFC), the region of the brain involved with higher-order thoughts.
Participants with access to real-time fMRI feedback could see their RLPFC activity increase during introspection and decrease during non-introspective thoughts, such as mental tasks that focused on body sensations. These participants used the feedback to guide their thoughts, which significantly improved their ability to control their thoughts and successfully perform the mental tasks. In contrast, participants given inaccurate or no brain feedback did not achieve any improvement in brain regulation.
What would also be helpful: A graphical display above one's desk that shows how much people coming in to ask questions are cutting into mental output. Recent mental productivity should be shown going back and hour or so with a live feed. Then a pair of bars on a bar graph could show how much mental work one has done so far and how much one could have done without interrupts.
What sounds like science fiction is actually possible: thanks to magnetic stimulation, the activity of certain brain nerve cells can be deliberately influenced. What happens in the brain in this context has been unclear up to now. Medical experts from Bochum under the leadership of Prof. Dr. Klaus Funke (Department of Neurophysiology) have now shown that various stimulus patterns changed the activity of distinct neuronal cell types. In addition, certain stimulus patterns led to rats learning more easily. The knowledge obtained could contribute to cerebral stimulation being used more purposefully in future to treat functional disorders of the brain. The researchers have published their studies in the Journal of Neuroscience and in the European Journal of Neuroscience.
Why overstimulate your brain with amphetamine when you can use magnets instead?
Intermittent theta burst stimulation (iTBS), that's the ticket.
Since the mid-1990's, repetitive TMS has been used to make purposeful changes to the activability of nerve cells in the human cortex: "In general, the activity of the cells drops as a result of a low-frequency stimulation, i.e. with one magnetic pulse per second. At higher frequencies from five to 50 pulses per second, the activity of the cells increases", explained Prof. Funke. Above all, the researchers are specifically addressing with the effects of specific stimulus patterns like the so-called theta burst stimulation (TBS), in which 50 Hz bursts are repeated with 5 Hz. "This rhythm is based on the natural theta rhythm of four to seven Hertz which can be observed in an EEG", says Funke. The effect is above all dependent on whether such stimulus patterns are provided continuously (cTBS, attenuating effect) or with interruptions (intermittent, iTBS, strengthening effect).
The iTBS enhanced rat learning. I'm picturing iTBS labs in colleges that are booked 24x7 leading up to finals.
The study, published in the December issue of the APA journal Experimental and Clinical Psychopharmacology, included 80 college students (34 men and 46 women) between the ages of 18 and 40. Some were given Red Bull 7, while others were given lower amounts of caffeine added to Squirt, a lemon-flavored decaffeinated soda that looks and tastes like Red Bull. Others were given plain Squirt as a placebo. A half-hour after finishing the drinks, participants took a computerized "go/no-go" test in which they had to respond quickly to targets on a screen. They were instructed to hit the forward slash key when a green target appeared and do nothing when a blue target appeared.
Participants were also asked how stimulated and mentally fatigued they felt after the drinks. The students who were given Red Bull reported feeling more stimulated and less tired than the other participants, but their response rates were slower.
This reminds me of my old post: Scientists Demonstrate Best Way To Use Caffeine
Avoid over-revving the brain or body with drugs. You might feel like a Master Of The Universe. But you run the risk of putting in the performance of a cheezy B-movie actor.
Berkeley — Research on a drug commonly prescribed to Alzheimer's disease patients is helping neuroscientists at the University of California, Berkeley, better understand perceptual learning in healthy adults.
In a new study, to be published online Thursday, Sept. 16, in the journal Current Biology, researchers from UC Berkeley's Helen Wills Neuroscience Institute and School of Optometry found that study participants showed significantly greater benefits from practice on a task that involved discriminating directions of motion after they took donepezil, sold under the brand name Aricept, compared with a placebo.
The study examined the effects of detecting whether successive moving dots were moving in the same direction. The results prove nothing about the utility of taking this drug to study science, law, or history. The researchers want to study the effects of this drug on other forms of learning.
Will the development of new drugs for Alzheimer's, dementia, and other neurological disorders also produce more drugs that boost intellectual ability? Seems likely.
The study was double blind.
Neither the researchers nor the participants knew whether they were taking the placebo or donepezil, a cholinesterase inhibitor that enhances the effects of the neurotransmitter acetylcholine in the brain. Cholinesterase inhibitors act by blocking an enzyme that breaks down acetylcholine. Acetylcholine is known to play an important role in mediating visual attention and, in animal studies, has been found to promote changes in the brain that are associated with learning.
Have you used drugs to enhance your ability to learn? If so, which drugs and which ones helped? Still using drugs to either enhance learning or to enhance performance of tasks you already know how to do? Anything work better than caffeine?
Ritalin isn't just for improving concentration. Ritalin tweaks a receptor in the amygala in a way that boosts learning.
Doctors treat millions of children with Ritalin every year to improve their ability to focus on tasks, but scientists now report that Ritalin also directly enhances the speed of learning.
In animal research, the scientists showed for the first time that Ritalin boosts both of these cognitive abilities by increasing the activity of the neurotransmitter dopamine deep inside the brain. Neurotransmitters are the chemical messengers neurons use to communicate with each other. They release the molecule, which then docks onto receptors of other neurons. The research demonstrated that one type of dopamine receptor aids the ability to focus, and another type improves the learning itself.
The scientists also established that Ritalin produces these effects by enhancing brain plasticity – strengthening communication between neurons where they meet at the synapse. Research in this field has accelerated as scientists have recognized that our brains can continue to form new connections – remain plastic – throughout life.
"Since we now know that Ritalin improves behavior through two specific types of neurotransmitter receptors, the finding could help in the development of better targeted drugs, with fewer side effects, to increase focus and learning," said Antonello Bonci, MD, principal investigator at the Ernest Gallo Clinic and Research Center and professor of neurology at UCSF. The Gallo Center is affiliated with the UCSF Department of Neurology.
Anyone out there using Ritalin to speed your learning?
Or do you use any other drugs to speed learning? If so, which one?
When drugs, gene therapies, and cell therapies become available that will allow you to change your personality will you opt to do so? Most people in a recent study indicated they don't want to change cognitive traits that they think are fundamental to their identity. So which traits do you think are fundamental to your identity? Would you like to change any of them?
Healthy people are more willing to take drugs to enhance traits that are not fundamental to their identity.
According to a new study in the Journal of Consumer Research, people's willingness to take a pill or drug depends on whether the trait the drug promises to enhance is one they consider fundamental.
Authors Jason Riis (NYU, Harvard Business School), Joseph P. Simmons (Yale University), and Geoffrey P. Goodwin (Princeton University) examine the moral dilemmas that arise as technologies develop that not only cure disease but also enhance already-healthy people. As many young people without diagnosed disorders or deficits take Ritalin or Adderall to improve concentration or anti-depressants to lift their moods, this study examines what makes healthy people willing to take pills.
People do not see increasing their ability to concentrate as something that would alter their identity.
The researchers determined that people do not feel comfortable using a pill to enhance a trait they believe to be fundamental to their identity. But less-fundamental traits, including concentration, are more acceptable targets.
"We suggest that people's willingness to take psychological enhancements will largely depend on beliefs about whether those enhancements will alter characteristics considered fundamental to self-identity," the authors write.
During a series of studies, the researchers found that young people were less likely to agree to take a drug to increase their social comfort than one that increased their ability to concentrate. The most common reason participants said they wouldn't want to take a pill was because it would "fundamentally change who I am."
But with proper marketing it is possible to sell people on more kinds of changes to who they are.
Not surprisingly, the marketing message affected participants' responses. When the researchers tested different advertising taglines, they found that participants responded more positively to a drug promising to help them become "more than who you are," than one that would allow them to become "who you are."
"Together, this research converges to highlight the importance of identity expression and preservation in governing the choices and lives of consumers," write the authors.
Will introverts opt to become extroverts? I think it more likely introverts will decide to become extroverts than vice versa. What do you think?
Imagine you could make yourself more likely or less likely to become angry when you see something that you think is morally wrong. Would you tune your emotional response? If so, in which direction?
If public nudity makes you deeply embarrassed would you like to alter your brain so that you do not feel any embarrassment or shame when nude in front of others? Or would you like to suppress the embarrassment response in any other circumstances?
Would you be willing to use biotechnology such as neural stem cells to make you more relaxed or more confident or change something else about your mental state?
Here is one of the most fundamental traits: Do you find yourself having moral reactions that you intellectually disagree with on some level? Would you like to alter what you find morally wrong or morally acceptable?
Some people are tempted to use Ritalin (Methylphenidate) in order to boost their cognitive performance. But will it burn out your brain the way methamphetamine can? Ritalin does not appear to work as a stimulant.
MADISON - Stimulant medications such as Ritalin have been prescribed for decades to treat attention deficit hyperactivity disorder (ADHD), and their popularity as "cognition enhancers" has recently surged among the healthy, as well.
What's now starting to catch up is knowledge of what these drugs actually do in the brain. In a paper publishing online this week in Biological Psychiatry, University of Wisconsin-Madison psychology researchers David Devilbiss and Craig Berridge report that Ritalin fine-tunes the functioning of neurons in the prefrontal cortex (PFC) - a brain region involved in attention, decision-making and impulse control - while having few effects outside it.
Because of the potential for addiction and abuse, controversy has swirled for years around the use of stimulants to treat ADHD, especially in children. By helping pinpoint Ritalin's action in the brain, the study should give drug developers a better road map to follow as they search for safer alternatives.
At the same time, the results support the idea that today's ADHD drugs may be safer than people think, says Berridge. Mounting behavioral and neurochemical evidence suggests that clinically relevant doses of Ritalin primarily target the PFC, without affecting brain centers linked to over-arousal and addiction. In other words, Ritalin at low doses doesn't appear to act like a stimulant at all.
Emphasis on the "at low doses". At higher doses the picture is different.
Ritalin at lower doses appears to cause the prefrontal cortex (PFC) to be more sensitive to signals coming in from the hippocampus.
When they listened to individual PFC neurons, the scientists found that while cognition-enhancing doses of Ritalin had little effect on spontaneous activity, the neurons' sensitivity to signals coming from the hippocampus increased dramatically. Under higher, stimulatory doses, on the other hand, PFC neurons stopped responding to incoming information. "This suggests that the therapeutic effects of Ritalin likely stem from this fine-tuning of PFC sensitivity," says Berridge. "You're improving the ability of these neurons to respond to behaviorally relevant signals, and that translates into better cognition, attention and working memory." Higher doses associated with drug abuse and cognitive impairment, in contrast, impair functioning of the PFC.
Ritalin may work by reducing the number of things the mind pays attention to. This lets you more productively do mental processing on what actually gets the mind's focus.
More intriguing still were the results that came from tuning into the entire chorus of neurons at once. When groups of neurons were already "singing" together strongly, Ritalin reinforced this coordinated activity. At the same time, the drug weakened activity that wasn't well coordinated to begin with. All of this suggests that Ritalin strengthens dominant and important signals within the PFC, while lessening weaker signals that may act as distractors, says Berridge.
But be careful. Ritalin has side effects.
The science journal Nature asked its readers to take an online survey of cognitive enhancing drug use. 1400 responded and 20% reported using drugs for brain enhancement with methylphenidate (Ritalin) the most popular followed by modafinil (Provigil to reduce sleepiness).
For those who choose to use, methylphenidate was the most popular: 62% of users reported taking it. 44% reported taking modafinil, and 15% said they had taken beta blockers such as propanolol, revealing an overlap between drugs. 80 respondents specified other drugs that they were taking. The most common of these was adderall, an amphetamine similar to methylphenidate. But there were also reports of centrophenoxine, piractem, dexedrine and various alternative medicines such as ginkgo and omega-3 fatty acids.
The most popular reason for taking the drugs was to improve concentration. Improving focus for a specific task (admittedly difficult to distinguish from concentration) ranked a close second and counteracting jet lag ranked fourth, behind 'other' which received a few interesting reasons, such as “party”, “house cleaning” and “to actually see if there was any validity to the afore-mentioned article”.
The propranolol (sold as Inderal) is a beta blocker which suppresses flight-or-flight stress reactions. Some musicians use beta blockers for performances. Though their primary use is to lower high blood pressure. It is also used in lower doses against anxiety.
The willingness of scientific researchers to use currently available drugs as cognitive enhancers suggests that these drugs might really work to improve mental performance. It also shows that these people who do competitive intellectually difficult work look for ways to get an edge.
Ritalin for faster computer chip design, less buggy software development, and more optimized mechanical designs? Ritalin for brainstorming marketing strategies? Anyone tried it for intellectual work?
Fifty per cent of people told to breathe normally or through their mouths yawned while watching other people yawn, while none of those told to breathe through their noses yawned. The researchers also found that subjects who held a cold pack to their forehead did not catch yawns from the film, while those who held a warm or room-temperature pack yawned normally (Evolutionary Psychology, vol 5, p 92).
Breathing through your nose cools blood flowing to the brain. Would breathing in through the nose and out through the mouth do a better job of cooling the brain by venting the warmer air from the lungs away from the nose and brain?
Yawning supposedly increases group vigilance. Our hunter ancestors either on stake-out or perhaps manning defense perimeters around camps might have needed to yawn in unison to keep their minds alert to prey or predators. Cooling overheated brains increases their efficiency and hence the yawns.
ALBANY, N.Y. (June 29, 2007) -- The next time you "catch a yawn" from someone across the room, you're not copying their sleepiness, you're participating in an ancient, hardwired ritual that might have evolved to help groups stay alert as a means of detecting danger. That's the conclusion of University at Albany researchers Andrew C. Gallup and Gordon G. Gallup Jr. in a study outlined in the May 2007 issue of Evolutionary Psychology (Volume 5.1., 2007).
The psychologists, who studied yawning in college students, concluded that people do not yawn because they need oxygen, since experiments show that raising or lowering oxygen and carbon dioxide in the blood fails to produce the reaction. Rather, yawning acts as a brain-cooling mechanism. The brain burns up to a third of the calories we consume, and as a consequence generates heat. According to Gallup and Gallup, our brains, not unlike computers, operate more efficiently when cool, and yawning enhances the brain's functioning by increasing blood flow and drawing in cooler air.
Has the need to dissipate heat slowed the evolution of human intelligence just as heat dissipation problems are slowing the rate of advance of modern computer microprocessor speeds?
These results suggest easy practical ways to boost mental performance:
Stanford University neuroscientists have designed a gene that enhances memory and learning ability in animals under stress. Writing in the Nov. 8 issue of the Journal of Neuroscience, the Stanford team says that the experimental technique might one day lead to new forms of gene therapy that can reduce the severe neurological side effects of steroids, which are prescribed to millions of patients with arthritis, asthma and other illnesses.
"Steroids can mess up the part of the brain involved in judgment and cognition," said neuroendocrinologist Robert Sapolsky, co-author of the study. "In extreme cases it's called steroid dementia. Ideally, if you could deliver this gene safely, it would protect the person from some of these cognitive side effects, while allowing the steroid to do whatever helpful thing it should be doing elsewhere in the body."
The gene therapy combines two receptors into a single gene and a single gene product.
For the experiment, Sapolsky and his team created what geneticists call a chimera--an experimental strand of DNA made with two genes stitched together, in this case a glucocorticoid-receptor gene from a rat combined with an estrogen-receptor gene from a human.
When this new chimeric gene was injected into the hippocampus of a rat, the result was dramatic. The gene produced new protein receptors that quickly converted stress-inducing glucocorticoids into beneficial estrogen signals.
The gene therapy was injected into the hippocampus region of the brain in male rats.
Once injected, individual copies of the virus penetrate the hippocampal neurons, thereby delivering the chimeric gene and activating it in the rat's brain. The new gene then transforms harmful corticoids into helpful estrogens--a process that should hypothetically block the animal's negative behavioral response to stress.
To make sure that natural estrogen wasn't a factor, the experiment was restricted to male rats only. Every rat was trained to find the hidden platform. To raise corticoid levels in the animal's bloodstream, the rats were subjected to a variety of stresses, such as immobilization or cold temperature, then released into the water, where observers counted how quickly and how often they swam to the area above the missing platform.
Stress tests were conducted before the animal received training, immediately after training and 24 hours later. "This taps into three different domains and three different timings--the effects of stress on learning, on storing learned information as memory and on retrieving that memory," Sapolsky explained. The results were clear: When stress was applied 24 hours after training, the rats infected with the chimeric gene swam to the area of the missing platform faster, and spent significantly more time looking for it, than the normal rats did.
"These results are pretty fantastic, " Nicholas said. "They suggest that this gene therapy not only blocks the deleterious effects of glucocorticoids but actually enhances spatial memory and learning through estrogen-controlled events, even in the presence of stress. Seeing this enhancement effect was pretty exciting. It's the best we could have hoped for."
What I wonder: Does the estrogen created from the glucocorticoids increase female behavior in the male rats?
Also, suppose the hippocampus was genetically engineered to simply have fewer glucocorticoid receptors. That might make it less susceptible to damage from stress. But would some advantage be lost?
We need better methods for consciously controlling stress reponses. In modern environments stress responses often serve no useful function. The environmental cues that cause stress responses are often not associated with the kinds of natural dangers that stress responses were designed to handle. So the body's response to stresses has become very maladaptive in industrial societies. We aren't evolved to handle modern society. So our reactions to it have become too often problematic.
We need genetic engineering to adapt our bodies to our technologies and the environments we have created with our technologies.
Jan Born, a neuroscientist at the University of Lübeck in Germany who led the research, said the electrical current, applied via electrodes stuck to the scalp, seemed to enhance a part of the sleep cycle linked to consolidating word memory. Dr Born had 13 medical students learn a list of words and tested how many they remembered after a set time. He had them repeat the exercise after a nap.
The results, published today in Nature, show that without electrical current the volunteers remembered, on average, 37.42 words before sleep and 39.5 words when they woke. It confirmed research that sleep is important for consolidating learned information. After electrical stimulation the number of words volunteers remembered rose to 41.27 after sleep.
Born thinks that the current enhanced activity in the hippocampus which is keep to new memory formation.
The students’ various sleep stages were monitored using an electroencephalogram (EEG) machine. When the students entered a period of light sleep, Born’s team started to apply a gentle current in one-second-long pulses, every second, for about 30 minutes. The EEG readings revealed that this current had put students into a deeper state of sleep.
The electric current was so small the students could not feel it.
What I'd like to do: turn down memory formation after borning days but turn up memory formation on nights after intense learning and complex problem solving.
Also, imagine taking naps throughout the day and evening when you are in an intense learning phase. The naps, enhanced by electric currents to accelerate memory formation, might allow you to learn more information per day by giving the hippocampus newly learned material to process several times a day.
Rejuvenated neural stem cells injected into the brain will probably help memory formation once cell manipulaton biotechnologies become advanced enough to produce such stem cells..
At the request of readers I've been out looking for information about whether methylphenidate (Ritalin) and dextroamphetamine (Dexedrine; Adderall) boost IQ and SAT scores. Haven't come up with anything quantitative yet. But in the process of looking I came across some interesting reports on biofeedback treatments for ADD (attention deficit disorder)/attention deficit hyperactive disorder (ADHD). A recent Stanford symposium, entitled "Brainwave Entrainment to External Rhythmic Stimuli: Interdisciplinary Research and Clinical Perspectives", surveyed methods of using rythmic stimuli as cognitive therapy. Maybe Janet Jackson has been delivering cognitive therapy.
Harold Russell, a clinical psychologist and adjunct research professor in the Department of Gerontology and Health Promotion at the University of Texas Medical Branch at Galveston, used rhythmic light and sound stimulation to treat ADD (attention deficit disorder) in elementary and middle school boys. His studies found that rhythmic stimuli that sped up brainwaves in subjects increased concentration in ways similar to ADD medications such as Ritalin and Adderall. Following a series of 20-minute treatment sessions administered over several months, the children made lasting gains in concentration and performance on IQ tests and had a notable reduction in behavioral problems compared to the control group, Russell said.
But the article does not quantify these gains.
The frequency of the delivered light and sound is controlled using biofeedback to measure brain activity.
"For most of us, the brain is locked into a particular level of functioning," the psychologist said. "If we ultimately speed up or slow down the brainwave activity, then it becomes much easier for the brain to shift its speed as needed."
Russell, whose study was funded by the U.S. Department of Education and included 40 experimental subjects, hopes to earn approval from the Food and Drug Administration to use the brainwave entrainment device as a treatment for ADD. The device uses an EEG to read brainwaves and then presents rhythmic light and sound stimuli through special eyeglasses and headphones at a slightly higher frequency than the brain's natural rhythm.
Thomas Budzynski, an affiliate professor of psychology at the University of Washington, conducted similar experiments with a small group of underachieving college students at Western Washington University. He found that rhythmic light and sound therapy helped students achieve a significant improvement in their grades.
Again, the article does not quantify the gains. Still, interesting.
Budzynski also found that rhythmic therapy could improve cognitive functioning in some elderly people by increasing blood flow throughout the brain. "The brain tends to groove on novel stimuli," Budzynski explained. "When a novel stimulus is applied to the brain, the brain lights up and cerebral blood flow increases." To maintain the high blood flow, Budzynski used a random alternation of rhythmic lights and sounds to stimulate the brains of elderly people. The result: Many of the seniors improved performance on an array of cognitive tests.
Wouldn't you like to try some of these methods of biofeedback to see if you could enhance your own cognitive function?
Jacques Duff, the Australian president-elect of the International Society of Neuronal Regulation, runs a centre in Melbourne that has treated more than 1000 people. He believes the treatment is so effective the need for medication can sometimes be eradicated.
"In the case of ADHD, within 20 sessions the effect is similar to Ritalin, with the effects being permanent," Duff says.
Biofeedback-based therapy strikes me as probably lower risk than drugs.
The IQ score boost of ADHD kids is probably much higher than would be seen with people who do not have ADHD. Also, the extent of the benefit might be different depending on whether one's problem is more distractability versus hyperactivity.
As the technique works on strengthening brainwaves, just about anyone can benefit from it, with students and athletes attending clinics. However, Duff warns that budding Einsteins will be disappointed.
"There is an optimum set-point for the brain. You can't, for example, keep making someone smarter. On average though, an IQ increase of 15 points is seen in children with ADHD and learning difficulties."
The use of biofeedback for ADD/ADHD is nothing new. You can find lots of articles on it going back decades if you search on it.
Results: BASC Monitor and TOVA scores indicated similar significant improvements in both groups. No significant difference in treatment change was seen in between-group comparisons. Parents' subjective appraisal of treatment effect on ADHD was more positive for the videogame group. The videogame treatment was rated significantly more enjoyable by both parents and children. Trends on pre-post QEEG change maps indicated that the videogame training may have advantages in creating more quantitative EEG effect in the therapeutic direction.
Conclusions: We conclude that the videogame biofeedback technology, as implemented in the NASA prototype tested, produced equivalent results to standard neurofeedback in effects on ADHD symptoms. Both the videogame and standard neurofeedback improved the functioning of children with ADHD substantially above the benefits of medication. The videogame technology provided advantages over standard neurofeedback treatment in terms of enjoyability for the children and positive parent perception, and possibly has stronger quantitative post-treatment effects on EEG.
I'd love to see large scale controlled tests of the cognitive performance effects of ADHD drugs with SAT, IQ, and other tests delivered before and after administration of drugs to people with and without ADHD and to people with a wide range of IQ levels.
Also see my post ADHD Drugs In Vogue For Boosting College Test Scores.
BETHESDA, MD. (May 30, 2006) – Methylphenidate (Ritalin) elevates norepinephrine levels in the brains of rats to help focus attention while suppressing nerve signal transmissions in the sensory pathways to make it easier to block out extraneous stimuli, a Philadelphia research team has found.
Their report in the Journal of Neurophysiology helps explain how a stimulant aids people with attention deficit and hyperactivity disorders to improve their focus without increasing their motor activity. Methylphenidate, prescribed under the brand name Ritalin, has been used for more than 20 years, mostly in children, to treat attention deficit hyperactivity disorder (ADHD) and attention deficit disorder (ADD). The drug can also help people who don't suffer either disorder to attend better to a cognitive task.
Despite its wide use, little is known about how the drug, a chemical cousin of amphetamines, produces its therapeutic effects. Researchers want to unlock the mystery of why the drug has the paradoxical effect of decreasing hyperactive behavior and increasing the ability to focus, even though it is a stimulant, said Barry Waterhouse, the study's senior author.
"We're developing a series of behavioral and electrophysiological assays for examining the actions of drugs like methylphenidate," Waterhouse said. "If we can show exactly how methylphenidate works, we may be able to produce even more effective drugs and provide a better understanding of the physiology underlying ADHD."
The study, using rats, is the first to document the increase in norepinephrine and suppression of the neuronal response in this sensory pathway of the brain. "Methylphenidate enhances noradrenergic transmission and suppresses mid- and long-latency sensory responses in the primary somatosensory cortex of awake rats," by Philadelphia-based researchers Candice Drouin, University of Pennsylvania; Michelle Page, Thomas Jefferson University; and Barry Waterhouse, Drexel University College of Medicine appears online in the Journal of Neurophysiology, published by The American Physiological Society.
Can Ritalin boost the mental performance of the average person? It is increasingly popular with college students. Should brain workers be taking it in order to boost their productivity?
The development of better drugs to increase the ability to focus will eventually result in strong market pressures for their use. Workers who do not take such drugs will find it increasingly difficult compete with those who embrace cognition-enhancing drugs.
"The larger effect sizes we calculated for stimulant ADHD medications, compared to nonstimulants or the novel stimulant modafinil, leads us to conclude that amphetamine and methylphenidate based stimulant medications are more effective in treating symptoms of ADHD," said Stephen V. Faraone, Ph.D., lead researcher and director of child and adolescent psychiatry at SUNY Upstate Medical University. "Our results should help physicians who have had to rely on qualitative comparisons among published trials, along with their own clinical experience, to draw conclusions about an ADHD medication's relative efficacy because of largely absent direct head-to-head drug comparisons."
The researchers compared study outcomes using effect sizes, a commonly used, standard statistical measure to determine the magnitude of a particular effect resulting from an intervention, such as a drug used on a population, irrespective of the population size. Effect sizes are generally categorized as small (0.2), medium (0.5) and large (0.8). Standardized mean averages, or effect sizes, for dependent measures in each study were computed by taking the mean of the active drug group minus the mean of the placebo group and dividing the result by the pooled standard deviation of the groups.
After adjusting for the influence of individual study design features, the researchers calculated effect size based on Total ADHD scores. Long-acting and short-acting stimulant medications showed the largest effect size among all medications (E = 0.83 and E = 0.9 respectively), followed by nonstimulant or modafinil based stimulants medications (E = 0.62). Statistically significant differences in effect size occurred in comparisons between nonstimulant/modafinil based stimulant medications and long-acting (p = 0.004) as well as short-acting stimulants (p = 0.002).
For the analysis, Faraone and his colleagues used data from 29 double-blind, placebo-controlled treatment studies of 4,465 children with ADHD, with an average age 10 years, published during or after 1980. Designs for all of the studies were randomized, double-blind with placebo controls that lasted for two or more weeks in populations diagnosed with ADHD as defined using criteria from the American Psychiatric Association's Diagnostic and Statistical Manual of Mental Disorders, Revised Third Edition or Fourth Edition (DSM).
My guess: ADHD is a product of natural selection. Tendency to totally focus on a single subject was probably maladaptive for much of human history.
Conscious analysis of problems works best for simpler problems. But for more complex problems it may be best to absorb the facts and then distract your conscious mind while giving the subconscious time to work out the best choice.
One group was given four minutes to pick a favourite car from a list having weighed up four attributes including fuel consumption and legroom.
The other group was given a series of puzzles to keep their conscious selves busy before making a decision.
The conscious thought group managed to pick the best car based on four aspects around 55% of the time, while the unconscious thought group only chose the right one 40% of the time.
But when the experiment was made more complex by bringing in 12 attributes to weigh up, the conscious thought group's success rate fell to around 23% as opposed to nearly 60% for the unconscious thought group.
Instead, the scientists conclude, the best strategy is to gather all of the relevant information -- such as the price, the number of bathrooms, the age of the roof -- and then put the decision out of mind for a while.
Then, when the time comes to decide, go with what feels right. ''It is much better to follow your gut," said Ap Dijksterhuis, a professor of psychology at the University of Amsterdam, who led the research.
For relatively simple decisions, he said, it is better to use the rational approach. But the conscious mind can consider only a few facts at a time. And so with complex decisions, he said, the unconscious appears to do a better job of weighing the factors and arriving at a sound conclusion.
Dijksterhuis and his team also propose that, although we are unaware of it, our brains are churning through the mass of information involved in a complex decision and sifting out the best option.
The study ties in with a growing trend in psychology research over the past 15 years, suggesting that our unconscious mind is more important than we once thought. "A lot of complicated processes occur without our being aware of it," says Daniel Kahneman, an authority on decision making at Princeton University, New Jersey.
I wonder whether people with higher levels of intelligence have higher thresholds of complexity of problems before it makes sense to let their subconscious handle a problem.
Prof Dijksterhuis said: "Your brain is capable of juggling lots of facts and possibilities at the same time when you let it work without specifically thinking about the decision. But when you are specifically thinking about a problem, your brain isn't able to weigh up as much information. I sit on things and rely on my gut."
Whether the conscious mind does best will also depend on the nature of the problem. For example, I doubt the subconscious can compete with the conscious mind when a problem requires mathematical analysis.
Work on memory enhancers may be furthest along. Eric R. Kandel of Columbia University, who won a Nobel Prize for his research on learning and memory in the sea slug Aplysia, is one proponent. He found that learning occurs at the synapse (the junction between two neurons) by several means. The synapse is enhanced when a protein called CREB is activated, and CREB plays a role in memory formation in fruit flies and in mice. With these discoveries came the 1998 birth of Memory Pharmaceuticals, Kandel's Montvale, N.J.-based company, which hopes to formulate a drug that will raise the amount of CREB in the human neural system and thus facilitate the formation of long-term memories. One of the most promising chemicals is called MEM 1414. If clinical trials go well, MEM 1414 could be on the market after 2008. At least one other company, Helicon Therapeutics in Farmingdale, N.Y., is also investigating CREB to improve human memory formation.
Alternative drugs are also in the works based on other brain mechanisms. Before a neuron naturally increases CREB, certain channels on its membrane must open to allow positive ions to flow into the cell. The ions then trigger a cascade of events leading to the activation of CREB. One channel of interest is known as NMDA. In 1999 Joseph Z. Tsein, Ya-Ping Tang and their colleagues, then at Princeton University, discovered that increasing the number of NMDA receptors in the mouse hippocampus led to better performance on a spatial-memory task. Now researchers and pharmaceutical companies are pursuing NMDA receptor agonists (they combine with the receptors) as nootropes. At least a dozen new drugs of this kind are making their way toward clinical trials.
Some FuturePundit readers have requested I post more on currently available cognitive enhancement methods. Well, Gazzaniga points to one way to boost learning:
Self-medicating with Starbucks is one thing. But consider the following. In July 2002 Jerome Yesavage and his colleagues at Stanford University discovered that donepezil, a drug approved by the FDA to slow the memory loss of Alzheimer's patients, improves the memory of the normal population. The researchers trained pilots in a flight simulator to perform specific maneuvers and to respond to emergencies that developed during their mock flight, after giving half the pilots donepezil and half a placebo. One month later they retested the pilots and found that those who had taken the donepezil remembered their training better, as shown by improved performance. The possibility exists that donepezil could become a Ritalin for college students. I believe nothing can stop this trend, either.
Donepezil is marketed by Pfizer as Aricept. Note that doctors in the United States can prescribe drugs for purposes other than their original FDA approved purpose. Therefore donepezil can be had by anyone in America who can find a cooperative doctor willing to write an Aricept prescription. In some Third World countries you just have to show up at a pharmacy and wave some money in front of the pharmacist and Aricept can be acquired without the cooperation of the doctor. Note that I'm not advocating this. My point is that some people are going to try Aricept to boost their learning and no doubt some already are doing this.
Since donepezil blocks acetyl cholinesterase it boosts available neurotransmitter acetylcholine. This probably causes other consequences aside from increased learning. For example, I've noticed that taking large amounts of choline (which presumably also boosts acetylcholine) makes me a lot more prone to depression. Your mileage may vary.
Gazzaniga also relays claims that Ritalin enhances cognitive function not just for hyperactives but also for regular minds. Has anyone ever come across any controlled studies on SAT tests, IQ tests, or other measures of cognitive function that demonstrate this claim? A lot of claims are floating around out there. But in the absence of controlled (preferably double blind) studies I'll treat such claims with skepticism.
The article above is adapted from Gazzaniga's book The Ethical Brain.
Dogs are like humans in yet another way. Elderly dogs demonstrate better cognitive performance if given higher antioxidant diets and more stimulating environments.
During the two-year longitudinal study, William Milgram, Ph.D., of the University of Toronto, Elizabeth Head, Ph.D., and Carl Cotman, Ph.D., of the University of California, Irvine and their colleagues found older beagles performed better on cognitive tests and were more likely to learn new tasks when they were fed a diet fortified with plenty of fruits, vegetables and vitamins, were exercised at least twice weekly, and were given the opportunity to play with other dogs and a variety of stimulating toys. The study is reported in the January 2005 Neurobiology of Aging.
Citrus pulp mixed in with dog food? I wonder if they had problems getting the dogs to eat it.
For the study, the researchers divided 48 older beagles (ages 7 to 11) into four groups. One group was fed a regular diet and received standard care; a second group received standard care but was fed an antioxidant fortified diet, consisting of standard dog food supplemented with tomatoes, carrot granules, citrus pulp, spinach flakes, the equivalent of 800 IUs of vitamin E, 20 milligrams per kilogram of vitamin C, and two mitochondrial co-factors--lipoic acid and carnitine; the third was fed a regular diet, but their environment was enriched (regular exercise, socialization with other dogs, and access to novel toys); the fourth group received a combination of the antioxidant diet as well as environmental enrichment. In addition, a set of 17 young dogs (ages 1 to 3) were divided into two groups, one fed a regular diet and the other fed the antioxidant fortified diet.
I am skeptical that the vitamin E was a big benefit. Too much of a single antioxidant can actually dampen down metabolism by quenching too many free radicals. Not all free radicals are purely detrimental. The body uses free radical molecules for intracellular and intercellular signalling. Dampen down those signals too much and the net result can be harmful. I'd like to see this experiment repeated with more fruits and vegetables and no vitamins. I bet well chosen fruits and vegetables such as blueberry, spinach, kale, and perhaps even some nuts could provide as antioxidant punch as this study's mixture that included vitamins.
The fruits and vegetables added to the antioxidant fortified diet was the equivalent of increasing intake from 3 servings to 5 or 6 servings daily. Previous research suggests that antioxidants might reduce free radical damage to neurons in the brain, which scientists believe is involved in age-associated learning and memory problems. Mitochondrial co-factors may help neurons function more efficiently, slash free radical production and lead to improvements in brain function. Other studies suggest that stimulating environments improve learning ability, induce beneficial changes in cellular structure, may help the brain grow new neurons, and increase the resistance of neurons to injury.
I've had Australian Shepherds turn up their noses at me when I offered them various fruits - and this in spite of their begging when they saw me eating out of a human food bowl. But perhaps mixed in with much tastier foods (like some blood poured out of a red meat package) dogs could be persuaded to eat their fruits. Though getting them to eat tomato sauce is not hard when it is mixed with pasta and some oil. the right
The combination of better environment and better diet had the most powerful effect.
Overall, older dogs in the combined intervention group did the best on these learning tasks, outperforming dogs in the control group (standard diet, standard care) as well as those that received either the antioxidant diet or environmental enrichment. However, older beagles that received at least one of these interventions also did better than the control group. For instance, all 12 of the older beagles in the combined intervention group were able to solve the reversal learning problem. In comparison, 8 of the 12 dogs that ate the antioxidant diet without environmental enrichment and 8 of the 10 that received environmental enrichment without the antioxidant diet solved the problem. Only two of the eight older dogs in the control group were able to do this task. Dietary intervention in the younger canines had no effect.
Similar dietary changes for older humans would probably provide a similar cognitive benefit.
Also see my previous posts "Concord Grape Juice Improves Memory Of Aged Rats" and "Choline May Restore Middle Aged Memory Formation".
"It used to be on the fringes completely, but now it's seeping into the mainstream," says Steven Roy Goodman, a college and graduate-school admissions consultant in Washington. "If you're one of hundreds of kids fighting for one of 10 spots, you'll do everything you can to get the extra edge."
William Pollack, a psychologist and director of the Centers for Men and Young Men at Harvard University's McLean Hospital, says he has spoken with more than 50 students who say they've used stimulants to raise their test scores. "I've seen it become a trend," Dr. Pollack says.
More than 6.4 million prescriptions for Adderall were filled last year. Some of these prescriptions are being used by students seeking a quick fix for studying.
A Yale University junior said Adderall helped him read the 576-page novel "Crime and Punishment" and write a 15-page paper — all in 30 hours.
"In earlier generations, people would take NoDoz and get themselves high on caffeine and that sort of thing," said the student, who asked not to be identified. "This is more efficient than that."
Twenty-four-year-old Chul Yim is among those considering Ritalin as a stimulant to help him focus before his important law school admission test.
"What was described to me is it gives you tunnel vision. You focus on the question. Once you're done with that question, you move on to the next question and you still have that energy," said Yim.
I'm not arguing for using these drugs. But if you are considering a drug like Ritalin note that the benefits are likely to be greater for someone who is easily distracted. If you think your mind stays focused while taking a test Ritalin is less likely to provide a benefit.
Also, lots of brain enhancers really present you with a trade-off. See, for example, my post Caffeine Alertness Comes At Cost Of Word Recall. A stimulant amphetamine-based drug like Adderall can make a mind race too quickly or make a person jittery. There is no guarantee it will provide a net benefit. Though I can easily imagine someone taking the SAT twice, once normal and once on Adderall. Even there, it might make more sense to try practice tests with and without your preferred drug to find out if it really will help you.
I personally would be reluctant to start using an amphetamine. There is the real danger of permanent neuronal damage. If you are not scared by the idea of using stimulant drugs you really ought to be. See my previous posts Methamphetamine Addict Brain Scans Show Extensive Losses and Partial Recovery From Methamphetamine-Induced Brain Damage. You only get one brain. Do not abuse it.
For some alternatives also see my previous posts Scientists Demonstrate Best Way To Use Caffeine and Modafinil Boosts Human Mental Abilities. Modafinal may be safer than amphetamines when the need is simply to avoid sleepiness and time sleeping. But I'm not convinced that modafinil doesn't exact some cost in the form of some permanent damage. Use any of these compounds sparingly.
Meenakshi Iyer, Ph.D. of the US National Institute of Neurological Disorders and Stroke Brain Stimulation Unit has found that a two thousandths of an ampere applied to scalps using electrodes increases the speed of word recall.
A current of two thousandths of an ampere (a fraction of that needed to power a digital watch) applied for 20 minutes is enough to produce a significant improvement, according to data presented this week at the annual meeting of the Society for Neuroscience, held in San Diego.
The volunteers were asked to name as many words as possible beginning with a particular letter. Given around 90 seconds, most people get around 20 words. But when Iyer administered the current, her volunteers were able to name around 20% more words than controls, who had the electrodes attached but no current delivered. A smaller current of one thousandth of an amp had no effect.
This result is not a strong case for using electric currents to improve your brain's performance. First of all, I'm not convinced that this is entirely harmless. The current flows are causing events in the brain that otherwise would not occur. Are all the effects transitory? We do not know.
Also, the improvement in performance in one kind of test may very well decrease performance in other kinds of tests. There are precedents for this. For example, caffeine helps the brain stay on a chain of thought but does so at the expense of reducing word recall on unrelated subjects. Well, the ability to stay on a chain of thought is obviously useful in a lot of situations and so the trade-off is often worth it. But what is the trade-off from having the ability to recall more words that start with the same letter? Iyer's work needs to be repeated with a larger assortment of tests of cognitive function to see if there is any decay in the ability to perform other types of mental tasks.
Also, a wider range of cognitive tests run on people who take this treatment might turn up other benefits of this treatment. Perhaps the ability to recall more words improves the ability to find words to use to write better prose. People who do a lot of writing such as reporters, book authors, and Hollywood script writers might benefit from a little bit of electric juice. But another question that needs to be answered is just how long does the effect last?
My advice is to go jogging if you have hit a mental writing block. Exercise will increase brain performance while fat and table sugar will probably decrease brain performance.
Update: My point above about there often being trade-offs for improving cognitive enhancement is driven home by a new report out of Ohio State University that stress increases memory recall but decreases problem solving ability.
Researchers at Ohio State University gave a battery of simple cognitive tests to 19 first-year medical students one to two days before a regular classroom exam – a period when they would be highly stressed. Students were also given a similar battery of tests a week after the exam, when things were less hectic.
While pre-exam stress helped students accurately recall a list of memorized numbers, they did less well on the tests that required them to consider many possibilities in order to come up with a reasonable answer. A week after the exam, the opposite was true.
"Other studies have suggested that elevated stress levels can actually improve some aspects of cognition, particularly working memory," said Jessa Alexander, a study co-author and a research assistant in neurology at Ohio State. "The results of the two problem-solving tests we administered suggested a decline in problem solving abilities that required flexible thinking."
She conducted the study with David Beversdorf, an assistant professor of neurology at Ohio State. The two presented their findings on October 25 in San Diego at the annual Society for Neuroscience conference.
ITHACA, N.Y. -- Warm workers work better, an ergonomics study at Cornell University finds.
Chilly workers not only make more errors but cooler temperatures could increase a worker's hourly labor cost by 10 percent, estimates Alan Hedge, professor of design and environmental analysis and director of Cornell's Human Factors and Ergonomics Laboratory.
When the office temperature in a month-long study increased from 68 to 77 degrees Fahrenheit, typing errors fell by 44 percent and typing output jumped 150 percent. Hedge's study was exploring the link between changes in the physical environment and work performance.
"The results of our study also suggest raising the temperature to a more comfortable thermal zone saves employers about $2 per worker, per hour," says Hedge, who presented his findings this summer at the 2004 Eastern Ergonomics Conference and Exposition in New York City.
In the study, which was conducted at Insurance Office of America's headquarters in Orlando, Fla., each of nine workstations was equipped with a miniature personal environment-sensor for sampling air temperature every 15 minutes. The researchers recorded the amount of time that employees keyboarded and the amount of time they spent making error corrections. Hedge used a new research approach employing software that can synchronize a specific indoor environmental variable, in this case temperature, with productivity.
"At 77 degrees Fahrenheit, the workers were keyboarding 100 percent of the time with a 10 percent error rate, but at 68 degrees, their keying rate went down to 54 percent of the time with a 25 percent error rate," Hedge says. "Temperature is certainly a key variable that can impact performance."
One lesson of this study is that conservation should be done with better technology (e.g. better insulation) and not by making people suffer more extreme variations in temperature.
What is even more interesting here is the idea that there must be some optimal room temperature for productivity. Does anyone know whether psychometric studies of human intelligence have been conducted under a range of environmental conditions? Is there an optimal room temperature range for IQ? If so, what is that range?
In female rats running through water mazes higher estrogen estrogen levels helped boost performance only when the conditions were not stressful.
"Water temperature totally reversed who did better," said Janice M. Juraska, a professor of psychology and of neuroscience. "Proestrous rats, which have high hormone levels, did better when the water was warm, presumably because they were less stressed. Estrous rats did better when the water was cold, presumably because they are not as prone to get stressed during this time."
Proestrous rats are fertile and ready to mate, while estrous rats have low hormone levels and won't mate. For the study -- funded by a grant to Juraska from the National Science Foundation -- 44 female rats were divided into four groups. The two groups of rats in proestrus and the two groups in estrus had to learn the route and swim to a submerged platform in either warm (91 degrees Fahrenheit; 33 Celsius) or cold water (66.2 degrees Fahrenheit; 19 Celsius).
Many scientists have tried to answer the hormones-cognition question, but the various findings, measuring different tasks, have been inconsistent and often contradictory.
"These discrepancies of sometimes opposite results have been very difficult to resolve," Juraska said. "Even for simple tests of spatial behavior, high hormones can either help or hinder, and nobody has understood why."
What is the evolutionary adaptation that is causing this difference in performance under different conditions?
We already have fairly limited abilities to regulate the extent of our stress response. We can take drugs that make us more relaxed and also other drugs that suppress inflammation response. We will of course eventually achieve much more control over the body's stress response. One future use of the ability to regulate stress response will be to create levels of stress that have the effect of optimally tuning the mind to reach peak performance for specific types of cognitive tasks.
Of course hormones will be manipulated to produce differences in cognitive function as well. But what we really need with hormones is the ability to make different target tissue types see different hormone levels. Rather than the whole body seeing, for instance, pharmaceutically boosted levels of testosterone expect to see the development of means to target only muscles or only the brain to produce only a subset of all the effects that testosterone produces. Ditto for estrogen, progesterone, and other hormones. Lots of effects of these hormones go together in nature. But you can bet people will want to, say, enhance their physical appearance without changing their cognitive function and vice versa.
The increased level of alertness from using caffeine from coffee or tea comes at a cost: When asked a question unrelated to your chain of thought you'll be less likely to recall the correct word for the answer if you are on caffeine.
Your first cup of coffee each morning increases your alertness, but a new study suggests caffeine--potentially because of how it interacts with neurons in the brain--might actually hinder your short-term recall of certain words. That is, it may temporarily suppress access to information locked in your memory and unrelated to your current train of thought. For example, you might struggle to remember an acquaintance's name after meeting many new people at a morning meeting.
The findings come from a study in the latest issue of APA's Behavioral Neuroscience (Vol. 118, No. 3) in which researchers Steve Womble, PhD, and Valerie Lesk, both of the International School for Advanced Studies in Trieste, Italy, examined the tip-of-the-tongue (TOT) phenomenon--a form of memory retrieval failure in which someone knows an answer for certain, yet is at a particular moment unable to recall it. They sought to provide a potential neurological explanation of how caffeine affects short-term memory.
Miss Lesk said: "In some conditions caffeine helps short-term memory and in others it makes it worse.
"It aids short-term memory when the information to be recalled is related to the current train of thought but hinders short-term memory when it is unrelated.
"If the word is unrelated then caffeine is still strengthening retrieval in the same way, but because it is unrelated to the word you want to find it is actually having a negative effect," she said.
Think of caffeine as a drug that reduces distractability. Get on one chain of thought. Then encounter a distraction that requires you to shift to another chain of thought. Caffeine will inhibit your brain's ability to make that transition.
Obviously, this is a trade-off. The ability to resist distractions is a plus in some environments. But that is not always the case. Other environments require frequent attention shifting.
Caffeine is a tool. This is another piece of scientific evidence on how to use it. Also see my previous post: Scientists Demonstrate Best Way To Use Caffeine.
So what does this portend for the future? What we need (and I believe we will eventually find) are better pharmaceutical tools for shifting mental states to fit the types of work tasks we are doing. Imagine a safe, non-toxic, and fast-acting drug that reduces distractability and then imagine another safe, non-toxic, and fast-acting drug that reverses the effect of the first drug. That would be a useful pair of drugs. Shift your mind into an distraction-resistant mode and work at the computer in your office. Then, when called out to an impromptu meeting to debate some issue, flip your mind into a mode that reacts well to handling the input of lots of other people who are all jumping around making competing points. Sound appealing?
Update: Another example of a situation where you wouldn't want to be on coffee is as a TV game show contestant. A player on Jeoparday is going to be hit by a series of questions on unrelated topics. One wouldn't want one's mind to be better at answering follow-up questions on the same topic at the cost of not being as good at answering questions on unrelated topics. But a medical student taking a test on body bones would probably do better on caffeine since the knowledge of all the bones would probably be stored together and memorized fairly recently before taking the test (caffeine works against recalling older memories too).
Julie Daniels, an epidemiologist at the University of North Carolina at Chapel Hill School of Public Health, has found in a population of English children that consumption of fish by mothers during pregnanacy is positively correlated with cognitive development after controlling for educational levels of the mothers and some other factors.
CHAPEL HILL -- When fish is not contaminated, moderate consumption of the protein-rich food source by pregnant women and young children appears to boost the children’s neurological development, a new study shows.
"Our research adds to the literature suggesting that fish contains nutrients that may enhance early brain development," said Dr. Julie Daniels, assistant professor of epidemiology at the University of North Carolina at Chapel Hill School of Public Health. "We can not say that we have proven that eating fish will have long-lasting effects in making people smarter since we have only looked at early development markers through an observational study."
More research is needed to corroborate the findings, Daniels said.
A report on the study appears in the July issue of the journal Epidemiology. Besides Daniels, authors are Drs. Matthew P. Longnecker of the National Institute of Environmental Health Sciences, Andrew S. Rowland of the University of New Mexico’s family and community medicine department and Jean Golding of the University of Bristol Institute of Child Health’s ALSPAC Study Team.
Conducted in Bristol, England, the research involved evaluating the association between mothers’ fish intake during pregnancy and their offspring’s early development of language and communication skills, Daniels said.
The team evaluated 7,421 English children born in 1991 and 1992. They studied the children since much has been learned about contaminants in fish, but little research has been done on the potential developmental benefits of eating fish, she said.
"We measured mothers’ and children’s fish intake by questionnaire," Daniels said. "Later, we assessed each child’s cognitive development using adaptations of the MacArthur Communicative Development Inventory at 15 months and the Denver Developmental Screening Test at 18 months."
Researchers also measured mercury levels in umbilical cord tissue for a subset of 1,054 children.
"We found total mercury concentrations to be low and not associated with neurodevelopment," she said. "Fish intake by mothers during pregnancy, and by infants after birth, was associated with higher mean developmental scores. For example, the adjusted mean MacArthur comprehension score for children whose mothers consumed fish four or more times a week was 72 compared with 68 among those whose mothers did not consume fish. While this may not be a major difference clinically, but the statistically significant results were consistent across related subtests that could be important across a large population."
Scientists found that there was a subtle but consistent link between eating fish during pregnancy and children’s subsequent test scores, even after adjusting for factors such as the age and education of the mother, whether she breastfed and the quality of the home environment.
The largest effect was seen in a test of the children’s understanding of words at age 15 months. Children whose mothers ate fish at least once a week scored 7 percent higher than those whose mothers never ate fish.
A similar pattern, although less marked, was seen in tests measuring social activity and language development. Developmental scores were also higher among children who also ate fish at least once a week before their first birthdays.
The study suggests that if a woman eats moderate quantities of fish -- about two to three servings per week, or 12 ounces, of non-contaminated species -- her child might benefit, the scientist said. There is no evidence that the more fish a woman eats, the higher that benefit would be.
"Women should definitely avoid shark, swordfish, king mackerel and tilefish, according to the U.S. Environmental Protection Agency and Food and Drug Administration," Daniels said. "Those fish are higher on the food chain and have greater accumulation of pollutants."
Depending on the region where they are caught, many of the most commonly eaten fish are low in pollutants while still being high in critical long-chain fatty acids and other nutrients, she said. They include salmon, herring, pollock, canned light tuna and sardines.
Daniels said she is pursuing similar work in a group of U.S. children to confirm the results in other populations.
"We also need to follow the children longer to determine whether any benefits from fish intake are permanent or transient," she said.
Fish intake during pregnancy has the potential to improve fetal development because it is a good source of iron and long chain omega fatty acids, which are necessary for proper development and function of the nervous system, Daniels said. Fish, especially oily fish, is a dietary source of eicosapentaenoic and docosahexaenoic acids (DHA), which are important in the structural and functional development of the brain before birth and through a child’s first year. The concentration of DHA in fetal brain increases rapidly during the last three months in the womb.
The Avon Longitudinal Study of Parents and Children, or ALSPAC, (also known as Children of the 90s) is a continuing research project based at the University of Bristol. It enrolled 14,000 mothers during pregnancy in 1991-2 and has followed the children and parents in minute detail ever since.
Support for the study came from the Medical Research Council, the Wellcome Trust, the Department of Health, the Department of the Environment, DfEE, Nutricia and other companies, all in the United Kingdom.
The US Food and Drug Admnistration has a very handy page of mercury levels in fish. See the chart Mercury Levels in Commercial Fish and Shellfish and follow down the column labelled "MEAN" to see how the various types of fish compare. Women especially would do well to avoid the higher scoring fish. You have to decide for yourself where you want to draw the line. Personally, I do not eat anything above 0.10 PPM and usually choose fish well below even that limit.
Some commentators recommend avoiding fatty fish because more PCBs, DDT, dioxin, and other chemical compounds will be found in the fat. But the big health benefit from fish comes from the omega-3 fatty acids found in fish. So the advice to avoid fatty fish ends up defeating the purpose of eating fish in the first place.
I think a smarter approach is to avoid the types of fish that have been shown to have the most chemical contamination and to eat fish which have omega-3 fatty acids as a high fraction of total fats. That way you can limit the total amount of fish fat you need to consume in order to get the omega-3 fats. I prefer ocean fish to fresh water fish since the chemical pollution has become far more concentrated in lakes and rivers than in the oceans.
Salmon has very low levels of mercury and at the same time wild salmon has lower levels of PCBs than beef. It is also has a relatively good ratio of omega-3 fatty acids to omega-6 fatty acids. However, the Environmental Working Group has found that most farmed salmon has much higher levels of PCBs along with greater amounts of non-omega-3 fats.
Seven of ten farmed salmon purchased at grocery stores in Washington DC, San Francisco, and Portland, Oregon were contaminated with polychlorinated biphenyls (PCBs) at levels that raise health concerns, according to independent laboratory tests commissioned by Environmental Working Group.
These first-ever tests of farmed salmon from U.S. grocery stores show that farmed salmon are likely the most PCB-contaminated protein source in the U.S. food supply. On average farmed salmon have 16 times the dioxin-like PCBs found in wild salmon, 4 times the levels in beef, and 3.4 times the dioxin-like PCBs found in other seafood. The levels found in these tests track previous studies of farmed salmon contamination by scientists from Canada, Ireland, and the U.K. In total, these studies support the conclusion that American consumers nationwide are exposed to elevated PCB levels by eating farmed salmon.
However, not all farmed salmon has higher chemical contamination. Environmental Working Group did find 2 farmed salmon companies which use feeds that keep PCB levels down as low as those found in wild salmon.
Some farmed salmon companies -- Black Pearl and Clare Island Sea Farm -- are producing salmon that have very low PCB levels similar to those of wild salmon, Green said. These producers use herring and sardine fish meal, canola oil, soya and other uncontaminated ingredients.
Seek out the Black Pearl and Clare Island Sea Farm brands if you can find them.
Whether the higher levels of PCBs pose a health risk is not clear. Contaminant risks are typically greater for developing fetuses than for adults because the fetuses are going through complex changes which, if disrupted by chemical toxins, can cause improper development with very lasting and even permanent results. Pregnant women therefore need to be more conservative when evaluating risks.
Keep in mind when evaluating risks of fish that fish consumption probably will reduce your risks of heart disease and other diseases in which inflammation mechanisms are implicated (e.g. arthritis and perhaps some of the neurodegenerative diseases). So the risks of chemical contaminants has to be weighed against the health benefits of eating high omega-3 fatty acid foods. In the case of wild salmon the mercury and chemical contaminant contaminations are so low that a strong argument can be made for eating the wild salmon. My guess is that even farmed salmon is a strong net health benefit for the vast majority and in the case of the farm salmon that is given feed that has low levels of contaminants the benefit from eating it almost as big as the benefit from eating wild salmon.
Those Rastafarians with dreadlocks who never wash their hair may be on to something. Two compounds found in some shampoos, diethanolamine (DEA) and triethanolamine (TEA), may seep through the skin, into the brain, and block the ability of neurons to take up choline. Steven H. Zeisel, M.D., Ph.D and colleagues at University of North Carolina at Chapel Hill believe the reduction in choline uptake may reduce neural cell replication.
The research in animals centers around diethanolamine (DEA), a chemical used in shampoos, lotions, creams and other cosmetics. DEA is used widely because it provides a rich lather in shampoos and keeps a favorable consistency in lotions and creams, but there's also some research that shows it may rob the brain of its ability to make memory cells."Depending on the treatment, some mice are stupid, some are not," said researcher Dr. Steven Zeisel.
In the modern scientific telling of the ancient story Samson was shampooed by Delilah and he became so dumb she was able to manipulate him to do her bidding without cutting off his hair.
Dr. Zeisel says if the dea-hypothesis holds true, the memory impact would probably be minimal in adults. But it could have a bigger effect on the developing brain, during pregnancy and the first few years of life.
Dr. Zeisel is investigating whether choline supplementation can counteract the effect of DEA and TEA.
The US National Institutes of Health National Library of Medicine happens to have an online database of household products and their ingredients. Check out the lists of Shampoo/Conditioner and Shampoo products and what each contains. It doesn't appear that all products are listed. Also, the ingredient lists look like what you find by reading the side of the bottle anyway. Still, if you want to check a number of different products while still remembering the names of these chemicals it is pretty quick to do.
Other recent work by Zeisel's lab showed how choline upregulates a gene to cause neurons to divide.
Now, working with nerve tissue derived from a human cancer known as a neuroblastoma, the UNC researchers have discovered why more choline causes stem cells -- the parents of brain cells -- to reproduce more than they would if insufficient choline were available.
A report on the findings will appear in the April issue of the Journal of Neurochemistry. Authors are doctoral student Mihai D. Niculescu and Dr. Steven H. Zeisel, professor and chair of nutrition at the UNC schools of public health and medicine. Dr. Yutaka Yamamuro, a former postdoctoral fellow in Zeisel's laboratory now with Nihon University in Japan, was a key contributor.
"We found that if we provided them with less choline, those nerve cells divided less and multiplied less," Zeisel said. "We then went on to try to explain why by looking at genes known to regulate cell division."
Scientists focused on cyclin-dependent kinase inhibitor 3 genes, which keep cells from dividing until a biochemical message turns the genes off, he said. They found exactly what they expected.
"We showed that choline donates a piece of its molecule called a methyl group and that gets put on the DNA for those genes," Zeisel said. "When the gene is methylated, its expression is shut down."
But when the gene is under-methylated -- such as when there’s not enough choline in the diet -- then it’s turned on -- halting or slowing nerve cell division, he said.
"Nature has built a remarkable switch into these genes something like the switches we have on the walls at home and at work," Zeisel said. "In this very complicated study, we’ve discovered that the diet during pregnancy turns on or turns off division of stem cells that form the memory areas of the brain. Once you have changed formation of the memory areas, we can see it later in how the babies perform on memory testing once they are born. And the deficits can last a lifetime."
The next step, Zeisel said, will be confirm that the same things happen in living mouse fetuses when the mothers receive either high or low doses of choline.
Dr. Zeisel and other collaborators at UNC Chapel Hill and Tufts University have recently shown that insufficient folic acid even in later pregnancy results in lifelong reduction in cognitive ability in rats and mice.
CHAPEL HILL -- Folic acid is not just critical for brain development in embryos during the earliest stages of pregnancy, but it is a key to healthy brain growth and function late in pregnancy too, scientists at the University of North Carolina at Chapel Hill have discovered.Humans and other mammals lacking sufficient folic acid shortly before they are born can suffer lifelong brain impairment, the UNC animal studies indicate. Such research can never be done directly in growing human fetuses for obvious reasons, scientists say.
...The experiments involved feeding pregnant mice and rats high, normal or low amounts of folic acid in otherwise healthy diets, Zeisel said. Researchers then examined fetuses' brains and looked specifically at stem, or progenitor, cells that divide and give rise to various forebrain structures. "In the babies of folic acid-deficient mothers, the stem cells divided less than half as much as in the babies of mothers on normal diets so there were less than half the number of stem cells available to help populate the brain," he said. "In addition, the number of cells that were dying off was much greater -- twice as high as it should have been. "So not only were fewer cells being born, but many more were dying so that there were many fewer available to form important areas of brain. That means that those parts will be abnormal permanently, and that the folic acid story does not end soon after the beginning of pregnancy." Essentially, folic acid is somehow promoting stem cell growth and survival so that the brain can form good memory centers, Zeisel said. To the researchers' knowledge, no one had ever looked before at folic acid's effects on brain in late pregnancy. "In mice and rats, the brain centers we are talking about are almost identical to those in human beings, and -- along with what we already know our human folic acid needs -- that's why we think these animal findings are applicable to humans," Zeisel said. "We have every reason to believe that this is true for pregnant women. It likely is the best evidence we're going to get because these experiments can never be done in humans."
Here is some useful news you can use. Morning "big gulp" coffee drinkers are misusing the power of caffeine. Researchers at the Sleep Disorders Center at Rush University Medical Center in Chicago along with colleagues at Brigham and Women’s Hospital and Harvard Medical School have shown that caffeine is best administered in a larger number of smaller doses with the doses coming later in the day.
Chicago - People who take small amounts of caffeine regularly during the day may be able to avoid falling asleep and perform well on cognitive tests without affecting their nighttime sleep habits.
Researchers from Rush University Medical Center, Brigham and Women’s Hospital and Harvard Medical School have discovered that caffeine works by thwarting one of two interacting physiological systems that govern the human sleep-wake cycle. The researchers, who report their findings in the May issue of the journal SLEEP, propose a novel regimen, consisting of frequent low doses of caffeine, to help shift workers, medical residents, truck drivers, and others who need to stay awake get a bigger boost from their tea or coffee.
"I hate to say it, but most of the population is using caffeine the wrong way by drinking a few mugs of coffee or tea in the morning, or three cups from their Starbuck’s grande on the way to work. This means that caffeine levels in the brain will be falling as the day goes on. Unfortunately, the physiological process they need to counteract is not a major player until the latter half of the day," said James Wyatt, PhD, sleep researcher at Rush University Medical Center and lead author on the study.
Though many studies have measured caffeine’s sleep-averting effects, most do not take into account that sleep is governed by two opposing but interacting processes. The circadian system promotes sleep rhythmically—an internal clock releases melatonin and other hormones in a cyclical fashion. In contrast, the homeostatic system drives sleep appetitively—it builds the longer one is awake. If the two drives worked together, the drive for sleep would be overwhelming. As it turns out, they oppose one another.
Caffeine is thought to block the receptor for adenosine, a critical chemical messenger involved in the homeostatic drive for sleep. If that were true, then caffeine would be most effective if it were administered in parallel with growing pressure from the sleep homeostatic system, and also with accumulating adenosine.
To test their hypothesis, the scientists studied 16 male subjects in private suites, free of time cues, for 29 days. Instead of keeping to a 24-hour day, researchers scheduled the subjects to live on a 42.85–hour day (28.57-hour wake episodes), simulating the duration of extended wakefulness commonly encountered by doctors, and military and emergency services personnel. The extended day was also designed to disrupt the subjects’ circadian system while maximizing the effects of the homeostatic push for sleep.
Following a randomized, double-blind protocol, subjects received either one caffeine pill, containing 0.3 mg per kilogram of body weight, roughly the equivalent of two ounces of coffee, or an identical-looking placebo. They took the pills upon waking and then once every hour. The goal of the steady dosing was to progressively build up caffeine levels in a way that would coincide with—and ultimately, counteract—the progressive push of the homeostatic system, which grows stronger the longer a subject stays awake.
The strategy worked. Subjects who took the low-dose caffeine performed better on cognitive tests. They also exhibited fewer accidental sleep onsets, or microsleeps. EEG tests showed that placebo subjects were unintentionally asleep 1.57 percent of the time during the scheduled wake episodes, compared with 0.32 percent for those receiving caffeine. Despite their enhanced wakefulness, the caffeine-taking subjects reported feeling sleepier than their placebo counterparts, suggesting that the wake-promoting effects of caffeine do not replace the restorative effects gained through sleep.
Coffee, tea, and other caffeine-containing beverages are tools. Don't drink more than you need to and slow the rate of your drinking to spread it out. Keep in mind that once you reach the point where you don't need to maintain a high feeling of wakefulness that you should immediately stop drinking it. If you need something more powerful then consider Provigil (modafinil). My strongly felt advice is to stay away from methamphetamine or other amphetamines because they cause brain damage. I don't have any specific knowledge about toxic effects of caffeine or modafinil on neurons. But sleep deprivation is definitely harmful. A life lived with a constant need for anti-sleep stmulants is a life that is in need of some serious restructuring to allow for more sleep time.
Young women who took iron supplementation for 16 weeks significantly improved their attention, short-term and long-term memory, and their performance on cognitive tasks, even though many were not considered to be anemic when the study began, according to researchers at Pennsylvania State University.
The study, the first to systematically examine the impact of iron supplementation on cognitive functioning in women aged 18 to 35 (average age 21), was presented at Experimental Biology 2004, in the American Society of Nutritional Sciences' scientific program. Dr. Laura Murray-Kolb, a postdoctoral fellow in the lab of Dr. John Beard, says the study shows that even modest levels of iron deficiency have a negative impact on cognitive functioning in young women. She says the study also is the first to demonstrate how iron supplementation can reverse this impact in this age group.
Baseline cognition testing, looking at memory, stimulus encoding, retrieval, and other measures of cognition, was performed on 149 women who classified as either iron sufficient, iron deficient but not anemic, or anemic. All of the women underwent a health history, and the research design controlled or took into account any differences in smoking, social status, grade point average, and other measures. The women were then given either 60 mg. iron supplementation (elemental iron) or placebo treatment for four months. At the end of that period, the 113 women remaining in the study took the same task again.
On the baseline test, women who were iron deficient but not anemic completed the tasks in the same amount of time as iron sufficient women of the same age, but they performed significantly worse. Women who were anemia also performed significantly worse, but in addition they took longer. The more anemic a woman was, the longer it took her to complete the tasks. However, supplementation and the subsequent increase in iron stores markedly improved cognition scores (memory, attention, and learning tasks) and time to complete the task.
This finding has great implications, says Dr. Murray-Kolb, because the prevalence of iron deficiency remains at 9 percent to 11 percent for women of reproductive age and 25 percent for pregnant women. In non-industrialized countries, the prevalence of anemia is over 40 percent in non-pregnant women and over 50 percent for pregnant women and for children aged five to 14. According to current prevalence estimates, iron deficiency affects the lives of more than two billion people worldwide.
The findings also are important, say the researchers, because they illustrate the significance of lower amounts of iron deficiency on cognitive functioning, including memory, attention, learning tasks, and time to complete studies.
Some of the known consequences of iron deficiency are reduced physical endurance, an impaired immune response, temperature regulation difficulties, changes in energy metabolism, and in children, a decrease in cognitive performance as well as negative affects on behavior. While iron deficiency was once presumed to exert most of its deleterious effects only if it had reached the level of anemia, it has more recently become recognized that many organs show negative changes in functioning before there is any drop in iron hemoglobin concentration.
Authors of the study are Dr. Murray-Kolb, Dr. Beard, both of the Nutritional Sciences Department at Penn State, and Dr. Keith Whitfield, of Penn State's Biobehavioral Health Department.
Two billion people would operate at a higher level of cognitive function if they received adequate amounts of iron. That represents a huge amount of wasted potential and that is from just one micronutrient deficiency. There may well be other widespread micronutrient deficiencies (choline, omega 3 fatty acids, zinc, and still others) that are reducing cognitive function. But even worse, some of these deficiencies are surely holding back brain development before and after birth and preventing full genetic potential from ever being realized. For people who grow up eating a nutritionally deficient diet supplementation in adulthood will be beneficial. But if a person goes through a stage of development with deficiencies persisting throughout that stage then supplementation that comes later in life will be too late to allow full brain development..
Unfortunately the taboos around IQ help to prevent a full airing of the implications of this sort of research. That is a shame. We need to talk about how to raise IQs and improve mental function by improving nutrition and in other ways as well. For instance, British researchers found that improved nutrition makes prisoners behave much better. Would it also lower crime rates if released prisoners were required to take supplements? Also, a recent report provides preliminary evidence that zinc will lower the severity of attention deficit hyperactivity disorder (ADHD) in children. It may be the case that millions of children would be learning more and disrupting classes less if they were getting more zinc in their diets.
In order to improve average level of mental functioning I am an advocate of both more aggressive fortification of foods and of a more rapid reduction in the emissions of compounds such as mercury and lead that are neurotoxins. Also, a lot more research into nutritional state and cognitive function should be done. Our brains are our greatest assets and we should err on the side of doing whatever we can to ensure they will develop to reach their fullest genetic potential.
Here is a pretty wild idea. Some day brain scans may be used to determine what each mind is in best condition to learn each day.
It's the future as imagined by Max Cynader, director of the Brain Research Centre at the University of British Columbia. "Forty or 50 years from now, a student will stick her head in a scanner and see what she could best learn that day," he says. "That's a dream. We aren't there, but we can see how to get to there from here."
Many types of scanners use radiation. The PET (Positron Emission Tomography) scan uses radioactive glucose. The CAT or CT scan uses X-rays and so involves radiation exposure. In fact, CT scan doses are much larger than the dose from conventional X-ray images. Well, one alternative is Magnetic Resonance Imaging. Leaving aside the question of whether the high magnetic fields might cause some damage (my guess: yes!) it faces a more immediate obstacle: MRI requires a person to lie still for up to 20 minutes. This is not practical since it would take too much time, the kids wouldn't lie still, and too many machines would be needed. But if a means could be developed to get a good view of the brain on a daily basis without causing any radiation damage then why not check each brain to see what it appears to be up for learning?
It seems more likely to expect methods will first be found to enhance a brain's function for learning each particular type of material. One idea I've previously suggested is to use drugs to cycle more rapidly between wakefulness and the sleeping brain states in which memories are consolidated. It makes sense to enhance memory formations when one is experiencing memories that are worth remembering. The person who has a boring clerical job doesn't want to go to sleep to remember every boring detail of every day of drudgery. But when studying for a test or learning new technologies it would be very helpful to enhance memory formation.
Taking a nutrient called choline during pregnancy could "super-charge" children's brains for life, suggests a study in rats.
Offspring born to pregnant rats given the supplement were known to be faster learners with better memories. But the new work, by Scott Swartzwelder and colleagues at Duke University Medical Center in North Carolina, US, shows this is due to having bigger brain cells in vital areas.
Previous studies have shown that the offspring of rats fed choline have better memories and their cognitive function does not decay as rapidly as they age.
Eggs are a good source of choline. For a chart on choline food sources see my previous posts Choline May Restore Middle Aged Memory Formation. For more on choline's effects down at the level of genetic regulation see my post Nutrients Change Embryonic DNA Methylation, Risk Of Disease.
In the current study, the researchers explored the effects of choline on neurons in the hippocampus, a brain region that is critical for learning and memory. They fed pregnant rats extra amounts of choline during a brief but critical window of pregnancy, then studied how their hippocampal neurons differed from those of control rats.
The researchers found that hippocampal neurons were larger, and they possessed more tentacle-like "dendrites" that reach out and receive signals from neighboring neurons.
"Having more dendrites means that a neuron has more surface area to receive incoming signals," said Scott Swartzwelder, Ph.D., senior author of the study and a neuropsychologist at Duke and the Durham VA Medical Center. "This could make it easier to push the neuron to the threshold for firing its signal to another neuron." When a neuron fires a signal, it releases brain chemicals called "neurotransmitters" that trigger neighboring neurons to react. As neurons successively fire, one to the next, they create a neural circuit that can process new information, he said.
Not only were neurons structured with more dendrites, they also "fired" electrical signals more rapidly and sustained their firing for longer periods of time, the study showed. The neurons also rebounded more easily from their resting phase in between firing signals. These findings complement a previous study by this group showing that neurons from supplemented animals were less susceptible to insults from toxic drugs that are known to kill neurons.
Collectively, these behaviors should heighten the neurons' capacity to accept, transmit and integrate incoming information, said Swartzwelder.
"We've seen before that the brains of choline-supplemented rats have a greater plasticity -- or an ability to change and react to stimuli more readily than normal rats -- and now we are beginning to understand why," he said.
VICHY, FRANCE, December 18, 2003-Consuming Concord grape juice significantly improved laboratory animals' short-term memory in a water maze test as well as their neuro-motor skills in certain of the coordination, balance and strength tests, according to preliminary research presented at the 1st International Conference on Polyphenols and Health recently held in Vichy, France.
"In the study we subjected 45 senescent rats-meaning they were mature animals approaching the end of their expected life spans-to a range of tests and challenges that are commonly accepted methods of measuring changes in short-term memory and neuro-motor skills," says James A. Joseph, Ph.D., Chief, Neurosciences Laboratory, USDA Human Nutrition Research Center on Aging and lead researcher in the study. "Concord grape juice appeared to reduce or reverse the loss of sensitivity of muscarinic receptors, thus enhancing cognitive and some motor skills in the test animals. In many of the tests we saw significant improvements or trends toward improvement."
The memory test was the Morris water maze, an age-sensitive challenge that requires animals to use spatial learning to find a platform submerged 2 cm below the surface of a pool of water. Rats fed a 10% solution of Concord grape juice found the platform in roughly 20% less time than the control group. Other tests measured the animals' ability to balance on a horizontal stationary rod; a rotating, slowly accelerating rod; and various sized planks, and their ability to hold onto a suspended wire and an inclined wire screen. Some of those tests saw improvements in either or both of the group consuming a 10% solution of Concord grape juice and the group consuming a 50% solution.
"The Concord grape juice findings are not surprising," explains Joseph. "We have seen similar effects in the work we've done in blueberries."
The researchers point to several factors as potential mechanisms of action, including increased dopamine production and a potent overall antioxidant effect. According to previously published USDA studies, Concord grape juice has the highest total antioxidants of any fruits, vegetables or juices tested.
Regarding the reference to previous USDA studies: Be aware that spinach is especially efficacious for improving aged rat memories. Also, blueberries and blackberries have more antioxidant activity than red grapes. Note that while raisins have high scores this is due to dehydration and that their antioxidant to calorie ratio is probably no better than that of undehydrated grapes.
Also on a related topic, if you have forgotten my previous post see: Choline May Restore Middle Aged Memory Formation.
MIT professor Richard Wurtman and post-doc Lisa Teather choline in the form of cytidine (5')-diphosphocholine (CDP-choline) improved learning in rats after 2 months of supplementation.
Among rats not getting CDP-choline, the older animals seemed to forget much of the previous day's learning, Teather says, while the young ones didn't. By the end of 4 days of testing, she notes, the difference between these groups "was really huge," suggesting that the older ones had trouble forming long-term memories. However, she notes, among CDP-choline–supplemented rats, middle-aged animals "mastered the [maze learning] as readily as the young animals did." Her group is now in the process of evaluating the impact of CDP-choline on memory development in the rodent equivalent of senior citizens.
No quick fix
"The interesting thing," observes Teather, "is that if you feed the [rats the supplemented] diet for 1 month, you can't rescue memories." The animals had to get CDP-choline for at least 2 months to receive some memory protection. And that, she says, points to a mechanism for what the nutritional supplement is doing.
Teather and Wurtman theorize that it takes a month of choline supplementation for the brain to build up enough acetylcholine transmitter for the choline to start getting used in the synthesis of phospholipids to make more membranes. Then gradually enough extra neuronal membrane is made for it to be available for use to do new memory formation. Also, they expect that humans would have to get only 500 mg per day because the human body retains choline more efficiently
So how to get choline in the diet? See the lists below. Keep in mind that there are about 454 grams in a pound and so 100 grams of food is less than a quarter pound.
FOOD SOURCES Choline Food (mg/100g) Liver, dessicated 2170 Heart, beef 1720 Brewer’s yeast 300 Nuts 220 Pulses 120 Citrus fruits 85 Bread, wholemeal 80 Bananas 44
Food/serving mg choline/serving Beef Liver, 85 grams (3 ounces) 453.2 Egg, 61 grams (1 large) 345.0 Beef Steak, 85 grams (3 ounces) 58.5 Cauliflower, 99 grams (1/6 medium head) 43.9 Iceberg Lettuce, 89 grams, (1/6 medium head) 28.9 Peanuts, 1 ounce 28.3 Peanut Butter, 32 grams (2 Tbsp) 26.1 Grape Juice, 8 ounces 12.9 Potato, 148 grams (1 medium) 12.9 Orange, 154 grams (1 medium) 11.5 Whole Milk, 8 ounces 9.7
Someone who doesn't want to eat eggs, liver, or beef heart is probably not going to get enough choline in thieir diet to achieve the effect that the MIT researchers expect. The amount of nuts needed to get enough choline would add up to a lot of calories. Though the nuts have other health benefits
Memory Enhancing Drugs May Worsen Working Memory
New Haven, Conn. -- A new study cautions that while drugs being designed to enhance memory in the elderly seem to be effective for some types of memory, they may actually worsen working memory, according to a study by Yale researchers published Thursday in the journal Neuron.
Working memory is the cognitive ability that intelligently regulates our thoughts, actions and feelings, letting us plan and organize for the future. It is governed by the prefrontal cortex. This type of memory is constantly updated and is known to be impaired by the normal aging process.
The ability to lay down long-term memories depends upon another region of the brain, the hippocampus.
The study by Amy Arnsten, associate professor and director of graduate studies in neurobiology at Yale School of Medicine, shows that the prefrontal cortex and hippocampus have different chemical needs, and that medications being developed to enhance long-term memory actually worsen working memory in aged animals.
Biotech companies are focusing on the activating protein kinase A, an enzyme in hippocampal cells which strengthens long-term memory formation. Arnsten and colleagues found that activation of protein kinase A in prefrontal cortex worsened working memory, while inhibiting this enzyme in prefrontal cortex improved working memory in aged rats. In collaboration with Ronald Duman's laboratory, they found that aged rats with naturally occurring working memory impairment had signs of overactive protein kinase A in their prefrontal cortex.
"Because PKA is over-activated in the aged prefrontal cortex, PKA stimulation actually makes the situation worse by further impairing working memory," Arnsten said.
The study was funded by the National Institute on Aging (NIA) of the National Institute on Health (NIH). "This important study tells us that one size may not fit all when developing treatment strategies for cognitive deficits," says Molly Wagster, program director for neuropsychology of aging research at the NIA. "The differing effects of PKA activity in the hippocampus and the prefrontal cortex suggest that distinct neurochemical needs of different regions of the brain must be addressed for the development of effective ways to enhance cognition."
Co-authors included Brian Ramos, Shari Birnbaum, Isabelle Lindenmayer, Samuel Newton and Ronald Duman.
Most drugs are really imprecise tools for modifying metabolism because they typically work on more than one site with more than one effect. It is therefore not surprising that a drug aimed to have one desired effect on the hippocampus has an undesired effect on the prefrontal cortex. Drugs that attempt to reverse or slow down aging are going to be tough to develop because the most effective treatments would reverse the damage that accumulates in aging. Attempts to simply suppress or up-regulate some activity are unlikely to reverse most forms accumulated damage. It may well be possible to develop drugs that work reverse some subset of all the types of accumulated damage. But theoretically more powerful approaches such as gene therapy and cell therapy will likely prove more efficacious in the long run.
"Prefrontal cortex functions are essential to the information age and they naturally decrease with normal aging, so it's particularly important to see what this cortex needs and to give it back to this part of the brain," says study author Amy F.T. Arnsten, an associate professor and director of graduate studies in neurobiology at Yale University School of Medicine. "There's some deterioration in the hippocampus with normal aging, but what really erodes the hippocampus is Alzheimer's."
Businessweek has an article that highlights the development of drugs for memory and general cogntive enhancement.
HUMAN TRIALS. C.L. took nine capsules of CX516 daily for 12 weeks this spring. The impact was immediate. "At the start of the trial, I could remember less than five words out of a list of 20. By the second week, I could get 14 out of 20. There was a very, very appreciable enhancement." He has since finished his part in the study and says it was "heartbreaking" to go off the drug. "I've been thinking of some other way to get it, and I don't give a damn if it's legal or illegal."
If he waits a few years, C.L. should be able to solve his problem legally. At least 60 pharmaceutical and biotech companies around the world are working on novel memory pills. Some 40 are in human trials, and the first of these could be on the market within the next few years.
The aging of the baby boomers and their unwillingness to go quietly into the night provides a big incentive for drug developers to come out with compounds that will help boomers think more clearly. This would be especially helpful for the boomers who did too much bad LSD and other recreational drugs in the 60s and who haven't had a clear thought since then.
The Company’s proprietary family of AMPAKINE compounds affects and enhances the activity of AMPA-type glutamate receptors, complexes of proteins that are involved in communication between cells in the human brain (neurons). Glutamatergic transmission between neurons is by far the brain’s most abundant communication system. When a neuron releases glutamate and it binds to the AMPA receptor, AMPAKINE compounds increase or magnify the effect of glutamate, thereby amplifying normal brain signals.
The AMPAKINE compounds, which can be taken orally, rapidly enter the brain and increase the ability of neurons to communicate with each other. Data from pilot clinical trials have shown that the AMPAKINE CX516, a lead compound, is effective in improving memory in elderly volunteers and patients with schizophrenia.
Some of the drugs mentioned in the Businessweek article operate on neurotransmitter metabolism by enhancing neurotransmitter receptor sensitivity, preventing reuptake, stimulating release, or perhaps by preventing breakdown or stimulating production of neurotransmitters. One or more of these approaches will eventually result in useful drugs that provide real benefits for the middle aged, the elderly and perhaps even younger minds. Large numbers of people (including me!) would certainly jump at the opportunity to take a memory enhancing drug that operated by one of these mechanisms if the side effects didn't seem too onerous or risky. But the better longer term approach to the prevention of age-related decline in memory and processing speed is to develop treatments that reverse brain aging. What is worrying about any approach that does not target an actual aging or disease process is that turning up the functional level of the brain could conceivably increase mental functioning in the short term but, by speeding up brain metabolism, accelerate brain aging.
The article mentions of the Alzheimer's Disease experimental vaccine AN-1792 from Elan Pharmaceuticals that successfully cleared away beta amyloid plaque from many test subjects. In a clinical trial this vaccine AN-1792 prevented mental decline in two thirds of the subjects while causing fatal brain inflammation from immune T cell attack in a small number of them. The result provides evidence for the idea that plaque accumulation really does cause Alzheimer's and strongly suggests that a better vaccine (probably one designed to hit a different antigen section of the plaque that doesn't look like any surface protein on normal brain cells) could successfully target the plaque while avoiding brain inflammation. This result is causing Elan and others to pursue better Alzheimer's vaccines.
This vaccine approach for Alzheimer's is probably the easiest way to try to stop a disease process that comes with age. Send in antibodies to target the plaque and the immune system eats up the plaque and the plaque is no longer there to kill neurons. The problem, of course, is that there are plenty of disease processes and aging processes that are not amenable to treatment with vaccines. The other disease processes that come with aging will be much harder to target to stop and reverse. Accumulated damage in neurons will likely require gene therapy to send in instructions to repair them. But effective gene therapies for brain rejuvenation will take much longer to develop in comparison to vaccines.
Another useful approach will be to send in stem cells to replace aged neural stem cells in the hippocampus. The brain is constantly forming new neurons from stem cells in the brain but the reservoirs of stem cells age from repeated replication and accumulated damage and they need to be replaced in order to restore the ability to form new neurons back to youthful levels.
The classic chemical compound drug approach to disease treatment has real limits to what it can accomplish. To do the most complex manipulations of an organism requires many steps and fancier instructions than a single chemical compound can provide. In a nutshell, what is needed is the ability to send in the equivalent of computer programs. Cell therapies and gene therapies have a lot more potential than classical chemical compound drugs in the long run because cells and genes carry much more information and can therefore carry out more complex tasks.
Chemical compound drug therapies still have a long life ahead of them. One reason is that cell therapy and gene therapy development are both (unfortunately) still in their infancy. Also, the same kinds of advances in DNA sequencing and gene activity assays that are pointing the way toward useful gene therapies are also pointing the way toward targets in the cell for chemical compounds. Since cells already have tens of thousands of genes one way that drug therapies will improve is by the development of compounds that can turn individual genes on and off by binding to regulatory proteins. Fairly complex transformations of cells will eventually be carried out by sending in series of compounds that sequentially turn on and off various genes. One area where this will be used first is in cancer therapy. The way forward for anti-angiogenesis compound usage is probably going to be to use multiple compounds in concert to cut off different pathways that cause the angiogenesis (the growth of new blood vessels) that helps keep growing tumors fed. Also, anti-angiogenesis compounds will be used in combination with other compounds that regulate other steps involved in cell proliferation.
Update: A team led by Canadian researcher Dr. William Molloy of McMaster University and of St. Peter's Centre for Studies in Aging in Hamilton, Ontario have discovered antibiotics slow the rate of advance of Alzheimer's Disease.
Alzheimer's patients who took the drugs - doxycycline and rifampin - in combination for three months showed significantly slower cognitive decline at six months out from the start of the trial than patients who received a placebo, they will report at the annual meeting of the Infectious Disease Society of America in San Diego on Saturday.
Armed with the knowledge that plaques formed in the brain have been associated with jumbled memories, and the fact that researchers have also found the chlamydia pneumoniae bacterium in the brains of people with Alzheimer's disease, Molloy came up with a novel hypothesis.
The antibiotics did not, however, lower the level of chlamydia by all that much (which itself is bad news). So did they knock out a different bacteria or block some action that chlamydia takes or did they work in some manner unrelated to bacteria? Also, would other antibiotics have a more beneficial effect? Expect a lot of research groups to jump to repeat this result. Heck, expect doctors to be deluged by requests for antibiotics prescriptions for Alzheimer's patients. Is the best antibiotic against chlamydia still under patent? What drug company makes it and how much do they make in profit from that antiobiotic?
The research lends credence to the notion that common bacterial infections might play a role in determining who is stricken with the debilitating neurological disorder. It also offers hope of a cheap, simple treatment for Alzheimer's, a condition for which there is virtually no effective treatment.
There is a hypothesis that bacteria play a role in the formation of plaques in arteries and this study was an attempt to discover whether the same might be going on with Alzheimer's plaques. This suggests another target for vaccine development: the bacteria that might be involved in Alzheimer's. Also, the development of better antibiotics would be another approach. Given that Alzheimer's plaque formation probably starts for people in their 40s or 50s it also brings up the question of whether we should all go get dosed with a heavy antibiotic regimen before our brains start accumulating damage. It may also be an argument for less sexual promiscuity if chlamydia is mostly getting spread by that route. (anyone know?)
In the January 2003 issue of Pediatrics researchers from the Institute for Nutrition Research of the University of Oslo in Norway reported supplementation with omega 3 fatty acid DocosaHexaenoic Acid (DHA) boosted the intelligence of infants.
We received dietary information from 76 infants (41 in the cod liver oil group and 35 in the corn oil group), documenting that all of them were breastfed at 3 months of age. Children who were born to mothers who had taken cod liver oil (n = 48) during pregnancy and lactation scored higher on the Mental Processing Composite of the K-ABC at 4 years of age as compared with children whose mothers had taken corn oil (n = 36; 106.4 [7.4] vs 102.3 [11.3]). The Mental Processing Composite score correlated significantly with head circumference at birth (r = 0.23), but no relation was found with birth weight or gestational length. The children's mental processing scores at 4 years of age correlated significantly with maternal intake of DHA and eicosapentaenoic acid during pregnancy. In a multiple regression model, maternal intake of DHA during pregnancy was the only variable of statistical significance for the children's mental processing scores at 4 years of age. CONCLUSION: Maternal intake of very-long-chain n-3 PUFAs during pregnancy and lactation may be favorable for later mental development of children.
This result supports a previous report arguing for a link between breast-feeding as a source of DHA and intelligence.
The importance of omega-3 fatty acids for brain development is getting a lot of support from a variety of quarters. A just announced study by USC psychology professor Adrian Raine and colleagues found that supplementing the diets of poor children in Mauritania with higher quality food including fish for omega-3 fatty acids reduced their rate of commission of crimes when they got older.
The research, published in this month's American Journal of Psychiatry, involved 100 Mauritian children and a group of around 350 control subjects not put through the programme. EEGs - scans of brain electrical activity - at the age of 11 found heightened activity compared to their peers: they were less likely to have criminal records and 35 per cent less likely to report having engaged in some criminal activity and got away with it.
The most striking effects were observed in those most malnourished when they started the programme, Raine said, suggesting that the diet - unusually rich in fish - could be the crucial element.
Raine thinks the omega-3 fatty acids may be responsible for the difference in behavior. Note that he also foresees a day when surgery might be used to correct prefrontal lobe defects that prevent people from controlling their impulses that cause them to commit crime!
Raine also says that we can't ignore biological and genetic causes of mental illness.
Raine cautioned, however, that there does appear to be a strong genetic component to schizophrenia that shouldn't be discounted. "Pushing biology and genetic issues under the carpet isn't going to help society in the long run," he said. The good nutrition and educational programs early in life might at least delay the onset of mental illness in some people, he added.
Raine was involved in earlier research that found less grey matter in the prefrontal lobes of violent criminals.
Researchers writing in this month’s Archives of General Psychiatry have found that men with antisocial personality disorder, a condition characterized by violence and criminal behavior, have 11 percent less gray matter than normal men in a part of the brain called the prefrontal cortex.
Firstly, the prefrontal cortex is responsible for self-restraint and deliberate foresight. If this part of the brain was damaged, then one effect that would arise would be the tendency for one to act on all his impulses without thinking ahead or thinking of the consequences. Second, the prefrontal cortex is important for learning conditioned responses. This area of the brain has been thought to be central to a child's ability to learn to feel remorse, conscience, and social sensitivity (7). If the prefrontal cortex was to function abnormally, how is the child supposed to learn how to have a conscience? For example, one study reported that children who received damage to their prefrontal cortex before the age of seven developed abnormal social behavior, which was characterized by their inability to control their aggression and anger (2). Lastly, Raine suggests that if prefrontal deficits underlie the APD group's low levels of autonomic arousal, these people may unconsciously be trying to compensate through stimulation-seeking (5).
So does a diet deficient in omega-3 fatty acids lead to poor development of the prefrontal lobe and hence to both lower intelligence and more criminality?
The results about the Mauritanian children follow on the heals of another recent study that found improving the nutrition of prisoners decreased prison violence.
A few months ago, C. Bernard Gesch of Oxford University and coworkers reported in the British Journal of Psychiatry that vitamin-mineral-essential fatty acid supplements appeared capable of dampening violence in a prison population (Psychiatric News, October 2, 2002). However, J.S. Zil, M.D., J.D., chief forensic psychiatrist of the State of California Department of Corrections, told Psychiatric News that he was skeptical of their results. To which Gesch replied: "I don’t feel that Dr. Zil’s cynicism is a problem. It’s only natural to be cautious about such provocative findings."
Researchers from Imperial College London and Charing Cross Hospital have discovered a way to help musicians improve their musical performances by an average of up to 17 per cent, equivalent to an improvement of one grade or class of honours.
The research published in this months edition of Neuroreport, shows that using a process known as neurofeedback, students at London's Royal College of Music were able to improve their performance across a number of areas including their musical understanding and imagination, and their communication with the audience.
Dr Tobias Egner, from Imperial College London at Charing Cross Hospital, one of the authors of the study, comments: "This is a unique use of neurofeedback. It has been used for helping with a number of conditions such as attention deficit disorder and epilepsy, but this is the first time it has been used to improve a complex set of skills such as musical performance in healthy students."
Two experiments were conducted involving a total of 97 students. In both experiments, the students were assessed on two pieces of music, both before and after the neurofeedback training, according to a 10-point scale adapted from a standard set of music performance evaluation criteria of the Associated Board of the Royal Schools of Music, by a panel of expert judges. The judges evaluated video-recorded performances, and were unaware of whether the performance had been given before or after the intervention.
Neurofeedback monitors brain activity through sensors attached to the scalp which filter out the brainwaves. These filtered brainwaves are then fed back to the individual in the form of a video game displayed on screen, and the participant learns to control the game by altering particular aspects of their brain activity. This alteration in brain activity can influence cognitive performance.
In the first experiment, 22 students out of 36 were trained on two neurofeedback protocols (SMR and beta1), commonly used as tools for the enhancement of attention, and, following this, on a deep relaxation alpha/theta (a/t) protocol. In addition a second group of 12 was engaged in a regime of weekly physical exercise and a mental skills training programme derived from applications in sports psychology. A third group consisted of a scholastic grade and age matched no-training group, which served as a control grade.
In the second experiment, a different cohort of students were randomly allocated to one of six training groups: alpha/theta neurofeedback, beta1 neurofeedback, SMR neurofeedback, physical exercise, mental skills training, or a group that engaged in Alexander Technique training.
All of the students who received neurofeedback training were found to have improved their performances marginally compared with those who received other forms of training, but those who had received the alpha/theta (a/t) protocol improved their performance the most. The range of improvement in performance for the alpha/theta group was between 13.5 per cent and 17 per cent.
Professor John Gruzelier, from Imperial College London at Charing Cross Hospital, and senior author of the study, adds: "These results show that neurofeedback can have a marked effect on musical performance. The alpha/theta training protocol has found promising applications as a complementary therapeutic tool in post-traumatic stress disorder and alcoholism. While it has a role in stress reduction by reducing the level of stage fright, the magnitude and range of beneficial effects on artistic aspects of performance have wider implications than alleviating stress."
In light of other recent results about musical training improving verbal abilities it would be interesting to know whether neurofeedback training can improve verbal abilities as well.
Update: Links to published papers of their research available here.
Now, however, findings published in April in The New England Journal of Medicine strongly suggest not only that any amount of lead is harmful to a child's brain but also that greater damage seems to occur at levels below 10 micrograms than above that. In other words, there is no threshold for lead's effects on the brain, and just small amounts seem to have relatively large effects. If a blood level of, say, 15 micrograms can shave 2 points off a child's IQ, then a level of 5 micrograms might reduce IQ by 5 points or more.
Dr. Richard L. Canfield of Cornell University led the analysis of data collected in a lead dust control research program. The conclusions published in April are significant because they are based on a study that looked, for the first time ever, at a population of children whose blood lead concentration never went above the government’s current benchmark of 10 µg/dL. Dr. Canfield and his co-investigators found that children with blood lead levels below 10 µg/dL had a decrease in IQ of 1.37 points for every increment of 1 µg/dL of blood lead burden. This is actually higher than the one-half-point decrease per 1 µg/dL that has been consistently found in populations with children above the official 10 µg/dL limit.
Note the latter report is written by socialists who blame the use of lead on the profit system. I'd be curious to see data about lead levels collected from people in the USSR back when the USSR still existed. My guess is that they used lead paint and lead in gasoline as much as was the case in the US. Also, the US banned lead in gasoline a couple of decades sooner than some of the less capitalistic European countries.
The half life of lead in bones is 20 to 30 years. So if someone has been exposed to lead for a long time the bones will be a continuing source of release of lead into the blood for years after exposure to lead has stopped. Good nutrition helps. There is a fair amount of research literature on the protective effects of thiamin for instance. Because of the lack of restrictions on lead release in India a number of scientists have looked for cheap ways to protect people from lead toxicity and thiamin has been found beneficial for humans and animals (PDF format). Also, adequate calcium in the diet may decrease lead absorption.
Given the long term economic costs of lower IQs (lower incomes, high crime rates, etc) this latest result argues for more vigorous efforts to remove lead from old houses and apartments.
Down-regulating a single gene in aged mice boosted their mental functioning to be more like younger mice.
All of us experience a successive decline in learning and memory capacities with ageing. In the course of their investigations of the neurophysiological basis of this decline, Thomas Blank, Ingrid Nijholt, Min-Jeong Kye, Jelena Radulovic, and Joachim Spiess from the Max Planck Institute for Experimental Medicine in Göttingen have obtained new insight into the mechanisms of age-related learning deficits in the mouse model. In experiments with mice, the Max Planck researchers were able to revert the observed age-related learning and memory deficits by down-regulation of calcium-activated potassium channels (SK3) located in the hippocampus, a brain region recognized to be important for learning and memory. The researchers published their results as a Brief Communication in the journal Nature Neuroscience.
This brings up the obvious question: If the human equivalent of the SK3 gene could be down-regulated would old minds regain some of their lost youthful ability? It may not be that easy because the amount of calcium-activated potassium channels in aged human minds might be higher in order to compensate for some other change caused by aging.
A different research team has just published a paper showing that just as humans have a measure of general intelligence called 'g' mice have their own measurable 'g' for general intelligence.
Mice have a version of 'g', according to a team led by Louis Matzel of Rutgers University in Piscataway, New Jersey1. Animals that come top in one learning test often score better on others, they found: a maze champion might be a sniffing sensation too. "Once in a while you come across one that's absolutely stunning," says Matzel.
Both of these results are important because it is a lot easier to do work on mice than on humans. The latter result is particularly interesting because genes that have variations that affect mouse intelligence may turn out to have equivalents in humans that also have variations that affect intelligence in humans.
It would be great if human learning could be enhanced so easily.
A new study by researchers in Italy and the United States has found 140 genes, located in an area of the brain called the hippocampus, that had significantly altered activity when rats navigate a water maze. By enhancing the protein product of one of those genes, the scientists significantly boosted the rodents' spatial learning ability.
By "enhancing" it sounds like all they did was to inject a protein called fibroblast growth factor 18:
In another experiment, Alkon's group showed that they could improve spatial learning in rats by injecting them with FGF-18.
Update: It sounds like they first used DNA microarray technology to identify the list of genes that were upregulated during learning as a clue as to which compounds made by the neurons might enhance learning. The Reuters Health write-up gives the clearest indication of what they did:
There were six major groups of memory-related genes, with the largest being genes involved in cell signaling. One of these signaling genes contains the blueprints for a substance called fibroblast growth factor (FGF)-18. Cavallaro's team found that injecting extra FGF-18 into the rats' brains improved their ability to learn.
They identified the genes activated during learning, categorized the genes by type of function, and then focused on genes that are involved in cell signalling. In other words they focused on genes that control other genes and that control other parts of the cell. This allowed them to narrow down their candidates for intervention to compounds that would have the best chance to have an impact on cell development under the conditions of learning.
Their ability to carry out this experiment was made possible by advances in DNA microarray technology (Affymetrix is best known for its DNA microarray assay technology) that allow the watching of the regulatory state of thousands of genes at once. In this case the researchers watched the state of 2500 genes in order to discover the 140 that were involved in learning. The development of better tools to monitor biological systems accelerates the rate at which the function of cells can be puzzled out.