Those of you trying to learn complex classical music pieces should probably listen to the music while sleeping.
EVANSTON, Ill. – Want to nail that tune that you've practiced and practiced? Maybe you should take a nap with the same melody playing during your sleep, new provocative Northwestern University research suggests.
The research grows out of exciting existing evidence that suggests that memories can be reactivated during sleep and storage of them can be strengthened in the process.
In the Northwestern study, research participants learned how to play two artificially generated musical tunes with well-timed key presses. Then while the participants took a 90-minute nap, the researchers presented one of the tunes that had been practiced, but not the other.
I doubt this technique would help if applied to listening to book tapes while you sleep. Though perhaps people with sleep apnea (who wake up often) could take in snippets of text every time they wake up.
But be careful about what you try to use sleep to remember. Sleep helps form memories of distressing images. So stay up too late after a really bad experience you don't want to remember in great detail?
In testosterone-deficient men, major weight loss was an added benefit of testosterone replacement therapy for most of the patients who participated in a new study. The results will be presented Saturday at The Endocrine Society's 94th Annual Meeting in Houston.
"The substantial weight loss found in our study—an average of 36 pounds—was a surprise," said the study's lead author, Farid Saad, PhD, of Berlin-headquartered Bayer Pharma.
Although prior studies using testosterone therapy in testosterone-deficient men consistently show changes in body composition, such as increased lean mass and decreased fat mass, Saad said the net effect on weight seemed unchanged in those studies. However, Saad said their study, which took place in Germany, had a longer follow-up by at least two years and used long-acting injections of testosterone.
Also, testosterone decline is very often due to obesity and depression. So then weight gain and testosterone decline seem like a vicious cycle.
A new study finds that a drop in testosterone levels over time is more likely to result from a man's behavioral and health changes than by aging. The study results will be presented Monday at The Endocrine Society's 94th Annual Meeting in Houston.
"Declining testosterone levels are not an inevitable part of the aging process, as many people think," said study co-author Gary Wittert, MD, professor of medicine at the University of Adelaide in Adelaide, Australia. "Testosterone changes are largely explained by smoking behavior and changes in health status, particularly obesity and depression."
More sex will boost male testosterone levels. Sated lust is good for health.
We've got neural circuits in the prefrontal cortex of our brains that model how the values of others differ from our own.
Researchers at the RIKEN Brain Science Institute (BSI) in Japan have uncovered two brain signals in the human prefrontal cortex involved in how humans predict the decisions of other people. Their results suggest that the two signals, each located in distinct prefrontal circuits, strike a balance between expected and observed rewards and choices, enabling humans to predict the actions of people with different values than their own.
What I wonder: If the techniques these researchers developed were applied to the study of autistic test subjects what would be found? Do higher functioning autistics do well or poorly (as compared to neurotypicals) simulating the value systems of others? More generally: do autistics broad deficiencies in their ability simulate the minds of others? Or do they have only specific deficiencies in their capabilities to simulate certain aspects of other minds while being quite strong in other areas? Do they even have special strengths in mind simulation? Are they weak at simulating to predict emotional reactions while being strong at simulating, say, values? Also, do the patterns of simulation weaknesses and strengths vary from autistic to autistic? I would expect that to be the case because so many selected for and de novo mutations for autism cause different autistic attributes.
Our minds have mental circuits for simulating the minds of others.
Learning another person's values and mental processes is often assumed to require simulation of the other's mind: using one's own familiar mental processes to simulate unfamiliar processes in the mind of the other. While simple and intuitive, this explanation is hard to prove due to the difficulty in disentangling one's own brain signals from those of the simulated other.
Research scientists Shinsuke Suzuki and Hiroyuki Nakahara, a Principal Investigator of the Laboratory for Integrated Theoretical Neuroscience at RIKEN BSI, together with their collaborators, set out to disentangle these signals using functional Magnetic Resonance Imaging (fMRI) on humans. First, they studied the behavior of subjects as they played a game by making predictions about the other's behavior based on the knowledge of others and their decisions. Then they generated a computer model of the simulation process to examine the brain signals underlying the prediction of the other's behavior.
Parts of the simulation of values of others take place in the ventromedial prefrontal cortex and the dorsomedial prefrontal cortex.
The authors found that humans simulate the decisions of other people using two brain signals encoded in the prefrontal cortex, an area responsible for higher cognition (Figure 1). One signal involves the estimated value of the reward to the other person, and is called the reward signal, referring to the difference between the other's values, simulated in one's mind, and the reward benefit that the other actually received. The other signal is called the action signal, relating to the other's expected action predicted by the simulation process in one's mind, and what the other person actually did, which may or may not be different. They found that the reward signal is processed in a part of the brain called the ventromedial prefrontal cortex. The action signal, on the other hand, was found in a separate brain area called the dorsomedial prefrontal cortex.
What I also wonder: when faced with large value differences are some people more inclined to feel stressed or hostile than others? Is this due to innate differences caused by genetic variants?
We need better drugs for our emotions. Check out this important research that highlights the need for drugs to manage desire and love.
Along with colleagues in the USA and Switzerland, Pfaus analyzed the results from 20 separate studies that examined brain activity while subjects engaged in tasks such as viewing erotic pictures or looking at photographs of their significant others. By pooling this data, the scientists were able to form a complete map of love and desire in the brain.
They found that that two brain structures in particular, the insula and the striatum, are responsible for tracking the progression from sexual desire to love. The insula is a portion of the cerebral cortex folded deep within an area between the temporal lobe and the frontal lobe, while the striatum is located nearby, inside the forebrain.
Love and sexual desire activate different areas of the striatum. The area activated by sexual desire is usually activated by things that are inherently pleasurable, such as sex or food. The area activated by love is involved in the process of conditioning by which things paired with reward or pleasure are given inherent value That is, as feelings of sexual desire develop into love, they are processed in a different place in the striatum.
The area of the striatum where love develops is also the area involved in drug addiction. Rewarded sexual desire develops the addiction.
Somewhat surprisingly, this area of the striatum is also the part of the brain that associated with drug addiction. Pfaus explains there is good reason for this. "Love is actually a habit that is formed from sexual desire as desire is rewarded. It works the same way in the brain as when people become addicted to drugs."
Habit? Love is an addiction.
Love dampens down parts of the brain involved in desire. Obviously we need a drug that will prevent loss of desire.
While love may be a habit, it's not necessarily a bad one. Love activates different pathways in the brain that are involved in monogamy and in pair bonding. Some areas in the brain are actually less active when a person feels love than when they feel desire. "While sexual desire has a very specific goal, love is more abstract and complex, so it's less dependent on the physical presence someone else," says Pfaus.
It takes an artist to understand. Love is the drug.
You're going to have to face it. Addicted to love.
Scared by violent mice? Live in fear of murine madness? Scientists have found a way to block mouse rage.
Pathological rage can be blocked in mice, researchers have found, suggesting potential new treatments for severe aggression, a widespread trait characterized by sudden violence, explosive outbursts and hostile overreactions to stress.
Blocking the receptor NMDA moderates mouse aggression. A drug that targets NMDA in humans might cut human aggression as well.
In a study appearing today in the Journal of Neuroscience, researchers from the University of Southern California and Italy identify a critical neurological factor in aggression: a brain receptor that malfunctions in overly hostile mice. When the researchers shut down the brain receptor, which also exists in humans, the excess aggression completely disappeared.
The findings are a significant breakthrough in developing drug targets for pathological aggression, a component in many common psychological disorders including Alzheimer's disease, autism, bipolar disorder and schizophrenia.
Suppose a gene therapy gets developed that will so reduce NMDA receptor activity that violent prisoners could have their chances of committing violent acts reduced by a few orders of magnitude. Would you favor use of the NMDA-blocking gene therapy as a condition for parole of a violent felon?
Canola oil and olive oil,both high in monounsaturated fats, do a better job of enabling carotenoid absorption than polyunsaturated fats such as corn oil and soybean oil.
In a human trial, researchers fed subjects salads topped off with saturated, monounsaturated and polyunsaturated fat-based dressings and tested their blood for absorption of fat-soluble carotenoids – compounds such as lutein, lycopene, beta-carotene and zeaxanthin. Those carotenoids are associated with reduced risk of several chronic and degenerative diseases such as cancer, cardiovascular disease and macular degeneration.
The study, published early online in the journal Molecular Nutrition & Food Research, found that monounsaturated fat-rich dressings required the least amount of fat to get the most carotenoid absorption, while saturated fat and polyunsaturated fat dressings required higher amounts of fat to get the same benefit.
So to absorb the beneficial compounds out of vegetables eat them with small amounts of olive oil or canola oil.
Olive oil and canola oil are effective at lower doses.
Monounsaturated fat-rich dressings, such as canola and olive oil-based dressings, promoted the equivalent carotenoid absorption at 3 grams of fat as it did 20 grams, suggesting that this lipid source may be a good choice for those craving lower fat options but still wanting to optimize absorption of health-promoting carotenoids from fresh vegetables.
Would a more vigorous immune system help prevent Alzheimer's disease? The immune system looks like it protects our brains from beta-amyloid build-up. So would immune system rejuvenation protect the brain from Alzheimer's?
Recent work in mice suggested that the immune system is involved in removing beta-amyloid, the main Alzheimer's-causing substance in the brain. Researchers have now shown for the first time that this may apply in humans.
Researchers at the Peninsula College of Medicine and Dentistry, University of Exeter with colleagues in the National Institute on Aging in the USA and in Italy screened the expression levels of thousands of genes in blood samples from nearly 700 people. The telltale marker of immune system activity against beta-amyloid, a gene called CCR2, emerged as the top marker associated with memory in people.
The immune system ages along with the rest of the body. Is Alzheimer's disease partly caused by the immune system becoming too sluggish to prevent beta-amyloid build-up in the brain?
A rejuvenated immune system would offer a number of advantages. Most obviously, old people would be at less risk of death from pneumonia or assorted infections picked up in hospitals. But also, an aged immune system is associated with an increased risk of cancer. In fact, rare people have an especially anti-cancer immune system. So a combined rejuvenation and anti-cancer enhancement of the immune system would cut cancer risks.
Since the immune system also removes metabolic trash from the body a rejuvenated immune system would likely reduce the rate of overall body aging by preventing the build-up harmful secretions of cells such as beta-amyloid.
Among the problems: dangerous gases in air flight corridors. But crop failures strike me as a bigger concern.
WASHINGTON—A modern recurrence of an extraordinary type of volcanic eruption in Iceland could inject large quantities of hazardous gases into North Atlantic and European flight corridors, potentially for months at a time, a new study suggests. Using computer simulations, researchers are investigating the likely atmospheric effects if a “flood lava” eruption took place in Iceland today. Flood lava eruptions, which stand out for the sheer amounts of lava and sulfurous gases they release and the way their lava sprays from cracks like fiery fountains, have occurred in Iceland four times in roughly the past thousand years, records indicate, the most recent being the deadly and remarkable eruption of Iceland’s volcano Laki in 1783-84.
In my view the human race has been lucky in terms of the severity of geological phenomena since the late 19th century. During the 19th century many more severe natural events occurred than was the case in the 20th. We might be overdo. Since Iceland volcanic eruptions on a similar scale to Laki in 1783-1784 have happened about 4 times in the last thousand years it should not surprise us if a similar eruption occurs in the 21st century.
When Laki sprang to life on June 8, 1783, it generated a sulfuric acid haze that dispersed over Iceland, France, England, the Netherlands, Sweden, Italy, and other countries. It killed a fifth of Iceland’s population and three-quarters of the island’s livestock. It also destroyed crops, withered vegetation, and sowed human disease and death in several Northern European nations. During the eight months that Laki erupted, the volcano blasted 122 million tons of sulfur dioxide into the atmosphere – seven times more than did the 1991 Mt. Pinatubo eruption in the Philippines and approximately 50 to 100 times more per day than Iceland’s Eyjafjallajökull volcano released in 2010.
Well that does not sound like fun.
Today such an eruption would surely cause a couple of years of global cooling (just as the Mt Pinatubo eruption of 1991 cooled the lower atmosphere). The researchers in the latest report focus on aviation impacts. The air in some air lanes would become unhealthy for humans.
In the new simulations – focusing again on the first month of the eruption -- average daily concentrations of the droplets, in up to 10 percent of the air space, would exceed 10 times London’s average daily concentration of the corrosive pollutant, the researchers found.
Earth has so many volcanoes waiting to erupt. The next doozy might be a repeat of big eruptions in Nicaragua.
SELFOSS, ICELAND—Giant volcanic eruptions in Nicaragua over the past 70,000 years could have injected enough gases into the atmosphere to temporarily thin the ozone layer, according to new research. And, if it happened today, a similar explosive eruption could do the same, releasing more than twice the amount of ozone-depleting halogen gases currently in stratosphere due to man-made emissions.
So many ways to get whacked by our planet, sun, and asteroids.
Too many multipotent progenitor stem cells combined with a decline in tumor suppressing myoepithelial cells likely combine to create conditions favorable for cancer to develop.
Mark LaBarge, a cell and molecular biologist in Berkeley Lab’s Life Sciences Division, led a study in which it was determined that aging causes an increase in multipotent progenitors – a type of adult stem cell believed to be at the root of many breast cancers – and a decrease in the myoepithelial cells that line the breast’s milk-producing luminal cells and are believed to serve as tumor suppressors.
This result reminds me of Aubrey de Grey's argument that rejuvenation therapies are the best way to prevent diseases of old age. A key goal for rejuvenation is to get rid of cells that work poorly or which do harm. Another key goal is to increase the supply of cells that do necessary repair and regulatory functions. Achieve both those goals in breasts and the incidence of breast cancer would drop sharply.
“This is a big step towards understanding the cellular basis for age-related vulnerability to breast cancer,” LaBarge says. “Now that we have defined some of the cell and molecular changes that occur in the epithelium during the aging process and we have the ability to assay them functionally, it should be possible to look for ways to avoid those states and perhaps even reverse them.”
Stem cell therapies and other cell therapies aren't just about doing repairs. If we can replace aged stem cells with youthful stem cells we can reduce the risk of cancer. Stem cells collect mutations as they age. Old mutated cells do a poorer job at tissue repair while at the same time accumulation of dangerous combinations of mutations put stem cells closer to the point where they will start dividing uncontrollably.
The use of stem cell therapies to reduce cancer risk looks challenging because the new stem cells have to displace the existing aged stem cells.
CAMBRIDGE, Mass. -- One of the biggest risk factors for liver, colon or stomach cancer is chronic inflammation of those organs, often caused by viral or bacterial infections. A new study from MIT offers the most comprehensive look yet at how such infections provoke tissues into becoming cancerous.
The study, which is appearing in the online edition of Proceedings of the National Academy of Sciences the week of June 11, tracked a variety of genetic and chemical changes in the livers and colons of mice infected with Helicobacter hepaticus, a bacterium similar to Helicobacter pylori, which causes stomach ulcers and cancer in humans.
What I wonder: How much cancer could be prevented if we all got tested for chronic infections and got treated for any infections found? Or could a round of antibiotics without even first testing for a bacterial infection cut cancer risks?
The immune system generates toxic chemicals to kill bacteria. But those toxic chemicals also damage tissue in ways that can lead to cancer.
In the colon, but not the liver, neutrophils secreted hypochlorous acid (also found in household bleach), which significantly damages proteins, DNA and RNA by adding a chlorine atom to them. The hypochlorous acid is meant to kill bacteria, but it also leaks into surrounding tissue and damages the epithelial cells of the colon. The researchers found that levels of one of the chlorine-damage products in DNA and RNA, chlorocytosine, correlated well with the severity of the inflammation, which could allow them to predict the risk of chronic inflammation in patients with infections of the colon, liver or stomach. Tannenbaum recently identified another chlorine-damage product in proteins: chlorotyrosine, which correlates with inflammation. While these results point to an important role for neutrophils in inflammation and cancer, "we don't know yet if we can predict the risk for cancer from these damaged molecules," Dedon says.
Unfortunately, we live in a time when drug resistant strains of tuberculosis, gonorrhoea, and other bacteria are spreading globally. Be careful. Bacteria should be taken as a serious threat to health once again.
Downing, Ross, and colleagues reviewed drug approval decisions of the FDA, the Canadian drug regulator, Health Canada, and the European Medicines Agency (EMA) between 2001 and 2010. They studied each regulator's database of drug approvals to identify novel therapeutics as well as the timing of key regulatory events, allowing regulatory review speed to be calculated. Canada and Europe were chosen as a comparison because they face similar pressures to approve new drugs quickly while ensuring they do not put patients at risk..
Drugs reach the market in the United States sooner.
The team found that the median total time to review was 322 days at FDA, 366 days at EMA and 393 days at Health Canada.
"Among the subsample of drugs approved for all three regulators, the FDA's reviews were over three months faster than those of the EMA or Health Canada," said Downing. "The total review time at the FDA was faster than EMA, despite the FDA's far higher proportion of applications requiring multiple regulatory reviews."
Downing added that most new drug therapies were first approved for use in the U.S. "Examining novel drugs approved in multiple markets, we found that 64% of medicines approved in both the U.S. and in Europe were approved for U.S. patients first, and 86% of medicines approved in both the U.S. and Canada were also approved first in the U.S." he said.
I would like to see a much faster review process for drugs for fatal diseases. People who already have a diagnostic death sentence should be more free to try unproven drugs. If you get told you have just months to live the drug regulatory agencies can't protect you as much as they can kill you by preventing you from trying experimental drugs that are your last chance before checking out of the Life Hotel.
Chromosomal deletions in DNA often involve just one of two gene copies inherited from either parent. But scientists haven't known how a deletion in one gene from one parent, called a "hemizygous" deletion, can contribute to cancer.
A research team led by Stephen Elledge, a professor in the Department of Genetics at Harvard Medical School, and his post-doctoral fellow Nicole Solimini, has now provided an answer. The most common hemizygous deletions in cancer, their research shows, involve a variety of tumor suppressing genes called STOP genes (suppressors of tumorigenesis and proliferation) that scatter randomly throughout the genome, but that sometimes cluster in the same place on a chromosome. And these clusters, said Elledge, who is also a professor of medicine at Brigham and Women's Hospital, tend to be deleted as a group. "Eliminating the cluster gives a bigger bang for the deletion buck," he said.
So I've got a modest proposal for genetically reegineering the human genome to reduce the risk of cancer: make big deletions of multiple oncogenes lethal for the cell. How? Mix absolutely essential genes in between the oncogenes. If the oncogenes get deleted in a big block then the cell should die due to loss of essential genes. Basically, do genetic layout so those big deletion mutations are lethal.
Your new word for the day: haploinsufficient. Try to mix it into everyday conversation.
This finding is especially interesting in light of the two-hit model of cancer formation, which holds that both copies of a recessive gene need to be inactivated to trigger a biological effect. Thus the loss of a single tumor suppressor copy should have little or no influence on tumor cell proliferation because the remaining copy located on the other chromosome is there to pick up the slack.
Elledge's research points to a different hypothesis, namely that STOP genes in a hemizygous deletion aren't recessive but are instead haploinsufficient, meaning that they depend on two copies to function normally. "If a tumor suppressor is haploinsufficient, then a single gene copy lacks the potency needed to fully restrain tumorigenesis," Elledge explained, who is also a Howard Hughes Medical Institute Investigator. "So by removing clusters of haploinsufficient genes all at once, the cancer cell immediately propels its growth forward without having to wait for the other copies to also be lost."
Perhaps 10 or 20 years hence stem cells intended for therapies will get genetically enhanced to be far less susceptible to mutations that could turn them cancerous. My rearranging the genome the number of mutations needed to make a cancerous cell could be greatly increased, with cells becoming more likely to die than become cancerous when they lose genes involved in protection against cancer.
Since some rare people have immune systems which aggressively attack cancers we should also figure out what makes their genomes better in this regard. Then we should develop gene therapies and cell therapies that enhance our immune systems to snuff out early stage cancers.
Many people flatter themselves that they've got firm, unchanging, and incorruptible moral compasses. Yet people can be easily swayed to adopt different moral positions by what role they think they are playing.
CORVALLIS, Ore. – An individual’s sense of right or wrong may change depending on their activities at the time – and they may not be aware of their own shifting moral integrity — according to a new study looking at why people make ethical or unethical decisions.
Focusing on dual-occupation professionals, the researchers found that engineers had one perspective on ethical issues, yet when those same individuals were in management roles, their moral compass shifted. Likewise, medic/soldiers in the U.S. Army had different views of civilian casualties depending on whether they most recently had been acting as soldiers or medics.
One wonders: As assorted occupations get automated out of existence and people shift into other occupations what is the net effect on moral perspectives? What ethical positions are people becoming more likely to take because of growth in some occupations? Which moral positions are becoming more of rarities as factories get automated or because functions previously done by people meeting face-to-face are now done on the phone or thru web pages?
Just hints to a person about what role they should use as their perspective caused them to take different ethical positions.
The researchers conducted three different studies with employees who had dual roles. In one case, 128 U.S. Army medics were asked to complete a series of problem-solving tests, which included subliminal cues that hinted they might be acting as either a medic or a soldier. No participant said the cues had any bearing on their behavior – but apparently they did. A much larger percentage of those in the medic category than in the soldier category were unwilling to put a price on human life.
In another test, a group of engineer-managers were asked to write about a time they either behaved as a typical manager, engineer, or both. Then they were asked whether U.S. firms should engage in “gifting” to gain a foothold in a new market. Despite the fact such a practice would violate federal laws, more than 50 percent of those who fell into the “manager” category said such a practice might be acceptable, compared to 13 percent of those in the engineer category.
Are more people thinking like managers? Do they compensate for their managerial ethics by becoming more altruistic in other areas? Or does managerial ethical thinking pervade their ethical calculations in other aspects of their lives?
Do you find your ethical positions more influenced by online communities where you play a role? Do you have more or less contact with humans than you did in your job 10 years ago? Do you sense your ethical perspective shifting? If so, in what directions?
One thing I see changing: As people work with and online chat with people who come from distant places people are growing their in groups. There is less local focus and more of a recognition of the need to form and maintain relationships with people in distant places and to incorporate perspectives and interests of distant groups into one's own moral calculations.
Need brain rejuvenation? Time to take a skin cell sample and use it to make brain stem cells The brain stem cells have both research and therapeutic potential.
SAN FRANCISCO, CA—June 7, 2012—Scientists at the Gladstone Institutes have for the first time transformed skin cells—with a single genetic factor—into cells that develop on their own into an interconnected, functional network of brain cells. The research offers new hope in the fight against many neurological conditions because scientists expect that such a transformation—or reprogramming—of cells may lead to better models for testing drugs for devastating neurodegenerative conditions such as Alzheimer's disease.
These scientists see an advantage in their technique because they do not convert the skin cells all the way into general purpose pluripotent stem cells. The fear with pluripotent stem cells is that they might go rogue in the body and act like cancer. Pluripotent stem cells simply have too much potential and can convert into too many other cell types.
In findings appearing online today in Cell Stem Cell, researchers in the laboratory of Gladstone Investigator Yadong Huang, MD, PhD, describe how they transferred a single gene called Sox2 into both mouse and human skin cells. Within days the skin cells transformed into early-stage brain stem cells, also called induced neural stem cells (iNSCs). These iNSCs began to self-renew, soon maturing into neurons capable of transmitting electrical signals. Within a month, the neurons had developed into neural networks.
The transformation of cells into assorted specialized stem cell types will become less risky once scientists develop technology to cheaply screen out cells that have too many dangerous mutations in them. We need stem cells with good genetic state (no risky mutations or mutations that reduce functionality), good epigenetic state (they should be solidly in a desired state rather than a mix of states), and youthful (long telomere chromosome caps).
Researchers at the Karolinska Institutet in Sweden have developed a vaccine against beta amyloid protein which might enable the immune system to basically gobble up this protein that causes death of neurons and their support cells.
The new treatment, which is presented in Lancet Neurology, involves active immunisation, using a type of vaccine designed to trigger the body's immune defence against beta-amyloid. In this second clinical trial on humans, the vaccine was modified to affect only the harmful beta-amyloid. The researchers found that 80 per cent of the patients involved in the trials developed their own protective antibodies against beta-amyloid without suffering any side-effects over the three years of the study. The researchers believe that this suggests that the CAD106 vaccine is a tolerable treatment for patients with mild to moderate Alzheimer's. Larger trials must now be conducted to confirm the CAD106 vaccine's efficacy.
One problem I see: the immune systems of older folks are much less effective due to aging. Will the aged be able to mount a big enough immune response to clear away the beta amyloid? Not clear.
Using scientific theories, toy ecosystem modeling and paleontological evidence as a crystal ball, 18 scientists, including one from Simon Fraser University, predict we're on a much worse collision course with Mother Nature than currently thought.
In Approaching a state-shift in Earth's biosphere, a paper just published in Nature, the authors, whose expertise span a multitude of disciplines, suggest our planet's ecosystems are careening towards an imminent, irreversible collapse.
Earth's accelerating loss of biodiversity, its climates' increasingly extreme fluctuations, its ecosystems' growing connectedness and its radically changing total energy budget are precursors to reaching a planetary state threshold or tipping point.
Once that happens, which the authors predict could be reached this century, the planet's ecosystems, as we know them, could irreversibly collapse in the proverbial blink of an eye.
As humans replace more of the wilderness with cities and farms the planet loses stabilizing buffer areas. Ecosystems become simpler and more vulnerable to shocks. We humans manage more of the Earth's surface and seas we will have to become far more skilled at managing human-made ecosystems.
Maybe we'll develop the ability to model ecosystems in really complex computer simulations and these simulations might tell us where disaster looms and how to avert it. Then again, once the computers alert us to an approaching problem governments and their populaces may show themselves unwilling to make the needed levels of sacrifice.
The authors promote solutions that seem unlikely to be put into place.
The authors recommend governments undertake five actions immediately if we are to have any hope of delaying or minimizing a planetary-state-shift. Arne Mooers, an SFU biodiversity professor and a co-author of this study, summarizes them as follows.
"Society globally has to collectively decide that we need to drastically lower our population very quickly. More of us need to move to optimal areas at higher density and let parts of the planet recover. Folks like us have to be forced to be materially poorer, at least in the short term. We also need to invest a lot more in creating technologies to produce and distribute food without eating up more land and wild species. It's a very tall order."
I do not expect governments and their populaces to be willing to make very large sacrifices . Population growth and economic growth seem on course to cause the loss of many species and the expansion of human habitats at the expense of dwindling wild habitats. I'd like to proven wrong on this point.
Coauthor Elizabeth Hadly from Stanford University said "we may already be past these tipping points in particular regions of the world. I just returned from a trip to the high Himalayas in Nepal, where I witnessed families fighting each other with machetes for wood – wood that they would burn to cook their food in one evening. In places where governments are lacking basic infrastructure, people fend for themselves, and biodiversity suffers. We desperately need global leadership for planet Earth."
The authors note that studies of small-scale ecosystems show that once 50-90 percent of an area has been altered, the entire ecosystem tips irreversibly into a state far different from the original, in terms of the mix of plant and animal species and their interactions. This situation typically is accompanied by species extinctions and a loss of biodiversity.
Overpopulation is the root cause. That used to be a major concern of environmentalists back in the 1970s. Are environmentalists going to reawaken to the problem as the situation becomes more dire?
Currently, to support a population of 7 billion people, about 43 percent of Earth's land surface has been converted to agricultural or urban use, with roads cutting through much of the remainder. The population is expected to rise to 9 billion by 2045; at that rate, current trends suggest that half Earth's land surface will be disturbed by 2025. To Barnosky, this is disturbingly close to a global tipping point.
Since the human population is going to rise and since industrialization will increase the land used per person it seems hard to imagine how we'll avoid the tipping point.
Currently, to support a population of 7 billion people, about 43 percent of Earth's land surface has been converted to agricultural or urban use, with roads cutting through much of the remainder. The population is expected to rise to 9 billion by 2045; at that rate, current trends suggest that half Earth's land surface will be disturbed by 2025. To Barnosky, this is disturbingly close to a global tipping point.
"We believe that ongoing loss of biological diversity is diminishing the ability of ecosystems to sustain human societies," says Andrew Gonzalez, Associate Professor with the Department of Biology and the Quebec Centre for Biodiversity Science at McGill University and author on the paper.
I'm pretty bearish on the 21st century future of Earth from an ecological standpoint.
Not really a new story. But here's more evidence for the protective effects of coffee. Caffeine and/or other chemicals in coffee cut the risk of developing Alzheimer's disease risk.
Tampa, FL (June 4, 2012) Those cups of coffee that you drink every day to keep alert appear to have an extra perk – especially if you're an older adult. A recent study monitoring the memory and thinking processes of people older than 65 found that all those with higher blood caffeine levels avoided the onset of Alzheimer's disease in the two-to-four years of study follow-up. Moreover, coffee appeared to be the major or only source of caffeine for these individuals.
Researchers from the University of South Florida (www.usf.edu) and the University of Miami (www.miami.edu)say the case control study provides the first direct evidence that caffeine/coffee intake is associated with a reduced risk of dementia or delayed onset. Their findings will appear in the online version of an article to be published June 5 in the Journal of Alzheimer's Disease, published by IOS Press (http://health.usf.edu/nocms/publicaffairs/now/pdfs/JAD111781.pdf). The collaborative study involved 124 people, ages 65 to 88, in Tampa and Miami.
Speaking as someone who is not a coffee fan: How else to get the benefit? See the bottom of the post for hints on how coffee delivers this benefit. It is a synergistic effect between caffeine and something else in coffee.
Maybe older adults with mild memory impairment ought to start heavy coffee drinkingn.
"These intriguing results suggest that older adults with mild memory impairment who drink moderate levels of coffee -- about 3 cups a day -- will not convert to Alzheimer's disease -- or at least will experience a substantial delay before converting to Alzheimer's," said study lead author Dr. Chuanhai Cao, a neuroscientist at the USF College of Pharmacy (http://health.usf.edu/nocms/pharmacy/) and the USF Health Byrd Alzheimer's Institute (http://health.usf.edu/nocms/byrd/). "The results from this study, along with our earlier studies in Alzheimer's mice, are very consistent in indicating that moderate daily caffeine/coffee intake throughout adulthood should appreciably protect against Alzheimer's disease later in life."
It is amazing that a beverage that humans have been drinking for over a thousand years can deliver more brain protection benefit than any drug so far devised by the pharmaceutical industry and academic researchers.
The study shows this protection probably occurs even in older people with early signs of the disease, called mild cognitive impairment, or MCI. Patients with MCI already experience some short-term memory loss and initial Alzheimer's pathology in their brains. Each year, about 15 percent of MCI patients progress to full-blown Alzheimer's disease. The researchers focused on study participants with MCI, because many were destined to develop Alzheimer's within a few years.
Blood caffeine levels at the study's onset were substantially lower (51 percent less) in participants diagnosed with MCI who progressed to dementia during the two-to-four year follow-up than in those whose mild cognitive impairment remained stable over the same period.
Sounds like some pretty heavy coffee drinking is needed for maximum benefit.
No one with MCI who later developed Alzheimer's had initial blood caffeine levels above a critical level of 1200 ng/ml – equivalent to drinking several cups of coffee a few hours before the blood sample was drawn. In contrast, many with stable MCI had blood caffeine levels higher than this critical level.
Coffee cuts other disease risks.
In addition to Alzheimer's disease, moderate caffeine/coffee intake appears to reduce the risk of several other diseases of aging, including Parkinson's disease, stroke, Type II diabetes, and breast cancer. However, supporting studies for these benefits have all been observational (uncontrolled), and controlled clinical trials are needed to definitively demonstrate therapeutic value.
A study tracking the health and coffee consumption of more than 400,000 older adults for 13 years, and published earlier this year in the New England Journal of Medicine, found that coffee drinkers reduced their risk of dying from heart disease, lung disease, pneumonia, stroke, diabetes, infections, and even injuries and accidents.
In 2011 this same USF research group showed that at least in mice caffeine and some other component(s) of coffee act to boost cytokines and their results suggest boosting these cytokines causes the benefit.
. In both AβPPsw+PS1 transgenic mice and non-transgenic littermates, acute i.p. treatment with caffeinated coffee greatly and specifically increased plasma levels of granulocyte-colony stimulating factor (GCSF), IL-10, and IL-6. Neither caffeine solution alone (which provided high plasma caffeine levels) or decaffeinated coffee provided this effect, indicating that caffeine synergized with some as yet unidentified component of coffee to selectively elevate these three plasma cytokines. The increase in GCSF is particularly important because long-term treatment with coffee (but not decaffeinated coffee) enhanced working memory in a fashion that was associated only with increased plasma GCSF levels among all cytokines. Since we have previously reported that long-term GCSF treatment enhances cognitive performance in AD mice through three possible mechanisms (e.g., recruitment of microglia from bone marrow, synaptogenesis, and neurogenesis), the same mechanisms could be complimentary to caffeine's established ability to suppress Aβ production. We conclude that coffee may be the best source of caffeine to protect against AD because of a component in coffee that synergizes with caffeine to enhance plasma GCSF levels, resulting in multiple therapeutic actions against AD.
We need other ways to boost these cytokines. Anyone familiar with the research literature who knows other ways to do this?
Why make small steps forward when really big steps are possible? Princeton researchers have developed a way to make immunoassays for biomarkers (e.g. blood proteins for cancer or Alzheimer's) 3 million times for sensitive. So far fewer molecules need be present to detect a medical condition. Earlier detection becomes possible.
The breakthrough involves a common biological test called an immunoassay, which mimics the action of the immune system to detect the presence of biomarkers – the chemicals associated with diseases. When biomarkers are present in samples, such as those taken from humans, the immunoassay test produces a fluorescent glow (light) that can be measured in a laboratory. The greater the glow, the more of the biomarker is present. However, if the amount of biomarker is too small, the fluorescent light is too faint to be detected, setting the limit of detection. A major goal in immunoassay research is to improve the detection limit.
The Princeton researchers tackled this limitation by using nanotechnology to greatly amplify the faint fluorescence from a sample. By fashioning glass and gold structures so small they could only be seen with a powerful electron microscope, the scientists were able to drastically increase the fluorescence signal compared to conventional immunoassays, leading to a 3-million-fold improvement in the limit of detection. That is, the enhanced immunoassay would require 3 million times fewer biomarkers to be present compared to a conventional immunoassay. (In technical terms, the researchers measured an improvement in the detection limit from 0.9 nanomolars to 300 attomolars.)
This technique doesn't just enable earlier detection. It also enables detection of compounds that never reach high concentrations. So expect more biomarkers to be found with this technique and for .
These researchers are now going to compare the sensitivity of this approach for cancers and Alzheimer's in hopes of earlier stage detection.
As next steps in his research, Chou said he is conducting tests to compare the sensitivity of the D2PA-enhanced immunoassay to a conventional immunoassay in detecting breast and prostate cancers. In addition he is collaborating with researchers at Memorial Sloan-Kettering Cancer Center in New York to develop tests to detect proteins associated with Alzheimer's disease at a very early stage.
"You can have very early detection with our approach," he said.
The scaling of biotechnology to work at very small scales is yielding rapid advances just as scaling down of semiconductor devices enabled computers to become both cheaper and more powerful at the same time. It is this trend toward smaller scale devices that gives me the most optimism about how soon we will get advances in biotechnology that will enable the development of rejuvenation therapies.
An Associated Press article takes a look at smart bomb drugs and other advances reported on in the recent 2012 annual meeting of the American Society of Clinical Oncology (ASCO).
—New “smart” drugs that deliver powerful poisons directly to cancer cells while leaving healthy ones alone.
—A new tool that helps the immune system attack a broad range of cancer types.
The basic problem with cancer is that cancer cells are your own cells. Their similarities to normal cell make them very hard to target without a lot of "friendly fire" deaths of healthy cell. Biotechnological advances (e.g. DNA sequencing) help in part because they allow identification of many more differences between cancer cells and normal cells. What is still an open question: how many unique (or nearly unique) features of cancer cells will be found? How hard to exploit those differences to kill only cancer cells?
All the abstracts for the 2012 annual meeting of (ASCO) are online. Looking thru the category Immunotherapy and Biologic Therapy I count 103 abstracts.
With populations aging and the incidence of cancer soaring pharmaceutical companies are investing more in cancer therapy development. As a result more of the research presented at ASCO has industry involvement. I see this as a good thing because it indicate an acceleration of the rate at which research will get translated into treatments brought to market.
They found that 48% of research accepted for presentation at the meeting in 2011 came from a group where at least one author had a relationship to industry—up from 39% of research presented in 2006. These ties to industry appeared to increase every year.
Interestingly, in a second related abstract by the same authors, Beverly Moy, M.D., M.P.H., clinical director of the Breast Oncology Program and a medical oncologist at the Massachusetts General Hospital, reported that high profile research—selected to be presented more prominently at the meeting—was more likely to come from scientists with relationships to industry. Studies from authors with ties to industry also tended to receive higher scores from their peers.
Industry is going to get more involved in treatments that show the most promise. With advances being made in underlying biotechnologies (e.g. cheap DNA sequencing, microfluidics, gene chips, biotechnologies for growing immune cells) used to identify targets for cancer therapy and for creating treatments the long war against cancer should start producing more victories in the next 20 years.
Key farming regions in the US are drawing water from underground sources at unsustainable rates, with slightly more than one-third of the southern Great Plains at risk of tapping out its sources within the next 30 years.
This is before taking into consideration risks of a megadrought.
"Basically, irrigated agriculture in much of the southern High Plains is unsustainable," said Scanlon.
Continued population growth combined with already overdrawn natural resources makes the future of many resources look pretty grim. One problem with this sort of depletion is that it removes a buffer. Imagine the Great Plains gets hit by a large drought lasting years or even decades. It has happened before. Without a big underground buffer of water the impact will be much more severe.
Given cheap enough energy water depletion would not have to make such a big impact. With very cheap energy (fusion energy some day?) water desalinization can provide us with as much water as we need, at least in coastal regions.
The Hubble Space Telescope keeps on giving. Brace for impact.
The Milky Way is set to collide with its closest neighbor, the Andromeda galaxy, astronomers working with the Hubble Space Telescope said Thursday. Galactic residents need not brace for impact just yet, however: The predicted collision would take place in 4 billion years.
So imagine you live long enough to still be around when rejuvenation therapies become available. Will any rejuvenated people from the 20th or 21st century survive millions of years? Billions of years? One would have to be both very risk avoidant and very lucky to make it that long. Plus,one would need to travel in a planet spaceship between stars when Sol gets too old. How far would humanity need to travel to get to a much younger star?
During the 2 billion year collision periodour sun will be thrown further away from the galactic core.
Computer simulations derived from Hubble's data show that it will take an additional two billion years after the encounter for the interacting galaxies to completely merge under the tug of gravity and reshape into a single elliptical galaxy similar to the kind commonly seen in the local universe.
Although the galaxies will plow into each other, stars inside each galaxy are so far apart that they will not collide with other stars during the encounter. However, the stars will be thrown into different orbits around the new galactic center. Simulations show that our solar system will probably be tossed much farther from the galactic core than it is today.
I want to know whether aliens from Andromeda will use the collision as an opportunity to invade the Milky Way.