February 20, 2004
Dogs Evolved To Read Human Cues

Dogs are better than chimpanzees at reading human signals.

A chimpanzee enters a room where food is hidden in one of two opaque containers. A human gazes at the container that hides the food. Reaches for it with outstretched arm. Marks the container with a wooden block. The chimp doesn't get the message, even though chimpanzees are one of Homo sapiens' two closest extant primate relatives and might be expected to figure it out. Biological anthropologist Brian Hare and colleagues tried this game with 11 chimps, and only two of the brainy apes used the conspicuous cues to find the food.

Dog owners may not be surprised to learn that nine of 11 dogs in the same situation correctly read the human signals and found the food. A control exercise established that odor was not a cue in either trial.

Humans served as a selective factor in canine evolution.

"Our new work provides direct evidence that dogs' lengthy contact with humans has served as a selection factor, leading to distinct evolutionary changes," says Hare, who recently completed his Ph.D. in anthropology in Harvard's Faculty of Arts and Sciences. "This is the first demonstration that humans play an ongoing role in the evolution of canine cognition."

Wolves do not look to humans for help but dogs do.

Ádám Miklósi led a group of researchers at Eötvös University in Budapest, Hungary who conducted the "shell game" tests on wolves. The test wolves were raised by humans and socialized to a comparable level as their dog counterparts. But although they could follow some signals, the wolves could not perform to the level of dogs.

Miklósi's test also included an important second step. He presented the animals with an unsolvable problem—a bowl of food that was impossible to access. The team found that while wolves continued to work at the unsolvable problem for long periods, dogs quickly looked at the humans for help.

Dogs branched off from wolves only 15,000 years ago.

Dec. 4 — The Eves of the dog world are five or six wolf females that lived in or near China nearly 15,000 years ago, according to a series of genetic research.

The progenitor breeds from all current breeds first appeared only 3,000 to 5,000 years ago.

The researchers believe that by 10,000 to 12,000 years later, 10 "progenitor breeds" of dog had been created to fulfill different roles alongside their masters. It took a further 5000 to 3000 years for people to create the 300 or so pure breeds known today.

What is interesting about this result from a human evolutionary perspective is that it demonstrates how, contrary to popular belief, 10,000 or 20,000 years of selective pressure from relatively new environmental factors can produce large changes in shape, cognitive function, and behavior of a species. The example of dogs changing so much under human influence suggests the possibility that humans have changed a great deal as they moved out of Africa and evolved to fit into various ecological niches around the world.

An example of an evolutionary adaptation in humans that may have developed as recently as dogs developed from wolves occurred in the human Andean population which developed an adaptation to high altitudes.

Previous studies have shown that the Tibetan, Ethiopian and Andean populations have developed slightly different ways of boosting their oxygen levels to cope with the thin air. Those in the Andes pump out more haemoglobin - a molecule that carries oxygen around in the blood. The Tibetans, by contrast, have relatively low haemoglobin levels but breathe faster to take in more oxygen. "The slightest bit of exercise makes them really pant," Beall says.

The Tibetans probably had more time in which to develop high altitude adaptations and certainly the Ethiopians had more time since humans have been in Africa for a longer period of time. But the Andean human adaptation couldn't begin until the human populations came across the Bering Strait and then migrated all the way to South America.

Loren Cordain claims that the ability of adult northern Europeans to digest lactose sugar is a fairly recent adaptation that may have become widespread in just the last few hundred generations of humans.

Commentary: There are calculations which estimate how long it took to increase the gene for adult lactase persistence (ALP) in northern Europeans from a pre-agricultural incidence rate of 5% to its present rate of approximately 70% [Aoki 1991]. (Note: The enzyme lactase is required to digest the sugar lactose in milk, and normally is not produced in significant quantity in human beings after weaning.) In order for the gene frequency to increase from 0.05 to 0.70 within the 250 generations which have occurred since the advent of dairying, a selective advantage in excess of 5% may have been required [Aoki 1991].

Therefore, some genetic changes can occur quite rapidly, particularly in polymorphic genes (those with more than one variant of the gene already in existence) with wide variability in their phenotypic expression. ("Phenotypic expression" means the physical characteristic(s) which a gene produces.) Because humans normally maintain lactase activity in their guts until weaning (approximately 4 years of age in modern-day hunter-gatherers), the type of genetic change (neoteny) required for adult lactase maintenance can occur quite rapidly if there is sufficient selective pressure. Maintenance of childlike genetic characteristics (neoteny) is what occurred with the geologically rapid domestication of the dog during the late Pleistocene and Mesolithic [Budiansky 1992].

Domestication of animals for milk began only about 6,000 years ago and so the selection for adult human lactase enzyme synthesis began only then.

Influence of human culture on genetic selection pressures. However--and this is where it gets interesting--those population groups that do retain the ability to produce lactase and digest milk into adulthood are those descended from the very people who first began domesticating animals for milking during the Neolithic period several thousand years ago.[119] (The earliest milking populations in Europe, Asia, and Africa began the practice probably around 4,000 B.C.[120]) And even more interestingly, in population groups where cultural changes have created "selection pressure" for adapting to certain behavior--such as drinking milk in this case--the rate of genetic adaptation to such changes significantly increases. In this case, the time span for widespread prevalence of the gene for lactose tolerance within milking population groups has been estimated at approximately 1,150 years[121]--a very short span of time in evolutionary terms.

It is worth noting that domestication of milk animals was such a large selective advantage that it could cause the mutation for lactase expression in adults to be selected for in a relatively short period of time. But since the selective pressure for adult lactase expression was very strong this suggests than any kind of behavior or other aspect of human physiology that was beneficial for doing milk animal herding and protection of milk animals would also have been selected for very strongly at the same time that adult lactase expression was being selected for. We have to consider the possibility that personality types more suited for the herding-tending and herd-protection may have been fundamentally different than the personality types most suited for a hunter-gatherer lifestyle that involved no use of milk animals.

Another post-Africa adaptation in humans is the spread of a mitochondrial mutation for generating more heat in colder weather.

These lineages are not found at all in Africans but occur in 14 percent of people in temperate zones and in 75 percent of those inhabiting Arctic zones. Wallace and his colleagues say this correlation is evidence that the lineages were positively selected because they help the body generate more heat.

...

Wallace says that climatic selection may have operated on the human population from the moment it moved north of the African tropics. Most such pioneers died but two lineages, known as M and N, arose in northeast Africa some 65,000 years ago and might have been adapted to temperate climates. Almost everyone outside of sub-Saharan Africa has mitochondria descended from the M and N lineages.

The writers of the research paper reporting on the heat-generating mtDNA variation speculate that human mtDNA has adaptations for local environmental conditions that are making humans have higher incidences of a number of diseases due to modern environments and diets.

Evidence has already accumulated that different human mtDNA lineages are functionally different. Haplogroup T is associated with reduced sperm motility in European males (30), and the tRNAGln nucleotide position 4336 variant in haplogroup H is associated with late-onset Alzheimer's disease (31). Moreover, Europeans harboring the mild ND6 nucleotide position 14484 and ND4L nucleotide position 10663 Leber's hereditary optic neuropathy missense mutations are more prone to blindness if they also harbor the mtDNA haplogroup J (32, 33), and haplogroup J is associated with increased European longevity (34). Because haplogroup J mtDNAs harbor two missense mutations in complex I genes (Y304H in ND1 and A458T in ND5), in addition to the above-mentioned L236T variant in the cytb gene, these polymorphisms all could affect the efficiency of OXPHOS ATP production and thus exacerbate the energy defects of mildly deleterious new mutations.

Given that mtDNA lineages are functionally different, it follows that the same variants that are advantageous in one climatic and dietary environment might be maladaptive when these individuals are placed in a different environment. Hence, ancient regionally beneficial mtDNA variants could be contributing to modern bioenergetic disorders such as obesity, diabetes, hypertension, cardiovascular disease, and neurodegenerative diseases as people move to new regions and adopt new lifestyles.

In humans mitochondrial DNA (mtDNA) is only 16,569 DNA letters long whereas the DNA in the human cell nucleus is over 3 billion letters long. Note that while the mtDNA is very small it still manages to have many variations with different effects on disease risks and environmental adaptation. It seems likely that the mtDNA heat variation is not the only mtDNA variation is a result of selective pressures to allow humans to adapt better to local conditions.

Another important thing to note about canine evolution is that to the extent that dog breeds developed special adaptations to perform various functions those dogs reduced the need for humans to do those functions and hence changed the selective pressure on humans.

"We know that dogs were useful for lots of things in Stone Age culture, as draft animals, in hunting, for warmth, and for protection," said Jennifer Leonard, a postdoctoral fellow at the Smithsonian Institution’s National Museum of Natural History. And in sharing food, shelter, survival and play, modem dogs have somehow genetically acquired an insight about humans that has earned them the title of man's best friend

For instance, a hunting dog that could smell prey reduced the need for humans to have an acute sense of smell for that purpose. Therefore the domestication of dogs must have changed the selective pressures on humans. Those changes in selective pressures must have been different depending on the types of dogs and the ecological niches various human groups found themselves in. Human groups that learned to train and work with dogs for various purposes had a selective advantage against human groups that did not do so. So just as humans have exerted selective pressures in dog evolution it seems highly likely that dogs have caused selective pressures in human evolution.

Share |      Randall Parker, 2004 February 20 08:57 PM  Trends, Human Evolution


Comments
Alex said at February 23, 2004 10:18 AM:

And yet people continue to cling to the intellectually bankrupt, completely unscientific notion that cats are better than dogs.

Bob Badour said at February 23, 2004 5:03 PM:

That's because humans evolved to read feline cues.

Mike said at February 25, 2004 2:44 PM:

It would be interesting to see the same analysis on cats. My guess is that cat domestication occurs even later, after the advent of agriculture. The cat's chief function is to prevent the very significant loss of grain and eggs from rodents. It would also be interesting to see this kind of analysis with a rat. Which evolved faster, the cat or the rat?

Abiola Lapite said at February 28, 2004 12:45 PM:

"contrary to popular belief, 10,000 or 20,000 years of selective pressure from relatively new environmental factors can produce large changes in shape, cognitive function, and behavior of a species."

The generation time for dogs is much shorter than it is for humans, so extrapolating from one case to the other is illegitimate in this case. What is more, dogs, being largely creatures of instinct being bred by men to fulfill particular goals, have faced much more severe selection pressure than any anatomically modern humans will have.

Abiola Lapite said at February 28, 2004 12:52 PM:

"But since the selective pressure for adult lactase expression was very strong this suggests than any kind of behavior or other aspect of human physiology that was beneficial for doing milk animal herding and protection of milk animals would also have been selected for very strongly at the same time that adult lactase expression was being selected for."

No it doesn't. One thing doesn't follow from the other, and there's not a shred of evidence in favor of it either.

"We have to consider the possibility that personality types more suited for the herding-tending and herd-protection may have been fundamentally different than the personality types most suited for a hunter-gatherer lifestyle that involved no use of milk animals."

It is a theoretical possibility, but then again, so are a lot of other things. Do keep in mind that humans show a lot more plasticity than dogs. Neither Europeans nor Eskimos have evolved cavemen levels of hirsuteness, have they? And yet, both groups manage to survive the higher latitudes somehow.

Randall Parker said at February 28, 2004 1:17 PM:

Abiola, But humans have changed into all sorts of forms to fit various niches.

Higher latitudes: Look at the mitochondrial DNA mutation to generate more heat for the people who live in closer to the north pole. Look at the mutation to make lighter skin so that more Vitamin D could be synthesized from less sunlight. Those are both mutations that better adapted humans to higher latitudes. It seems reasonable to expect more to be found.

As for why no hirsuteness: probably because they could wear furs of killed animals there was no need to develop that adaptation. It required less metabolic energy to wear dead fur than to constantly grow hair. The furs were more convenient since they could be donned and removed quickly in response to fast temperature changes.

Generation times of dogs: Yes, but humans have had tens of thousands of more years to adapt to various environments since leaving Africa. So they've had more generations. Plus, look at how few generations it has taken to produce various breeds. Instead of tens of thousands of years we are talking thousands or even hundreds of years to produce radical changes in shape and functionality.

Abiola Lapite said at February 28, 2004 4:32 PM:

"Higher latitudes: Look at the mitochondrial DNA mutation to generate more heat for the people who live in closer to the north pole."
I actually pursued this claim when it was originally made, and it hasn't actually been demonstrated. At this point, all we have is one man's speculation that this is the underlying reason.

"Look at the mutation to make lighter skin so that more Vitamin D could be synthesized from less sunlight."
Look at exceptions like East Asians and Eskimos, who are darker than Europeans despite living at similar or higher altitudes. It is rash in the extreme to jump from a statistical signal of selection (which is all this guy actually found) to making bold claims about selection for heat generation, as if it were actually backed up by scientific evidence. The fact that 86 percent of Europeans don't even carry the supposedly beneficial variant ought at least to make you wonder.

"humans have had tens of thousands of more years to adapt to various environments since leaving Africa"
Humans left Africa ~50,000 years ago, as opposed to 10-20,000 years for dogs. The generation time for humans, at about 25 years, is more than 10 times as long as that for dogs. The arithmetic is pretty clear - there've been many more generations of dogs to select against.

"look at how few generations it has taken to produce various breeds"
Only with extremely intense human-directed selection - incestual "line breeding" is commonplace amongst dog breeders, but in no society I'm aware of has it been the norm amongst the ordinary population; finally, as I've already pointed out, there've been many more dog generations.

The bottom line is that while selection has undoubtedly occurred amongst human populations, it simply isn't scientifically justifiable to casually extrapolate from dogs to humans. Dogs don't have the brains or the opposable thumbs to make fur coats to keep themselves warm in the European winter, and there's no good reason to assume that any "innate" personality differences between human populations will be anything like as clearcut as they are between dogs bred using closed stud-books - as is the norm with Kennel Clubs throughout the world.

Randall Parker said at February 28, 2004 11:02 PM:

Abiola, the bulk of Europe is well to the north of China. Southern Italians are fairly dark and darker than northern Italians who are darker than southern Germans and so on.

As for Eskimos: You have to adjust for a diet that includes a lot of fish oils. A populace eating a high Vitamin D diet didn't experience as much selective pressure to develop lighter colored skins. Tellingly, they have never suffered from rickets. Also, since altitude and cloud cover also factor in the use of latitude by itself is not a sufficient measure of UV exposure.

Are you aware of the work of Nina Jablonski and George Chaplin that correlates skin color with UV exposure?

Here is more on the work of Jablonski and Chaplin on skin color.

According to Jablonski and Chaplin, the amount of pigment in the skin is an adaptation to UVR levels. Skin pigmentation in humans has evolved over time to permit just enough UVR to enter the body to stimulate production of vitamin D3, but not so much as to destroy necessary folate. Using clinical data and satellite data collected by NASA that measured UVR levels at the Earth’s surface, Jablonski and Chaplin created a map of UVR at different latitudes that demarcates three human skin tone zones. Zone 1, in the tropics, contains peoples with high levels of melanin, a dark skin pigment that acts as a natural sunblock. Zone 2 includes most of the United States and southern Europe. Residents of this zone historically have moderately pigmented skin that is easily altered through tanning. Zone 2 residents increase melanin levels to prevent folate loss and lighten their tan to take advantage of dimmer, briefer days during winter months. Residents of Zone 3, high-latitude and polar regions, face the greatest risk of vitamin D3 deficiency due to diminished UV exposure and compensate for this by eating vitamin D3-rich foods. Females tend to have lighter skin than males in every examined population, a "significant biological message" that Jablonski attributes to the need for women to generate extra vitamin D3 at critical times in their lives, particularly while pregnant or nursing an infant, despite the possible risk of folate loss from UV overexposure. As our dark-skinned ancestors migrated from the sun-drenched tropics into higher latitudes with different climates and environments, their skin had to adapt to changing sunlight levels. Jablonski points out that today’s globe-trotting population still needs to be aware of the health implications of moving between skin tone zones and to take appropriate precautions. People with light skin who are exposed to lots of sunlight face the risk of folate loss, while those with darker skin who are not exposed to enough sunlight face problems related to vitamin D3 deficiency.


More on skin color.

Credit for describing the relationship between latitude and skin color in modern humans is usually ascribed to an Italian geographer, Renato Basutti, whose widely reproduced “skin color maps” illustrate the correlation of darker skin with equatorial proximity (Figure 2). More recent studies by physical anthropologists have substantiated and extended these observations; a recent review and analysis of data from more than 100 populations (Relethford 1997) found that skin reflectance is lowest at the equator, then gradually increases, about 8% per 10° of latitude in the Northern Hemisphere and about 4% per 10° of latitude in the Southern Hemisphere. This pattern is inversely correlated with levels of UV irradiation, which are greater in the Southern than in the Northern Hemisphere. An important caveat is that we do not know how patterns of UV irradiation have changed over time; more importantly, we do not know when skin color is likely to have evolved, with multiple migrations out of Africa and extensive genetic interchange over the last 500,000 years (Templeton 2002).

Randall Parker said at February 29, 2004 10:16 AM:

Here's another example of a fairly recent change in allele frequency due to natural selection:

From approximately 1920 to 1950, a kuru epidemic devastated the Fore in the Highlands of Papua New Guinea. At mortuary feasts, kinship groups would consume deceased relatives, a practice that probably started around the end of the 19th Century, according to local oral history. The Australian authorities imposed a ban on cannibalism there in the mid-1950s.

The same genetic variation in the prion protein that helps protect against Creutzfeld Jacob disease turned out to do the same for kuru. Studying Fore women who had participated in mortuary feasts, Collinge's group found that 23 out of the 30 women were heterozygous for the prion protein gene, possessing one normal copy and one with the M129V mutation.

The researchers sequenced and analyzed the prion protein gene in more than 2000 chromosome samples from people selected to represent worldwide genetic diversity. They found either M129V or E219K in every population, with the prevalence decreasing in East Asia (except for the Fore, who have the highest frequency in the world).

Randall Parker said at February 29, 2004 11:57 AM:

A darker skin population in northern regions had to use fish as a food source but a lighter skin population could hunt or farm for food.

Until the 1980s, researchers could only estimate how much ultraviolet radiation reaches Earth's surface. But in 1978, NASA launched the Total Ozone Mapping Spectrometer. Three years ago, Jablonski and Chaplin took the spectrometer's global ultraviolet measurements and compared them with published data on skin color in indigenous populations from more than 50 countries. To their delight, there was an unmistakable correlation: The weaker the ultraviolet light, the fairer the skin. Jablonski went on to show that people living above 50 degrees latitude have the highest risk of vitamin D deficiency. "This was one of the last barriers in the history of human settlement," Jablonski says. "Only after humans learned fishing, and therefore had access to food rich in vitamin D, could they settle these regions."

estman said at July 13, 2004 5:37 PM:

That experiment with the chimpanzees and the dogs (that lead to the conclusion that dogs read human signals better) possibly includes a serious error:
The dogs in the experiment all lived with humans (all dogs do) and thus have gathered experience in reading human signals. The chimps probably had significantly less occasions to learn human signals. Should that be so, then we can't simply conclude, that dogs adapted (genetically/evolutionary) better to reading human signals.

JS said at November 24, 2004 3:10 PM:

Estman,

Your assumptions are wrong. Not all dogs live with humans. Most or all dogs used in studies are breed for lab studies and have very little contact with humans until research is done. Therefore, monkeys, rats, dogs etc. all animals in studies do NOT live with humans they live in labs/research centers.

Post a comment
Comments:
Name (not anon or anonymous):
Email Address:
URL:
Remember info?

                       
Go Read More Posts On FuturePundit
Site Traffic Info
The contents of this site are copyright ©