Home | Feedback | Links | Books

The Demise of Mitochondrial Eve

Brad Harrub, Ph.D. and Bert Thompson, Ph.D.
© 2003 Apologetics Press, Inc.

All Rights Reserved.
Reproduced by Permission from Apologetics Press

On the first day of 1987, a scientific “discovery” seized the attention of the popular press.  The original scientific article that caused all the commotion—“Mitochondrial DNA and Human Evolution”—appeared in the January 1, 1987 issue of Nature, and was authored by Rebecca Cann, Mark Stoneking, and Allan C. Wilson (see Cann, et al., 1987).  These three scientists announced that they had “proven” that all modern human beings can trace their ancestry back to a single woman who lived 200,000 years ago in Africa.  This one woman was nicknamed “Eve” (a.k.a., “mitochondrial Eve”)—much to the media’s delight.  An article in the January 26, 1987 issue of Time magazine bore the headline, “Everyone’s Genealogical Mother: Biologists Speculate that ‘Eve’ Lived in Sub-Saharan Africa” (Lemonick, 1987).  A year later, that “speculation” became a major Newsweek production titled, “The Search for Adam and Eve” (Tierney, et al., 1988).  The provocative front cover presented a snake, tree, and a nude African couple in a “Garden of Eden” type setting.  The biblical-story imagery was reinforced by showing the woman offering an apple to the man.

A word of explanation is in order.  For decades, evolutionists had been trying to determine the specific geographical origin of humans—whether we all came from one specific locale, or whether there were many small pockets of people placed around the globe.  When they set out to determine the specific geographical origin of humans, a curious piece of data came to light.  As they considered various human populations, Africans seemed to show much more genetic variation than non-Africans (i.e., Asians, Europeans, Native Americans, Pacific Islanders, et al.).  According to molecular biologists, this increased variability is the result of African populations being older, thus, having had more time to accumulate mutations and diverge from one another.  This assumption led some researchers to postulate that Africa was the ancient “cradle of civilization” from which all of humanity had emerged.

The genetic material (DNA) in a cell’s nucleus controls the functions of the cell, bringing in nutrients from the body and making hormones, proteins, and other chemicals.  Outside the nucleus is an area known as the cytoplasmic matrix (generally referred to simply as the cytoplasm), which contains, among other things, tiny bean-shaped organelles known as mitochondria.  These often are described as the “energy factories” of the cell.

Mitochondria contain their own DNA, which they use to make certain proteins; the DNA in the nucleus oversees production of the rest of the proteins necessary for life and its functions.  However, mitochondrial DNA (mtDNA) was thought to be special for two reasons.  First, it is short and relatively simple in comparison to the DNA found within the nucleus, containing only thirty-seven genes instead of the 70,000+ genes located in the nuclear DNA.  This makes it relatively easy to analyze.  Second, unlike nuclear DNA, which each person inherits in a jumbled form from both parents, mitochondrial DNA was thought to be passed on only through the mother’s line (more about this later).  Working from the assumption that mtDNA is passed to the progeny only by the mother, Dr. Cann and her coworkers believed that each new cell should contain copies of only the egg’s mitochondria.  In trying to draw the human family tree, therefore, researchers took a special interest in these minute strands of genetic code.  What they really were interested in, of course, was the variations in mitochondrial DNA from one group of people to another.

Although our mtDNA should be, in theory at least, the same as our mother’s mtDNA, small changes (or mutations) in the genetic code can, and do, arise.  On rare occasions, mutations are serious enough to do harm.  More frequently, however, the mutations have no effect on the proper functioning of either the DNA or the mitochondria.  In such cases, the mutational changes will be preserved and carried on to succeeding generations.

Theoretically, if scientists could look farther and farther into the past, they would find that the number of women who contributed the modern varieties of mitochondrial DNA gets less and less until, finally, we arrive at one “original” mother.  She, then, would be the only woman out of all the women living in her day to have a daughter in every generation till the present.  Coming forward in time, we would see that the mtDNA varieties found within her female contemporaries were gradually eliminated as their daughters did not have children, had only sons, or had daughters who did not have daughters.  This does not mean, of course, that we would look like this putative ancestral mother; rather, it means only that we would have gotten our mitochondrial DNA from her.

To find this woman, researchers compared the different varieties of mtDNA in the human family.  Since mtDNA occurs in fairly small quantities, and since the researchers wanted as large a sample as possible from each person, they decided to use human placentas as their source of the mtDNA.  So, Rebecca Cann and her colleagues selected 145 pregnant women and two cell lines representing the five major geographic regions: 20 Africans, 34 Asians, 46 Caucasians, 21 aboriginal Australians, and 26 aboriginal New Guineans (Cann, et al., 1987, 325:32).  All placentas from the first three groups came from babies born in American hospitals.  Only two of the 20 Africans were born in Africa.

After analyzing a portion of the mtDNA in the cells of each placenta, they found that the differences “grouped” the samples by region.  In other words, Asians were more like each other than they were like Europeans, people from New Guinea were more like each other than they were like people from Australia, and so on.

Next, they saw two major branches from in their computer-generated tree of recent human evolution.  Seven African individuals formed one distinct branch, which started lower on the trunk than the other four.  This was because the differences among these individuals were much greater than the differences between other individuals and other groups.  More differences mean more mutations, and hence more time to accumulate those changes.  If the Africans have more differences, then their lineage must be older than all the others.  The second major branch bore the non-African groups and, significantly, a scattering of the remaining thirteen Africans in the sample.  To the researchers, the presence of Africans among non-Africans meant an African common ancestor for the non-African branches, which, likewise, meant an African common ancestor for both branches.  The nickname “Eve” stuck to this hypothetical common ancestral mother, and later, then, fired the media’s imagination.

Having concluded that the African group was the oldest, Dr. Cann and her colleagues wanted to find out just how old the group might be.  To do this, they used what is known as a “molecular clock” that, in this case, was based on mutations in the mtDNA.  The rate at which the clock ticked was determined from the accumulation of changes over a given period of time.  As we note below in our discussion of the so-called molecular clock, if the assumption was made that there was one mutation every 1,000 years, and if scientists found a difference of 10 mutations between us and our ancient hypothetical ancestor, they then could infer that that ancestor lived 10,000 years ago.

The researchers looked in two places for their figures.  First, they compared mtDNA from humans with that from chimpanzees, and then used paleontology and additional molecular data to determine the age of the supposed common ancestor.  This (and similar calculations on other species) revealed a mutation rate in the range of 2% to 4% per million years.  Second, they compared the groups in their study that were close geographically, and took the age of the common ancestor from estimated times of settlement as indicated by anthropology and archaeology.  Again, 2% to 4% every million years seemed reasonable to them.

Since the common mitochondrial ancestor diverged from all others by 0.57%, she must have lived sometime between approximately 140,000 (0.57 ÷ 4 × 1,000,000) and 290,000 (0.57 ÷ 2 × 1,000,000) years ago.  The figure of 200,000 was chosen as a suitable round number.

The results obtained from analysis of mitochondrial DNA eventually led to what is known in evolutionary circles as the “Out of Africa” theory.  This is the idea that the descendants of mitochondrial Eve were the only ones to colonize Africa and the rest of the world, supplanting all other hominid populations in the process.  Many (though not all) evolutionists claim that such an interpretation is in accord with archaeological, paleontological, and other genetic data (see Stringer and Andrews, 1988; for an opposing viewpoint, see the written debate in the April 1992 issue of Scientific American).

While many evolutionists have accepted the mitochondrial DNA tree, they differ widely in their views regarding both the source of the nuclear DNA and the “humanity” of Eve.  Some believe that Eve contributed all the nuclear DNA, in addition to the mitochondrial DNA.  Some believe she was an “archaic” Homo sapiens, while others believe she was fully human.  The exact interpretation is hotly debated because mitochondrial DNA is “something of a passenger in the genetic processes that lead to the formation of new species: it therefore neither contributes to the formation of a new species nor reveals anything about what actually happened” (Lewin, 1987, 238:24).

The Demise of Mitochondrial Eve

Things change rapidly in science.  What is popular one day, is not the next.  Theories come, and theories go.  And so it is with mitochondrial Eve.  She once was in vogue as “the woman of the moment,” so to speak.  Now, she has become virtually the “crazy aunt in the attic” that no one wants to admit even exists.

But it was not forbidden fruit that caused her demise this time around.  The “passing” of one of evolution’s most familiar icons is due to new scientific facts that have surfaced since her introduction in 1987.  If humans received mitochondrial DNA only from their mothers, then researchers could “map” a family tree using that information.  And, if the mutations affecting mtDNA had indeed occurred at constant rates, then the mtDNA could serve as a molecular clock for timing evolutionary events and reconstructing the evolutionary history of extant species.  It is the “ifs” in these two sentences that are the problem.

Mitochondrial Eve is alleged to have lived in Africa at the beginning of the Upper Pleistocene period (between 100,000 and 200,000 years ago).  She has been described as the most-recent common ancestor of all humans on Earth today, with respect to matrilineal descent.  The validity of these assertions, however, is dependent upon two critically important assumptions: (1) that mtDNA is, in fact, derived exclusively from the mother; and (2) that the mutation rates associated with mtDNA have remained constant over time.  However, we now know that both of these assumptions are wrong!

First, let us examine the assumption that mtDNA is derived solely from the mother.  In response to a paper that appeared in Science in 1999, anthropologist Henry Harpending of the University of Utah lamented: “There is a cottage industry of making gene trees in anthropology and then interpreting them.  This paper will invalidate most of that” (as quoted in Strauss, 1999, 286:2436).  Just as women thought they were getting their fair shake in science, the tables turned.  As one study noted:

Women have struggled to gain equality in society, but biologists have long thought that females wield absolute power in a sphere far from the public eye: in the mitochondria, cellular organelles whose DNA is thought to pass intact from mother to child with no paternal influence.  On page 2524 however, a study by Philip Awadalla of the University of Edinburgh and Adam Eyre-Walker and John Maynard Smith of the University of Sussex in Brighton, U.K.  finds signs of mixing between maternal and paternal mitochondrial DNA (mtDNA) in humans and chimpanzees.  Because biologists have used mtDNA as a tool to trace human ancestry and relationships, the finding has implications for everything from the identification of bodies to the existence of a “mitochondrial Eve” 200,000 years ago (Strauss, 286:2436, emphasis added).

One year later, researchers made this startling admission:

Mitochondrial DNA (mtDNA) is generally assumed to be inherited exclusively from the mother….  Several recent papers, however, have suggested that elements of mtDNA may sometimes be inherited from the father.  This hypothesis is based on evidence that mtDNA may undergo recombination.  If this does occur, maternal mtDNA in the egg must cross over with homologous sequences in a different DNA molecule; paternal mtDNA seems the most likely candidate….  If mtDNA can recombine, irrespective of the mechanism, there are important implications for mtDNA evolution and for phylogenetic studies that use mtDNA (Morris and Mightowlers, 2000, 355:1290, emphasis added).

In 2002, a study was conducted that concluded:

Nevertheless, even a single validated example of paternal mtDNA transmission suggests that the interpretation of inheritance patterns in other kindreds thought to have mitochondrial disease should not be based on the dogmatic assumption of absolute maternal inheritance of mtDNA….  The unusual case described by Schwartz and Vissing is more than a mere curiosity (Williams, 2002, 347:611, emphasis added).

And now we know that these are more than small “fractional” amounts of mtDNA coming from fathers.  The August 2002 issue of the New England Journal of Medicine contained the results of one study, which concluded:

Mammalian mitochondrial DNA (mtDNA) is thought to be strictly maternally inherited….  Very small amounts of paternally inherited mtDNA have been detected by the polymerase chain reaction (PCR) in mice after several generations of interspecific backcrosses….  We report the case of a 28-year-old man with mitochondrial myopathy due to a novel 2-bp mtDNA deletion….  We determined that the mtDNA harboring the mutation was paternal in origin and accounted for 90 percent of the patient’s muscle mtDNA (Schwartz and Vissing, 2002, 347:576, emphasis added).

Ninety percent!  And all this time, evolutionists have been selectively shaping our family tree using what was alleged to be only maternal mtDNA!

As scientists have begun to comprehend the fact, and significance, of the “death” of mitochondrial Eve, many have found themselves searching for alternatives that can help them maintain their current beliefs regarding human origins.  But this recombination ability in mtDNA makes the entire discussion a moot point.  As Strauss noted:

Such recombination could be a blow for researchers who have used mtDNA to trace human evolutionary history and migrations.  They have assumed that the mtDNA descends only through the mother, so they could draw a single evolutionary tree of maternal descent—all the way back to an African “mitochondrial Eve,” for example.  But “with recombination there is no single tree,” notes Harpending.  Instead, different parts of the molecule have different histories.  Thus, “there’s not one woman to whom we can trace our mitochondria,” says Eyre-Walker (1999, 286:2436, emphasis added).

Our thoughts on the matter exactly.

The Molecular Clock—Dating Mitochondrial Ancestors

Second, let us examine the assumption that the mutations affecting mtDNA did indeed occur at constant rates.  The researchers who made the initial announcement about Eve not only gave a location for this amazing female, but also proposed the time period during which she was supposed to have lived.  However, in order for the mtDNA theory to be of any practical use, those scientists had to assume that random mutations in the DNA occurred at documented, steady rates.  For example, if they speculated that there was one mutation every 1,000 years, and they found a difference of 10 mutations between us and our ancient hypothetical ancestor, they then could infer that that ancestor lived 10,000 years ago.  Scientists—who used this concept to determine the age of mitochondrial Eve—refer to this proposed mutation rate as a “molecular clock.” One group of researchers described the process as follows:

The hypothesis of the molecular clock of evolution emerged from early observations that the number of amino acid replacements in a given protein appeared to change linearly with time.  Indeed, if proteins (and genes) evolve at constant rates they could serve as molecular clocks for timing evolutionary events and reconstructing the evolutionary history of extant species (Rodriguez-Trelles, et al., 2001, 98:11405, parenthetical item in orig.).

It sounds good in theory, but the actual facts tell an entirely different story.  As these same researchers went on to admit:

The neutrality theory predicts that the rate of neutral molecular evolution is constant over time, and thus that there is a molecular clock for timing evolutionary events.  It has been observed that the variance of the rate of evolution is generally larger than expected according to the neutrality theory, which has raised the question of how reliable the molecular clock is or, indeed, whether there is a molecular clock at all….  The observations are inconsistent with the predictions made by various subsidiary hypotheses proposed to account for the overdispersion of the molecular clock (98:11405, emphasis added).

Another study that was published in 2002 pointed out a built-in, natural bias for older ages that result from use of the molecular clock.  The researchers who carried out the study noted:

There is presently a conflict between fossil- and molecular-based evolutionary time scales.  Molecular approaches for dating the branches of the tree of life frequently lead to substantially deeper times of divergence than those inferred by paleontologists….  Here we show that molecular time estimates suffer from a methodological handicap, namely that they are asymmetrically bounded random variables, constrained by a nonelastic boundary at the lower end, but not at the higher end of the distribution.  This introduces a bias toward an overestimation of time since divergence, which becomes greater as the length of the molecular sequence and the rate of evolution decrease….  Despite the booming amount of sequence information, molecular timing of evolutionary events has continued to yield conspicuously deeper dates than indicated by the stratigraphic data.  Increasingly, the discrepancies between molecular and paleontological estimates are ascribed to deficiencies of the fossil record, while sequence-based time tables gain credit.  Yet, we have identified a fundamental flaw of molecular dating methods, which leads to dates that are systematically biased towards substantial overestimation of evolutionary times (Rodriguez-Trelles, et al., 2002, 98:8112,8114, emphasis added).

Until approximately 1997, we did not have good empirical measures of mutation rates in humans.  However, that situation greatly improved when geneticists were able to analyze DNA from individuals with well-established family trees going back several generations.  One study found that mutation rates in mitochondrial DNA were eighteen times higher than previous estimates (see Parsons, et al., 1997).

Science writer Ann Gibbons authored an article for the January 2, 1998 issue of Science titled “Calibrating the Mitochondrial Clock,” the subheading of which read as follows: “Mitochondrial DNA appears to mutate much faster than expected, prompting new DNA forensics procedures and raising troubling questions about the dating of evolutionary events.” In that article, she discussed the new data which showed that the mutation rates used to obtain mitochondrial Eve’s age no longer could be considered valid, and concluded:

Regardless of the cause, evolutionists are most concerned about the effect of a faster mutation rate.  For example, researchers have calculated that “mitochondrial Eve”—the woman whose mtDNA was ancestral to that in all living people—lived 100,000 to 200,000 years ago in Africa.  Using the new clock, she would be a mere 6,000 years old (1998: 279:29, emphasis added).

Gibbons quickly went on to note, of course, that “no one thinks that’s the case,” (279:29).  She concluded her article by discussing the fact that many test results are (to use her exact word) “inconclusive.” She then noted: “And, for now, so are some of the evolutionary results gained by using the mtDNA clock” (279:29).

We now know that the two key assumptions behind the data used to establish the existence of “mitochondrial Eve” are not just flawed, but wrong.  The assumption that mitochondrial DNA is passed down only by the mother is completely incorrect (it also can be passed on by the father).  And, the mutation rates used so calibrate the so-called “molecular clock” are now known to have been in error.  (To use the words of Rodriguez-Trelles and his coworkers, the method contains a “fundamental flaw.”)

Philip Awadalla and his coworkers noted in Science: “Many inferences about the pattern and tempo of human evolution and mtDNA evolution have been based on the assumption of clonal inheritance.  There inferences will now have to be reconsidered” (1999, 286:2525).  However, rather than merely “reconsidering” their theory and attempting to revamp it accordingly, evolutionists need to admit, honestly and forthrightly, that “mitochondrial Eve,” as it turns out, has existed only in their minds, not in the facts of the real world.  Science works by analyzing the data and forming hypotheses based on those data.  Science is not supposed to massage the data until they fit a certain preconceived hypothesis.  All of the conclusions that have been drawn from research on mitochondrial Eve via the molecular clock must now be discarded as unreliable.  A funeral and interment are in order for mitochondrial Eve.


References

Awadalla Philip, Adam Eyre-Walker, and John Maynard Smith (1999), “Linkage Disequilibrium and Recombination in Hominid Mitochondrial DNA,” Science, 286:2524-2525, December 24.

Cann, Rebecca L., Mark Stoneking, and Allan C. Wilson (1987), “Mitochondrial DNA and Human Evolution,” Nature, 325:31-36, January 1.

Gibbons, Ann (1998), “Calibrating the Mitochondrial Clock,” Science, 279:28-29, January 2.

Lemonick, Michael D. (1987), “Everyone’s Genealogical Mother,” Time, p. 66, January 26.

Lewin, Roger (1987), “The Unmasking of Mitochondrial Eve,” Science, 238:24-26, October 2.

Morris, Andrew A. M., and Robert N. Lightowlers (2000), “Can Paternal mtDNA be Inherited?,” The Lancet, 355:1290-1291, April 15.

Parsons, Thomas J., et al. (1997), “A High Observed Substitution Rate in the Human Mitochondrial DNA Control Region,” Nature Genetics, 15:363.

Rodriguez-Trelles, Francisco, Rosa Tarrio, and Francisco J. Ayala (2001), “Erratic Overdispersion of Three molecular Clocks: GPDH, SOD, and XDH,” Proceedings of the National Academy of Sciences, 98:11405-11410, September 25.

Rodriguez-Trelles, Francisco, Rosa Tarrio, and Francisco J. Ayala (2002), “A Methodological Bias Toward Overestimation of Molecular Evolutionary Time Scales,” Proceedings of the National Academy of Sciences, 99:8112-8115, June 11.

Schwartz, Marianne and John Vissing (2002), “Paternal Inheritance of Mitochondrial DNA,” New England Journal of Medicine, 347:576-580, August 22.

Strauss, Evelyn (1999), “mtDNA Shows Signs of Paternal Influence,” Science, 286:2436, December 24.

Stringer, C.B. and P. Andrews (1988), “Genetic and Fossil Evidence for the Origin of Modern Humans,” Science, 239:1263-1268, March 11.

Tierney, John, Lynda Wright, and Karen Springen (1988), “The Search for Adam and Eve,” Newsweek, pp. 46-52, January 11.

Williams, R. Sanders (2002), “Another Surprise from the Mitochondrial Genome,” New England Journal of Medicine, 347:609-611, August 22.


Brad Harrub is a graduate of Kentucky Wesleyan College, where he earned a B.S. degree in biology.  He also earned a Ph.D. in neurobiology and anatomy from the College of Medicine at the University of Tennessee in Memphis.  He is a member of the Society for Neuroscience, and was listed in the 2001-2002 edition of Who’s Who Among Scientists and Researchers.  He was an invited speaker to the 2003 International Conference on Creationism.  He currently serves as the Director of Scientific Information at Apologetics Press, and as associate editor of Reason & Revelation.   [RETURN TO TEXT]

Bert Thompson is a graduate of Abilene Christian University, where he earned a B.S. degree in biology.  He also is a graduate of Texas A&M University, where he earned both M.S. and Ph.D. degrees in microbiology.  Dr. Thompson is a former professor in the College of Veterinary Medicine at Texas A&M, where he taught for several years.  While at Texas A&M, he served as Coordinator of the Cooperative Education Program in Biomedical Science.  Currently, Dr. Thompson is the Executive Director of Apologetics Press and editor of Reason & Revelation.   [RETURN TO TEXT]


Home | Feedback | Links | Books | Donate | Back to Top

© 2024 TrueOrigin Archive.  All Rights Reserved.
  powered by Webhandlung