Implications of a mismatch between hominin genes and bones

Finds in Kenya, Ethiopia and Chad during the first few years of the 21st century suggest that bipedal hominins, perhaps on the human clade, emerged as long ago as 7 Ma. Even using the previously accepted molecular-clock age for separation of chimpanzees and hominins, this is dangerously close to the time of the last common ancestor of both (5-10 Ma). Results from comparison of more detailed chimp and human genomics (Paterson, N. et al. 2006. Genetic evidence for complex speciation of humans and chimpanzees. Nature, doi:10.1038/nature04789, online) throw up a bewildering series of possibilities. On Patterson et al’s reckoning, our descent split from that of our nearest relatives no more than 6.3 Ma ago and perhaps as recently as 5.4 Ma, implying an overlap between tangible evidence and that based on DNA. Of even greater concern is the fact that human and chimp X-chromosomes are more similar than the rest, and seem to have diverged even later. One way in which this greater similarity could have arisen is if natural selection had been operating more strongly on X-chromosome genes, which studies of other related species show to have stemmed from hybridisation. Genes found on X-chromosomes that make hybrids less fertile can create strong selection pressures on this chromosome. An explanation that takes into account the young date of apparent splitting and strong selection operating on X-chromosomes is that the actual speciation(s) did take place before the time when the oldest hominin fossils were preserved, but that there was common interbreeding between the two closely related lines. 

Understandably, palaeoanthropologists and geneticists are arguing heatedly, but failing to recognise the great differences between fossils and extant genetic evidence: each is bound to tell a different part of the story. Yet another is the ecology connected to either lineage, the end point being a regional separation into creatures of forest and open savannah, separated by considerable distances in Africa – basically west and east of the East African Rift system. Before that climatic and vegetation-cover schism what would there have been to stop a great many branchings from either lineage of very closely related animals? The rarity of fossils from either may leave the true relationships early in the history of both clades completely impenetrable. One thing is for sure, although chimps and humans today do make close friendships, that is as far as it goes…

See also: Holmes, B. 2006. Did humans and chimps once merge?. New Scientist, v. 190 20 May 2006, p. 14. Pennisi, E. 2006. Genomes throw kinks in timing of chimp-human split. Science, v. 312, p. 985-986.

Hobbit matters

Debate about the significance of the tiny hominid fossils from the Indonesian island of Flores (H. floresiensis) continues to escalate. The remains are sufficiently complete for analysis of other things than size and morphology of skull and brain. It seems that the shoulder structure is different from that of modern humans, but more similar to that of full-sized H. erectus (see Culotta, E. 2006. How the hobbit shrugged: tiny hominid’s story take a new turn. Science, v. 312, p. 983-984). In ourselves, when standing straight, our inner elbows face slightly forwards so that we can work with both hands in front of the body. The necessary twist in the humerus is somewhat less in H. floresiensis, and by itself that would inhibit being able to make tools. However, the shoulder bones of the fossil articulate differently with the hobbit humerus so that a hunched posture would allow intricate work, but not an overarm throwing action. Much the same features characterise the well-preserved upper bodies of H. erectus fossils from Africa and Georgia. Incidentally, like J.R.R Tolkien’s fictional Hobbit, H. floresiensis also had disproportionately large feet.

It seems inescapable that H. floresiensis did make tools. As well as the 90-12 ka artefacts found in the Liang Bua cave with the hominid remains, which some have reckoned to be too complex for the small people to have made the, large numbers of similarly sophisticated stone tools have been found at other sites in Flores. These occur with similar prey species, but not hominid remains, from as long ago as 800 ka; a time at which only H. erectus was present in the Indonesian archipelago (Brumm, A. et al. 2006. Early stone technology on Flores and its implications for Homo floresiensis. Nature. V. 441, p. 624-628).

The minute size of H. floresiensis, with a brain capacity of a mere 400 cm3, continues to cause some researchers to doubts that the fossils – in fact 9 sets of remains from Luing Bua – were other than congenitally deformed modern humans: microcephalics. Anatomist Robert Martin of the Chicago Field Museum of Natural History (see www.sciencemag.org/cgi/content/full/312/5776/999b) used scaling factors of other dwarfed mammals from island faunas to model the body versus brain size to be expected for similarly dwarfed hominids that might arise from isolated H. erectus. He calculated that the 400 cm3 brain of H. floresiensis should be associated with a creature with around 11 kg body mass: about the size of small monkey. But that conflicts with the fact that the famous skull shows no signs of other deformities associated with microcephaly (See Culotta, E. 2006. How the hobbit shrugged: tiny hominid’s story take a new turn. Science, v. 312, p. 983-984).

Hominid evolution: a line or a bush?

From the late 19th century it has been clear that two species of our genus Homo inhabited Europe and the Middle East: modern humans and Neanderthals. Recent partial sequences of Neanderthal genetic material, compared with the human genome, confirm that the two did not interbreed; at least, no trace of Neanderthal genetics remains in that of modern humans. The discovery in Indonesia that fully modern immigrants occupied the same territory as Homo erectus from 70 to 20 thousand years ago adds more weight to the hypothesis of multiple occupancy of the world by different kinds of humans until recent times. The astonishing discovery in 2003 of the remains of tiny hominids (Homo floresiensis) on Flores whose occupancy lasted from at least 840 ka to as recent as 12 ka (see The little people of Flores, Indonesia, November 2004 issue of EPN) confirms mixed occupancy late in hominid evolution. That includes several different representatives of Homohabilis, eragster and erectus – and also paranthropoids in Africa around 2 Ma years ago. As regards Homo, this cohabitation, especially that in Africa, supports two hypotheses: that our lineage was bush-like and involved separate extinctions and sudden appearances of new species (cladogenesis), or that the great variability in physiognomy (polymorphy) of modern humans extended back for a considerable time. The second is the view of Jonathan Kingdon, who believes insufficient hominid fossils have been collected to rule out polymorphism among tool-using and tool-creating beings. The idea of a single lineage since the first appearance of bipedal apes that led unerringly through gradual changes to modern humans (phyletic evolution) has been largely discarded. For at least part of the 6-7 Ma hominid record, that abandonment of phyletic evolution may have to be reconsidered, following a report of remarkably productive excavations in the Awash Valley of NE Ethiopia (White, T.D. and 21 others 2006. Asa Issie, Aramis and the origin of Australopithecus. Nature, v. 440, p. 883-889).

The Middle Awash is the single most productive area for hominid remains and other fossils that help establish changes in their environment. That is so because of consistent collecting for more than two decades by a multinational team, co-led by Ethiopian and US palaeoanthropologists, from a sequence of flood plain sediments over 1 km thick, liberally interlayered with dateable volcanic horizons. Its middle parts record three species, Ardepithecus ramidus, Australopithecus anamensis and Australopithecus afarensis (of which ‘Lucy’ was a member), in an age range from 4.42 to 3.88 Ma. White and the other members of the team have unearthed 30 new fossils of all three species, but, so far, no examples of more than one in a particular thickness of sediments. Of course, ‘absence of evidence is not evidence of absence’, but this massive addition to the Pliocene hominid record is a challenge to the prevailing hypothesis of cladogenesis – Steven J. Gould’s idea of punctuated equilibrium, in which species arise by sudden appearance of new characteristics from earlier ancestors. Its test is whether or not ancestral species co-exist with new species for a time. In the Middle Awash, it seems that they do not, even though the critical 300 m of sediments represents only 200 thousand years.

The three species, and their predecessor Ardepithecus ramidus kadabba (5.5-5.8 Ma), show variations in their teeth, with Ar. r. kadabba and Ar. ramidus sharing some similarities, and Au. anamensis and Au. afarensis others. The shift between the two sets of common dentition can be explained by either gradual changes in a single lineage over about 2.5 to 3.0 Ma, or a sudden speciation event, perhaps around 4.5 Ma. The lack of overlap favours the first hypothesis. Complicating factors are rife, however, for there may have been migrations (Ar. Ramidus is known from far to the south in Kenya), and yet more evidence will undubtedly be found from the vast amount of sediment of this age in the Afar Depression.

See also: Dalton, R. 2006. Feel it in your bones. Nature, v. 440, p. 1100-1101.

Palaeodentistry

Those of a nervous disposition should not read this item.

A 7500 to 9000 year-old Neolithic graveyard in Pakistan has yielded remains of about 300 people who cultivated wheat, barley and cotton, and herded cattle. There is nothing remarkable in that, except that nine individuals have teeth that have clearly been drilled neatly (Coppa, A. et al. 2006. Nature, v. 440, p. 755). The holes are between 1-3 mm in diameter and up to 3.5 mm deep, and would have exposed sensitive parts of the tooth. In excavations of the nearby village of Merhgarh are found tiny flint drill heads associated with beads of various ornamental materials. The drills are of the same size as the tooth holes. Quite probably, miniature bow-drills tipped with flint would have been used by Neolithic dentists for at least 1500 years – there is no evidence for tooth drilling from younger cemeteries in the area, despite abundant evidence of dental decay. Experiments show that such drills would take less than a minute to produce the neat holes, probably wielded by jewellers rather than dentists.

Asian Homo erectus skilled in tool making

The 1.8 Ma emigrants from Africa who first populated the Far East have not been regarded as having been especially inventive. While their ‘cousins’ in Africa developed the aesthetically stunning bi-face axe about 1.6 to 1.4 Ma ago (the first instance of visualising a finished object within a rough piece of raw material), H. erectus in East Asia is associated with the most primitive stone tools made by simply breaking flinty stones. That seemed to have been the extent of their stone-using skills up to their final demise about 20 thousand years ago –not a lot of progress in 1.8 million years. A report in March at the Indo-Pacific Prehistory Association Congress (Manila) of yet to be published work by Harry Widianto of Indonesia’s National centre of Archaeology may force a revision of this less than charitable view of early Asians (Stone, R. 2006. Java Man’s first tools. Science, v. 312, p. 361). In the Solo district of Java, made famous by Renée Dubois who found the first fossils of H. erectus there, a wealth of finely worked flake tools has been discovered in sediments that are about 1.6 Ma old. Most are small and made from blood-red to beige, translucent chalcedony. It seems that necessity was the mother of invention in this case, because suitable materials for sharp tools are very scarce in Java.

Climate change and collapse of early civilisations

About 4200 years ago early civilisations of the Old World underwent decline and collapse. Examples are the Akkadian civilisation in the upper Tigris and Euphrates basins, famed for Hammurabi’s Hanging Gardens of Babylon, the Harappan of the Indus Valley (Mohenjodaro), the phaoronic Old Kingdom and the Minoan of Crete. This period of the Bronze Age has been thought by some to have experienced either massive volcanism – the explosion of Santorini – or even a comet strike. Others have correlated collapses of city states with Biblical events. Whatever happened, its outcome spanned a vast area of western Asia and north-eastern Africa, so another candidate is climatic drying leading to drought and famine. That is perhaps not such a spectacular fate as near-instant environmental upheavals, but probably just as effective for societies dependant on regular agriculture production or, in the case of Crete, on wide-ranging trade.

Detecting climate change is now well established on proxy records of one kind or another, such as those based on isotopes and sedimentation changes from sea-floor sediments and flowstone (speleothem) in caves, and dust records in ice cores. Such time-series from the mid- to late Holocene are increasing in number, with particular interest growing in records from speleothem now that precise age sequences are possible using uranium-series dating. A flowstone record from a cave in northern Italy, has helped link other time series ranging from the North Atlantic floor, in the Middle East and East Africa (Drysdale, R. et al. 2006. Late Holocene drought responsible for the collapse of Old World civilizations is recorded in an Italian cave flowstone. Geology, v. 34, p. 101-104). A team of geochemists ad environmental scientists from Australia, Italy and the UK has shown a remarkable coincidence among these widely different records, centred on 3900-4200 b.p.. From the North Atlantic at high latitudes is an upsurge in fragments deposited by ice rafting, while mean sea-surface temperatures swung downwards. Kilimanjaro ice shows a marked peak in atmospheric dustiness. Carbonate deposition peaked in the Gulf of Oman. Finally, the Italian flowstone shows peaks in d18O, d13C and the magnesium:calcium ratio of its carbonates. The conclusion is a period of climatic cooling and drying that spanned 40 degrees of latitude over a period of several hundred years. This is not the signature likely to have been associated with instantaneous catastrophes. Yet nor is it typical of the episodic climate shifts of the order of a few thousand years, which were now well known features of the last glacial period and the current interglacial. It was certainly sufficiently prolonged and large enough to have wrought havoc on early civilisations, and throughout the Old World it clearly did.

Culture and human evolution

Culture in the most general sense that encompasses tools, clothing, habitation and fire has increasingly set humans and their ancestors apart from the rest of the natural world. It might therefore seem that becoming more ‘human’ cushions our line from Darwinian natural selection since we have created our own ‘nature’ and carry it with us. Setting fully modern humans adrift in the environment, without that culture, would undoubtedly result in rapidly extinguishing the species. In that hypothetical context we are far from ‘fit’, in Darwin’s sense. However, the development of humanity’s cultural milieu has itself provided a continually changing, increasingly pervasive artificial set of conditions for natural selection. Culturally, the most dramatic step in human evolution, for which we have tangible evidence, emerged with the explosive appearance of graphic art and a complex ‘toolkit’ around 35 thousand years ago in Europe. That huge advance will undoubtedly be traced back maybe tens of millennia when archaeological finds in Africa and Australia, for instance, are more precisely dated.  Evidence from the DNA in male-carried Y chromosomes indicates that a profound genetic shift occurred around 70 ka, perhaps resulting from a decline in global human numbers to a very small population after the climatic disaster wrought by the explosive eruption of the Toba volcano in Indonesia. That too was a time when fully modern humanity distributed itself more thinly by a decisive exodus from Africa. Some specialists have speculated that the cultural explosion stemmed from that evolutionary ‘bottleneck’.  There are genetic signs of adaptation to cultural practices and selective pressures that accompanied them after the rise of agriculture and settlement (See Has human evolution stopped?, September 2005 issue of EPN). Recent work on the whole human genome gives an inkling that even more pervasive evolutionary changes took place in the last 50 thousand years (Wang, E.T. et al., 2005. Global landscape of recent inferred Darwinian selection for Homo sapiens. Proceedings of the National Academy of Science, www.pnas.org/cgi/doi/10.1073/pnas.0509691102).

Wang and colleagues from the University of California studied the occurrence of single-letter differences in the genetic code (single-nucleotide polymorphisms – SNPs). Scattered across all human chromosomes are about 1.6 million of these SNPs. They appear not to do anything, but can be linked to nearby genes. When natural selection favours a particular mutated variant of a gene, the associated SNPs can be selected as well. The approach used by Wang et al. is a statistical search for pairs of SNPs that occur together more often than could be possible by chance ‘reshuffling’ that occurs from generation to generation. Their analysis suggests that around 1800 genes, a remarkable 7% of the whole genome, have changed over the last 50 thousand years. Interestingly, that is similar to the degree of genetic change in maize since its domestication from its wild ancestor. As well as genes connected to protein metabolism that could have changed as new diets followed the rise of agriculture, some that are involved in brain function have been selected as well.

Although at an early stage, this kind of research confirms that we are indeed still evolving along Darwinian lines, perhaps unwittingly domesticating ourselves. It is easy to assume that ideas, skills and artistic sensibilities are passed on through language and learning and thereby grow and diversify, but in order for any of these to stimulate the deep feelings that they foster suggests that some aspects have become ‘hard-wired’ in all of us. Everyone unconsciously taps their feet to rhythm, can be moved to a vast range of emotions by music, words and visual stimuli, and can ‘sense’ an environment captured, even in abstraction, by a talented artist. They inspire further development. Until around 50 ka human culture, insofar as we can see evidence for it, remained fixed for more than a million years through several species and subspecies of the genus Homo. Appearing between 1.6 and 1.4 Ma ago the bi-face stone axe endured as humanity’s highest known achievement until those very recent times.

See also: Holmes, R. 2005. Civilisation left its mark on our genes. New Scientist, 24/31 December 2005 issue, p. 8.

Earliest tourism in northern Europe

Some years ago British palaeoanthropologists were in a state of high excitement about finds of stone tools, evidence of prolonged human habitation and fragmentary skeletal remains from a sandpit at Boxgrove on England’s southern coast.  They showed the earliest human presence at high latitudes around 400-500 ka. The date of early colonisation has now been pushed back more than half as long before that to 700 ka by finds in a shoreline exposure of riverine sediments on the coast of Suffolk on England’s east coat.  The Cromer Forest Bed of Middle Pleistocene age has been know since Victorian times as a rich source of the flora and fauna from one of the earliest interglacials of the current period of 100 ka climate cyclicity. At that time the North Sea had yet to establish a connection that would eventually separate the British Isles from Europe, and the site at Pakefield would have been the estuary of a now-vanished river system draining the Midlands and Wales.  So far no human bones have turned up in the excavations, which have to be conducted at low tide. But many flint tools pepper the organic-rich sediments (Parrfitt, S.A. et al., 2005. The earliest record of human activity in northern Europe. Nature, v. 438, p. 1008-1012). As with most terrestrial deposits, establishing the age of human occupation posed the greatest difficulty. A careful documentation of magnetic polarity combined with fossils – including distinct voles – and a new technique that relies on assessing the degree of protein degradation in bivalve shells helped tie-down the age precisely.

Around 800 ka human occupation had begun in Spain and the Pakefield site shows that migration northwards of flora and fauna following a glacial epoch was swift, to establish conditions considerable warmer than in the Holocene. It seems that this Mediterranean climate encouraged such northward penetration by humans, most likely during a short period of particular warmth. Long eyed by archaeologists as a potential source of human remains, patience has paid off in the Cromer Forest Beds.  Yet around the world there are many other, equally promising strata or Pleistocene age that have not had such undivided attention for so long, A glance at the distribution of keynote sites for palaeoanthropology shows how narrow the search for human origins and migratory destination has been up to now. Though it is understandable that once finds have been made, funds and scientists cluster where progress is best guaranteed. Very rarely, either a ‘shot in the dark’ pays off or something surprising turns up at a site being excavated for other purposes. Broadening the search may well have high financial and career risks, yet the more discoveries are made at well-trodden sites the greater the likelihood that the full story of human evolution and migration will be revealed by breaking new ground,

See also: Roebroeks, E. 2005. Life on the Costa del Cromer. Nature, v. 438, p.921-922.

Biogeochemical evidence for vegetation change when hominins evolved

A long-held theory that concerns the background to hominin evolution, is that the freeing of hands by bipedalism was triggered by a shift in the ecology of East Africa from forest to more open grassland.  That might well have happened as the Neogene uplift associated with development of the East African Rift transformed the regional wind and rainfall patterns to the way they are today, thereby creating the conditions for the modern savannahs and semi-deserts in the area long associated with human origins.  The lakes of East Africa are ephemeral in the context of Neogene climate change, and so their sediments are not much use in charting long-term shifts in flora.  However, the modern wind systems shift dust and organic particles consistently towards the Gulf of Aden, so sediment cores there potentially provide a continuous record of vegetation change.  That is, if they contain ‘biomarkers’ that distinguish the debris of trees from that of grasses. The first biomarker records from the Gulf of Aden seabed powerfully confirm the notion of vegetation change as a possible driver for hominin evolution (Feakins, S.J. et al., 2005. Biomarker records of late Neogene changes in northeast African vegetation. Geology, v. 33, p. 977-980).

Up to about 3.5 Ma the cores contain plant-derived waxes that are characteristic of trees that use C3 metabolic processes, but thereafter evidence for increasing C4 grasses predominates.  Coinciding with that broad trend is an increase in 13C in soil carbonates on land, which probably reflects increased grassland too.  Although records of hominin diversity before about 3 Ma are scanty, later times saw the rise of several bipedal species, grouped as the powerfully jawed parathropoids and the more daintily chewing members of the lineage that led to modern humans. Detail in those sections of marine core that were used – presumably costs prevented continuous measurements – shows that the carbon-isotopic signals in the waxes varied in harmony with evidence for climate change, so the proportions of savannah and woodland probably shifted quite rapidly.  However, because cold-dry periods have tended to be longer than those which were warm and more humid, savannah would have had more influence over faunas than ephemeral woodland. Fascinating as this empirical relationship between hominin evolution and vegetation change is, what Africa lacks – as indeed does most of the planet – is data that chart accurately how topography has changed with time. Cosmogenic and U-Th/ He apatite thermochronology, on which so much hope and funding have been invested, has proved spectacularly ineffectual compared with careful work on the likely effects of changing landforms.

The geological sources of myths

Sitting on top of the Kremlin in Red Square is a huge five-pointed red star that is illuminated at night.  This is not just a relic of Stalin’s Soviet Union, but has its origins in a common myth that shows up concretely in archaeological digs, particularly in the Middle East, in the form of collections of fossil sea urchins and starfish. They, of course possess the five-fold symmetry unique to the Echinodermata, which also figures in the emblematic pentagram of Denis Wheatley’s awful novels about satanism and on the pointed hats of latter-day wizards and warlocks. I learned of this fascinating link between geology and symbolism at a session on Geology and Mythology at the 32nd International Geological Congress in Florence (August 2004). This branch of geoscience seems destined to thrive, and Kevin Krajik has helped ensure that it does by reviewing a range of geo-inspired myths (Krajik, K. 2005. Tracking myth to geological reality. Science, v. 310, p. 762-764). His examples range from Pitman and Ryan’s hypothesis linking the flood myth of the Near East, first recorded in the Epic of Gilgamesh, to catastrophic filling of the Black Sea basin as sea level rose and spilled through the Bosporus around 7600 years ago, to the Oracle of Delphi. The most interesting and useful are those myths that incorporate an implicit warning of risk. Among these are pictograms of two headed serpents US which are reputed to shake the ground by native people of the NW who carved them. These a’yahos are found around major active fault zones. Cameroonian taboos include some that relate clearly to exhalation of carbon dioxide from crater lakes, as happened with disastrous effects at Nyos in 1986. The seafaring Moken of western Thailand have a tradition that a rapidly falling tide presages a man-eating wave: no Mokens died during the 26 december 2004 Tsunamis, despite living on the shore that was badly hit.

Growing evidence for ‘hobbits’

Various shenanigans within the Indonesian palaeoanthropology community have hindered evaluation of all the evidence surrounding the diminutive adult female skeleton found in Liang Bua cave on Flores in 2003.  Her skull was damaged after prolonged examination by a leading national figure in the science, and now further excavation in the cave has been blocked indefinitely. Whether she is indeed a member of new species of hominin, Homo floresiensis, or merely an individual modern human dwarfed by some genetic defect, as some claim, seems closer to resolution (Morwood, M.J. and 10 others 2005.  Further evidence for small-bodied hominins from the Late Pleistocene of Flores, Indonesia. Nature, v. 437, p. 1012-1017). During the 2004 field season at Liang Bua the Australian-Indonesian team unearthed remains of nine other individuals of similarly diminished stature. They included another jaw bone that is virtually identical to that of the first ‘hobbit’: neither have the chins that unify all fully modern humans.  Significantly, the new piece of lower jaw is dated at some 3 ka older than the original, so the chances of both being from physiologically unfortunate modern humans are remote.

The new finds also include stone tools, more advanced than any found in association with one of H. floresiensis’s possible ancestors, H. erectus.  Whoever they were, the ‘hobbits’ also butchered prey and cooked meat.  There is negative evidence in support of the new species hypothesis too: compared with human sites of the Late Pleistocene, Liang Bua is conspicuously lacking in evidence for any form of art. But the idea is not proven.  It would take a definite association between fossils and tools, as for instance in a burial, to show that the implements belonged to ‘hobbits’ rather than having been introduced by a fully human visitor. Moreover, should any evidence for moderns be found in Liang Bua or other caves of interest, the possibility of mixture of cultures and fossils would leave things up in the air.

It is worth noting that Indonesian scientists are not the only ones prone to obstructive tactics as regards hominin sites. They have long been a bone of contention throughout Africa, where both local and visiting scientists have tried to throw spanners in their colleagues’ research ambitions.

See also: Dalton, R. 2005. More evidence for hobbit unearthed as diggers are refused access to cave. Nature, v. 437, p. 934-935; Lieberman, D.E. 2005. Further fossil finds from Flores. Nature, v. 437, p. 957-958.

Congenital disease, human migration and population growth

The way in which genetic features are inherited has become a key feature in distinguishing human populations, the time and route of their migrations as separate groups, and when they merged with other groups.  The most familiar outcomes are those based on mitochondrial DNA and lines of female descent that show with little room for manoeuvre, that all of us descend from Africans alive around 150 to 200 ka.  Studies of the male Y chromosome help fine tune the record to show short periods when either populations fell so low that human survival passed through only a few small bands (e.g. around 70 ka) or Big Men corralled most women for their own purposes (the now famous case of Ghengis Khan’s genes still dominating the genetics of Central Asian people). Dennis Drayna of the US NIH outlines yet another revealing feature of genetics with historical connotation in the October 2005 issue of Scientific American (Drayna, D. 2005. Founder mutations. Scientific American, v. 293(4), p. 60-67).

Disabling congenital diseases, such as cystic fibrosis and sickle-cell anaemia, together with adverse reaction to alcohol and the ability of adults to tolerate the lactose in milk, are all passed down generations in different ways. Understanding the genetic processes involved obviously stems from medical research on genetic mutations so as to identify groups that are at risk.  From it has emerged details on the structure and location of the responsible genes in chromosomal DNA.  The feature that unites the four examples above is a special repetition of the same kind of mutant structure. Inherited conditions involve either different mutations in a single gene, or the identical change at a specific location.  Of the latter, it seems the most common is an innate tendency in DNA for the same mutation to affect a specific gene – so called ‘hot-spot’ mutation, which occurs in unrelated individuals.  More rare is a defect that is embedded in a length of DNA (a haplotype) whose structure is identical in all those who carry the mutation. That common identity suggests that the mutation arose once and has been passed down subsequently; a ‘founder’ mutation.

Since a ‘founder’ mutation arose at some time in the past it can potentially be used to trace population history, and so passes into the realm of palaeoanthropology. The fascinating and most useful feature is that the greater the separation in generations from the individual in whom the mutation occurred, the more restricted becomes the haplotype, in terms of its relative length in DNA.  That phenomenon is a consequence of sexual recombination among descendants.  In the founding individual, the whole chromosome is the haplotype, and the mutated part becomes increasingly ‘diluted’ with time.  Measuring its length today harks back to the time of foundation.  What has become clear is that not all founder mutations have any obvious consequence, and instead of being in as few as one millionth of a population, the general case for those causing disability and therefore conferring an adverse effect on natural selection, a few percent of people can carry them. Such abundance indicates either neutral effects or some subtle benefit to fitness.  Diseases ascribed to them appear when both parents contribute the mutation: most are recessive. 

A good example is a mutation of the HFE gene that confers above normal iron absorption, which is a decided advantage in protection against anaemia from iron-deficient diet.  An individual with two copies vastly overcompensates and iron accumulates to deadly levels in their cells.  Studies of its incidence in global populations indicate that it arose in Ireland, western Britain and Brittany and then spread south-eastwards. It appears to be a Celtic trait, although not from their original heartland in Central Europe but at the limit of their original migration more than 2000 years ago. Its haplotype is quite long and suggests a founder around 800 AD.  There are no records of significant late Celtic migrations, and quite possibly the spread was through wide-ranging Vikings who dominated parts of the western British Islands at that time. A more fascinating case is the founder mutation that prevents people who carry it from tasting bitterness.  Most people do experience bitter tastes, and that is very handy for avoiding toxic plants.  About 25% do not.  Maybe the mutation involved conferred some advantage, but the fact is that the haplotype is exceptionally short, representing a foundation at about 100 ka.  It occurs in Africa along with 6 variants of the bitter-taster gene, yet beyond that continent only one taster and the non-taster forms occur commonly.  That tallies with the hypothesis of the major movement out of Africa to populate the rest of the world with modern humans, around 75 ka ago. The surveys go intriguingly further: should descendants of those African migrants have bred successfully and regularly with earlier Eurasian hominins (Neanderthals and Erects), then non-African versions of the bitterness detecting gene ought to be present among non-African populations.  Not one ‘alien’ haplotype has been detected, and this novel approach seems to have lain to rest that particularly intriguing bit of sociology.

Climate change and human evolution

 

One clear character of the record of investigations into human evolution is that, rather than becoming clearer as data increase, our origins become more of a puzzle. With every major fossil find the hominin clade or bush of descent acquires what appears to be another branch. With the recent publication of the genome of our closest living relative, the chimpanzee – and its earliest fossil remains – (Nature, v. 437, p. 47-108), it will hardly be surprising if the assumptions about a gene-based time of separation of the two clades (5-7 Ma) comes into question. Studies of the Y-chromosomes of living human males have suggested ‘bottlenecks’ in our recent evolutionary past, interpreted to indicate near-catastrophic declines in numbers to perhaps that of a few scattered bands. One such ‘near-extinction’ seems to have occurred about 70 thousand years ago, which has been linked to the huge explosion of the Toba ‘supervolcano’ in Indonesia in whose ash are poignantly preserved biface axes. Toba would have had a global climatic effect at a time when fully modern humans were migrating rapidly from Africa across Eurasia; thinly spread and easily isolated by disaster. What followed was an explosive development of both material and aesthetic culture, perhaps enabled by some serious selection amongst those who endured Toba’s global blast.

It is always tempting to restrict hypothesizing with the ‘Just gimme the facts’ outlook – as people of my generation will remember from the main detective in the Dragnet TV series. That is, ideas based on hominin remains alone. Yet all evolution takes place within a wider environmental context; for much of our history that of East Africa. Scanty knowledge of tropical climates there and a reliance on distant deep-sea records had led to the widespread belief that this centre of most hominin evolution gradually became drier since the late Miocene. Lake beds in the East African Rift system have held the key to a useful record, and now some of the detail is emerging (Trauth, M.H. et al. 2005. Late Cenozoic moisture history of East Africa. Science, v. 309, p. 2051-2053). Lakes in the Rift are handy for climate study because they span 8 degrees of latitude north and south of the equator, the spread helping to isolate more local effects of volcanism and tectonics on their sedimentary record from those of regional climate change. Many have little outflow and a local supply of water, so their levels depend mainly on the amount of local precipitation compared with evaporation. The actively subsiding basins in which they form have the opportunity to preserve unbroken, thick records of both lake and river sediments.

Trauth et al. compile environmental and chronological information from sediments in seven Rift basins, going back to about 3 Ma. Volcanic events provide plenty of dating opportunities to calibrate and correlate the sedimentary evidence. They show three rift-long episodes of deep lakes spanning broad periods from 2.7-2.5, 1.9-1.7 and 1.1-0.9 Ma. A few sections reveal lake-level fluctuations on Milankovich timescales. The longer episodes link in time to the intensification of Northern Hemisphere glaciation, to a shift in east-west air circulation over Africa and to the switch from the dominant glacial cyclicity of 41 ka to one of 100 ka, respectively. Wisely, they consider the climatic information to be crucial to studies of human evolution, but still too coarse to be used with confidence in relation to details of the fossil record. Long humid periods would have been ‘easy’, whereas the separating drier periods may have experienced ups and downs in humidity on Milankovich timescales. Fluctuating conditions would have been more stressful and likely to witness speciation. One very odd feature is that the 1.9-1.7 Ma period of deep rift lakes is the time when H. erectus became the first tooled-up being to migrate far beyond Africa. Many have regarded migration as a response to environmental stress, but just as likely is an expansion of opportunity.

Has human evolution stopped?

There can be no doubt that the way in which humans consciously build ‘shields’ of many kinds between themselves and their surroundings placed our species, and those leading up to it, in an increasingly different relationship to the environment than those of other organisms. Fire, habitations, tools, weapons and clothing emerged far back in our evolutionary ‘bush’, to be followed more recently by artificial means of feeding ourselves in a vast range of climatic conditions. In the last century these ‘shields’ have been added to by medical protection against pathogens.

Many of the physical traits of the modern human frame would not be ‘fit’ in a purely Darwinian sense for life unprotected by myriads of cultural devices: they arose from genetic potential largely because growing human culture allowed them to be fit for purposes other than survival at its simplest level. The range of basic physiognomies among modern humans does seem to reflect natural selection to suit various climatic regions, such as the differences between cold- and heat adapted peoples. That perhaps began during the great expansion out of Africa some 70 ka ago. But the much greater range of facial characteristics among all populations (a really human characteristic compared with other primates) is probably a result of genetic drift at random, rather than any kind of evolutionary selection. There are also differences that have arisen since the widespread adoption of agriculturally produced foods since about 10 ka ago, as in jaw shapes and those of the skull, probably linked to easier mastication. That can be explained most easily by the manner in which the use of muscle tends to sculpt the bone to which it is attached: it arises during the life of the individual.

With what appears to be the start of a global unification of cultures, and greater security for the more fortunate one third of humanity at least, it might be expected that natural selection is on the wane for humans. A mere 10 thousand years since the rise of agriculture and far less since modern cultures arose, it is perhaps too soon to conclude that we have cut loose from Darwinian processes. Indeed, recent genetic research has come up with several developments that must be recent results of natural selection. One is the split between adults who can metabolise cows’ milk and those who cannot. The first group, a minority, cluster around the Near East (most Europeans) and in a few parts of Africa where cattle domestication arose. A large block of the human genome, about a million base pairs of nucleotides, includes the gene that produces the necessary enzyme lactase, and its persistence in those adults able to digest milk. The large size of the whole haplotype is typical of recent genetic developments, and the researchers are certain that it resulted from selective pressure where dairy farming began at between 5-10 ka.

Genes that confer resistance to infectious diseases that can cut life short before successful reproduction are good candidates for showing the effects of natural selection, especially in those areas where medical care and drugs are not available. For a long while natural resistance among some west Africans to malaria parasites was linked with heritable sickle-cell anaemia, but recent research has shown a more complex reason that involves several genes. Interestingly, ‘dating’ of the associated genetic changes gives recent ages between 3 and 6 ka, perhaps linked to the rise of farming practices. Clearing land and ponding of water on fields would have encouraged the malaria-carrying Anopheles mosquitoes, which are not forest species: a cultural change presaged a genetic one. Similar results have emerged from studies of inherited protection against HIV/AIDS, yet that only appeared in pandemic form very recently (unless misidentified earlier). An explanation may centre on selective pressure on mutation to form the protecting gene as a result of the appearance of previous epidemics, such as plague and smallpox among early Europeans, who seem to have the highest resistance to HIV/AIDS.

So it is hard to say if selective pressures will work in future on the human genome, as culture convergence continues, and (hopefully) equitably shared living standards. Since the limit on human brain size is the skull, and that is limited by the near-maximum pathway through the human female pelvis, it is very difficult to imagine our evolution into big-heads.

Source: Balter, M. 2005. Are humans still evolving? Science, v. 309, p. 234-237.

The route and the pace out of Africa

Tool making hominid species left their African homeland several times in the past, the earliest being shortly after the appearance of Homo erectus, about 1.8 Ma ago.  Those early migrants ended up in eastern Asia, where they thrived until as recently as 12 thousand years ago (if indeed H. floresiensis does prove to be a miniature erect).  Europe was reached by at least three waves: possibly advanced H. erectus around 0.5 Ma; Neanderthals as early as 0.25 Ma; modern humans around 40 thousand years ago, at the earliest.  The fully modern human record in Asia begins at 67 thousand years ago, suggesting an exodus from Africa at between 80 and 70 thousand years.  There is an oddity here: simple geography suggests that Europe should have been colonised first in each wave out of Africa, because it is closer.  But the Nile to Middle East to Europe route was not successfully used by our immediate forebears until long after they moved eastwards, although there is evidence of H. sapiens temporary occupation of parts of Palestine between 100 to 80 thousand years.  Several reasons for this have been suggested, including the possibility of direct competition with Neanderthals who occupied the same 100 ka sites in the Middle East, and the relative difficulty of passage along the Nile compared with a coastal route in NE Africa. 

Eritrean and US archaeologists have shown that around 100 ka the Eritrean coast was occupied by humans who subsisted on seafood: always available whatever the climate, whereas terrestrial game potential fluctuates.  That has led to the suggestion that Africans who colonised Asia and Australasia left by island hopping across the narrow Straits of Bab el Mandab when sea-level began to fall around 70 ka.  A coastal route, well stocked with food items would have allowed rapid movement eastwards.  That seems intuitively likely, because an eastward route through the Middle East is barred by deserts, which would have been even more arid as glacial conditions developed.  Moreover, a Middle Eastern route would have led more directly to Asia Minor and ultimately Europe.  The conundrum deepens, since the Straits of Bab el Mandab would have been even easier to cross at the time of the last glacial maximum, around 20 ka, yet there are no archaeological signs of populations of that age in Yemen and Oman; research has hardly begun there.  Unravelling routes is possible, just, by analysing modern population genetics (Macaulay, V. et al. 2005. Single, rapid coastal settlement of Asia by analysis of complete mitochondrial genomes.  Science, v. 308, p. 1034-1036).  People living in the Andaman islands and the Malaysian Peninsula include groups who differ substantially from their neighbours and may be descendants of the original colonisers.  Mitochondrial DNA from these groups indicates a branching from an original type around 65 ka, remarkably suggesting a single founding woman.  That cannot be taken exactly at face value, but does suggest that only a small band migrated to these two areas, perhaps no larger than a few hundred.  The fact that they reached the Andaman islands may indicate that theirs was a boat-using culture.  Whatever, movement was rapid, possibly as high as 4 km per year, thereby allowing the early colonisation of Australia.

Analyses of mtDNA in Africa suggest that about 85 ka ago there was a major expansion of people, whose descendants make up more than two thirds of modern Africans.  Could it be that this expansion reflected climate and ecological change, so that migration from elsewhere drove inhabitants of the Red Sea coast to cross the daunting Straits of Bab el Mandab because of severe competition?  Perhaps it was the driving force as late as 40 ka, when modern humans reached Europe itself, undoubtedly along the Middle East route.

See also:  Forster, P. & Matsumura, S. 2005.  Did early humans go north or south?  Science, v. 3308, p. 965-966.

Changing the world

Because humanity and its activities have transformed the vegetated face of our home planet, caused its climate to warm and pushed an increasing number of other species over the edge of extinction, some circles have coined the name “Anthropocene” for the last half of the Holocene Epoch.  Human induced change almost certainly began as soon as settled agriculture arose to dominate most societies (see Did the earliest agriculture kick-start global warming?, in EPN of April 2005).  In terms of atmospheric emissions and mobilizing metals we now push natural rates close: facts that emerge from annual reviews of mining and energy use.  But are we truly significant geological agents as well as influences on the atmosphere and biosphere?  Two articles in April 2005 suggest that we are.

Quarries, mines and other excavations are obvious signs of human erosive power, but our farming activities produce insidious results by inducing soil erosion.  Although its effects are well known from such areas as the Ethiopian Highlands and the 1930’s “Dust Bowl” of the US mid-west, a global measure of the rates involved requires a careful compilation of  quantitative data.  Bruce Wilkinson of the University of Michigan has made the first attempt (Wilkinson, B.H. 2005.  Humans as geological agents: A deep-time perspective.  Geology, v. 33, p. 161-164).  Throughout the Phanerozoic, the volume of sedimentary rocks suggests that enough erosion has taken place to have stripped a uniform blanket 3 km deep from the continental surface.  That gives an average erosion rate for the last half-billion years of Earth history of the order of tens of metres per million years.  Assembling information about current rates of human-induced stripping, roughly divided 30:70 between excavation and soil erosion, Wilkinson arrives at a staggering figure for anthropogenic denudation: hundreds of metres per million years.  Our activities in the outer part of the rock cycle are an order of magnitude greater than purely natural rates of weathering, erosion and transportation.  He suggests that humanity began to outpace sedimentology sometime around the time of the Norman Conquest.

This awesome picture might seem to indicate that rates of sediment deposition on continental margins are also tremendously elevated by our actions.  That aspect has been studied by geoscientists from the US and Holland (Syvitski, J.P.M. 2005.  Impact of humans on the flux of terrestrial sediment to the global coastal ocean.  Science, v. 308, p. 376-380).  The opposite is now happening.  Syvitski et al.’s analysis of historical sediment loads in the catchments and lower reaches of the worlds major rivers shows that while overall sediment transport has increased by 2.3 billion t per year, since human effects became noticeable in the sedimentary record, the amount delivered to the sea has fallen.  Some 1.4 billion t no longer add to marine sedimentation each year.  Instead, that mass ends up behind dams of one kind or another.  In the last 50 years, more than 100 billion t, containing 1 to 3 billion t of carbon is in silted up reservoirs, or redistributed to farmland by irrigation diversions.  One of the outcomes is that natural coastal protection by spits and sand bars is growing less effective.  Another is that less nutrients are getting to the near-shore marine biosphere, with possible effects on fish stocks, coral reefs and other habitats.

Caring among the Erects

Dmanisi in Georgia provided one great surprise in human evolution by yielding abundant remains of 1.7 Ma old Homo erectus where they might be least expected: north of the Caucasus mountains that would have formed a tremendous barrier to any migration from further south.  The archaeological sites have provided another surprise in the form of a well-preserved skull of a completely toothless individual.  It is clear from the regrowth of bone into the sockets that this “masticatorily impaired” individual survived for years after losing all their teeth (Lordkipanidze, D. et al. 2005.  The earliest toothless hominin skull.  Nature, v. 434, p. 717-718).  It is impossible to believe that the individual could have survived on a tough meat and vegetable diet without special preparation of soft victuals.  Although the person’s survival cannot prove that other Erects helped out, that is a distinct possibility.  Losing teeth through dental disease or trauma would have been immensely painful and debilitating, yet the individual did survive.  We have to move forward to around 40 thousand years ago for compelling evidence that Neanderthal society cared for disadvantaged people, when several near-complete skeletons show evidence of long-term, crippling damage.