Probing the Earth’s mantle using noise

sesmic tomography
Artistic impression of a global seismic tomogram – beneath Mercator projection – dividing the mantle into ‘warm’ and ‘cool’ regions (Credit: Cornell University Geology Department – http://www.geo.cornell.edu/geology/classes/Geo101/graphics/s12fsl.jpg)

It goes without saying that it is difficult to sample the mantle. The only direct samples are inclusions found in igneous rocks that formed by partial melting at depth so that the magma incorporated fragments of mantle rock as it rose, or where tectonics has shoved once very deep blocks to the surface. Even if such samples were not contaminated in some way, they are isolated from any context. For 20 years geophysicists have been analysing seismograms from many stations across the globe for every digitally recordable earthquake to use in a form of depth sounding. This seismic tomography assesses variations in the speed of body (P and S) waves according to the path that they travelled through the Earth.

Unusually high speeds at a particular depth suggests more rigid rock and thus cooler temperatures whereas hotter materials slow down body waves. The result is images of deep structure in vertical 2-D slices, but the quality of such sections depends, ironically, on plate tectonics. Earthquakes, by definition mainly occur at plate boundaries, which are lines at the surface. Such a one-dimensional source for seismic tomograms inevitably leaves the bulk of the mantle as a blur. But there are more ways of killing a cat than drowning it in melted butter. All kinds of processes unconnected with tectonics, such as ocean waves hitting the shore and interfering with one another across the ocean basins, plus changes in atmospheric pressure especially associated with storms, also create waves similar in kind to seismic ones that pass through the solid Earth.

Such aseismic energy produces the background noise seen on any seismogram. Even though this noise is way below the energy and amplitude associated with earthquakes, it is continuous and all pervading: the cumulative energy. Given highly sensitive modern detectors and sophisticated processing much the same kind of depth sounding is possible using micro-seismic noise, but for the entire planet and at high resolution. Rather than imaging speed variations this approach can pick up reflections from physical boundaries in the solid Earth. Surface micro-seismic waves exactly the same as Rayleigh and Love waves from earthquakes have already been used to analyse the Mohorovičić discontinuity between crust and upper mantle as well as features in the continental crust; indeed the potential of noise was recognized in the 1960s. But the deep mantle and core are the principle targets, being far out of reach of experimental seismic surveys using artificial energy input. It seems they are now accessible using body-wave noise (Poli, P. et al. 2012. Body-wave imaging of Earth’s mantle discontinuities from ambient seismic noise. Science, v. 338, p. 1063-1065).

Poli and colleagues from the University of Grenoble, France and Finland used a temporary network of 42 seismometers laid out in Arctic Finland to pick up noise, and sophisticated signal processing to separate surface waves from body waves. Their experiment resolved two major mantle discontinuities at ~410 and 660 km depth that define a transition zone between the upper and lower mantle, where the dominant mineral of the upper mantle – olivine – changes its molecular state to a more closely packed configuration akin to that of the mineral perovskite that is thought to characterize the lower mantle. Moreover, they were able to demonstrate that the 2-step shift to perovskite occupies depth changes of about 10-15 km.

Applying the method elsewhere doesn’t need a flurry of new closely-spaced seismic networks. Data are already available from arrays that aimed at conventional seismic tomography, such as USArray that deploys  400 portable stations in area-by-area steps across the United States (http://earth-pages.co.uk/2009/11/01/the-march-of-the-seismometers/)

It is early days, but micro-seismic noise seems very like the dreams of planetary probing foreseen by several science fiction writers, such as Larry Niven who envisaged ‘deep radar’ being deployed for exploration by his piratical hero Louis Wu. Trouble is, radar of that kind would need a stupendous power source and would probably fry any living beings unwise enough to use it. Noise may be a free lunch to the well-equipped geophysicist of the future.

  • Prieto, G.A. 2012. Imaging the deep Earth. Science (Perspectives), v. 338, p. 1037-1038.

Breakthrough in human tools: the scene shifts to Africa

A means of assessing the cognitive abilities of hominins is through the objects that they created, whether tools or artefacts with apparent symbolic significance. The latter include pigments, coloured shells, beads, artwork or even deliberately parallel and crossing lines gouged on otherwise innocuous rock. Undoubtedly valuable to their creators, possibly treasured and passed on until lost or broken – most are fragile – symbolic artefacts are rare. So although they shout ‘thoughtful’, their age tells us little about when such a capacity first arose. Many archaeologists and palaeoanthropologists assert that creating and/or manipulating symbols may signify a link with being able to speak. Tools are a lot easier to find, probably as discards and lost items, and a well-described and understood sequence of forms and sometimes uses has been established, which extends as far back as perhaps 3 Ma – before the genus Homo appeared.

In terms of their meaning in terms of the consciousness of their makers and users, there are possibly four major recognisable steps. Chimpanzees and some birds can learn to pick up natural objects, such as stones and twigs, and use them: some bands of chimps even retain the knowledge. A step beyond that is preparing a natural object for use, as with breaking a pebble to create a cutting edge: something not exclusively human because it is possible that pre-human hominins created the earliest such Oldowan tools. Being able to visualise hidden potential inside something natural is altogether more advanced, and is represented by the iconic bi-face or Acheulean ‘hand-axe’. Its earliest makers, H. ergaster and erectus, literally brought such objects to light by skilfully knapping away the outer parts of substantial lumps of suitable rock. The knowledge endured for more than a million years but was eventually added to and superseded by a range of more delicate and specific stone tools, but more sophisticated tools represented the same ‘liberation’ of a simple idea held in rock. The fourth general cognitive leap was to add several resources together as composite tools, and arguably we have not long emerged from that phase with the creation of composite tools that help us design and make other tools: a machine-tool culture.

English: Backed edge bladelet Español: Hojita ...
Example of a microlith (credit: Wikipedia)

It is that penultimate step-up in consciousness that has been engaging archaeologists since they first realised that some small, sharp chips of stone were not waste but deliberately crafted for combination with wood or bone. Such ‘microliths’ have been found in intact arrows and sickles of the Meso- and Neolithic, but their range steadily goes back in time with more research. Unmistakeable microliths have now been discovered at the South African coastal site at Pinnacle Point, in an occupation layer that is 71 ka old (Brown, K.S. and 8 others 2012. An early and enduring advanced technology originating 71, 000 years ago in South Africa. Nature, v. 491, p. 590-593).

The Pinnacle Point technology was indeed sophisticated, microlith manufacture requiring fire treatment as well as choice of rock and careful shaping and sharpening. As well as extending the microlith culture back so far the team of South African, US, Australian and Greek archaeologists compared them with 28 later African tool kits. The designs have barely changed from 71 ka to those of the last few hundred years. Kyle Brown and colleagues show that the industrial method endured, thereby laying to rest the somewhat reactionary notion that the methods were lost again and again in Africa after separate inventions and were only taken up decisively by the supposed ‘advanced’ anatomically modern humans who colonised Europe…

It is difficult to see how the Pinnacle Point microliths could have been useful, unless hafted in arrows or throwing sticks – maybe even saws and sickles? Crucially, they predate larger blade-tools that could have been hafted to form spears. The focus must now shift to the Zambian scene where possible microliths are reported at two 250 ka sites. If confirmed, they would link the decisive fourth cognitive step towards humanity with the very origin of fully modern humans, rather than a much later, non-African dawning of ‘smarts’ along with language, advanced art and much else in the chilly caves of southern Europe.

Of all human-colonised continents Africa lags far behind the rest as regards spread and density of archaeological digs. Only the ‘famous’ sites attract resources for investigation. Imagine what might emerge once there are more local people with research skills, equipment and transport; and, dare I say it, more independence of action and the attendant confidence in their ability.

A glimpse of the Hadean

There is something deeply unsatisfying, even untidy, about a geoscientific history from which the first half billion years is more or less a blank. Every likely stone has been turned and every isotope hurled as a curve-ball through a mass spectrometer in the quest for either direct evidence of Hadean events or an acrid whiff that lingers in later matter. All, that is, except for one…

Formed in a proposed supernova that likely helped trigger formation of the Sun and Solar System, 150Gd quickly decayed to produce 146Sm, which itself had a half-life of about 68 Ma. That is too short for any significant trace of that radioactive rare-earth element to remain in terrestrial rocks, but its daughter isotope 142Nd bears witness to its former existence. Checking the proportion of 142Nd against the heavier 144Nd is a means of assessing isotopic fractionation according to atomic mass between a solid source of a magma, and between residual magma and solids that crystallised from it.

A popular and well-supported view of the Hadean is that shortly after accretion of the Earth a stupendous impact left a deep ‘ocean’ of magma and flung off mass that produced the Moon. Solidification of that ocean, which would have involved denser minerals sinking and lighter ones rising to higher levels, has been suggested to have resulted in differentiation of the mantle into two portions, one enriched, the other depleted; an event on which the entire later geochemical history of our planet has depended. Should either part of the mantle melt again, the igneous rocks that would result should carry a neodymium isotope signature of one or the other. Little sign of either emerges from studies of igneous rocks younger than 2.5 Ga, but older rocks from Greenland that go back to 3.8 Ga demonstrate that almost all of them melted from the Hadean depleted mantle. Without rocks carrying 142Nd/144Nd ratios signifying the other side of the more ancient mantle division, an enriched source, the grand idea was flawed. But this one-sidedness appears now to have been balanced by other Archaean igneous rocks (Rizo, H. et al. 2012. The elusive Hadean enriched reservoir revealed by 142Nd deficits in Isua Archaean rocks. Nature, v. 491, p. 96-100).

3.8 billion year-old Amitsoq gneisses, West Greenland (Image credit: Stephen Moorbath, via Royal Society)

The analysed rocks are interesting for another reason, for they are 3.4 Ga old vertical sheets of basalt or dykes that cut through the more ancient west Greenland crust. They are the first evidence of a brittle crust that cracked under tension to be followed by mantle-derived magma. Some members of the Ameralik dyke swarm show just the isotopic signature predicted for the enriched member of the postulated fundamental mantle division. However, for some yet to be recognised reason, few post-Archaean rocks show any sign of widespread mantle heterogeneity. Such matters could be addressed with any confidence only after mass spectrometry allowed precise discrimination between isotopes of a whole variety of both common and rare elements. That was not so long ago, so a rich trove of future revelations can be anticipated.

Batter your planet

K/T extinction event theory. An artist's depic...
Artist’s depiction of the asteroid impact 65 million years ago that caused the K-T mass extinction. (Photo credit: Wikipedia)

Just in time for the festive season I have been sent the URL for an on-line impact simulator written by a team from Imperial College London and the University of Arizona (Collins, G.S. et al. 2005. Earth Impact Effects Program: A Web-based computer program for calculating the regional environmental consequences of a meteoroid impact on Earth. Meteoritics and Planetary Science, v. 40, p. 817–840), with a web presence designed at Purdue University, Indiana. ImpactEarth (http://www.purdue.edu/impactearth/) has been around for two years and has a scientifically pleasing level of precision, thanks to the authors, Gareth Collins, Jay Melosh and Robert Marcus.

The fact that the target shown by the accompanying animation and other graphics seems to be the Washington-New York megalopolis may be a cause for some concern for US readers, especially the Department of Homeland Security, National Security Agency and CIA. They can rest easy, however, as this seems to be a matter of artistic license: the choice of parameters allows for ocean strikes and targets of sedimentary or crystalline rocks. Others are impactor diameter and density, impact angle and speed, plus distance from ground zero. An element of whimsy allows the casual user to choose inbound humpback whales, school buses and the Empire State Building as well as more astronomically likely scenarios.

There are a number of missing parameters such as direction relative to Earth’s rotation, latitude and the likely affect of an ice-cap strike, and no mention in the results of the electromagnetic burst from atmospheric compression on entry – the Diesel effect. However, the thermal effects on bystanders, buildings and vegetation at the ‘viewpoint’ personalise the experience to some extent. It is the detail about crater dimensions and evolution, lithospheric melting and what might happen to the Earth’s axial tilt and day length that the wealth of computations produce surprises. It is not easy to destroy our planet: using a body with a density of 3000 kg m-3 and the diameter of Asia causes no significant melting or changes in axial tilt at speeds less than 12 km s-1, but does change the length of the day by up to 113 hours. This is because the power of impacts and therefore the work done by them is proportional to the square of the speed. Mind you, nothing is left standing as the seismic effect has a Richter Magnitude of more than 15! Yet, curiously, no atmospheric or thermal radiation effects are noted.

Have fun.

Hominin round-up

Our tenacious companions.

Male human head louse, Pediculus humanus capit...
Male human head louse, Pediculus humanus capitis (credit: Wikipedia)

Until recently humans and lice were inseparable and the same goes for all primates, and nearly all mammals. However, unlike fleas, which happily will suck any blood that is going provided it is easily tapped, lice are tailored to their hosts. Should a baboon louse, for instance, get into your short and curlies it will almost certainly die. In any case, again unlike fleas, the louse cannot leap: they spread through intimate contact. The human head louse spreads especially well among nursery- and infant-school children, as any parent knows, because lessons often involve them literally getting their heads together. Less well known is that Pediculus humanus eschew soiled or greasy hair and it is the well-scrubbed kids who suffer and spread ‘beasts on the head’. Conversely, the clothes louse that carries typhus and other infections is deterred by regular laundry and ironing. And then there is the  Continue reading “Hominin round-up”

Short fuse on clathrate bomb?

Structure of a gas hydrate (methane clathrate)...
Gas hydrate (methane clathrate) block embedded in seabed sediment (Photo credit: Wikipedia)

The biggest tsunami to affect inhabitants of Britain, mentioned in the earlier post Landslides and multiple dangers, emanated from the Storegga Slide in the northern North Sea west of Norway. That submarine debris flow was probably launched by gas hydrates beneath the sea bed breaking down to release methane thereby destabilising soft sediments on the continental slope. Similar slides were implicated in breaking Europe-America communications in the 20th century, such as the Grand Banks Slide of 1929 that severed submarine cables up to 600 km from the source of the slide. Even now, much Internet traffic is carried across oceans along optic-fibre cables, breakages disrupting and slowing services. A more mysterious facet of clathrate breakdown is its possible implication in unexplained and sudden losses of ships. When gas escapes to the surface, the net density of seawater decreases, the more so as the proportion of bubbles increases. Ship design and cargo loading rests on an assumed water density range from fresh to salt water and for different temperatures at high and low latitudes.

Gulf stream map
Gulf stream map (credit: Wikipedia)

The Atlantic seaboard of the USA hosts some of the best-studied accumulations of clathrates in the top 100-300 m of seabed sediments. Since their discovery these ‘cage complexes’ of mainly methane and carbon dioxide trapped within molecules of water ice have been studied in detail. Importantly, the temperatures at which they form and the range over which they remain stable depend on pressure and therefore depth below the sea surface. At atmospheric pressure solid methane hydrate is unstable at any likely temperature and requires -20°C to form at a pressure equivalent to 200 m water depth. Yet is stable at temperatures up to 10°C 500 m down and 20°C at a depth of 2 km. Modern sea water cools to around 0°C at depths greater than 1.5 km, so gas hydrates can form virtually anywhere that there is a source of methane or CO2 in seafloor sediment. In the sediments temperature increases sharply with depth beneath the seabed due to geothermal heat flow thereby limiting the clathrate stability zone to the top few hundred metres.

Two factors may lead to clathrate instability: falling sea level and sea-floor pressure or rising sea-floor temperature. Many gas-hydrate deposits, especially on the continental shelf and continental edge are likely to be close to their stability limits, hence the worries about destabilisation should global warming penetrate through the water column. The western North Atlantic is an area of especial concern because the Gulf Stream flows northward from the Caribbean to pass close to the US seaboard off the Carolinas: that massive flow of tropical warm water has been increasing during the last 5 thousand years so that its thermal effects are shifting westwards.

Geophysicists Benjamin Phrampus and Matthew Hornbach of the Southern Methodist University in Dallas, Texas have used thermal modelling to predict that gas-hydrate instability is imminent across 10 thousand square kilometres of the Caroline Rise (Phrampus, B.J. & Hornbach, M.J. 2012. Recent changes to the Gulf Stream causing widespread gas hydrate destabilization. Nature, v. 490, p. 527-530). As a test they analysed two seismic reflection profiles across the Carolina Rise, seeking anomalies known as bottom-simulating reflectors that signify free gas in the sediments. These are expected at the base of the gas-hydrate zone and their presence helps assess sediment temperature. At depths less than 1 km the base of the gas-hydrates modelled from the present temperature profile through the overlying seawater lies significantly above the base’s signature on seismic lines. The deeper levels probably formed under cooler conditions than now – probably eight degrees cooler – and may be unstable. If that is correct, the Caroline Rise area seems set to release around 2.5 Gt of methane to add to atmospheric greenhouse warming. The Storegga Slide also lies close to the northern track of the Gulf-Stream – North Atlantic Drift…

Una parodia della giustizia?

Damage caused by the L’ Aquila earthquake of 6 April 2009. (credit: Reuters)

Lying above a destructive plate margin, albeit a small one, Italy is prone to earthquakes. Seismometers detect a great many of low magnitude that no one notices and that do no obvious damage to buildings. From 2006 to autumn 2008 the Abruzzo region on the eastern flank of the Appenine mountains of central Italy experienced a background of one low-magnitude tremor every day (Papadopoulos, G.A. et al. 2010. Strong foreshock signal preceding the L’Aquila (Italy) earthquake (Mw 6.3) of 6 April 2009. Natural Hazards and Earth System Sciences, v. 10, p. 19-24). In the following 6 months the rate more than doubled but the epicentres continued to be almost randomly situated. Things changed dramatically in the 10 days following 27 March 2009: the pace increased to twenty times the normal ‘background’ and epicentres clustered directly beneath the regional capital L’ Aquila (population 73 thousand) close to a known fault line. At 3.32 am on 6 April 2009 the Paganica fault failed less than 10 km below L’ Aquila, directing most of the Magnitude 6.3 energy at the town. This was the deadliest earthquake in Italy for three decades; 308 people died 1500 were injured and 40 thousand found themselves homeless. Silvio Berlusconi, not a man to flinch from controversy, commented on German TV about the homeless, ‘Of course, their current lodgings are a bit temporary. But they should see it like a weekend of camping’.

English: Silvio Berlusconi in a meeting with J...
Former Italian President Silvio Berlusconi (credit: Wikipedia)

L’ Aquila has a dismal history of seismic damage, having been devastated before: 7 times since the 14th century. Having grown on a foundation of lake-bed sediments, notorious for amplifying ground movements, the city was clearly in a high-risk status in much the same manner as Mexico City. Shaken several times before and built with no regard to seismicity, much of L’ Aquila’s centuries-old building stock was incapable of resisting the event of 6 April 2009: up to 11 thousand building were damaged, some collapsing completely.

Not only was the earthquake preceded by an increasing pace of foreshocks, but many local people reported strange ‘earth lights’ during the months beforehand (Fidani, C. The earthquake lights (EQL) of the 6 April 2009 Aquila earthquake, in Central Italy.Natural Hazards and Earth System Sciences, v. 10, p. 967-978). In fact, so many sightings were made that plans have been outlined for a CCTV monitoring network in rural areas.

So, this disaster was not short of signs that all was not well in Abruzzo, in a seismic sense: historical precedent; poor urban siting; foreshocks and oddities that have come to be associated with impending energy release. But was this litany sufficient to predict the place, date, and magnitude of what was coming? Plate tectonics, local structural geology and worldwide seismicity allow geophysicists to assess risk from earthquakes in the same way as hydrologists can outline flood-prone areas: literally on flood plains. Yet there are few if any records of a devastating earthquake having been predicted anywhere with sufficient accuracy to allow evacuation and mitigation of death and injury. That is despite the fact that teams of seismologists in the western US, Japan, Italy and several other well-off countries continually monitor seismic events even with a power many orders of magnitude less than those which kill or injure. Such bodies are faced with a dreadful choice in the face of evidence like that summarised above: warn tens of thousands to evacuate, organise such an exodus in a few days and prepare accommodation for them, or advise that similar seismic escalations rarely lead to massive damage with an estimate of the probability of risk. Both choices are guesswork for there are no rigorous equations that spell ‘doom’ or ‘all clear’ from such data. Earthquakes are not rainstorms or hurricanes, as 250 thousand dead people on the shores of the Indian Ocean bear grim witness.

Despite broad knowledge of the deep uncertainty associated with earthquakes and volcanic eruptions – no longer privy to specialist scientists these days, even in the least developed parts of the world – the Italian authorities saw fit to prosecute six earth scientists and a public official for multiple manslaughter.  Because they provided “inaccurate, incomplete and contradictory” information about what might have been the aftermath of tremors felt ahead of 6 April 2009 earthquake, a regional court sentenced all of them to six years in prison – two years more than even the prosecution demanded – and they are to pay the equivalent of £6.7 million in compensation. This was not a jury verdict, but the decision of a single judge, Marco Billi. No scientist, even one poring over data from the Large Hadron Collider in search of the Higgs boson, would every claim that what they report is perfectly accurate, complete and incontrovertible. The L’Aquila Seven never said they were certain that no earthquake would ensue, and the city’s people were well aware of what risk they faced in much the same way that Neapolitans living on the slopes of Vesuvius know that one day they may be incinerated.

This is a travesty of justice so bizarre that one must look to the famous adage of Roman Law: qui bono? Certainly not the victims and their mourners, and definitely not science because any sensible Italian geophysicist will in future simply play dumb. There is already a huge world wide outcry, not just from outraged scientists.

Added 25 October 2012: The 12 October issue of Science carried a lengthy summary of proceedings early in the trial (Cartlidge, E. 2012. Aftershocks in the courtroom. Science, v. 338, p. 185-188). Read Nature‘s editorial on the L’ Aquila verdict here and further comment.

New twist on lunar origin

English: Giant impact - artist impression. Čes...
Artistic impression of the moon-forming giant impact. (credit: Wikipedia)

Although a few would-be space faring countries have ambitions, a post-Apollo crewed mission to the Moon is unlikely for quite a while. Yet moon-struck curiosity goes on: currently there is a surge in re-examining the lunar samples brought back more than 40 years ago. The Lunar Sample Laboratory Facility in Houston holds about a third of a ton of rock and regolith. I suppose part of the reason why lunar rocks are being re-analysed – in fact some for the first time – is because new or improved methods are available, but frustration among  a growing community of planetary geochemists having little more than meteorites to peer at probably plays a role as well. Since Hartman and Davis first suggested it, the giant impact theory for the Moon’s origin has dominated geochemical ideas. Most tangible is that of a magma ocean, floated plagioclase crystals from its fractional crystallisation probably having formed the glaring white lunar highlands composed of anorthosite. More subtle are ideas about what happened to the Mars-sized planet that did the damage to Earth and flung vaporised rock into orbit to accrete into the new Moon, and the effects of the stupendous energy on the geochemistry of all three bodies. Directed at all that is new research on isotopes of zinc (Paniello, R.C. et al. 2012. Zinc isotope evidence for the origin of the Moon. Nature, v. 490, p. 376-379).

The focus on zinc is because it is easily vaporised compared with more refractory materials, such as calcium an titanium, and as well as being ‘volatile’ it has five naturally occurring isotopes with relative atomic masses of 64 (the most abundant), 66, 67, 68 and 70. In general, isotopes of an element behave in slightly different ways during geological and cosmological processes, which changes their proportions in the products; a process known as ‘mass-fractionation’. Paniello and colleagues from Washington University, Missouri and the Scripps Institution of Oceanography, California USA found that Moon rocks are enriched in the heavier isotopes of zinc yet depleted in total zinc compared with terrestrial rocks and meteorites supposed to have come from Mars. Unlike those two planets the Moon’s zinc deviates from its abundance relative to other elements recorded by chondritic meteorites. This zinc depletion tallies with volatile loss from incandescent vapour blurted from the colliding planets. But it doesn’t help with the detailed predictions from the giant-impact model. A variety of scenarios suggest that the Moon should be made from remnants of the inbound impactor’s mantle, yet studies of other elements’ isotopes indicate that the Moon is rather Earth-like. But not those of zinc, so it looks like they have to be explained by a complete rethink of the whole hypothesis (Elliott, T. 2012. Galvanized lunacy. Nature, v. 490, p. 346-7).

The shuffling poles

The mechanical disconnection of the lithosphere from the Earth’s deep mantle by a more ductile zone in the upper mantle – the asthenosphere – suggests that the lithosphere might move independently. If that were the case then points on the surface would shift relative to the axis of rotation and the magnetic poles, irrespective of plate tectonics.  So it makes sense to speak of absolute and relative motions of tectonic plates. The second relates to plates’ motions relative to each other and to the ancient position of the magnetic poles, assumed to be reasonably close to that of the past pole of rotation, yet measurable from the direction of palaeomagnetism retained in rocks on this or that tectonic plate. Plotting palaeomagnetic pole positions through time for each tectonic plate gives the impression that the poles have wandered. Such apparent polar wandering has long been a key element in judging ancient plate motions.  Absolute plate motion judges the direction and speed of plates relative to supposedly fixed mantle plumes beneath volcanic hot spots, the classic case being Hawaii, over which the Pacific Plate has moved to leave a chain of extinct volcanoes that become progressively older to the west. But it turns out that between about 80 to 50 Ma there are some gross misfits using the hot-spot frame of reference. An example is the 60° bend of the Hawaiian chain to become the Emperor seamount chain that some have ascribed to hot spots shifting (see http://earth-pages.co.uk/2009/05/01/the-great-bend-of-the-pacific-ocean-floor/).

English: Age of ocean floor, with fracture zon...
Age of Pacific Ocean floor, showing the Hawaii-Emperor seamount chain in black. (credit: Wikipedia)

Ideas have shifted dramatically since it became clear that hot spots can shift, and there has been an attempt to estimate their actual motions (Doubrovine, P.V. et al. 2012. Absolute plate motions in a reference frame defined by moving hot spots in the Pacific, Atlantic, and Indian oceans. Journal of Geophysics Research: Solid Earth, v. 117, B09101, doi:10.1029/2011JB009072). It is early days for the revised view of absolute motion of the lithosphere and estimates go back only 120 Ma. However, one outcome has been a realistic examination of whether the positions of the poles have shifted through time; a possibility that is hidden in apparent polar wander paths. Since the mid-Cretaceous it seems that a slow and hesitant, but significant polar shuffle has taken place, varying between 0.1 and 1.0° Ma-1, starting in one direction and then the movement retraced its steps to achieve the current proximity of magnetic poles to the poles of rotation.

Landslides and multiple dangers

English: A rock landslide in Guerrero, Mexico....
A landslide in Guerrero, Mexico in August, 1989. (credit: Wikipedia)

Just as modern humans were establishing a permanent foothold in Britain and engaging in the transition to settled farming and livestock husbandry disaster struck some of the most attractive Mesolithic real estate. Around 8 000 years ago the east coast of Scotland, from the Shetland Isles to the Firth of Forth, was struck by a tsunami as big as that affecting the north eastern island of Honshu in the Japan archipelago in 2011. It washed over low lying islands of Shetland and Orkney and roiled up the great inlets or firths of eastern mainland Scotland to leave thick sand deposits containing carcases of whales and other large sea mammals. At that time, Britain was joined to the rest of Europe by marshy lowlands linking East Anglia and the Netherlands dubbed ‘Doggerland’ at the southern end of a huge gulf that became the North Sea. Final sea level rise removed that initial gateway to Britain, so we cannot judge what damage the tsunami wrought, but tools and animal bones dredged from the area show that it was full of game and people. A disaster, but not one linked to seismicity. The driving force has been recognised in a series of submarine scars off the west coast of Norway that witness massive slides of sediment on the sea bed area known as Storegga. Similar scars around the Hawaiian Islands and those making up the Azores and Canaries in the mid Atlantic bear witness to many large slippage events, on the sea bed and from the islands themselves. Recognising signs of past tsunami damage in coastal areas worldwide reveals plenty of cases triggered by landslides rather than earthquakes.

The March 2011 Sendai tsunami and those which ravaged lands around the Indian Ocean in late 2004 formed because of vertical movements on major faults that dropped or shoved up the oceanic crust itself. Yet any sudden change in the shape of the sea floor will displace all the ocean water above, the difference from seismic tsunamis lies in the energy source: instead of tectonic plate forces, gravitational potential energy is released by slumps and slides. That may happen because of erosion producing unstable steep slopes, build up of sedimentary piles, large outpourings of lavas or slopes being destabilised by minor earthquakes or release of gases from the sediments themselves. The Mesolithic submarine slide at Storegga may have been set in motion by massive release of methane from gas-hydrate deposits, and such is the extent of scarring of the sea floor there that it must have happened before and may do so again.

1755 copper engraving showing Lisbon in flames...
Copper engraving showing the 1755 Lisbon tsunami overwhelming ships in the harbor. (credit: Wikipedia)

Realisation of the potential for tsunamis to be triggered by submarine and coastal and slides has spurred bathymetric studies in a number of likely areas, including the Gorringe Bank that lies on the Atlantic floor just west of the Iberian Peninsula. It is tectonic in origin but has a thick veneer of sediment brought by Iberian river systems. On its northern flank is a 35 km long scar of a slip that moved 80 km3 of sediment (Lo Iacono, C. And 11 others 2012. Large, deepwater slope failures: implications for landslide generated tsunamis.  Geology, v. 40, p. 931-934). The Spanish-British-Italian group estimate that the slip would have generated a 15 m tsunami most likely to have affected the Iberian coast south of Lisbon. Conditions for slides of si,ilar magnitude still exist on the Gorringe Bank. One unstable system ripe for collapse is present far out in the Atlantic on the south-east coast of the island of Picos in the Azores (Hildenbrand, A. et al. 2012. Large-sale active slump on the southeast flan of Picos Island, Azores. Geology, v. 40, p. 939-942). This is in a coastal area where repeated volcanism has piled up lavas on the flanks of the island’s main volcanic edifice. Failure has already started, with a number of prominent arcuate scars having developed. The Picos slide moves very slowly sideways but vertical displacements ar estimated at up to a centimetre a year. The volume of the slowly moving mass is an order of magnitude less that the fossil slide on the Gorringe Bank. Yet should it fail entirely, the slopes involved, the absence of water’s slowing effect and the height of the mass might ensure comparable energy is delivered to the Atlantic Ocean, though the likely trajectory of tsunamis would be parallel to the coast of Africa rather than directly towards it.

Landslides of all kinds, though hazardous, have long been thought to be less of a risk to life globally than the more spectacular seismic and volcanic hazards, but there are few data to support that view. In an attempt to assess the annual risk properly, David Petley of Durham University, UK ‘mined’ world-wide landslide records for the seven years since 2004 (Petley, D. 2012. Global patterns of loss of life from landslides. Geology, v. 40, p. 927-930). There were more than 2600 recorded slope-failures that killed people and caused a total of more than 32 thousand fatalities: ten time more than previous vague estimates. This is a minimum because many landslides occur in very remote areas, especially in the mountainous regions of China and the Himalaya. The number of fatalities accompanying each event shows distinct signs, on a country-by-country basis, of a relationship with population density. Several international agencies are emerging that aim at means of measuring disaster risk, one being the Integrated Global Observing Strategy for Geohazards (IGOS).