Occasionally, journals not usually associated with mainstream geosciences publish something startling, but easily missed. Nature of 12 September 2013 alerted me to just such an oddity. It seems that the chemistry of sea-floor hydrothermal vents potentially can generate electrical power (Yamamoto, M. et al. 2013. Generation of electricity and illumination by an environmental fuel cell in deep-sea hydrothermal vents. Angewandte Chemie, online DOI: 10.1002/ange.201302704).
The team from the Japan Agency for Marine-Earth Science and Technology, the Riken Centre for Sustainable Resource Science and the University of Tokyo used a submersible ROV to suspend a fuel cell based on a platinum cathode and iridium anode in hydrothermal vents that emerge from the Okinawa Trough off southern Japan at a depth of over 1 km. It recorded a tiny, but significant power generation of a few milliwatts.
The fluids issuing from the vents are at over 300°C while seawater is around 4°C, creating a very high thermal gradient. More importantly, the fluid-seawater interface is also a boundary between geochemically very different conditions. The fluids are highly acidic (pH 4.8) compared with the slight alkalinity of seawater, and contain high concentrations of hydrogen and hydrogen sulfide but undetectable oxygen (sea water is slightly oxygenated).
The fuel cell was designed so that iridium in the anode speeds up the oxidation of H2S at the geochemical interface which yields the electrons necessary in electrical currents. The experiment neatly signified its success by lighting up three light-emitting diodes.
Does this herald entirely new means of renewable power generation? Perhaps, if the fuel cell is scaled-up enormously. Yet, the very basis of oxidation and reduction is expressed by the mnemonic OILRIG (Oxidation Is Loss Reduction Is Gain – of electrons) and any potential redox reaction in nature has potential, even plants can be electricity producers. In fact all fuel cells exploit oxidation reactions of one kind or another.
Media coverage of the disasters following the magnitude 9.0 earthquake of 11 March 2011 that devastated the north-eastern coast of Honshu, Japan around the city of Sendai is now (early May) fitful and dominated by the aftermath of the tsunamis’ effect on the Fukushima Daiichi nuclear power station. For those who escaped the tsunamis the experience is irredeemably seared on their memory. Unlike the great waves that killed 10 times more people around the Indian Ocean on 26 December 2004, it will also be unforgettable for those of us far from the event who witnessed the lengthy, high-definition footage captured during the black-water torrents that swept all before them far inland. But that is no longer ‘news’…
Only 6 to 7 weeks later lessons are being learned that probably should have been anticipated long before. Japan has the world’s best disaster preparedness systems. They are centred on civil engineering that was proven to resist great earthquakes by that of 11 March; the terrifying tremors resulted in far fewer casualties than would have been the case anywhere else under such conditions. The tragedy lay with the magnitude of the tsunamis – as high as 30 m in some areas – that reached the coast within an hour of the seismic event. As well as the devastation and loss of life along the coast and up fertile low-lying valleys, waves of this size swept over defences of the coastal Fukushima Daiichi nuclear power plant cutting off emergency power supplies: the world’s largest tsunami barriers proved inadequate to the task and near-meltdown ensued.
Despite the densest network of seismometers anywhere and in-place earthquake early-warning and risk-assessment systems, the events were not forecast and the only warning was that of the earthquake itself which alerted a well-versed population to the imminence of tsunamis to follow. Public education and preparedness proved to be the major life saver, except of course for those tragically killed or lost without trace. So what went wrong?
The risk assessment and warning systems produced results that bore little relation to the actual seismic shaking; the warning was for the immediate vicinity of Sendai city to experience the highest intensities (5-6), most of the rest of Honshu, including Tokyo, having expected intensities in the 2-4 range. For Fukushima Daiichi a maximum magnitude of 7.2 in its vicinity was predicted to have less than 10% chance of occurring over the next 50 years. In reality seismometers across the whole eastern part of the Honshu north of Tokyo recorded intensities between 5-7, demonstrated graphically by numerous CCT recordings in shops and offices. The emerging opinion is that the theory and historic data used for risk and warning systems are flawed or inadequate. For instance the earthquake ripped along 400 km of the Japan Trench subduction zone rather than being a point source – a lesson also from the Sumatra earthquake of 26 December 2004, when ocean-floor thrusting extended 1200 km northwards to the Andaman Islands. Great earthquakes are far too infrequent for sufficient modern-style seismic data to have been collected for previous cases in the 20th century, but it seems clear since 2004 that: (1) stresses accumulate to unexpectedly high values where opposed plates are coupled or stuck together; (2) the ‘point-source’ model for earthquakes, which the use of seismic focuses and epicentres pinpointed by the world-wide seismic network encourages, is far from reality, the more so for the biggest stress accumulations; (3) existing approaches will fail for events with magnitudes greater than 8.0.
Part of the problem is the sparse record of great earthquakes and the likelihood that, if they do have cyclicity, it may be of the order of hundreds to thousands of years. Historical sources record a large earthquake and tsunamis affecting Sendai district in 869 CE (Common Era), confirmed recently by geologists having located a typical tsunami deposit extending 3-4 km up the Sendai Plain, compared with more than 5 km in March 2011. The survey team claimed at the time that their discovery might indicate far higher risk now in the area than modelled ‘officially’. Sadly, evaluating the prediction was incomplete when disaster did strike. Geoscientists can map faults, infer the length of their activity and work out the mechanisms whereby they fail, but apart from historical data – often sketchy – pinpointing and quantifying past events is beyond us, Looking at more widespread secondary effects, tsunami deposits in particular that often contain dateable organic debris, seems a fruitful way forward for coastal areas likely to bear the brunt of both shaking and huge inundations and the powerful ebbing of their flood waters. That is a topic in its infancy, but likely now to burgeon.
Ominously, because great earthquakes are so rare along any plate boundary, for seven greater than magnitude 8 to occur worldwide in a matter of 6 years (Sumatra, 2004, 9.1, 2005, 8.8, and three with magnitude >7 in 2010; Kuril Islands, 2006, 8.3, 2007, 8.1; Sichuan, 2008, 8.0; Chile, 2010, 8.8; Japan, 2011, 9.0) raises the questions, do they occur in time clusters, and if so, why? Although the numbers are small enough to strain statistics, comparing the last six years with the previous century or so of seismometer recordings shows that great earthquakes have never occurred so frequently. Is there a domino effect so that, say, energy from the Sumatran earthquake of late 2004 has somehow been transmitted throughout the interconnected subduction-zone system to destabilise other highly stressed areas? It is widely acknowledged that in one subduction system there is evidence of clustering, and this may extend to the two great earthquakes (2006 and 2007) in the Kuril Islands on the same boundary as the Sendai event, and two off Sumatra (2004 and 2005) with three more with magnitude >7 in 2010 on what previously had been regarded as a relatively quiescent subduction zone. Analysing all recorded seismic events greater than magnitude 5 to improve the statistics suggests that clustering does not extend to global scales, yet great earthquakes buck other trends shown by lesser ones. Their motions both vertical and lateral could conceivably cause widespread destabilisation, yet worryingly the only test of the idea is the occurrence of yet more in the next few years.
Sources: Normile, D. et al. 2011. Devastating earthquake defied expectations. Science, v. 331, p. 1375-1376; Brahic, C. et al. 2011. Megaquake aftermath. New Scientist, v. 209 (19 March 2011), p. 6-8; Cyranoski, D. Japan faces up to failure of its earthquake preparations. Nature, v. 471, p. 556-557; Normile, D. 2011. Scientific consensus on great quake came too late. Science, v. 332, p. 22-23.
See also: Geller, R.J. 2011. Shake-up time for Japanese seismology. Nature, v. 472, p. 407-409; Kerr, R.A. 2011. New Work reinforces megaquake’s harsh lessons in geoscience. Science, v. 332, p. 911