Skip to main content
internal slider
The Science Case
science case
internal slider
The Science Case
science case

The science case for ExtremeEarth expresses the conviction that concerted development of advancing information technologies can be used to: (i) provide a new foundation for predictions of Earth-system hazards and vulnerability; (ii) integrate diverse, unconventional, and previously un-usable data streams to document Earth’s present and past state; (iii) enable the fusion of models and data – of past, present and future – in ways that expose them to the full ingenuity of diverse application communities. In so doing ExtremeEarth will build upon and strengthen European excellence in Earth sciences, support European institutions, and launch a new wave of innovation at the intersection of Earth-system and information science. These advances will also change our capacity to model and even predict more disparate events within the Earth system, e.g., air pollution impacts on health, floods, storm surges, droughts, fires, and disease vectors. This fact also brings other science domains into ExtremeEarth’s technology scope.

Increasing the Physics in Predictions — Weather and Climate Extremes

eef

 

Advances in high-performance computing (HPC) have brought us to the point where it is now possible to contemplate substituting crude statistical models of crucial climate processes (called 'parameterisations') by more fundamental principles [1]. Extreme computing is making it possible to replace rules of thumb by laws of nature; to replace parameterisations of ocean eddies, of precipitating deep convection, of gravity waves, by explicit representations of the transient dynamics of these processes. Harnessing the power of extreme computing towards this end will allow us to develop qualitatively different models [2], ones much better grounded in the laws of physics, providing less biased and more reliable insights into how extremes respond to warming, and what large surprises a warmer world may entail [3]. This need is identical for weather and climate prediction [4].

 

Changes in large-scale patterns of atmospheric and oceanic circulations are a wild card of climate change, and expose vulnerability of societies to environmental change. Will the basic structure of the tropical circulation qualitatively change as the Earth warms? Will the West Antarctic ice sheet collapse and if so, how soon? How will patterns of winter storms change and what do these changes imply for weather extremes and sea-level? What will be the societal impacts of these extreme events? Even in the absence of climate change we have frightfully little insight into these types of questions, as our imagination is held hostage to models that were never developed for such tasks, and whose inadequacies in confronting such challenges has slowly become apparent [5].

Key parameterisations in existing global weather and climate models condition the behaviour of these large-scale aspects of the climate system to a disproportionate extent: momentum transport by parameterised gravity waves, or Rossby waves emanating from large-scale patterns of parameterised convection in the tropics, determine the position of the storm tracks; tropical convection interacts intimately with patterns of sea-surface temperatures and fine-scale currents and the processes in the upper ocean to determine the structure of the tropical climate, and parameterisations of ocean eddies condition the stability of the southern ocean and its ability to transport heat through the climate system (and to the ice-sheets) as well as carbon. Advances in HPC, if they can be harvested, make it possible to directly simulate crucial processes – ocean eddies, ice fracture, atmospheric gravity waves, and precipitating deep convection – and offer an opportunity to develop entirely new insights into the large-scale dynamics of our changing climate, and the risk they pose for present and future society [6]. Similarly, much higher-resolution regional models (e.g. turbulence resolving) will lead to better local predictions of high-impact weather.

Increasing the Physics in Predictions — Earthquakes and Volcanic Eruptions

eef4

The inherent physical complexity of the solid Earth – the chaotic nature of the interaction of its crustal fault systems, or of its underground magma flows and their interaction with confining rock systems, combined with the lack of sufficient and accurate observations to study phenomena occurring below Earth’s surface – introduces enormous computing challenges similar to those faced by weather and climate. Until now these challenges have forced models in the solid Earth domains to disproportionately rely on statistical and piecemeal modelling approaches. ExtremeEarth will build the technology platform of the scale and power required to tackle, for the first time, the ambitious goals of (i) predicting three fundamental aspects of earthquakes: their spatial and temporal distribution, their initiation and rupture, and their resultant seismic shaking at frequencies most relevant for our built environment, and (ii) creating the first global volcanic simulator capable of predicting the initiation of a volcanic eruption, the space-time evolution of the eruption dynamics, its impacts on the territory, and the space-time aerial distribution of volcanic ash.

It is often said that earthquakes will never be predicted, but we are in a period where the observation of earthquakes and our physical understanding of rupture phenomena, magma flow, and wave propagation in heterogeneous media are all advancing rapidly. At the same time, extreme-scale computing is reaching a scale where it offers the promise to attain the spatial and temporal resolution and complexity necessary to model and integrate all these elements together. This situation motivates ExtremeEarth’s ambition to develop the tools required to predict important aspects of earthquakes and possibly even volcanic eruptions.

To this end, ExtremeEarth will enable the community to transcend traditional approaches – based on scarce historical information and crude statistics – to build multi-scale physics-based predictive models ultimately capable of protecting society. These new models will make it possible to integrate information about active faults, geodetic strain rates, remote sensing imaging, seismicity distribution and geodynamic constraints, and thus enable the simulation of the physical processes leading to earthquakes and volcanoes. For instance physical models integrating the newest earthquake dynamics on complex fault geometries with real-time data assimilation from near-fault observatories will be constructed, to identify possible precursors and map the initiation and evolution of rupture. Similarly, physical models of the coupled dynamics in volcanic sub-domains will be integrated with automatic signal detection from massive analysis of data from multi-parametric volcano monitoring systems to resolve the deep volcano dynamics and anticipate the occurrence of an eruption. Additionally, methodologies for full-waveform simulations in complex heterogeneous media – accounting for the geological and sedimentary structures observed at the Earth’s surface and in the upper crust – that have been developed in recent years, can be developed into a fully integrated European broad-band platform through extreme scale computing. The target capability is to generate synthetic seismograms across Europe up to 10 Hz in frequency corresponding to a wavelength of 60-100 m for the typical sedimentary basins on which most of our cities are built.

Fusing models and data

Limitations in the physics of existing models, as well as computational restrictions (for instance in ensemble data assimilation methods, or methods of empirical inference) also restrict our ability to sensibly ingest and make use of data collected continually by a vast array of sensors: satellites, ground stations, organised surface networks, commodity device sensing, etc. Data assimilation, the process whereby diverse and heterogeneous measurements of different quality are cast into a structure that makes them usable is limited by the quality and resolution of our physical models. New methods of empirical inference (deep learning) are only beginning to tap into the potential of the vast amounts of fused model-observation data. By simulating the Earth system on the scales at which it is observed, and by explicitly representing the transient dynamics of the processes that are observed, rather than a crude estimate of their statistics, it will become possible to integrate increasingly diverse data streams and exploit their full information, in ways that blur the distinction in how users interact with models and data, or look at the past versus the future.

 

[1] Stevens and Bony, 2013: What Are Climate Models Missing? Science, 340(6136), 1053–1054.

[2] Bony et al., 2015: Clouds, circulation and climate sensitivity. Nature Geoscience8261268.

[3] IPCC, WG II, https://www.ipcc.ch/pdf/assessment-report/ar5/wg2/WGIIAR5-FrontMatterA_FINAL.pdf

[4] Bauer et al., 2015: The quiet revolution of numerical weather prediction. Nature, doi:10.1038/nature14956

[5] Marotzke et al., 2017: Climate research must sharpen its view. Nature Climate Change, 7, 89-91.

[6] Palmer, 2014: Build high-resolution global climate models. Nature, 515, 338-339.