Full Text View

Volume 25 Issue 6 (June 2015)

GSA Today

Bookmark and Share

Article, pp. 42–43 | Abstract | PDF (161KB)

Groundwork
Table of Contents
Search GoogleScholar for

Search GSA Today


 
GROUNDWORK:

Moving lithospheric modeling forward: Attributes of a community computer code

C.M. Cooper1, Eric Mittelstaedt2, Claire A. Currie3, Jolante van Wijk4, Louise H. Kellogg5, Lorraine Hwang5, Ramon Arrowsmith6

1 Washington State University, School of the Environment, PO Box 624812, Pullman, Washington 99164-2812, USA
2 University of Idaho, Dept. of Geological Sciences, 875 Perimeter Drive, MS 3022, Moscow, Idaho 83844-3022, USA
3 University of Alberta, Dept. of Physics, Edmonton, Alberta, Canada T6G 2E1
4 New Mexico Institute of Mining and Technology, Dept. of Earth & Environmental Science, 801 Leroy Place, Socorro, New Mexico 87801, USA
5 University of California Davis, Earth and Planetary Sciences, Computational Infrastructure for Geodynamics, 2215 Earth and Physical Sciences, One Shields Avenue, Davis, California 95616, USA
6 Arizona State University, School of Earth & Space Exploration, P.O. Box 876004, Tempe, Arizona 85287-6004, USA

We live on a planet with an active surface that is modified and deformed at multiple temporal and spatial scales owing to diverse processes occurring at plate boundaries and plate interiors. The processes of mid-ocean-ridge spreading, mountain building, subduction of tectonic plates, mantle drag, intra-continental deformation, earthquakes, and volcanism cross traditional disciplinary boundaries (Fig. 1A). Understanding these lithospheric processes is valuable not only for intellectual curiosity and to refine our working knowledge of plate tectonics, but also for understanding threats to life, property, and infrastructure. Computer modeling and simulation are increasingly powerful tools that researchers employ to better understand lithospheric deformation and unravel the complex feedbacks that drive the evolution of Earth’s surface. The field is poised for a significant advance to take advantage of recent expansions in computing power, improved representation of idealized processes, increased data availability, and better communication between software developers and geoscientists.

Emails: Cooper: ; Mittelstaedt: ; Currie: ; van Wijk: ; Kellogg: ; Hwang: ; Arrowsmith: .

Manuscript received 28 Aug. 2014; accepted 2 Dec. 2014

doi: 10.1130/GSATG230GW.1

Figure 1Figure 1

(A) The study of how the lithosphere deforms spans disciplines to help us understand earth processes on a subatomic to global scale and from microseconds to hundreds of millions of years. The wide range of scales in space and time are challenging to accommodate using today’s computation resources (after Hwang et al., 2014). (B) Mathematical techniques commonly used in geophysics research can be classified as continuum, analytical, or discontinuous methods, posing computational and numerical challenges for multidisciplinary research.

To move forward as a community, we must address key scientific drivers motivating present and future lithospheric deformation research. The scientific processes to incorporate include melting and melt transport, strain localization and de-localization, surface processes (e.g., erosion and deposition), and mantle-lithosphere interaction. Understanding these requires the integration of results from seismic imaging, the earthquake cycle, plate boundary evolution, and more realistic Earth-like rheologies into numerical models that are reliable, portable, and computationally efficient.

Lithospheric modelers are confronted with a broad range of challenges to address these drivers. Scientifically, crucial geological processes lack theoretical or empirical descriptions (e.g., variable fault dip at depth, spacing in shear bands, localization of deformation, and coupled deformation with melting and melt migration). Incorporating the vast quantity of new data available through such initiatives as the National Science Foundation’s EarthScope and data compilations such as Gplates (Qin et al., 2012) and PetDB (Lehnert et al., 2000) requires both the development of new data-handling methods and an understanding of their interrelationships. Added to these challenges are the difficulties in implementing the numerical methods required to run the desired simulations, including modeling systems with large-magnitude variations in material properties occurring over short spatial scales; maintaining discrete material boundaries as the model evolves; and incorporating realistic fault evolution and faulting behavior. Lastly, extending models to three dimensions increases the numerical and model complexity, an area that has seen limited development.

Modeling complex systems requires validation and verification of software. Establishing and running benchmarks and test suites not only “proves” a code, it also provides important insight to the researcher. Limits in parameter space and trade-offs between different model specifications become better known. Benchmarking performance helps to inform the use of computational resources and to understand numerical uncertainty.

The heterogeneity of the lithosphere translates to a heterogeneous approach to modeling lithospheric processes. Computational approaches employed to address the key scientific interests of the community tend to be based either in continuum, analytical, or discontinuous methods (Fig. 1B). Usage of these different mathematical methods, several of which may be deployed in any one code, depends on the maturity of the research area and the specifics of the research question. Individual researchers will often develop numerical techniques and modeling software capable of solving specific geologic problems. While these efforts often result in numerical codes that are powerful and apt for the problem at hand, they often do not translate into a more universal modeling tool.

For example, while one technique might be optimal for understanding the evolution of ocean basins and localized faulting at a mid-ocean ridge, it may not be applicable for regional-scale subduction dynamics. Similarly, techniques used to model stress/strain fields over the earthquake cycle may not be optimal for understanding stress/strain fields generated during continental collision. In addition, while grain-scale processes are critical in understanding how rocks deform, it may not be necessary to include these small-scale effects when trying to understand continental scale deformation. Furthermore, we understand that the lithosphere behaves as an elasto-visco-plastic material, but many of the governing characteristics of this rheologic behavior are not well defined and thus not easy to incorporate into numerical models. Large-scale geophysical observatories, such as EarthScope’s real-time seismological and geodetic data streams (Williams et al., 2010), coupled with the breadth of research questions (both basic and applied) focusing on the structure and evolution of the North American continent provide a great interpretive challenge that requires a broad range of lithospheric dynamics modeling capabilities.

Therein lies the challenge of modeling lithospheric processes: Can we build a community code (or suite of codes) that can span the breadth of lithospheric processes while maintaining the required numerical rigor to solve such problems and that is offered at a level that is accessible to users with a wide range of experiences?

The ideas in this article emerged from the 2014 CIG EarthScope Institute for Lithospheric Modeling workshop, a joint workshop of the Computational Infrastructure for Geodynamics’ (CIG) long-term tectonics community and the EarthScope National Office. This was the first dedicated workshop within North America for the modeling of lithospheric deformation in more than a decade; the topic has previously been wrapped into larger workshops and national meetings with a broader scope, diluting many of the discussions pertinent to lithospheric deformation modeling. The workshop highlighted the complexity and variety within the discipline, suggesting that a “one size fits many”—that is, a single community code that fits most researchers’ needs—might not be the best approach. Rather, a move toward a common core or engine that researchers can build upon or modify to suit their specific research problems might be a more realistic and fruitful endeavor. Community-developed scientific codes can build on established numerical methods (which are ideally benchmarked, documented, and open-source) while taking advantage of state-of-the-art techniques. As a community of user-developers is established, a shared expertise emerges that in turn leads to improved computational tools.

To begin this journey, we must build a community vision. The workshop articulated the following as needs: to (1) continue the conversations, either in person at meetings or via online forums; (2) establish the means to collate best practices and known successful numerical techniques; (3) develop benchmarks and use cases—specific examples of scientific or technical problems or questions, with identified goals, key users, and outcomes; and (4) collectively begin a community-wide benchmark exercise in order to assess current computing capabilities and guide the development of the next generation of models. Through these efforts, the community as a whole can move lithospheric deformation modeling into the next frontier. This requires your involvement; please consider visiting the CIG website and joining the list-serves and online community.

References Cited

  1. Hwang, L., Jordan, T., Kellogg, L., Tromp, J., and Willemann, R., 2014, Advancing Solid Earth System Science through High Performance Computing: http://geodynamics.org/cig/files/1614/0224/2811/AdvHPC-June2014.pdf (last accessed 27 Jan. 2015).
  2. Lehnert, K., Su, Y., Langmuir, C., Sarbas, B., and Nohl, U., 2000, A global geochemical database structure for rocks: Geochemistry Geophysics Geosystems, v. 1, no. 5, doi: 10.1029/1999GC000026.
  3. Qin, X., Müller, R.D., Cannon, J., Landgrebe, T.C.W., Heine, C., Watson, R.J., and Turner, M., 2012, The GPlates Geological Information Model and Markup Language, in Geoscientific Instrumentation, Methods and Data Systems Discussion, v. 2, p. 365–428, doi: 10.5194/gid-2-365-2012.
  4. Williams, M.L., Fischer, K.M., Freymueller, J.T., Tikoff, B., Tréhu, A.M., et al., 2010, Unlocking the Secrets of the North American Continent: An EarthScope Science Plan for 2010–2020, February, 2010, 78 p., http://www.earthscope.org/assets/uploads/pages/es_sci_plan_hi.pdf (last accessed 27 Jan. 2015).

top