-
Notifications
You must be signed in to change notification settings - Fork 90
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
docs: GEOS paper submittal to JOSS #2886
Changes from 117 commits
54e927f
2b29b95
750862e
d55e5af
285f40c
f290eac
be52de7
cb5f8e1
8c19ccc
e50e46a
ed8f1fd
7b6dc25
ff87cd3
cb95a59
4a75ec7
567c151
e6adf8b
ba0d784
f061207
22bcf82
b98fce1
cc6aa8e
324d9b3
44e587f
ff5b361
1c0894a
1eba4d9
68ad585
f22be36
15e3d78
1cf51ea
affa149
6c78496
98a5fb5
8b7e179
048868c
794b84b
bcbdf03
8672c68
826c203
e28722e
9b82675
8c001e9
609103c
a62482e
0509c81
19a6700
fa9e0e2
a17a223
b72c3a3
653e93a
b2b2707
2abfbc8
0966865
cc3f0be
284f98d
45cddd1
d78492a
99fca7c
6e6431f
bb78dcd
026aa18
fcad4dc
9d27ba0
ab27ff0
3cc8ea7
2b1ca8a
7b8f83a
5a98c81
e108726
316b8b4
65b259b
80d0d13
4b4f155
a7e0518
2d3ef6f
8b3d92e
eb4aa4c
a4faddd
778204b
6a2af5d
c97947a
4a2706b
8aa32c0
a13429d
8bc0a8c
9efe862
6521858
9c400b4
feb93fa
1b40991
cd4bd50
bd4eba5
f0e2131
31d3ca9
80ae94b
a4fcd2a
ef0a13b
04e3f51
e388b70
8ec312e
2d9360e
986c251
9479bca
626f0d4
ff074dc
c761667
774c76f
387e435
20e74ff
446d68a
e20b6ea
d3ce292
007629e
889c8d0
aec27cf
5e5222f
53242fb
d153993
6cd110b
b89d201
4a57749
d3af213
51b79b7
135d913
cb59bbe
2fc7b04
b0e7f06
ce3e177
44efdd9
f19434c
e538311
fec8f63
7652709
4a402f7
fe1086a
d811b45
ba28c76
aafe671
0050783
89cc18f
b5c3d8b
7cd2c27
7902695
b022bd7
d796432
54ebb70
9c2c61d
e958d39
10116be
6a5df34
82c90ce
185a884
72b92a5
5a3cd44
49cd284
8498655
37cdc33
7a6c3b2
ce75916
1dd3456
8e3a0e2
d092921
527ce5b
1531466
475e185
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,23 @@ | ||
on: [push] | ||
|
||
jobs: | ||
paper: | ||
runs-on: ubuntu-latest | ||
name: Paper Draft | ||
steps: | ||
- name: Checkout | ||
uses: actions/checkout@v4 | ||
- name: Build draft PDF | ||
uses: openjournals/openjournals-draft-action@master | ||
with: | ||
journal: joss | ||
# This should be the path to the paper within your repo. | ||
paper-path: src/docs/JOSS/paper.md | ||
- name: Upload | ||
uses: actions/upload-artifact@v4 | ||
with: | ||
name: paper | ||
# This is the output path where Pandoc will write the compiled | ||
# PDF. Note, this should be the same directory as the input | ||
# paper.md | ||
path: src/docs/JOSS/paper.pdf |
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. we should update this figure because some things have changed since when it was made (e.g., the There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I think @TotoGaz generated this...but perhaps this figure is not required for the paper...and it can just be updated in the documentation. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It's generated on the fly from |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,105 @@ | ||
@article{Settgast:2017, | ||
author = {Settgast, Randolph R. and Fu, Pengcheng and Walsh, Stuart D.C. and White, Joshua A. and Annavarapu, Chandrasekhar and Ryerson, Frederick J.}, | ||
title = {A fully coupled method for massively parallel simulation of hydraulically driven fractures in 3-dimensions}, | ||
journal = {International Journal for Numerical and Analytical Methods in Geomechanics}, | ||
volume = {41}, | ||
number = {5}, | ||
pages = {627-653}, | ||
year = {2017}, | ||
doi = {10.1002/nag.2557} | ||
} | ||
|
||
@InProceedings{Beckingsale:2019, | ||
author={Beckingsale, David A. and Burmark, Jason and Hornung, Rich and Jones, Holger and Killian, William and Kunen, Adam J. and Pearce, Olga and Robinson, Peter and Ryujin, Brian S. and Scogland, Thomas R. W.}, | ||
booktitle={2019 IEEE/ACM International Workshop on Performance, Portability and Productivity in HPC (P3HPC)}, | ||
title={RAJA: Portable Performance for Large-Scale Scientific Applications}, | ||
pages={71-81}, | ||
year={2019}, | ||
doi={10.1109/P3HPC49587.2019.00012}} | ||
|
||
@misc{CHAI:2023, | ||
title = {CHAI}, | ||
year = {2023}, | ||
publisher = {GitHub}, | ||
journal = {GitHub repository}, | ||
url = {https://github.com/LLNL/chai} | ||
} | ||
|
||
@article{Beckingsale:2020, | ||
author={Beckingsale, D. A. and McFadden, M. J. and Dahm, J. P. S. and Pankajakshan, R. and Hornung, R. D.}, | ||
title={Umpire: Application-focused management and coordination of complex hierarchical memory}, | ||
journal={IBM Journal of Research and Development}, | ||
volume={64}, | ||
number={3/4}, | ||
pages={15:1-15:10}, | ||
year={2020}, | ||
doi={10.1147/JRD.2019.2954403} | ||
} | ||
|
||
@InProceedings{hypre, | ||
author = {Falgout, R. D. and Yang, U. M.}, | ||
title = {\textit{hypre}: a Library of High Performance Preconditioners}, | ||
booktitle = {Lecture Notes in Computer Science}, | ||
pages = {632--641}, | ||
year = {2002}, | ||
doi={10.1007/3-540-47789-6_66} | ||
} | ||
|
||
@Misc{ petsc-web-page, | ||
author = {Satish Balay and Shrirang Abhyankar and Mark~F. Adams and Steven Benson and Jed | ||
Brown and Peter Brune and Kris Buschelman and Emil~M. Constantinescu and Lisandro | ||
Dalcin and Alp Dener and Victor Eijkhout and Jacob Faibussowitsch and William~D. | ||
Gropp and V\'{a}clav Hapla and Tobin Isaac and Pierre Jolivet and Dmitry Karpeev | ||
and Dinesh Kaushik and Matthew~G. Knepley and Fande Kong and Scott Kruger and | ||
Dave~A. May and Lois Curfman McInnes and Richard Tran Mills and Lawrence Mitchell | ||
and Todd Munson and Jose~E. Roman and Karl Rupp and Patrick Sanan and Jason Sarich | ||
and Barry~F. Smith and Stefano Zampini and Hong Zhang and Hong Zhang and Junchao | ||
Zhang}, | ||
title = {{PETS}c {W}eb page}, | ||
url = {https://petsc.org/}, | ||
howpublished = {\url{https://petsc.org/}}, | ||
year = {2024} | ||
} | ||
|
||
@Manual{trilinos-website, | ||
title = {The {T}rilinos {P}roject {W}ebsite}, | ||
author = {The {T}rilinos {P}roject {T}eam}, | ||
year = {2020 (acccessed May 22, 2020)}, | ||
url = {https://trilinos.github.io} | ||
} | ||
|
||
@article{BUI:2020, | ||
author = {Bui, Quan M. and Osei-Kuffuor, Daniel and Castelletto, Nicola and White, Joshua A.}, | ||
title = {A Scalable Multigrid Reduction Framework for Multiphase Poromechanics of Heterogeneous Media}, | ||
journal = {SIAM Journal on Scientific Computing}, | ||
volume = {42}, | ||
number = {2}, | ||
pages = {B379-B396}, | ||
year = {2020}, | ||
doi = {10.1137/19M1256117}, | ||
} | ||
|
||
@article{BUI:2021114111, | ||
author = {Quan M. Bui and François P. Hamon and Nicola Castelletto and Daniel Osei-Kuffuor and Randolph R. Settgast and Joshua A. White}, | ||
title = {Multigrid reduction preconditioning framework for coupled processes in porous and fractured media}, | ||
journal = {Computer Methods in Applied Mechanics and Engineering}, | ||
volume = {387}, | ||
pages = {114111}, | ||
year = {2021}, | ||
doi = {10.1016/j.cma.2021.114111} | ||
} | ||
|
||
@book{ IPCC_2023, | ||
author={{Intergovernmental Panel on Climate Change IPCC}}, | ||
title={Climate Change 2022 - Mitigation of Climate Change: Working Group III Contribution to the Sixth Assessment Report of the Intergovernmental Panel on Climate Change}, | ||
publisher={Cambridge University Press}, | ||
place={Cambridge}, | ||
year={2023}, | ||
doi = {10.1017/9781009157926} | ||
} | ||
|
||
@misc{GEOS_RTD, | ||
title = {GEOS Documentation}, | ||
year = {2024}, | ||
url = {https://geosx-geosx.readthedocs-hosted.com/en/latest/}, | ||
} |
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,168 @@ | ||
--- | ||
title: 'GEOS-2023: A portable multi-physics simulation framework' | ||
tags: | ||
- reservoir simulations | ||
- computational mechanics | ||
- multiphase flow | ||
- c++ | ||
authors: | ||
- name: Randolph R. Settgast | ||
orcid: 0000-0002-2536-7867 | ||
corresponding: true | ||
affiliation: 1 | ||
- name: Ryan M. Aronson | ||
affiliation: 3 | ||
- name: Julien R. Besset | ||
affiliation: 2 | ||
- name: Andrea Borio | ||
affiliation: 5 | ||
- name: Thomas J. Byer | ||
affiliation: 1 | ||
- name: Nicola Castelletto | ||
affiliation: 1 | ||
- name: Aurélien Citrain | ||
affiliation: 2 | ||
- name: Benjamin C. Corbett | ||
affiliation: 1 | ||
- name: James Corbett | ||
affiliation: 1 | ||
- name: Philippe Cordier | ||
affiliation: 2 | ||
- name: Matthias A. Cremon | ||
affiliation: 1 | ||
- name: Cameron M. Crook | ||
affiliation: 1 | ||
- name: Matteo Cusini | ||
affiliation: 1 | ||
- name: Fan Fei | ||
affiliation: 1 | ||
- name: Stefano Frambati | ||
affiliation: 2 | ||
- name: Andrea Franceschini | ||
affiliation: 3 | ||
- name: Matteo Frigo | ||
affiliation: 3 | ||
- name: Thomas Gazzola | ||
affiliation: 2 | ||
- name: Herve Gross | ||
affiliation: 2 | ||
- name: Francois Hamon | ||
affiliation: 2 | ||
- name: Brian M. Han | ||
affiliation: 1 | ||
- name: Michael Homel | ||
affiliation: 1 | ||
- name: Jian Huang | ||
affiliation: 2 | ||
- name: Tao Jin | ||
affiliation: 1 | ||
- name: Dickson Kachuma | ||
affiliation: 2 | ||
- name: Mohammad Karimi-Fard | ||
affiliation: 3 | ||
- name: Sergey Klevtsov | ||
affiliation: 3 | ||
- name: Alexandre Lapene | ||
affiliation: 2 | ||
- name: Victor A. P. Magri | ||
affiliation: 1 | ||
- name: Daniel Osei-Kuffuor | ||
affiliation: 1 | ||
- name: Stefan Povolny | ||
affiliation: 1 | ||
- name: Shabnam J. Semnani | ||
- name: Chris S. Sherman | ||
affiliation: 1 | ||
- name: Melvin Rey | ||
affiliation: 2 | ||
- name: Hamdi A. Tchelepi | ||
affiliation: 3 | ||
- name: William R. Tobin | ||
affiliation: 1 | ||
- name: Pavel Tomin | ||
affiliation: 4 | ||
- name: Lionel Untereiner | ||
orcid: 0000-0002-8025-2616 | ||
- name: Joshua A. White | ||
affiliation: 1 | ||
- name: Hui Wu | ||
affiliation: 1 | ||
affiliations: | ||
- name: Lawrence Livermore National Laboratory, USA | ||
index: 1 | ||
- name: TotalEnergies E&P Research & Technology, USA | ||
index: 2 | ||
- name: Stanford University, USA | ||
index: 3 | ||
- name: Chevron Technical Center, USA | ||
index: 4 | ||
- name: Politecnico di Torino, Italy | ||
index: 5 | ||
date: 15 December 2023 | ||
bibliography: paper.bib | ||
|
||
--- | ||
|
||
# Summary | ||
|
||
GEOS is a simulation framework focused solving tightly-coupled multi-physics problems with an initial emphasis subsurface reservoir applications. | ||
rrsettgast marked this conversation as resolved.
Show resolved
Hide resolved
|
||
Currently GEOS actively supports implementations for studying carbon sequestration, geothermal energy, hydrogen storage, and similar subsurface applications. | ||
The unique aspect of GEOS that differentiates it from existing reservoir simulators is the ability to provide tightly-coupled compositional flow, poromechanics, faults and fractures slip, and thermal effects, etc. | ||
Extensive documentation is available on Read the Docs [@GEOS_RTD]. | ||
Note that the version of GEOS described here should be considered a separate work form the previous incarnation of GEOS referred to in [@Settgast:2017]. | ||
rrsettgast marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
# Statement of need | ||
|
||
The increasing threat of climate change has resulted in an increased focus on mitigating carbon emissions into the atmosphere. | ||
Carbon Capture and Storage (CCS) of CO2 in subsurface reservoirs and saline aquifers is an important component in the strategy to meet global climate goals. | ||
rrsettgast marked this conversation as resolved.
Show resolved
Hide resolved
|
||
Given the 2050 net-zero GHG goals, CO2 storage capacities required to offset emissions is orders of magnitude greater than current levels [@IPCC_2023]. | ||
The ability to evaluate the reservoir performance and containment risks associated with the injection of liquefied CO2 in the subsurface in a reproducible and transparent manner is an important consideration when developing new storage sites. | ||
The primary goal of GEOS is to provide the global community with an open-source tool that is capable of simulating the complex coupled physics that occurs when liquefied CO2 is injected into a subsurface reservoir. | ||
castelletto1 marked this conversation as resolved.
Show resolved
Hide resolved
|
||
Thus, GEOS is freely available and focused on the simulation of reservoir integrity through various failure mechanisms such as caprock failure, fault leakage, and wellbore failure. | ||
|
||
# GEOS Components | ||
|
||
The core c++17 infrastructure provides common computer science capabilities typically required for solving differential equations using a spatially discrete method. | ||
The components of the infrastructure provided by GEOS include a data hierarchy, a discrete mesh data structure, a mesh based MPI communications interface, degree-of-freedom management, IO services, and a physics package interface. | ||
|
||
By design, GEOS is intended to be a generic multi-physics simulation platform. | ||
The physics package interface in GEOS is intended to encapsulate the development of numerical methods applied to the solution of governing equations relevant to a problem. | ||
When implementing a physics package for a set of coupled physics equations, each individual physics package is first developed as a stand-alone capability. | ||
The single physics capabilities are then applied together in a coupled physics package and solved through a flexible strategy ranging from solving the fully monolithic system, to a split operator approach. | ||
|
||
To solve the linear systems that arise from the boundary value problem, GEOS maintains a generic linear algebra interface (LAI) capable of wrapping various linear algebra packages such as hypre [@hypre], PETSc[@petsc-web-page], and Trilinos[@trilinos-website]. | ||
Currently, in GEOS only the hypre interaface is actively maintained. | ||
For every multi-physics problems involving the solution of a coupled linear system, GEOS currently relies on a multigrid reduction preconditioning strategy available in hypre as presented by [@BUI:2020;@BUI:2021114111]. | ||
|
||
The performance portability strategy utilized by GEOS applies LLNL's suite of portability tools RAJA[@Beckingsale:2019], CHAI[@CHAI:2023], and Umpire[@Beckingsale:2020]. | ||
The RAJA performance portability layer provides portable kernel launching and wrappers for reductions, atomics, and local/shared memory to achieve performance on both CPU and GPU hardware. | ||
The combination of CHAI/Umpire provides memory motion management for platforms with heterogeneous memory spaces (i.e. host memory and device memory). | ||
Through this strategy GEOS has been successfully run on platforms ranging from GPU-based Exa-scale systems to CPU-based laptops with near optimal of performance. | ||
|
||
In addition to its c++ core, the GEOS project provides a Python3 interface that allows for the integration of the simulation capabilities into complex python workflows involving components unrelated to GEOS. | ||
|
||
# Applications | ||
To date GEOS has been used to simulate problems relevant to CO2 storage, enhanced geothermal systems, hydrogen storage, and both conventional and unconventional oil and gas extraction. | ||
Often these simulations involve coupling between compositional multiphase flow and transport, poroelasticity, thermal transport, and interactions with faults and fractures. | ||
|
||
As an example of a field case where GEOS has been applied, we present a coupled compositional flow/mechanics simulation of CO2 injection and storage at a large real-world storage site. | ||
Figure \ref{RW_results}a illustrates the computational mesh and Figure \ref{RW_results}b shows results after 25 years of injection. | ||
Simulations such as this will play a critical role in predicting the viability of potential CO2 storage sites. | ||
|
||
![Real world CO2 storage site: (a) discrete mesh, transparency is used for the overburden region to reveal the complex faulted structure of the storage reservoir; (b) results of a compositional flow simulation after 25 years of CO2 injection. The CO2 plume is shown in white near the bottom of the well. Colors in the reservoir layer indicate changes in fluid pressure, and the colors in the overburden indicate vertical displacement resulting from the injection. Note that color scales have been removed intentionally.\label{RW_results}](RW_final.pdf){ width=100% } | ||
|
||
As an example of the weak scalability of GEOS on exascale systems, we present two weak scaling studies on a simple wellbore geometry using the exascale Frontier supercomputer located at Oak Ridge National Laboratory (ORNL). | ||
The results from the weak scaling study (Figure \ref{fig:Frontier_scaling}a) shows flat scaling of the GEOS processes (assembly/field synchronization) up to 16,384 MPI ranks and 81.3e9 degrees-of-freedom (1/4 of Frontier). | ||
There is a moderate decrease in efficiency with the application of the hypre preconditioner setup and solve, but given the complexity of those algorithms this level of scaling efficiency is excellent. | ||
The compositional flow study presented in Figure \ref{fig:Frontier_scaling}b shows similarly good weak scaling. | ||
|
||
![Weak scaling results on ORNL/Frontier: execution time per timestep vs number of cluster ranks for a mechanics (a) and a compositional flow (b) simulation, respectively.\label{fig:Frontier_scaling}](GEOS_Frontier_scaling.pdf){ width=100% } | ||
|
||
# Acknowledgements | ||
This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 | ||
rrsettgast marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
This research was supported by the Exascale Computing Project (ECP), Project Number: 17-SC-20-SC, a collaborative effort of two DOE organizations - the Office of Science and the National Nuclear Security Administration, responsible for the planning and preparation of a capable exascale ecosystem, including software, applications, hardware, advanced system engineering and early testbed platforms, to support the nation's exascale computing imperative. | ||
|
||
Partial funding was provided by TotalEnergies and Chevron through the FC-MAELSTROM project, a collaborative effort between Lawrence Livermore National Laboratory, TotalEnergies, Chevron, and Stanford University, aiming to develop an exascale compatible, multiscale, research-oriented simulator for modeling fully coupled flow, transport and geomechanics in geological formations. | ||
|
||
# References |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we should really update this figure before submitting this. Some things are inaccurate.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It won't be included in the manuscript.