From 3ccf452f870923b5a5ef848228fa9c2dfeff4272 Mon Sep 17 00:00:00 2001 From: "Michael Kavulich, Jr" Date: Mon, 11 Mar 2024 09:36:49 -0600 Subject: [PATCH 01/26] First go at converting LaTeX to Sphinx-compatible RST --- scm/doc/TechGuide/acknow.rst | 16 + scm/doc/TechGuide/chap_cases.rst | 605 +++++++++++++++++++ scm/doc/TechGuide/chap_ccpp.rst | 385 ++++++++++++ scm/doc/TechGuide/chap_function.rst | 157 +++++ scm/doc/TechGuide/chap_hsd.rst | 344 +++++++++++ scm/doc/TechGuide/chap_intro.rst | 116 ++++ scm/doc/TechGuide/chap_quick.rst | 892 ++++++++++++++++++++++++++++ scm/doc/TechGuide/chap_repo.rst | 22 + scm/doc/TechGuide/main.rst | 1 + scm/doc/TechGuide/preface.rst | 21 + scm/doc/TechGuide/title.rst | 25 + 11 files changed, 2584 insertions(+) create mode 100644 scm/doc/TechGuide/acknow.rst create mode 100644 scm/doc/TechGuide/chap_cases.rst create mode 100644 scm/doc/TechGuide/chap_ccpp.rst create mode 100644 scm/doc/TechGuide/chap_function.rst create mode 100644 scm/doc/TechGuide/chap_hsd.rst create mode 100644 scm/doc/TechGuide/chap_intro.rst create mode 100644 scm/doc/TechGuide/chap_quick.rst create mode 100644 scm/doc/TechGuide/chap_repo.rst create mode 100644 scm/doc/TechGuide/main.rst create mode 100644 scm/doc/TechGuide/preface.rst create mode 100644 scm/doc/TechGuide/title.rst diff --git a/scm/doc/TechGuide/acknow.rst b/scm/doc/TechGuide/acknow.rst new file mode 100644 index 000000000..dcb376ad2 --- /dev/null +++ b/scm/doc/TechGuide/acknow.rst @@ -0,0 +1,16 @@ +.. container:: titlepage + + .. container:: flushleft + + Acknowledgement + + | If significant help was provided via the helpdesk, email, or + support forum for work resulting in a publication, please + acknowledge the Developmental Testbed Center team. + + | For referencing this document please use: + + Firl, G., D. Swales, L. Carson, L. Bernardet, D. Heinzeller, and + M. Harrold, 2022. Common Community Physics Package Single Column + Model v6.0.0 User and Technical Guide. 40pp. Available at + https://dtcenter.org/sites/default/files/paragraph/scm-ccpp-guide-v6-0-0.pdf. diff --git a/scm/doc/TechGuide/chap_cases.rst b/scm/doc/TechGuide/chap_cases.rst new file mode 100644 index 000000000..55c1bd6fb --- /dev/null +++ b/scm/doc/TechGuide/chap_cases.rst @@ -0,0 +1,605 @@ +.. _`chapter: cases`: + +Cases +===== + +How to run cases +---------------- + +Only two files are needed to set up and run a case with the SCM. The +first is a configuration namelist file found in that contains parameters +for the SCM infrastructure. The second necessary file is a NetCDF file +containing data to initialize the column state and time-dependent data +to force the column state. The two files are described below. + +.. _`subsection: case config`: + +Case configuration namelist parameters +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +The namelist expects the following parameters: + +- + + - Identifier for which dataset (initialization and forcing) to load. + This string must correspond to a dataset included in the directory + (without the file extension). + +- + + - Specify the model runtime in seconds (integer). This should + correspond with the forcing dataset used. If a runtime is + specified that is longer than the supplied forcing, the forcing is + held constant at the last specified values. + +- + + - An integer representing how forcing for temperature and moisture + state variables is applied (1 :math:`=` total advective + tendencies, 2 :math:`=` horizontal advective tendencies with + prescribed vertical motion, 3 :math:`=` relaxation to observed + profiles with vertical motion prescribed) + +- + + - An integer representing how forcing for horizontal momentum state + variables is applied (1 :math:`=` total advective tendencies; not + implemented yet, 2 :math:`=` horizontal advective tendencies with + prescribed vertical motion, 3 :math:`=` relaxation to observed + profiles with vertical motion prescribed) + +- + + - A floating point number representing the timescale in seconds for + the relaxation forcing (only used if :math:`=` or :math:`=` ) + +- + + - A boolean set to if surface flux are specified from the forcing + data (there is no need to have surface schemes in a suite + definition file if so) + +- + + - Surface roughness length in cm for calculating surface-related + fields from specified surface fluxes (only used if is True). + +- + + - An integer representing the character of the surface (0 :math:`=` + sea surface, 1 :math:`=` land surface, 2 :math:`=` sea-ice + surface) + +- + + - An integer representing the choice of reference profile to use + above the supplied initialization and forcing data (1 :math:`=` + “McClatchey” profile, 2 :math:`=` mid-latitude summer standard + atmosphere) + +- + + - An integer representing the year of the initialization time + +- + + - An integer representing the month of the initialization time + +- + + - An integer representing the day of the initialization time + +- + + - An integer representing the hour of the initialization time + +- + + - A list of floating point values representing the characteristic + horizontal domain area of each atmospheric column in square meters + (this could be analogous to a 3D model’s horizontal grid size or + the characteristic horizontal scale of an observation array; these + values are used in scale-aware schemes; if using multiple columns, + you may specify an equal number of column areas) + +- + + - A boolean set to if UFS atmosphere initial conditions are used + rather than field campaign-based initial conditions + +- + + - An integer representing the grid size of the UFS atmosphere + initial conditions; the integer represents the number of grid + points in each horizontal direction of each cube tile + +- + + - 0 => original DTC format, 1 => DEPHY-SCM format. + +Optional variables (that may be overridden via run script command line +arguments) are: + +- + + - File containing FV3 vertical grid coefficients. + +- + + - Specify the integer number of vertical levels. + +.. _`subsection: case input`: + +Case input data file (CCPP-SCM format) +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +The initialization and forcing data for each case is stored in a NetCDF +(version 4) file within the directory. Each file has at least two +dimensions ( and , potentially with additions for vertical snow and soil +levels) and is organized into 3 groups: scalars, initial, and forcing. +Not all fields are required for all cases. For example the fields and +are only needed if the variable :math:`=` in the case configuration file +and state nudging variables are only required if :math:`=` or :math:`=` +. Using an active LSM (Noah, NoahMP, RUC) requires many more variables +than are listed here. Example files for using with Noah and NoahMP LSMs +are included in . + +.. _`subsection: case input dephy`: + +.. literalinclude:: arm_case_header.txt + :name: lst_case_input_netcdf_header + :caption: example NetCDF file (CCPP-SCM format) header for case initialization and forcing data + +Case input data file (DEPHY format) +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +The Development and Evaluation of Physics in atmospheric models (DEPHY) +format is an internationally-adopted data format intended for use by SCM +and LESs. The initialization and forcing data for each case is stored in +a NetCDF (version 4) file, although these files are not by default +included in the CCPP SCM repository. To access these cases you need to +clone the DEPHY-SCM repository, and provide the DEPHY-SCM file location +to the SCM. For example: + +.. code:: bash + + cd [...]/ccpp-scm/scm/data + git clone https://github.com/GdR-DEPHY/DEPHY-SCM DEPHY-SCM + cd [...]/ccpp-scm/scm/bin + ./run_scm.py -c MAGIC_LEG04A --case_data_dir [...]/ccpp-scm/scm/data/DEPHY-SCM/MAGIC/LEG04A -v + +Each DEPHY file has three dimensions (, , ) and contains the initial +conditions (, ) and forcing data (, ). Just as when using the CCPP-SCM +formatted inputs, `1.1.2 <#subsection: case input>`__, not all fields +are required for all cases. More information on the DEPHY format +requirements can be found at +`DEPHY `__. + +.. literalinclude:: dephy_case_header.txt + :name: lst_case_input_netcdf_header + :caption: example NetCDF file (DEPHY format) header for case initialization and forcing data + +Included Cases +-------------- + +Several cases are included in the repository to serve as examples for +users to create their own and for basic research. All case configuration +namelist files for included cases can be found in and represent the +following observational field campaigns: + +- Tropical Warm Pool – International Cloud Experiment (TWP-ICE) + maritime deep convection + +- Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) + Summer 1997 continental deep convection + +- Atlantic Stratocumulus Transition EXperiment (ASTEX) maritime + stratocumulus-to-cumulus transition + +- Barbados Oceanographic and Meteorological EXperiment (BOMEX) maritime + shallow convection + +- Large eddy simulation ARM Symbiotic Simulation and Observation + (LASSO) for May 18, 2016 (with capability to run all LASSO dates - + see `1.4 <#sec:lasso>`__) continental shallow convection + +For the ARM SGP case, several case configuration files representing +different time periods of the observational dataset are included, +denoted by a trailing letter. The LASSO case may be run with different +forcing applied, so three case configuration files corresponding to +these different forcing are included. In addition, two example cases are +included for using UFS Atmosphere initial conditions: + +- UFS initial conditions for 38.1 N, 98.5 W (central Kansas) for 00Z on + Oct. 3, 2016 with Noah variables on the C96 FV3 grid () + +- UFS initial conditions for 38.1 N, 98.5 W (central Kansas) for 00Z on + Oct. 3, 2016 with NoahMP variables on the C96 FV3 grid () + +See `1.5 <#sec:UFSreplay>`__ for information on how to generate these +files for other locations and dates, given appropriate UFS Atmosphere +initial conditions and output. + +How to set up new cases +----------------------- + +Setting up a new case involves preparing the two types of files listed +above. For the case initialization and forcing data file, this typically +involves writing a custom script or program to parse the data from its +original format to the format that the SCM expects, listed above. An +example of this type of script written in Python is included in . The +script reads in the data as supplied from its source, converts any +necessary variables, and writes a NetCDF (version 4) file in the format +described in subsections `1.1.2 <#subsection: case input>`__ and +`1.1.3 <#subsection: case input dephy>`__. For reference, the following +formulas are used: + +.. math:: \theta_{il} = \theta - \frac{\theta}{T}\left(\frac{L_v}{c_p}q_l + \frac{L_s}{c_p}q_i\right) + +.. math:: q_t = q_v + q_l + q_i + +where :math:`\theta_{il}` is the ice-liquid water potential temperature, +:math:`\theta` is the potential temperature, :math:`L_v` is the latent +heat of vaporization, :math:`L_s` is the latent heat of sublimation +:math:`c_p` is the specific heat capacity of air at constant pressure, +:math:`T` is absolute temperature, :math:`q_t` is the total water +specific humidity, :math:`q_v` is the water vapor specific humidity, +:math:`q_l` is the suspended liquid water specific humidity, and +:math:`q_i` is the suspended ice water specific humidity. + +As shown in the example NetCDF header, the SCM expects that the vertical +dimension is pressure levels (index 1 is the surface) and the time +dimension is in seconds. The initial conditions expected are the height +of the pressure levels in meters, and arrays representing vertical +columns of :math:`\theta_{il}` in K, :math:`q_t`, :math:`q_l`, and +:math:`q_i` in kg kg\ :math:`^{-1}`, :math:`u` and :math:`v` in m +s\ :math:`^{-1}`, turbulence kinetic energy in m\ :math:`^2` +s\ :math:`^{-2}` and ozone mass mixing ratio in kg kg\ :math:`^{-1}`. + +For forcing data, the SCM expects a time series of the following +variables: latitude and longitude in decimal degrees [in case the +column(s) is moving in time (e.g., Lagrangian column)], the surface +pressure (Pa) and surface temperature (K). If surface fluxes are +specified for the new case, one must also include a time series of the +kinematic surface sensible heat flux (K m s\ :math:`^{-1}`) and +kinematic surface latent heat flux (kg kg\ :math:`^{-1}` m +s\ :math:`^{-1}`). The following variables are expected as 2-dimensional +arrays (vertical levels first, time second): the geostrophic u (E-W) and +v (N-S) winds (m s\ :math:`^{-1}`), and the horizontal and vertical +advective tendencies of :math:`\theta_{il}` (K s\ :math:`^{-1}`) and +:math:`q_t` (kg kg\ :math:`^{-1}` s\ :math:`^{-1}`), the large scale +vertical velocity (m s\ :math:`^{-1}`), large scale pressure vertical +velocity (Pa s\ :math:`^{-1}`), the prescribed radiative heating rate (K +s\ :math:`^{-1}`), and profiles of u, v, T, :math:`\theta_{il}` and +:math:`q_t` to use for nudging. + +Although it is expected that all variables are in the NetCDF file, only +those that are used with the chosen forcing method are required to be +nonzero. For example, the following variables are required depending on +the values of and specified in the case configuration file: + +- :math:`=` + + - Not implemented yet + +- :math:`=` + + - geostrophic winds and large scale vertical velocity + +- :math:`=` + + - u and v nudging profiles + +- :math:`=` + + - horizontal and vertical advective tendencies of + :math:`\theta_{il}` and :math:`q_t` and prescribed radiative + heating (can be zero if radiation scheme is active) + +- :math:`=` + + - horizontal advective tendencies of :math:`\theta_{il}` and + :math:`q_t`, prescribed radiative heating (can be zero if + radiation scheme is active), and the large scale vertical pressure + velocity + +- :math:`=` + + - :math:`\theta_{il}` and :math:`q_t` nudging profiles and the large + scale vertical pressure velocity + +For the case configuration file, it is most efficient to copy an +existing file in and edit it to suit one’s case. Recall from subsection +`1.1.1 <#subsection: case config>`__ that this file is used to configure +the SCM framework parameters for a given case. Be sure to check that +model timing parameters such as the time step and output frequency are +appropriate for the physics suite being used. There is likely some +stability criterion that governs the maximum time step based on the +chosen parameterizations and number of vertical levels (grid spacing). +The parameter should match the name of the case input data file that was +configured for the case (without the file extension). The parameter +should be less than or equal to the length of the forcing data unless +the desired behavior of the simulation is to proceed with the last +specified forcing values after the length of the forcing data has been +surpassed. The initial date and time should fall within the forcing +period specified in the case input data file. If the case input data is +specified to a lower altitude than the vertical domain, the remainder of +the column will be filled in with values from a reference profile. There +is a tropical profile and mid-latitude summer profile provided, although +one may add more choices by adding a data file to and adding a parser +section to the subroutine in . Surface fluxes can either be specified in +the case input data file or calculated using a surface scheme using +surface properties. If surface fluxes are specified from data, set to +and specify for the surface over which the column resides. Otherwise, +specify a . In addition, one must specify a for each column. + +To control the forcing method, one must choose how the momentum and +scalar variable forcing are applied. The three methods of Randall and +Cripe (1999, JGR) have been implemented: “revealed forcing” where total +(horizontal :math:`+` vertical) advective tendencies are applied (type +1), “horizontal advective forcing” where horizontal advective tendencies +are applied and vertical advective tendencies are calculated from a +prescribed vertical velocity and the calculated (modeled) profiles (type +2), and “relaxation forcing” where nudging to observed profiles replaces +horizontal advective forcing combined with vertical advective forcing +from prescribed vertical velocity (type 3). If relaxation forcing is +chosen, a that represents the timescale over which the profile would +return to the nudging profiles must be specified. + +.. _`sec:lasso`: + +Using other LASSO cases +----------------------- + +In order to use other LASSO cases than the one provided, perform the +following steps: + +#. Access http://archive.arm.gov/lassobrowser and use the navigation on + the left to choose the dates for which you would like to run a SCM + simulation. Pay attention to the “Large Scale Forcing” tab where you + can choose how the large scale forcing was generated, with options + for ECMWF, MSDA, and VARANAL. All are potentially valid, and it is + likely worth exploring the differences among forcing methods. Click + on Submit to view a list of simulations for the selected criteria. + Choose from the simulations (higher skill scores are preferred) and + check the “Config Obs Model Tar” box to download the data. Once the + desired simulations have been checked, order the data (you may need + to create an ARM account to do so). + +#. Once the data is downloaded, decompress it. From the directory, copy + the files , , and into their own directory under . + +#. Modify to point to the input files listed above. Execute the script + in order to generate a case input file for the SCM (to be put in ): + + .. code:: bash + + ./lasso1_forcing_file_generator_gjf.py + +#. Create a new case configuration file (or copy and modify an existing + one) in . Be sure that the variable points to the newly + created/processed case input file from above. + +.. _`sec:UFSreplay`: + +Using UFS Output to Create SCM Cases: UFS-Replay +------------------------------------------------ + +.. _`subsection: pydepend`: + +Python Dependencies +~~~~~~~~~~~~~~~~~~~ + +The scripts here require a few python packages that may not be found by +default in all python installations. There is a YAML file with the +python environment needed to run the script in . To create and activate +this environment using conda: + +Create environment (only once): + +This will create the conda environment + +Activate environment: + +.. _`subsection: ufsicgenerator`: + +UFS_IC_generator.py +~~~~~~~~~~~~~~~~~~~ + +A script exists in to read in UFS history (output) files and their +initial conditions to generate a SCM case input data file, in DEPHY +format. + +.. code:: bash + + ./UFS_IC_generator.py [-h] (-l LOCATION LOCATION | -ij INDEX INDEX) -d + DATE -i IN_DIR -g GRID_DIR -f FORCING_DIR -n + CASE_NAME [-t {1,2,3,4,5,6,7}] [-a AREA] [-oc] + [-lam] [-sc] [-near] + +Mandatory arguments: + +#. OR : Either longitude and latitude in decimal degrees east and north + of a location OR the UFS grid index with the tile number + + - -l 261.51 38.2 (two floating point values separated by a space) + + - -ij 8 49 (two integer values separated by a space; this option + must also use the argument to specify the tile number) + +#. YYYYMMDDHHMMSS: date corresponding to the UFS initial conditions + +#. : path to the directory containing the UFS initial conditions + +#. : path to the directory containing the UFS supergrid files (AKA "fix" + directory) + +#. : path to the directory containing the UFS history files + +#. : name of case + +Optional arguments: + +#. : if one already knows the correct tile for the given longitude and + latitude OR one is specifying the UFS grid index ( argument) + +#. : area of grid cell in :math:`m^2` (if known or different than the + value calculated from the supergrid file) + +#. : flag if UFS initial conditions were generated using older version + of chgres (global_chgres); might be the case for pre-2018 data + +#. : flag to signal that the ICs and forcing is from a limited-area + model run + +#. : flag to create UFS reference file for comparison + +#. : flag to indicate using the nearest UFS history file gridpoint + +.. _`subsection: ufsforcingensemblegenerator`: + +UFS_forcing_ensemble_generator.py +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +There is an additional script in to create UFS-replay case(s) starting +with output from UFS Weather Model (UWM) Regression Tests (RTs). + +.. code:: bash + + UFS_forcing_ensemble_generator.py [-h] -d DIR -n CASE_NAME + (-lonl LON_1 LON_2 -latl LAT_1 LAT_2 -nens NENSMEMBERS | + -lons [LON_LIST] -lats [LAT_LIST]) + [-dt TIMESTEP] [-cres C_RES] [-sdf SUITE] [-sc] [-near] + +Mandatory arguments: + +#. : path to UFS Regression Test output + +#. : name of cases + +#. Either: (see examples below) + + - AND AND : longitude range, latitude range, and number of cases to + create + + - AND : longitude and latitude of cases + +Optional arguments: + +#. : SCM timestep, in seconds + +#. : UFS spatial resolution + +#. : CCPP suite definition file to use for ensemble + +#. : flag to create UFS reference file for comparison + +#. : flag to indicate using the nearest UFS history file gridpoint + +Examples to run from within the directory to create SCM cases starting +with the output from a UFS Weather Model regression test(s): + +On the supported platforms Cheyenne (NCAR) and Hera (NOAA), there are +staged UWM RTs located at: + +- +- + +.. _`subsection: example1`: + +Example 1: UFS-replay for single point +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +UFS regression test, , for single point. + +.. code:: bash + + ./UFS_forcing_ensemble_generator.py -d /glade/scratch/epicufsrt/GMTB/CCPP-SCM/UFS_RTs/control_c192/ -sc --C_RES 192 -dt 360 -n control_c192 -lons 300 -lats 34 + +Upon successful completion of the script, the command to run the case(s) +will print to the screen. For example, + +.. code:: bash + + ./run_scm.py --npz_type gfs --file scm_ufsens_control_c192.py --timestep 360 + +The file is created in , where the SCM run script is to be exectued. + +.. _`subsection: example2`: + +Example 2: UFS-replay for list of points +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +UFS regression test, , for multiple points. + +.. code:: bash + + ./UFS_forcing_ensemble_generator.py -d /glade/scratch/epicufsrt/GMTB/CCPP-SCM/UFS_RTs/control_c384/ -sc --C_RES 384 -dt 225 -n control_c384 -lons 300 300 300 300 -lats 34 35 35 37 + +Upon successful completion of the script, the command to run the case(s) +will print to the screen. For example, + +.. code:: bash + + ./run_scm.py --npz_type gfs --file scm_ufsens_control_c384.py --timestep 225 + +The file contains of the cases created. Each case created will have the +naming convention , where the suffix is the case number from 0 to the +number of points provided. The contents of the file should look like: + +.. code:: bash + + run_list = [{"case": "control_c384_n000", "suite": "SCM_GFS_v16"}, + {"case": "control_c384_n001", "suite": "SCM_GFS_v16"}, + {"case": "control_c384_n002", "suite": "SCM_GFS_v16"}, + {"case": "control_c384_n003", "suite": "SCM_GFS_v16"}] + +.. _`subsection: example3`: + +Example 3: UFS-replay for an ensemble of points +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +UFS regression test, , for an ensemble (10) of randomly selected points +over a specified longitude (:math:`300-320^oW`) and latitude +(:math:`40-50^oN`) range + +But first, to use the test we need to rerun the regression test to +generate UFS history files with a denser and constant output interval. +First, in , change to , where is the UFS history file output frequency +(in hours), see `UFS Weather Model Users +Guide `__ +for more details. + +For the purposes of this example the test has already been rerun, but if +starting from your own UWM RTs, you can rerun the UWM regression test, +on Cheyenne for example, by running the following command in the RT +directory: + +Now the cases can be generated with the following command: + +.. code:: bash + + ./UFS_forcing_ensemble_generator.py -d /glade/scratch/epicufsrt/GMTB/CCPP-SCM/UFS_RTs/control_p8/ -sc --C_RES 96 -dt 720 -n control_p8 -lonl 300 320 -latl 40 50 -nens 10 -sdf SCM_GFS_v17_p8 + +Upon successful completion of the script, the command to run the case(s) +will print to the screen. For example, + +.. code:: bash + + ./run_scm.py --npz_type gfs --file scm_ufsens_control_p8.py --timestep 720 + +The file contains ten cases (n000-n009) to be run. The contents of the +file should look like: + +.. code:: bash + + run_list = [{"case": "control_p8_n000", "suite": "SCM_GFS_v17_p8"}, + {"case": "control_p8_n001", "suite": "SCM_GFS_v17_p8"}, + {"case": "control_p8_n002", "suite": "SCM_GFS_v17_p8"}, + {"case": "control_p8_n003", "suite": "SCM_GFS_v17_p8"}, + {"case": "control_p8_n004", "suite": "SCM_GFS_v17_p8"}, + {"case": "control_p8_n005", "suite": "SCM_GFS_v17_p8"}, + {"case": "control_p8_n006", "suite": "SCM_GFS_v17_p8"}, + {"case": "control_p8_n007", "suite": "SCM_GFS_v17_p8"}, + {"case": "control_p8_n008", "suite": "SCM_GFS_v17_p8"}, + {"case": "control_p8_n009", "suite": "SCM_GFS_v17_p8"}] diff --git a/scm/doc/TechGuide/chap_ccpp.rst b/scm/doc/TechGuide/chap_ccpp.rst new file mode 100644 index 000000000..0964b48f5 --- /dev/null +++ b/scm/doc/TechGuide/chap_ccpp.rst @@ -0,0 +1,385 @@ +.. _`chapter: ccpp_interface`: + +CCPP Interface +============== + +Chapter 6 of the CCPP v6 Technical Documentation +(https://ccpp-techdoc.readthedocs.io/en/v6.0.0/) provides a wealth of +information on the overall process of connecting a host model to the +CCPP framework for calling physics. This chapter describes the +particular implementation within this SCM, including how to set up, +initialize, call, and change a physics suite using the CCPP framework. + +Setting up a suite +------------------ + +Setting up a physics suite for use in the CCPP SCM with the CCPP +framework involves three steps: preparing data to be made available to +physics through the CCPP, running the script to reconcile SCM-provided +variables with physics-required variables, and preparing a suite +definition file. + +Preparing data from the SCM +~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +As described in sections 6.1 and 6.2 of the `CCPP Technical +Documentation `__ a host +model must allocate memory and provide metadata for variables that are +passed into and out of the schemes within the physics suite. As of this +release, in practice this means that a host model must do this for all +variables needed by all physics schemes that are expected to be used +with the host model. For this SCM, all variables needed by the physics +schemes are allocated and documented in the file and are contained +within the derived data type. This derived data type initializes its +component variables in a type-bound procedure. As mentioned in section +6.2 of the `CCPP Technical +Documentation `__, files +containing all required metadata was constructed for describing all +variables in the derived data type. These files are , and . Further, +exists to provide metadata for derived data type definitions and their +instances, which is needed by the ccpp-framework to properly reference +the data. The standard names of all variables in this table must match +with a corresponding variable within one or more of the physics schemes. +A list of all standard names used can be found in . + +Editing and running +~~~~~~~~~~~~~~~~~~~~ + +General instructions for configuring and running the script can be found +in chapter 8 of the `CCPP Technical +Documentation `__. The +script expects to be run with a host-model-dependent configuration file, +passed as argument . Within this configuration file are variables that +hold paths to the variable definition files (where metadata tables can +be found on the host model side), the scheme files (a list of paths to +all source files containing scheme entry points), the auto-generated +physics schemes makefile snippet, the auto-generated physics scheme caps +makefile snippet, and the directory where the auto-generated physics +caps should be written out to. As mentioned in section +`[section: compiling] <#section: compiling>`__, this script must be run +to reconcile data provided by the SCM with data required by the physics +schemes before compilation – this is done automatically by . + +Preparing a suite definition file +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +The suite definition file is a text file read by the model at compile +time. It is used to specify the physical parameterization suite, and +includes information about the number of parameterization groupings, +which parameterizations that are part of each of the groups, the order +in which the parameterizations should be run, and whether subcycling +will be used to run any of the parameterizations with shorter timesteps. + +In addition to the six or so major parameterization categories (such as +radiation, boundary layer, deep convection, resolved moist physics, +etc.), the suite definition file can also have an arbitrary number of +additional interstitial schemes in between the parameterizations to +prepare or postprocess data. In many models, this interstitial code is +not known to the model user but with the suite definition file, both the +physical parameterizations and the interstitial processing are listed +explicitly. + +For this release, supported suite definition files used with this SCM +are found in and have default namelist, tracer configuration, and +timesteps attached in . For all of these suites, the physics schemes +have been organized into 3 groupings following how the physics are +called in the UFS Atmosphere model, although no code is executed in the +SCM time loop between execution of the grouped schemes. Several +“interstitial” schemes are included in the suite definition file to +execute code that previously was part of a hard-coded physics driver. +Some of these schemes may eventually be rolled into the schemes +themselves, improving portability. + +Initializing/running a suite +---------------------------- + +The process for initializing and running a suite in this SCM is +described in sections +`[section: physics init] <#section: physics init>`__ and +`[section: time integration] <#section: time integration>`__, +respectively. A more general description of the process for performing +suite initialization and running can also be found in sections 6.4 and +6.5 of the `CCPP Technical +Documentation `__. + +Changing a suite +---------------- + +Replacing a scheme with another +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +Prior to being able to swap a scheme within a suite, one must first add +a CCPP-compliant scheme to the pool of available schemes in the CCPP +physics repository. This process is described in chapter 2 of the `CCPP +Technical +Documentation `__. + +Once a CCPP-compliant scheme has been added to the CCPP physics +repository, the process for modifying an existing suite should take the +following steps into account: + +- Examine and compare the arguments of the scheme being replaced and + the replacement scheme. + + - Are there any new variables that the replacement scheme needs from + the host application? If so, these new variables must be added to + the host model cap. For the SCM, this involves adding a component + variable to the derived data type and a corresponding entry in the + metadata table. The new variables must also be allocated and + initialized in the type-bound procedure. + + - Do any of the new variables need to be calculated in an + interstitial scheme? If so, one must be written and made + CCPP-compliant itself. The `CCPP Technical + Documentation `__ + will help in this endeavor, and the process outlined in its + chapter 2 should be followed. + + - Do other schemes in the suite rely on output variables from the + scheme being replaced that are no longer being supplied by the + replacement scheme? Do these output variables need to be + derived/calculated in an interstitial scheme? If so, see the + previous bullet about adding one. + +- Examine existing interstitial schemes related to the scheme being + replaced. + + - There may be scheme-specific interstitial schemes (needed for one + specific scheme) and/or type-generic interstitial schemes (those + that are called for all schemes of a given type, i.e. all PBL + schemes). Does one need to write analogous scheme-specific + interstitial schemes for the replacement? + + - Are the type-generic interstitial schemes relevant or do they need + to be modified? + +- Depending on the answers to the above considerations, edit the suite + definition file as necessary. Typically, this would involve finding + the elements associated with the scheme to be replaced and its + associated interstitial elements and simply replacing the scheme + names to reflect their replacements. See chapter 4 of the `CCPP + Technical + Documentation `__ for + further details. + +Modifying “groups” of parameterizations +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +The concept of grouping physics in the suite definition file (currently +reflected in the elements) enables “groups” of parameterizations to be +called with other computation (perhaps related to the dycore, I/O, etc.) +in between. In the suite definition file included in this release, three +groups are specified, but currently no computation happens between calls +for these groups. However, one can edit the groups to suit the needs of +the host application. For example, if a subset of physics schemes needs +to be more tightly connected with the dynamics and called more +frequently, one could create a group consisting of that subset and place +a call in the appropriate place in the host application. The remainder +of the parameterizations groups could be called using calls in a +different part of the host application code. + +Subcycling parameterizations +~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +The suite definition file allows subcycling of schemes, or calling a +subset of schemes at a smaller time step than others. The element in the +suite definition file controls this function. All schemes within such an +element are called times during one call. An example of this is found in +the suite definition file, where the surface schemes are executed twice +for each timestep (implementing a predictor/corrector paradigm). Note +that no time step information is included in the suite definition file. +**If subcycling is used for a set of parameterizations, the smaller time +step must be an input argument for those schemes. This is not handled +automatically by the ccpp-framework yet.** + +Adding variables +---------------- + +.. _adding_physics_only_variable: + +Adding a physics-only variable +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +Suppose that one wants to add the variable to a scheme that spans the +depth of the column and that this variable is internal to physics, not +part of the SCM state or subject to external forcing. Here is how one +would do so: + +#. First, add the new variable to the derived data type definition in . + Within the definition, you’ll notice that there are nested derived + data types (which contain most of the variables needed by the physics + and are used for mainly legacy reasons) and several other + integers/reals/logicals. One could add the new variable to one of the + nested GFS derived data types if the variable neatly fits inside one + of them, but it is suggested to bypass the GFS derived data types and + add a variable directly to the type definition: + + .. code:: fortran + + real(kind=kind_phys), allocatable :: foo(:,:) + +#. Second, within the subroutine, add an allocate and initialization + statement. + + .. code:: fortran + + allocate(foo(n_columns, n_levels)) + physics%foo = 0.0 + + Note that even though foo only needs to have the vertical dimension, + it is also allocated with the dimension as the first dimension since + this model is intended to be used with multiple independent columns. + Also, the initialization in this creation subroutine can be + overwritten by an initialization subroutine associated with a + particular scheme. + +#. At this point, these changes are enough to allocate the new variable + ( is called in the main subroutine of ), although this variable + cannot be used in a physics scheme yet. For that, you’ll need to add + an entry in the corresponding metadata file. See section 2.2 of the + `CCPP Technical + Documentation `__ + for more information regarding the format. + +#. On the physics scheme side, there will also be a metadata file entry + for . For example, say that scheme uses . If is further initialized + in ’s subroutine, a metadata entry for must be found in the + corresponding section in the metadata file. If it is used in ’s run + subroutine, a metadata entry for foo must also appear in the metadata + file section for . The metadata entry on the physics scheme side has + the same format as the one on the host model side described above. + The standard name, rank, type, and kind must match the entry from the + host model table. Others attributes (local name, units (assuming that + an automatic conversion exists in the ccpp-framework), long_name, + intent) can differ. The local name corresponds to the name of the + variable used within the scheme subroutine, and the intent attribute + should reflect how the variable is actually used within the scheme. + + Note: In addition to the metadata file, the argument list for the + scheme subroutine must include the new variable (i.e., must actually + be in the argument list for and be declared appropriately in regular + Fortran). + +If a variable is declared following these steps, it can be used in any +CCPP-compliant physics scheme and it will retain its value from timestep +to timestep. A variable will ONLY be zeroed out (either every timestep +or periodically) if it is in the or data types. So, if one needs the new +variable to be ‘prognostic’, one would need to handle updating its value +within the scheme, something like: + +.. math:: \text{foo}^{t+1} = \text{foo}^t + \Delta t*\text{foo\_tendency} + +Technically, the host model can “see” foo between calls to physics +(since the host model allocated its memory at initialization), but it +will not be touching it. + +Adding a prognostic SCM variable +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +The following instructions are valid for adding a passive, prognostic +tracer to the SCM. Throughout these instructions, the new tracer is +called ‘smoke’. + +#. Add a new tracer to the SCM state. In do the following: + + - Add an index for the new tracer in the definition. + + - Do the following in the subroutine: + + - Increment + + - Set = + + - Set = + + - Note: is initialized to zero in this subroutine already, so + there is no need to do so again. + +#. Initialize the new tracer to something other than zero (from an input + file). + + - Edit an existing input file (in ): add a field in the ‘initial’ + group of the NetCDF file(s) (with vertical dimension in pressure + coordinates) with an appropriate name in one (or all) of the input + NetCDF files and populate with whatever values are necessary to + initialize the new tracer. + + - Create a new input variable to read in the initialized values. In + : + + - Add a new input variable in + + .. code:: fortran + + real(kind=dp), allocatable :: input_smoke(:) + + - In , allocate and initialize the new variable to 0. + + - Read in the input values to initialize the new tracer. In : + + - Add a variable under the initial profile section: + + .. code:: fortran + + real(kind=dp), allocatable :: input_smoke(:) !< smoke profile (fraction) + + - Add the new input variable to the allocate statement. + + - Read the values in from the file: + + .. code:: fortran + + call check(NF90_INQ_VARID(grp_ncid,"smoke",varID)) + call check(NF90_GET_VAR(grp_ncid,varID,input_smoke)) + + - set = + + - Interpolate the input values to the model grid. Edit : + + - Add a loop over the columns to call that puts on grid levels in + + .. code:: fortran + + do i=1, scm_state%n_cols + call interpolate_to_grid_centers(scm_input%input_nlev, scm_input%input_pres, scm_input%input_smoke, scm_state%pres_l(i,1,:), & + scm_state%n_levels, scm_state%state_tracer(i,1,:,scm_state%smoke_index,1), last_index_init, 1) + end do + + - At this point, you have a new tracer initialized to values + specified in the input file on the model vertical grid, but it is + not connected to any physics or changed by any forcing. + +#. For these instructions, we’ll assume that the tracer is not subject + to any external forcing (e.g., horizontal advective forcing, sources, + sinks). If it is, further work is required to: + + - One needs to provide data on how tracer is forced in the input + file, similar to specifying its initial state, as above. + + - Create, allocate, and read in the new variable for forcing + (similar to above). + + - Add to (similar to above, but interpolates the forcing to the + model grid and model time). + + - Add statements to time loop to handle the first time step and + time-advancing. + + - Edit in . + +#. In order to connect the new tracer to the CCPP physics, perform steps + 1-4 in section `1.4.1 <#adding_physics_only_variable>`__ for adding a + physics variable. In addition, do the following in order to associate + the variable with variables used in the physics through a pointer: + + - Point the new physics variable to in . + +#. There may be additional steps depending on how the tracer is used in + the physics and how the physics scheme is integrated with the current + GFS physics suite. For example, the GFS physics has two tracer + arrays, one for holding tracer values before the physics timestep () + and one for holding tracer values that are updated during/after the + physics (). If the tracer needs to be part of these arrays, there are + a few additional steps to take. If you need help, please post on the + support forum at: + https://dtcenter.org/forum/ccpp-user-support/ccpp-single-column-model. diff --git a/scm/doc/TechGuide/chap_function.rst b/scm/doc/TechGuide/chap_function.rst new file mode 100644 index 000000000..8560a95b1 --- /dev/null +++ b/scm/doc/TechGuide/chap_function.rst @@ -0,0 +1,157 @@ +.. _`chapter: algorithm`: + +Algorithm +========= + +Algorithm Overview +------------------ + +Like most SCMs, the algorithm for the CCPP SCM is quite simple. In a +nutshell, the SCM code performs the following: + +- Read in an initial profile and the forcing data. + +- Create a vertical grid and interpolate the initial profile and + forcing data to it. + +- Initialize the physics suite. + +- Perform the time integration, applying forcing and calling the + physics suite each time step. + +- Output the state and physics data. + +In this chapter, it will briefly be described how each of these tasks is +performed. + +Reading input +------------- + +The following steps are performed at the beginning of program execution: + +#. Call in the module to read in the ` <#subsection: case config>`__ and + ` <#subsection: physics config>`__ namelists. This subroutine also + sets some variables within the derived type from the data that was + read. + +#. Call (or if using the DEPHY format) in the module to read in the + `case input data file <#subsection: case input>`__. This subroutine + also sets some variables within the derived type from the data that + was read. + +#. Call in the module to read in the reference profile data. This + subroutine also sets some variables within the derived type from the + data that was read. At this time, there is no “standard” format for + the reference profile data file. There is a statement within the + subroutine that reads in differently-formatted data. If adding a new + reference profile, it will be required to add a section that reads + its data in this subroutine. + +Setting up vertical grid and interpolating input data +----------------------------------------------------- + +The CCPP SCM uses pressure for the vertical coordinate (lowest index is +the surface). The pressure levels are calculated using the surface +pressure and coefficients (:math:`a_k` and :math:`b_k`), which are taken +directly from FV3 code (). For vertical grid options, inspect for valid +values of . The default vertical coordinate uses 127 levels and sets to +the empty string. Alternatively, one can specify the (:math:`a_k` and +:math:`b_k`) coefficients via an external file in the directory and pass +it in to the SCM via the argument of the run script. If changing the +number of vertical levels or the algorithm via the or run script +arguments, be sure to check and that the vertical coordinate is as +inteneded. + +After the vertical grid has been set up, the state variable profiles +stored in the derived data type are interpolated from the input and +reference profiles in the subroutine of the module. + +.. _`section: physics init`: + +Physics suite initialization +---------------------------- + +With the CCPP framework, initializing a physics suite is a 3-step +process: + +#. Initialize variables needed for the suite initialization routine. For + suites originating from the GFS model, this involves setting some + values in a derived data type used in the initialization subroutine. + Call the suite initialization subroutine to perform suite + initialization tasks that are not already performed in the routines + of the CCPP-compliant schemes (or associated initialization stages + for groups or suites listed in the suite definition file). Note: As + of this release, this step will require another suite intialization + subroutine to be coded for a non-GFS-based suite to handle any + initialization that is not already performed within CCPP-compliant + scheme initialization routines. + +#. Associate the variables with the appropriate pointers in the derived + data type. + +#. Call with the derived data type as input. This call executes the + initialization stages of all schemes, groups, and suites that are + defined in the suite definition file. + +.. _`section: time integration`: + +Time integration +---------------- + +The CCPP SCM uses the simple forward Euler scheme for time-stepping. + +During each step of the time integration, the following sequence occurs: + +#. Update the elapsed model time. + +#. Calculate the current date and time given the initial date and time + and the elapsed time. + +#. Call the subroutine in the module to interpolate the forcing data in + space and time. + +#. Recalculate the pressure variables (pressure, Exner function, + geopotential) in case the surface pressure has changed. + +#. Call in the module. Within this subroutine: + + - Call the appropriate subroutine from the module. + + - For each column, call to call all physics schemes within the suite + (this assumes that all suite parts are called sequentially without + intervening code execution) + +#. Check to see if output should be written during the current time step + and call in the module if necessary. + +Writing output +-------------- + +Output is accomplished via writing to a NetCDF file. If not in the +initial spin-up period, a subroutine is called to determine whether data +needs to be added to the output file during every timestep. Variables +can be written out as instantaneous or time-averaged and there are 5 +output periods: + +#. one associated with how often instantaneous variables should be + written out (controlled by the run script variable). + +#. one associated with how often diagnostic (either instantaneous or + time-averaged) should be written out (controlled by the run script + variable) + +#. one associated with the shortwave radiation period (controlled by + variable in the physics namelist) + +#. one associated with the longwave radiation period (controlled by the + variable in the physics namelist) + +#. one associated with the minimum of the shortwave and longwave + radiation intervals (for writing output if any radiation is called) + +Further, which variables are output and on each interval are controlled +via the source file. Of course, any changes to this file must result in +a recompilation to take effect. There are several subroutines for +initializing the output file () and for appending to it () that are +organized according to their membership in physics derived data types. +See the source file to understand how to change output variables. diff --git a/scm/doc/TechGuide/chap_hsd.rst b/scm/doc/TechGuide/chap_hsd.rst new file mode 100644 index 000000000..2a4e5e1d9 --- /dev/null +++ b/scm/doc/TechGuide/chap_hsd.rst @@ -0,0 +1,344 @@ +.. _`chapter: Hierarchical_Physics_Development`: + +Hierarchical Physics Development +================================ + +Chapter 7 of the CCPP v6 Technical Documentation +(https://ccpp-techdoc.readthedocs.io/en/v6.0.0/) provides an overview of +the tools supported by the Single Column Model (SCM) to faciliate +hierarchical system development (HSD) + +Background +---------- + +Developing and implementing a new physics parameterization for use in an +operational setting requires extensive testing and evaluation. This is +to ensure that new developments aren’t yielding unexpected results and +that all computational considerations are being met. From a scientific +perspective, this process should be incremental and hierarchical, i.e. +using HSD which follows a systems-engineering approach, i.e. initial +testing of simple idealized cases that focus on small elements (e.g., +physics schemes) of an Earth System Model (ESM) first in isolation, then +progressively connecting elements with increased coupling between ESM +components at the different HSD steps. HSD includes SCMs (including +individual elements within the SCM), Small-Domain, Limited-Area and +Regional Models, all the way up to complex fully-coupled ESMs with +components for atmosphere/chemistry/aerosols, ocean/waves/sea-ice, +land-hydrology/snow/land-ice, and biogeochemical cycles/ecosystems, a +subset of which (i.e. atmosphere+land and specified ocean conditions) +has traditionally addressed Numerical Weather Prediction (NWP) needs. +HSD is end-to-end in that it includes data ingest and quality control, +data assimilation, modeling, post-processing, and verification. The +requirements for advancing from one HSD step to the next are appropriate +metrics/benchmarks for ESM (or ESM components or elements) performance, +many of which are at the physical process level, plus the necessary +forcing data sets to drive and validate models. Datasets for use in +different HSD steps are obtained from measurements (e.g. field programs +and observational networks), ESM output, or idealized conditions (often +used to “stress-test” schemes/elements/system components, among many +other options). It is important to note that the HSD process is +concurrent and iterative, i.e. more complex HSD steps can provide +information to be used at simpler HSD steps, and vice versa. This also +includes understanding spatial and temporal dependencies in model +physics, and the need for consistency in those solutions in models +between higher-resolution/regional short-range, global +medium/extended-range, and subseasonal-to-seasonal time scales. + +The CCPP-SCM provides developers working within CCPP-compliant host +models the ability to test their physics innovations without having to +worry about the coupling to the dynamical core. This is a critical step +in the model development hierarchy, providing insight on how an +introduced physics change can modify the evolution of the internal +physics state. However, there are still challenges, most notably the +interactions between introduced changes and the other physics schemes in +the suite. + +CCPP Suite Simulator +-------------------- + +Overview +~~~~~~~~ + +The CCPP Suite Simulator is a CCPP-compliant physics scheme that +provides the ability to turn on/off physical processes in a Suite +Definition File (SDF), using namelist options. This simulator +‘piggybacks’ on an existing SDF, replacing physics tendencies with +data-driven tendencies (`1.1 <#fig:CSS_tendency_schematic>`__). + +.. figure:: images/CSS_tendency_schematic.png + name: fig:CSS_tendency_schematic + :width: 80.0% + + Equation for internal physics state evolution for process-split + physics suite, where S is the prognostic state and D are simulated + data tendencies. Top) Standard Suite Definition File; Middle) Active + PBL physics with simulated tendencies for other schemes; Bottom) + Active PBL and radiation, with simulated tendencies for other + schemes. + +Process-split vs. Time-split Physics Process +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +Process-split physics processes are schemes that share a common input +state, whereas time-split processes use the state provided by the +previous physics process. A SDF can be any combination of time-split and +process-split schemes, just as long as the appropriate interstitial +schemes are created to couple the physics schemes. + +About the CCPP Suite Simulator +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +The CCPP Suite Simulator (CSS) emulates the evolution of the internal +physics state provided by the SDF. There are different deployments of +the suite simulator, depending on the role(s) and order of the physical +processes in the SDF we are emulating (e.g. time vs. process-split), +that need further attention. For instance, SDFs consisting of only +process-split physics schemes can be handled simply by adding the +simulator to the end of the SDF, since for process-split schemes the +order is not critical to the evolution of the internal physics state. On +the other hand, for SDFs that contain time-split processes, where the +simulator is added is important to preserve the order of the internal +state evolution. + +.. _`subsection: pydepend`: + +Python Dependencies +~~~~~~~~~~~~~~~~~~~ + +The scripts here require a few python packages that may not be found by +default in all python installations. There is a YAML file with the +python environment needed to run the script in . To create and activate +this environment using conda: + +Create environment (only once): + +This will create the conda environment + +Activate environment: + +Enabling the CCPP Suite Simulator +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +To use the CSS in the CCPP-SCM three modifications need to be made: + +#. Add CSS, and any interstitial schemes needed for coupling the CSS to + the host (e.g. SCM), to an existing CCPP SDF (or create a new SDF). + +#. Set in the GFS physics namelist, + +#. Modify, or create new, namelist that has the options needed to + activate the CSS. + +Modifying the CCPP Suite Definition File +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +The SDF needs to be modified to include the CSS scheme, and any +interstitial schemes needed for your host application. Here we will +illustrate how to use the CSS within SCM (UFS) physics, for which all +applications use a physics SDF with a mixture of process-split and +time-split physics schemes. In general, + +- for SDFs that contain ONLY process-split schemes, the CSS can be + added to the end of the SDF for all configurations. In this case we + have the flexibility to switch “on/off” any combination of active + physics and simulated physics at runtime, via namelist, with the same + modified SDF. + +- when using SDFs that contain ONLY time-split schemes, the CSS needs + to be added before and after each scheme you want to switch “on/off”. + So one could add calls to the CSS between each process in the SDF to + obtain total flexibility, or just around the schemes they are + interested in. + +In the examples below we will demonstrate how to modify SDFs to use the +CSS for SCM (UFS) physics applications, +`1.2.9 <#section:Suite_with_Active_Radiation>`__ and +`1.2.10 <#section:Suite_with_Active_cldmp>`__. + +Namelist for the CCPP Suite Simulator +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +The CSS has its own namelist, , that needs to be added to the physics +namelists used by the SCM. + +.. literalinclude:: css_nml.txt + :name: lst_css_nml_ex1 + :caption: Example namelist for CCPP Suite Simulator. + +- : Input file with simulated data tendencies (See + `1.2.8 <#section:Creating_Custom_Data_for_Simulator>`__ for how to + create input file from SCM output). + +- : Number of physical processes in the input data. + +- : Configuration for physical process XYZ. + + - 0 - Active scheme; 1 - Use data + + - 0 - Process-split scheme; 1 - Time-split scheme + + - Index for scheme order (1 - ) + +For example, in Listing `[lst_css_nml_ex1] <#lst_css_nml_ex1>`__, there +are two active schemes, longwave and shortwave radiation, and five +simulated schemes: PBL, gravity-wave drag, deep/shallow convection, and +cloud microphysics. The radiation, gravity-wave drag and PBL schemes are +all process-split, whereas convection and cloud microphysics are +time-split. + +.. _`section:Creating_Custom_Data_for_Simulator`: + +Creating Custom Data for Simulator +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +Navigate to + +Provided with the SCM are scripts to generate data for the suite +simulator using output from a previous SCM run. The first script, , +extracts the physics tendencies from a user-specified time interval, +which are used for constant forcing in the suite simulator. The other +script, , creates a two-dimensional forcing dataset. The suite simulator +interpolates these forcings in time. + +#. Run the SCM twice using the TWPICE case with the and suites. + + .. code:: bash + + cd ccpp-scm/scm/bin + ./run_scm.py -c twpice -s SCM_GFS_v16 + ./run_scm.py -c twpice -s SCM_GFS_v17_p8 + +#. Create 2D forcing data for the CSS, using SCM output from TWPICE case + with suite. + + .. code:: bash + + cd ccpp-scm/scm/etc/scripts/ccpp_suite_sim + ./create_2D_CSSdata.py --cases twpice --suites SCM_GFS_v16 + +#. Create constant forcing data for the CSS, using SCM output, at + forecast time 3600s, from TWPICE case with suite. + + .. code:: bash + + cd ccpp-scm/scm/etc/scripts/ccpp_suite_sim + ./create_1D_CSSdata.py --cases twpice --suites SCM_GFS_v17_p8 --time 3600 + +The data file will be written to with the following format, . + +.. _`section:Suite_with_Active_Radiation`: + +Example 1: Suite with Active Radiation +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +For this example we will use the two-dimensional forcing data from +`1.2.8 <#section:Creating_Custom_Data_for_Simulator>`__. + +First, we need to modify the SDF to include the CSS, and an additional +interstital scheme to couple to the GFS physics, (See +`1.2 <#fig:CSS_SDF_ex1>`__). + +.. figure:: images/SDF_changes_for_CSS_ex1.png + name: fig:CSS_SDF_ex1 + :width: 80.0% + + Changes to GFS v16 physics SDF to include CCPP suite simulator for + radiation parameterization. All other parameterizations are replaced + by simulated data. + +Next, the physics namelist needs to be configured to: + +#. Add data file, created in + `1.2.8 <#section:Creating_Custom_Data_for_Simulator>`__ to the + namelist. + +#. Turn “off” all schemes except the radiation (see Listing + `[lst_css_nml_ex1] <#lst_css_nml_ex1>`__) + +Finally, we rebuild the SCM with the modified SDFs to include the CSS, +and run the SCM using TWPICE case with the modified suite. + +.. code:: bash + + cd ccpp-scm/scm/bin + cmake ../src -DCCPP_SUITES=SCM_GFS_v16 + ./run_scm.py -c twpice -s SCM_GFS_v16 + +.. _`section:Suite_with_Active_cldmp`: + +Example 2: Suite with Active Cloud Microphysics +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +For this example we will use the constant forcing data from +`1.2.8 <#section:Creating_Custom_Data_for_Simulator>`__. + +First, we need to modify the SDF to include the CSS, and an additional +interstital scheme to couple to the GFS physics, (See +`1.3 <#fig:CSS_SDF_ex2>`__). + +.. figure:: images/SDF_changes_for_CSS_ex2.png + name: fig:CSS_SDF_ex2 + :width: 80.0% + + Changes to GFS v17 Prototype 8 physics SDF to include CCPP suite + simulator for cloud microphysics parameterization. All other + parameterizations are replaced by simulated data. + +Next, the physics namelist needs to be configured to: + +#. Add data file, created in + `1.2.8 <#section:Creating_Custom_Data_for_Simulator>`__ to the + namelist. + +#. Turn “off” all schemes except the cloud microphysics (see Listing + `[lst_css_nml_ex2] <#lst_css_nml_ex2>`__) + +.. literalinclude:: css_nml_ex2.txt + :name: lst_css_nml_ex2 + :caption: Example namelist for CCPP Suite Simulator with active cloud microphysics. + +Finally, we rebuild the SCM with the modified SDFs to include the CSS, +and run the SCM using TWPICE case with the modified suite. + +.. code:: bash + + cd ccpp-scm/scm/bin + cmake ../src -DCCPP_SUITES=SCM_GFS_v17_p8 + ./run_scm.py -c twpice -s SCM_GFS_v17_p8 + +.. _`section:plotting_tools`: + +Plotting tools +~~~~~~~~~~~~~~ + +Additionally, plotting scripts provided in : + +#. .. code:: bash + + ./plt_scmout_2d.py [-h] -n CASE_NAME -sdf SDF -nmls NMLS -vars VAR1 VAR2 VAR3 + + Mandatory arguments: + + #. name of case + + #. CCPP suite definition file + + #. namelists, separated by a space + + #. varaibles to plot, separated by a space + +#. .. code:: bash + + ./plt_scmout_3d.py [-h] -n CASE_NAME -sdf SDF -nmls NMLS -vars VAR1 VAR2 VAR3 -time TIME + + Mandatory arguments: + + #. name of case + + #. CCPP suite definition file + + #. namelists, separated by a space + + #. varaibles to plot, separated by a space + + #. time to plot, in seconds diff --git a/scm/doc/TechGuide/chap_intro.rst b/scm/doc/TechGuide/chap_intro.rst new file mode 100644 index 000000000..f99913eed --- /dev/null +++ b/scm/doc/TechGuide/chap_intro.rst @@ -0,0 +1,116 @@ +.. _`chapter: introduction`: + +Introduction +============ + +A single column model (SCM) can be a valuable tool for diagnosing the +performance of a physics suite, from validating that schemes have been +integrated into a suite correctly to deep dives into how physical +processes are being represented by the approximating code. This SCM has +the advantage of working with the Common Community Physics Package +(CCPP), a library of physical parameterizations for atmospheric +numerical models and the associated framework for connecting potentially +any atmospheric model to physics suites constructed from its member +parameterizations. In fact, this SCM serves as perhaps the simplest +example for using the CCPP and its framework in an atmospheric model. +This version contains all parameterizations of NOAA’s evolved +operational GFS v16 suite (implemented in 2021), plus additional +developmental schemes. The schemes are grouped in six supported suites +described in detail in the `CCPP Scientific +Documentation `__ +(GFS_v16, GFS_v17p8, RAP, HRRR, and RRFS_v1beta, and WoFS_v0). + +This document serves as both the User and Technical Guides for this +model. It contains a Quick Start Guide with instructions for obtaining +the code, compiling, and running a sample test case, an explanation for +what is included in the repository, a brief description of the operation +of the model, a description of how cases are set up and run, and +finally, an explanation for how the model interfaces with physics +through the CCPP infrastructure. + +| Please refer to the release web page for further documentation and + user notes: +| https://dtcenter.org/community-code/common-community-physics-package-ccpp/download + +Version Notes +------------- + +The CCPP SCM v6.0.0 contains the following major and minor changes since +v5.0. + +Major + +- Inclusion of regression testing functionality + +- Combine single- and multi-run capabilities into one script + +Minor + +- Add RUC LSM support + +- Add the GFS_v17p8, HRRR, RRFS_v1beta, and WoFS_v0 suites + +- Update the vertical coordinate code to better match latest FV3 + vertical coordinate code + +- Simplify the case configuration namelists + +- Add greater flexibility for output location (outside of bin + directory) + +Limitations +~~~~~~~~~~~ + +This release bundle has some known limitations: + +- In the output file, temperature tendency variables all mistakenly + have the same description, although their variable names are correct. + This has been fixed in the development code. + +- Using the RRFS_v1beta, HRRR, and WoFS_v0 suites for cases where deep + convection is expected to be active will likely produce + strange/unreliable results, unless the forcing has been modified to + account for the deep convection. This is because forcing for existing + cases assumes a horizontal scale for which deep convection is + subgrid-scale and is expected to be parameterized. The suites without + convection are intended for use with regional UFS simulations with + horizontal scale small enough not to need a deep convection + parameterization active, and it does not contain a deep convective + scheme. Nevertheless, these suites are included with the SCM as-is + for research purposes. + +- The provided cases over land points cannot use an LSM at this time + due to the lack of initialization data for the LSMs. Therefore, for + the provided cases over land points (ARM_SGP_summer_1997\_\* and + LASSO\_\*, where sfc_type = 1 is set in the case configuration file), + prescribed surface fluxes must be used: + + - surface sensible and latent heat fluxes must be provided in the + case data file + + - sfc_flux_spec must be set to true in the case configuration file + + - the surface roughness length in cm must be set in the case + configuration file + + - the suite defintion file used (physics_suite variable in the case + configuration file) must have been modified to use prescribed + surface fluxes rather than an LSM. + + - NOTE: If one can develop appropriate initial conditions for the + LSMs for the supplied cases over land points, there should be no + technical reason why they cannot be used with LSMs, however. + +- As of this release, using the SCM over a land point with an LSM is + possible through the use of UFS initial conditions (see section + `[sec:UFS ICs] <#sec:UFS ICs>`__). However, advective forcing terms + are unavailable as of this release, so only short integrations using + this configuration should be employed. Using dynamical tendencies + (advective forcing terms) from the UFS will be part of a future + release. + +- There are several capabilities of the developmental code that have + not been tested sufficiently to be considered part of the supported + release. Those include additional parameterizations. Users that want + to use experimental capabilities should refer to Subsection + `[section: development_code] <#section: development_code>`__. diff --git a/scm/doc/TechGuide/chap_quick.rst b/scm/doc/TechGuide/chap_quick.rst new file mode 100644 index 000000000..7a13df7d8 --- /dev/null +++ b/scm/doc/TechGuide/chap_quick.rst @@ -0,0 +1,892 @@ +.. _`chapter: quick`: + +Quick Start Guide +================= + +This chapter provides instructions for obtaining and compiling the CCPP +SCM. The SCM code calls CCPP-compliant physics schemes through the CCPP +framework code. As such, it requires the CCPP framework code and physics +code, both of which are included as submodules within the SCM code. This +package can be considered a simple example for an atmospheric model to +interact with physics through the CCPP. + +Alternatively, if one doesn’t have access or care to set up a machine +with the appropriate system requirements but has a working Docker +installation, it is possible to create and use a Docker container with a +pre-configured computing environment with a pre-compiled model. This is +also an avenue for running this software with a Windows PC. See section +`1.5 <#docker>`__ for more information. + +.. _obtaining_code: + +Obtaining Code +-------------- + +The source code for the CCPP and SCM is provided through GitHub.com. +This release branch contains the tested and supported version for +general use, while a development branch is less stable, yet contains the +latest developer code. Instructions for using either option are +discussed here. + +Release Code +~~~~~~~~~~~~ + +Clone the source using + +.. code:: bash + + git clone --recursive -b v6.0.0 https://github.com/NCAR/ccpp-scm + +Recall that the option in this command clones the main ccpp-scm +repository and all subrepositories (ccpp-physics and ccpp-framework). +Using this option, there is no need to execute and . + +The CCPP framework can be found in the ccpp/framework subdirectory at +this level. The CCPP physics parameterizations can be found in the +ccpp/physics subdirectory. + +.. _`section: development_code`: + +Development Code +~~~~~~~~~~~~~~~~ + +If you would like to contribute as a developer to this project, please +see (in addition to the rest of this guide) the scientific and technical +documentation included with this release: + +https://dtcenter.org/community-code/common-community-physics-package-ccpp/documentation + +There you will find links to all of the documentation pertinent to +developers. + +For working with the development branches (stability not guaranteed), +check out the branch of the repository: + +.. code:: bash + + git clone --recursive -b main https://github.com/NCAR/ccpp-scm + +By using the option, it guarantees that you are checking out the commits +of ccpp-physics and ccpp-framework that were tested with the latest +commit of the SCM main branch. You can always retrieve the commits of +the submodules that were intended to be used with a given commit of the +SCM by doing the following from the top level SCM directory: + +.. code:: bash + + git submodule update --init --recursive + +You can try to use the latest commits of the ccpp-physics and +ccpp-framework submodules if you wish, but this may not have been tested +(i.e. SCM development may lag ccpp-physics and/or ccpp-framework +development). To do so: + +#. Navigate to the ccpp-physics directory. + + .. code:: bash + + cd ccpp-scm/ccpp/physics + +#. Check out main. + + .. code:: bash + + git checkout main + +#. Pull down the latest changes just to be sure. + + .. code:: bash + + git pull + +#. Do the same for ccpp-framework + + .. code:: bash + + cd ../framework + git checkout main + git pull + +#. Change back to the main directory for following the instructions in + section `1.3 <#section: compiling>`__ assuming system requirements in + section `1.2 <#section: systemrequirements>`__ are met. + + .. code:: bash + + cd ../.. + +.. _`section: systemrequirements`: + +System Requirements, Libraries, and Tools +----------------------------------------- + +The source code for the SCM and CCPP components is in the form of +programs written in FORTRAN, FORTRAN 90, and C. In addition, the I/O +relies on the NetCDF libraries. Beyond the standard scripts, the build +system relies on use of the Python scripting language, along with cmake, +GNU make and date. + +The following software stacks have been tested with this code. Other +versions of various components will likely still work, however. + +- gfortran 12.1.0, gcc 12.1.0, cmake 3.23.2, NetCDF 4.7.4, Python + 3.9.12 + +- GNU compilers 10.1.0, cmake 3.16.4, NetCDF 4.8.1, Python 3.7.12 + +- GNU compilers 11.1.0, cmake 3.18.2, NetCDF 4.8.1, Python 3.8.5 + +- Intel compilers 2022.0.2, cmake 3.20.1, NetCDF 4.7.4, Python 3.7.11 + +- Intel compilers 2022.1.0, cmake 3.22.0, NetCDF 4.8.1, Python 3.7.12 + +Because these tools are typically the purview of system administrators +to install and maintain, they are considered part of the basic system +requirements. The Unified Forecast System (UFS) Short-Range Weather +Application release v1.0.0 of March 2021 provides software packages and +detailed instructions to install these prerequisites and the hpc-stack +on supported platforms (see +section `1.2.3 <#section: setup_supported_platforms>`__). + +Further, there are several utility libraries as part of the hpc-stack +package that must be installed with environment variables pointing to +their locations prior to building the SCM. + +- bacio - Binary I/O Library + +- sp - Spectral Transformation Library + +- w3emc - GRIB decoder and encoder library + +The following environment variables are used by the build system to +properly link these libraries: , , and . Computational platforms in +which these libraries are prebuilt and installed in a central location +are referred to as preconfigured platforms. Examples of preconfigured +platforms are most NOAA high-performance computing machines (using the +Intel compiler) and the NCAR Cheyenne system (using the Intel and GNU +compilers). The machine setup scripts mentioned in section +`1.3 <#section: compiling>`__ load these libraries (which are identical +to those used by the UFS Short and Medium Range Weather Applications on +those machines) and set these environment variables for the user +automatically. For installing the libraries and its prerequisites on +supported platforms, existing UFS packages can be used (see +section `1.2.3 <#section: setup_supported_platforms>`__). + +Compilers +~~~~~~~~~ + +The CCPP and SCM have been tested on a variety of computing platforms. +Currently the CCPP system is actively supported on Linux and MacOS +computing platforms using the Intel or GNU Fortran compilers. Windows +users have a path to use this software through a Docker container that +uses Linux internally (see section `1.5 <#docker>`__). Please use +compiler versions listed in the previous section as unforeseen build +issues may occur when using older versions. Typically the best results +come from using the most recent version of a compiler. If you have +problems with compilers, please check the “Known Issues” section of the +release website +(https://dtcenter.org/community-code/common-community-physics-package-ccpp/download). + +.. _`section: use_preconfigured_platforms`: + +Using Existing Libraries on Preconfigured Platforms +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +Because the SCM can be built using the so-called `"spack-stack" +libraries `__ +maintained for the UFS Weather Model effort, there are many platforms +where the SCM can be built using those existing libraries. This can be +done by loading provided modules in the directory (must be done from the +top-level "ccpp-scm" directory; otherwise the command should point to +the corresponding absolute path): + +.. code:: sh + + module purge + module use scm/etc/modules + module load [machine]_[compiler] + +View the contents of the directory to see if your machine/compiler +combination is supported. As of this writing, modulefiles are provided +for Intel and GNU compilers on the NCAR machine Derecho, the NOAA +machines Hera and Jet, and the NOAA/MSU machine Orion. Loading these +modules will set up all the needed compilers, libraries, and other +programs needed for building, as well as the python libraries needed for +both building and running the SCM. + +.. _`section: setup_supported_platforms`: + +Installing Libraries on Non-preconfigured Platforms +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +For users on supported platforms such as generic Linux or macOS systems +that have not been preconfigured, the project is suggested for +installing prerequisite libraries. Visit +https://github.com/NOAA-EMC/hpc-stack for instructions for installing +prerequisite libraries via in their docs directory. UFS users who +already installed libraries via the package only need to set the +compiler (), NetCDF (), and , and (, , ) environment variables to point +to their installation paths in order to compile the SCM. + +The SCM uses only a small part of the UFS package and has fewer +prerequisites (i.e. no or needed). Users who are not planning to use the +UFS can install only NetCDF/NetCDF-Fortran manually or using the +software package manager (, , ). + +The Python environment must provide the module for the SCM scripts to +function. Users can test if f90nml is installed using this command in +the shell: + +:: + + python -c "import f90nml" + +If is installed, this command will succeed silently, otherwise an will +be printed to screen. To install the (v0.19) Python module, use the +install method preferred for your Python environment (one of the +following): + +- :: + + easy_install f90nml==0.19 + +- :: + + pip install f90nml==0.19 + +- :: + + conda install f90nml=0.19 + +or perform the following steps to install it manually from source: + +:: + + cd /directory/with/write/priveleges + git clone -b v0.19 https://github.com/marshallward/f90nml + cd f90nml + python setup.py install [--prefix=/my/install/directory or --user] + +The directory must exist and its subdirectory (or instead of , depending +on the system) must be in the environment variable. + +.. _`section: compiling`: + +Compiling SCM with CCPP +----------------------- + +The first step in compiling the CCPP and SCM is to properly setup your +user environment as described in +sections `1.2.2 <#section: use_preconfigured_platforms>`__ +and `1.2.3 <#section: setup_supported_platforms>`__. The second step is +to download the lookup tables and other large datasets (large binaries, +:math:`<`\ 1 GB) needed by the physics schemes and place them in the +correct directory: From the top-level code directory ( by default), +execute the following scripts: + +.. code:: bash + + ./contrib/get_all_static_data.sh + ./contrib/get_thompson_tables.sh + +If the download step fails, make sure that your system’s firewall does +not block access to GitHub. If it does, download the files , , , from +the GitHub release website using your browser and manually extract its +contents in the directory . Similarly, do the same for and and extract +to . + +Following this step, the top level build system will use to query system +parameters, execute the CCPP prebuild script to match the physics +variables (between what the host model – SCM – can provide and what is +needed by physics schemes in the CCPP for the chosen suites), and build +the physics caps needed to use them. Finally, is used to compile the +components. + +#. From the top-level code directory ( by default), change directory to + the top-level SCM directory. + + .. code:: bash + + cd scm + +#. Make a build directory and change into it. + + .. code:: bash + + mkdir bin && cd bin + +#. Invoke on the source code to build using one of the options below. + This step is used to identify for which suites the ccpp-framework + will build caps and which suites can be run in the SCM without + recompiling. + + - Default mode + + .. code:: bash + + cmake ../src + + By default, this option uses all supported suites. The list of + supported suites is controlled by . + + - All suites mode + + .. code:: bash + + cmake -DCCPP_SUITES=ALL ../src + + All suites in , regardless of whether they’re supported, will be + used. This list is typically longer for the development version of + the code than for releases. + + - Selected suites mode + + .. code:: bash + + cmake -DCCPP_SUITES=SCM_GFS_v16,SCM_RAP ../src + + This only compiles the listed subset of suites (which should still + have a corresponding entry in + + - The statements above can be modified with the following options + (put before ): + + - Use threading with openmp (not for macOS with clang+gfortran) + + .. code:: bash + + -DOPENMP=ON + + - Debug mode + + .. code:: bash + + -DCMAKE_BUILD_TYPE=Debug + + - One can also save the output of this step to a log file: + + .. code:: bash + + cmake [-DCMAKE_BUILD_TYPE ...] ../src 2>&1 | tee log.cmake + + CMake automatically runs the CCPP prebuild script to match required + physics variables with those available from the dycore (SCM) and to + generate physics caps and makefile segments. It generates software + caps for each physics group defined in the supplied Suite Definition + Files (SDFs) and generates a static library that becomes part of the + SCM executable. + + If necessary, the CCPP prebuild script can be executed manually from + the top level directory (). The basic syntax is + + .. code:: bash + + ./ccpp/framework/scripts/ccpp_prebuild.py --config=./ccpp/config/ccpp_prebuild_config.py --suites=SCM_GFS_v16,SCM_RAP[...] --builddir=./scm/bin [--debug] + + where the argument supplied via the variable is a comma-separated + list of suite names that exist in the directory. Note that suite + names are the suite definition filenames minus the prefix and suffix. + +#. Compile. Add to obtain more information on the build process. + + .. code:: bash + + make + + - One may also use more threads for compilation and/or save the + output of the compilation to a log file: + + .. code:: bash + + make -j4 2>&1 | tee log.make + +The resulting executable may be found at (Full path of ). + +Although is not currently implemented, an out-of-source build is used, +so all that is required to clean the build directory is (from the +directory) + +.. code:: bash + + pwd #confirm that you are in the ccpp-scm/scm/bin directory before deleting files + rm -rfd * + +Note: This command can be dangerous (deletes files without confirming), +so make sure that you’re in the right directory before executing! + +If you encounter errors, please capture a log file from all of the +steps, and start a thread on the support forum at: +https://dtcenter.org/forum/ccpp-user-support/ccpp-single-column-model + +Run the SCM with a supplied case +-------------------------------- + +There are several test cases provided with this version of the SCM. For +all cases, the SCM will go through the time steps, applying forcing and +calling the physics defined in the chosen suite definition file using +physics configuration options from an associated namelist. The model is +executed through a Python run script that is pre-staged into the +directory: . It can be used to run one integration or several +integrations serially, depending on the command line arguments supplied. + +.. _`subsection: singlerunscript`: + +Run Script Usage +~~~~~~~~~~~~~~~~ + +Running a case requires four pieces of information: the case to run +(consisting of initial conditions, geolocation, forcing data, etc.), the +physics suite to use (through a CCPP suite definition file), a physics +namelist (that specifies configurable physics options to use), and a +tracer configuration file. As discussed in chapter +`[chapter: cases] <#chapter: cases>`__, cases are set up via their own +namelists in . A default physics suite is provided as a user-editable +variable in the script and default namelists and tracer configurations +are associated with each physics suite (through ), so, technically, one +must only specify a case to run with the SCM when running just one +integration. For running multiple integrations at once, one need only +specify one argument () which runs through all permutations of supported +suites from and cases from . The run script’s options are described +below where option abbreviations are included in brackets. + +- + + - **This or the option are the minimum required arguments.** The + case should correspond to the name of a case in (without the + extension). + +- + + - The suite should correspond to the name of a suite in (without the + ) extension that was supplied in the or step. + +- + + - The namelist should correspond to the name of a file in (WITH the + extension). If this argument is omitted, the default namelist for + the given suite in will be used. + +- + + - The tracers file should correspond to the name of a file in (WITH + the extension). If this argument is omitted, the default tracer + configuration for the given suite in will be used. + +- + + - **This or the option are the minimum required arguments.** When + used alone, this option runs through all permutations of supported + suites from and cases from . When used in conjunction with the + option, only the runs configured in the file will be run. + +- + + - This option may be used in conjunction with the argument. It + specifies a path and filename to a python file where multiple runs + are configured. + +- + + - Use this to run the executable through the debugger (if it is + installed on the system). + +- + + - Use this argument when running in a docker container in order to + successfully mount a volume between the host machine and the + Docker container instance and to share the output and plots with + the host machine. + +- + + - Use this to override the runtime provided in the case + configuration namelist. + +- + + - Use this to override the runtime provided in the case + configuration namelist by multiplying the runtime by the given + value. This is used, for example, in regression testing to reduce + total runtimes. + +- + + - Use this to change the number of vertical levels. + +- + + - Use this to change the type of FV3 vertical grid to produce (see + for valid values). + +- + + - Use this to specify the path/filename of a file containing the a_k + and b_k coefficients for the vertical grid generation code to use. + +- + + - Use this to specify the path to the build directory. + +- + + - Use this to specify the path to the run directory. + +- + + - Use this to specify the path to the directory containing the case + data file (useful for using the DEPHY case repository). + +- + + - Use this to specify the period of writing instantaneous output in + timesteps (if different than the default specified in the script). + +- + + - Use this to specify the period of writing instantaneous and + time-averaged diagnostic output in timesteps (if different than + the default specified in the script). + +- + + - Use this to specify the timestep to use (if different than the + default specified in ). + +- + + - Use this option to see additional debugging output from the run + script and screen output from the executable. + +When invoking the run script, the only required argument is the name of +the case to run. The case name used must match one of the case +configuration files located in (*without the .nml extension!*). If +specifying a suite other than the default, the suite name used must +match the value of the suite name in one of the suite definition files +located in (Note: not the filename of the suite definition file). As +part of the sixth CCPP release, the following suite names are valid: + +#. SCM_GFS_v16 + +#. SCM_GFS_v17p8 + +#. SCM_RAP + +#. SCM_HRRR + +#. SCM_RRFS_v1beta + +#. SCM_WoFS_v0 + +Note that using the Thompson microphysics scheme requires the +computation of look-up tables during its initialization phase. As of the +release, this process has been prohibitively slow with this model, so it +is HIGHLY suggested that these look-up tables are downloaded and staged +to use this scheme as described in +section `1.3 <#section: compiling>`__. The issue appears to be +machine/compiler-specific, so you may be able to produce the tables with +the SCM, especially when invoking with the option. + +Also note that some cases require specified surface fluxes. Special +suite definition files that correspond to the suites listed above have +been created and use the decoration. It is not necessary to specify this +filename decoration when specifying the suite name. If the variable in +the configuration file of the case being run is set to , the run script +will automatically use the special suite definition file that +corresponds to the chosen suite from the list above. + +If specifying a namelist other than the default, the value must be an +entire filename that exists in . Caution should be exercised when +modifying physics namelists since some redundancy between flags to +control some physics parameterizations and scheme entries in the CCPP +suite definition files currently exists. Values of numerical parameters +are typically OK to change without fear of inconsistencies. If +specifying a tracer configuration other than the default, the value must +be an entire filename that exists in . The tracers that are used should +match what the physics suite expects, lest a runtime error will result. +Most of the tracers are dependent on the microphysics scheme used within +the suite. The tracer names that are supported as of this release are +given by the following list. Note that running without , , and may +result in a runtime error in all supported suites. + +#. sphum + +#. o3mr + +#. liq_wat + +#. ice_wat + +#. rainwat + +#. snowwat + +#. graupel + +#. hailwat + +#. cld_amt + +#. water_nc + +#. ice_nc + +#. rain_nc + +#. snow_nc + +#. graupel_nc + +#. hail_nc + +#. graupel_vol + +#. hail_vol + +#. ccn_nc + +#. sgs_tke + +#. liq_aero + +#. ice_aero + +#. q_rimef + +A NetCDF output file is generated in an output directory located named +with the case and suite within the run directory. If using a Docker +container, all output is copied to the directory in container space for +volume-mounting purposes. Any standard NetCDF file viewing or analysis +tools may be used to examine the output file (ncdump, ncview, NCL, etc). + +Batch Run Script +~~~~~~~~~~~~~~~~ + +If using the model on HPC resources and significant amounts of processor +time is anticipated for the experiments, it will likely be necessary to +submit a job through the HPC’s batch system. An example script has been +included in the repository for running the model on Hera’s batch system +(SLURM). It is located in . Edit the , , etc. to suit your needs and +copy to the directory. The case name to be run is included in the +variable. To use, invoke + +.. code:: bash + + ./scm_slurm_example.py + +from the directory. + +Additional details regarding the SCM may be found in the remainder of +this guide. More information on the CCPP can be found in the CCPP +Technical Documentation available at +https://ccpp-techdoc.readthedocs.io/en/v6.0.0/. + +.. _docker: + +Creating and Using a Docker Container with SCM and CCPP +------------------------------------------------------- + +In order to run a precompiled version of the CCPP SCM in a container, +Docker will need to be available on your machine. Please visit +https://www.docker.com to download and install the version compatible +with your system. Docker frequently releases updates to the software; it +is recommended to apply all available updates. NOTE: In order to install +Docker on your machine, you will be required to have root access +privileges. More information about getting started can be found at +https://docs.docker.com/get-started + +The following tips were acquired during a recent installation of Docker +on a machine with Windows 10 Home Edition. Further help should be +obtained from your system administrator or, lacking other resources, an +internet search. + +- Windows 10 Home Edition does not support Docker Desktop due to lack + of “Hyper-V” support, but does work with Docker Toolbox. See the + installation guide + (https://docs.docker.com/toolbox/toolbox_install_windows/). + +- You may need to turn on your CPU’s hardware virtualization capability + through your system’s BIOS. + +- After a successful installation of Docker Toolbox, starting with + Docker Quickstart may result in the following error even with + virtualization correctly enabled: . We were able to bypass this error + by opening a bash terminal installed with Docker Toolbox, navigating + to the directory where it was installed, and executing the following + command: + + .. code:: bash + + docker-machine create default --virtualbox-no-vtx-check + +Building the Docker image +~~~~~~~~~~~~~~~~~~~~~~~~~ + +The Dockerfile builds CCPP SCM v6.0.0 from source using the GNU +compiler. A number of required codes are built and installed via the +DTC-supported common community container. For reference, the common +community container repository can be accessed here: +https://github.com/NCAR/Common-Community-Container. + +The CCPP SCM has a number of system requirements and necessary libraries +and tools. Below is a list, including versions, used to create the the +GNU-based Docker image: + +- gfortran - 9.3 + +- gcc - 9.3 + +- cmake - 3.16.5 + +- NetCDF - 4.6.2 + +- HDF5 - 1.10.4 + +- ZLIB - 1.2.7 + +- SZIP - 2.1.1 + +- Python - 3 + +- NCEPLIBS subset: bacio v2.4.1_4, sp v2.3.3_d, w3emc v2.9.2_d + +A Docker image containing the SCM, CCPP, and its software prerequisites +can be generated from the code in the software repository obtained by +following section `1.1 <#obtaining_code>`__ by executing the following +steps: + +NOTE: Windows users can execute these steps in the terminal application +that was installed as part of Docker Toolbox. + +#. Navigate to the directory. + +#. Run the command to generate the Docker image, using the supplied + Dockerfile. + + .. code:: bash + + docker build -t ccpp-scm . + + Inspect the Dockerfile if you would like to see details for how the + image is built. The image will contain SCM prerequisite software from + DTC, the SCM and CCPP code, and a pre-compiled executable for the SCM + with the 6 supported suites for the SCM. A successful build will show + two images: dtcenter/common-community-container, and ccpp-scm. To + list images, type: + + .. code:: bash + + docker images + +Using a prebuilt Docker image from Dockerhub +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +A prebuilt Docker image for this release is available on Dockerhub if it +is not desired to build from source. In order to use this, execute the +following from the terminal where Docker is run: + +.. code:: bash + + docker pull dtcenter/ccpp-scm:v6.0.0 + +To verify that it exists afterward, run + +.. code:: bash + + docker images + +Running the Docker image +~~~~~~~~~~~~~~~~~~~~~~~~ + +NOTE: Windows users can execute these steps through the Docker +Quickstart application installed with Docker Toolbox. + +#. Set up a directory that will be shared between the host machine and + the Docker container. When set up correctly, it will contain output + generated by the SCM within the container for manipulation by the + host machine. For Mac/Linux, + + .. code:: bash + + mkdir -p /path/to/output + + For Windows, you can try to create a directory of your choice to + mount to the container, but it may not work or require more + configuration, depending on your particular Docker installation. We + have found that Docker volume mounting in Windows can be difficult to + set up correctly. One method that worked for us was to create a new + directory under our local user space, and specifying the volume mount + as below. In addition, with Docker Toolbox, double check that the + mounted directory has correct permissions. For example, open + VirtualBox, right click on the running virtual machine, and choose + “Settings”. In the dialog that appears, make sure that the directory + you’re trying to share shows up in “Shared Folders" (and add it if it + does not) and make sure that the “auto-mount" and “permanent" options + are checked. + +#. Set an environment variable to use for your SCM output directory. For + *t/csh* shells, + + .. code:: bash + + setenv OUT_DIR /path/to/output + + For bourne/bash shells, + + .. code:: bash + + export OUT_DIR=/path/to/output + + For Windows, the format that worked for us followed this example: + +#. To run the SCM, you can run the Docker container that was just + created and give it the same run commands as discussed in section + `1.4.1 <#subsection: singlerunscript>`__. **Be sure to remember to + include the option for all run commands**. For example, + + .. code:: bash + + docker run --rm -it -v ${OUT_DIR}:/home --name run-ccpp-scm ccpp-scm ./run_scm.py -c twpice -d + + will run through the TWPICE case using the default suite and namelist + and put the output in the shared directory. NOTE: Windows users may + need to omit the curly braces around environment variables: use + instead of . For running through all supported cases and suites, use + + .. code:: bash + + docker run --rm -it -v ${OUT_DIR}:/home --name run-ccpp-scm ccpp-scm ./run_scm.py -m -d + + The options included in the above commands are the following: + + - removes the container when it exits + + - interactive mode with terminal access + + - specifies the volume mount from host directory (outside container) + to inside the container. Using volumes allows you to share data + between the host machine and container. For running the SCM, the + output is being mounted from inside the container to the on the + host machine. Upon exiting the container, data mounted to the host + machine will still be accessible. + + - names the container. If no name is provided, the daemon will + autogenerate a random string name. + + NOTE: If you are using a prebuilt image from Dockerhub, substitute + the name of the image that was pulled from Dockerhub in the commands + above; i.e. instead of above, one would have . + +#. To use the SCM interactively, run non-default configurations, create + plots, or even develop code, issue the following command: + + .. code:: bash + + docker run --rm -it -v ${OUT_DIR}:/home --name run-ccpp-scm ccpp-scm /bin/bash + + You will be placed within the container space and within the + directory of the SCM with a pre-compiled executable. At this point, + one could use the run scripts as described in previous sections + (remembering to include the option on run scripts if output is to be + shared with the host machine). NOTE: If developing, since the + container is ephemeral, one should push their changes to a remote git + repository to save them (i.e. a fork on GitHub.com). diff --git a/scm/doc/TechGuide/chap_repo.rst b/scm/doc/TechGuide/chap_repo.rst new file mode 100644 index 000000000..eb1e7570f --- /dev/null +++ b/scm/doc/TechGuide/chap_repo.rst @@ -0,0 +1,22 @@ +.. _`chapter: repository`: + +Repository +========== + +What is included in the repository? +----------------------------------- + +The repository contains all code required to build the CCPP SCM and +scripts that can be used to obtain data to run it (e.g. downloading +large initialization tables for the Thompson microphysics schemes +discussed in subsection +`[subsection: singlerunscript] <#subsection: singlerunscript>`__ and +processed case data). It is functionally separated into 3 subdirectories +representing the SCM model infrastructure ( directory), the CCPP +infrastructure ( directory), and the CCPP physics schemes ( directory). +The entire repository resides on Github’s NCAR space, and the and +directories are git submodules that point to repositories and on the +same space. The structure of the entire repository is represented below. +Note that the repository also contains files needed for using the CCPP +with the UFS Atmosphere host model that uses the Finite-Volume +Cubed-Sphere (FV3) dynamical core. diff --git a/scm/doc/TechGuide/main.rst b/scm/doc/TechGuide/main.rst new file mode 100644 index 000000000..8b1378917 --- /dev/null +++ b/scm/doc/TechGuide/main.rst @@ -0,0 +1 @@ + diff --git a/scm/doc/TechGuide/preface.rst b/scm/doc/TechGuide/preface.rst new file mode 100644 index 000000000..581c32468 --- /dev/null +++ b/scm/doc/TechGuide/preface.rst @@ -0,0 +1,21 @@ +Preface +======= + +Meaning of typographic changes and symbols +------------------------------------------ + +Table `[tab:pre_typog] <#tab:pre_typog>`__ describes the type changes +and symbols used in this book. + +.. container:: tabular + + | +l^l^l Typeface or Symbol & Meaning & Example + | & The names of commands, & Edit your + | & files, and directories; & Use to list all files. + | & on-screen computer output & . + | & What you type, contrasted & + | & with on-screen computer & + | & output & + | & Command line placeholder: & To delete a file, type + | & replace with a real name & + | & or value & diff --git a/scm/doc/TechGuide/title.rst b/scm/doc/TechGuide/title.rst new file mode 100644 index 000000000..451b8549e --- /dev/null +++ b/scm/doc/TechGuide/title.rst @@ -0,0 +1,25 @@ +.. container:: titlepage + + .. container:: center + + Common Community Physics Package + Single Column Model (SCM) + + User and Technical Guide + v6.0.0 + + | July 2023 + | Grant Firl + | *CIRA/CSU, NOAA GSL, and DTC* + | Dustin Swales, Laurie Carson, Michelle Harrold + | *NCAR and DTC* + | Ligia Bernardet + | *NOAA GSL and DTC* + | Dom Heinzeller + | *CU/CIRES, NOAA/GSL, and DTC* + | *currently JCSDA* + + | |image| + +.. |image| image:: images/dtc_logo.png + :width: 40.0% From 3c65377d486667807ba1373dda5958310e137584 Mon Sep 17 00:00:00 2001 From: "Michael Kavulich, Jr" Date: Mon, 11 Mar 2024 09:39:51 -0600 Subject: [PATCH 02/26] Initial .readthedocs.yaml --- .readthedocs.yaml | 32 ++++++++++++++++++++++++++++++++ 1 file changed, 32 insertions(+) create mode 100644 .readthedocs.yaml diff --git a/.readthedocs.yaml b/.readthedocs.yaml new file mode 100644 index 000000000..0cf1c1f78 --- /dev/null +++ b/.readthedocs.yaml @@ -0,0 +1,32 @@ +# .readthedocs.yaml +# Read the Docs configuration file +# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details + +# Required +version: 2 + +# Set the OS, Python version and other tools you might need +build: + os: ubuntu-22.04 + tools: + python: "3.12" + # You can also specify other tool versions: + # nodejs: "19" + # rust: "1.64" + # golang: "1.19" + +# Build documentation in the "docs/" directory with Sphinx +sphinx: + configuration: docs/conf.py + +# Optionally build your docs in additional formats such as PDF and ePub +# formats: +# - pdf +# - epub + +# Optional but recommended, declare the Python requirements required +# to build your documentation +# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html +# python: +# install: +# - requirements: docs/requirements.txt From 70f768b1eb10c72cd3f85a61631d1df66bc862a3 Mon Sep 17 00:00:00 2001 From: "Michael Kavulich, Jr" Date: Mon, 11 Mar 2024 10:01:09 -0600 Subject: [PATCH 03/26] Add conf.py in correct location --- .readthedocs.yaml | 2 +- scm/doc/TechGuide/conf.py | 212 ++++++++++++++++++++++++++++++++++++++ 2 files changed, 213 insertions(+), 1 deletion(-) create mode 100644 scm/doc/TechGuide/conf.py diff --git a/.readthedocs.yaml b/.readthedocs.yaml index 0cf1c1f78..c3f1466f3 100644 --- a/.readthedocs.yaml +++ b/.readthedocs.yaml @@ -17,7 +17,7 @@ build: # Build documentation in the "docs/" directory with Sphinx sphinx: - configuration: docs/conf.py + configuration: scm/doc/TechGuide/conf.py # Optionally build your docs in additional formats such as PDF and ePub # formats: diff --git a/scm/doc/TechGuide/conf.py b/scm/doc/TechGuide/conf.py new file mode 100644 index 000000000..2c8243069 --- /dev/null +++ b/scm/doc/TechGuide/conf.py @@ -0,0 +1,212 @@ +# -*- coding: utf-8 -*- +# +# Configuration file for the Sphinx documentation builder. +# +# This file does only contain a selection of the most common options. For a +# full list see the documentation: +# http://www.sphinx-doc.org/en/master/config + +# -- Path setup -------------------------------------------------------------- + +# If extensions (or modules to document with autodoc) are in another directory, +# add these directories to sys.path here. If the directory is relative to the +# documentation root, use os.path.abspath to make it absolute, like shown here. +# +import os +import sys +sys.path.insert(0, os.path.abspath('.')) + + +# -- Project information ----------------------------------------------------- + +project = 'Single Column Model (SCM) for the Common Community Physics Package (CCPP)' +copyright = '2024 ' +author = 'Firl, Grant, Ligia Bernardet, Dom Heinzeller, Dustin Swales, \\\ Tracy Hertneky, and Michael Kavulich, Jr.' + pdfauthor={Grant Firl / Laurie Carson / Ligia Bernardet / Dom Heinzeller}, + pdftitle={Common Community Physics Packaged (CCPP) Single Column Model (SCM)}, + +# The short X.Y version +version = '7.0' +# The full version, including alpha/beta/rc tags +release = '7.0.0' + +numfig = True + + +# -- General configuration --------------------------------------------------- + +# If your documentation needs a minimal Sphinx version, state it here. +# +# needs_sphinx = '1.0' + +# Add any Sphinx extension module names here, as strings. They can be +# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom +# ones. +extensions = [ + 'sphinx.ext.autodoc', + 'sphinx.ext.doctest', + 'sphinx.ext.intersphinx', + 'sphinx.ext.todo', + 'sphinx.ext.coverage', + 'sphinx.ext.mathjax', + 'sphinx.ext.ifconfig', + 'sphinx.ext.viewcode', + 'sphinx.ext.githubpages', + 'sphinx.ext.napoleon', + 'sphinxcontrib.bibtex', + 'sphinx.ext.autosectionlabel' +] + +bibtex_bibfiles = ['references.bib'] + +# Add any paths that contain templates here, relative to this directory. +templates_path = ['_templates'] + +# The suffix(es) of source filenames. +# You can specify multiple suffix as a list of string: +# +# source_suffix = ['.rst', '.md'] +source_suffix = '.rst' + +# The master toctree document. +master_doc = 'main' + +# The language for content autogenerated by Sphinx. Refer to documentation +# for a list of supported languages. +# +# This is also used if you do content translation via gettext catalogs. +# Usually you set "language" from the command line for these cases. +language = 'en' + +# List of patterns, relative to source directory, that match files and +# directories to ignore when looking for source files. +# This pattern also affects html_static_path and html_extra_path. +exclude_patterns = [] + +# The name of the Pygments (syntax highlighting) style to use. +pygments_style = None + + +# -- Options for HTML output ------------------------------------------------- + +# The theme to use for HTML and HTML Help pages. See the documentation for +# a list of builtin themes. +# +html_theme = 'classic' +#html_title = f'{project} v{revision} documentation' +html_title = f'{project} documentation' + +# Theme options are theme-specific and customize the look and feel of a theme +# further. For a list of options available for each theme, see the +# documentation. +# +# html_theme_options = {} +html_theme_options = {"body_max_width": "none"} + +# Add any paths that contain custom static files (such as style sheets) here, +# relative to this directory. They are copied after the builtin static files, +# so a file named "default.css" will overwrite the builtin "default.css". +html_static_path = ['_static'] + +def setup(app): + app.add_css_file('custom.css') # may also be an URL + +# Custom sidebar templates, must be a dictionary that maps document names +# to template names. +# +# The default sidebars (for documents that don't match any pattern) are +# defined by theme itself. Builtin themes are using these templates by +# default: ``['localtoc.html', 'relations.html', 'sourcelink.html', +# 'searchbox.html']``. +# +# html_sidebars = {} + + +# -- Options for HTMLHelp output --------------------------------------------- + +# Output file base name for HTML help builder. +htmlhelp_basename = 'CCPP-SCM' + + +# -- Options for LaTeX output ------------------------------------------------ + +latex_engine = 'pdflatex' +latex_elements = { + # The paper size ('letterpaper' or 'a4paper'). + # + # 'papersize': 'letterpaper', + + # The font size ('10pt', '11pt' or '12pt'). + # + # 'pointsize': '10pt', + + # Additional stuff for the LaTeX preamble. + # + # 'preamble': '', + + # Latex figure (float) alignment + # + # 'figure_align': 'htbp', +## 'maketitle': r'\newcommand\sphinxbackoftitlepage{For referencing this document please use: \newline \break Bernardet, L., G. Firl, D. Heinzeller, L. Pan, M. Zhang, M. Kavulich, J. Schramm, and L. Carson, 2022. CCPP Technical Documentation Release v6.0.0. Available at https://ccpp-techdoc.readthedocs.io/\textunderscore/downloads/en/v6.0.0/pdf/.}\sphinxmaketitle' +} + +# Grouping the document tree into LaTeX files. List of tuples +# (source start file, target name, title, +# author, documentclass [howto, manual, or own class]). +##latex_documents = [ +## (master_doc, 'CCPPtechnical.tex', 'CCPP Technical Documentation', +## author,'manual'), +##] + + +# -- Options for manual page output ------------------------------------------ + +# One entry per manual page. List of tuples +# (source start file, name, description, authors, manual section). +man_pages = [ + (master_doc, 'CCPP-SCM', 'CCPP-SCM Users Guide', + [author], 1) +] + + +# -- Options for Texinfo output ---------------------------------------------- + +# Grouping the document tree into Texinfo files. List of tuples +# (source start file, target name, title, author, +# dir menu entry, description, category) +texinfo_documents = [ + (master_doc, 'CCPP-SCM', 'CCPP-SCM Users Guide', + author, 'CCPP-SCM', 'Single-Column Model (SCM) for the Common Community Physics Package (CCPP).', + 'Miscellaneous'), +] + + +# -- Options for Epub output ------------------------------------------------- + +# Bibliographic Dublin Core info. +epub_title = project + +# The unique identifier of the text. This can be a ISBN number +# or the project homepage. +# +# epub_identifier = '' + +# A unique identification for the text. +# +# epub_uid = '' + +# A list of files that should not be packed into the epub file. +epub_exclude_files = ['search.html'] + + +# -- Extension configuration ------------------------------------------------- + +# -- Options for intersphinx extension --------------------------------------- + +# Example configuration for intersphinx: refer to the Python standard library. +intersphinx_mapping = {'https://docs.python.org/': None} + +# -- Options for todo extension ---------------------------------------------- + +# If true, `todo` and `todoList` produce output, else they produce nothing. +todo_include_todos = True From 45f74543886e398c253a5a6c287c010fd906f5d2 Mon Sep 17 00:00:00 2001 From: "Michael Kavulich, Jr" Date: Mon, 11 Mar 2024 10:08:11 -0600 Subject: [PATCH 04/26] Include proper master doc main.rst, fix rogue text in conf.py --- scm/doc/TechGuide/conf.py | 2 -- scm/doc/TechGuide/main.rst | 18 ++++++++++++++++++ 2 files changed, 18 insertions(+), 2 deletions(-) diff --git a/scm/doc/TechGuide/conf.py b/scm/doc/TechGuide/conf.py index 2c8243069..b3981cf28 100644 --- a/scm/doc/TechGuide/conf.py +++ b/scm/doc/TechGuide/conf.py @@ -22,8 +22,6 @@ project = 'Single Column Model (SCM) for the Common Community Physics Package (CCPP)' copyright = '2024 ' author = 'Firl, Grant, Ligia Bernardet, Dom Heinzeller, Dustin Swales, \\\ Tracy Hertneky, and Michael Kavulich, Jr.' - pdfauthor={Grant Firl / Laurie Carson / Ligia Bernardet / Dom Heinzeller}, - pdftitle={Common Community Physics Packaged (CCPP) Single Column Model (SCM)}, # The short X.Y version version = '7.0' diff --git a/scm/doc/TechGuide/main.rst b/scm/doc/TechGuide/main.rst index 8b1378917..89b95bf49 100644 --- a/scm/doc/TechGuide/main.rst +++ b/scm/doc/TechGuide/main.rst @@ -1 +1,19 @@ +CCPP Technical Documentation +============================ + +.. toctree:: + :numbered: + :maxdepth: 3 + + Title + Acknow + Preface + chap_intro + chap_quick + chap_repo + chap_function + chap_cases + chap_ccpp + chap_hsd + From 6656e5ff434500391bedca7b6c692da110340552 Mon Sep 17 00:00:00 2001 From: "Michael Kavulich, Jr" Date: Mon, 11 Mar 2024 11:01:50 -0600 Subject: [PATCH 05/26] Add requirements file for python dependencies --- .readthedocs.yaml | 6 +++--- scm/doc/TechGuide/requirements.txt | 1 + 2 files changed, 4 insertions(+), 3 deletions(-) create mode 100644 scm/doc/TechGuide/requirements.txt diff --git a/.readthedocs.yaml b/.readthedocs.yaml index c3f1466f3..3cbb7291f 100644 --- a/.readthedocs.yaml +++ b/.readthedocs.yaml @@ -27,6 +27,6 @@ sphinx: # Optional but recommended, declare the Python requirements required # to build your documentation # See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html -# python: -# install: -# - requirements: docs/requirements.txt +python: + install: + - requirements: scm/doc/TechGuide/requirements.txt diff --git a/scm/doc/TechGuide/requirements.txt b/scm/doc/TechGuide/requirements.txt new file mode 100644 index 000000000..ef36addc6 --- /dev/null +++ b/scm/doc/TechGuide/requirements.txt @@ -0,0 +1 @@ +sphinxcontrib-bibtex From bcb9fa94bf2e79555d40882dfe653a3071bef7e6 Mon Sep 17 00:00:00 2001 From: "Michael Kavulich, Jr" Date: Mon, 11 Mar 2024 11:04:49 -0600 Subject: [PATCH 06/26] Readthedocs insists that we have index.rst --- scm/doc/TechGuide/conf.py | 2 +- scm/doc/TechGuide/{main.rst => index.rst} | 0 2 files changed, 1 insertion(+), 1 deletion(-) rename scm/doc/TechGuide/{main.rst => index.rst} (100%) diff --git a/scm/doc/TechGuide/conf.py b/scm/doc/TechGuide/conf.py index b3981cf28..8027f532c 100644 --- a/scm/doc/TechGuide/conf.py +++ b/scm/doc/TechGuide/conf.py @@ -67,7 +67,7 @@ source_suffix = '.rst' # The master toctree document. -master_doc = 'main' +master_doc = 'index' # The language for content autogenerated by Sphinx. Refer to documentation # for a list of supported languages. diff --git a/scm/doc/TechGuide/main.rst b/scm/doc/TechGuide/index.rst similarity index 100% rename from scm/doc/TechGuide/main.rst rename to scm/doc/TechGuide/index.rst From acd117aa757d4663c823c47412c8cbc4714e3c7f Mon Sep 17 00:00:00 2001 From: "Michael Kavulich, Jr" Date: Tue, 12 Mar 2024 08:21:13 -0600 Subject: [PATCH 07/26] Apparently chapters are case-sensitive. Fix up acknowledgements, and remove title page. --- scm/doc/TechGuide/acknow.rst | 22 +++++++++------------- scm/doc/TechGuide/index.rst | 7 +++---- scm/doc/TechGuide/title.rst | 25 ------------------------- 3 files changed, 12 insertions(+), 42 deletions(-) delete mode 100644 scm/doc/TechGuide/title.rst diff --git a/scm/doc/TechGuide/acknow.rst b/scm/doc/TechGuide/acknow.rst index dcb376ad2..c8014aba5 100644 --- a/scm/doc/TechGuide/acknow.rst +++ b/scm/doc/TechGuide/acknow.rst @@ -1,16 +1,12 @@ -.. container:: titlepage +Acknowledgement +=============== +If significant help was provided via the helpdesk, email, or +support forum for work resulting in a publication, please +acknowledge the Developmental Testbed Center team. - .. container:: flushleft +For referencing this document please use: - Acknowledgement + Firl, G., D. Swales, L. Carson, L. Bernardet, D. Heinzeller, M. Harrold, T. Hertneky, and + M. Kavulich, 2024. Common Community Physics Package Single Column Model v7.0.0 User and + Technical Guide. Available at https://ccpp-scm.readthedocs.io/en/latest/. - | If significant help was provided via the helpdesk, email, or - support forum for work resulting in a publication, please - acknowledge the Developmental Testbed Center team. - - | For referencing this document please use: - - Firl, G., D. Swales, L. Carson, L. Bernardet, D. Heinzeller, and - M. Harrold, 2022. Common Community Physics Package Single Column - Model v6.0.0 User and Technical Guide. 40pp. Available at - https://dtcenter.org/sites/default/files/paragraph/scm-ccpp-guide-v6-0-0.pdf. diff --git a/scm/doc/TechGuide/index.rst b/scm/doc/TechGuide/index.rst index 89b95bf49..3d4bacb58 100644 --- a/scm/doc/TechGuide/index.rst +++ b/scm/doc/TechGuide/index.rst @@ -1,13 +1,12 @@ -CCPP Technical Documentation +CCPP Single Column Model (SCM) User and Technical Guide v6.0.0 ============================ .. toctree:: :numbered: :maxdepth: 3 - Title - Acknow - Preface + acknow + preface chap_intro chap_quick chap_repo diff --git a/scm/doc/TechGuide/title.rst b/scm/doc/TechGuide/title.rst deleted file mode 100644 index 451b8549e..000000000 --- a/scm/doc/TechGuide/title.rst +++ /dev/null @@ -1,25 +0,0 @@ -.. container:: titlepage - - .. container:: center - - Common Community Physics Package - Single Column Model (SCM) - - User and Technical Guide - v6.0.0 - - | July 2023 - | Grant Firl - | *CIRA/CSU, NOAA GSL, and DTC* - | Dustin Swales, Laurie Carson, Michelle Harrold - | *NCAR and DTC* - | Ligia Bernardet - | *NOAA GSL and DTC* - | Dom Heinzeller - | *CU/CIRES, NOAA/GSL, and DTC* - | *currently JCSDA* - - | |image| - -.. |image| image:: images/dtc_logo.png - :width: 40.0% From d14f369aaa24d4234b42661c7189c6630baf3644 Mon Sep 17 00:00:00 2001 From: "Michael Kavulich, Jr" Date: Tue, 12 Mar 2024 08:54:11 -0600 Subject: [PATCH 08/26] fix image embedding, clean up "preface" table (basically copied from CCPP tech doc), fix deprecated feature in conf.py --- scm/doc/TechGuide/chap_hsd.rst | 6 +++--- scm/doc/TechGuide/conf.py | 2 +- scm/doc/TechGuide/preface.rst | 39 ++++++++++++++++++++++------------ 3 files changed, 30 insertions(+), 17 deletions(-) diff --git a/scm/doc/TechGuide/chap_hsd.rst b/scm/doc/TechGuide/chap_hsd.rst index 2a4e5e1d9..aada12eb7 100644 --- a/scm/doc/TechGuide/chap_hsd.rst +++ b/scm/doc/TechGuide/chap_hsd.rst @@ -66,7 +66,7 @@ Definition File (SDF), using namelist options. This simulator data-driven tendencies (`1.1 <#fig:CSS_tendency_schematic>`__). .. figure:: images/CSS_tendency_schematic.png - name: fig:CSS_tendency_schematic + :name: fig:CSS_tendency_schematic :width: 80.0% Equation for internal physics state evolution for process-split @@ -239,7 +239,7 @@ interstital scheme to couple to the GFS physics, (See `1.2 <#fig:CSS_SDF_ex1>`__). .. figure:: images/SDF_changes_for_CSS_ex1.png - name: fig:CSS_SDF_ex1 + :name: fig:CSS_SDF_ex1 :width: 80.0% Changes to GFS v16 physics SDF to include CCPP suite simulator for @@ -277,7 +277,7 @@ interstital scheme to couple to the GFS physics, (See `1.3 <#fig:CSS_SDF_ex2>`__). .. figure:: images/SDF_changes_for_CSS_ex2.png - name: fig:CSS_SDF_ex2 + :name: fig:CSS_SDF_ex2 :width: 80.0% Changes to GFS v17 Prototype 8 physics SDF to include CCPP suite diff --git a/scm/doc/TechGuide/conf.py b/scm/doc/TechGuide/conf.py index 8027f532c..e1bafb42d 100644 --- a/scm/doc/TechGuide/conf.py +++ b/scm/doc/TechGuide/conf.py @@ -202,7 +202,7 @@ def setup(app): # -- Options for intersphinx extension --------------------------------------- # Example configuration for intersphinx: refer to the Python standard library. -intersphinx_mapping = {'https://docs.python.org/': None} +intersphinx_mapping = {'python': ('https://docs.python.org/3', None)} # -- Options for todo extension ---------------------------------------------- diff --git a/scm/doc/TechGuide/preface.rst b/scm/doc/TechGuide/preface.rst index 581c32468..ce63434a4 100644 --- a/scm/doc/TechGuide/preface.rst +++ b/scm/doc/TechGuide/preface.rst @@ -4,18 +4,31 @@ Preface Meaning of typographic changes and symbols ------------------------------------------ -Table `[tab:pre_typog] <#tab:pre_typog>`__ describes the type changes -and symbols used in this book. +The table below describes the type changes and symbols used in this document. -.. container:: tabular +.. _scheme_suite_table: - | +l^l^l Typeface or Symbol & Meaning & Example - | & The names of commands, & Edit your - | & files, and directories; & Use to list all files. - | & on-screen computer output & . - | & What you type, contrasted & - | & with on-screen computer & - | & output & - | & Command line placeholder: & To delete a file, type - | & replace with a real name & - | & or value & +.. list-table:: *Type changes and symbols used in this guide.* + :header-rows: 1 + + * - Typeface or Symbol + - Meaning + - Examples + * - ``AaBbCc123`` + - + * The names of commands, files, and directories + * On-screen terminal output + - + * Edit your ``.bashrc`` file + * Use ``ls -a`` to list all files. + * ``host$ You have mail!`` + * - *AaBbCc123* + - + * The names of CCPP-specific terms, subroutines, etc. + * Captions for figures, tables, etc. + - + * Each scheme must include at least one of the following subroutines: ``{schemename}_timestep_init``, ``{schemename}_init``, ``{schemename}_run``, ``{schemename}_finalize``, and ``{schemename}_timestep_finalize``. + * *Listing 2.1: Fortran template for a CCPP-compliant scheme showing the* _run *subroutine.* + * - **AaBbCc123** + - Words or phrases requiring particular emphasis + - Fortran77 code should **not** be used From 350dda943fb2bc8191691c0b36883f1725af88b2 Mon Sep 17 00:00:00 2001 From: "Michael Kavulich, Jr" Date: Tue, 12 Mar 2024 09:11:12 -0600 Subject: [PATCH 09/26] Update and fix formatting for Introduction chapter --- scm/doc/TechGuide/chap_intro.rst | 22 +++++++++++----------- scm/doc/TechGuide/preface.rst | 2 +- 2 files changed, 12 insertions(+), 12 deletions(-) diff --git a/scm/doc/TechGuide/chap_intro.rst b/scm/doc/TechGuide/chap_intro.rst index f99913eed..68e4ddf1d 100644 --- a/scm/doc/TechGuide/chap_intro.rst +++ b/scm/doc/TechGuide/chap_intro.rst @@ -3,15 +3,15 @@ Introduction ============ -A single column model (SCM) can be a valuable tool for diagnosing the +A single-column model (SCM) can be a valuable tool for diagnosing the performance of a physics suite, from validating that schemes have been integrated into a suite correctly to deep dives into how physical processes are being represented by the approximating code. This SCM has -the advantage of working with the Common Community Physics Package +the advantage of working with and being developed alongside the Common Community Physics Package (CCPP), a library of physical parameterizations for atmospheric -numerical models and the associated framework for connecting potentially +numerical models (CCPP physics) and the associated framework for connecting potentially any atmospheric model to physics suites constructed from its member -parameterizations. In fact, this SCM serves as perhaps the simplest +parameterizations (CCPP framework). In fact, this SCM serves as perhaps the simplest example for using the CCPP and its framework in an atmospheric model. This version contains all parameterizations of NOAA’s evolved operational GFS v16 suite (implemented in 2021), plus additional @@ -22,9 +22,9 @@ Documentation `__ This document serves as both the User and Technical Guides for this model. It contains a Quick Start Guide with instructions for obtaining -the code, compiling, and running a sample test case, an explanation for -what is included in the repository, a brief description of the operation -of the model, a description of how cases are set up and run, and +the code, compiling, and running a sample test case; an explanation for +what is included in the repository; a brief description of the operation +of the model; a description of how cases are set up and run, and finally, an explanation for how the model interfaces with physics through the CCPP infrastructure. @@ -102,8 +102,8 @@ This release bundle has some known limitations: technical reason why they cannot be used with LSMs, however. - As of this release, using the SCM over a land point with an LSM is - possible through the use of UFS initial conditions (see section - `[sec:UFS ICs] <#sec:UFS ICs>`__). However, advective forcing terms + possible through the use of UFS initial conditions (see + :numref:`Section %s `). However, advective forcing terms are unavailable as of this release, so only short integrations using this configuration should be employed. Using dynamical tendencies (advective forcing terms) from the UFS will be part of a future @@ -112,5 +112,5 @@ This release bundle has some known limitations: - There are several capabilities of the developmental code that have not been tested sufficiently to be considered part of the supported release. Those include additional parameterizations. Users that want - to use experimental capabilities should refer to Subsection - `[section: development_code] <#section: development_code>`__. + to use experimental capabilities should refer to + :numref:`Subsection %s `. diff --git a/scm/doc/TechGuide/preface.rst b/scm/doc/TechGuide/preface.rst index ce63434a4..a5104db3f 100644 --- a/scm/doc/TechGuide/preface.rst +++ b/scm/doc/TechGuide/preface.rst @@ -24,7 +24,7 @@ The table below describes the type changes and symbols used in this document. * ``host$ You have mail!`` * - *AaBbCc123* - - * The names of CCPP-specific terms, subroutines, etc. + * The names of SCM- and CCPP-specific terms, subroutines, etc. * Captions for figures, tables, etc. - * Each scheme must include at least one of the following subroutines: ``{schemename}_timestep_init``, ``{schemename}_init``, ``{schemename}_run``, ``{schemename}_finalize``, and ``{schemename}_timestep_finalize``. From 42a5a9663c2ff3406a5077b583018495b33b18d1 Mon Sep 17 00:00:00 2001 From: "Michael Kavulich, Jr" Date: Tue, 12 Mar 2024 10:21:27 -0600 Subject: [PATCH 10/26] Update and fix formatting for first section of Quick Start chapter --- scm/doc/TechGuide/chap_quick.rst | 51 ++++++++++++++++++-------------- 1 file changed, 29 insertions(+), 22 deletions(-) diff --git a/scm/doc/TechGuide/chap_quick.rst b/scm/doc/TechGuide/chap_quick.rst index 7a13df7d8..4b4525e14 100644 --- a/scm/doc/TechGuide/chap_quick.rst +++ b/scm/doc/TechGuide/chap_quick.rst @@ -6,27 +6,27 @@ Quick Start Guide This chapter provides instructions for obtaining and compiling the CCPP SCM. The SCM code calls CCPP-compliant physics schemes through the CCPP framework code. As such, it requires the CCPP framework code and physics -code, both of which are included as submodules within the SCM code. This +code, both of which are included as submodules within the SCM git repository. This package can be considered a simple example for an atmospheric model to interact with physics through the CCPP. -Alternatively, if one doesn’t have access or care to set up a machine +Alternatively, if one doesn’t have access to or care to set up a machine with the appropriate system requirements but has a working Docker installation, it is possible to create and use a Docker container with a pre-configured computing environment with a pre-compiled model. This is also an avenue for running this software with a Windows PC. See section -`1.5 <#docker>`__ for more information. +:numref:`Section %s ` for more information. .. _obtaining_code: Obtaining Code -------------- -The source code for the CCPP and SCM is provided through GitHub.com. -This release branch contains the tested and supported version for -general use, while a development branch is less stable, yet contains the -latest developer code. Instructions for using either option are -discussed here. +The source code for the SCM, CCPP, and their required components are provided through GitHub. +The latest release branch contains the tested and supported version for +general use, while the development branch (``main``) contains the latest +developer code, but may not be as stable or consistent with existing documentation. +Instructions for using either option are discussed here. Release Code ~~~~~~~~~~~~ @@ -37,13 +37,19 @@ Clone the source using git clone --recursive -b v6.0.0 https://github.com/NCAR/ccpp-scm -Recall that the option in this command clones the main ccpp-scm -repository and all subrepositories (ccpp-physics and ccpp-framework). -Using this option, there is no need to execute and . +By using the ``--recursive`` option, it guarantees that you are checking out the commits +of ccpp-physics and ccpp-framework that were tested with the latest +commit of the SCM main branch. If not included initially, you can always retrieve the commits of +the submodules that were intended to be used with a given commit of the +SCM by executing the following command from the SCM directory: + +.. code:: bash -The CCPP framework can be found in the ccpp/framework subdirectory at + git submodule update --init --recursive + +The CCPP framework can be found in the ``ccpp/framework`` subdirectory at this level. The CCPP physics parameterizations can be found in the -ccpp/physics subdirectory. +``ccpp/physics`` subdirectory. .. _`section: development_code`: @@ -52,7 +58,7 @@ Development Code If you would like to contribute as a developer to this project, please see (in addition to the rest of this guide) the scientific and technical -documentation included with this release: +documentation included with this release for both the SCM and the CCPP: https://dtcenter.org/community-code/common-community-physics-package-ccpp/documentation @@ -60,17 +66,17 @@ There you will find links to all of the documentation pertinent to developers. For working with the development branches (stability not guaranteed), -check out the branch of the repository: +check out the ``main`` branch of the repository: .. code:: bash git clone --recursive -b main https://github.com/NCAR/ccpp-scm -By using the option, it guarantees that you are checking out the commits -of ccpp-physics and ccpp-framework that were tested with the latest -commit of the SCM main branch. You can always retrieve the commits of +Recall that the ``--recursive`` option in this command clones the main ccpp-scm +repository and all subrepositories (ccpp-physics and ccpp-framework). +If not included initially, you can always retrieve the commits of the submodules that were intended to be used with a given commit of the -SCM by doing the following from the top level SCM directory: +SCM by executing the following command from the SCM directory: .. code:: bash @@ -108,8 +114,8 @@ development). To do so: git pull #. Change back to the main directory for following the instructions in - section `1.3 <#section: compiling>`__ assuming system requirements in - section `1.2 <#section: systemrequirements>`__ are met. + :numref:`Section %s `, assuming system requirements in + section :numref:`Section %s ` are met. .. code:: bash @@ -121,7 +127,8 @@ System Requirements, Libraries, and Tools ----------------------------------------- The source code for the SCM and CCPP components is in the form of -programs written in FORTRAN, FORTRAN 90, and C. In addition, the I/O +programs written in FORTRAN 90 (with some required features from the +FORTRAN 2008 standard), and C. In addition, the model I/O relies on the NetCDF libraries. Beyond the standard scripts, the build system relies on use of the Python scripting language, along with cmake, GNU make and date. From 1a5dacdddb39c17b454e1b92c7a86f6079c4c466 Mon Sep 17 00:00:00 2001 From: "Michael Kavulich, Jr" Date: Tue, 12 Mar 2024 11:33:01 -0600 Subject: [PATCH 11/26] Finish formatting Quick Start chapter. Did not make any updates for latest changes or building with spack-stack; that will come in a subsequent PR --- scm/doc/TechGuide/chap_quick.rst | 230 ++++++++++++++++--------------- 1 file changed, 117 insertions(+), 113 deletions(-) diff --git a/scm/doc/TechGuide/chap_quick.rst b/scm/doc/TechGuide/chap_quick.rst index 4b4525e14..4a3a46782 100644 --- a/scm/doc/TechGuide/chap_quick.rst +++ b/scm/doc/TechGuide/chap_quick.rst @@ -152,8 +152,7 @@ to install and maintain, they are considered part of the basic system requirements. The Unified Forecast System (UFS) Short-Range Weather Application release v1.0.0 of March 2021 provides software packages and detailed instructions to install these prerequisites and the hpc-stack -on supported platforms (see -section `1.2.3 <#section: setup_supported_platforms>`__). +on supported platforms (see :numref:`Section %s `) Further, there are several utility libraries as part of the hpc-stack package that must be installed with environment variables pointing to @@ -166,18 +165,18 @@ their locations prior to building the SCM. - w3emc - GRIB decoder and encoder library The following environment variables are used by the build system to -properly link these libraries: , , and . Computational platforms in +properly link these libraries: ``bacio_ROOT``, ``sp_ROOT``, and ``w3emc_ROOT`` Computational platforms on which these libraries are prebuilt and installed in a central location -are referred to as preconfigured platforms. Examples of preconfigured +are referred to as *preconfigured* platforms. Examples of preconfigured platforms are most NOAA high-performance computing machines (using the Intel compiler) and the NCAR Cheyenne system (using the Intel and GNU -compilers). The machine setup scripts mentioned in section -`1.3 <#section: compiling>`__ load these libraries (which are identical +compilers). The machine setup scripts mentioned in +:numref:`Section %s ` load these libraries (which are identical to those used by the UFS Short and Medium Range Weather Applications on those machines) and set these environment variables for the user automatically. For installing the libraries and its prerequisites on supported platforms, existing UFS packages can be used (see -section `1.2.3 <#section: setup_supported_platforms>`__). +:numref:`Section %s `). Compilers ~~~~~~~~~ @@ -230,15 +229,16 @@ For users on supported platforms such as generic Linux or macOS systems that have not been preconfigured, the project is suggested for installing prerequisite libraries. Visit https://github.com/NOAA-EMC/hpc-stack for instructions for installing -prerequisite libraries via in their docs directory. UFS users who -already installed libraries via the package only need to set the -compiler (), NetCDF (), and , and (, , ) environment variables to point +prerequisite libraries via *hpc-stack* in their docs directory. UFS users who +already installed libraries via the *hpc-stack* package only need to set the +compiler (``CC``, ``CXX``, ``FC``), NetCDF (``NetCDF_ROOT``), and ``bacio``, +``sp`` and ``w3emc`` (``bacio_ROOT``, ``sp_ROOT``, ``w3emc_ROOT``) environment variables to point to their installation paths in order to compile the SCM. -The SCM uses only a small part of the UFS package and has fewer -prerequisites (i.e. no or needed). Users who are not planning to use the +The SCM uses only a small part of the UFS *hpc-stack* package and has fewer +prerequisites (i.e. no ESMF or wgrib2 needed). Users who are not planning to use the UFS can install only NetCDF/NetCDF-Fortran manually or using the -software package manager (, , ). +software package manager (apt, yum, brew). The Python environment must provide the module for the SCM scripts to function. Users can test if f90nml is installed using this command in @@ -248,8 +248,8 @@ the shell: python -c "import f90nml" -If is installed, this command will succeed silently, otherwise an will -be printed to screen. To install the (v0.19) Python module, use the +If is installed, this command will succeed silently, otherwise an ``ImportError: No module named f90nml`` +will be printed to screen. To install the ``f90nml`` (v0.19) Python module, use the install method preferred for your Python environment (one of the following): @@ -274,8 +274,10 @@ or perform the following steps to install it manually from source: cd f90nml python setup.py install [--prefix=/my/install/directory or --user] -The directory must exist and its subdirectory (or instead of , depending -on the system) must be in the environment variable. +The directory ``/my/install/directory`` must exist and its subdirectory +``/my/install/directory/lib/python[version]/site-packages`` (or ``lib64`` +instead of ``lib``, depending on the system) must be in the ``PYTHONPATH`` +environment variable. .. _`section: compiling`: @@ -288,7 +290,7 @@ sections `1.2.2 <#section: use_preconfigured_platforms>`__ and `1.2.3 <#section: setup_supported_platforms>`__. The second step is to download the lookup tables and other large datasets (large binaries, :math:`<`\ 1 GB) needed by the physics schemes and place them in the -correct directory: From the top-level code directory ( by default), +correct directory: From the top-level code directory (``ccpp-scm`` by default), execute the following scripts: .. code:: bash @@ -297,19 +299,21 @@ execute the following scripts: ./contrib/get_thompson_tables.sh If the download step fails, make sure that your system’s firewall does -not block access to GitHub. If it does, download the files , , , from -the GitHub release website using your browser and manually extract its -contents in the directory . Similarly, do the same for and and extract -to . - -Following this step, the top level build system will use to query system +not block access to GitHub. If it does, download the files ``comparison_data.tar.gz``, +``physics_input_data.tar.gz``, ``processed_case_input.tar.gz``, and ``raw_case_input.tar.gz`` +from the GitHub release website using your browser and manually extract its +contents in the directory ``scm/data``. Similarly, do the same for +``thompson_tables.tar.gz`` and ``MG_INCCN_data.tar.gz`` and extract +to ``scm/data/physics_input_data/``. + +Following this step, the top level build system will use ``cmake`` to query system parameters, execute the CCPP prebuild script to match the physics variables (between what the host model – SCM – can provide and what is needed by physics schemes in the CCPP for the chosen suites), and build -the physics caps needed to use them. Finally, is used to compile the +the physics caps needed to use them. Finally, ``make`` is used to compile the components. -#. From the top-level code directory ( by default), change directory to +#. From the top-level code directory (``ccpp-scm`` by default), change directory to the top-level SCM directory. .. code:: bash @@ -322,7 +326,7 @@ components. mkdir bin && cd bin -#. Invoke on the source code to build using one of the options below. +#. Invoke ``cmake`` on the source code to build using one of the options below. This step is used to identify for which suites the ccpp-framework will build caps and which suites can be run in the SCM without recompiling. @@ -334,7 +338,7 @@ components. cmake ../src By default, this option uses all supported suites. The list of - supported suites is controlled by . + supported suites is controlled by ``scm/src/suite_info.py``. - All suites mode @@ -342,7 +346,7 @@ components. cmake -DCCPP_SUITES=ALL ../src - All suites in , regardless of whether they’re supported, will be + All suites in ``scm/src/suite_info.py``, regardless of whether they’re supported, will be used. This list is typically longer for the development version of the code than for releases. @@ -353,10 +357,10 @@ components. cmake -DCCPP_SUITES=SCM_GFS_v16,SCM_RAP ../src This only compiles the listed subset of suites (which should still - have a corresponding entry in + have a corresponding entry in ``scm/src/suite_info.py``) - The statements above can be modified with the following options - (put before ): + (put before ``../src``): - Use threading with openmp (not for macOS with clang+gfortran) @@ -384,17 +388,17 @@ components. SCM executable. If necessary, the CCPP prebuild script can be executed manually from - the top level directory (). The basic syntax is + the top level directory (``ccpp-scm``). The basic syntax is .. code:: bash ./ccpp/framework/scripts/ccpp_prebuild.py --config=./ccpp/config/ccpp_prebuild_config.py --suites=SCM_GFS_v16,SCM_RAP[...] --builddir=./scm/bin [--debug] - where the argument supplied via the variable is a comma-separated + where the argument supplied via the ``--suites`` variable is a comma-separated list of suite names that exist in the directory. Note that suite - names are the suite definition filenames minus the prefix and suffix. + names are the suite definition filenames minus the ``suite_`` prefix and ``.xml`` suffix. -#. Compile. Add to obtain more information on the build process. +#. Compile. Add ``VERBOSE=1`` to obtain more information on the build process. .. code:: bash @@ -407,10 +411,10 @@ components. make -j4 2>&1 | tee log.make -The resulting executable may be found at (Full path of ). +The resulting executable may be found at ./scm (Full path of ``ccpp-scm/scm/bin/scm``). -Although is not currently implemented, an out-of-source build is used, -so all that is required to clean the build directory is (from the +Although ``make clean`` is not currently implemented, an out-of-source build is used, +so all that is required to clean the build directory is (from the ``bin`` directory) .. code:: bash @@ -432,8 +436,8 @@ There are several test cases provided with this version of the SCM. For all cases, the SCM will go through the time steps, applying forcing and calling the physics defined in the chosen suite definition file using physics configuration options from an associated namelist. The model is -executed through a Python run script that is pre-staged into the -directory: . It can be used to run one integration or several +executed through a Python run script that is pre-staged into the ``bin`` +directory: ``run_scm.py``. It can be used to run one integration or several integrations serially, depending on the command line arguments supplied. .. _`subsection: singlerunscript`: @@ -445,131 +449,130 @@ Running a case requires four pieces of information: the case to run (consisting of initial conditions, geolocation, forcing data, etc.), the physics suite to use (through a CCPP suite definition file), a physics namelist (that specifies configurable physics options to use), and a -tracer configuration file. As discussed in chapter -`[chapter: cases] <#chapter: cases>`__, cases are set up via their own -namelists in . A default physics suite is provided as a user-editable +tracer configuration file. As discussed in :numref:`Chapter %c `, cases are set up via their own +namelists in ``../etc/case_config``. A default physics suite is provided as a user-editable variable in the script and default namelists and tracer configurations -are associated with each physics suite (through ), so, technically, one +are associated with each physics suite (through ``../src/suite_info.py``), so, technically, one must only specify a case to run with the SCM when running just one integration. For running multiple integrations at once, one need only -specify one argument () which runs through all permutations of supported -suites from and cases from . The run script’s options are described +specify one argument (``-m``) which runs through all permutations of supported +suites from ``../src/suite_info.py`` and cases from ``../src/supported_cases.py``. The run script’s options are described below where option abbreviations are included in brackets. -- +- ``--case [-c]`` - - **This or the option are the minimum required arguments.** The - case should correspond to the name of a case in (without the - extension). + - **This or the ``--multirun`` option are the minimum required arguments.** The + case should correspond to the name of a case in ``../etc/case_config`` (without the + ``.nml`` extension). -- +- ``--suite [-s]`` - - The suite should correspond to the name of a suite in (without the - ) extension that was supplied in the or step. + - The suite should correspond to the name of a suite in ``../ccpp/suites`` (without the + ``.xml`` extension) that was supplied in the ``cmake`` or ``ccpp_prebuild`` step. -- +- ``--namelist [-n]`` - - The namelist should correspond to the name of a file in (WITH the - extension). If this argument is omitted, the default namelist for - the given suite in will be used. + - The namelist should correspond to the name of a file in ``../ccpp/physics_namelists`` (WITH the + ``.txt`` extension). If this argument is omitted, the default namelist for + the given suite in ``../src/suite_info.py`` will be used. -- +- ``--tracers [-t]`` - - The tracers file should correspond to the name of a file in (WITH - the extension). If this argument is omitted, the default tracer - configuration for the given suite in will be used. + - The tracers file should correspond to the name of a file in ``../etc/tracer_config`` (WITH + the ``.txt`` extension). If this argument is omitted, the default tracer + configuration for the given suite in ``../src/suite_info.py`` will be used. -- +- ``--multirun [-m]`` - - **This or the option are the minimum required arguments.** When + - **This or the ``--case`` option are the minimum required arguments.** When used alone, this option runs through all permutations of supported - suites from and cases from . When used in conjunction with the - option, only the runs configured in the file will be run. + suites from ``../src/suite_info.py`` and cases from ``../src/supported_cases.py``. When used in conjunction with the + ``--file`` option, only the runs configured in the file will be run. -- +- ``--file [-f]`` - - This option may be used in conjunction with the argument. It + - This option may be used in conjunction with the ``--multirun`` argument. It specifies a path and filename to a python file where multiple runs are configured. -- +- ``--gdb [-g]`` - - Use this to run the executable through the debugger (if it is + - Use this to run the executable through the ``gdb`` debugger (if it is installed on the system). -- +- ``--docker [-d]`` - Use this argument when running in a docker container in order to successfully mount a volume between the host machine and the Docker container instance and to share the output and plots with the host machine. -- +- ``--runtime`` - Use this to override the runtime provided in the case configuration namelist. -- +- ``--runtime_mult`` - Use this to override the runtime provided in the case configuration namelist by multiplying the runtime by the given value. This is used, for example, in regression testing to reduce total runtimes. -- +- ``--levels [-l] - Use this to change the number of vertical levels. -- +- ``--npz_type`` - Use this to change the type of FV3 vertical grid to produce (see - for valid values). + ``src/scm_vgrid.F90`` for valid values). -- +- ``--vert_coord_file`` - Use this to specify the path/filename of a file containing the a_k and b_k coefficients for the vertical grid generation code to use. -- +- ``--bin_dir`` - Use this to specify the path to the build directory. -- +- ``--run_dir`` - Use this to specify the path to the run directory. -- +- ``--case_data_dir`` - Use this to specify the path to the directory containing the case data file (useful for using the DEPHY case repository). -- +- ``--n_itt_out`` - Use this to specify the period of writing instantaneous output in timesteps (if different than the default specified in the script). -- +- ``--n_itt_diag`` - Use this to specify the period of writing instantaneous and time-averaged diagnostic output in timesteps (if different than the default specified in the script). -- +- ``--timestep [-dt]`` - Use this to specify the timestep to use (if different than the - default specified in ). + default specified in ``../src/suite_info.py``). -- +- ``--verbose [-v]`` - Use this option to see additional debugging output from the run script and screen output from the executable. When invoking the run script, the only required argument is the name of the case to run. The case name used must match one of the case -configuration files located in (*without the .nml extension!*). If +configuration files located in ``../etc/case_config`` (*without the .nml extension!*). If specifying a suite other than the default, the suite name used must match the value of the suite name in one of the suite definition files -located in (Note: not the filename of the suite definition file). As +located in ``../../ccpp/suites`` (Note: not the filename of the suite definition file). As part of the sixth CCPP release, the following suite names are valid: #. SCM_GFS_v16 @@ -588,31 +591,30 @@ Note that using the Thompson microphysics scheme requires the computation of look-up tables during its initialization phase. As of the release, this process has been prohibitively slow with this model, so it is HIGHLY suggested that these look-up tables are downloaded and staged -to use this scheme as described in -section `1.3 <#section: compiling>`__. The issue appears to be +to use this scheme as described in :numref:`Section %s `. The issue appears to be machine/compiler-specific, so you may be able to produce the tables with -the SCM, especially when invoking with the option. +the SCM, especially when invoking ``cmake`` with the ``-DOPENMP=ON`` option. Also note that some cases require specified surface fluxes. Special suite definition files that correspond to the suites listed above have -been created and use the decoration. It is not necessary to specify this -filename decoration when specifying the suite name. If the variable in -the configuration file of the case being run is set to , the run script +been created and use the ``*_prescribed_surface`` decoration. It is not necessary to specify this +filename decoration when specifying the suite name. If the ``spec_sfc_flux`` variable in +the configuration file of the case being run is set to ``.true.``, the run script will automatically use the special suite definition file that corresponds to the chosen suite from the list above. If specifying a namelist other than the default, the value must be an -entire filename that exists in . Caution should be exercised when +entire filename that exists in ``../../ccpp/physics_namelists``. Caution should be exercised when modifying physics namelists since some redundancy between flags to control some physics parameterizations and scheme entries in the CCPP suite definition files currently exists. Values of numerical parameters are typically OK to change without fear of inconsistencies. If specifying a tracer configuration other than the default, the value must -be an entire filename that exists in . The tracers that are used should +be an entire filename that exists in ``../../scm/etc/tracer_config``. The tracers that are used should match what the physics suite expects, lest a runtime error will result. Most of the tracers are dependent on the microphysics scheme used within the suite. The tracer names that are supported as of this release are -given by the following list. Note that running without , , and may +given by the following list. Note that running without ``sphum``, ``o3mr``, and ``liq_wat`` and may result in a runtime error in all supported suites. #. sphum @@ -672,15 +674,15 @@ If using the model on HPC resources and significant amounts of processor time is anticipated for the experiments, it will likely be necessary to submit a job through the HPC’s batch system. An example script has been included in the repository for running the model on Hera’s batch system -(SLURM). It is located in . Edit the , , etc. to suit your needs and -copy to the directory. The case name to be run is included in the +(SLURM). It is located in ``ccpp-scm/scm/etc/scm_slurm_example.py``. Edit the ``job_name``, ``account``, etc. to suit your needs and +copy to the ``bin`` directory. The case name to be run is included in the ``command`` variable. To use, invoke .. code:: bash ./scm_slurm_example.py -from the directory. +from the ``bin`` directory. Additional details regarding the SCM may be found in the remainder of this guide. More information on the CCPP can be found in the CCPP @@ -716,7 +718,8 @@ internet search. - After a successful installation of Docker Toolbox, starting with Docker Quickstart may result in the following error even with - virtualization correctly enabled: . We were able to bypass this error + virtualization correctly enabled: ``This computer doesn’t have VT-X/AMD-v enabled. Enabling it in the BIOS is mandatory.`` + We were able to bypass this error by opening a bash terminal installed with Docker Toolbox, navigating to the directory where it was installed, and executing the following command: @@ -758,15 +761,15 @@ GNU-based Docker image: A Docker image containing the SCM, CCPP, and its software prerequisites can be generated from the code in the software repository obtained by -following section `1.1 <#obtaining_code>`__ by executing the following -steps: +following the instructions in :numref:`Section %s `, +and then executing the following steps: NOTE: Windows users can execute these steps in the terminal application that was installed as part of Docker Toolbox. -#. Navigate to the directory. +#. Navigate to the ``ccpp-scm/docker`` directory. -#. Run the command to generate the Docker image, using the supplied +#. Run the ``docker build`` command to generate the Docker image, using the supplied Dockerfile. .. code:: bash @@ -844,10 +847,11 @@ Quickstart application installed with Docker Toolbox. export OUT_DIR=/path/to/output For Windows, the format that worked for us followed this example: + ``/c/Users/myusername/path/to/directory/to/mount`` #. To run the SCM, you can run the Docker container that was just - created and give it the same run commands as discussed in section - `1.4.1 <#subsection: singlerunscript>`__. **Be sure to remember to + created and give it the same run commands as discussed in :numref:`Section %s ` + **Be sure to remember to include the ``-d`` include the option for all run commands**. For example, .. code:: bash @@ -856,32 +860,32 @@ Quickstart application installed with Docker Toolbox. will run through the TWPICE case using the default suite and namelist and put the output in the shared directory. NOTE: Windows users may - need to omit the curly braces around environment variables: use - instead of . For running through all supported cases and suites, use + need to omit the curly braces around environment variables: use ``$OUT_DIR`` + instead of ``${OUT_DIR}``. For running through all supported cases and suites, use .. code:: bash docker run --rm -it -v ${OUT_DIR}:/home --name run-ccpp-scm ccpp-scm ./run_scm.py -m -d - The options included in the above commands are the following: + The options included in the above ``run`` commands are the following: - - removes the container when it exits + - ``−−rm`` removes the container when it exits - - interactive mode with terminal access + - ``-it`` interactive mode with terminal access - - specifies the volume mount from host directory (outside container) + - ``-v`` specifies the volume mount from host directory (outside container) to inside the container. Using volumes allows you to share data between the host machine and container. For running the SCM, the output is being mounted from inside the container to the on the host machine. Upon exiting the container, data mounted to the host machine will still be accessible. - - names the container. If no name is provided, the daemon will + - ``−−name`` names the container. If no name is provided, the daemon will autogenerate a random string name. NOTE: If you are using a prebuilt image from Dockerhub, substitute the name of the image that was pulled from Dockerhub in the commands - above; i.e. instead of above, one would have . + above; i.e. instead of ``ccpp-scm`` above, one would have ``dtcenter/ccpp-scm:v6.0.0``. #. To use the SCM interactively, run non-default configurations, create plots, or even develop code, issue the following command: From 0879584de2579bb6cdf90e16503e2fff2765691b Mon Sep 17 00:00:00 2001 From: "Michael Kavulich, Jr" Date: Tue, 12 Mar 2024 15:09:00 -0600 Subject: [PATCH 12/26] Format "Repository" Chapter, including directory tree with descriptions --- scm/doc/TechGuide/chap_repo.rst | 54 ++++++++++++++++++++++++++++----- 1 file changed, 47 insertions(+), 7 deletions(-) diff --git a/scm/doc/TechGuide/chap_repo.rst b/scm/doc/TechGuide/chap_repo.rst index eb1e7570f..d49b9605a 100644 --- a/scm/doc/TechGuide/chap_repo.rst +++ b/scm/doc/TechGuide/chap_repo.rst @@ -9,14 +9,54 @@ What is included in the repository? The repository contains all code required to build the CCPP SCM and scripts that can be used to obtain data to run it (e.g. downloading large initialization tables for the Thompson microphysics schemes -discussed in subsection -`[subsection: singlerunscript] <#subsection: singlerunscript>`__ and +discussed in :numref:`Subsection %s ` and processed case data). It is functionally separated into 3 subdirectories -representing the SCM model infrastructure ( directory), the CCPP -infrastructure ( directory), and the CCPP physics schemes ( directory). -The entire repository resides on Github’s NCAR space, and the and -directories are git submodules that point to repositories and on the +representing the SCM model infrastructure (``scm`` directory), the CCPP +infrastructure (``ccpp/framework`` directory), and the CCPP physics schemes +(``ccpp/physics`` directory). The entire ``ccpp-scm`` repository resides on +Github’s NCAR space, and the ``ccpp/framework`` and ``ccpp/physics`` directories +are git submodules that point to repositories ``ccpp-framework`` and ``ccpp-physics`` on the same space. The structure of the entire repository is represented below. -Note that the repository also contains files needed for using the CCPP +Note that the ``ccpp-physics`` repository also contains files needed for using the CCPP with the UFS Atmosphere host model that uses the Finite-Volume Cubed-Sphere (FV3) dynamical core. + +| ``ccpp-scm/`` +| ``├── CMakeModules`` +| ``├── CODEOWNERS`` - list of code maintainers/developers who are automatically assigned to review Pull Requests on GitHub +| ``├── LICENSE`` +| ``├── README.md`` +| ``├── ccpp`` - contains the CCPP prebuild configuration file +| ``│   ├── config`` +| ``│   ├── framework`` - Contains CCPP framework submodule. See https://github.com/NCAR/ccpp-framework for contents +| ``│   ├── physics`` - Contains CCPP physics submodule. See https://github.com/NCAR/ccpp-physics for contents +| ``│   ├── physics_namelists`` - contains physics namelist files associated with suites +| ``│   └── suites`` - contains suite definition files +| ``├── contrib`` +| ``│   ├── get_all_static_data.sh`` - script for downloading/extracting the processed SCM case data +| ``│   ├── get_mg_inccn_data.sh`` - script for downloading/extracting the Morrison-Gettelman data +| ``│   └── get_thompson_tables.sh`` - script for downloading/extracting the Thompson lookup tables +| ``├── docker`` +| ``│   └── Dockerfile`` - contains Docker instructions for building the CCPP SCM image +| ``├── environment-suite-sim.yml`` - Python environment dependency file for the CCPP Suite Simulator +| ``├── environment-ufsreplay.yml`` - Python environment dependency file for the UFS Replay capability +| ``├── environment.yml`` - Python environment dependency file for the SCM +| ``├── scm`` +| ``│   ├── LICENSE.txt`` - Contains licensing information +| ``│   ├── data`` - Directory where data is staged by scripts in the ``ccpp/contrib/`` directory +| ``│   │   └── vert_coord_data`` - contains data to calculate vertical coordinates (from GSM-based GFS only) +| ``│   ├── doc`` +| ``│   │   └── TechGuide`` - Contains source code and other files for this User’s/Technical Guide +| ``│   ├── etc`` - contains case configuration, machine setup scripts, and plotting scripts +| ``│   │   ├── CENTOS_docker_setup.sh`` - contains machine setup for Docker container +| ``│   │   ├── case_config`` - contains case configuration files +| ``│   │   ├── modules`` - contains module files for loading build environments on both pre-configured and custom platforms +| ``│   │   ├── scm_qsub_example.py`` - example ``qsub`` (LSF) run script +| ``│   │   ├── scm_slurm_example.py`` - example ``srun`` (SLURM) run script +| ``│   │   ├── scripts`` - Python scripts for setting up cases, plotting, and the CCPP Suite Simulator +| ``│   │   │   ├── ccpp_suite_sim`` - Python scripts for the CCPP Suite Simulator +| ``│   │   │   ├── plot_configs`` - plot configuration files +| ``│   │   └── tracer_config`` - tracer configuration files +| ``│   └── src`` - source code for SCM infrastructure, Python run script, CMakeLists.txt for the SCM, example multirun setup files, suite_info.py +| ``└── test`` - Contains scripts for regression testing, Continuous Integration tests + From 36fc7e5b80bf108cddce36a3752a046bc7754613 Mon Sep 17 00:00:00 2001 From: "Michael Kavulich, Jr" Date: Tue, 12 Mar 2024 15:23:23 -0600 Subject: [PATCH 13/26] Format "Algorithm" chapter --- scm/doc/TechGuide/chap_function.rst | 68 ++++++++++++++--------------- 1 file changed, 34 insertions(+), 34 deletions(-) diff --git a/scm/doc/TechGuide/chap_function.rst b/scm/doc/TechGuide/chap_function.rst index 8560a95b1..ab80094bc 100644 --- a/scm/doc/TechGuide/chap_function.rst +++ b/scm/doc/TechGuide/chap_function.rst @@ -29,21 +29,21 @@ Reading input The following steps are performed at the beginning of program execution: -#. Call in the module to read in the ` <#subsection: case config>`__ and - ` <#subsection: physics config>`__ namelists. This subroutine also - sets some variables within the derived type from the data that was +#. Call ``get_config_nml()`` in the ``scm_input`` module to read in the ``case_config`` and + ``physics_config`` namelists. This subroutine also + sets some variables within the ``scm_state`` derived type from the data that was read. -#. Call (or if using the DEPHY format) in the module to read in the - `case input data file <#subsection: case input>`__. This subroutine - also sets some variables within the derived type from the data that +#. Call ``get_case_init()`` (or ``get_case_init_DEPHY()`` if using the DEPHY format) in the ``scm_input`` module to read in the + :numref:`case input data file `. This subroutine + also sets some variables within the ``scm_input`` derived type from the data that was read. -#. Call in the module to read in the reference profile data. This - subroutine also sets some variables within the derived type from the +#. Call ``get_reference_profile()`` in the ``scm_input`` module to read in the reference profile data. This + subroutine also sets some variables within the ``scm_reference`` derived type from the data that was read. At this time, there is no “standard” format for - the reference profile data file. There is a statement within the - subroutine that reads in differently-formatted data. If adding a new + the reference profile data file. There is a ``select case`` statement within the + ``get_reference_profile()`` subroutine that reads in differently-formatted data. If adding a new reference profile, it will be required to add a section that reads its data in this subroutine. @@ -53,18 +53,18 @@ Setting up vertical grid and interpolating input data The CCPP SCM uses pressure for the vertical coordinate (lowest index is the surface). The pressure levels are calculated using the surface pressure and coefficients (:math:`a_k` and :math:`b_k`), which are taken -directly from FV3 code (). For vertical grid options, inspect for valid -values of . The default vertical coordinate uses 127 levels and sets to -the empty string. Alternatively, one can specify the (:math:`a_k` and -:math:`b_k`) coefficients via an external file in the directory and pass -it in to the SCM via the argument of the run script. If changing the -number of vertical levels or the algorithm via the or run script -arguments, be sure to check and that the vertical coordinate is as -inteneded. +directly from FV3 code (``fv_eta.h``). For vertical grid options, inspect ``scm/src/scm_vgrid.F90`` for valid +values of ``npz_type``. The default vertical coordinate uses 127 levels and sets ``npz_type`` to +an empty string. Alternatively, one can specify the (:math:`a_k` and +:math:`b_k`) coefficients via an external file in the ``scm/data/vert_coord_data`` directory and pass +it in to the SCM via the ``--vert_coord_file`` argument of the run script. If changing the +number of vertical levels or the algorithm via the ``--levels`` or ``--npz_type`` run script +arguments, be sure to check ``src/scm/scm_vgrid.F90`` and ``fv_eta.h`` that the vertical coordinate is as +intended. After the vertical grid has been set up, the state variable profiles -stored in the derived data type are interpolated from the input and -reference profiles in the subroutine of the module. +stored in the ``scm_state`` derived data type are interpolated from the input and +reference profiles in the ``set_state`` subroutine of the ``scm_setup`` module. .. _`section: physics init`: @@ -86,10 +86,10 @@ process: initialization that is not already performed within CCPP-compliant scheme initialization routines. -#. Associate the variables with the appropriate pointers in the derived +#. Associate the ``scm_state`` variables with the appropriate pointers in the derived data type. -#. Call with the derived data type as input. This call executes the +#. Call ``ccpp_physics_init`` with the ``cdata`` derived data type as input. This call executes the initialization stages of all schemes, groups, and suites that are defined in the suite definition file. @@ -107,22 +107,22 @@ During each step of the time integration, the following sequence occurs: #. Calculate the current date and time given the initial date and time and the elapsed time. -#. Call the subroutine in the module to interpolate the forcing data in +#. Call the ``interpolate_forcing()`` subroutine in the ``scm_forcing`` module to interpolate the forcing data in space and time. #. Recalculate the pressure variables (pressure, Exner function, geopotential) in case the surface pressure has changed. -#. Call in the module. Within this subroutine: +#. Call ``do_time_step()`` in the ``scm_time_integration`` module. Within this subroutine: - - Call the appropriate subroutine from the module. + - Call the appropriate ``apply_forcing_*`` subroutine from the ``scm_forcing`` module. - - For each column, call to call all physics schemes within the suite + - For each column, call ``ccpp_physics_run()`` to call all physics schemes within the suite (this assumes that all suite parts are called sequentially without intervening code execution) #. Check to see if output should be written during the current time step - and call in the module if necessary. + and call ``output_append()`` in the module if necessary. Writing output -------------- @@ -134,24 +134,24 @@ can be written out as instantaneous or time-averaged and there are 5 output periods: #. one associated with how often instantaneous variables should be - written out (controlled by the run script variable). + written out (controlled by the ``-- n_itt_out`` run script variable). #. one associated with how often diagnostic (either instantaneous or - time-averaged) should be written out (controlled by the run script + time-averaged) should be written out (controlled by the ``-- n_itt_diag`` run script variable) -#. one associated with the shortwave radiation period (controlled by +#. one associated with the shortwave radiation period (controlled by ``fhswr`` variable in the physics namelist) -#. one associated with the longwave radiation period (controlled by the +#. one associated with the longwave radiation period (controlled by the ``fhlwr`` variable in the physics namelist) #. one associated with the minimum of the shortwave and longwave radiation intervals (for writing output if any radiation is called) Further, which variables are output and on each interval are controlled -via the source file. Of course, any changes to this file must result in +via the ``scm/src/scm_output.F90`` source file. Of course, any changes to this file must result in a recompilation to take effect. There are several subroutines for -initializing the output file () and for appending to it () that are +initializing the output file (``output_init_*``) and for appending to it (``output_append_*``) that are organized according to their membership in physics derived data types. -See the source file to understand how to change output variables. +See the ``scm/src/scm_output.F90`` source file to understand how to change output variables. From 5ae498236bf09a097d68e795964f3cddf900e8ee Mon Sep 17 00:00:00 2001 From: "Michael Kavulich, Jr" Date: Wed, 13 Mar 2024 20:46:43 -0600 Subject: [PATCH 14/26] Format "Cases" and "CCPP Inteface" chapters --- scm/doc/TechGuide/chap_cases.rst | 218 ++++++++++++++++--------------- scm/doc/TechGuide/chap_ccpp.rst | 120 +++++++++-------- 2 files changed, 172 insertions(+), 166 deletions(-) diff --git a/scm/doc/TechGuide/chap_cases.rst b/scm/doc/TechGuide/chap_cases.rst index 55c1bd6fb..3aa2b880a 100644 --- a/scm/doc/TechGuide/chap_cases.rst +++ b/scm/doc/TechGuide/chap_cases.rst @@ -7,7 +7,7 @@ How to run cases ---------------- Only two files are needed to set up and run a case with the SCM. The -first is a configuration namelist file found in that contains parameters +first is a configuration namelist file found in ``ccpp-scm/scm/etc/case_config`` that contains parameters for the SCM infrastructure. The second necessary file is a NetCDF file containing data to initialize the column state and time-dependent data to force the column state. The two files are described below. @@ -17,22 +17,22 @@ to force the column state. The two files are described below. Case configuration namelist parameters ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -The namelist expects the following parameters: +The ``case_config`` namelist expects the following parameters: -- +- ``case_name`` - Identifier for which dataset (initialization and forcing) to load. This string must correspond to a dataset included in the directory - (without the file extension). + ``ccpp-scm/scm/data/processed_case_input/`` (without the file extension). -- +- ``runtime`` - Specify the model runtime in seconds (integer). This should correspond with the forcing dataset used. If a runtime is specified that is longer than the supplied forcing, the forcing is held constant at the last specified values. -- +- ``thermo_forcing_type`` - An integer representing how forcing for temperature and moisture state variables is applied (1 :math:`=` total advective @@ -40,7 +40,7 @@ The namelist expects the following parameters: prescribed vertical motion, 3 :math:`=` relaxation to observed profiles with vertical motion prescribed) -- +- ``mom_forcing_type`` - An integer representing how forcing for horizontal momentum state variables is applied (1 :math:`=` total advective tendencies; not @@ -48,52 +48,52 @@ The namelist expects the following parameters: prescribed vertical motion, 3 :math:`=` relaxation to observed profiles with vertical motion prescribed) -- +- ``relax_time`` - A floating point number representing the timescale in seconds for - the relaxation forcing (only used if :math:`=` or :math:`=` ) + the relaxation forcing (only used if ``thermo_forcing_type = 3`` or ``mom_forcing_type = 3``) -- +- ``sfc_flux_spec`` - - A boolean set to if surface flux are specified from the forcing + - A boolean set to ``.true.`` if surface flux are specified from the forcing data (there is no need to have surface schemes in a suite definition file if so) -- +- ``sfc_roughness_length_cm`` - Surface roughness length in cm for calculating surface-related - fields from specified surface fluxes (only used if is True). + fields from specified surface fluxes (only used if ``sfc_flux_spec`` is True). -- +- ``sfc_type`` - An integer representing the character of the surface (0 :math:`=` sea surface, 1 :math:`=` land surface, 2 :math:`=` sea-ice surface) -- +- ``reference_profile_choice`` - An integer representing the choice of reference profile to use above the supplied initialization and forcing data (1 :math:`=` “McClatchey” profile, 2 :math:`=` mid-latitude summer standard atmosphere) -- +- ``year`` - An integer representing the year of the initialization time -- +- ``month`` - An integer representing the month of the initialization time -- +- ``day`` - An integer representing the day of the initialization time -- +- ``hour`` - An integer representing the hour of the initialization time -- +- ``column_area`` - A list of floating point values representing the characteristic horizontal domain area of each atmospheric column in square meters @@ -102,29 +102,29 @@ The namelist expects the following parameters: values are used in scale-aware schemes; if using multiple columns, you may specify an equal number of column areas) -- +- ``model_ics`` - - A boolean set to if UFS atmosphere initial conditions are used + - A boolean set to ``.true.`` if UFS atmosphere initial conditions are used rather than field campaign-based initial conditions -- +- ``C_RES`` - An integer representing the grid size of the UFS atmosphere initial conditions; the integer represents the number of grid points in each horizontal direction of each cube tile -- +- ``input_type`` - 0 => original DTC format, 1 => DEPHY-SCM format. Optional variables (that may be overridden via run script command line arguments) are: -- +- ``vert_coord_file`` - File containing FV3 vertical grid coefficients. -- +- ``n_levels`` - Specify the integer number of vertical levels. @@ -134,15 +134,15 @@ Case input data file (CCPP-SCM format) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The initialization and forcing data for each case is stored in a NetCDF -(version 4) file within the directory. Each file has at least two -dimensions ( and , potentially with additions for vertical snow and soil +(version 4) file within the ``ccpp-scm/scm/data/processed_case_input`` directory. Each file has at least two +dimensions (``time`` and ``levels``, potentially with additions for vertical snow and soil levels) and is organized into 3 groups: scalars, initial, and forcing. -Not all fields are required for all cases. For example the fields and -are only needed if the variable :math:`=` in the case configuration file -and state nudging variables are only required if :math:`=` or :math:`=` +Not all fields are required for all cases. For example the fields ``sh_flux_sfc`` and ``lh_flux_sfc`` +are only needed if the variable ``sfc_flx_spec = .true.`` in the case configuration file +and state nudging variables are only required if ``thermo_forcing_type = 3`` or ``mom_forcing_type = 3`` . Using an active LSM (Noah, NoahMP, RUC) requires many more variables than are listed here. Example files for using with Noah and NoahMP LSMs -are included in . +are included in ``ccpp-scm/scm/data/processed_case_input/fv3_model_point_noah[mp].nc``. .. _`subsection: case input dephy`: @@ -168,9 +168,9 @@ to the SCM. For example: cd [...]/ccpp-scm/scm/bin ./run_scm.py -c MAGIC_LEG04A --case_data_dir [...]/ccpp-scm/scm/data/DEPHY-SCM/MAGIC/LEG04A -v -Each DEPHY file has three dimensions (, , ) and contains the initial -conditions (, ) and forcing data (, ). Just as when using the CCPP-SCM -formatted inputs, `1.1.2 <#subsection: case input>`__, not all fields +Each DEPHY file has three dimensions (``time``, ``t0``, ``levels``) and contains the initial +conditions (``t0``, ``levels``) and forcing data (``time``, ``levels``). Just as when using the CCPP-SCM +formatted inputs, :numref:`Subsection %s `, not all fields are required for all cases. More information on the DEPHY format requirements can be found at `DEPHY `__. @@ -184,7 +184,7 @@ Included Cases Several cases are included in the repository to serve as examples for users to create their own and for basic research. All case configuration -namelist files for included cases can be found in and represent the +namelist files for included cases can be found in ``ccpp-scm/scm/etc/case_config`` and represent the following observational field campaigns: - Tropical Warm Pool – International Cloud Experiment (TWP-ICE) @@ -201,7 +201,7 @@ following observational field campaigns: - Large eddy simulation ARM Symbiotic Simulation and Observation (LASSO) for May 18, 2016 (with capability to run all LASSO dates - - see `1.4 <#sec:lasso>`__) continental shallow convection + see :numref:`Section %s `) continental shallow convection For the ARM SGP case, several case configuration files representing different time periods of the observational dataset are included, @@ -211,12 +211,12 @@ these different forcing are included. In addition, two example cases are included for using UFS Atmosphere initial conditions: - UFS initial conditions for 38.1 N, 98.5 W (central Kansas) for 00Z on - Oct. 3, 2016 with Noah variables on the C96 FV3 grid () + Oct. 3, 2016 with Noah variables on the C96 FV3 grid (``fv3_model_point_noah.nc``) - UFS initial conditions for 38.1 N, 98.5 W (central Kansas) for 00Z on - Oct. 3, 2016 with NoahMP variables on the C96 FV3 grid () + Oct. 3, 2016 with NoahMP variables on the C96 FV3 grid (``fv3_model_point_noahmp.nc``) -See `1.5 <#sec:UFSreplay>`__ for information on how to generate these +See :numref:`Section %s ` for information on how to generate these files for other locations and dates, given appropriate UFS Atmosphere initial conditions and output. @@ -227,11 +227,11 @@ Setting up a new case involves preparing the two types of files listed above. For the case initialization and forcing data file, this typically involves writing a custom script or program to parse the data from its original format to the format that the SCM expects, listed above. An -example of this type of script written in Python is included in . The +example of this type of script written in Python is included in ``ccpp-scm/scm/etc/scripts/twpice_forcing_file_generator.py``. The script reads in the data as supplied from its source, converts any necessary variables, and writes a NetCDF (version 4) file in the format -described in subsections `1.1.2 <#subsection: case input>`__ and -`1.1.3 <#subsection: case input dephy>`__. For reference, the following +described in subsections :numref:`Subsection %s ` and +:numref:`Subsection %s `. For reference, the following formulas are used: .. math:: \theta_{il} = \theta - \frac{\theta}{T}\left(\frac{L_v}{c_p}q_l + \frac{L_s}{c_p}q_i\right) @@ -276,47 +276,47 @@ s\ :math:`^{-1}`), and profiles of u, v, T, :math:`\theta_{il}` and Although it is expected that all variables are in the NetCDF file, only those that are used with the chosen forcing method are required to be nonzero. For example, the following variables are required depending on -the values of and specified in the case configuration file: +the values of ``mom_forcing_type`` and ``thermo_forcing_type`` specified in the case configuration file: -- :math:`=` +- ``mom_forcing_type = 1`` - Not implemented yet -- :math:`=` +- ``mom_forcing_type = 2`` - geostrophic winds and large scale vertical velocity -- :math:`=` +- ``mom_forcing_type = 3`` - u and v nudging profiles -- :math:`=` +- ``thermo_forcing_type = 1`` - horizontal and vertical advective tendencies of :math:`\theta_{il}` and :math:`q_t` and prescribed radiative heating (can be zero if radiation scheme is active) -- :math:`=` +- ``thermo_forcing_type = 2`` - horizontal advective tendencies of :math:`\theta_{il}` and :math:`q_t`, prescribed radiative heating (can be zero if radiation scheme is active), and the large scale vertical pressure velocity -- :math:`=` +- ``thermo_forcing_type = 3`` - :math:`\theta_{il}` and :math:`q_t` nudging profiles and the large scale vertical pressure velocity For the case configuration file, it is most efficient to copy an -existing file in and edit it to suit one’s case. Recall from subsection -`1.1.1 <#subsection: case config>`__ that this file is used to configure +existing file in ``ccpp-scm/scm/etc/case_config`` and edit it to suit one’s case. Recall from subsection +:numref:`Subsection %s ` that this file is used to configure the SCM framework parameters for a given case. Be sure to check that model timing parameters such as the time step and output frequency are appropriate for the physics suite being used. There is likely some stability criterion that governs the maximum time step based on the chosen parameterizations and number of vertical levels (grid spacing). -The parameter should match the name of the case input data file that was +The ``case_name`` parameter should match the name of the case input data file that was configured for the case (without the file extension). The parameter should be less than or equal to the length of the forcing data unless the desired behavior of the simulation is to proceed with the last @@ -326,12 +326,12 @@ period specified in the case input data file. If the case input data is specified to a lower altitude than the vertical domain, the remainder of the column will be filled in with values from a reference profile. There is a tropical profile and mid-latitude summer profile provided, although -one may add more choices by adding a data file to and adding a parser -section to the subroutine in . Surface fluxes can either be specified in +one may add more choices by adding a data file to ``ccpp-scm/scm/data/processed_case_input`` and adding a parser +section to the subroutine ``get_reference_profile`` in ``scm/src/scm_input.f90``. Surface fluxes can either be specified in the case input data file or calculated using a surface scheme using -surface properties. If surface fluxes are specified from data, set to -and specify for the surface over which the column resides. Otherwise, -specify a . In addition, one must specify a for each column. +surface properties. If surface fluxes are specified from data, set ``sfc_flux_spec`` to ``.true.`` +and specify ``sfc_roughness_length_cm`` for the surface over which the column resides. Otherwise,` +specify a ``sfc_type``. In addition, one must specify a ``column_area`` for each column. To control the forcing method, one must choose how the momentum and scalar variable forcing are applied. The three methods of Randall and @@ -343,7 +343,7 @@ prescribed vertical velocity and the calculated (modeled) profiles (type 2), and “relaxation forcing” where nudging to observed profiles replaces horizontal advective forcing combined with vertical advective forcing from prescribed vertical velocity (type 3). If relaxation forcing is -chosen, a that represents the timescale over which the profile would +chosen, a ``relax_time`` that represents the timescale over which the profile would return to the nudging profiles must be specified. .. _`sec:lasso`: @@ -366,18 +366,18 @@ following steps: desired simulations have been checked, order the data (you may need to create an ARM account to do so). -#. Once the data is downloaded, decompress it. From the directory, copy - the files , , and into their own directory under . +#. Once the data is downloaded, decompress it. From the ``config`` directory, copy + the files ``input_ls_forcing.nc``, ``input_sfc_forcing.nc``, and ``wrfinput_d01.nc`` into their own directory under ``ccpp-scm/scm/data/raw_case_input/``. -#. Modify to point to the input files listed above. Execute the script - in order to generate a case input file for the SCM (to be put in ): +#. Modify ``ccpp-scm/scm/etc/scripts/lasso1_forcing_file_generator_gjf.py`` to point to the input files listed above. Execute the script + in order to generate a case input file for the SCM (to be put in ``ccpp-scm/scm/data/processed_case_input/``): .. code:: bash ./lasso1_forcing_file_generator_gjf.py #. Create a new case configuration file (or copy and modify an existing - one) in . Be sure that the variable points to the newly + one) in ``ccpp-scm/scm/etc/case_config``. Be sure that the ``case_name`` variable points to the newly created/processed case input file from above. .. _`sec:UFSreplay`: @@ -392,21 +392,29 @@ Python Dependencies The scripts here require a few python packages that may not be found by default in all python installations. There is a YAML file with the -python environment needed to run the script in . To create and activate +python environment needed to run the script in ``ccpp-scm/environment-ufsreplay.yml``. To create and activate this environment using conda: Create environment (only once): -This will create the conda environment +.. code:: bash + + > conda env create -f environment-ufsreplay.yml + +This will create the conda environment ``env_ufsreplay`` Activate environment: +.. code:: bash + + > conda activate env_ufsreplay + .. _`subsection: ufsicgenerator`: UFS_IC_generator.py ~~~~~~~~~~~~~~~~~~~ -A script exists in to read in UFS history (output) files and their +A script exists in ``scm/etc/scripts/UFS_IC_generator.py`` to read in UFS history (output) files and their initial conditions to generate a SCM case input data file, in DEPHY format. @@ -419,49 +427,49 @@ format. Mandatory arguments: -#. OR : Either longitude and latitude in decimal degrees east and north +#. ``--location (-l)`` OR ``--index (-ij)``: Either longitude and latitude in decimal degrees east and north of a location OR the UFS grid index with the tile number - -l 261.51 38.2 (two floating point values separated by a space) - -ij 8 49 (two integer values separated by a space; this option - must also use the argument to specify the tile number) + must also use the ``--tile (-t)`` argument to specify the tile number) -#. YYYYMMDDHHMMSS: date corresponding to the UFS initial conditions +#. ``--date (-d)`` YYYYMMDDHHMMSS: date corresponding to the UFS initial conditions -#. : path to the directory containing the UFS initial conditions +#. ``--in_dir (-i)``: path to the directory containing the UFS initial conditions -#. : path to the directory containing the UFS supergrid files (AKA "fix" +#. ``--grid_dir (-g)``: path to the directory containing the UFS supergrid files (AKA "fix" directory) -#. : path to the directory containing the UFS history files +#. ``--forcing_dir (-f)``: path to the directory containing the UFS history files -#. : name of case +#. ``--case_name (-n)``: name of case Optional arguments: -#. : if one already knows the correct tile for the given longitude and - latitude OR one is specifying the UFS grid index ( argument) +#. ``--tile (-t)``: if one already knows the correct tile for the given longitude and + latitude OR one is specifying the UFS grid index (``--index`` argument) -#. : area of grid cell in :math:`m^2` (if known or different than the +#. ``--area (-a)``: area of grid cell in :math:`m^2` (if known or different than the value calculated from the supergrid file) -#. : flag if UFS initial conditions were generated using older version +#. ``--old_chgres (-oc)``: flag if UFS initial conditions were generated using older version of chgres (global_chgres); might be the case for pre-2018 data -#. : flag to signal that the ICs and forcing is from a limited-area +#. ``--lam (-lam)``: flag to signal that the ICs and forcing is from a limited-area model run -#. : flag to create UFS reference file for comparison +#. ``--save_comp (-sc)``: flag to create UFS reference file for comparison -#. : flag to indicate using the nearest UFS history file gridpoint +#. ``--use_nearest (-near)``: flag to indicate using the nearest UFS history file gridpoint .. _`subsection: ufsforcingensemblegenerator`: UFS_forcing_ensemble_generator.py ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -There is an additional script in to create UFS-replay case(s) starting +There is an additional script in ``scm/etc/scripts/UFS_forcing_ensemble_generator.py`` to create UFS-replay case(s) starting with output from UFS Weather Model (UWM) Regression Tests (RTs). .. code:: bash @@ -473,44 +481,44 @@ with output from UFS Weather Model (UWM) Regression Tests (RTs). Mandatory arguments: -#. : path to UFS Regression Test output +#. ``--dir (-d)``: path to UFS Regression Test output -#. : name of cases +#. ``--case_name (-n)``: name of cases #. Either: (see examples below) - - AND AND : longitude range, latitude range, and number of cases to + - ``--lon_limits (-lonl)`` AND ``--lat_limits (-latl)`` AND ``--nensmembers (-nens)``: longitude range, latitude range, and number of cases to create - - AND : longitude and latitude of cases + - ``--lon_list (-lons)`` AND ``--lat_list (-lats)``: longitude and latitude of cases Optional arguments: -#. : SCM timestep, in seconds +#. ``--timestep (-dt)``: SCM timestep, in seconds -#. : UFS spatial resolution +#. ``--C_res (-cres)``: UFS spatial resolution -#. : CCPP suite definition file to use for ensemble +#. ``--suite (-sdf)``: CCPP suite definition file to use for ensemble -#. : flag to create UFS reference file for comparison +#. ``--save_comp (-sc)``: flag to create UFS reference file for comparison -#. : flag to indicate using the nearest UFS history file gridpoint +#. ``--use_nearest (-near)``: flag to indicate using the nearest UFS history file gridpoint -Examples to run from within the directory to create SCM cases starting +Examples to run from within the ``scm/etc/scripts`` directory to create SCM cases starting with the output from a UFS Weather Model regression test(s): On the supported platforms Cheyenne (NCAR) and Hera (NOAA), there are staged UWM RTs located at: -- -- +- Cheyenne ``/glade/scratch/epicufsrt/GMTB/CCPP-SCM/UFS_RTs`` +- Hera ``/scratch1/BMC/gmtb/CCPP-SCM/UFS_RTs`` .. _`subsection: example1`: Example 1: UFS-replay for single point ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -UFS regression test, , for single point. +UFS regression test, ``control_c192``, for single point. .. code:: bash @@ -523,14 +531,14 @@ will print to the screen. For example, ./run_scm.py --npz_type gfs --file scm_ufsens_control_c192.py --timestep 360 -The file is created in , where the SCM run script is to be exectued. +The file ``scm_ufsens_control_c192.py`` is created in ``ccpp-scm/scm/bin/``, where the SCM run script is to be exectued. .. _`subsection: example2`: Example 2: UFS-replay for list of points ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -UFS regression test, , for multiple points. +UFS regression test, ``control_c384``, for multiple points. .. code:: bash @@ -543,8 +551,8 @@ will print to the screen. For example, ./run_scm.py --npz_type gfs --file scm_ufsens_control_c384.py --timestep 225 -The file contains of the cases created. Each case created will have the -naming convention , where the suffix is the case number from 0 to the +The file ``scm_ufsens_control_c384.py`` contains ALL of the cases created. Each case created will have the +naming convention ``case_name_nXXX``, where the suffix ``XXX`` is the case number from 0 to the number of points provided. The contents of the file should look like: .. code:: bash @@ -559,21 +567,21 @@ number of points provided. The contents of the file should look like: Example 3: UFS-replay for an ensemble of points ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -UFS regression test, , for an ensemble (10) of randomly selected points +UFS regression test, ``control_p8``, for an ensemble (10) of randomly selected points over a specified longitude (:math:`300-320^oW`) and latitude (:math:`40-50^oN`) range -But first, to use the test we need to rerun the regression test to +But first, to use the ``control_p8`` test we need to rerun the regression test to generate UFS history files with a denser and constant output interval. -First, in , change to , where is the UFS history file output frequency +First, in ``control_p8/model_configure``, change ``--output_fh`` to ``"interval -1"``, where ``interval`` is the UFS history file output frequency (in hours), see `UFS Weather Model Users Guide `__ for more details. -For the purposes of this example the test has already been rerun, but if +For the purposes of this example the ``control_p8`` test has already been rerun, but if starting from your own UWM RTs, you can rerun the UWM regression test, on Cheyenne for example, by running the following command in the RT -directory: +directory: ``qsub job_card`` Now the cases can be generated with the following command: @@ -588,7 +596,7 @@ will print to the screen. For example, ./run_scm.py --npz_type gfs --file scm_ufsens_control_p8.py --timestep 720 -The file contains ten cases (n000-n009) to be run. The contents of the +The file ``scm_ufsens_control_p8.py`` contains ten cases (n000-n009) to be run. The contents of the file should look like: .. code:: bash diff --git a/scm/doc/TechGuide/chap_ccpp.rst b/scm/doc/TechGuide/chap_ccpp.rst index 0964b48f5..37b560733 100644 --- a/scm/doc/TechGuide/chap_ccpp.rst +++ b/scm/doc/TechGuide/chap_ccpp.rst @@ -15,7 +15,7 @@ Setting up a suite Setting up a physics suite for use in the CCPP SCM with the CCPP framework involves three steps: preparing data to be made available to -physics through the CCPP, running the script to reconcile SCM-provided +physics through the CCPP, running the ``ccpp_prebuild.py`` script to reconcile SCM-provided variables with physics-required variables, and preparing a suite definition file. @@ -29,36 +29,35 @@ passed into and out of the schemes within the physics suite. As of this release, in practice this means that a host model must do this for all variables needed by all physics schemes that are expected to be used with the host model. For this SCM, all variables needed by the physics -schemes are allocated and documented in the file and are contained -within the derived data type. This derived data type initializes its -component variables in a type-bound procedure. As mentioned in section +schemes are allocated and documented in the file ``ccpp-scm/scm/src/scm_type_defs.f90`` and are contained +within the ``physics`` derived data type. This derived data type initializes its +component variables in a ``create`` type-bound procedure. As mentioned in section 6.2 of the `CCPP Technical Documentation `__, files containing all required metadata was constructed for describing all -variables in the derived data type. These files are , and . Further, +variables in the derived data type. These files are ``scm/src/GFS_typedefs.meta,``, ``scm/src/CCPP_typedefs.meta``, and ``scm_physical_constants.meta``. Further, ``scm_type_defs.meta`` exists to provide metadata for derived data type definitions and their instances, which is needed by the ccpp-framework to properly reference the data. The standard names of all variables in this table must match with a corresponding variable within one or more of the physics schemes. -A list of all standard names used can be found in . +A list of all standard names used can be found in ``ccpp/framework/doc/DevelopersGuide/CCPP_VARIABLES_SCM.pdf``. -Editing and running +Editing and running ``ccpp_prebuild.py`` ~~~~~~~~~~~~~~~~~~~~ -General instructions for configuring and running the script can be found +General instructions for configuring and running the ``ccpp_prebuild.py`` script can be found in chapter 8 of the `CCPP Technical Documentation `__. The script expects to be run with a host-model-dependent configuration file, -passed as argument . Within this configuration file are variables that +passed as argument ``–config=path_to_config_file``. Within this configuration file are variables that hold paths to the variable definition files (where metadata tables can be found on the host model side), the scheme files (a list of paths to all source files containing scheme entry points), the auto-generated physics schemes makefile snippet, the auto-generated physics scheme caps makefile snippet, and the directory where the auto-generated physics -caps should be written out to. As mentioned in section -`[section: compiling] <#section: compiling>`__, this script must be run +caps should be written out to. As mentioned in :numref:`Section %s `, this script must be run to reconcile data provided by the SCM with data required by the physics -schemes before compilation – this is done automatically by . +schemes before compilation – this is done automatically by ``cmake``. Preparing a suite definition file ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @@ -80,8 +79,8 @@ physical parameterizations and the interstitial processing are listed explicitly. For this release, supported suite definition files used with this SCM -are found in and have default namelist, tracer configuration, and -timesteps attached in . For all of these suites, the physics schemes +are found in ``ccpp-scm/ccpp/suites`` and have default namelist, tracer configuration, and +timesteps attached in ``ccpp-scm/scm/src/suite_info.py``. For all of these suites, the physics schemes have been organized into 3 groupings following how the physics are called in the UFS Atmosphere model, although no code is executed in the SCM time loop between execution of the grouped schemes. Several @@ -95,8 +94,7 @@ Initializing/running a suite The process for initializing and running a suite in this SCM is described in sections -`[section: physics init] <#section: physics init>`__ and -`[section: time integration] <#section: time integration>`__, +:numref:`%s ` and :numref:`%s