From 0aed4f44624bc9f6169925f33762a992d0d4fedb Mon Sep 17 00:00:00 2001 From: tomvothecoder Date: Tue, 28 Jan 2025 14:17:46 -0600 Subject: [PATCH 1/6] Update mamba references to conda - Add `sphinx-copybutton` extension for markdown formatted as code - Add `docs-verisoned` command to `Makefile` --- Makefile | 7 ++++ conda-env/ci.yml | 1 + conda-env/dev.yml | 1 + docs/source/conf.py | 2 +- docs/source/install.rst | 89 +++++++++++++---------------------------- pyproject.toml | 2 +- 6 files changed, 38 insertions(+), 64 deletions(-) diff --git a/Makefile b/Makefile index 0ea712d9e..c39785f0e 100644 --- a/Makefile +++ b/Makefile @@ -88,6 +88,13 @@ docs: ## generate Sphinx HTML documentation, including API docs $(MAKE) -C docs html $(BROWSER) docs/_build/html/index.html +docs-versioned: ## generate verisoned Sphinx HTML documentation, including API docs + rm -rf docs/generated + cd docs && sphinx-multiversion source _build/html + $(MAKE) -C docs clean + $(MAKE) -C docs html + $(BROWSER) docs/_build/html/index.html + # Build # ---------------------- install: clean ## install the package to the active Python's site-packages diff --git a/conda-env/ci.yml b/conda-env/ci.yml index 7df34b39b..7ea92fda8 100644 --- a/conda-env/ci.yml +++ b/conda-env/ci.yml @@ -36,3 +36,4 @@ dependencies: - sphinx - sphinx_rtd_theme - sphinx-multiversion + - sphinx-copybutton diff --git a/conda-env/dev.yml b/conda-env/dev.yml index 8c009787d..8a82addf6 100644 --- a/conda-env/dev.yml +++ b/conda-env/dev.yml @@ -34,6 +34,7 @@ dependencies: - sphinx - sphinx_rtd_theme - sphinx-multiversion + - sphinx-copybutton # Quality Assurance Tools # ======================= # Run `pre-commit autoupdate` to get the latest pinned versions of 'rev' in diff --git a/docs/source/conf.py b/docs/source/conf.py index c5d344b4d..da2144c54 100644 --- a/docs/source/conf.py +++ b/docs/source/conf.py @@ -33,6 +33,7 @@ extensions = [ "sphinx_rtd_theme", "sphinx_multiversion", + "sphinx_copybutton" ] # Add any paths that contain templates here, relative to this directory. @@ -88,7 +89,6 @@ # html_theme = "sphinx_rtd_theme" -html_theme_path = [sphinx_rtd_theme.get_html_theme_path()] html_sidebars = { "**": [ "versions.html", diff --git a/docs/source/install.rst b/docs/source/install.rst index c212c97d9..12df7c6e6 100644 --- a/docs/source/install.rst +++ b/docs/source/install.rst @@ -5,7 +5,7 @@ The installation procedure depends on what version you'd like to install. Activate **e3sm_unified** environment ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -If you have an account on one of the E3SM supported machines (NERSC, Compy, Acme1, LCRC, Cooley, Rhea), you +If you have an account on one of the E3SM supported machines (NERSC, Compy, LCRC, Cooley, Rhea), you can access ``e3sm_diags`` by activating ``e3sm_unified``, which is a conda environment that pulls together Python and other E3SM analysis tools such as ``e3sm_diags``, ``mpas-analysis``, ``NCO``, and ``processflow``. @@ -34,14 +34,6 @@ The paths to ``e3sm_unified`` activation scripts are machine dependent: source /lus/theta-fs0/projects/ccsm/acme/tools/e3sm-unified/load_latest_e3sm_unified_cooley.sh -**acme1** - :: - - source /usr/local/e3sm_unified/envs/load_latest_e3sm_unified_acme1.sh - - - - Change ``.sh`` to ``.csh`` for csh shells. Note that ``e3sm_unified``'s development cycle is not in phase with ``e3sm_diags``, therefore the version of ``e3sm_diags`` included may not be the latest. @@ -77,41 +69,30 @@ NERSC Others/Local ~~~~~~~~~~~~ -If the system doesn't come with conda pre-installed, follow these instructions: +If the system doesn't come with conda pre-installed, follow these instructions for +Unix-like platforms (macOS & Linux) -1. Download Mambaforge +1. Download Miniforge - Linux - :: - - wget https://github.com/conda-forge/miniforge/releases/latest/download/Mambaforge-Linux-x86_64.sh - - MacOS x86_64 - :: - - wget https://github.com/conda-forge/miniforge/releases/latest/download/Mambaforge-MacOSX-x86_64.sh - -2. Install Mambaforge + :: - Linux - :: + wget "https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-$(uname)-$(uname -m).sh" - bash ./Mambaforge-Linux-x86_64.sh +2. Install Miniforge - MacOS x86_64 - :: + :: - bash ./Mambaforge-MacOSX-x86_64.sh + bash Miniforge3-$(uname)-$(uname -m).sh When you see: :: by running conda init? [yes|no] [no] >>> yes - respond with ``yes`` so ``conda`` and ``mamba`` commands are available on + respond with ``yes`` so ``conda`` commands are available on initializing a new bash terminal. -3. If you are working on a machine/network that intercepts SSL communications (such as acme1), you will get +3. If you are working on a machine/network that intercepts SSL communications, you will get an SSL error unless you disable the SSL verification: :: @@ -120,7 +101,7 @@ an SSL error unless you disable the SSL verification: binstar config --set ssl_verify False -4. Once conda and mamba are properly working, you can install the **(a) Latest Stable Release** or +4. Once conda are properly working, you can install the **(a) Latest Stable Release** or create a **(b) Development Environment**. .. _install_latest: @@ -128,34 +109,14 @@ create a **(b) Development Environment**. (a) Latest Stable Release ------------------------- -1. Follow :ref:`"Others/Local" ` section for installing Conda. - -2. Get the yml file to create an environment. - - :: - - wget https://raw.githubusercontent.com/E3SM-Project/e3sm_diags/main/conda-env/prod.yml - - -3. Change ``prefix`` in that file to be your conda prefix. Typically, this will be ``~/miniconda3/envs/e3sm_diags_env``. +1. Follow :ref:`"Others/Local" ` section for installing conda. -4. Remove any cached conda packages. This will ensure that you always get the latest packages +Create a new conda environment with ``e3sm_diags`` installed and activate it: :: - mamba clean --all - -5. Use conda to create a new environment with E3SM Diags (``e3sm_diags``) included. -These steps should not be necessary if you installed Mambaforge as suggested -above but may be needed if you have previously installed Miniconda3 instead: :: - - conda install -y -n base mamba - conda config --add channels conda-forge - conda config --set channel_priority strict - -Create a new conda environment with ``e3sm_diags`` installed and activate it: :: - - mamba env create -f conda-env/prod.yml # Tip: Add the flag ``-n `` to customize the name of the environment + # Tip: Add the flag ``-n `` to customize the name of the environment + conda create -n e3sm_diags_env e3sm_diags conda activate e3sm_diags_env .. _dev-env: @@ -163,12 +124,16 @@ Create a new conda environment with ``e3sm_diags`` installed and activate it: :: (b) Development Environment --------------------------- -Unlike the latest stable release (i.e., the user environment), the development environment does not include E3SM Diags (``e3sm-diags``). -Instead, the developer will ``pip install .`` to build ``e3sm-diags`` with changes (see step 6 below). +Unlike the latest stable release (i.e., the user environment), the development +environment does not include E3SM Diags (``e3sm-diags``). Instead, the developer will +``make install`` (or ``python -m pip install .``) to build ``e3sm-diags`` with changes +(see step 6 below). .. note:: - The dev environment includes quality assurance (QA) tools such as code formatters, linters, and ``pre-commit``. - **You must use the dev environment for all contributions** because these QA tools are enforced using ``pre-commit`` checks in the continuous integration/continuous deployment build. + The dev environment includes quality assurance (QA) tools such as code formatters, + linters, and ``pre-commit``. **You will need to use the dev environment for all + contributions** because these QA tools are enforced using ``pre-commit`` checks in + the continuous integration/continuous deployment build. 1. Follow :ref:`"Others/Local" ` section for installing conda. @@ -222,7 +187,7 @@ Instead, the developer will ``pip install .`` to build ``e3sm-diags`` with chang :: - mamba clean --all + conda clean --all 4. Enter the fork directory. @@ -236,7 +201,7 @@ Instead, the developer will ``pip install .`` to build ``e3sm-diags`` with chang :: - mamba env create -f conda-env/dev.yml + conda env create -f conda-env/dev.yml conda activate e3sm_diags_env_dev 6. Install ``pre-commit``. @@ -249,7 +214,7 @@ Instead, the developer will ``pip install .`` to build ``e3sm-diags`` with chang :: - pip install . + make install # or python -m pip install . 8. Check that tests pass: ``./tests/test.sh``. This takes about 4 minutes. diff --git a/pyproject.toml b/pyproject.toml index d4c85c1ea..f0ebb69c4 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -49,7 +49,7 @@ dependencies = [ [project.optional-dependencies] test = ["pytest", "pytest-cov"] -docs = ["sphinx", "sphinx_rtd_theme", "sphinx-multiversion"] +docs = ["sphinx", "sphinx_rtd_theme", "sphinx-multiversion", "sphinx-copybutton"] dev = [ "pre-commit", "types-PyYAML", From 74cdef3c4216b70f0564be966fcacf3f7f2e0b2f Mon Sep 17 00:00:00 2001 From: tomvothecoder Date: Tue, 28 Jan 2025 14:18:10 -0600 Subject: [PATCH 2/6] Remove old v1 quickguides --- docs/source/quickguides/old/index.rst | 15 - .../quickguides/old/quick-guide-aims4.rst | 230 ------------ .../quickguides/old/quick-guide-compy.rst | 259 ------------- .../quickguides/old/quick-guide-cooley.rst | 276 -------------- .../old/quick-guide-cori-haswell.rst | 341 ------------------ .../quickguides/old/quick-guide-cori.rst | 279 -------------- .../quickguides/old/quick-guide-docker.rst | 124 ------- .../old/quick-guide-edison-shifter.rst | 259 ------------- .../quickguides/old/quick-guide-edison.rst | 209 ----------- ...ick-guide-multiprocessing-cori-haswell.rst | 65 ---- .../quickguides/old/quick-guide-rhea.rst | 170 --------- 11 files changed, 2227 deletions(-) delete mode 100644 docs/source/quickguides/old/index.rst delete mode 100644 docs/source/quickguides/old/quick-guide-aims4.rst delete mode 100644 docs/source/quickguides/old/quick-guide-compy.rst delete mode 100644 docs/source/quickguides/old/quick-guide-cooley.rst delete mode 100644 docs/source/quickguides/old/quick-guide-cori-haswell.rst delete mode 100644 docs/source/quickguides/old/quick-guide-cori.rst delete mode 100644 docs/source/quickguides/old/quick-guide-docker.rst delete mode 100644 docs/source/quickguides/old/quick-guide-edison-shifter.rst delete mode 100644 docs/source/quickguides/old/quick-guide-edison.rst delete mode 100644 docs/source/quickguides/old/quick-guide-multiprocessing-cori-haswell.rst delete mode 100644 docs/source/quickguides/old/quick-guide-rhea.rst diff --git a/docs/source/quickguides/old/index.rst b/docs/source/quickguides/old/index.rst deleted file mode 100644 index 44d6afe98..000000000 --- a/docs/source/quickguides/old/index.rst +++ /dev/null @@ -1,15 +0,0 @@ -####################### -Old Quick Guides for v1 -####################### - -.. toctree:: - :maxdepth: 2 - - quick-guide-edison - quick-guide-edison-shifter - quick-guide-compy - quick-guide-docker - quick-guide-cori - quick-guide-cooley - quick-guide-aims4 - quick-guide-rhea diff --git a/docs/source/quickguides/old/quick-guide-aims4.rst b/docs/source/quickguides/old/quick-guide-aims4.rst deleted file mode 100644 index 2566bccbf..000000000 --- a/docs/source/quickguides/old/quick-guide-aims4.rst +++ /dev/null @@ -1,230 +0,0 @@ - -Quick guide for LLNL AIMS4 or ACME1 (v1) -======================================== - -If you're pressed for time, just follow this quick guide to -run ``e3sm_diags`` on ``aims4`` or ``acme1``. - -1. Log on to ``aims4``: - -:: - - ssh -Y aims4.llnl.gov - -or ``acme1``: - -:: - - ssh -Y acme1.llnl.gov - -2. If you don't have Anaconda installed, follow `this -guide `__. - -3. Make sure you are using ``bash`` - -:: - - bash - -Installing and creating an environment --------------------------------------- -The steps below detail how to create your own environment with ``e3sm_diags``. -However, it is possible to use the `E3SM Unified Environment `__ instead. -If you decide to use the unified environment, please do so and skip to step 5. - -4. Allow Anaconda to download packages, even with a firewall. - -:: - - conda config --set ssl_verify false - binstar config --set verify_ssl False - -5. Update Anaconda. - -:: - - conda update conda - -6. Get the yml file to create an environment. - -:: - - wget https://raw.githubusercontent.com/E3SM-Project/e3sm_diags/master/conda/e3sm_diags_env.yml - -7. Remove any cached Anaconda packages. This will ensure that you always get the latest packages. - -:: - - conda clean --all - -8. Use Anaconda to create a new environment with ``e3sm_diags`` installed. -**Tip:** You can change the name of the environment by adding ``-n new_env_name`` to the end of ``conda env create ...``. - -:: - - conda env create -f e3sm_diags_env.yml - source activate e3sm_diags_env - - -Running the entire Latitude-longitude contour set -------------------------------------------------- - -9. Copy and paste the below code into ``myparams.py`` using your -favorite text editor. Adjust any options as you like. - -.. code:: python - - reference_data_path = '/p/cscratch/acme/data/obs_for_e3sm_diags/climatology/' - test_data_path = '/p/cscratch/acme/data/test_model_data_for_acme_diags/climatology/' - - test_name = '20161118.beta0.FC5COSP.ne30_ne30.edison' - - sets = ["lat_lon"] - - backend = 'mpl' # 'mpl' is for the matplotlib plots. - - results_dir = 'lat_lon_demo' # name of folder where all results will be stored - - multiprocessing = True - num_workers = 16 # Number of processes to use - -10. Run the diags. - -:: - - e3sm_diags -p myparams.py - - -11. Open the following webpage to view the results. - -:: - - firefox --no-remote lat_lon_demo/viewer/index.html & - -- The ``--no-remote`` option uses the Firefox installed on this machine, - and not the one on your machine. - -**Tip:** Once you're on the webpage for a specific plot, click on the 'Output Metadata' -drop down menu to view the metadata for the displayed plot. - -* Running that command allows the displayed plot to be recreated. Changing any of the options will modify the resulting figure. - -Running all of the diagnostics sets ------------------------------------ - -12. To run all of the diagnostic sets that this software supports, open ``myparams.py`` -and remove the ``sets`` parameter. If should look like this: - -.. code:: python - - reference_data_path = '/p/cscratch/acme/data/obs_for_e3sm_diags/climatology/' - test_data_path = '/p/cscratch/acme/data/test_model_data_for_acme_diags/climatology/' - - test_name = '20161118.beta0.FC5COSP.ne30_ne30.edison' - - # When commented out, this is ignored. - # We can also just delete this line. - # sets = ["lat_lon"] - - backend = 'vcs' # 'mpl' is for the matplotlib plots. - - results_dir = 'lat_lon_demo' # name of folder where all results will be stored - - multiprocessing = True - num_workers = 16 # Number of processes to use - -13. Now run and view the results. This will take some more time, so if you can, -change the ``num_workers`` parameter to use more processors so it can be faster! - -:: - - e3sm_diags -p myparams.py - firefox --no-remote lat_lon_demo/viewer/index.html & - - -Advanced: Running custom diagnostics ------------------------------------- -The following steps are for 'advanced' users, who want to run custom diagnostics. -So most users will not run the software like this. - -14. By default, all of the E3SM diagnostics are ran for the ``sets`` that -we defined above. This takes some time, so we'll create our own -diagnostics to be ran. Run the command - -:: - - touch mydiags.cfg - -and paste the code below in ``mydiags.cfg``. Check :doc:`defining parameters <../../available-parameters>` -for all available parameters. - -:: - - [#] - case_id = "GPCP_v2.2" - variables = ["PRECT"] - ref_name = "GPCP_v2.2" - reference_name = "GPCP (yrs1979-2014)" - seasons = ["ANN", "DJF"] - regions = ["global"] - test_colormap = "WhiteBlueGreenYellowRed.rgb" - reference_colormap = "WhiteBlueGreenYellowRed.rgb" - diff_colormap = "BrBG" - contour_levels = [0.5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 12, 13, 14, 15, 16] - diff_levels = [-5, -4, -3, -2, -1, -0.5, 0.5, 1, 2, 3, 4, 5] - - [#] - case_id = "SST_CL_HadISST" - variables = ["SST"] - ref_name = "HadISST_CL" - reference_name = "HadISST/OI.v2 (Climatology) 1982-2001" - seasons = ["ANN", "MAM"] - contour_levels = [-1, 0, 1, 3, 6, 9, 12, 15, 18, 20, 22, 24, 26, 28, 29] - diff_levels = [-5, -4, -3, -2, -1, -0.5, -0.2, 0.2, 0.5, 1, 2, 3, 4, 5] - -15. Run the custom diagnostics. - -:: - - e3sm_diags -p myparams.py -d mydiags.cfg - - -16. Open the following webpage to view the results. - -:: - - firefox --no-remote lat_lon_demo/viewer/index.html & - -More Options ------------- - -- You can modify the ``sets`` parameters in ``myparams.py`` to run - multiple sets. Possible options are: - ``'zonal_mean_xy', 'zonal_mean_2d', 'lat_lon, 'polar', 'cosp_histogram'``. - If the ``sets`` parameter is not defined, all of the aforementioned - sets are ran. Ex: - - .. code:: python - - sets = ['zonal_mean_xy', 'zonal_mean_2d', 'lat_lon', 'polar', 'cosp_histogram'] - -- Diagnostics can be ran in parallel with multi-processing. In - ``myparams.py``, add ``multiprocessing = True`` and set - ``num_workers`` to the number of workers you want to use. If - ``num_workers`` is not defined, it will automatically use 4 processors processes by default on a machine. Ex: - - .. code:: python - - # myparams.py - # In addition to your other parameters, include: - multiprocessing = True - num_workers = 4 - -Below figure shows a scalability test running the package for all lat_lon diagnostics on ACME1. Courtesy of Sterling Baldwin. - -.. figure:: ../../_static/quick-guide-aims4/performance_test.png - :width: 450px - :align: center - :alt: Performance_test - - Performance test running the package with full set: "lat_lon" diagnostics on ACME1 diff --git a/docs/source/quickguides/old/quick-guide-compy.rst b/docs/source/quickguides/old/quick-guide-compy.rst deleted file mode 100644 index e835b0a74..000000000 --- a/docs/source/quickguides/old/quick-guide-compy.rst +++ /dev/null @@ -1,259 +0,0 @@ - -Quick guide for running in e3sm_unified environment for Compy and Others (v1) -============================================================================= - -Activate the latest version of the e3sm_unified environment ------------------------------------------------------------ - -Most of the E3SM analysis software is maintained with an Anaconda metapackage. To get all of the tools in the metapackage in your path, use one of the sets of commands below and the activation path from below for individual e3sm analysis machines. Shown below we also provide the paths where climatology observational data e3sm_diags uses, replace ``/climatology`` to ``/time-series`` for time-series data: - - -**Compy** - :: - - source /compyfs/software/e3sm-unified/load_latest_e3sm_unified_py2.7.sh - - -obs at: ``/compyfs/e3sm_diags_data/obs_for_e3sm_diags/climatology/`` - - - -**For other analysis platforms** - -**Cori** - :: - - source /global/project/projectdirs/acme/software/anaconda_envs/load_latest_e3sm_unified_py2.7.sh - -obs at: ``/global/project/projectdirs/acme/e3sm_diags/obs_for_e3sm_diags/climatology/`` - - -**Anvil/blues** - :: - - source /lcrc/soft/climate/e3sm-unified/load_latest_e3sm_unified_py2.7.sh - -obs at: ``/lcrc/soft/climate/e3sm_diags_data/obs_for_e3sm_diags/climatology/`` - - -**Cooley** - :: - - source /lus/theta-fs0/projects/ccsm/acme/tools/e3sm-unified/load_latest_e3sm_unified_py2.7.sh - -obs at:``/lus/theta-fs0/projects/ClimateEnergy_3/e3sm_diags/obs_for_e3sm_diags/climatology/`` - - -**acme1** - :: - - source /usr/local/e3sm_unified/envs/load_latest_e3sm_unified_py2.7.sh - -obs at:``/p/cscratch/acme/data/obs_for_e3sm_diags/climatology/`` - - -**Rhea** - :: - - source /ccs/proj/cli900/sw/rhea/e3sm-unified/load_latest_e3sm_unified_py2.7.sh - -obs at:``/ccs/proj/cli115/e3sm_diags_data/obs_for_acme_diags/climatology/`` - - -Change ``.sh`` to ``.csh`` for csh shells. - - -Running the entire annual latitude-longitude contour set --------------------------------------------------------- - -Copy and paste the below code into ``myparams.py`` using your favorite text editor. Adjust any options as you like. - - **Tip:** Make a folder in the following directory ``/compyfs/www/`` based off your username. - Then you can set ``results_dir`` to ``/compyfs/www//lat_lon_demo`` in ``myparams.py`` below - to view the results via a web browser here: https://compy-dtn.pnl.gov//lat_lon_demo - - - .. code:: python - - reference_data_path = '/compyfs/e3sm_diags_data/obs_for_e3sm_diags/climatology/' - test_data_path = '/compyfs/e3sm_diags_data/test_model_data_for_acme_diags/climatology/' - - test_name = '20161118.beta0.FC5COSP.ne30_ne30.edison' - - sets = ["lat_lon"] - seasons = ["ANN"] - - # 'mpl' and 'vcs' are for matplotlib or vcs plots respectively. - backend = 'mpl' - - # Name of folder where all results will be stored. - results_dir = 'lat_lon_demo' - - multiprocessing = True - num_workers = 40 - -To enable multiprocessing rather than running in serial, the program will need to be ran in an -**interactive session** on compute nodes, or as a **batch job**. - -Here are some hardware details for `compy` - * 40 cores/node - * 192 GB DRAM/node - * 18400 total cores - - -Interactive session on compute nodes -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -First, request an interactive session with a single node -for one hour (running this example should take much less than this). - - :: - - salloc --nodes=1 --account=e3sm --time=00:30:00 - - -Once the session is available, launch E3SM Diagnostics. - - :: - - source /compyfs/software/e3sm-unified/load_latest_e3sm_unified_py2.7.sh - e3sm_diags -p myparams.py - - -Batch job -^^^^^^^^^ - -Alternatively, you can also create a script and submit it to the batch system. -Copy and paste the code below into a file named ``diags.bash``. -Please remember to change what directory you're in to one accessible to you. - - .. code:: bash - - #!/bin/bash -l - #SBATCH --job-name=diags - #SBATCH --output=diags.o%j - #SBATCH --account=e3sm - #SBATCH --nodes=1 - #SBATCH --time=00:30:00 - - # Please change the directory below. - source /compyfs/software/e3sm-unified/load_latest_e3sm_unified_py2.7.sh - e3sm_diags -p myparams.py - -And then submit it - - :: - - sbatch diags.bash - -View the status of your job with ``squeue -u ``. -Here's the meaning of some values under the State (``ST``) column: - -* ``PD``: Pending -* ``R``: Running -* ``CA``: Cancelled -* ``CD``: Completed -* ``F``: Failed -* ``TO``: Timeout -* ``NF``: Node Failure - - -Back to running the latitude-longitude contour set -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -5. Once you ran the diagnostics in an interactive session or via a batch job, open the following webpage to view the results. - - - :: - - lat_lon_demo/viewer/index.html - -**Tip:** Once you're on the webpage for a specific plot, click on the -'Output Metadata' drop down menu to view the metadata for the displayed plot. -Running that command allows the displayed plot to be recreated. -Changing any of the options will modify the just that resulting figure. - - - -Running all of the diagnostics sets ------------------------------------ - -Copy and paste the following into ``all_sets.py`` using your -favorite text editor: - - .. code:: python - - reference_data_path = '/compyfs/e3sm_diags_data/obs_for_e3sm_diags/climatology/' - test_data_path = '/compyfs/e3sm_diags_data/test_model_data_for_acme_diags/climatology/' - - test_name = '20161118.beta0.FC5COSP.ne30_ne30.edison' - - # Not defining a sets parameter runs all of the default sets: - # ['zonal_mean_xy', 'zonal_mean_2d', 'lat_lon', 'polar', 'cosp_histogram'] - sets = ['zonal_mean_xy', 'zonal_mean_2d', 'lat_lon', 'polar', 'cosp_histogram'] - - # 'mpl' and 'vcs' are for matplotlib or vcs plots respectively. - backend = 'mpl' - - # Name of folder where all results will be stored. - results_dir = 'diag_demo' - - # Optional settings below: - - diff_title = 'Model - Obs' - - multiprocessing = True - num_workers = 40 - - -Advanced: Running custom diagnostics ------------------------------------- -The following steps are for 'advanced' users, who want to run custom diagnostics. -So most users will not run the software like this. - - -By default, all of the E3SM diagnostics are ran for the sets that we defined above. -This takes some time, so instead we create our own diagnostics to be ran. - - -Copy and paste the code below in ``mydiags.cfg``. -Check :doc:`Available Parameters <../../available-parameters>` -for all available parameters. - -For more examples of these types of files, look -`here `_ -for the cfg file that was used to create all of the latitude-longitude sets. - - - :: - - [#] - sets = ["lat_lon"] - case_id = "GPCP_v2.2" - variables = ["PRECT"] - ref_name = "GPCP_v2.2" - reference_name = "GPCP (yrs1979-2014)" - seasons = ["ANN", "DJF"] - regions = ["global"] - test_colormap = "WhiteBlueGreenYellowRed.rgb" - reference_colormap = "WhiteBlueGreenYellowRed.rgb" - diff_colormap = "BrBG" - contour_levels = [0.5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 12, 13, 14, 15, 16] - diff_levels = [-5, -4, -3, -2, -1, -0.5, 0.5, 1, 2, 3, 4, 5] - - [#] - sets = ["lat_lon"] - case_id = "SST_CL_HadISST" - variables = ["SST"] - ref_name = "HadISST_CL" - reference_name = "HadISST/OI.v2 (Climatology) 1982-2001" - seasons = ["ANN", "MAM"] - contour_levels = [-1, 0, 1, 3, 6, 9, 12, 15, 18, 20, 22, 24, 26, 28, 29] - diff_levels = [-5, -4, -3, -2, -1, -0.5, -0.2, 0.2, 0.5, 1, 2, 3, 4, 5] - -Run E3SM diagnostics with the ``-d`` parameter. - - :: - - e3sm_diags -p myparams.py -d mydiags.cfg - - diff --git a/docs/source/quickguides/old/quick-guide-cooley.rst b/docs/source/quickguides/old/quick-guide-cooley.rst deleted file mode 100644 index 18672f513..000000000 --- a/docs/source/quickguides/old/quick-guide-cooley.rst +++ /dev/null @@ -1,276 +0,0 @@ - -Quick guide for ANL ALCF Cooley using Singularity (v1) -====================================================== - -Obtaining the container image ------------------------------ - -Singularity is the container runtime used at ALCF to run containers. -It's an alternative to Docker, but supports Docker containers. - -1. Set the ``SINGULARITY_PULLFOLDER`` environmental variable to ``/lus/theta-fs0/projects/ccsm/acme/containers/``. -This is where the containers are stored. -If you're using bash, run the following. - - :: - - export SINGULARITY_PULLFOLDER="/lus/theta-fs0/projects/ccsm/acme/containers/" - - -2. Unlike Docker or Shifter (on NERSC), Singularity doesn't store the containers is a central repository. -So we just the directory below to store all downloaded containers. - -View the ``e3sm_diags`` images available on ALCF. - - :: - - ls $SINGULARITY_PULLFOLDER | grep e3sm_diags - -If the version you want to use is already available, then please continue to step 3. -Otherwise, you'll need to download the image you want, shown in step 2. - - -3. If the specific version you want or the ``latest`` image **is not shown**, download it. -You can view all of the images available on the -`e3sm_diags Docker Hub `_. - -Below, we are getting the image with the ``latest`` tag. -Images are stored in the ``SINGULARITY_PULLFOLDER`` directory. - - :: - - singularity pull docker://e3sm/e3sm_diags:latest - - -4. ``wget`` the following script. - - :: - - wget https://raw.githubusercontent.com/E3SM-Project/e3sm_diags/master/e3sm_diags/container/e3sm_diags_container.py - - - -Running the entire annual latitude-longitude contour set --------------------------------------------------------- - -4. Copy and paste the below code into ``myparams.py`` using your favorite text editor. -Adjust any options as you like. - - .. code:: python - - reference_data_path = '/lus/theta-fs0/projects/ClimateEnergy_3/e3sm_diags/obs_for_e3sm_diags/climatology/' - test_data_path = '/lus/theta-fs0/projects/ClimateEnergy_3/e3sm_diags/test_model_data_for_e3sm_diags/climatology/' - - test_name = '20161118.beta0.FC5COSP.ne30_ne30.edison' - - sets = ["lat_lon"] - seasons = ["ANN"] - - # 'mpl' and 'vcs' are for matplotlib or vcs plots respectively. - backend = 'mpl' - - # Name of folder where all results will be stored. - results_dir = 'lat_lon_demo' - - # Optional parameters to run in parallel. - multiprocessing = True - num_workers = 12 - -Singularity can be ran on the login nodes, but you should run -either in an interactive session on the compute nodes, or as a batch job. - -Interactive session on compute nodes -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -First, request an interactive session with a single node (12 cores) for one hour -(running this example should take much less than this) on the ``default`` queue. -If obtaining a session takes too long, try to use the ``debug`` queue. -For more information, -`see here `_. - - :: - - qsub --interactive --nodecount=1 --queue=default --time=01:00:00 - - -Once the session is available, reset the environmental variable. -In bash, use the below. - - :: - - export SINGULARITY_PULLFOLDER="/lus/theta-fs0/projects/ccsm/acme/containers/" - -Then launch E3SM Diagnostics: - - :: - - python e3sm_diags_container.py --singularity -p myparams.py - -**Tip:** You can select the version of the container you want to run with the ``--container_version`` argument. -If this argument isn't defined, it defaults to the ``latest`` container. - - :: - - python e3sm_diags_container.py --shifter --container_version v1.6.0 -p myparams.py - -Use the command ``exit`` when the run is done. - - -Batch job -^^^^^^^^^ - -Alternatively, you can also create a script and submit it to the batch system. - -First, create a temporary directory and make sure you download the script into it. -The reason we're doing this first is because the compute nodes -don't seem to not have Internet access. - - .. code:: bash - - mkdir ~/e3sm_diags_output - wget -P ~/e3sm_diags_output https://raw.githubusercontent.com/E3SM-Project/e3sm_diags/master/e3sm_diags/container/e3sm_diags_container.py - - -Also, make sure your ``myparams.py`` is also in the directory you've made. - -Finally, copy and paste the code below into a file named ``diags.sh``. -You might need to make it executable with ``chmod u+x diags.sh``. - - .. code::bash - - #!/bin/bash -l - #COBALT --jobname=diags - #COBALT --output=diags.o%j - #COBALT --queue=default - #COBALT --nodecount=1 - #COBALT --time=01:00:00 - - export SINGULARITY_PULLFOLDER="/lus/theta-fs0/projects/ccsm/acme/containers/" - cd ~/e3sm_diags_output - python e3sm_diags_container.py --singularity -p myparams.py - -And then submit it - - :: - - qsub diags.bash - -View the status of your job with ``qstat -u ``. - - -Back to running the latitude-longitude contour set -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -5. Once you ran the diagnostics in an interactive session or via a batch job, -open the following webpage to view the results. - - :: - - lat_lon_demo/viewer/index.html - -**Tip:** Once you're on the webpage for a specific plot, click on the -'Output Metadata' drop down menu to view the metadata for the displayed plot. -Running that command allows the displayed plot to be recreated. -Changing any of the options will modify the just that resulting figure. - - - -Running all of the diagnostics sets ------------------------------------ - -Copy and paste the following into ``all_sets.py`` using your -favorite text editor: - - .. code:: python - - reference_data_path = '/lus/theta-fs0/projects/ClimateEnergy_3/e3sm_diags/obs_for_e3sm_diags/climatology/' - test_data_path = '/lus/theta-fs0/projects/ClimateEnergy_3/e3sm_diags/test_model_data_for_e3sm_diags/climatology/' - - test_name = '20161118.beta0.FC5COSP.ne30_ne30.edison' - - # Not defining a sets parameter runs all of the default sets: - # ['zonal_mean_xy', 'zonal_mean_2d', 'lat_lon', 'polar', 'cosp_histogram'] - # Not defining a seasons parameter runs all of the seasons: - # ['ANN', 'DJF', 'MAM', 'JJA', 'SON'] - - # 'mpl' and 'vcs' are for matplotlib or vcs plots respectively. - backend = 'mpl' - - # Name of folder where all results will be stored. - results_dir = 'diag_demo' - - # Optional settings below: - diff_title = 'Model - Obs' - - multiprocessing = True - num_workers = 12 - - -Compared to the previous short test above, note the following changes: - -* Plots for all the available sets ('zonal_mean_xy', 'zonal_mean_2d', - 'lat_lon', 'polar', 'cosp_histogram') are generated. -* Plots for all of the seasons ('ANN', 'DJF', 'MAM', 'JJA', 'SON') are generated. - - -6. Again, run the diagnostics with this new parameter file (``all_sets.py``), either - in an interactive session or via a batch job. - - -7. Open the following webpage to view the results. - - :: - - diags_demo/viewer/index.html - - - -Advanced: Running custom diagnostics ------------------------------------- -The following steps are for 'advanced' users, who want to run custom diagnostics. -So most users will not run the software like this. - - -By default, all of the E3SM diagnostics are ran for the sets that we defined above. -This takes some time, so instead we create our own diagnostics to be ran. - - -8. Copy and paste the code below in ``mydiags.cfg``. -Check :doc:`Available Parameters <../../available-parameters>` -for all available parameters. - -For more examples of these types of files, look -`here `_ -for the ``cfg`` file that was used to create all of the latitude-longitude sets. - - - :: - - [#] - sets = ["lat_lon"] - case_id = "GPCP_v2.2" - variables = ["PRECT"] - ref_name = "GPCP_v2.2" - reference_name = "GPCP (yrs1979-2014)" - seasons = ["ANN", "DJF"] - regions = ["global"] - test_colormap = "WhiteBlueGreenYellowRed.rgb" - reference_colormap = "WhiteBlueGreenYellowRed.rgb" - diff_colormap = "BrBG" - contour_levels = [0.5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 12, 13, 14, 15, 16] - diff_levels = [-5, -4, -3, -2, -1, -0.5, 0.5, 1, 2, 3, 4, 5] - - [#] - sets = ["lat_lon"] - case_id = "SST_CL_HadISST" - variables = ["SST"] - ref_name = "HadISST_CL" - reference_name = "HadISST/OI.v2 (Climatology) 1982-2001" - seasons = ["ANN", "MAM"] - contour_levels = [-1, 0, 1, 3, 6, 9, 12, 15, 18, 20, 22, 24, 26, 28, 29] - diff_levels = [-5, -4, -3, -2, -1, -0.5, -0.2, 0.2, 0.5, 1, 2, 3, 4, 5] - -9. Run E3SM diagnostics with the ``-d`` parameter. - - :: - - python e3sm_diags_container.py --singularity -p myparams.py -d mydiags.cfg diff --git a/docs/source/quickguides/old/quick-guide-cori-haswell.rst b/docs/source/quickguides/old/quick-guide-cori-haswell.rst deleted file mode 100644 index 1abc1171a..000000000 --- a/docs/source/quickguides/old/quick-guide-cori-haswell.rst +++ /dev/null @@ -1,341 +0,0 @@ -.. - Comment: If you want to edit `quick-guide-{machine_name}.rst`, edit `quick-guide-generic.rst` instead and run `generate_quick_guides.py`. - -Cori-Haswell quick guide for running e3sm_diags v2 -========================================================================= - -.. warning:: - As of ``v2.6.0``, ``e3sm_diags`` should be used as the module name instead of - ``acme_diags``. Instances of ``acme_diags`` in the Python import statements should - be replaced accordingly. - -1. Installation ------------------------------------------------------------ - -We will use the e3sm_unifed environment to install. -For the latest stable release or if you don't have access to e3sm analysis machines, -please instead refer to :ref:`Latest stable release `. - -Most of the E3SM analysis software is maintained with an Anaconda metapackage -(E3SM unified environment). -If you have an account on Cori-Haswell, -then to get all of the tools in the metapackage in your path, -use the activation command below. -(Change ``.sh`` to ``.csh`` for csh shells.) - -Below, we also provide the paths for observational data needed by ``e3sm_diags`` (), -and some sample model data for testing (). -Both and have two subdirectories: -``/climatology`` and ``/time-series`` for climatology and time-series data respectively. - -Also listed below are paths where the HTML files () must be located to be displayed -at their corresponding web addresses (). - -: ``source /global/common/software/e3sm/anaconda_envs/load_latest_e3sm_unified_cori-haswell.sh`` - -: ``/global/cfs/cdirs/e3sm/e3sm_diags/obs_for_e3sm_diags/`` - -: ``/global/cfs/cdirs/e3sm/e3sm_diags/test_model_data_for_acme_diags/`` - -: ``/global/cfs/cdirs/e3sm/www//`` - -: ``http://portal.nersc.gov/cfs/e3sm//`` - - - -2. Config and run --------------------------------------------------------- - -.. _Cori-Haswell_lat_lon: - -Running the annual mean latitude-longitude contour set -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -Copy and paste the below code into ``run_e3sm_diags.py`` using your favorite text editor. -Adjust any options as you like. - - **Tip:** Some of E3SM's analysis machines (**Acme1, Anvil, Compy, Cori**) - have web servers setup to host html results. - On Cori-Haswell, - create the directory ``/global/cfs/cdirs/e3sm/www//`` using your username. - Set ``results_dir`` to ``/global/cfs/cdirs/e3sm/www//doc_examples/lat_lon_demo`` - in ``run_e3sm_diags.py`` below. Then, you can view results via a web browser here: - http://portal.nersc.gov/cfs/e3sm//doc_examples/lat_lon_demo - - - .. code:: python - - import os - from e3sm_diags.parameter.core_parameter import CoreParameter - from e3sm_diags.run import runner - - param = CoreParameter() - - param.reference_data_path = '/global/cfs/cdirs/e3sm/e3sm_diags/obs_for_e3sm_diags/climatology/' - param.test_data_path = '/global/cfs/cdirs/e3sm/e3sm_diags/test_model_data_for_acme_diags/climatology/' - param.test_name = '20161118.beta0.FC5COSP.ne30_ne30.edison' - param.seasons = ["ANN"] #all seasons ["ANN","DJF", "MAM", "JJA", "SON"] will run,if comment out" - - prefix = '/global/cfs/cdirs/e3sm/www//doc_examples/' - param.results_dir = os.path.join(prefix, 'lat_lon_demo') - # Use the following if running in parallel: - #param.multiprocessing = True - #param.num_workers = 32 - - # Use below to run all core sets of diags: - #runner.sets_to_run = ['lat_lon','zonal_mean_xy', 'zonal_mean_2d', 'polar', 'cosp_histogram', 'meridional_mean_2d'] - # Use below to run lat_lon map only: - runner.sets_to_run = ['lat_lon'] - runner.run_diags([param]) - - -Run in serial with: - - :: - - python run_e3sm_diags.py - -The above run has the same results as running ``e3sm_diags -p lat_lon_params.py`` -using the code below for ``lat_lon_params.py``: - - - .. code:: python - - reference_data_path = '/global/cfs/cdirs/e3sm/e3sm_diags/obs_for_e3sm_diags/climatology/' - test_data_path = '/global/cfs/cdirs/e3sm/e3sm_diags/test_model_data_for_acme_diags/climatology/' - - test_name = '20161118.beta0.FC5COSP.ne30_ne30.edison' - - sets = ["lat_lon"] - seasons = ["ANN"] - - # 'mpl' for matplotlib plots - backend = 'mpl' - - # Name of folder where all results will be stored. - results_dir = '/global/cfs/cdirs/e3sm/www//doc_examples/lat_lon_demo' - -The new way of running (no ``-p``) is implemented in version 2.0.0, -preparing ``e3sm_diags`` to accomodate more diagnostics sets with set-specific parameters. - - -To enable multiprocessing rather than running in serial, the program will need to be run in an -**interactive session** on compute nodes, or as a **batch job**. - - -Interactive session on compute nodes -''''''''''''''''''''''''''''''''''''' - -First, request an interactive session with a single node -(32 cores with Cori Haswell, 68 cores with Cori KNL) -for one hour (running this example should take much less than this). -If obtaining a session takes too long, try to use the ``debug`` partition. -Note that the maximum time allowed for that partition is ``00:30:00``. - - :: - - salloc --nodes=1 --partition=regular --time=01:00:00 -C haswell - - -Once the session is available, launch E3SM Diagnostics, to activate ``e3sm_unified``: - - :: - - source /global/common/software/e3sm/anaconda_envs/load_latest_e3sm_unified_cori-haswell.sh - python run_e3sm_diags.py --multiprocessing --num_workers=32 - - -We could have also set these multiprocessing parameters in the ``run_e3sm_diags.py`` as well -but we're showing that you can still submit parameters via the command line. - -Batch job -''''''''' - -Alternatively, you can also create a script and submit it to the batch system. -Copy and paste the code below into a file named ``diags.bash``. - - .. code:: bash - - #!/bin/bash -l - #SBATCH --job-name=diags - #SBATCH --output=diags.o%j - #SBATCH --partition=regular - #SBATCH --account=e3sm - #SBATCH --nodes=1 - #SBATCH --time=01:00:00 - #SBATCH -C haswell - - source /global/common/software/e3sm/anaconda_envs/load_latest_e3sm_unified_cori-haswell.sh - python run_e3sm_diags.py --multiprocessing --num_workers=32 - -And then submit it: - - :: - - sbatch diags.bash - -View the status of your job with ``squeue -u ``. -Here's the meaning of some values under the State (``ST``) column: - -* ``PD``: Pending -* ``R``: Running -* ``CA``: Cancelled -* ``CD``: Completed -* ``F``: Failed -* ``TO``: Timeout -* ``NF``: Node Failure - -View results on the web -''''''''''''''''''''''' -Once the run is completed, -open ``http://portal.nersc.gov/cfs/e3sm//doc_examples/lat_lon_demo/viewer/index.html`` to view the results. -If you don't see the results, you may need to set proper permissions. -Run ``chmod -R 755 /global/cfs/cdirs/e3sm/www//``. - -**Tip:** Once you're on the webpage for a specific plot, click on the -'Output Metadata' drop down menu to view the metadata for the displayed plot. -Running that command allows the displayed plot to be recreated. -Changing any of the options will modify just that resulting figure. - - - -Running all the core diagnostics sets -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -Core diagnostics set includes: -**lat_lon**, **zonal_mean_xy**, **zonal_mean_2d**, **polar**, **cosp_histogram**, -**meridional_mean_2d**. -These diags share a common parameter space (core parameters). -To run all these sets without defining set-specific parameters -(e.g. **plev** for **zonal_mean_2d** and **meridional_mean_2d**.), -replace the ``runner.sets_to_run`` line in ``run_e3sm_diags.py`` with the one below: - - :: - - runner.sets_to_run = ['lat_lon','zonal_mean_xy', 'zonal_mean_2d', 'polar', 'cosp_histogram', 'meridional_mean_2d'] - - -Running area mean time series set -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -In v2.0.0, the time series set was implemented to support regional averaged time series plotting -using monthly mean time series input. -This set is enabled if monthly mean time series is processed as documented -:doc:`here <../input-data-requirement>`. - -A ``run_e3sm_diags.py`` example for running area mean time series alone: - - .. code:: python - - import os - from e3sm_diags.parameter.core_parameter import CoreParameter - from e3sm_diags.parameter.area_mean_time_series_parameter import AreaMeanTimeSeriesParameter - from e3sm_diags.run import runner - - param = CoreParameter() - - param.reference_data_path = '/global/cfs/cdirs/e3sm/e3sm_diags/obs_for_e3sm_diags/time-series/' - param.test_data_path = '/global/cfs/cdirs/e3sm/e3sm_diags/test_model_data_for_acme_diags/time-series/E3SM_v1/' - param.test_name = 'e3sm_v1' - - prefix = '/global/cfs/cdirs/e3sm/www//doc_examples/' - param.results_dir = os.path.join(prefix, 'area_mean_with_obs') - # Use the following if running in parallel: - #param.multiprocessing = True - #param.num_workers = 40 - - # We're passing in this new object as well, in - # addition to the CoreParameter object. - - ts_param = AreaMeanTimeSeriesParameter() - #ts_param.ref_names = ['none'] # Using this setting will plot only the model data, not the observation data - ts_param.start_yr = '2002' - ts_param.end_yr = '2008' - - runner.sets_to_run = ['area_mean_time_series'] - runner.run_diags([param, ts_param]) - - -This set can also be ran with the core diagnostics sets, -so that all the plots are shown in one viewer. -The following is an example to run all sets: - - .. code:: python - - import os - from e3sm_diags.parameter.core_parameter import CoreParameter - from e3sm_diags.parameter.area_mean_time_series_parameter import AreaMeanTimeSeriesParameter - from e3sm_diags.run import runner - - param = CoreParameter() - - param.reference_data_path = '/global/cfs/cdirs/e3sm/e3sm_diags/obs_for_e3sm_diags/climatology/' - param.test_data_path = '/global/cfs/cdirs/e3sm/e3sm_diags/test_model_data_for_acme_diags/climatology/' - param.test_name = '20161118.beta0.FC5COSP.ne30_ne30.edison' - param.multiprocessing = True - param.num_workers = 40 - prefix = '/global/cfs/cdirs/e3sm/www//doc_examples' - param.results_dir = os.path.join(prefix, 'all_sets') - - # - ##Set specific parameters for new sets - ts_param = AreaMeanTimeSeriesParameter() - ts_param.reference_data_path = '/global/cfs/cdirs/e3sm/e3sm_diags/obs_for_e3sm_diags/time-series/' - ts_param.test_data_path = '/global/cfs/cdirs/e3sm/e3sm_diags/obs_for_e3sm_diags/time-series/E3SM_v1/' - ts_param.test_name = 'e3sm_v1' - ts_param.start_yr = '2002' - ts_param.end_yr = '2008' - - runner.sets_to_run = ['lat_lon','zonal_mean_xy', 'zonal_mean_2d', 'polar', 'cosp_histogram', 'meridional_mean_2d', 'area_mean_time_series'] - runner.run_diags([param, ts_param]) - - -Advanced: Running custom diagnostics -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -The following steps are for 'advanced' users, who want to run custom diagnostics. -So, most users will not run the software like this. - - -By default, with ``e3sm_diags``, -a built in set of variables are defined for each diagonostics sets. -To do a short run, e.g. only running through a subset of variables, -a configuration file is needed to customize the run. - - -In the following example, -only precipitation and surface sea temperature are run to compare with -model and obs for lat_lon set. -Create ``mydiags.cfg`` file as below. - -Check :doc:`Available Parameters <../available-parameters>` for all available parameters. - -For a larger configuration file example, look -`here `_ -for the cfg file that was used to create all of the latitude-longitude sets. - - - :: - - [#] - sets = ["lat_lon"] - case_id = "GPCP_v2.3" - variables = ["PRECT"] - ref_name = "GPCP_v2.3" - reference_name = "GPCP" - seasons = ["ANN", "DJF", "MAM", "JJA", "SON"] - regions = ["global"] - test_colormap = "WhiteBlueGreenYellowRed.rgb" - reference_colormap = "WhiteBlueGreenYellowRed.rgb" - diff_colormap = "BrBG" - contour_levels = [0.5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 12, 13, 14, 15, 16] - diff_levels = [-5, -4, -3, -2, -1, -0.5, 0.5, 1, 2, 3, 4, 5] - - -Run E3SM diagnostics with the ``-d`` parameter. -Use the :ref:`above run script `. And run as following: - - :: - - python run_e3sm_diags.py -d mydiags.cfg - - diff --git a/docs/source/quickguides/old/quick-guide-cori.rst b/docs/source/quickguides/old/quick-guide-cori.rst deleted file mode 100644 index c3f07ec81..000000000 --- a/docs/source/quickguides/old/quick-guide-cori.rst +++ /dev/null @@ -1,279 +0,0 @@ - -Quick guide for NERSC Cori using Shifter (v1) -============================================= - -Obtaining the container image ------------------------------ - -Shifter is the container runtime used at NERSC to run containers. -It's an alternative to Docker, but supports Docker containers. - -1. View the ``e3sm_diags`` images available at NERSC. - - :: - - shifterimg images | grep e3sm_diags - - -If the version you want to use is already available, then please continue to step 3. - -Otherwise, you'll need to download the image you want, shown in step 2. - - -2. If the specific version you want or the ``latest`` image **is not shown**, download it. -You can view all of the images available on the -`e3sm_diags Docker Hub `_. -Below, we are getting the image with the latest tag. - - :: - - shifterimg -v pull docker:e3sm/e3sm_diags:latest - -* You'll see the same message with a timestamp print multiple times. - This is normal and takes around 10 minutes or so. - Something's just wrong with Shifter, we don't know why it does that. -* Once an image is downloaded from a public repo like this one, all users on NERSC can use it. -* You also cannot delete an image that you downloaded via Shifter for now. - Please email NERSC support and they can do that for you. - - -3. ``wget`` the following script. - - :: - - wget https://raw.githubusercontent.com/E3SM-Project/e3sm_diags/master/e3sm_diags/container/e3sm_diags_container.py - - - -Running the entire annual latitude-longitude contour set --------------------------------------------------------- - -4. Copy and paste the below code into ``myparams.py`` using your favorite text editor. Adjust any options as you like. - - **Tip:** Make a folder in the following directory ``/global/project/projectdirs/acme/www/`` based off your username. - Then you can set ``results_dir`` to ``/global/project/projectdirs/acme/www//lat_lon_demo`` in ``myparams.py`` below - to view the results via a web browser here: http://portal.nersc.gov/project/acme//lat_lon_demo - - - .. code:: python - - reference_data_path = '/global/project/projectdirs/acme/e3sm_diags/obs_for_e3sm_diags/climatology/' - test_data_path = '/global/project/projectdirs/acme/e3sm_diags/test_model_data_for_acme_diags/climatology/' - - test_name = '20161118.beta0.FC5COSP.ne30_ne30.edison' - - sets = ["lat_lon"] - seasons = ["ANN"] - - # 'mpl' and 'vcs' are for matplotlib or vcs plots respectively. - backend = 'mpl' - - # Name of folder where all results will be stored. - results_dir = 'lat_lon_demo' - -Since Shifter cannot be ran on the login nodes, it must be ran either in an -**interactive session** on compute nodes, or as a **batch job**. - -There are two kinds of compute nodes on Cori: - -* Cori KNL: - * 68 cores/node - * 128 GB/node - * Use ``-C knl`` with any ``salloc`` or ``srun`` commands. - -* Cori Haswell: - * 32 cores/node - * 128 GB/node - * Use ``-C haswell`` with any ``salloc`` or ``srun`` commands. - -For more information on how to run any batch job on Cori, consult -`the documentation here `_. - -Interactive session on compute nodes -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -First, request an interactive session with a single node (32 cores with Cori Haswell, 68 cores with Cori KNL) -for one hour (running this example should take much less than this). - -If obtaining a session takes too long, try to use the ``debug`` partition. -Note that the maximum time allowed for this partition is ``00:30:00``. - - :: - - salloc --nodes=1 --partition=regular --time=01:00:00 -C haswell - - -Once the session is available, launch E3SM Diagnostics. - - :: - - python e3sm_diags_container.py --shifter -p myparams.py - -**Tip:** You can select the version of the container you want to run with the ``--container_version`` argument. -If this argument isn't defined, it defaults to the ``latest`` container. - - :: - - python e3sm_diags_container.py --shifter --container_version v1.5.0 -p myparams.py - - -Batch job -^^^^^^^^^ - -Alternatively, you can also create a script and submit it to the batch system. -Copy and paste the code below into a file named ``diags.bash``. -Please remember to change what directory you're in to one accessible to you. - - .. code:: bash - - #!/bin/bash -l - #SBATCH --job-name=diags - #SBATCH --output=diags.o%j - #SBATCH --partition=regular - #SBATCH --account=acme - #SBATCH --nodes=1 - #SBATCH --time=01:00:00 - #SBATCH -C haswell - - # Please change the directory below. - cd /global/cscratch1/sd/golaz/tmp - wget https://raw.githubusercontent.com/E3SM-Project/e3sm_diags/master/e3sm_diags/container/e3sm_diags_container.py - python e3sm_diags_container.py --shifter -p myparams.py - -And then submit it - - :: - - sbatch diags.bash - -View the status of your job with ``squeue -u ``. -Here's the meaning of some values under the State (``ST``) column: - -* ``PD``: Pending -* ``R``: Running -* ``CA``: Cancelled -* ``CD``: Completed -* ``F``: Failed -* ``TO``: Timeout -* ``NF``: Node Failure - - -Back to running the latitude-longitude contour set -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -5. Once you ran the diagnostics in an interactive session or via a batch job, open the following webpage to view the results. - - - :: - - lat_lon_demo/viewer/index.html - -**Tip:** Once you're on the webpage for a specific plot, click on the -'Output Metadata' drop down menu to view the metadata for the displayed plot. -Running that command allows the displayed plot to be recreated. -Changing any of the options will modify the just that resulting figure. - - -.. _cori-params: - -Running all of the diagnostics sets ------------------------------------ - -Copy and paste the following into ``all_sets.py`` using your -favorite text editor: - - .. code:: python - - reference_data_path = '/global/project/projectdirs/acme/e3sm_diags/obs_for_e3sm_diags/climatology/' - test_data_path = '/global/project/projectdirs/acme/e3sm_diags/test_model_data_for_acme_diags/climatology/' - - test_name = '20161118.beta0.FC5COSP.ne30_ne30.edison' - - # Not defining a sets parameter runs all of the default sets: - # ['zonal_mean_xy', 'zonal_mean_2d', 'lat_lon', 'polar', 'cosp_histogram'] - - # 'mpl' and 'vcs' are for matplotlib or vcs plots respectively. - backend = 'mpl' - - # Name of folder where all results will be stored. - results_dir = 'diag_demo' - - # Optional settings below: - - diff_title = 'Model - Obs' - - multiprocessing = True - # You can set this to 64 if running on the KNL nodes. - num_workers = 32 - - -Compared to the previous short test above, note the following changes: - -* Plots for all the available sets ('zonal_mean_xy', 'zonal_mean_2d', - 'lat_lon', 'polar', 'cosp_histogram') are generated. -* Multiprocessing with 32 workers is enabled. - - -6. Again, run the diagnostics with this new parameter file (``all_sets.py``), either - in an interactive session or via a batch job. - - -7. Open the following webpage to view the results. - - :: - - diags_demo/viewer/index.html - - - -Advanced: Running custom diagnostics ------------------------------------- -The following steps are for 'advanced' users, who want to run custom diagnostics. -So most users will not run the software like this. - - -By default, all of the E3SM diagnostics are ran for the sets that we defined above. -This takes some time, so instead we create our own diagnostics to be ran. - - -8. Copy and paste the code below in ``mydiags.cfg``. -Check :doc:`Available Parameters <../available-parameters>` -for all available parameters. - -For more examples of these types of files, look -`here `_ -for the cfg file that was used to create all of the latitude-longitude sets. - - - :: - - [#] - sets = ["lat_lon"] - case_id = "GPCP_v2.2" - variables = ["PRECT"] - ref_name = "GPCP_v2.2" - reference_name = "GPCP (yrs1979-2014)" - seasons = ["ANN", "DJF"] - regions = ["global"] - test_colormap = "WhiteBlueGreenYellowRed.rgb" - reference_colormap = "WhiteBlueGreenYellowRed.rgb" - diff_colormap = "BrBG" - contour_levels = [0.5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 12, 13, 14, 15, 16] - diff_levels = [-5, -4, -3, -2, -1, -0.5, 0.5, 1, 2, 3, 4, 5] - - [#] - sets = ["lat_lon"] - case_id = "SST_CL_HadISST" - variables = ["SST"] - ref_name = "HadISST_CL" - reference_name = "HadISST/OI.v2 (Climatology) 1982-2001" - seasons = ["ANN", "MAM"] - contour_levels = [-1, 0, 1, 3, 6, 9, 12, 15, 18, 20, 22, 24, 26, 28, 29] - diff_levels = [-5, -4, -3, -2, -1, -0.5, -0.2, 0.2, 0.5, 1, 2, 3, 4, 5] - -9. Run E3SM diagnostics with the ``-d`` parameter. - - :: - - python e3sm_diags_container.py --shifter -p myparams.py -d mydiags.cfg - - diff --git a/docs/source/quickguides/old/quick-guide-docker.rst b/docs/source/quickguides/old/quick-guide-docker.rst deleted file mode 100644 index c7d966cf7..000000000 --- a/docs/source/quickguides/old/quick-guide-docker.rst +++ /dev/null @@ -1,124 +0,0 @@ - -Quick guide for using Docker (v1) -================================== - -Docker allows software packages, called containers, to be ran. -E3SM Diagnostics has a Docker container that makes running the -software easier than traditional methods. -Any machine that has Docker installed can use the -instructions below to run E3SM Diagnostics. - -Obtaining the container image ------------------------------ - -1. Make sure that Docker is installed and running on your machine. -To check this, run the command below and you should see some output. - - :: - - docker info - -2. View all of the ``e3sm_diags`` images. - - :: - - docker images | grep e3sm_diags - -If the version you want to use is already available, then please continue to step 4. - -Otherwise, you'll need to download the image you want, shown in step 3. - -3. If the specific version you want or the ``latest`` image is **not** shown, -download it. You can view all of the images available on the -`e3sm_diags Docker Hub `_. - -Below, we are getting the image with the latest tag: - - :: - - docker pull e3sm/e3sm_diags:latest - -Running a quick test --------------------- - -Since you probably don't have much sample data on your machine, -below are the steps to run a quick test. If you do have data, -please go to the "Running more diagnostics" section below. - -4. Clone the ``e3sm_diags`` repo, and go to the test directory. - - :: - - git clone https://github.com/E3SM-Project/e3sm_diags.git - cd e3sm_diags/tests/system/ - -5. ``wget`` or ``curl`` the script to run the container. - - :: - - wget https://raw.githubusercontent.com/E3SM-Project/e3sm_diags/master/e3sm_diags/container/e3sm_diags_container.py - - # Or use this: - curl -O https://raw.githubusercontent.com/E3SM-Project/e3sm_diags/master/e3sm_diags/container/e3sm_diags_container.py - -6. Run your diagnostics and examine the sample output. - - :: - - python e3sm_diags_container.py --docker -p all_sets.py -d all_sets.cfg - -**Tip:** You can select the version of the container you want to run with the ``--container_version`` argument. -If this argument isn't defined, it defaults to the ``latest`` container. - - :: - - python e3sm_diags_container.py --docker --container_version v1.6.0 -p myparams.py - - -7. Open the html below to view the results. - - :: - - all_sets_results/viewer/index.html - -Running more diagnostics ------------------------- - -To run other, more interesting diagnostics, you must download the data from one of our supported -machines (ALCF Cooley, NERSC Cori and others). -For more information on the format of the input data, please see the -:doc:`input data requirements <../../input-data-requirement>`. - -Below are the paths to the **observational data**: - - * Climatology data (6GB): - * NERSC: ``/global/project/projectdirs/acme/e3sm_diags/obs_for_e3sm_diags/climatology/`` - * ALCF: ``/lus/theta-fs0/projects/ClimateEnergy_3/e3sm_diags/obs_for_e3sm_diags/climatology/`` - * Time-series data (145GB): - * NERSC: ``/global/project/projectdirs/acme/e3sm_diags/obs_for_e3sm_diags/time-series/`` - * ALCF: ``/lus/theta-fs0/projects/ClimateEnergy_3/e3sm_diags/obs_for_e3sm_diags/time-series/`` - -We also have **sample model data** as well. You can use your own model data as well, -either climatology or time-series files created via ``nco``. -Again, if you want to use your own data, please see the -:doc:`input data requirements <../../input-data-requirement>`. - - * Climatology data (42GB): - * We have data from three models in these directories, from about 11.5GB to 19GB. - * NERSC: ``/global/project/projectdirs/acme/e3sm_diags/test_model_data_for_acme_diags/climatology/`` - * ALCF: ``/lus/theta-fs0/projects/ClimateEnergy_3/e3sm_diags/test_model_data_for_e3sm_diags/climatology/`` - * Time-series data (107GB): - * We have E3SM v1 data (94GB) and CESM1-CAM5 CMIP data (14GB). - * NERSC: ``/global/project/projectdirs/acme/e3sm_diags/test_model_data_for_acme_diags/time-series/`` - * ALCF: ``/lus/theta-fs0/projects/ClimateEnergy_3/e3sm_diags/test_model_data_for_e3sm_diags/time-series/`` - -Once the data is downloaded you can follow one of the -:doc:`many examples that we have <../../examples/index>`. - -Some points to remember: - - * You must change the ``reference_data_path`` and ``test_data_path`` accordingly. - - * Every instance of ``e3sm_diags`` should be ``python e3sm_diags_container.py --docker``. - - * Ex: Use ``python e3sm_diags_container.py --docker -p myparams.py`` instead of ``e3sm_diags -p myparams.py``. diff --git a/docs/source/quickguides/old/quick-guide-edison-shifter.rst b/docs/source/quickguides/old/quick-guide-edison-shifter.rst deleted file mode 100644 index 34414917f..000000000 --- a/docs/source/quickguides/old/quick-guide-edison-shifter.rst +++ /dev/null @@ -1,259 +0,0 @@ - -Quick guide for NERSC Edison using Shifter (v1) -=============================================== - -Obtaining the container image ------------------------------ - -Shifter is the container runtime used at NERSC to run containers. -It's an alternative to Docker, but supports Docker containers. - -1. View the ``e3sm_diags`` images available at NERSC. - - :: - - shifterimg images | grep e3sm_diags - - -If the version you want to use is already available, then please continue to step 3. - -Otherwise, you'll need to download the image you want, shown in step 2. - - -2. If the specific version you want or the ``latest`` image **is not shown**, download it. -You can view all of the images available on the -`e3sm_diags Docker Hub `_. -Below, we are getting the image with the latest tag. - - :: - - shifterimg -v pull docker:e3sm/e3sm_diags:latest - -* You'll see the same message with a timestamp print multiple times. - This is normal and takes around 10 minutes or so. - Something's just wrong with Shifter, we don't know why it does that. -* Once an image is downloaded from a public repo like this one, all users on NERSC can use it. -* You also cannot delete an image that you downloaded via Shifter for now. - Please email NERSC support and they can do that for you. - - -3. ``wget`` the following script. - - :: - - wget https://raw.githubusercontent.com/E3SM-Project/e3sm_diags/master/e3sm_diags/container/e3sm_diags_container.py - - - -Running the entire annual latitude-longitude contour set --------------------------------------------------------- - -4. Copy and paste the below code into ``myparams.py`` using your favorite text editor. Adjust any options as you like. - - **Tip:** Make a folder in the following directory ``/global/project/projectdirs/acme/www/`` based off your username. - Then you can set ``results_dir`` to ``/global/project/projectdirs/acme/www//lat_lon_demo`` in ``myparams.py`` below - to view the results via a web browser here: http://portal.nersc.gov/project/acme//lat_lon_demo - - .. code:: python - - reference_data_path = '/global/project/projectdirs/acme/e3sm_diags/obs_for_e3sm_diags/climatology/' - test_data_path = '/global/project/projectdirs/acme/e3sm_diags/test_model_data_for_acme_diags/climatology/' - - test_name = '20161118.beta0.FC5COSP.ne30_ne30.edison' - - sets = ["lat_lon"] - seasons = ["ANN"] - - # 'mpl' and 'vcs' are for matplotlib or vcs plots respectively. - backend = 'mpl' - - # Name of folder where all results will be stored. - results_dir = 'lat_lon_demo' - -Since Shifter cannot be ran on the login nodes, it must be ran either in an -**interactive session** on compute nodes, or as a **batch job**. - - -Interactive session on compute nodes -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -First, request an interactive session with a single node (24 cores) for one hour (running this example should take much less than this). - -If obtaining a session takes too long, try to use the ``debug`` partition. -Note that the maximum time allowed for this partition is ``00:30:00``. - - :: - - salloc --nodes=1 --partition=regular --time=01:00:00 -C haswell - - -Once the session is available, launch E3SM Diagnostics. - - :: - - python e3sm_diags_container.py --shifter -p myparams.py - -**Tip:** You can select the version of the container you want to run with the ``--container_version argument``. -If this argument isn't defined, it defaults to the ``latest`` container. - - :: - - python e3sm_diags_container.py --shifter --container_version v1.5.0 -p myparams.py - - -Batch job -^^^^^^^^^ - -Alternatively, you can also create a script and submit it to the batch system. -Copy and paste the code below into a file named ``diags.bash``. -Please remember to change what directory you're in to one accessible to you. - - .. code:: bash - - #!/bin/bash -l - #SBATCH --job-name=diags - #SBATCH --output=diags.o%j - #SBATCH --partition=regular - #SBATCH --account=acme - #SBATCH --nodes=1 - #SBATCH --time=01:00:00 - - # Please change the directory below. - cd /global/cscratch1/sd/golaz/tmp - wget https://raw.githubusercontent.com/E3SM-Project/e3sm_diags/master/e3sm_diags/container/e3sm_diags_container.py - python e3sm_diags_container.py --shifter -p myparams.py - -And then submit it - - :: - - sbatch diags.bash - -View the status of your job with ``squeue -u ``. -Here's the meaning of some values under the State (``ST``) column: - -* ``PD``: Pending -* ``R``: Running -* ``CA``: Cancelled -* ``CD``: Completed -* ``F``: Failed -* ``TO``: Timeout -* ``NF``: Node Failure - - -Back to running the latitude-longitude contour set -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -5. Once you ran the diagnostics in an interactive session or via a batch job, open the following webpage to view the results. - - - :: - - lat_lon_demo/viewer/index.html - -**Tip:** Once you're on the webpage for a specific plot, click on the 'Output Metadata' drop down menu to view the metadata for the displayed plot. -Running that command allows the displayed plot to be recreated. -Changing any of the options will modify the just that resulting figure. - - - -Running all of the diagnostics sets ------------------------------------ - -Copy and paste the following into ``all_sets.py`` using your -favorite text editor: - - .. code:: python - - reference_data_path = '/global/project/projectdirs/acme/e3sm_diags/obs_for_acme_diags/' - test_data_path = '/global/project/projectdirs/acme/e3sm_diags/test_model_data_for_acme_diags/' - - test_name = '20161118.beta0.FC5COSP.ne30_ne30.edison' - - # Not defining a sets parameter runs all of the default sets: - # ['zonal_mean_xy', 'zonal_mean_2d', 'lat_lon', 'polar', 'cosp_histogram'] - - # 'mpl' and 'vcs' are for matplotlib or vcs plots respectively. - backend = 'mpl' - - # Name of folder where all results will be stored. - results_dir = 'diag_demo' - - # Optional settings below: - - diff_title = 'Model - Obs' - - multiprocessing = True - num_workers = 24 - - -Compared to the previous short test above, note the following changes: - -* Plots for all the available sets ('zonal_mean_xy', 'zonal_mean_2d', - 'lat_lon', 'polar', 'cosp_histogram') are generated. -* Multiprocessing with 24 workers is enabled. - - -6. Again, run the diagnostics with this new parameter file (``all_sets.py``), either - in an interactive session or via a batch job. - - -7. Open the following webpage to view the results. - - :: - - diags_demo/viewer/index.html - - - -Advanced: Running custom diagnostics ------------------------------------- -The following steps are for 'advanced' users, who want to run custom diagnostics. -So most users will not run the software like this. - - -By default, all of the E3SM diagnostics are ran for the sets that we defined above. -This takes some time, so instead we create our own diagnostics to be ran. - - -8. Copy and paste the code below in ``mydiags.cfg``. -Check :doc:`Available Parameters <../../available-parameters>` -for all available parameters. - -For more examples of these types of files, look -`here `_ -for the cfg file that was used to create all of the latitude-longitude sets. - - - :: - - [#] - sets = ["lat_lon"] - case_id = "GPCP_v2.2" - variables = ["PRECT"] - ref_name = "GPCP_v2.2" - reference_name = "GPCP (yrs1979-2014)" - seasons = ["ANN", "DJF"] - regions = ["global"] - test_colormap = "WhiteBlueGreenYellowRed.rgb" - reference_colormap = "WhiteBlueGreenYellowRed.rgb" - diff_colormap = "BrBG" - contour_levels = [0.5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 12, 13, 14, 15, 16] - diff_levels = [-5, -4, -3, -2, -1, -0.5, 0.5, 1, 2, 3, 4, 5] - - [#] - sets = ["lat_lon"] - case_id = "SST_CL_HadISST" - variables = ["SST"] - ref_name = "HadISST_CL" - reference_name = "HadISST/OI.v2 (Climatology) 1982-2001" - seasons = ["ANN", "MAM"] - contour_levels = [-1, 0, 1, 3, 6, 9, 12, 15, 18, 20, 22, 24, 26, 28, 29] - diff_levels = [-5, -4, -3, -2, -1, -0.5, -0.2, 0.2, 0.5, 1, 2, 3, 4, 5] - -9. Run E3SM diagnostics with the ``-d`` parameter. - - :: - - python e3sm_diags_container.py --shifter -p myparams.py -d mydiags.cfg - - diff --git a/docs/source/quickguides/old/quick-guide-edison.rst b/docs/source/quickguides/old/quick-guide-edison.rst deleted file mode 100644 index 6aa0fe71a..000000000 --- a/docs/source/quickguides/old/quick-guide-edison.rst +++ /dev/null @@ -1,209 +0,0 @@ - -Quick guide for NERSC Edison (v1) -================================= - -Installation ------------- - -1. Log on to NERSC Edison - -2. Make sure you're using ``bash`` - -:: - - bash - -3. Load the Anaconda module - -:: - - module load python/2.7-anaconda-4.4 - -4. Get the yml file to create an environment. - -:: - - wget https://raw.githubusercontent.com/E3SM-Project/e3sm_diags/master/conda/e3sm_diags_env.yml - - -5. Allow Anaconda to download packages, even with a firewall. - -:: - - conda config --set ssl_verify false - binstar config --set verify_ssl False - - -6. Use Anaconda to create a new environment with ``e3sm_diags`` installed. -Tip: You can change the name of the environment by adding ``-n new_env_name`` to the end of ``conda env create ...``. - -:: - - conda env create -f e3sm_diags_env.yml - source activate e3sm_diags_env - - -Running the entire Latitude-longitude contour set -------------------------------------------------- - -7. Copy and paste the below code into ``myparams.py`` using your -favorite text editor. Adjust any options as you like. - -.. code:: python - - reference_data_path = '/global/project/projectdirs/acme/e3sm_diags/obs_for_e3sm_diags/climatology/' - test_data_path = '/global/project/projectdirs/acme/e3sm_diags/test_model_data_for_acme_diags/climatology/' - - test_name = '20161118.beta0.FC5COSP.ne30_ne30.edison' - - sets = ["lat_lon"] - - backend = 'mpl' # 'vcs' is for vcs plots - - results_dir = 'lat_lon_demo' # name of folder where all results will be stored - - -8. Run E3SM diags. - -:: - - e3sm_diags -p myparams.py - -9. Open the following webpage to view the results: - -:: - - lat_lon_demo/viewer/index.html - - -Tip: Once you're on the webpage for a specific plot, click on the 'Output Metadata' -drop down menu to view the metadata for the displayed plot. - -* Running that command allows the displayed plot to be recreated. Changing any of the options will modify the resulting figure. - -Running all of the diagnostics sets ------------------------------------ - -Copy and paste the following into ``all_sets.py`` using your -favorite text editor: - -.. code:: python - - reference_data_path = '/global/project/projectdirs/acme/e3sm_diags/obs_for_acme_diags/' - test_data_path = '/global/project/projectdirs/acme/e3sm_diags/test_model_data_for_acme_diags/' - - test_name = '20161118.beta0.FC5COSP.ne30_ne30.edison' - - # Not defining a sets parameter runs all of the default sets: - # ['zonal_mean_xy', 'zonal_mean_2d', 'lat_lon', 'polar', 'cosp_histogram'] - - # optional settings below - diff_title = 'Model - Obs' - - backend = 'mpl' # 'mpl' is for the matplotlib plots. - - results_dir = 'diag_demo' # name of folder where all results will be stored - - multiprocessing = True - num_workers = 24 - -Compared to the previous short test above, note the following changes: - -* Generate plots for all the available sets ('zonal_mean_xy', 'zonal_mean_2d', - 'lat_lon', 'polar', 'cosp_histogram'). -* Turn on multiprocessing with 24 workers. - -Since the example above turns on multiprocessing, it should not be run interactively -on the Edison login nodes (NERSC would likely flag it and let you know about it). -Instead, it can be run either in an interactive session on compute nodes, or as a batch -job. - - -Interactive session on compute nodes -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -First, request an interactive session with a single node (24 cores) for one hour -(running this example should take much less than this): :: - - salloc --nodes=1 --partition=regular --time=01:00:00 - -Once the session is available, launch E3SM Diags: :: - - source activate e3sm_diags_env - e3sm_diags -p all_sets.py - -Batch job -^^^^^^^^^ - -Alternatively, you can also create a script and submit it to the batch system. -Copy and paste the code below into a file named ``diags.bash``: - -.. code:: bash - - #!/bin/bash -l - #SBATCH --job-name=diags - #SBATCH --output=diags.o%j - #SBATCH --partition=regular - #SBATCH --account=acme - #SBATCH --nodes=1 - #SBATCH --time=01:00:00 - - source activate e3sm_diags_env - cd /global/cscratch1/sd/golaz/tmp - e3sm_diags -p all_sets.py - -And then submit it :: - - sbatch diags.bash - -That's it! - - -Advanced: Running custom diagnostics ------------------------------------- -The following steps are for 'advanced' users, who want to run custom diagnostics. -So most users will not run the software like this. - -By default, all of the E3SM diagnostics are ran for the ``sets`` that -we defined above. This takes some time, so instead we create our own -diagnostics to be ran. - -10. Copy and paste the code below in ``mydiags.cfg``. -Check :doc:`defining parameters <../../available-parameters>` -for all available parameters. - -:: - - [#] - case_id = "GPCP_v2.2" - variables = ["PRECT"] - ref_name = "GPCP_v2.2" - reference_name = "GPCP (yrs1979-2014)" - seasons = ["ANN", "DJF"] - regions = ["global"] - test_colormap = "WhiteBlueGreenYellowRed.rgb" - reference_colormap = "WhiteBlueGreenYellowRed.rgb" - diff_colormap = "BrBG" - contour_levels = [0.5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 12, 13, 14, 15, 16] - diff_levels = [-5, -4, -3, -2, -1, -0.5, 0.5, 1, 2, 3, 4, 5] - - [#] - case_id = "SST_CL_HadISST" - variables = ["SST"] - ref_name = "HadISST_CL" - reference_name = "HadISST/OI.v2 (Climatology) 1982-2001" - seasons = ["ANN", "MAM"] - contour_levels = [-1, 0, 1, 3, 6, 9, 12, 15, 18, 20, 22, 24, 26, 28, 29] - diff_levels = [-5, -4, -3, -2, -1, -0.5, -0.2, 0.2, 0.5, 1, 2, 3, 4, 5] - -11. Run E3SM diags. - -:: - - e3sm_diags -p myparams.py -d mydiags.cfg - -12. Open the following webpage to view the results: - -:: - - lat_lon_demo/viewer/index.html diff --git a/docs/source/quickguides/old/quick-guide-multiprocessing-cori-haswell.rst b/docs/source/quickguides/old/quick-guide-multiprocessing-cori-haswell.rst deleted file mode 100644 index cd3a26e9f..000000000 --- a/docs/source/quickguides/old/quick-guide-multiprocessing-cori-haswell.rst +++ /dev/null @@ -1,65 +0,0 @@ -To enable multiprocessing rather than running in serial, the program will need to be run in an -**interactive session** on compute nodes, or as a **batch job**. - - -Interactive session on compute nodes -''''''''''''''''''''''''''''''''''''' - -First, request an interactive session with a single node -(32 cores with Cori Haswell, 68 cores with Cori KNL) -for one hour (running this example should take much less than this). -If obtaining a session takes too long, try to use the ``debug`` partition. -Note that the maximum time allowed for that partition is ``00:30:00``. - - :: - - salloc --nodes=1 --partition=regular --time=01:00:00 -C haswell - - -Once the session is available, launch E3SM Diagnostics, to activate ``e3sm_unified``: - - :: - - source /global/common/software/e3sm/anaconda_envs/load_latest_e3sm_unified_cori-haswell.sh - python run_e3sm_diags.py --multiprocessing --num_workers=32 - - -We could have also set these multiprocessing parameters in the ``run_e3sm_diags.py`` as well -but we're showing that you can still submit parameters via the command line. - -Batch job -''''''''' - -Alternatively, you can also create a script and submit it to the batch system. -Copy and paste the code below into a file named ``diags.bash``. - - .. code:: bash - - #!/bin/bash -l - #SBATCH --job-name=diags - #SBATCH --output=diags.o%j - #SBATCH --partition=regular - #SBATCH --account=e3sm - #SBATCH --nodes=1 - #SBATCH --time=01:00:00 - #SBATCH -C haswell - - source /global/common/software/e3sm/anaconda_envs/load_latest_e3sm_unified_cori-haswell.sh - python run_e3sm_diags.py --multiprocessing --num_workers=32 - -And then submit it: - - :: - - sbatch diags.bash - -View the status of your job with ``squeue -u ``. -Here's the meaning of some values under the State (``ST``) column: - -* ``PD``: Pending -* ``R``: Running -* ``CA``: Cancelled -* ``CD``: Completed -* ``F``: Failed -* ``TO``: Timeout -* ``NF``: Node Failure diff --git a/docs/source/quickguides/old/quick-guide-rhea.rst b/docs/source/quickguides/old/quick-guide-rhea.rst deleted file mode 100644 index 52ee5452a..000000000 --- a/docs/source/quickguides/old/quick-guide-rhea.rst +++ /dev/null @@ -1,170 +0,0 @@ -Quick guide for OLCF Rhea (v1) -============================== - -Running the software on Rhea shares similar steps as running on other machines. The path of datasets are different for different file systems. -To run ``e3sm_diags`` on ``Rhea`` (which shares the same file system as ``Titan``, and ``Rhea`` is more suitable for data analysis). - -1. Log on to ``rhea``: - -:: - - ssh -Y rhea.ccs.ornl.gov - -2. If you don't have Anaconda installed, follow `this -guide `__. - -3. Make sure you are using ``bash`` - -:: - - bash - -Installing and creating an environment --------------------------------------- -The steps below detail how to create your own environment with ``e3sm_diags``. -However, it is possible to use the `E3SM Unified Environment `__ instead. -If you decide to use the unified environment, please do so and skip to step 5 (Note, as of April 4th, 2018, the lastest version of unified-env is 1.1.3). - -4. Allow Anaconda to download packages, even with a firewall. - -:: - - conda config --set ssl_verify false - binstar config --set verify_ssl False - -5. Update Anaconda. - -:: - - conda update conda - -6. Get the yml file to create an environment. - -:: - - wget https://raw.githubusercontent.com/E3SM-Project/e3sm_diags/master/conda/e3sm_diags_env.yml - -7. Remove any cached Anaconda packages. This will ensure that you always get the latest packages. - -:: - - conda clean --all - -8. Use Anaconda to create a new environment with ``e3sm_diags`` installed. -**Tip:** You can change the name of the environment by adding ``-n new_env_name`` to the end of ``conda env create ...``. - -:: - - conda env create -f e3sm_diags_env.yml - source activate e3sm_diags_env - - -Running all sets of diagnostics -------------------------------------------------- - -9. Copy and paste the below code into ``myparams.py`` using your -favorite text editor. Adjust any options as you like. - -.. code:: python - - - reference_data_path = '/ccs/proj/cli115/e3sm_diags_data/obs_for_e3sm_diags/climatology/' - test_data_path = '/ccs/proj/cli115/e3sm_diags_data/test_model_data_for_acme_diags/' - - test_name = '20161118.beta0.FC5COSP.ne30_ne30.edison' - - #sets = ["lat_lon"] # To run only lat_lon countour diags - #without specifing sets, sets = ['zonal_mean_xy', 'zonal_mean_2d', 'lat_lon', 'polar', 'cosp_histogram'] - - diff_title = 'Model - Obs' - - backend = 'mpl' # 'vcs' is for vcs plotting backend. - - results_dir = 'lat_lon_demo' # name of folder where all results will be stored - - -10. Run the diags. - -:: - - e3sm_diags -p myparams.py - - -11. Open the following webpage to view the results. - -:: - - firefox --no-remote lat_lon_demo/viewer/index.html & - -- The ``--no-remote`` option uses the Firefox installed on this machine, - and not the one on your machine. - -Above example runs the package in serial. Instead, it can be run much faster using multi-processing, either in an interactive session on compute nodes, or as a batch -job. - -Adding below lines to ``myparams.py``: - -.. code:: python - - multiprocessing = True - num_workers = 16 # Number of processes to use - -**Tip:** Once you're on the webpage for a specific plot, click on the 'Output Metadata' -drop down menu to view the metadata for the displayed plot. - -* Running that command allows the displayed plot to be recreated. Changing any of the options will modify the resulting figure. - - -Interactive session on compute nodes -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -First, request an interactive session with a single node (16 cores) for an half hour -(running this example should take much less than this): :: - - - qsub -I -A charging_project_name -q name_of_queue -V -l nodes=1 -l walltime=00:30:00 - -Once the session is available, launch E3SM Diags: :: - - source activate e3sm_diags_env - e3sm_diags -p myparams.py - -Batch job -^^^^^^^^^ - -Alternatively, you can also create a script and submit it to the batch system. -Copy and paste the code below into a file named ``diags.pbs`` and **change the following**: - -* Change ``charging_project_name`` to a valid value -* Change ``$YOUR_WORKING_DIR`` to your working directory -* Get the path of your Anaconda binary - - * Run ``which conda``, and get a path like so: - ``/ccs/home/zhang40/anaconda3/envs/e3sm_diags_env/bin/conda`` - * Copy everything from the beginning to 'anaconda2' (or 'anaconda3') put it in: - ``export PATH="PASTE_HERE/bin:$PATH"`` - - An example path is: - - ``export PATH="/ccs/home/zhang40/anaconda3/bin:$PATH"`` - -.. code:: bash - - #!/bin/bash -l - # PLEASE CHANGE: charging_project_name - #PBS -A charging_project_name - #PBS -N e3sm_diags_test - #PBS -j oe - #PBS -l walltime=0:30:00,nodes=1 - - # PLEASE CHANGE: the line below to your valid path - export PATH="/ccs/home/zhang40/anaconda3/bin:$PATH" - source activate e3sm_diags_env - # PLEASE CHANGE: $YOUR_WORKING_DIR to a valid directory - cd $YOUR_WORKING_DIR - e3sm_diags -p myparams.py - -And then submit it :: - - qsub diags.pbs - From 41863485b3c0faded804932a9f242b025cb86d12 Mon Sep 17 00:00:00 2001 From: tomvothecoder Date: Tue, 28 Jan 2025 14:19:34 -0600 Subject: [PATCH 3/6] Remove acme1 references and warning about importing module --- docs/source/index.rst | 5 - .../quickguides/generate_quick_guides.py | 10 +- docs/source/quickguides/index.rst | 1 - docs/source/quickguides/quick-guide-acme1.rst | 273 ------------------ docs/source/quickguides/quick-guide-anvil.rst | 2 +- .../quickguides/quick-guide-chrysalis.rst | 4 - docs/source/quickguides/quick-guide-compy.rst | 4 - .../quickguides/quick-guide-general.rst | 4 - .../quickguides/quick-guide-generic.rst | 4 - .../quickguides/quick-guide-perlmutter.rst | 4 - 10 files changed, 2 insertions(+), 309 deletions(-) delete mode 100644 docs/source/quickguides/quick-guide-acme1.rst diff --git a/docs/source/index.rst b/docs/source/index.rst index 482107d68..2e4dfc545 100644 --- a/docs/source/index.rst +++ b/docs/source/index.rst @@ -13,11 +13,6 @@ Welcome to the E3SM Diagnostics Package documentation hub. To change the documentation version, use the version selector in the bottom left-hand corner. Please note, documentation for versions ``v2.5.0`` are not available in the version selector. -.. warning:: - As of ``v2.6.0``, ``e3sm_diags`` should be used as the module name instead of - ``acme_diags``. Instances of ``acme_diags`` in the Python import statements should - be replaced accordingly. - .. toctree:: :maxdepth: 2 :caption: Contents: diff --git a/docs/source/quickguides/generate_quick_guides.py b/docs/source/quickguides/generate_quick_guides.py index 99a3f11ac..f0abbdbb1 100644 --- a/docs/source/quickguides/generate_quick_guides.py +++ b/docs/source/quickguides/generate_quick_guides.py @@ -2,14 +2,6 @@ import subprocess EXPANSIONS = { - 'acme1': { - 'machine_name': 'Acme1', - 'activation_path': '/usr/local/e3sm_unified/envs/load_latest_e3sm_unified_acme1.sh', - 'obs_path': '/p/user_pub/e3sm/diagnostics/observations/Atm/', - 'test_data_path': '/p/user_pub/e3sm/e3sm_diags_data/test_model_data_for_acme_diags/', - 'html_path': '/var/www/acme/acme-diags//', - 'web_address': 'https://acme-viewer.llnl.gov//', - }, 'anvil': { 'machine_name': 'Anvil', 'activation_path': '/lcrc/soft/climate/e3sm-unified/load_latest_e3sm_unified_anvil.sh', @@ -78,7 +70,7 @@ def generate_quick_guides(): else: expansion = EXPANSIONS[machine_name][expansion_name] new_line = line.replace(expansion_indicator, expansion) - if multiprocessing and machine_name != 'acme1': + if multiprocessing: multiprocessing_file = '{d}quick-guide-multiprocessing-{m}.rst'.format( d=quick_guides_dir, m=machine_name) with open(multiprocessing_file, 'r') as mpf: diff --git a/docs/source/quickguides/index.rst b/docs/source/quickguides/index.rst index bed6aa782..30e7a34a8 100644 --- a/docs/source/quickguides/index.rst +++ b/docs/source/quickguides/index.rst @@ -6,7 +6,6 @@ Quick Guides :maxdepth: 2 quick-guide-general - quick-guide-acme1 quick-guide-anvil quick-guide-chrysalis quick-guide-compy diff --git a/docs/source/quickguides/quick-guide-acme1.rst b/docs/source/quickguides/quick-guide-acme1.rst deleted file mode 100644 index 5d7724f4f..000000000 --- a/docs/source/quickguides/quick-guide-acme1.rst +++ /dev/null @@ -1,273 +0,0 @@ -.. - Comment: If you want to edit `quick-guide-{machine_name}.rst`, edit `quick-guide-generic.rst` instead and run `generate_quick_guides.py`. - -Acme1 quick guide for running e3sm_diags v2 -========================================================================= - -.. warning:: - As of ``v2.6.0``, ``e3sm_diags`` should be used as the module name instead of - ``acme_diags``. Instances of ``acme_diags`` in the Python import statements should - be replaced accordingly. - -1. Installation ------------------------------------------------------------ - -We will use the e3sm_unifed environment to install. -For the latest stable release or if you don't have access to e3sm analysis machines, -please instead refer to :ref:`Latest stable release `. - -Most of the E3SM analysis software is maintained with an Anaconda metapackage -(E3SM unified environment). -If you have an account on Acme1, -then to get all of the tools in the metapackage in your path, -use the activation command below. -(Change ``.sh`` to ``.csh`` for csh shells.) - -Below, we also provide the paths for observational data needed by ``e3sm_diags`` (), -and some sample model data for testing (). -Both and have two subdirectories: -``/climatology`` and ``/time-series`` for climatology and time-series data respectively. - -Also listed below are paths where the HTML files () must be located to be displayed -at their corresponding web addresses (). - -: ``source /usr/local/e3sm_unified/envs/load_latest_e3sm_unified_acme1.sh`` - -: ``/p/user_pub/e3sm/diagnostics/observations/Atm/`` - -: ``/p/user_pub/e3sm/e3sm_diags_data/test_model_data_for_acme_diags/`` - -: ``/var/www/acme/acme-diags//`` - -: ``https://acme-viewer.llnl.gov//`` - - - -2. Config and run --------------------------------------------------------- - -.. _Acme1_lat_lon: - -Running the annual mean latitude-longitude contour set -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -Copy and paste the below code into ``run_e3sm_diags.py`` using your favorite text editor. -Adjust any options as you like. - - **Tip:** Some of E3SM's analysis machines (**Acme1, Anvil, Compy, Cori**) - have web servers setup to host html results. - On Acme1, - create the directory ``/var/www/acme/acme-diags//`` using your username. - Set ``results_dir`` to ``/var/www/acme/acme-diags//doc_examples/lat_lon_demo`` - in ``run_e3sm_diags.py`` below. Then, you can view results via a web browser here: - https://acme-viewer.llnl.gov//doc_examples/lat_lon_demo - - - .. code:: python - - import os - from e3sm_diags.parameter.core_parameter import CoreParameter - from e3sm_diags.run import runner - - param = CoreParameter() - - param.reference_data_path = '/p/user_pub/e3sm/diagnostics/observations/Atm/climatology/' - param.test_data_path = '/p/user_pub/e3sm/e3sm_diags_data/test_model_data_for_acme_diags/climatology/' - param.test_name = '20161118.beta0.FC5COSP.ne30_ne30.edison' - param.seasons = ["ANN"] #all seasons ["ANN","DJF", "MAM", "JJA", "SON"] will run,if comment out" - - prefix = '/var/www/acme/acme-diags//doc_examples/' - param.results_dir = os.path.join(prefix, 'lat_lon_demo') - # Use the following if running in parallel: - #param.multiprocessing = True - #param.num_workers = 32 - - # Use below to run all core sets of diags: - #runner.sets_to_run = ['lat_lon','zonal_mean_xy', 'zonal_mean_2d', 'polar', 'cosp_histogram', 'meridional_mean_2d'] - # Use below to run lat_lon map only: - runner.sets_to_run = ['lat_lon'] - runner.run_diags([param]) - - -Run in serial with: - - :: - - python run_e3sm_diags.py - -The above run has the same results as running ``e3sm_diags -p lat_lon_params.py`` -using the code below for ``lat_lon_params.py``: - - - .. code:: python - - reference_data_path = '/p/user_pub/e3sm/diagnostics/observations/Atm/climatology/' - test_data_path = '/p/user_pub/e3sm/e3sm_diags_data/test_model_data_for_acme_diags/climatology/' - - test_name = '20161118.beta0.FC5COSP.ne30_ne30.edison' - - sets = ["lat_lon"] - seasons = ["ANN"] - - # Name of folder where all results will be stored. - results_dir = '/var/www/acme/acme-diags//doc_examples/lat_lon_demo' - -The new way of running (no ``-p``) is implemented in version 2.0.0, -preparing ``e3sm_diags`` to accomodate more diagnostics sets with set-specific parameters. - - - -View results on the web -''''''''''''''''''''''' -Once the run is completed, -open ``https://acme-viewer.llnl.gov//doc_examples/lat_lon_demo/viewer/index.html`` to view the results. -If you don't see the results, you may need to set proper permissions. -Run ``chmod -R 755 /var/www/acme/acme-diags//``. - -**Tip:** Once you're on the webpage for a specific plot, click on the -'Output Metadata' drop down menu to view the metadata for the displayed plot. -Running that command allows the displayed plot to be recreated. -Changing any of the options will modify just that resulting figure. - - - -Running all the core diagnostics sets -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -Core diagnostics set includes: -**lat_lon**, **zonal_mean_xy**, **zonal_mean_2d**, **polar**, **cosp_histogram**, -**meridional_mean_2d**. -These diags share a common parameter space (core parameters). -To run all these sets without defining set-specific parameters -(e.g. **plev** for **zonal_mean_2d** and **meridional_mean_2d**.), -replace the ``runner.sets_to_run`` line in ``run_e3sm_diags.py`` with the one below: - - :: - - runner.sets_to_run = ['lat_lon','zonal_mean_xy', 'zonal_mean_2d', 'polar', 'cosp_histogram', 'meridional_mean_2d'] - - -Running area mean time series set -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -In v2.0.0, the time series set was implemented to support regional averaged time series plotting -using monthly mean time series input. -This set is enabled if monthly mean time series is processed as documented -:doc:`here <../input-data-requirement>`. - -A ``run_e3sm_diags.py`` example for running area mean time series alone: - - .. code:: python - - import os - from e3sm_diags.parameter.core_parameter import CoreParameter - from e3sm_diags.parameter.area_mean_time_series_parameter import AreaMeanTimeSeriesParameter - from e3sm_diags.run import runner - - param = CoreParameter() - - param.reference_data_path = '/p/user_pub/e3sm/diagnostics/observations/Atm/time-series/' - param.test_data_path = '/p/user_pub/e3sm/e3sm_diags_data/test_model_data_for_acme_diags/time-series/E3SM_v1/' - param.test_name = 'e3sm_v1' - - prefix = '/var/www/acme/acme-diags//doc_examples/' - param.results_dir = os.path.join(prefix, 'area_mean_with_obs') - # Use the following if running in parallel: - #param.multiprocessing = True - #param.num_workers = 40 - - # We're passing in this new object as well, in - # addition to the CoreParameter object. - - ts_param = AreaMeanTimeSeriesParameter() - #ts_param.ref_names = ['none'] # Using this setting will plot only the model data, not the observation data - ts_param.start_yr = '2002' - ts_param.end_yr = '2008' - - runner.sets_to_run = ['area_mean_time_series'] - runner.run_diags([param, ts_param]) - - -This set can also be ran with the core diagnostics sets, -so that all the plots are shown in one viewer. -The following is an example to run all sets: - - .. code:: python - - import os - from e3sm_diags.parameter.core_parameter import CoreParameter - from e3sm_diags.parameter.area_mean_time_series_parameter import AreaMeanTimeSeriesParameter - from e3sm_diags.run import runner - - param = CoreParameter() - - param.reference_data_path = '/p/user_pub/e3sm/diagnostics/observations/Atm/climatology/' - param.test_data_path = '/p/user_pub/e3sm/e3sm_diags_data/test_model_data_for_acme_diags/climatology/' - param.test_name = '20161118.beta0.FC5COSP.ne30_ne30.edison' - param.multiprocessing = True - param.num_workers = 40 - prefix = '/var/www/acme/acme-diags//doc_examples' - param.results_dir = os.path.join(prefix, 'all_sets') - - # - ##Set specific parameters for new sets - ts_param = AreaMeanTimeSeriesParameter() - ts_param.reference_data_path = '/p/user_pub/e3sm/diagnostics/observations/Atm/time-series/' - ts_param.test_data_path = '/p/user_pub/e3sm/diagnostics/observations/Atm/time-series/E3SM_v1/' - ts_param.test_name = 'e3sm_v1' - ts_param.start_yr = '2002' - ts_param.end_yr = '2008' - - runner.sets_to_run = ['lat_lon','zonal_mean_xy', 'zonal_mean_2d', 'polar', 'cosp_histogram', 'meridional_mean_2d', 'area_mean_time_series'] - runner.run_diags([param, ts_param]) - - -Advanced: Running custom diagnostics -^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -The following steps are for 'advanced' users, who want to run custom diagnostics. -So, most users will not run the software like this. - - -By default, with ``e3sm_diags``, -a built in set of variables are defined for each diagonostics sets. -To do a short run, e.g. only running through a subset of variables, -a configuration file is needed to customize the run. - - -In the following example, -only precipitation and surface sea temperature are run to compare with -model and obs for lat_lon set. -Create ``mydiags.cfg`` file as below. - -Check :doc:`Available Parameters <../available-parameters>` for all available parameters. - -For a larger configuration file example, look -`here `_ -for the cfg file that was used to create all of the latitude-longitude sets. - - - :: - - [#] - sets = ["lat_lon"] - case_id = "GPCP_v2.3" - variables = ["PRECT"] - ref_name = "GPCP_v2.3" - reference_name = "GPCP" - seasons = ["ANN", "DJF", "MAM", "JJA", "SON"] - regions = ["global"] - test_colormap = "WhiteBlueGreenYellowRed.rgb" - reference_colormap = "WhiteBlueGreenYellowRed.rgb" - diff_colormap = "BrBG" - contour_levels = [0.5, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 12, 13, 14, 15, 16] - diff_levels = [-5, -4, -3, -2, -1, -0.5, 0.5, 1, 2, 3, 4, 5] - - -Run E3SM diagnostics with the ``-d`` parameter. -Use the :ref:`above run script `. And run as following: - - :: - - python run_e3sm_diags.py -d mydiags.cfg - - diff --git a/docs/source/quickguides/quick-guide-anvil.rst b/docs/source/quickguides/quick-guide-anvil.rst index 5f2cb2909..b944c108b 100644 --- a/docs/source/quickguides/quick-guide-anvil.rst +++ b/docs/source/quickguides/quick-guide-anvil.rst @@ -54,7 +54,7 @@ Running the annual mean latitude-longitude contour set Copy and paste the below code into ``run_e3sm_diags.py`` using your favorite text editor. Adjust any options as you like. - **Tip:** Some of E3SM's analysis machines (**Acme1, Anvil, Compy, Cori**) + **Tip:** Some of E3SM's analysis machines (**Anvil, Compy, Cori**) have web servers setup to host html results. On Anvil, create the directory ``/lcrc/group/e3sm/public_html/diagnostic_output//`` using your username. diff --git a/docs/source/quickguides/quick-guide-chrysalis.rst b/docs/source/quickguides/quick-guide-chrysalis.rst index f29a4665d..03ec95760 100644 --- a/docs/source/quickguides/quick-guide-chrysalis.rst +++ b/docs/source/quickguides/quick-guide-chrysalis.rst @@ -4,10 +4,6 @@ Chrysalis quick guide for running e3sm_diags v2 ========================================================================= -.. warning:: - As of ``v2.6.0``, ``e3sm_diags`` should be used as the module name instead of - ``acme_diags``. Instances of ``acme_diags`` in the Python import statements should - be replaced accordingly. 1. Installation ----------------------------------------------------------- diff --git a/docs/source/quickguides/quick-guide-compy.rst b/docs/source/quickguides/quick-guide-compy.rst index 0e33e298e..eb0353f19 100644 --- a/docs/source/quickguides/quick-guide-compy.rst +++ b/docs/source/quickguides/quick-guide-compy.rst @@ -4,10 +4,6 @@ Compy quick guide for running e3sm_diags v2 ========================================================================= -.. warning:: - As of ``v2.6.0``, ``e3sm_diags`` should be used as the module name instead of - ``acme_diags``. Instances of ``acme_diags`` in the Python import statements should - be replaced accordingly. 1. Installation ----------------------------------------------------------- diff --git a/docs/source/quickguides/quick-guide-general.rst b/docs/source/quickguides/quick-guide-general.rst index 2f3c5f5d5..c3095a15b 100644 --- a/docs/source/quickguides/quick-guide-general.rst +++ b/docs/source/quickguides/quick-guide-general.rst @@ -2,10 +2,6 @@ General quick guide for running e3sm_diags v2 ========================================================================= -.. warning:: - As of ``v2.6.0``, ``e3sm_diags`` should be used as the module name instead of - ``acme_diags``. Instances of ``acme_diags`` in the Python import statements should - be replaced accordingly. 1. Installation ----------------------------------------------------------- diff --git a/docs/source/quickguides/quick-guide-generic.rst b/docs/source/quickguides/quick-guide-generic.rst index 754dd3cac..4298dbe72 100644 --- a/docs/source/quickguides/quick-guide-generic.rst +++ b/docs/source/quickguides/quick-guide-generic.rst @@ -4,10 +4,6 @@ #expand machine_name# quick guide for running e3sm_diags v2 ========================================================================= -.. warning:: - As of ``v2.6.0``, ``e3sm_diags`` should be used as the module name instead of - ``acme_diags``. Instances of ``acme_diags`` in the Python import statements should - be replaced accordingly. 1. Installation ----------------------------------------------------------- diff --git a/docs/source/quickguides/quick-guide-perlmutter.rst b/docs/source/quickguides/quick-guide-perlmutter.rst index bb689debd..77bf7b867 100644 --- a/docs/source/quickguides/quick-guide-perlmutter.rst +++ b/docs/source/quickguides/quick-guide-perlmutter.rst @@ -4,10 +4,6 @@ Perlmutter quick guide for running e3sm_diags v2 ========================================================================= -.. warning:: - As of ``v2.6.0``, ``e3sm_diags`` should be used as the module name instead of - ``acme_diags``. Instances of ``acme_diags`` in the Python import statements should - be replaced accordingly. 1. Installation ----------------------------------------------------------- From 3aab4b670d3695e984606f3b988dc672aa17fdba Mon Sep 17 00:00:00 2001 From: tomvothecoder Date: Tue, 28 Jan 2025 14:28:56 -0600 Subject: [PATCH 4/6] Update v2 references to v3 --- docs/source/index.rst | 6 +++--- docs/source/quickguides/quick-guide-anvil.rst | 2 +- docs/source/quickguides/quick-guide-chrysalis.rst | 2 +- docs/source/quickguides/quick-guide-compy.rst | 2 +- docs/source/quickguides/quick-guide-general.rst | 2 +- docs/source/quickguides/quick-guide-generic.rst | 2 +- docs/source/quickguides/quick-guide-perlmutter.rst | 2 +- 7 files changed, 9 insertions(+), 9 deletions(-) diff --git a/docs/source/index.rst b/docs/source/index.rst index 2e4dfc545..6c2c96fc6 100644 --- a/docs/source/index.rst +++ b/docs/source/index.rst @@ -6,7 +6,7 @@ .. _index-label: *************************** -E3SM Diagnostics Package v2 +E3SM Diagnostics Package v3 *************************** Welcome to the E3SM Diagnostics Package documentation hub. @@ -49,8 +49,8 @@ that: - is flexible for user-specified diagnostics and configuration for use by other climate models. -Current State (v2 release) --------------------------- +Current State +------------- Algorithm and visualization codes for **latitude-longitude contour maps**, **polar contour maps**, the accompanying **summarizing table** and **Taylor diagram plots**, **pressure-latitude zonal mean contour plots**, diff --git a/docs/source/quickguides/quick-guide-anvil.rst b/docs/source/quickguides/quick-guide-anvil.rst index b944c108b..1a4e99961 100644 --- a/docs/source/quickguides/quick-guide-anvil.rst +++ b/docs/source/quickguides/quick-guide-anvil.rst @@ -1,7 +1,7 @@ .. Comment: If you want to edit `quick-guide-{machine_name}.rst`, edit `quick-guide-generic.rst` instead and run `generate_quick_guides.py`. -Anvil quick guide for running e3sm_diags v2 +Anvil quick guide for running e3sm_diags v3 ========================================================================= .. warning:: diff --git a/docs/source/quickguides/quick-guide-chrysalis.rst b/docs/source/quickguides/quick-guide-chrysalis.rst index 03ec95760..aec42afd9 100644 --- a/docs/source/quickguides/quick-guide-chrysalis.rst +++ b/docs/source/quickguides/quick-guide-chrysalis.rst @@ -1,7 +1,7 @@ .. Comment: If you want to edit `quick-guide-{machine_name}.rst`, edit `quick-guide-generic.rst` instead and run `generate_quick_guides.py`. -Chrysalis quick guide for running e3sm_diags v2 +Chrysalis quick guide for running e3sm_diags v3 ========================================================================= diff --git a/docs/source/quickguides/quick-guide-compy.rst b/docs/source/quickguides/quick-guide-compy.rst index eb0353f19..6f60b2ce8 100644 --- a/docs/source/quickguides/quick-guide-compy.rst +++ b/docs/source/quickguides/quick-guide-compy.rst @@ -1,7 +1,7 @@ .. Comment: If you want to edit `quick-guide-{machine_name}.rst`, edit `quick-guide-generic.rst` instead and run `generate_quick_guides.py`. -Compy quick guide for running e3sm_diags v2 +Compy quick guide for running e3sm_diags v3 ========================================================================= diff --git a/docs/source/quickguides/quick-guide-general.rst b/docs/source/quickguides/quick-guide-general.rst index c3095a15b..7a038b94a 100644 --- a/docs/source/quickguides/quick-guide-general.rst +++ b/docs/source/quickguides/quick-guide-general.rst @@ -1,5 +1,5 @@ -General quick guide for running e3sm_diags v2 +General quick guide for running e3sm_diags v3 ========================================================================= diff --git a/docs/source/quickguides/quick-guide-generic.rst b/docs/source/quickguides/quick-guide-generic.rst index 4298dbe72..f3704ed11 100644 --- a/docs/source/quickguides/quick-guide-generic.rst +++ b/docs/source/quickguides/quick-guide-generic.rst @@ -1,7 +1,7 @@ .. Comment: If you want to edit `quick-guide-{machine_name}.rst`, edit `quick-guide-generic.rst` instead and run `generate_quick_guides.py`. -#expand machine_name# quick guide for running e3sm_diags v2 +#expand machine_name# quick guide for running e3sm_diags v3 ========================================================================= diff --git a/docs/source/quickguides/quick-guide-perlmutter.rst b/docs/source/quickguides/quick-guide-perlmutter.rst index 77bf7b867..31c8469dc 100644 --- a/docs/source/quickguides/quick-guide-perlmutter.rst +++ b/docs/source/quickguides/quick-guide-perlmutter.rst @@ -1,7 +1,7 @@ .. Comment: If you want to edit `quick-guide-{machine_name}.rst`, edit `quick-guide-generic.rst` instead and run `generate_quick_guides.py`. -Perlmutter quick guide for running e3sm_diags v2 +Perlmutter quick guide for running e3sm_diags v3 ========================================================================= From 1cb57f2c81c5ab402d789f3f2aeeafaed9911870 Mon Sep 17 00:00:00 2001 From: tomvothecoder Date: Tue, 28 Jan 2025 15:42:44 -0600 Subject: [PATCH 5/6] Add info about v3 in current state --- README.md | 11 +++++++++++ docs/source/index.rst | 15 +++++++++++---- 2 files changed, 22 insertions(+), 4 deletions(-) diff --git a/README.md b/README.md index d8066d55e..3414d78de 100644 --- a/README.md +++ b/README.md @@ -46,6 +46,17 @@ This diagnostics package is constructed for supporting the diagnostics task of D E3SM Diags is modeled after the National Center for Atmospheric Research (NCAR) Atmosphere Model Working Group (AMWG) diagnostics package. In its version 1 release, E3SM Diags included a set of core essential diagnostics to evaluate the mean physical climate from model simulations. As of version 2, more process-oriented and phenomenon-based evaluation diagnostics have been implemented, as listed below: +### Major refactor with v3 development + +v3.0.0 marks a major milestone after nearly two years of work by the core development +team. This release introduces a completely new back-end, replacing CDAT with Xarray and +xCDAT. Due to the significant scale of code changes, this has been incremented as a +major release. The user-facing API for running E3SM Diagnostics remains backward-compatible between v2 and v3. + +The modernization improves performance, usability, and maintainability, paving the way +for future enhancements to E3SM development. The refactored codebase is now more robust +and extensively covered by unit tests, setting a solid foundation for ongoing testing and development. + ### New Feature added during v2 development | Feature name
(set name) | Brief Introduction | Developers Contributors\* | Released version | diff --git a/docs/source/index.rst b/docs/source/index.rst index 6c2c96fc6..7bdac1684 100644 --- a/docs/source/index.rst +++ b/docs/source/index.rst @@ -53,10 +53,17 @@ Current State ------------- Algorithm and visualization codes for **latitude-longitude contour maps**, -**polar contour maps**, the accompanying **summarizing table** and **Taylor diagram plots**, **pressure-latitude zonal mean contour plots**, -**zonal mean line plots**, **pressure-longitude meridional mean contour plots**, **area mean time series plots**, and **Cloud Top Height-Tau** joint histograms -from COSP cloud simulator output. Plots can be created for annual -and seasonal climatologies, and monthly mean time series. In additional to the core sets being released in v1, **ENSO diags**, **QBO diags**, **Diurnal cycle phase plot**, **Streamflow evaluation**, **ARM diags**, and **TC analysis** are implemented in v2 release. +**polar contour maps**, the accompanying **summarizing table** and +**Taylor diagram plots**, **pressure-latitude zonal mean contour plots**, +**zonal mean line plots**, **pressure-longitude meridional mean contour plots**, +**area mean time series plots**, and **Cloud Top Height-Tau** joint histograms from +COSP cloud simulator output. Plots can be created for annual and seasonal climatologies, +and monthly mean time series. In additional to the core sets being released in v1, +**ENSO diags**, **QBO diags**, **Diurnal cycle phase plot**, **Streamflow evaluation**, +**ARM diags**, and **TC analysis** are implemented in v2 release. v3 introduces a +completely new back-end, replacing CDAT with Xarray and xCDAT. Due to the significant +scale of code changes, this has been incremented as a major release. The user-facing +API for running E3SM Diagnostics remains backward-compatible between v2 and v3. The package also supports custom user diagnostics, by specifying plot type, desired region (global, ocean, land, etc.), From f2572e0f04dc57f38d0a3c026de48b4cb710b86c Mon Sep 17 00:00:00 2001 From: tomvothecoder Date: Tue, 28 Jan 2025 15:44:12 -0600 Subject: [PATCH 6/6] Update salloc command in examples --- docs/source/examples.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/examples.rst b/docs/source/examples.rst index bab86c3d2..0ece5a25c 100644 --- a/docs/source/examples.rst +++ b/docs/source/examples.rst @@ -131,7 +131,7 @@ Use the code below to run the diagnostics. .. code:: # Allocate a node to run an interactive session on. You can also use a batch job. - salloc --nodes=1 --partition=regular --time=01:00:00 -C haswell + salloc --nodes 1 --qos interactive --time 01:00:00 --constraint cpu --account=e3sm # Enter the E3SM Unified environment. For Cori, the command to do this is: source /global/common/software/e3sm/anaconda_envs/load_latest_e3sm_unified_cori-haswell.sh # Running Ex.1. For examples 4,5,7 append ``-d diags.cfg``.