Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Refactor]: Refactor integration tests and add use_cfg flag to run_diags() #747

Merged
merged 31 commits into from
Dec 13, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
31 commits
Select commit Hold shift + click to select a range
0c7307a
Remove default diags running with only Python API
tomvothecoder Nov 29, 2023
00865fe
Clean up `test_all_sets.py` and `test_diags.py`
tomvothecoder Nov 29, 2023
083ae3c
Fix paths for enso_diags in `all_sets_modified.cfg`
tomvothecoder Nov 29, 2023
daf1940
Fix referenced paths in `test_all_sets_image_diffs.py`
tomvothecoder Nov 29, 2023
9b55499
Remove unused NERSC function
tomvothecoder Nov 29, 2023
cb9f64e
Add debug mode
tomvothecoder Nov 30, 2023
715c025
Fix missing default value to debug in get_final_parameters
tomvothecoder Nov 30, 2023
0cdca64
Update pyproject.toml
tomvothecoder Nov 30, 2023
2f96ca4
Update e3sm_diags/run.py
tomvothecoder Nov 30, 2023
590fc37
Update e3sm_diags/run.py
tomvothecoder Nov 30, 2023
e7b049d
Fix sets to run
tomvothecoder Nov 30, 2023
9cb6777
Fix default sets
tomvothecoder Nov 30, 2023
0b99fff
Fix `test_all_sets_and_all_season`
tomvothecoder Nov 30, 2023
d19d4da
Refactor how cfg parameters are set
tomvothecoder Nov 30, 2023
7c9c607
Fix removal of CoreParameter default sets breaking integration test
tomvothecoder Nov 30, 2023
4c0a057
Fix class property and default sets
tomvothecoder Nov 30, 2023
63e29bf
Fix property
tomvothecoder Nov 30, 2023
4e24fb2
Update docstring
tomvothecoder Nov 30, 2023
4d562ce
Update e3sm_diags/run.py
tomvothecoder Nov 30, 2023
c2b5cf9
Fix comments
tomvothecoder Nov 30, 2023
2942f25
Fix cfg_params being modified in memory
tomvothecoder Nov 30, 2023
8618ba9
Update Makefile commands
tomvothecoder Nov 30, 2023
4e7bf9c
Revert core_parser changes
tomvothecoder Nov 30, 2023
3122650
Revert formatting changes
tomvothecoder Dec 1, 2023
720dc10
Revert formatting
tomvothecoder Dec 1, 2023
3ae6d54
Rename `debug` to `use_cfg`
tomvothecoder Dec 12, 2023
d3e0c9a
Update use_cfg param in test
tomvothecoder Dec 12, 2023
5eaf7b3
Remove `test_all_sets_modified_image_counts.py`
tomvothecoder Dec 12, 2023
9f1862a
Remove `test_dataset.py` and move `test_run.py` to unit tests dir
tomvothecoder Dec 12, 2023
13547ad
Update attr comment in Run class
tomvothecoder Dec 12, 2023
3ff0aa7
Rename `debug_params` to `params`
tomvothecoder Dec 13, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 10 additions & 2 deletions .github/workflows/build_workflow.yml
Original file line number Diff line number Diff line change
Expand Up @@ -99,10 +99,18 @@ jobs:
mamba info

- if: ${{ steps.skip_check.outputs.should_skip != 'true' }}
name: Run Tests
name: Run Unit Tests
run: pytest tests/e3sm_diags

- if: ${{ steps.skip_check.outputs.should_skip != 'true' }}
name: Download Integration Test Data
run: python -m tests.integration.download_data

- if: ${{ steps.skip_check.outputs.should_skip != 'true' }}
name: Run Integration Tests
env:
CHECK_IMAGES: False
run: bash tests/test.sh
run: pytest tests/integration

publish-docs:
if: ${{ github.event_name == 'push' }}
Expand Down
8 changes: 8 additions & 0 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -55,6 +55,14 @@ clean-test: ## remove test and coverage artifacts
rm -fr .pytest_cache
rm -rf .mypy_cache

clean-test-int-res: ## remove integration test results and image check failures
rm -rf tests/integration/all_sets_results_test
rm -rf tests/integration/image_check_failures

clean-test-int-data: # remove integration test data and images (expected) -- useful when they are updated
rm -rf tests/integration/integration_test_data
rm -rf tests/integration/integration_test_images

# Quality Assurance
# ----------------------
pre-commit: # run pre-commit quality assurance checks
Expand Down
10 changes: 5 additions & 5 deletions docs/source/dev_guide/testing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ The unit and integration tests are run automatically as part of this.
Complete run test
-----------------

``tests/complete_run.py`` checks the images generated by all diagnostics to
``tests/integration/complete_run.py`` checks the images generated by all diagnostics to
see if any differ from expected.
This test is not run as part of the unit test suite, because it relies on a large
quantity of data found on LCRC (Anvil/Chrysalis).
Expand All @@ -78,7 +78,7 @@ Now that you have your changes on LCRC, enter your development environment. Then
.. code::

pip install . # Install your changes
python -m unittest tests/complete_run.py
pytest tests/integration/complete_run.py

If this test passes, you're done. If it fails however, that means
your code has changed what the output looks like.
Expand Down Expand Up @@ -108,11 +108,11 @@ then you need to update the expected images.
find . -type f -name '*.png' > ../image_list_all_sets.txt
cd ..

Run ``python -m unittest tests/complete_run.py`` again. Now, the test should pass.
Run ``pytest tests/integration/complete_run.py`` again. Now, the test should pass.

After merging your pull request, edit ``README.md``.
The version should be the version of E3SM Diags you ran
``python -m unittest tests/complete_run.py`` with,
the date should be the date you ran ``python -m unittest tests/complete_run.py`` on,
``pytest tests/integration/complete_run.py`` with,
the date should be the date you ran ``pytest tests/integration/complete_run.py`` on,
and the hash should be for the top commit shown by ``git log`` or on
https://github.com/E3SM-Project/e3sm_diags/commits/main.
7 changes: 5 additions & 2 deletions e3sm_diags/e3sm_diags_driver.py
Original file line number Diff line number Diff line change
Expand Up @@ -347,15 +347,16 @@ def _collapse_results(parameters: List[List[CoreParameter]]) -> List[CoreParamet
return output_parameters


def main(parameters=[]):
def main(parameters=[]) -> List[CoreParameter]:
# Get the diagnostic run parameters
# ---------------------------------
parser = CoreParser()

# If no parameters are passed, use the parser args as defaults. Otherwise,
# create the dictionary of expected parameters.
if not parameters:
if len(parameters) == 0:
parameters = get_parameters(parser)

expected_parameters = create_parameter_dict(parameters)

if not os.path.exists(parameters[0].results_dir):
Expand Down Expand Up @@ -408,6 +409,8 @@ def main(parameters=[]):
)
raise Exception(message)

return parameters_results
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

parameter_results are now returned so that we can parse them for the results_dir in unit/integration tests.

It is also generally useful to see the results after each diagnostic run.



if __name__ == "__main__":
main()
50 changes: 28 additions & 22 deletions e3sm_diags/parameter/core_parameter.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,32 @@

logger = custom_logger(__name__)

# FIXME: There is probably a better way of defining default sets because most of
# this is repeated in SETS_TO_PARAMETERS and SETS_TO_PARSERS.
# Also integration tests will break if "mp_partition" is included because
# we did not take it into account yet.
DEFAULT_SETS = [
"zonal_mean_xy",
"zonal_mean_2d",
"zonal_mean_2d_stratosphere",
"meridional_mean_2d",
"lat_lon",
"polar",
"area_mean_time_series",
"cosp_histogram",
"enso_diags",
"qbo",
"streamflow",
"diurnal_cycle",
"arm_diags",
"tc_analysis",
"annual_cycle_zonal_mean",
"lat_lon_land",
"lat_lon_river",
"aerosol_aeronet",
"aerosol_budget",
]


class CoreParameter:
def __init__(self):
Expand Down Expand Up @@ -58,28 +84,8 @@ def __init__(self):
# 'model_vs_obs' (by default), 'model_vs_model', or 'obs_vs_obs'.
self.run_type: str = "model_vs_obs"

# A list of the sets to be run. Default is all sets
self.sets: List[str] = [
"zonal_mean_xy",
"zonal_mean_2d",
"zonal_mean_2d_stratosphere",
"meridional_mean_2d",
"lat_lon",
"polar",
"area_mean_time_series",
"cosp_histogram",
"enso_diags",
"qbo",
"streamflow",
"diurnal_cycle",
"arm_diags",
"tc_analysis",
"annual_cycle_zonal_mean",
"lat_lon_land",
"lat_lon_river",
"aerosol_aeronet",
"aerosol_budget",
]
# A list of the sets to be run, by default all sets.
self.sets: List[str] = DEFAULT_SETS

# The current set that is being ran when looping over sets in
# `e3sm_diags_driver.run_diag()`.
Expand Down
1 change: 1 addition & 0 deletions e3sm_diags/plot/cartopy/arm_diags_plot.py
Original file line number Diff line number Diff line change
Expand Up @@ -174,6 +174,7 @@ def plot_convection_onset_statistics(
var_time_absolute = cwv.getTime().asComponentTime()
time_interval = int(var_time_absolute[1].hour - var_time_absolute[0].hour)

# FIXME: UnboundLocalError: local variable 'cwv_max' referenced before assignment
number_of_bins = int(np.ceil((cwv_max - cwv_min) / bin_width))
bin_center = np.arange(
(cwv_min + (bin_width / 2)),
Expand Down
Loading
Loading