diff --git a/docs/demos/1-25-23-cwss-seminar/introduction-to-xcdat.ipynb b/docs/demos/1-25-23-cwss-seminar/introduction-to-xcdat.ipynb index 7ff9a41e..3a9a5a0e 100644 --- a/docs/demos/1-25-23-cwss-seminar/introduction-to-xcdat.ipynb +++ b/docs/demos/1-25-23-cwss-seminar/introduction-to-xcdat.ipynb @@ -61,16 +61,16 @@ "First, create the conda environment:\n", "\n", "```bash\n", - "conda create -n xcdat_notebook_0.7.3 -c conda-forge xcdat=0.7.3 xesmf matplotlib ipython ipykernel cartopy nc-time-axis gsw-xarray jupyter pooch\n", + "conda create -n xcdat_notebook -c conda-forge xcdat xesmf matplotlib ipython ipykernel cartopy nc-time-axis gsw-xarray jupyter pooch\n", "```\n", "\n", - "Then install the kernel from the `xcdat_notebook_0.7.3` environment using `ipykernel` and name the kernel with the display name (e.g., `xcdat_notebook_0.7.3`):\n", + "Then install the kernel from the `xcdat_notebook` environment using `ipykernel` and name the kernel with the display name (e.g., `xcdat_notebook`):\n", "\n", "```bash\n", - "python -m ipykernel install --user --name xcdat_notebook_0.7.3 --display-name xcdat_notebook_0.7.3\n", + "python -m ipykernel install --user --name xcdat_notebook --display-name xcdat_notebook\n", "```\n", "\n", - "Then to select the kernel `xcdat_notebook_0.7.3` in Jupyter to use this kernel.\n" + "Then to select the kernel `xcdat_notebook` in Jupyter to use this kernel.\n" ] }, { diff --git a/docs/examples/climatology-and-departures.ipynb b/docs/examples/climatology-and-departures.ipynb index 08c7a225..54b056aa 100644 --- a/docs/examples/climatology-and-departures.ipynb +++ b/docs/examples/climatology-and-departures.ipynb @@ -42,16 +42,16 @@ "First, create the conda environment:\n", "\n", "```bash\n", - "conda create -n xcdat_notebook_0.7.3 -c conda-forge xcdat=0.7.3 xesmf matplotlib ipython ipykernel cartopy nc-time-axis gsw-xarray jupyter pooch\n", + "conda create -n xcdat_notebook -c conda-forge xcdat xesmf matplotlib ipython ipykernel cartopy nc-time-axis gsw-xarray jupyter pooch\n", "```\n", "\n", - "Then install the kernel from the `xcdat_notebook_0.7.3` environment using `ipykernel` and name the kernel with the display name (e.g., `xcdat_notebook_0.7.3`):\n", + "Then install the kernel from the `xcdat_notebook` environment using `ipykernel` and name the kernel with the display name (e.g., `xcdat_notebook`):\n", "\n", "```bash\n", - "python -m ipykernel install --user --name xcdat_notebook_0.7.3 --display-name xcdat_notebook_0.7.3\n", + "python -m ipykernel install --user --name xcdat_notebook --display-name xcdat_notebook\n", "```\n", "\n", - "Then to select the kernel `xcdat_notebook_0.7.3` in Jupyter to use this kernel.\n" + "Then to select the kernel `xcdat_notebook` in Jupyter to use this kernel.\n" ] }, { diff --git a/docs/examples/general-utilities.ipynb b/docs/examples/general-utilities.ipynb index 3c1d2127..ba895d94 100644 --- a/docs/examples/general-utilities.ipynb +++ b/docs/examples/general-utilities.ipynb @@ -38,16 +38,16 @@ "First, create the conda environment:\n", "\n", "```bash\n", - "conda create -n xcdat_notebook_0.7.3 -c conda-forge xcdat=0.7.3 xesmf matplotlib ipython ipykernel cartopy nc-time-axis gsw-xarray jupyter pooch\n", + "conda create -n xcdat_notebook -c conda-forge xcdat xesmf matplotlib ipython ipykernel cartopy nc-time-axis gsw-xarray jupyter pooch\n", "```\n", "\n", - "Then install the kernel from the `xcdat_notebook_0.7.3` environment using `ipykernel` and name the kernel with the display name (e.g., `xcdat_notebook_0.7.3`):\n", + "Then install the kernel from the `xcdat_notebook` environment using `ipykernel` and name the kernel with the display name (e.g., `xcdat_notebook`):\n", "\n", "```bash\n", - "python -m ipykernel install --user --name xcdat_notebook_0.7.3 --display-name xcdat_notebook_0.7.3\n", + "python -m ipykernel install --user --name xcdat_notebook --display-name xcdat_notebook\n", "```\n", "\n", - "Then to select the kernel `xcdat_notebook_0.7.3` in Jupyter to use this kernel.\n" + "Then to select the kernel `xcdat_notebook` in Jupyter to use this kernel.\n" ] }, { diff --git a/docs/examples/parallel-computing-with-dask.ipynb b/docs/examples/parallel-computing-with-dask.ipynb index 856c6387..0b81bf46 100644 --- a/docs/examples/parallel-computing-with-dask.ipynb +++ b/docs/examples/parallel-computing-with-dask.ipynb @@ -9,7 +9,7 @@ "\n", "Author: [Tom Vo](https://github.com/tomvothecoder)\n", "\n", - "Last Updated: 04/16/24 (v0.7.0)\n" + "Last Updated: 11/07/24 (v0.7.3)\n" ] }, { @@ -31,12 +31,12 @@ "- More Resources\n", "- FAQs\n", "\n", - "_The data used in the code examples can be found through the [Earth System Grid Federation (ESGF) search portal](https://aims2.llnl.gov/search)._\n", + "_The data used in this example can be found in the [xarray-data repository](https://github.com/pydata/xarray-data)._\n", "\n", "Users can [install their own instance of xcdat](../getting-started-guide/installation.rst) and follow these examples using their own environment (e.g., with vscode, Jupyter, Spyder, iPython) or [enable xcdat with existing JupyterHub instances](../getting-started-guide/getting-started-hpc-jupyter.rst). The conda environment used in this notebook includes xcdat, xesmf, matplotlib, ipython, ipykernel, cartopy, and jupyter:\n", "\n", "```bash\n", - "conda create -n xcdat_notebook_dask -c conda-forge xcdat xesmf matplotlib ipython ipykernel cartopy nc-time-axis jupyter jupyter-server-proxy\n", + "conda create -n xcdat_notebook_dask -c conda-forge xcdat matplotlib ipython ipykernel cartopy nc-time-axis jupyter jupyter-server-proxy\n", "```\n" ] }, @@ -303,7 +303,7 @@ }, { "cell_type": "code", - "execution_count": 1, + "execution_count": null, "metadata": {}, "outputs": [], "source": [ @@ -313,10 +313,7 @@ "\n", "# Silence flox logger info messages.\n", "logger = logging.getLogger(\"flox\")\n", - "logger.setLevel(logging.WARNING)\n", - "\n", - "\n", - "filepath = \"https://esgf-data1.llnl.gov/thredds/dodsC/css03_data/CMIP6/CMIP/CSIRO/ACCESS-ESM1-5/historical/r10i1p1f1/Amon/tas/gn/v20200605/tas_Amon_ACCESS-ESM1-5_historical_r10i1p1f1_gn_185001-201412.nc\"" + "logger.setLevel(logging.WARNING)\n" ] }, { @@ -348,52 +345,9 @@ }, { "cell_type": "code", - "execution_count": 2, + "execution_count": null, "metadata": {}, - "outputs": [ - { - "name": "stderr", - "output_type": "stream", - "text": [ - "2024-06-07 13:48:37,817 [INFO]: proxy.py(:85) >> To route to workers diagnostics web server please install jupyter-server-proxy: python -m pip install jupyter-server-proxy\n", - "2024-06-07 13:48:37,817 [INFO]: proxy.py(:85) >> To route to workers diagnostics web server please install jupyter-server-proxy: python -m pip install jupyter-server-proxy\n", - "2024-06-07 13:48:37,835 [INFO]: scheduler.py(__init__:1711) >> State start\n", - "2024-06-07 13:48:37,835 [INFO]: scheduler.py(__init__:1711) >> State start\n", - "2024-06-07 13:48:37,837 [INFO]: diskutils.py(_check_lock_or_purge:252) >> Found stale lock file and directory '/var/folders/_h/t3wvkks5643fxnv07_kx9cx8000zpt/T/dask-scratch-space/scheduler-maoehc7f', purging\n", - "2024-06-07 13:48:37,837 [INFO]: diskutils.py(_check_lock_or_purge:252) >> Found stale lock file and directory '/var/folders/_h/t3wvkks5643fxnv07_kx9cx8000zpt/T/dask-scratch-space/scheduler-maoehc7f', purging\n", - "2024-06-07 13:48:37,838 [INFO]: diskutils.py(_check_lock_or_purge:252) >> Found stale lock file and directory '/var/folders/_h/t3wvkks5643fxnv07_kx9cx8000zpt/T/dask-scratch-space/worker-4ybkixzs', purging\n", - "2024-06-07 13:48:37,838 [INFO]: diskutils.py(_check_lock_or_purge:252) >> Found stale lock file and directory '/var/folders/_h/t3wvkks5643fxnv07_kx9cx8000zpt/T/dask-scratch-space/worker-4ybkixzs', purging\n", - "2024-06-07 13:48:37,841 [INFO]: diskutils.py(_check_lock_or_purge:252) >> Found stale lock file and directory '/var/folders/_h/t3wvkks5643fxnv07_kx9cx8000zpt/T/dask-scratch-space/worker-7meavnua', purging\n", - "2024-06-07 13:48:37,841 [INFO]: diskutils.py(_check_lock_or_purge:252) >> Found stale lock file and directory '/var/folders/_h/t3wvkks5643fxnv07_kx9cx8000zpt/T/dask-scratch-space/worker-7meavnua', purging\n", - "2024-06-07 13:48:37,844 [INFO]: scheduler.py(start_unsafe:4072) >> Scheduler at: tcp://127.0.0.1:58890\n", - "2024-06-07 13:48:37,844 [INFO]: scheduler.py(start_unsafe:4072) >> Scheduler at: tcp://127.0.0.1:58890\n", - "2024-06-07 13:48:37,844 [INFO]: scheduler.py(start_unsafe:4087) >> dashboard at: http://127.0.0.1:8787/status\n", - "2024-06-07 13:48:37,844 [INFO]: scheduler.py(start_unsafe:4087) >> dashboard at: http://127.0.0.1:8787/status\n", - "2024-06-07 13:48:37,845 [INFO]: scheduler.py(register_worker_plugin:7879) >> Registering Worker plugin shuffle\n", - "2024-06-07 13:48:37,845 [INFO]: scheduler.py(register_worker_plugin:7879) >> Registering Worker plugin shuffle\n", - "2024-06-07 13:48:37,853 [INFO]: nanny.py(start_unsafe:368) >> Start Nanny at: 'tcp://127.0.0.1:58893'\n", - "2024-06-07 13:48:37,853 [INFO]: nanny.py(start_unsafe:368) >> Start Nanny at: 'tcp://127.0.0.1:58893'\n", - "2024-06-07 13:48:37,863 [INFO]: nanny.py(start_unsafe:368) >> Start Nanny at: 'tcp://127.0.0.1:58895'\n", - "2024-06-07 13:48:37,863 [INFO]: nanny.py(start_unsafe:368) >> Start Nanny at: 'tcp://127.0.0.1:58895'\n", - "2024-06-07 13:48:38,357 [INFO]: scheduler.py(add_worker:4424) >> Register worker \n", - "2024-06-07 13:48:38,357 [INFO]: scheduler.py(add_worker:4424) >> Register worker \n", - "2024-06-07 13:48:38,360 [INFO]: scheduler.py(handle_worker:5934) >> Starting worker compute stream, tcp://127.0.0.1:58900\n", - "2024-06-07 13:48:38,360 [INFO]: scheduler.py(handle_worker:5934) >> Starting worker compute stream, tcp://127.0.0.1:58900\n", - "2024-06-07 13:48:38,361 [INFO]: core.py(handle_stream:1019) >> Starting established connection to tcp://127.0.0.1:58904\n", - "2024-06-07 13:48:38,361 [INFO]: core.py(handle_stream:1019) >> Starting established connection to tcp://127.0.0.1:58904\n", - "2024-06-07 13:48:38,362 [INFO]: scheduler.py(add_worker:4424) >> Register worker \n", - "2024-06-07 13:48:38,362 [INFO]: scheduler.py(add_worker:4424) >> Register worker \n", - "2024-06-07 13:48:38,363 [INFO]: scheduler.py(handle_worker:5934) >> Starting worker compute stream, tcp://127.0.0.1:58899\n", - "2024-06-07 13:48:38,363 [INFO]: scheduler.py(handle_worker:5934) >> Starting worker compute stream, tcp://127.0.0.1:58899\n", - "2024-06-07 13:48:38,363 [INFO]: core.py(handle_stream:1019) >> Starting established connection to tcp://127.0.0.1:58903\n", - "2024-06-07 13:48:38,363 [INFO]: core.py(handle_stream:1019) >> Starting established connection to tcp://127.0.0.1:58903\n", - "2024-06-07 13:48:38,387 [INFO]: scheduler.py(add_client:5691) >> Receive client connection: Client-50e779ce-250f-11ef-8879-e2398742200c\n", - "2024-06-07 13:48:38,387 [INFO]: scheduler.py(add_client:5691) >> Receive client connection: Client-50e779ce-250f-11ef-8879-e2398742200c\n", - "2024-06-07 13:48:38,388 [INFO]: core.py(handle_stream:1019) >> Starting established connection to tcp://127.0.0.1:58905\n", - "2024-06-07 13:48:38,388 [INFO]: core.py(handle_stream:1019) >> Starting established connection to tcp://127.0.0.1:58905\n" - ] - } - ], + "outputs": [], "source": [ "from dask.distributed import Client, LocalCluster\n", "\n", @@ -406,201 +360,9 @@ }, { "cell_type": "code", - "execution_count": 3, + "execution_count": null, "metadata": {}, - "outputs": [ - { - "data": { - "application/vnd.jupyter.widget-view+json": { - "model_id": "da998083ef784d3cade398da93944974", - "version_major": 2, - "version_minor": 0 - }, - "text/html": [ - "
\n", - "
\n", - "
\n", - "
\n", - "

LocalCluster

\n", - "

ed191b1e

\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "\n", - "\n", - " \n", - "
\n", - " Dashboard: http://127.0.0.1:8787/status\n", - " \n", - " Workers: 2\n", - "
\n", - " Total threads: 2\n", - " \n", - " Total memory: 7.45 GiB\n", - "
Status: runningUsing processes: True
\n", - "\n", - "
\n", - " \n", - "

Scheduler Info

\n", - "
\n", - "\n", - "
\n", - "
\n", - "
\n", - "
\n", - "

Scheduler

\n", - "

Scheduler-6c92480e-b0a9-4413-9c4e-e6a4dc4d8e79

\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
\n", - " Comm: tcp://127.0.0.1:58890\n", - " \n", - " Workers: 2\n", - "
\n", - " Dashboard: http://127.0.0.1:8787/status\n", - " \n", - " Total threads: 2\n", - "
\n", - " Started: Just now\n", - " \n", - " Total memory: 7.45 GiB\n", - "
\n", - "
\n", - "
\n", - "\n", - "
\n", - " \n", - "

Workers

\n", - "
\n", - "\n", - " \n", - "
\n", - "
\n", - "
\n", - "
\n", - " \n", - "

Worker: 0

\n", - "
\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "\n", - " \n", - "\n", - " \n", - "\n", - "
\n", - " Comm: tcp://127.0.0.1:58900\n", - " \n", - " Total threads: 1\n", - "
\n", - " Dashboard: http://127.0.0.1:58902/status\n", - " \n", - " Memory: 3.73 GiB\n", - "
\n", - " Nanny: tcp://127.0.0.1:58893\n", - "
\n", - " Local directory: /var/folders/_h/t3wvkks5643fxnv07_kx9cx8000zpt/T/dask-scratch-space/worker-yyydue5d\n", - "
\n", - "
\n", - "
\n", - "
\n", - " \n", - "
\n", - "
\n", - "
\n", - "
\n", - " \n", - "

Worker: 1

\n", - "
\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "\n", - " \n", - "\n", - " \n", - "\n", - "
\n", - " Comm: tcp://127.0.0.1:58899\n", - " \n", - " Total threads: 1\n", - "
\n", - " Dashboard: http://127.0.0.1:58901/status\n", - " \n", - " Memory: 3.73 GiB\n", - "
\n", - " Nanny: tcp://127.0.0.1:58895\n", - "
\n", - " Local directory: /var/folders/_h/t3wvkks5643fxnv07_kx9cx8000zpt/T/dask-scratch-space/worker-9wdyj7x3\n", - "
\n", - "
\n", - "
\n", - "
\n", - " \n", - "\n", - "
\n", - "
\n", - "\n", - "
\n", - "
\n", - "
" - ], - "text/plain": [ - "LocalCluster(ed191b1e, 'tcp://127.0.0.1:58890', workers=2, threads=2, memory=7.45 GiB)" - ] - }, - "metadata": {}, - "output_type": "display_data" - } - ], + "outputs": [], "source": [ "client.cluster" ] @@ -632,20 +394,9 @@ }, { "cell_type": "code", - "execution_count": 4, + "execution_count": null, "metadata": {}, - "outputs": [ - { - "data": { - "text/plain": [ - "'http://127.0.0.1:8787/status'" - ] - }, - "execution_count": 4, - "metadata": {}, - "output_type": "execute_result" - } - ], + "outputs": [], "source": [ "client.dashboard_link" ] @@ -660,765 +411,11 @@ }, { "cell_type": "code", - "execution_count": 5, + "execution_count": null, "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "
<xarray.Dataset> Size: 221MB\n",
-       "Dimensions:    (time: 1980, bnds: 2, lat: 145, lon: 192)\n",
-       "Coordinates:\n",
-       "  * lat        (lat) float64 1kB -90.0 -88.75 -87.5 -86.25 ... 87.5 88.75 90.0\n",
-       "  * lon        (lon) float64 2kB 0.0 1.875 3.75 5.625 ... 354.4 356.2 358.1\n",
-       "    height     float64 8B ...\n",
-       "  * time       (time) object 16kB 1850-01-16 12:00:00 ... 2014-12-16 12:00:00\n",
-       "Dimensions without coordinates: bnds\n",
-       "Data variables:\n",
-       "    time_bnds  (time, bnds) object 32kB dask.array<chunksize=(1980, 2), meta=np.ndarray>\n",
-       "    lat_bnds   (lat, bnds) float64 2kB dask.array<chunksize=(145, 2), meta=np.ndarray>\n",
-       "    lon_bnds   (lon, bnds) float64 3kB dask.array<chunksize=(192, 2), meta=np.ndarray>\n",
-       "    tas        (time, lat, lon) float32 220MB dask.array<chunksize=(1695, 122, 162), meta=np.ndarray>\n",
-       "Attributes: (12/48)\n",
-       "    Conventions:                     CF-1.7 CMIP-6.2\n",
-       "    activity_id:                     CMIP\n",
-       "    branch_method:                   standard\n",
-       "    branch_time_in_child:            0.0\n",
-       "    branch_time_in_parent:           87658.0\n",
-       "    creation_date:                   2020-06-05T04:06:11Z\n",
-       "    ...                              ...\n",
-       "    variant_label:                   r10i1p1f1\n",
-       "    version:                         v20200605\n",
-       "    license:                         CMIP6 model data produced by CSIRO is li...\n",
-       "    cmor_version:                    3.4.0\n",
-       "    tracking_id:                     hdl:21.14100/af78ae5e-f3a6-4e99-8cfe-5f2...\n",
-       "    DODS_EXTRA.Unlimited_Dimension:  time
" - ], - "text/plain": [ - " Size: 221MB\n", - "Dimensions: (time: 1980, bnds: 2, lat: 145, lon: 192)\n", - "Coordinates:\n", - " * lat (lat) float64 1kB -90.0 -88.75 -87.5 -86.25 ... 87.5 88.75 90.0\n", - " * lon (lon) float64 2kB 0.0 1.875 3.75 5.625 ... 354.4 356.2 358.1\n", - " height float64 8B ...\n", - " * time (time) object 16kB 1850-01-16 12:00:00 ... 2014-12-16 12:00:00\n", - "Dimensions without coordinates: bnds\n", - "Data variables:\n", - " time_bnds (time, bnds) object 32kB dask.array\n", - " lat_bnds (lat, bnds) float64 2kB dask.array\n", - " lon_bnds (lon, bnds) float64 3kB dask.array\n", - " tas (time, lat, lon) float32 220MB dask.array\n", - "Attributes: (12/48)\n", - " Conventions: CF-1.7 CMIP-6.2\n", - " activity_id: CMIP\n", - " branch_method: standard\n", - " branch_time_in_child: 0.0\n", - " branch_time_in_parent: 87658.0\n", - " creation_date: 2020-06-05T04:06:11Z\n", - " ... ...\n", - " variant_label: r10i1p1f1\n", - " version: v20200605\n", - " license: CMIP6 model data produced by CSIRO is li...\n", - " cmor_version: 3.4.0\n", - " tracking_id: hdl:21.14100/af78ae5e-f3a6-4e99-8cfe-5f2...\n", - " DODS_EXTRA.Unlimited_Dimension: time" - ] - }, - "execution_count": 5, - "metadata": {}, - "output_type": "execute_result" - } - ], + "outputs": [], "source": [ - "ds = xc.open_dataset(filepath, chunks=\"auto\")\n", + "ds = xc.tutorial.open_dataset(\"ersstv5\", chunks=\"auto\")\n", "\n", "ds" ] @@ -1471,774 +468,13 @@ }, { "cell_type": "code", - "execution_count": 6, + "execution_count": null, "metadata": {}, - "outputs": [ - { - "name": "stderr", - "output_type": "stream", - "text": [ - "2024-06-07 13:48:47,372 [INFO]: core.py(_measure_tick:744) >> Event loop was unresponsive in Nanny for 3.67s. This is often caused by long-running GIL-holding functions or moving large chunks of data. This can cause timeouts and instability.\n", - "2024-06-07 13:48:47,372 [INFO]: core.py(_measure_tick:744) >> Event loop was unresponsive in Nanny for 3.67s. This is often caused by long-running GIL-holding functions or moving large chunks of data. This can cause timeouts and instability.\n", - "2024-06-07 13:48:51,545 [INFO]: core.py(_measure_tick:744) >> Event loop was unresponsive in Scheduler for 7.84s. This is often caused by long-running GIL-holding functions or moving large chunks of data. This can cause timeouts and instability.\n", - "2024-06-07 13:48:51,545 [INFO]: core.py(_measure_tick:744) >> Event loop was unresponsive in Scheduler for 7.84s. This is often caused by long-running GIL-holding functions or moving large chunks of data. This can cause timeouts and instability.\n", - "2024-06-07 13:48:51,573 [INFO]: core.py(_measure_tick:744) >> Event loop was unresponsive in Nanny for 7.87s. This is often caused by long-running GIL-holding functions or moving large chunks of data. This can cause timeouts and instability.\n", - "2024-06-07 13:48:51,573 [INFO]: core.py(_measure_tick:744) >> Event loop was unresponsive in Nanny for 7.87s. This is often caused by long-running GIL-holding functions or moving large chunks of data. This can cause timeouts and instability.\n", - "2024-06-07 13:48:51,745 [INFO]: core.py(_measure_tick:744) >> Event loop was unresponsive in Nanny for 4.37s. This is often caused by long-running GIL-holding functions or moving large chunks of data. This can cause timeouts and instability.\n", - "2024-06-07 13:48:51,745 [INFO]: core.py(_measure_tick:744) >> Event loop was unresponsive in Nanny for 4.37s. This is often caused by long-running GIL-holding functions or moving large chunks of data. This can cause timeouts and instability.\n", - "2024-06-07 13:48:59,235 [INFO]: utils_perf.py(_gc_callback:198) >> full garbage collection released 17.00 MiB from 111124 reference cycles (threshold: 9.54 MiB)\n", - "2024-06-07 13:48:59,235 [INFO]: utils_perf.py(_gc_callback:198) >> full garbage collection released 17.00 MiB from 111124 reference cycles (threshold: 9.54 MiB)\n", - "2024-06-07 13:49:22,404 [INFO]: scheduler.py(remove_client:5735) >> Remove client Client-50e779ce-250f-11ef-8879-e2398742200c\n", - "2024-06-07 13:49:22,404 [INFO]: scheduler.py(remove_client:5735) >> Remove client Client-50e779ce-250f-11ef-8879-e2398742200c\n", - "2024-06-07 13:49:22,404 [INFO]: core.py(handle_stream:1044) >> Received 'close-stream' from tcp://127.0.0.1:58905; closing.\n", - "2024-06-07 13:49:22,404 [INFO]: core.py(handle_stream:1044) >> Received 'close-stream' from tcp://127.0.0.1:58905; closing.\n", - "2024-06-07 13:49:22,405 [INFO]: scheduler.py(remove_client:5735) >> Remove client Client-50e779ce-250f-11ef-8879-e2398742200c\n", - "2024-06-07 13:49:22,405 [INFO]: scheduler.py(remove_client:5735) >> Remove client Client-50e779ce-250f-11ef-8879-e2398742200c\n", - "2024-06-07 13:49:22,406 [INFO]: scheduler.py(add_client:5727) >> Close client connection: Client-50e779ce-250f-11ef-8879-e2398742200c\n", - "2024-06-07 13:49:22,406 [INFO]: scheduler.py(add_client:5727) >> Close client connection: Client-50e779ce-250f-11ef-8879-e2398742200c\n", - "2024-06-07 13:49:45,868 [INFO]: core.py(_measure_tick:744) >> Event loop was unresponsive in Nanny for 5.01s. This is often caused by long-running GIL-holding functions or moving large chunks of data. This can cause timeouts and instability.\n", - "2024-06-07 13:49:45,868 [INFO]: core.py(_measure_tick:744) >> Event loop was unresponsive in Nanny for 5.01s. This is often caused by long-running GIL-holding functions or moving large chunks of data. This can cause timeouts and instability.\n", - "2024-06-07 13:49:47,050 [INFO]: core.py(_measure_tick:744) >> Event loop was unresponsive in Scheduler for 6.19s. This is often caused by long-running GIL-holding functions or moving large chunks of data. This can cause timeouts and instability.\n", - "2024-06-07 13:49:47,050 [INFO]: core.py(_measure_tick:744) >> Event loop was unresponsive in Scheduler for 6.19s. This is often caused by long-running GIL-holding functions or moving large chunks of data. This can cause timeouts and instability.\n", - "2024-06-07 13:49:47,056 [INFO]: core.py(_measure_tick:744) >> Event loop was unresponsive in Nanny for 6.19s. This is often caused by long-running GIL-holding functions or moving large chunks of data. This can cause timeouts and instability.\n", - "2024-06-07 13:49:47,056 [INFO]: core.py(_measure_tick:744) >> Event loop was unresponsive in Nanny for 6.19s. This is often caused by long-running GIL-holding functions or moving large chunks of data. This can cause timeouts and instability.\n" - ] - }, - { - "data": { - "text/html": [ - "
\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "\n", - "
<xarray.Dataset> Size: 441MB\n",
-       "Dimensions:   (lat: 145, bnds: 2, lon: 192, time: 1980)\n",
-       "Coordinates:\n",
-       "  * lat       (lat) float64 1kB -90.0 -88.75 -87.5 -86.25 ... 87.5 88.75 90.0\n",
-       "  * lon       (lon) float64 2kB 0.0 1.875 3.75 5.625 ... 352.5 354.4 356.2 358.1\n",
-       "    height    float64 8B ...\n",
-       "  * time      (time) object 16kB 1850-01-01 00:00:00 ... 2014-12-01 00:00:00\n",
-       "Dimensions without coordinates: bnds\n",
-       "Data variables:\n",
-       "    lat_bnds  (lat, bnds) float64 2kB dask.array<chunksize=(145, 2), meta=np.ndarray>\n",
-       "    lon_bnds  (lon, bnds) float64 3kB dask.array<chunksize=(192, 2), meta=np.ndarray>\n",
-       "    tas       (time, lat, lon) float64 441MB dask.array<chunksize=(1, 122, 162), meta=np.ndarray>\n",
-       "Attributes: (12/48)\n",
-       "    Conventions:                     CF-1.7 CMIP-6.2\n",
-       "    activity_id:                     CMIP\n",
-       "    branch_method:                   standard\n",
-       "    branch_time_in_child:            0.0\n",
-       "    branch_time_in_parent:           87658.0\n",
-       "    creation_date:                   2020-06-05T04:06:11Z\n",
-       "    ...                              ...\n",
-       "    variant_label:                   r10i1p1f1\n",
-       "    version:                         v20200605\n",
-       "    license:                         CMIP6 model data produced by CSIRO is li...\n",
-       "    cmor_version:                    3.4.0\n",
-       "    tracking_id:                     hdl:21.14100/af78ae5e-f3a6-4e99-8cfe-5f2...\n",
-       "    DODS_EXTRA.Unlimited_Dimension:  time
" - ], - "text/plain": [ - " Size: 441MB\n", - "Dimensions: (lat: 145, bnds: 2, lon: 192, time: 1980)\n", - "Coordinates:\n", - " * lat (lat) float64 1kB -90.0 -88.75 -87.5 -86.25 ... 87.5 88.75 90.0\n", - " * lon (lon) float64 2kB 0.0 1.875 3.75 5.625 ... 352.5 354.4 356.2 358.1\n", - " height float64 8B ...\n", - " * time (time) object 16kB 1850-01-01 00:00:00 ... 2014-12-01 00:00:00\n", - "Dimensions without coordinates: bnds\n", - "Data variables:\n", - " lat_bnds (lat, bnds) float64 2kB dask.array\n", - " lon_bnds (lon, bnds) float64 3kB dask.array\n", - " tas (time, lat, lon) float64 441MB dask.array\n", - "Attributes: (12/48)\n", - " Conventions: CF-1.7 CMIP-6.2\n", - " activity_id: CMIP\n", - " branch_method: standard\n", - " branch_time_in_child: 0.0\n", - " branch_time_in_parent: 87658.0\n", - " creation_date: 2020-06-05T04:06:11Z\n", - " ... ...\n", - " variant_label: r10i1p1f1\n", - " version: v20200605\n", - " license: CMIP6 model data produced by CSIRO is li...\n", - " cmor_version: 3.4.0\n", - " tracking_id: hdl:21.14100/af78ae5e-f3a6-4e99-8cfe-5f2...\n", - " DODS_EXTRA.Unlimited_Dimension: time" - ] - }, - "execution_count": 6, - "metadata": {}, - "output_type": "execute_result" - } - ], + "outputs": [], "source": [ - "tas_avg = ds.temporal.group_average(\"tas\", freq=\"month\")\n", + "sst_avg = ds.temporal.group_average(\"sst\", freq=\"month\")\n", "\n", - "tas_avg" + "sst_avg" ] }, { @@ -2253,26 +489,15 @@ }, { "cell_type": "code", - "execution_count": 7, + "execution_count": null, "metadata": {}, - "outputs": [ - { - "name": "stderr", - "output_type": "stream", - "text": [ - "/opt/miniconda3/envs/xcdat_scipy_2024/lib/python3.11/site-packages/distributed/client.py:3164: UserWarning: Sending large graph of size 426.18 MiB.\n", - "This may cause some slowdown.\n", - "Consider scattering data ahead of time and using futures.\n", - " warnings.warn(\n" - ] - } - ], + "outputs": [], "source": [ "# NOTE: You might see `UserWarning: Sending large graph of size 420.59 MiB.`\n", "# which indicates that the Dask configuration might not be optimal for the\n", "# small dataset used in the example.\n", - "tas_avg_res = tas_avg.compute()\n", - "# Or call tas_avg.load() to modify the original dataset." + "sst_avg_res = sst_avg.compute()\n", + "# Or call sst_avg.load() to modify the original dataset." ] }, { @@ -2286,7 +511,7 @@ }, { "cell_type": "code", - "execution_count": 8, + "execution_count": null, "metadata": {}, "outputs": [], "source": [ @@ -2401,25 +626,9 @@ }, { "cell_type": "code", - "execution_count": 9, + "execution_count": null, "metadata": {}, - "outputs": [ - { - "name": "stderr", - "output_type": "stream", - "text": [ - "/opt/miniconda3/envs/xcdat_scipy_2024/lib/python3.11/site-packages/distributed/client.py:1394: VersionMismatchWarning: Mismatched versions found\n", - "\n", - "+---------+----------------+----------------+----------------+\n", - "| Package | Client | Scheduler | Workers |\n", - "+---------+----------------+----------------+----------------+\n", - "| python | 3.11.9.final.0 | 3.12.3.final.0 | 3.12.3.final.0 |\n", - "| tornado | 6.4 | 6.4.1 | 6.4.1 |\n", - "+---------+----------------+----------------+----------------+\n", - " warnings.warn(version_module.VersionMismatchWarning(msg[0][\"warning\"]))\n" - ] - } - ], + "outputs": [], "source": [ "# For the purpose of this notebook, ignore the VersionMismatchWarning if you get\n", "# one.\n", @@ -2442,273 +651,9 @@ }, { "cell_type": "code", - "execution_count": 10, + "execution_count": null, "metadata": {}, - "outputs": [ - { - "data": { - "text/html": [ - "
\n", - "
\n", - "
\n", - "

Client

\n", - "

Client-6b2935b6-250f-11ef-8879-e2398742200c

\n", - " \n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "\n", - "
Connection method: Direct
\n", - " Dashboard: http://127.0.0.1:58399/status\n", - "
\n", - "\n", - " \n", - "\n", - " \n", - "
\n", - "

Scheduler Info

\n", - "
\n", - "
\n", - "
\n", - "
\n", - "

Scheduler

\n", - "

Scheduler-f0ee5dff-8fc9-4e04-81f3-26600a2a275f

\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "
\n", - " Comm: tcp://192.168.11.21:8786\n", - " \n", - " Workers: 2\n", - "
\n", - " Dashboard: http://192.168.11.21:58399/status\n", - " \n", - " Total threads: 2\n", - "
\n", - " Started: 3 minutes ago\n", - " \n", - " Total memory: 32.00 GiB\n", - "
\n", - "
\n", - "
\n", - "\n", - "
\n", - " \n", - "

Workers

\n", - "
\n", - "\n", - " \n", - "
\n", - "
\n", - "
\n", - "
\n", - " \n", - "

Worker: tcp://127.0.0.1:58435

\n", - "
\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "\n", - " \n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "\n", - "
\n", - " Comm: tcp://127.0.0.1:58435\n", - " \n", - " Total threads: 1\n", - "
\n", - " Dashboard: http://127.0.0.1:58438/status\n", - " \n", - " Memory: 16.00 GiB\n", - "
\n", - " Nanny: tcp://127.0.0.1:58431\n", - "
\n", - " Local directory: /var/folders/_h/t3wvkks5643fxnv07_kx9cx8000zpt/T/dask-scratch-space/worker-4bscf5z9\n", - "
\n", - " Tasks executing: \n", - " \n", - " Tasks in memory: \n", - "
\n", - " Tasks ready: \n", - " \n", - " Tasks in flight: \n", - "
\n", - " CPU usage: 1.1%\n", - " \n", - " Last seen: Just now\n", - "
\n", - " Memory usage: 54.45 MiB\n", - " \n", - " Spilled bytes: 0 B\n", - "
\n", - " Read bytes: 1.09 GiB\n", - " \n", - " Write bytes: 1.09 GiB\n", - "
\n", - "
\n", - "
\n", - "
\n", - " \n", - "
\n", - "
\n", - "
\n", - "
\n", - " \n", - "

Worker: tcp://127.0.0.1:58436

\n", - "
\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "\n", - " \n", - "\n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - " \n", - "\n", - "
\n", - " Comm: tcp://127.0.0.1:58436\n", - " \n", - " Total threads: 1\n", - "
\n", - " Dashboard: http://127.0.0.1:58437/status\n", - " \n", - " Memory: 16.00 GiB\n", - "
\n", - " Nanny: tcp://127.0.0.1:58433\n", - "
\n", - " Local directory: /var/folders/_h/t3wvkks5643fxnv07_kx9cx8000zpt/T/dask-scratch-space/worker-46sk205e\n", - "
\n", - " Tasks executing: \n", - " \n", - " Tasks in memory: \n", - "
\n", - " Tasks ready: \n", - " \n", - " Tasks in flight: \n", - "
\n", - " CPU usage: 1.0%\n", - " \n", - " Last seen: Just now\n", - "
\n", - " Memory usage: 54.11 MiB\n", - " \n", - " Spilled bytes: 0 B\n", - "
\n", - " Read bytes: 1.09 GiB\n", - " \n", - " Write bytes: 1.09 GiB\n", - "
\n", - "
\n", - "
\n", - "
\n", - " \n", - "\n", - "
\n", - "
\n", - "
\n", - " \n", - "\n", - "
\n", - "
" - ], - "text/plain": [ - "" - ] - }, - "execution_count": 10, - "metadata": {}, - "output_type": "execute_result" - } - ], + "outputs": [], "source": [ "client_hpc" ] @@ -2722,33 +667,22 @@ }, { "cell_type": "code", - "execution_count": 11, + "execution_count": null, "metadata": {}, "outputs": [], "source": [ "ds_hpc = xc.open_dataset(filepath, chunks=\"auto\")\n", "\n", - "tas_avg_hpc = ds.temporal.group_average(\"tas\", freq=\"month\")" + "sst_avg_hpc = ds.temporal.group_average(\"sst\", freq=\"month\")" ] }, { "cell_type": "code", - "execution_count": 12, + "execution_count": null, "metadata": {}, - "outputs": [ - { - "name": "stderr", - "output_type": "stream", - "text": [ - "/opt/miniconda3/envs/xcdat_scipy_2024/lib/python3.11/site-packages/distributed/client.py:3164: UserWarning: Sending large graph of size 426.18 MiB.\n", - "This may cause some slowdown.\n", - "Consider scattering data ahead of time and using futures.\n", - " warnings.warn(\n" - ] - } - ], + "outputs": [], "source": [ - "tas_avg_hpc_res = tas_avg_hpc.compute()" + "sst_avg_hpc_res = sst_avg_hpc.compute()" ] }, { @@ -2836,7 +770,7 @@ ], "metadata": { "kernelspec": { - "display_name": "xcdat_dev", + "display_name": "xcdat_notebook_0.7.3", "language": "python", "name": "python3" }, @@ -2850,7 +784,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.11.9" + "version": "3.12.7" }, "orig_nbformat": 4 }, diff --git a/docs/examples/spatial-average.ipynb b/docs/examples/spatial-average.ipynb index 757e637a..4d150209 100644 --- a/docs/examples/spatial-average.ipynb +++ b/docs/examples/spatial-average.ipynb @@ -35,16 +35,16 @@ "First, create the conda environment:\n", "\n", "```bash\n", - "conda create -n xcdat_notebook_0.7.3 -c conda-forge xcdat=0.7.3 xesmf matplotlib ipython ipykernel cartopy nc-time-axis gsw-xarray jupyter pooch\n", + "conda create -n xcdat_notebook -c conda-forge xcdat xesmf matplotlib ipython ipykernel cartopy nc-time-axis gsw-xarray jupyter pooch\n", "```\n", "\n", - "Then install the kernel from the `xcdat_notebook_0.7.3` environment using `ipykernel` and name the kernel with the display name (e.g., `xcdat_notebook_0.7.3`):\n", + "Then install the kernel from the `xcdat_notebook` environment using `ipykernel` and name the kernel with the display name (e.g., `xcdat_notebook`):\n", "\n", "```bash\n", - "python -m ipykernel install --user --name xcdat_notebook_0.7.3 --display-name xcdat_notebook_0.7.3\n", + "python -m ipykernel install --user --name xcdat_notebook --display-name xcdat_notebook\n", "```\n", "\n", - "Then to select the kernel `xcdat_notebook_0.7.3` in Jupyter to use this kernel.\n" + "Then to select the kernel `xcdat_notebook` in Jupyter to use this kernel.\n" ] }, { diff --git a/docs/examples/temporal-average.ipynb b/docs/examples/temporal-average.ipynb index cd1df7dd..a9aa0519 100644 --- a/docs/examples/temporal-average.ipynb +++ b/docs/examples/temporal-average.ipynb @@ -40,16 +40,16 @@ "First, create the conda environment:\n", "\n", "```bash\n", - "conda create -n xcdat_notebook_0.7.3 -c conda-forge xcdat=0.7.3 xesmf matplotlib ipython ipykernel cartopy nc-time-axis gsw-xarray jupyter pooch\n", + "conda create -n xcdat_notebook -c conda-forge xcdat xesmf matplotlib ipython ipykernel cartopy nc-time-axis gsw-xarray jupyter pooch\n", "```\n", "\n", - "Then install the kernel from the `xcdat_notebook_0.7.3` environment using `ipykernel` and name the kernel with the display name (e.g., `xcdat_notebook_0.7.3`):\n", + "Then install the kernel from the `xcdat_notebook` environment using `ipykernel` and name the kernel with the display name (e.g., `xcdat_notebook`):\n", "\n", "```bash\n", - "python -m ipykernel install --user --name xcdat_notebook_0.7.3 --display-name xcdat_notebook_0.7.3\n", + "python -m ipykernel install --user --name xcdat_notebook --display-name xcdat_notebook\n", "```\n", "\n", - "Then to select the kernel `xcdat_notebook_0.7.3` in Jupyter to use this kernel.\n" + "Then to select the kernel `xcdat_notebook` in Jupyter to use this kernel.\n" ] }, {