{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "\n", "
\n", "\n", "
\n", "\"Unidata\n", "
\n", "\n", "

MetPy with xarray

\n", "

Unidata AMS 2021 Student Conference

\n", "\n", "
\n", "
\n", "\n", "---\n", "\n", "[Xarray](http://xarray.pydata.org/en/stable/) is a powerful Python package that provides N-dimensional\n", "labeled arrays and datasets following the Common Data Model. MetPy's suite of meteorological\n", "calculations are designed to integrate with xarray DataArrays as one of its two primary data\n", "models (the other being Pint Quantities). MetPy also provides DataArray and Dataset\n", "*accessors* (collections of methods and properties attached to the `.metpy` property) for\n", "coordinate/CRS and unit operations.\n", "\n", "Full information on MetPy's accessors is available in the [appropriate section of MetPy's documentation](https://unidata.github.io/MetPy/latest/api/generated/metpy.xarray.html), otherwise, continue on in this\n", "notebook for a demonstration of the three main components of MetPy's integration with xarray\n", "(coordinates/coordinate reference systems, units, and calculations), as well as instructive\n", "examples for both CF-compliant and non-compliant datasets.\n", "\n", "This training notebook a lengthy one, and is based off of the [xarray with MetPy Tutorial](https://unidata.github.io/MetPy/latest/tutorials/xarray_tutorial.html) in MetPy's documentation. Also, there are several other training notebooks that expand upon the features of xarray (see the [see also section below](#See-Also)). Feel free to use this notebook as the big-picture overview of working with gridded data in MetPy's ecosystem (and then going to the specific xarray notebooks where needed), or, look through the xarray notebooks individually first, since this notebook covers extensions to those behaviors to make them easier to use with MetPy.\n", "\n", "
\"HTML
\n", "\n", "\n", "### Focuses\n", "\n", "- Introduce the xarray package\n", "- Learn about coordinates and coordinate reference systems in xarray and MetPy\n", "- See how MetPy's unit functionality caries over to xarray\n", "- See how xarray data can be used with MetPy's calculation suite\n", "- Find instructive examples for working with both CF-compliant and non CF-compliant datasets\n", "\n", "### Objectives\n", "\n", "1. [Open a test Dataset](#1.-Open-a-test-Dataset)\n", "1. [Coordinates and Coordinate Reference Systems](#2.-Coordinates-and-Coordinate-Reference-Systems)\n", "1. [Units](#3.-Units)\n", "1. [Calculations](#4.-Calculations)\n", "1. [Dataset examples](#5.-Dataset-examples)\n", "\n", "---" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Imports" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "import numpy as np\n", "import xarray as xr\n", "\n", "# Any import of metpy will activate the accessors\n", "import metpy.calc as mpcalc\n", "from metpy.cbook import get_test_data\n", "from metpy.units import units" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "---" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 1. Open a test Dataset\n", "\n", "While xarray can handle a wide variety of n-dimensional data (essentially anything that can\n", "be stored in a netCDF file), a common use case is working with gridded model output. For more details on accessing data with xarray, see the [xarray data access notebook](https://nbviewer.jupyter.org/github/Unidata/pyaos-ams-2021/blob/master/notebooks/dataAccess/xarray_data_access.ipynb).\n", "\n", "For now, we are just going to load a local file from MetPy's test data collection:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "# Open the netCDF file as a xarray Dataset\n", "data = xr.open_dataset(get_test_data('irma_gfs_example.nc', False))\n", "\n", "# View a summary of the Dataset\n", "data" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Take a look at the output of this cell. The `data` variable here is an xarray `Dataset`, and it consists of *dimensions* and their associated *coordinates*, which in turn make up the axes along which the *data variables* are defined. The dataset also has a dictionary-like collection of *attributes*. If you've worked with NetCDF data in the past, hopefully these should be familiar to you.\n", "\n", "What happens if we look at just a single data variable?" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "temperature = data['Temperature_isobaric']\n", "temperature" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This is a `DataArray`, which stores just a single data variable with its associated\n", "coordinates and attributes. These individual `DataArray`s are the kinds of objects that\n", "MetPy's calculations take as input (more on that in [Calculations section below](#4.Calculations)).\n", "\n", "If you are more interested in learning about xarray's terminology and data structures, see\n", "the [terminology section](http://xarray.pydata.org/en/stable/terminology.html) of xarray's\n", "documenation.\n", "\n", "Top\n", "\n", "---" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 2. Coordinates and Coordinate Reference Systems\n", "\n", "### Coordinate Types\n", "\n", "MetPy's first set of helpers comes with identifying *coordinate types*. In a given dataset,\n", "coordinates can have a variety of different names and yet refer to the same type (such as the\n", "\"isobaric1\" and \"isobaric3\" coordinates seen above both referring to vertical isobaric coordinates). Following\n", "[CF conventions](http://cfconventions.org/Data/cf-conventions/cf-conventions-1.8/cf-conventions.html), as well as using some fall-back regular expressions, MetPy can\n", "systematically identify coordinates of the following types:\n", "\n", "- time\n", "- vertical\n", "- latitude\n", "- y\n", "- longitude\n", "- x\n", "\n", "When identifying a single coordinate, it is best to use the property directly associated\n", "with that type" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "temperature.metpy.time" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "When accessing multiple coordinate types simultaneously, you can use the ``.coordinates()``\n", "method to yield a generator for the respective coordinates\n", "\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "x, y = temperature.metpy.coordinates('x', 'y')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "If you're [indexing into your data with xarray](https://nbviewer.jupyter.org/github/Unidata/pyaos-ams-2021/blob/master/notebooks/analysis/xarray_indexing.ipynb), and want to use these shortcut aliases instead of the direct coordinate names, MetPy provides its own version of `.sel` and `.loc` to handle these! For example, to use MetPy's wrapped `.sel` and `.loc` to access 500 hPa heights at 1800Z, you can do the following:\n", "\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "heights = data['Geopotential_height_isobaric'].metpy.sel(\n", " time='2017-09-05 18:00',\n", " vertical=50000.\n", ")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "(Notice how we specified 50000 here without units...we'll go over [a better alternative in\n", "the next section on units](#3.-Units).)\n", "\n", "One point of warning: xarray's selection and indexing only works if these coordinates are\n", "*dimension coordinates*, meaning that they are 1D and share the name of their associated\n", "dimension. In practice, this means that you can't index a dataset that has 2D latitude and\n", "longitude coordinates by latitudes and longitudes, instead, you must index by the 1D y and x\n", "dimension coordinates. (What if these coordinates are missing, you may ask? MetPy has the [`.assign_y_x` helper](https://unidata.github.io/MetPy/latest/api/generated/metpy.xarray.html#metpy.xarray.MetPyDataArrayAccessor.assign_y_x) for that.)\n", "\n", "### Coordinate Reference Systems (CRS)\n", "\n", "Beyond just the coordinates themselves, a common need for both calculations with and plots\n", "of geospatial data is knowing the coordinate reference system (CRS) on which the horizontal\n", "spatial coordinates are defined. MetPy follows the [CF Conventions](http://cfconventions.org/Data/cf-conventions/cf-conventions-1.8/cf-conventions.html#grid-mappings-and-projections)\n", "for its CRS definitions, which it then caches on the `metpy_crs` coordinate in order for\n", "it to persist through calculations and other array operations. There are two ways to do so\n", "in MetPy:\n", "\n", "First, if your dataset is already conforming to the CF Conventions, it will have a grid\n", "mapping variable that is associated with the other data variables by the `grid_mapping`\n", "attribute. This is automatically parsed via the `.parse_cf()` method:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "# Parse full dataset\n", "data_parsed = data.metpy.parse_cf()\n", "\n", "# Parse subset of dataset\n", "data_subset = data.metpy.parse_cf([\n", " 'u-component_of_wind_isobaric',\n", " 'v-component_of_wind_isobaric',\n", " 'Vertical_velocity_pressure_isobaric'\n", "])\n", "\n", "# Parse single variable\n", "relative_humidity = data.metpy.parse_cf('Relative_humidity_isobaric')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "If your dataset doesn't have a CF-conforming grid mapping variable, you can manually specify\n", "the CRS using the [`.assign_crs()` method](https://unidata.github.io/MetPy/latest/api/generated/metpy.xarray.html#metpy.xarray.MetPyDataArrayAccessor.assign_crs):\n", "\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "temperature = data['Temperature_isobaric'].metpy.assign_crs(\n", " grid_mapping_name='latitude_longitude',\n", " earth_radius=6371229.0\n", ")\n", "\n", "temperature" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Notice the newly added `metpy_crs` non-dimension coordinate. Now how can we use this in\n", "practice? For individual `DataArrays`s, we can access the cartopy and pyproj objects\n", "corresponding to this CRS:\n", "\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "# Cartopy CRS, useful for plotting\n", "relative_humidity.metpy.cartopy_crs" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "# pyproj CRS, useful for projection transformations and forward/backward azimuth and great\n", "# circle calculations\n", "temperature.metpy.pyproj_crs" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### One other note about coordinates and CRSs\n", "\n", "Finally, there are times when a certain horizontal coordinate type is missing from your\n", "dataset, and you need the other, that is, you have latitude/longitude and need y/x, or visa\n", "versa. This is where the ``.assign_y_x`` and ``.assign_latitude_longitude`` methods come in\n", "handy. Our current GFS sample won't work to demonstrate this (since, on its\n", "latitude-longitude grid, y is latitude and x is longitude), so for more information, take\n", "a look at the [Non-Compliant Dataset Example](#Non-Compliant-Dataset-Example) below.\n", "\n", "\n", "Top\n", "\n", "---" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 3. Units\n", "\n", "Since [unit-aware calculations](https://nbviewer.jupyter.org/github/Unidata/pyaos-ams-2021/blob/master/notebooks/analysis/metpy_basics.ipynb) are a major part of the MetPy library, [unit support](https://nbviewer.jupyter.org/github/Unidata/pyaos-ams-2021/blob/master/notebooks/analysis/units.ipynb) is a major\n", "part of MetPy's xarray integration!\n", "\n", "One very important point of consideration is that xarray data variables (in both\n", "`Dataset`s and `DataArray`s) can store both unit-aware and unit-naive array types.\n", "Unit-naive array types will be used by default in xarray, so we need to convert to a\n", "unit-aware type if we want to use xarray operations while preserving unit correctness (What are these xarray operations you may ask? See the [xarray aggregations](https://nbviewer.jupyter.org/github/Unidata/pyaos-ams-2021/blob/master/notebooks/analysis/xarray_aggregations.ipynb) and [xarray calculations](https://nbviewer.jupyter.org/github/Unidata/pyaos-ams-2021/blob/master/notebooks/analysis/xarray_calculations.ipynb) training notebooks!). MetPy\n", "provides the `.quantify()` method for this conversion (named since we are turning the data stored\n", "inside the xarray object into a Pint `Quantity` object...and if you want to learn more about those, see the [MetPy with units training notebook](https://nbviewer.jupyter.org/github/Unidata/pyaos-ams-2021/blob/master/notebooks/analysis/units.ipynb)).\n", "\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "heights = heights.metpy.quantify()\n", "heights" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Notice how the units are now represented in the data itself, rather than as a text\n", "attribute. Now, even if we perform some kind of xarray operation (such as taking the zonal\n", "mean, which is [an aggregation operation](https://nbviewer.jupyter.org/github/Unidata/pyaos-ams-2021/blob/master/notebooks/analysis/xarray_aggregations.ipynb)), the units are preserved\n", "\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "heights_mean = heights.mean('longitude')\n", "heights_mean" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "However, this \"quantification\" is not without its consequences. By default, xarray loads its\n", "data lazily to conserve memory usage. Unless your data is chunked into a Dask array (using\n", "the ``chunks`` argument), this ``.quantify()`` method will load data into memory, which\n", "could slow your script or even cause your process to run out of memory. And so, we recommend\n", "subsetting your data before quantifying it.\n", "\n", "Also, these Pint ``Quantity`` data objects are not properly handled by xarray when writing\n", "to disk. And so, if you want to safely export your data, you will need to undo the\n", "quantification with the ``.dequantify()`` method, which converts your data back to a\n", "unit-naive array with the unit as a text attribute\n", "\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "heights_mean_str_units = heights_mean.metpy.dequantify()\n", "heights_mean_str_units" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Other useful unit integration features include:\n", "\n", "Unit-based [selection/indexing](https://nbviewer.jupyter.org/github/Unidata/pyaos-ams-2021/blob/master/notebooks/analysis/xarray_indexing.ipynb):\n", "\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "heights_at_45_north = data['Geopotential_height_isobaric'].metpy.sel(\n", " latitude=45 * units.degrees_north,\n", " vertical=300 * units.hPa\n", ")\n", "heights_at_45_north" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Unit conversion:\n", "\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "temperature_degC = temperature[0].metpy.convert_units('degC')\n", "temperature_degC" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Unit conversion for coordinates:\n", "\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "heights_on_hPa_levels = heights.metpy.convert_coordinate_units('isobaric3', 'hPa')\n", "heights_on_hPa_levels['isobaric3']" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Accessing just the underlying unit array:\n", "\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "heights_unit_array = heights.metpy.unit_array\n", "heights_unit_array" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Accessing just the underlying units:\n", "\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "height_units = heights.metpy.units\n", "height_units" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "\n", "Top\n", "\n", "---" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 4. Calculations\n", "\n", "MetPy's xarray integration extends to its calcuation suite as well. Most grid-capable\n", "calculations (such as thermodynamics, kinematics, and smoothers) fully support xarray\n", "`DataArray`s by accepting them as inputs, returning them as outputs, and automatically\n", "using the attached coordinate data/metadata to determine grid arguments.\n", "\n", "As before, if you'd like more information on MetPy's calculations, see the [basic overview training notebook](https://nbviewer.jupyter.org/github/Unidata/pyaos-ams-2021/blob/master/notebooks/analysis/metpy_basics.ipynb) or look through [MetPy's documentation](https://unidata.github.io/MetPy/latest/api/generated/metpy.calc.html).\n", "\n", "As an example, let's calculate geostrophic wind from our geopotential height data. Note that we only have to specify the height field, as all coordinate/grid information is automatically extracted from the attached metadata." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "heights = data_parsed.metpy.parse_cf('Geopotential_height_isobaric').metpy.sel(\n", " time='2017-09-05 18:00',\n", " vertical=500 * units.hPa\n", ")\n", "u_g, v_g = mpcalc.geostrophic_wind(heights)\n", "u_g" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now, not every MetPy calculation will always return a `DataArray`. For profile-based calculations (and most remaining calculations in the `metpy.calc`\n", "module not in the categories mentioned above), xarray `DataArray`s are accepted as inputs, but the outputs remain Pint Quantities (typically scalars), like in this below example with CAPE:\n", "\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "data_at_point = data.metpy.sel(\n", " time1='2017-09-05 12:00',\n", " latitude=40 * units.degrees_north,\n", " longitude=260 * units.degrees_east\n", ")\n", "dewpoint = mpcalc.dewpoint_from_relative_humidity(\n", " data_at_point['Temperature_isobaric'],\n", " data_at_point['Relative_humidity_isobaric']\n", ")\n", "cape, cin = mpcalc.surface_based_cape_cin(\n", " data_at_point['isobaric3'],\n", " data_at_point['Temperature_isobaric'],\n", " dewpoint\n", ")\n", "cape" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "A few remaining portions of MetPy's calculations (mainly the interpolation module and a few\n", "other functions) do not fully support xarray, and so, use of ``.values`` may be needed to\n", "convert to a bare NumPy array.\n", "\n", "Top\n", "\n", "---\n", "\n", "## 5. Dataset examples\n", "\n", "Now let's examine some other datasets so that we can go over the two most common workflows needed to handle any kind of gridded dataset you may have in xarray.\n", "\n", "### CF-Compliant Dataset Example\n", "\n", "The GFS sample used throughout this tutorial so far has been an example of a CF-compliant\n", "dataset. These kinds of datasets are easiest to work with it MetPy, since most of the\n", "\"xarray magic\" uses CF metadata. For this kind of dataset, a typical workflow looks like the\n", "following\n", "\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "# Load data, parse it for a CF grid mapping, and promote lat/lon data variables to coordinates\n", "data = xr.open_dataset(\n", " get_test_data('narr_example.nc', False)\n", ").metpy.parse_cf().set_coords(['lat', 'lon'])\n", "\n", "# Subset to only the data you need to save on memory usage\n", "subset = data.metpy.sel(isobaric=500 * units.hPa)\n", "\n", "# Quantify if you plan on performing xarray operations that need to maintain unit correctness\n", "subset = subset.metpy.quantify()\n", "\n", "# Perform calculations\n", "heights = mpcalc.smooth_gaussian(subset['Geopotential_height'], 5)\n", "subset['u_geo'], subset['v_geo'] = mpcalc.geostrophic_wind(heights)\n", "\n", "# Plot\n", "heights.plot()" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "# Save output\n", "subset.metpy.dequantify().drop_vars('metpy_crs').to_netcdf('500hPa_analysis.nc')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Non-Compliant Dataset Example\n", "\n", "When CF metadata (such as grid mapping, coordinate attributes, etc.) are missing, a bit more\n", "work is required to manually supply the required information, for example,\n", "\n" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "jupyter": { "outputs_hidden": false } }, "outputs": [], "source": [ "nonstandard = xr.Dataset({\n", " 'temperature': (('y', 'x'), np.arange(0, 9).reshape(3, 3) * units.degC),\n", " 'y': ('y', np.arange(0, 3) * 1e5, {'units': 'km'}),\n", " 'x': ('x', np.arange(0, 3) * 1e5, {'units': 'km'})\n", "})\n", "\n", "# Add both CRS and then lat/lon coords using chained methods\n", "data = nonstandard.metpy.assign_crs(\n", " grid_mapping_name='lambert_conformal_conic',\n", " latitude_of_projection_origin=38.5,\n", " longitude_of_central_meridian=262.5,\n", " standard_parallel=38.5,\n", " earth_radius=6371229.0\n", ").metpy.assign_latitude_longitude()\n", "\n", "# Preview the changes\n", "data" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Once the CRS and additional coordinates are assigned, you can generally proceed as you would\n", "for a CF-compliant dataset.\n", "\n", "Top\n", "\n", "---\n", "\n", "## See Also\n", "\n", "There was a lot going on in this notebook! But, that's just because there is so much that you can do with xarray, and MetPy has a fair number of utilities to make it all go more smoothly when working with atmospheric science datasets. Since we couldn't cover the details of everything xarray here, be sure to take a look at the other xarray training notebooks linked below:\n", "\n", "- [Xarray data access](https://nbviewer.jupyter.org/github/Unidata/pyaos-ams-2021/blob/master/notebooks/dataAccess/xarray_data_access.ipynb)\n", "- [Indexing with xarray](https://nbviewer.jupyter.org/github/Unidata/pyaos-ams-2021/blob/master/notebooks/analysis/xarray_indexing.ipynb)\n", "- [Interpolation and regridding with xarray](https://nbviewer.jupyter.org/github/Unidata/pyaos-ams-2021/blob/master/notebooks/analysis/xarray_interpolation.ipynb)\n", "- [Xarray aggregation operations](https://nbviewer.jupyter.org/github/Unidata/pyaos-ams-2021/blob/master/notebooks/analysis/xarray_aggregations.ipynb)\n", "- [Calculations in xarray](https://nbviewer.jupyter.org/github/Unidata/pyaos-ams-2021/blob/master/notebooks/analysis/xarray_calculations.ipynb)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Top" ] } ], "metadata": { "kernelspec": { "display_name": "Python [conda env:pyaos-ams-2021]", "language": "python", "name": "conda-env-pyaos-ams-2021-py" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.9.1" } }, "nbformat": 4, "nbformat_minor": 4 }