{ "cells": [ { "cell_type": "markdown", "id": "d9ff4103", "metadata": {}, "source": [ "## Visualizing Hurricane Florence\n", "\n", "This examples makes a true-color animation of Hurricane Florence by stitching together images from GOES. It builds off this example from [pytroll-examples](https://github.com/pytroll/pytroll-examples/blob/main/satpy/GOES-16%20ABI%20-%20True%20Color%20Animation%20-%20Hurricane%20Florence.ipynb). You can see the output of that example [here](https://twitter.com/PyTrollOrg/status/1039555399433834497).\n", "\n", "Here's what our final animation will look like:\n", "\n", "" ] }, { "cell_type": "code", "execution_count": 1, "id": "d48a3b9f", "metadata": {}, "outputs": [], "source": [ "import urllib.request\n", "\n", "import contextily as ctx\n", "import geopandas\n", "import matplotlib.animation as animation\n", "import matplotlib.pyplot as plt\n", "import pandas as pd\n", "import planetary_computer\n", "import pystac_client\n", "import rioxarray\n", "import xarray as xr\n", "import odc.stac\n", "import dask.distributed" ] }, { "cell_type": "markdown", "id": "6217412c", "metadata": {}, "source": [ "### Find the storm\n", "\n", "First, we need to find where on earth the storm was. The NCEI [International Best Track Archive for Climate Stewardship](https://www.ncei.noaa.gov/access/metadata/landing-page/bin/iso?id=gov.noaa.ncdc:C00834) provides files with all the information we need." ] }, { "cell_type": "code", "execution_count": 2, "id": "a9601ba0", "metadata": {}, "outputs": [], "source": [ "file, _ = urllib.request.urlretrieve(\n", " \"https://www.ncei.noaa.gov/data/international-best-track-archive-for-\"\n", " \"climate-stewardship-ibtracs/v04r00/access/netcdf/IBTrACS.NA.v04r00.nc\"\n", ")\n", "# The storm id comes from the text file in\n", "# https://www.ncei.noaa.gov/data/international-best-track-archive-for-climate-stewardship-ibtracs\n", "# /v04r00/access/netcdf/\n", "# The name of this file changes with the update date, so we can't access it programmatically.\n", "STORM_ID = b\"2018242N13343\"\n", "ds = xr.open_dataset(file)\n", "storm_loc = (ds.sid == STORM_ID).argmax().item()\n", "\n", "data = ds.sel(storm=storm_loc)\n", "geometry = geopandas.points_from_xy(data.lon, data.lat)" ] }, { "cell_type": "markdown", "id": "e5b46235", "metadata": {}, "source": [ "`geometry` is a geopandas GeoArray with points tracking the location of the storm over time. We'll match those up with the timestamps to plot plot storm's trajectory. We'll also overlay the time period covered by our animation in red." ] }, { "cell_type": "code", "execution_count": 3, "id": "3b5ada5a", "metadata": {}, "outputs": [], "source": [ "df = (\n", " geopandas.GeoDataFrame(\n", " dict(\n", " time=pd.to_datetime(data.time).tz_localize(\"UTC\"),\n", " geometry=geopandas.points_from_xy(data.lon, data.lat),\n", " )\n", " )\n", " .set_crs(4326)\n", " .dropna()\n", ")\n", "\n", "start = pd.Timestamp(\"2018-09-11T13:00:00Z\")\n", "stop = pd.Timestamp(\"2018-09-11T15:40:00Z\")" ] }, { "cell_type": "code", "execution_count": 4, "id": "cbd23142", "metadata": {}, "outputs": [ { "data": { "text/html": [ "" ], "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "ax = df.to_crs(epsg=3857).plot(figsize=(12, 12))\n", "subset = df[df.time.dt.date == start.date()]\n", "subset.to_crs(epsg=3857).plot(ax=ax, color=\"r\")\n", "\n", "ctx.add_basemap(ax)\n", "ax.set_axis_off()\n", "ax.set(title=\"Path of Hurricane Florence (animation period in red)\");" ] }, { "cell_type": "markdown", "id": "0ca49d78", "metadata": {}, "source": [ "Let's save the bounding box for the subset of points we're animating . We'll use it in our query later on." ] }, { "cell_type": "code", "execution_count": 5, "id": "fc950ea3", "metadata": {}, "outputs": [], "source": [ "bbox = list(subset.total_bounds)" ] }, { "cell_type": "markdown", "id": "1330e644", "metadata": {}, "source": [ "### Get the imagery\n", "\n", "Now we'll get the GOES imagery using the Planteary Computer's STAC API. We'll use the `goes-cmi` collection. We'll also have the API filter down the images to just the \"mesoscale\" images (GOES takes images with various fields of view)." ] }, { "cell_type": "code", "execution_count": 6, "id": "4df7dad0", "metadata": {}, "outputs": [], "source": [ "catalog = pystac_client.Client.open(\n", " \"https://planetarycomputer.microsoft.com/api/stac/v1/\",\n", " modifier=planetary_computer.sign_inplace,\n", ")\n", "search = catalog.search(\n", " collections=[\"goes-cmi\"],\n", " bbox=bbox,\n", " datetime=[start, stop],\n", " query={\"goes:image-type\": {\"eq\": \"MESOSCALE\"}},\n", ")\n", "items = sorted(search.item_collection(), key=lambda x: x.datetime)" ] }, { "cell_type": "markdown", "id": "333e4d59", "metadata": {}, "source": [ "Let's load and plot the first item, just to make sure we're on the right track." ] }, { "cell_type": "code", "execution_count": 7, "id": "92855bfe", "metadata": {}, "outputs": [ { "data": { "text/html": [ "" ], "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "ds = rioxarray.open_rasterio(items[0].assets[\"C01_2km\"].href).load()\n", "ds[0].plot.imshow(figsize=(16, 9), cmap=\"Blues\");" ] }, { "cell_type": "markdown", "id": "25e9562b-cf79-491a-82f6-03478c18e4e9", "metadata": {}, "source": [ "We'll use `odc-stac` to load all the bands and time steps into a single datacube." ] }, { "cell_type": "code", "execution_count": 8, "id": "84e6d864-ff60-4927-8d6b-85228574a17d", "metadata": {}, "outputs": [], "source": [ "# Drop assets in different projections\n", "# https://github.com/opendatacube/odc-stac/issues/70\n", "bands = {\"C01_2km\", \"C02_2km\", \"C03_2km\"}\n", "\n", "for item in items:\n", " item.assets = {k: v for k, v in item.assets.items() if k in bands}" ] }, { "cell_type": "code", "execution_count": 9, "id": "b899ee26-e7ef-4363-92ea-0f3e6309e541", "metadata": {}, "outputs": [ { "data": { "text/html": [ "
\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "
<xarray.Dataset>\n",
       "Dimensions:      (time: 160, y: 501, x: 501)\n",
       "Coordinates:\n",
       "  * time         (time) datetime64[ns] 2018-09-11T13:00:21 ... 2018-09-11T15:...\n",
       "  * y            (y) float64 3.368e+06 3.366e+06 ... 2.368e+06 2.366e+06\n",
       "  * x            (x) float64 3.196e+05 3.216e+05 ... 1.32e+06 1.322e+06\n",
       "    spatial_ref  int32 0\n",
       "Data variables:\n",
       "    red          (time, y, x) int16 dask.array<chunksize=(1, 501, 501), meta=np.ndarray>\n",
       "    nir09        (time, y, x) int16 dask.array<chunksize=(1, 501, 501), meta=np.ndarray>\n",
       "    blue         (time, y, x) int16 dask.array<chunksize=(1, 501, 501), meta=np.ndarray>\n",
       "Attributes:\n",
       "    crs:           PROJCRS["undefined",BASEGEOGCRS["undefined",DATUM["undefin...\n",
       "    grid_mapping:  spatial_ref
" ], "text/plain": [ "\n", "Dimensions: (time: 160, y: 501, x: 501)\n", "Coordinates:\n", " * time (time) datetime64[ns] 2018-09-11T13:00:21 ... 2018-09-11T15:...\n", " * y (y) float64 3.368e+06 3.366e+06 ... 2.368e+06 2.366e+06\n", " * x (x) float64 3.196e+05 3.216e+05 ... 1.32e+06 1.322e+06\n", " spatial_ref int32 0\n", "Data variables:\n", " red (time, y, x) int16 dask.array\n", " nir09 (time, y, x) int16 dask.array\n", " blue (time, y, x) int16 dask.array\n", "Attributes:\n", " crs: PROJCRS[\"undefined\",BASEGEOGCRS[\"undefined\",DATUM[\"undefin...\n", " grid_mapping: spatial_ref" ] }, "execution_count": 9, "metadata": {}, "output_type": "execute_result" } ], "source": [ "ds = odc.stac.stac_load(items, bands=bands, chunks={})\n", "key_to_common_name = {\n", " k: item.assets[k].to_dict()[\"eo:bands\"][0][\"common_name\"] for k in bands\n", "}\n", "\n", "ds = ds.rename(key_to_common_name)\n", "ds" ] }, { "cell_type": "markdown", "id": "e3f1a578", "metadata": {}, "source": [ "GOES doesn't have a true green band, which we need for our true color animation. We'll simulate it with a linear combination of the other bands (See [Bah et. al (2018)](https://agupubs.onlinelibrary.wiley.com/doi/10.1029/2018EA000379) for more on this technique)." ] }, { "cell_type": "code", "execution_count": 10, "id": "733c325f", "metadata": {}, "outputs": [ { "data": { "text/html": [ "
\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "
<xarray.DataArray (time: 160, y: 501, x: 501)>\n",
       "dask.array<add, shape=(160, 501, 501), dtype=float64, chunksize=(1, 501, 501), chunktype=numpy.ndarray>\n",
       "Coordinates:\n",
       "  * time         (time) datetime64[ns] 2018-09-11T13:00:21 ... 2018-09-11T15:...\n",
       "  * y            (y) float64 3.368e+06 3.366e+06 ... 2.368e+06 2.366e+06\n",
       "  * x            (x) float64 3.196e+05 3.216e+05 ... 1.32e+06 1.322e+06\n",
       "    spatial_ref  int32 0
" ], "text/plain": [ "\n", "dask.array\n", "Coordinates:\n", " * time (time) datetime64[ns] 2018-09-11T13:00:21 ... 2018-09-11T15:...\n", " * y (y) float64 3.368e+06 3.366e+06 ... 2.368e+06 2.366e+06\n", " * x (x) float64 3.196e+05 3.216e+05 ... 1.32e+06 1.322e+06\n", " spatial_ref int32 0" ] }, "execution_count": 10, "metadata": {}, "output_type": "execute_result" } ], "source": [ "green = 0.45 * ds[\"red\"] + 0.1 * ds[\"nir09\"] + 0.45 * ds[\"blue\"]\n", "green" ] }, { "cell_type": "markdown", "id": "107c4dba", "metadata": {}, "source": [ "Now we'll normalize the data and apply a gamma correction for plotting." ] }, { "cell_type": "code", "execution_count": 11, "id": "d2a0a541", "metadata": {}, "outputs": [ { "data": { "text/html": [ "
\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "\n", "
<xarray.DataArray 'stack-9695230fd7b589ed0e55bad9e0730a08' (time: 160, band: 3,\n",
       "                                                            y: 501, x: 501)>\n",
       "dask.array<clip, shape=(160, 3, 501, 501), dtype=float64, chunksize=(1, 1, 501, 501), chunktype=numpy.ndarray>\n",
       "Coordinates:\n",
       "  * time         (time) datetime64[ns] 2018-09-11T13:00:21 ... 2018-09-11T15:...\n",
       "  * y            (y) float64 3.368e+06 3.366e+06 ... 2.368e+06 2.366e+06\n",
       "  * x            (x) float64 3.196e+05 3.216e+05 ... 1.32e+06 1.322e+06\n",
       "    spatial_ref  int32 0\n",
       "  * band         (band) <U5 'red' 'green' 'blue'
" ], "text/plain": [ "\n", "dask.array\n", "Coordinates:\n", " * time (time) datetime64[ns] 2018-09-11T13:00:21 ... 2018-09-11T15:...\n", " * y (y) float64 3.368e+06 3.366e+06 ... 2.368e+06 2.366e+06\n", " * x (x) float64 3.196e+05 3.216e+05 ... 1.32e+06 1.322e+06\n", " spatial_ref int32 0\n", " * band (band) " ], "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "fig, ax = plt.subplots(figsize=(16, 16))\n", "rgb.isel(time=0).plot.imshow(rgb=\"band\", add_labels=False)\n", "ax.set_axis_off()" ] }, { "cell_type": "markdown", "id": "354e41ba", "metadata": {}, "source": [ "### Create the animation\n", "\n", "We'll use matplotlib's [FuncAnimation](https://matplotlib.org/stable/api/_as_gen/matplotlib.animation.FuncAnimation.html) to create the animation." ] }, { "cell_type": "code", "execution_count": 15, "id": "f1e2adaa", "metadata": {}, "outputs": [ { "data": { "text/html": [ "" ], "text/plain": [ "
" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "fig, ax = plt.subplots(figsize=(16, 16))\n", "fig.subplots_adjust(left=0, bottom=0, right=1, top=1, wspace=None, hspace=None)\n", "ax.set_axis_off()\n", "\n", "img = rgb[0].plot.imshow(ax=ax, add_colorbar=False, rgb=\"band\", add_labels=False)\n", "label = ax.text(\n", " 0.4,\n", " 0.03,\n", " pd.Timestamp(rgb.time.data[0]).isoformat(),\n", " transform=ax.transAxes,\n", " color=\"k\",\n", " size=20,\n", ")\n", "\n", "\n", "def animate(i):\n", " img.set_data(rgb[i].transpose(\"y\", \"x\", \"band\"))\n", " label.set_text(pd.Timestamp(rgb.time.data[i]).isoformat())\n", " return img, label\n", "\n", "\n", "ani = animation.FuncAnimation(fig, animate, frames=len(rgb), interval=120)\n", "ani.save(\n", " \"goes.mp4\",\n", " fps=15,\n", " extra_args=[\"-vcodec\", \"libx264\"],\n", " savefig_kwargs=dict(pad_inches=0, transparent=True),\n", ")" ] }, { "cell_type": "markdown", "id": "eb7aef0b-81c1-454f-9651-c9de45e7264a", "metadata": {}, "source": [ "And we can display it in this notebook using IPython." ] }, { "cell_type": "code", "execution_count": 16, "id": "f38385c1", "metadata": {}, "outputs": [ { "data": { "text/html": [ "" ], "text/plain": [ "" ] }, "execution_count": 16, "metadata": {}, "output_type": "execute_result" } ], "source": [ "from IPython.display import Video\n", "\n", "Video(\"goes.mp4\")" ] }, { "cell_type": "markdown", "id": "593fad1a", "metadata": {}, "source": [ "### Next steps\n", "\n", "Learn more about GOES and using the Planetary Computer\n", "\n", "* [GOES quickstart](../datasets/goes/goes-example.ipynb)\n", "* [Reading from the STAC API](https://planetarycomputer.microsoft.com/docs/quickstarts/reading-stac/)\n", "* [Scale with Dask](https://planetarycomputer.microsoft.com/docs/quickstarts/scale-with-dask/)" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.8.13" }, "widgets": { "application/vnd.jupyter.widget-state+json": { "state": {}, "version_major": 2, "version_minor": 0 } } }, "nbformat": 4, "nbformat_minor": 5 }