{ "cells": [ { "cell_type": "markdown", "id": "generous-bandwidth", "metadata": {}, "source": [ "# DeepCube Use Case 3: Fire Hazard Forecasting in the Mediterranean" ] }, { "cell_type": "markdown", "id": "japanese-cattle", "metadata": {}, "source": [ " ![use case3 icon](static/uc3-logo.png) " ] }, { "cell_type": "markdown", "id": "interior-logistics", "metadata": { "id": "collected-somerset" }, "source": [ "# Notebook 2: Fire danger maps in Greece with Deep Learning models" ] }, { "cell_type": "markdown", "id": "internal-range", "metadata": { "id": "creative-bracelet" }, "source": [ "This notebook shows how to load our pretrained models for Fire Danger prediction to perform inference on the published [greece_wildfire_datacube](https://zenodo.org/deposit/4943354) or its [cloud version](https://deepcube-h2020.eu/data-cubes/).\n", "\n", "For details about how our models were trained, have a look at our relevant paper \"Deep Learning Methods for Daily Wildfire Danger Forecasting\" [available on arXiv](https://arxiv.org/abs/2111.02736), which is accepted to the workshop on Artificial Intelligence for Humanitarian Assistance and Disaster Response at the 35th Conference on Neural Information Processing Systems (NeurIPS 2021)\n", "\n", "![arxiv paper](static/arxiv_paper.PNG)" ] }, { "cell_type": "markdown", "id": "intended-hands", "metadata": { "id": "understanding-intention" }, "source": [ "# Initial Imports\n", "\n", "We start by importing the libraries we need. Make sure to have them installed in your system. " ] }, { "cell_type": "code", "execution_count": null, "id": "raised-compound", "metadata": { "id": "hungry-machine" }, "outputs": [], "source": [ "import xarray as xr\n", "import fsspec\n", "import zarr\n", "\n", "import numpy as np\n", "import pandas as pd\n", "\n", "import csv\n", "import numpy as np\n", "import matplotlib.pyplot as plt\n", "\n", "import random\n", "import os\n", "from tqdm import tqdm\n", "import gc\n", "from pathlib import Path\n", "\n", "import torchvision\n", "import torch\n", "from torch.utils.data import Dataset, DataLoader, TensorDataset" ] }, { "cell_type": "markdown", "id": "lucky-advocacy", "metadata": { "id": "quarterly-catalyst" }, "source": [ "# Access Datacube\n", "\n", "Let's open the dataset with xarray" ] }, { "cell_type": "code", "execution_count": 2, "id": "modern-travel", "metadata": { "id": "demographic-display" }, "outputs": [], "source": [ "# uncomment the lines if you download the dataset locally\n", "# !wget -O dataset_greece.nc https://zenodo.org/record/4943354/files/dataset_greece.nc?download=1\n", "# ds = xr.open_dataset('./dataset_greece.nc')\n", "\n", "# comment out the following two lines if you want to access the dataset locally\n", "url = 'https://storage.de.cloud.ovh.net/v1/AUTH_84d6da8e37fe4bb5aea18902da8c1170/uc3/uc3cube.zarr'\n", "ds = xr.open_zarr(fsspec.get_mapper(url), consolidated=True)\n" ] }, { "cell_type": "code", "execution_count": 3, "id": "quiet-admission", "metadata": {}, "outputs": [ { "data": { "text/html": [ "
<xarray.Dataset>\n", "Dimensions: (time: 4314, x: 700, y: 562)\n", "Coordinates:\n", " band int64 ...\n", " spatial_ref (time) int64 dask.array<chunksize=(288,), meta=np.ndarray>\n", " * time (time) datetime64[ms] 2009-03-06 ... 2020-12-26\n", " * x (x) float64 19.86 19.87 19.89 ... 28.16 28.17 28.18\n", " * y (y) float64 41.62 41.61 41.59 ... 34.96 34.95 34.94\n", "Data variables: (12/58)\n", " 1 km 16 days EVI (time, y, x) float64 dask.array<chunksize=(288, 50, 50), meta=np.ndarray>\n", " 1 km 16 days NDVI (time, y, x) float64 dask.array<chunksize=(288, 50, 50), meta=np.ndarray>\n", " 1 km 16 days VI Quality (time, y, x) float64 dask.array<chunksize=(288, 50, 50), meta=np.ndarray>\n", " ET_500m (time, y, x) float64 dask.array<chunksize=(288, 50, 50), meta=np.ndarray>\n", " ET_QC_500m (time, y, x) float64 dask.array<chunksize=(288, 50, 50), meta=np.ndarray>\n", " FparExtra_QC (time, y, x) float64 dask.array<chunksize=(288, 50, 50), meta=np.ndarray>\n", " ... ...\n", " population_density_2020 (y, x) float32 dask.array<chunksize=(50, 50), meta=np.ndarray>\n", " roads_density_2020 (y, x) float64 dask.array<chunksize=(50, 50), meta=np.ndarray>\n", " slope_max (y, x) float32 dask.array<chunksize=(50, 50), meta=np.ndarray>\n", " slope_mean (y, x) float32 dask.array<chunksize=(50, 50), meta=np.ndarray>\n", " slope_min (y, x) float32 dask.array<chunksize=(50, 50), meta=np.ndarray>\n", " slope_std (y, x) float32 dask.array<chunksize=(50, 50), meta=np.ndarray>
array(0)
\n",
"
| \n",
"\n", "\n", " | \n", "
array(['2009-03-06T00:00:00.000', '2009-03-07T00:00:00.000',\n", " '2009-03-08T00:00:00.000', ..., '2020-12-24T00:00:00.000',\n", " '2020-12-25T00:00:00.000', '2020-12-26T00:00:00.000'],\n", " dtype='datetime64[ms]')
array([19.862915, 19.874819, 19.886723, ..., 28.16 , 28.171904, 28.183808])
array([41.617206, 41.605302, 41.593398, ..., 34.962873, 34.950969, 34.939065])
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "
\n",
"
| \n",
"\n", "\n", " | \n", "