{ "cells": [ { "cell_type": "markdown", "metadata": { "hidePrompt": true }, "source": [ "\n", "\n", "
\n", "
\n", " This Notebook is Deprecated\n", "\n", "The OpenSARLab MintPy workflow has been moved into the [OpenSARLab_MintPy_Recipe_Book](https://github.com/ASFOpenSARlab/opensarlab_MintPy_Recipe_Book), which can be found on OpenSARLab at the path: `~/Data_Recipe_Jupyter_Books/opensarlab_MintPy_Recipe_Book`\n", "\n", "We will no longer support this workflow or the `osl_mintpy` environment that it uses.\n", "\n", "Hint: Once you have navigated to the Recipe Book, click the \"JB\" tab on the left hand JupyterLab sidebar to access its table of contents.\n", "\n", "
\n", "\n", "# Preparing a HyP3 InSAR Stack for MintPy\n", "\n", "**Author**: Alex Lewandowski; University of Alaska Fairbanks\n", " \n", "Based on [prep_hyp3_for_mintpy.ipynb](https://github.com/ASFHyP3/hyp3-docs) by Jiang Zhu; University of Alaska Fairbanks\n", "\n", "\n", "\n", "**Important Note about JupyterHub**\n", "Your JupyterHub server will automatically shutdown when left idle for more than 1 hour. Your notebooks will not be lost but you will have to restart their kernels and re-run them from the beginning. You will not be able to seamlessly continue running a partially run notebook." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import url_widget as url_w\n", "notebookUrl = url_w.URLWidget()\n", "display(notebookUrl)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "from IPython.display import Markdown\n", "from IPython.display import display\n", "\n", "notebookUrl = notebookUrl.value\n", "user = !echo $JUPYTERHUB_USER\n", "env = !echo $CONDA_PREFIX\n", "if env[0] == '':\n", " env[0] = 'Python 3 (base)'\n", "if env[0] != '/home/jovyan/.local/envs/osl_mintpy':\n", " display(Markdown(f'WARNING:'))\n", " display(Markdown(f'This notebook should be run using the \"osl_mintpy\" conda environment.'))\n", " display(Markdown(f'It is currently using the \"{env[0].split(\"/\")[-1]}\" environment.'))\n", " display(Markdown(f'Select the \"osl_mintpy\" from the \"Change Kernel\" submenu of the \"Kernel\" menu.'))\n", " display(Markdown(f'If the \"osl_mintpy\" environment is not present, use Create_OSL_Conda_Environments.ipynb to create it.'))\n", " display(Markdown(f'Note that you must restart your server after creating a new environment before it is usable by notebooks.'))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 0. Importing Relevant Python Packages\n", "\n", "In this notebook we will use the following scientific libraries:\n", "- [GDAL](https://www.gdal.org/) is a software library for reading and writing raster and vector geospatial data formats. It includes a collection of programs tailored for geospatial data processing. Most modern GIS systems (such as ArcGIS or QGIS) use GDAL in the background.\n", "\n", "**Our first step is to import gdal and other needed packages**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import copy\n", "from datetime import datetime, timedelta\n", "import ipywidgets as widgets\n", "from itertools import chain\n", "import json\n", "from pathlib import Path\n", "import re\n", "import requests\n", "import shutil\n", "from tqdm.auto import tqdm\n", "from typing import Union\n", "\n", "import numpy as np\n", "from osgeo import gdal\n", "gdal.UseExceptions()\n", "import pandas\n", "from rasterio.warp import transform_bounds\n", "\n", "import opensarlab_lib as asfn\n", "\n", "from hyp3_sdk import Batch, HyP3\n", "\n", "from IPython.display import clear_output\n", "\n", "%matplotlib widget" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 1. Load Your Own Data Stack Into the Notebook\n", "\n", "This notebook assumes that you are accessing an InSAR time series created using the [Alaska Satellite Facility's](https://www.asf.alaska.edu/) value-added product system HyP3, available via [ASF Data Search/Vertex](https://search.asf.alaska.edu/). HyP3 is an ASF service used to prototype value added products and provide them to users to collect feedback.\n", "\n", "You can access HyP3 on-demand products from your HyP3 account or from publically available, pre-processed SARVIEWS Event data. https://sarviews-hazards.alaska.edu/\n", "\n", "Before downloading anything, create an analysis directory to hold your data.\n", "\n", "**Select or create a working directory for the analysis:**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "while True:\n", " print(f\"Current working directory: {Path.cwd()}\")\n", " data_dir = Path(input(f\"\\nPlease enter the name of a directory in which to store your data for this analysis.\"))\n", " if data_dir == Path('.'):\n", " continue\n", " if data_dir.is_dir():\n", " contents = data_dir.glob('*')\n", " if len(list(contents)) > 0:\n", " choice = asfn.handle_old_data(data_dir)\n", " if choice == 1:\n", " if data_dir.exists():\n", " shutil.rmtree(data_dir)\n", " data_dir.mkdir()\n", " break\n", " elif choice == 2:\n", " break\n", " else:\n", " clear_output()\n", " continue\n", " else:\n", " break\n", " else:\n", " data_dir.mkdir()\n", " break" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Define absolute path to analysis directory:**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "analysis_directory = Path.cwd()/(data_dir)\n", "print(f\"analysis_directory: {analysis_directory}\")" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def dates_from_product_name(product_name):\n", " regex = \"[0-9]{8}T[0-9]{6}_[0-9]{8}T[0-9]{6}\"\n", " results = re.search(regex, product_name)\n", " if results:\n", " return results.group(0)\n", " else:\n", " return None" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Create a HyP3 object and authenticate:**" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "slideshow": { "slide_type": "subslide" } }, "outputs": [], "source": [ "hyp3 = HyP3(prompt=True)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**List projects containing active InSAR products and select one:**\n", "\n", "Your HyP3 InSAR project should include DEMs, which are available as options when submitting a HyP3 project" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "my_hyp3_info = hyp3.my_info()\n", "active_projects = dict()\n", "\n", "print(\"Checking all HyP3 projects for current INSAR_GAMMA jobs\")\n", "for project in tqdm(my_hyp3_info['job_names']):\n", " batch = Batch()\n", " batch = hyp3.find_jobs(name=project, job_type='INSAR_GAMMA').filter_jobs(running=False, include_expired=False)\n", " if len(batch) > 0:\n", " active_projects.update({batch.jobs[0].name: batch})\n", "\n", "if len(active_projects) > 0:\n", " display(Markdown(\"Note: After selecting a project, you must select the next cell before hitting the 'Run' button or typing Shift/Enter.\"))\n", " display(Markdown(\"Otherwise, you will rerun this code cell.\"))\n", " print('\\nSelect a Project:')\n", " project_select = asfn.select_parameter(active_projects.keys())\n", " display(project_select)\n", "else:\n", " print(\"Found no active projects containing InSAR products\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Select a date range of products to download:**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "jobs = active_projects[project_select.value]\n", "\n", "display(Markdown(\"Note: After selecting a date range, you should select the next cell before hitting the 'Run' button or typing Shift/Enter.\"))\n", "display(Markdown(\"Otherwise, you may simply rerun this code cell.\"))\n", "print('\\nSelect a Date Range:')\n", "dates = asfn.get_job_dates(jobs)\n", "date_picker = asfn.gui_date_picker(dates)\n", "display(date_picker)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Save the selected date range and remove products falling outside of it:**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "date_range = asfn.get_slider_vals(date_picker)\n", "date_range[0] = date_range[0].date()\n", "date_range[1] = date_range[1].date()\n", "print(f\"Date Range: {str(date_range[0])} to {str(date_range[1])}\")\n", "jobs = asfn.filter_jobs_by_date(jobs, date_range)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Gather the available paths and orbit directions for the remaining products:**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "display(Markdown(\"This may take some time for projects containing many jobs...\"))\n", "asfn.set_paths_orbits(jobs)\n", "paths = set()\n", "orbit_directions = set()\n", "for p in jobs:\n", " paths.add(p.path)\n", " orbit_directions.add(p.orbit_direction)\n", "display(Markdown(f\"Done.\"))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "---\n", "**Select a path:**\n", "\n", "This notebook does not currently support merging InSAR products in multiple paths" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "display(Markdown(\"Note: After selecting a path, you must select the next cell before hitting the 'Run' button or typing Shift/Enter.\"))\n", "display(Markdown(\"Otherwise, you will simply rerun this code cell.\"))\n", "print('\\nSelect a Path:')\n", "path_choice = asfn.select_parameter(paths)\n", "display(path_choice)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Save the selected flight path/s:**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "flight_path = path_choice.value\n", "if flight_path:\n", " if flight_path:\n", " print(f\"Flight Path: {flight_path}\")\n", " else:\n", " print('Flight Path: All Paths')\n", "else:\n", " print(\"WARNING: You must select a flight path in the previous cell, then rerun this cell.\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Select an orbit direction:**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "if len(orbit_directions) > 1:\n", " display(Markdown(\"Note: After selecting a flight direction, you must select the next cell before hitting the 'Run' button or typing Shift/Enter.\"))\n", " display(Markdown(\"Otherwise, you will simply rerun this code cell.\"))\n", "print('\\nSelect a Flight Direction:')\n", "direction_choice = asfn.select_parameter(orbit_directions, 'Direction:')\n", "display(direction_choice)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Save the selected orbit direction:**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "direction = direction_choice.value\n", "print(f\"Orbit Direction: {direction}\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Filter jobs by path and orbit direction:**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "jobs = asfn.filter_jobs_by_path(jobs, [flight_path])\n", "jobs = asfn.filter_jobs_by_orbit(jobs, direction)\n", "print(f\"There are {len(jobs)} products to download.\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Download the products, unzip them into a directory named after the product type, and delete the zip files:**" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "tags": [] }, "outputs": [], "source": [ "print(f\"\\nProject: {jobs.jobs[0].name}\")\n", "project_zips = jobs.download_files(analysis_directory)\n", "for z in project_zips:\n", " asfn.asf_unzip(str(analysis_directory), str(z))\n", " z.unlink()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# 2. Confirm Presence of a DEM, Azimuth Angle Map, and Incidence Angle Map\n", "\n", "- These are optional addon products for HyP3, which are necessary for MintPy\n", " - Incidence angle maps are included with HyP3 jobs when the `Include Look Vectors` option is selected.\n", " - DEMs are included with HyP3 jobs when the `Include DEM` option is selected\n", "- This is an optional addon product for HyP3, which is necessary for MintPy if running the correct_SET (Solid Earth Tides) step\n", " - Azimuth angle maps are included with HyP3 jobs when the `Include Look Vectors` option is selected\n", "\n", "**All of the above mentioned files will be included in an InSAR project if Set MintPy Options is selected when adding InSAR jobs to a project in ASF-Search (Vertex)**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "dems = list(analysis_directory.glob('*/*dem*.tif'))\n", "az_angle_maps = list(analysis_directory.glob('*/*lv_phi*.tif'))\n", "inc_angle_maps = list(analysis_directory.glob('*/*lv_theta*.tif'))\n", "\n", "if len(dems) > 0:\n", " print(\"Success: Found at least 1 DEM.\")\n", "else:\n", " raise FileNotFoundError(\"Failed to find at least 1 DEM. \\\n", " \\nYou will not be able to successfully run a MintPy time-series unless you reorder your HyP3 project \\\n", "with DEMS or provide one from another source.\")\n", " \n", "if len(az_angle_maps) > 0:\n", " print(\"Success: Found at least 1 Azimuth Angle Map.\")\n", "else:\n", " raise FileNotFoundError(\"Failed to find at least 1 Azimuth Angle Map. \\\n", " \\nYou will not be able to successfully run a MintPy time-series unless your reorder your HyP3 project \\\n", "with 'Include Look Vectors' option selected.\")\n", " \n", "if len(inc_angle_maps) > 0:\n", " print(\"Success: Found at least 1 Incidence Angle Map.\")\n", "else:\n", " raise FileNotFoundError(\"Failed to find at least 1 Incidence Angle Map. \\\n", " \\nYou will not be able to successfully run a MintPy time-series unless your reorder your HyP3 project \\\n", "with 'Include Inc. Angle Map' option selected.\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# 3. Subset the Stack and Cleanup Unused Files\n", "\n", "**Delete unneeded files:**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "for pattern in [\"xml\",\"png\",\"kmz\",\"md.txt\"]:\n", " unneeded_files = analysis_directory.glob(f\"*/*.{pattern}\")\n", " for file in unneeded_files:\n", " file.unlink()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Project all tiffs to Predominant UTM**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "from typing import List, Union, Dict\n", "from collections import Counter\n", "\n", "def get_projection(img_path: Union[Path, str]) -> Union[str, None]:\n", " \"\"\"\n", " Takes: a string or posix path to a product in a UTM projection\n", "\n", " Returns: the projection (as a string) or None if none found\n", " \"\"\"\n", " img_path = str(img_path)\n", " try:\n", " info = gdal.Info(img_path, format='json')['coordinateSystem']['wkt']\n", " except KeyError:\n", " return None\n", " except TypeError:\n", " raise FileNotFoundError\n", "\n", " regex = 'ID\\[\"EPSG\",[0-9]{4,5}\\]\\]$'\n", " results = re.search(regex, info)\n", " if results:\n", " return results.group(0).split(',')[1][:-2]\n", " else:\n", " return None\n", "\n", "def get_projections(tiff_paths: List[Union[Path, str]]) -> Dict:\n", " \"\"\"\n", " Takes: List of string or posix paths to geotiffs\n", " \n", " Returns: Dictionary key: epsg, value: number of tiffs in that epsg \n", " \"\"\"\n", " epsgs = []\n", " for p in tiff_paths:\n", " epsgs.append(get_projection(p))\n", "\n", " epsgs = dict(Counter(epsgs))\n", " return epsgs\n", "\n", "def get_res(tiff):\n", " tiff = str(tiff)\n", " f = gdal.Open(tiff)\n", " return f.GetGeoTransform()[1] \n", "\n", "def get_no_data_val(pth):\n", " pth = str(pth)\n", " f = gdal.Open(str(pth))\n", " if f.GetRasterBand(1).DataType > 5:\n", " no_data_val = f.GetRasterBand(1).GetNoDataValue()\n", " return np.nan if no_data_val == None else f.GetRasterBand(1).GetNoDataValue()\n", " else:\n", " return 0" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "fnames = list(analysis_directory.glob('*/*.tif'))\n", "fnames.sort()\n", "epsgs = get_projections(fnames)\n", "predominant_epsg = None if len(epsgs) == 1 else max(epsgs, key=epsgs.get)\n", "\n", "if predominant_epsg:\n", " for pth in fnames:\n", " src_SRS = get_projection(str(pth))\n", " res = get_res(pth)\n", " if src_SRS != predominant_epsg:\n", " res = get_res(pth)\n", " no_data_val = get_no_data_val(pth)\n", " \n", " temp = pth.parent/f\"temp_{pth.stem}.tif\"\n", " pth.rename(temp)\n", "\n", " warp_options = {\n", " \"dstSRS\":f\"EPSG:{predominant_epsg}\", \"srcSRS\":f\"EPSG:{src_SRS}\",\n", " \"targetAlignedPixels\":True,\n", " \"xRes\":res, \"yRes\":res,\n", " \"dstNodata\": no_data_val\n", " }\n", " gdal.Warp(str(pth), str(temp), **warp_options)\n", " temp.unlink()" ] }, { "cell_type": "markdown", "metadata": { "hideOutput": true }, "source": [ "**Determine the maximum and common extents of the stack and plot an Area-of_Interest Selector:**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "amp = list(analysis_directory.glob(f'*/*_amp*.tif'))\n", "max_extents = asfn.get_max_extents(amp)\n", "xmin, ymin, xmax, ymax = transform_bounds(int(asfn.get_projection(str(amp[0]))), 3857, *max_extents)\n", "max_extents = [xmin, ymin, xmax, ymax]\n", "\n", "common_extents = asfn.get_common_coverage_extents(amp)\n", "xmin, ymin, xmax, ymax = transform_bounds(int(asfn.get_projection(str(amp[0]))), 3857, *common_extents)\n", "common_extents = [xmin, ymin, xmax, ymax]\n", "\n", "print(max_extents)\n", "print(common_extents)\n", "\n", "aoi = asfn.AOI_Selector(max_extents, common_extents, figsize=(10, 8))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Convert the subset corner coordinates from Web-Mercator back to the input data's EPSG:** " ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "try:\n", " xmin, ymin, xmax, ymax = transform_bounds(3857, \n", " int(asfn.get_projection(str(amp[0]))), \n", " *[aoi.x1, aoi.y1, aoi.x2, aoi.y2])\n", " ul = [xmin, ymax]\n", " lr = [xmax, ymin]\n", " print(f\"AOI Corner Coordinates:\")\n", " print(f\"upper left corner: {ul}\")\n", " print(f\"lower right corner: {lr}\")\n", "except TypeError:\n", " print('TypeError')\n", " display(Markdown(f'This error may occur if an AOI was not selected.'))\n", " display(Markdown(f'Note that the square tool icon in the AOI selector menu is NOT the selection tool. It is the zoom tool.'))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Crop the stack to the AOI and reproject to lat-lon:**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "fnames = list(analysis_directory.glob('*/*.tif'))\n", "fnames.sort()\n", "\n", "for i, fname in enumerate(fnames):\n", " clip = fname.parent/f\"{fname.stem}_clip.tif\"\n", " gdal.Translate(destName=str(clip), srcDS=str(fname), projWin=[ul[0], ul[1], lr[0], lr[1]])\n", " gdal.Warp(str(clip), str(clip), dstSRS='EPSG:4326', dstNodata=0)\n", " fname.unlink() " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Remove any subset scenes containing no data:**" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "fnames = list(analysis_directory.glob('*/*.tif'))\n", "fnames = [str(f) for f in fnames]\n", "fnames.sort()\n", "\n", "removed = []\n", "for f in fnames:\n", " if not \"dem\" in str(f):\n", " raster = gdal.Open(f)\n", " if raster:\n", " band = raster.ReadAsArray()\n", " if np.count_nonzero(band) < 1:\n", " Path(f).unlink()\n", " removed.append(f)\n", "\n", "if len(removed) == 0:\n", " print(\"No Geotiffs were removed\")\n", "else:\n", " print(f\"{len(removed)} GeoTiffs removed:\")\n", " for f in removed:\n", " print(f)" ] }, { "cell_type": "markdown", "metadata": { "tags": [] }, "source": [ "# 4. Proceed to the MintPy Time-Series Notebook\n", "\n", "**Run the code cell below for a link to the MintPy_Time_Series_From_Prepared_Data_Stack notebook:**" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Describe what each notebook does in bottom links" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# from pathlib import Path\n", "from IPython.display import display, HTML\n", "\n", "current = Path.cwd()\n", "abs_path = current/\"MintPy_Time_Series_From_Prepared_Data_Stack.ipynb\"\n", "relative_path = abs_path.relative_to(current) \n", "\n", "link_t = f\"Open MintPy_Time_Series_From_Prepared_Data_Stack.ipynb to run an InSAR time-series analysis on your data using MintPy\"\n", "html = HTML(link_t)\n", "display(html)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "*Prepare_HyP3_InSAR_Stack_for_MintPy.ipynb - Version 1.2.5 - April 2024*\n", " \n", "*Version Changes*\n", "\n", "- *Deprecate notebook*" ] } ], "metadata": { "kernelspec": { "display_name": "Python [conda env:.local-osl_mintpy]", "language": "python", "name": "conda-env-.local-osl_mintpy-py" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.9.0" } }, "nbformat": 4, "nbformat_minor": 4 }