{ "cells": [ { "cell_type": "markdown", "id": "d20494ef", "metadata": {}, "source": [ "# Time series analysis with HyP3 and MintPy\n", "\n", "This notebook walks through performing a time-series analysis of the 2019 Ridgecrest, CA earthquake with On Demand InSAR products from the Alaska Satellite facility and MintPy. We'll:\n", "\n", "1. Use the [ASF Search Python package](https://docs.asf.alaska.edu/asf_search/basics/) to:\n", " - Search ASF's catalog for Sentinel-1 SAR products covering the [Ridgecrest earthquake](https://earthquake.usgs.gov/storymap/index-ridgecrest.html)\n", " - Select a reference scene to generate a baseline stack\n", " - Select a [short baseline subset (SBAS)](https://docs.asf.alaska.edu/vertex/sbas/) of scene pairs for InSAR processing\n", "\n", "\n", "2. Use the [HyP3 Python SDK](https://hyp3-docs.asf.alaska.edu/using/sdk/) to:\n", " - Request On Demand InSAR products from ASF HyP3\n", " - Download the InSAR products when they are done processing\n", "\n", "\n", "3. Use [GDAL](https://gdal.org/api/index.html#python-api) and [MintPy](https://mintpy.readthedocs.io/en/latest/) to:\n", " - Prepare the InSAR products for MintPy\n", " - perform a time-series analysis with MintPy\n", " \n", "---\n", "\n", "**Note:** This notebook does assume you have some familiarity with InSAR processing with MintPy already, and is a minimal example without much context or explanations. If you're new to InSAR and MintPy, I suggest checking out:\n", "* our [InSAR on Demand Story Map](https://storymaps.arcgis.com/stories/68a8a3253900411185ae9eb6bb5283d3)\n", "\n", "\n", "* [OpenSARlab's](https://opensarlab-docs.asf.alaska.edu/) highly detailed walkthrough of using HyP3 + MintPy via these notebooks:\n", " * [Prepare a HyP3 InSAR Stack for MintPy](https://nbviewer.org/github/ASFOpenSARlab/opensarlab-notebooks/blob/master/SAR_Training/English/Master/Prepare_HyP3_InSAR_Stack_for_MintPy.ipynb)\n", " * [MintPy Time-series Analysis](https://nbviewer.org/github/ASFOpenSARlab/opensarlab-notebooks/blob/master/SAR_Training/English/Master/MintPy_Time_Series_From_Prepared_Data_Stack.ipynb)\n", " \n", " Note: While these notebooks make some assumptions you're working in OpenSARlab, you can run these \n", " notebooks outside OpenSARlab by creating [this conda environment](https://github.com/ASFOpenSARlab/opensarlab-envs/blob/main/Environment_Configs/insar_analysis_env.yml)." ] }, { "cell_type": "markdown", "id": "b0a6e353", "metadata": {}, "source": [ "## 0. Initial Setup\n", "\n", "To run this notebook, you'll need a conda environment with the required dependencies. You can set up a new environment (recommended) and run the jupyter server like:\n", "```shell\n", "conda create -n hyp3-mintpy python=3.10 asf_search hyp3_sdk \"mintpy>=1.5.2\" pandas jupyter ipympl\n", "\n", "conda activate hyp3-mintpy\n", "jupyter notebook hyp3_insar_stack_for_ts_analysis.ipynb\n", "```\n", "Or, install these dependencies into your own environment:\n", "```shell\n", "conda install hyp3-mintpy python=3.10 asf_search hyp3_sdk \"mintpy>=1.5.2\" pandas jupyter ipympl\n", "\n", "jupyter notebook hyp3_insar_stack_for_ts_analysis.ipynb\n", "```" ] }, { "cell_type": "code", "execution_count": null, "id": "64e566f3", "metadata": {}, "outputs": [], "source": [ "\n", "from pathlib import Path\n", "\n", "from dateutil.parser import parse as parse_date\n" ] }, { "cell_type": "markdown", "id": "33543149", "metadata": {}, "source": [ "### Set parameters" ] }, { "cell_type": "code", "execution_count": null, "id": "7fa17bd0", "metadata": {}, "outputs": [], "source": [ "\n", "project_name = '2019_ridgecrest'\n", "work_dir = Path.cwd() / project_name\n", "data_dir = work_dir / 'data'\n", "\n", "stack_start = parse_date('2019-06-10 00:00:00Z')\n", "stack_end = parse_date('2019-07-21 00:00:00Z')\n", "max_temporal_baseline = 13 #days\n", "\n", "data_dir.mkdir(parents=True, exist_ok=True)\n" ] }, { "cell_type": "markdown", "id": "0e62c544", "metadata": {}, "source": [ "## 1. Select InSAR pairs with ASF Search" ] }, { "cell_type": "code", "execution_count": null, "id": "4498c66c", "metadata": {}, "outputs": [], "source": [ "\n", "import asf_search as asf\n", "import pandas as pd\n", "\n", "search_results = asf.geo_search(\n", " platform=asf.SENTINEL1,\n", " intersectsWith='POINT(-117.599330 35.769500)',\n", " start='2019-06-10',\n", " end='2019-07-21',\n", " processingLevel=asf.SLC,\n", " beamMode=asf.IW,\n", " flightDirection=asf.ASCENDING,\n", " )\n" ] }, { "cell_type": "code", "execution_count": null, "id": "9f7ccdcc", "metadata": {}, "outputs": [], "source": [ "\n", "baseline_results = asf.baseline_search.stack_from_product(search_results[-1])\n", "\n", "columns = list(baseline_results[0].properties.keys()) + ['geometry', ]\n", "data = [list(scene.properties.values()) + [scene.geometry, ] for scene in baseline_results]\n", "\n", "stack = pd.DataFrame(data, columns=columns)\n", "stack['startTime'] = stack.startTime.apply(parse_date)\n", "\n", "stack = stack.loc[(stack_start <= stack.startTime) & (stack.startTime <= stack_end)]\n" ] }, { "cell_type": "code", "execution_count": null, "id": "e8d9f38b", "metadata": {}, "outputs": [], "source": [ "\n", "sbas_pairs = set()\n", "\n", "for reference, rt in stack.loc[::-1, ['sceneName', 'temporalBaseline']].itertuples(index=False):\n", " secondaries = stack.loc[\n", " (stack.sceneName != reference)\n", " & (stack.temporalBaseline - rt <= max_temporal_baseline)\n", " & (stack.temporalBaseline - rt > 0)\n", " ]\n", " for secondary in secondaries.sceneName:\n", " sbas_pairs.add((reference, secondary))\n" ] }, { "cell_type": "markdown", "id": "9e5c5b0b", "metadata": {}, "source": [ "## 2. Request On Demand InSAR products from ASF HyP3\n", "\n", "Use your [NASA Earthdata login](https://urs.earthdata.nasa.gov/) to connect to [ASF HyP3](https://hyp3-docs.asf.alaska.edu/)." ] }, { "cell_type": "code", "execution_count": null, "id": "be78f415", "metadata": {}, "outputs": [], "source": [ "\n", "import hyp3_sdk as sdk\n", "\n", "hyp3 = sdk.HyP3(prompt=True)\n" ] }, { "cell_type": "code", "execution_count": null, "id": "e1dec3dd", "metadata": {}, "outputs": [], "source": [ "\n", "jobs = sdk.Batch()\n", "for reference, secondary in sbas_pairs:\n", " jobs += hyp3.submit_insar_job(reference, secondary, name=project_name,\n", " include_dem=True, include_look_vectors=True)\n" ] }, { "cell_type": "code", "execution_count": null, "id": "22b82d81", "metadata": {}, "outputs": [], "source": [ "\n", "jobs = hyp3.watch(jobs)\n" ] }, { "cell_type": "code", "execution_count": null, "id": "a3b3d2b3", "metadata": {}, "outputs": [], "source": [ "\n", "jobs = hyp3.find_jobs(name=project_name)\n" ] }, { "cell_type": "code", "execution_count": null, "id": "4b461cf0", "metadata": {}, "outputs": [], "source": [ "\n", "insar_products = jobs.download_files(data_dir)\n", "insar_products = [sdk.util.extract_zipped_product(ii) for ii in insar_products]\n" ] }, { "cell_type": "markdown", "id": "3181031f", "metadata": {}, "source": [ "## 3. Time-series Analysis with MintPy" ] }, { "cell_type": "markdown", "id": "3cd1b850", "metadata": {}, "source": [ "### 3.1 Subset all GeoTIFFs to their common overlap" ] }, { "cell_type": "code", "execution_count": null, "id": "31f75d5c", "metadata": {}, "outputs": [], "source": [ "from pathlib import Path\n", "from typing import List, Union\n", "from osgeo import gdal\n", "\n", "\n", "def get_common_overlap(file_list: List[Union[str, Path]]) -> List[float]:\n", " \"\"\"Get the common overlap of a list of GeoTIFF files\n", " \n", " Arg:\n", " file_list: a list of GeoTIFF files\n", " \n", " Returns:\n", " [ulx, uly, lrx, lry], the upper-left x, upper-left y, lower-right x, and lower-right y\n", " corner coordinates of the common overlap\n", " \"\"\"\n", " \n", " corners = [gdal.Info(str(dem), format='json')['cornerCoordinates'] for dem in file_list]\n", "\n", " ulx = max(corner['upperLeft'][0] for corner in corners)\n", " uly = min(corner['upperLeft'][1] for corner in corners)\n", " lrx = min(corner['lowerRight'][0] for corner in corners)\n", " lry = max(corner['lowerRight'][1] for corner in corners)\n", " return [ulx, uly, lrx, lry]\n" ] }, { "cell_type": "code", "execution_count": null, "id": "43c55f08", "metadata": {}, "outputs": [], "source": [ "\n", "files = data_dir.glob('*/*_dem.tif')\n", "\n", "overlap = get_common_overlap(files)\n" ] }, { "cell_type": "code", "execution_count": null, "id": "20d94460", "metadata": {}, "outputs": [], "source": [ "\n", "from pathlib import Path\n", "from typing import List, Union\n", "\n", "def clip_hyp3_products_to_common_overlap(data_dir: Union[str, Path], overlap: List[float]) -> None:\n", " \"\"\"Clip all GeoTIFF files to their common overlap\n", " \n", " Args:\n", " data_dir:\n", " directory containing the GeoTIFF files to clip\n", " overlap:\n", " a list of the upper-left x, upper-left y, lower-right-x, and lower-tight y\n", " corner coordinates of the common overlap\n", " Returns: None\n", " \"\"\"\n", "\n", " \n", " files_for_mintpy = ['_water_mask.tif', '_corr.tif', '_unw_phase.tif', '_dem.tif', '_lv_theta.tif', '_lv_phi.tif']\n", "\n", " for extension in files_for_mintpy:\n", "\n", " for file in data_dir.rglob(f'*{extension}'):\n", "\n", " dst_file = file.parent / f'{file.stem}_clipped{file.suffix}'\n", "\n", " gdal.Translate(destName=str(dst_file), srcDS=str(file), projWin=overlap)\n" ] }, { "cell_type": "code", "execution_count": null, "id": "023ca045", "metadata": {}, "outputs": [], "source": [ "\n", "clip_hyp3_products_to_common_overlap(data_dir, overlap)\n" ] }, { "cell_type": "markdown", "id": "92be356f", "metadata": {}, "source": [ "### 3.2 Create the MintPy config file" ] }, { "cell_type": "code", "execution_count": null, "id": "9c3cf17d", "metadata": {}, "outputs": [], "source": [ "\n", "mintpy_config = work_dir / 'mintpy_config.txt'\n", "mintpy_config.write_text(\n", "f\"\"\"\n", "mintpy.load.processor = hyp3\n", "##---------interferogram datasets\n", "mintpy.load.unwFile = {data_dir}/*/*_unw_phase_clipped.tif\n", "mintpy.load.corFile = {data_dir}/*/*_corr_clipped.tif\n", "##---------geometry datasets:\n", "mintpy.load.demFile = {data_dir}/*/*_dem_clipped.tif\n", "mintpy.load.incAngleFile = {data_dir}/*/*_lv_theta_clipped.tif\n", "mintpy.load.azAngleFile = {data_dir}/*/*_lv_phi_clipped.tif\n", "mintpy.load.waterMaskFile = {data_dir}/*/*_water_mask_clipped.tif\n", "mintpy.troposphericDelay.method = no\n", "\"\"\")\n" ] }, { "cell_type": "markdown", "id": "87385631", "metadata": {}, "source": [ "### 3.3 run MintPy to do the time series analysis" ] }, { "cell_type": "code", "execution_count": null, "id": "a012c642", "metadata": {}, "outputs": [], "source": [ "\n", "!smallbaselineApp.py --dir {work_dir} {mintpy_config}\n" ] }, { "cell_type": "code", "execution_count": null, "id": "3e866ac9", "metadata": {}, "outputs": [], "source": [ "\n", "%matplotlib widget\n", "from mintpy.cli import view, tsview\n" ] }, { "cell_type": "code", "execution_count": null, "id": "acf5cfdc", "metadata": {}, "outputs": [], "source": [ "\n", "view.main([f'{work_dir}/velocity.h5'])\n" ] }, { "cell_type": "code", "execution_count": null, "id": "b6172aa1", "metadata": {}, "outputs": [], "source": [ "\n", "tsview.main([f'{work_dir}/timeseries.h5'])\n" ] }, { "cell_type": "code", "execution_count": null, "id": "8a858c62", "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.10.12" } }, "nbformat": 4, "nbformat_minor": 5 }