{ "cells": [ { "cell_type": "markdown", "metadata": { "collapsed": true }, "source": [ "# Using the HyP3 SDK for Python\n", "\n", "[![Binder](https://mybinder.org/badge_logo.svg)](https://mybinder.org/v2/gh/ASFHyP3/hyp3-sdk/main?filepath=docs%2Fsdk_example.ipynb)\n", "\n", "HyP3's Python SDK `hyp3_sdk` provides a convenience wrapper around the HyP3 API and HyP3 jobs.\n", "\n", "\n", "The HyP3 SDK can be installed using [Anaconda/Miniconda](https://docs.conda.io/projects/conda/en/latest/user-guide/install/download.html#anaconda-or-miniconda)\n", " (recommended) via [`conda`](https://anaconda.org/conda-forge/hyp3_sdk):\n", "\n", "```\n", "conda install -c conda-forge hyp3_sdk\n", "```\n", "\n", "Or using [`pip`](https://pypi.org/project/hyp3-sdk/):\n", "\n", "```\n", "python -m pip install hyp3_sdk\n", "```\n", "\n", "Full documentation of the SDK can be found in the [HyP3 documentation](https://hyp3-docs.asf.alaska.edu/using/sdk/).\n", "\n", "### Other recommended packages\n", "\n", "We also recommend installing the `asf_search` Python package for performing searches of the ASF catalog. The ASF Search\n", "Python package can be installed using [Anaconda/Miniconda](https://docs.conda.io/projects/conda/en/latest/user-guide/install/download.html#anaconda-or-miniconda)\n", "(recommended) via [`conda`](https://anaconda.org/conda-forge/asf_search):\n", "\n", "```\n", "conda install -c conda-forge asf_search\n", "```\n", "\n", "Or using [`pip`](https://pypi.org/project/asf-search):\n", "\n", "```\n", "python -m pip install asf_search\n", "```\n", "\n", "Full documentation of `asf_search` can be found in the [ASF search documentation](https://docs.asf.alaska.edu/asf_search/basics)." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# initial setup\n", "import asf_search as asf\n", "import hyp3_sdk as sdk" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Authenticating to the API\n", "\n", "The SDK will attempt to pull your [NASA Earthdata Login](https://urs.earthdata.nasa.gov/) credentials out of `~/.netrc`\n", "by default, or you can pass your credentials in directly" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# .netrc\n", "hyp3 = sdk.HyP3()" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "# or enter your credentials\n", "hyp3 = sdk.HyP3(prompt=True)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Submitting jobs\n", "\n", "The SDK provides a submit method for [all supported job types](https://hyp3-docs.asf.alaska.edu/products/).\n", "\n", "### Submitting Sentinel-1 RTC jobs\n", "\n", "Sentinel-1 Radiometric Terrain Correction (RTC) jobs are submitted using [ESA granule IDs](https://sentinel.esa.int/web/sentinel/user-guides/sentinel-1-sar/naming-conventions).\n", "The example granules below can be viewed in [ASF Search here](https://search.asf.alaska.edu/#/?zoom=7.08772014623877¢er=-141.733866,58.498008&resultsLoaded=true&granule=S1A_IW_SLC__1SDV_20210214T154835_20210214T154901_036588_044C54_8494-SLC&searchType=List%20Search&searchList=S1A_IW_SLC__1SDV_20210214T154835_20210214T154901_036588_044C54_8494-SLC,S1B_IW_SLC__1SDV_20210210T153131_20210210T153159_025546_030B48_B568-SLC,S1A_IW_SLC__1SDV_20210210T025526_20210210T025553_036522_0449E2_7769-SLC,S1A_IW_SLC__1SDV_20210210T025501_20210210T025528_036522_0449E2_3917-SLC,S1B_IW_SLC__1SDV_20210209T030255_20210209T030323_025524_030A8D_7E88-SLC,S1B_IW_SLC__1SDV_20210209T030227_20210209T030257_025524_030A8D_5BAF-SLC,S1A_IW_SLC__1SDV_20210202T154835_20210202T154902_036413_044634_01A1-SLC)." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "granules = [\n", " 'S1A_IW_SLC__1SDV_20210214T154835_20210214T154901_036588_044C54_8494',\n", " 'S1B_IW_SLC__1SDV_20210210T153131_20210210T153159_025546_030B48_B568',\n", " 'S1A_IW_SLC__1SDV_20210210T025526_20210210T025553_036522_0449E2_7769',\n", " 'S1A_IW_SLC__1SDV_20210210T025501_20210210T025528_036522_0449E2_3917',\n", " 'S1B_IW_SLC__1SDV_20210209T030255_20210209T030323_025524_030A8D_7E88',\n", " 'S1B_IW_SLC__1SDV_20210209T030227_20210209T030257_025524_030A8D_5BAF',\n", " 'S1A_IW_SLC__1SDV_20210202T154835_20210202T154902_036413_044634_01A1',\n", "]\n", "\n", "\n", "rtc_jobs = sdk.Batch()\n", "for g in granules:\n", " rtc_jobs += hyp3.submit_rtc_job(g, name='rtc-example')\n", "print(rtc_jobs)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Here we've given each job the name `rtc-example`, which we can use later to search for these jobs.\n", "\n", "`HyP3.submit_rtc_job` also accepts\n", "[keyword arguments](https://hyp3-docs.asf.alaska.edu/using/sdk_api/#hyp3_sdk.hyp3.HyP3.submit_rtc_job)\n", "to customize the RTC products to your application." ] }, { "cell_type": "markdown", "metadata": { "pycharm": { "name": "#%% md\n" } }, "source": [ "### Submitting Sentinel-1 InSAR jobs\n", "\n", "The SDK can also submit Sentinel-1 Interferometric Synthetic Aperture Radar (InSAR) jobs which processes\n", "reference and secondary granule pairs.\n", "\n", "For a particular reference granule, we may want to use the nearest and next-nearest temporal neighbor granules as secondary\n", "scenes. To programmatically find our secondary granules for a reference granule, We'll define a `get_nearest_neighbors`\n", "function that uses the [baseline stack](https://docs.asf.alaska.edu/asf_search/ASFProduct/#stack) method from `asf_search`:" ] }, { "cell_type": "code", "execution_count": null, "outputs": [], "source": [ "from typing import Optional\n", "\n", "def get_nearest_neighbors(granule: str, max_neighbors: Optional[int] = None) -> asf.ASFSearchResults:\n", " granule = asf.granule_search(granule)[-1]\n", " stack = reversed([item for item in granule.stack() if item.properties['temporalBaseline'] < 0])\n", " return asf.ASFSearchResults(stack)[:max_neighbors]" ], "metadata": { "collapsed": false, "pycharm": { "name": "#%%\n" } } }, { "cell_type": "markdown", "source": [ "Now, using the example granule list for our RTC jobs as the reference scenes, we can find their nearest and next-nearest neighbor granules, and submit them\n", "as pairs for InSAR processing." ], "metadata": { "collapsed": false, "pycharm": { "name": "#%% md\n" } } }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "from tqdm.auto import tqdm # For a nice progress bar: https://github.com/tqdm/tqdm#ipython-jupyter-integration\n", "\n", "insar_jobs = sdk.Batch()\n", "for reference in tqdm(granules):\n", " neighbors = get_nearest_neighbors(reference, max_neighbors=2)\n", " for secondary in neighbors:\n", " insar_jobs += hyp3.submit_insar_job(reference, secondary.properties['sceneName'], name='insar-example')\n", "print(insar_jobs)" ] }, { "cell_type": "markdown", "metadata": { "pycharm": { "name": "#%% md\n" } }, "source": [ "Like RTC jobs, `HyP3.submit_insar_job` accepts\n", "[keyword arguments](https://hyp3-docs.asf.alaska.edu/using/sdk_api/#hyp3_sdk.hyp3.HyP3.submit_insar_job)\n", "to customize the InSAR products to your application." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Submitting autoRIFT jobs\n", "\n", "AutoRIFT supports processing Sentinel-1, Sentinel-2, or Landsat-8 Collection 2 pairs.\n", "* Sentinel-1 jobs are submitted using [ESA granule IDs](https://sentinel.esa.int/web/sentinel/user-guides/sentinel-1-sar/naming-conventions)\n", "* Sentinel-2 jobs are submitted using [ESA granule IDs](https://sentinel.esa.int/web/sentinel/user-guides/sentinel-2-msi/naming-convention)\n", "* Landsat-8 Collection 2 jobs are submitted using [USGS scene IDs](https://www.usgs.gov/faqs/what-naming-convention-landsat-collection-2-level-1-and-level-2-scenes?qt-news_science_products=0#qt-news_science_products)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "autorift_pairs = [\n", " # Sentinel-1 ESA granule IDs\n", " ('S1A_IW_SLC__1SSH_20170221T204710_20170221T204737_015387_0193F6_AB07',\n", " 'S1B_IW_SLC__1SSH_20170227T204628_20170227T204655_004491_007D11_6654'),\n", " # Sentinel-2 ESA granule IDs\n", " ('S2B_MSIL1C_20200612T150759_N0209_R025_T22WEB_20200612T184700',\n", " 'S2A_MSIL1C_20200627T150921_N0209_R025_T22WEB_20200627T170912'),\n", " # Landsat 8\n", " ('LC08_L1TP_009011_20200703_20200913_02_T1',\n", " 'LC08_L1TP_009011_20200820_20200905_02_T1'),\n", "]\n", "\n", "autorift_jobs = sdk.Batch()\n", "for reference, secondary in autorift_pairs:\n", " autorift_jobs += hyp3.submit_autorift_job(reference, secondary, name='autorift-example')\n", "print(autorift_jobs)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "AutoRIFT does not currently accept any keyword arguments for product customization.\n", "\n", "## Monitoring jobs\n", "\n", "One jobs are submitted, you can either watch the jobs until they finish" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "rtc_jobs = hyp3.watch(rtc_jobs)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "which will require you to keep the cell/terminal running, or you can come back later and search for jobs" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "rtc_jobs = hyp3.find_jobs(name='rtc-example')\n", "rtc_jobs = hyp3.watch(rtc_jobs)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Downloading files\n", "\n", "Batches are collections of jobs. They provide a snapshot of the job status when the job was created or last\n", "refreshed. To get updated information on a batch" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "print(insar_jobs)\n", "insar_jobs = hyp3.refresh(insar_jobs)\n", "print(insar_jobs)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "`hyp3.watch()` will return a refreshed batch once every job in the batch has completed.\n", "\n", "Batches can be added together" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "print(f'Number of Jobs:\\n RTC:{len(rtc_jobs)}\\n InSAR:{len(insar_jobs)}\\n autoRIFT:{len(autorift_jobs)}')\n", "all_jobs = rtc_jobs + insar_jobs + autorift_jobs\n", "print(f'Total number of Jobs: {len(all_jobs)}')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You can check the status of a batch (at last refresh) by printing the batch" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "print(all_jobs)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "and filter jobs by status" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "succeeded_jobs = all_jobs.filter_jobs(succeeded=True, running=False, failed=False)\n", "print(f'Number of succeeded jobs: {len(succeeded_jobs)}')\n", "failed_jobs = all_jobs.filter_jobs(succeeded=False, running=False, failed=True)\n", "print(f'Number of failed jobs: {len(failed_jobs)}')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You can download the files for all successful jobs" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "file_list = succeeded_jobs.download_files()" ] }, { "cell_type": "markdown", "metadata": { "pycharm": { "name": "#%% md\n" } }, "source": [ "*Note: only succeeded jobs will have files to download.*\n" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.8.8" } }, "nbformat": 4, "nbformat_minor": 1 }