{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "* Run SalishSeaCast with different winds\n", "* Run WaveWatchIII using results from SalishSeaCast \n", "* Run MOHID using results of SalishSeaCast and WaveWatchIII" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import matplotlib.pyplot as plt\n", "import numpy as np\n", "import pandas as pd\n", "import netCDF4 as nc\n", "import seaborn as sns\n", "import matplotlib.colors as mcolors\n", "import glob\n", "import os\n", "import xarray as xr\n", "import datetime\n", "from salishsea_tools import viz_tools, tidetools, geo_tools, gsw_calls, wind_tools\n", "import pickle\n", "from IPython.core.display import display, HTML\n", "display(HTML(\"\"))\n", "\n", "%matplotlib inline" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Run SalisihSeaCast from 01jan20 to 11jan20. This gives us three days of spinup and eight days of model results in order to make seven days of forcing files for mohid." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Using SOG (-123.67, 49.21) as a reference, we choose the dates\n", "\n", "* low 350 deg winds: 2017-08-05\n", "* high 350 deg winds: 2016-05-04\n", "* low 150 deg winds: 2017-05-11\n", "* high 150 deg winds: 2017-11-13" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We first take wind files from the date range of our choice ( -3 days for spinup) , replace the times, and then save the file wth the appropriate name." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "files = np.sort(glob.glob('/ocean/vdo/MEOPAR/fake_winds/ops_y2017m11*'))[9:20]\n", "day = 1\n", "for file in files:\n", " f = xr.open_dataset(file)\n", " f = f.assign_coords(time_counter=[datetime.datetime(2020,1,day) + datetime.timedelta(hours = i) for i in range(24)])\n", " f.to_netcdf('/ocean/vdo/MEOPAR/fake_winds/10nov17/ops_y2020m01d' +f'{day:02}' + '.nc')\n", " day = day + 1" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "files = np.sort(glob.glob('/ocean/vdo/MEOPAR/fake_winds/ops_y2017m05*'))[7:18]\n", "day = 1\n", "for file in files:\n", " f = xr.open_dataset(file)\n", " f = f.assign_coords(time_counter=[datetime.datetime(2020,1,day) + datetime.timedelta(hours = i) for i in range(24)])\n", " f.to_netcdf('/ocean/vdo/MEOPAR/fake_winds/08may17/ops_y2020m01d' +f'{day:02}' + '.nc')\n", " day = day + 1" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "files = np.sort(glob.glob('/ocean/vdo/MEOPAR/fake_winds/ops_y2016m05*'))[:11]\n", "day = 1\n", "for file in files:\n", " f = xr.open_dataset(file)\n", " f = f.assign_coords(time_counter=[datetime.datetime(2020,1,day) + datetime.timedelta(hours = i) for i in range(24)])\n", " f.to_netcdf('/ocean/vdo/MEOPAR/fake_winds/01may16/ops_y2020m01d' +f'{day:02}' + '.nc')\n", " day = day + 1" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "files = np.sort(glob.glob('/ocean/vdo/MEOPAR/fake_winds/ops_y2017m08*'))[1:12]\n", "day = 1\n", "for file in files:\n", " f = xr.open_dataset(file)\n", " f = f.assign_coords(time_counter=[datetime.datetime(2020,1,day) + datetime.timedelta(hours = i) for i in range(24)])\n", " f.to_netcdf('/ocean/vdo/MEOPAR/fake_winds/02aug17/ops_y2020m01d' +f'{day:02}' + '.nc')\n", " day = day + 1" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We then run SalishSeaCast making sure to remember to edit the yaml file to point to the correct modified wind files." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Once SalishSeaCast is finished running, we need to nowcast-ize the results files in order to make wavewatch and mohid forcing files. This needs to be done on Skookum in the salishsea-nowcast eniviroment. We need to edit the yaml file to point the results archive : hindcast key to the directory of our SalishSeaCast results." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "`python -m nowcast.workers.split_results --debug config_file hindcast 2020-01-01`" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We now need to make the forcing files to run wavewatch. We need to remember to edit the yaml file to point to the correct modified wind files and to point to a desired results directory. We also need to edit the symlink `/results/SalishSea/nowcast` as we go through each date range. We run this command from 01jan20 to 11jan20" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "`python -m nowcast.workers.make_ww3_wind_file SalishSeaNowcast/config/wwatch3-forcing.yaml --debug salish nowcast --run-date 2020-01-01`\n", "\n", "`python -m nowcast.workers.make_ww3_wind_file SalishSeaNowcast/config/wwatch3-forcing.yaml --debug salish nowcast --run-date 2020-01-01`" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We then transfer the wavewatch forcing to Cedar and run wavewatch. We need to remember to edit the yaml fle to point to the correct currents and wind files. We can run all 11 days with one command. It takes ~20 minutes to run one model day. \n", "\n", "`wwatch3 run --start-date 2020-01-01 --n-days 11 yaml_file 5:00:00 RESULTS_DIR`" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We then make the mohid forcing files using make-hdf5. We need to remember to edit the yaml file to point to the correct winds, wavewatch, and nemo files. It takes ~35 min to make 7+1 days of forcing. From there we can run mohid. We need to remember to edit the yaml file to point to the forcng files that we just created. " ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.8" } }, "nbformat": 4, "nbformat_minor": 2 }