{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "As part of the EarthSim project, extensive support for visualizing triangular meshes was added to [HoloViews](http://holoviews.org/reference/elements/bokeh/TriMesh.html) and [Datashader](http://datashader.org/user_guide/6_Trimesh.html); see the websites for those projects for details of how to use this functionality. Here, we will show how to use the small utilities provided in `earthsim` itself, which allow you to read 3DM and mesh2d files with mesh data, and we'll then visualize the resulting structures." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import numpy as np\n", "import holoviews as hv\n", "import geoviews as gv\n", "import cartopy.crs as ccrs\n", "\n", "from holoviews import opts\n", "from holoviews.operation.datashader import datashade, rasterize\n", "import datashader as ds\n", "from colorcet import cm_n\n", "from earthsim.io import read_3dm_mesh, read_mesh2d\n", "\n", "datashade.precompute = True\n", "\n", "hv.extension('bokeh')\n", "\n", "size = dict(width=800, height=600)\n", "opts.defaults(\n", " opts.Image(**size), opts.RGB(**size), opts.VectorField(**size))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Static meshes\n", "\n", "Static trimeshes can be loaded using the ``read_3dm_mesh`` utility, which will return its simplices and vertices. Here we will load a mesh of the Chesapeake and Delaware Bays.\n", "\n", "Before we declare the ``TriMesh`` we will project the coordinates of the vertices to a Mercator projection for plotting, so that GeoViews does not have to re-project the coordinates each time the plot is displayed. We also declare the 'z' dimension representing the value at each vertex (water depth in this case)." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "fpath = '../data/Chesapeake_and_Delaware_Bays.3dm'\n", "tris, verts = read_3dm_mesh(fpath)\n", "points = gv.operation.project_points(gv.Points(verts, vdims=['z']))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now that we have the simplices and vertices we can construct the ``TriMesh``. Since the TriMesh is too large to display on its own in Bokeh (which provides all data to the web browser by default), we apply the ``datashade`` operation to render the data into an image before providing it to the browser. To begin with, we will visualize the underlying mesh by datashading just ``trimesh.edgepaths``:" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "tiles = gv.WMTS('https://maps.wikimedia.org/osm-intl/{Z}/{X}/{Y}@2x.png')\n", "trimesh = gv.TriMesh((tris, points))\n", "chesapeake_mesh = tiles * datashade(trimesh.edgepaths)\n", "chesapeake_mesh" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Here the plot may look continuous in most regions, but if you have a live, running version of this plot, you can zoom in to see that it's a wireframe with high resolution (many grid points and vertices) in the inland areas and low resolution in the open water.\n", "\n", "Since the mesh also has associated 'z' values, we can render the value at each pixel that falls inside a triangle, interpolating from the values at the vertices when there are multiple pixels per triangle, and aggregating across vertex values (averaging in this case) when there are multiple vertices per pixel:" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "tiles = gv.WMTS('https://maps.wikimedia.org/osm-intl/{Z}/{X}/{Y}@2x.png')\n", "trimesh = gv.TriMesh((tris, points))\n", "chesapeake_interp = tiles * rasterize(trimesh, aggregator=ds.mean('z')).opts(colorbar=True, cmap='Viridis')\n", "chesapeake_interp" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "If you have a live running Python process (not a static web page), you can zoom in to the aboe plot and eventually make out the original triangles.\n", "\n", "## Meshes over time\n", "\n", "When running environmental simulations using a tool like AdH, the mesh is also often accompanied by additional data files representing depths, velocity, and error values over time. We can load in these values and use them to visualize changes in each variable as a simulation evolves.\n", "\n", "First, we again load a static mesh and project it from its native UTM Zone 11 coordinate system to the Mercator projection." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "fpath = '../data/SanDiego_Mesh/SanDiego.3dm'\n", "tris, verts = read_3dm_mesh(fpath, skiprows=2)\n", "points = gv.operation.project_points(gv.Points(verts, vdims=['z'], crs=ccrs.UTM(11)))\n", "trimesh = gv.TriMesh((tris, points))\n", "san_diego = tiles * rasterize(trimesh, aggregator='mean').opts(colorbar=True, cmap='Viridis', clim=(-75, 0))\n", "san_diego" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Next we load a mesh2d .dat file containing Overland Velocity values over the course of a simulation. The ``read_mesh2d`` utility returns a dictionary of dataframes indexed by time." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "fpath = '../data/SanDiego_Mesh/SanDiego_err_hydro.dat'\n", "dfs = read_mesh2d(fpath)\n", "dfs[0].head()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To explore this data we will wrap it in a ``DynamicMap`` that returns a TriMesh for each timepoint. First we declare some points containing the positions of the vertices, and project those (once at the start, rather than for each subsequent plot). Now we can make use of the ``add_dimension`` method on those points to add the additional 'HYDRO_ERROR' variable we loaded from the ``.dat`` file. Finally, we declare a DynamicMap indexed by time with the keys of the dictionary as values." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "points = gv.operation.project_points(gv.Points((verts.x, verts.y), crs=ccrs.UTM(11)))\n", "\n", "def time_mesh(time):\n", " depth_points = points.add_dimension('HYDRO_ERROR', 0, dfs[time].values[:, 0], vdim=True)\n", " return gv.TriMesh((tris, depth_points), crs=ccrs.GOOGLE_MERCATOR)\n", "\n", "meshes = hv.DynamicMap(time_mesh, kdims='Time').redim.values(Time=sorted(dfs.keys()))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now we can ``rasterize`` the data by interpolating the 'Velocity' and again plot the data over a map. We'll first activate the `scrubber` option for HoloMaps, so that we get a set of controls that makes it simple to explore the dataset over time." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "error_meshes = datashade(meshes, aggregator='mean', cmap=cm_n['fire'])\n", "error_plot = tiles * error_meshes\n", "\n", "hv.output(error_plot, holomap='scrubber', fps=4)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In addition to visualizing an interpolated mesh, we can also visualize vector field data. The ``SanDiego_ovl`` file contains vector data, which we can again load using the ``read_mesh2d`` utility. The GeoViews ``VectorField`` element expects the data to be expressed as angle and magntitude values, so we convert the values and then declare the ``VectorField``." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "fpath = '../data/SanDiego_Mesh/SanDiego_ovl.dat'\n", "velocity_dfs = read_mesh2d(fpath)\n", "\n", "def time_field(time):\n", " vx = velocity_dfs[time].values[:, 0]\n", " vy = velocity_dfs[time].values[:, 1]\n", " xs, ys = (points.dimension_values(i) for i in range(2))\n", " with np.errstate(divide='ignore', invalid='ignore'):\n", " angle = np.arctan2(vy, vx)\n", " mag = np.sqrt(vx**2+vy**2)\n", " return gv.VectorField((xs, ys, angle, mag), vdims=['Angle', 'Magnitude'],\n", " crs=ccrs.GOOGLE_MERCATOR)\n", "\n", "vectors = hv.DynamicMap(time_field, kdims='Time').redim.values(Time=sorted(dfs.keys()))\n", "\n", "tiles * vectors.opts(color='Magnitude', magnitude='Magnitude', scale=0.005, rescale_lengths=False)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Here you will probably need to zoom in to see the structure, but you should be able to make out changes in the flow pattern over time. \n", "\n", "If you have data in other formats, you should be able to read the source code for the read_mesh2d and read_3dm_mesh utilities and adapt it to build the same data structures from whatever data you start with." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "language_info": { "name": "python", "pygments_lexer": "ipython3" } }, "nbformat": 4, "nbformat_minor": 2 }