{ "nbformat": 4, "nbformat_minor": 0, "metadata": { "accelerator": "GPU", "colab": { "name": "Combining the power of PyTorch and NiftyNet.ipynb", "version": "0.3.2", "provenance": [], "collapsed_sections": [], "toc_visible": true }, "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.8" } }, "cells": [ { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "7L7_DBmQzrY_" }, "source": [ "# Combining the power of PyTorch and NiftyNet" ] }, { "cell_type": "markdown", "metadata": { "id": "piVSIuG49xsT", "colab_type": "text" }, "source": [ "## Contents" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "ltcwoYBm7_fF" }, "source": [ "1. [Introduction](#Introduction)\n", "2. [Running the notebook](#Running-the-notebook)\n", "3. [Setup](#Setup)\n", "4. [Transferring parameters from NiftyNet to PyTorch](#Transferring-parameters-from-NiftyNet-to-PyTorch)\n", "5. [Testing the model](#Testing-the-model)\n", "6. [Future work](#Future-work)\n", "7. [Conclusion](#Conclusion)\n", "8. [Acknowledgments](#Acknowledgments)\n", "\n" ] }, { "cell_type": "markdown", "metadata": { "id": "pCLyphMDb-64", "colab_type": "text" }, "source": [ "## Introduction" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "5bsZ0Xi2z2Je" }, "source": [ "NiftyNet is \"[an open source convolutional neural networks platform for medical image analysis and image-guided therapy](https://niftynet.io/)\" built on top of [TensorFlow](https://www.tensorflow.org/). Due to its available implementations of successful architectures, patch-based sampling and straightforward configuration, it has become a [popular choice](https://github.com/NifTK/NiftyNet/network/members) to get started with deep learning in medical imaging.\n", "\n", "PyTorch is \"[an open source deep learning platform that provides a seamless path from research prototyping to production deployment](https://pytorch.org/)\". It is low-level enough to offer a lot of control over what is going on under the hood during training, and its [dynamic computational graph](https://medium.com/intuitionmachine/pytorch-dynamic-computational-graphs-and-modular-deep-learning-7e7f89f18d1) allows for easy debugging. Being a generic deep learning framework, it is not tailored to the needs of the medical imaging field, although its popularity in this field is increasing rapidly.\n", "\n", "One can [extend a NiftyNet application](https://niftynet.readthedocs.io/en/dev/extending_app.html), but it is not straightforward without being familiar with the framework and fluent in TensorFlow 1.X. Therefore it can be convenient to implement applications in PyTorch using NiftyNet models and functionalities. In particular, combining both frameworks allows for fast architecture experimentation and transfer learning.\n", "\n", "So why not use [both](https://www.youtube.com/watch?v=vqgSO8_cRio&feature=youtu.be&t=5)? In this tutorial we will port the parameters of a model trained on NiftyNet to a PyTorch model and compare the results of running an inference using both frameworks." ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "owZ4Cew-0D-H" }, "source": [ "### Image segmentation" ] }, { "cell_type": "markdown", "metadata": { "id": "21Sdl4ylayLD", "colab_type": "text" }, "source": [ "Although NiftyNet supports different applications, it is mostly used for medical image segmentation.\n", "\n", "***Image segmentation using deep learning*** were the 5 most common words in all full paper titles from *both* [MICCAI 2018](https://www.miccai2018.org/en/) and [MIDL 2019](https://2019.midl.io/), which shows the interest of the medical imaging community in the topic." ] }, { "cell_type": "markdown", "metadata": { "id": "3HsS0Isg-GP0", "colab_type": "text" }, "source": [ "
" ] }, { "cell_type": "markdown", "metadata": { "id": "jJ8E26da-GP1", "colab_type": "text" }, "source": [ ""Image segmentation using deep learning", guess this is the hottest topic in MIDL #MIDL2019 @midl_conference pic.twitter.com/64smdMjnxY
— Hua Ma (@forever_pippo) July 8, 2019
" ] }, { "cell_type": "markdown", "metadata": { "id": "v16_WPuKmtDm", "colab_type": "text" }, "source": [ "### HighRes3DNet" ] }, { "cell_type": "markdown", "metadata": { "id": "i_trk4Gv-GP4", "colab_type": "text" }, "source": [ "" ] }, { "cell_type": "markdown", "metadata": { "id": "IHNg6sBD-GP7", "colab_type": "text" }, "source": [ "HighRes3DNet is a residual convolutional neural network architecture designed to have a large receptive field and preserve a high resolution using a relatively small number of parameters. It was presented in 2017 by Li et al. at IPMI: [*On the Compactness, Efficiency, and Representation of 3D Convolutional Networks: Brain Parcellation as a Pretext Task*](https://arxiv.org/abs/1707.01992)." ] }, { "cell_type": "markdown", "metadata": { "id": "vOJMIC9j-GP8", "colab_type": "text" }, "source": [ "" ] }, { "cell_type": "markdown", "metadata": { "id": "CW59Mlpf-GP9", "colab_type": "text" }, "source": [ "The authors used NiftyNet to implement and train a model based on this architecture to perform [brain parcellation](https://ieeexplore.ieee.org/document/7086081?arnumber=7086081) using $T_1$-weighted MR images from the [ADNI dataset](http://adni.loni.usc.edu/). They achieved competitive segmentation performance compared with state-of-the-art architectures such as [DeepMedic](https://biomedia.doc.ic.ac.uk/software/deepmedic/) or [U-Net](https://arxiv.org/abs/1606.06650).\n", "\n", "This figure from the paper shows a parcellation produced by HighRes3DNet:\n", "\n", "\n", "\n", "\n", "\n", "\n" ] }, { "cell_type": "markdown", "metadata": { "id": "4gDgyxQH-GP-", "colab_type": "text" }, "source": [ "The code of the architecture is on [NiftyNet GitHub repository](https://github.com/NifTK/NiftyNet/blob/dev/niftynet/network/highres3dnet.py). The authors have uploaded the parameters and configuration file to the [Model Zoo](https://github.com/NifTK/NiftyNetModelZoo/tree/5-reorganising-with-lfs/highres3dnet_brain_parcellation).\n", "\n", "After reading the paper and the code, it is relatively straightforward to [implement the same architecture using PyTorch](https://github.com/fepegar/highresnet)." ] }, { "cell_type": "markdown", "metadata": { "id": "jfnsSHJh-GRn", "colab_type": "text" }, "source": [ "## Running the notebook" ] }, { "cell_type": "markdown", "metadata": { "id": "BAbMnglFJJx3", "colab_type": "text" }, "source": [ "All the code is hosted in a GitHub repository:\n", "[`fepegar/miccai-educational-challenge-2019`](https://github.com/fepegar/miccai-educational-challenge-2019).\n", "\n", "The latest release can also be found on the Zenodo repository under this DOI: [10.5281/zenodo.3352316](https://doi.org/10.5281/zenodo.3352316)." ] }, { "cell_type": "markdown", "metadata": { "id": "0w-5ghoyBZN1", "colab_type": "text" }, "source": [ "### Online" ] }, { "cell_type": "markdown", "metadata": { "id": "WyKog0VdBhUo", "colab_type": "text" }, "source": [ "If you have a Google account, the best way to run this notebook seamlessly is using [Google Colab](https://colab.research.google.com/drive/1vqDojKuC4Svb97LdoEyZQygm3jccX4hr). You will need to click on \"Open in playground\", at the top left:\n", "\n", "![Playground mode screenshot](https://github.com/fepegar/miccai-educational-challenge-2019/raw/master/images/playground.png)\n", "\n", "You will also get a couple of warnings that you can safely ignore. The first one warns about this notebook not being authored by Google and the second one asks for confirmation to reset all runtimes. These are valid points, but will not affect this tutorial.\n", "\n", "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1vqDojKuC4Svb97LdoEyZQygm3jccX4hr)\n", "\n", "---\n", "\n", "Please [report any issues on GitHub](https://github.com/fepegar/miccai-educational-challenge-2019/issues/new) and I will fix them. You can also [drop me an email](mailto:fernando.perez.garcia.17@ucl.ac.uk?subject=Combining%20the%20power%20of%20PyTorch%20and%20NiftyNet) if you have any questions or comments.\n", "\n" ] }, { "cell_type": "markdown", "metadata": { "id": "aNVkCt5vBe0V", "colab_type": "text" }, "source": [ "### Locally" ] }, { "cell_type": "markdown", "metadata": { "id": "JuJmtAHl-GRn", "colab_type": "text" }, "source": [ "To write this notebook I used Ubuntu 18.04 installed on an Alienware 13 R3 laptop, which includes a 6-GB GeForce GTX 1060 NVIDIA GPU. I am using CUDA 9.0.\n", "\n", "Inference using PyTorch took 5725 MB of GPU memory. TensorFlow usually takes as much as possible beforehand." ] }, { "cell_type": "markdown", "metadata": { "id": "pzitKNDD-GRq", "colab_type": "text" }, "source": [ "To run this notebook locally, I recommend downloading the repository and creating a [`conda`](https://docs.conda.io/en/latest/miniconda.html) environment:\n", "\n", "```shell\n", "git clone https://github.com/fepegar/miccai-educational-challenge-2019.git\n", "cd miccai-educational-challenge-2019\n", "conda create -n mec2019 python=3.6 -y\n", "conda activate mec2019 && conda install jupyterlab -y && jupyter lab\n", "```" ] }, { "cell_type": "markdown", "metadata": { "id": "Ne0BYvIHFJzR", "colab_type": "text" }, "source": [ "### nbviewer" ] }, { "cell_type": "markdown", "metadata": { "id": "t4c313Z_FPeb", "colab_type": "text" }, "source": [ "An already executed version of the notebook can be rendered using [nbviewer](https://nbviewer.jupyter.org/github/fepegar/miccai-educational-challenge-2019/blob/master/Combining_the_power_of_PyTorch_and_NiftyNet.ipynb?flush_cache=true).\n", "\n", "\n", " \n", "" ] }, { "cell_type": "markdown", "metadata": { "id": "vooxt1Jt4efL", "colab_type": "text" }, "source": [ "### Interactive volume plots" ] }, { "cell_type": "markdown", "metadata": { "id": "sMEKcXj14iy0", "colab_type": "text" }, "source": [ "If you run the notebook, you can use interactive plots to navigate through the volume slices by setting this variable to `True`. You might need to run the volume visualization cells individually after running the whole notebook. This feature is experimental and therefore disabled by default." ] }, { "cell_type": "code", "metadata": { "id": "FT17pIB249jn", "colab_type": "code", "colab": {} }, "source": [ "interactive_plots = False" ], "execution_count": 0, "outputs": [] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "FZzLgIR80mVQ" }, "source": [ "## Setup" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "8BIsvPiS4ghW" }, "source": [ "### Install and import libraries" ] }, { "cell_type": "markdown", "metadata": { "id": "3Qtauo5uO70v", "colab_type": "text" }, "source": [ "Clone NiftyNet and some custom Python libraries for this notebook. This might take one or two minutes." ] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "X2uoosvx1uxh", "colab": {} }, "source": [ "%%capture --no-stderr\n", "# This might take about 30 seconds\n", "!rm -rf NiftyNet && git clone https://github.com/NifTK/NiftyNet --depth 1\n", "!cd NiftyNet && git checkout df0f86733357fdc92bbc191c8fec0dcf49aa5499 && cd ..\n", "!pip install -r NiftyNet/requirements-gpu.txt\n", "!curl -O https://raw.githubusercontent.com/fepegar/miccai-educational-challenge-2019/master/requirements.txt\n", "!curl -O https://raw.githubusercontent.com/fepegar/miccai-educational-challenge-2019/master/tf2pt.py\n", "!curl -O https://raw.githubusercontent.com/fepegar/miccai-educational-challenge-2019/master/utils.py\n", "!curl -O https://raw.githubusercontent.com/fepegar/miccai-educational-challenge-2019/master/visualization.py\n", "!curl -O https://raw.githubusercontent.com/fepegar/miccai-educational-challenge-2019/master/highresnet_mapping.py\n", "!curl -O https://raw.githubusercontent.com/fepegar/highresnet/master/GIFNiftyNet.ctbl\n", "!pip install -r requirements.txt\n", "!pip install --upgrade numpy\n", "!pip install ipywidgets\n", "import sys\n", "sys.path.insert(0, 'NiftyNet')" ], "execution_count": 0, "outputs": [] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "TdjKHmBb82qG", "outputId": "9cd83db3-e0e2-44bf-a34f-456fe36bc5d4", "colab": { "base_uri": "https://localhost:8080/", "height": 34 } }, "source": [ "%matplotlib inline\n", "%config InlineBackend.figure_format='retina'\n", "\n", "import os\n", "import tempfile\n", "from pathlib import Path\n", "from configparser import ConfigParser\n", "\n", "import numpy as np\n", "import pandas as pd\n", "\n", "try:\n", " # Fancy rendering of Pandas tables\n", " import google.colab.data_table\n", " %load_ext google.colab.data_table\n", " print(\"We are on Google Colab\")\n", "except ModuleNotFoundError:\n", " print(\"We are not on Google Colab\")\n", " pd.set_option('display.max_colwidth', -1) # do not truncate strings when displaying data frames\n", " pd.set_option('display.max_rows', None) # show all rows\n", "\n", "import torch\n", "\n", "from highresnet import HighRes3DNet" ], "execution_count": 3, "outputs": [ { "output_type": "stream", "text": [ "We are on Google Colab\n" ], "name": "stdout" } ] }, { "cell_type": "code", "metadata": { "id": "pNpcQ8mZJm6H", "colab_type": "code", "colab": {} }, "source": [ "%%capture\n", "os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'\n", "from tensorflow.python.util import deprecation\n", "deprecation._PRINT_DEPRECATION_WARNINGS = False\n", "\n", "import tf2pt\n", "import utils\n", "import visualization\n", "import highresnet_mapping\n", "\n", "if interactive_plots: # for Colab or Jupyter\n", " plot_volume = visualization.plot_volume_interactive\n", "else: # for HTML, GitHub or nbviewer\n", " plot_volume = visualization.plot_volume\n", "\n", "from niftynet.io.image_reader import ImageReader\n", "from niftynet.engine.sampler_grid_v2 import GridSampler\n", "from niftynet.engine.windows_aggregator_grid import GridSamplesAggregator\n", "from niftynet.layer.pad import PadLayer\n", "from niftynet.layer.binary_masking import BinaryMaskingLayer\n", "from niftynet.layer.histogram_normalisation import HistogramNormalisationLayer\n", "from niftynet.layer.mean_variance_normalisation import MeanVarNormalisationLayer" ], "execution_count": 0, "outputs": [] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "O6Z5ftE48fxv" }, "source": [ "### Download NiftyNet model and test data" ] }, { "cell_type": "markdown", "metadata": { "id": "YHb8yWM7-GQJ", "colab_type": "text" }, "source": [ "We can use NiftyNet's `net_download` to get all we need from the [Model Zoo entry](https://github.com/NifTK/NiftyNetModelZoo/tree/5-reorganising-with-lfs/highres3dnet_brain_parcellation#downloading-model-zoo-files) corresponding to brain parcellation using HighRes3DNet:" ] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "fA92bog98ewS", "colab": {} }, "source": [ "%%capture\n", "%run NiftyNet/net_download.py highres3dnet_brain_parcellation_model_zoo" ], "execution_count": 0, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "zz1V4CKy-GQM", "colab_type": "code", "outputId": "371eb9fb-e327-4844-9cc4-e85e30dc6223", "colab": { "base_uri": "https://localhost:8080/", "height": 459 } }, "source": [ "niftynet_dir = Path('~/niftynet').expanduser()\n", "utils.list_files(niftynet_dir)" ], "execution_count": 6, "outputs": [ { "output_type": "stream", "text": [ "niftynet/\n", " data/\n", " OASIS/\n", " license\n", " OAS1_0145_MR2_mpr_n4_anon_sbj_111.nii.gz\n", " models/\n", " highres3dnet_brain_parcellation/\n", " inference_niftynet_log\n", " databrain_std_hist_models_otsu.txt\n", " settings_inference.txt\n", " Modality0.csv\n", " parcellation_output/\n", " window_seg_OAS1_0145_MR2_mpr_n4_anon_sbj_111__niftynet_out.nii.gz\n", " inferred.csv\n", " logs/\n", " models/\n", " model.ckpt-33000.meta\n", " model.ckpt-33000.data-00000-of-00001\n", " model.ckpt-33000.index\n", " extensions/\n", " __init__.py\n", " highres3dnet_brain_parcellation/\n", " __init__.py\n", " highres3dnet_config_eval.ini\n", " network/\n", " __init__.py\n" ], "name": "stdout" } ] }, { "cell_type": "markdown", "metadata": { "id": "yjpKmyji-GQO", "colab_type": "text" }, "source": [ "There are three directories under `~/niftynet`:\n", "1. `extensions` is a Python package that contains the [configuration file].(https://niftynet.readthedocs.io/en/dev/config_spec.html)\n", "2. `models` contains the landmarks for [histogram standardization](https://ieeexplore.ieee.org/document/836373) (an MRI preprocessing step) and the network parameters.\n", "3. `data` contains an [OASIS](https://www.oasis-brains.org/) MRI sample that can be used to test the model." ] }, { "cell_type": "markdown", "metadata": { "id": "c8vms-m2p5d3", "colab_type": "text" }, "source": [ "Here are the paths to the downloaded files and to the files that will be generated by the notebook. I use `nn` for NiftyNet, `tf` for TensorFlow and `pt` for PyTorch." ] }, { "cell_type": "code", "metadata": { "colab_type": "code", "id": "e3Ll7LVl4WtC", "colab": {} }, "source": [ "models_dir = niftynet_dir / 'models'\n", "zoo_entry = 'highres3dnet_brain_parcellation'\n", "input_checkpoint_tf_name = 'model.ckpt-33000'\n", "input_checkpoint_tf_path = models_dir / zoo_entry / 'models' / input_checkpoint_tf_name\n", "data_dir = niftynet_dir / 'data' / 'OASIS'\n", "config_path = niftynet_dir / 'extensions' / zoo_entry / 'highres3dnet_config_eval.ini'\n", "histogram_landmarks_path = models_dir / zoo_entry / 'databrain_std_hist_models_otsu.txt'\n", "tempdir = Path(tempfile.gettempdir()) / 'miccai_niftynet_pytorch'\n", "tempdir.mkdir(exist_ok=True)\n", "output_csv_tf_path = tempdir / 'variables_tf.csv'\n", "output_state_dict_tf_path = tempdir / 'state_dict_tf.pth'\n", "output_state_dict_pt_path = tempdir / 'state_dict_pt.pth'\n", "prediction_pt_dir = tempdir / 'prediction'\n", "prediction_pt_dir.mkdir(exist_ok=True)\n", "color_table_path = 'GIFNiftyNet.ctbl'" ], "execution_count": 0, "outputs": [] }, { "cell_type": "markdown", "metadata": { "id": "IwdjnYc--GQR", "colab_type": "text" }, "source": [ "Note that the path to the checkpoint is not a path to an existing filename, but the basename of the three checkpoint files." ] }, { "cell_type": "markdown", "metadata": { "id": "Fpb_C4qt-GQR", "colab_type": "text" }, "source": [ "## Transferring parameters from NiftyNet to PyTorch" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "svG_jtqn0KGU" }, "source": [ "### Variables in TensorFlow world \n", "\n", "" ] }, { "cell_type": "markdown", "metadata": { "id": "JfZfj1zs-GQT", "colab_type": "text" }, "source": [ "There are two modules that are relevant for this section in the\n", "[repository](https://github.com/fepegar/miccai-educational-challenge-2019).\n", "[`tf2pt`](https://github.com/fepegar/miccai-educational-challenge-2019/blob/master/tf2pt.py) contains generic functions that can be used to transform any TensorFlow model to PyTorch.\n", "[`highresnet_mapping`](https://github.com/fepegar/miccai-educational-challenge-2019/blob/master/highresnet_mapping.py) contains custom functions that are specific for HighRes3DNet.\n", "\n", "Let's see what variables are stored in the checkpoint.\n", "\n", "These are filtered out by\n", "[`highresnet_mapping.is_not_valid()`](https://github.com/fepegar/miccai-educational-challenge-2019/blob/c96777d654ac577c0dba218038f76c2497de946a/highresnet_mapping.py#L4-L11)\n", "for clarity:\n", "* Variables used by the Adam optimizer during training\n", "* Variables with no shape. They won't help much.\n", "* Variables containing `biased` or `ExponentialMovingAverage`. I have experimented with them and the results using these variables turned out to be different to the ones generated by NiftyNet." ] }, { "cell_type": "markdown", "metadata": { "id": "XLFf7VkP-GQT", "colab_type": "text" }, "source": [ "We will store the variables names in a data frame to list them in this notebook and the values in a Python dictionary to retrieve them later. I figured out the code in\n", "[`tf2pt.checkpoint_tf_to_state_dict_tf()`](https://github.com/fepegar/miccai-educational-challenge-2019/blob/c96777d654ac577c0dba218038f76c2497de946a/tf2pt.py#L90-L129)\n", "reading the corresponding [TensorFlow docs](https://www.tensorflow.org/api_docs/python/tf/train/list_variables) and [Stack Overflow answers](https://stackoverflow.com/search?q=restore+tensorflow)." ] }, { "cell_type": "code", "metadata": { "id": "N74zhuFYPwff", "colab_type": "code", "outputId": "330fba6e-fb22-4800-cc6b-82b9b3d18554", "colab": { "base_uri": "https://localhost:8080/", "height": 190 } }, "source": [ "# %%capture\n", "tf2pt.checkpoint_tf_to_state_dict_tf(\n", " input_checkpoint_tf_path=input_checkpoint_tf_path,\n", " output_csv_tf_path=output_csv_tf_path,\n", " output_state_dict_tf_path=output_state_dict_tf_path,\n", " filter_out_function=highresnet_mapping.is_not_valid,\n", " replace_string='HighRes3DNet/',\n", ")\n", "data_frame_tf = pd.read_csv(output_csv_tf_path)\n", "state_dict_tf = torch.load(output_state_dict_tf_path)" ], "execution_count": 8, "outputs": [ { "output_type": "stream", "text": [ "W0824 12:06:24.860505 140162299176832 deprecation_wrapper.py:119] From /content/tf2pt.py:106: The name tf.reset_default_graph is deprecated. Please use tf.compat.v1.reset_default_graph instead.\n", "\n", "W0824 12:06:24.872821 140162299176832 deprecation_wrapper.py:119] From /content/tf2pt.py:114: The name tf.get_variable is deprecated. Please use tf.compat.v1.get_variable instead.\n", "\n", "W0824 12:06:25.661258 140162299176832 deprecation_wrapper.py:119] From /content/tf2pt.py:122: The name tf.train.Saver is deprecated. Please use tf.compat.v1.train.Saver instead.\n", "\n", "W0824 12:06:25.749863 140162299176832 deprecation_wrapper.py:119] From /content/tf2pt.py:124: The name tf.Session is deprecated. Please use tf.compat.v1.Session instead.\n", "\n", "I0824 12:06:29.631742 140162299176832 saver.py:1280] Restoring parameters from /root/niftynet/models/highres3dnet_brain_parcellation/models/model.ckpt-33000\n" ], "name": "stderr" } ] }, { "cell_type": "code", "metadata": { "id": "5GoBd5Qq-GQU", "colab_type": "code", "outputId": "ba845b60-587f-4adc-9ec3-89493dbf2994", "colab": { "base_uri": "https://localhost:8080/", "height": 623 } }, "source": [ "data_frame_tf" ], "execution_count": 9, "outputs": [ { "output_type": "execute_result", "data": { "application/vnd.google.colaboratory.module+javascript": "\n import \"https://ssl.gstatic.com/colaboratory/data_table/a036b366c3cace79/data_table.js\";\n\n window.createDataTable({\n data: [[{\n 'v': 0,\n 'f': \"0\",\n },\n{\n 'v': 0,\n 'f': \"0\",\n },\n\"conv_0_bn_relu/bn_/beta\",\n\"16\"],\n [{\n 'v': 1,\n 'f': \"1\",\n },\n{\n 'v': 1,\n 'f': \"1\",\n },\n\"conv_0_bn_relu/bn_/gamma\",\n\"16\"],\n [{\n 'v': 2,\n 'f': \"2\",\n },\n{\n 'v': 2,\n 'f': \"2\",\n },\n\"conv_0_bn_relu/bn_/moving_mean\",\n\"16\"],\n [{\n 'v': 3,\n 'f': \"3\",\n },\n{\n 'v': 3,\n 'f': \"3\",\n },\n\"conv_0_bn_relu/bn_/moving_variance\",\n\"16\"],\n [{\n 'v': 4,\n 'f': \"4\",\n },\n{\n 'v': 4,\n 'f': \"4\",\n },\n\"conv_0_bn_relu/conv_/w\",\n\"3, 3, 3, 1, 16\"],\n [{\n 'v': 5,\n 'f': \"5\",\n },\n{\n 'v': 5,\n 'f': \"5\",\n },\n\"conv_1_bn_relu/bn_/beta\",\n\"80\"],\n [{\n 'v': 6,\n 'f': \"6\",\n },\n{\n 'v': 6,\n 'f': \"6\",\n },\n\"conv_1_bn_relu/bn_/gamma\",\n\"80\"],\n [{\n 'v': 7,\n 'f': \"7\",\n },\n{\n 'v': 7,\n 'f': \"7\",\n },\n\"conv_1_bn_relu/bn_/moving_mean\",\n\"80\"],\n [{\n 'v': 8,\n 'f': \"8\",\n },\n{\n 'v': 8,\n 'f': \"8\",\n },\n\"conv_1_bn_relu/bn_/moving_variance\",\n\"80\"],\n [{\n 'v': 9,\n 'f': \"9\",\n },\n{\n 'v': 9,\n 'f': \"9\",\n },\n\"conv_1_bn_relu/conv_/w\",\n\"1, 1, 1, 64, 80\"],\n [{\n 'v': 10,\n 'f': \"10\",\n },\n{\n 'v': 10,\n 'f': \"10\",\n },\n\"conv_2_bn/bn_/beta\",\n\"160\"],\n [{\n 'v': 11,\n 'f': \"11\",\n },\n{\n 'v': 11,\n 'f': \"11\",\n },\n\"conv_2_bn/bn_/gamma\",\n\"160\"],\n [{\n 'v': 12,\n 'f': \"12\",\n },\n{\n 'v': 12,\n 'f': \"12\",\n },\n\"conv_2_bn/bn_/moving_mean\",\n\"160\"],\n [{\n 'v': 13,\n 'f': \"13\",\n },\n{\n 'v': 13,\n 'f': \"13\",\n },\n\"conv_2_bn/bn_/moving_variance\",\n\"160\"],\n [{\n 'v': 14,\n 'f': \"14\",\n },\n{\n 'v': 14,\n 'f': \"14\",\n },\n\"conv_2_bn/conv_/w\",\n\"1, 1, 1, 80, 160\"],\n [{\n 'v': 15,\n 'f': \"15\",\n },\n{\n 'v': 15,\n 'f': \"15\",\n },\n\"res_1_0/bn_0/beta\",\n\"16\"],\n [{\n 'v': 16,\n 'f': \"16\",\n },\n{\n 'v': 16,\n 'f': \"16\",\n },\n\"res_1_0/bn_0/gamma\",\n\"16\"],\n [{\n 'v': 17,\n 'f': \"17\",\n },\n{\n 'v': 17,\n 'f': \"17\",\n },\n\"res_1_0/bn_0/moving_mean\",\n\"16\"],\n [{\n 'v': 18,\n 'f': \"18\",\n },\n{\n 'v': 18,\n 'f': \"18\",\n },\n\"res_1_0/bn_0/moving_variance\",\n\"16\"],\n [{\n 'v': 19,\n 'f': \"19\",\n },\n{\n 'v': 19,\n 'f': \"19\",\n },\n\"res_1_0/bn_1/beta\",\n\"16\"],\n [{\n 'v': 20,\n 'f': \"20\",\n },\n{\n 'v': 20,\n 'f': \"20\",\n },\n\"res_1_0/bn_1/gamma\",\n\"16\"],\n [{\n 'v': 21,\n 'f': \"21\",\n },\n{\n 'v': 21,\n 'f': \"21\",\n },\n\"res_1_0/bn_1/moving_mean\",\n\"16\"],\n [{\n 'v': 22,\n 'f': \"22\",\n },\n{\n 'v': 22,\n 'f': \"22\",\n },\n\"res_1_0/bn_1/moving_variance\",\n\"16\"],\n [{\n 'v': 23,\n 'f': \"23\",\n },\n{\n 'v': 23,\n 'f': \"23\",\n },\n\"res_1_0/conv_0/w\",\n\"3, 3, 3, 16, 16\"],\n [{\n 'v': 24,\n 'f': \"24\",\n },\n{\n 'v': 24,\n 'f': \"24\",\n },\n\"res_1_0/conv_1/w\",\n\"3, 3, 3, 16, 16\"],\n [{\n 'v': 25,\n 'f': \"25\",\n },\n{\n 'v': 25,\n 'f': \"25\",\n },\n\"res_1_1/bn_0/beta\",\n\"16\"],\n [{\n 'v': 26,\n 'f': \"26\",\n },\n{\n 'v': 26,\n 'f': \"26\",\n },\n\"res_1_1/bn_0/gamma\",\n\"16\"],\n [{\n 'v': 27,\n 'f': \"27\",\n },\n{\n 'v': 27,\n 'f': \"27\",\n },\n\"res_1_1/bn_0/moving_mean\",\n\"16\"],\n [{\n 'v': 28,\n 'f': \"28\",\n },\n{\n 'v': 28,\n 'f': \"28\",\n },\n\"res_1_1/bn_0/moving_variance\",\n\"16\"],\n [{\n 'v': 29,\n 'f': \"29\",\n },\n{\n 'v': 29,\n 'f': \"29\",\n },\n\"res_1_1/bn_1/beta\",\n\"16\"],\n [{\n 'v': 30,\n 'f': \"30\",\n },\n{\n 'v': 30,\n 'f': \"30\",\n },\n\"res_1_1/bn_1/gamma\",\n\"16\"],\n [{\n 'v': 31,\n 'f': \"31\",\n },\n{\n 'v': 31,\n 'f': \"31\",\n },\n\"res_1_1/bn_1/moving_mean\",\n\"16\"],\n [{\n 'v': 32,\n 'f': \"32\",\n },\n{\n 'v': 32,\n 'f': \"32\",\n },\n\"res_1_1/bn_1/moving_variance\",\n\"16\"],\n [{\n 'v': 33,\n 'f': \"33\",\n },\n{\n 'v': 33,\n 'f': \"33\",\n },\n\"res_1_1/conv_0/w\",\n\"3, 3, 3, 16, 16\"],\n [{\n 'v': 34,\n 'f': \"34\",\n },\n{\n 'v': 34,\n 'f': \"34\",\n },\n\"res_1_1/conv_1/w\",\n\"3, 3, 3, 16, 16\"],\n [{\n 'v': 35,\n 'f': \"35\",\n },\n{\n 'v': 35,\n 'f': \"35\",\n },\n\"res_1_2/bn_0/beta\",\n\"16\"],\n [{\n 'v': 36,\n 'f': \"36\",\n },\n{\n 'v': 36,\n 'f': \"36\",\n },\n\"res_1_2/bn_0/gamma\",\n\"16\"],\n [{\n 'v': 37,\n 'f': \"37\",\n },\n{\n 'v': 37,\n 'f': \"37\",\n },\n\"res_1_2/bn_0/moving_mean\",\n\"16\"],\n [{\n 'v': 38,\n 'f': \"38\",\n },\n{\n 'v': 38,\n 'f': \"38\",\n },\n\"res_1_2/bn_0/moving_variance\",\n\"16\"],\n [{\n 'v': 39,\n 'f': \"39\",\n },\n{\n 'v': 39,\n 'f': \"39\",\n },\n\"res_1_2/bn_1/beta\",\n\"16\"],\n [{\n 'v': 40,\n 'f': \"40\",\n },\n{\n 'v': 40,\n 'f': \"40\",\n },\n\"res_1_2/bn_1/gamma\",\n\"16\"],\n [{\n 'v': 41,\n 'f': \"41\",\n },\n{\n 'v': 41,\n 'f': \"41\",\n },\n\"res_1_2/bn_1/moving_mean\",\n\"16\"],\n [{\n 'v': 42,\n 'f': \"42\",\n },\n{\n 'v': 42,\n 'f': \"42\",\n },\n\"res_1_2/bn_1/moving_variance\",\n\"16\"],\n [{\n 'v': 43,\n 'f': \"43\",\n },\n{\n 'v': 43,\n 'f': \"43\",\n },\n\"res_1_2/conv_0/w\",\n\"3, 3, 3, 16, 16\"],\n [{\n 'v': 44,\n 'f': \"44\",\n },\n{\n 'v': 44,\n 'f': \"44\",\n },\n\"res_1_2/conv_1/w\",\n\"3, 3, 3, 16, 16\"],\n [{\n 'v': 45,\n 'f': \"45\",\n },\n{\n 'v': 45,\n 'f': \"45\",\n },\n\"res_2_0/bn_0/beta\",\n\"16\"],\n [{\n 'v': 46,\n 'f': \"46\",\n },\n{\n 'v': 46,\n 'f': \"46\",\n },\n\"res_2_0/bn_0/gamma\",\n\"16\"],\n [{\n 'v': 47,\n 'f': \"47\",\n },\n{\n 'v': 47,\n 'f': \"47\",\n },\n\"res_2_0/bn_0/moving_mean\",\n\"16\"],\n [{\n 'v': 48,\n 'f': \"48\",\n },\n{\n 'v': 48,\n 'f': \"48\",\n },\n\"res_2_0/bn_0/moving_variance\",\n\"16\"],\n [{\n 'v': 49,\n 'f': \"49\",\n },\n{\n 'v': 49,\n 'f': \"49\",\n },\n\"res_2_0/bn_1/beta\",\n\"32\"],\n [{\n 'v': 50,\n 'f': \"50\",\n },\n{\n 'v': 50,\n 'f': \"50\",\n },\n\"res_2_0/bn_1/gamma\",\n\"32\"],\n [{\n 'v': 51,\n 'f': \"51\",\n },\n{\n 'v': 51,\n 'f': \"51\",\n },\n\"res_2_0/bn_1/moving_mean\",\n\"32\"],\n [{\n 'v': 52,\n 'f': \"52\",\n },\n{\n 'v': 52,\n 'f': \"52\",\n },\n\"res_2_0/bn_1/moving_variance\",\n\"32\"],\n [{\n 'v': 53,\n 'f': \"53\",\n },\n{\n 'v': 53,\n 'f': \"53\",\n },\n\"res_2_0/conv_0/w\",\n\"3, 3, 3, 16, 32\"],\n [{\n 'v': 54,\n 'f': \"54\",\n },\n{\n 'v': 54,\n 'f': \"54\",\n },\n\"res_2_0/conv_1/w\",\n\"3, 3, 3, 32, 32\"],\n [{\n 'v': 55,\n 'f': \"55\",\n },\n{\n 'v': 55,\n 'f': \"55\",\n },\n\"res_2_1/bn_0/beta\",\n\"32\"],\n [{\n 'v': 56,\n 'f': \"56\",\n },\n{\n 'v': 56,\n 'f': \"56\",\n },\n\"res_2_1/bn_0/gamma\",\n\"32\"],\n [{\n 'v': 57,\n 'f': \"57\",\n },\n{\n 'v': 57,\n 'f': \"57\",\n },\n\"res_2_1/bn_0/moving_mean\",\n\"32\"],\n [{\n 'v': 58,\n 'f': \"58\",\n },\n{\n 'v': 58,\n 'f': \"58\",\n },\n\"res_2_1/bn_0/moving_variance\",\n\"32\"],\n [{\n 'v': 59,\n 'f': \"59\",\n },\n{\n 'v': 59,\n 'f': \"59\",\n },\n\"res_2_1/bn_1/beta\",\n\"32\"],\n [{\n 'v': 60,\n 'f': \"60\",\n },\n{\n 'v': 60,\n 'f': \"60\",\n },\n\"res_2_1/bn_1/gamma\",\n\"32\"],\n [{\n 'v': 61,\n 'f': \"61\",\n },\n{\n 'v': 61,\n 'f': \"61\",\n },\n\"res_2_1/bn_1/moving_mean\",\n\"32\"],\n [{\n 'v': 62,\n 'f': \"62\",\n },\n{\n 'v': 62,\n 'f': \"62\",\n },\n\"res_2_1/bn_1/moving_variance\",\n\"32\"],\n [{\n 'v': 63,\n 'f': \"63\",\n },\n{\n 'v': 63,\n 'f': \"63\",\n },\n\"res_2_1/conv_0/w\",\n\"3, 3, 3, 32, 32\"],\n [{\n 'v': 64,\n 'f': \"64\",\n },\n{\n 'v': 64,\n 'f': \"64\",\n },\n\"res_2_1/conv_1/w\",\n\"3, 3, 3, 32, 32\"],\n [{\n 'v': 65,\n 'f': \"65\",\n },\n{\n 'v': 65,\n 'f': \"65\",\n },\n\"res_2_2/bn_0/beta\",\n\"32\"],\n [{\n 'v': 66,\n 'f': \"66\",\n },\n{\n 'v': 66,\n 'f': \"66\",\n },\n\"res_2_2/bn_0/gamma\",\n\"32\"],\n [{\n 'v': 67,\n 'f': \"67\",\n },\n{\n 'v': 67,\n 'f': \"67\",\n },\n\"res_2_2/bn_0/moving_mean\",\n\"32\"],\n [{\n 'v': 68,\n 'f': \"68\",\n },\n{\n 'v': 68,\n 'f': \"68\",\n },\n\"res_2_2/bn_0/moving_variance\",\n\"32\"],\n [{\n 'v': 69,\n 'f': \"69\",\n },\n{\n 'v': 69,\n 'f': \"69\",\n },\n\"res_2_2/bn_1/beta\",\n\"32\"],\n [{\n 'v': 70,\n 'f': \"70\",\n },\n{\n 'v': 70,\n 'f': \"70\",\n },\n\"res_2_2/bn_1/gamma\",\n\"32\"],\n [{\n 'v': 71,\n 'f': \"71\",\n },\n{\n 'v': 71,\n 'f': \"71\",\n },\n\"res_2_2/bn_1/moving_mean\",\n\"32\"],\n [{\n 'v': 72,\n 'f': \"72\",\n },\n{\n 'v': 72,\n 'f': \"72\",\n },\n\"res_2_2/bn_1/moving_variance\",\n\"32\"],\n [{\n 'v': 73,\n 'f': \"73\",\n },\n{\n 'v': 73,\n 'f': \"73\",\n },\n\"res_2_2/conv_0/w\",\n\"3, 3, 3, 32, 32\"],\n [{\n 'v': 74,\n 'f': \"74\",\n },\n{\n 'v': 74,\n 'f': \"74\",\n },\n\"res_2_2/conv_1/w\",\n\"3, 3, 3, 32, 32\"],\n [{\n 'v': 75,\n 'f': \"75\",\n },\n{\n 'v': 75,\n 'f': \"75\",\n },\n\"res_3_0/bn_0/beta\",\n\"32\"],\n [{\n 'v': 76,\n 'f': \"76\",\n },\n{\n 'v': 76,\n 'f': \"76\",\n },\n\"res_3_0/bn_0/gamma\",\n\"32\"],\n [{\n 'v': 77,\n 'f': \"77\",\n },\n{\n 'v': 77,\n 'f': \"77\",\n },\n\"res_3_0/bn_0/moving_mean\",\n\"32\"],\n [{\n 'v': 78,\n 'f': \"78\",\n },\n{\n 'v': 78,\n 'f': \"78\",\n },\n\"res_3_0/bn_0/moving_variance\",\n\"32\"],\n [{\n 'v': 79,\n 'f': \"79\",\n },\n{\n 'v': 79,\n 'f': \"79\",\n },\n\"res_3_0/bn_1/beta\",\n\"64\"],\n [{\n 'v': 80,\n 'f': \"80\",\n },\n{\n 'v': 80,\n 'f': \"80\",\n },\n\"res_3_0/bn_1/gamma\",\n\"64\"],\n [{\n 'v': 81,\n 'f': \"81\",\n },\n{\n 'v': 81,\n 'f': \"81\",\n },\n\"res_3_0/bn_1/moving_mean\",\n\"64\"],\n [{\n 'v': 82,\n 'f': \"82\",\n },\n{\n 'v': 82,\n 'f': \"82\",\n },\n\"res_3_0/bn_1/moving_variance\",\n\"64\"],\n [{\n 'v': 83,\n 'f': \"83\",\n },\n{\n 'v': 83,\n 'f': \"83\",\n },\n\"res_3_0/conv_0/w\",\n\"3, 3, 3, 32, 64\"],\n [{\n 'v': 84,\n 'f': \"84\",\n },\n{\n 'v': 84,\n 'f': \"84\",\n },\n\"res_3_0/conv_1/w\",\n\"3, 3, 3, 64, 64\"],\n [{\n 'v': 85,\n 'f': \"85\",\n },\n{\n 'v': 85,\n 'f': \"85\",\n },\n\"res_3_1/bn_0/beta\",\n\"64\"],\n [{\n 'v': 86,\n 'f': \"86\",\n },\n{\n 'v': 86,\n 'f': \"86\",\n },\n\"res_3_1/bn_0/gamma\",\n\"64\"],\n [{\n 'v': 87,\n 'f': \"87\",\n },\n{\n 'v': 87,\n 'f': \"87\",\n },\n\"res_3_1/bn_0/moving_mean\",\n\"64\"],\n [{\n 'v': 88,\n 'f': \"88\",\n },\n{\n 'v': 88,\n 'f': \"88\",\n },\n\"res_3_1/bn_0/moving_variance\",\n\"64\"],\n [{\n 'v': 89,\n 'f': \"89\",\n },\n{\n 'v': 89,\n 'f': \"89\",\n },\n\"res_3_1/bn_1/beta\",\n\"64\"],\n [{\n 'v': 90,\n 'f': \"90\",\n },\n{\n 'v': 90,\n 'f': \"90\",\n },\n\"res_3_1/bn_1/gamma\",\n\"64\"],\n [{\n 'v': 91,\n 'f': \"91\",\n },\n{\n 'v': 91,\n 'f': \"91\",\n },\n\"res_3_1/bn_1/moving_mean\",\n\"64\"],\n [{\n 'v': 92,\n 'f': \"92\",\n },\n{\n 'v': 92,\n 'f': \"92\",\n },\n\"res_3_1/bn_1/moving_variance\",\n\"64\"],\n [{\n 'v': 93,\n 'f': \"93\",\n },\n{\n 'v': 93,\n 'f': \"93\",\n },\n\"res_3_1/conv_0/w\",\n\"3, 3, 3, 64, 64\"],\n [{\n 'v': 94,\n 'f': \"94\",\n },\n{\n 'v': 94,\n 'f': \"94\",\n },\n\"res_3_1/conv_1/w\",\n\"3, 3, 3, 64, 64\"],\n [{\n 'v': 95,\n 'f': \"95\",\n },\n{\n 'v': 95,\n 'f': \"95\",\n },\n\"res_3_2/bn_0/beta\",\n\"64\"],\n [{\n 'v': 96,\n 'f': \"96\",\n },\n{\n 'v': 96,\n 'f': \"96\",\n },\n\"res_3_2/bn_0/gamma\",\n\"64\"],\n [{\n 'v': 97,\n 'f': \"97\",\n },\n{\n 'v': 97,\n 'f': \"97\",\n },\n\"res_3_2/bn_0/moving_mean\",\n\"64\"],\n [{\n 'v': 98,\n 'f': \"98\",\n },\n{\n 'v': 98,\n 'f': \"98\",\n },\n\"res_3_2/bn_0/moving_variance\",\n\"64\"],\n [{\n 'v': 99,\n 'f': \"99\",\n },\n{\n 'v': 99,\n 'f': \"99\",\n },\n\"res_3_2/bn_1/beta\",\n\"64\"],\n [{\n 'v': 100,\n 'f': \"100\",\n },\n{\n 'v': 100,\n 'f': \"100\",\n },\n\"res_3_2/bn_1/gamma\",\n\"64\"],\n [{\n 'v': 101,\n 'f': \"101\",\n },\n{\n 'v': 101,\n 'f': \"101\",\n },\n\"res_3_2/bn_1/moving_mean\",\n\"64\"],\n [{\n 'v': 102,\n 'f': \"102\",\n },\n{\n 'v': 102,\n 'f': \"102\",\n },\n\"res_3_2/bn_1/moving_variance\",\n\"64\"],\n [{\n 'v': 103,\n 'f': \"103\",\n },\n{\n 'v': 103,\n 'f': \"103\",\n },\n\"res_3_2/conv_0/w\",\n\"3, 3, 3, 64, 64\"],\n [{\n 'v': 104,\n 'f': \"104\",\n },\n{\n 'v': 104,\n 'f': \"104\",\n },\n\"res_3_2/conv_1/w\",\n\"3, 3, 3, 64, 64\"]],\n columns: [[\"number\", \"index\"], [\"number\", \"Unnamed: 0\"], [\"string\", \"name\"], [\"string\", \"shape\"]],\n rowsPerPage: 25,\n helpUrl: \"https://colab.research.google.com/notebooks/data_table.ipynb\",\n });\n ", "text/html": [ "Just like #miccai2018! pic.twitter.com/3ZTHxj9iPT
— Julia Schnabel (@ja_schnabel) July 8, 2019
\n", " | Unnamed: 0 | \n", "name | \n", "shape | \n", "
---|---|---|---|
0 | \n", "0 | \n", "conv_0_bn_relu/bn_/beta | \n", "16 | \n", "
1 | \n", "1 | \n", "conv_0_bn_relu/bn_/gamma | \n", "16 | \n", "
2 | \n", "2 | \n", "conv_0_bn_relu/bn_/moving_mean | \n", "16 | \n", "
3 | \n", "3 | \n", "conv_0_bn_relu/bn_/moving_variance | \n", "16 | \n", "
4 | \n", "4 | \n", "conv_0_bn_relu/conv_/w | \n", "3, 3, 3, 1, 16 | \n", "
5 | \n", "5 | \n", "conv_1_bn_relu/bn_/beta | \n", "80 | \n", "
6 | \n", "6 | \n", "conv_1_bn_relu/bn_/gamma | \n", "80 | \n", "
7 | \n", "7 | \n", "conv_1_bn_relu/bn_/moving_mean | \n", "80 | \n", "
8 | \n", "8 | \n", "conv_1_bn_relu/bn_/moving_variance | \n", "80 | \n", "
9 | \n", "9 | \n", "conv_1_bn_relu/conv_/w | \n", "1, 1, 1, 64, 80 | \n", "
10 | \n", "10 | \n", "conv_2_bn/bn_/beta | \n", "160 | \n", "
11 | \n", "11 | \n", "conv_2_bn/bn_/gamma | \n", "160 | \n", "
12 | \n", "12 | \n", "conv_2_bn/bn_/moving_mean | \n", "160 | \n", "
13 | \n", "13 | \n", "conv_2_bn/bn_/moving_variance | \n", "160 | \n", "
14 | \n", "14 | \n", "conv_2_bn/conv_/w | \n", "1, 1, 1, 80, 160 | \n", "
15 | \n", "15 | \n", "res_1_0/bn_0/beta | \n", "16 | \n", "
16 | \n", "16 | \n", "res_1_0/bn_0/gamma | \n", "16 | \n", "
17 | \n", "17 | \n", "res_1_0/bn_0/moving_mean | \n", "16 | \n", "
18 | \n", "18 | \n", "res_1_0/bn_0/moving_variance | \n", "16 | \n", "
19 | \n", "19 | \n", "res_1_0/bn_1/beta | \n", "16 | \n", "
20 | \n", "20 | \n", "res_1_0/bn_1/gamma | \n", "16 | \n", "
21 | \n", "21 | \n", "res_1_0/bn_1/moving_mean | \n", "16 | \n", "
22 | \n", "22 | \n", "res_1_0/bn_1/moving_variance | \n", "16 | \n", "
23 | \n", "23 | \n", "res_1_0/conv_0/w | \n", "3, 3, 3, 16, 16 | \n", "
24 | \n", "24 | \n", "res_1_0/conv_1/w | \n", "3, 3, 3, 16, 16 | \n", "
25 | \n", "25 | \n", "res_1_1/bn_0/beta | \n", "16 | \n", "
26 | \n", "26 | \n", "res_1_1/bn_0/gamma | \n", "16 | \n", "
27 | \n", "27 | \n", "res_1_1/bn_0/moving_mean | \n", "16 | \n", "
28 | \n", "28 | \n", "res_1_1/bn_0/moving_variance | \n", "16 | \n", "
29 | \n", "29 | \n", "res_1_1/bn_1/beta | \n", "16 | \n", "
... | \n", "... | \n", "... | \n", "... | \n", "
75 | \n", "75 | \n", "res_3_0/bn_0/beta | \n", "32 | \n", "
76 | \n", "76 | \n", "res_3_0/bn_0/gamma | \n", "32 | \n", "
77 | \n", "77 | \n", "res_3_0/bn_0/moving_mean | \n", "32 | \n", "
78 | \n", "78 | \n", "res_3_0/bn_0/moving_variance | \n", "32 | \n", "
79 | \n", "79 | \n", "res_3_0/bn_1/beta | \n", "64 | \n", "
80 | \n", "80 | \n", "res_3_0/bn_1/gamma | \n", "64 | \n", "
81 | \n", "81 | \n", "res_3_0/bn_1/moving_mean | \n", "64 | \n", "
82 | \n", "82 | \n", "res_3_0/bn_1/moving_variance | \n", "64 | \n", "
83 | \n", "83 | \n", "res_3_0/conv_0/w | \n", "3, 3, 3, 32, 64 | \n", "
84 | \n", "84 | \n", "res_3_0/conv_1/w | \n", "3, 3, 3, 64, 64 | \n", "
85 | \n", "85 | \n", "res_3_1/bn_0/beta | \n", "64 | \n", "
86 | \n", "86 | \n", "res_3_1/bn_0/gamma | \n", "64 | \n", "
87 | \n", "87 | \n", "res_3_1/bn_0/moving_mean | \n", "64 | \n", "
88 | \n", "88 | \n", "res_3_1/bn_0/moving_variance | \n", "64 | \n", "
89 | \n", "89 | \n", "res_3_1/bn_1/beta | \n", "64 | \n", "
90 | \n", "90 | \n", "res_3_1/bn_1/gamma | \n", "64 | \n", "
91 | \n", "91 | \n", "res_3_1/bn_1/moving_mean | \n", "64 | \n", "
92 | \n", "92 | \n", "res_3_1/bn_1/moving_variance | \n", "64 | \n", "
93 | \n", "93 | \n", "res_3_1/conv_0/w | \n", "3, 3, 3, 64, 64 | \n", "
94 | \n", "94 | \n", "res_3_1/conv_1/w | \n", "3, 3, 3, 64, 64 | \n", "
95 | \n", "95 | \n", "res_3_2/bn_0/beta | \n", "64 | \n", "
96 | \n", "96 | \n", "res_3_2/bn_0/gamma | \n", "64 | \n", "
97 | \n", "97 | \n", "res_3_2/bn_0/moving_mean | \n", "64 | \n", "
98 | \n", "98 | \n", "res_3_2/bn_0/moving_variance | \n", "64 | \n", "
99 | \n", "99 | \n", "res_3_2/bn_1/beta | \n", "64 | \n", "
100 | \n", "100 | \n", "res_3_2/bn_1/gamma | \n", "64 | \n", "
101 | \n", "101 | \n", "res_3_2/bn_1/moving_mean | \n", "64 | \n", "
102 | \n", "102 | \n", "res_3_2/bn_1/moving_variance | \n", "64 | \n", "
103 | \n", "103 | \n", "res_3_2/conv_0/w | \n", "3, 3, 3, 64, 64 | \n", "
104 | \n", "104 | \n", "res_3_2/conv_1/w | \n", "3, 3, 3, 64, 64 | \n", "
105 rows × 3 columns
\n", "\n", " | name | \n", "shape | \n", "
---|---|---|
0 | \n", "block.0.convolutional_block.1.weight | \n", "16, 1, 3, 3, 3 | \n", "
1 | \n", "block.0.convolutional_block.2.weight | \n", "16 | \n", "
2 | \n", "block.0.convolutional_block.2.bias | \n", "16 | \n", "
3 | \n", "block.0.convolutional_block.2.running_mean | \n", "16 | \n", "
4 | \n", "block.0.convolutional_block.2.running_var | \n", "16 | \n", "
5 | \n", "block.1.dilation_block.0.residual_block.0.conv... | \n", "16 | \n", "
6 | \n", "block.1.dilation_block.0.residual_block.0.conv... | \n", "16 | \n", "
7 | \n", "block.1.dilation_block.0.residual_block.0.conv... | \n", "16 | \n", "
8 | \n", "block.1.dilation_block.0.residual_block.0.conv... | \n", "16 | \n", "
9 | \n", "block.1.dilation_block.0.residual_block.0.conv... | \n", "16, 16, 3, 3, 3 | \n", "
10 | \n", "block.1.dilation_block.0.residual_block.1.conv... | \n", "16 | \n", "
11 | \n", "block.1.dilation_block.0.residual_block.1.conv... | \n", "16 | \n", "
12 | \n", "block.1.dilation_block.0.residual_block.1.conv... | \n", "16 | \n", "
13 | \n", "block.1.dilation_block.0.residual_block.1.conv... | \n", "16 | \n", "
14 | \n", "block.1.dilation_block.0.residual_block.1.conv... | \n", "16, 16, 3, 3, 3 | \n", "
15 | \n", "block.1.dilation_block.1.residual_block.0.conv... | \n", "16 | \n", "
16 | \n", "block.1.dilation_block.1.residual_block.0.conv... | \n", "16 | \n", "
17 | \n", "block.1.dilation_block.1.residual_block.0.conv... | \n", "16 | \n", "
18 | \n", "block.1.dilation_block.1.residual_block.0.conv... | \n", "16 | \n", "
19 | \n", "block.1.dilation_block.1.residual_block.0.conv... | \n", "16, 16, 3, 3, 3 | \n", "
20 | \n", "block.1.dilation_block.1.residual_block.1.conv... | \n", "16 | \n", "
21 | \n", "block.1.dilation_block.1.residual_block.1.conv... | \n", "16 | \n", "
22 | \n", "block.1.dilation_block.1.residual_block.1.conv... | \n", "16 | \n", "
23 | \n", "block.1.dilation_block.1.residual_block.1.conv... | \n", "16 | \n", "
24 | \n", "block.1.dilation_block.1.residual_block.1.conv... | \n", "16, 16, 3, 3, 3 | \n", "
25 | \n", "block.1.dilation_block.2.residual_block.0.conv... | \n", "16 | \n", "
26 | \n", "block.1.dilation_block.2.residual_block.0.conv... | \n", "16 | \n", "
27 | \n", "block.1.dilation_block.2.residual_block.0.conv... | \n", "16 | \n", "
28 | \n", "block.1.dilation_block.2.residual_block.0.conv... | \n", "16 | \n", "
29 | \n", "block.1.dilation_block.2.residual_block.0.conv... | \n", "16, 16, 3, 3, 3 | \n", "
... | \n", "... | \n", "... | \n", "
75 | \n", "block.3.dilation_block.1.residual_block.0.conv... | \n", "64 | \n", "
76 | \n", "block.3.dilation_block.1.residual_block.0.conv... | \n", "64 | \n", "
77 | \n", "block.3.dilation_block.1.residual_block.0.conv... | \n", "64 | \n", "
78 | \n", "block.3.dilation_block.1.residual_block.0.conv... | \n", "64 | \n", "
79 | \n", "block.3.dilation_block.1.residual_block.0.conv... | \n", "64, 64, 3, 3, 3 | \n", "
80 | \n", "block.3.dilation_block.1.residual_block.1.conv... | \n", "64 | \n", "
81 | \n", "block.3.dilation_block.1.residual_block.1.conv... | \n", "64 | \n", "
82 | \n", "block.3.dilation_block.1.residual_block.1.conv... | \n", "64 | \n", "
83 | \n", "block.3.dilation_block.1.residual_block.1.conv... | \n", "64 | \n", "
84 | \n", "block.3.dilation_block.1.residual_block.1.conv... | \n", "64, 64, 3, 3, 3 | \n", "
85 | \n", "block.3.dilation_block.2.residual_block.0.conv... | \n", "64 | \n", "
86 | \n", "block.3.dilation_block.2.residual_block.0.conv... | \n", "64 | \n", "
87 | \n", "block.3.dilation_block.2.residual_block.0.conv... | \n", "64 | \n", "
88 | \n", "block.3.dilation_block.2.residual_block.0.conv... | \n", "64 | \n", "
89 | \n", "block.3.dilation_block.2.residual_block.0.conv... | \n", "64, 64, 3, 3, 3 | \n", "
90 | \n", "block.3.dilation_block.2.residual_block.1.conv... | \n", "64 | \n", "
91 | \n", "block.3.dilation_block.2.residual_block.1.conv... | \n", "64 | \n", "
92 | \n", "block.3.dilation_block.2.residual_block.1.conv... | \n", "64 | \n", "
93 | \n", "block.3.dilation_block.2.residual_block.1.conv... | \n", "64 | \n", "
94 | \n", "block.3.dilation_block.2.residual_block.1.conv... | \n", "64, 64, 3, 3, 3 | \n", "
95 | \n", "block.4.convolutional_block.0.weight | \n", "80, 64, 1, 1, 1 | \n", "
96 | \n", "block.4.convolutional_block.1.weight | \n", "80 | \n", "
97 | \n", "block.4.convolutional_block.1.bias | \n", "80 | \n", "
98 | \n", "block.4.convolutional_block.1.running_mean | \n", "80 | \n", "
99 | \n", "block.4.convolutional_block.1.running_var | \n", "80 | \n", "
100 | \n", "block.6.convolutional_block.0.weight | \n", "160, 80, 1, 1, 1 | \n", "
101 | \n", "block.6.convolutional_block.1.weight | \n", "160 | \n", "
102 | \n", "block.6.convolutional_block.1.bias | \n", "160 | \n", "
103 | \n", "block.6.convolutional_block.1.running_mean | \n", "160 | \n", "
104 | \n", "block.6.convolutional_block.1.running_var | \n", "160 | \n", "
105 rows × 2 columns
\n", "