{ "cells": [ { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "4embtkV0pNxM" }, "source": [ "Deep Learning\n", "=============\n", "\n", "Assignment 4\n", "------------\n", "\n", "Previously in `2_fullyconnected.ipynb` and `3_regularization.ipynb`, we trained fully connected networks to classify [notMNIST](http://yaroslavvb.blogspot.com/2011/09/notmnist-dataset.html) characters.\n", "\n", "The goal of this assignment is make the neural network convolutional." ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "cellView": "both", "colab": { "autoexec": { "startup": false, "wait_interval": 0 } }, "colab_type": "code", "id": "tm2CQN_Cpwj0" }, "outputs": [], "source": [ "# These are all the modules we'll be using later. Make sure you can import them\n", "# before proceeding further.\n", "from __future__ import print_function\n", "import numpy as np\n", "import tensorflow as tf\n", "from six.moves import cPickle as pickle\n", "from six.moves import range" ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "cellView": "both", "colab": { "autoexec": { "startup": false, "wait_interval": 0 }, "output_extras": [ { "item_id": 1 } ] }, "colab_type": "code", "executionInfo": { "elapsed": 11948, "status": "ok", "timestamp": 1446658914837, "user": { "color": "", "displayName": "", "isAnonymous": false, "isMe": true, "permissionId": "", "photoUrl": "", "sessionId": "0", "userId": "" }, "user_tz": 480 }, "id": "y3-cj1bpmuxc", "outputId": "016b1a51-0290-4b08-efdb-8c95ffc3cd01" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Training set (200000, 28, 28) (200000,)\n", "Validation set (10000, 28, 28) (10000,)\n", "Test set (10000, 28, 28) (10000,)\n" ] } ], "source": [ "pickle_file = 'notMNIST.pickle'\n", "\n", "with open(pickle_file, 'rb') as f:\n", " save = pickle.load(f)\n", " train_dataset = save['train_dataset']\n", " train_labels = save['train_labels']\n", " valid_dataset = save['valid_dataset']\n", " valid_labels = save['valid_labels']\n", " test_dataset = save['test_dataset']\n", " test_labels = save['test_labels']\n", " del save # hint to help gc free up memory\n", " print('Training set', train_dataset.shape, train_labels.shape)\n", " print('Validation set', valid_dataset.shape, valid_labels.shape)\n", " print('Test set', test_dataset.shape, test_labels.shape)" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "L7aHrm6nGDMB" }, "source": [ "Reformat into a TensorFlow-friendly shape:\n", "- convolutions need the image data formatted as a cube (width by height by #channels)\n", "- labels as float 1-hot encodings." ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "cellView": "both", "colab": { "autoexec": { "startup": false, "wait_interval": 0 }, "output_extras": [ { "item_id": 1 } ] }, "colab_type": "code", "executionInfo": { "elapsed": 11952, "status": "ok", "timestamp": 1446658914857, "user": { "color": "", "displayName": "", "isAnonymous": false, "isMe": true, "permissionId": "", "photoUrl": "", "sessionId": "0", "userId": "" }, "user_tz": 480 }, "id": "IRSyYiIIGIzS", "outputId": "650a208c-8359-4852-f4f5-8bf10e80ef6c" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Training set (200000, 28, 28, 1) (200000, 10)\n", "Validation set (10000, 28, 28, 1) (10000, 10)\n", "Test set (10000, 28, 28, 1) (10000, 10)\n" ] } ], "source": [ "image_size = 28\n", "num_labels = 10\n", "num_channels = 1 # grayscale\n", "\n", "import numpy as np\n", "\n", "def reformat(dataset, labels):\n", " dataset = dataset.reshape(\n", " (-1, image_size, image_size, num_channels)).astype(np.float32)\n", " labels = (np.arange(num_labels) == labels[:,None]).astype(np.float32)\n", " return dataset, labels\n", "train_dataset, train_labels = reformat(train_dataset, train_labels)\n", "valid_dataset, valid_labels = reformat(valid_dataset, valid_labels)\n", "test_dataset, test_labels = reformat(test_dataset, test_labels)\n", "print('Training set', train_dataset.shape, train_labels.shape)\n", "print('Validation set', valid_dataset.shape, valid_labels.shape)\n", "print('Test set', test_dataset.shape, test_labels.shape)" ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "cellView": "both", "colab": { "autoexec": { "startup": false, "wait_interval": 0 } }, "colab_type": "code", "id": "AgQDIREv02p1" }, "outputs": [], "source": [ "def accuracy(predictions, labels):\n", " return (100.0 * np.sum(np.argmax(predictions, 1) == np.argmax(labels, 1))\n", " / predictions.shape[0])" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "5rhgjmROXu2O" }, "source": [ "Let's build a small network with two convolutional layers, followed by one fully connected layer. Convolutional networks are more expensive computationally, so we'll limit its depth and number of fully connected nodes." ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "cellView": "both", "colab": { "autoexec": { "startup": false, "wait_interval": 0 } }, "colab_type": "code", "id": "IZYv70SvvOan" }, "outputs": [], "source": [ "batch_size = 16\n", "patch_size = 5\n", "depth = 16\n", "num_hidden = 64\n", "\n", "graph = tf.Graph()\n", "\n", "with graph.as_default():\n", "\n", " # Input data.\n", " tf_train_dataset = tf.placeholder(\n", " tf.float32, shape=(batch_size, image_size, image_size, num_channels))\n", " tf_train_labels = tf.placeholder(tf.float32, shape=(batch_size, num_labels))\n", " tf_valid_dataset = tf.constant(valid_dataset)\n", " tf_test_dataset = tf.constant(test_dataset)\n", " \n", " # Variables.\n", " layer1_weights = tf.Variable(tf.truncated_normal(\n", " [patch_size, patch_size, num_channels, depth], stddev=0.1))\n", " layer1_biases = tf.Variable(tf.zeros([depth]))\n", " layer2_weights = tf.Variable(tf.truncated_normal(\n", " [patch_size, patch_size, depth, depth], stddev=0.1))\n", " layer2_biases = tf.Variable(tf.constant(1.0, shape=[depth]))\n", " layer3_weights = tf.Variable(tf.truncated_normal(\n", " [image_size // 4 * image_size // 4 * depth, num_hidden], stddev=0.1))\n", " layer3_biases = tf.Variable(tf.constant(1.0, shape=[num_hidden]))\n", " layer4_weights = tf.Variable(tf.truncated_normal(\n", " [num_hidden, num_labels], stddev=0.1))\n", " layer4_biases = tf.Variable(tf.constant(1.0, shape=[num_labels]))\n", " \n", " # Model.\n", " def model(data):\n", " conv = tf.nn.conv2d(data, layer1_weights, [1, 2, 2, 1], padding='SAME')\n", " hidden = tf.nn.relu(conv + layer1_biases)\n", " conv = tf.nn.conv2d(hidden, layer2_weights, [1, 2, 2, 1], padding='SAME')\n", " hidden = tf.nn.relu(conv + layer2_biases)\n", " shape = hidden.get_shape().as_list()\n", " reshape = tf.reshape(hidden, [shape[0], shape[1] * shape[2] * shape[3]])\n", " hidden = tf.nn.relu(tf.matmul(reshape, layer3_weights) + layer3_biases)\n", " return tf.matmul(hidden, layer4_weights) + layer4_biases\n", " \n", " # Training computation.\n", " logits = model(tf_train_dataset)\n", " loss = tf.reduce_mean(\n", " tf.nn.softmax_cross_entropy_with_logits(labels=tf_train_labels, logits=logits))\n", " \n", " # Optimizer.\n", " optimizer = tf.train.GradientDescentOptimizer(0.05).minimize(loss)\n", " \n", " # Predictions for the training, validation, and test data.\n", " train_prediction = tf.nn.softmax(logits)\n", " valid_prediction = tf.nn.softmax(model(tf_valid_dataset))\n", " test_prediction = tf.nn.softmax(model(tf_test_dataset))" ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "cellView": "both", "colab": { "autoexec": { "startup": false, "wait_interval": 0 }, "output_extras": [ { "item_id": 37 } ] }, "colab_type": "code", "executionInfo": { "elapsed": 63292, "status": "ok", "timestamp": 1446658966251, "user": { "color": "", "displayName": "", "isAnonymous": false, "isMe": true, "permissionId": "", "photoUrl": "", "sessionId": "0", "userId": "" }, "user_tz": 480 }, "id": "noKFb2UovVFR", "outputId": "28941338-2ef9-4088-8bd1-44295661e628" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Initialized\n", "Minibatch loss at step 0: 3.213093\n", "Minibatch accuracy: 18.8%\n", "Validation accuracy: 10.0%\n", "Minibatch loss at step 50: 1.705882\n", "Minibatch accuracy: 25.0%\n", "Validation accuracy: 44.3%\n", "Minibatch loss at step 100: 1.193937\n", "Minibatch accuracy: 50.0%\n", "Validation accuracy: 62.9%\n", "Minibatch loss at step 150: 0.464993\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 72.7%\n", "Minibatch loss at step 200: 0.703133\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 77.4%\n", "Minibatch loss at step 250: 1.254208\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 77.5%\n", "Minibatch loss at step 300: 0.312940\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 79.2%\n", "Minibatch loss at step 350: 0.552538\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 76.3%\n", "Minibatch loss at step 400: 0.359237\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 80.2%\n", "Minibatch loss at step 450: 0.724704\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 78.5%\n", "Minibatch loss at step 500: 0.640203\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 80.4%\n", "Minibatch loss at step 550: 0.820833\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 80.8%\n", "Minibatch loss at step 600: 0.354834\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 81.7%\n", "Minibatch loss at step 650: 0.885277\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 80.8%\n", "Minibatch loss at step 700: 1.020606\n", "Minibatch accuracy: 62.5%\n", "Validation accuracy: 81.5%\n", "Minibatch loss at step 750: 0.087126\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 82.5%\n", "Minibatch loss at step 800: 0.741018\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 80.8%\n", "Minibatch loss at step 850: 1.040641\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 81.2%\n", "Minibatch loss at step 900: 0.620474\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 82.6%\n", "Minibatch loss at step 950: 0.557300\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 82.7%\n", "Minibatch loss at step 1000: 0.389972\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 82.7%\n", "Test accuracy: 89.7%\n" ] } ], "source": [ "num_steps = 1001\n", "\n", "with tf.Session(graph=graph) as session:\n", " tf.initialize_all_variables().run()\n", " print('Initialized')\n", " for step in range(num_steps):\n", " offset = (step * batch_size) % (train_labels.shape[0] - batch_size)\n", " batch_data = train_dataset[offset:(offset + batch_size), :, :, :]\n", " batch_labels = train_labels[offset:(offset + batch_size), :]\n", " feed_dict = {tf_train_dataset : batch_data, tf_train_labels : batch_labels}\n", " _, l, predictions = session.run(\n", " [optimizer, loss, train_prediction], feed_dict=feed_dict)\n", " if (step % 50 == 0):\n", " print('Minibatch loss at step %d: %f' % (step, l))\n", " print('Minibatch accuracy: %.1f%%' % accuracy(predictions, batch_labels))\n", " print('Validation accuracy: %.1f%%' % accuracy(\n", " valid_prediction.eval(), valid_labels))\n", " print('Test accuracy: %.1f%%' % accuracy(test_prediction.eval(), test_labels))" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "KedKkn4EutIK" }, "source": [ "---\n", "Problem 1\n", "---------\n", "\n", "The convolutional model above uses convolutions with stride 2 to reduce the dimensionality. Replace the strides by a max pooling operation (`nn.max_pool()`) of stride 2 and kernel size 2.\n", "\n", "---" ] }, { "cell_type": "code", "execution_count": 8, "metadata": {}, "outputs": [], "source": [ "batch_size = 16\n", "patch_size = 5\n", "depth = 16\n", "num_hidden = 64\n", "\n", "graph = tf.Graph()\n", "\n", "with graph.as_default():\n", "\n", " # Input data.\n", " tf_train_dataset = tf.placeholder(\n", " tf.float32, shape=(batch_size, image_size, image_size, num_channels))\n", " tf_train_labels = tf.placeholder(tf.float32, shape=(batch_size, num_labels))\n", " tf_valid_dataset = tf.constant(valid_dataset)\n", " tf_test_dataset = tf.constant(test_dataset)\n", " \n", " # Variables.\n", " layer1_weights = tf.Variable(tf.truncated_normal(\n", " [patch_size, patch_size, num_channels, depth], stddev=0.1))\n", " layer1_biases = tf.Variable(tf.zeros([depth]))\n", " layer2_weights = tf.Variable(tf.truncated_normal(\n", " [patch_size, patch_size, depth, depth], stddev=0.1))\n", " layer2_biases = tf.Variable(tf.constant(1.0, shape=[depth]))\n", " layer3_weights = tf.Variable(tf.truncated_normal(\n", " [image_size // 4 * image_size // 4 * depth, num_hidden], stddev=0.1))\n", " layer3_biases = tf.Variable(tf.constant(1.0, shape=[num_hidden]))\n", " layer4_weights = tf.Variable(tf.truncated_normal(\n", " [num_hidden, num_labels], stddev=0.1))\n", " layer4_biases = tf.Variable(tf.constant(1.0, shape=[num_labels]))\n", " \n", " # Model.\n", " def model(data):\n", " conv1 = tf.nn.conv2d(data, layer1_weights, [1, 1, 1, 1], padding='SAME')\n", " bias1 = tf.nn.relu(conv1 + layer1_biases)\n", " pool1 = tf.nn.max_pool(bias1, [1, 2, 2, 1], [1, 2, 2, 1], padding='SAME')\n", " conv2 = tf.nn.conv2d(pool1, layer2_weights, [1, 1, 1, 1], padding='SAME')\n", " bias2 = tf.nn.relu(conv2 + layer2_biases)\n", " pool2 = tf.nn.max_pool(bias2, [1, 2, 2, 1], [1, 2, 2, 1], padding='SAME')\n", " shape = pool2.get_shape().as_list()\n", " reshape = tf.reshape(pool2, [shape[0], shape[1] * shape[2] * shape[3]])\n", " hidden = tf.nn.relu(tf.matmul(reshape, layer3_weights) + layer3_biases)\n", " return tf.matmul(hidden, layer4_weights) + layer4_biases\n", " \n", " # Training computation.\n", " logits = model(tf_train_dataset)\n", " loss = tf.reduce_mean(\n", " tf.nn.softmax_cross_entropy_with_logits(logits, tf_train_labels))\n", " \n", " # Optimizer.\n", " optimizer = tf.train.GradientDescentOptimizer(0.05).minimize(loss)\n", " \n", " # Predictions for the training, validation, and test data.\n", " train_prediction = tf.nn.softmax(logits)\n", " valid_prediction = tf.nn.softmax(model(tf_valid_dataset))\n", " test_prediction = tf.nn.softmax(model(tf_test_dataset))" ] }, { "cell_type": "code", "execution_count": 9, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Initialized\n", "Minibatch loss at step 0: 3.779494\n", "Minibatch accuracy: 6.2%\n", "Validation accuracy: 10.0%\n", "Minibatch loss at step 50: 1.866151\n", "Minibatch accuracy: 37.5%\n", "Validation accuracy: 40.1%\n", "Minibatch loss at step 100: 1.125046\n", "Minibatch accuracy: 56.2%\n", "Validation accuracy: 56.2%\n", "Minibatch loss at step 150: 0.518518\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 75.0%\n", "Minibatch loss at step 200: 1.233413\n", "Minibatch accuracy: 50.0%\n", "Validation accuracy: 76.1%\n", "Minibatch loss at step 250: 1.081354\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 77.6%\n", "Minibatch loss at step 300: 0.417454\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 79.8%\n", "Minibatch loss at step 350: 0.428833\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 79.4%\n", "Minibatch loss at step 400: 0.205477\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 81.0%\n", "Minibatch loss at step 450: 0.671585\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 79.6%\n", "Minibatch loss at step 500: 0.673592\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 81.7%\n", "Minibatch loss at step 550: 0.713905\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 82.2%\n", "Minibatch loss at step 600: 0.371910\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 82.8%\n", "Minibatch loss at step 650: 0.860517\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 82.7%\n", "Minibatch loss at step 700: 0.719865\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 81.2%\n", "Minibatch loss at step 750: 0.061006\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 83.5%\n", "Minibatch loss at step 800: 0.573218\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 83.5%\n", "Minibatch loss at step 850: 0.780923\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 83.6%\n", "Minibatch loss at step 900: 0.558760\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 84.0%\n", "Minibatch loss at step 950: 0.525676\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 84.1%\n", "Minibatch loss at step 1000: 0.283331\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 84.4%\n", "Test accuracy: 91.3%\n" ] } ], "source": [ "num_steps = 1001\n", "\n", "with tf.Session(graph=graph) as session:\n", " tf.initialize_all_variables().run()\n", " print('Initialized')\n", " for step in range(num_steps):\n", " offset = (step * batch_size) % (train_labels.shape[0] - batch_size)\n", " batch_data = train_dataset[offset:(offset + batch_size), :, :, :]\n", " batch_labels = train_labels[offset:(offset + batch_size), :]\n", " feed_dict = {tf_train_dataset : batch_data, tf_train_labels : batch_labels}\n", " _, l, predictions = session.run(\n", " [optimizer, loss, train_prediction], feed_dict=feed_dict)\n", " if (step % 50 == 0):\n", " print('Minibatch loss at step %d: %f' % (step, l))\n", " print('Minibatch accuracy: %.1f%%' % accuracy(predictions, batch_labels))\n", " print('Validation accuracy: %.1f%%' % accuracy(\n", " valid_prediction.eval(), valid_labels))\n", " print('Test accuracy: %.1f%%' % accuracy(test_prediction.eval(), test_labels))" ] }, { "cell_type": "markdown", "metadata": { "colab_type": "text", "id": "klf21gpbAgb-" }, "source": [ "---\n", "Problem 2\n", "---------\n", "\n", "Try to get the best performance you can using a convolutional net. Look for example at the classic [LeNet5](http://yann.lecun.com/exdb/lenet/) architecture, adding Dropout, and/or adding learning rate decay.\n", "\n", "---" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The CNN below is loosely inspired by the LeNet5 architecture." ] }, { "cell_type": "code", "execution_count": 10, "metadata": {}, "outputs": [], "source": [ "batch_size = 16\n", "patch_size = 5\n", "depth = 16\n", "num_hidden = 64\n", "\n", "graph = tf.Graph()\n", "\n", "with graph.as_default():\n", "\n", " # Input data.\n", " tf_train_dataset = tf.placeholder(\n", " tf.float32, shape=(batch_size, image_size, image_size, num_channels))\n", " tf_train_labels = tf.placeholder(tf.float32, shape=(batch_size, num_labels))\n", " tf_valid_dataset = tf.constant(valid_dataset)\n", " tf_test_dataset = tf.constant(test_dataset)\n", " \n", " # Variables.\n", " layer1_weights = tf.Variable(tf.truncated_normal(\n", " [patch_size, patch_size, num_channels, depth], stddev=0.1))\n", " layer1_biases = tf.Variable(tf.zeros([depth]))\n", " layer2_weights = tf.Variable(tf.truncated_normal(\n", " [patch_size, patch_size, depth, depth], stddev=0.1))\n", " layer2_biases = tf.Variable(tf.constant(1.0, shape=[depth]))\n", " size3 = ((image_size - patch_size + 1) // 2 - patch_size + 1) // 2\n", " layer3_weights = tf.Variable(tf.truncated_normal(\n", " [size3 * size3 * depth, num_hidden], stddev=0.1))\n", " layer3_biases = tf.Variable(tf.constant(1.0, shape=[num_hidden]))\n", " layer4_weights = tf.Variable(tf.truncated_normal(\n", " [num_hidden, num_labels], stddev=0.1))\n", " layer4_biases = tf.Variable(tf.constant(1.0, shape=[num_labels]))\n", " \n", " # Model.\n", " def model(data):\n", " # C1 input 28 x 28\n", " conv1 = tf.nn.conv2d(data, layer1_weights, [1, 1, 1, 1], padding='VALID')\n", " bias1 = tf.nn.relu(conv1 + layer1_biases)\n", " # S2 input 24 x 24\n", " pool2 = tf.nn.avg_pool(bias1, [1, 2, 2, 1], [1, 2, 2, 1], padding='VALID')\n", " # C3 input 12 x 12\n", " conv3 = tf.nn.conv2d(pool2, layer2_weights, [1, 1, 1, 1], padding='VALID')\n", " bias3 = tf.nn.relu(conv3 + layer2_biases)\n", " # S4 input 8 x 8\n", " pool4 = tf.nn.avg_pool(bias3, [1, 2, 2, 1], [1, 2, 2, 1], padding='VALID')\n", " # F6 input 4 x 4\n", " shape = pool4.get_shape().as_list()\n", " reshape = tf.reshape(pool4, [shape[0], shape[1] * shape[2] * shape[3]])\n", " hidden = tf.nn.relu(tf.matmul(reshape, layer3_weights) + layer3_biases)\n", " return tf.matmul(hidden, layer4_weights) + layer4_biases\n", " \n", " # Training computation.\n", " logits = model(tf_train_dataset)\n", " loss = tf.reduce_mean(\n", " tf.nn.softmax_cross_entropy_with_logits(logits, tf_train_labels))\n", " \n", " # Optimizer.\n", " optimizer = tf.train.GradientDescentOptimizer(0.05).minimize(loss)\n", " \n", " # Predictions for the training, validation, and test data.\n", " train_prediction = tf.nn.softmax(logits)\n", " valid_prediction = tf.nn.softmax(model(tf_valid_dataset))\n", " test_prediction = tf.nn.softmax(model(tf_test_dataset))" ] }, { "cell_type": "code", "execution_count": 11, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Initialized\n", "Minibatch loss at step 0: 2.887526\n", "Minibatch accuracy: 6.2%\n", "Validation accuracy: 10.0%\n", "Minibatch loss at step 50: 1.755834\n", "Minibatch accuracy: 43.8%\n", "Validation accuracy: 49.1%\n", "Minibatch loss at step 100: 1.122573\n", "Minibatch accuracy: 62.5%\n", "Validation accuracy: 59.9%\n", "Minibatch loss at step 150: 0.818305\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 68.2%\n", "Minibatch loss at step 200: 1.141408\n", "Minibatch accuracy: 62.5%\n", "Validation accuracy: 70.2%\n", "Minibatch loss at step 250: 1.179383\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 74.6%\n", "Minibatch loss at step 300: 0.573698\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 75.9%\n", "Minibatch loss at step 350: 0.567827\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 73.7%\n", "Minibatch loss at step 400: 0.373525\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 78.5%\n", "Minibatch loss at step 450: 0.923118\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 77.4%\n", "Minibatch loss at step 500: 0.829145\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 78.0%\n", "Minibatch loss at step 550: 1.070729\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 78.0%\n", "Minibatch loss at step 600: 0.396518\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 79.6%\n", "Minibatch loss at step 650: 0.829448\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 79.4%\n", "Minibatch loss at step 700: 0.859195\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 80.2%\n", "Minibatch loss at step 750: 0.065276\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 80.4%\n", "Minibatch loss at step 800: 0.508514\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 80.8%\n", "Minibatch loss at step 850: 1.009475\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 79.8%\n", "Minibatch loss at step 900: 0.941263\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 81.1%\n", "Minibatch loss at step 950: 0.570730\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 81.1%\n", "Minibatch loss at step 1000: 0.480312\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 80.9%\n", "Minibatch loss at step 1050: 0.600223\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 80.8%\n", "Minibatch loss at step 1100: 0.612804\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 82.5%\n", "Minibatch loss at step 1150: 0.347645\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 80.9%\n", "Minibatch loss at step 1200: 1.044914\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 82.7%\n", "Minibatch loss at step 1250: 0.641687\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 82.5%\n", "Minibatch loss at step 1300: 0.314071\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 82.7%\n", "Minibatch loss at step 1350: 0.956139\n", "Minibatch accuracy: 62.5%\n", "Validation accuracy: 82.1%\n", "Minibatch loss at step 1400: 0.257573\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 82.5%\n", "Minibatch loss at step 1450: 0.341947\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 83.4%\n", "Minibatch loss at step 1500: 0.527776\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 82.5%\n", "Minibatch loss at step 1550: 0.554325\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 82.9%\n", "Minibatch loss at step 1600: 0.881043\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 83.0%\n", "Minibatch loss at step 1650: 0.619974\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 82.4%\n", "Minibatch loss at step 1700: 0.640255\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 83.4%\n", "Minibatch loss at step 1750: 0.571080\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 83.2%\n", "Minibatch loss at step 1800: 0.540935\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 83.6%\n", "Minibatch loss at step 1850: 0.652062\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 84.3%\n", "Minibatch loss at step 1900: 0.286824\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 84.1%\n", "Minibatch loss at step 1950: 0.571306\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 84.5%\n", "Minibatch loss at step 2000: 0.085546\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 84.5%\n", "Minibatch loss at step 2050: 0.778835\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 84.2%\n", "Minibatch loss at step 2100: 0.305310\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 85.1%\n", "Minibatch loss at step 2150: 0.367425\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 84.9%\n", "Minibatch loss at step 2200: 0.371174\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 84.4%\n", "Minibatch loss at step 2250: 0.613358\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 85.0%\n", "Minibatch loss at step 2300: 0.750378\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 84.2%\n", "Minibatch loss at step 2350: 0.418179\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 85.0%\n", "Minibatch loss at step 2400: 0.501598\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 84.4%\n", "Minibatch loss at step 2450: 0.581857\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 84.5%\n", "Minibatch loss at step 2500: 0.903965\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 84.2%\n", "Minibatch loss at step 2550: 0.426037\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 84.9%\n", "Minibatch loss at step 2600: 0.099809\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 85.3%\n", "Minibatch loss at step 2650: 0.328693\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 85.5%\n", "Minibatch loss at step 2700: 0.639310\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 85.3%\n", "Minibatch loss at step 2750: 1.238202\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 85.5%\n", "Minibatch loss at step 2800: 0.425977\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 85.5%\n", "Minibatch loss at step 2850: 0.074243\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 85.7%\n", "Minibatch loss at step 2900: 0.343937\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 85.2%\n", "Minibatch loss at step 2950: 0.420437\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 85.5%\n", "Minibatch loss at step 3000: 0.697705\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 85.7%\n", "Minibatch loss at step 3050: 0.445790\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 85.7%\n", "Minibatch loss at step 3100: 0.510697\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 84.9%\n", "Minibatch loss at step 3150: 0.656437\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 85.5%\n", "Minibatch loss at step 3200: 0.525945\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 85.3%\n", "Minibatch loss at step 3250: 0.379972\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 85.7%\n", "Minibatch loss at step 3300: 0.110951\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 86.5%\n", "Minibatch loss at step 3350: 0.242193\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 86.2%\n", "Minibatch loss at step 3400: 0.595421\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 86.3%\n", "Minibatch loss at step 3450: 0.388807\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 85.6%\n", "Minibatch loss at step 3500: 0.294891\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 85.8%\n", "Minibatch loss at step 3550: 0.295007\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 86.0%\n", "Minibatch loss at step 3600: 0.104769\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 86.2%\n", "Minibatch loss at step 3650: 0.843573\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 85.7%\n", "Minibatch loss at step 3700: 0.847604\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 86.2%\n", "Minibatch loss at step 3750: 0.677515\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 85.9%\n", "Minibatch loss at step 3800: 0.011699\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 86.2%\n", "Minibatch loss at step 3850: 0.731791\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 86.7%\n", "Minibatch loss at step 3900: 0.505476\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 85.7%\n", "Minibatch loss at step 3950: 0.017837\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 86.2%\n", "Minibatch loss at step 4000: 0.389282\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 86.4%\n", "Minibatch loss at step 4050: 0.880657\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 84.9%\n", "Minibatch loss at step 4100: 0.545353\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 86.7%\n", "Minibatch loss at step 4150: 1.049770\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 86.1%\n", "Minibatch loss at step 4200: 0.369591\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 85.9%\n", "Minibatch loss at step 4250: 0.612737\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 86.3%\n", "Minibatch loss at step 4300: 0.461530\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 86.3%\n", "Minibatch loss at step 4350: 0.200963\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 86.4%\n", "Minibatch loss at step 4400: 1.108174\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 86.5%\n", "Minibatch loss at step 4450: 0.509003\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 86.1%\n", "Minibatch loss at step 4500: 0.609005\n", "Minibatch accuracy: 87.5%\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Validation accuracy: 86.8%\n", "Minibatch loss at step 4550: 0.372265\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.1%\n", "Minibatch loss at step 4600: 0.533304\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 86.2%\n", "Minibatch loss at step 4650: 0.773061\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 86.3%\n", "Minibatch loss at step 4700: 0.506120\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.0%\n", "Minibatch loss at step 4750: 0.928353\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 86.3%\n", "Minibatch loss at step 4800: 0.489047\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 86.7%\n", "Minibatch loss at step 4850: 0.354044\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.1%\n", "Minibatch loss at step 4900: 0.119216\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.0%\n", "Minibatch loss at step 4950: 0.178830\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.2%\n", "Minibatch loss at step 5000: 0.955608\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 86.2%\n", "Minibatch loss at step 5050: 0.237759\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.0%\n", "Minibatch loss at step 5100: 0.275851\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.0%\n", "Minibatch loss at step 5150: 0.449317\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 86.1%\n", "Minibatch loss at step 5200: 0.273233\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 86.6%\n", "Minibatch loss at step 5250: 0.154574\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 87.1%\n", "Minibatch loss at step 5300: 0.304150\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.0%\n", "Minibatch loss at step 5350: 0.274738\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.0%\n", "Minibatch loss at step 5400: 0.401083\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.2%\n", "Minibatch loss at step 5450: 0.334712\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 86.7%\n", "Minibatch loss at step 5500: 0.447083\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.2%\n", "Minibatch loss at step 5550: 0.379631\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 86.6%\n", "Minibatch loss at step 5600: 0.286631\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.2%\n", "Minibatch loss at step 5650: 0.309757\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 87.2%\n", "Minibatch loss at step 5700: 0.507893\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.4%\n", "Minibatch loss at step 5750: 0.620230\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.2%\n", "Minibatch loss at step 5800: 0.169920\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.1%\n", "Minibatch loss at step 5850: 0.745497\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 87.1%\n", "Minibatch loss at step 5900: 0.767921\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 86.6%\n", "Minibatch loss at step 5950: 0.312666\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.2%\n", "Minibatch loss at step 6000: 0.168658\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 86.9%\n", "Minibatch loss at step 6050: 0.373202\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 87.2%\n", "Minibatch loss at step 6100: 0.774956\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 87.1%\n", "Minibatch loss at step 6150: 0.190329\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.4%\n", "Minibatch loss at step 6200: 1.050667\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 87.4%\n", "Minibatch loss at step 6250: 0.822543\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 87.1%\n", "Minibatch loss at step 6300: 0.708801\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 87.3%\n", "Minibatch loss at step 6350: 0.209116\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.3%\n", "Minibatch loss at step 6400: 0.129195\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.3%\n", "Minibatch loss at step 6450: 0.241793\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.4%\n", "Minibatch loss at step 6500: 0.820996\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 87.2%\n", "Minibatch loss at step 6550: 0.095217\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 87.4%\n", "Minibatch loss at step 6600: 0.252888\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.2%\n", "Minibatch loss at step 6650: 1.725186\n", "Minibatch accuracy: 56.2%\n", "Validation accuracy: 87.2%\n", "Minibatch loss at step 6700: 0.191433\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.4%\n", "Minibatch loss at step 6750: 0.353232\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.5%\n", "Minibatch loss at step 6800: 0.726301\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.6%\n", "Minibatch loss at step 6850: 0.515651\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 87.5%\n", "Minibatch loss at step 6900: 0.452586\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 87.0%\n", "Minibatch loss at step 6950: 0.179482\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 87.2%\n", "Minibatch loss at step 7000: 0.931371\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 87.7%\n", "Minibatch loss at step 7050: 0.871163\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 87.3%\n", "Minibatch loss at step 7100: 0.498850\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 87.2%\n", "Minibatch loss at step 7150: 0.299375\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.1%\n", "Minibatch loss at step 7200: 0.470262\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 87.7%\n", "Minibatch loss at step 7250: 0.305438\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.7%\n", "Minibatch loss at step 7300: 0.586515\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 87.3%\n", "Minibatch loss at step 7350: 0.242167\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.1%\n", "Minibatch loss at step 7400: 0.011232\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 87.8%\n", "Minibatch loss at step 7450: 0.328229\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 87.5%\n", "Minibatch loss at step 7500: 0.261614\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.8%\n", "Minibatch loss at step 7550: 0.394588\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.3%\n", "Minibatch loss at step 7600: 0.519581\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 87.5%\n", "Minibatch loss at step 7650: 0.201451\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.1%\n", "Minibatch loss at step 7700: 0.152650\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.1%\n", "Minibatch loss at step 7750: 0.329386\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.6%\n", "Minibatch loss at step 7800: 0.292810\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.6%\n", "Minibatch loss at step 7850: 0.450945\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.3%\n", "Minibatch loss at step 7900: 0.103688\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 87.7%\n", "Minibatch loss at step 7950: 0.825726\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 87.4%\n", "Minibatch loss at step 8000: 0.426394\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 87.6%\n", "Minibatch loss at step 8050: 0.299139\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.7%\n", "Minibatch loss at step 8100: 0.387193\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.5%\n", "Minibatch loss at step 8150: 0.923081\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 87.5%\n", "Minibatch loss at step 8200: 0.326014\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.3%\n", "Minibatch loss at step 8250: 0.321350\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.8%\n", "Minibatch loss at step 8300: 0.244986\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.8%\n", "Minibatch loss at step 8350: 0.455575\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.6%\n", "Minibatch loss at step 8400: 0.197823\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.0%\n", "Minibatch loss at step 8450: 0.207904\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.7%\n", "Minibatch loss at step 8500: 0.332120\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.6%\n", "Minibatch loss at step 8550: 0.314461\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.6%\n", "Minibatch loss at step 8600: 0.302965\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.0%\n", "Minibatch loss at step 8650: 0.540094\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.0%\n", "Minibatch loss at step 8700: 0.226378\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.9%\n", "Minibatch loss at step 8750: 0.103386\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 87.6%\n", "Minibatch loss at step 8800: 0.180387\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.8%\n", "Minibatch loss at step 8850: 0.014346\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 87.2%\n", "Minibatch loss at step 8900: 0.449845\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 87.7%\n", "Minibatch loss at step 8950: 0.290058\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.9%\n", "Minibatch loss at step 9000: 0.387533\n", "Minibatch accuracy: 93.8%\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Validation accuracy: 88.1%\n", "Minibatch loss at step 9050: 0.340914\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.0%\n", "Minibatch loss at step 9100: 0.406798\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.7%\n", "Minibatch loss at step 9150: 0.764048\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 87.2%\n", "Minibatch loss at step 9200: 0.293021\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.0%\n", "Minibatch loss at step 9250: 0.839284\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 87.9%\n", "Minibatch loss at step 9300: 0.926702\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 87.5%\n", "Minibatch loss at step 9350: 0.294565\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.4%\n", "Minibatch loss at step 9400: 0.348141\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.9%\n", "Minibatch loss at step 9450: 0.339952\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.9%\n", "Minibatch loss at step 9500: 0.244894\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.9%\n", "Minibatch loss at step 9550: 0.226825\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.2%\n", "Minibatch loss at step 9600: 0.356070\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.0%\n", "Minibatch loss at step 9650: 0.401066\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 87.8%\n", "Minibatch loss at step 9700: 0.234334\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.0%\n", "Minibatch loss at step 9750: 0.159659\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.9%\n", "Minibatch loss at step 9800: 0.380945\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.5%\n", "Minibatch loss at step 9850: 0.251576\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.4%\n", "Minibatch loss at step 9900: 0.566218\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.0%\n", "Minibatch loss at step 9950: 0.279412\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.3%\n", "Minibatch loss at step 10000: 0.215329\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.1%\n", "Minibatch loss at step 10050: 0.019123\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.1%\n", "Minibatch loss at step 10100: 0.356180\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.3%\n", "Minibatch loss at step 10150: 0.567362\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.1%\n", "Minibatch loss at step 10200: 0.189304\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.4%\n", "Minibatch loss at step 10250: 0.446280\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.0%\n", "Minibatch loss at step 10300: 0.126306\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.0%\n", "Minibatch loss at step 10350: 0.471321\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.1%\n", "Minibatch loss at step 10400: 0.290378\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.1%\n", "Minibatch loss at step 10450: 0.317353\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.0%\n", "Minibatch loss at step 10500: 0.340341\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 87.9%\n", "Minibatch loss at step 10550: 0.853542\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 88.1%\n", "Minibatch loss at step 10600: 0.720329\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.0%\n", "Minibatch loss at step 10650: 0.351055\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.0%\n", "Minibatch loss at step 10700: 0.048014\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.1%\n", "Minibatch loss at step 10750: 0.276408\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.4%\n", "Minibatch loss at step 10800: 0.477006\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.1%\n", "Minibatch loss at step 10850: 0.774067\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.7%\n", "Minibatch loss at step 10900: 0.237711\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.4%\n", "Minibatch loss at step 10950: 0.447895\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.0%\n", "Minibatch loss at step 11000: 0.093522\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.8%\n", "Minibatch loss at step 11050: 0.410395\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.1%\n", "Minibatch loss at step 11100: 0.094415\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 87.5%\n", "Minibatch loss at step 11150: 0.322285\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.1%\n", "Minibatch loss at step 11200: 0.145736\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.2%\n", "Minibatch loss at step 11250: 0.962268\n", "Minibatch accuracy: 62.5%\n", "Validation accuracy: 88.4%\n", "Minibatch loss at step 11300: 0.364485\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.4%\n", "Minibatch loss at step 11350: 0.390612\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.2%\n", "Minibatch loss at step 11400: 0.284497\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.2%\n", "Minibatch loss at step 11450: 0.343720\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.9%\n", "Minibatch loss at step 11500: 0.392723\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.1%\n", "Minibatch loss at step 11550: 0.357363\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.9%\n", "Minibatch loss at step 11600: 0.417761\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.4%\n", "Minibatch loss at step 11650: 0.378388\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.4%\n", "Minibatch loss at step 11700: 0.893325\n", "Minibatch accuracy: 62.5%\n", "Validation accuracy: 88.2%\n", "Minibatch loss at step 11750: 0.508242\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.4%\n", "Minibatch loss at step 11800: 0.019753\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 87.7%\n", "Minibatch loss at step 11850: 0.637589\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.2%\n", "Minibatch loss at step 11900: 0.371976\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.2%\n", "Minibatch loss at step 11950: 0.838995\n", "Minibatch accuracy: 62.5%\n", "Validation accuracy: 88.4%\n", "Minibatch loss at step 12000: 0.444190\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.0%\n", "Minibatch loss at step 12050: 0.038973\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 12100: 0.501612\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.2%\n", "Minibatch loss at step 12150: 0.247185\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 12200: 0.328059\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.3%\n", "Minibatch loss at step 12250: 0.381155\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.2%\n", "Minibatch loss at step 12300: 0.291867\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.3%\n", "Minibatch loss at step 12350: 0.758298\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.4%\n", "Minibatch loss at step 12400: 0.018543\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.2%\n", "Minibatch loss at step 12450: 0.952901\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 88.3%\n", "Minibatch loss at step 12500: 0.452856\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.3%\n", "Minibatch loss at step 12550: 0.775165\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 87.5%\n", "Minibatch loss at step 12600: 0.543465\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 88.1%\n", "Minibatch loss at step 12650: 0.423330\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 87.5%\n", "Minibatch loss at step 12700: 0.339476\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.9%\n", "Minibatch loss at step 12750: 0.207814\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.4%\n", "Minibatch loss at step 12800: 0.152592\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.2%\n", "Minibatch loss at step 12850: 0.250027\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.0%\n", "Minibatch loss at step 12900: 0.150702\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.1%\n", "Minibatch loss at step 12950: 0.036315\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.0%\n", "Minibatch loss at step 13000: 0.224514\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.3%\n", "Minibatch loss at step 13050: 0.276813\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.1%\n", "Minibatch loss at step 13100: 0.267251\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.4%\n", "Minibatch loss at step 13150: 0.248624\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.2%\n", "Minibatch loss at step 13200: 0.317007\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.4%\n", "Minibatch loss at step 13250: 0.949058\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.4%\n", "Minibatch loss at step 13300: 0.298962\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.4%\n", "Minibatch loss at step 13350: 0.106796\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 13400: 0.471004\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 13450: 0.339622\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.4%\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Minibatch loss at step 13500: 0.388597\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.2%\n", "Minibatch loss at step 13550: 0.395972\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.2%\n", "Minibatch loss at step 13600: 0.326045\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.6%\n", "Minibatch loss at step 13650: 0.497251\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.4%\n", "Minibatch loss at step 13700: 0.386662\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 13750: 0.576945\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.6%\n", "Minibatch loss at step 13800: 0.047079\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.0%\n", "Minibatch loss at step 13850: 0.130557\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.1%\n", "Minibatch loss at step 13900: 0.168503\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 13950: 0.453997\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 14000: 0.054875\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 14050: 0.387485\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 87.7%\n", "Minibatch loss at step 14100: 0.558513\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.3%\n", "Minibatch loss at step 14150: 0.281757\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.3%\n", "Minibatch loss at step 14200: 0.194573\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 87.6%\n", "Minibatch loss at step 14250: 0.234614\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 14300: 0.405016\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.6%\n", "Minibatch loss at step 14350: 0.146940\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.6%\n", "Minibatch loss at step 14400: 0.295947\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.2%\n", "Minibatch loss at step 14450: 0.073702\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.6%\n", "Minibatch loss at step 14500: 0.348648\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.2%\n", "Minibatch loss at step 14550: 0.555093\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.2%\n", "Minibatch loss at step 14600: 0.047173\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 87.8%\n", "Minibatch loss at step 14650: 0.611663\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 14700: 0.310703\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.6%\n", "Minibatch loss at step 14750: 0.285280\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.6%\n", "Minibatch loss at step 14800: 0.477825\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.2%\n", "Minibatch loss at step 14850: 0.301898\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.6%\n", "Minibatch loss at step 14900: 0.266927\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.6%\n", "Minibatch loss at step 14950: 0.401053\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 15000: 0.300880\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.3%\n", "Minibatch loss at step 15050: 0.685438\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.8%\n", "Minibatch loss at step 15100: 0.189911\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.7%\n", "Minibatch loss at step 15150: 0.255015\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.6%\n", "Minibatch loss at step 15200: 0.083468\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.4%\n", "Minibatch loss at step 15250: 0.641439\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 88.3%\n", "Minibatch loss at step 15300: 0.183658\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 15350: 0.020951\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 15400: 0.783557\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 15450: 0.691905\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.3%\n", "Minibatch loss at step 15500: 0.161925\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.8%\n", "Minibatch loss at step 15550: 0.345921\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.4%\n", "Minibatch loss at step 15600: 0.448692\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.3%\n", "Minibatch loss at step 15650: 0.473705\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.6%\n", "Minibatch loss at step 15700: 0.084393\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.6%\n", "Minibatch loss at step 15750: 0.207775\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.6%\n", "Minibatch loss at step 15800: 0.126755\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.6%\n", "Minibatch loss at step 15850: 0.267497\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.9%\n", "Minibatch loss at step 15900: 0.642480\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 89.0%\n", "Minibatch loss at step 15950: 0.096726\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.3%\n", "Minibatch loss at step 16000: 1.024041\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 16050: 0.080016\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.3%\n", "Minibatch loss at step 16100: 0.323796\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 16150: 0.119888\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 16200: 0.465717\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.3%\n", "Minibatch loss at step 16250: 0.177272\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 16300: 0.465884\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 16350: 0.391384\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.6%\n", "Minibatch loss at step 16400: 0.658646\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 88.7%\n", "Minibatch loss at step 16450: 0.640159\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.4%\n", "Minibatch loss at step 16500: 0.026741\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 16550: 0.468924\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 16600: 0.337119\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.7%\n", "Minibatch loss at step 16650: 0.614522\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 88.4%\n", "Minibatch loss at step 16700: 0.511785\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.8%\n", "Minibatch loss at step 16750: 0.521505\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.6%\n", "Minibatch loss at step 16800: 0.644770\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.6%\n", "Minibatch loss at step 16850: 0.742855\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 88.7%\n", "Minibatch loss at step 16900: 0.931507\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 88.6%\n", "Minibatch loss at step 16950: 0.210255\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.4%\n", "Minibatch loss at step 17000: 0.183624\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.7%\n", "Minibatch loss at step 17050: 0.308766\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.8%\n", "Minibatch loss at step 17100: 0.225872\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.7%\n", "Minibatch loss at step 17150: 0.325996\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.7%\n", "Minibatch loss at step 17200: 0.230195\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.8%\n", "Minibatch loss at step 17250: 0.547959\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.8%\n", "Minibatch loss at step 17300: 0.469168\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.9%\n", "Minibatch loss at step 17350: 0.212685\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 89.2%\n", "Minibatch loss at step 17400: 0.546534\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 17450: 0.272410\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 89.0%\n", "Minibatch loss at step 17500: 0.317612\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 17550: 0.484892\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.9%\n", "Minibatch loss at step 17600: 0.646033\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.8%\n", "Minibatch loss at step 17650: 0.209862\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.6%\n", "Minibatch loss at step 17700: 0.484305\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.7%\n", "Minibatch loss at step 17750: 0.251212\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 89.0%\n", "Minibatch loss at step 17800: 0.535176\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.8%\n", "Minibatch loss at step 17850: 0.298966\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.7%\n", "Minibatch loss at step 17900: 0.374506\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 89.1%\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Minibatch loss at step 17950: 0.356499\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.8%\n", "Minibatch loss at step 18000: 0.334625\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.2%\n", "Minibatch loss at step 18050: 0.343583\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.8%\n", "Minibatch loss at step 18100: 0.647946\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 89.2%\n", "Minibatch loss at step 18150: 0.221066\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 89.1%\n", "Minibatch loss at step 18200: 0.406646\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 89.0%\n", "Minibatch loss at step 18250: 0.407224\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 89.1%\n", "Minibatch loss at step 18300: 0.115090\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.9%\n", "Minibatch loss at step 18350: 0.321624\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 89.0%\n", "Minibatch loss at step 18400: 0.731389\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 88.5%\n", "Minibatch loss at step 18450: 0.288471\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 89.1%\n", "Minibatch loss at step 18500: 0.242199\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.6%\n", "Minibatch loss at step 18550: 0.298285\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 89.1%\n", "Minibatch loss at step 18600: 0.080742\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.8%\n", "Minibatch loss at step 18650: 0.327417\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 89.1%\n", "Minibatch loss at step 18700: 0.129839\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 89.1%\n", "Minibatch loss at step 18750: 0.682693\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 88.9%\n", "Minibatch loss at step 18800: 0.382109\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 89.0%\n", "Minibatch loss at step 18850: 0.477357\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 89.2%\n", "Minibatch loss at step 18900: 0.015490\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 89.4%\n", "Minibatch loss at step 18950: 0.127917\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 89.1%\n", "Minibatch loss at step 19000: 0.568930\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 88.8%\n", "Minibatch loss at step 19050: 0.059831\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 89.0%\n", "Minibatch loss at step 19100: 0.014571\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 89.1%\n", "Minibatch loss at step 19150: 0.449375\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.8%\n", "Minibatch loss at step 19200: 0.294453\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 88.8%\n", "Minibatch loss at step 19250: 0.913588\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 89.0%\n", "Minibatch loss at step 19300: 0.226687\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 89.1%\n", "Minibatch loss at step 19350: 0.138060\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 88.7%\n", "Minibatch loss at step 19400: 0.512782\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 89.1%\n", "Minibatch loss at step 19450: 0.192825\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 89.0%\n", "Minibatch loss at step 19500: 0.244256\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 89.0%\n", "Minibatch loss at step 19550: 0.229286\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 89.1%\n", "Minibatch loss at step 19600: 0.274848\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 89.0%\n", "Minibatch loss at step 19650: 0.315529\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 88.9%\n", "Minibatch loss at step 19700: 0.079131\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 89.0%\n", "Minibatch loss at step 19750: 0.500986\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 89.1%\n", "Minibatch loss at step 19800: 0.255140\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 89.4%\n", "Minibatch loss at step 19850: 0.130912\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 89.2%\n", "Minibatch loss at step 19900: 0.186173\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 89.6%\n", "Minibatch loss at step 19950: 0.748589\n", "Minibatch accuracy: 62.5%\n", "Validation accuracy: 89.3%\n", "Minibatch loss at step 20000: 0.276946\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 89.1%\n", "Test accuracy: 95.0%\n" ] } ], "source": [ "num_steps = 20001\n", "\n", "with tf.Session(graph=graph) as session:\n", " tf.initialize_all_variables().run()\n", " print('Initialized')\n", " for step in range(num_steps):\n", " offset = (step * batch_size) % (train_labels.shape[0] - batch_size)\n", " batch_data = train_dataset[offset:(offset + batch_size), :, :, :]\n", " batch_labels = train_labels[offset:(offset + batch_size), :]\n", " feed_dict = {tf_train_dataset : batch_data, tf_train_labels : batch_labels}\n", " _, l, predictions = session.run(\n", " [optimizer, loss, train_prediction], feed_dict=feed_dict)\n", " if (step % 50 == 0):\n", " print('Minibatch loss at step %d: %f' % (step, l))\n", " print('Minibatch accuracy: %.1f%%' % accuracy(predictions, batch_labels))\n", " print('Validation accuracy: %.1f%%' % accuracy(\n", " valid_prediction.eval(), valid_labels))\n", " print('Test accuracy: %.1f%%' % accuracy(test_prediction.eval(), test_labels))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The accuracy is good, but not as good as the 3-layer network from the previous assignment." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The next version of the net uses dropout and learning rate decay:" ] }, { "cell_type": "code", "execution_count": 13, "metadata": {}, "outputs": [], "source": [ "batch_size = 16\n", "patch_size = 5\n", "depth = 16\n", "num_hidden = 64\n", "beta_regul = 1e-3\n", "drop_out = 0.5\n", "\n", "graph = tf.Graph()\n", "\n", "with graph.as_default():\n", "\n", " # Input data.\n", " tf_train_dataset = tf.placeholder(\n", " tf.float32, shape=(batch_size, image_size, image_size, num_channels))\n", " tf_train_labels = tf.placeholder(tf.float32, shape=(batch_size, num_labels))\n", " tf_valid_dataset = tf.constant(valid_dataset)\n", " tf_test_dataset = tf.constant(test_dataset)\n", " global_step = tf.Variable(0)\n", " \n", " # Variables.\n", " layer1_weights = tf.Variable(tf.truncated_normal(\n", " [patch_size, patch_size, num_channels, depth], stddev=0.1))\n", " layer1_biases = tf.Variable(tf.zeros([depth]))\n", " layer2_weights = tf.Variable(tf.truncated_normal(\n", " [patch_size, patch_size, depth, depth], stddev=0.1))\n", " layer2_biases = tf.Variable(tf.constant(1.0, shape=[depth]))\n", " size3 = ((image_size - patch_size + 1) // 2 - patch_size + 1) // 2\n", " layer3_weights = tf.Variable(tf.truncated_normal(\n", " [size3 * size3 * depth, num_hidden], stddev=0.1))\n", " layer3_biases = tf.Variable(tf.constant(1.0, shape=[num_hidden]))\n", " layer4_weights = tf.Variable(tf.truncated_normal(\n", " [num_hidden, num_hidden], stddev=0.1))\n", " layer4_biases = tf.Variable(tf.constant(1.0, shape=[num_hidden]))\n", " layer5_weights = tf.Variable(tf.truncated_normal(\n", " [num_hidden, num_labels], stddev=0.1))\n", " layer5_biases = tf.Variable(tf.constant(1.0, shape=[num_labels]))\n", " \n", " # Model.\n", " def model(data, keep_prob):\n", " # C1 input 28 x 28\n", " conv1 = tf.nn.conv2d(data, layer1_weights, [1, 1, 1, 1], padding='VALID')\n", " bias1 = tf.nn.relu(conv1 + layer1_biases)\n", " # S2 input 24 x 24\n", " pool2 = tf.nn.avg_pool(bias1, [1, 2, 2, 1], [1, 2, 2, 1], padding='VALID')\n", " # C3 input 12 x 12\n", " conv3 = tf.nn.conv2d(pool2, layer2_weights, [1, 1, 1, 1], padding='VALID')\n", " bias3 = tf.nn.relu(conv3 + layer2_biases)\n", " # S4 input 8 x 8\n", " pool4 = tf.nn.avg_pool(bias3, [1, 2, 2, 1], [1, 2, 2, 1], padding='VALID')\n", " # F5 input 4 x 4\n", " shape = pool4.get_shape().as_list()\n", " reshape = tf.reshape(pool4, [shape[0], shape[1] * shape[2] * shape[3]])\n", " hidden5 = tf.nn.relu(tf.matmul(reshape, layer3_weights) + layer3_biases)\n", " # F6\n", " drop5 = tf.nn.dropout(hidden5, keep_prob)\n", " hidden6 = tf.nn.relu(tf.matmul(hidden5, layer4_weights) + layer4_biases)\n", " drop6 = tf.nn.dropout(hidden6, keep_prob)\n", " return tf.matmul(drop6, layer5_weights) + layer5_biases\n", " \n", " # Training computation.\n", " logits = model(tf_train_dataset, drop_out)\n", " loss = tf.reduce_mean(\n", " tf.nn.softmax_cross_entropy_with_logits(logits, tf_train_labels))\n", " \n", " # Optimizer.\n", " learning_rate = tf.train.exponential_decay(0.05, global_step, 1000, 0.85, staircase=True)\n", " optimizer = tf.train.GradientDescentOptimizer(learning_rate).minimize(loss, global_step=global_step)\n", " \n", " # Predictions for the training, validation, and test data.\n", " train_prediction = tf.nn.softmax(logits)\n", " valid_prediction = tf.nn.softmax(model(tf_valid_dataset, 1.0))\n", " test_prediction = tf.nn.softmax(model(tf_test_dataset, 1.0))" ] }, { "cell_type": "code", "execution_count": 14, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Initialized\n", "Minibatch loss at step 0: 2.466877\n", "Minibatch accuracy: 25.0%\n", "Validation accuracy: 10.3%\n", "Minibatch loss at step 50: 2.161342\n", "Minibatch accuracy: 18.8%\n", "Validation accuracy: 26.2%\n", "Minibatch loss at step 100: 1.929306\n", "Minibatch accuracy: 37.5%\n", "Validation accuracy: 44.5%\n", "Minibatch loss at step 150: 1.402269\n", "Minibatch accuracy: 43.8%\n", "Validation accuracy: 56.8%\n", "Minibatch loss at step 200: 1.576180\n", "Minibatch accuracy: 43.8%\n", "Validation accuracy: 59.8%\n", "Minibatch loss at step 250: 1.603382\n", "Minibatch accuracy: 56.2%\n", "Validation accuracy: 67.5%\n", "Minibatch loss at step 300: 0.996797\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 74.0%\n", "Minibatch loss at step 350: 0.924196\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 72.1%\n", "Minibatch loss at step 400: 0.585693\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 72.5%\n", "Minibatch loss at step 450: 1.226259\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 74.3%\n", "Minibatch loss at step 500: 1.268786\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 75.3%\n", "Minibatch loss at step 550: 0.794506\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 76.7%\n", "Minibatch loss at step 600: 0.704979\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 77.4%\n", "Minibatch loss at step 650: 0.899304\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 78.2%\n", "Minibatch loss at step 700: 1.260285\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 79.6%\n", "Minibatch loss at step 750: 0.351961\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 79.3%\n", "Minibatch loss at step 800: 0.666271\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 79.1%\n", "Minibatch loss at step 850: 0.991982\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 78.4%\n", "Minibatch loss at step 900: 0.785811\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 79.7%\n", "Minibatch loss at step 950: 0.733023\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 80.3%\n", "Minibatch loss at step 1000: 0.335110\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 80.0%\n", "Minibatch loss at step 1050: 0.680891\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 80.1%\n", "Minibatch loss at step 1100: 0.795097\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 81.0%\n", "Minibatch loss at step 1150: 0.398647\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 80.9%\n", "Minibatch loss at step 1200: 1.061081\n", "Minibatch accuracy: 62.5%\n", "Validation accuracy: 81.2%\n", "Minibatch loss at step 1250: 0.686575\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 81.3%\n", "Minibatch loss at step 1300: 0.320834\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 81.2%\n", "Minibatch loss at step 1350: 1.318262\n", "Minibatch accuracy: 56.2%\n", "Validation accuracy: 80.8%\n", "Minibatch loss at step 1400: 0.357428\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 81.5%\n", "Minibatch loss at step 1450: 0.332784\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 82.0%\n", "Minibatch loss at step 1500: 0.748170\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 81.8%\n", "Minibatch loss at step 1550: 0.876032\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 81.9%\n", "Minibatch loss at step 1600: 1.237206\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 81.4%\n", "Minibatch loss at step 1650: 0.727465\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 81.5%\n", "Minibatch loss at step 1700: 0.889423\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 81.2%\n", "Minibatch loss at step 1750: 0.521735\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 82.3%\n", "Minibatch loss at step 1800: 0.735978\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 82.1%\n", "Minibatch loss at step 1850: 1.355421\n", "Minibatch accuracy: 62.5%\n", "Validation accuracy: 81.9%\n", "Minibatch loss at step 1900: 0.470354\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 81.5%\n", "Minibatch loss at step 1950: 1.045202\n", "Minibatch accuracy: 56.2%\n", "Validation accuracy: 83.1%\n", "Minibatch loss at step 2000: 0.212505\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 82.2%\n", "Minibatch loss at step 2050: 0.946967\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 83.1%\n", "Minibatch loss at step 2100: 0.289381\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 82.5%\n", "Minibatch loss at step 2150: 0.503749\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 83.1%\n", "Minibatch loss at step 2200: 0.464120\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 83.2%\n", "Minibatch loss at step 2250: 0.684509\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 83.7%\n", "Minibatch loss at step 2300: 0.723519\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 82.7%\n", "Minibatch loss at step 2350: 0.435728\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 83.4%\n", "Minibatch loss at step 2400: 0.848848\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 83.5%\n", "Minibatch loss at step 2450: 0.610091\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 84.2%\n", "Minibatch loss at step 2500: 0.973621\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 83.9%\n", "Minibatch loss at step 2550: 0.680162\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 83.7%\n", "Minibatch loss at step 2600: 0.277876\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 84.0%\n", "Minibatch loss at step 2650: 0.717264\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 83.9%\n", "Minibatch loss at step 2700: 0.608532\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 83.9%\n", "Minibatch loss at step 2750: 1.166722\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 83.5%\n", "Minibatch loss at step 2800: 0.666192\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 84.4%\n", "Minibatch loss at step 2850: 0.112893\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 83.5%\n", "Minibatch loss at step 2900: 0.545045\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 83.9%\n", "Minibatch loss at step 2950: 0.628278\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 84.4%\n", "Minibatch loss at step 3000: 0.785475\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 83.9%\n", "Minibatch loss at step 3050: 0.930228\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 84.3%\n", "Minibatch loss at step 3100: 0.791880\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 84.4%\n", "Minibatch loss at step 3150: 0.641446\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 84.5%\n", "Minibatch loss at step 3200: 0.614943\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 84.0%\n", "Minibatch loss at step 3250: 0.768515\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 83.3%\n", "Minibatch loss at step 3300: 0.164835\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 84.5%\n", "Minibatch loss at step 3350: 0.500897\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 84.5%\n", "Minibatch loss at step 3400: 1.111665\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 84.7%\n", "Minibatch loss at step 3450: 0.670814\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 84.2%\n", "Minibatch loss at step 3500: 0.433042\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 84.6%\n", "Minibatch loss at step 3550: 0.315746\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 84.5%\n", "Minibatch loss at step 3600: 0.355311\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 84.4%\n", "Minibatch loss at step 3650: 0.756499\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 84.8%\n", "Minibatch loss at step 3700: 1.074692\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 84.5%\n", "Minibatch loss at step 3750: 0.792253\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 85.0%\n", "Minibatch loss at step 3800: 0.049834\n", "Minibatch accuracy: 100.0%\n", "Validation accuracy: 84.9%\n", "Minibatch loss at step 3850: 0.917343\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 85.2%\n", "Minibatch loss at step 3900: 0.633758\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 84.8%\n", "Minibatch loss at step 3950: 0.091348\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 84.9%\n", "Minibatch loss at step 4000: 0.386830\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 84.8%\n", "Minibatch loss at step 4050: 0.664441\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 84.9%\n", "Minibatch loss at step 4100: 0.818731\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 85.3%\n", "Minibatch loss at step 4150: 1.098946\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 85.2%\n", "Minibatch loss at step 4200: 0.427599\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 85.0%\n", "Minibatch loss at step 4250: 0.701067\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 85.1%\n", "Minibatch loss at step 4300: 0.915477\n", "Minibatch accuracy: 75.0%\n", "Validation accuracy: 85.0%\n", "Minibatch loss at step 4350: 0.301903\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 85.5%\n", "Minibatch loss at step 4400: 1.344068\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 85.0%\n", "Minibatch loss at step 4450: 0.401401\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 85.4%\n", "Minibatch loss at step 4500: 0.523591\n", "Minibatch accuracy: 87.5%\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Validation accuracy: 85.3%\n", "Minibatch loss at step 4550: 0.539713\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 85.3%\n", "Minibatch loss at step 4600: 0.340253\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 85.7%\n", "Minibatch loss at step 4650: 1.693758\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 85.5%\n", "Minibatch loss at step 4700: 0.515037\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 85.4%\n", "Minibatch loss at step 4750: 1.176289\n", "Minibatch accuracy: 62.5%\n", "Validation accuracy: 85.6%\n", "Minibatch loss at step 4800: 0.575342\n", "Minibatch accuracy: 81.2%\n", "Validation accuracy: 85.6%\n", "Minibatch loss at step 4850: 0.411686\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 85.7%\n", "Minibatch loss at step 4900: 0.197935\n", "Minibatch accuracy: 93.8%\n", "Validation accuracy: 85.7%\n", "Minibatch loss at step 4950: 0.303288\n", "Minibatch accuracy: 87.5%\n", "Validation accuracy: 85.8%\n", "Minibatch loss at step 5000: 1.693749\n", "Minibatch accuracy: 68.8%\n", "Validation accuracy: 85.5%\n", "Test accuracy: 91.9%\n" ] } ], "source": [ "num_steps = 5001\n", "\n", "with tf.Session(graph=graph) as session:\n", " tf.initialize_all_variables().run()\n", " print('Initialized')\n", " for step in range(num_steps):\n", " offset = (step * batch_size) % (train_labels.shape[0] - batch_size)\n", " batch_data = train_dataset[offset:(offset + batch_size), :, :, :]\n", " batch_labels = train_labels[offset:(offset + batch_size), :]\n", " feed_dict = {tf_train_dataset : batch_data, tf_train_labels : batch_labels}\n", " _, l, predictions = session.run(\n", " [optimizer, loss, train_prediction], feed_dict=feed_dict)\n", " if (step % 50 == 0):\n", " print('Minibatch loss at step %d: %f' % (step, l))\n", " print('Minibatch accuracy: %.1f%%' % accuracy(predictions, batch_labels))\n", " print('Validation accuracy: %.1f%%' % accuracy(\n", " valid_prediction.eval(), valid_labels))\n", " print('Test accuracy: %.1f%%' % accuracy(test_prediction.eval(), test_labels))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Well, the accuracy is worst. This net has many meta parameters and I don't feel comfortable in tuning them randomly. I should probably change the depth and make it different between the layers, since it looks like the increasing number of feature maps is a key design item." ] } ], "metadata": { "colab": { "default_view": {}, "name": "4_convolutions.ipynb", "provenance": [], "version": "0.3.2", "views": {} }, "kernelspec": { "display_name": "Python 2", "language": "python", "name": "python2" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 2 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython2", "version": "2.7.14" } }, "nbformat": 4, "nbformat_minor": 1 }