{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "### EffortlessAI: Multilayer Perceptron Tool\n", "*Version 1.2*\n", "*Last Updated: 2/6/2018* \n", "*Programmed By:* Daniel DiPietro \n", " \n", "Supports: \n", "* Loading/saving previous neural networks\n", "* Training via gradient descent\n", "* Custom amounts of layers and nodes\n", "* Visual representation of weights via matplotlib\n", "* Custom epoch progression\n", "* Training sets/Multiple queries\n", "* Viewing/querying previous epochs" ] }, { "cell_type": "code", "execution_count": 265, "metadata": { "collapsed": true }, "outputs": [], "source": [ "#Imports\n", "import numpy as np\n", "import matplotlib.pyplot as plt\n", "import csv\n", "import json\n", "import copy\n", "import scipy.special\n", "import ipywidgets as widgets\n", "from IPython.display import Markdown, display, clear_output\n", "\n", "np.set_printoptions(threshold=np.nan)" ] }, { "cell_type": "code", "execution_count": 266, "metadata": { "scrolled": false }, "outputs": [ { "data": { "text/markdown": [ "### ** Console**" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "f96e3b5b30664944acb78f42546a4be8", "version_major": 2, "version_minor": 0 }, "text/plain": [ "A Jupyter Widget" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "f7bbda91b74c49198618b787a4d754b6", "version_major": 2, "version_minor": 0 }, "text/plain": [ "A Jupyter Widget" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "ef88a69f7da34d31b21668496742c93a", "version_major": 2, "version_minor": 0 }, "text/plain": [ "A Jupyter Widget" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "d6038a838db14d61a8bda27379a71a23", "version_major": 2, "version_minor": 0 }, "text/plain": [ "A Jupyter Widget" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "a18eb6499b854e2a97287e3c0ccf8f10", "version_major": 2, "version_minor": 0 }, "text/plain": [ "A Jupyter Widget" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "7d07352f2f0d4ee9abed6befb345f2af", "version_major": 2, "version_minor": 0 }, "text/plain": [ "A Jupyter Widget" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "29b2696793e04fbe911c1b1c647a1a32", "version_major": 2, "version_minor": 0 }, "text/plain": [ "A Jupyter Widget" ] }, "metadata": {}, "output_type": "display_data" }, { "data": { "application/vnd.jupyter.widget-view+json": { "model_id": "c618f4ea9d3944dfad51c8e02fdd322c", "version_major": 2, "version_minor": 0 }, "text/plain": [ "A Jupyter Widget" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "#Initializes console and output widgets\n", "def printmd(string):\n", " display(Markdown(string))\n", "\n", "printmd('### ** Console**')\n", "console = widgets.Output(\n", " readout=True,\n", " readout_format='d',\n", " layout=widgets.Layout(height='150px', border='solid 1px')\n", ")\n", "display(console)\n", " \n", "out = widgets.Output()\n", "with out: #Nerual network initialization tools\n", " printmd('### **New Neural Network Initialization**')\n", "display(out)\n", "\n", "load = widgets.Output()\n", "with load: #Neural network loading tools\n", " printmd('### **Load a Neural Network**')\n", "display(load)\n", " \n", "out2 = widgets.Output() #Neural network interaction tools\n", "display(out2)\n", "\n", "out3 = widgets.Output() #Subplots for NN weights\n", "display(out3)\n", "\n", "out4 = widgets.Output() #Querying\n", "display(out4)\n", "\n", "out5 = widgets.Output() #Training\n", "display(out5)\n", "\n", "out6 = widgets.Output() #Saving\n", "display(out6)\n", "\n", "#Declares neural network as global so that it can be easily accessed\n", "global n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**-------------------------------------------------**\n", "### Documentation\n", "#### New Neural Network Initialization\n", "* Users are able to quickly initialize their own neural networks through seven widgets.\n", "* Number of Hidden Layers Widget:\n", " * Describes the number of layers between the input layer and output layer. As of now, we only support a maximum of 3 hidden layers\n", "* Number of Nodes for Input Layer:\n", " * Must match with the size of the inputs used to query and train the neural network\n", "* Number of Nodes per Hidden Layer:\n", " * As this number increases, the ability of the neural network to recognize increasingly complicated non-linear patterns does as well.\n", " * As of now, all hidden layers must have the same number of nodes.\n", "* Number of Nodes for Output Layer:\n", " * Must match with the size of the targets used to train the neural network\n", "* Initial Weight Generation Method:\n", " * Users can have weights randomly generated between -0.5 and +0.5.\n", " * Users can have weights generated via a normal distribution where the mean occurs at zero and the standard deviation is equal to ${\\frac {1}{\\sqrt{incoming\\space links}}}$\n", "* Activation Function:\n", " * Users can select between a sigmoid (${\\frac {1}{1+e^{-x}}}$) and hyperbolic tangent activation function.\n", "* Learning Rate:\n", " * The most popular learning rates used for multi-layer perceptron training are included, ranging from 0.00001 to 10.\n", "\n", "#### Loading a Neural Network\n", "* In order to load neural network, users must first paste the JSON object data of a previously saved EffortlessAI neural network into the \"JSON\" text box.\n", "* Next, they must place the weights array of a previously saved EffortlessAI neural network into the \"weights\" text box. \n", "\n", "#### Neural Network Viewing\n", "* Users have access to a display that lists current characterics of the neural network and updates dynamically.\n", "* Users are able to visualize the weights of previous epochs and subsequent learning progression of a neural network via a slider that generates weight-modeling graphics for each epoch.\n", "\n", "#### Querying\n", "* Querying refers to the propagation of inputs through the neural network in order to calculate their respective outputs.\n", "* Users are able to query multiple sets at a time.\n", "* Formatting:\n", " * [[first query], [second query], [third query], [nth query]] where each query is a comma-separated list of numbers\n", " * *Example (for a neural network with 3 input nodes):* [[1, 3, 4], [5, 2, 7], [1, 0.9, 4]]\n", " * Outputs are structured in the same way as query inputs.\n", " * *Example:* [[Output of first query], [Output of second query], [Output of third query]]\n", "* Users are able to query either the most recent epoch of the neural network or a previous epoch.\n", " * In order to query the most recent epoch, ensure that \"Epoch to Query\" is set to \"Most Recent Epoch\"\n", " * In order to query a previous epoch, set the \"Currently Selected Epoch\" slider in \"Currently Loaded Neural Network\" section to your desired epoch. Then, make sure that \"Epoch to Query\" is set to \"Currently Selected Epoch\"\n", " \n", "#### Training\n", "* Training data is structured identically to queries. However, both inputs and targets must be specified.\n", "* Formatting:\n", " * Inputs: [[first set of inputs], [second set of inputs], [third set of inputs], [nth set of inputs]]\n", " * *Example (for a neural network with 3 input nodes):* [[1, 3, 4], [5, 2, 7], [1, 0.9, 4]]\n", " * Targets: [[first set of targets], [second set of targets], [third set of targets], [nth set of targets]]\n", " * *Example (for a neural network with 3 output nodes):* [[0.15, 0.93, 0.24], [0.45, 0.192, 0.37], [0.11, 0.89, 0.114]]\n", "* The number of training iterations can be set via an integer slider\n", " * For a large number of training iterations, the screen may appear to temporarily freeze while calculations are being performed. This is normal.\n", "\n", "#### Saving\n", "* As of now, saving works but is relatively messy and in its fledgling stages.\n", "* Neural networks with thousands of training epochs contain an extremely large amount of data.\n", " * Users are able to select the range of epochs that they would like to save. For example, if a user trained their neural network with 10,000 iterations of training data, they may only want to save epochs 9,000 through 10,000; this will give them a file that is nearly 1/10th the size of what it would be.\n", "* After setting the desired epoch saving settings, click the \"Save\" button.\n", "* An algorithm will generate a JSON object that stores the characteristics of the neural network. The weights will be outputted in a specially formatted array.\n", "* Save both the JSON object and weights array on your local computer. They can be pasted into the \"Loading a Neural Network\" section later on.\n", "\n", "#### Known Bugs\n", "* If initializing a new neural network while one is currently loaded, the subsequent weight graphics will have a delayed loading time when switching between epochs. This can be avoided by restrating the kernal whenever dealing with a new neural network.\n", "* Occassionally extra weight visualization graphics will appear after initialization. This is relatively uncommon, but if it happens, refresh the page.\n", "\n", "#### Upcoming Features\n", "* Including a converter that turns CSV files into properly formatted queries/training data\n", "* Improved save/load format\n", "* Ability to generate animations that demonstrate learning\n", "* Ability to create interactive charts that demonstrate performance in relation to test data for selected epochs\n", "* Ability to customize the normal generation weight generation method\n", "* More options for learning rates\n", "* More options for activation functions\n", "* Improved console messages\n", "* Optimized code; should speed up this software significantly\n", "* More interpolations for weight-representing graphics" ] }, { "cell_type": "code", "execution_count": 267, "metadata": { "collapsed": true, "scrolled": true }, "outputs": [], "source": [ "#Initializes Neural Network Initialization Widgets\n", "\n", "#Number of Hidden Layers Int Slider Widget & Associated Label\n", "LWSlider = widgets.IntSlider(\n", " value=1,\n", " min=1,\n", " max=3,\n", " step=1,\n", " disabled=False,\n", " continuous_update=False,\n", " orientation='horizontal',\n", " readout=True,\n", " readout_format='d')\n", "LayersWidget = widgets.HBox([\n", " widgets.Label(\n", " value=\"Number of Hidden Layers:\",\n", " layout=widgets.Layout(width='250px')),\n", " LWSlider\n", "])\n", "\n", "#Nodes Per Input Layer Int Slider Widget & Associated Label\n", "NPILSlider = widgets.IntSlider(\n", " value=200,\n", " min=3,\n", " max=1000,\n", " step=1,\n", " disabled=False,\n", " continuous_update=False,\n", " orientation='horizontal',\n", " readout=True,\n", " readout_format='d')\n", "NodesPerInputLayerWidget = widgets.HBox([\n", " widgets.Label(\n", " value=\"Number of Nodes for Input Layer:\",\n", " layout=widgets.Layout(width='250px')),\n", " NPILSlider\n", "])\n", "\n", "#Nodes Per Hidden Layer Int Slider Widget & Associated Label\n", "NPHLSlider = widgets.IntSlider(\n", " value=200,\n", " min=3,\n", " max=1000,\n", " step=1,\n", " disabled=False,\n", " continuous_update=False,\n", " orientation='horizontal',\n", " readout=True,\n", " readout_format='d')\n", "NodesPerHiddenLayerWidget = widgets.HBox([\n", " widgets.Label(\n", " value=\"Number of Nodes per Hidden Layer:\",\n", " layout=widgets.Layout(width='250px')),\n", " NPHLSlider\n", "])\n", "\n", "#Nodes Per Output Layer Int Slider Widget & Associated Label\n", "NPOLSlider = widgets.IntSlider(\n", " value=200,\n", " min=3,\n", " max=1000,\n", " step=1,\n", " disabled=False,\n", " continuous_update=False,\n", " orientation='horizontal',\n", " readout=True,\n", " readout_format='d')\n", "NodesPerOutputLayerWidget = widgets.HBox([\n", " widgets.Label(\n", " value=\"Number of Nodes for Output Layer:\",\n", " layout=widgets.Layout(width='250px')),\n", " NPOLSlider\n", "])\n", "\n", "#Weight Generation Method Radio Button Widget & Associated Label\n", "WGMRadioButton = widgets.RadioButtons(\n", " options=['Random between -0.5 and +0.5', 'N(μ,σ^2) for σ^2=1/sqrt(incoming links)'],\n", " value='Random between -0.5 and +0.5',\n", " disabled=False,)\n", "WeightGenerationMethodWidget = widgets.HBox([\n", " widgets.Label(\n", " value=\"Initial Weight Generation Method:\",\n", " layout=widgets.Layout(width='250px')),\n", " WGMRadioButton\n", "])\n", "\n", "#Activation Function Radio Button Widget & Associated Label\n", "AFRadioButton = widgets.RadioButtons(\n", " options=['Sigmoid', 'Tanh'],\n", " value='Sigmoid',\n", " disabled=False,)\n", "ActivationFunctionWidget = widgets.HBox([\n", " widgets.Label(\n", " value=\"Activation Function:\",\n", " layout=widgets.Layout(width='250px')),\n", " AFRadioButton\n", "])\n", "\n", "#Learning Rate Dropdown Widget\n", "LRDropdown = widgets.Dropdown(\n", " options=[0.00001, 0.0001, 0.001, 0.003, 0.01, 0.03, 0.1, 0.3, 1.0, 3.0, 10.0],\n", " value=0.03,\n", " disabled=False,)\n", "LearningRateWidget = widgets.HBox([\n", " widgets.Label(\n", " value=\"Learning Rate:\"),\n", " LRDropdown\n", "])\n", "\n", "#Button to initialize a new neural network\n", "SubmitNNWidget = widgets.Button(\n", " description='Initialize',\n", " disabled=False,\n", " button_style='info',\n", " tooltip='Initialize Neural Network',\n", " icon='',\n", ")" ] }, { "cell_type": "code", "execution_count": 268, "metadata": { "collapsed": true }, "outputs": [], "source": [ "#Groups and Displays Neural Network Initialization Widgets\n", "left_box = widgets.VBox([LayersWidget, NodesPerInputLayerWidget, NodesPerHiddenLayerWidget, NodesPerOutputLayerWidget])\n", "right_box = widgets.VBox([WeightGenerationMethodWidget, ActivationFunctionWidget, LearningRateWidget])\n", "outputs = widgets.HBox([left_box, right_box])\n", "outputs2 = widgets.HBox([SubmitNNWidget])\n", "\n", "#Prints initialization widgets into the correct output widget\n", "with out:\n", " display(outputs)\n", " display(outputs2)" ] }, { "cell_type": "code", "execution_count": 269, "metadata": { "collapsed": true }, "outputs": [], "source": [ "#Rich text widget that will intake the JSON data\n", "LoadInput1 = widgets.Textarea(\n", " value=\"\",\n", " placeholder='',\n", " description='JSON:',\n", " disabled=False,\n", " layout = widgets.Layout(width='400px')\n", ")\n", "\n", "#Rich text input that will intake the weights data\n", "LoadInput2 = widgets.Textarea(\n", " value=\"\",\n", " placeholder='',\n", " description='Weights:',\n", " disabled=False,\n", " layout = widgets.Layout(width='400px')\n", ")\n", "\n", "#Button to load the inputted neural network\n", "LoadNNWidget = widgets.Button(\n", " description='Load NN',\n", " disabled=False,\n", " button_style='info',\n", " tooltip='Load NN',\n", " icon='',\n", " layout=widgets.Layout(margin='0px 0px 0px 0px')\n", ")\n", "\n", "#Formats and displays widgets\n", "LoadBox = widgets.HBox([LoadInput1, LoadInput2])\n", "LoadBoxLower = widgets.HBox([LoadNNWidget])\n", "\n", "with load:\n", " display(LoadBox, LoadBoxLower)" ] }, { "cell_type": "code", "execution_count": 270, "metadata": { "collapsed": true }, "outputs": [], "source": [ "#Neural Network Class Definition\n", "class neuralNetwork():\n", " \n", " #initialization function\n", " def __init__(self, status, layers, inputnodes, hiddennodes, outputnodes, learningrate, activationfunction, weightinitialization):\n", " \n", " #set number of nodes in input, hidden, and output layers\n", " self.inodes = inputnodes\n", " self.hnodes = hiddennodes\n", " self.onodes = outputnodes\n", " self.epochs = 1\n", " self.actfunction = activationfunction\n", " self.methodgen = weightinitialization\n", "\n", " #number of hidden layers\n", " self.layers = layers\n", "\n", " #learning rate\n", " self.lr = learningrate\n", "\n", " #activation function\n", " if(activationfunction == \"Sigmoid\"):\n", " self.activation_function = lambda x: scipy.special.expit(x)\n", " elif(activationfunction == \"Tanh\"):\n", " self.activation_function = lambda x: np.tanh(x)\n", "\n", " #weights initialization\n", " #the weights array is made up of additional arrays representing the weights for each epoch\n", " #inside each epoch array is another set of arrays, where the indeces (low to high) represents the synpapses (left to right)\n", " #weights inside those arrays are w_i_j, where link is from node i to node j in the next layer\n", " #w11 w21\n", " #w12 w22 etc.\n", " \n", " #For a newly initialized neural network\n", " if(status == \"new\"):\n", " \n", " self.weights = [[]]\n", "\n", " if(weightinitialization=='Random between -0.5 and +0.5'):\n", "\n", " #weights between input and hidden layer\n", " self.weights[0].append((np.random.rand(self.hnodes, self.inodes) - 0.5)) \n", " #rows = number of hidden nodes; columns = number of input nodes\n", "\n", " #adds hidden layer weights\n", " for x in range(layers-1):\n", " self.weights[0].append((np.random.rand(self.hnodes, self.hnodes) - 0.5))\n", "\n", " #weights between hidden and output layer\n", " self.weights[0].append((np.random.rand(self.onodes, self.hnodes) - 0.5)) \n", " \n", " if(weightinitialization=='N(μ,σ^2) for σ^2=1/sqrt(incoming links)'): #samples a normal distribution where the mean is zero and the st. dev is the 1/sqrt(incoming links)\n", " #weights between input and hidden layer\n", " self.weights[0].append((np.random.normal(0.0, pow(self.hnodes, -0.5), (self.hnodes, self.inodes)))) \n", "\n", " #adds hidden layer weights\n", " for x in range(layers-1):\n", " self.weights[0].append((np.random.normal(0.0, pow(self.hnodes, -0.5), (self.hnodes, self.hnodes))))\n", "\n", " #weights between hidden and output layer\n", " self.weights[0].append((np.random.normal(0.0, pow(self.onodes, -0.5), (self.onodes, self.hnodes))))\n", " print(self.weights)\n", " \n", " #initializes widgets for viewing and interaction\n", " createInteract(self)\n", " \n", " #Called if a neural network is being reloaded rather than initialized from scratch\n", " elif(status==\"load\"):\n", " \n", " #Sets temporary variable equal to weights\n", " temp = str(LoadInput2.value)\n", " \n", " #Cleans up string\n", " temp = temp.replace('[', '')\n", " temp = temp.replace(']', '')\n", " temp = temp.replace('array', '')\n", " temp = temp.replace('(', '')\n", " temp = temp.replace('', '')\n", " templist = temp.split(',')\n", " \n", " #will be used later on\n", " templist2 = []\n", " \n", " #makes it easier to divide the list into epochs\n", " sizeofeachepoch = self.inodes*self.hnodes + self.hnodes*self.hnodes*(self.layers-1) + self.hnodes*self.onodes\n", " \n", " #adds epoch dimension\n", " for x in range(0, len(templist)):\n", " #adds dimensions for each epoch\n", " if (x%(sizeofeachepoch)==0 and x != 1):\n", " templist2.append(templist[x:x+sizeofeachepoch])\n", " \n", " #breaks up each epoch dimension into 2D numpy arrays representing the weights between each layer\n", " #This function is extremely messy and ineffecient; it will be optimized in the near future\n", " for x in range(0, len(templist2)):\n", " sublist = []\n", " if layers==1:\n", " sublist.append((np.array(templist2[x][0:self.inodes*self.hnodes]).reshape(self.hnodes, self.inodes)).astype('float_'))\n", " sublist.append((np.array(templist2[x][self.inodes*self.hnodes:self.inodes*self.hnodes+self.hnodes*self.onodes]).reshape(self.onodes, self.hnodes)).astype('float_'))\n", " if layers==2:\n", " sublist.append((np.array(templist2[x][0:self.inodes*self.hnodes]).reshape(self.hnodes, self.inodes)).astype('float_'))\n", " sublist.append((np.array(templist2[x][self.inodes*self.hnodes:self.hnodes*self.hnodes+self.inodes*self.hnodes]).reshape(self.hnodes, self.hnodes)).astype('float_'))\n", " sublist.append((np.array(templist2[x][self.hnodes*self.hnodes+self.inodes*self.hnodes:self.hnodes*self.hnodes+self.inodes*self.hnodes+self.hnodes*self.onodes]).reshape(self.onodes, self.hnodes)).astype('float_'))\n", " elif layers==3:\n", " sublist.append((np.array(templist2[x][0:self.hnodes*self.hnodes]).reshape(self.hnodes, self.hnodes)).astype('float_'))\n", " sublist.append((np.array(templist2[x][self.inodes*self.hnodes:self.hnodes*self.hnodes+self.inodes*self.hnodes]).reshape(self.hnodes, self.hnodes)).astype('float_'))\n", " sublist.append((np.array(templist2[x][self.inodes*self.hnodes + self.hnodes*self.hnodes+self.inodes*self.hnodes:self.hnodes*self.onodes+self.inodes*self.hnodes + self.hnodes*self.hnodes+self.inodes*self.hnodes]).reshape(self.onodes, self.hnodes)).astype('float_'))\n", " sublist.append((np.array(templist2[x][self.hnodes*self.onodes+self.inodes*self.hnodes + self.hnodes*self.hnodes+self.inodes*self.hnodes:self.hnodes*self.onodes+self.inodes*self.hnodes + self.hnodes*self.hnodes+self.inodes*self.hnodes+self.hnodes*self.onodes]).reshape(self.onodes, self.hnodes)).astype('float_'))\n", "\n", " templist2[x] = sublist\n", " \n", " self.weights = templist2\n", " \n", " #sets epoch variable\n", " self.epochs = len(self.weights)\n", " \n", " #updates viewing widgets\n", " EpochSlider.max = self.epochs\n", " EpochSaves.max = self.epochs\n", " Epochs.value = 'Number of Epochs (1 = initialization): ' + str(self.epochs)\n", " \n", " #initializes widgets for viewing and interaction \n", " createInteract(self)\n", " \n", " #Inputs_list should be an array containing arrays representing individual training inputs\n", " #Note: any comments labeled with 'test' are included so that training can be visually tested if needed; just remove the '#'\n", " def train(self, inputs_list, targets_list):\n", " \n", " #test: print(\"First query: \" + str(self.query([[5, 2, 4]])))\n", " \n", " #convert inputs list to 2d array\n", " inputs = []\n", " targets = []\n", " \n", " for x in range(len(inputs_list)):\n", " inputs.append(np.array(inputs_list[x], ndmin=2).T)\n", " targets.append(np.array(targets_list[x], ndmin=2).T)\n", " \n", " #test: print(\"Inputs: \" + str(inputs))\n", " #test: print(\"Targets: \" + str(targets))\n", " \n", " #copy function prevents changes to newweightsarray from affecting the actual weights of the previous epoch\n", " newweightsarray = copy.copy(self.weights[self.epochs-1])\n", " \n", " #test: print(\"New Weights Array: \" + str(newweightsarray))\n", " \n", " #Iterates through for the entire training set and then only increases the epoch once.\n", " for z in range(len(inputs)):\n", " inputstorage = []\n", " outputstorage = []\n", " errorstorage = []\n", "\n", " #creates an array containing the inputs into each layer of the NN with the given training inputs\n", " #index 0 => input into the first hidden layer\n", " #highest index => input into the output layer\n", " for x in range(len(self.weights[0])):\n", " if x==0:\n", " inputstorage.append(np.dot(newweightsarray[x], inputs[z]))\n", " else:\n", " inputstorage.append(np.dot(newweightsarray[x], self.activation_function(inputstorage[x-1])))\n", " \n", " #test: print(\"Inputs Storage:\" + str(inputstorage))\n", " \n", " #creates an array containing the output from each layer of the NN with the given training inputs\n", " #index 0 => input from the first hidden layer\n", " #highest index => input into the output layer\n", " for x in range(len(self.weights[0])):\n", " outputstorage.append(self.activation_function(inputstorage[x]))\n", " \n", " #test: print(\"Output Storage:\" + str(outputstorage))\n", " \n", " #creates an array the holds the errors for each layer\n", " #index 0 => errors for the first hidden layer\n", " #highest index => errors for the output layer\n", " #the error for layer n is: the dot product of the transposed weights between layer n and n+1 and the error of layer n+1\n", "\n", " #output errors are just targets - outputs of the output array, so this is done outside of the loop to avoid executing\n", " #repretitive boolean functions for no reason\n", " errorstorage.append(targets[z] - outputstorage[len(outputstorage)-1])\n", "\n", " for x in range(len(self.weights[0])-1, 0, -1):\n", " errorstorage.insert(0, np.dot(newweightsarray[x].T, errorstorage[0]))\n", " \n", " #test: print(\"Error Storage:\" + str(errorstorage))\n", " \n", " change = []\n", " \n", " #finds the new weights by using gradient descent\n", " for x in range(len(newweightsarray)):\n", " if x==0: #between input and hidden\n", " change.append(self.lr * np.dot((errorstorage[x] * outputstorage[x] * (1.0 - outputstorage[x])), np.transpose(inputs[z]))) \n", " else: #between hidden and output\n", " change.append(self.lr * np.dot((errorstorage[x] * outputstorage[x] * (1.0 - outputstorage[x])), np.transpose(outputstorage[x-1])))\n", " \n", " #test: print(\"Change:\" + str(change))\n", " \n", " for x in range(len(newweightsarray)):\n", " change[x] += newweightsarray[x]\n", " \n", " newweightsarray = copy.copy(change)\n", " \n", " #test: print(\"New Weights:\" + str(newweightsarray))\n", " \n", " #increments epoch\n", " self.epochs = self.epochs + 1\n", " \n", " #adds the array to hold the new epoch\n", " self.weights.append([])\n", "\n", " #fills the previously added array with the weights between each layer\n", " for x in range(len(newweightsarray)):\n", " self.weights[self.epochs-1].append(newweightsarray[x])\n", " \n", " EpochSlider.max = self.epochs\n", " EpochSaves.max = self.epochs\n", " Epochs.value = 'Number of Epochs (1 = initialization): ' + str(self.epochs)\n", " \n", " def query(self, inputs_list, epoch):\n", " \n", " inputs = []\n", "\n", " #convert inputs_list into 2d array\n", " for z in range(len(inputs_list)):\n", " inputs.append(np.array(inputs_list[z], ndmin=2).T)\n", " #ndmin is a numpy parameter specifying the minimum number of dimensions and array shold have\n", " #.T transposes the array; turns first column into first row, second column into second row\n", " \n", " outputlist = []\n", " \n", " for z in range(len(inputs)):\n", " #storage array for output after query\n", " outputarray = inputs[z]\n", "\n", " for x in range(0, len(self.weights[epoch-1])):\n", " if x==0:\n", " outputarray = np.dot(self.weights[epoch-1][x], inputs[z])\n", " outputarray = self.activation_function(outputarray)\n", " else:\n", " outputarray = np.dot(self.weights[epoch-1][x], outputarray)\n", " outputarray = self.activation_function(outputarray)\n", "\n", " outputlist.append(outputarray)\n", " \n", " return outputlist\n", " \n", " #Converts the characteristics of neural network in JSON format\n", " def toJSON(self, lower, upper):\n", " data = {}\n", " data['inodes'] = self.inodes\n", " data['hnodes'] = self.hnodes\n", " data['onodes'] = self.onodes\n", " data['actfunction'] = self.actfunction\n", " data['methodgen'] = self.methodgen\n", " data['layers'] = self.layers\n", " data['lr'] = self.lr\n", " return(json.dumps(data))\n", " \n", " #Returns the weights of neural network\n", " def printWeights(self, lower, upper):\n", " outputdata = []\n", " for x in range(lower-1, upper):\n", " outputdata.append(self.weights[x])\n", " return(outputdata)\n", " " ] }, { "cell_type": "code", "execution_count": 271, "metadata": { "collapsed": true }, "outputs": [], "source": [ "#Function that is called when the initialization button is clicked\n", "def InitializeNN(b):\n", " n = neuralNetwork(\"new\", LWSlider.value, NPILSlider.value, NPHLSlider.value, NPOLSlider.value, LRDropdown.value, AFRadioButton.value, WGMRadioButton.value)\n", " with console:\n", " print(\"Neural Network Successfully Initialized:\")\n", " print('\\t* Number of Hidden Layers: ' + str(LWSlider.value))\n", " print('\\t* Number of Nodes for Input Layer: ' + str(NPILSlider.value))\n", " print('\\t* Number of Nodes per Hidden Layer: ' + str(NPHLSlider.value))\n", " print('\\t* Number of Nodes for Output Layer: ' + str(NPOLSlider.value))\n", " print('\\t* Learning Rate: ' + str(LRDropdown.value))\n", " print('\\t* Activation Function: ' + AFRadioButton.value)\n", " print('\\t* Method of Initial Weight Generation: ' + WGMRadioButton.value)\n", " \n", "#Links initialization button to initialization function\n", "SubmitNNWidget.on_click(InitializeNN)\n", "\n", "#Function that is called when the loading button is clicked\n", "def LoadNN(b):\n", " try:\n", " NNJSON = json.loads(str(LoadInput1.value))\n", " inodes = NNJSON['inodes']\n", " hnodes = NNJSON['hnodes']\n", " onodes = NNJSON['onodes']\n", " methodgen = NNJSON['methodgen']\n", " actfunction = NNJSON['actfunction']\n", " layers = NNJSON['layers']\n", " lr = NNJSON['lr']\n", " try:\n", " n = neuralNetwork(\"load\", layers, inodes, hnodes, onodes, lr, actfunction, methodgen)\n", " except:\n", " with console:\n", " print(\"Improper weights format.\")\n", " except:\n", " with console:\n", " print(\"Improper JSON format.\")\n", "\n", "#Links loading button to loading function\n", "LoadNNWidget.on_click(LoadNN)" ] }, { "cell_type": "code", "execution_count": 272, "metadata": { "collapsed": true }, "outputs": [], "source": [ "#Initializes Neural Network Interaction Widgets\n", "\n", "#Placed on outside for easy access by other code blocks. Doesn't really matter so long as its display is in the function\n", "EpochSlider = widgets.IntSlider(\n", " value=1,\n", " min=1,\n", " max=1,\n", " step=1,\n", " disabled=False,\n", " continuous_update=False,\n", " orientation='horizontal',\n", " readout=True,\n", " readout_format='d',\n", " layout=widgets.Layout(width='400px'))\n", "\n", "#Number of Trained Epochs. Created out here so that it can be easily accessed by other code blocks (not possible if in function)\n", "Epochs = widgets.Label(value = 'Number of Epochs (1 = initialization): 1')\n", "\n", "#Creates and displays widgets used in viewing the current neural network\n", "def createInteract(neuralnetwork):\n", " #Clears previous output in order to avoid duplicate outputs\n", " with out2:\n", " clear_output()\n", " printmd('### **Currently Loaded Neural Network**')\n", " \n", " #Widget for number of hidden layers of current NN\n", " HiddenLayers = widgets.Label(value = 'Number of Hidden Layers: ' + str(neuralnetwork.layers), layout=widgets.Layout(width='400px'))\n", " \n", " #Widget for number of input nodes of current NN\n", " InputNodes = widgets.Label(value = 'Number of Input Nodes: ' + str(neuralnetwork.inodes))\n", " \n", " #Widget for number of output nodes of current NN\n", " OutputNodes = widgets.Label(value = 'Number of Output Nodes: ' + str(neuralnetwork.onodes))\n", " \n", " #Widget for number of hidden nodes of current NN\n", " HiddenNodes = widgets.Label(value = 'Number of Nodes per Hidden Layer: ' + str(neuralnetwork.hnodes))\n", " \n", " #Widget for learning rate of current NN\n", " LearningRate = widgets.Label(value = 'Learning Rate: ' + str(neuralnetwork.lr))\n", " \n", " #Widget for activation function of current NN\n", " ActivationFunction = widgets.Label(value = 'Activation Function: ' + str(neuralnetwork.actfunction))\n", " \n", " #Widget for method of generation of current NN\n", " Methodgen = widgets.Label(value = 'Method of Initial Weight Generation: ' + str(neuralnetwork.methodgen))\n", " \n", " #groups widgets into a left-side box\n", " left_box1 = widgets.VBox([HiddenLayers, InputNodes, OutputNodes, HiddenNodes])\n", " \n", " #groups widgets into a right-side box\n", " right_box1 = widgets.VBox([LearningRate, ActivationFunction, Methodgen, Epochs])\n", " \n", " #groups left_box1 and right_box1 into a centered box for output\n", " outputs1 = widgets.HBox([left_box1, right_box1])\n", " \n", " #Select epoch to view weights of previous epochs\n", " EpochSlider.max=neuralnetwork.epochs\n", " EpochUp = widgets.Button(description = \"+\", button_style='info')\n", " EpochDown = widgets.Button(description = \"-\", button_style='info')\n", " EpochWidget = widgets.HBox([\n", " widgets.Label(\n", " value=\"Currently Selected Epoch:\",\n", " layout=widgets.Layout(width='200px')),\n", " EpochSlider,\n", " EpochDown,\n", " EpochUp\n", " ])\n", " \n", " with out2:\n", " display(outputs1)\n", " display(EpochWidget)\n", " \n", " genweightplots(EpochSlider.value, neuralnetwork)\n", " \n", " #Remakes the weight images when the slider is changed. Must be in here to have access to local vars easily\n", " def Sliderchange(slider):\n", " genweightplots(EpochSlider.value, neuralnetwork)\n", " \n", " EpochSlider.observe(Sliderchange, names='value')\n", " \n", " def bringEpochUp(b):\n", " EpochSlider.value = EpochSlider.value + 1\n", " \n", " def bringEpochDown(b):\n", " EpochSlider.value = EpochSlider.value - 1\n", " \n", " EpochDown.on_click(bringEpochDown)\n", " EpochUp.on_click(bringEpochUp)\n", " \n", " createQuery(neuralnetwork)\n", " createTrain(neuralnetwork)\n", " createSave(neuralnetwork)" ] }, { "cell_type": "code", "execution_count": 273, "metadata": { "collapsed": true }, "outputs": [], "source": [ "#Function for creating the subplots representing weights and then displaying them in out3 \n", "def genweightplots(epochnum, neuralnetwork):\n", " with out3:\n", " clear_output()\n", " f, axs = plt.subplots(2,2,figsize=(15,5))\n", " weightarraynum = len(neuralnetwork.weights[0])\n", " \n", " #Here for testing\n", " #print(neuralnetwork.weights)\n", " #print(len(neuralnetwork.weights))\n", " \n", " for x in range(0, weightarraynum):\n", " plt.subplot(1, weightarraynum, x+1)\n", " plt.imshow(neuralnetwork.weights[epochnum-1][x], interpolation=\"nearest\")\n", " if x==0:\n", " plt.title(\"Layer \" + str(x+1) + \" (Input) & \" + str(x+2))\n", " elif x==(len(neuralnetwork.weights[epochnum-1])-1):\n", " plt.title(\"Layer \" + str(x+1) + \" & \" + str(x+2) + \" (Output)\")\n", " else:\n", " plt.title(\"Layer \" + str(x+1) + \" & \" + str(x+2))\n", " with out3:\n", " plt.show()" ] }, { "cell_type": "code", "execution_count": 274, "metadata": { "collapsed": true }, "outputs": [], "source": [ "#Create query widgets\n", "\n", "queryoutput = []\n", "\n", "def createQuery(neuralnetwork):\n", " with out4:\n", " clear_output()\n", " printmd('### **Query**')\n", " \n", " QueryInput = widgets.Textarea(\n", " value=\"\",\n", " placeholder='See documentation for query formatting',\n", " description='Query Input:',\n", " disabled=False,\n", " layout = widgets.Layout(width='400px')\n", " )\n", " \n", " QueryOutput = widgets.Textarea(\n", " value=\"\",\n", " description='Output:',\n", " disabled=False,\n", " layout = widgets.Layout(width='400px')\n", " )\n", " \n", " EpochtoQuery = widgets.RadioButtons(\n", " options=['Most Recent Epoch', 'Currently Selected Epoch'],\n", " value='Most Recent Epoch',\n", " layout=widgets.Layout(width='300px'),\n", " disabled=False\n", " )\n", " \n", " EpochtoQueryWidget = widgets.HBox([\n", " widgets.Label(\n", " value=\"Epoch to Query:\",\n", " layout=widgets.Layout(width='200px')),\n", " EpochtoQuery\n", " ], layout=widgets.Layout(width='600px'))\n", " \n", " QueryButton = widgets.Button(description = \"Query\", button_style='info')\n", " \n", " #Future feature\n", " #SaveButton = widgets.Button(description = \"Save Output (csv)\", button_style='info', layout=widgets.Layout(margin='2px 0px 0px 50px'))\n", " \n", " Queryboxestop = widgets.HBox([QueryInput, QueryOutput])\n", " Queryboxesbottom = widgets.HBox([EpochtoQueryWidget, QueryButton])\n", " \n", " def QueryFunction(b):\n", " try:\n", " listofinput = json.loads(QueryInput.value)\n", " if len(listofinput[0]) != neuralnetwork.inodes:\n", " QueryOutput.value = \"Incorrect number of input values.\"\n", " else:\n", " if EpochtoQuery.value=='Most Recent Epoch':\n", " queryoutput = neuralnetwork.query(listofinput, neuralnetwork.epochs-1)\n", " elif EpochtoQuery.value=='Currently Selected Epoch':\n", " queryoutput = neuralnetwork.query(listofinput, EpochSlider.value)\n", " QueryOutput.value=str(queryoutput)\n", " except:\n", " with console:\n", " print(\"Incorrect query format.\")\n", "\n", " QueryButton.on_click(QueryFunction)\n", " \n", " \n", " with out4:\n", " display(Queryboxestop, Queryboxesbottom)" ] }, { "cell_type": "code", "execution_count": 275, "metadata": { "collapsed": true }, "outputs": [], "source": [ "#Creates training widgets\n", "\n", "def createTrain(neuralnetwork):\n", " with out5:\n", " clear_output()\n", " printmd('### **Train**')\n", " \n", " TargetsInput = widgets.Textarea(\n", " value=\"\",\n", " placeholder='See documentation for input list formatting',\n", " description='Inputs List:',\n", " disabled=False,\n", " layout = widgets.Layout(width='400px')\n", " )\n", " \n", " TargetsOutput = widgets.Textarea(\n", " value=\"\",\n", " placeholder='See documentation for target list formatting',\n", " description='Targets List:',\n", " disabled=False,\n", " layout = widgets.Layout(width='400px')\n", " )\n", " \n", " TrainingIterations = widgets.IntSlider(\n", " min=0,\n", " max=500,\n", " value=1,\n", " layout=widgets.Layout(width='300px')\n", " )\n", " \n", " TrainingIterationsWidge = widgets.HBox([\n", " widgets.Label(value=\"Training Iterations:\", layout=widgets.Layout(width='200px')),\n", " TrainingIterations \n", " ], layout=widgets.Layout(width = '600px'))\n", " \n", " TrainButton = widgets.Button(description = \"Train\", button_style='info')\n", " \n", " Trainboxestop = widgets.HBox([TargetsInput, TargetsOutput])\n", " Trainboxesbottom = widgets.HBox([TrainingIterationsWidge, TrainButton])\n", " \n", " def TrainFunction(b):\n", " try:\n", " listofinputs = json.loads(TargetsInput.value)\n", " listoftargets = json.loads(TargetsOutput.value)\n", " if len(listofinputs[0]) != neuralnetwork.inodes:\n", " with console:\n", " print(\"Incorrect number of input values.\")\n", " elif len(listoftargets[0]) != neuralnetwork.onodes:\n", " with console:\n", " print(\"Incorrect number of target values.\")\n", " else:\n", " for x in range(TrainingIterations.value):\n", " neuralnetwork.train(listofinputs, listoftargets)\n", " except:\n", " with console:\n", " print(\"Incorrect training list format.\")\n", " \n", " TrainButton.on_click(TrainFunction)\n", " \n", " \n", " with out5:\n", " display(Trainboxestop, Trainboxesbottom)" ] }, { "cell_type": "code", "execution_count": 276, "metadata": { "collapsed": true }, "outputs": [], "source": [ "EpochSaves = widgets.IntRangeSlider(\n", " min=1,\n", " max=1,\n", " value=[1,1],\n", " layout=widgets.Layout(width='300px')\n", ")\n", "\n", "#Creates saving widgets\n", "\n", "def createSave(neuralnetwork):\n", " with out6:\n", " clear_output()\n", " printmd('### **Save**')\n", " \n", " SaveOutput1 = widgets.Textarea(\n", " value=\"\",\n", " placeholder='',\n", " description='JSON:',\n", " disabled=False,\n", " layout = widgets.Layout(width='400px')\n", " )\n", " \n", " SaveOutput2 = widgets.Textarea(\n", " value=\"\",\n", " placeholder='',\n", " description='Weights:',\n", " disabled=False,\n", " layout = widgets.Layout(width='400px')\n", " )\n", " \n", " EpochSavesWidge = widgets.HBox([\n", " widgets.Label(value=\"Epoch Range:\", layout=widgets.Layout(width='200px')),\n", " EpochSaves \n", " ], layout=widgets.Layout(width = '600px'))\n", " \n", " TrainButton = widgets.Button(description = \"Train\", button_style='info')\n", " \n", " SaveButton = widgets.Button(description = \"Save\", button_style='info')\n", " \n", " SaveBox = widgets.HBox([SaveOutput1, SaveOutput2])\n", " SaveBoxLower = widgets.HBox([EpochSavesWidge, SaveButton])\n", " \n", " with out6:\n", " display(SaveBox)\n", " display(SaveBoxLower)\n", " \n", " def SaveFunction(b):\n", " SaveOutput1.value = str(neuralnetwork.toJSON(EpochSaves.value[0], EpochSaves.value[1]))\n", " SaveOutput2.value = str((np.stack(neuralnetwork.printWeights(EpochSaves.value[0], EpochSaves.value[1]))).tolist()) \n", " \n", " SaveButton.on_click(SaveFunction)" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.3" } }, "nbformat": 4, "nbformat_minor": 2 }