{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "

Neural Networks Demystified

\n", "

Part 5: Numerical Gradient Checking

\n", "\n", "\n", "

@stephencwelch

" ] }, { "cell_type": "code", "execution_count": 1, "metadata": {}, "outputs": [ { "data": { "image/jpeg": "/9j/4AAQSkZJRgABAQAAAQABAAD/2wCEABALDA4MChAODQ4SERATGCgaGBYWGDEjJR0oOjM9PDkzODdASFxOQERXRTc4UG1RV19iZ2hnPk1xeXBkeFxlZ2MBERISGBUYLxoaL2NCOEJjY2NjY2NjY2NjY2NjY2NjY2NjY2NjY2NjY2NjY2NjY2NjY2NjY2NjY2NjY2NjY2NjY//AABEIAWgB4AMBIgACEQEDEQH/xAAbAAEAAgMBAQAAAAAAAAAAAAAAAQIDBAYFB//EAEQQAAIBAwEFBAgCBwcCBwAAAAABAgMEESEFEjGS0hdBUVQTFiIyU2Fx0QaBFBVCQ5GhsSMzUnKTweFEVQckNGJzgoP/xAAZAQEBAQEBAQAAAAAAAAAAAAAAAQIDBAX/xAAhEQEBAQEAAwEAAgMBAAAAAAAAAQIRAxIhMUFRBBMiQv/aAAwDAQACEQMRAD8A+fgAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA6zs+2t5iy559I7PtreYsuefSByYOs7PtreYsuefSOz7a3mLLnn0gcmDrOz7a3mLLnn0js+2t5iy559IHJg6zs+2t5iy559I7PtreYsuefSByYOs7PdreYsuefSOz3a3mLLnn0gcmDrOz3a3mLLnn0js92t5iy559IHJg6zs92t5iy559I7PdreYsuefSByYOt7PdreYsuefSOz3a3mLLnn0gckDrezza3mLLnn0js82v5iy559IHJA63s82v5iy559JPZ3tfzFlzz6QORB13Z3tfzFlzz6R2d7X8xY88+kDkQdd2d7X8xY88+kdne1/MWPPPpA5EHXdne1/MWPPPpHZ1tfzNjzz6QORB13Z1tfzFjzz6R2dbX8zY88+kDkQdd2dbX8zY88+kdnW1/M2PPPpA5EHX9nW1/M2PPPpHZ1tfzNjzz6QOQB13Z1tfzNjzz6Sezra/mbHnn0gcgDr+zra/mbHnn0js62v5mx559IHIA6/s62v5mx559I7Otr+ZseefSByAOv7Odr+ZseefSOzna/mbHnn0gcgDr+zna/mbHnn0js52v5mx559IHIA6/s52v5mx559I7Odr+ZseefSByAOv7Odr+ZseefSOzna/mbHnn0gcgDr+zna/mbHnn0js52v5mx559IHIA6/s52v5mx559I7Odr+ZseefSByAOv7Odr+ZseefSOzna/mbHnn0gcgDr+zna/mbHnn0js52v5mx559IHIA6/s52v5mx559I7Odr+ZseefSByAOv7Odr+ZseefSOzna/mbHnn0gcgDr+zna/mbHnn0js52v5mx559IH0YAFAAAAAAAAAAASCABIAAkEEgTkkqSgJJIAAkAASQSAAAAkgkCCUQSAAYYDAAAkEAASQSAAAAAAAAAAAEkAAAABJBJAEggkAAAAAAAADWAAAAAAQSAAIAkEEgAAAJIAEggkCQQSBKBAyBYkrklASiSABIIJQAkgASAMgAAADAAAAAAAJBAAkAAAAAAAAAAAABJAJAgkAAAAAAAAADWBAAAAAAAAAAAAAAABJAAkEEgSCABIAAklFSQLIkpktkCQQSgJBBIAkgASCABIIAAkgkAAAAAAEkACQAAAAAAAAABIIAEggASQSQBIIAEkAAawAAAAAAABAAAkgACSABIIAEggkASQAJAAEgxVa0KS9p/kY43tNvg0OxeVtAw/pNL/EQ7qil76HTlbGRk05bQpLgpMhbSo96kidi+tbxOTVhe0ZftY+pnjJSWYtMdTnFySuRkqLAgASAAAAAkEEgAAAAAEggASAAAAAAAAAAAAAAAAAAAAA1gCAJBAAAAAAAAAAAAAAAABGQJMVzdUbWmqlxUVODe7l+JpbQ2vQtJeijvVrh8KVNZl/wXuKP6w2W4VabjKcd5RfGMi8o3oVIzgpQkpRfBp5TDlhNnEW17XsauKc3FJ+73HS2G1KN9T3G1Cq1jHczprxXM6krBcVZVKjbfEopYIm8SeeKKb2p5Hr/AIZHUfiRv6GPeWQwcWbyDHval86EGRTwZYVpw92TRrpls/MqWN+nfVFxaf1NqnfQl7yweOpFlUNTTFzHvwqRmvZaZbJ4UKzjweDPHaFSGj1+pr2YuK9Yk82O1I98P5meF/Sk8PKL2J61t5JKRlGazF5RYrKQAAAAEgAAAAJIAAkAAAAABBIAAAAAAAAAAAaoAAAAAAAAIAEkBvHE16t9a0f725pQfg5oSW/iNgZNCW2LLhCq6r8KcHL+iK/p91U/9Ps+rjxrSUP5cTXpR6OSspxgsyaSXe2ed6PaVb+8uKVvF91KO8/4v7COyrXe3q2/cS8a03L+XAck/aLVds2yk6dvvXVX/DRWf4vgjDOG0bxf21VWdN/sUvan+cu78jejGFOO7ThGEV3RWEQ3ke0n5FYLSzoWqaowSb96T1cvqzehojDEzRZnvRz34h2fuy/Sqa9mT9pJcH4nPqrKhLKeMHf1/R+hkqizFrDXieLRsLek87qb8Zas7Z88znlWY68yz2k7mThOMlLuk1ozNWuFA9R01JaSX0NC6tY1YPuZ5NWW9j0Z/ONSN7Fy4mzC5jJYyc1tCnWtJt5biY7fari/aZG66yElN6GXGDxbHaEKksKR61OqpERm7hFZCaaJWgQehGRLLZUqLbxOcsqTFBRp9xkpacSIosvoRqNyhWlTeU/yPRo1lVXg/A8aMtTYpzNTXGNZlesgeVKrVhrGpLH1ELysnrLP1Ne0cv8AXXqkmrRu41NJLdZtGoxZxIIJKABARIAAEkEgAABBIAAAAAAABBIAAAaoAAEEkAadR7S9I1SjaqnnRylLOPpgr6LacveureH+Wi3/AFkbwL7I0XZ3kve2lUX+SlFfcLZqf97dXVT61cf0wbpBfejT/VNl+1R33/75OX9WZadna0l/Z29KP0gjNkhsl1b/ACpolhIhshsq2QTkhsjJBAbBBIEoumULJga9xNynjuRrSeGZqnFswPiZdp8iks8UY1mUdTNUwkY1pEzY3m9eTtSgp05LBxl1D0NZx4eB3t3Hegzj9s0t2e98xn9XX40aFxOjNSi3odHs7a0aqSbxLwOWReDcXmLwzVy550+hULhSWTO6vA5DZu13BqnWf/2Ogp3Cmk08o52cdI9JSyiGa8KmVjJkjIDIStCItOJKepUXiy6Zj4ssgrIi8ZYMcSyIrPGRE6emY/wKZMkJlRjjPBu29y4+zJ5RrygpLejxKxeCy8Zs69iLUllPKLHl0q0qb9lm7SuIz0ejNy9crmxmAJKyAAoEkAIAACSASAAIAkAgASQAJIAA1iAAABADIBAAhghgGyrYZGSCGQ2CABAAEkDJGQJyN40ry9nbVqNONvKt6TPutZWPkZaNxTrxcqcs44rg19V3ATNZka1ScYaJ5Yr1nL2YaLxMUYeJi11k/s3pT1ZbuJSwiEZdIwV45gzltt0/7OTOvqRzFnNbZh/ZyRZ+r/DlsaExeNGWpvXUvKn4HruPaPNLwS1RvWt3WtvdeY+DNCO9HR8DPSkmjz6zZ+u0srorTaMKy44l4M9GnW3kcgtHlaG/ZbQlTmo1NY+JjjfXV0/dLamK3mpwTXBmwZOoRbJTvJWQMqLJmJSLJhGbOgTwUTLINRmhJlpR3tY8TEmZYsFUiy8XgmUd7VcTHkI3aFy46T1RuRkpLKeUeQmZqNeVN6cPA3NOesf09MFKdSNSOYsubcgkgFAkgBAAkCAABJAAAAAMgADVAIIBEs7rxx7iSANBR2tonVs/ruyy/wCZMqO0Zf8AWUo/Sjn+rN0hso0Xb3//AHCP+gvuR+i3eHvbQqN/KnFG8VbINN2dx/3Cvyx+xR2NZ8doXP5bq/2N4gDSez2/+tu/9T/gr+rlnW7u3/8AqbxAGl+q6fmLr/WkV/VdLOVcXS+lZm+QwNB7LXnLz/WZV7Ki+N5e4/8AmZ6BAHk3Gy5Ua1K6s5VKlanLWNSo2pJ6Pj9TNC1nGs7qvKLryjuYgsKK/wBzfbwjBOWeJLWszrFu4G6WUkTlGOuvKpKAjFYMmE0EkZbjDUXss57bEfYZ0k0sM8LbEU6MvEsVxS97HzNymlumCtRlRqZlwfBmSjI93jvY8mpxecMmBpwllG6llGKpTOms9iSqxqp8Vgupx8TXacXlGRRU45R4t49XozrroNjX/wC5nL/KdBCalE4Cnv05qcG008nW7LvY3FFP9pcUc609TiWxoRHDJcTKiLIo9OBKYGRMspGJMsmBlUtS6lgwIsmFbKkTJbyyuJhT01MkGBC44LF3FTWVxMeWtGGWWnUlCWYs36FeNRYekjzPoXi2tVxNS8Z1mV6wNahcqWIz4+JsnTvXKzgSQAgSQCokgAAAAAAAAADUAIIABVgGQwQwIbIBABkAAQQSAGSAQAKskxVZdwqydUqTyYpSLamPST4nK13zOLx4F4rQpFGSKwRpYjBL+WpCIMVRaM8TaiTpSTOgklg8jbFLeoSlHikVeuRuoelgl3rgaUU6c8PQ2alTV6lVOnU9mb17md/Hv1c/Jnv1mpvKLuOTDFSpvEjYWqPoZ+x5q16lMwLNOWV/A3msmKdMzvEsWa4Q3akcoz21WdrW348O9eJioUY/sTe/3rBeUopuLaTXcz5+8+tenOvZ1VndRrU1OL0ZtxqZZx1pe/otTKlmL4o6C2u4VoKUJJnNXpt5CMFOtHvZlUk1oRV0SisZFsoC6JzgomSwMkZLBkizXRkiFbMZFpJT+phiZEwisk48SUy6xJYZjlBx1XAJxKbybVC6cNJ6xNLOpdMsvEs7+vXhOM1mLyWPJhVlTeYvBu0byM9J+y/E3NOdxY2QE0+ANMAACAAAAgFEggkDTAIIBUkhgQyAQAIAAggkgAQSQAIJIAgwVfeM5hqrvJfxrH6w/UquJOcMq038jDsumkXinjVlFpgujKskdCsxnQPVAYpycTQu6mYNM35rJ5d/BqLYacrXs16abXDJELeMe43ZPecvqY8JGhXcUo7svyZiw6ct2RlehOk47svyZ6PD5efK5bx37FOIayRh05bsi3HVHul681ikc05qUe4zToxvFKpCKhOPGPiUIi5U5KUHhrvMeTxTU+LnXGJUlrlYa4ovT9JSlmnJr6G7K3hXoRrQrOdX9uO6ayTzwxjuZ87WbmvVnUr3dn0W6SlUk5SfiepCCweJY3iSUWz16VVSWhixOs7hoVWhdSyiGskaiEWTKrQsgq6LJmPOCykBkUsItv4Ri4k5KNiDbMqTZqweWjepa4MrapKlvLwZi3XF4aN1xIlBSWC9RqNaFG8dxmqQcTDKLZSMlG6nTej/ACN+jdQqaPRnjuLyZYSa0YmrDWJXtg0aFeUdHrE3YzU1lM6SyvPrNiQAVAABAAFGkQwyCAQwQAZAAEAgAQAABBJAAhkkMCrKTWUXZDCz41nxKyeheaKPgc3cRZMxp6mRIipbyTnQhrAIIk+Jq3MVKm00bLMFZ+ywOWuafo68kuD1MTRtbQT/AEhY8cGSOybmeskor5s0vWjo0Vwen+p6q4ziP1VU/wAaHKntHnNKcd2X5MxLMJbsj1f1ZP8AxIpW2bOVPim1wPT4vLc/K47kv2PPa70QSswk4TWGiXHwPdNONi1CtOhUU4PH+56NShC/t/T0NKkfeTPLM1rcTtq0Zx1w9V4mPL4pufFzqxhTlTrYllfJnQbPqqUEjUuaH62oelt4xhUi9U+Jp2d1K2rejqLdlHRo+frFz8rtNddVHLWhOpqW90pxWGbKnk52NypwEyGwZbTktHiVRdAX7iSqYKsSnhm7QnlI0c6mxQepmrY9KOqDiRS90yriHO3jDKOVhmGVPHcbVRGOSCz61HAq4JLVmy4ZMc6YbhRljQ2YS3JZXBmvThg2Elgia42otSWUSa9B4e73Gwd5ex5tTlQSAVkIJIKrSIBBEGVJIAgAACAAIAAAgkgAQSVYEEEkAYqi1MLNifAwSWpi/rti/FMamRFWiYsy2uRglMkiMTTyUqx3jKyjaC9eBtW3a9pLgevbzVW3pzXfFMpd0lUg0a9hXjS/8tUeMe6/E3ljc+N7BVxRcG3Ji3V4DcRdjAHl7S2cq8XUprE1/M8RNwk4TWGjr8Hl7U2aq8XVpLFRd3idvH5PX5UseNKJQtCTTcZLDXcyZJHtzWKmnUnTeYycfoytSkq099ye94kFovBu4zr9SWxeFzUtmtcx8T07XaCljLPMwprDNSop20t6OXD+h4vN/j+v3LtnXXXQrbyzkyqXzObtL54R6lGtv8ZHjsdo9JSyXizVhUXAzQkZ40zoniY97QtFslWLLjqbFJamKKRmpYI03qHAz8DBRehlb0JHO/q0+BikTKeSrCycEQ0ie4FVCLxISLIiUjpNYNk14+8sGwdPG4+QAB0cwAAaBDDAEAEAAAAIAAgAACCSAIIZLKsAQABD4GCa1M5hmjOnTDFLJBZoqYdV1wLIxpllLQJVnwMckXyUfEgxTWh5t3R31nvXeepNGrWjoVWra7S3GqNzo+Cn9z08prK4HiXNupLONTWpXdzbSUY1PZ7kzcrnc/06NlKlSFKO9OSivmeYttpUnvUn6TuxwPJuLyrcVN6bb+XgXqTFr26m1IN4owcvm9EYpXN1P9pR+iPMt6uHqb9OvFrUza3MxqXVpObdXOZ9/wAzDTxjD4nqb8HH3jyqsXTqvGqbPV/j+T/zXPyZ/mEolDNFqSKTj4HtlcCnrJLxPTt7WhNujcpqT4PxPHcnF5XFHu2FzRvYRjUWKsP5nLz71J/yuXn3uzKljPfh7VJ9/gVo1mnodHRk6zqU6tPEF496PF2js92lXfprNJ/yPDfrvnTPb1cm9TqZweRbp50PQpZS1OddI346mWGpq05YNmEtTDcZ4ozQMMHkzw4kVs0jJJ6GKMsItnJEMggkKlAlEFRIcsEN4RjevEi8Z7fMp/Q2jFbw3YZ72ZjtmcjzbvagAGmAAAeeQSQAIJIAAAAQSQBAAAEAAQypZlQIAAEFKiLlZLQlazeVgkUfEySRVow7xXJKaKtYJRFWyOIJQRSfAwVEbLMFRahGnWhoeVc09cntVEaNxTymsFg8+jFTe6+KJnZ5ecF4Q3J68PE3aOsUpalGnSsprHDDNiNq+DRv0qa3cF1HDw0RWgrRYxgxTscs9dRwQ4CXiVzlahO3lqtCE1JHu1reNWDjJHi3NvO2qYa9nuZ7vD5vb5f1w1lgqQMcJzo1FODxJGwvaRjqUz0fv65vfsr39Ott2Mt2quJvUlGrSdOqsvg8nHUqs7asqlN4aOms7qN7SVSm8VFxR4vJ4/W9jUqlWz/R5ae73BacT0YtV4Ylx8DSrUXSnrw7jjXbGu/CEtDYpS0NWK1ybNNanKu8blN8DYizVp8DYiiHGfeLxeTXRmpkVlXDUJkZJKyZBWTK5YVLeWZKNPfn8kYuLwb9GG5D5s1mdY3rkXJAOrzIAAAAAeeyCzIAgAAQAABAAEAACAABDKskhgQAAIIfAlkMDC0Q8Fp8SpzeiKSEVgs0Rgy0JalmiFxLNZAo+BglxNhruME13hGGou816iyjYZiqQ8Co05U020jLBJSWFglrCJgsPDKNmDw+GDLxMcNdMGVaIghrwA3lkjOpEVepjr0YVqe7JcTMkmTujvDjnLi3nbVMP3e5lV7SOirW8a0HGSPBuradrVw/d7me/wAPm9vl/XHeeNapSK21xUs6ynB/VeJsJqSMNWl3noslnK5ultbmFzTVak9e9G21GtTOQsrudlWUl7j4o6m2rRqwVak8p8UeLyYuK3Kp6JxeGZoRxgzuKqQyilOGuGefWf5j0ePfflZYRNiKxxMdNYRlMOlQzLAxYZmggLpB6EN6lWwg9WMCJKTk8LvAzW1PelvPgjbK04bkEi52k5Hm1e0ABWUEkAAAMgaBB899fNqeXs+SXUR6+bU+BZ8kuoD6ED57697U+BZ8kuoeve1PgWfJLqA+gg+feve1PgWfJLqI9e9qfAs+SXUB9CKnz/162n8Cz5JdQ9etp/As+SXUB9AB8/8AXnafwLTkl1EevO0/gWnJLqA+gEHAevO0/gWnJLqHrztP4FpyS6gO/ZU4H142n8C05JdQ9d9pfAtOSXUB3oOC9d9pfAtOSXUPXbaXwLTkl1Ad4QcH67bS+Backuoeu20vgWnJLqA7mfExN4OKl+NNoy40LXkl1FH+MNoP9zbcsuozY651JHc5WhXJxHrhtD4Ntyy6iPW7aHwrbll9zPrVm47hPUvnBwq/F+0F+6tuWX3J9cdoP9za8suonrWv9mXcvVZMU1xRxnrltH4Nryy6ir/GG0G8+htuWXUX1qe8ddJGNo5N/iy/f7q25Zfcq/xVfP8AdW/LL7j1qe8dU4ptaBx1OV9ab7OfRW/LL7j1pvc59Fb8svuX1qe8dgk9GXRxvrXfYx6K35Zfcn1sv/g23LL7k9ae8dg8jBx/rZf/AArbll9yPWu++FbcsvuPWnvHZRRkSRxa/F1+v3Ntyy+49b9ofBtuWXUT1q+8drgwXNvGvBxkjkvXDaHwbXll1D1v2g/3Ntyy6izNie0b1zbTtqmHw7mVTUkedX/E95Xhu1KFtj/LL7mmtr3C/Yp/wf3PbjzfP+nLUnfj16tLwM2zL+VlW3Zf3b4/I8R7ZuGvcpfwf3MUtpVpcYU/4P7mteXGpys8r6RSqx3VODzCRsNZ9qJ87s/xLfWdPchCjKPhOLf+5tQ/GW0YLCo2uPnGXUeWtO/pyTXzLnz7102jvZ9Da8suov68bT+Backuo53P9O88k/l9AijItEfPPXrafwLPkl1D172p8Cz5JdRn1q/7I+hN5K958/8AXrafwLPkl1D162n8C05JdQ9Kf7I+hZ0Ni1hmTl4HzX162n8C05JdRlp/+IO1qccRt7Lkn1FznlY1uWfH08HzLtF2v5ax5J9Q7Rdr+WseSfUdHJ9NB8y7Rdr+WseSfUO0Xa/lrHkn1AfTQfMu0Xa/lrHkn1DtE2v5ax5J9QH00g+Z9ou1/LWPJPqHaJtfy1jyT6gORAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAB//Z\n", "text/html": [ "\n", " \n", " " ], "text/plain": [ "" ] }, "execution_count": 1, "metadata": {}, "output_type": "execute_result" } ], "source": [ "from IPython.display import YouTubeVideo\n", "YouTubeVideo('pHMzNW8Agq4')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Last time, we did a bunch of calculus to find the rate of change of our cost, J, with respect to our parameters, W. Although each calculus step was pretty straight forward, it’s still easy to make mistakes. What’s worse, is that our network doesn’t have a good way to tell us that it’s broken – code with incorrectly implemented gradients may appear to be functioning just fine." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This is the most nefarious kind of error when building complex systems. Big, in-your-face errors suck initially, but it’s clear that you must fix this error for your work to succeed. More subtle errors can be more troublesome because they hide in your code and steal hours of your time, slowly degrading performance, while you wonder what the problem is. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "A good solution here is to test the gradient computation part of our code, just as developer would unit test new portions of their code. We’ll combine a simple understanding of the derivative with some mild cleverness to perform numerical gradient checking. If our code passes this test, we can be quite confident that we have computed and coded up our gradients correctly. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To get started, let’s quickly review derivatives. Derivates tell us the slope, or how steep a function is. Once you’re familiar with calculus, it’s easy to take for granted the inner workings of the derivative - we just accept that the derivative of x^2 is 2x by the power rule. However, depending on how mean your calculus teacher was, you may have spent months not being taught the power rule, and instead required to compute derivatives using the definition. Taking derivatives this way is a bit tedious, but still important - it provides us a deeper understanding of what a derivative is, and it’s going to help us solve our current problem. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The definition of the derivative is really a glorified slope formula. The numerator gives us the change in y values, while the denominator is convenient way to express the change in x values. By including the limit, we are applying the slope formula across an infinitely small region – it’s like zooming in on our function, until it becomes linear. \n", "\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The definition tells us to zoom in until our x distance is infinitely small, but computers can’t really handle infinitely small numbers, especially when they’re in the bottom parts of fractions - if we try to plug in something too small, we will quickly lose precision. The good news here is that if we plug in something reasonable small, we can still get surprisingly good numerical estimates of the derivative. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We’ll modify our approach slightly by picking a point in the middle of the interval we would like to test, and call the distance we move in each direction epsilon. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let’s test our method with a simple function, x squared. We’ll choose a reasonable small value for epsilon, and compute the slope of x^2 at a given point by finding the function value just above and just below our test point. We can then compare our result to our symbolic derivative 2x, at the test point. If the numbers match, we’re in business!" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Populating the interactive namespace from numpy and matplotlib\n" ] } ], "source": [ "%pylab inline\n", "#Import Code from previous videos:\n", "from partFour import *" ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [], "source": [ "def f(x):\n", " return x**2" ] }, { "cell_type": "code", "execution_count": 4, "metadata": {}, "outputs": [], "source": [ "epsilon = 1e-4\n", "x = 1.5" ] }, { "cell_type": "code", "execution_count": 5, "metadata": {}, "outputs": [], "source": [ "numericalGradient = (f(x+epsilon)- f(x-epsilon))/(2*epsilon)" ] }, { "cell_type": "code", "execution_count": 6, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "(2.9999999999996696, 3.0)" ] }, "execution_count": 6, "metadata": {}, "output_type": "execute_result" } ], "source": [ "numericalGradient, 2*x" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Add helper functions to our neural network class: " ] }, { "cell_type": "code", "execution_count": 7, "metadata": {}, "outputs": [], "source": [ "class Neural_Network(object):\n", " def __init__(self): \n", " #Define Hyperparameters\n", " self.inputLayerSize = 2\n", " self.outputLayerSize = 1\n", " self.hiddenLayerSize = 3\n", " \n", " #Weights (parameters)\n", " self.W1 = np.random.randn(self.inputLayerSize,self.hiddenLayerSize)\n", " self.W2 = np.random.randn(self.hiddenLayerSize,self.outputLayerSize)\n", " \n", " def forward(self, X):\n", " #Propogate inputs though network\n", " self.z2 = np.dot(X, self.W1)\n", " self.a2 = self.sigmoid(self.z2)\n", " self.z3 = np.dot(self.a2, self.W2)\n", " yHat = self.sigmoid(self.z3) \n", " return yHat\n", " \n", " def sigmoid(self, z):\n", " #Apply sigmoid activation function to scalar, vector, or matrix\n", " return 1/(1+np.exp(-z))\n", " \n", " def sigmoidPrime(self,z):\n", " #Gradient of sigmoid\n", " return np.exp(-z)/((1+np.exp(-z))**2)\n", " \n", " def costFunction(self, X, y):\n", " #Compute cost for given X,y, use weights already stored in class.\n", " self.yHat = self.forward(X)\n", " J = 0.5*sum((y-self.yHat)**2)\n", " return J\n", " \n", " def costFunctionPrime(self, X, y):\n", " #Compute derivative with respect to W and W2 for a given X and y:\n", " self.yHat = self.forward(X)\n", " \n", " delta3 = np.multiply(-(y-self.yHat), self.sigmoidPrime(self.z3))\n", " dJdW2 = np.dot(self.a2.T, delta3)\n", " \n", " delta2 = np.dot(delta3, self.W2.T)*self.sigmoidPrime(self.z2)\n", " dJdW1 = np.dot(X.T, delta2) \n", " \n", " return dJdW1, dJdW2\n", " \n", " #Helper Functions for interacting with other classes:\n", " def getParams(self):\n", " #Get W1 and W2 unrolled into vector:\n", " params = np.concatenate((self.W1.ravel(), self.W2.ravel()))\n", " return params\n", " \n", " def setParams(self, params):\n", " #Set W1 and W2 using single paramater vector.\n", " W1_start = 0\n", " W1_end = self.hiddenLayerSize * self.inputLayerSize\n", " self.W1 = np.reshape(params[W1_start:W1_end], (self.inputLayerSize , self.hiddenLayerSize))\n", " W2_end = W1_end + self.hiddenLayerSize*self.outputLayerSize\n", " self.W2 = np.reshape(params[W1_end:W2_end], (self.hiddenLayerSize, self.outputLayerSize))\n", " \n", " def computeGradients(self, X, y):\n", " dJdW1, dJdW2 = self.costFunctionPrime(X, y)\n", " return np.concatenate((dJdW1.ravel(), dJdW2.ravel()))\n", " " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can use the same approach to numerically evaluate the gradient of our neural network. It’s a little more complicated this time, since we have 9 gradient values, and we’re interested in the gradient of our cost function. We’ll make things simpler by testing one gradient at a time. We’ll “perturb” each weight - adding epsilon to the current value and computing the cost function, subtracting epsilon from the current value and computing the cost function, and then computing the slope between these two values. " ] }, { "cell_type": "code", "execution_count": 8, "metadata": {}, "outputs": [], "source": [ "def computeNumericalGradient(N, X, y):\n", " paramsInitial = N.getParams()\n", " numgrad = np.zeros(paramsInitial.shape)\n", " perturb = np.zeros(paramsInitial.shape)\n", " e = 1e-4\n", "\n", " for p in range(len(paramsInitial)):\n", " #Set perturbation vector\n", " perturb[p] = e\n", " N.setParams(paramsInitial + perturb)\n", " loss2 = N.costFunction(X, y)\n", " \n", " N.setParams(paramsInitial - perturb)\n", " loss1 = N.costFunction(X, y)\n", "\n", " #Compute Numerical Gradient\n", " numgrad[p] = (loss2 - loss1) / (2*e)\n", "\n", " #Return the value we changed to zero:\n", " perturb[p] = 0\n", " \n", " #Return Params to original value:\n", " N.setParams(paramsInitial)\n", "\n", " return numgrad " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We’ll repeat this process across all our weights, and when we’re done we’ll have a numerical gradient vector, with the same number of values as we have weights. It’s this vector we would like to compare to our official gradient calculation. We see that our vectors appear very similar, which is a good sign, but we need to quantify just how similar they are. " ] }, { "cell_type": "code", "execution_count": 9, "metadata": {}, "outputs": [], "source": [ "NN = Neural_Network()" ] }, { "cell_type": "code", "execution_count": 10, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "array([-0.01570145, 0.04553206, 0.02332193, -0.01261438, 0.02962221,\n", " 0.02384717, -0.19123277, -0.24006099, -0.09635695])" ] }, "execution_count": 10, "metadata": {}, "output_type": "execute_result" } ], "source": [ "numgrad = computeNumericalGradient(NN, X, y)\n", "numgrad" ] }, { "cell_type": "code", "execution_count": 11, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "array([-0.01570145, 0.04553206, 0.02332193, -0.01261438, 0.02962221,\n", " 0.02384717, -0.19123277, -0.24006099, -0.09635695])" ] }, "execution_count": 11, "metadata": {}, "output_type": "execute_result" } ], "source": [ "grad = NN.computeGradients(X,y)\n", "grad" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "A nice way to do this is to divide the norm of the difference by the norm of the sum of the vectors we would like to compare. Typical results should be on the order of 10^-8 or less if you’ve computed your gradient correctly. " ] }, { "cell_type": "code", "execution_count": 12, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "2.7533413768397804e-10" ] }, "execution_count": 12, "metadata": {}, "output_type": "execute_result" } ], "source": [ "norm(grad-numgrad)/norm(grad+numgrad)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "And that’s it, we can now check our computations and eliminate gradient errors before they become a problem. Next time we’ll train our Neural Network. " ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.7.3" } }, "nbformat": 4, "nbformat_minor": 1 }