{ "cells": [ { "attachments": { "cuemacro_logo.png": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAPoAAAD6CAIAAAAHjs1qAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAAFiUAABYlAUlSJPAAAAAZdEVYdFNvZnR3YXJlAHBhaW50Lm5ldCA0LjAuMTnU1rJkAABGGElEQVR4Xu1dhV8U3ff+/SffVwUDC5BGxAC7uztfuzEwUQETFTEIFUFRsYilQ7q7G5aOhWU7eH8H7jDMzs7CUjLIPJ/nw2eZe2Z2duaZc8+5c+P//mPAYMyAkTuDMQRG7gzGEBi5MxhDYOTOYAyBkTuDMQRG7gzGEGgq9+aWFplcjv3DgMEQgV5yl8vl1dXVKalp1dU1HA4H28qAwRCBLnKXyWQREZEHDxzSnamtp6NbWlrK4/GxMgYMhgh0kXtNba2hvsGE//0D1Bw/wevzF6yAAYOhAy3kDjHMxQsXkdaBpkbGcfEJsBErZsBgiEALuUMkc/r0GVzuGv+MW7Z0GRPMMBhy0ELuHR0djo5OuNyBL1+9xsoYMBg60CWYObjvAK71SRM0cnNzsTIGDIYOtJC7WCLZtGEjLnftGTPZ1dVYGQMGQwdayL2trc1ATx+Xu86MmdXVNVgZg78FFbXtNU0jnI/RQu6xsfG41oEzpk6rrKzEyhj8LRCJZXyhFPtnhEALufv6+i9ftgKX+8QJGkVFRVgZAwZDB1rIvb6hwWK+BS53YGZGJlbGgMHQgRZyFwqFSxYtIco9LS0NK2PAYOhAC7lXVlZpT5+Ja32K5qTSklKsjAGDocPIy72jo8PfP2DaFC1c7kb6hmymZYbBMGDk5d7U3Gxjc2e61lRc7sDr164LhSLMggGDIcLIy72trc3UyISo9RlTp33//kMmk2EWDBgMEWgRu1+zvkaUO3CRxUK+QIAVMxiFkHd0NLTQ7g7SQu729vdJcl+/Zi3j3Uc12nhiwUi/VFIGLeT++PETktxt7eyxMgajDTJZR0dXCwT2P51AC7m7ur0lyd3723esjMGoglQmfxdYJJLQtGamhdxjY+M1/hmHa11r0uR37z9gZQxGFcLTauLzGiBwx/6nGWghd7FYfPbsOVzu2jNmJiWlYGUMRg8q6toj0mroKvVO0ELuEOfdvHELlzvwicMzegZ/DFShXSB5G1AokdJ6hDEt5C6VSnds34FrfeqkyZlZWVgZg9EAmbzjV3RFSTUX+5+uoIt3f+74AmndUN/gwcNHEokEK2MwGgA1cUZRc3ZpC/Y/XUELucfHJ+B9ZiZO0Dh+/ARWwICukMk6UvIasX+6IJWNgolSaCF3iVTq4uIGMQxS/CLLhUKhECtjQEsIRbIc2vtyZdBC7oCWlpZ1a9YhuYOnL6+owAoY0AwymVwskUWk1nyMGH2dtGkh9+rqmrVr1k3XmobkPnPa9Co2GytjQCdweZJzzklcnlggkoHosa2jB7SQe1UVW2siFskwcqczmltFCdn1+eUcmXxUNhPTQu4Qqds/eGxjcxfJXV9Xr76+HitjQDPUNwvYdTzsn9EGWsi9obExP7/A2cUVyX31ylXM2A7agieQ0rBnr5qghdzFYvGpU6eR1oFTJ085eOBwC7OcAYOhBi3kHhUdg2u9k/+Mu3HjFtMWyWDIQQu5JyUl6+nM0hw3HsldY9x4L6/PzPzudIBM1hGeWiOVdVQ38OAvtnXUghZy9/r85duPX8ePnUBy15o0+e07dyaYoQOEIhkrtrKj479vsRVphU3Y1lGLkZd7R0dHaFjETx+/y5evIrnPNjYJC48UiZhsdeQhFMsqatvhg0gib2od9eElLby7Pytglo4u0jpQc/wEi/kL2ttHa2vX3wHw6PKOjqySlhJ2W2U9zzuyLJ3x7oOHWCy+dPkqcTQTcNniJWIx0ylyZAABjEgs84mpaOQI2Q283FKOR0hxTSP/LxiAMPJyLyuvsLO7b2xgRJT7gf0HsWIGfxztPLFEKucLpX/f+JqRlztfIAiPiJylrUOU+/37D7FiBgyGDiMv97DwCP1ZekStAz9/+YoVM/gjkMs7QtNq2PW8Np4Y4hZs61+HkZd7VHTM9CkKE0QCkxKTsGIGfwSQjD7yzubyJWXVXKmsg8MV/5VDhUde7iWlZdevXSemqpM0NIuLi7FiBn8EeeUckPiLn7lV9bymVpGjTx74e6zsL8LIyx3w/cdP7ekzcLnP0tatqWEmvP5zEEtk4NcTsuvZjXzQuG9sZXlXW/vfB1rIPSU1befO3UjrkzUnnj93gZl1408CHHl1A7+dj7X8/pV+HWHk5S4Wi7ds2TpZQxO0Dj4+JDQsKZmZU+nPgSeQgjt/G1DY2i7GNv29GHm5S6UycOfg1EHuEMHPmDptjqlZQmIS00XszwCC9U9hJRY3Q+qa//4Zxkde7uGRv1kBQVs2bUbBDCKoPz4+AbNgMGyQyuRZxc15la0ugYVjIX4cebmnZ2QutFxE6kQwSUMzLy8fs2AwDIDsKKO4OS6rHol8jORKdAhmpMoTXlvMm8/j/7UvO0Yc8o6OqMzah9+ycZVX/xVdYvrEyMvdzz9g3Vpshhmce3btwYoZDANisuqCEtlfY8pjs+tFks6126MyarGyvxojLHeBQHjk8L8krQN379zFtEUOHyA99YosPeOSdNkt+dDL+LFzqUdY7mKxZJHlQpLWgdu3bmPkPhxoaescNAOXtpTNzS3nWNwKfforFxWNBYyw3GUymYfHR5LWge7uHpgFgyEFPkQjLLVG7xLrQ1gJ+neMYMS9uzg8IpKk9bnmc2tqx0Qo+Schl3eU1XDbBdir0zeBhSvuhdN2EaVhwsinqixWIEnuRvoGpWVlWDGDIcLvjFq3gEKZrKOqjgfSP+uWnKQ4Y/VYwMjL/e3bdyS5A13d3mHFDAYNiNRb28VWrskFla2uwUXSrvkdK+r+zk5gvWOE5S6VSp84PJs6eQpR69O1pqampWMWDAYHsUQen11/xCn+Z2zFG7+C2qYx/TZjhOVeWlZeUlJ6+7YNUe4a/4yzsbmDWTAYHDhcsXto8cLboaU1nQsnBaZUt/H+/q5gqjDCcm/hcMCRG+rpE+UOvH2bkfsQoLqRf9gpfuZFf9NrQQeex0JUMxonZR9CjKTcOzo6WAFBkJXu2YV1dkecOF7D8YUTZsRgoGAlVC24FTLrEuvx9xwbz4zaZkF1/VifumeEvbtQKLKYb0GK3efNMW9uGX3r/tAKDS3Cg45x2lb+jj55pdXcJo4Q/PqoWC1sWDHCcpfJZDt37CJ1hwT15+cz3SEHDp5AWtPI9wwv2fI4SiiWlde003x13z+GEZY7ICg4dOpkbJVJxOlaU6OjY7HioQA+LG0sADJRnkDS3CZabBMakV5bUNGaV85MLoth5OXe2NT07p07Ue5zZps1NTVjxYNGbHZdVnGzSDwmUjSprMPWK1Mgkt78mAFyhw98oRQrY0AHuUukUmdnbJkaRIht3r59jxUPDi1tojV2EakFTd+iyrFNfy86Ojpde2Zxs8O3HIMrAesf/P7xu0wsYcKYHoy83Ds6Ouzs7IlyB+7auVs4FBNep5c0615iHX4eu/NhFLg6bOtfiric+qCU6ud++XqXWDMu+htcDjjrmjxKl8gbJtBC7g4Oz0jZ6uVLV4akA/CH8FK48doX/Wda+b8JKMS2/nUATfslVJlaB2lbdQod8d9XCZCzYhYMujDycq+prX33/sPa1WtwrU/S0AwMCpZIhiC/vOia3HP7X8TLh+IRohtksg5Hv3wdgtCBp1yShGMjXekXRlju4MLz8gsWzF8wXUthmkhDfYPGxsH21wNtr7KNwBUw91rQiLfHFbPborLqMoqah/BMvsWUa1v5r7sf6RZYaHErBH4p+HVG65QYYbmHhoVfu3b91i2FPjOIDx8+GmQ809ouNrsRjMsdmJDbgJX9cYjEshse6RBszOyKrNbaR1YPxUS7FfXtZteCtj6ORpMieUWU7Xkaw2NaY1RghOVeXlEpEomO/XuMpHUjfcPjx08OcjHh8tp2w6uBRLmD2wO3OiRZQb8AsfVV91Q4AXvv7JDkajPrIPh87GXCIF9zwu7n3ZINrgRkd7Wsc9rFt9zTWrjMmlYqMcJyb2pqDg4JNVDqIjZLW+fla+dBhu8ZpS26ihEtcP/zuB+xFZjFHwGI8tH3HHDq+1/EoQetrV18/EW8ziVWWPqgZn71ia+Ew660i4BnuIjddvJlwl88NfuQYORT1campsULFxO1vnrV6ri4hMH74J8JlSStA5fZhJVWd3aF/WNILWoCZcNX/4jufMzcQ0uMLgegk1lxL3zAr4EgFjK/3hmqwbPU3Cpy9stHXXwZ9IIRlrtAIPzq/d3MdDau9SmaEz0/emHFg8NtzwykKr1LrFndbn7WJVZ9i+CPhTONrcLVdli6XMLmghuec60zkkGEFNM7ZiBVDbuRb3E7FI5gei0IYrbqBh74dRHzRqkvjLx3F4vFy5cux+VuZmJaUzM047IPPItFqlp8O/SscxL6DFx1L3xgIusvKurav/4uA02j721oEaQXNeOngQg5a3/7oEPWu9MhGvJdyEy8IrFBvcXstvIxOR6vXxh5uUPQYmNzB5e7oZ7+c8cXQ5JNHnOKR5JaYhP6K6EzzMVF9p417K+capsFZ5yTbD5iNQwQgqjvcRX4v4hQ2yTk96PJFS7M/W/Z8Fuue6YXVrX9xXOxDwdGXu5yudzV7Z3muPG44u/etcXKBodTrxORpCxvheSUcXS7AmjEHzEV9cM8v3N1E//5z9ytD37jXxqdWfecVQAf1t2P3NNd8wDPuSar+XSDF4jIqNWxYi2yCW1uEzVyRv0y1n8YIy93QGNj092795DWJ2tOjI6OwQoGh6MvMO9ufCWglSc2se5plDzkFO/CKhzu4Q4nXiUQqxSv8FK7b9nw1EGQA+czp6s5EmhiHZRbzukzZ+UJpe8Di8yud+7lGVHqG1dZWdcuk3cUVrZiFgz6Ai3kLpPJbt28jeQ+XWuqj68fVjA47Hsag/QEmqus5+0nOFRdKxYoPjqrDjMdHoCsjbsT05V3wm5+znzwI2fj/d9QJJHJ53e1qyAuuRvmGV6K9qIEhysKSWJb3Ox8aWp8NZDLl7C70lPIU0PTmHWs1AUt5M5mVxOb3lcuWz4kS3estu/pQXDudeLjn7n4v0BQ/FnXZMx0eCCVyvHXui++5+55GuMUWHjsdSIUldVwiY4fuMI2HESMdiQBfL+rf8GiO2HI8vibRAh+nvvkhaXVNLcKR7xnxCgCLeQeHRM7bUrPgKYlCxeJRIOdHEIskVnc6WyqAxpcZmlf9H8XWERSmLF1IETA2A7DgJSiplmXsYTBI7T4gkuSR2QpiDUoufqmexp+GojaVqyAFDa2pyKs36UaXsGa6oGP/fNzyzgQKTFjT/sLugQzz5474nKfNEEjMzMTKxso2vmSZffCkT623v+9/G7459AS9LoHp7aVv29iFbbDMOClf2diinj2XUoRu40VX7XtYdTyu2EeIcV4H0YITvY7xsGHbQ7Ryo2SOeUcI8WuEJc90qLSatO6JzdloD5oIfeOjo6Y2Ph9e/eB0CdO0DDUM8jNzcPKBgqo7kHKM7vavC85J6+zj8wv55hfDwKR6Xe/1AQe7wothgl3PvW0Qi64GSKWyEVimVtYsVNAQUk116TLYc+5HhSTXVfdxDe8HKB3OSCzVGEKhtpmwYruhxYRovwXv/IgPeUzfdn7D1rIXSKRuLq927tnn+OLlw5Pnx86eHjwnd0bW4VBSWyDLkmde5ty0CFGKJadcU6y+5rlFlKMq8f0WhCa8nzIIZN1bHsUhX8R8HdG7feIsri8hoiMWpFE5hZUBKeHus20cEXwPIDNGZekivp2eFbhbO96Za64q6D17Y+jOGNgOcjhAy3kXldfb2psoj1j5qpVqw319M3NzK9du5GYkDSYl03fYyvmXAtCbe0nXic4fMmSyOTesRUHXsQVV7XhAgJnH5k5LO0z6aUtxJZ+4K0vmV8jygQiKfpZnyPLnLvWu+NwxSXstpVdfQ0ggofQ5eSbxN1PY0iZxo7uXr4MBgxayD0oKGSy5kTi+L0pEye9fPVaLB743b3nlYkLZduT6OoGPjw8BVWtG2wjPoV1jujDeetTBrbPkALCEpLcLW+GSKRy1JACKg9MqJLKOoVfVc+rbxasvh9JNCbxoGMcPjU7gwGDLsGM58dP06f0DGgC6aempmHFA8LeZ1ijO9DUOhAf8ZBd3PwutCeYAS6+EzYcs/pDijlLUe4Q2xCrK+JIQr5Q2ovcL7xNGWvrDgwTaCH3lJTUgMCgwMBgrUmTkdzNzebUDmIBDy5fYmGDtUIiRndHLN7R5aFJ1cQiCKDzKoZ+4qGCylZiTvwxtIRyQF1KYdOvxKqQ1GpzxYFXiNpW/nbe2UyD41CBFnJns9mXr1hnZefs2b0XufYXL14OJnBPLW4melZdK5atdxYqKq3mriUMYEX8+HvoFwupqFMYS3XuXYryD4ItmUVNp5yT8F6TRN7/mh2ZXsvMnDGEoIXca2rrLlyw0hw3Hg/fL160GozcS2q4qE0Gcf3939+jylHwIBBKH/7IwYsQ/3WKRzsOIcQS+Vp7LD6ByNvqXQplrxi+QPouqAg/E5zr70f+9RPj/HnQQu4tLRwPz09I6Ig/fvzCygaEqgYesUPYnGtBbbyePK+oslWbICyg+c2QIW/0gJR0Ufdr3QMv4kC7qqb9qGni738ea0qoCoytAyH0x4oZDB1oIXdAYlIyce4NiOaxggGhupFvShg0NPOiv4AQNyfmNxK1BYQge8jlVdcsMO7u8wg84hTXS30ll3ck5DS4BBSadu3iMcbWf/xjoIXcQQcRhOUmIaQpKCjAygYEkJoZob/hgpshxJaN2Oz62Kw6vBTx25CO14aA2yeuUr87oILkIatErRnrRWJZCbuNx7Q5Dg9oIXeJVLpty1Zc7lMmTiot7a03bJ9oahXO7XpJibjPKY446qe5VRSbU4+XIl55P6j6hATw4ye6B5cA7b2zmYyTDqCF3OVy+YF9+3G5mxgZNzUNqv8TBMoHHOPwt5Kr7SLgAcDKumDZNdsWkcttI4Z2stz4vAb0msnwcsBL/wL7r1mDyL0ZDA3oEsy4u3vgzTJXr14bTLMM4EdMhf6lnpYZ4O8MhVb8MwTXiwixPiS4WPGgwa7n/YyrvOKe9jqo8GVg4a6nMWfeUjREMvjDoIXc+Xz+z1++RvqGE7oUv337jkH2d/8VT55h5rvi/O7pBU0kA20r/9i8IZtS721gkd4l1tbH0ejVEpcv8Y7++yeYpz9oIfeMzCz7+w9/R0XfvHlbZ8ZMnRnaiYlJWNmA4JtQRVLzp3CFto7aRj7JAPjs12B7HePwTew8AYimvkaVRaTVXHiXwuEyvbtGHrSQu58/68yp0+fOntuza9dcszlWVpfr6uqxsgHhY9e07kR+UmzaK2G3mVgHEmdDB+5+FqN+vNHGE+dXtX2LKr/rlfk9toK0GE5sQSM6puXt0MySlv0OMWN8QVOagBZy5/H5b5xd8Xb3NavXcNsHNUPQg6/ZuIgRScGMXN4RkMQ2Irx5BS64HarmfKK1Tfztj6Lwd1WGVwMLqxSmA8ivatXr7jBj753dyBEO9zwfDNQBLeQuFIq+en/bvHEzkvvyZctbBreu6r/dEyrhDEwiDwMFR45mKsVt9C8HFLDbsOJeATviewENrgTkKr6l+hFdjuQOKcGeJ9GVY379XpqAFnJvb293dXWbMXUakjvw1q3bA56MQCCSLrNVGAQETKRKQ1vbxXi3FqDuJVayegNAwVubEd/aWvnf/pRBDIRcAgvx0i8RpcywDJqAFnKHJHWy5kRc68Bdu3YPePxereIrVcQyqtlx88o5xDVeDK8EFKs9OfDn32XEmmHOjeDa5p7Jpr1jeibHc/IdsgyYwSAx8nKXyWSNTU3XrK9PISje2vp6RUUlnz+Q2cpzKhREDNS1YvGpehcKRLIlhG7xIN8EtYf3S6Ty9YSaARx8aAY2vREUHeyaWQDx6c9ctJ3BiGPk5Z6RkXnvni348vj4BGMDIyR3EyNjH1//xsaBvFv1iiQ3y8y7Hkw5GgiiD3YDf7tDz7gnfELdPiGTd5D6zdt0DwKEBIA4sOONP7PgPV1Ai2CGx+M9fPjY0sJylrYukruhnn59wwBf+tz+mI5LDXHlvXA0QlQZfvGVaC5scO13vmR+U/tlkFAsI2oauOdZLPoWd8XBgV8ihn7sCIOBYeTlXltX7+H58ZePb1lZua2tvZG+Ach94QJLgWAg09tCvrjrec9ckIhbn0SjQdAkgIe288pCI4lmXWKllzaDiFU9GCQQpzMAgvTD02qEos465PQbhR4KpP4LDEYQIyz3jo6Oy1aXN2/abGxodPjQkeLikqam5n+PHNXT0a2pGchMn6DXjQ9/E5NI4OHXCaqGVrzvnnMGdrHxyswobs5Vb9xqfkUrfnyg+fXglbbhre1iOIFlikFOWS2zygBdQIPYPTPr0aMntbV1a1atnjZF6/Ztmzt3bTX+Gefs7IpZ9Aeg6op6HmmYcy/dsyobeMRnQ+9ywGmXJHXWCKhvFuB7AU++Stj6OCqf3VrfIiB+u8nVQMosmcGIgBaxe05uXkxs3Cevzyhw1xw/YbrWVD//AKy4P4Cg5bonOXY//CxW1WB+nlCqPLLpJatARhX8EJHS3U0A0fBKwMKbITc+Zdh6ZxNHhW97GKVmdMTgD4AWck9ISDQ1NiEu4LFq+cqBdYoMSakmTWYEXHc/UpXmYPsr/wJjRcXDEVL6apFMLmg86UzuRUwiZAVZJS191xQM/hRoIfeGhkaI3XGtAyG8wcr6idSiJuIcBIg7n8ao8tYCkSy3nHP/G7mPzT2vvqcgDkxik/YiccGNkAGvI8lgODDycpfL5Z8/f7G0sFyzes2mjZs3bthkbjZHT3eW97cfmEV/0MaTLFacRnTf81j3oCKsWAkNrcIr71NfE2amRtzTOZIas6EElFortXiSuMMhholkaIWRl3tHR0cbF3t1X1nFrq6u2bVjFzj4bVu2DqDbDHjTTQ97Vv/SsWKVVHN7meMX8si9j6O3PonGd0Hc7BDdu1JFEpk5odsMJY9D1tv7Q8Pgz4IWwYxQKPz2/aejo+PvqOgjh46geAay1Q8eHzELtdHYKpxPGJQNWWNcTm9d5yGFveCWgtvjXP/wd+/TMqYojYdS5rYHv4d1dRAG/QUt5C6Ty587viD2iEQ8feo0ZqE2EgoaiQ2LkIN+i+tjRo2aJv48pfkZ1z74jcbdqUJ4as3crkXweqHjz9ywlGpsBwY0AC3kDoBs1cTQaOL4CUS5b920ub/xjP3XLKLgDK8ExOX20RkBwg1S/3XgKruI3uesyynn6BGagOAZQyuFEPk+sKicqicmg5ECXeQO4PF4np4fZ2hhPn7KxEksViBWph7EEvlmxSj8rFtyn++MILyOz6nXV2y+XHYvnKd6NZjWdvE6xfmpV9lGnHdLJr3Njc0e1BBEBkMOGskd4O8fML17CT7dmdr97RHZ1CokhSXhaqw5Wt8siEivJflmOI6qMRmQDR9WGi219VEUBEVoBQ6cyQUD6dHJYPhAL7lfsrqMtA48deqMSNS/PC+9pJk4c/Rs66AWNYYRVdXzjr9OMFDs3gg5blUjRW97nkDyL6EvO04dKxYriZ2Q12BiHbjsTtjcG8FwJmdck5ip2WkFGsldIBQuX7IMl7uPr19/J1dyCegZMgfc+ChKnfH/EALt6F5fGyeINTGvEbMggMMVn3Xp7DBM4qq74TYe6RKpPL2kpbKeV1LDPeWc9IpVwLRD0gp0kbtUKs3OyTXs6v0LhJzVxfVtv0YzwbNxTPGt/m31Fl2SyTuOUnUH8AilmIZXKJZlV3BI3WyA1z+kBSWz3wcVQUhT2yxw+Jl70CkeXyGHAU1AF7lXsasfPX4aGxe3ccNGjX/G6enM8vVj9cu7twskxNAZskZWIvUi1CRU1LWbEiaDx3nJnWKSVL/EKmK8hKhzifU9rqK4qg38OiuuEmIYJ7/8Nh4zHJt2oIvcm5tbPn/5WlXFzssvWL92ndakyUVFxViZeqhpFhDnjYHP71T3HSCipJqrrGDgLsdY5Wl7XRRHKiGa3QiubxGAMZpUVZ3+wwxGBHSRe1lZeUsLRyaTwV/be3YQz3zrZ5+Z4JSeBca0rVjWb1PK1RtXwRdJVypN1AHc/ISiH4F/12x4JM65GdzUKkzKa2Am2KA56CJ31HNGIpFwOJxFlgtB7m/fuWNl6uHel56FVCGSmXMtKEa9Zm+ImOy9yT0igZsfU3RVT8xtIJkB9zyNSStqtvuUyXh1moMucpfL5fUNjWERvzOzsiGMmW8+b/269WK1p5oRSWRbHRReMC20CW3jq7t7VhlHOZ6h9O45ZRzSuyTY8WNkmeWtkNB0Zkwq3UEXuSO8fe+empbe2tZ25PC/RgaG4Omxgr7Q1CaarZhu3vqYob6vhQdj0W2FdViB25/FKLeaZ5W0kOQOPPgs9u7nzF7ewjKgCegld1tbO62Jk6ZN1pqiOfHA/gMQymMFfSGEELgD9S6z8ioV5ijtHdmlLaTeB8Cdz2OVB4X4xlcqy/1GV4s7ZsGAxqCR3CF8P3K4s/fvJA3NDx6ev6OipVK1/CXseMUjjai/bU+i+7XyTH5F65GXCcQjAG96ZSjLncMVbSRMHqZ3OeDa2xTrD4Naz57BHwON5A7itrO1t7O/r687KyIyktverqbc24XShYRQBLzvr7hKrExtfI4sw4+A+OxXnnI4xBcoDOU2vBLgEVR80S0ZK2ZAb9ArmBGJRMUlpaGhYcuWLrt+7YZAqNbMSpEZtbj+gBZqT9NOxA/CJKaI82+GcJQaFt3DFNrdLW6FJOY0VFP1rmFAQ9BL7nK5vKa27vuPX9rTZ1hbX1Ons7tM3nFcsQuA/ZcsrKw/gKCceBDEzxEK6102t4nMCCsDA42sA5l1CkYR6CX31ta2qqoqG5u7SxcvDY+IFIv7fmtTXttO7KpueCWweEAjKijl/uR7DlbcheS8RlI/4VmXA1LyKXqSMaAn6CV3Lre9obHR3d0zISHJ3cOzz3HNkKSSJlHa8ZSi9VAdsBIpZtFwZSks3h2RXkNqlgHvXsNEMqMH9JJ7QUFhc3ML+Hjw68WlZX2uaJBf2UoaQfcrvt9JKkJcLnkdbV0r1j3vbOIAbR+lHgQb7kdiZQxGA2gkd3DVAYFBSxYvyS8orKtvaG1rgy1YGRUkUvluxX7q6+9HKieXaqK8vl35xSo8P8SON29CikgGdt+ysTIGowH08u7l5eVLFi12/+D5+o3Lc0cnHk/lCl7wIDgpzoWke4kVkTnw1/hNbSJzpSVuQO5h3cP/EvMbSIGT3uWARCZwH1Wgl9xr6+pbOBy5XA5hTGxsfC/eHSJmkysKvQYsbEIH0yGxc6bsx1HEAyKutosQS+QiceeyNsTACXjgeVy/XmYxGHHQS+6gb1B4YmLSw0ePf/r4SqUqOxE0coRbCOrUsWI9/Z4zyKV6laffAC6xCYPwnZXEVp5X4xmzCs1oA73kjgCuvYpdnZScUt/QoKrbTHEVdx3hZb7yyqYDgHdUOX5AnHOuBYWk1RBnJkOcdYn1exCxE4MRAe3kzucLyisq29q4Eb+j0tMzYmLjsAICithty+4pDMgAT9/7HHfqwOFHDmm5pV747FduK5cZzDHKQDu5C4Wi/IJCCN/BwQuFQuUXqwKhdP4tBV9rfiM4s3RQq2wj2HxM944uN1Iadq3MDQ9+tzEDl0Yh6BjMACCEb2xqbm1rw+UOSWt9iwAi+yDFvr6g9XiqFbEHAGdWQVOr8ITiQmKUVH+BPga0Au3kDhJvaGjMyc3jctuJLTPyjo52vqSkhkucJ2zO9eC04mbMYtAQSztf40bnkN83kTjzon9lbXtiX1NPMqAh6CV3SEx9/FgQxrTzeIlJKaB7rOC//8KSqoVimf2nDAiv9S6xzKyDjK4E+Az0HWov+EyVsBK58GYIPBXMeI7RCNp5d5FYXFBYVFVVLZVKiYG7QCSTd85sIWviioCcdnEDR9j7nNT9BRdqD3bb/a9Zyqs7EXn4ZQK2A4PRBjrG7l2t7310DhsOwMPzM6oc3HZaUfO2J9Gk3mBAqFX2PouNzqrDdmAw2kBHuUNIw66uyc0rCAwMVm6ZGVbwRVJRV43BF0pvfkgjLhkJ3PyIWTVydIOOcgfw+PyU1LTW1jbs/z8LCJkaWgSJOQ0nXyZoW2GKn38zJH3o0mIGIwKayh0cfF19QxVbrUkehxxtPMlL3zz/+MqH33IOvUo44Zr0NqiImSHsLwDt5A756B8OYCiB5w4jlEcwGBbQTu7l5RWh4ZHt7WpN78iAQb9Ax2AmJzevpqbvRWYYMOgv6Cj3isrK0jLmLT3dATHn9+8/GhobxWIxMeCD7bSN/+gody63XZ05CEYX2tt5IpG4z9G3CDweT/0JA4mIiY0nzkVFlB18DgkJ3bRhk4mR8eaNm8PDI0tKy2B7SwuHeLUlEmlWdk5zcwuotrGpicfjt7a1wQcpnJBMBr+isakZNgqFQm47rx0+CQStrW2wF+i+ra2zJS03N7e0tKysvELVyHo4Ez5/ZGYroZfcQQ1wZeFawBWBC1pf39B72lrf0AC7NLe0wN/Ol64dWJrby9MiEAggWAI9we2pqaltampuaGgUiUToJS5RHwh1dfX37eyDgoJhF/gWAFgiM/iLPsAWGXy94qnC+aMfUlJa6uX12VDP4O49+48fvbBi1YAz3LZ1+/v37p4fP0FF13lcFRcBfTv6AD8EfrXji5f+rMCSktLKyip7+/vHjp0oKi6G0wsPC1+6eOlkzYloLSBErYmTbG7b7Nm99/Ub118+vvfu3rt4/sJCC0vNceOna00LDgkrKiqB7XNmmzk5vbx541ZcfCKfz/fzD/jx81d0TFxIaPiPX77FJaUCgbCxsQn0jWZsrqmty8nJK1S9GgU8Ni1qT3Y7tKCX3MGvw2X68vXb1avWvn5+2tNnHD1y9L37h6DgEODPnz7v33+AS+/o+OLr12/R0bGrVqyys38w4X/jjA2Mdu3YBXfr1MlTz5877ti2w8fHFwTXOS1ZcYmvn7+fH8ufFZCenslmV+vO1Dl/7vzZs+dt7tx7+84dbvzMadOXLF6yd/fey5cuv3z56uMnr8ysbBA23Mjz5y4gccDJzJs7z9TYFCxt79nm5xc8eeJw5tQZOL3FCxfv2L7z8WMHOPkH9x+AE/X2/rZh/YYZU6ddsroE2kVH0J+lN8fUjMfnIwWjJwd+NZwmSDw5JdXhiUNsfEJBQeHMqdPRLsuWLjt+/MSB/Qfs7tm+eePs7u4Bv4LDaQWvmpSUzGIFwl+4aPZ29vq6s1YuX6mnowt7afwzDjhFc+LE8ROAa1avWTBvPmmJZmVO0tCEvUyNjD96foKHc92adc7OrnACixYummc+NzEpGS4mnPbZc+fh2f7x0yc8onMWBnjM4W9dfT3UEimpafDgVdfU4lUTVAUJickBAYFwwr2MTftjoF0wk5KSCt4Frv7UyVOIN0NNThyvMWmCBvr88NETZZcGWif+q4qg1N279gQHh07W0CQVIYK80AeQCP4BP3PNcSq1dfzYiWvW13Nz89auXrNr5y7v7z/WrF5rZGA0seu0V65YaT7HvJfdgXPN5ljMt9CaNHnJosUGevqzTWYrSxmKkpJTLOYvwLe8e++OnzOJcObr166HH7t3zz64/nAXDh/qnJt2utbUmNjYisqqbVu3xcbGgQvPyMjy+vIVHHlXZSJpam4ODA4B+zYuNyU1HbZ33cMewANQVcVGM6lgm0YUtJM7xCcGs/SJN2PA1JkxE9fiAAgxLjhUjS4FDzn1dfVIWxBB7m5v36PHBpH4mUjYDhUUaSMiqD86JgYqGeJjADFbUDC2BVzAxg0bwZ2jIqhzCgqLUtPSwTeDZH39WYsXLYZLt3XLVojBa2prIUaHoAWqo7Wr10Jsk56RhYJvsAd/Dx8gCkxOTlFuPoZQEPtED9BO7nAFz545h27DUBEXPbirKZqT8O3guUm+f8QJKnz69Dny9CDN06fObNq0hWiAc+6cuYmJyfABfh3+TELEderk6RXLlkOuuXjhomlTtHBNQ4wBznjm9Bnw+cL5CxA9x8XGPXF4unfvPviblp7RlZP8V1pWnpqWFhsb6+vjC0EjSBxuCgR1kGdDcJiamg6hF6SnaPuoA+3kDohPSFy7Zi14L7jrEICiu4U4ZWKPWFURgplOff8zbuqkznDI3Mz8/v2HoGyIMeDIbm5vkRmEKz9/+Rw7dhz9q0yIi0hbhpxwnjrTZ+KfwctCFqirrWNoYKg5fgKcNpfL3b//oPa0GYZ6+hBAo8gKxD1/7vz7Dx5BWLxu7XrQ/fJly6dPmQpHePjw0YcPnq5ub+FnQryenp4BKQQ6PrhnSEimTtaCaxsbG49d665ZlwMCgyFdycsvAFedlZ2LFfyNoKPcoWYER+Ll9eXLF++EhMT3791tbe3XrFoNedjWrduhBj937gLcy4WWiwxmKYQEEM6amZgGBQb9+uUDaRbU5gvmzodUFWJHsIeM7YnDMwhAz545C4davnQZKyDQ3eMjfIaHas/uvfgixrAFjnz37j145JYtWXrs6HH4AOls51NE+DqcyBnj1PhnvLGB4aQJnZnfIstFkGtqTZwEn0m7wwEtF1jA6V04fxHSwSOHj1RWsYtKSkHlRvqGUMVBdmhrZ79v335wvVHRsdz2ds+PXi9evITflZ6R+fmL98NHDp6enz54fPwdFb1x/cZDhw5DKAhFEJZAQBgWFlFeXrF923aITNavWx8TG19ZVeX9/adAKEQpMo62rkGS1TU1ULVim/5S0FHulIA7BEFkRUUFeCP4FyrrwqISm9t3Thw/8ft3VHx8QkZ6RmFhEaRQ+D2TyWRNTU3oc1sbF7IlANxX2F5TUwtHS0hMYrOrS8vKrly5Cgf5+tV71crV8+fOi42Lh1QSROPg8Mzr81d4uu7ds4uJjZul3dnu0aXUGfBcQVUD0cLePXvfvHE5eODQjRu34KmYPkVr5fIV4EoDAoJYrADI7Xbu2Hn5ivW37z9u3bq90HIh/Hvt+g14gNFXwLl1Rsxl5RAuoM9wevX1DQKBAPQNMQM8n/AXb+tAgF9R39BYW1vL4bSiFlgw43A4X756g+duaWk5evT4xYtW8AMhiWxsauJy26OiYpAldogxiVEjd0rw+XySoxoYQD08Ph9UVcWuRjkZbETKkMpklZWVYrEEHrCKikoQq7OzC8iI09paDJ65pNT9g2eXnrgQh+zbu8/Pn1XFZsOWrgN3AgT36vWbwKAQkG9ySmp2Tm5lZRVWNtTg8fjwYMCZp6Wlh4aGYVsZdGN0y/3PAzWoEZ8xeDaSklJaW1uVm+FwwF65efmoNQMcLMlVM/hjYOTOYAyBkTuDMQRG7gzGEBi5MxhDYOTOYAyBkTuDMQRG7gzGEBi5MxhDYOTOYAyBkTuDMQRG7gzGEIZF7jKZjMvlJqekffv23cnp5X07O+CbN86+vv6FhUVixXkaGDD4YxhKuYOIW1tbP378tG3rtulaU/GO3SSam82xsbnDZlczomfwhzFkcufxeM7OLiZGxiRxAzX+Gac1cdLUyVOIQyd1Zmh7eX2Wq5hSggGD4cDQyD0/v2Dd2nW4lHFqjhtvucDi/Tv3jIzMvLz80JDQndt34oN6NMdP8PDwxA7BgMHwYwjkHhn5W1+HYkaHSRM07t69x+UqjE6HwP3Y0WO4jfb0GRUVQ7++EgMGlBis3KOjY2ZOw+YAInLiBA1HxxeU4xgKi4qJozYfPXqMFTBgMMwYlNyLiovx4ZtEgpovX7oiUTGsTt7RYaDXM5PM+nUbsAIGDIYZA5e7QCBYuWIlrloiFy6wbG5WuYx1R8d/psamuLGhvgFWwIDBMGPgcn/48BEuWSInTdAM7ppITRXq6uuJTTS6M7WxAgYMhhkDlHtJaamqGY42bdgkEvU2IaCn50ei/dLFS7ECBgyGGQOU+8kTp4iSxTlpgkZE18SwqtDa2jZn9hziLhcuXMTKGDAYZgxE7qWlZfgsuyQuslwIMT1mpwTQ+t49e4n2kzU0ExOTsGIGDIYZA5H7vbv3iJIl8s0bZ8yICgKh8OyZc/gspDO0pr50esV0JWDwx9BvuQuFQj0V04RrT5te1teaSjKZvIrNZgUEBQWFNDQ0Mlpn8CfRb7lHR8eQpvbEuXb1mgH3gUlLS/v+7Zsyo35HKT8SHA6HZIbo6+PL5/MxIyrIZLK0tPSHDx5t3brNxNB4+hQtIwPDzZu22Nvfz8jMpDz5kpLS2Ng4ZWZkZmEWXeC0tmZmZcfHJ+Tl5YvFvU0GzePxc3LzOie1zMzi9Xq2vaO9nVdeUZlfUFhWXo5m+YMw0s/Xl3RNEOvrG9BevUAikZaUlsFP8/f3//HtG4vFgpNkV1dTOiQ4IOkrEFn+/miiNYBILM7KzgHHBvclMSkZHCXargoikQgi28ePnuzevXfe3Hk6M2bONjHdsX3nC0cniJ8xo8Gh33K/eeMWSeU4X796gxn1E1KpdOf2HaSjIVpbX8OMCAgICCSZIZoYGrW2tmJGSoBLuXrlamIbKJGQjUD+3dbGxay7sXvXHs1x45V58sRJKJXJ5bFxcUcO/zttihYcGbZPnKAxd475z5+/SG+U4aFNTU07c+bczGnTwabTcvwEAz19FxdXNBmqmiguLnn69NnKFaumaE6cpKEJpw000jd8cP9hZOTvyRoU09WDewLVYvsrAU6ssLDozp17JkbGcCg4MeTO4C98huRq3dr1CUr5lYeHJ/ErcHYmb0IhPPwvXjiZGJnAb+w62jjN8RPMTGenp2dg+ysCXICb21szUzMwIx0QEQJgp5evMOtBoH9yh1Bkzmwz0qkgak2anJGRidn1E+DkQCKkAyL6+PhhRgTctblLMkPcsnkLZXQEj9PjJw6kaakpeejgIdw5AfgCge5MbZINovMb59zcvK1btoImSEVAuMdWF63wQxWXlOzds4/yXoLlnl17uFzyY6aMqir28WPHVT2uQMozAc42ma1q4tjm5pYrl68qr+kwZeIkY0Mj/IDwb0xsHLZP1xNy9vQZ3JjIo/8eZbECYF/SdkTwR8rVb0pK6kKLhegZ64Ua48Y3NTdj+wwU/ZN7eXmFqjYZfR1dUm8w9VFZWUUpBXAtbDaFW9qwYSPJEvHhw0eYBQFwp62tr1NezQXz5v/86ZOekQUyQlvAX8bF90z1X1RcDI8xbk/kwYOH+1z5A2pCkUj8xtmlz1UYoH5Ay9apwi8fX8q+Sepwx7YdlHFaTm6uuWKjMBASMBdXN3j8YJfyior5c+eh7URXApd0xbIV+C5EmpvN6eWBBIaFR6CDAOCAP378pBwaAce/dNHq9KnT+Fpa8POFwn5Ug5Ton9xDQkJVPYU7d+zCjPoPXz9/0tEQoSaBCBUz6gZU/ZQSBFcUEdFzKXE4OjpRur0Vy5ZDnYtswMnhDunGjVtoIyAoKLhPr9ML4XFdv269OkeAmsfXl6IeQ3iruFoTImy5dOlycUlpO4+XlJwCbpXyZwIhM8EOREBmZpbyymSGevpQZWEWXXBzdUNF5nPMcbnDw6DKC/TJ799/oIPA0X798tVScgQQFnp5fYY4AplB1K4zU3vq5Cne3t/QFoSGxkbsU3/QP7nb290nnRzOwXRsvHWTOh/YtGGjcp/K7JxcSgHB019aRk5oEhITScvdIEKIUl3dsw49uKvN3TUGfMC2/vcfRLT4LiTu3LEzIvJ3C4fT3NLi4PAUXwKJRDjVXTt3gUsDszYu9737B1WWK5evIMZROMCvK1d9oGx4BjCLLjQ0NBrqYauPkBgWTn7xV1tXB5k6yWza5CmQm2IW3XByeolKly5eissdQnl8LxJBl7b3bEtKSyExvXAeW6OTSLSCAyAjM1O5vgIHAWkPMsABcVSVYiVfU1sLSQKkHNj/aqN/ct+7dx/p/BDh6gcEBGBG/QRIbYuK1bYePHiIGRHw9ctXkhnibCNjCLUxoy7w+YKli5eQzBA/KA4rgYp7146dqGjjeqyHJtzdVSr6wEGeit97ADyTxE78RBrqG5Deu9na2pNsECm7/oNuZlDFMDdv9lRBCNnZ2ZRRBAiIo7hmL/zYPbv3kMyAtrZ2xB+FEB0Tm5ySmpObF0uI3Z2dXUj7Is7S1olPSMSM/vsvKiqKVOEsW7IMhVWQrS1dRHFrzpw+o+zgcnJyN2/aAl5/2dLlEE7zeDzwgzO0pip7tz7RD7nL5R2qUhCo2sDpYnb9BIgSXxSJxMjIKMyIAKuLl0hmiIcOHsYsuvH1qzdl/Q6OgXRNwa2uW7MWlZ4+fRZthAhHZwa2ThiJX754Ixsc/qwAkg3ORsVqNyMzi2SACKeamamQ60skEoiYSWbAeeZzlV9dv+taD1mZlgssSHkquE+SDdDUyLhZvUQQHomjR46Sdke8csUaM+oCVDgzpk7DS/FICY7g6PgC345zlo6ucqqWmpo2XavnIPfu2u7asQs+gFeSKz2cfaIfcm9tbVWOtBDhV9Wp0bJLibz8AtLREKGmq6+nWJVzoeUikiXiK8VmUHiKFlpYkmwQvypGgQBI+fEu+B8+YI4/OztHVYpZoFSNJqqu3+vq6jCjLpSXV5IMEME3Z2dlY0ZdgMyB0mH7+PhiFt0AAamqXg7uP0D02RBjzDGlaFt74fQSs+gL4J7ndeevJELejxl1IzgkdPu2HRs3bLp75x67O3psaeHoUQ1/u3L5Kql6gbNdMG8+0QbFsXBZlC+COuiH3EvLyqFmJH43Tn1dXfSmYwDw+vyFdDREi/kLlButQJeUCoCAmBR3xsTEUDYiGekbosXMiPjcfQ7gzvGI4rPXZ3wvIqHKVq5wf/n4kcwQ9XRnkTxxamo6yQZRe/rMisqeYEYslixfuoxkA1T21gC4UIssFpIsEd+8ccGMuuDt/Z1kAIRMpqqKjVn0hfr6BsrcCaL20tJSzKhXODu7kvYFTp00OS8vH7PoRkJCImX9vG7NOuWbqA76IXeohVU1XS9dMsBOvPA0W124SDoaIsktIYCmKa81yJSYesKOJ46fJNkgXrt2HTPqBtRaoCFUevbMWRRcAs6epV7NGNwVMiDCxuYOyQxx9crVJHWqekEDIQox94BKnDLJpuyVVFtbp6Pi/UBaWs+bHbgsa1atIRkA9+7ei//qPhEaFk7aHdHMxBRv6eoF4CmWL1tO2he4ZOEi5deuvj6+E5Rut77urLx88oOhJvoh97j4eFUvvSgVoA5ACqtWriIdDdHZxQ0zIuDN6zckM0TL+RZEj9vc0gKRIskGCI9KuuK7MKiUrl61RqUQv+L+FbbPNSO3SSPa2dohGxxgvGHdepIZ4iWrS5hRN86cPkuyQTx18jRm0YUrV66SDIBQX9VRBXgxMbEkS0SdGdqQ2GFGnW+7SpXXvIdr4v2VnIr0ggf3H5COgAjpozrPTFFxCaXDsrOjaC3NyMgkBRQQiaWkpGLF/Uc/5B4aqrLR/eCBQ5hRP8HltkMlSDoaEKowyne0kI+SLBHPn7+AWXQhNSWVMpIxN5tDfCogQ33y5CmKjiBVCA3rWZmRXV2t3bWeujLBvWFG3YD6wdTYhGSG+P2HQrMaVFcL5i8g2SB6eX3GjDr7w7SbULUKqBrX6/D0GckSceXylcS65W33iuFEQt5VXFKCWaiBHduxJiwSHzygeMenjHfv3pN2BMItSEzsadLBAc+Pu/sHYr7r5vYWKxsQ+iF3FoulSu5H/z2GGfUTySmppEMhGszSa2zsWZoUAZRKWiYb4z/jPnp+xIy6AIkX2aaLZ071eFAIfo4fO4G0DsGrr68fMXaCq0+ZJMBz2NREbsEoKCikrPfgCJVVCkuowo+iDEbhmSe+31GVJT+merkBl0VVjyOIzYg/at++/SQDIAQhAkEfnbdwwHdR9qoAYURG/saMesXRfyladSA+qVVM6HHA+RPfJ/j5s7CCAaEfcv/18yf+rSQeOfIvZtRPuHa/tCNx6aLFyrlIWXkFpQRBGWmKfY/2KA4iwen+wROCgfT0jJs3b+vrdj45cJ8WLrBMTk4hygLg5PSKuCPOBfMWkCwBn1QktWamZqRXAdHRMSQbRGMDQw6nJ/B97/6BZIAYG9fTwQEHl8vFX/WT+JnQYAqeUp/KWcAzgFmogcKiYtLuiJBnkx5sVVhIlVJbzl+gKvVsam4h3vSU1DSsYEDoh9x9flG01yLu29uPS0YE3lmFRMoRfaq6MOhMn0Gc+EAgEMxT0eEMJA6OBH+vCY7q3l1b0lsYhN27KF7EAP9VerBB/efPnSeZIW7ZuJkYOwEcnzuSbBAhQiA+RSdPniYZAOGuc9speiVVVVWp6lVBfBFTXVNLefXgCmAWauD7D2qXB3kOMUlQBbg1lA7r8KEjmIUSIGXH60M4f8jKsIIBoR9yDw5W2YFkYKkq3GCL+ViTCIlfqZIne3vqLgxrVq0haqWhsZEyT8UJl2/uHPO7d22r2GxlVw0AVZkqvWNHdHFxxYy6AW5p1QrqbFs5/QK/QLJBfP78BWbRlb4voPLW88znUZ6tqjdcJoqvmSFupLx979+5YxZq4CpVAg08eOAgZtEryisqSDsi3lcd938gVHRw17CtA0U/5B4bG0cZdwIXL1yEGfUHEARTHhCCk6Iiiu4QmzZuJlki3iR06gJUVFYqd8YAzpw6beumzY8ePU5LS1fueUZESUmpqhmMUwntegicFs4sbazXHonhiqPUwdPP0qGYhQpqm4SEnpcGLRyO4SyKx3Xzxs2Ucre5Td0GumHdemKeGhIaRil3n1/kd0O9YKWKB/t1r4M2cahK1T596knTSdi//wBuduL4CWzrQNEPuefm5avq/as7YyaP1++BOZG/o0jHQTQxNFaus3h8/jSqNhyNceN//vyJGXWhqLCIMs/bv3cfKbSgBEhKVTUybYqWcoiZouIWgjGp+0BJaRml4CCmqqmpxYy6+rVTNgopx1EAiUSySUV36Nu3bTCjLnh7e5MMEIMCgzCLvgA+gvLFC2xMTk7GjHpFRMRv0r6IwUHBmIUi2rhcuIy4GYRSWMFA0Q+5gwRVvVTvfCWWX4DZqQ1VLbhrVq1W1mVObi7lw6Y1cRI8h5hRF3JzcynbSYjNMr0gI4Oipx4iKWpCcHGhzrYt5i8gvU8NCAiklPuSRYvx/q6AoqLiaZN77jHOi4qNrQhtbW2q2kADAhU05OnhQTJA/NHdI7d3SKWy69dvkPZF1NPRra7pecfXC4KCgkn7IsYR+p8RERwcgtf/kJ9wWtuwgoGiH3IHLF9K8T4MCHfx/TuF/qh9Iisrewah6w+Rt24puCWEHz9+UmpFe9p0Ul+D7OxsSssL585hFqrR0NCgqqcN0OqiFWbXDblc/u/hIyQzxL2795Beu0DQRbJBJA1QhCtD2Un42tWrmAUBhYVFlD8WXENtbU+NAVAl95dqDIqDOgRuivIrKkTL+Qsouy4rw0/FwIbfVI2YcPW2bd2G2xw9chQrGAT6J3eHJ0/xrydxyaIlyt30VAECAOWxBTjxTlpEXL9+k2SGCN+LWXSjWMUQJMpeCURwOJwtm7eS9iLym5IjFApFixctJpkhkrqsAVQFvh8+eGAWXUhPS6esnS4rvaAFwCmRzBCNDYzaFZtxKDtCAg8fVtkqggARxfHjJygfKsRz5yiqHUoEh4SS9kV8S+Ur4+Lj8focvj0xSa14qXf0T+45uXmUDUlAqHQeP3pC8mfK4PMFTk6vVHU1Q3RVfHMGGg0JCdVVkQ5aLrAkdQTNy8unjEZmm5j20q+Iza5eT7UiA5HlSv3Rm5qaVI3rUX7XrSr9JYUTEAdTamvXjl2kx1UskexTMQLB2JAs95jYOMrD6mnr1KgORRISk1BLOVTFqm79x4+fMOu+AJIl7YsIV550ayBIW7ywp+vrju07endVaqJ/coev3L6N+gUeEJ7FK5ev1ijWoQiwY0VllYuLq5npbHTRIQ1Q1QMHot6CwiL088rLKy6cv6jqQgPhSz998oLaFowhXYbPEEqSbHA6UXVzhUc0LDzC2MAQDCZrTNy4cRNxF5wgIOWMQlW2PUtbV3kOZFXdp4m91iBBohx7AYSkjTioB077/v2HvThdD09P3PtkZ+dsVJHRAg/sPwguHFkiwC9NTUs/dhQbCQ7B1RtnV8rvgh+VqTgHSS+AjJy0OyL4SkisW7tD87q6+h0EmWlNnExKzwaM/skdkJWdTZky4pyiOXHzpi0Qpzo+d3zh+MLO1g6umsUCC2KbI1w4qMHNzahfBgHBeLbp7Hnmc3Ghq0rIgHA07ekzLS0WgiDQLTHUN6C8N3BYB4dn+PtLUENKatqhg4eQMdQ5b5xdLl60Iu6CE+JIZQfzVEVnlSULFyv374OLQDJDhCt2x+YOVPSvX7/p5WcCIa8ABwlBY3V1zdWr1pS/ESeUzjOft2vn7mVLlxMvPiVNjIyvWV9/6fTyyaPHp06exr0SUHPcBEjH4cGm/DoIShsa+jFs1FxFxzsg3L4N6zdu3ryFVGFSDmobGPotd8Dz5xRDUdQnuApn586XNSdV9NElEZ6u588dWayA3u8uTrjBFZWVcKdJ23FCxbJm9RqopkyNTfFjTp00+csXb6hSF1tSdxxXfvsok8n3q4glKFuIrayox2GRuGXTFjOT2aSNOOGEEeEziPjM6bO9ex+c69eud3BQmXqpIlwrcEzwnD9Vse+yJUv7jGCJUNVTWhV379otVaP5WE0MRO7w86D+JZ2WmjTU0w/vnnoBPKuqZB8nROE+Pr5wudvbeb17PiBUBdZXryG3Gh0doyqqVqblAgs0ZLi+vsGA6hUPMDSU3BFSLBbPNZ9LMkP84KHQZQ0hIzOrzyd27+693PZ2d3ePPi3B4OSJU3w+nzTLLCWtLl6CJxmuDN6zXx2aGBpFx8Sik9+zm/pbrl+/iQzUBMRjxB6OvXP9uvWkDGSQGIjcAaA/N7e3qrrIUhIStZs3b7W0KHRQgdBCleLhdm5Yt4E4TC4hIVFVizhQX1fv+/cfRE/j5+ffi30n/xk3c9qMhw8ft3f398jMzKTME8CDks4cABEFpSjhCAUF1G8hnj59riqugFDq+rUbaCoVEPGG9RtIBkTCRTt96jR6sGtr61YuX6Hq8QD/8qvLX3R9/3+lZWWQspNsSIRDwXW7Y3MXD6alUinlsF2wVJ44oE98/vKV8iITCUf+98hR/L4MFQYodwS437dv2RjM0lNVn8Kvgjxjzeq1EJVSjkuA2wBRyrIlyyB+xXUAR1swbz6egBJRVFxy6OBhcNsTJ2igGwz5Ltyba9bXIL/BjAiATPfM6TPwpOH28Bc+Q4K1Ytny16/ekLoZu6jooWm5wJL4Qh4hICCIZIYIWW+L6p5Mfn6sJYsWT+7+vXA+8HuXL10eFhaOixIAme6+vfuULyzYQy7+9u074pspHo/30unV4kWL4XdBrAg/EB4eI32Dhw8fKZ9JY1MTxP3gqojXBC4j7DJDayrkXeDISLO4FBUXUz6lEG0rj7jrE+imzzY2oTwmbDQxNPby+tKvGElNDEruCOCKEhKT3Fzdrl+/cfLk6ePHTpw9ex58w1u3t1FR0aBC4l2kBPywwqLioODgT54fvT5+Sk5O6f21RX1DQ2xcPPhyzw8ekOEp94wnoam5GSrlr1++eHz48PPHT9iXsvkIACfs4uyiTH8/lvKvyMjIIJkhen/1Vn42iIDfW1BQGBgY5Onh8enjJwjqJBIKe5lMBudzyeryyhUrQbtzTGdDWuLq4qpqflM4bHVNLURl8AMzs7J7n3ML8vW4uPhv377BOcDfkJBQiLVa26hfW8Llgkvt+eEDiV8/f1H/ZQsJELP5+vpBlLVu7XqocGbN1LaYv+DYseM/vv8ccqeOYwjkzoDBaAEjdwZjCIzcGYwhMHJnMIbAyJ3BGAIjdwZjCIzcGYwhMHJnMIbAyJ3BGAIjdwZjCIzcGYwhMHJnMIbAyJ3BGAIjdwZjCIzcGYwhMHJnMIbAyJ3BGAIjdwZjBv/99/8kDa6t9Ywk7gAAAABJRU5ErkJggg==" } }, "cell_type": "markdown", "metadata": {}, "source": [ "![cuemacro_logo.png](attachment:cuemacro_logo.png)\n", "\n", "# Populate databases for tcapy\n", "\n", "**Saeed Amen / Founder of Cuemacro**\n", "\n", "https://www.cuemacro.com / saeed@cuemacro.com / @saeedamenfx / All material is copyright Cuemacro / 2020" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Introduction\n", "\n", "In order for tcapy to work best it needs databases to ingest trade data and high frequency market data. Here we illustrate how to download market data from external sources, and how to populate your market tick database with it. We also show how to populate your trade database with your own trade/order data from CSV files. We also show how you can dump trade/order data from your database too via tcapy.\n", "\n", "This tutorial assumes that users are using databases for MySQL to store their trade/order data and MongoDB for their market data and also using Ubuntu/WSL, on `localhost`. You can setup both MySQL and MongoDB on Windows if you prefer.\n", "\n", "Whilst our focus here is on MySQL and MongoDB, note that tcapy supports several databases, via adapters:\n", "\n", "* trade/order data - Microsoft SQL Server, MySQL, SQLite\n", "* market data - Arctic/MongoDB, PyStore, InfluxDB and kdb+/q\n", "\n", "You can also read trade/order and market data from flat files (CSV, Parquet or H5) or direct from external sources. However, we would strongly recommend, in particular for your market data, to download from your external source and then dump in your database. For trade/order data, it is also more convenient to use a database, rather than using CSVs.\n", "\n", "If running locally, we are assuming that users have already installed MySQL/MongoDB on their Linux box, using `install_mysql.sh` and `install_mongodb.sh`, which is also triggered by `install_all_tcapy.sh`. You also need to make sure the databases have been started locally, using `restart_db.py`.\n", "\n", "You can of course use databases which are not installed locally, in which case you won't need to install locally, but you will still need the IP, usernames and passwords to access them. To begin with, as with the other notebooks, let's set our paths. We are assuming that we are running the front end on Windows and the backend on WSL/Ubuntu." ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "ExecuteTime": { "end_time": "2020-04-08T18:19:57.042226Z", "start_time": "2020-04-08T18:19:57.033250Z" } }, "outputs": [], "source": [ "import sys\n", "import os\n", "\n", "windows_tcapy_path = 'e:/cuemacro/tcapy' # Windows platform\n", "linux_tcapy_path = '/home/tcapyuser/cuemacro/tcapy' # Linux platform\n", "local_test_data_path = '../test/resources/' # Windows platform\n", "remote_test_data_path = '../test/resources/' # WSL drive\n", "\n", "# For dumping market data files\n", "# Assuming Windows platform (you might have to change these paths)\n", "# wsl$ gives us access to the WSL/Ubuntu drive on Windows\n", "# Make sure whatever paths you add exist!\n", "csv_folder = '\\\\\\\\wsl$\\\\Ubuntu\\\\data\\\\csv_dump\\\\'\n", "temp_data_folder = \"\\\\\\\\wsl$\\\\Ubuntu\\\\data\\\\temp\\\\\"\n", "temp_large_data_folder = \"\\\\\\\\wsl$\\\\Ubuntu\\\\data\\\\csv_dump\\\\large\\\\\"\n", "\n", "# Assuming the front end is on Windows\n", "sys.path.insert(0, windows_tcapy_path)\n", "\n", "from tcapy.conf.constants import Constants\n", "\n", "constants = Constants()\n", "\n", "# Should we download the tick data or upload to Arctic? (this will take ages!)\n", "download_tick_data = False\n", "upload_tick_data = False" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's also define the data vendor we are using (by default `dukascopy`) and also which database we shall use for trade/order data `mysql`. We'll also define our market database as `arctic` but you can also select `pystore` too. In both cases, we ensure that we are reading/writing to the test harness database/tables, rather than the main tables. Obviously, for production purposes, you would not use the databases or tables with `test_harness`, however, for tutorials etc, it is better to write in separate tables. We also even advise setting up a totally different database for testing purposes, rather than using a production database." ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "ExecuteTime": { "end_time": "2020-04-08T18:19:59.911853Z", "start_time": "2020-04-08T18:19:57.043223Z" } }, "outputs": [], "source": [ "data_vendor = 'dukascopy' # 'ncfx' or 'dukascopy'\n", "\n", "# 'ms_sql_server' or 'mysql' or 'sqlite'\n", "sql_database = 'mysql'\n", "trade_data_database_name = 'trade_database_test_harness'\n", "\n", "# 'arctic' or 'pystore'\n", "market_database = 'arctic'\n", "market_data_database_name = 'market_data_table_test_harness'\n", "data_vendor_test_harness = 'testharness'\n", "\n", "# Change paths as appropriate\n", "pystore_path = '/data/pystore' # Linux\n", "pystore_path = '\\\\\\\\wsl$\\\\Ubuntu\\\\data\\\\pystore' # Windows\n", "\n", "from tcapy.data.databasesource import DatabaseSourceArctic, DatabaseSourcePyStore\n", "\n", "if data_vendor == 'ncfx':\n", " from tcapy.data.databasepopulator import DatabasePopulatorNCFX as DatabasePopulator\n", "\n", " tickers = constants.ncfx_tickers\n", "elif data_vendor == 'dukascopy':\n", " from tcapy.data.databasepopulator import DatabasePopulatorDukascopy as DatabasePopulator\n", " \n", " tickers = constants.dukascopy_tickers\n", "\n", "if sql_database == 'ms_sql_server':\n", " from tcapy.data.databasesource import DatabaseSourceMSSQLServer as DatabaseSource\n", "elif sql_database == 'mysql':\n", " from tcapy.data.databasesource import DatabaseSourceMySQL as DatabaseSource\n", "elif sql_database == 'sqlite':\n", " from tcapy.data.databasesource import DatabaseSourceSQLite as DatabaseSource\n", " \n", "if market_database == 'arctic':\n", " market_data_database_table = 'market_database_test_harness'\n", " database_source_market = DatabaseSourceArctic(postfix=data_vendor_test_harness)\n", " \n", "elif market_database == 'pystore':\n", " market_data_database_table = 'market_database_test_harness'\n", " database_source_market = DatabaseSourcePyStore(postfix=data_vendor_test_harness)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's also define the trade key mappings to underlying SQL database tables too." ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "ExecuteTime": { "end_time": "2020-04-08T18:19:59.920827Z", "start_time": "2020-04-08T18:19:59.913845Z" } }, "outputs": [], "source": [ "sql_trade_order_mapping = {\n", " 'ms_sql_server' : {'trade_df' : '[dbo].[trade]', # Name of table which has broker messages to client\n", " 'order_df' : '[dbo].[order]'}, # Name of table which has orders from client\n", " 'mysql': {'trade_df': 'trade_database_test_harness.trade', # Name of table which has broker messages to client\n", " 'order_df': 'trade_database_test_harness.order'}, # Name of table which has orders from client\n", " 'sqlite': {'trade_df': 'trade_table', # Name of table which has broker messages to client\n", " 'order_df': 'order_table'} # Name of table which has orders from client\n", "}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Format of input data for tcapy\n", "\n", "tcapy expects data to be a certain format when input. If it isn't quite the right format, you can change it beforehand, or use `DataNorm` to change on the fly. We strongly recommend looking at sample CSV files in the `tests_harness_data` folder which have trade data (`small_test_trade_df.csv`) and order data (`small_test_order_df.csv`).\n", "\n", "Note, that the more fields we have recorded, the more scope it gives us for doing TCA. For example, the `venue` field is optional but if it is there, we can also slice and dice trade data by venue, to understand if some venues are more costly than others. \n", "\n", "If we have very limited data, we have less scope in what we do for TCA. Also the granularity and quality of the data is important. For example, having very inaccurate timestamps can make TCA, means that any benchmark we find will also be subject to an inaccurate timestamp. This can happen when trades are booked manually after a trade has taken place." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Trade data format\n", "\n", "We assume that trade messages are points in time, which are returned by our liquidity provider. These can a range of messages, usually `trade` fills, but they can also be cancel messages (in which case we would not for example have a populated `executed_price` field etc.). We give some of the fields here, note that we can have many more than this, and use those to filter trades. \n", "\n", "Note that tcapy will add many calculated fields to this, such as converting to the notional to the reporting currency (typically USD), benchmarks (such as mid price), metrics (such as slippage) etc.\n", "\n", "* `Date` - index - date/time of the message\n", "* `id` - event identifier\n", "* `ticker` - asset\n", "* `side` - side of the trade (eg. +1 for buy and -1 for sell)\n", "* `executed_notional` - amount actually executed\n", "* `executed_price` - price at which trade has been dealt\n", "* `notional_currency` - currency in which notional is expressed\n", "* `event_type` - trade, cancel, cancel/replace or placement\n", "* optional fields\n", " * `ancestor_pointer_id` - pointing to the order which this trade is part (if orders exist in the dataset)\n", " * `broker_id` - name of the liquidity provider\n", " * `venue` - venue where trade was executed\n", " * `algo_id` - name of the algo used (eg. `manual` or `TWAP` etc.)\n", " * `trader_id` - which trader executed trade\n", " * `portfolio_id` - which internal portfolio is the trade for\n", " * `portfolio_manager_id` - which portfolio manager is this for" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Order data format\n", "\n", "You can also use order data in tcapy. Orders are sent by the client to their liquidity provider. They have start and end times and can include multiple trade events. Here we discuss some of the fields specific to orders. As with trades, tcapy will add additional calculated fields, such as `executed_notional` based upon the underlying trade fills in an order, and also `executed_price` calculated as the average execution price of the trade fills.\n", "\n", "* `benchmark_date_start` - start time of order\n", "* `benchmark_date_end` - end time of order\n", "* optional fields\n", " * `order_notional` - total amount of notional from the order" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Market data format\n", "\n", "High frequency market data is used as our benchmark in tcapy. Typically, this is going to be quote data. We clearly can't use our own trade data as a benchmark, as this is going to result in circular analysis. However, if we have access to a large database of trade data from many different counterparties, we could construct a benchmark out of that.\n", "\n", "Below, we give some of the fields which you can have in your market data for use in tcapy. At the very least we need to have a `mid` market quote and this is typically what we'd use as a benchmark\n", "\n", "* `Date` - date/time of the tick\n", "* `mid` - mid market quote, which can be our proxy for the reference price\n", "* `bid` - bid quote\n", "* `ask` - ask quote\n", "\n", "tcapy can use bid/ask quotes as benchmark for selling/buying trades respectively, although, by default it will select the `mid` field. For orders, tcapy can construct benchmarks like TWAP on the fly, or VWAP, if you also have volume data (using `BenchmarkTWAP` and `BenchmarkVWAP` classes). " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# MySQL for trade/order data\n", "\n", "Here we discuss how to add users to the MySQL database and also how to upload/dump data from your SQL database.\n", "\n", "## Adding users\n", "\n", "Once MySQL installed, you are likely to want to create a user to access it. In this case, we assume the user is `tcapyuser`. In practice, you may wish to limit the precise privilages. The first step is to run the mysql shell, that will allow us to run commands on the database" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ " sudo mysql" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's add a user `tcapyuser` that can access the mysql database. In practice, we may wish to limit their priviliges! In this instance, we will only allow the user to login from the localhost. This [guide](https://medium.com/@harshityadav95/installing-mysql-in-ubuntu-linux-windows-subsystem-for-linux-from-scratch-d5771a4a2496) summarises how to install MySQL on Ubuntu/WSL and also the steps you can take to secure it." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ " CREATE USER 'tcapyuser'@'localhost' IDENTIFIED BY 'password';\n", " GRANT ALL PRIVILEGES ON * . * TO 'tcapyuser'@'localhost';\n", " FLUSH PRIVILEGES;" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can also change the passwords of a user by running the following." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ " SET PASSWORD FOR 'tcapyuser'@'localhost' = 'password';" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can also create the `trade_database` database that we'll use to store trade/order data. You should also create `trade_database_test_harness` if you want to run any tests or execute the code in this notebook. If you want to setup SQL Server or SQLite, you'll also need to create databases in those, as tcapy doesn't by default attempt to create databases (it will create the appropriate tables though automatically, if necessary)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ " CREATE DATABASE trade_database;\n", " SHOW DATABASES;" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "MySQL can have issues with large SQL inserts, increasing the `max_allowed_packet` parameter can help." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ " Select @@global.max_allowed_packet;\n", " SET GLOBAL max_allowed_packet=1073741824" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Default location for the data in MySQL is `/var/lib/mysql/` on Ubuntu. You can configure it for a different location." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Configuring tcapy to interact with your MySQL database\n", "\n", "You will likely need to edit `constants.py` to make sure it is pointing to the right SQL database. Also make sure you add any credentials to a new file `constantscred.py`, which should not be added to version control. Below we've shown some of the parameters you may need to change. Microsoft SQL database has similar parameters" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ " ## MySQL\n", " mysql_host = '127.0.0.1'\n", " mysql_port = '3306'\n", "\n", " mysql_trade_data_database_name = 'trade_database'\n", "\n", " mysql_username = 'OVERWRITE_IN_ConstantsCred'\n", " mysql_password = 'OVERWRITE_IN_ConstantsCred'\n", "\n", " mysql_dump_record_chunksize = 10000 # Making the chunk size very big for MySQL can slow down inserts significantly\n", "\n", " mysql_trade_order_mapping = OrderedDict(\n", " [('trade_df', 'trade_database.trade'), # Name of the table which holds broker messages to clients\n", " ('order_df', 'trade_database.order')]) # Name of the table which has orders from client" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# SQLite for trade/order data\n", "\n", "tcapy also supports SQLite for storing trade/order data. SQLite is basically a flat file database, which is stored in a single file on your computer. If you are the only user, it can be an easy way to setup a database. SQLite is included in Python, so we don't need to install it separately unlike for example MySQL. The first thing you need to do is to activate your `py36tca` Python environment. We can do this by running `source /home/tcapyuser/cuemacro/tcapy/batch_scripts/linux/installation/activate_python_environment.sh` on Linux or running `activate_python_environment.bat` on Windows. Then we create a folder to house our SQL database and then run `sqlite3` from the Anaconda prompt to create `trade_database.db`. If you want to run the tcapy tests or execute the code in this notebook, you can also create `trade_database_test_harness.db`." ] }, { "cell_type": "markdown", "metadata": { "ExecuteTime": { "end_time": "2020-04-07T14:47:04.460141Z", "start_time": "2020-04-07T14:47:04.413260Z" } }, "source": [ " mkdir -p /home/tcapyuser/db\n", " sqlite3 /home/tcapyuser/db/trade_database.db\n", " >.quit" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Uploading trade/order data to SQL database\n", "\n", "In this instance we show how to upload trade/order CSV into the database tables (`trade` and `order`). Note that the database table names differ from the nicknames `trade_df` and `order_df` that we usually use. For simplicity, it is probably easier simply to name everything `trade_df` and `order_df`, but if you have an existing database, we recognise you might not be able to change table names easily. Warning this will overwrite your `trade` and `order` tables!! This code is generic so will work with all the supported SQL databases used." ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "ExecuteTime": { "end_time": "2020-04-08T18:20:00.419256Z", "start_time": "2020-04-08T18:19:59.921826Z" } }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "2020-04-08 19:19:59,961; INFO:tcapy.data.databasesource: Parsing e:/Remote/tcapy/tests_harness_data/small_test_trade_df.csv before database dump for table trade (databasesource.py:1036)\n", "2020-04-08 19:19:59,987; DEBUG:tcapy.data.databasesource: About to write to mysql database... (databasesource.py:1101)\n", "2020-04-08 19:20:00,168; DEBUG:tcapy.data.databasesource: No empty strings in column broker_id (databasesource.py:500)\n", "2020-04-08 19:20:00,169; DEBUG:tcapy.data.databasesource: No empty strings in column venue (databasesource.py:500)\n", "2020-04-08 19:20:00,171; DEBUG:tcapy.data.databasesource: No empty strings in column algo_id (databasesource.py:500)\n", "2020-04-08 19:20:00,291; INFO:tcapy.data.databasesource: Parsing e:/Remote/tcapy/tests_harness_data/small_test_order_df.csv before database dump for table order (databasesource.py:1036)\n", "2020-04-08 19:20:00,300; DEBUG:tcapy.data.databasesource: About to write to mysql database... (databasesource.py:1101)\n", "2020-04-08 19:20:00,369; WARNING:tcapy.data.databasesource: Primary key already exists... (databasesource.py:1196)\n", "2020-04-08 19:20:00,371; WARNING:tcapy.data.databasesource: Index already exists... (databasesource.py:1202)\n", "2020-04-08 19:20:00,372; DEBUG:tcapy.data.databasesource: No empty strings in column broker_id (databasesource.py:500)\n", "2020-04-08 19:20:00,373; DEBUG:tcapy.data.databasesource: No empty strings in column algo_id (databasesource.py:500)\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "(mysql.connector.errors.ProgrammingError) 1064 (42000): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'order ADD CONSTRAINT order_PK_trade PRIMARY KEY (`date`, id, ticker)' at line 1\n", "[SQL: ALTER TABLE order ADD CONSTRAINT order_PK_trade PRIMARY KEY (`date`, id, ticker)]\n", "(Background on this error at: http://sqlalche.me/e/f405)\n" ] } ], "source": [ "# Where are the trade/order CSVs stored, and how are they mapped?\n", "# This assumes you have already generated these files!\n", "csv_sql_table_trade_order_mapping = {'trade': os.path.join(local_test_data_path, 'small_test_trade_df.csv'),\n", " 'order': os.path.join(local_test_data_path, 'small_test_order_df.csv'),\n", " }\n", "\n", "# Get the actual table names in the database which may differ from \"nicknames\"\n", "trade_order_mapping = constants.trade_order_mapping[sql_database]\n", "\n", "# To interact with our SQL database\n", "database_source = DatabaseSource(trade_data_database_name=trade_data_database_name)\n", "\n", "# Upload each CSV to the associated table (replacing it!)\n", "for key in csv_sql_table_trade_order_mapping.keys():\n", " database_source.convert_csv_to_table(\n", " csv_sql_table_trade_order_mapping[key], None, key, database_name=trade_data_database_name,\n", " if_exists_table='replace')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Dumping trade/order data to CSV and Parquet" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can use the `DatabaseSource` objects to read back from our SQL database. Note, that the data here, won't have undergone any normalization process, which is typically done during TCA calculations by the `DataNorm` object." ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "ExecuteTime": { "end_time": "2020-04-08T18:20:00.512008Z", "start_time": "2020-04-08T18:20:00.420266Z" } }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "2020-04-08 19:20:00,448; DEBUG:tcapy.data.databasesource: Excecuted mysql query: select * from trade_database.trade , 660 returned (databasesource.py:959)\n", "2020-04-08 19:20:00,492; DEBUG:tcapy.data.databasesource: Excecuted mysql query: select * from trade_database.order , 322 returned (databasesource.py:959)\n" ] } ], "source": [ "# Map the \"nicknames\" to the CSV dumping output\n", "csv_trade_order_mapping_dump = {'trade_df': os.path.join(local_test_data_path, 'small_test_trade_df_dump.csv'),\n", " 'order_df': os.path.join(local_test_data_path, 'small_test_order_df_dump.csv'),\n", " }\n", "\n", "# Get the actual table names in the database which may differ from \"nicknames\"\n", "trade_order_mapping = constants.trade_order_mapping[sql_database]\n", "\n", "database_source = DatabaseSource(trade_data_database_name=trade_data_database_name)\n", "\n", "for k in trade_order_mapping.keys():\n", " trade_order_df = database_source.fetch_trade_order_data(table_name=trade_order_mapping[k])\n", " trade_order_df.to_csv(csv_trade_order_mapping_dump[k])\n", " trade_order_df.to_parquet(csv_trade_order_mapping_dump[k].replace('csv', 'parquet'))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Arctic/MongoDB for market tick data\n", "\n", "Here we discuss how to add users to the database, how to download data from Dukascopy and also how to upload tick data to Arctic/MongoDB." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Increasing open file limit on Linux for MongoDB\n", "\n", "MongoDB ends up opening many files at the same time. However, on Linux typically the `nofile` setting is quite low by default. In order to make MongoDB stable, it is recommended to increase the number of open files allowed. By default, during tcapy's installation `increase_file_limits.sh` is run which copies a modified version of `limits.conf` to `/etc/security/limits.conf` which increases the `nofile` parameters for `root` user (which is the default one in tcapy for running MongoDB). It also overwrites `/etc/pam.d/common-session` and `/etc/pam.d/common-session-noninteractive` to ensure that the `limits.conf` is observed for the root user.\n", "\n", "Note, that you may wish to use different users such as the `mongodb` user to run your instance of MongoDB, and also to run MongoDB on Linux a service at startup, as opposed to running it via `restart_db.sh`. However, this isn't the default usage with tcapy and would require changing the configuration.\n", "\n", "To check the `nofile` parameter has been currently set for the `root` user, you can run:\n", "\n", " sudo su\n", " ulimit -n\n", " \n", "The output should be `100000` if the above script has been run. Note, that it has only been fully tested on Ubuntu at present.\n", "\n", "If you do not increase the open file limit, you will likely experience a lot of issues, when trying to access MongoDB. This link in the [MongoDB explains more](https://docs.mongodb.com/manual/reference/ulimit/) and this article explains how to [tune MongoDB for a large number of users](https://www.mongodb.com/blog/post/tuning-mongodb--linux-to-allow-for-tens-of-thousands-connections) including a discussion of the open files point." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Adding users\n", "\n", "Once installed, you may wish to create various users for accessing MongoDB, by typing in the following commands into the Mongo shell. We can start the Mongo shell by running." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ " sudo mongo" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can add a root user called `admin_root`. Generally, it is advisable to limit root access to the database." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ " use admin\n", " db.createUser(\n", " {\n", " user: \"admin_root\",\n", " pwd: \"password\",\n", " roles: [\"root\"]\n", " }\n", " )" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Let's create a user `tcapyuser` who will be the one mainly accessing the MongoDB database via tcapy. Again, as with mysql, we might wish to reduce the privilages of the user." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ " use admin\n", " db.createUser(\n", " {\n", " user: \"tcapyuser\",\n", " pwd: \"password\",\n", " roles: [ { role: \"userAdminAnyDatabase\", db: \"admin\" }, \n", " { role: \"dbAdminAnyDatabase\", db: \"admin\" }, \n", " { role: \"readWriteAnyDatabase\", db: \"admin\" } ]\n", " }\n", " )" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You can also change passwords of user." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ " use admin\n", " db.changeUserPassword(\"admin_root\", \"password\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Storage engines for Arctic/MongoDB\n", "\n", "tcapy uses [Arctic](https://github.com/man-group/arctic) which is Man AHL's open source Python library to intreact with MongoDB. It essentially takes Pandas DataFrame, and then compresses them before dumping them into MongoDB. It is very fast for fetching and dumping large amounts of time series data. The speed is relatively to several points, first because of the compression, it is less impacted by IO limits. Furthermore, because of the compression there is less overhead when accessing MongoDB over a network. \n", "\n", "There are several different storage engines in Arctic:\n", "\n", "* `ChunkStore`\n", "* `VersionStore`\n", "* `TickStore`\n", "\n", "Each engine has various different use cases. tcapy supports all three, however, we have opted to make `ChunkStore` the default engine, which stores the data in user defined chunks. \n", "\n", "In our case, our default chunk size is daily, because typically, when using Celery in parallel to fetch data in daily chunks (and caching these daily chunks in Redis), and it should be very fast to get such chunks. In our experience this seems to be quicker than doing the same thing using `VersionStore`.\n", "\n", "The precise chunk size you choose, will depend on how you intend to use it. For example, if you wouldn't run tcapy in parallel, then you might consider using larger chunk sizes. In the documentation, the reasons for Arctic speed is discussed in more detail, as well as a [summary of the various storage engines](https://arctic.readthedocs.io/en/latest/).\n", "\n", "Note, that you should not try to keep switching between the various storage engines, while using the same identical collection names eg. `market_data_table`. If you intend to use them in together (eg. for benchmarking) it would be better to have different collection names for each storage engine to avoid confusion." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Configuring tcapy to interact with your MongoDB database\n", "\n", "You will likely need to edit `constants.py` to make sure it is pointing to you MongoDB instance. Also make sure you add any credentials to a new file `constantscred.py`, which should not be added to version control. Below we've shown some of the parameters you may need to change." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ " ### Arctic/MongoDB\n", " arctic_host = database_host\n", " arctic_port = 27017\n", " arctic_username = 'OVERWRITE_IN_ConstantsCred'\n", " arctic_password = 'OVERWRITE_IN_ConstantsCred'\n", "\n", " arctic_ssl = False\n", " arctic_ssl_cert_reqs = ssl.CERT_NONE\n", "\n", " # NOTE: database name not currently used in Arctic\n", " arctic_market_data_database_name = 'fx'\n", " arctic_trade_data_database_name = 'fx'\n", " arctic_market_data_database_table = 'market_data_table' # Name of the table with market tick data\n", "\n", " arctic_timeout_ms = 10 * 1000 # How many millisections should we have a timeout for in Arctic/MongoDB\n", "\n", " # https://arctic.readthedocs.io/en/latest/ has more discussion on the differences between the\n", " # various storage engines, and the various pros and cons\n", " # By default we use 'CHUNK_STORE' with 'D' (daily) bucketing - corresponding to our 'daily' fetching when using\n", " # with Celery\n", " arctic_lib_type = 'CHUNK_STORE' # storage engines: VERSION_STORE or TICK_STORE or CHUNK_STORE (default)\n", "\n", " # 'D' is for daily, you can also choose 'M' and other chunk sizes (depends on how you wish to cache data)\n", " arctic_chunk_store_freq = 'D'\n", "\n", " arctic_quota_market_data_GB = 80" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Experiementing with PyStore for market tick data\n", "\n", "Another choice for storing your market tick data, is [PyStore](https://github.com/ranaroussi/pystore). It isn't strictly speaking a database. It is modelled on Arctic, but instead of using MongoDB as the backend it is Pandas data store, which uses Parquet files on disk, which have been chunked. If you are the only user for the market tick data, PyStore can be an easy to use option (combined with SQLite for the trade/order data). In order to setup PyStore, you just need to make sure that it has access to a folder `/home/tcapyuser/pystore/`, which you can create with:\n", "\n", " mkdir -p /home/tcapyuser/pystore\n", " \n", "Note, that with pretty much everything else with tcapy, this is fully configurable. You can change the default PyStore dump location in `constants.py`. tcapy also has adapters to use InfluxDB and KDB for accessing market tick data.\n", "\n", "*At present we are still our working on our PyStore, and it is highly experimental, in particular, we are trying to speed up the appending of new data, so you may choose to use Arctic for the moment*" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Downloading market tick data from Dukascopy\n", "\n", "We are going to show how to download large amounts of tick data from Dukascopy to disk as Parquet files. We are later going to upload these files to MongoDB. Note, the syntax is very similar for NCFX, we just need to instantiate `DatabasePopulatorNCFX` instead of `DatabasePopulatorDukacopy`." ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "ExecuteTime": { "end_time": "2020-04-08T18:20:00.522979Z", "start_time": "2020-04-08T18:20:00.513006Z" } }, "outputs": [], "source": [ "start_date_csv = '01 Jan 2016'; finish_date_csv = '03 Apr 2020'; split_size = 'monthly' # 'daily' or 'monthly'\n", "\n", "db_populator = DatabasePopulator(temp_data_folder=temp_data_folder, temp_large_data_folder=temp_large_data_folder,\n", " tickers=tickers)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now we can kick off the download. This will take a very long time, and hence only do this once! It will dump Parquet files to disk in `monthly` chunks, and will also create smaller temporary files. Having the temporary files is beneficial, so if the download gets interrupted, tcapy will ingest these, rather than redownloading externally. In this case, we have set `write_large_csv` to `False`. Whilst CSV files are universal, they can be very large in file size and slow to parse, compared to Parquet files." ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "ExecuteTime": { "end_time": "2020-04-08T18:20:00.527965Z", "start_time": "2020-04-08T18:20:00.523976Z" } }, "outputs": [], "source": [ "# Writes a CSV/Parquet to disk from data vendor (does not attempt to write anything to the database)\n", "# Will also dump temporary HDF5 files to disk (to avoid reloading them)\n", "\n", "if (download_tick_data):\n", " msg, df_dict = db_populator.download_to_csv(start_date_csv, finish_date_csv, tickers, split_size='monthly',\n", " csv_folder=csv_folder, return_df=False, remove_duplicates=False, write_large_csv=False,\n", " write_large_hdf5_parquet=True)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In practice, we'd recommend kicking off the downloading script `dump_data_vendor_to_parquet_csv_hdf5.py` separately, rather than running it in a Jupyter notebook, given the amount of time it takes, and also the amount of log output it generates. In our `csv_dump` folder, we'll end up with a bunch of monthly Parquet files which look like this." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ " AUDUSD_dukascopy_2016-01-03_22_00_01.868000+00_002016-01-31_23_59_57.193000+00_00.parquet\n", " AUDUSD_dukascopy_2016-02-01_00_00_00.055000+00_002016-02-29_23_59_56.657000+00_00.parquet\n", " AUDUSD_dukascopy_2016-03-01_00_00_00.712000+00_002016-03-31_23_59_50.885000+00_00.parquet" ] }, { "cell_type": "markdown", "metadata": { "ExecuteTime": { "end_time": "2020-04-03T18:56:56.092036Z", "start_time": "2020-04-03T18:56:56.079820Z" } }, "source": [ "## Uploading market tick data to market database" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We can now upload the tick data in the above Parquet (or CSV/HDF5) files to our market database! First we need to define the data source, the tickers we want to upload, and also the format of the data. To make things quicker, I've just filtered for 2017 (mainly because that cover the period of the pregenerated trade/order CSVs in the `tests_harness_data` folder that comes with tcapy)." ] }, { "cell_type": "code", "execution_count": 8, "metadata": { "ExecuteTime": { "end_time": "2020-04-08T18:20:00.536944Z", "start_time": "2020-04-08T18:20:00.528963Z" } }, "outputs": [], "source": [ "ticker_mkt = ['EURUSD'] # You can add more tickers, but it will take much longer!\n", "\n", "file_extension = 'parquet' # parquet or csv or h5\n", "\n", "# Files dumped by DatabasePopulator look like this\n", "# 'EURUSD_dukascopy_2016-01-03_22_00_01.868000+00_002016-01-31_23_59_57.193000+00_00.parquet'\n", "#\n", "# Assume that ALL TIME IN UTC!\n", "csv_file = [x + '_' + data_vendor + '_*.' + file_extension for x in ticker_mkt] \n", "\n", "# These are all the files with the market tick data\n", "csv_market_data = [os.path.join(csv_folder, x) for x in csv_file]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now read the Parquet files and upload our market databases (this will take a while!). Note it will overwrite any tickers already in the database from the same data vendor. By default we use `DatabaseSourceArctic` but note we could have changed it to `DatabaseSourcePyStore` earlier." ] }, { "cell_type": "code", "execution_count": 9, "metadata": { "ExecuteTime": { "end_time": "2020-04-08T18:21:04.859112Z", "start_time": "2020-04-08T18:20:00.538937Z" } }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "2020-04-08 19:20:00,542; INFO:tcapy.data.databasesource: Attempting to load Arctic/MongoDB library: market_database_test_harness CHUNK_STORE (databasesource.py:1881)\n", "2020-04-08 19:20:00,544; INFO:tcapy.data.databasesource: Got Arctic/MongoDB library: market_database_test_harness CHUNK_STORE (databasesource.py:1901)\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ "Uploading EURUSD\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "2020-04-08 19:20:00,547; INFO:arctic.arctic: Dropping collection: market_database_test_harness (arctic.py:326)\n", "2020-04-08 19:20:00,780; INFO:arctic.arctic: Dropping collection: market_database_test_harness.snapshots (arctic.py:330)\n", "2020-04-08 19:20:00,793; INFO:arctic.arctic: Dropping collection: market_database_test_harness.versions (arctic.py:330)\n", "2020-04-08 19:20:00,810; INFO:arctic.arctic: Dropping collection: market_database_test_harness.ARCTIC (arctic.py:330)\n", "2020-04-08 19:20:00,819; INFO:arctic.arctic: Dropping collection: market_database_test_harness.version_nums (arctic.py:330)\n", "2020-04-08 19:20:01,208; INFO:arctic.chunkstore.chunkstore: Trying to enable sharding... (chunkstore.py:46)\n", "2020-04-08 19:20:01,211; WARNING:arctic.chunkstore.chunkstore: Library created, but couldn't enable sharding: no such command: 'enablesharding'. This is OK if you're not 'admin' (chunkstore.py:50)\n", "2020-04-08 19:20:01,833; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2016-01-03_22_00_01.446000+00_002016-01-31_23_59_59.429000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:02,218; INFO:arctic.arctic: Mongo Quota: arctic.market_database_test_harness 0.000 / 80 GB used (arctic.py:625)\n", "2020-04-08 19:20:02,990; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2016-02-01_00_00_00.728000+00_002016-02-29_23_59_58.926000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:04,481; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2016-03-01_00_00_00.290000+00_002016-03-31_23_59_51.216000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:05,874; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2016-04-01_00_00_00.220000+00_002016-04-29_20_59_13.205000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:06,869; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2016-05-01_21_00_11.829000+00_002016-05-31_23_59_59.744000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:07,815; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2016-06-01_00_00_00.402000+00_002016-06-30_23_59_59.903000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:08,984; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2016-07-01_00_00_00.259000+00_002016-07-31_23_59_59.851000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:11,147; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2016-08-01_00_00_00.108000+00_002016-08-31_23_59_58.926000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:14,639; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2016-09-01_00_00_00.077000+00_002016-09-30_20_59_59.046000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:18,837; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2016-10-02_21_00_05.470000+00_002016-10-31_23_59_42.479000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:21,752; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2016-11-01_00_00_00.049000+00_002016-11-30_23_59_58.644000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:22,924; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2016-12-01_00_00_00.156000+00_002016-12-30_21_59_56.230000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:23,936; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2017-01-01_22_00_20.786000+00_002017-01-31_23_59_53.808000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:25,123; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2017-02-01_00_00_00.623000+00_002017-02-28_23_59_55.547000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:25,996; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2017-03-01_00_00_00.241000+00_002017-03-31_20_59_56.707000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:26,989; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2017-04-02_21_00_34.289000+00_002017-04-30_23_59_03.898000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:27,709; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2017-05-01_00_00_02.851000+00_002017-05-31_23_59_59.724000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:28,581; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2017-06-01_00_00_00.132000+00_002017-06-30_20_59_56.643000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:29,462; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2017-07-02_21_00_14.575000+00_002017-07-31_23_59_59.959000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:30,509; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2017-08-01_00_00_00.470000+00_002017-08-31_23_59_59.099000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:31,597; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2017-09-01_00_00_00.191000+00_002017-09-29_20_59_56.618000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:32,741; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2017-10-01_21_00_06.101000+00_002017-10-31_23_59_12.569000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:33,714; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2017-11-01_00_00_00.357000+00_002017-11-30_23_59_55.097000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:34,612; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2017-12-01_00_00_00.175000+00_002017-12-29_21_59_42.990000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:35,452; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2018-01-01_22_00_08.661000+00_002018-01-31_23_59_47.019000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:36,633; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2018-02-01_00_00_00.061000+00_002018-02-28_23_59_58.929000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:37,825; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2018-03-01_00_00_00.185000+00_002018-03-30_20_59_55.773000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n" ] }, { "name": "stderr", "output_type": "stream", "text": [ "2020-04-08 19:20:38,912; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2018-04-01_21_00_18.456000+00_002018-04-30_23_59_59.599000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:39,906; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2018-05-01_00_00_00.129000+00_002018-05-31_23_59_59.903000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:41,160; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2018-06-01_00_00_00.434000+00_002018-06-29_20_59_56.236000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:42,257; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2018-07-01_21_00_16.014000+00_002018-07-31_23_59_57.465000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:43,449; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2018-08-01_00_00_00.042000+00_002018-08-31_20_59_57.280000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:44,574; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2018-09-02_21_00_04.803000+00_002018-09-30_23_59_59.225000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:45,341; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2018-10-01_00_00_00.205000+00_002018-10-31_23_59_59.448000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:46,400; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2018-11-01_00_00_00.222000+00_002018-11-29_23_59_59.541000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:47,332; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2018-12-02_22_00_01.940000+00_002018-12-31_22_00_03.048000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:48,407; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2019-01-01_22_02_37.254000+00_002019-01-31_23_59_58.297000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:49,829; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2019-02-03_22_00_03.618000+00_002019-02-28_23_59_59.156000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:50,915; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2019-03-03_22_00_03.788000+00_002019-03-31_23_59_55.956000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:51,985; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2019-04-01_00_00_00.064000+00_002019-04-30_23_59_51.619000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:53,174; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2019-05-01_00_00_00.053000+00_002019-05-30_23_59_59.532000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:54,221; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2019-06-02_21_01_02.358000+00_002019-06-30_23_59_59.336000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:55,294; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2019-07-01_00_00_00.173000+00_002019-07-31_23_59_59.914000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:56,360; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2019-08-01_00_00_00.134000+00_002019-08-29_23_59_57.067000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:57,532; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2019-09-01_21_04_09.364000+00_002019-09-30_23_59_59.830000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:58,423; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2019-10-01_00_00_00.002000+00_002019-10-31_23_59_53.804000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:20:59,451; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2019-11-03_22_00_04.956000+00_002019-11-28_23_59_46.180000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:21:00,116; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2019-12-01_22_00_03.296000+00_002019-12-31_21_59_57.829000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:21:00,845; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2020-01-01_22_01_12.821000+00_002020-01-30_23_59_57.058000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:21:01,584; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2020-02-02_22_00_14.503000+00_002020-02-27_23_59_58.483000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n", "2020-04-08 19:21:02,268; INFO:tcapy.data.databasesource: Parsing \\\\wsl$\\Ubuntu\\home\\tcapyuser\\csv_dump\\EURUSD_dukascopy_2020-03-01_22_00_01.729000+00_002020-03-31_23_59_57.265000+00_00.parquet before tick database dump for ticker EURUSD-testharness (databasesource.py:1701)\n" ] } ], "source": [ "if upload_tick_data:\n", " for i in range(0, len(ticker_mkt)):\n", " ticker = ticker_mkt[i]\n", " csv_file = csv_market_data[i]\n", "\n", " print('Uploading ' + ticker)\n", " \n", " # Very first time, replace whole table, then append\n", " # Otherwise will overwrite every ticker continuously!\n", " if i == 0:\n", " if_exists_table = 'replace'\n", " else:\n", " if_exists_table = 'append'\n", " \n", " database_source_market.convert_csv_to_table(csv_file, ticker, market_data_database_table,\n", " if_exists_table=if_exists_table, remove_duplicates=False,\n", " if_exists_ticker='replace', date_format=None,\n", " read_in_reverse=False)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "There is a script `cache_data_vendor_data.py` which illustrates how to directly populate the database for market tick data from an external download.\n", "* `ONEOFF` - one off downloads from external data sources (either Dukascopy or NCFX) and write directly to the database\n", "* `DAILY_RUN_APPEND` - a regular download from external data sources (either Dukascopy or NCFX) and write directly to the database (note, that Dukascopy won't have the last few weeks of data), typically you could run this once every day, or potentionally on an even highly frequency than that." ] }, { "cell_type": "code", "execution_count": 12, "metadata": { "ExecuteTime": { "end_time": "2020-04-08T22:16:19.557815Z", "start_time": "2020-04-08T22:16:18.998604Z" } }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "2020-04-08 23:16:19,004; DEBUG:matplotlib.pyplot: Loaded backend module://ipykernel.pylab.backend_inline version unknown. (pyplot.py:225)\n", "2020-04-08 23:16:19,006; INFO:tcapy.data.databasesource: Attempting to load Arctic/MongoDB library: market_database_test_harness CHUNK_STORE (databasesource.py:1881)\n", "2020-04-08 23:16:19,007; INFO:tcapy.data.databasesource: Got Arctic/MongoDB library: market_database_test_harness CHUNK_STORE (databasesource.py:1901)\n", "2020-04-08 23:16:19,316; DEBUG:tcapy.data.databasesource: Extracted Arctic/MongoDB library: market_database_test_harness for ticker EURUSD-testharness between 2017-01-01 00:00:00 - 2017-02-01 00:00:00 from CHUNK_STORE (databasesource.py:1960)\n" ] }, { "name": "stdout", "output_type": "stream", "text": [ " mid\n", "Date \n", "2016-12-30 00:00:00+00:00 1.052560\n", "2017-01-02 00:00:00+00:00 1.045770\n", "2017-01-03 00:00:00+00:00 1.041765\n", "2017-01-04 00:00:00+00:00 1.049720\n", "2017-01-05 00:00:00+00:00 1.060400\n", "2017-01-06 00:00:00+00:00 1.052960\n", "2017-01-09 00:00:00+00:00 1.058605\n", "2017-01-10 00:00:00+00:00 1.055230\n", "2017-01-11 00:00:00+00:00 1.059430\n", "2017-01-12 00:00:00+00:00 1.061625\n", "2017-01-13 00:00:00+00:00 1.062750\n", "2017-01-16 00:00:00+00:00 1.060245\n", "2017-01-17 00:00:00+00:00 1.069665\n", "2017-01-18 00:00:00+00:00 1.062825\n", "2017-01-19 00:00:00+00:00 1.065935\n", "2017-01-20 00:00:00+00:00 1.071275\n", "2017-01-23 00:00:00+00:00 1.075930\n", "2017-01-24 00:00:00+00:00 1.072835\n", "2017-01-25 00:00:00+00:00 1.075270\n", "2017-01-26 00:00:00+00:00 1.068355\n", "2017-01-27 00:00:00+00:00 1.071720\n", "2017-01-30 00:00:00+00:00 1.070770\n", "2017-01-31 00:00:00+00:00 1.080100\n" ] } ], "source": [ "%matplotlib inline\n", "\n", "import datetime\n", "import pandas as pd\n", "\n", "df = database_source_market.fetch_market_data(start_date='01 Jan 2017', finish_date='01 Feb 2017', ticker='EURUSD', \n", " table_name=market_data_database_table)\n", "\n", "df = pd.DataFrame(df.resample('B').last())\n", "\n", "if 'mid' not in df.columns:\n", " df['mid'] = (df['bid'] + df['ask']) / 2.0\n", "\n", "df = pd.DataFrame(df['mid'])\n", "\n", "print(df)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Conclusion\n", "\n", "We have examined the data formats tcapy expects for trade/order data, as well as market data.\n", "\n", "We have seen how to configure both MySQL and MongoDB to work with trade/order data respectively. We've also seen how we can populate a SQL database with trade/order data, using tcapy, and dump it back out. On the market data side, we've investigated how we can read from an external source (Dukascopy) and dump the market data to disk, as well as how to populate MongoDB with it, all via using tcapy. We also talked about how you can use SQLite and PyStore, as flat file alternatives, if you are likely to be the sole user of the tcpay.\n", "\n", "It is important you properly maintain your databases for tcapy or indeed for any purpose. If large periods of time have missing tick data for example, you wont' be able to do TCA over those periods. Also having lots of missing trade data in your SQL database, could result in biased results. Data is key to any sort of TCA!" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.10" }, "toc": { "base_numbering": 1, "nav_menu": {}, "number_sections": true, "sideBar": true, "skip_h1_title": false, "title_cell": "Table of Contents", "title_sidebar": "Contents", "toc_cell": false, "toc_position": {}, "toc_section_display": true, "toc_window_display": false } }, "nbformat": 4, "nbformat_minor": 4 }