{ "cells": [ { "cell_type": "markdown", "metadata": { "hide_input": true }, "source": [ "## Computer vision data" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [], "source": [ "%matplotlib inline\n", "from fastai.gen_doc.nbdoc import *\n", "from fastai.vision import * " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This module contains the classes that define datasets handling [`Image`](/vision.image.html#Image) objects and their transformations. As usual, we'll start with a quick overview, before we get in to the detailed API docs.\n", "\n", "Before any work can be done a dataset needs to be converted into a [`DataBunch`](/basic_data.html#DataBunch) object, and in the case of the computer vision data - specifically into an [`ImageDataBunch`](/vision.data.html#ImageDataBunch) subclass.\n", "\n", "This is done with the help of [data block API](/data_block.html) and the [`ImageList`](/vision.data.html#ImageList) class and its subclasses.\n", " \n", "However, there is also a group of shortcut methods provided by [`ImageDataBunch`](/vision.data.html#ImageDataBunch) which reduce the multiple stages of the data block API, into a single wrapper method. These shortcuts methods work really well for:\n", "- Imagenet-style of datasets ([`ImageDataBunch.from_folder`](/vision.data.html#ImageDataBunch.from_folder))\n", "- A pandas `DataFrame` with a column of filenames and a column of labels which can be strings for classification, strings separated by a `label_delim` for multi-classification or floats for a regression problem ([`ImageDataBunch.from_df`](/vision.data.html#ImageDataBunch.from_df))\n", "- A csv file with the same format as above ([`ImageDataBunch.from_csv`](/vision.data.html#ImageDataBunch.from_csv))\n", "- A list of filenames and a list of targets ([`ImageDataBunch.from_lists`](/vision.data.html#ImageDataBunch.from_lists))\n", "- A list of filenames and a function to get the target from the filename ([`ImageDataBunch.from_name_func`](/vision.data.html#ImageDataBunch.from_name_func))\n", "- A list of filenames and a regex pattern to get the target from the filename ([`ImageDataBunch.from_name_re`](/vision.data.html#ImageDataBunch.from_name_re))\n", "\n", "In the last five factory methods, a random split is performed between train and validation, in the first one it can be a random split or a separation from a training and a validation folder.\n", "\n", "If you're just starting out you may choose to experiment with these shortcut methods, as they are also used in the first lessons of the fastai deep learning course. However, you can completely skip them and start building your code using the data block API from the very beginning. Internally, these shortcuts use this API anyway.\n", "\n", "The first part of this document is dedicated to the shortcut [`ImageDataBunch`](/vision.data.html#ImageDataBunch) factory methods. Then all the other computer vision data-specific methods that are used with the data block API are presented." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Quickly get your data ready for training" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To get you started as easily as possible, the fastai provides two helper functions to create a [`DataBunch`](/basic_data.html#DataBunch) object that you can directly use for training a classifier. To demonstrate them you'll first need to download and untar the file by executing the following cell. This will create a data folder containing an MNIST subset in `data/mnist_sample`." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "PosixPath('/home/ubuntu/.fastai/data/mnist_sample')" ] }, "execution_count": null, "metadata": {}, "output_type": "execute_result" } ], "source": [ "path = untar_data(URLs.MNIST_SAMPLE); path" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "There are a number of ways to create an [`ImageDataBunch`](/vision.data.html#ImageDataBunch). One common approach is to use *Imagenet-style folders* (see a ways down the page below for details) with [`ImageDataBunch.from_folder`](/vision.data.html#ImageDataBunch.from_folder):" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "tfms = get_transforms(do_flip=False)\n", "data = ImageDataBunch.from_folder(path, ds_tfms=tfms, size=24)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Here the datasets will be automatically created in the structure of *Imagenet-style folders*. The parameters specified:\n", "- the transforms to apply to the images in `ds_tfms` (here with `do_flip`=False because we don't want to flip numbers),\n", "- the target `size` of our pictures (here 24).\n", "\n", "As with all [`DataBunch`](/basic_data.html#DataBunch) usage, a `train_dl` and a `valid_dl` are created that are of the type PyTorch [`DataLoader`](https://pytorch.org/docs/stable/data.html#torch.utils.data.DataLoader). \n", "\n", "If you want to have a look at a few images inside a batch, you can use [`DataBunch.show_batch`](/basic_data.html#DataBunch.show_batch). The `rows` argument is the number of rows and columns to display." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAVIAAAFgCAYAAADpZ/FJAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAAIABJREFUeJzt3WmQVOXZxvH7ALKDsgqI7PuqIEZQBAVFIaISUSER45JK3jKaVL6kUiZVpqKWWYyKKSmXpCKKGsUVIyoiRhYBFwRl33dkUTYRFDzvh7zeXN1v9zAzTy/T3f9flVUXw8z0Caf7ybmfNYrj2AAAlVct3xcAAIWOhhQAAtGQAkAgGlIACERDCgCBaEgBIBANKQAEKrmGNIqig0n/HYui6MF8XxcqL4qiJ6Mo2h5F0f4oilZFUXRzvq8JlVeI9zMq5Qn5URTVM7PPzGxkHMfv5vt6UDlRFPU0szVxHB+Joqibmb1jZqPiOP4wv1eGyijE+1lyT6RJrjKznWY2O98XgsqL43hpHMdHvvvj//3XMY+XhACFeD9LvSG93swmx6X8WF4koih6KIqiQ2a2wsy2m9lreb4kBCi0+1mypX0URW3MbL2ZdYrjeH2+rwfhoiiqbmYDzWyomf0xjuNv8ntFCFFI97OUn0gnmNkcGtHiEcfxsTiO55hZazP7n3xfD8IU0v0s9Yb08XxfBLKihlXxPjVUSJW/nyXZkEZRNMjMTjOz5/J9LQgTRVHzKIqujaKofhRF1aMoGmFm48zs7XxfGyquUO9nSfaRRlH0sJnVjeP4unxfC8JEUdTMzKaaWV/774PBRjObGMfxo3m9MFRKod7PkmxIASCTSrK0B4BMoiEFgEA0pAAQiIYUAALVyOWLRVFU6ZGtKIo8V69ePeXXjx07dsLv0ZysefPmnjt16uS5Tp06nhs3buy5VatWnhs0aOB5586dnv/zn/94/vTTTz1/++23Ka8hjuP0F1jFhNzPUlFI97NBgwZ+P5s2bapf96yfEf16jRqpm5JNmzZ5/vrrrz23bNnSc/fu3T3XqlXLc82aNRN+l/453fel+x79TH7wwQeeFy5c6Hnjxo0p/zeodPeTJ1IACERDCgCBclrah9BSoE2bNp7r16/v+csvv/TcsGFDz1qaawlSrVri/49o2dKzZ0/PdevWTXkdXbt29axl/rx58zzv2bPH84oVKzxrmQNUBfq5uvrqqz137tzZ8znnnOO5Xbt2nrVbbf/+/Z5nzJjh+fPPP/fcvn17zxdddJHnsrreVLrvSzcvfteuXZ4HDhzo+YUXXvD89NNPe/7ss8/KdR3f4YkUAALRkAJAoCpd2usI3Lnnnuv52muv9dyrVy/POirepUsXz1qyaJmfSelmDOjIYXJXAlCV6Cj8r3/9a8/pRuR15omW1Pqev/LKKyv0e7QrLHlmS7169VLm8mjWrJnnJk2aeF6zZo3nadOmeaa0B4AcoyEFgEBVorTXETh9/L/00ks968hejx49PLdu3dqzPr5reZGupMgkHYXfvHmz5xdffNHzkSNHDKiq9H07depUzzrCrp+3U0891fO+ffs8L1261LPOZtHvP3z4cMqsC1gOHTqUcH06q0Z/l36u9HOoMwxOPvlkz9rFVtYsnorgiRQAAtGQAkCgvJX2Ws7rKNoFF1zgWScF9+3b17M+4p900kkpc3noJF2zxNJm7969ns877zzPyet/v6OT7ZcsWeJZyxM20T5O77/uX6Ajx7rYQmdC6H3btm1bti6x5OzYscPzXXfd5blRo0Yps5bL+nlZt26dZ50loz+rJfjRo0c9r1692nNyV5i+T7TN0PeGdu/pbB29Vp1ho5/JdPtflAdPpAAQiIYUAALltLTXETzdpk630dLReV0Tq4/smZI8mq+P/+lG+dI57bTTPPfp08fzWWed5fmjjz7yrCOVxUZnVeieBfpvqiWfzrDQ0j7dpOsvvvjCs5aj2r1iZvb+++97Zm+DE9PSVv8t9XOSbvs6Lc/1va1dOPo50tfSrL8nme6lod07uvhGP2/6/tESXn9269atnnXmQUXxRAoAgWhIASBQTkv72rVre27RooXnM8880/Pw4cM96/Z1SksHHRXX8u3AgQOedZdu/f6DBw8m/F4tK3SLPJ2EnG5yv5YL+np6HSGjglXRsGHDPGt5pfsiaBeObm2Ybt20jtRrWaj/djrqqqO8s2bNSrg+LSVnz55d1v8UlEE/F5q11M4FfT9oN+Htt9/u+ZJLLvGs76vdu3d7njlzpmfd5k+/p6J4IgWAQDSkABAop6X9li1bPGuZrxNntTzX0l4n565du9bzqlWrPGsZrbtxf/LJJ5514rB+v1niKKSWCFq26qillvN6oNb06dM9r1y50vM333xjxeTWW2/1rOuxded0La90hF2zdoXov5GWch06dPCsB7PpjA/tFjBLfM9Q2hcencFhlrgoZ8KECZ51qz7tztH7/9Zbb3l++OGHPetpFiF4IgWAQDSkABAop6W9lu16zrRO/tXH8XSTaF9//XXPDz30kGddK18eyRPtu3Xr5llH93WUWEctly9f7lknf8+fP99zMU8EHzVqlGe9b4sXL/aso+q6G7mW8zoRWkeCdXaGlnna7TJy5EjPOhPELHFBh3YTsOdB1XXKKad41tk8ZmZjx471/MMf/tCzfo71s6rdbU899ZRnPcs+U3giBYBANKQAECinpb1ud9W2bVvP3//+9z3raL6WiwsWLPCsj+w6+ltRyQfh6ewB3Z1fR4N1nbeWCFrOF/M6erVhwwbPWsJPnjzZs46K6kwK7TpJt1BBZ2386Ec/8qwlX8uWLT0nvxd0gr6+98paz43c00Ub+rkbM2ZMwvf17t3bs86eUYsWLfJ89913e9auN33P6CIR7fLRfSHWr19f9v8A44kUAILRkAJAoJyW9p07d/Y8ePBgz0OGDPGs5ZxOon3jjTc8a0kdMsld1+uaJU681wnguvP+xx9/7Fkn+uuIdKn461//6vmFF17wXOEzwWXUVe//uHHjPF911VWedR8Enc2hB6eZmb366quei22fg0KR7iSEXr16eR49erRnLe21+88sfTmvi2xeeuklz9old91113nW2Rx6iJ6+b3XbTl2bnw5PpAAQiIYUAALltLTXXeR1BE7XZutE7blz53rWkfqKTrxPJ7m012vSEX0dbdZRQS3nc72lWFUwadKklF/XyfO6f4FuQaijoqeffrrnjh07ek43Uq+LObSc17LOLPEQQkr73NFyXrtq9DDLiy++2HP//v0964i6/p6y6HvpzjvvrNjFpqHvF+3mS4cnUgAIREMKAIHydvidjpDrWus333zTs05y37hxY0auQUcC9ZA2s8RD63TNrh5ap6P2mbqmYqAjqkOHDvWsa6d19oOWS2effbZnPbxMS7avvvrKs96Pp59+2rPudo6qQSe868GWWtqnG40vL+0C0L0t9AQM7XrThT66zaN2K+j3fPrpp561G1LxRAoAgWhIASAQDSkABMppH6lObdEjIrQPYsqUKZ515VCmphdpv+iAAQMS/k77bXUDg/fee8/zsmXLPOs0nFKkq0V0FZL2NesmNDr9SftLNevmImrr1q2etZ9a3zvJm9BovypyRz87uvJQNw7RsQptC8raK1b/7tlnn/Ws09x0UyHt59QxD33tK664wnPz5s096/vtscce86wbLCmeSAEgEA0pAATKaWmvpz/q6iSdwqJ7/2VjtZBOu9FTCc0SS0PdU1S7GCq6IUcxu/nmmz1rOa8rm8qzOkWnv+lKGC35dfXT+PHjPbdu3dqzbpxixnSoqmD//v2etRzXMl/fL1qaa07+s25Wk64LR7uS9HOv71XNOl1KT5195513Uv5+xRMpAASiIQWAQDkt7Z944gnPWnbpY75uEJINZW1eoasitm/f7lnLfI6pOK5Ro0aedYWIlvM6oqqnruqxIHrP0600OeOMMzzr5jfDhw/3nLwJjXbd/OMf/0j5esgu/UytXbs2Zc4W3bdWZ5VcdtllnnVWwWuvveZZTyfWrqd0eCIFgEA0pAAQKKel/dKlSz2vW7fOs47AZUO3bt0860mhWjqaJY4Kvv766551hkEp7juaji6e0H9LnfCsI55a2mvXSbpRex2R1/eOTuDW8k3LfzOzJk2aeNaRXT3CRktMum0Kky7i0Ht+5ZVXev7FL37hWWeD6D7HjzzyiOfFixdX6Bp4IgWAQDSkABAop6W9TnjVnG3pynldf2uWuIZ72rRpnnWEmfLvuHvvvdezLmY499xzPb/99tuedQS3PFasWOF51qxZnnVPBF37PGbMmISf1y6Aa665xrOuwT506JDnTB1hU4q0XNbuFl2Eo+vXV69eXeHX0NNmda9avc86i+Pyyy/3rF1yejxNukUCFT2ahidSAAhEQwoAgXJa2ueSTtrW0dzOnTt71on2ZmZbtmzxrMdZ6HphHKfdHAcOHPCsMx4yRbthdPRfXzd5bfZtt93mWU+q3LNnj2e9t5T2ldesWTPPOuH9kksu8ayfLx0V1xk8uk2llvJmiV0GI0aM8Dxw4EDP+rnftm2b54kTJ3p+8cUXPeveHsntQUXwRAoAgWhIASBQ0Zb2OlLfr18/z7r+es2aNQk/o5PEdQ1uRUfwkDt6SoFOtDdLHJG/5557POtJpTozQEdwUTH169f3rHswpDshVk8R1Qn1eupoWfS+a9fAwoULPeuJxPre0K6dTOGJFAAC0ZACQKCiLe11gnjNmjU96xZvOkHYLHFduJb2qLp0kn/yPdOD9/TgNC0fdWI3Kk8XrehiFi3bdaRdF1Xo7A/tbtNRfrPEUXgt4XUhjf6MnmaR7YMQeSIFgEA0pAAQqGhL+/PPP99z+/btPW/YsMGznoluljhJmNL+xH73u9951gMC586d6zkbWyRqWaizM5J3yL/00ks96/Z8OuJbnt3PcWK6sGH+/PmedfvDP/zhD5737t2bmwvLEZ5IASAQDSkABCra0l5LOR21X7lypec5c+Yk/IxutaWjvEhNDxQbNmyYZy2ptbTTQ+f0UDztRtERXF2/rXRRxYABAzzrjvpmZkOGDPGso8fvvvuu54ruhI4T0/tZKgcN8kQKAIFoSAEgUNGW9jparCOKWr7rhF0z1tRXVIMGDTy3a9fOc48ePTzrPgda5unIrq6J14nTuvVd48aNPespBzpqr7u0myUuvpgxY4Zn3SFdux6AyuKJFAAC0ZACQKCI0WkACMMTKQAEoiEFgEA0pAAQiIYUAALRkAJAIBpSAAhEQwoAgWhIASAQDSkABKIhBYBANKQAEIiGFAAC0ZACQCAaUgAIREMKAIFoSAEgEA0pAASiIQWAQDSkABCIhhQAAtGQAkAgGlIACERDCgCBaEgBIBANKQAEoiEFgEA0pAAQiIYUAALRkAJAoJJrSKMoOpj037Eoih7M93WhcrifxaVQ72eNfF9ArsVxXP+7HEVRPTP7zMyey98VIQT3s7gU6v0suSfSJFeZ2U4zm53vC0FGcD+LS8Hcz1JvSK83s8lxHMf5vhBkBPezuBTM/YwK4BqzIoqiNma23sw6xXG8Pt/XgzDcz+JSaPezlJ9IJ5jZnEK4SSgX7mdxKaj7WeoN6eP5vghkDPezuBTU/SzJ0j6KokFmNsPMWsRxfCDf14Mw3M/iUoj3s1SfSK83sxcK5SbhhLifxaXg7mdJPpECQCaV6hMpAGQMDSkABKIhBYBANKQAECinm5ZEUcTI1gnEcRzl+xrKK1P3s27dup7POussz8OGDfN88cUXe+7UqZPnnTt3er7mmmsSfu+qVas8f/3115m41Aor5vt52WWXeX700Uc979q1y/PUqVM9P/HEE553796tr+v52LFjnqtXr558fSmv49tvv02ZDx06lPLrIdLdT55IASBQTqc/8UR6YsX8BJNOy5YtPT/wwAOemzdv7vno0aOeGzZs6Llv376eH3wwcdvKxx57zPOaNWtS/q5sK+b72aVLF8833XST50GDBnmuUeN40Ttv3jzPmzZt0tf1fOTIEc+1atVKeL2aNWumvA59ij18+LDn7du3e54zZ45nfWKuaPvHEykAZAkNKQAEorSvYoq5FEzn1FNP9ayl/Y4dOzzrwJG+Z0eNGuW5VatWCb/3ueeOb6yugx7r1q3zrGVhNhTz/axW7fhz2CmnnOJ58ODBngcMGOC5Z8+e+lopf+c333zjOXmwqU6dOp617NfuAH0vNW3a1POvfvUrzy+99JLninbzUNoDQJbQkAJAoJI7/A5Vj87xXLFiheclS5Z4Xr58ueeDBw96XrZsmefhw4cn/F4tN3Ue6ieffJLyNfbt2+eZzXxOTOdmfv75555ffvllz6+++qrndu3aea5du7Zn/bcu699dR+3r1/cz8hK6GC655BLPt9xyi+e2bdum/P5M4YkUAALRkAJAIEp75N1XX33leebMmZ43bNjgWZeC6ijt5s2bPWu3gJnZueee6/nyyy/3PHr0aM86iX/27OOn/h44cHxP4UwtLyxFOiti7dq1Gfu9OqLfvn17z7pYY+vWrZ43btzoORv3kydSAAhEQwoAgarchHwdUdNJu/oor1/Xx3T9nnSP7/qz6SYFl0VHmLNRIhTzBO5c0/fS0KFDPd95552e9R5OnjzZs07azsba7Kqoqt9P/bzqPgzjxo3z/POf/9yzzh74zW9+4zlkJzAm5ANAltCQAkCgvI3aa9nVuHFjzz169PDcp08fzzrqqutsdUK2jsbq1/VRXicF61pcs8SyTcsI/frPfvYzzzNmzPCsk8RRNWjZrhsJ6xp+ncR/6623etZt9+bOnetZZwwgt3QSvnbVaDm/evVqz88++6znbG/szRMpAASiIQWAQHkr7Zs0aeL5nnvu8dyvX7+U39OoUSPP2i2gXQG6tlon5mqJp90Cunt3snSl/dixYz2vXLnSs3YloOo56aSTPGuJqOdF6TZtWsIzIT8/9J6ZmfXu3dvzn//8Z8979uzxPGnSJM8LFizI4tUl4okUAALRkAJAoJyW9t27d/d8/fXXe77iiis8a0muh1fpumst+Vu0aOFZ1/Vu2bLFs+66XdYkfC352rRpk/J7dBf2evXqpf1dyD9doNG6dWvPnTt39qzvGT2cTXfR1/cPsivdpHszszvuuMOztgFa5r/yyivZu7gy8EQKAIFoSAEgUE5L+wsvvNDzj3/8Y8862q4l1dNPP+1ZJ0hraaZbpemZ1rrOVrdpK8sZZ5zh+Ze//GW5fgZVV7NmzTzr+0S7bfQ989FHH3k+dOhQlq8O39FyXkv2G264IeH79CC9F154wfOUKVOyeHXlwxMpAASiIQWAQDkt7bUk17JLpSvt9fAznZw/bdo0zzq6qoealffscp0ZQGlf+LTLSEeAtZTUNdj6HmNNfe6cfPLJnnW/DN0ez8xs/vz5nnWkXg8tzBeeSAEgEA0pAATKaWlfnt3pL7roIs86+V3Lri+++MLzwoULM3mJJ1SZXfWRG8n3pn///p7PPPNMz9oFpIfnaXdQtrddK3W6mEVnVNx0002e9+/fn/Azf/vb3zyvWrXKc3m77rKJJ1IACERDCgCBclra63Z05dmNPpcH85X12vm8JpSfboNnZnbeeed57tixY8qf0Qn52mXE1nmZp9tWdunSxbOeTNCyZUvP999/f8LPz5kzx7Pet6qAJ1IACERDCgCBaEgBIFDepj9l4/tDlWd6FtOfqi5dyWSW2C+qR4poP7f+DP3fmaefF13ZePvtt3seNGiQ54cfftizbkxiZnbgwIFsXGJG8EQKAIFoSAEgUN6mP2Xj+3OhKl4TUkvXDaNTm/bu3Zvy68gM3V/0e9/7nuezzz7b82OPPeb5vvvu86xHDZlV7c8eT6QAEIiGFAAC5bS0130Ddb/H2rVrp/z+888/3/PKlSs9b926NSPXo5uimJkNGTIkI78X+ZFcylerlvo5Yc+ePZ5nzJjhmdNCM0NXMHXo0MHzpZde6nn37t2en3nmGc96b6pyKZ+MJ1IACERDCgCBclraL1q0yLOe2KgTclXfvn09axmeqdK+W7duCX8eOXLkCX+GCflVi5bvvXr1Svg7PV5E6QmhySPDqBz9XLRo0cKzfrb19NZHH33Us+4tWqj7wPJECgCBaEgBIFBOS3sdIdVR+0mTJnnWkbrnn3/esz7+h9ARxfr16yf8nc4eSDdiWEgjiaVAS/vkLqKmTZum/Jldu3Z5/uSTT7JzYSVG9zIYNmyY5/Hjx3t+7733POtIvc7mKdTPF0+kABCIhhQAAuW0tP/yyy89T58+3bNO2lXZWPusW3mNGTMm4e/69OnjmW30CkP16tU9d+/ePeHv9KTKo0ePel66dKlnSvvMGDFihOcbb7zRs259d++993ouhnJe8UQKAIFoSAEgUE5L+3RyuX1ZeXbBT8YpolVXrVq1PA8ePDjh77S037Rpk+fZs2d71pNDUTHt27f3rCP1DRo08DxlyhTPO3bs8FxsWxbyRAoAgWhIASBQlSjtCwmj9vmnk78vvPBCz3Xq1En4voMHD3p+4IEHPE+dOtUzXTUVowtadFu8AQMGeN6yZYvnt956y3Mxb1PIEykABKIhBYBAlPYV9OGHH3rWNdvIHZ2Er+vpk3fE160a582b51lLfpyYdmdpt8rAgQM9t27d2vMbb7zhecWKFZ6LuRuFJ1IACERDCgCBKO0rSEchdYIxckdHjjt27Jjy62aJWy9qN0wxl5jZoKW9dqto1i0y//73v3vW7TKLGU+kABCIhhQAAlHai/JMtt+4caPnw4cPZ/NykIaur+/atavn5FH7DRs2eGakvvK0K0Qn1etWmNrNtW3bttxcWBXCEykABKIhBYBAESOYABCGJ1IACERDCgCBaEgBIBANKQAEoiEFgEA0pAAQiIYUAALRkAJAIBpSAAhEQwoAgWhIASAQDSkABKIhBYBANKQAEIiGFAAC0ZACQCAaUgAIREMKAIFoSAEgEA0pAASiIQWAQDSkABCIhhQAAtGQAkAgGlIACERDCgCBaEgBIBANKQAEoiEFgEAl15BGUXQw6b9jURQ9mO/rQuVFUfRkFEXboyjaH0XRqiiKbs73NaHyCvF+RnEc5/sa8iaKonpm9pmZjYzj+N18Xw8qJ4qinma2Jo7jI1EUdTOzd8xsVBzHH+b3ylAZhXg/S+6JNMlVZrbTzGbn+0JQeXEcL43j+Mh3f/y//zrm8ZIQoBDvZ6k3pNeb2eS4lB/Li0QURQ9FUXTIzFaY2XYzey3Pl4QAhXY/S7a0j6KojZmtN7NOcRyvz/f1IFwURdXNbKCZDTWzP8Zx/E1+rwghCul+lvIT6QQzm0MjWjziOD4Wx/EcM2ttZv+T7+tBmEK6n6XekD6e74tAVtSwKt6nhgqp8vezJBvSKIoGmdlpZvZcvq8FYaIoah5F0bVRFNWPoqh6FEUjzGycmb2d72tDxRXq/SzJPtIoih42s7pxHF+X72tBmCiKmpnZVDPra/99MNhoZhPjOH40rxeGSinU+1mSDSkAZFJJlvYAkEk0pAAQiIYUAALRkAJAoBq5fLEoihjZOoE4jqN8X0N5cT9PjPtZXNLdT55IASAQDSkABMppaQ9kWo0ax9/CjRs3Tvi7Hj16eB42bJjniy++2PO8efM8L1iwwPOsWbM879y50zPzro+rXbu259NPP91z165dPdevX99ztWrHn9sOHTrkWf9Nv/nm+L4k27ZtS3i9Y8eOed61a5fnL774wvPXX3+d8vdmG0+kABCIhhQAAuV0iSijgifGKG9qWjoOHz7c89ChQz23atUq4Wc6dOjg+ZRTTvHcoEEDz0eOHPGsZeGSJUs833nnnZ5nzpxZoesutvtZp04dz926dfM8fvx4z2PHjvXcsmVLz9oNs3v3bs9asn/55Zee58+fn/Daen8WL17secWKFZ6XL1/u+bPPPkv5syEYtQeALKEhBYBAlPZVTLGVgiEGDx7s+aqrrvJ84YUXem7Tpo3nr776KuHn165d63nOnDmedfR40KBBnnv37u1ZR6SXLVvm+S9/+YvnJ5988oT/G4rtfnbp0sXzT3/6U8+33Xab56NHj3o+fPiw55NOOsmzlvlRVL5/Iv0+zToD4P777/c8ZcoUz/pe0K6EiqK0B4AsoSEFgEBMyEeV0rp1a8833HCD51GjRnnWUu5f//qX548++ijhdy1atMjz5s2bPWtpd+qpp3q+6aabPE+YMMFz+/btPevE/vKU9sVGR+0bNWrkef/+/Z7XrFnj+bnnjp/mc9ZZZ3nWEf+6deuW67W1S0ZfWyf933LLLZ5r1arl+ZFHHvG8bt26cr1eRfBECgCBaEgBIBClPaqUpk2bem7evLnnLVu2eH7++ec9P/PMM5537NiR8Lu0CyAdXUf/73//27OW8B07Hj8JuF27dif8ncVMJ7/ffffdnvXfbtOmTZ43bNjg+amnnvKsZbeW7GWpXr265/79+3u+6667POvCjREjRnhevXq1Z0p7AKiCaEgBIBClPaoULQt1xFcndk+fPt2zjhZXhk4S13Iz3Rpx7W7Q9ft79+4Nuo5CoXsTrF+/3rNua6f3KpPb2mkXgE761/X5+j26p4LmbOCJFAAC0ZACQCBKe1Qpn3/+uefHH38866/XokULz6NHj/asE761JH377bc916xZM8tXV7XpwoZ9+/Zl/Pdrt4uZ2WmnneZZt+rTe6UOHjyYMmcDT6QAEIiGFAACFUxpryOqvXr18tynTx/POpKnE3D1EC0ddW3SpEnCazRs2NCzjv7p4Vq6pdrWrVs9cyha4dC13f369fOs6/n1fup9njFjhufQGQP4/+rVq+e5Z8+eCX935ZVXeh4zZoxn/Rxv377ds26dqDvqZwNPpAAQiIYUAAIVTGmv23fpTuY33nijZ538O3v2bM+rVq3yrGul27Ztm/AaWvbrJGydJK4Te1955RXPeh438kPXYpslvmd08vwZZ5zh+frrr/fcuHHjlL9XR3z1PaP7AtDNUzEnn3yyZ/031c/2BRdckPAzP/jBDzzr51B3y9etE/Xz+fHHHwdecdl4IgWAQDSkABCoYEp7Hc3TskBLAS2punfv7lk1rcibAAAG90lEQVRH83X0P3nCr/68lom6HZuuwZ41a5ZnHdmntMsuLeW0q6ZTp04J36dn3utk++TR4O/ofdPX0Ing2pWkk/knT57seeXKlWVdPixxa0I9mUAPOUyeVVOeQ/J02z491/7bb7+tzGWWG0+kABCIhhQAAhVMad+qVSvPOrqqJb+WVFpq63nnuuVWcgmmh3bpeel6vrqO/jZr1syzrhFH5ml3znXXXef5iiuu8Jw8CyPdAWnpSvh057Hr4gxdDKIH9ekosp7xjtS0W03/7fQ+6ZZ9Zon3Svc50K/r/TnzzDM960kIepJCprrheCIFgEA0pAAQqGBKey3TtLzWrbyWLFnieenSpZ53797tWUt7nahvlljaDxgwwLOODHft2tWzlg6M1GaeLorQrhYd5dXz0XVGRjIdtT1w4IDnKVOmeNZD9fR7dI33T37yE886qqxf/+1vf5v2OvBfWl6/+eabnnWkPXmUXsv+CRMmeNb7PnDgQM/6uW3fvr3niRMnet6zZ09FLz0lnkgBIBANKQAEoiEFgEAF00dau3ZtzzpdQqepaB/pSy+95HnFihUVfj1dzaT9sNrXpn14yDzddERXF2l/V1n9ojqdSfveJk2a5HnatGme161b51k3odFpODoNT/vpdDqO9qMjNd0j+NVXX/U8c+bMtD+jqw3/+c9/eh45cqRn3eikS5cunsePH+9Z79Uf//hHzyEnwfJECgCBaEgBIFDB1KY61aRDhw6e9XFcpzLo9IrK0FJSV1LpUQY6XQqZp90oOm1Np7Np+a7fY2a2ZcsWz3PnzvWsZaGW/NqFk+73LF++3LOW/NrFoNOzkJp2nehnuLzltX4O9bOuUxp1A5RzzjnHs25m895773nWLoaKbnLCEykABKIhBYBABVPa6zEi+tit5ZWWeboypbx0JYWumNGVVAsWLPCc7eMLSp3ec+1G+dOf/uRZTwTdvHlzws/r6rbKzNz4js7g0M0v9u3b51k3sNHZBsgO7YbRPUj1Xml3i3YN6l7F48aN8zxv3jzPuglRecp8nkgBIBANKQAEKpjSXk9p1JJayz+dnJ9uBLYsWibqXpO6WYKOKurrIfO0q0ZH6l9++eWU36/7zqb6c2XpfdYTRdOV9slH2CB3tOvlww8/9Dxo0KCUuX///p4bNmzoWfczLg+eSAEgEA0pAAQqmNJeR22nT5/uef369Z7Xrl0b9Bo9evTwrKW9lohaOiC79BgIncCtk+hzTbuMdIRYZ3xomY/UdO8M/bfTbpTQY0C06yXdRH997XTXVB48kQJAIBpSAAhUMKW9rnfWPGPGjIy9hk6k1sm8Ws7rGl/kjp4iqllLsOTyTUu7TNF9HnQPBp20rV9HarrFnY6Wa/dc8n4ZmTrxU+nWfE2bNvVc0X00eCIFgEA0pAAQqGBK+1xIV9rrKHE+R4xLjZbw9913n2ctC7UU1O3xzMzeffddz+VZoKHdBFpudu7c2XPv3r1Tfo9u4bdo0SLP/fr1O+HrlqLRo0d71u3u3nrrLc+///3vE35GZ89oV0q6tfB6PzVrF4Hmim6dp3giBYBANKQAEIjSXnTq1MmzlpW6zl8P7ULm6cFkbdq08TxgwADPbdu29Vyt2vFnAR11NUtc866lvX5d73OjRo1Svt7555+f8uu6N0O6fQGQmnZ/6Nr3a6+91rN2qZiZffDBBynznDlzPOt91i4gPTBRt97Uz7Mefqj3szx4IgWAQDSkABCI0l40b97cs6671a3TdM03Mk9nTrRs2dKzjpDrtoZaso0dOzbhd2kXgB6MqL93yJAhnrW017JSX0+7ErRE3Lhxo+eFCxcayvb+++97fuONNzzr/dR7Y5Y4A+Kaa67xrJPndRRe98vQ37t69WrPEydO9Lxr1y7PHH4HADlGQwoAgSjthZZtugZXS7jK7LyP8tPSTP/dtXtFt6+rV6+e52HDhiX8Lt0WUQ9D1FJdy78aNY5/HHRBhr4vtOTTA9KmTJniee7cuYay6cwGPU9eR8uvvvrqhJ8577zzPGv3jM7uUDoDRH+vdsPo6H9Itx1PpAAQiIYUAAJR2ot0pb3u2h2yHhcnpmX74sWLPd9xxx2ehw4d6llHdrt165bwu7SE17JND0zUQ86020ZH+XXStp7I8M4773hesGCBZx39RWr6OdJ/0+eff95z8mkUun3mwIEDPWuZr3th6O9dtmyZ55kzZ6b8/hA8kQJAIBpSAAgUZWPXaQAoJTyRAkAgGlIACERDCgCBaEgBIBANKQAEoiEFgEA0pAAQiIYUAALRkAJAIBpSAAhEQwoAgWhIASAQDSkABKIhBYBANKQAEIiGFAAC0ZACQCAaUgAIREMKAIFoSAEgEA0pAASiIQWAQDSkABDofwHIWK/BesWovwAAAABJRU5ErkJggg==\n", "text/plain": [ "
" ] }, "metadata": { "needs_background": "light" }, "output_type": "display_data" } ], "source": [ "data.show_batch(rows=3, figsize=(5,5))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The second way to define the data for a classifier requires a structure like this:\n", "```\n", "path\\\n", " train\\\n", " test\\\n", " labels.csv\n", "```\n", "where the labels.csv file defines the label(s) of each image in the training set. This is the format you will need to use when each image can have multiple labels. It also works with single labels:" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/html": [ "
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
namelabel
0train/3/7463.png0
1train/3/21102.png0
2train/3/31559.png0
3train/3/46882.png0
4train/3/26209.png0
\n", "
" ], "text/plain": [ " name label\n", "0 train/3/7463.png 0\n", "1 train/3/21102.png 0\n", "2 train/3/31559.png 0\n", "3 train/3/46882.png 0\n", "4 train/3/26209.png 0" ] }, "execution_count": null, "metadata": {}, "output_type": "execute_result" } ], "source": [ "pd.read_csv(path/'labels.csv').head()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You can then use [`ImageDataBunch.from_csv`](/vision.data.html#ImageDataBunch.from_csv):" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "data = ImageDataBunch.from_csv(path, ds_tfms=tfms, size=28)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAVIAAAFgCAYAAADpZ/FJAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAAIABJREFUeJzt3VmQVdX1x/HvEVSQQcURBxAEHEBAnEElDohTnNAkGpOYBJPKHCtDJRWTh39MpVJJTCrDQypxJHFIHAiC8wgqgmhERBFQxAEFBxwAFdD7f8C1zjr0pbvpfbtv33t/nxd2rW66T/e5d/feZ+29dlYqlRARkbbbotoXICJS69SRiogkUkcqIpJIHamISCJ1pCIiidSRiogkUkcqIpKoITvSLMv6ZFl2c5Zlq7MsW5pl2XnVviZpO93P+pJl2bezLJuTZdmHWZZdWe3raY2u1b6AKvkrsBbYBRgJTMuybG6pVJpf3cuSNtL9rC/LgEuA8UD3Kl9Lq2SNtrMpy7IewEpgWKlUWvhJbBLwSqlU+klVL042m+5n/cqy7BJgj1KpdEG1r6UljTi1HwJ8ZG+6T8wFhlbpeiSN7qdUXSN2pD2BdzaKvQP0qsK1SDrdT6m6RuxIVwG9N4r1Bt6rwrVIOt1PqbpG7EgXAl2zLBscYiMAJSZqk+6nVF3DdaSlUmk1cBPwf1mW9ciybAxwOjCpulcmbaH7WX+yLOuaZVk3oAvQJcuyblmWdeoVRg3XkX7im2xYVrECuBb4hpbK1DTdz/pyMfA+8BPg/E/aF1f1ilrQcMufREQqrVFHpCIiFaOOVEQkkTpSEZFE6khFRBJ16JKCLMuU2WpBqVTKqn0NraX72TLdz/qyqfupEamISCJ1pCIiidSRiogkUkcqIpKoU+9fFQHo0qWLt7feemsAunfPC6dbe926dR5bsWKFt7V7r2NtsUU+PrN79/HHH3ss3g/73K5d867I7uNHH33UrtdZSRqRiogkUkcqIpKo7qf23bp1A4pTwd69N9QB3mGHHTzWq1deUH3PPfcEoE+fPh576aWXvP3BBx8A8PLLL3vsueeeA+D999/3mKaUlbHtttt6e6eddgKK92v48OEADBo0yGNr16719vTp0wF48803Pfbiiy96+513NhTYj9PPRrfNNtsA+e8bYOeddwaKr3F7fwG8/fbbQP74BWD33XcHYOXKlR5777285rZ9rn1tgP79+wNw9913e8zuEcCOO+7YJDZgwIAmnxuvfc6cOU2uvZI0IhURSdTpR6TxwfWWW24JFP8K2l/Ovfbay2MjR470ds+ePQv/Auyyyy4A7Lvvvh6LCY2+ffsCxZHQs88+620b7cS/rEuXLgXgsssu89jChfl5bO31l7ARDB6cF78/5ZRTANh///091qNHDyAfyUDxftr9jqOiadOmeXvBggVAcYSzfv36ilx7rRo1ahQAEyZM8JjN4N59912PxSTRq6++CuTvSYBjjjkGgLfeestjH374obfHjRvX5Hvb++rrX/+6x5YvX+5tmxHG2Pbbb+9tu8/xNfL73/8egBkzZngs3u9UGpGKiCRSRyoikqhTTe3twfN2223nMZuWQf7gep999vHYHnvsAcDAgQM9NmTIEG/btC9O7S1REaclLTn88MObxOK6xVWrVgH5tANgypQp3n7wwQdb/b2kaNddd/W23dsDDzzQY3ZvY3IwTh+PP/54oDiVi4+H7N5YQgLy5GKjJgzt93zuued6LE6fWyvLNtT42Jzfo32f2A+09vts6nvZ47qf/vSnHrv55ptb/fVbohGpiEgidaQiIomqNrW3dZ1xOmaZ95NPPtljlj0E2G233QDYe++9PRanaK3V0nTD1hvGbYZxzamtebNVBJBPRw444ACPPfnkk97W1L7t4j22e3PllVd6zKbhw4YN85itNYT88VDM/n/rW9/y9lFHHQXA5Zdf7rFbbrkFKK4VbiT2+4vvz1p+zGHrTK0PqTSNSEVEEnXoiDSOLA455BAAPve5z3ns0EMPBYprQm0XUhSTBjZiiLG4vvOZZ54BYP78/JhzW+/2xhtveCyuc7MdGrFoQnxI/Z3vfKfJNdlf6/hX2xJQkK+tW7NmTZP/K82bOXOmt21HUkwmWeypp57yWFy3a8lFW9MIcN5553nbRiljx4712NNPPw3AK6+84rFaHpFtLlsrvWzZMo9Z0m9TiZ0Y76zmzZvn7bjWOLVAikakIiKJ1JGKiCTq0Kl9XN9pW8POOOMMj1nCJhaciNNvG37ffvvtHps9ezaQT8ehOK2zYiK2hTN+/XJ1ESGfhp955pkes8IYm2LX+fzzz3ssPk6opdqKnU0sMBLbxqaU8b7HaaatM33sscc8FrcHf+YznwHg1FNP9ZgVzHjooYc81kjbRu+8806gmLSzR1rlkn9QfN+a+L4q93mWhI2PaiyJu9VWW3msXDsmjmLBGnuUU+5RTEwQV/JRhEakIiKJOnRE2q9fP29bgYlyuyXiyOHqq6/2tiUA4ujSkkxxR1FbxL+ylvSKBRViIZRyHnjgAaC4W+KFF17wdtwFJZVlI4/Vq1d7LJbZs91tcTnbrFmzvH3++ecDxVmDlWCL5d1i4qXe2Xssvp6XLFkCFN+zNuOD/D0Ydwza6DGOKGOSx75PfH/YKDZ+XmzbvT3yyCM9FndBWbGhcmUR43KucqPlttKIVEQkkTpSEZFEHTq1j8mXW2+9FShOqW06ZVN4KE7zbWoR12e2hQ3p4xQlTt3PPvtsIJ/iQ7E2qbnxxhu9PWnSJADuv/9+j6nievuJiQJ7DcU6mFakAvL7OXToUI/Fx0x2n+LjIfv68Ws2EntMEmvq2qORWAHf1mRDnuxr79e9rTMfM2aMx2Liyb5/TDZZOz4CULJJRKQTUUcqIpKoQ6f2MdtuhSZi6X9bqxm3hcZMbKXWYlrWL04NfvzjH3vb6p22VA8xrrEbMWIEUMw+PvHEE96OGWNpO8vexiyw1aI99thjPRa3g9rHrXbtxmwbaCxQcscddwDFqWsjitutY7uaytUYjq+Hcmyd+ebUIN4cGpGKiCTq0BFpfAht7fgXPxaISBETBDYKiRXVTzrpJKC4c8n+ym2OeLiWjV5POOEEj8UdWHfddRdQTJ5J68SkgCUIP//5z3vMEoWxvKLNECCfJcSv88gjj3jbjmuOZQ8tyaIiM52PHZce33/lxBH05MmTgeIMuJJruzUiFRFJpI5URCRR1Q+/q9Sas/gQOa4h/MIXvgAUD6+zKUF8WN2WWpO27jW249eJ61Rtq2Es8mCPNWLBBmkq3ltbzxsLjFjSMG4jfO2117xthW3imsi4RdTicQtoZ0msyAaHHXaYt0877TSgeAhmOfEe2nbtWFSokutdNSIVEUlU9RFppcTyWHEphO2CiMc1xyOe28Kq6cfSfTYitbOooHhGkCWjFi9e7LF77rkHgGeffTbpeupduRKHdjQ35DttYnm25cuXe9sSfI8++qjH4og0daectI+YHIzFY/bbbz+geI5a/Fx7vcREoc1Q2it5qBGpiEgidaQiIonqZmofC05YcROA66+/HoBFixZ5zKbhcfoX26+//nrhXygmiWwqGNeeXnLJJQAcffTRHos7n2w9a3xobrUcNbVvXpyyW0Loj3/8o8e+973vAcVCJPFxgE374gGJWh/a+cUdjjGxa4/mNlVP1BK+MYlrOynbi0akIiKJ1JGKiCSqm6l9XL8Zp/lTp04t/FsJNlWMUwvbXhi3rcWsoolZffv/lTzyoB7Fe2trA23FA+QH2cVCJfE+2JbCWKTmhhtu8PZtt90G5KsxQEfDVJO9v+KjmgMOOMDbthV4U2u/rS5qPCgxrh9tD3oHi4gkqpsRaTXE9ar2VzBWDy8njnpsTWnckSPNs8RTPHTthz/8IQDHH3+8x2JRE9vVNmzYMI/FEacloeI6Uyug05Ydb5LGZm1HHXWUx2w3U2tYMvimm27yWExAtweNSEVEEqkjFRFJpKl9K8UtaDb1iOdqW03MuHY0TgvtdIBYf9WmpzokL439nv/3v/95LD5ieeONNwAYPXq0x8aNG+dtO+TQalYCXHrppUDxVIf169dX8rJlEyxJG4sPxXWk5cTTM2zrdjyI0t6/7fWoRiNSEZFE6khFRBJpat9Kcao4YMAAAM455xyP2RrGuG001kO0mpgxM2wVq1SPNI1N1+KW3ltuucXbDz30EAAHH3ywx7785S97e/z48UDxUY2tqLj22ms9Fg8w1OOY9mOPyeLxQC2thonvIVtjHLcEt/fqC41IRUQSaUTajLhONB7l+7Of/QyAsWPHeqxcjdO5c+d62x58P/744x6r1PHS0jybGcRkVLxfVrPWRkKQj1jjPdSR2u0njjhtZhDfXy2Ja7FnzpwJdGxyUCNSEZFE6khFRBJpav+JODXo378/AKNGjfJYPK/+iCOOAIprRk3cAnr33Xd7285Oj3VPtf2wY9h20JiMuvfee71t60iHDx/usT59+gDFbYr33Xdfu15nI7P3HJQ/tqel5F63bt28bdtBNbUXEakhDTkitb9e8bhkK7UG+SjkU5/6lMcOOuggb9v/i7udbKmFVcqH4k4Zq9CtUWjnYEVJAObPnw8Uy65ZctEO24Pi/dZ9rKxdd93V27169QKKo9DN+X3baRYduaxQI1IRkUTqSEVEEtXk1D6eYW+FDezc+I29/PLLQH4WNsCIESOA4rrBWD3dpvnxYXe5+pULFizw2HXXXQcUK7fbWdqgqWBHiCcNxLYlHWIsriO1RznlTjSw14+0L5vOQ3H9dnPi1H/KlCnejocldhSNSEVEEqkjFRFJVFNTe1vrGddv2vESceoeC4esXLkSgCFDhnhs0KBBAOy8884es3WDkE8t3n33XY8tWrTI23bQ3axZszxmB6jFdaI6QK3ybCth3FJo2fS4bjCuwrB7P3DgQI/ts88+3raiJXEt8RNPPFH4F/R4pj3F321La0Zta3VceXHrrbd6244a6UgakYqIJOr0I9K4ds9GGRMmTPDYBRdcABRHpG0Rj2udM2cOAA8//HCTGOSjU1uvJmliJfTevXsDm04cde264SUbE0NWwtCO6YViotBmKLHwTDzq12Yetp4U4KKLLgLycnqgEWlHaen3bOtDrTwiFHedxd2FHUUjUhGRROpIRUQSdfqpfUwA9OzZEygWkmjpUCxbUxarZVvxiqefftpj5abxtq0T8qQV5GfYS+uUKwgDcPTRRwPFNZ2WRNp22209tv/++3vbthLGtcQ2jY8Jw5hwtKmiHYoG+cGDkCcKJ02a5LGFCxcCShh2lHiaxLJly4Dilt04Xb/qqquAYiIwfrwaj2A0IhURSdTpR6Txr4tVKL/hhhs8Zn/JTjrpJI/FXRKWGIqV6efNmwfAjBkzPBaTCnH0KeniUqXBgwd7+5e//CVQTBzZSPODDz7wWDxJwJKPcYmLHXEdR5yWtIJ8VHnNNdd47I477mjy/7V0rXri6QOW9IuzjniM+bRp04BiWcRqzxI1IhURSaSOVEQkUaa1cSIiaTQiFRFJpI5URCSROlIRkUTqSEVEEqkjFRFJpI5URCSROlIRkUTqSEVEEqkjFRFJpI5URCSROlIRkUTqSEVEEqkjFRFJpI5URCSROlIRkUTqSEVEEqkjFRFJpI5URCSROlIRkUTqSEVEEqkjFRFJpI5URCSROlIRkUTqSEVEEqkjFRFJpI5URCSROlIRkUTqSEVEEqkjFRFJ1JAdaZZlfbIsuznLstVZli3Nsuy8al+TtJ3uZ32pxfvZtdoXUCV/BdYCuwAjgWlZls0tlUrzq3tZ0ka6n/Wl5u5nViqVqn0NHSrLsh7ASmBYqVRa+ElsEvBKqVT6SVUvTjab7md9qdX72YhT+yHAR3aTPjEXGFql65E0up/1pSbvZyN2pD2BdzaKvQP0qsK1SDrdz/pSk/ezETvSVUDvjWK9gfeqcC2STvezvtTk/WzEjnQh0DXLssEhNgLotA+ypVm6n/WlJu9nwyWbALIsuw4oARPZkBW8FRjdmbOCsmm6n/WlFu9nI45IAb4JdAdWANcC3+jMN0lapPtZX2rufjbkiFREpJIadUQqIlIx6khFRBKpIxURSaSOVEQkUYcWLcmyrEMyW926dfP2QQcdBMCkSZM8du+99zb5vAMPPLDJ/3/mmWc89vDDD3t75syZAMybN89jb7zxRkWuvVQqZRX5Qh2go+5na2VZ/qvrLElU3c/6sqn7qRGpiEiiDl3+1FF/8eLIZKeddgJgwoQJHnv55ZeB4ojz+OOP9/axxx4LwPDhwz227bbbenvx4sUA3HTTTR77wx/+UJFr1whm82299dYA7Lnnnh47/PDDvb333ns3+firr77q7alTpwIwa9asil+b7md90YhURKSd1GVh5zjKfvvttwG46667PLZmzRoAXn/9dY/dc8893l60aBEAQ4YM8djBBx/s7QMOOACAU045xWNvvfUWAFdddVX6DyBssUX+N3677bbz9iGHHALAcccd5zEbccbP23HHHb1ts4ntt9/eY6tWrfL2+++/D8Bjjz3msfXr16f9ANJQNCIVEUmkjlREJFFdTu2jtWvXAnmCaFNsOh/b8+fndRKef/55b69YsQKAMWPGeOyMM84AYO7cuR576qmnvK2pYuv06dMHgMMOO8xjp512mrf3339/APr16+cxSy7aY5yN2++++y4Au+66q8d69uzpbftacTlcnPqLtEQjUhGRRHU/Ik3x2muvlW2/9NJLAGy11VYe+/znPw8UkyBxFKwRzqbF0eG4ceMAmDhxoscOPfRQb1uicMGCBR6zWUDcNPHkk09622YD06dPL/s9lyxZAugeSdtpRCoikkgdqYhIIk3t2+DDDz8E4J138sMOu3bd8KvcYYcdmsSkeXH95+mnnw7AUUcd5bE333zT21Yz4Xe/+12Tj8e1p3379vX2t7/9bQB6987PVIuJwNiWlsWk3B577OHtfffdt0ks3gdjCeDYXrduXbOx+Nhl+fLlQL5DMcbi/7f1wR1BI1IRkUTqSEVEEmnu2Qa25bBcEYyYOY5TGNm0OG2zNZ9xu+a//vUvb19zzTWFz4N8S7Bt3QX46le/6u2TTz4ZKE4F41bexx9/PO0HaDCjR4/29tixY71tU/v+/ft7zNb9brPNNh6LW7ibK5rUpUsXb8ep+//+9z8AZsyY4bH4erFHPeVWzXz88ceb/H4pNCIVEUmkEWkr7bzzzt62kU8samLrGufMmeMxS0pJ89577z1v//a3vwWgV69eHosl7yzBN2DAAI9ZybxYRCbuOrOk37XXXuuxyZMnl/360rI4Io3rfa0ozAcffOAxu19Lly4t+7W23HLLwr+Ql0WMycGYkLSSl5/61Kc8Fmc1ts77hhtu8JglKV955ZVN/2AJNCIVEUmkjlREJJGm9s3o3r27t2Mi45hjjgGK0/0pU6YAeUETaL8H2/Xmo48+8vZzzz0H5NM7KBYbOfHEE4FiksMKxsQ1i3Ht6bRp0wD4y1/+4rFYi1Y2T5xmx/eITZuXLVvmMUv43H777R6L7xGb0sc11/aIIBaribVkLakV73c8wcLWE8fXiH3/eLZaJR+9aUQqIpJII9JmjBgxwttnnXWWt+1kUjuNFPJlORqFVkbcIWYjToDvfe97AOy1114eszJ6Cxcu9Njf/va3Ju2400X3qTLicj9L4D344IMes2STFZuB4u/eRo9xV5rNRmJpy1hkxnZOxYTi0Ucf7W0bqcYdWFa4pr3OqNOIVEQkkTpSEZFEDTO1j8kLewgdiyvEB8+2rjHWwYxrRm1Hxb///W+PaS1iZcXkwdChQ70dd5NtLD4OGDlypLf32WcfQDuYKuXPf/6zt3v06OHtlStXAsVEX1t299mOprgeNR6NbgmqOHWP7297L8fEkj1iaK+TKjQiFRFJpI5URCRR3U/tLdsXp3pnnnkmAAMHDvRYuW1t8ePxMLWpU6cC8MADD7TDFQsUp4dxdYSdVz9s2DCP2VbCWBgjbmO0qd5//vMfj8WtvHZvlclvnU1t92wLWx+6yy67eMweudkjGSjeb9seHD8ep/l2aOWjjz7qMXtPq2iJiEgnlbXXuqqy3yzLOu6bfcLWG1500UUe+9rXvgYUH1DH3TX2O4mj0Fh27ec//zlQHMVWSqlUyir+RdtJe97PeG9sFAr5KMQKlUA+gokJiVjUxGYl8USD7373u96+++67gfY5/K7R76et8Y27AGNS0Mrs7bfffh4bNGgQUJxV7Lbbbt62XVBvvfWWx2IZPVvP+o9//CP9B9jIpu6nRqQiIonUkYqIJKr7ZJNN2eM0vdzjjHLVtOOU8sADD/T2qaeeChTrHUplxXW9sYaktWMCymy11VbejrUq7fC78ePHe+wXv/iFt23948yZMz2m0w1aJyZ5LHFUbhof12zH7b22pvvggw9u8jXtsQAU13+uXr0agHvuucdjVjQI4KGHHmrDT5JGI1IRkUR1PyK1s32mT5/uMRvtxDNhyu3GsOIkUByRHnvssQDMnj3bYy+++GIlL1vaII4iZ82a5W2rth9LscXlcDbDiGc6WTk/aSrO1GJSzxJHhxxyiMesmEhc3hQLkDR3ZHkchcadg5deeilQLJhiVfGhmFTsKBqRiogkUkcqIpKo7qf2NsyPD6Zjuzmvvfaat+PDcnswHuuVamrfucTpne1Ai5X242MbO/EgJrA0td+0uL7T1mRDfihdXJNtdUbjY7SWWJIpfp24I2n33XcHipX649R+3bp1rf5elaIRqYhIInWkIiKJ6n5qn8LOqgd48sknvX3++ecDxcO1brnllo67MNksVpcyHsBm23whL3YS16HKpvXr18/bffr08bZNyWO23Wr7xrWltg4U8mx8PAbGatHGx2lxGm+rZmJt2ni8iWXzn3/++bLfsz1oRCoikkgj0mbEZFM8WM2KJcQye9L5lTv2GfLTD2y9qTQvFgjZaaedvL18+XIgH4UCzJs3Dyiu4Y3JIPs/cSebjV7jqRTx3tiJCfG45jh6tTWrsYyeXUdcjxpfD6k0IhURSaSOVEQkkab2zYj1KZcsWeLtZ599FigWX5DOL56dHteZ2gFuzW1XlNwzzzzj7ZhYssMFY51ee6/E+rJxar9ixQqg+F6zxwUxqRUfIRx33HFA8VDE+Bhg4sSJQHGr6rXXXgvAzTff7LGY4EqlEamISCJ1pCIiiTrlXCbWISx3HEicTnTUUSmvv/66ty3jGytCSecXX0vxoDzbftgeR8fUo1jbN2bGK8Xea/E9F/sEqzcaVwLErapnn302AKNGjfKYPTqwY2Wg+Igh9bx7jUhFRBJ1yhFp3GESj2G1mqFxfWclHxg3Jz7stoO6YvLC2o1+pG+sO2lFQuJf/rhbrKN/V3EXTlwDbK8nS3w0Kku6QV5kJL6/qlEMxMSZp40u47XFo9Ht3n7605/2WN++fYHirCS+f1NpRCoikkgdqYhIok45tY9HEVitSIDevXsDxXqiTzzxBFB8AF4pcaoTCyQMHjwYKB7KZlOhRp/ajxkzxttHHnkkUNyWZ+sKO5JtOTz88MM9ZoUxAB555BGgeNxMI4pny3fv3h0obqVdtmxZh19TOTbNjwkiK0wDeYGSGLMt3nH9cGqCKdKIVEQkUacckcZkU9zdYAdpxR0oVl4rlsyKh5hZgZG22G233by97777ettGOLF4Q1ye0cg++9nPetsORrv66qs9VslCEc3ZcsstvX3ooYcCxSOa42hEI9INzjvvPG9bFfpJkyZ5rLOMSMuxg/cgL2ASq+bPmTMHyA/DrDSNSEVEEqkjFRFJ1Cmn9rE2YZw+W7Lgwgsv9Ng555wDwG233eax6667ztu2C6Ithg8f7u2YRLH1ay+88EKbv3a9soQg5Gtv47QrPqqxZF1bpvvxUYol+uIjobju96yzzgLgsMMO89jSpUu9ba+XuJOmEcV7Z4nCuO7SHqP9+9//9tjatWs76Opy9mgvvq6saj7kiWF7ZAPFfqQ9aEQqIpJIHamISKJOObWPB1XZQVaQTwFPPfVUj9kZ81ajEGDvvff29sqVK4HyWfUYi2sdbeoeM/U77rijt++//36gWNuwkmvSapH9LuORLHYf4hnyp5xyirftMLq43s8Kh7S0HTFOOfv37w/k01EoHkNh9zFeW5ye2jS/GtPUziS+12x6HAt/2NpSy+hDcS213ceYLbd2W7LlsbBMPDzPtoDG6fzo0aO9bSt1rD4q5EeNtBeNSEVEEnXKEWlMNi1atMjbtnsp7h6yv3ixpFa50notxeIo2Eak8XsvXrzY2/bgeu7cuS39KA3DRqR///vfPWbr+eLutEsuucTbP/jBDwB46qmnPGY7aWLiJ74ebMRqa1QBBg0aBBRHSnEn2vz58wG49dZbPRbba9asaenHawgzZszwtq3DHT9+vMdGjhwJ5AVAoLi21O5ZTORZkZoYi7MNKxgT34s224gJwxEjRnj79NNPB4o71eKMcPLkyUBxhN3esw2NSEVEEqkjFRFJlHVUhXmALMs67pvVqFKpVDN7TeP9tKl9TBDYOtwjjjjCYzF5YYeXxbqQtkYwJgLjtM0SjjvssIPH7HvGxzOxtuif/vQnIE9uQcetGa3V+2mPRuK9s0RhXFMdH6lZMsrW9UJ+aGQ8MC+eRGDrvOMaYDtUMhZRsUJBkNeVjff4jjvu8Pb1118PwOzZs8t+zxSbup8akYqIJNKItJOp1RFMc+KOmX322cfbY8eOBYpnX1nbElVQXOpkSZCnn37aY9OnTy/8CzBz5kxv2xK4OGLtqOIptX4/Y0nLAw44AIBzzz3XY/EEC9u1ZjugALp16wYUR5wx2WSJ3Xg/bDYSC8/EMpm2rOmGG27wWBzx2o7D9iitqRGpiEg7UUcqIpKoQ6f2IiL1SCNSEZFE6khFRBKpIxURSaSOVEQkkTpSEZFE6khFRBKpIxURSaSOVEQkkTpSEZFE6khFRBKpIxURSaSOVEQkkTpSEZFE6khFRBKpIxURSaSOVEQkkTpSEZFE6khFRBKpIxURSaSOVEQkkTpSEZFE6khFRBKpIxURSaSOVEQkkTpSEZFE6khFRBLosSahAAANzElEQVSpIxURSaSOVEQkkTpSEZFEDdmRZln27SzL5mRZ9mGWZVdW+3okTZZlfbIsuznLstVZli3Nsuy8al+TtF0tvj+7VvsCqmQZcAkwHuhe5WuRdH8F1gK7ACOBaVmWzS2VSvOre1nSRjX3/mzIEWmpVLqpVCpNBt6s9rVImizLegATgJ+XSqVVpVLpQWAK8IXqXpm0VS2+PxuyI5W6MgT4qFQqLQyxucDQKl2PNCB1pFLregLvbBR7B+hVhWuRBqWOVGrdKqD3RrHewHtVuBZpUOpIpdYtBLpmWTY4xEYASjRJh2nIjjTLsq5ZlnUDugBdsizrlmVZo65gqGmlUmk1cBPwf1mW9ciybAxwOjCpulcmbVWL78+G7EiBi4H3gZ8A53/SvriqVyQpvsmGZTIrgGuBb2jpU02rufdnViqVqn0NIiI1rVFHpCIiFaOOVEQkkTpSEZFE6khFRBJ16JKCLMuU2WpBqVTKqn0NraX72TLdz/qyqfupEamISCJ1pCIiidSRiogkUkcqIpKoU+9fFWmNHj16ADBo0CCPnXjiid5+/fXXAVizZo3HsizPGWyxxYbxxPbbb++x999/H4Bnn33WY7Nnz/b22rVrK3LtUh80IhURSaSOVEQkkab2UpNsOg8wdOiGU0VOO+00j33961/39ssvvwzAO+9sXEh/A5va77TTTh5bvXo1APfee6/Hnn/+eW+/+uqrAKjoj4BGpCIiyTQilZo0fPhwb3/xi18EYMKECR7bbrvtvL3tttsCxdFjTDY1F/vggw+8feedd3r7zTc3HHD54Ycfbva1S/3RiFREJJE6UhGRRHUzte/SpYu3d911V2/3798fgK222spjXbtu+LEXLVrksaVLl7b3JUoFrVy50tvr168H8ik85AkkKD9lb61ddtnF22PGjPH2Y489BmhqLxtoRCoikkgdqYhIorqZ2h999NHejlMw2za48847e2zcuHEAPPnkkx6LU/slS5YA8NRTT3lszpw53rZtgx9//LHHbHopHWP58uXetnu3YsUKj+24447etql9nO7HR0Hlpv62jjR+LK4U6NWrF1B8xCCNSyNSEZFENTUi3XLLLYHiDpQLL7wQKI4WRo4c6e1+/foBxdGjGTJkiLctKQX56HW//fbz2A477OBtS1bFUdFrr70GFNcvvvvuu97+6KOPCv83XpN9TFovjgSfeOIJAGbMmOGxESNGeNt+5/HexAIl8Z6YVatWAXnxEigmMePOKhGNSEVEEqkjFRFJVFNT+8GDBwPFrYBWnKJ3794es2kZ5Fv54tR+2bJlQD4dh2KiwqZze++9t8fGjh3r7fHjxwPFKflzzz0HwNtvv+2x++67z9uW6Igft/9j1yht88gjjwDFoiJx6m2vjS9/+cseO+ecc7xtiaOoZ8+eALz33nses9cN6HGMFGlEKiKSqKZGpJZQsuVLkI80586d67FY1dx2oCxcuNBjtlQpJhli9XQbzcRRx8CBA71tyag4krEdMPvuu6/HBgwY0OT/W/k1gD/+8Y8APPDAA2V+Wmkt29EUE45HHHGEty05GT++9dZbN/s1bYYyc+ZMj914443etqr79eTTn/60t0844QSguKxv6tSpQD4DgHyZWBSXmdl7LC4ji7MF23EY70dzsdiOs4W4fDHOSDuKRqQiIonUkYqIJKqpqf0bb7wBwKxZszxm7WnTpjWJAWyzzTZAcQoSa0y2lj0iAOjTpw9QXJdoU8k99tjDYzGhYcUt7GcAeOuttzb7OmSDuA7Ufvef+cxnmsQgTzZ169bNY7YmOYpTxQULFgBw2223eWzKlCmpl92pxXXTZ511FlD8PR1++OFAvvMP8tMHIP/9xZMIXnjhBQBeeuklj40aNcrb9l6Kj8nsflnCb+OPWzy+ly6//HJvP/zwwwCsW7eu7M/ZHjQiFRFJpI5URCRRTU3tLbsdDySzrH0sQhGz7XGLX4q4/tPa3bt399jBBx8MFDOWsVblvHnzAPj1r3/tscWLF1fk2hrR/vvv7+0zzjgDyKejUMwSt7YeaVyHOn36dKD4Wqt3sTCPTcWHDRvmMdt6Hbdgx23Qa9euBVo+EDA+lrHHLfF90xK7n/F7x2m+rYyJK3Xam0akIiKJOuWINI704lpMK41miQDI1/PZQ2uAPffc09uWbIrJIhuxxjVy5YqaRHFUY9d37LHHeuykk04C8t1XAI8++qi3bQ3i448/7rFKjZbrXby3dtDd8ccf7zErUNLSqCbe71j0xO7J5MmTPWYj0jgTqXdxh52NzuPvtG/fvgDstttuHosFhJobibZ08GBbxATUYYcd5m0rXqMRqYhIDVFHKiKSqFNN7W3NWqxmf9xxx3n7Rz/6EQDXX3+9x2wKEguM7L777t62qVmsJfniiy8C8PTTT3ssbhG1aX6c1sR6paNHjwbgoosu8pg9Lrjqqqs89uCDD3r7mWeeATSdb624HveYY47xtiWW7OQDyNfzxuReXCts9zMWqbEapgB33303APfcc4/H7DXSSOI26/POOw8oPmb70pe+BMD3v/99j8XtmDbVjrV7bfvupg4jbCkx1Vpxu3e5+rLtTSNSEZFEnWpEan9J4o6EWIzAypidffbZHrPdQXHEGUe0ZuLEid620eftt9/uMVueBPloJC6viMVITj311ML1AnznO98BikVJYmGLcsUdpCkraGGjfoDzzz/f24cccghQHH3a7zYmDOOI0pblzJ4922NXX321t+fPnw/o3K1y4gzqiiuuAIq7veKI0hI+do8gnx3G3225pYQtxeJuqSuvvBIozlrKFTXpSBqRiogkUkcqIpKoU03tbRoRpxOxVqdN+7773e96zHZZxGldTDTYMD8+7LY6h7HqfdwpY7sk4nQkJptsWvnf//7XY7a2VcfzprHHNrEASZwq2r2LxTSsWEasQ/urX/3K2zZFjOskY7ENVbtvHXvdx8dtkT1O+c9//tOu12GPzOLUPu5sjO2OohGpiEgidaQiIok61dS+nFgj0uoM2rZPyNcYxkIIsU7o0KFDgeLxBrb2LT4OiFn5cuL0z7KGMesfM/zSOvbYpV+/fh6zdcOxMIatRYR8DWJ8VGOPU+KW3IceesjblvGt1JpFqS7rE+L9jCtoytWabW8akYqIJOr0I9I4arSH3PFhtu0eip8XR49W0CIWMrERayx0EHdQWbIrjnpisRGrlB7Xniph0TpxV4sVI4nFXw466CAgP0xw4/9TjiWTYrIpjlDsPuoe1Ydyh9vFUah2NomI1CB1pCIiiTr91L6cmNixdpz+xWG+Tftigsqm8TaNhGKFbUtUXHbZZR6LaxAXLVrU5DqkdeK9sQTf1772NY/Z2sDNqZhu9THHjRvnsVjt3iq/x3sstavc+07JJhGRGqeOVEQkUU1O7cuJa8qs2g/k29rsTG7Iay1aRh+KWfmpU6cC+bpVKFZv0nrEtotTMDs6ZuDAgR6Lj2Bay9aZxvv5la98xdu2CiNuN5baEh/d2TrSGNPUXkSkxtXNiDSKa0atruWFF17osSOOOALIj5wFmDRpkretYnq59WqSptwxybFIjVVk31RF9XLs/8QdUjbaBbjrrrsAjUhrWdyZWE4chaoeqYhIDVJHKiKSqG6m9nEqFw9Ls7qWhx56qMeWL18OwMUXX+wxOwsbikeESGXFabwdCPivf/3LY5/97GeB4gGGrT0HPX6e1S2F6mwZlMoaNmyYtwcMGNDk47EGqd3v+HgobiFvDxqRiogkqvk/1fbX56ijjvKYHdkLcOCBBwLFo5ftOGdLQkCxXJ+0nzgyeOWVVwCYNm2axwYPHgwUD7eLZfQskRCTD+VGnHE0sjm7pKRz2tRR2ya+Bmx0GkepGpGKiHRy6khFRBLV5NQ+JpaspugFF1zgsf3228/bjz32GAA33XSTx+677z6geFa2dDyrbP/II494zNZ6xtqhgwYN8rbd+27dunnMpnVxx1mcCqoOae2L9zMmLE1MNFoyubVJykrQiFREJJE6UhGRRDU1tbfiFkceeaTHbJ1ojP3zn//09o033gjkaxZB60Q7m5hRteIx8aiRuKbUjicpV5giFqt54YUXvG2PEKR29e3b19ujRo1q8vH4WOfmm28GYN26de1/YZ/QiFREJFFNjUit2IiNQiFfJ2pJJYBLL73U2/bgOT6gbu81ZdI6PXv2BIo7VU444QSgeHpBHI3YOtK4NtRGoitWrPDYvffe6+0lS5ZU8rKlCuJso1evXk0+Xu2dbBqRiogkUkcqIpKopqb2Vls0rhO9//77Abjiiis89uKLL3p7/fr1HXNx0ipxbZ8dWheLzJx77rlAfggelE8sxbWhth44nmt/zTXXeHvBggWply1VFtd8l3s0F9cVW7LpN7/5jcdsO3h71RjWiFREJFGnH5HGpIItibEEEsAdd9wB5EfugkahnVk8k2ns2LEATJw40WN2j2PBiciWubz66qsesxKINhIBWLx4sbfXrFmTetlSZXGWaUsav/Wtb5X93OHDhwPFpJOKloiIdHLqSEVEEmU6WlhEJI1GpCIiidSRiogkUkcqIpJIHamISCJ1pCIiidSRiogkUkcqIpJIHamISCJ1pCIiidSRiogkUkcqIpJIHamISCJ1pCIiidSRiogkUkcqIpJIHamISCJ1pCIiidSRiogkUkcqIpJIHamISCJ1pCIiidSRiogkUkcqIpLo/wFBVhwORz8H+AAAAABJRU5ErkJggg==\n", "text/plain": [ "
" ] }, "metadata": { "needs_background": "light" }, "output_type": "display_data" } ], "source": [ "data.show_batch(rows=3, figsize=(5,5))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "An example of multiclassification can be downloaded with the following cell. It's a sample of the [planet dataset](https://www.google.com/search?q=kaggle+planet&rlz=1C1CHBF_enFR786FR786&oq=kaggle+planet&aqs=chrome..69i57j0.1563j0j7&sourceid=chrome&ie=UTF-8)." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "planet = untar_data(URLs.PLANET_SAMPLE)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "If we open the labels files, we seach that each image has one or more tags, separated by a space." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/html": [ "
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
image_nametags
0train_21983partly_cloudy primary
1train_9516clear cultivation primary water
2train_12664haze primary
3train_36960clear primary
4train_5302haze primary road
\n", "
" ], "text/plain": [ " image_name tags\n", "0 train_21983 partly_cloudy primary\n", "1 train_9516 clear cultivation primary water\n", "2 train_12664 haze primary\n", "3 train_36960 clear primary\n", "4 train_5302 haze primary road" ] }, "execution_count": null, "metadata": {}, "output_type": "execute_result" } ], "source": [ "df = pd.read_csv(planet/'labels.csv')\n", "df.head()" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "data = ImageDataBunch.from_csv(planet, folder='train', size=128, suffix='.jpg', label_delim=' ',\n", " ds_tfms=get_transforms(flip_vert=True, max_lighting=0.1, max_zoom=1.05, max_warp=0.))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The `show_batch`method will then print all the labels that correspond to each image." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "data.show_batch(rows=3, figsize=(10,8), ds_type=DatasetType.Valid)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You can find more ways to build an [`ImageDataBunch`](/vision.data.html#ImageDataBunch) without the factory methods in [`data_block`](/data_block.html#data_block)." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

class ImageDataBunch[source]

\n", "\n", "> ImageDataBunch(**`train_dl`**:[`DataLoader`](https://pytorch.org/docs/stable/data.html#torch.utils.data.DataLoader), **`valid_dl`**:[`DataLoader`](https://pytorch.org/docs/stable/data.html#torch.utils.data.DataLoader), **`fix_dl`**:[`DataLoader`](https://pytorch.org/docs/stable/data.html#torch.utils.data.DataLoader)=***`None`***, **`test_dl`**:`Optional`\\[[`DataLoader`](https://pytorch.org/docs/stable/data.html#torch.utils.data.DataLoader)\\]=***`None`***, **`device`**:[`device`](https://pytorch.org/docs/stable/tensor_attributes.html#torch-device)=***`None`***, **`dl_tfms`**:`Optional`\\[`Collection`\\[`Callable`\\]\\]=***`None`***, **`path`**:`PathOrStr`=***`'.'`***, **`collate_fn`**:`Callable`=***`'data_collate'`***, **`no_check`**:`bool`=***`False`***) :: [`DataBunch`](/basic_data.html#DataBunch)\n", "\n", "DataBunch suitable for computer vision. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageDataBunch)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This is the same initialization as a regular [`DataBunch`](/basic_data.html#DataBunch) so you probably don't want to use this directly, but one of the factory methods instead." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Factory methods" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "If you quickly want to get a [`ImageDataBunch`](/vision.data.html#ImageDataBunch) and train a model, you should process your data to have it in one of the formats the following functions handle. " ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

from_folder[source]

\n", "\n", "> from_folder(**`path`**:`PathOrStr`, **`train`**:`PathOrStr`=***`'train'`***, **`valid`**:`PathOrStr`=***`'valid'`***, **`valid_pct`**=***`None`***, **`classes`**:`Collection`\\[`T_co`\\]=***`None`***, **\\*\\*`kwargs`**:`Any`) → `ImageDataBunch`\n", "\n", "Create from imagenet style dataset in `path` with `train`,`valid`,`test` subfolders (or provide `valid_pct`). \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageDataBunch.from_folder)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Refer to [`create_from_ll`](#ImageDataBunch.create_from_ll) to see all the `**kwargs` arguments.\n", "\n", "\"*Imagenet-style*\" datasets look something like this (note that the test folder is optional):\n", "\n", "```\n", "path\\\n", " train\\\n", " clas1\\\n", " clas2\\\n", " ...\n", " valid\\\n", " clas1\\\n", " clas2\\\n", " ...\n", " test\\\n", "```\n", "\n", "For example:" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "data = ImageDataBunch.from_folder(path, ds_tfms=tfms, size=24)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Note that this (and all factory methods in this section) pass any `kwargs` to [`DataBunch.create`](/basic_data.html#DataBunch.create)." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

from_csv[source]

\n", "\n", "> from_csv(**`path`**:`PathOrStr`, **`folder`**:`PathOrStr`=***`None`***, **`label_delim`**:`str`=***`None`***, **`csv_labels`**:`PathOrStr`=***`'labels.csv'`***, **`valid_pct`**:`float`=***`0.2`***, **`fn_col`**:`int`=***`0`***, **`label_col`**:`int`=***`1`***, **`suffix`**:`str`=***`''`***, **`delimiter`**:`str`=***`None`***, **`header`**:`Union`\\[`int`, `str`, `NoneType`\\]=***`'infer'`***, **\\*\\*`kwargs`**:`Any`) → `ImageDataBunch`\n", "\n", "Create from a csv file in `path/csv_labels`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageDataBunch.from_csv)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Refer to [`create_from_ll`](#ImageDataBunch.create_from_ll) to see all the `**kwargs` arguments.\n", "\n", "Create an [`ImageDataBunch`](/vision.data.html#ImageDataBunch) from `path` by splitting the data in `folder` and labelled in a file `csv_labels` between a training and validation set. Use `valid_pct` to indicate the percentage of the total images to use as the validation set. An optional `test` folder contains unlabelled data and `suffix` contains an optional suffix to add to the filenames in `csv_labels` (such as '.jpg'). `fn_col` is the index (or the name) of the the column containing the filenames and `label_col` is the index (indices) (or the name(s)) of the column(s) containing the labels. Use [`header`](https://pandas.pydata.org/pandas-docs/stable/reference/api/pandas.read_csv.html#pandas-read-csv) to specify the format of the csv header, and `delimiter` to specify a non-standard csv-field separator. In case your csv has no header, column parameters can only be specified as indices. If `label_delim` is passed, split what's in the label column according to that separator.\n", "\n", "For example:" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "data = ImageDataBunch.from_csv(path, ds_tfms=tfms, size=24);" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

from_df[source]

\n", "\n", "> from_df(**`path`**:`PathOrStr`, **`df`**:`DataFrame`, **`folder`**:`PathOrStr`=***`None`***, **`label_delim`**:`str`=***`None`***, **`valid_pct`**:`float`=***`0.2`***, **`fn_col`**:`IntsOrStrs`=***`0`***, **`label_col`**:`IntsOrStrs`=***`1`***, **`suffix`**:`str`=***`''`***, **\\*\\*`kwargs`**:`Any`) → `ImageDataBunch`\n", "\n", "Create from a `DataFrame` `df`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageDataBunch.from_df)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Refer to [`create_from_ll`](#ImageDataBunch.create_from_ll) to see all the `**kwargs` arguments.\n", "\n", "Same as [`ImageDataBunch.from_csv`](/vision.data.html#ImageDataBunch.from_csv), but passing in a `DataFrame` instead of a csv file. e.g" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/html": [ "
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
namelabel
0train/3/7463.png0
1train/3/21102.png0
2train/3/31559.png0
3train/3/46882.png0
4train/3/26209.png0
\n", "
" ], "text/plain": [ " name label\n", "0 train/3/7463.png 0\n", "1 train/3/21102.png 0\n", "2 train/3/31559.png 0\n", "3 train/3/46882.png 0\n", "4 train/3/26209.png 0" ] }, "execution_count": null, "metadata": {}, "output_type": "execute_result" } ], "source": [ "df = pd.read_csv(path/'labels.csv', header='infer')\n", "df.head()" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "data = ImageDataBunch.from_df(path, df, ds_tfms=tfms, size=24)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Different datasets are labeled in many different ways. The following methods can help extract the labels from the dataset in a wide variety of situations. The way they are built in fastai is constructive: there are methods which do a lot for you but apply in specific circumstances and there are methods which do less for you but give you more flexibility.\n", "\n", "In this case the hierarchy is:\n", "\n", "1. [`ImageDataBunch.from_name_re`](/vision.data.html#ImageDataBunch.from_name_re): Gets the labels from the filenames using a regular expression\n", "2. [`ImageDataBunch.from_name_func`](/vision.data.html#ImageDataBunch.from_name_func): Gets the labels from the filenames using any function\n", "3. [`ImageDataBunch.from_lists`](/vision.data.html#ImageDataBunch.from_lists): Labels need to be provided as an input in a list" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

from_name_re[source]

\n", "\n", "> from_name_re(**`path`**:`PathOrStr`, **`fnames`**:`FilePathList`, **`pat`**:`str`, **`valid_pct`**:`float`=***`0.2`***, **\\*\\*`kwargs`**)\n", "\n", "Create from list of `fnames` in `path` with re expression `pat`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageDataBunch.from_name_re)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Refer to [`create_from_ll`](#ImageDataBunch.create_from_ll) to see all the `**kwargs` arguments.\n", "\n", "Creates an [`ImageDataBunch`](/vision.data.html#ImageDataBunch) from `fnames`, calling a regular expression (containing one *re group*) on the file names to get the labels, putting aside `valid_pct` for the validation. In the same way as [`ImageDataBunch.from_csv`](/vision.data.html#ImageDataBunch.from_csv), an optional `test` folder contains unlabelled data.\n", "\n", "Our previously created dataframe contains the labels in the filenames so we can leverage it to test this new method. [`ImageDataBunch.from_name_re`](/vision.data.html#ImageDataBunch.from_name_re) needs the exact path of each file so we will append the data path to each filename before creating our [`ImageDataBunch`](/vision.data.html#ImageDataBunch) object." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "[PosixPath('/home/ubuntu/.fastai/data/mnist_sample/train/3/7463.png'),\n", " PosixPath('/home/ubuntu/.fastai/data/mnist_sample/train/3/21102.png')]" ] }, "execution_count": null, "metadata": {}, "output_type": "execute_result" } ], "source": [ "fn_paths = [path/name for name in df['name']]; fn_paths[:2]" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "pat = r\"/(\\d)/\\d+\\.png$\"\n", "data = ImageDataBunch.from_name_re(path, fn_paths, pat=pat, ds_tfms=tfms, size=24)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "['3', '7']" ] }, "execution_count": null, "metadata": {}, "output_type": "execute_result" } ], "source": [ "data.classes" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

from_name_func[source]

\n", "\n", "> from_name_func(**`path`**:`PathOrStr`, **`fnames`**:`FilePathList`, **`label_func`**:`Callable`, **`valid_pct`**:`float`=***`0.2`***, **\\*\\*`kwargs`**)\n", "\n", "Create from list of `fnames` in `path` with `label_func`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageDataBunch.from_name_func)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Refer to [`create_from_ll`](#ImageDataBunch.create_from_ll) to see all the `**kwargs` arguments.\n", "\n", "Works in the same way as [`ImageDataBunch.from_name_re`](/vision.data.html#ImageDataBunch.from_name_re), but instead of a regular expression it expects a function that will determine how to extract the labels from the filenames. (Note that `from_name_re` uses this function in its implementation).\n", "\n", "To test it we could build a function with our previous regex. Let's try another, similar approach to show that the labels can be obtained in a different way." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "['3', '7']" ] }, "execution_count": null, "metadata": {}, "output_type": "execute_result" } ], "source": [ "def get_labels(file_path): return '3' if '/3/' in str(file_path) else '7'\n", "data = ImageDataBunch.from_name_func(path, fn_paths, label_func=get_labels, ds_tfms=tfms, size=24)\n", "data.classes" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

from_lists[source]

\n", "\n", "> from_lists(**`path`**:`PathOrStr`, **`fnames`**:`FilePathList`, **`labels`**:`StrList`, **`valid_pct`**:`float`=***`0.2`***, **`item_cls`**:`Callable`=***`None`***, **\\*\\*`kwargs`**)\n", "\n", "Create from list of `fnames` in `path`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageDataBunch.from_lists)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Refer to [`create_from_ll`](#ImageDataBunch.create_from_ll) to see all the `**kwargs` arguments.\n", "\n", "The most flexible factory function; pass in a list of `labels` that correspond to each of the filenames in `fnames`.\n", "\n", "To show an example we have to build the labels list outside our [`ImageDataBunch`](/vision.data.html#ImageDataBunch) object and give it as an argument when we call `from_lists`. Let's use our previously created function to create our labels list." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "['3', '7']" ] }, "execution_count": null, "metadata": {}, "output_type": "execute_result" } ], "source": [ "labels_ls = list(map(get_labels, fn_paths))\n", "data = ImageDataBunch.from_lists(path, fn_paths, labels=labels_ls, ds_tfms=tfms, size=24)\n", "data.classes" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

create_from_ll[source]

\n", "\n", "> create_from_ll(**`lls`**:[`LabelLists`](/data_block.html#LabelLists), **`bs`**:`int`=***`64`***, **`val_bs`**:`int`=***`None`***, **`ds_tfms`**:`Union`\\[`Callable`, `Collection`\\[`Callable`\\], `NoneType`\\]=***`None`***, **`num_workers`**:`int`=***`4`***, **`dl_tfms`**:`Optional`\\[`Collection`\\[`Callable`\\]\\]=***`None`***, **`device`**:[`device`](https://pytorch.org/docs/stable/tensor_attributes.html#torch-device)=***`None`***, **`test`**:`Union`\\[`Path`, `str`, `NoneType`\\]=***`None`***, **`collate_fn`**:`Callable`=***`'data_collate'`***, **`size`**:`int`=***`None`***, **`no_check`**:`bool`=***`False`***, **`resize_method`**:[`ResizeMethod`](/vision.image.html#ResizeMethod)=***`None`***, **`mult`**:`int`=***`None`***, **`padding_mode`**:`str`=***`'reflection'`***, **`mode`**:`str`=***`'bilinear'`***, **`tfm_y`**:`bool`=***`False`***) → `ImageDataBunch`\n", "\n", "Create an [`ImageDataBunch`](/vision.data.html#ImageDataBunch) from [`LabelLists`](/data_block.html#LabelLists) `lls` with potential `ds_tfms`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageDataBunch.create_from_ll)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Use `bs`, `num_workers`, `collate_fn` and a potential `test` folder. `ds_tfms` is a tuple of two lists of transforms to be applied to the training and the validation (plus test optionally) set. `tfms` are the transforms to apply to the [`DataLoader`](https://pytorch.org/docs/stable/data.html#torch.utils.data.DataLoader). The `size` and the `kwargs` are passed to the transforms for data augmentation." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

single_from_classes[source]

\n", "\n", "> single_from_classes(**`path`**:`PathOrStr`, **`classes`**:`StrList`, **`ds_tfms`**:`Union`\\[`Callable`, `Collection`\\[`Callable`\\]\\]=***`None`***, **\\*\\*`kwargs`**)\n", "\n", "Create an empty [`ImageDataBunch`](/vision.data.html#ImageDataBunch) in `path` with `classes`. Typically used for inference. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageDataBunch.single_from_classes)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "
Note: This method is deprecated, you should use DataBunch.load_empty now.
" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "jekyll_note('This method is deprecated, you should use DataBunch.load_empty now.')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Other methods" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In the next few methods we will use another dataset, CIFAR. This is because the second method will get the statistics for our dataset and we want to be able to show different statistics per channel. If we were to use MNIST, these statistics would be the same for every channel. White pixels are [255,255,255] and black pixels are [0,0,0] (or in normalized form [1,1,1] and [0,0,0]) so there is no variance between channels." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "PosixPath('/home/ubuntu/.fastai/data/cifar10')" ] }, "execution_count": null, "metadata": {}, "output_type": "execute_result" } ], "source": [ "path = untar_data(URLs.CIFAR); path" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

channel_view[source]

\n", "\n", "> channel_view(**`x`**:`Tensor`) → `Tensor`\n", "\n", "Make channel the first axis of `x` and flatten remaining axes \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(channel_view)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "data = ImageDataBunch.from_folder(path, ds_tfms=tfms, valid='test', size=24)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "def channel_view(x:Tensor)->Tensor:\n", " \"Make channel the first axis of `x` and flatten remaining axes\"\n", " return x.transpose(0,1).contiguous().view(x.shape[1],-1) " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This function takes a tensor and flattens all dimensions except the channels, which it keeps as the first axis. This function is used to feed [`ImageDataBunch.batch_stats`](/vision.data.html#ImageDataBunch.batch_stats) so that it can get the pixel statistics of a whole batch.\n", "\n", "Let's take as an example the dimensions our MNIST batches: 128, 3, 24, 24." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "t = torch.Tensor(128, 3, 24, 24)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "torch.Size([128, 3, 24, 24])" ] }, "execution_count": null, "metadata": {}, "output_type": "execute_result" } ], "source": [ "t.size()" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "tensor = channel_view(t)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "torch.Size([3, 73728])" ] }, "execution_count": null, "metadata": {}, "output_type": "execute_result" } ], "source": [ "tensor.size()" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

batch_stats[source]

\n", "\n", "> batch_stats(**`funcs`**:`Collection`\\[`Callable`\\]=***`None`***) → `Tensor`\n", "\n", "Grab a batch of data and call reduction function `func` per channel \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageDataBunch.batch_stats)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "[tensor([0.4928, 0.4767, 0.4671]), tensor([0.2677, 0.2631, 0.2630])]" ] }, "execution_count": null, "metadata": {}, "output_type": "execute_result" } ], "source": [ "data.batch_stats()" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

normalize[source]

\n", "\n", "> normalize(**`stats`**:`Collection`\\[`Tensor`\\]=***`None`***, **`do_x`**:`bool`=***`True`***, **`do_y`**:`bool`=***`False`***)\n", "\n", "Add normalize transform using `stats` (defaults to `DataBunch.batch_stats`) \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageDataBunch.normalize)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In the fast.ai library we have `imagenet_stats`, `cifar_stats` and `mnist_stats` so we can add normalization easily with any of these datasets. Let's see an example with our dataset of choice: MNIST." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "ImageDataBunch;\n", "\n", "Train: LabelList\n", "y: CategoryList (50000 items)\n", "[Category truck, Category truck, Category truck, Category truck, Category truck]...\n", "Path: /home/ubuntu/.fastai/data/cifar10\n", "x: ImageList (50000 items)\n", "[Image (3, 32, 32), Image (3, 32, 32), Image (3, 32, 32), Image (3, 32, 32), Image (3, 32, 32)]...\n", "Path: /home/ubuntu/.fastai/data/cifar10;\n", "\n", "Valid: LabelList\n", "y: CategoryList (10000 items)\n", "[Category truck, Category truck, Category truck, Category truck, Category truck]...\n", "Path: /home/ubuntu/.fastai/data/cifar10\n", "x: ImageList (10000 items)\n", "[Image (3, 32, 32), Image (3, 32, 32), Image (3, 32, 32), Image (3, 32, 32), Image (3, 32, 32)]...\n", "Path: /home/ubuntu/.fastai/data/cifar10;\n", "\n", "Test: None" ] }, "execution_count": null, "metadata": {}, "output_type": "execute_result" } ], "source": [ "data.normalize(cifar_stats)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "[tensor([ 0.0074, -0.0219, 0.0769]), tensor([1.0836, 1.0829, 1.0078])]" ] }, "execution_count": null, "metadata": {}, "output_type": "execute_result" } ], "source": [ "data.batch_stats()" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Data normalization" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "You may also want to normalize your data, which can be done by using the following functions." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

normalize[source]

\n", "\n", "> normalize(**`x`**:`Tensor`, **`mean`**:`FloatTensor`, **`std`**:`FloatTensor`) → `Tensor`\n", "\n", "Normalize `x` with `mean` and `std`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(normalize)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

denormalize[source]

\n", "\n", "> denormalize(**`x`**:`Tensor`, **`mean`**:`FloatTensor`, **`std`**:`FloatTensor`, **`do_x`**:`bool`=***`True`***) → `Tensor`\n", "\n", "Denormalize `x` with `mean` and `std`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(denormalize)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

normalize_funcs[source]

\n", "\n", "> normalize_funcs(**`mean`**:`FloatTensor`, **`std`**:`FloatTensor`, **`do_x`**:`bool`=***`True`***, **`do_y`**:`bool`=***`False`***) → `Tuple`\\[`Callable`, `Callable`\\]\n", "\n", "Create normalize/denormalize func using `mean` and `std`, can specify `do_y` and `device`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(normalize_funcs)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "On MNIST the mean and std are 0.1307 and 0.3081 respectively (looked on Google). If you're using a pretrained model, you'll need to use the normalization that was used to train the model. The imagenet norm and denorm functions are stored as constants inside the library named imagenet_norm and imagenet_denorm. If you're training a model on CIFAR-10, you can also use cifar_norm and cifar_denorm.\n", "\n", "You may sometimes see warnings about *clipping input data* when plotting normalized data. That's because even although it's denormalized when plotting automatically, sometimes floating point errors may make some values slightly out or the correct range. You can safely ignore these warnings in this case." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAZoAAAGoCAYAAAB/tCPFAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDMuMC4wLCBodHRwOi8vbWF0cGxvdGxpYi5vcmcvqOYd8AAAIABJREFUeJzt3XmQldWZx/HnyL43q+z7JiCLgCgOi4CA6LiTwGQkidGUmcRkkkpSmamZyViZqpnEKVNlJkbjlEmMKUejGCFggcquRhZZBEH2Nc2O0Cyy+c4fE2t8fm97b9/uPn1vd38/Vf7xo7vvPen79n3ynueec0KSJAYAQCxX5HsAAICajUIDAIiKQgMAiIpCAwCIikIDAIiKQgMAiIpCAwCIikJTBiGEZ0MIxSGEUyGErSGE+/M9JhQurhfkojZcL4EFm9mFEAaa2fYkSc6HEPqb2RIzuyVJkjX5HRkKEdcLclEbrhfuaMogSZJNSZKc/yT+5b9eeRwSChjXC3JRG64XCk0ZhRAeDyGcNbMtZlZsZvPzPCQUMK4X5KKmXy9MneUghFDHzK43s/Fm9uMkSS7md0QoZFwvyEVNvl64o8lBkiSXkyRZYWadzexr+R4PChvXC3JRk68XCk351LUaNoeKqLhekIsad71QaLIIIbQLIcwIITQNIdQJIUwxs5lmtijfY0Ph4XpBLmrL9UKPJosQQlsze9HMhtj/FeY9ZvZYkiRP5XVgKEhcL8hFbbleKDQAgKiYOgMAREWhAQBERaEBAERFoQEARFW3Kp8shFBQnzyYOnWqy9OnT3d54sSJLrdq1Sr1GAcOHHD56NGjLjdu3Njl1q1bu9ysWTOXly1b5vJPfvITl99+++3UGHKRJEmo0ANUoUK7Xmojrhfk4rOuF+5oAABRUWgAAFFRaAAAUVVpjybfWrZs6fKsWbNcvuWWW1w+efKky++8807qMV999VWX586d6/JHH33k8pQpU1z+6le/6nLfvn1dHj16tMsV7dEAQFXjjgYAEBWFBgAQFYUGABBVrerRdO7c2eUOHTq4vGHDBpfXrVvn8ssvv5x6zEWLMu/mfcUVvpa/8cYbLl9//fUu33XXXS43bdo04+MDKJt69eq5rH9bDRs2dDkEvyRENyDOtiFxnTp1Mj5fSUmJyxcuXEg9xpkzZ1w+f/68yx9//HHGMRQK7mgAAFFRaAAAUVFoAABR1aoezcWLF12eM2eOy8XFxS6/9dZbLu/duzfn59Q51H79+rncpUsXl3Xe9vDhwzk/JwpD+/btU/+m66SKiopc1j6BXj96DWsPcNu2bS5v377dZV3XVZvo71r3Mhw0aJDLDRo0yOnx9bXQ57v33ntdnj17tsv62pmZLV261GXtIx87dszly5cvl22wVYw7GgBAVBQaAEBUFBoAQFS1qkezZcuWjDmGFi1auHzDDTe43LNnT5d1P7XNmzfHGRgqrG5d/+czfvx4l/W8IzOzAQMGuNyoUSOX9fwi7RPo93ft2tXltWvXuvytb33L5VWrVqXGVFvo2U933323y8OHD3dZ173ouppsWXs2SvdWLK2/8uCDD7q8YMEClx977DGXV65cmfE584U7GgBAVBQaAEBUFBoAQFS1qkdTFSZMmODyqFGjXJ48ebLLTZo0cXnx4sUub926tRJHh1zompbu3bu7rP2WGTNmuKw9G7P0mUi6v9W5c+dc1h6Njkn30xoxYoTLI0eOdHn9+vUZn78m0zVp//Vf/+Wy9rt0zZP2XHbs2OHy8ePHXdbXQnvCvXr1cln3XjRLv3/cfPPNLn/44YcuHz161OWdO3emHjMfuKMBAERFoQEAREWhAQBERY8mBzonb2b29a9/3WVdF6Nn4Oi87Jo1a1zWdTS6lxHiqV+/vsvak7nvvvtcnjRpUsbv13U2ZunXU/ci+93vfuey7m3WqVMnl3WdxenTp13es2ePy4W6F1ZV0N/Nm2++6bL2r9q1a+eyvp5HjhzJ+PgbN250Wfsper1of9fMbNiwYS7r2h7dK1Gv4ULBHQ0AICoKDQAgKgoNACAqejQZXH311S7/8Ic/TH3PX/3VX7ncqlUrl/VccT3zW8+geP/99zN+P+Jp3ry5y+PGjXP5u9/9rsv62uq6iqeffjr1HH/4wx9c1nUzOs+ve5/df//9Luv+Wnpeia6jqM09GnXp0iWXtYeiOVfaf+vdu7fL+v6ie62ZmXXs2NFl7Qvp+8WhQ4dyHmdV4I4GABAVhQYAEBWFBgAQFT2aT9F9o2699VaXb7zxxtTP6Ofadd2Dzrnr9+vaHP1cPHudVR3d90vn6D/66KOMX//1r3/t8gsvvJB6jt27d7v88ccfu6zXoO51pdeg9niWLFnicnFxcWoMqBy691lRUZHLuq/dnXfe6fLMmTNdHjhwYOo5SkpKXH7jjTdcnj9/vssnTpzIMOL84Y4GABAVhQYAEBWFBgAQFT2aT9E1MLquorT5z71797qsPRU9T2Ts2LEut2jRwmXd74geTdU5e/asy7oP3c9//nOXdU3D3LlzXdZ9xszSPRm9PgYPHuzyN7/5TZd17YWeIf/qq6+6XNG1IPh/umZJ1zhpD2batGkuX3fddS5feeWVWZ/z1KlTLus6qf3797use50VynlD3NEAAKKi0AAAoqLQAACiokfzKbpOQj+zXtp895kzZ1yeN2+ey3oOuc7J33DDDS5fe+21Lj/xxBMZRozKpHtf6ZqXRx991GVd46BrWrQfY2bWqFEjl/Vc+n/91391WdfN6JgeeeQRlwvljPiaSM+jad26tct/8zd/47KeJaM9Hd0rrzT6HDfddJPLui5vzpw5Lu/bt8/lgwcPuqzXfCzc0QAAoqLQAACiotAAAKKiR/MpOuf++uuvZ8xloZ+D13PJdd1MvXr1XG7Tpo3LR48ezXkMKB+dv8513zDtx5mZjRkzxuUf/ehHLg8dOtRlvSb/4z/+w2U9z6i0vhAqh57lc/LkSZd1DZOeR9OvXz+XdU1UadeLru3S62PkyJEuf/WrX3V59uzZLmufUceoezVWFu5oAABRUWgAAFFRaAAAUdGjiSzb/kjZvr9z584u06MpXLo33m233Zb6ngcffNDlQYMGuaxrNfQ8Iz0jXtfuIB7tf2mP5pVXXnFZz6PRNTGdOnVyWfc9NDNbu3aty7ouT8+wmTRpkst33HGHy9oD/s///E+XN23a5HJZ1vqUBXc0AICoKDQAgKgoNACAqCg0AICo+DBAZHXq1HFZG4Lqiit87deGIgrX6NGjXdZNFs3SGy3qJpvacNYG8YwZM1w+dOiQy3oQHws449FGuS6ezaZhw4Yut23bNvU9uimmLuB+6623XNYPD3zta19z+eabb3ZZP1yii5KPHTuWGlN5cEcDAIiKQgMAiIpCAwCIih5NZLn2aPQgNRbkVR8jRoxwuU+fPqnvOXHihMvZFmAWFRW5PHPmTJd1Dv/Xv/61y8ePH//sASOv9KBFfS1Lowu2NevrrX2k+++/3+UvfOELLh85csTlZ5991uXybrrJHQ0AICoKDQAgKgoNACCqWtWj0c+p64aFMfoh119/vcu6SabOeeocvh58hMKlc+5btmxJfc+6detcnjNnjss65659nkceecTlBx54wOV3333X5RUrVrish7mhZtHr57XXXnO5Xbt2Lv/gBz9wedq0aS6vXLnSZd10s6y4owEAREWhAQBERaEBAERVo3s0PXv2dPnuu+92WT+3vmrVKpd37drlcln2jRo/frzLuu6hS5cuLh84cMDll19+2eXt27dnfU4Uhscff9zlp556KvU9Fy5cyJj1GtO9y37yk59kzLfeeqvLmzdvdln3RkPNpj2bZcuWuTx58mSX+/fv73KvXr1cpkcDAChIFBoAQFQUGgBAVDW6R6PnPXz+8593Wc9+WbhwoctPPPGEy+PGjUs9R4cOHVweNGiQy1OmTHFZ5+Qffvhhlzdu3Ogy62iqjxiv1eXLl13WOfYPP/zQZV23pesm6NHUbrt373b5lVdecfmHP/yhy3r+TXlxRwMAiIpCAwCIikIDAIiqRvdo9GwFXVNw3XXXufz3f//3Lk+aNMll7a+YmbVs2dJl7dno3mX6OfQ1a9a4XFJSknoO4BMHDx50ecOGDS6PGTPGZV0X8cEHH7hc2jWNmkv3d1y7dq3LzZs3d7lp06aV8rzc0QAAoqLQAACiotAAAKKqVT2aH//4xy7r2Qs6n33VVVe53K9fv9Rz6LoEPQNe9097/vnnXd6/f3/qMYHPoucXrV+/3uWJEye6rPv96doyejRVJ4Tgsq7jK20vxSRJKnUMui5L12Hp17VH07hxY5fLunaMOxoAQFQUGgBAVBQaAEBUNbpHo3QfMc3ak7n66qtdHjJkSOoxz58/77Kua9Aejc6JnjlzJsOIgcwuXbqU8eu6LqJevXoxh4NP0X7YqFGjXNbzsIqLi1OPce7cuUodk/aB9HrQ82uaNWvmMj0aAEBBotAAAKKi0AAAoqpVPZpsdC80zS+88ELWx2jQoIHLuk6hsj8Xj8qj6xwK8bWqW9f/yY4ePdpl7QscO3bMZdbN5E/nzp1dHjlypMtLlixJ/czq1atjDsnGjh3rcv369V3WdTRNmjRxWXs6n4U7GgBAVBQaAEBUFBoAQFT0aCqZrqtB4dAeTLdu3Vxu3bq1yzr/vGfPnjgDy4Gu9brmmmtc1vNqli5d6jLnHVUdXeOk/Y+hQ4e6rHszmplt27bN5ZMnT+Y0Br3mdV2VnrmlfwO6bqZRo0Y5Pf8nuKMBAERFoQEAREWhAQBERY8GtYbOL8+cOdPl3r17u/zyyy+7XBU9Gp1T79Gjh8v/+I//6HK7du1cXrduncv0ZPJHezS6r6F+fdiwYanH0L3OtmzZ4rKeh6XfX1RU5LKuu+revXvGn9f9106fPp0aY1lwRwMAiIpCAwCIikIDAIgqFOJ+TgCAmoM7GgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFJoyCCE8G0IoDiGcCiFsDSHcn+8xoXBxvaCsQgin5b/LIYSf5XtclS0kSZLvMRS8EMJAM9ueJMn5EEJ/M1tiZrckSbImvyNDIeJ6QXmEEJqY2SEzm5YkybJ8j6cycUdTBkmSbEqS5Pwn8S//9crjkFDAuF5QTveY2WEzW57vgVQ2Ck0ZhRAeDyGcNbMtZlZsZvPzPCQUMK4XlMMXzeyZpAZOMzF1loMQQh0zu97MxpvZj5MkuZjfEaGQcb2grEIIXc1sl5n1TpJkV77HU9m4o8lBkiSXkyRZYWadzexr+R4PChvXC3Iwy8xW1MQiY0ahKa+6xpw7yo7rBdnMMrPf5HsQsVBosgghtAshzAghNA0h1AkhTDGzmWa2KN9jQ+HhekGuQgijzayTmf0+32OJhR5NFiGEtmb2opkNsf8rzHvM7LEkSZ7K68BQkLhekKsQwpNm1jhJknvzPZZYKDQAgKiYOgMAREWhAQBERaEBAERFoQEARFW3Kp8shMAnD/IsSZKQ7zGUVezrpX79+i537tzZ5R/96Ecujx492uUrr7zS5QYNGmR9zosX/eYA+mEczWfPnnX50qVLLh85csTlLVu2uDx/vt/55plnnnH58uXLGcdbna+Xxo0bu68PGjTI5X/4h39weeTIkS63adPG5UOHDrl87Ngxl8+fP++yXg/6uz516pTLBw4cMLVp0yaXS0pKXH777bdd3r9/v8utWrVyWa/Z/v37u1xUVOTyU0/5D0ueOHEiNcZP+6zrhTsaAEBUFBoAQFRVOnUGFBKdWunatavLOtXSsWNHl+vUqeNyWdak1a2b25+cTu8pnRrp0aOHy/369XNZ/zfo1JpOvVVnOs24d+9el/ft2+dy3759Xe7QoYPLOrWq01A6ddaoUaOM47tw4YLLH330Uep7xo8f7/IVV/h7gxD8TJVO1+k1nm16t7i42OX169e7vGDBgow//1m4owEAREWhAQBERaEBAERFjwa11rlz51zetcsfBbJ48WKX9aOlOmevuSx0Xv7MmTMuaw9G5+R1zr5hw4Yua4/mG9/4hsuDBw92+ac//WmWEVcf2gPRjyevWrXKZX39evbs6bL21/TjzatXr3a5efPmLmtPr169ei7ra21m1q5dO5e1Z6evvz6mXi+atY+1Z8+ejGMuL+5oAABRUWgAAFFRaAAAUdGjQa2lc/i6huAXv/iFyzqH3qlTJ5evueYal0tbV6P/dvr0aZd1LcYDDzzgcrdu3VzOti5C59jbtm3rsq7T0C1NahL93f/pT39yWftbI0aMcLlZs2Yub9682eVvf/vbLmv/Q3/3ev1oP80svZZHx6h9pcmTJ7usPRsd09y5c11+5JFHXNbfUXlxRwMAiIpCAwCIikIDAIiKHg1qLZ2z157NBx984LKuQdD5cl1HUVqPRtdi6N5jY8eOzfgclbWu4RO6tuTkyZOV+viFTLflX7hwocu6l5lu879o0SKXtcen67R0r7U1a9aUfbB/MWTIEJcfeughl/Wa0zG/+eabLj/33HMur1u3LucxlQV3NACAqCg0AICoKDQAgKjy2qNp3769y71793ZZ5xt1Xyg9Fjfbvj7Z6OMfPXo049dL+zf9nDpqDr0edQ5+9+7dLut8ull6bca1117r8nXXXeey7nWV63k22Wzbts1lPaOlJtPX789//rPLzz//fMaf19c7xt++9uj0vKEpU6a4rD083b9Pz5NZtmyZy6W9x1UG7mgAAFFRaAAAUVFoAABR5bVHM3ToUJcfffRRl3XfJ50T/fDDD13Odp62Zv2M+eHDh11euXJlxq+bpc+k0HUIer6IZu0zleXceeSHXj963oj2X+64447UY4wbN87lPn36uKx7U1W2JUuWuKx7Xb3//vtRn7+Q6N+a/i1q/yoftI89bNgwl1u0aJHx57du3eryW2+95XJp72kxcEcDAIiKQgMAiIpCAwCIKq89Gv3cuZ7RrZ9z1/lr7eG0bt3aZT2ro0mTJi7rmgSds33wwQddPn78uKlTp065rHPcr7/+usuvvfaaywcPHnRZ/zehcGgPUM8T+f73v+/y8OHDU49RVFRUqWPS/dl0zl3XSfzhD39wWc+f0f26UHWaNm2a+rerrrrK5XvvvddlfY87ceKEyzt37nQ5X30n7mgAAFFRaAAAUVFoAABR5bVHo/OHTz75pMs33nijy3PmzHH55Zdfdrl+/fouZzvjPVvu0qWLy/qZdrP0mRV6voh+7v2uu+5y+YUXXnBZ1zno2iHkj/ZodB+qAQMGuNyyZcvUY2gfUNfm6Nc//vhjl3X/vS1btri8atUql59++mmX9XqiJ1g49P3GzOz22293WdfNaJ95/fr1Lj/77LMu56sHxx0NACAqCg0AICoKDQAgqrz2aPT8h1/96lcu79+/3+XNmze7rGdu63x348aNXW7WrFnGrOts9HPtpe0rNGjQIJf1PBHt0VxzzTUuZ5v3X7p0qcs7duxwWddRoOro3lh6lofupWeW+xlJ+vpu2rTJZZ2DX7x4scu6TovrpXDomip9LzEzGzt2rMv6nqZ7mS1cuNDl1atXV2SIlYY7GgBAVBQaAEBUFBoAQFR57dHonPZ7773ncklJicvZ5pd1DUK2s2B0/jqb0s4K0TlzXccwcOBAl/WMeJ2D1TNOdE72pZdecnnPnj0ZRozKpD0XvX5+97vfuTxhwoTUY+jeVbrWJtsZSl27dnV50qRJLusZTdrXPHTokMu6Vx/nIVUdXZen/VszszZt2ris70F6Db744osuF8rryR0NACAqCg0AICoKDQAgqrz2aJTOgeuakXzTdRNm6b2jNG/YsMHlffv2uZxt3c3KlStd1p4Nqo7uO6Znf+i+Ytq/MzObOnWqy3feeafLupZLzxvp0aOHy926dXN5+vTpLutaM11389xzz7msPRvEo/vMlbaXovZY9PXRtYaFujcidzQAgKgoNACAqCg0AICoCqpHUxPougc9I0f3V9PPxZ87d85lPX9E1wIhfy5duuTy3r17XT579mzqZ3Td0zvvvOOyngmv+19pD0fPI1G6NkN/vl27di7rfoOoPHrezG233eZyaeuudB2Nrot64403XNb3m0LZ2447GgBAVBQaAEBUFBoAQFQUGgBAVHwYoIL0oLK2bdu6rJtmfvnLX3a5U6dOLs+fP99lXXB3/Pjxco0TVU8/yGGW3vRy27ZtLuuC3p49e7qsm3Lef//9LuuHA/TDJn379nX5gQcecLl3796pMaNyNGrUyGXdYFffS8zSH/7RwyKff/55l0tbVF4IuKMBAERFoQEAREWhAQBEVVA9miuu8HWvTp06LutiSF0wp5se5kofXzc01EOqzMwGDBjg8siRI12+8cYbXR4xYoTL69evd/nRRx91WXs0p0+fTo0B1Ydes7oxp77e+nWdx9fDA5s2bZrx+XVBn27keOWVV2b8eZSd9sd0sbZukKqLac3Sf++6KFgfUxd8FwruaAAAUVFoAABRUWgAAFFVaY9GeyA636wb/OmGcjrneeTIEZd1vjrb8+t8ufaE+vTp4/INN9yQesx77rkn48/oQUVLlixx+be//a3Lq1evdplNNKsPXSdRWr9ED67TrGsrhgwZ4rJeX9rXzEb/BvRgLe3hoPz0/Wz8+PEZv67vP2bp97gXXnjB5ZKSkgqMsOpwRwMAiIpCAwCIikIDAIiqSns0RUVFLn/jG99wedq0aS7rvkva09m5c6fL+plzXVejnzHXQ6h0PlTHo/tEmaX3Ftq9e7fL8+bNc/nZZ591+b333ks9JgqTrnPQfch037BRo0alHkN7MtrX0b8RXSdR2jx+ZWrdunXUx6/JtP+lPeUxY8a4rK91aT3mAwcOuLx///6sP1OIuKMBAERFoQEAREWhAQBEVaU9mptvvtnlyZMnu6xrCHTOU3suAwcOdFnXBGj/5MKFCy4PHz48y4g9HY+Z2aZNm1yeO3euy8uWLXNZ+0A653758uWcxoSqo2tM9OwhXeOi++CZpde9ZNvfL9d1Mtno34heb9Vlzr8Qaf9N1wF27NjR5RYtWrisZxWZpd9f9D1NX89CxR0NACAqCg0AICoKDQAgqirt0cyePdvlY8eOuTxjxgyXb7rpJpd1bzKdXz558qTLum/YqlWrMn5/p06dXNY517Fjx5pq1aqVy3qG+1133eXy66+/7rLuXbRhwwaX6dkUDu0R6vWoPTxdI1MepfUFP62ic/RHjx51eeHChS7r+Un4bLq33YQJE1zWdTPaf9N1N2ZmH3zwgcul9XGqA+5oAABRUWgAAFFRaAAAUYWq/Bx2CCGR7L5eXT4T/ml6xvrMmTNdvv32213WtRbvv/++y//0T//k8rvvvuuy9gVylSRJ5kn/AqLXS75pz65r164uf/e733X51ltvTT2GztPruplsPZlstKenc/pr1651Wdd9aY9m69atXC9lpOuqfvGLX7isPV69ntasWZN6zH/5l39xeenSpRUZYnSf9f7CHQ0AICoKDQAgKgoNACCqKl1Ho3uL6VksuhdZdXDo0CGXf/azn7m8ceNGl++77z6XJ06c6PLDDz/s8kMPPeTyrl27XGadTdXR84x27Njhsu5rp2fJmJn179/fZd0/TX9G12YUFxe7rGvR9u3b57Jef2+//bbLW7dudVnX1aDssu1jqOuqtB938ODB1GPWrVulb9HRcEcDAIiKQgMAiIpCAwCIqkonAHUvn+rYk8lGeyY6b69nsnfv3t3l3r17u6z7vT3xxBMVHCHKS9d56ZqmOXPmuKyvvVl6P71evXq5PGrUKJevuuoql998802Xd+7c6bL28LZv3+6y9hT1fBNUHv1d6/XTsGFDl3UvvdJ+prrijgYAEBWFBgAQFYUGABBVlfZo/vSnP1Xl0xUEnXc9fPiwy3v37nVZzxnX/ZBqypxtTaCvhe4rdvr06dTP6LqXLVu2uLxgwQKXtYeiZyixjqpwHThwwGV9rfS9QXs6Zul1U9UVdzQAgKgoNACAqCg0AICoqrRH0759e5d176gzZ864XNGzV6qCnkej62R0b6s77rjDZd3/Tc8LWb58eUWHiDwp7fotKSnJw0iQDytWrHBZezKrV692Wc8CMjNr1apV5Q8sD7ijAQBERaEBAERFoQEARBVYlwEAiIk7GgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFJoyCCE8G0IoDiGcCiFsDSHcn+8xoXBxvaCsQgin5b/LIYSf5XtclS0kSZLvMRS8EMJAM9ueJMn5EEJ/M1tiZrckSbImvyNDIeJ6QXmEEJqY2SEzm5YkybJ8j6cycUdTBkmSbEqS5Pwn8S//9crjkFDAuF5QTveY2WEzW57vgVQ2Ck0ZhRAeDyGcNbMtZlZsZvPzPCQUMK4XlMMXzeyZpAZOMzF1loMQQh0zu97MxpvZj5MkuZjfEaGQcb2grEIIXc1sl5n1TpJkV77HU9m4o8lBkiSXkyRZYWadzexr+R4PChvXC3Iwy8xW1MQiY0ahKa+6xpw7yo7rBdnMMrPf5HsQsVBosgghtAshzAghNA0h1AkhTDGzmWa2KN9jQ+HhekGuQgijzayTmf0+32OJhR5NFiGEtmb2opkNsf8rzHvM7LEkSZ7K68BQkLhekKsQwpNm1jhJknvzPZZYKDQAgKiYOgMAREWhAQBERaEBAERFoQEARFW3Kp8shMAnD/IsSZKQ7zGUFddLdoMHD3a5a9euLvfu3dvlIUOGuNy3b1+Xt2/f7vKsWbO4XlBmn/X+wh0NACAqCg0AICoKDQAgqirt0QAZclIhAAAMeElEQVSomGbNmrk8ffp0l6dMmeKy9mTq1KmT8fG7detWgdEBpeOOBgAQFYUGABAVhQYAEFVB92jq1avn8vDhw10eOXKky126dHG5YcOGLp87d85lXTPw3nvvubx161aXjx8/nmXEQFyNGjVyedy4cS7369fP5TNnzrjcuHFjl+vW9W8Bbdu2regQgRTuaAAAUVFoAABRUWgAAFEVdI9m4sSJLt90000u6z5P7du3d7lly5Yu6xqCAwcOuLxq1SqXFy5c6PJbb72VGuOhQ4dS/wbE8tFHH7m8du1aly9cuOCy9hk//PBDlzdt2uTyd77zHZevueaaco0Tafr+o2uiOnTokPoZ7blpH/nPf/6zy9qHLhTc0QAAoqLQAACiotAAAKIqqB6Nfqa/R48eLjdp0sTlffv2ubxt2zaXdR1OUVGRy8OGDXN51qxZLvfs2TPj+MzMXnzxRZeThCMxEM+pU6dc/ta3vuWyrrNp0KCBy/Xr13dZezonTpxwed68eeUaJ9Jat27tsva/pk6dmvqZmTNnuvzEE0+4/Nprr7l88OBBl7Und/nyZZf1emjXrl3GMWuP6NixY6kxl4Y7GgBAVBQaAEBUFBoAQFQF1aPRz5lv3LjR5aVLl7qse5XpfLPSz61/7nOfc/mhhx5yeejQoS7r2R5m6TlSnRMFqpKuo8h1XUVxcXFlDqdWa9q0qcvaA9Y1S7ruz8xsz549Lut+j7169XK5pKTEZX0P/fjjjzM+pz6e9sUfe+wxl0tbW1ga7mgAAFFRaAAAUVFoAABRFVSP5vz58y4vX768Uh//9OnTLv/+9793+f3333f5pz/9qct9+vRJPaZ+Fn7RokUVGSJQpfT8mbvuuitPI6l5br75ZpfHjx/vsq5RuXTpUuoxtKcyYMAAl7XvfPHiRZcnTJjgsu6V17VrV5f1DC/d7zFbH/yzcEcDAIiKQgMAiIpCAwCIqqB6NLHpPmS6b5Se7aE9HZ0vNUvvFQQUMt3vb9KkSS5/8YtfrMrhVGstWrRwWc/Puueee1zWfcR+9atfuVzamhTdW0zfk/T9R3s0+p6lawF/8IMfZHy8559/3mXtY5cVdzQAgKgoNACAqCg0AICoalWPRul85O233+5y8+bNXV61alXqMfbu3Vv5AwP+Qnsquu6lTZs2LusZTDpnr+smvv71r7tc3nUSNcEVV/j/3609GO2xdOvWzWXtb+m+YZs3b3ZZezQ7duxIjUn3TtSei45Zv675tttuc1nPK9J1M7qW8ezZs6kxlgV3NACAqCg0AICoKDQAgKhqdI9Gz0/XOdXRo0e7fMcdd7isZ0G8/fbbqefYuXNnRYaIWkbn1LVPqH0B3UtvzJgxLt9yyy0u9+3bN+PjKz2v5p//+Z9dfvTRRzP+fHWm5181btzY5VtvvdXladOmudyvXz+X27dv77KuOfnjH/+Y8et6lkxZlLa279O0r6Rj1td/y5YtLh89ejTnMZWGOxoAQFQUGgBAVBQaAEBUBd2j0TnUqVOnutyxY0eXdT5az+zu0qWLyz169HBZ5yv/53/+x+V33nknNUY93wH4hPZjzNJ9w+7du7us58rPnDnTZV1Ho38jpT3np+l+f3PnznX5t7/9rcs1qUeja4y0f6F7lenZPDfeeKPL2tPZuHGjy6+//rrL2qMp75qUTEIILuvawMGDB7u8evVql5cuXeqyvieWF3c0AICoKDQAgKgoNACAqAqqR6Pzy61atXJ58uTJLuuaAT0/W+dgO3Xq5PKhQ4dc1jUE+pny0s70zra3FGoPnR/X69fM7Prrr3f5m9/8pss9e/Z0Wa/hunUr909W+wblWctRXWhP5corr3RZ1ygNHTrU5YYNG7qs7w/PPfecy7qX2fHjx13Wfll5aI9uwoQJLk+ZMsVl3W/t8ccfd3nbtm0VHlNpuKMBAERFoQEAREWhAQBEVVA9Gp3j1nUxelaGntWg/RGdv9THa9Kkicvf+c53XH7zzTddLu1Mb93/bP/+/S5fvnw59TOomXTd1sCBA1Pf87d/+7cu69oMpX8TOq9/+PBhl3XvK+0TaZ/xyJEjLtek61V/d3q2z5AhQ1y+6qqrXG7ZsqXL2tPVHsxLL73kclX0ZHTMn//8513WHvLs2bNd3rRpU4XHVBbc0QAAoqLQAACiotAAAKKi0AAAoiqoDwNoI/LAgQMuf+9738v487qgShdk6UFnd955p8ujRo1y+Utf+pLLM2bMSD2nLnBas2aNy7qxnuYYG+shP7Txfu2116a+5+67787pMfUDMGvXrnV58eLFLuumm7pgb9euXS5rgzvbQVrVib4fdO3a1eXp06dn/Lr+bl599VWXFy1a5LIelFgZ9AMNuuhcm/9XX321y6+99prLS5YsqbzB5YA7GgBAVBQaAEBUFBoAQFQF1aOpKD2ETBdPzps3z+WVK1e63KFDB5f79Onjsm5YZ2Y2bNgwl6dNm+aybsw3fPhwl3/5y1+6rH0pVB/ao+nWrVvqe7ItwNRreO/evS4/+eSTLs+fP9/lcePGuazXmy46Li4udrkm9Wh0ceP58+ddbtCggctnzpxxWfsZTz/9tMtbt26t4Aiz0+vlpptucvmv//qvXdbXU/tKp0+frsTRlR13NACAqCg0AICoKDQAgKhqVI9G6bqcU6dOZcy7d+92WQ8J2rlzZ+o5tEczduxYl3XjPl3XsG/fPpf/+7//O/UcqB6OHj3qsq6zMEvPkes6Ge3JvPvuuy5v377dZT0ITdeC6Eayutbj3LlzqTHWFNqT+eCDD1x++OGHXdbfhR5spq9dZWyS+Wl68KNZuu93yy23uKyHPeqmmevWrauk0VUMdzQAgKgoNACAqCg0AICoanSPJlfa0zl27JjLS5cuTf2MzqHrXlK6t5X2bKZOneryihUrXNZ5YhQuvV5Wr16d+h7tA+phfXqYn/Z99CAr3a9PD1bTvfh07VhJSUlqjDWF/m5Pnjzp8jvvvJPx57V/Vtk9GVXa43/hC19wuXfv3i7r3ne6bkbXBuULdzQAgKgoNACAqCg0AICo6NFUkM5x//GPf3RZ967SdQ19+/Z1WdfZ0KOpPvRsIV0T81n/lon2ZMaMGePy5z73OZd1v77f/OY3LmvPpjbTdTZVTV/bAQMGpL5H9zbTtTYLFy50edWqVZU0usrFHQ0AICoKDQAgKgoNACAqejSVTNfiLFiwwGXdi0r3Mho8eHCcgaFa0Dn4jh07uvylL33JZZ3XX7x4scvLly93OV/nkSB9tkyTJk1cvu+++1I/c/XVV7us5w+99NJLlTS6uLijAQBERaEBAERFoQEAREWPporp+TO6Vxpz6LWLnmuv66gefPBBlydNmuSyrqN4/PHHXd60aVNFh4hK0qhRI5cnTJjg8sSJE1M/s2HDBpefeeYZl3V/vULFHQ0AICoKDQAgKgoNACAqejRVTNfN6LztwYMHq3I4qIAGDRpkzA0bNkz9TNu2bV3u16+fy9OnT3e5c+fOLv/bv/2by6+88orLO3fudDnf+3nh/+lr/+1vf9vlS5cupX5Ge3Dr16+v/IFVAe5oAABRUWgAAFFRaAAAUdGjiaxnz54ujxgxwuWmTZu6rJ+bR/5o/0xfy169ermsZw01a9Ys9ZijRo1yedq0aS7rufGzZ892WXsyW7dudfnixYup50R+6PXx7//+7y53797d5Tlz5qQeQ3s0euZRdcEdDQAgKgoNACAqCg0AICp6NDnQfanMzNq0aeOy7lU1evRol/Wz9KtXr3aZHk3+6HkhrVu3dllf26FDh7rcvHlzl1u2bJl6Dl0Xo32gefPmufzLX/7SZe3J6PlHyJ/27du7PG7cOJdvuOEGl3fs2OGyvvZmZtu3b6+k0eUXdzQAgKgoNACAqCg0AICoqnWPpm5dP/z+/fu7rOsadO8p3XesqKjI5RYtWris57ebpefc9TF0/6J169a5vGjRIpePHj2aeg5UDb1edN3M3/3d37ncoUMHl7WHV9prqa//ggULXF6+fLnLu3fvdpmeTOFo3Lixy8OGDXP5K1/5isunTp1y+ec//7nLr776auo5dF1VdcUdDQAgKgoNACAqCg0AIKpQU+YAAQCFiTsaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFQUGgBAVBQaAEBUFBoAQFT/CxFWnMifWD0bAAAAAElFTkSuQmCC\n", "text/plain": [ "
" ] }, "metadata": { "needs_background": "light" }, "output_type": "display_data" } ], "source": [ "data = ImageDataBunch.from_folder(untar_data(URLs.MNIST_SAMPLE),\n", " ds_tfms=tfms, size=24)\n", "data.normalize()\n", "data.show_batch(rows=3, figsize=(6,6))" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

get_annotations[source]

\n", "\n", "> get_annotations(**`fname`**, **`prefix`**=***`None`***)\n", "\n", "Open a COCO style json in `fname` and returns the lists of filenames (with maybe `prefix`) and labelled bboxes. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(get_annotations)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To use this dataset and collate samples into batches, you'll need to following function:" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

bb_pad_collate[source]

\n", "\n", "> bb_pad_collate(**`samples`**:`BatchSamples`, **`pad_idx`**:`int`=***`0`***) → `Tuple`\\[`FloatTensor`, `Tuple`\\[`LongTensor`, `LongTensor`\\]\\]\n", "\n", "Function that collect `samples` of labelled bboxes and adds padding with `pad_idx`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(bb_pad_collate)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Finally, to apply transformations to [`Image`](/vision.image.html#Image) in a [`Dataset`](https://pytorch.org/docs/stable/data.html#torch.utils.data.Dataset), we use this last class." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## ItemList specific to vision" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The vision application adds a few subclasses of [`ItemList`](/data_block.html#ItemList) specific to images." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

class ImageList[source]

\n", "\n", "> ImageList(**\\*`args`**, **`convert_mode`**=***`'RGB'`***, **\\*\\*`kwargs`**) :: [`ItemList`](/data_block.html#ItemList)\n", "\n", "[`ItemList`](/data_block.html#ItemList) suitable for computer vision. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageList, title_level=3)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Create a [`ItemList`](/data_block.html#ItemList) in `path` from filenames in `items`. `create_func` will default to [`open_image`](/vision.image.html#open_image). `label_cls` can be specified for the labels, `xtra` contains any extra information (usually in the form of a dataframe) and `processor` is applied to the [`ItemList`](/data_block.html#ItemList) after splitting and labelling." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

from_folder[source]

\n", "\n", "> from_folder(**`path`**:`PathOrStr`=***`'.'`***, **`extensions`**:`StrList`=***`None`***, **\\*\\*`kwargs`**) → [`ItemList`](/data_block.html#ItemList)\n", "\n", "Get the list of files in `path` that have an image suffix. `recurse` determines if we search subfolders. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageList.from_folder)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

from_df[source]

\n", "\n", "> from_df(**`df`**:`DataFrame`, **`path`**:`PathOrStr`, **`cols`**:`IntsOrStrs`=***`0`***, **`folder`**:`PathOrStr`=***`None`***, **`suffix`**:`str`=***`''`***, **\\*\\*`kwargs`**) → `ItemList`\n", "\n", "Get the filenames in `cols` of `df` with `folder` in front of them, `suffix` at the end. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageList.from_df)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

get_image_files[source]

\n", "\n", "> get_image_files(**`c`**:`PathOrStr`, **`check_ext`**:`bool`=***`True`***, **`recurse`**=***`False`***) → `FilePathList`\n", "\n", "Return list of files in `c` that are images. `check_ext` will filter to `image_extensions`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(get_image_files)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

open[source]

\n", "\n", "> open(**`fn`**)\n", "\n", "Open image in `fn`, subclass and overwrite for custom behavior. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageList.open)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

show_xys[source]

\n", "\n", "> show_xys(**`xs`**, **`ys`**, **`imgsize`**:`int`=***`4`***, **`figsize`**:`Optional`\\[`Tuple`\\[`int`, `int`\\]\\]=***`None`***, **\\*\\*`kwargs`**)\n", "\n", "Show the `xs` (inputs) and `ys` (targets) on a figure of `figsize`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageList.show_xys)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

show_xyzs[source]

\n", "\n", "> show_xyzs(**`xs`**, **`ys`**, **`zs`**, **`imgsize`**:`int`=***`4`***, **`figsize`**:`Optional`\\[`Tuple`\\[`int`, `int`\\]\\]=***`None`***, **\\*\\*`kwargs`**)\n", "\n", "Show `xs` (inputs), `ys` (targets) and `zs` (predictions) on a figure of `figsize`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageList.show_xyzs)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

class ObjectCategoryList[source]

\n", "\n", "> ObjectCategoryList(**`items`**:`Iterator`\\[`T_co`\\], **`classes`**:`Collection`\\[`T_co`\\]=***`None`***, **`label_delim`**:`str`=***`None`***, **`one_hot`**:`bool`=***`False`***, **\\*\\*`kwargs`**) :: [`MultiCategoryList`](/data_block.html#MultiCategoryList)\n", "\n", "[`ItemList`](/data_block.html#ItemList) for labelled bounding boxes. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ObjectCategoryList, title_level=3)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

class ObjectItemList[source]

\n", "\n", "> ObjectItemList(**\\*`args`**, **`convert_mode`**=***`'RGB'`***, **\\*\\*`kwargs`**) :: [`ImageList`](/vision.data.html#ImageList)\n", "\n", "[`ItemList`](/data_block.html#ItemList) suitable for object detection. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ObjectItemList, title_level=3)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

class SegmentationItemList[source]

\n", "\n", "> SegmentationItemList(**\\*`args`**, **`convert_mode`**=***`'RGB'`***, **\\*\\*`kwargs`**) :: [`ImageList`](/vision.data.html#ImageList)\n", "\n", "[`ItemList`](/data_block.html#ItemList) suitable for segmentation tasks. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(SegmentationItemList, title_level=3)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

class SegmentationLabelList[source]

\n", "\n", "> SegmentationLabelList(**`items`**:`Iterator`\\[`T_co`\\], **`classes`**:`Collection`\\[`T_co`\\]=***`None`***, **\\*\\*`kwargs`**) :: [`ImageList`](/vision.data.html#ImageList)\n", "\n", "[`ItemList`](/data_block.html#ItemList) for segmentation masks. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(SegmentationLabelList, title_level=3)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

class PointsLabelList[source]

\n", "\n", "> PointsLabelList(**`items`**:`Iterator`\\[`T_co`\\], **`path`**:`PathOrStr`=***`'.'`***, **`label_cls`**:`Callable`=***`None`***, **`inner_df`**:`Any`=***`None`***, **`processor`**:`Union`\\[[`PreProcessor`](/data_block.html#PreProcessor), `Collection`\\[[`PreProcessor`](/data_block.html#PreProcessor)\\]\\]=***`None`***, **`x`**:`ItemList`=***`None`***, **`ignore_empty`**:`bool`=***`False`***) :: [`ItemList`](/data_block.html#ItemList)\n", "\n", "[`ItemList`](/data_block.html#ItemList) for points. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(PointsLabelList, title_level=3)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

class PointsItemList[source]

\n", "\n", "> PointsItemList(**\\*`args`**, **`convert_mode`**=***`'RGB'`***, **\\*\\*`kwargs`**) :: [`ImageList`](/vision.data.html#ImageList)\n", "\n", "[`ItemList`](/data_block.html#ItemList) for [`Image`](/vision.image.html#Image) to [`ImagePoints`](/vision.image.html#ImagePoints) tasks. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(PointsItemList, title_level=3)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

class ImageImageList[source]

\n", "\n", "> ImageImageList(**\\*`args`**, **`convert_mode`**=***`'RGB'`***, **\\*\\*`kwargs`**) :: [`ImageList`](/vision.data.html#ImageList)\n", "\n", "[`ItemList`](/data_block.html#ItemList) suitable for [`Image`](/vision.image.html#Image) to [`Image`](/vision.image.html#Image) tasks. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageImageList, title_level=3)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Building your own dataset" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This module also contains a few helper functions to allow you to build you own dataset for image classification." ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

download_images[source]

\n", "\n", "> download_images(**`urls`**:`StrList`, **`dest`**:`PathOrStr`, **`max_pics`**:`int`=***`1000`***, **`max_workers`**:`int`=***`8`***, **`timeout`**=***`4`***)\n", "\n", "Download images listed in text file `urls` to path `dest`, at most `max_pics` \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(download_images)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

verify_images[source]

\n", "\n", "> verify_images(**`path`**:`PathOrStr`, **`delete`**:`bool`=***`True`***, **`max_workers`**:`int`=***`4`***, **`max_size`**:`int`=***`None`***, **`recurse`**:`bool`=***`False`***, **`dest`**:`PathOrStr`=***`'.'`***, **`n_channels`**:`int`=***`3`***, **`interp`**=***`2`***, **`ext`**:`str`=***`None`***, **`img_format`**:`str`=***`None`***, **`resume`**:`bool`=***`None`***, **\\*\\*`kwargs`**)\n", "\n", "Check if the images in `path` aren't broken, maybe resize them and copy it in `dest`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(verify_images)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "It will try if every image in this folder can be opened and has `n_channels`. If `n_channels` is 3 – it'll try to convert image to RGB. If `delete=True`, it'll be removed it this fails. If `resume` – it will skip already existent images in `dest`. If `max_size` is specified, image is resized to the same ratio so that both sizes are less than `max_size`, using `interp`. Result is stored in `dest`, `ext` forces an extension type, `img_format` and `kwargs` are passed to PIL.Image.save. Use `max_workers` CPUs." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Undocumented Methods - Methods moved below this line will intentionally be hidden" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/markdown": [ "

get[source]

\n", "\n", "> get(**`i`**)\n", "\n", "Subclass if you want to customize how to create item `i` from `self.items`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(PointsItemList.get)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/markdown": [ "

new[source]

\n", "\n", "> new(**`items`**:`Iterator`\\[`T_co`\\], **`processor`**:`Union`\\[[`PreProcessor`](/data_block.html#PreProcessor), `Collection`\\[[`PreProcessor`](/data_block.html#PreProcessor)\\]\\]=***`None`***, **\\*\\*`kwargs`**) → `ItemList`\n", "\n", "Create a new [`ItemList`](/data_block.html#ItemList) from `items`, keeping the same attributes. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(SegmentationLabelList.new)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/markdown": [ "

from_csv[source]

\n", "\n", "> from_csv(**`path`**:`PathOrStr`, **`csv_name`**:`str`, **`header`**:`str`=***`'infer'`***, **\\*\\*`kwargs`**) → `ItemList`\n", "\n", "Get the filenames in `path/csv_name` opened with `header`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageList.from_csv)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/markdown": [ "

get[source]

\n", "\n", "> get(**`i`**)\n", "\n", "Subclass if you want to customize how to create item `i` from `self.items`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ObjectCategoryList.get)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/markdown": [ "

get[source]

\n", "\n", "> get(**`i`**)\n", "\n", "Subclass if you want to customize how to create item `i` from `self.items`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageList.get)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/markdown": [ "

reconstruct[source]

\n", "\n", "> reconstruct(**`t`**:`Tensor`)\n", "\n", "Reconstruct one of the underlying item for its data `t`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(SegmentationLabelList.reconstruct)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/markdown": [ "

show_xys[source]

\n", "\n", "> show_xys(**`xs`**, **`ys`**, **`imgsize`**:`int`=***`4`***, **`figsize`**:`Optional`\\[`Tuple`\\[`int`, `int`\\]\\]=***`None`***, **\\*\\*`kwargs`**)\n", "\n", "Show the `xs` (inputs) and `ys`(targets) on a figure of `figsize`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageImageList.show_xys)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/markdown": [ "

show_xyzs[source]

\n", "\n", "> show_xyzs(**`xs`**, **`ys`**, **`zs`**, **`imgsize`**:`int`=***`4`***, **`figsize`**:`Optional`\\[`Tuple`\\[`int`, `int`\\]\\]=***`None`***, **\\*\\*`kwargs`**)\n", "\n", "Show `xs` (inputs), `ys` (targets) and `zs` (predictions) on a figure of `figsize`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageImageList.show_xyzs)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/markdown": [ "

open[source]

\n", "\n", "> open(**`fn`**)\n", "\n", "Open image in `fn`, subclass and overwrite for custom behavior. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageList.open)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/markdown": [ "

analyze_pred[source]

\n", "\n", "> analyze_pred(**`pred`**:`Tensor`)\n", "\n", "Called on `pred` before `reconstruct` for additional preprocessing. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(PointsItemList.analyze_pred)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/markdown": [ "

analyze_pred[source]

\n", "\n", "> analyze_pred(**`pred`**, **`thresh`**:`float`=***`0.5`***)\n", "\n", "Called on `pred` before `reconstruct` for additional preprocessing. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(SegmentationLabelList.analyze_pred)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/markdown": [ "

reconstruct[source]

\n", "\n", "> reconstruct(**`t`**:`Tensor`)\n", "\n", "Reconstruct one of the underlying item for its data `t`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(PointsItemList.reconstruct)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/markdown": [ "

open[source]

\n", "\n", "> open(**`fn`**)\n", "\n", "Open image in `fn`, subclass and overwrite for custom behavior. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(SegmentationLabelList.open)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/markdown": [ "

reconstruct[source]

\n", "\n", "> reconstruct(**`t`**:`Tensor`)\n", "\n", "Reconstruct one of the underlying item for its data `t`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ImageList.reconstruct)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/markdown": [ "

resize_to[source]

\n", "\n", "> resize_to(**`img`**, **`targ_sz`**:`int`, **`use_min`**:`bool`=***`False`***)\n", "\n", "Size to resize to, to hit `targ_sz` at same aspect ratio, in PIL coords (i.e w*h) \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(resize_to)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/markdown": [ "

reconstruct[source]

\n", "\n", "> reconstruct(**`t`**, **`x`**)\n", "\n", "Reconstruct one of the underlying item for its data `t`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ObjectCategoryList.reconstruct)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/markdown": [ "

reconstruct[source]

\n", "\n", "> reconstruct(**`t`**, **`x`**)\n", "\n", "Reconstruct one of the underlying item for its data `t`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(PointsLabelList.reconstruct)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/markdown": [ "

analyze_pred[source]

\n", "\n", "> analyze_pred(**`pred`**, **`thresh`**:`float`=***`0.5`***)\n", "\n", "Called on `pred` before `reconstruct` for additional preprocessing. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(PointsLabelList.analyze_pred)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/markdown": [ "

get[source]

\n", "\n", "> get(**`i`**)\n", "\n", "Subclass if you want to customize how to create item `i` from `self.items`. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(PointsLabelList.get)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## New Methods - Please document or move to the undocumented section" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

analyze_pred[source]

\n", "\n", "> analyze_pred(**`pred`**)\n", "\n", "Called on `pred` before `reconstruct` for additional preprocessing. \n", "\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(ObjectCategoryList.analyze_pred)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [] } ], "metadata": { "jekyll": { "keywords": "fastai", "summary": "Basic dataset for computer vision and helper function to get a DataBunch", "title": "vision.data" }, "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" } }, "nbformat": 4, "nbformat_minor": 2 }