{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "## Customizing datasets in fastai" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [], "source": [ "from fastai.gen_doc.nbdoc import *\n", "from fastai.vision import *" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In this tutorial, we'll see how to create custom subclasses of [`ItemBase`](/core.html#ItemBase) or [`ItemList`](/data_block.html#ItemList) while retaining everything the fastai library has to offer. To allow basic functions to work consistently across various applications, the fastai library delegates several tasks to one of those specific objects, and we'll see here which methods you have to implement to be able to have everything work properly. But first let's take a step back to see where you'll use your end result." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Links with the data block API" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The data block API works by allowing you to pick a class that is responsible to get your items and another class that is charged with getting your targets. Combined together, they create a pytorch [`Dataset`](https://pytorch.org/docs/stable/data.html#torch.utils.data.Dataset) that is then wrapped inside a [`DataLoader`](https://pytorch.org/docs/stable/data.html#torch.utils.data.DataLoader). The training set, validation set and maybe test set are then all put in a [`DataBunch`](/basic_data.html#DataBunch).\n", "\n", "The data block API allows you to mix and match what class your inputs have, what class your targets have, how to do the split between train and validation set, then how to create the [`DataBunch`](/basic_data.html#DataBunch), but if you have a very specific kind of input/target, the fastai classes might no be sufficient to you. This tutorial is there to explain what is needed to create a new class of items and what methods are important to implement or override.\n", "\n", "It goes in two phases: first we focus on what you need to create a custom [`ItemBase`](/core.html#ItemBase) class (which is the type of your inputs/targets) then on how to create your custom [`ItemList`](/data_block.html#ItemList) (which is basically a set of [`ItemBase`](/core.html#ItemBase)) while highlighting which methods are called by the library." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Creating a custom [`ItemBase`](/core.html#ItemBase) subclass" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The fastai library contains three basic types of [`ItemBase`](/core.html#ItemBase) that you might want to subclass:\n", "- [`Image`](/vision.image.html#Image) for vision applications\n", "- [`Text`](/text.data.html#Text) for text applications\n", "- [`TabularLine`](/tabular.data.html#TabularLine) for tabular applications\n", "\n", "Whether you decide to create your own item class or to subclass one of the above, here is what you need to implement:" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Basic attributes" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Those are the more important attributes your custom [`ItemBase`](/core.html#ItemBase) needs as they're used everywhere in the fastai library:\n", "- `ItemBase.data` is the thing that is passed to pytorch when you want to create a [`DataLoader`](https://pytorch.org/docs/stable/data.html#torch.utils.data.DataLoader). This is what needs to be fed to your model. Note that it might be different from the representation of your item since you might want something that is more understandable.\n", "- `__str__` representation: if applicable, this is what will be displayed when the fastai library has to show your item.\n", "\n", "If we take the example of a [`MultiCategory`](/core.html#MultiCategory) object `o` for instance:\n", "- `o.data` is a tensor where the tags are one-hot encoded\n", "- `str(o)` returns the tags separated by ;\n", "\n", "If you want to code the way data augmentation should be applied to your custom `Item`, you should write an `apply_tfms` method. This is what will be called if you apply a [`transform`](/vision.transform.html#vision.transform) block in the data block API." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Example: ImageTuple" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "For cycleGANs, we need to create a custom type of items since we feed the model tuples of images. Let's look at how to code this. The basis is to code the [`data`](/vision.data.html#vision.data) attribute that is what will be given to the model. Note that we still keep track of the initial object (usuall in an `obj` attribute) to be able to show nice representations later on. Here the object is the tuple of images and the data their underlying tensors normalized between -1 and 1." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "class ImageTuple(ItemBase):\n", " def __init__(self, img1, img2):\n", " self.img1,self.img2 = img1,img2\n", " self.obj,self.data = (img1,img2),[-1+2*img1.data,-1+2*img2.data]" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Then we want to apply data augmentation to our tuple of images. That's done by writing an `apply_tfms` method as we saw before. Here we pass that call to the two underlying images then update the data." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ " def apply_tfms(self, tfms, **kwargs):\n", " self.img1 = self.img1.apply_tfms(tfms, **kwargs)\n", " self.img2 = self.img2.apply_tfms(tfms, **kwargs)\n", " self.data = [-1+2*self.img1.data,-1+2*self.img2.data]\n", " return self" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We define a last method to stack the two images next to each other, which we will use later for a customized `show_batch` / `show_results` behavior." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ " def to_one(self): return Image(0.5+torch.cat(self.data,2)/2)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This is all you need to create your custom [`ItemBase`](/core.html#ItemBase). You won't be able to use it until you have put it inside your custom [`ItemList`](/data_block.html#ItemList) though, so you should continue reading the next section." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Creating a custom [`ItemList`](/data_block.html#ItemList) subclass" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This is the main class that allows you to group your inputs or your targets in the data block API. You can then use any of the splitting or labelling methods before creating a [`DataBunch`](/basic_data.html#DataBunch). To make sure everything is properly working, here is what you need to know." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Class variables" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Whether you're directly subclassing [`ItemList`](/data_block.html#ItemList) or one of the particular fastai ones, make sure to know the content of the following three variables as you may need to adjust them:\n", "- `_bunch` contains the name of the class that will be used to create a [`DataBunch`](/basic_data.html#DataBunch) \n", "- `_processor` contains a class (or a list of classes) of [`PreProcessor`](/data_block.html#PreProcessor) that will then be used as the default to create processor for this [`ItemList`](/data_block.html#ItemList)\n", "- `_label_cls` contains the class that will be used to create the labels by default\n", "\n", "`_label_cls` is the first to be used in the data block API, in the labelling function. If this variable is set to `None`, the label class will be set to [`CategoryList`](/data_block.html#CategoryList), [`MultiCategoryList`](/data_block.html#MultiCategoryList) or [`FloatList`](/data_block.html#FloatList) depending on the type of the first item. The default can be overridden by passing a `label_cls` in the kwargs of the labelling function.\n", "\n", "`_processor` is the second to be used. The processors are called at the end of the labelling to apply some kind of function on your items. The default processor of the inputs can be overriden by passing a `processor` in the kwargs when creating the [`ItemList`](/data_block.html#ItemList), the default processor of the targets can be overridden by passing a `processor` in the kwargs of the labelling function. \n", "\n", "Processors are useful for pre-processing some data, but you also need to put in their state any variable you want to save for the call of `data.export()` before creating a [`Learner`](/basic_train.html#Learner) object for inference: the state of the [`ItemList`](/data_block.html#ItemList) isn't saved there, only their processors. For instance `SegmentationProcessor`'s only reason to exist is to save the dataset classes, and during the process call, it doesn't do anything apart from setting the `classes` and `c` attributes to its dataset.\n", "``` python\n", "class SegmentationProcessor(PreProcessor):\n", " def __init__(self, ds:ItemList): self.classes = ds.classes\n", " def process(self, ds:ItemList): ds.classes,ds.c = self.classes,len(self.classes)\n", "```\n", "\n", "`_bunch` is the last class variable used in the data block. When you type the final `databunch()`, the data block API calls the `_bunch.create` method with the `_bunch` of the inputs. " ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Keeping \\_\\_init\\_\\_ arguments" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "If you pass additional arguments in your `__init__` call that you save in the state of your [`ItemList`](/data_block.html#ItemList), we have to make sure they are also passed along in the `new` method as this one is used to create your training and validation set when splitting. To do that, you just have to add their names in the `copy_new` argument of your custom [`ItemList`](/data_block.html#ItemList), preferably during the `__init__`. Here we will need two collections of filenames (for the two type of images) so we make sure the second one is copied like this:\n", "\n", "```python\n", "def __init__(self, items, itemsB=None, **kwargs):\n", " super().__init__(items, **kwargs)\n", " self.itemsB = itemsB\n", " self.copy_new.append('itemsB')\n", "```\n", "\n", "Be sure to keep the kwargs as is, as they contain all the additional stuff you can pass to an [`ItemList`](/data_block.html#ItemList)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Important methods" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### - get" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The most important method you have to implement is `get`: this one will enable your custom [`ItemList`](/data_block.html#ItemList) to generate an [`ItemBase`](/core.html#ItemBase) from the thing stored in its `items` array. For instance an [`ImageList`](/vision.data.html#ImageList) has the following `get` method:\n", "``` python\n", "def get(self, i):\n", " fn = super().get(i)\n", " res = self.open(fn)\n", " self.sizes[i] = res.size\n", " return res\n", "```\n", "The first line basically looks at `self.items[i]` (which is a filename). The second line opens it since the `open`method is just\n", "``` python\n", "def open(self, fn): return open_image(fn)\n", "```\n", "The third line is there for [`ImagePoints`](/vision.image.html#ImagePoints) or [`ImageBBox`](/vision.image.html#ImageBBox) targets that require the size of the input [`Image`](/vision.image.html#Image) to be created. Note that if you are building a custom target class and you need the size of an image, you should call `self.x.size[i]`. " ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "
Note: If you just want to customize the way an `Image` is opened, subclass `Image` and just change the\n", "`open` method.
" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "jekyll_note(\"\"\"If you just want to customize the way an `Image` is opened, subclass `Image` and just change the\n", "`open` method.\"\"\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### - reconstruct" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This is the method that is called in `data.show_batch()`, `learn.predict()` or `learn.show_results()` to transform a pytorch tensor back into an [`ItemBase`](/core.html#ItemBase). In a way, it does the opposite of calling `ItemBase.data`. It should take a tensor `t` and return the same kind of thing as the `get` method.\n", "\n", "In some situations ([`ImagePoints`](/vision.image.html#ImagePoints), [`ImageBBox`](/vision.image.html#ImageBBox) for instance) you need to have a look at the corresponding input to rebuild your item. In this case, you should have a second argument called `x` (don't change that name). For instance, here is the `reconstruct` method of [`PointsItemList`](/vision.data.html#PointsItemList):\n", "```python\n", "def reconstruct(self, t, x): return ImagePoints(FlowField(x.size, t), scale=False)\n", "```" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "#### - analyze_pred" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This is the method that is called in `learn.predict()` or `learn.show_results()` to transform predictions in an output tensor suitable for `reconstruct`. For instance we may need to take the maximum argument (for [`Category`](/core.html#Category)) or the predictions greater than a certain threshold (for [`MultiCategory`](/core.html#MultiCategory)). It should take a tensor, along with optional kwargs and return a tensor.\n", "\n", "For instance, here is the `analyze_pred` method of [`MultiCategoryList`](/data_block.html#MultiCategoryList):\n", "```python\n", "def analyze_pred(self, pred, thresh:float=0.5): return (pred >= thresh).float()\n", "```\n", "`thresh` can then be passed as kwarg during the calls to `learn.predict()` or `learn.show_results()`." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Advanced show methods" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "If you want to use methods such a `data.show_batch()` or `learn.show_results()` with a brand new kind of [`ItemBase`](/core.html#ItemBase) you will need to implement two other methods. In both cases, the generic function will grab the tensors of inputs, targets and predictions (if applicable), reconstruct the corresponding [`ItemBase`](/core.html#ItemBase) (as seen before) but it will delegate to the [`ItemList`](/data_block.html#ItemList) the way to display the results.\n", "\n", "``` python\n", "def show_xys(self, xs, ys, **kwargs)->None:\n", "\n", "def show_xyzs(self, xs, ys, zs, **kwargs)->None:\n", "```\n", "In both cases `xs` and `ys` represent the inputs and the targets, in the second case `zs` represent the predictions. They are lists of the same length that depend on the `rows` argument you passed. The kwargs are passed from `data.show_batch()` / `learn.show_results()`. As an example, here is the source code of those methods in [`ImageList`](/vision.data.html#ImageList):\n", "\n", "``` python\n", "def show_xys(self, xs, ys, figsize:Tuple[int,int]=(9,10), **kwargs):\n", " \"Show the `xs` and `ys` on a figure of `figsize`. `kwargs` are passed to the show method.\"\n", " rows = int(math.sqrt(len(xs)))\n", " fig, axs = plt.subplots(rows,rows,figsize=figsize)\n", " for i, ax in enumerate(axs.flatten() if rows > 1 else [axs]):\n", " xs[i].show(ax=ax, y=ys[i], **kwargs)\n", " plt.tight_layout()\n", " \n", "def show_xyzs(self, xs, ys, zs, figsize:Tuple[int,int]=None, **kwargs):\n", " \"\"\"Show `xs` (inputs), `ys` (targets) and `zs` (predictions) on a figure of `figsize`. \n", " `kwargs` are passed to the show method.\"\"\"\n", " figsize = ifnone(figsize, (6,3*len(xs)))\n", " fig,axs = plt.subplots(len(xs), 2, figsize=figsize)\n", " fig.suptitle('Ground truth / Predictions', weight='bold', size=14)\n", " for i,(x,y,z) in enumerate(zip(xs,ys,zs)):\n", " x.show(ax=axs[i,0], y=y, **kwargs)\n", " x.show(ax=axs[i,1], y=z, **kwargs)\n", "``` \n", "\n", "Linked to this method is the class variable `_show_square` of an [`ItemList`](/data_block.html#ItemList). It defaults to `False` but if it's `True`, the `show_batch` method will send `rows * rows` `xs` and `ys` to `show_xys` (so that it shows a square of inputs/targets), like here for images." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Example: ImageTupleList" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Continuing our custom item example, we create a custom [`ItemList`](/data_block.html#ItemList) class that will wrap those `ImageTuple`s properly. The first thing is to write a custom `__init__` method (since we need a list of filenames here) which means we also have to change the `new` method." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "class ImageTupleList(ImageList):\n", " def __init__(self, items, itemsB=None, **kwargs):\n", " super().__init__(items, **kwargs)\n", " self.itemsB = itemsB\n", " self.copy_new.append('itemsB')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We then specify how to get one item. Here we pass the image in the first list of items, and pick one randomly in the second list." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ " def get(self, i):\n", " img1 = super().get(i)\n", " fn = self.itemsB[random.randint(0, len(self.itemsB)-1)]\n", " return ImageTuple(img1, open_image(fn))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "We also add a custom factory method to directly create an `ImageTupleList` from two folders." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ " @classmethod\n", " def from_folders(cls, path, folderA, folderB, **kwargs):\n", " itemsB = ImageList.from_folder(path/folderB).items\n", " res = super().from_folder(path/folderA, itemsB=itemsB, **kwargs)\n", " res.path = path\n", " return res" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Finally, we have to specify how to reconstruct the `ImageTuple` from tensors if we want `show_batch` to work. We recreate the images and denormalize." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ " def reconstruct(self, t:Tensor): \n", " return ImageTuple(Image(t[0]/2+0.5),Image(t[1]/2+0.5))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "There is no need to write a `analyze_preds` method since the default behavior (returning the output tensor) is what we need here. However `show_results` won't work properly unless the target (which we don't really care about here) has the right `reconstruct` method: the fastai library uses the `reconstruct` method of the target on the outputs. That's why we create another custom [`ItemList`](/data_block.html#ItemList) with just that `reconstruct` method. The first line is to reconstruct our dummy targets, and the second one is the same as in `ImageTupleList`." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "class TargetTupleList(ItemList):\n", " def reconstruct(self, t:Tensor): \n", " if len(t.size()) == 0: return t\n", " return ImageTuple(Image(t[0]/2+0.5),Image(t[1]/2+0.5))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To make sure our `ImageTupleList` uses that for labelling, we pass it in `_label_cls` and this is what the result looks like." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "class ImageTupleList(ImageList):\n", " _label_cls=TargetTupleList\n", " def __init__(self, items, itemsB=None, **kwargs):\n", " super().__init__(items, **kwargs)\n", " self.itemsB = itemsB\n", " self.copy_new.append('itemsB')\n", " \n", " def get(self, i):\n", " img1 = super().get(i)\n", " fn = self.itemsB[random.randint(0, len(self.itemsB)-1)]\n", " return ImageTuple(img1, open_image(fn))\n", " \n", " def reconstruct(self, t:Tensor): \n", " return ImageTuple(Image(t[0]/2+0.5),Image(t[1]/2+0.5))\n", " \n", " @classmethod\n", " def from_folders(cls, path, folderA, folderB, **kwargs):\n", " itemsB = ImageList.from_folder(path/folderB).items\n", " res = super().from_folder(path/folderA, itemsB=itemsB, **kwargs)\n", " res.path = path\n", " return res" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Lastly, we want to customize the behavior of `show_batch` and `show_results`. Remember the `to_one` method just puts the two images next to each other." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ " def show_xys(self, xs, ys, figsize:Tuple[int,int]=(12,6), **kwargs):\n", " \"Show the `xs` and `ys` on a figure of `figsize`. `kwargs` are passed to the show method.\"\n", " rows = int(math.sqrt(len(xs)))\n", " fig, axs = plt.subplots(rows,rows,figsize=figsize)\n", " for i, ax in enumerate(axs.flatten() if rows > 1 else [axs]):\n", " xs[i].to_one().show(ax=ax, **kwargs)\n", " plt.tight_layout()\n", "\n", " def show_xyzs(self, xs, ys, zs, figsize:Tuple[int,int]=None, **kwargs):\n", " \"\"\"Show `xs` (inputs), `ys` (targets) and `zs` (predictions) on a figure of `figsize`.\n", " `kwargs` are passed to the show method.\"\"\"\n", " figsize = ifnone(figsize, (12,3*len(xs)))\n", " fig,axs = plt.subplots(len(xs), 2, figsize=figsize)\n", " fig.suptitle('Ground truth / Predictions', weight='bold', size=14)\n", " for i,(x,z) in enumerate(zip(xs,zs)):\n", " x.to_one().show(ax=axs[i,0], **kwargs)\n", " z.to_one().show(ax=axs[i,1], **kwargs)" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [] } ], "metadata": { "jekyll": { "keywords": "fastai", "summary": "Advanced tutorial, explains how to create your custom `ItemBase` or `ItemList`", "title": "Custom ItemList" }, "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" } }, "nbformat": 4, "nbformat_minor": 2 }