{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Welcome to fastai" ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "hide_input": true }, "outputs": [], "source": [ "from fastai.vision import *\n", "from fastai.gen_doc.nbdoc import *\n", "from fastai.core import *\n", "from fastai.basic_train import *" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The fastai library simplifies training fast and accurate neural nets using modern best practices. It's based on research in to deep learning best practices undertaken at [fast.ai](http://www.fast.ai), including \"out of the box\" support for [`vision`](/vision.html#vision), [`text`](/text.html#text), [`tabular`](/tabular.html#tabular), and [`collab`](/collab.html#collab) (collaborative filtering) models. If you're looking for the source code, head over to the [fastai repo](https://github.com/fastai/fastai) on GitHub. For brief examples, see the [examples](https://github.com/fastai/fastai/tree/master/examples) folder; detailed examples are provided in the full documentation (see the sidebar). For example, here's how to train an MNIST model using [resnet18](https://arxiv.org/abs/1512.03385) (from the [vision example](https://github.com/fastai/fastai/blob/master/examples/vision.ipynb)):" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/html": [ "Total time: 00:09

\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
epochtrain_lossvalid_lossaccuracy
10.1285800.0826470.973503
\n" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "path = untar_data(URLs.MNIST_SAMPLE)\n", "data = ImageDataBunch.from_folder(path)\n", "learn = cnn_learner(data, models.resnet18, metrics=accuracy)\n", "learn.fit(1)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

Note: This documentation is all built from notebooks;\n", "that means that you can try any of the code you see in any notebook yourself!\n", "You'll find the notebooks in the docs_src folder of the\n", "fastai repo. For instance,\n", "here\n", "is the notebook source of what you're reading now.
" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "jekyll_note(\"\"\"This documentation is all built from notebooks;\n", "that means that you can try any of the code you see in any notebook yourself!\n", "You'll find the notebooks in the docs_src folder of the\n", "fastai repo. For instance,\n", "here\n", "is the notebook source of what you're reading now.\"\"\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Installation and updating" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To install or update fastai, we recommend `conda`:\n", "\n", "```\n", "conda install -c pytorch -c fastai fastai \n", "```\n", "For troubleshooting, and alternative installations (including pip and CPU-only options) see the [fastai readme](https://github.com/fastai/fastai/blob/master/README.md)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Reading the docs" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To get started quickly, click *Applications* on the sidebar, and then choose the application you're interested in. That will take you to a walk-through of training a model of that type. You can then either explore the various links from there, or dive more deeply into the various fastai modules.\n", "\n", "We've provided below a quick summary of the key modules in this library. For details on each one, use the sidebar to find the module you're interested in. Each module includes an overview and example of how to use it, along with documentation for every class, function, and method. API documentation looks, for example, like this:\n", "\n", "### An example function" ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "hide_input": true }, "outputs": [ { "data": { "text/markdown": [ "

rotate[source][test]

\n", "\n", "> rotate(**`degrees`**:[`uniform`](/torch_core.html#uniform)) → [`Image`](/vision.image.html#Image) :: [`TfmAffine`](/vision.image.html#TfmAffine)\n", "\n", "
×

Tests found for rotate:

  • pytest -sv tests/test_vision_transform.py::test_deterministic_transforms [source]

To run tests please refer to this guide.

\n", "\n", "Rotate image by `degrees`. " ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "show_doc(rotate, full_name='rotate')" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "---\n", "\n", "Types for each parameter, and the return type, are displayed following standard Python [type hint syntax](https://www.python.org/dev/peps/pep-0484/). Sometimes for compound types we use [type variables](/fastai_typing.html). Types that are defined by fastai or Pytorch link directly to more information about that type; try clicking *Image* in the function above for an example. The docstring for the symbol is shown immediately after the signature, along with a link to the source code for the symbol in GitHub. After the basic signature and docstring you'll find examples and additional details (not shown in this example). As you'll see at the top of the page, all symbols documented like this also appear in the table of contents.\n", "\n", "For inherited classes and some types of decorated function, the base class or decorator type will also be shown at the end of the signature, delimited by `::`. For `vision.transforms`, the random number generator used for data augmentation is shown instead of the type, for randomly generated parameters." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Module structure" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Imports" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "fastai is designed to support both interactive computing as well as traditional software development. For interactive computing, where convenience and speed of experimentation is a priority, data scientists often prefer to grab all the symbols they need, with `import *`. Therefore, fastai is designed to support this approach, without compromising on maintainability and understanding.\n", "\n", "In order to do so, the module dependencies are carefully managed (see next section), with each exporting a carefully chosen set of symbols when using `import *` . In general, for interactive computing, to just play around the core modules and the training loop you can do\n", "\n", "```\n", "from fastai.basics import *\n", "```\n", "\n", "If you want experiment with one of the *applications* such as vision, then you can do\n", "\n", "```\n", "from fastai.vision import *\n", "```" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "That will give you all the standard external modules you'll need, in their customary namespaces (e.g. `pandas as pd`, `numpy as np`, `matplotlib.pyplot as plt`), plus the core fastai libraries. In addition, the main classes and functions for your application ([`fastai.vision`](/vision.html#vision), in this case), e.g. creating a [`DataBunch`](/basic_data.html#DataBunch) from an image folder and training a convolutional neural network (with [`cnn_learner`](/vision.learner.html#cnn_learner)), are also imported. If you don't wish to import any application, but want all the main functionality from fastai, use `from fastai.basics import *`. Of course, you can also just import the specific symbols that you require, without using `import *`.\n", "\n", "If you wish to see where a symbol is imported from, either just type the symbol name (in a REPL such as Jupyter Notebook or IPython), or (in most editors) wave your mouse over the symbol to see the definition. For instance:" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [ { "data": { "text/plain": [ "fastai.basic_train.Learner" ] }, "execution_count": null, "metadata": {}, "output_type": "execute_result" } ], "source": [ "Learner" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Dependencies" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "At the base of everything are the two modules [`core`](/core.html#core) and [`torch_core`](/torch_core.html#torch_core) (we're not including the `fastai.` prefix when naming modules in these docs). They define the basic functions we use in the library; [`core`](/core.html#core) only relies on general modules, whereas [`torch_core`](/torch_core.html#torch_core) requires pytorch. Most type-hinting shortcuts are defined there too (at least the one that don't depend on fastai classes defined later). Nearly all modules below import [`torch_core`](/torch_core.html#torch_core).\n", "\n", "Then, there are three modules directly on top of [`torch_core`](/torch_core.html#torch_core): \n", "- [`basic_data`](/basic_data.html#basic_data), which contains the class that will take a [`Dataset`](https://pytorch.org/docs/stable/data.html#torch.utils.data.Dataset) or pytorch [`DataLoader`](https://pytorch.org/docs/stable/data.html#torch.utils.data.DataLoader) to wrap it in a [`DeviceDataLoader`](/basic_data.html#DeviceDataLoader) (a class that sits on top of a [`DataLoader`](https://pytorch.org/docs/stable/data.html#torch.utils.data.DataLoader) and is in charge of putting the data on the right device as well as applying transforms such as normalization) and regroup then in a [`DataBunch`](/basic_data.html#DataBunch).\n", "- [`layers`](/layers.html#layers), which contains basic functions to define custom layers or groups of layers\n", "- [`metrics`](/metrics.html#metrics), which contains all the metrics\n", "\n", "This takes care of the basics, then we regroup a model with some data in a [`Learner`](/basic_train.html#Learner) object to take care of training. More specifically:\n", "- [`callback`](/callback.html#callback) (depends on [`basic_data`](/basic_data.html#basic_data)) defines the basis of callbacks and the [`CallbackHandler`](/callback.html#CallbackHandler). Those are functions that will be called every step of the way of the training loop and can allow us to customize what is happening there;\n", "- [`basic_train`](/basic_train.html#basic_train) (depends on [`callback`](/callback.html#callback)) defines [`Learner`](/basic_train.html#Learner) and [`Recorder`](/basic_train.html#Recorder) (which is a callback that records training stats) and has the training loop;\n", "- [`callbacks`](/callbacks.html#callbacks) (depends on [`basic_train`](/basic_train.html#basic_train)) is a submodule defining various callbacks, such as for mixed precision training or 1cycle annealing;\n", "- [`train`](/train.html#train) (depends on [`callbacks`](/callbacks.html#callbacks)) defines helper functions to invoke the callbacks more easily.\n", "\n", "From [`basic_data`](/basic_data.html#basic_data) we can split on one of the four main *applications*, which each has their own module: [`vision`](/vision.html#vision), [`text`](/text.html#text) [`collab`](/collab.html#collab), or [`tabular`](/tabular.html#tabular). Each of those submodules is built in the same way with:\n", "- a submodule named transform that handles the transformations of our data (data augmentation for computer vision, numericalizing and tokenizing for text and preprocessing for tabular)\n", "- a submodule named data that contains the class that will create datasets specific to this application and the helper functions to create [`DataBunch`](/basic_data.html#DataBunch) objects.\n", "- a submodule named models that contains the models specific to this application.\n", "- optionally, a submodule named {application}.learner that will contain [`Learner`](/basic_train.html#Learner) specific to the application.\n", "\n", "Here is a graph of the key module dependencies:" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "![Modules overview](imgs/dependencies.svg)" ] } ], "metadata": { "jekyll": { "keywords": "fastai", "toc": "false" }, "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.8.2" } }, "nbformat": 4, "nbformat_minor": 2 }