{ "cells": [ { "cell_type": "markdown", "metadata": { "id": "d1KVLcqHWdgI" }, "source": [ "# Downloading and Evaluating Open Images\n", "\n", "Downloading Google's [Open Images dataset](https://storage.googleapis.com/openimages/web/download.html) is now easier than ever with the [FiftyOne Dataset Zoo](https://voxel51.com/docs/fiftyone/user_guide/dataset_zoo/index.html#dataset-zoo-open-images-v7)! You can load all three splits of Open Images V7, including image-level labels, detections, segmentations, visual relationships, and point labels.\n", "\n", "FiftyOne also natively supports [Open Images-style evaluation](https://voxel51.com/docs/fiftyone/user_guide/evaluation.html#open-images-style-evaluation), so you can easily evaluate your object detection models and explore the results directly in the library.\n", "\n", "This walkthrough covers:\n", "\n", "- Downloading [Open Images](https://storage.googleapis.com/openimages/web/index.html) from the [FiftyOne Dataset Zoo](https://voxel51.com/docs/fiftyone/user_guide/dataset_zoo/index.html)\n", "- Computing predictions using a model from the [FiftyOne Model Zoo](https://voxel51.com/docs/fiftyone/user_guide/model_zoo/index.html)\n", "- Performing [Open Images-style evaluation](https://voxel51.com/docs/fiftyone/user_guide/evaluation.html#open-images-style-evaluation) in FiftyOne to evaluate a model and compute its mAP\n", "- Exploring the dataset and [evaluation results](https://voxel51.com/docs/fiftyone/user_guide/evaluation.html)\n", "- [Visualizing embeddings](https://voxel51.com/docs/fiftyone/user_guide/brain.html#visualizing-embeddings) through [interactive plots](https://voxel51.com/docs/fiftyone/user_guide/plots.html)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**So, what's the takeaway?**\n", "\n", "Starting a new ML project takes data and time, and the datasets in the [FiftyOne Dataset Zoo](https://voxel51.com/docs/fiftyone/user_guide/dataset_zoo/index.html) can help jump start the development process.\n", "\n", "Open Images in particular is one of the largest publicly available datasets for object detections, classification, segmentation, and more. Additionally, with [Open Images evaluation](https://voxel51.com/docs/fiftyone/user_guide/evaluation.html#open-images-style-evaluation) available natively in FiftyOne, you can quickly evaluate your models and compute mAP and PR curves.\n", "\n", "While metrics like mAP are often used to compare models, the best way to improve your model's performance isn't to look at aggregate metrics but instead to get hands-on with your evaluation and visualize how your model performs on individual samples. All of this is made easy with FiftyOne!" ] }, { "cell_type": "markdown", "metadata": { "id": "R5SJCYnQXnOL" }, "source": [ "## Setup" ] }, { "cell_type": "markdown", "metadata": { "id": "Swt8yYgIbOkv" }, "source": [ "If you haven't already, install FiftyOne:" ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "colab": { "base_uri": "https://localhost:8080/" }, "id": "627MxtHOUjU6", "outputId": "0c87eadb-60e4-452d-fa13-a5b8d06b218f" }, "outputs": [], "source": [ "!pip install fiftyone" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "In this tutorial, we’ll use some [TensorFlow models](https://github.com/tensorflow/models) and [PyTorch](https://pytorch.org/vision/stable/index.html) to generate predictions and embeddings, and we’ll use the [UMAP method](https://github.com/lmcinnes/umap) to reduce the dimensionality of embeddings, so we need to install the corresponding packages:" ] }, { "cell_type": "code", "execution_count": 2, "metadata": {}, "outputs": [], "source": [ "!pip install tensorflow torch torchvision umap-learn" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This tutorial also includes some of FiftyOne's [interactive plotting capabilities](https://voxel51.com/docs/fiftyone/user_guide/plots.html).\n", "\n", "The recommended way to work with FiftyOne’s interactive plots is in [Jupyter notebooks](https://jupyter.org/) or [JupyterLab](https://jupyterlab.readthedocs.io/en/stable/). In these environments, you can leverage the full power of plots by [attaching them to the FiftyOne App](https://voxel51.com/docs/fiftyone/user_guide/plots.html#attaching-plots) and bidirectionally interacting with the plots and the App to identify interesting subsets of your data.\n", "\n", "To use interactive plots in Jupyter notebooks, ensure that you have the `ipywidgets` package installed:" ] }, { "cell_type": "code", "execution_count": 3, "metadata": {}, "outputs": [], "source": [ "!pip install 'ipywidgets>=8,<9'" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "If you’re working in JupyterLab, refer to [these instructions](https://voxel51.com/docs/fiftyone/user_guide/plots.html#working-in-notebooks) to get setup.\n" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "
Support for interactive plots in non-notebook contexts and Google Colab is coming soon! In the meantime, you can still use FiftyOne's plotting features in those environments, but you must manually call plot.show()
to update the state of a plot to match the state of a connected session, and any callbacks that would normally be triggered in response to interacting with a plot will not be triggered.