{ "cells": [ { "attachments": {}, "cell_type": "markdown", "metadata": {}, "source": [ ">### 🚩 *Create a free WhyLabs account to get more value out of whylogs!*
\n", ">*Did you know you can store, visualize, and monitor whylogs profiles with the [WhyLabs Observability Platform](https://whylabs.ai/whylogs-free-signup?utm_source=whylogs-Github&utm_medium=whylogs-example&utm_campaign=flask_with_whylogs)? Sign up for a [free WhyLabs account](https://whylabs.ai/whylogs-free-signup?utm_source=whylogs-Github&utm_medium=whylogs-example&utm_campaign=flask_with_whylogs) to leverage the power of whylogs and WhyLabs together!*" ] }, { "attachments": {}, "cell_type": "markdown", "metadata": { "collapsed": true, "pycharm": { "name": "#%% md\n" } }, "source": [ "# Integrating Whylogs into your Flask Flow\n", "\n", "[![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/whylabs/whylogs/blob/mainline/python/examples/integrations/flask_streaming/flask_with_whylogs.ipynb)\n", "\n", "Now that you've gone through the basics, let's get into how to integrate whylogs into your current work flow. Today, let's look at Flask to build a web app that wil use data from IRIS and us it for a prediction. Notice that we will log both the input received and the prediction! \n", "\n", "### What you'll Need\n", "- Docker\n", "- pandas\n", "- scikit-learn\n", "- Flask" ] }, { "attachments": {}, "cell_type": "markdown", "metadata": { "collapsed": false, "pycharm": { "name": "#%% md\n" } }, "source": [ "## Overview\n", "\n", "![Data Flow by Felipe de Pontes Adachi](./assets/flask_whylogs_whylabs_flow.jpeg)\n", "\n", "We’ll deploy locally a Flask application, which is responsible for serving the user with the requested predictions through a REST endpoint. Our application will use the whylogs library to create statistical profiles of both input and output features of our application during production. These statistical properties will then be sent in microbatches to either to a local writer or WhyLabs at fixed intervals. If sent to WhyLabs, it will merge them automatically, creating statistical profiles on a daily basis." ] }, { "attachments": {}, "cell_type": "markdown", "metadata": { "collapsed": false, "pycharm": { "name": "#%% md\n" } }, "source": [ "#### Let's get our environment ready! \n", "Uncomment the whylogs install if you don't have it with whylabs extension installed" ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "collapsed": false, "pycharm": { "name": "#%%\n" } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Note: you may need to restart the kernel to use updated packages.\n" ] } ], "source": [ "# Note: you may need to restart the kernel to use updated packages.\n", "%pip install pandas utils joblib scikit-learn Flask\n", "%pip install whylogs" ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "collapsed": false, "pycharm": { "name": "#%%\n" } }, "outputs": [], "source": [ "import random\n", "import numpy as np\n", "import time\n", "import requests\n", "import pandas as pd\n", "from joblib import dump\n", "from sklearn.svm import SVC\n", "import sklearn.datasets \n", "from sklearn.model_selection import train_test_split" ] }, { "attachments": {}, "cell_type": "markdown", "metadata": { "collapsed": false, "pycharm": { "name": "#%% md\n" } }, "source": [ "### Step 1: Load the Data\n", "For this we will be using the IRIS dataset for our classification. It looks at the sepal and petal lengths and widths to be able to make a prediction of what species it is. This data set is readily available, but in this case we will grab it from sklearn's dataset library." ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "collapsed": false, "pycharm": { "name": "#%%\n" } }, "outputs": [ { "data": { "text/html": [ "
\n", "\n", "\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
sepal length (cm)sepal width (cm)petal length (cm)petal width (cm)target
05.13.51.40.2setosa
14.93.01.40.2setosa
24.73.21.30.2setosa
34.63.11.50.2setosa
45.03.61.40.2setosa
..................
1456.73.05.22.3virginica
1466.32.55.01.9virginica
1476.53.05.22.0virginica
1486.23.45.42.3virginica
1495.93.05.11.8virginica
\n", "

150 rows × 5 columns

\n", "
" ], "text/plain": [ " sepal length (cm) sepal width (cm) petal length (cm) petal width (cm) \\\n", "0 5.1 3.5 1.4 0.2 \n", "1 4.9 3.0 1.4 0.2 \n", "2 4.7 3.2 1.3 0.2 \n", "3 4.6 3.1 1.5 0.2 \n", "4 5.0 3.6 1.4 0.2 \n", ".. ... ... ... ... \n", "145 6.7 3.0 5.2 2.3 \n", "146 6.3 2.5 5.0 1.9 \n", "147 6.5 3.0 5.2 2.0 \n", "148 6.2 3.4 5.4 2.3 \n", "149 5.9 3.0 5.1 1.8 \n", "\n", " target \n", "0 setosa \n", "1 setosa \n", "2 setosa \n", "3 setosa \n", "4 setosa \n", ".. ... \n", "145 virginica \n", "146 virginica \n", "147 virginica \n", "148 virginica \n", "149 virginica \n", "\n", "[150 rows x 5 columns]" ] }, "execution_count": 3, "metadata": {}, "output_type": "execute_result" } ], "source": [ "iris = sklearn.datasets.load_iris(as_frame=True)\n", "data = pd.DataFrame(data=iris.data, columns=iris.feature_names)\n", "\n", "data['target'] = [iris.target_names[i] for i in iris.target]\n", "data" ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "collapsed": false, "pycharm": { "name": "#%%\n" } }, "outputs": [], "source": [ " # Separating the independent variables from dependent variables\n", "X = data.iloc[:, 0:4].values\n", "y = data.iloc[:, -1].values\n", "x_train, x_test, y_train, y_test = train_test_split(X, y, test_size=0.30, random_state=0)" ] }, { "attachments": {}, "cell_type": "markdown", "metadata": { "collapsed": false }, "source": [ "### Step 2: Train the Model\n", "Next up, it's time to train the model. We will use a simple SVC, fit the modle, then dump it to \"model.joblib\". " ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "collapsed": false, "pycharm": { "name": "#%%\n" } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Train started.\n", "Train finished.\n", "Model saved as model.joblib\n" ] } ], "source": [ "# Train a classifier\n", "print(\"Train started.\")\n", "model = SVC()\n", "model.fit(x_train, y_train)\n", "print(\"Train finished.\")\n", "# Save the model\n", "dump(model, 'model.joblib')\n", "print(\"Model saved as model.joblib\")" ] }, { "attachments": {}, "cell_type": "markdown", "metadata": { "collapsed": false, "pycharm": { "name": "#%% md\n" } }, "source": [ "### Step 3: Build and Run a Docker Image\n", "Within the directory this notebook is in you'll see all the code that makes up our flask app. There are a lot of files, but the main ones of interest are in the api folder. For this next step we will use docker to build an image based on all the requirements and settings that are put in that outer directory" ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "collapsed": false, "pycharm": { "name": "#%%\n" } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\u001b[1A\u001b[1B\u001b[0G\u001b[?25l[+] Building 0.0s (0/1) \r\n", "\u001b[?25h\u001b[1A\u001b[0G\u001b[?25l[+] Building 0.1s (2/3) \r\n", "\u001b[34m => [internal] load build definition from Dockerfile 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring dockerfile: 37B 0.0s\r\n", "\u001b[0m\u001b[34m => [internal] load .dockerignore 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring context: 34B 0.0s\r\n", "\u001b[0m => [internal] load metadata for docker.io/library/python:3.9 0.0s\r\n", "\u001b[?25h\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[0G\u001b[?25l[+] Building 0.3s (2/3) \r\n", "\u001b[34m => [internal] load build definition from Dockerfile 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring dockerfile: 37B 0.0s\r\n", "\u001b[0m\u001b[34m => [internal] load .dockerignore 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring context: 34B 0.0s\r\n", "\u001b[0m => [internal] load metadata for docker.io/library/python:3.9 0.2s\r\n", "\u001b[?25h\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[0G\u001b[?25l[+] Building 0.4s (2/3) \r\n", "\u001b[34m => [internal] load build definition from Dockerfile 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring dockerfile: 37B 0.0s\r\n", "\u001b[0m\u001b[34m => [internal] load .dockerignore 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring context: 34B 0.0s\r\n", "\u001b[0m => [internal] load metadata for docker.io/library/python:3.9 0.3s\r\n", "\u001b[?25h\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[0G\u001b[?25l[+] Building 0.6s (2/3) \r\n", "\u001b[34m => [internal] load build definition from Dockerfile 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring dockerfile: 37B 0.0s\r\n", "\u001b[0m\u001b[34m => [internal] load .dockerignore 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring context: 34B 0.0s\r\n", "\u001b[0m => [internal] load metadata for docker.io/library/python:3.9 0.5s\r\n", "\u001b[?25h\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[0G\u001b[?25l[+] Building 0.7s (2/3) \r\n", "\u001b[34m => [internal] load build definition from Dockerfile 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring dockerfile: 37B 0.0s\r\n", "\u001b[0m\u001b[34m => [internal] load .dockerignore 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring context: 34B 0.0s\r\n", "\u001b[0m => [internal] load metadata for docker.io/library/python:3.9 0.6s\r\n", "\u001b[?25h\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[0G\u001b[?25l[+] Building 0.9s (2/3) \r\n", "\u001b[34m => [internal] load build definition from Dockerfile 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring dockerfile: 37B 0.0s\r\n", "\u001b[0m\u001b[34m => [internal] load .dockerignore 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring context: 34B 0.0s\r\n", "\u001b[0m => [internal] load metadata for docker.io/library/python:3.9 0.8s\r\n", "\u001b[?25h\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[0G\u001b[?25l[+] Building 1.0s (2/3) \r\n", "\u001b[34m => [internal] load build definition from Dockerfile 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring dockerfile: 37B 0.0s\r\n", "\u001b[0m\u001b[34m => [internal] load .dockerignore 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring context: 34B 0.0s\r\n", "\u001b[0m => [internal] load metadata for docker.io/library/python:3.9 0.9s\r\n", "\u001b[?25h\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[0G\u001b[?25l[+] Building 1.2s (2/3) \r\n", "\u001b[34m => [internal] load build definition from Dockerfile 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring dockerfile: 37B 0.0s\r\n", "\u001b[0m\u001b[34m => [internal] load .dockerignore 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring context: 34B 0.0s\r\n", "\u001b[0m => [internal] load metadata for docker.io/library/python:3.9 1.1s\r\n", "\u001b[?25h\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[0G\u001b[?25l[+] Building 1.3s (2/3) \r\n", "\u001b[34m => [internal] load build definition from Dockerfile 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring dockerfile: 37B 0.0s\r\n", "\u001b[0m\u001b[34m => [internal] load .dockerignore 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring context: 34B 0.0s\r\n", "\u001b[0m => [internal] load metadata for docker.io/library/python:3.9 1.2s\r\n", "\u001b[?25h\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[0G\u001b[?25l[+] Building 1.4s (10/11) \r\n", "\u001b[34m => [internal] load build definition from Dockerfile 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring dockerfile: 37B 0.0s\r\n", "\u001b[0m\u001b[34m => [internal] load .dockerignore 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring context: 34B 0.0s\r\n", "\u001b[0m\u001b[34m => [internal] load metadata for docker.io/library/python:3.9 1.2s\r\n", "\u001b[0m\u001b[34m => [1/6] FROM docker.io/library/python:3.9@sha256:51c996c8c65d826d20c613 0.0s\r\n", "\u001b[0m\u001b[34m => [internal] load build context 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring context: 57.28kB 0.0s\r\n", "\u001b[0m\u001b[34m => CACHED [2/6] RUN mkdir /app 0.0s\r\n", "\u001b[0m\u001b[34m => CACHED [3/6] WORKDIR /app 0.0s\r\n", "\u001b[0m\u001b[34m => CACHED [4/6] COPY requirements.txt . 0.0s\r\n", "\u001b[0m\u001b[34m => CACHED [5/6] RUN pip install --no-cache-dir -r requirements.txt 0.0s\r\n", "\u001b[0m\u001b[34m => [6/6] COPY ./ /app 0.0s\r\n", "\u001b[0m => exporting to image 0.0s\r\n", "\u001b[34m => => exporting layers 0.0s\r\n", "\u001b[0m\u001b[?25h\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[1A\u001b[0G\u001b[?25l[+] Building 1.4s (11/11) FINISHED \r\n", "\u001b[34m => [internal] load build definition from Dockerfile 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring dockerfile: 37B 0.0s\r\n", "\u001b[0m\u001b[34m => [internal] load .dockerignore 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring context: 34B 0.0s\r\n", "\u001b[0m\u001b[34m => [internal] load metadata for docker.io/library/python:3.9 1.2s\r\n", "\u001b[0m\u001b[34m => [1/6] FROM docker.io/library/python:3.9@sha256:51c996c8c65d826d20c613 0.0s\r\n", "\u001b[0m\u001b[34m => [internal] load build context 0.0s\r\n", "\u001b[0m\u001b[34m => => transferring context: 57.28kB 0.0s\r\n", "\u001b[0m\u001b[34m => CACHED [2/6] RUN mkdir /app 0.0s\r\n", "\u001b[0m\u001b[34m => CACHED [3/6] WORKDIR /app 0.0s\r\n", "\u001b[0m\u001b[34m => CACHED [4/6] COPY requirements.txt . 0.0s\r\n", "\u001b[0m\u001b[34m => CACHED [5/6] RUN pip install --no-cache-dir -r requirements.txt 0.0s\r\n", "\u001b[0m\u001b[34m => [6/6] COPY ./ /app 0.0s\r\n", "\u001b[0m\u001b[34m => exporting to image 0.0s\r\n", "\u001b[0m\u001b[34m => => exporting layers 0.0s\r\n", "\u001b[0m\u001b[34m => => writing image sha256:33c622c8278ee63a8dd063de008fc364f5f33de4338ac 0.0s\r\n", "\u001b[0m\u001b[34m => => naming to docker.io/library/whylogs-flask 0.0s\r\n", "\u001b[0m\u001b[?25h\r\n", "Use 'docker scan' to run Snyk tests against images to find vulnerabilities and learn how to fix them\r\n" ] } ], "source": [ "!docker build --build-arg PYTHON_VERSION=3.9 -t whylogs-flask ." ] }, { "attachments": {}, "cell_type": "markdown", "metadata": { "collapsed": false, "pycharm": { "name": "#%% md\n" } }, "source": [ "Huzzah!! It built! We have a docker image, but nothing is running yet. Open a terminal and execute the following command:\n", "\n", "```bash\n", "docker run --rm -p 5000:5000 whylogs-flask\n", "```\n" ] }, { "attachments": {}, "cell_type": "markdown", "metadata": { "collapsed": false, "pycharm": { "name": "#%% md\n" } }, "source": [ "### Step 4: Test Endpoint\n", "Let's make sure it's actually up and running follow the directions below. \n", "\n", "- Go to http://0.0.0.0:5000/apidocs/\n", "- Open /predict endpoint green tab.\n", "- Click Try it out.\n", "- Click Execute green button.\n", "- Check the response and code, if 200, the API is working\n", "\n", "If it's not working please check `docker ps` to see if something else is running on that port. " ] }, { "attachments": {}, "cell_type": "markdown", "metadata": { "collapsed": false, "pycharm": { "name": "#%% md\n" } }, "source": [ "### Step 5: Mess with Data to Showcase a Drift\n", "Note that the logger is configured for to roll over every 5 min, we recommend running this for at least 15 min before really digging into the visualizations." ] }, { "attachments": {}, "cell_type": "markdown", "metadata": { "collapsed": false, "pycharm": { "name": "#%% md\n" } }, "source": [ "The following functions aim to modify the variables distribution in order to test whylabs." ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "collapsed": false, "pycharm": { "name": "#%%\n" } }, "outputs": [], "source": [ "def modify_random_column_values(data, value: float = np.random.uniform(low=0.0, high=10.0)) -> None:\n", " random_column = None\n", " data_mod = data.copy(deep=True)\n", " try:\n", " number_of_columns = len(data_mod.columns) - 2 # Index and label eliminated\n", " random_column = data_mod.columns[np.random.randint(number_of_columns) + 1]\n", " data_mod[random_column] = value\n", " except Exception as ex:\n", " raise f\"Error adding fix value in random column: {str(random_column)}\"\n", " return data_mod\n", " \n", " \n", "def add_random_column_outliers(data, number_outliers: int = 10) -> None:\n", " random_column = None\n", " data_mod = data.copy(deep=True)\n", " try:\n", " number_of_columns = len(data_mod.columns) - 2 # Index and label eliminated\n", " number_of_rows = data_mod.shape[0]\n", " random_column = data_mod.columns[np.random.randint(number_of_columns) + 1]\n", " for i in range(number_outliers):\n", " random_row = np.random.randint(0, number_of_rows)\n", " data_mod.loc[random_row, random_column] = round(np.random.uniform(low=20.0, high=50.0), 2)\n", " except Exception as ex:\n", " raise f\"Error adding outliers in random column: {random_column}\"\n", " return data_mod" ] }, { "attachments": {}, "cell_type": "markdown", "metadata": { "collapsed": false }, "source": [ "Once it's working, you can try to send continous requests to the endpoint:" ] }, { "cell_type": "code", "execution_count": 8, "metadata": { "collapsed": false, "pycharm": { "name": "#%%\n" } }, "outputs": [], "source": [ "labels = [\"sepal_length_cm\", \"sepal_width_cm\", \"petal_length_cm\", \"petal_width_cm\"]" ] }, { "cell_type": "code", "execution_count": 9, "metadata": { "collapsed": false, "pycharm": { "name": "#%%\n" } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Dataset distribution modified!\n" ] } ], "source": [ "# modify a variable distribution\n", "data_mod = add_random_column_outliers(data, 30)\n", "print(\"Dataset distribution modified!\")" ] }, { "cell_type": "code", "execution_count": 10, "metadata": { "collapsed": false, "pycharm": { "name": "#%%\n" } }, "outputs": [], "source": [ "url = \"http://0.0.0.0:5000/api/v1\"" ] }, { "cell_type": "code", "execution_count": 11, "metadata": { "collapsed": false, "pycharm": { "name": "#%%\n" } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "{'sepal_length_cm': 5.1, 'sepal_width_cm': 3.5, 'petal_length_cm': 1.4, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.9, 'sepal_width_cm': 3.0, 'petal_length_cm': 1.4, 'petal_width_cm': 25.89}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.7, 'sepal_width_cm': 3.2, 'petal_length_cm': 1.3, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.6, 'sepal_width_cm': 3.1, 'petal_length_cm': 1.5, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.0, 'sepal_width_cm': 3.6, 'petal_length_cm': 1.4, 'petal_width_cm': 48.97}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.4, 'sepal_width_cm': 3.9, 'petal_length_cm': 1.7, 'petal_width_cm': 48.1}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.6, 'sepal_width_cm': 3.4, 'petal_length_cm': 1.4, 'petal_width_cm': 0.3}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.0, 'sepal_width_cm': 3.4, 'petal_length_cm': 1.5, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.4, 'sepal_width_cm': 2.9, 'petal_length_cm': 1.4, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.9, 'sepal_width_cm': 3.1, 'petal_length_cm': 1.5, 'petal_width_cm': 0.1}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.4, 'sepal_width_cm': 3.7, 'petal_length_cm': 1.5, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.8, 'sepal_width_cm': 3.4, 'petal_length_cm': 1.6, 'petal_width_cm': 26.61}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.8, 'sepal_width_cm': 3.0, 'petal_length_cm': 1.4, 'petal_width_cm': 0.1}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.3, 'sepal_width_cm': 3.0, 'petal_length_cm': 1.1, 'petal_width_cm': 43.34}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.8, 'sepal_width_cm': 4.0, 'petal_length_cm': 1.2, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.7, 'sepal_width_cm': 4.4, 'petal_length_cm': 1.5, 'petal_width_cm': 0.4}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.4, 'sepal_width_cm': 3.9, 'petal_length_cm': 1.3, 'petal_width_cm': 0.4}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.1, 'sepal_width_cm': 3.5, 'petal_length_cm': 1.4, 'petal_width_cm': 0.3}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.7, 'sepal_width_cm': 3.8, 'petal_length_cm': 1.7, 'petal_width_cm': 0.3}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.1, 'sepal_width_cm': 3.8, 'petal_length_cm': 1.5, 'petal_width_cm': 0.3}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.4, 'sepal_width_cm': 3.4, 'petal_length_cm': 1.7, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.1, 'sepal_width_cm': 3.7, 'petal_length_cm': 1.5, 'petal_width_cm': 0.4}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.6, 'sepal_width_cm': 3.6, 'petal_length_cm': 1.0, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.1, 'sepal_width_cm': 3.3, 'petal_length_cm': 1.7, 'petal_width_cm': 0.5}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.8, 'sepal_width_cm': 3.4, 'petal_length_cm': 1.9, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.0, 'sepal_width_cm': 3.0, 'petal_length_cm': 1.6, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.0, 'sepal_width_cm': 3.4, 'petal_length_cm': 1.6, 'petal_width_cm': 0.4}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.2, 'sepal_width_cm': 3.5, 'petal_length_cm': 1.5, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.2, 'sepal_width_cm': 3.4, 'petal_length_cm': 1.4, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.7, 'sepal_width_cm': 3.2, 'petal_length_cm': 1.6, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.8, 'sepal_width_cm': 3.1, 'petal_length_cm': 1.6, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.4, 'sepal_width_cm': 3.4, 'petal_length_cm': 1.5, 'petal_width_cm': 0.4}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.2, 'sepal_width_cm': 4.1, 'petal_length_cm': 1.5, 'petal_width_cm': 0.1}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.5, 'sepal_width_cm': 4.2, 'petal_length_cm': 1.4, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.9, 'sepal_width_cm': 3.1, 'petal_length_cm': 1.5, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.0, 'sepal_width_cm': 3.2, 'petal_length_cm': 1.2, 'petal_width_cm': 27.27}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.5, 'sepal_width_cm': 3.5, 'petal_length_cm': 1.3, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.9, 'sepal_width_cm': 3.6, 'petal_length_cm': 1.4, 'petal_width_cm': 0.1}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.4, 'sepal_width_cm': 3.0, 'petal_length_cm': 1.3, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.1, 'sepal_width_cm': 3.4, 'petal_length_cm': 1.5, 'petal_width_cm': 28.7}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.0, 'sepal_width_cm': 3.5, 'petal_length_cm': 1.3, 'petal_width_cm': 0.3}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.5, 'sepal_width_cm': 2.3, 'petal_length_cm': 1.3, 'petal_width_cm': 0.3}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.4, 'sepal_width_cm': 3.2, 'petal_length_cm': 1.3, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.0, 'sepal_width_cm': 3.5, 'petal_length_cm': 1.6, 'petal_width_cm': 45.86}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.1, 'sepal_width_cm': 3.8, 'petal_length_cm': 1.9, 'petal_width_cm': 41.23}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.8, 'sepal_width_cm': 3.0, 'petal_length_cm': 1.4, 'petal_width_cm': 0.3}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.1, 'sepal_width_cm': 3.8, 'petal_length_cm': 1.6, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.6, 'sepal_width_cm': 3.2, 'petal_length_cm': 1.4, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.3, 'sepal_width_cm': 3.7, 'petal_length_cm': 1.5, 'petal_width_cm': 0.2}\n", "{'data': {'class': 'setosa'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.0, 'sepal_width_cm': 3.3, 'petal_length_cm': 1.4, 'petal_width_cm': 25.96}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 7.0, 'sepal_width_cm': 3.2, 'petal_length_cm': 4.7, 'petal_width_cm': 1.4}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.4, 'sepal_width_cm': 3.2, 'petal_length_cm': 4.5, 'petal_width_cm': 1.5}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.9, 'sepal_width_cm': 3.1, 'petal_length_cm': 4.9, 'petal_width_cm': 1.5}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.5, 'sepal_width_cm': 2.3, 'petal_length_cm': 4.0, 'petal_width_cm': 31.79}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.5, 'sepal_width_cm': 2.8, 'petal_length_cm': 4.6, 'petal_width_cm': 1.5}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.7, 'sepal_width_cm': 2.8, 'petal_length_cm': 4.5, 'petal_width_cm': 32.73}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.3, 'sepal_width_cm': 3.3, 'petal_length_cm': 4.7, 'petal_width_cm': 1.6}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.9, 'sepal_width_cm': 2.4, 'petal_length_cm': 3.3, 'petal_width_cm': 1.0}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.6, 'sepal_width_cm': 2.9, 'petal_length_cm': 4.6, 'petal_width_cm': 1.3}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.2, 'sepal_width_cm': 2.7, 'petal_length_cm': 3.9, 'petal_width_cm': 30.06}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.0, 'sepal_width_cm': 2.0, 'petal_length_cm': 3.5, 'petal_width_cm': 1.0}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.9, 'sepal_width_cm': 3.0, 'petal_length_cm': 4.2, 'petal_width_cm': 1.5}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.0, 'sepal_width_cm': 2.2, 'petal_length_cm': 4.0, 'petal_width_cm': 1.0}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.1, 'sepal_width_cm': 2.9, 'petal_length_cm': 4.7, 'petal_width_cm': 1.4}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.6, 'sepal_width_cm': 2.9, 'petal_length_cm': 3.6, 'petal_width_cm': 47.54}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.7, 'sepal_width_cm': 3.1, 'petal_length_cm': 4.4, 'petal_width_cm': 1.4}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.6, 'sepal_width_cm': 3.0, 'petal_length_cm': 4.5, 'petal_width_cm': 44.65}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.8, 'sepal_width_cm': 2.7, 'petal_length_cm': 4.1, 'petal_width_cm': 1.0}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.2, 'sepal_width_cm': 2.2, 'petal_length_cm': 4.5, 'petal_width_cm': 1.5}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.6, 'sepal_width_cm': 2.5, 'petal_length_cm': 3.9, 'petal_width_cm': 1.1}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.9, 'sepal_width_cm': 3.2, 'petal_length_cm': 4.8, 'petal_width_cm': 1.8}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.1, 'sepal_width_cm': 2.8, 'petal_length_cm': 4.0, 'petal_width_cm': 1.3}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.3, 'sepal_width_cm': 2.5, 'petal_length_cm': 4.9, 'petal_width_cm': 1.5}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.1, 'sepal_width_cm': 2.8, 'petal_length_cm': 4.7, 'petal_width_cm': 1.2}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.4, 'sepal_width_cm': 2.9, 'petal_length_cm': 4.3, 'petal_width_cm': 1.3}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.6, 'sepal_width_cm': 3.0, 'petal_length_cm': 4.4, 'petal_width_cm': 1.4}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.8, 'sepal_width_cm': 2.8, 'petal_length_cm': 4.8, 'petal_width_cm': 1.4}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.7, 'sepal_width_cm': 3.0, 'petal_length_cm': 5.0, 'petal_width_cm': 32.12}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.0, 'sepal_width_cm': 2.9, 'petal_length_cm': 4.5, 'petal_width_cm': 1.5}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.7, 'sepal_width_cm': 2.6, 'petal_length_cm': 3.5, 'petal_width_cm': 1.0}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.5, 'sepal_width_cm': 2.4, 'petal_length_cm': 3.8, 'petal_width_cm': 1.1}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.5, 'sepal_width_cm': 2.4, 'petal_length_cm': 3.7, 'petal_width_cm': 1.0}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.8, 'sepal_width_cm': 2.7, 'petal_length_cm': 3.9, 'petal_width_cm': 43.94}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.0, 'sepal_width_cm': 2.7, 'petal_length_cm': 5.1, 'petal_width_cm': 1.6}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.4, 'sepal_width_cm': 3.0, 'petal_length_cm': 4.5, 'petal_width_cm': 1.5}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.0, 'sepal_width_cm': 3.4, 'petal_length_cm': 4.5, 'petal_width_cm': 1.6}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.7, 'sepal_width_cm': 3.1, 'petal_length_cm': 4.7, 'petal_width_cm': 1.5}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.3, 'sepal_width_cm': 2.3, 'petal_length_cm': 4.4, 'petal_width_cm': 49.06}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.6, 'sepal_width_cm': 3.0, 'petal_length_cm': 4.1, 'petal_width_cm': 1.3}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.5, 'sepal_width_cm': 2.5, 'petal_length_cm': 4.0, 'petal_width_cm': 1.3}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.5, 'sepal_width_cm': 2.6, 'petal_length_cm': 4.4, 'petal_width_cm': 1.2}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.1, 'sepal_width_cm': 3.0, 'petal_length_cm': 4.6, 'petal_width_cm': 1.4}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.8, 'sepal_width_cm': 2.6, 'petal_length_cm': 4.0, 'petal_width_cm': 32.23}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.0, 'sepal_width_cm': 2.3, 'petal_length_cm': 3.3, 'petal_width_cm': 1.0}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.6, 'sepal_width_cm': 2.7, 'petal_length_cm': 4.2, 'petal_width_cm': 1.3}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.7, 'sepal_width_cm': 3.0, 'petal_length_cm': 4.2, 'petal_width_cm': 1.2}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.7, 'sepal_width_cm': 2.9, 'petal_length_cm': 4.2, 'petal_width_cm': 48.92}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.2, 'sepal_width_cm': 2.9, 'petal_length_cm': 4.3, 'petal_width_cm': 1.3}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.1, 'sepal_width_cm': 2.5, 'petal_length_cm': 3.0, 'petal_width_cm': 1.1}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.7, 'sepal_width_cm': 2.8, 'petal_length_cm': 4.1, 'petal_width_cm': 1.3}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.3, 'sepal_width_cm': 3.3, 'petal_length_cm': 6.0, 'petal_width_cm': 2.5}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.8, 'sepal_width_cm': 2.7, 'petal_length_cm': 5.1, 'petal_width_cm': 1.9}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 7.1, 'sepal_width_cm': 3.0, 'petal_length_cm': 5.9, 'petal_width_cm': 33.89}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.3, 'sepal_width_cm': 2.9, 'petal_length_cm': 5.6, 'petal_width_cm': 1.8}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.5, 'sepal_width_cm': 3.0, 'petal_length_cm': 5.8, 'petal_width_cm': 33.77}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 7.6, 'sepal_width_cm': 3.0, 'petal_length_cm': 6.6, 'petal_width_cm': 2.1}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 4.9, 'sepal_width_cm': 2.5, 'petal_length_cm': 4.5, 'petal_width_cm': 1.7}\n", "{'data': {'class': 'versicolor'}, 'message': 'Success'}\n", "{'sepal_length_cm': 7.3, 'sepal_width_cm': 2.9, 'petal_length_cm': 6.3, 'petal_width_cm': 1.8}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.7, 'sepal_width_cm': 2.5, 'petal_length_cm': 5.8, 'petal_width_cm': 25.81}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 7.2, 'sepal_width_cm': 3.6, 'petal_length_cm': 6.1, 'petal_width_cm': 2.5}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.5, 'sepal_width_cm': 3.2, 'petal_length_cm': 5.1, 'petal_width_cm': 2.0}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.4, 'sepal_width_cm': 2.7, 'petal_length_cm': 5.3, 'petal_width_cm': 1.9}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.8, 'sepal_width_cm': 3.0, 'petal_length_cm': 5.5, 'petal_width_cm': 2.1}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.7, 'sepal_width_cm': 2.5, 'petal_length_cm': 5.0, 'petal_width_cm': 34.12}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.8, 'sepal_width_cm': 2.8, 'petal_length_cm': 5.1, 'petal_width_cm': 2.4}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.4, 'sepal_width_cm': 3.2, 'petal_length_cm': 5.3, 'petal_width_cm': 2.3}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.5, 'sepal_width_cm': 3.0, 'petal_length_cm': 5.5, 'petal_width_cm': 37.76}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 7.7, 'sepal_width_cm': 3.8, 'petal_length_cm': 6.7, 'petal_width_cm': 2.2}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 7.7, 'sepal_width_cm': 2.6, 'petal_length_cm': 6.9, 'petal_width_cm': 2.3}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.0, 'sepal_width_cm': 2.2, 'petal_length_cm': 5.0, 'petal_width_cm': 1.5}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.9, 'sepal_width_cm': 3.2, 'petal_length_cm': 5.7, 'petal_width_cm': 2.3}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.6, 'sepal_width_cm': 2.8, 'petal_length_cm': 4.9, 'petal_width_cm': 2.0}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 7.7, 'sepal_width_cm': 2.8, 'petal_length_cm': 6.7, 'petal_width_cm': 44.82}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.3, 'sepal_width_cm': 2.7, 'petal_length_cm': 4.9, 'petal_width_cm': 1.8}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.7, 'sepal_width_cm': 3.3, 'petal_length_cm': 5.7, 'petal_width_cm': 2.1}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 7.2, 'sepal_width_cm': 3.2, 'petal_length_cm': 6.0, 'petal_width_cm': 1.8}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.2, 'sepal_width_cm': 2.8, 'petal_length_cm': 4.8, 'petal_width_cm': 1.8}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.1, 'sepal_width_cm': 3.0, 'petal_length_cm': 4.9, 'petal_width_cm': 1.8}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.4, 'sepal_width_cm': 2.8, 'petal_length_cm': 5.6, 'petal_width_cm': 2.1}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 7.2, 'sepal_width_cm': 3.0, 'petal_length_cm': 5.8, 'petal_width_cm': 1.6}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 7.4, 'sepal_width_cm': 2.8, 'petal_length_cm': 6.1, 'petal_width_cm': 40.22}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 7.9, 'sepal_width_cm': 3.8, 'petal_length_cm': 6.4, 'petal_width_cm': 2.0}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.4, 'sepal_width_cm': 2.8, 'petal_length_cm': 5.6, 'petal_width_cm': 2.2}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.3, 'sepal_width_cm': 2.8, 'petal_length_cm': 5.1, 'petal_width_cm': 1.5}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.1, 'sepal_width_cm': 2.6, 'petal_length_cm': 5.6, 'petal_width_cm': 1.4}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 7.7, 'sepal_width_cm': 3.0, 'petal_length_cm': 6.1, 'petal_width_cm': 2.3}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.3, 'sepal_width_cm': 3.4, 'petal_length_cm': 5.6, 'petal_width_cm': 2.4}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.4, 'sepal_width_cm': 3.1, 'petal_length_cm': 5.5, 'petal_width_cm': 1.8}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.0, 'sepal_width_cm': 3.0, 'petal_length_cm': 4.8, 'petal_width_cm': 1.8}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.9, 'sepal_width_cm': 3.1, 'petal_length_cm': 5.4, 'petal_width_cm': 2.1}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.7, 'sepal_width_cm': 3.1, 'petal_length_cm': 5.6, 'petal_width_cm': 2.4}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.9, 'sepal_width_cm': 3.1, 'petal_length_cm': 5.1, 'petal_width_cm': 49.55}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.8, 'sepal_width_cm': 2.7, 'petal_length_cm': 5.1, 'petal_width_cm': 1.9}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.8, 'sepal_width_cm': 3.2, 'petal_length_cm': 5.9, 'petal_width_cm': 2.3}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.7, 'sepal_width_cm': 3.3, 'petal_length_cm': 5.7, 'petal_width_cm': 2.5}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.7, 'sepal_width_cm': 3.0, 'petal_length_cm': 5.2, 'petal_width_cm': 2.3}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.3, 'sepal_width_cm': 2.5, 'petal_length_cm': 5.0, 'petal_width_cm': 1.9}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.5, 'sepal_width_cm': 3.0, 'petal_length_cm': 5.2, 'petal_width_cm': 2.0}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 6.2, 'sepal_width_cm': 3.4, 'petal_length_cm': 5.4, 'petal_width_cm': 2.3}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n", "{'sepal_length_cm': 5.9, 'sepal_width_cm': 3.0, 'petal_length_cm': 5.1, 'petal_width_cm': 1.8}\n", "{'data': {'class': 'virginica'}, 'message': 'Success'}\n" ] } ], "source": [ "healthy = requests.get(f\"{url}/health\")\n", "if healthy.ok:\n", " for k in range(data_mod.shape[0]):\n", " # Build a payload with random values\n", " payload = dict(zip(labels, data_mod.iloc[:, 0:4].values[k]))\n", " print(payload)\n", " response = requests.post(f\"{url}/predict\", json=payload)\n", " if response.ok:\n", " print(response.json())\n", " time.sleep(random.randrange(2, 10))" ] }, { "attachments": {}, "cell_type": "markdown", "metadata": { "collapsed": false, "pycharm": { "name": "#%% md\n" } }, "source": [ "# Viewing the Data\n", "\n", "This app defaults to a local file writer. You can look at the logs within the docker container by running `docker exec -it bash` . They will all be in `ls /logs`\n", "\n", "While there you can visualize the results by following the instructions of the [\"Notebook Profile Visualizer\"](https://github.com/whylabs/whylogs/blob/9618e5dd6570bc484579ec1325f2f512ff56977f/python/examples/basic/Notebook_Profile_Visualizer.ipynb)\n", ". Some more information can be found in these notebooks [\"Merging Profiles\"](https://github.com/whylabs/whylogs/blob/9618e5dd6570bc484579ec1325f2f512ff56977f/python/examples/basic/Merging_Profiles.ipynb) and [\"Streaming_Data_with_Log_Rotation\"](https://github.com/whylabs/whylogs/blob/mainline/python/examples/advanced/Log_Rotation_for_Streaming_Data/Streaming_Data_with_Log_Rotation.ipynb).\n", "\n", "## Using WhyLabs\n", "As we talked about in the overview though we will also be able to send this over to your WhyLabs observation dashboard. Here it will automatically receive the data and merge them into useful daily analysis. Follow the directions in \"WhyLabs Writer\" to change the .env file in this directory. Then rerun the above. This time it will populate your WhyLabs dataset to be able to see the drifts on the WhyLabs portal. \n", "\n", "Included in this notebook's directory is a `.env` file with the needed variables set to None. \n", "\n", "![WhyLabs Profile](./assets/WhyLabs_profile.png)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "collapsed": false, "pycharm": { "name": "#%%\n" } }, "outputs": [], "source": [] } ], "metadata": { "kernelspec": { "display_name": ".venv", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 2 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython2", "version": "3.8.10" }, "vscode": { "interpreter": { "hash": "5dd5901cadfd4b29c2aaf95ecd29c0c3b10829ad94dcfe59437dbee391154aea" } } }, "nbformat": 4, "nbformat_minor": 0 }