{ "nbformat": 4, "nbformat_minor": 0, "metadata": { "kernelspec": { "display_name": "TensorFlow 2.3 on Python 3.6 (CUDA 10.1)", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.6.9" }, "colab": { "name": "10-3.vggnet_in_keras.ipynb", "provenance": [] }, "accelerator": "GPU" }, "cells": [ { "cell_type": "markdown", "metadata": { "id": "BZhefSFHEjMr" }, "source": [ "# 케라스로 VGGNet 만들기" ] }, { "cell_type": "markdown", "metadata": { "id": "2FvQC3sqEjMv" }, "source": [ "이 노트북에서 \"매우 깊은\" 합성곱 신경망 [VGGNet](https://arxiv.org/pdf/1409.1556.pdf)과 비슷한 모델을 훈련하여 옥스포드 꽃 데이터셋을 17개 카테고리로 분류합니다." ] }, { "cell_type": "markdown", "metadata": { "id": "MSxfpICvEjMv" }, "source": [ "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/rickiepark/dl-illustrated/blob/master/notebooks/10-3.vggnet_in_keras.ipynb)" ] }, { "cell_type": "markdown", "metadata": { "id": "P1Jq5ISLEjMw" }, "source": [ "#### 동일하게 재현하기 위해 랜덤 시드 값을 지정합니다." ] }, { "cell_type": "code", "metadata": { "id": "USDP6PrYEjMw" }, "source": [ "import numpy as np\n", "np.random.seed(42)" ], "execution_count": 1, "outputs": [] }, { "cell_type": "markdown", "metadata": { "id": "3hwK7zQXEjMw" }, "source": [ "#### 라이브러리를 적재합니다." ] }, { "cell_type": "code", "metadata": { "id": "u7A2Q8BGEjMx" }, "source": [ "from tensorflow import keras\n", "from tensorflow.keras.models import Sequential\n", "from tensorflow.keras.layers import Dense, Dropout, Flatten, Conv2D, MaxPooling2D\n", "from tensorflow.keras.layers import BatchNormalization\n", "from tensorflow.keras.callbacks import TensorBoard " ], "execution_count": 2, "outputs": [] }, { "cell_type": "markdown", "metadata": { "id": "Mwd40wPgEjMx" }, "source": [ "#### 데이터를 적재하고 전처리합니다." ] }, { "cell_type": "markdown", "metadata": { "id": "ZGAyGZ6dEjMx" }, "source": [ "이미 `oxflower17.npz` 파일을 다운받았다고 가정합니다. 이 파일이 없다면 `10-2.alexnet_in_keras.ipynb` 노트북을 다시 실행하세요." ] }, { "cell_type": "code", "metadata": { "id": "StogDyZVEwWw", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "5d88053a-3853-4d5a-cb3a-8d43c4374f45" }, "source": [ "# 코랩을 사용할 경우 다음 셀을 실행하세요.\n", "!rm oxflower17*\n", "!wget https://bit.ly/36QytdH -O oxflower17.npz" ], "execution_count": 3, "outputs": [ { "output_type": "stream", "name": "stdout", "text": [ "rm: cannot remove 'oxflower17*': No such file or directory\n", "--2022-12-05 14:17:30-- https://bit.ly/36QytdH\n", "Resolving bit.ly (bit.ly)... 67.199.248.10, 67.199.248.11\n", "Connecting to bit.ly (bit.ly)|67.199.248.10|:443... connected.\n", "HTTP request sent, awaiting response... 301 Moved Permanently\n", "Location: https://onedrive.live.com/download?cid=822579D69D2DC3B5&resid=822579D69D2DC3B5!597859&authkey=AGd0CpvKFkK8GtE [following]\n", "--2022-12-05 14:17:31-- https://onedrive.live.com/download?cid=822579D69D2DC3B5&resid=822579D69D2DC3B5!597859&authkey=AGd0CpvKFkK8GtE\n", "Resolving onedrive.live.com (onedrive.live.com)... 13.107.43.13\n", "Connecting to onedrive.live.com (onedrive.live.com)|13.107.43.13|:443... connected.\n", "HTTP request sent, awaiting response... 302 Found\n", "Location: https://57ucia.bl.files.1drv.com/y4mwsCeYKx-CkygdaC_AgNBSRavTkON1bJosV1qsqIgK4FmXOSxDzwUZP_QKqFEyAwN-9Acv2_g1Zm_Xb0q2y1HHqWZFlp9TTuP6tNE2_ppV8POZDV0NrqrkzbZmj0N5Dv0VXZ7mjeI5md5zohP6UYjakliqfzMK9drrL3oU8wI6gRkgd-Ut7pP1f32l7DSAengF_MEScPyAhXMV_Wg56vFJw/oxflower17.npz?download&psid=1 [following]\n", "--2022-12-05 14:17:31-- https://57ucia.bl.files.1drv.com/y4mwsCeYKx-CkygdaC_AgNBSRavTkON1bJosV1qsqIgK4FmXOSxDzwUZP_QKqFEyAwN-9Acv2_g1Zm_Xb0q2y1HHqWZFlp9TTuP6tNE2_ppV8POZDV0NrqrkzbZmj0N5Dv0VXZ7mjeI5md5zohP6UYjakliqfzMK9drrL3oU8wI6gRkgd-Ut7pP1f32l7DSAengF_MEScPyAhXMV_Wg56vFJw/oxflower17.npz?download&psid=1\n", "Resolving 57ucia.bl.files.1drv.com (57ucia.bl.files.1drv.com)... 13.107.43.12\n", "Connecting to 57ucia.bl.files.1drv.com (57ucia.bl.files.1drv.com)|13.107.43.12|:443... connected.\n", "HTTP request sent, awaiting response... 200 OK\n", "Length: 252415092 (241M) [application/zip]\n", "Saving to: ‘oxflower17.npz’\n", "\n", "oxflower17.npz 100%[===================>] 240.72M 28.3MB/s in 12s \n", "\n", "2022-12-05 14:17:44 (19.7 MB/s) - ‘oxflower17.npz’ saved [252415092/252415092]\n", "\n" ] } ] }, { "cell_type": "code", "metadata": { "id": "OUxE5xw6eSmC", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "3dd457d3-b717-4dd3-c18d-6c93a68c0708" }, "source": [ "ls -al" ], "execution_count": 4, "outputs": [ { "output_type": "stream", "name": "stdout", "text": [ "total 246516\n", "drwxr-xr-x 1 root root 4096 Dec 5 14:17 \u001b[0m\u001b[01;34m.\u001b[0m/\n", "drwxr-xr-x 1 root root 4096 Dec 5 14:17 \u001b[01;34m..\u001b[0m/\n", "drwxr-xr-x 4 root root 4096 Dec 1 20:07 \u001b[01;34m.config\u001b[0m/\n", "-rw-r--r-- 1 root root 252415092 Feb 7 2021 oxflower17.npz\n", "drwxr-xr-x 1 root root 4096 Dec 1 20:08 \u001b[01;34msample_data\u001b[0m/\n" ] } ] }, { "cell_type": "code", "metadata": { "id": "VGZXYi5HEjMx" }, "source": [ "import numpy as np\n", "\n", "data = np.load('oxflower17.npz', allow_pickle=True)\n", "X = data['X']\n", "Y = data['Y']" ], "execution_count": 5, "outputs": [] }, { "cell_type": "markdown", "metadata": { "id": "gaYGGHH9EjMx" }, "source": [ "#### 신경망 모델을 만듭니다." ] }, { "cell_type": "code", "metadata": { "id": "0Q3DqY9IEjMy" }, "source": [ "model = Sequential()\n", "\n", "model.add(Conv2D(64, 3, activation='relu', input_shape=(224, 224, 3)))\n", "model.add(Conv2D(64, 3, activation='relu'))\n", "model.add(MaxPooling2D(2, 2))\n", "model.add(BatchNormalization())\n", "\n", "model.add(Conv2D(128, 3, activation='relu'))\n", "model.add(Conv2D(128, 3, activation='relu'))\n", "model.add(MaxPooling2D(2, 2))\n", "model.add(BatchNormalization())\n", "\n", "model.add(Conv2D(256, 3, activation='relu'))\n", "model.add(Conv2D(256, 3, activation='relu'))\n", "model.add(Conv2D(256, 3, activation='relu'))\n", "model.add(MaxPooling2D(2, 2))\n", "model.add(BatchNormalization())\n", "\n", "model.add(Conv2D(512, 3, activation='relu'))\n", "model.add(Conv2D(512, 3, activation='relu'))\n", "model.add(Conv2D(512, 3, activation='relu'))\n", "model.add(MaxPooling2D(2, 2))\n", "model.add(BatchNormalization())\n", "\n", "model.add(Conv2D(512, 3, activation='relu'))\n", "model.add(Conv2D(512, 3, activation='relu'))\n", "model.add(Conv2D(512, 3, activation='relu'))\n", "model.add(MaxPooling2D(2, 2))\n", "model.add(BatchNormalization())\n", "\n", "model.add(Flatten())\n", "model.add(Dense(4096, activation='relu'))\n", "model.add(Dropout(0.5))\n", "model.add(Dense(4096, activation='relu'))\n", "model.add(Dropout(0.5))\n", "\n", "model.add(Dense(17, activation='softmax'))" ], "execution_count": 6, "outputs": [] }, { "cell_type": "code", "metadata": { "id": "d3si-SqCEjMy", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "0163ff89-865d-4214-fbe6-e7934256ba32" }, "source": [ "model.summary()" ], "execution_count": 7, "outputs": [ { "output_type": "stream", "name": "stdout", "text": [ "Model: \"sequential\"\n", "_________________________________________________________________\n", " Layer (type) Output Shape Param # \n", "=================================================================\n", " conv2d (Conv2D) (None, 222, 222, 64) 1792 \n", " \n", " conv2d_1 (Conv2D) (None, 220, 220, 64) 36928 \n", " \n", " max_pooling2d (MaxPooling2D (None, 110, 110, 64) 0 \n", " ) \n", " \n", " batch_normalization (BatchN (None, 110, 110, 64) 256 \n", " ormalization) \n", " \n", " conv2d_2 (Conv2D) (None, 108, 108, 128) 73856 \n", " \n", " conv2d_3 (Conv2D) (None, 106, 106, 128) 147584 \n", " \n", " max_pooling2d_1 (MaxPooling (None, 53, 53, 128) 0 \n", " 2D) \n", " \n", " batch_normalization_1 (Batc (None, 53, 53, 128) 512 \n", " hNormalization) \n", " \n", " conv2d_4 (Conv2D) (None, 51, 51, 256) 295168 \n", " \n", " conv2d_5 (Conv2D) (None, 49, 49, 256) 590080 \n", " \n", " conv2d_6 (Conv2D) (None, 47, 47, 256) 590080 \n", " \n", " max_pooling2d_2 (MaxPooling (None, 23, 23, 256) 0 \n", " 2D) \n", " \n", " batch_normalization_2 (Batc (None, 23, 23, 256) 1024 \n", " hNormalization) \n", " \n", " conv2d_7 (Conv2D) (None, 21, 21, 512) 1180160 \n", " \n", " conv2d_8 (Conv2D) (None, 19, 19, 512) 2359808 \n", " \n", " conv2d_9 (Conv2D) (None, 17, 17, 512) 2359808 \n", " \n", " max_pooling2d_3 (MaxPooling (None, 8, 8, 512) 0 \n", " 2D) \n", " \n", " batch_normalization_3 (Batc (None, 8, 8, 512) 2048 \n", " hNormalization) \n", " \n", " conv2d_10 (Conv2D) (None, 6, 6, 512) 2359808 \n", " \n", " conv2d_11 (Conv2D) (None, 4, 4, 512) 2359808 \n", " \n", " conv2d_12 (Conv2D) (None, 2, 2, 512) 2359808 \n", " \n", " max_pooling2d_4 (MaxPooling (None, 1, 1, 512) 0 \n", " 2D) \n", " \n", " batch_normalization_4 (Batc (None, 1, 1, 512) 2048 \n", " hNormalization) \n", " \n", " flatten (Flatten) (None, 512) 0 \n", " \n", " dense (Dense) (None, 4096) 2101248 \n", " \n", " dropout (Dropout) (None, 4096) 0 \n", " \n", " dense_1 (Dense) (None, 4096) 16781312 \n", " \n", " dropout_1 (Dropout) (None, 4096) 0 \n", " \n", " dense_2 (Dense) (None, 17) 69649 \n", " \n", "=================================================================\n", "Total params: 33,672,785\n", "Trainable params: 33,669,841\n", "Non-trainable params: 2,944\n", "_________________________________________________________________\n" ] } ] }, { "cell_type": "markdown", "metadata": { "id": "tdcm6SWdEjMz" }, "source": [ "#### 모델을 설정합니다." ] }, { "cell_type": "code", "metadata": { "id": "UF792LT3EjMz" }, "source": [ "model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy'])" ], "execution_count": 8, "outputs": [] }, { "cell_type": "markdown", "metadata": { "id": "3Ic6S4HFEjMz" }, "source": [ "#### 텐서보드를 설정합니다." ] }, { "cell_type": "code", "metadata": { "id": "Z2MbZ9eFEjMz" }, "source": [ "tensorbrd = TensorBoard('logs/vggnet')" ], "execution_count": 9, "outputs": [] }, { "cell_type": "markdown", "metadata": { "id": "pCOavb4hEjMz" }, "source": [ "#### 훈련!" ] }, { "cell_type": "code", "metadata": { "id": "47m02kWMEjM0", "colab": { "base_uri": "https://localhost:8080/" }, "outputId": "3fd26ac7-8a81-427c-a5dd-a5044f5f204c" }, "source": [ "model.fit(X, Y, batch_size=64, epochs=250, verbose=1, validation_split=0.1, shuffle=True, callbacks=[tensorbrd])" ], "execution_count": 10, "outputs": [ { "output_type": "stream", "name": "stdout", "text": [ "Epoch 1/250\n", "20/20 [==============================] - 60s 2s/step - loss: 3.2662 - accuracy: 0.1283 - val_loss: 10.0563 - val_accuracy: 0.0441\n", "Epoch 2/250\n", "20/20 [==============================] - 14s 713ms/step - loss: 2.8892 - accuracy: 0.1961 - val_loss: 8.2696 - val_accuracy: 0.0368\n", "Epoch 3/250\n", "20/20 [==============================] - 15s 730ms/step - loss: 2.7490 - accuracy: 0.2435 - val_loss: 5.8112 - val_accuracy: 0.0809\n", "Epoch 4/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 2.9122 - accuracy: 0.1520 - val_loss: 3.4217 - val_accuracy: 0.1765\n", "Epoch 5/250\n", "20/20 [==============================] - 15s 750ms/step - loss: 2.2502 - accuracy: 0.2394 - val_loss: 3.7264 - val_accuracy: 0.1176\n", "Epoch 6/250\n", "20/20 [==============================] - 15s 743ms/step - loss: 2.1105 - accuracy: 0.2606 - val_loss: 3.6896 - val_accuracy: 0.1324\n", "Epoch 7/250\n", "20/20 [==============================] - 15s 735ms/step - loss: 1.9533 - accuracy: 0.3039 - val_loss: 2.3240 - val_accuracy: 0.2132\n", "Epoch 8/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 1.9497 - accuracy: 0.3154 - val_loss: 4.7434 - val_accuracy: 0.1029\n", "Epoch 9/250\n", "20/20 [==============================] - 15s 730ms/step - loss: 1.8397 - accuracy: 0.3317 - val_loss: 3.1084 - val_accuracy: 0.2132\n", "Epoch 10/250\n", "20/20 [==============================] - 15s 731ms/step - loss: 1.8157 - accuracy: 0.3268 - val_loss: 2.4423 - val_accuracy: 0.2059\n", "Epoch 11/250\n", "20/20 [==============================] - 15s 730ms/step - loss: 1.7919 - accuracy: 0.3464 - val_loss: 2.0414 - val_accuracy: 0.3015\n", "Epoch 12/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 1.6605 - accuracy: 0.4093 - val_loss: 2.9098 - val_accuracy: 0.2206\n", "Epoch 13/250\n", "20/20 [==============================] - 15s 727ms/step - loss: 1.7663 - accuracy: 0.3668 - val_loss: 2.9539 - val_accuracy: 0.2279\n", "Epoch 14/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 1.7100 - accuracy: 0.3750 - val_loss: 2.3156 - val_accuracy: 0.2574\n", "Epoch 15/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 1.7604 - accuracy: 0.3685 - val_loss: 2.5862 - val_accuracy: 0.2721\n", "Epoch 16/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 1.6886 - accuracy: 0.3807 - val_loss: 2.8375 - val_accuracy: 0.1985\n", "Epoch 17/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 1.6031 - accuracy: 0.4126 - val_loss: 2.4606 - val_accuracy: 0.3309\n", "Epoch 18/250\n", "20/20 [==============================] - 15s 731ms/step - loss: 1.5542 - accuracy: 0.4216 - val_loss: 2.3299 - val_accuracy: 0.3162\n", "Epoch 19/250\n", "20/20 [==============================] - 15s 739ms/step - loss: 1.5380 - accuracy: 0.4338 - val_loss: 6.1641 - val_accuracy: 0.0956\n", "Epoch 20/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 1.6000 - accuracy: 0.4289 - val_loss: 1.9594 - val_accuracy: 0.4044\n", "Epoch 21/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 1.5976 - accuracy: 0.4150 - val_loss: 2.1397 - val_accuracy: 0.3897\n", "Epoch 22/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 1.5957 - accuracy: 0.4150 - val_loss: 2.5208 - val_accuracy: 0.3750\n", "Epoch 23/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 1.4529 - accuracy: 0.4592 - val_loss: 2.0267 - val_accuracy: 0.3897\n", "Epoch 24/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 1.3847 - accuracy: 0.4943 - val_loss: 2.7812 - val_accuracy: 0.3529\n", "Epoch 25/250\n", "20/20 [==============================] - 15s 735ms/step - loss: 1.3626 - accuracy: 0.5139 - val_loss: 2.0514 - val_accuracy: 0.4265\n", "Epoch 26/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 1.3875 - accuracy: 0.4894 - val_loss: 5.7763 - val_accuracy: 0.1765\n", "Epoch 27/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 1.4368 - accuracy: 0.4820 - val_loss: 2.3196 - val_accuracy: 0.3529\n", "Epoch 28/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 1.5101 - accuracy: 0.4698 - val_loss: 7.4431 - val_accuracy: 0.1471\n", "Epoch 29/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 1.3423 - accuracy: 0.5172 - val_loss: 6.5513 - val_accuracy: 0.2794\n", "Epoch 30/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 1.2838 - accuracy: 0.5523 - val_loss: 2.8749 - val_accuracy: 0.3162\n", "Epoch 31/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 1.4849 - accuracy: 0.4502 - val_loss: 3.3545 - val_accuracy: 0.2500\n", "Epoch 32/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 1.4781 - accuracy: 0.4730 - val_loss: 5.2195 - val_accuracy: 0.2132\n", "Epoch 33/250\n", "20/20 [==============================] - 15s 730ms/step - loss: 1.2084 - accuracy: 0.5629 - val_loss: 2.1107 - val_accuracy: 0.3971\n", "Epoch 34/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 1.2114 - accuracy: 0.5711 - val_loss: 5.1488 - val_accuracy: 0.1765\n", "Epoch 35/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 1.3754 - accuracy: 0.5253 - val_loss: 4.6591 - val_accuracy: 0.3015\n", "Epoch 36/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 1.2198 - accuracy: 0.5605 - val_loss: 2.7805 - val_accuracy: 0.3824\n", "Epoch 37/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 1.2980 - accuracy: 0.5580 - val_loss: 3.3633 - val_accuracy: 0.3603\n", "Epoch 38/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 1.1182 - accuracy: 0.6070 - val_loss: 4.4305 - val_accuracy: 0.3309\n", "Epoch 39/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 1.2001 - accuracy: 0.5850 - val_loss: 3.3975 - val_accuracy: 0.3529\n", "Epoch 40/250\n", "20/20 [==============================] - 15s 735ms/step - loss: 1.1188 - accuracy: 0.6062 - val_loss: 1.7927 - val_accuracy: 0.5368\n", "Epoch 41/250\n", "20/20 [==============================] - 15s 727ms/step - loss: 1.0248 - accuracy: 0.6389 - val_loss: 3.1558 - val_accuracy: 0.4118\n", "Epoch 42/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 1.0201 - accuracy: 0.6626 - val_loss: 1.8570 - val_accuracy: 0.5515\n", "Epoch 43/250\n", "20/20 [==============================] - 15s 730ms/step - loss: 1.0910 - accuracy: 0.6340 - val_loss: 3.2766 - val_accuracy: 0.4044\n", "Epoch 44/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.8742 - accuracy: 0.6895 - val_loss: 2.0596 - val_accuracy: 0.4706\n", "Epoch 45/250\n", "20/20 [==============================] - 15s 726ms/step - loss: 0.9692 - accuracy: 0.6601 - val_loss: 3.2208 - val_accuracy: 0.4265\n", "Epoch 46/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.8393 - accuracy: 0.7018 - val_loss: 2.1105 - val_accuracy: 0.4853\n", "Epoch 47/250\n", "20/20 [==============================] - 15s 730ms/step - loss: 0.8139 - accuracy: 0.7206 - val_loss: 4.6318 - val_accuracy: 0.3456\n", "Epoch 48/250\n", "20/20 [==============================] - 15s 735ms/step - loss: 0.7749 - accuracy: 0.7206 - val_loss: 3.4388 - val_accuracy: 0.4118\n", "Epoch 49/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.7540 - accuracy: 0.7467 - val_loss: 2.5389 - val_accuracy: 0.4118\n", "Epoch 50/250\n", "20/20 [==============================] - 15s 731ms/step - loss: 0.7877 - accuracy: 0.7377 - val_loss: 3.2288 - val_accuracy: 0.4412\n", "Epoch 51/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.7410 - accuracy: 0.7377 - val_loss: 2.1626 - val_accuracy: 0.5294\n", "Epoch 52/250\n", "20/20 [==============================] - 15s 739ms/step - loss: 0.6719 - accuracy: 0.7688 - val_loss: 1.9679 - val_accuracy: 0.5368\n", "Epoch 53/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.7928 - accuracy: 0.7198 - val_loss: 2.4499 - val_accuracy: 0.5588\n", "Epoch 54/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.8884 - accuracy: 0.6977 - val_loss: 1.6234 - val_accuracy: 0.5882\n", "Epoch 55/250\n", "20/20 [==============================] - 15s 730ms/step - loss: 0.6612 - accuracy: 0.7802 - val_loss: 2.7018 - val_accuracy: 0.5147\n", "Epoch 56/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.5996 - accuracy: 0.7998 - val_loss: 2.3317 - val_accuracy: 0.5294\n", "Epoch 57/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.8003 - accuracy: 0.7337 - val_loss: 3.2246 - val_accuracy: 0.4485\n", "Epoch 58/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.7348 - accuracy: 0.7402 - val_loss: 2.1631 - val_accuracy: 0.5735\n", "Epoch 59/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.6789 - accuracy: 0.7688 - val_loss: 2.4024 - val_accuracy: 0.5368\n", "Epoch 60/250\n", "20/20 [==============================] - 15s 731ms/step - loss: 0.5542 - accuracy: 0.8178 - val_loss: 3.2382 - val_accuracy: 0.5441\n", "Epoch 61/250\n", "20/20 [==============================] - 15s 730ms/step - loss: 0.6183 - accuracy: 0.8031 - val_loss: 3.0212 - val_accuracy: 0.4779\n", "Epoch 62/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.5349 - accuracy: 0.8284 - val_loss: 2.9785 - val_accuracy: 0.5662\n", "Epoch 63/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.8073 - accuracy: 0.7426 - val_loss: 4.6083 - val_accuracy: 0.3235\n", "Epoch 64/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.6623 - accuracy: 0.7876 - val_loss: 3.1860 - val_accuracy: 0.4706\n", "Epoch 65/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.9305 - accuracy: 0.7312 - val_loss: 27.2112 - val_accuracy: 0.0662\n", "Epoch 66/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 1.5177 - accuracy: 0.5057 - val_loss: 39.9631 - val_accuracy: 0.1618\n", "Epoch 67/250\n", "20/20 [==============================] - 15s 734ms/step - loss: 1.0039 - accuracy: 0.6797 - val_loss: 7.9617 - val_accuracy: 0.2574\n", "Epoch 68/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.7978 - accuracy: 0.7328 - val_loss: 5.4666 - val_accuracy: 0.2941\n", "Epoch 69/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.7404 - accuracy: 0.7459 - val_loss: 5.2302 - val_accuracy: 0.3603\n", "Epoch 70/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.6279 - accuracy: 0.7851 - val_loss: 2.3682 - val_accuracy: 0.5368\n", "Epoch 71/250\n", "20/20 [==============================] - 15s 727ms/step - loss: 0.5185 - accuracy: 0.8317 - val_loss: 2.6149 - val_accuracy: 0.5147\n", "Epoch 72/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.5704 - accuracy: 0.8137 - val_loss: 2.2288 - val_accuracy: 0.5294\n", "Epoch 73/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.5701 - accuracy: 0.8219 - val_loss: 3.3443 - val_accuracy: 0.4853\n", "Epoch 74/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.4282 - accuracy: 0.8595 - val_loss: 2.7132 - val_accuracy: 0.5074\n", "Epoch 75/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.5848 - accuracy: 0.8301 - val_loss: 2.4077 - val_accuracy: 0.5441\n", "Epoch 76/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.6240 - accuracy: 0.7925 - val_loss: 3.4086 - val_accuracy: 0.4853\n", "Epoch 77/250\n", "20/20 [==============================] - 15s 739ms/step - loss: 0.5727 - accuracy: 0.8227 - val_loss: 3.5711 - val_accuracy: 0.4779\n", "Epoch 78/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.5671 - accuracy: 0.8219 - val_loss: 2.9115 - val_accuracy: 0.4485\n", "Epoch 79/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.4861 - accuracy: 0.8513 - val_loss: 2.4437 - val_accuracy: 0.5662\n", "Epoch 80/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.4361 - accuracy: 0.8627 - val_loss: 2.0699 - val_accuracy: 0.6176\n", "Epoch 81/250\n", "20/20 [==============================] - 15s 727ms/step - loss: 0.4324 - accuracy: 0.8546 - val_loss: 3.3301 - val_accuracy: 0.5074\n", "Epoch 82/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.4392 - accuracy: 0.8636 - val_loss: 3.3029 - val_accuracy: 0.5147\n", "Epoch 83/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.3867 - accuracy: 0.8709 - val_loss: 2.2068 - val_accuracy: 0.6397\n", "Epoch 84/250\n", "20/20 [==============================] - 15s 730ms/step - loss: 0.5964 - accuracy: 0.8186 - val_loss: 2.6369 - val_accuracy: 0.3897\n", "Epoch 85/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.5212 - accuracy: 0.8309 - val_loss: 3.0138 - val_accuracy: 0.5294\n", "Epoch 86/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.3761 - accuracy: 0.8742 - val_loss: 2.4456 - val_accuracy: 0.6029\n", "Epoch 87/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.4721 - accuracy: 0.8538 - val_loss: 2.7435 - val_accuracy: 0.5074\n", "Epoch 88/250\n", "20/20 [==============================] - 15s 730ms/step - loss: 0.3834 - accuracy: 0.8897 - val_loss: 2.5227 - val_accuracy: 0.6029\n", "Epoch 89/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.3329 - accuracy: 0.8946 - val_loss: 1.9727 - val_accuracy: 0.6397\n", "Epoch 90/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.3188 - accuracy: 0.9011 - val_loss: 2.2982 - val_accuracy: 0.6029\n", "Epoch 91/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.3258 - accuracy: 0.8979 - val_loss: 2.8195 - val_accuracy: 0.5368\n", "Epoch 92/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.2151 - accuracy: 0.9306 - val_loss: 2.5224 - val_accuracy: 0.5588\n", "Epoch 93/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.4403 - accuracy: 0.8668 - val_loss: 1.6825 - val_accuracy: 0.5588\n", "Epoch 94/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.3292 - accuracy: 0.8962 - val_loss: 2.1945 - val_accuracy: 0.6029\n", "Epoch 95/250\n", "20/20 [==============================] - 15s 730ms/step - loss: 0.2402 - accuracy: 0.9322 - val_loss: 2.7774 - val_accuracy: 0.5809\n", "Epoch 96/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.3277 - accuracy: 0.8971 - val_loss: 2.0833 - val_accuracy: 0.5956\n", "Epoch 97/250\n", "20/20 [==============================] - 15s 730ms/step - loss: 0.3645 - accuracy: 0.8938 - val_loss: 3.4449 - val_accuracy: 0.5956\n", "Epoch 98/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.2481 - accuracy: 0.9158 - val_loss: 4.1259 - val_accuracy: 0.5368\n", "Epoch 99/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.3086 - accuracy: 0.9183 - val_loss: 3.4797 - val_accuracy: 0.5956\n", "Epoch 100/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.3048 - accuracy: 0.9150 - val_loss: 2.7096 - val_accuracy: 0.6103\n", "Epoch 101/250\n", "20/20 [==============================] - 15s 740ms/step - loss: 0.2251 - accuracy: 0.9248 - val_loss: 2.2701 - val_accuracy: 0.6103\n", "Epoch 102/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.2167 - accuracy: 0.9346 - val_loss: 2.5304 - val_accuracy: 0.6176\n", "Epoch 103/250\n", "20/20 [==============================] - 15s 739ms/step - loss: 0.1994 - accuracy: 0.9461 - val_loss: 2.4895 - val_accuracy: 0.6250\n", "Epoch 104/250\n", "20/20 [==============================] - 15s 730ms/step - loss: 0.2938 - accuracy: 0.9134 - val_loss: 3.7247 - val_accuracy: 0.4559\n", "Epoch 105/250\n", "20/20 [==============================] - 15s 730ms/step - loss: 0.2316 - accuracy: 0.9322 - val_loss: 3.4561 - val_accuracy: 0.5662\n", "Epoch 106/250\n", "20/20 [==============================] - 15s 731ms/step - loss: 0.3199 - accuracy: 0.9093 - val_loss: 2.5353 - val_accuracy: 0.6250\n", "Epoch 107/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.6820 - accuracy: 0.8007 - val_loss: 2.6267 - val_accuracy: 0.5368\n", "Epoch 108/250\n", "20/20 [==============================] - 15s 731ms/step - loss: 0.4088 - accuracy: 0.8775 - val_loss: 3.6561 - val_accuracy: 0.5882\n", "Epoch 109/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.3424 - accuracy: 0.8971 - val_loss: 3.3292 - val_accuracy: 0.6029\n", "Epoch 110/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.2149 - accuracy: 0.9281 - val_loss: 2.7010 - val_accuracy: 0.5882\n", "Epoch 111/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.3383 - accuracy: 0.9199 - val_loss: 2.6733 - val_accuracy: 0.6397\n", "Epoch 112/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.2748 - accuracy: 0.9134 - val_loss: 2.6641 - val_accuracy: 0.6250\n", "Epoch 113/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.3857 - accuracy: 0.8930 - val_loss: 5.0618 - val_accuracy: 0.4412\n", "Epoch 114/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.1929 - accuracy: 0.9355 - val_loss: 2.6654 - val_accuracy: 0.6838\n", "Epoch 115/250\n", "20/20 [==============================] - 15s 739ms/step - loss: 0.1124 - accuracy: 0.9641 - val_loss: 2.5195 - val_accuracy: 0.6250\n", "Epoch 116/250\n", "20/20 [==============================] - 15s 730ms/step - loss: 0.2161 - accuracy: 0.9412 - val_loss: 3.1727 - val_accuracy: 0.5515\n", "Epoch 117/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.5466 - accuracy: 0.8423 - val_loss: 12.7212 - val_accuracy: 0.3824\n", "Epoch 118/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.4371 - accuracy: 0.8717 - val_loss: 4.8675 - val_accuracy: 0.4706\n", "Epoch 119/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.2553 - accuracy: 0.9297 - val_loss: 5.2941 - val_accuracy: 0.5074\n", "Epoch 120/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.5379 - accuracy: 0.8652 - val_loss: 3.9211 - val_accuracy: 0.5000\n", "Epoch 121/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.6321 - accuracy: 0.8219 - val_loss: 3.7585 - val_accuracy: 0.4338\n", "Epoch 122/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.5189 - accuracy: 0.8489 - val_loss: 3.3254 - val_accuracy: 0.5294\n", "Epoch 123/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.3651 - accuracy: 0.8856 - val_loss: 3.4844 - val_accuracy: 0.5515\n", "Epoch 124/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.2544 - accuracy: 0.9158 - val_loss: 2.5919 - val_accuracy: 0.6176\n", "Epoch 125/250\n", "20/20 [==============================] - 15s 730ms/step - loss: 0.2967 - accuracy: 0.9183 - val_loss: 2.8812 - val_accuracy: 0.5735\n", "Epoch 126/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.1552 - accuracy: 0.9493 - val_loss: 2.5907 - val_accuracy: 0.5809\n", "Epoch 127/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.2166 - accuracy: 0.9363 - val_loss: 2.6084 - val_accuracy: 0.6618\n", "Epoch 128/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.2143 - accuracy: 0.9371 - val_loss: 2.8957 - val_accuracy: 0.6103\n", "Epoch 129/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.1722 - accuracy: 0.9477 - val_loss: 2.5440 - val_accuracy: 0.6691\n", "Epoch 130/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.2072 - accuracy: 0.9453 - val_loss: 2.4369 - val_accuracy: 0.5588\n", "Epoch 131/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.2161 - accuracy: 0.9322 - val_loss: 2.3118 - val_accuracy: 0.6029\n", "Epoch 132/250\n", "20/20 [==============================] - 15s 735ms/step - loss: 0.3386 - accuracy: 0.9028 - val_loss: 2.2970 - val_accuracy: 0.6176\n", "Epoch 133/250\n", "20/20 [==============================] - 15s 727ms/step - loss: 0.2233 - accuracy: 0.9314 - val_loss: 4.4512 - val_accuracy: 0.5147\n", "Epoch 134/250\n", "20/20 [==============================] - 15s 727ms/step - loss: 0.2226 - accuracy: 0.9322 - val_loss: 3.4755 - val_accuracy: 0.6103\n", "Epoch 135/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.2066 - accuracy: 0.9404 - val_loss: 4.4259 - val_accuracy: 0.5735\n", "Epoch 136/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.2263 - accuracy: 0.9248 - val_loss: 2.9096 - val_accuracy: 0.6471\n", "Epoch 137/250\n", "20/20 [==============================] - 15s 727ms/step - loss: 0.1498 - accuracy: 0.9542 - val_loss: 6.1692 - val_accuracy: 0.5809\n", "Epoch 138/250\n", "20/20 [==============================] - 15s 727ms/step - loss: 0.1396 - accuracy: 0.9714 - val_loss: 2.9781 - val_accuracy: 0.6691\n", "Epoch 139/250\n", "20/20 [==============================] - 15s 735ms/step - loss: 0.1784 - accuracy: 0.9534 - val_loss: 2.5399 - val_accuracy: 0.6691\n", "Epoch 140/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.2872 - accuracy: 0.9134 - val_loss: 2.7547 - val_accuracy: 0.6250\n", "Epoch 141/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.2601 - accuracy: 0.9322 - val_loss: 2.6683 - val_accuracy: 0.6618\n", "Epoch 142/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.2106 - accuracy: 0.9371 - val_loss: 2.9890 - val_accuracy: 0.5882\n", "Epoch 143/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.1632 - accuracy: 0.9526 - val_loss: 2.4284 - val_accuracy: 0.6765\n", "Epoch 144/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.1242 - accuracy: 0.9706 - val_loss: 2.7937 - val_accuracy: 0.6176\n", "Epoch 145/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.1971 - accuracy: 0.9444 - val_loss: 4.1914 - val_accuracy: 0.5882\n", "Epoch 146/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.1343 - accuracy: 0.9665 - val_loss: 3.4227 - val_accuracy: 0.6471\n", "Epoch 147/250\n", "20/20 [==============================] - 15s 727ms/step - loss: 0.2261 - accuracy: 0.9485 - val_loss: 3.5983 - val_accuracy: 0.5809\n", "Epoch 148/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.4567 - accuracy: 0.8824 - val_loss: 3.1307 - val_accuracy: 0.5956\n", "Epoch 149/250\n", "20/20 [==============================] - 15s 730ms/step - loss: 0.2179 - accuracy: 0.9363 - val_loss: 2.6638 - val_accuracy: 0.6544\n", "Epoch 150/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.4942 - accuracy: 0.8725 - val_loss: 3.1587 - val_accuracy: 0.4779\n", "Epoch 151/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.4567 - accuracy: 0.8627 - val_loss: 2.6603 - val_accuracy: 0.5662\n", "Epoch 152/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 1.1382 - accuracy: 0.6863 - val_loss: 4.3152 - val_accuracy: 0.4044\n", "Epoch 153/250\n", "20/20 [==============================] - 15s 731ms/step - loss: 0.4697 - accuracy: 0.8489 - val_loss: 4.2249 - val_accuracy: 0.4559\n", "Epoch 154/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.2749 - accuracy: 0.9126 - val_loss: 5.0660 - val_accuracy: 0.4412\n", "Epoch 155/250\n", "20/20 [==============================] - 15s 727ms/step - loss: 0.2565 - accuracy: 0.9232 - val_loss: 3.9567 - val_accuracy: 0.4559\n", "Epoch 156/250\n", "20/20 [==============================] - 15s 735ms/step - loss: 0.4221 - accuracy: 0.8832 - val_loss: 3.9464 - val_accuracy: 0.4632\n", "Epoch 157/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.2936 - accuracy: 0.9167 - val_loss: 2.9204 - val_accuracy: 0.5956\n", "Epoch 158/250\n", "20/20 [==============================] - 15s 730ms/step - loss: 0.1349 - accuracy: 0.9583 - val_loss: 3.0951 - val_accuracy: 0.5735\n", "Epoch 159/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.2387 - accuracy: 0.9338 - val_loss: 2.6730 - val_accuracy: 0.5588\n", "Epoch 160/250\n", "20/20 [==============================] - 15s 735ms/step - loss: 0.1126 - accuracy: 0.9632 - val_loss: 2.5732 - val_accuracy: 0.6471\n", "Epoch 161/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.0850 - accuracy: 0.9714 - val_loss: 2.9416 - val_accuracy: 0.6176\n", "Epoch 162/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.0938 - accuracy: 0.9690 - val_loss: 2.7792 - val_accuracy: 0.6324\n", "Epoch 163/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.0877 - accuracy: 0.9722 - val_loss: 2.3340 - val_accuracy: 0.6324\n", "Epoch 164/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.5006 - accuracy: 0.8611 - val_loss: 6.7806 - val_accuracy: 0.3603\n", "Epoch 165/250\n", "20/20 [==============================] - 15s 727ms/step - loss: 1.2129 - accuracy: 0.6846 - val_loss: 44.4650 - val_accuracy: 0.1544\n", "Epoch 166/250\n", "20/20 [==============================] - 15s 727ms/step - loss: 1.2097 - accuracy: 0.6340 - val_loss: 22.3854 - val_accuracy: 0.2132\n", "Epoch 167/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.8357 - accuracy: 0.7337 - val_loss: 9.3122 - val_accuracy: 0.3162\n", "Epoch 168/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.9829 - accuracy: 0.6944 - val_loss: 4.6894 - val_accuracy: 0.3382\n", "Epoch 169/250\n", "20/20 [==============================] - 15s 735ms/step - loss: 0.5872 - accuracy: 0.8007 - val_loss: 2.5263 - val_accuracy: 0.5074\n", "Epoch 170/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.4276 - accuracy: 0.8521 - val_loss: 2.9490 - val_accuracy: 0.4559\n", "Epoch 171/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.3349 - accuracy: 0.8807 - val_loss: 2.2318 - val_accuracy: 0.6029\n", "Epoch 172/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.3334 - accuracy: 0.8938 - val_loss: 2.3285 - val_accuracy: 0.5441\n", "Epoch 173/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.3521 - accuracy: 0.8938 - val_loss: 2.4853 - val_accuracy: 0.5368\n", "Epoch 174/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.3810 - accuracy: 0.8791 - val_loss: 2.6499 - val_accuracy: 0.5441\n", "Epoch 175/250\n", "20/20 [==============================] - 15s 739ms/step - loss: 0.4385 - accuracy: 0.8807 - val_loss: 2.8500 - val_accuracy: 0.5074\n", "Epoch 176/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.2017 - accuracy: 0.9395 - val_loss: 2.7507 - val_accuracy: 0.5735\n", "Epoch 177/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.2694 - accuracy: 0.9306 - val_loss: 3.3662 - val_accuracy: 0.5221\n", "Epoch 178/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.2765 - accuracy: 0.9199 - val_loss: 2.5776 - val_accuracy: 0.5588\n", "Epoch 179/250\n", "20/20 [==============================] - 15s 727ms/step - loss: 0.1781 - accuracy: 0.9485 - val_loss: 2.0949 - val_accuracy: 0.6103\n", "Epoch 180/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.1503 - accuracy: 0.9526 - val_loss: 2.5389 - val_accuracy: 0.5809\n", "Epoch 181/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.1195 - accuracy: 0.9624 - val_loss: 2.4658 - val_accuracy: 0.5809\n", "Epoch 182/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.2646 - accuracy: 0.9199 - val_loss: 2.5406 - val_accuracy: 0.5662\n", "Epoch 183/250\n", "20/20 [==============================] - 15s 739ms/step - loss: 0.3142 - accuracy: 0.9101 - val_loss: 2.2599 - val_accuracy: 0.6176\n", "Epoch 184/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.3990 - accuracy: 0.8913 - val_loss: 3.7274 - val_accuracy: 0.4706\n", "Epoch 185/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.2269 - accuracy: 0.9379 - val_loss: 3.0041 - val_accuracy: 0.5956\n", "Epoch 186/250\n", "20/20 [==============================] - 15s 727ms/step - loss: 0.1512 - accuracy: 0.9542 - val_loss: 2.4126 - val_accuracy: 0.6250\n", "Epoch 187/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.1259 - accuracy: 0.9616 - val_loss: 3.3927 - val_accuracy: 0.5441\n", "Epoch 188/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 1.0651 - accuracy: 0.7402 - val_loss: 7.3412 - val_accuracy: 0.2132\n", "Epoch 189/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.6672 - accuracy: 0.7982 - val_loss: 2.1885 - val_accuracy: 0.5074\n", "Epoch 190/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.3971 - accuracy: 0.8913 - val_loss: 2.3195 - val_accuracy: 0.5735\n", "Epoch 191/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.3038 - accuracy: 0.9109 - val_loss: 2.2835 - val_accuracy: 0.5882\n", "Epoch 192/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.2168 - accuracy: 0.9314 - val_loss: 2.2191 - val_accuracy: 0.6176\n", "Epoch 193/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.1414 - accuracy: 0.9542 - val_loss: 2.0841 - val_accuracy: 0.6103\n", "Epoch 194/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.1847 - accuracy: 0.9526 - val_loss: 2.3761 - val_accuracy: 0.5735\n", "Epoch 195/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.1652 - accuracy: 0.9592 - val_loss: 2.2241 - val_accuracy: 0.5956\n", "Epoch 196/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.1288 - accuracy: 0.9632 - val_loss: 3.1524 - val_accuracy: 0.5882\n", "Epoch 197/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.1374 - accuracy: 0.9681 - val_loss: 2.9827 - val_accuracy: 0.6250\n", "Epoch 198/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.1301 - accuracy: 0.9592 - val_loss: 3.0995 - val_accuracy: 0.6250\n", "Epoch 199/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.0832 - accuracy: 0.9722 - val_loss: 3.0490 - val_accuracy: 0.5956\n", "Epoch 200/250\n", "20/20 [==============================] - 15s 730ms/step - loss: 0.1782 - accuracy: 0.9534 - val_loss: 3.2922 - val_accuracy: 0.5809\n", "Epoch 201/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.0896 - accuracy: 0.9747 - val_loss: 2.7171 - val_accuracy: 0.6324\n", "Epoch 202/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.1080 - accuracy: 0.9673 - val_loss: 2.9443 - val_accuracy: 0.6029\n", "Epoch 203/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.2232 - accuracy: 0.9477 - val_loss: 3.0214 - val_accuracy: 0.6103\n", "Epoch 204/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.1760 - accuracy: 0.9583 - val_loss: 2.6577 - val_accuracy: 0.6397\n", "Epoch 205/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.0930 - accuracy: 0.9730 - val_loss: 2.8268 - val_accuracy: 0.6176\n", "Epoch 206/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.1137 - accuracy: 0.9632 - val_loss: 3.4081 - val_accuracy: 0.6029\n", "Epoch 207/250\n", "20/20 [==============================] - 15s 727ms/step - loss: 0.1062 - accuracy: 0.9714 - val_loss: 3.8998 - val_accuracy: 0.6397\n", "Epoch 208/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.0630 - accuracy: 0.9796 - val_loss: 2.6259 - val_accuracy: 0.6397\n", "Epoch 209/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.0512 - accuracy: 0.9902 - val_loss: 2.7408 - val_accuracy: 0.6103\n", "Epoch 210/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.2230 - accuracy: 0.9379 - val_loss: 5.9677 - val_accuracy: 0.3897\n", "Epoch 211/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.6728 - accuracy: 0.8178 - val_loss: 6.7094 - val_accuracy: 0.5074\n", "Epoch 212/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.4207 - accuracy: 0.8685 - val_loss: 4.2323 - val_accuracy: 0.5809\n", "Epoch 213/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.3048 - accuracy: 0.9036 - val_loss: 2.9643 - val_accuracy: 0.5515\n", "Epoch 214/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.2672 - accuracy: 0.9297 - val_loss: 2.3335 - val_accuracy: 0.6397\n", "Epoch 215/250\n", "20/20 [==============================] - 15s 727ms/step - loss: 0.1939 - accuracy: 0.9436 - val_loss: 2.4418 - val_accuracy: 0.5662\n", "Epoch 216/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.2710 - accuracy: 0.9314 - val_loss: 2.7091 - val_accuracy: 0.5441\n", "Epoch 217/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.1613 - accuracy: 0.9551 - val_loss: 2.6761 - val_accuracy: 0.6397\n", "Epoch 218/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.2350 - accuracy: 0.9257 - val_loss: 2.9476 - val_accuracy: 0.6176\n", "Epoch 219/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.1618 - accuracy: 0.9534 - val_loss: 4.7399 - val_accuracy: 0.5956\n", "Epoch 220/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.2992 - accuracy: 0.9167 - val_loss: 2.5617 - val_accuracy: 0.5735\n", "Epoch 221/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.2729 - accuracy: 0.9338 - val_loss: 2.9277 - val_accuracy: 0.5956\n", "Epoch 222/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.1159 - accuracy: 0.9616 - val_loss: 2.6784 - val_accuracy: 0.6397\n", "Epoch 223/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.1831 - accuracy: 0.9518 - val_loss: 4.0484 - val_accuracy: 0.5882\n", "Epoch 224/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.1747 - accuracy: 0.9632 - val_loss: 2.7723 - val_accuracy: 0.6250\n", "Epoch 225/250\n", "20/20 [==============================] - 15s 727ms/step - loss: 0.1140 - accuracy: 0.9673 - val_loss: 3.3044 - val_accuracy: 0.6029\n", "Epoch 226/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.1380 - accuracy: 0.9600 - val_loss: 2.8139 - val_accuracy: 0.6324\n", "Epoch 227/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.1065 - accuracy: 0.9665 - val_loss: 3.0242 - val_accuracy: 0.6250\n", "Epoch 228/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.0922 - accuracy: 0.9755 - val_loss: 3.0599 - val_accuracy: 0.5956\n", "Epoch 229/250\n", "20/20 [==============================] - 15s 727ms/step - loss: 0.0526 - accuracy: 0.9877 - val_loss: 2.8209 - val_accuracy: 0.6544\n", "Epoch 230/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.0211 - accuracy: 0.9959 - val_loss: 2.8154 - val_accuracy: 0.6912\n", "Epoch 231/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.0679 - accuracy: 0.9812 - val_loss: 2.9188 - val_accuracy: 0.6618\n", "Epoch 232/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.0305 - accuracy: 0.9918 - val_loss: 3.1998 - val_accuracy: 0.6544\n", "Epoch 233/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.1122 - accuracy: 0.9673 - val_loss: 3.4848 - val_accuracy: 0.6250\n", "Epoch 234/250\n", "20/20 [==============================] - 15s 730ms/step - loss: 0.2208 - accuracy: 0.9420 - val_loss: 3.6313 - val_accuracy: 0.5882\n", "Epoch 235/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.1803 - accuracy: 0.9469 - val_loss: 3.2286 - val_accuracy: 0.6103\n", "Epoch 236/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.0681 - accuracy: 0.9722 - val_loss: 2.5510 - val_accuracy: 0.6765\n", "Epoch 237/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.1305 - accuracy: 0.9698 - val_loss: 2.9013 - val_accuracy: 0.6250\n", "Epoch 238/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.0852 - accuracy: 0.9714 - val_loss: 3.1888 - val_accuracy: 0.6471\n", "Epoch 239/250\n", "20/20 [==============================] - 15s 738ms/step - loss: 0.1595 - accuracy: 0.9567 - val_loss: 3.1303 - val_accuracy: 0.6618\n", "Epoch 240/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.1517 - accuracy: 0.9624 - val_loss: 3.6736 - val_accuracy: 0.6029\n", "Epoch 241/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.1679 - accuracy: 0.9567 - val_loss: 3.0746 - val_accuracy: 0.6103\n", "Epoch 242/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.1704 - accuracy: 0.9592 - val_loss: 3.5271 - val_accuracy: 0.5882\n", "Epoch 243/250\n", "20/20 [==============================] - 15s 728ms/step - loss: 0.1806 - accuracy: 0.9567 - val_loss: 3.2975 - val_accuracy: 0.5809\n", "Epoch 244/250\n", "20/20 [==============================] - 15s 737ms/step - loss: 0.1679 - accuracy: 0.9600 - val_loss: 2.9738 - val_accuracy: 0.6103\n", "Epoch 245/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.0777 - accuracy: 0.9796 - val_loss: 3.6021 - val_accuracy: 0.6103\n", "Epoch 246/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.2468 - accuracy: 0.9551 - val_loss: 3.0834 - val_accuracy: 0.5956\n", "Epoch 247/250\n", "20/20 [==============================] - 15s 729ms/step - loss: 0.1754 - accuracy: 0.9485 - val_loss: 3.2508 - val_accuracy: 0.6176\n", "Epoch 248/250\n", "20/20 [==============================] - 15s 735ms/step - loss: 0.1908 - accuracy: 0.9502 - val_loss: 3.4634 - val_accuracy: 0.6029\n", "Epoch 249/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.1347 - accuracy: 0.9714 - val_loss: 2.7130 - val_accuracy: 0.6324\n", "Epoch 250/250\n", "20/20 [==============================] - 15s 736ms/step - loss: 0.1137 - accuracy: 0.9649 - val_loss: 2.3323 - val_accuracy: 0.6397\n" ] }, { "output_type": "execute_result", "data": { "text/plain": [ "" ] }, "metadata": {}, "execution_count": 10 } ] } ] }