handson-ml/14_deep_computer_vision_wit...

2586 lines
2.9 MiB
Plaintext
Raw Normal View History

2016-09-27 23:31:21 +02:00
{
"cells": [
{
"cell_type": "markdown",
"metadata": {
"id": "yK7ecnb6pKzp"
},
2016-09-27 23:31:21 +02:00
"source": [
"**Chapter 14 Deep Computer Vision Using Convolutional Neural Networks**"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "f6cR-I1WpKzs"
},
2016-09-27 23:31:21 +02:00
"source": [
"_This notebook contains all the sample code and solutions to the exercises in chapter 14._"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "CeYcCO3HpKzt"
},
"source": [
"<table align=\"left\">\n",
" <td>\n",
" <a href=\"https://colab.research.google.com/github/ageron/handson-ml3/blob/main/14_deep_computer_vision_with_cnns.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
" </td>\n",
2021-05-25 21:31:19 +02:00
" <td>\n",
" <a target=\"_blank\" href=\"https://kaggle.com/kernels/welcome?src=https://github.com/ageron/handson-ml3/blob/main/14_deep_computer_vision_with_cnns.ipynb\"><img src=\"https://kaggle.com/static/images/open-in-kaggle.svg\" /></a>\n",
2021-05-25 21:31:19 +02:00
" </td>\n",
"</table>"
]
},
2016-09-27 23:31:21 +02:00
{
"cell_type": "markdown",
"metadata": {
"id": "dFXIv9qNpKzt",
"tags": []
},
2016-09-27 23:31:21 +02:00
"source": [
"# Setup"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "8IPbJEmZpKzu"
},
2016-09-27 23:31:21 +02:00
"source": [
2022-02-19 11:03:20 +01:00
"This project requires Python 3.7 or above:"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"id": "TFSU3FCOpKzu"
},
2016-09-27 23:31:21 +02:00
"outputs": [],
"source": [
2019-03-24 02:06:29 +01:00
"import sys\n",
"\n",
2022-02-19 11:03:20 +01:00
"assert sys.version_info >= (3, 7)"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "TAlKky09pKzv"
},
2016-09-27 23:31:21 +02:00
"source": [
"It also requires Scikit-Learn ≥ 1.0.1:"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"id": "YqCwW7cMpKzw"
},
2016-09-27 23:31:21 +02:00
"outputs": [],
"source": [
"import sklearn\n",
2016-09-27 23:31:21 +02:00
"\n",
"assert sklearn.__version__ >= \"1.0.1\""
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "GJtVEqxfpKzw"
},
2016-09-27 23:31:21 +02:00
"source": [
2022-02-28 23:41:27 +01:00
"And TensorFlow ≥ 2.8:"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {
"id": "0Piq5se2pKzx"
},
2016-09-27 23:31:21 +02:00
"outputs": [],
"source": [
"import tensorflow as tf\n",
2019-03-24 02:06:29 +01:00
"\n",
2022-02-28 23:41:27 +01:00
"assert tf.__version__ >= \"2.8.0\""
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "DDaDoLQTpKzx"
},
"source": [
"As we did in earlier chapters, let's define the default font sizes to make the figures prettier:"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {
"id": "8d4TH3NbpKzx"
},
2016-09-27 23:31:21 +02:00
"outputs": [],
"source": [
"import matplotlib.pyplot as plt\n",
2019-03-24 02:06:29 +01:00
"\n",
"plt.rc('font', size=14)\n",
"plt.rc('axes', labelsize=14, titlesize=14)\n",
"plt.rc('legend', fontsize=14)\n",
"plt.rc('xtick', labelsize=10)\n",
"plt.rc('ytick', labelsize=10)"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "YTsawKlapKzy"
},
"source": [
"This chapter can be very slow without a GPU, so let's make sure there's one, or else issue a warning:"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 5,
"metadata": {
"id": "Ekxzo6pOpKzy"
},
2016-09-27 23:31:21 +02:00
"outputs": [],
"source": [
"# Is this notebook running on Colab or Kaggle?\n",
"IS_COLAB = \"google.colab\" in sys.modules\n",
"IS_KAGGLE = \"kaggle_secrets\" in sys.modules\n",
"\n",
"if not tf.config.list_physical_devices('GPU'):\n",
" print(\"No GPU was detected. Neural nets can be very slow without a GPU.\")\n",
" if IS_COLAB:\n",
" print(\"Go to Runtime > Change runtime and select a GPU hardware \"\n",
" \"accelerator.\")\n",
" if IS_KAGGLE:\n",
" print(\"Go to Settings > Accelerator and select GPU.\")"
2019-03-24 02:06:29 +01:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "k9Tnd8cwpKzz"
},
2019-03-24 02:06:29 +01:00
"source": [
"# Convolutional Layers\n",
"## Implementing Convolutional Layers With Keras"
2019-03-24 02:06:29 +01:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "KuamdOs5pKz0"
},
2019-03-24 02:06:29 +01:00
"source": [
"Let's load two sample images, rescale their pixel values to 0-1, and center crop them to small 70×120 images:"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 6,
"metadata": {
"id": "I-kXsWgDpKz0"
},
2016-09-27 23:31:21 +02:00
"outputs": [],
"source": [
"from sklearn.datasets import load_sample_images\n",
"import tensorflow as tf\n",
"\n",
"images = load_sample_images()[\"images\"]\n",
"images = tf.keras.layers.CenterCrop(height=70, width=120)(images)\n",
"images = tf.keras.layers.Rescaling(scale=1 / 255)(images)"
2021-06-24 04:45:35 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 7,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "btpkyo8ZpKz0",
"outputId": "da87408f-5e8f-4c2e-c21f-4b73028d64a2"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"TensorShape([2, 70, 120, 3])"
]
},
"execution_count": 7,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"images.shape"
]
},
2021-06-24 04:45:35 +02:00
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 8,
"metadata": {
"id": "Jv6KYhPzpKz0"
},
2021-06-24 04:45:35 +02:00
"outputs": [],
"source": [
"tf.random.set_seed(42) # extra code ensures reproducibility\n",
"conv_layer = tf.keras.layers.Conv2D(filters=32, kernel_size=7)\n",
"fmaps = conv_layer(images)"
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 9,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "w-mtSoL_pKz1",
"outputId": "8cf6409d-6c0d-4d44-ceaa-4cc9faab21b7"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"TensorShape([2, 64, 114, 32])"
]
},
"execution_count": 9,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"fmaps.shape"
]
},
2016-09-27 23:31:21 +02:00
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 10,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 511
},
"id": "ttMBSh9RpKz1",
"outputId": "2b6ccb30-f9b9-451c-86e1-a248da78acd2"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAA1MAAAHuCAYAAAB6c1YeAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nOyd2dJexXWGF56NwSCE5nkCiclgEcAmJ6nkIOVKlW8hF5CqXEJuIQc5Sa4jlUpVDuyC4ABhEAg0geZ5QAwGz0MOUmo//epbi1brEwjzPkf9/7u/3j3t7r1rvWv1HX/84x/DGGOMMcYYY8yN8ZXPuwLGGGOMMcYY80XEH1PGGGOMMcYYM4E/powxxhhjjDFmAn9MGWOMMcYYY8wE/pgyxhhjjDHGmAm+Vl3813/91xbq7zvf+U537d13341F177ylf777Ktf/WpL33HHHS19+PDhLt/q1atb+ve//31LX7p0qcv3jW98o6W//e1vt/T999/f5bv77rtb+vLlyy195syZLt/58+dbeuXKlS197733dvlYd9Yhom8/r/3hD3/o8rFv2C8ff/xxl+++++5r6V/+8pct/dFHH3X5WF+2l/0SEfHrX/+6pX/3u98trHdExD333NPSv/nNb1qa46F1Z3lMR/Tt/eSTT1qafRkRwYiSTGt5bAfL1vJYd47BN7/5zS7fz3/+85a+evVqet+vfe1PjwnLjujHbt++fS393nvvdfnYZ6zHXXfd1eVjPTi+nBMREStWrGhpjuOdd94ZGRxH1qdC5/Aoug5coxr7Zdx3hqyuN1IPllH9prrXzdahYvS+//Iv/3LHp+cy1/inf/qnNoF1bfjwww9bevPmzS195cqVLh/HhmuN7kG/+tWvWppr/Lp167p8v/3tb1uadWJ9IiLeeeedlt6+fXtL63q/cePGlua6pvsM1zJ9rr/+9a+3NPd0rula/oYNG1pa1yuuu9wXDxw40OXbtWvXwjpcuHChy8c1n3XS+2bPkfYF9zs+vzpHvvWtb7U013sdK/Yty9b7cv6wHbxPRL5/6v7B8tauXbvw/xH9e9UvfvGL7trDDz/c0nwOmI7o3wn5PsP2RkQ88cQTC/Pp2Lzwwgstffz48ZbW+c2/R9dxovfl7z6v9Z7c7vX7LKn6guj6xWfkH/7hHxbukbZMGWOMMcYYY8wE/pgyxhhjjDHGmAlKmR/NxGpqpUmaZl3K5iJ6M6zKrAjN3zRJqyyIEgamVQbFazTtUU4YEXHx4sWWpnRAzeL8m9K7iF4KQCmGwjpRUqemR0odKE1gn0f0/UnZpUrH2Desg8oGOd6Ujmk+3pfmfpUIsF0qRyA0tXKeab+wbynL0LHKrql8T8fxGt/97nfT8t5///3u2tmzZ1uactJz5851+davX9/SmzZtammVS7AveI3jpnA89HmhhIHXbrWpPit/GZK6ZbOM+y677l9kKcWXCa5JXKsj+rWHsjJK6iL6/ZMSOJWEZeuurmPcxyjNUukK68F9WmWILJ97C8uOiPjggw9aWtdktp+SfN23KfXi2rVq1aouH9vPvUXXePbnQw891NJsr9aP66TuW5TxUyauckBKNNkOlZhx/nCN1/I4dtybNR/LryT4nHOZK0ZE/57C/Y1prdPu3bu7axxvlkeZaUS/Lx49ejQyOM/YZyqt5zyu3Adulmqtvh3W8du9fp8nmaxTn4OhspZSI2OMMcYYY4z5kuGPKWOMMcYYY4yZwB9TxhhjjDHGGDNB6TNFzbbqram5pa5YfZKoQ6RWVf2nqPWlX4pqFxkSmmWrRpuhMCv9Y6YX1vCe1EprX9CniNph9Z+ixpr6evXRoc6b99UQ6tR9U0Oufj28F/XL6gvFdrHumW9RRD9u1DJH9Pr6yq+M92U71AeBY1L5/zCsLDXaqn/nNaLh+Pn3yZMnu2vUZdP/bsuWLV0+6vXZL+oztW3btpZmv+gc1vl+DdWDZyHQb3W47az8L7tG+3bwCxsND2s+nTVr1rS0rldcD7iOq98M+597gfq+cj3gNS2PeyvXWh1nhh5n/XRf4BrCMrS9vKZrKPNyHda5SL8Z+sBUewvL0z6jH/CJEydaWsNycxxPnz7d0uq3xjGonin2O32ddS+lzw/rrn1LX2LOEc1Hv2W2XX3OWA/WVfcjzgX2hb6zcNz0Gv2Hs+MCIvoxZn/yvSki4n/+539aunqvyPzmq/dBr4U3x+3oE13VYSY0fIYtU8YYY4wxxhgzgT+mjDHGGGOMMWaCUub32muvtbSGZqZJmnKp0XCuesI34b3UPE15E/OprIDlZ+beiF4qR7mA1k/N3xk0O2uf0YxNc2MVfpVmdg0jyzpS3kWpYURv7qf0JJOA6bUqTDzzad+yz/gbLY/9RMmKmu0p+6CpXvNRDsmyNRwux4NhWlUmSdnHqVOnumuUS+zZs6elH3zwwS4fnxf2k/aFyjauodJS/p2FP78RZszdvJfWb7S8GdP/Mk3zn8atlCYsQxLxZTvB/naDIbB13eW6ROmTPqPcMxluvDpOguuuSuAo9aK0TetH2fQjjzzS0iqV4x5eHZfCvUrbyDrp70i2Th46dKjLx2sbN25s6atXr3b52O9cJ3WdZRsZMl6l9YR9oe892Xhr33KMOV90TEkly+N7AO/Fd4qIfjz4G4aI17o/8MADLa3S0up9ju8tlJbqnsFw6yyvku+RKpz8jMxZ6ze7t2ZUMv4vKl+0vWWZ9bVlyhhjjDHGGGMm8MeUMcYYY4wxxkxQatcqaRvN0zSRq5SK5mCaNTXf+fPnW5pyBDW1EpqqNfoeo8ucPXu2pVViQPM326tR0Wh210iEbEsV+Y4mbkrWGHkwImL9+vUL63HkyJEuH+WVNKWrGZtShSqqDaPmMJ+a1Tn27DOVJmSR9BgBLyKfFyolyORxlVyRc0THhn8z3xtvvNHlowRQIxXt3bu3pdetW9fS+uyw7tX8plxk1PTP9mt5mbxhGVK56tnMuB1lALdDhL0/p3t9maAkWJ95Ph98RlVixnWOa62uu1zzOJ66TnIty+oT0Uu+uTdplFPuR5QN6nrKqHWVrIxSMt1LKenib7Q8RvVl/ZQdO3a0NN8RKM+M6N8RWCeVRvL9gePNKLkR/fhwTdexyWST2iaOCWXnuvexDLZRn3/+ji4Smo9RDulyoe9bfH+jTDKilwfyXUQlmXwnYr8omRRe657tn6Nr4a2W3t0O0r4/R6nh54UtU8YYY4wxxhgzgT+mjDHGGGOMMWYCf0wZY4wxxhhjzASlzxR1q6rLpt6VOmf1haLue9QfpNKh02eFOmr1haLGmGVoqFPq0BnaVX2mqD+mPjii157Td0l13lmYVQ39efLkyZZmuxg6NKLvT+q8VWtPTTT7gj4+EX2/UxNN3XRE30/0k2JYX60HQ52qXv3MmTMtzRC41MXr76j11VDm1KVTH636fPpu8VR19r/e6/vf/353jX4CzKfzVkPiXkOfq1HdMn2eKt+lzDeqOgm8CiM7Q6VRH61fdm0ZIdhv91PbK2ZCqN/ubfoiwbVR13Gu/9yP1OeDfzOtfiMcN/r46HrKeuzevbul6RMb0a819MPRtStro7aX67+2kT46XPMqv2yua7p+cu/jXqW+S9zH+e6g7ymE9dOQ51koc/WFYvu5R6qPE98/uL9v3bq1y8f5w3mlexrfMegHx/GN6P2f3n333ZbWuUSfYI4v/YgjIh5++OGW3rlzZ3eN48N3EX2f4Xyv1uSbPQqk2i+/bH5DX7b23kpsmTLGGGOMMcaYCfwxZYwxxhhjjDETlDI/hgWtwrRWErNMXqLmc5rjaT5XqQPDsVJqqBKGzZs3t/S2bdtamhK1iIi33367pY8dO9bSlG9F9OZ9lh3Rt5mm60rOwTayLyPy8KuVnOPcuXNpeQxFy/7U087ZRsryNB/nBcdR+5bSB8obqpColETomHL+0DytodYpR+C80nyUypw6daqlVWLw0EMPtbTKKig5oBRDZSnZaezVCfZkVm6XmfGr8mbupfdhe6vy+LvR38wwW96y5XHLbteMfHHZdfgywzVPx4J
"text/plain": [
"<Figure size 1080x648 with 4 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
2016-09-27 23:31:21 +02:00
"source": [
"# extra code displays the two output feature maps for each image\n",
"\n",
"plt.figure(figsize=(15, 9))\n",
"for image_idx in (0, 1):\n",
" for fmap_idx in (0, 1):\n",
" plt.subplot(2, 2, image_idx * 2 + fmap_idx + 1)\n",
" plt.imshow(fmaps[image_idx, :, :, fmap_idx], cmap=\"gray\")\n",
" plt.axis(\"off\")\n",
"\n",
"plt.show()"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "TyKNtjntpKz1"
},
"source": [
"As you can see, randomly generated filters typically act like edge detectors, which is great since that's a useful tool in image processing, and that's the type of filters that a convolutional layer typically starts with. Then, during training, it gradually learns improved filters to recognize useful patterns for the task."
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "Cm7giIfDpKz1"
},
"source": [
"Now let's use zero-padding:"
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 11,
"metadata": {
"id": "HHUI5jsNpKz1"
},
"outputs": [],
"source": [
"conv_layer = tf.keras.layers.Conv2D(filters=32, kernel_size=7,\n",
" padding=\"same\")\n",
"fmaps = conv_layer(images)"
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 12,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "iyfeZ38EpKz2",
"outputId": "b42fd198-58c3-4ee5-8158-6252e4eca01d"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"TensorShape([2, 70, 120, 32])"
]
},
"execution_count": 12,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"fmaps.shape"
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 13,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "80Umgdm1pKz2",
"outputId": "ae8d2cf0-5b8e-4b83-d19b-db483caecb13"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"TensorShape([2, 35, 60, 32])"
]
},
"execution_count": 13,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# extra code shows that the output shape when we set strides=2\n",
"conv_layer = tf.keras.layers.Conv2D(filters=32, kernel_size=7, padding=\"same\",\n",
" strides=2)\n",
"fmaps = conv_layer(images)\n",
"fmaps.shape"
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 14,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "jisXP9jfpKz2",
"outputId": "adfe3825-4724-4f89-e984-a75988b02927"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"(array([35, 60]), array([5, 5]))"
]
},
"execution_count": 14,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# extra code this utility function can be useful to compute the size of the\n",
"# feature maps output by a convolutional layer. It also returns\n",
"# the number of ignored rows or columns if padding=\"valid\", or the\n",
"# number of zero-padded rows or columns if padding=\"same\".\"\"\"\n",
"\n",
"import numpy as np\n",
"\n",
"def conv_output_size(input_size, kernel_size, strides=1, padding=\"valid\"):\n",
" if padding==\"valid\":\n",
" z = input_size - kernel_size + strides\n",
" output_size = z // strides\n",
" num_ignored = z % strides\n",
" return output_size, num_ignored\n",
" else:\n",
" output_size = (input_size - 1) // strides + 1\n",
" num_padded = (output_size - 1) * strides + kernel_size - input_size\n",
" return output_size, num_padded\n",
"\n",
"conv_output_size(np.array([70, 120]), kernel_size=7, strides=2, padding=\"same\")"
2016-09-27 23:31:21 +02:00
]
},
{
2019-03-24 02:06:29 +01:00
"cell_type": "markdown",
"metadata": {
"id": "aIgA6FCopKz2"
},
2016-09-27 23:31:21 +02:00
"source": [
"Let's now look at the weights:"
2016-09-27 23:31:21 +02:00
]
},
{
2019-03-24 02:06:29 +01:00
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 15,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "vH_xhNDVpKz2",
"outputId": "cc8e813f-7250-4c12-b168-8f1eb64ab9aa"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"(7, 7, 3, 32)"
]
},
"execution_count": 15,
"metadata": {},
"output_type": "execute_result"
}
],
2016-09-27 23:31:21 +02:00
"source": [
"kernels, biases = conv_layer.get_weights()\n",
"kernels.shape"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 16,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "zXIgK5tMpKz2",
"outputId": "53e1abff-6329-4ccd-ff6e-4ad5f53bad21"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"(32,)"
]
},
"execution_count": 16,
"metadata": {},
"output_type": "execute_result"
}
],
2016-09-27 23:31:21 +02:00
"source": [
"biases.shape"
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 17,
"metadata": {
"id": "ik87xvJhpKz3"
},
"outputs": [],
"source": [
"# extra code shows how to use the tf.nn.conv2d() operation\n",
"\n",
"tf.random.set_seed(42)\n",
"filters = tf.random.normal([7, 7, 3, 2])\n",
"biases = tf.zeros([2])\n",
"fmaps = tf.nn.conv2d(images, filters, strides=1, padding=\"SAME\") + biases"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "vt140LrypKz3"
},
"source": [
"Let's manually create two filters full of zeros, except for a vertical line of 1s in the first filter, and a horizontal one in the second filter (just like in Figure 145). The two output feature maps highlight vertical lines and horizontal lines, respectively. In practice you will probably never need to create filters manually, since the convolutional layers will learn them automatically."
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 18,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 520
},
"id": "7jSGHqKMpKz3",
"outputId": "c9e51288-dbc8-45db-dfed-c3faf49e5195"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAA1MAAAH3CAYAAABeqgVNAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nOy9ybZkx1m+H8a4t1RqSqqS3BvjtYAxiztgyE1wCcwZchlcAzfBnAEsfl62sWXJaqpVqSS5t/8D/id59qd8v/py5zmnSvbzjCIz945+R+xc8cYbn/nDH/6wRERERERE5DT+7GlnQERERERE5NOIf6ZERERERER24J8pERERERGRHfhnSkREREREZAf+mRIREREREdmBf6ZERERERER28Ofdj3/7t3978E1/8ODB4fsPP/xwc93NmzcP4T/7s//7f3b//v1D+Ctf+coh/MUvfvEQfv/99zdx8X7atv/ud7974v1MY621PvvZzx7N8+c///lD+De/+c3ReJ9//vlNXPfu3TuEv/SlLx1N85e//OUh/PDhw839v//974+m+ZnPfGYdg9e8+OKLm9++973vHcJsF9bdl7/85UOY5WLeK3/+5//XHZhf3s8yrrXW5z73uUOY9f3rX//66PepvGvl9mZ7petrvEyf9cLrGGZ5f/vb327iYr2wHzFeXlP7Dp+PDz744BC+e/fuIcz2+vrXv34If/vb397ExbyxXpn/9AxN6dpIej7tdfdP//RPn+4CXDPf+c53Dg8Yn+E6znIM78atY9TxiPdzPP7Vr351CKexnONPB+c1jjPd2MI0OS8w/5xHOUavtZ3z+NsXvvCFQ5hzEcMvvfTSJq6vfvWrhzDr/vbt20fvT/eutS0/72FdsIwsR72fsP4YTmP5Wts6Zt9JcxzzxXhrPmsfu4DzMPtXjYvXMX2Wi3l/9OjR5n6mz/Crr756CHNeZZvWZ+0b3/jGIXzr1q1DmG3P/D+NMfvTME/syePknnOPYzq37tL9P/zhDzeff/SjHx0N//M///PRCFyZEhERERER2YF/pkRERERERHbQyvy43M1l9yr1oryBy82URDGuJLNba7skzqVjhhkXl+zqcm9aOuc9TO/ll18+hKtkkEvcXG5OsgNeX/Of5GVJglblCJRQJDkfpRGs7xoXl94ZZt3xniq5Y154P8ufpICdTIT9rbbFBZ0cIsn8khyCcdW2S/JJ3kMpH8NrrfXCCy8cwpTZsB+xjJT5ffe7393EleSTrMunIfM7d+n+3PSfFaZ5P1Xqddl8muv4WYJjYxqL18qytwSfc84da+UxjHNpkpLfuHFjExf7Ice9JBVjuJbxF7/4xSHMdwTOy1VCRxj348ePD2GOs5x/Pv7440OY9Vs/cysA65JzZFdHnJd4/3PPPXc0Ll5f70lz0USK3v3GNCbvOjVN/pb6J8tY39tIkvaxHepWEbYl+xX7VKojzq9rbftb/e0YSZZ4laR0nqVxeVoX111/577TXFUeXZkSERERERHZgX+mREREREREdtDK/JKEgeF6HZfXuaTPMK+pEgb+liQIXLrlsn9d6udydZJDMH3GW+UITJ/L+0w/OfvVvEwkH8nNqN4/kZkwXCVzyZGOsIznyvyqHC/lhfXH+iad4x/bOMkZElXCwHRYFspC6TpUnaWSzI9yErYd3Yi+853vbOKayPyeJanAHro+8qwzdauc3nNVfNr7yLNCdaS7oM5FdZ67IMnxOC989NFHo7yk+0ka4+tvvL/OZRfU5zTN15TscS6qfZD3sL5Yfo55nMuYRr3/lVdeOYTT3Mm8VKc5jucs43vvvRfLkkjScuaLc1mt4zRf854k3a9zX/otue+y7HWOZFn4THBee/3114+WY61tW3Je5XsY24H1Vd+P2K6pT3fbDRKXOWamND+N7ruTPJ+bx3Pnzkke6zXs05P7P71vLSIiIiIiIk8R/0yJiIiIiIjswD9TIiIiIiIiO2j3TNFikhpZWlyulfe0pP081LjWk7QJdbXJXrXbM0XdNDWPE8t16qzrdSxvsnKvOneWM1mo8xrqe6vlO8vM9FO9MNxZ4yar1qTHXivbnqf7u/0wSQNOC3GS9jKttdVtM96UPuuh7oNIGmxqu2mHzu/X2tY5LWG5l4rPR2eNzv0LrKPOJj5xHXt4ztWjX6bWOun/L7O8e/ZMTe+fsKdcT1tz/2nmnXfeOYS516aOkxzD+ax2Y1iCYwXniXQEBsczWlDX9Dl+837OKyxXtUbnZ8bLfaKcF+u+G+Yz7SWjVXY3R7COmT5hvJzTu/cIjuXcA5Tm5JpOOraC7cj06lyU+g7fI9Les+m7VuoTpPbvdHwI+z3fqV588cXN/Zwz2Zas7/R81P2IrFeGJ/ulK8/SPqmUl8ucy861Db+qPF7VHNUdQ3AqrkyJiIiIiIjswD9TIiIiIiIiO2hlfklSVOHyMZd4k1U2l1urrWVaok526FwG7+yxJ5aozG9d7uT9XHpOp8NXqRWXte/fv//E9Fn31ZqWZaY8IdUx769L5Sxnko2xHmtekjSEcgzewzJW2UGSYSVr827pON3PvCT7+SqrpCVrkvkxXKWUbHv2acr5mD5t1quEYSLhm0jbLpurks3tKctEajD5vosrcW4dX6ZN77RNruPU+j9WaOnMcaPOa/VIigtY97y/azuO4clSm3AMmR77kPLYHZPBMnKcomyLMsMqOaQEmvMH5zXmn9dUWJesP8bF7QqcF6Yy6TQv1fuZfpLFp+NlOikl+1jaRpHk9jVfiSQ/rNb7zBfnP0pE+a5SZZnMP39j/tl25FxZO6l1wneU65CaXeU9TzNecpnz7XUxyZcrUyIiIiIiIjvwz5SIiIiIiMgOWpkfl9GTTKGSTuhOUoPuVG4usXLZPrm+cal4re2p6EkaSAkW06hSL15HyeHdu3cPYS4F1uV5ps+l77QkTyh/WGsrSfja1752NC90lrp169bRPNa4uHTPfHUOTCwnr+OSPK+htKPGxTpnXbAfMv/MY40ruSYxfZadTkNVpsf2Yh5Tn6julnx2+Bvrm32aMo+pFDLxNCRce9KcSNIu02lIaZtcBqnfVklVGuc5//EeXl+fB44VvC65lTGPVSpV3eIu4LiTJNOdGx/HfI6ZHNvqOwXHUMbFeTzJ/6rsjHNRciBkmOWqczfrlXHx+6lTHuuM70Sci5ILcU0npZnyWKWnSS7KdmHfY3qUvq+V6zI5THJOXeuT7n4XJFfHlK/uukQ3L1y3zHua9nU48V4m03o5dbvAtLwTiX/3DHe/XeDKlIiIiIiIyA78MyUiIiIiIrKDVuaXlvc7CRuXw9LScTposH7mEh4lfEyDy8P1IDmmybJQNpDkF6+99tomrocPHx5Nk7IDLknXg1uZDpf0uQxPCVjnUMPfGC/vv3PnziFMOUTnnliljRdQGlfvZxuxXEmOkGQO9XNy4EvuVfVgRrZRcoNimHVXZX6pjil1SN9X2Edrf72gO4B3stx8XSiVe3pY988G6UDwOrZRtsbr0vjJ++uYm2TuaWzhmFllMUkayPk2ycG6fE3k2FUmyHmG7wjJDbA7kLbW/wUs79TJl/MH5/t0gG7NSzpElvCeJCtca1v/qb2TzK1KOtN8/e677x7CfNe7ffv20Tyute2HaS7kIcd1jk2y0slBu9d16Pp13H9VcT1trqos1zUP6uYnIiIiIiJyRfhnSkREREREZAetzI+OKwzTdWetTx7Adgwu9adDANfaLv1z6ZpLx+mg3k4CRdkC5X9MLx3i2uWF9ZJcg2p8lHnQdShJFqvUi0vflOZRakA5QnIwqukkd7kqxyBMn/XHe1jfSaax1rYfsb5YlpTH6njIflGdno7Bdrx58+bmNy7xso2YXz4TVZrB9JP8cc8y+MSh5jLj/bSw53DeC65LWnFVhxxP0qv8MclJrhuOxZxjqlsZx0Dekw577aS+SZKWZMNs3xoXx0mO2ekAYI73nQSLcwGlaZ3ML+WZdcH5a/oMJfl4eidJDoc1nSTtq/NNOuyW5U/f1+c2STzZ3zgvMVzbi/JFpkMJHrc
"text/plain": [
"<Figure size 1080x648 with 4 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"# extra code shows how to manually create two filters to get images similar\n",
"# to those in Figure 145.\n",
"\n",
"plt.figure(figsize=(15, 9))\n",
"filters = np.zeros([7, 7, 3, 2])\n",
"filters[:, 3, :, 0] = 1\n",
"filters[3, :, :, 1] = 1\n",
"fmaps = tf.nn.conv2d(images, filters, strides=1, padding=\"SAME\") + biases\n",
"\n",
"for image_idx in (0, 1):\n",
" for fmap_idx in (0, 1):\n",
" plt.subplot(2, 2, image_idx * 2 + fmap_idx + 1)\n",
" plt.imshow(fmaps[image_idx, :, :, fmap_idx], cmap=\"gray\")\n",
" plt.axis(\"off\")\n",
2019-03-24 02:06:29 +01:00
"\n",
"plt.show()"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "sO0dtyuVpKz3"
},
"source": [
"Notice the dark lines at the top and bottom of the two images on the left, and on the left and right of the two images on the right? Can you guess what these are? Why were they not present in the previous figure?\n",
2019-03-24 02:06:29 +01:00
"\n",
"You guessed it! These are artifacts due to the fact that we used zero padding in this case, while we did not use zero padding to create the feature maps in the previous figure. Because of zero padding, the two feature maps based on the vertical line filter (i.e., the two left images) could not fully activate near the top and bottom of the images. Similarly, the two feature maps based on the horizontal line filter (i.e., the two right images) could not fully activate near the left and right of the images."
]
},
{
2019-03-24 02:06:29 +01:00
"cell_type": "markdown",
"metadata": {
"id": "YXahd-O0pKz4"
},
"source": [
"# Pooling Layers\n",
"## Implementing Pooling Layers With Keras"
]
},
2016-09-27 23:31:21 +02:00
{
"cell_type": "markdown",
"metadata": {
"id": "obubVYH-pKz4"
},
2016-09-27 23:31:21 +02:00
"source": [
"**Max pooling**"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 19,
"metadata": {
"id": "v4qYbnjKpKz4"
},
2016-09-27 23:31:21 +02:00
"outputs": [],
"source": [
2021-10-17 04:04:08 +02:00
"max_pool = tf.keras.layers.MaxPool2D(pool_size=2)"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 20,
"metadata": {
"id": "Niwcuaw_pKz4"
},
2016-09-27 23:31:21 +02:00
"outputs": [],
"source": [
"output = max_pool(images)"
2016-09-27 23:31:21 +02:00
]
},
{
2019-03-24 02:06:29 +01:00
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 21,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 366
},
"id": "sZo5TrZ6pKz4",
"outputId": "471b4713-527a-41e7-820d-ed09f24a2195"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAA1AAAAFdCAYAAAANGozGAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nOy9eZRl91Xfu8+dx5qruqq6q+dBarW6NSJZtjxgmxhjzDzZxMwQggm8hBASZlgJfgECj0CCXwiQ5EGeg3nECdiAwdh4kGTZ8iC1Wmr1PNc83LrzPee8P7rN0t7f7aqjonpQ+/tZS2vpd3rfM/zOubf7/Pbe328Qx7EQQgghhBBCCFmf1M0+AUIIIYQQQgh5ucAXKEIIIYQQQghJCF+gCCGEEEIIISQhfIEihBBCCCGEkITwBYoQQgghhBBCEsIXKEIIIYQQQghJCF+gCCGEEEIIISQhfIEihLgEQfD7QRD86U047pkgCH7sRh+XEEIIISQJfIEihBBCCCGEkITwBYoQsi5fyEYFQfAjQRBcDIJgMQiC3wuCoPSimA8HQfDbQRD8X9f+fDEIgl8OgiD1ohjILl373G9+4f9FZIeI/HIQBHEQBPENukRCCCGEkETwBYoQkpRHReSQiLxBRL5FRL5ORH7ExLxdrv6uvEJEfkBEvl9EfvQlHOPrReSCiPyCiExc+48QQggh5JYhc7NPgBDysmFFRP5RHMehiBwLguCPROT1IvJLL4q5LCL/JI7jWESeC4Jgv4j8UxH5d0kOEMfxQhAEoYjU4ji+srmnTwghhBDy94cZKEJIUp699vL0BS6JyJiJefzay9MXeExEtgZB0Hfdz44QQggh5AbAFyhCSFK6ZhzLS/8NiUQkMNuyGz4jQgghhJAbDF+gCCGbyUNBELz4BelhEbkUx/HKtfGsvKivKQiCgojcYfbREZH0dT1LQgghhJANwhcoQshmMikivx4EwYEgCL5RRP65iPzai/78QyLy9iAIXhsEwV0i8ruCvZhnROTRIAi2BkEwciNOmhBCCCEkKRSRIIRsJn8gV7NHT8jVEr//LPoF6pdEZKeIvE9EVkXkX8vVl64X8zMi8m4ROSkiecGSP0IIIYSQm0ag+70JIWRjXPNweiaO43fe7HMhhBBCCLlesISPEEIIIYQQQhLCFyhCCCGEEEIISQhL+AghhBBCCCEkIcxAEUIIIYQQQkhC1lTh+5p/9H2QnvqbD31cjQ/eay1cRPbvuVONP/P5JyGmUwvV+OFX3gcxl6en1fjxpz4DMTsPHVTjvQcOQsxH//C9sK08UFHjkYF+iFmZXVLjobAHMWNl/Q766ePnIGZLuazG+TSKir39u74Ltn3gQ3+hxu2wAzHdSHubtlrOOY5MqfHH//bjEBOFej/5SgFiBjJVNZ5bXYaYd7ztH8C2bRPjavy/H/skxLziK75cjY+fmoaYew4fUuOR8UGIqTWbanzmwhzEVIZGYVs6k1PjbNaxIYr1HHVWGxCyOrugxmNDQxDTlUiNe46NbKlSVOOVmVmI2T6I19Fut9U4HeAaSVVfqqyu4H1cXNLX0e2FENPLlvSxcnghlX5UIZ9daanxhel53HdXX0cpwOd6/+SwGoerSxBz5O671fjnf/HXIWZ8XIsADlT7IOZ3fud3Ydv8rH62Mhn8OdWWWCLpND5XUXT9qgBshYE9H48kMRv93O1S8ZBKbWztMcn1b2weNzav3rE26z5u9DmyjE+MUIWTEHLLwQwUIYQQQgghhCSEL1CEEEIIIYQQkhC+QBFCCCGEEEJIQvgCRQghhBBCCCEJWVNE4si+u2Hb7GndzL5imuZFRJZGdHP18CQ2u19+7JQazx09DzGTB/ep8c49bYhpnL2gxs8uLULM1NbtsK3Y1o3sMy+chphMSjfXb912CGJ6l/X1f/3XvRliPvfcZ9X47GfPQsyffeyvYdsTR59Q491790HMfUe+TI3nZ7Ahf25Bn+Or3vhaiPnIXz+uxt0eznUt0gIN6Qz29p47jSIah3dvU+Ppp49BTHPXg2r8iPPs/cUf/4kab71jJ8TI4JgaDmzfAyFzTXxmcyndFJ1qYpP0+IAWRIiM8ISISHF8ixrPO73WqawWWwjaEcTEy6b5P0Zhg2aEQh+9YS1aMVtbwZimFltY7qBAxO49d6lxtdmFmKipryNcrkHMmStXYFunq683ClsQM9ynhVf6s/is9ZW1EEz/MAp2/MiP/FM1TgU4j+2GftZ/4N+8C2IC5z72elrYolKpQIwV9ZAA7/X1BPv4b5xgxWZiBQlupBiFJ4YQx9fvPt7Ia7uex9rIvoPU7SEyQgi5/WEGihBCCCGEEEISwhcoQgghhBBCCEkIX6AIIYQQQgghJCFr9kDFL1yEbffv0v0sf/LnT0NMM6MNNo+8+kGIGb2g+2mOPvZhiJka0/0skyGaUK5M6x6LOeecx3bshG2Vgb1qPF1fhZiHH9Y9R2fnsA/kSk1/rnDlEsS0V3VvyOFX3gkxT3wKTYLLFf1+u2UI33dbs7rnKF7Gev3HPvQJNb7n/nsg5tGHtQHxic+egpgV0fsu5rF35X998BOwbed+/cx819u+FmJ+89/9jhp/y7d/M8R8/ze8RY0/efQpiGkt6/6mxvNowBqm8LxLo9qU9bljJyHmYlab2w5NDEBM1/QTBQGay3Yi3T+RK5UhplDQ/VWZNN772jz2+/Xm9T3KOqawQwP6eEEez7FlenXiPB6/Z86p5fRSDaaxL2h8VH+vF2fR7HjRfI9s/5mIyMUXjqvxq7/57RDz3/7zH6jxyiqaUU9NaaPp3bv3QszFi/i7UjYG2WGI1x+L3hZFOI9BanN8QjfLuHSj/T2bZa7qx1hD4KRnZfayAWNhr2/Mu9bNmv+bbUhs97zR69rI83Cb+CwTQr4EYAaKEEIIIYQQQhLCFyhCCCGEEEIISQhfoAghhBBCCCEkIXyBIoQQQgghhJCEBGs1et67Zzf84ehu3ZR+4swFGyKZQBt8TmzbATHnXzihxvt3TEDMcye14Wx1BI0yD070q/GZU2hSOzaxFbaVRrW57tNPoxjGHft0zFwLjUtPPv15Ne7UZiFm237dpP66Lz8IMZcbRdgWi27u79SWICbT1YIISwtNiPn0MW1SvFhDwYyRfn1t+7ahAe208UlNt1AwI4rweZqa0EbKj96zE2IuX6qr8XOnj0PM9jH9HN39EIphtNNaaKQ8Mg4xHadHvtqvBSGaLRQbCIybamEAn8e5ZX2PvG9XuVxV47YjYDI3M6PGnmlx7Jiy9qW0+MRyvQExHWOSHGZRRKLR0s9RpVKCmLQR1SgV0Fh48cIZ2GZFJBrL+FyPDRiBjg6Kgbz+ta9X46kduyGmWtXiE50Q92MNcZtN/A55jfSpjH7Wogjvh92Wdeba+5wlifhB0m3r7TuOUQxjvc8kxZ5PcoGCzVnr2zzxBbxnmycigWJJ14vNEv5Isu+NPp/j42ObM7GEELKJMANFCCGEEEIIIQnhCxQhhBBCCCGEJIQvUIQQQgghhBCSkDV7oN74FQ/DH/YVde33sc+gmenBI0fUuLeyADGzTd27MzGFfUpt07+Rz+O5FivaqHPpEvY4BCXsL+qvDKrx6OAwxCw1dC/ApUvY8xM2dL/E2UsnIKZvp+7DeMsbXgkxc9PYczNn+pnGhtG4NazbXpk8xHz2BT3/9QYaybZq+tpqy3jP0hndb1aqYml6pjAG21YXdI/LAwfQFHXrqO5VqtVqEFNb1b1CYYDv/+VBYxLrtBPEPTzvckF/bsXpS7L9K9kA+4JMmxSYrYqIpIv6HoVpPJ+Jbfr7sDCDvXW7tm2HbbF9jBwD3kxZn3er24aYyHRvFR2z36CnD+a18tRqaPbbbpserBC/13XTp/cP/+F3QMz+/QfUeGYWj9UyvVPp7Jre4V+UfB6/V7ZXyuvnsJ/rdtGMO5XS92iz+ouuJze+B2pzuJ4GtDf6WtbjZj8PGzdN1rAHihByK8IMFCGEEEIIIYQkhC9QhBBCCCGEEJIQvkARQgghhBBCSEL4AkUIIYQQQgghCVm
"text/plain": [
"<Figure size 864x576 with 2 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
2016-09-27 23:31:21 +02:00
"source": [
"# extra code this cells shows what max pooling with stride = 2 looks like\n",
"\n",
"import matplotlib as mpl\n",
"\n",
2019-03-24 02:06:29 +01:00
"fig = plt.figure(figsize=(12, 8))\n",
"gs = mpl.gridspec.GridSpec(nrows=1, ncols=2, width_ratios=[2, 1])\n",
"\n",
"ax1 = fig.add_subplot(gs[0, 0])\n",
2022-01-30 23:12:12 +01:00
"ax1.set_title(\"Input\")\n",
"ax1.imshow(images[0]) # plot the 1st image\n",
2019-03-24 02:06:29 +01:00
"ax1.axis(\"off\")\n",
"ax2 = fig.add_subplot(gs[0, 1])\n",
2022-01-30 23:12:12 +01:00
"ax2.set_title(\"Output\")\n",
2019-03-24 02:06:29 +01:00
"ax2.imshow(output[0]) # plot the output for the 1st image\n",
"ax2.axis(\"off\")\n",
"plt.show()"
2016-09-27 23:31:21 +02:00
]
},
{
2019-03-24 02:06:29 +01:00
"cell_type": "markdown",
"metadata": {
"id": "yJKqdXSEpKz4"
},
2016-09-27 23:31:21 +02:00
"source": [
"**Depth-wise pooling**"
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 22,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "ECn_PnixpKz4",
"outputId": "ca0cd587-9a20-40cd-b69d-56ce9be47f32"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"TensorShape([2, 70, 120, 20])"
]
},
"execution_count": 22,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# extra code shows how to use the max_pool() op; only works on the CPU\n",
"np.random.seed(42)\n",
"fmaps = np.random.rand(2, 70, 120, 60)\n",
"with tf.device(\"/cpu:0\"):\n",
" output = tf.nn.max_pool(fmaps, ksize=(1, 1, 1, 3), strides=(1, 1, 1, 3),\n",
" padding=\"VALID\")\n",
"output.shape"
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 23,
"metadata": {
"id": "G9rV71mrpKz4"
},
"outputs": [],
"source": [
"class DepthPool(tf.keras.layers.Layer):\n",
" def __init__(self, pool_size=2, **kwargs):\n",
" super().__init__(**kwargs)\n",
" self.pool_size = pool_size\n",
" \n",
" def call(self, inputs):\n",
" shape = tf.shape(inputs) # shape[-1] is the number of channels\n",
" groups = shape[-1] // self.pool_size # number of channel groups\n",
" new_shape = tf.concat([shape[:-1], [groups, self.pool_size]], axis=0)\n",
" return tf.reduce_max(tf.reshape(inputs, new_shape), axis=-1)"
2016-09-27 23:31:21 +02:00
]
},
{
2019-03-24 02:06:29 +01:00
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 24,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "lEHRtmhXpKz5",
"outputId": "16fa295e-72a1-43a6-b3c6-eb6694a1bc4f"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"True"
]
},
"execution_count": 24,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# extra code shows that this custom layer gives the same result as max_pool()\n",
"np.allclose(DepthPool(pool_size=3)(fmaps), output)"
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 25,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 225
},
"id": "hMJoKQxPpKz5",
"outputId": "8b5e494d-a7f6-4341-ba6c-2d82f2f45adc"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAqsAAADQCAYAAAAgYSjaAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nOy9W6wlWZrf9fvWWnHbe5+zzyVP3uqS1deBsYfpwTNu82Bs4bE8SB7QABISRgIexlj2iEGI2wMgBA/z4MGAxCALg20JYT/wgJAsDcPFngfw2GLANDP0dE9fqrrr0lWVmZV5LnvvuKwLDytWZJxdJ0+ezMqsqu6Kv2rX2bl3xIoVKyK+/V//77IkhMCECRMmTJgwYcKECZ9GqE+6AxMmTJgwYcKECRMmPA4TWZ0wYcKECRMmTJjwqcVEVidMmDBhwoQJEyZ8ajGR1QkTJkyYMGHChAmfWkxkdcKECRMmTJgwYcKnFhNZnTBhwoQJEyZMmPCpxURWJ0yYMGHChAkTJnxqMZHVCc8MEfnrIvK3PoHjviEi/+bHfdwJEyZM+GGBiLwkIv+ViLwlIq2IvC0if0VEXn7Kdn5TRP6LF9THf1lEzl5E2xN+tDCR1QkTJkyYMOFHCCLyOeC3gT8I/EvAF4F/EfgDwP8pIq99Yp2bMOEZMJHVCc8FSWUVkV/uZ/APROSvichstM1vishfFpH/vP/+gYj8RRFRo20+pJqOZ/Yi8pvAHeAvikgQkWkJtgkTJkw4j18DPPCzIYT/LYTw/RDC3wF+tv/81+Bi1XTsMRORvw78MeAvJHsrIq+JyB/v3/9pEfl/RKQWkf9LRP7QqJ0Pqaaj/a6JyB8H/howH7X9H76oAZnww42JrE54nvijxJn8zwL/PPALwC9vbfNniPfdPwb8q8CfBf71pzjGPwO8BfxHwK3+NWHChAkTABE5AH4O+LUQwnr8Xf/v/xL4J0Vk/wrN/TLwW0RSmeztm6PvfxX4d4CfBr4L/K2xQPEE/F2i7V+P2v7VK+474TMG80l3YMKPFE6APxdCcMDvich/D/wJ4FdG2/wA+NdCCAH4hoh8Gfg3gL90lQOEED4QEQechhDefb7dnzBhwoQfenwJEOD3HvP91/vvv/SkhkIIxyLSAuuxvRWR9PY/DiH8Rv/Zv0IUEv4F4L++QtutiBzHt5Mtn3A5JmV1wvPE13uimvAOcH1rm7/XE9WE3wJeEpHdF967CRMmTJjwPPFb6U0I4Qz4HeDHP7nuTPhRxURWJzxPdFv/Djz9PeaJs/4xsmfu0YQJEyZ8tvBtou19HGn88f77b/Ni7e1kyyc8N0xkdcLHja/KyIcE/BHgnRDCSf/vu4ziUEWkBP6hrTZaQL/QXk6YMGHCDyFCCPeB3wD+/Hb8aP/vvwD8egjhA7bsbY+f3Pr3Zfb2j4zanhNzFlL4wV1gtuU1+8pTtD1hwoCJrE74uHEb+M9E5MdE5J8D/i3gPx19/7eBP9Nnjf4B4K/y4djqN4A/2tcRvPZxdHrChAkTfojwS0S7+b+KyD8hIq/02ff/C1Ht/KV+u79NTLb6p3qb/JeAV7baegP4w30VgGvj6i3Avycif3Jkq1vgb/Tf/X1gBfyKiHxRRP5Z4M9f0HbZt3HtKZKzJnzGMJHVCR83/jviTPrvA38F+G84T1Z/hWhA/0fgfwb+d+AfbLXxHxAN6neIs/cJEyZMmNAjhPAdYob+/wf8t8RM/b9BVD1/JoTwer/pXx29/g/gFPgftpr7VSIJ/TrR3r46+u7fBf4T4P8mJmz96RDCqu/DB8TqL3+SGMv6Z4F/f6uffxf4y8Df7Nv+tz/amU/4UYWcz3WZMOHFoa+R+rshhF960rYTJkyYMOHTiV6l/TvAUQjh3ifcnQmfAUzK6oQJEyZMmDBhwoRPLSayOmHChAkTJkyYMOFTiykMYMKECRMmTJgwYcKnFpOyOmHChAkTJkyYMOFTi0uXW/2n/9wvBu893//+23zvjbfIcsPNl28y35mxt7vP4eERtut4653vc3zyEG8DrglkWcad117i8No+ddNwcnpK07Z8/+13uHv/PuVizuGtm+RlxcG1I/b2Dzi5e5/v/PY/YPXwmKzMyYoMYzSLqsJoTbtpaFY1KgTm3pMRKA0scqGzjrfuPuDh2YZSaxZ5jlYKI6AEXn71VX76q1+lKEt+/7vf4vtvv4kPHustPgR88Hg83ge6zhECzKod5rNd1usN3/32dzl+eAzBE4IHEUxh0EaTi6FUBSHAqq1pbcerr9zgZ/7RH2NnMWO5u8Osqrh7fMzvvv49Np3llS98jlt3XmFTd9y9f0rbOW5cP+L60TWyPGOxO8Nkms5aGmvpOsv9hyvqpsMUJUU1R5RCKYMohYigtfTVlx0Q8NZim47gHO1qQ1c3ZMawM5+jtcIT8EAQcApQYDJDlhu8c9SnZ7iuY1GU7FVzBMFai/ceAQRBiVBoMAqc7WjqDc45NvWGtm3wPuC8JwBeZQRlQCmU1ogS8mJGVlRYHzirOzrrOVvXnKzWeO/xzhKCxxDI8GRGcbQ7Y1HmBNsR2g1GKW7dusX+3h6//63X+Z9+4zc5Pj5lsdhlNpuRGUNVlJRlxc///M/zp/7Uz+G9Z3W2wlmLiKD6MYS4jGB6ATyt4+EiT8X5srKP/+yi7z4Jz8f4/LdxWX8u7vfV+j/e99HYX77vZWOYcPPWtSdv9COEX/zFXwwhBN58802+973vkWUZL7/8MvP5nL29PQ4PD+m6jrfffpuTkxOstXRdR5ZlvPbaaxweHlLXNScnJ3Rdx1tvvcW9e/dYLBa89NJLlGXJtWvXODg44O7du/z2b/82Dx8+pCgKsizDGENVVWRZRl3XrNfnlqanLEsWiwXWWt58800ePHhAWZbs7OygtcYYg1KKV155hZ/+6Z+mLEu++c1v8r3vfQ/vPdZaQgjDX+fc8L6qKhaLBev1mu9+97scHx8TQiCEgIgMbed5TlmWAKxWK5qm4dVXX+WrX/0qOzs7LJdL5vM59+7d4xvf+AZN0/DFL36RO3fu0DQN9+7do21bbty4wfXr18nznN3dXfI8H8az6zoePnxIXdeUZUlVVSilhle02RoRGe5zay1N0+CcY7VasdlsyLKM2WyG1o9KkY7bybKMPM9xznF6ekrbtsPxUpve+2Hf8Th0XUdd19Fmbza0bUsIAe89IYShf1prtNYopSiKgqIo8N6z2WyGfTebzXB90v4AWmuWyyVlWQ7np7WONnt/n29+85v8+q//OicnJ+zs7DCfz9FaU5YlZVnyC7/wC/zcz/0cIQTOzs6w1g7nns4n/U2fPYvNTPs8ra1+3Pcft92+zGbD1X+Xxttd1dZf9fyvYq8Brl272GZfSlalP4AgiKSDhfTfud+hoSMhfOgXXnjURvpXvzH0xiSEQDxIf1xAQn+M9Er7jPpHGPVTZCBvY4QQelL66FghhEctCRDG5xg/TN+nB8H7cO6cQwiEC4c1nOvyMGwhjU/8m0ifiDwaBx8JcQhq1Nft1kdHCaFvXwjSP3DndpBHa4iMzi3tf+HteO5DGcbi0TnFPodRG2HrnIfjiSAhjmGQR1f+3GHC+XH98PHD6L7ox0sgENsP3uOcwzt3IUmKxP7xD/OTH6KPaniGO+EJbV5G0j4u43c1oni+P1fd55LWekLxNO08djs59+czhe1JV/oBB86RiO3tgUd2+DHfJ9IzJjSPQ9omvX8cCRiTt4v2T8cZ9y2E0Ntjf460jN+fn3Be7X7aPqfL7MWYGKX9xn29rI2nvdcvIiLPYsvGx3zcdd7+O75/xrjKRPIywuKTzR4R6XG7SqmBLE+YAE8gq0f7R4QQ2JzUrB6uImFwgXazoS1KmmaDCw5TZFRhRnvaULdrgg20D9dsMAQlVMWcLCvZ3dnQOInK3rqma1pOA9h6Q3u2pshy9GKXTAlGRQLnzla44PFdi3Q1Rilm+ZzS5GjnkMaSieL29Rtcv62wXUtbr/HO0Z412E3L/eNjvvfOW5gi4/vvvcU7995BGUNZlWhtmM3nzGZznPM0mwbnPCK
"text/plain": [
"<Figure size 864x576 with 2 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"# extra code computes and displays the output of the depthwise pooling layer\n",
"\n",
"depth_output = DepthPool(pool_size=3)(images)\n",
"\n",
2019-03-24 02:06:29 +01:00
"plt.figure(figsize=(12, 8))\n",
"plt.subplot(1, 2, 1)\n",
2022-01-30 23:12:12 +01:00
"plt.title(\"Input\")\n",
"plt.imshow(images[0]) # plot the 1st image\n",
"plt.axis(\"off\")\n",
2019-03-24 02:06:29 +01:00
"plt.subplot(1, 2, 2)\n",
2022-01-30 23:12:12 +01:00
"plt.title(\"Output\")\n",
"plt.imshow(depth_output[0, ..., 0], cmap=\"gray\") # plot 1st image's output\n",
2019-03-24 02:06:29 +01:00
"plt.axis(\"off\")\n",
"plt.show()"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "O5Sel6supKz5"
},
"source": [
"**Global Average Pooling**"
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 26,
"metadata": {
"id": "KW52BwBypKz5"
},
"outputs": [],
2017-05-05 15:22:45 +02:00
"source": [
"global_avg_pool = tf.keras.layers.GlobalAvgPool2D()"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "s2E12ccVpKz5"
},
2017-05-05 15:22:45 +02:00
"source": [
"The following layer is equivalent:"
2017-05-05 15:22:45 +02:00
]
},
{
2019-03-24 02:06:29 +01:00
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 27,
"metadata": {
"id": "nG_X-OuTpKz5"
},
2019-03-24 02:06:29 +01:00
"outputs": [],
2017-05-05 15:22:45 +02:00
"source": [
"global_avg_pool = tf.keras.layers.Lambda(\n",
" lambda X: tf.reduce_mean(X, axis=[1, 2]))"
2017-05-05 15:22:45 +02:00
]
},
{
2019-03-24 02:06:29 +01:00
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 28,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "Ygy0q39xpKz5",
"outputId": "39084b74-1687-458d-dd82-84277f3bd221"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"<tf.Tensor: shape=(2, 3), dtype=float32, numpy=\n",
"array([[0.643388 , 0.59718215, 0.5825038 ],\n",
" [0.7630747 , 0.2601088 , 0.10848834]], dtype=float32)>"
]
},
"execution_count": 28,
"metadata": {},
"output_type": "execute_result"
}
],
2017-05-05 15:22:45 +02:00
"source": [
"global_avg_pool(images)"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "oid44Xx-pKz6"
},
"source": [
"# CNN Architectures"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "ELZe7PLfpKz6"
},
2017-05-05 15:22:45 +02:00
"source": [
"**Tackling Fashion MNIST With a CNN**"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 29,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "1IXwgw_0pKz6",
"outputId": "4cd7176d-5b2d-4bab-efd3-5ad967a2e43b"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-labels-idx1-ubyte.gz\n",
"32768/29515 [=================================] - 0s 0us/step\n",
"40960/29515 [=========================================] - 0s 0us/step\n",
"Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-images-idx3-ubyte.gz\n",
"26427392/26421880 [==============================] - 1s 0us/step\n",
"26435584/26421880 [==============================] - 1s 0us/step\n",
"Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-labels-idx1-ubyte.gz\n",
"16384/5148 [===============================================================================================] - 0s 0us/step\n",
"Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-images-idx3-ubyte.gz\n",
"4423680/4422102 [==============================] - 0s 0us/step\n",
"4431872/4422102 [==============================] - 0s 0us/step\n"
]
}
],
2017-05-05 15:22:45 +02:00
"source": [
"# extra code loads the mnist dataset, add the channels axis to the inputs,\n",
"# scales the values to the 0-1 range, and splits the dataset\n",
"mnist = tf.keras.datasets.fashion_mnist.load_data()\n",
"(X_train_full, y_train_full), (X_test, y_test) = mnist\n",
"X_train_full = np.expand_dims(X_train_full, axis=-1).astype(np.float32) / 255\n",
"X_test = np.expand_dims(X_test.astype(np.float32), axis=-1) / 255\n",
2019-03-24 02:06:29 +01:00
"X_train, X_valid = X_train_full[:-5000], X_train_full[-5000:]\n",
"y_train, y_valid = y_train_full[:-5000], y_train_full[-5000:]"
]
},
2017-05-05 15:22:45 +02:00
{
2019-03-24 02:06:29 +01:00
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 30,
"metadata": {
"id": "34upiak4pKz6"
},
2019-03-24 02:06:29 +01:00
"outputs": [],
2017-05-05 15:22:45 +02:00
"source": [
2019-03-24 02:06:29 +01:00
"from functools import partial\n",
"\n",
"tf.random.set_seed(42) # extra code ensures reproducibility\n",
"DefaultConv2D = partial(tf.keras.layers.Conv2D, kernel_size=3, padding=\"same\",\n",
" activation=\"relu\", kernel_initializer=\"he_normal\")\n",
2021-10-17 04:04:08 +02:00
"model = tf.keras.Sequential([\n",
2019-03-24 02:06:29 +01:00
" DefaultConv2D(filters=64, kernel_size=7, input_shape=[28, 28, 1]),\n",
" tf.keras.layers.MaxPool2D(),\n",
2019-03-24 02:06:29 +01:00
" DefaultConv2D(filters=128),\n",
" DefaultConv2D(filters=128),\n",
" tf.keras.layers.MaxPool2D(),\n",
2019-03-24 02:06:29 +01:00
" DefaultConv2D(filters=256),\n",
" DefaultConv2D(filters=256),\n",
" tf.keras.layers.MaxPool2D(),\n",
2021-10-17 04:04:08 +02:00
" tf.keras.layers.Flatten(),\n",
" tf.keras.layers.Dense(units=128, activation=\"relu\",\n",
" kernel_initializer=\"he_normal\"),\n",
2021-10-17 04:04:08 +02:00
" tf.keras.layers.Dropout(0.5),\n",
" tf.keras.layers.Dense(units=64, activation=\"relu\",\n",
" kernel_initializer=\"he_normal\"),\n",
2021-10-17 04:04:08 +02:00
" tf.keras.layers.Dropout(0.5),\n",
" tf.keras.layers.Dense(units=10, activation=\"softmax\")\n",
2019-03-24 02:06:29 +01:00
"])"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 31,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "KZbWeIBYpKz6",
"outputId": "fd2181cd-3092-4f03-96ff-b573a39b21ef"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/10\n",
"1719/1719 [==============================] - 53s 28ms/step - loss: 0.7001 - accuracy: 0.7601 - val_loss: 0.3433 - val_accuracy: 0.8764\n",
"Epoch 2/10\n",
"1719/1719 [==============================] - 37s 21ms/step - loss: 0.3948 - accuracy: 0.8662 - val_loss: 0.3217 - val_accuracy: 0.8816\n",
"Epoch 3/10\n",
"1719/1719 [==============================] - 37s 22ms/step - loss: 0.3339 - accuracy: 0.8867 - val_loss: 0.3310 - val_accuracy: 0.8918\n",
"Epoch 4/10\n",
"1719/1719 [==============================] - 37s 22ms/step - loss: 0.2979 - accuracy: 0.9003 - val_loss: 0.2454 - val_accuracy: 0.9072\n",
"Epoch 5/10\n",
"1719/1719 [==============================] - 37s 21ms/step - loss: 0.2756 - accuracy: 0.9070 - val_loss: 0.2477 - val_accuracy: 0.9120\n",
"Epoch 6/10\n",
"1719/1719 [==============================] - 36s 21ms/step - loss: 0.2580 - accuracy: 0.9129 - val_loss: 0.2774 - val_accuracy: 0.9028\n",
"Epoch 7/10\n",
"1719/1719 [==============================] - 37s 21ms/step - loss: 0.2432 - accuracy: 0.9169 - val_loss: 0.2824 - val_accuracy: 0.9072\n",
"Epoch 8/10\n",
"1719/1719 [==============================] - 37s 21ms/step - loss: 0.2333 - accuracy: 0.9185 - val_loss: 0.2414 - val_accuracy: 0.9088\n",
"Epoch 9/10\n",
"1719/1719 [==============================] - 37s 22ms/step - loss: 0.2242 - accuracy: 0.9246 - val_loss: 0.2566 - val_accuracy: 0.9094\n",
"Epoch 10/10\n",
"1719/1719 [==============================] - 37s 22ms/step - loss: 0.2117 - accuracy: 0.9285 - val_loss: 0.2548 - val_accuracy: 0.9188\n",
"313/313 [==============================] - 2s 8ms/step - loss: 0.2759 - accuracy: 0.9117\n"
]
}
],
2016-09-27 23:31:21 +02:00
"source": [
"# extra code compiles, fits, evaluates, and uses the model to make predictions\n",
"model.compile(loss=\"sparse_categorical_crossentropy\", optimizer=\"nadam\",\n",
" metrics=[\"accuracy\"])\n",
"history = model.fit(X_train, y_train, epochs=10,\n",
" validation_data=(X_valid, y_valid))\n",
2019-03-24 02:06:29 +01:00
"score = model.evaluate(X_test, y_test)\n",
"X_new = X_test[:10] # pretend we have new images\n",
2019-03-24 02:06:29 +01:00
"y_pred = model.predict(X_new)"
2016-09-27 23:31:21 +02:00
]
},
2017-05-05 15:22:45 +02:00
{
"cell_type": "markdown",
"metadata": {
"id": "h9kyemsZpKz6"
},
"source": [
"## LeNet-5"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "5glqD4rWpKz7"
},
"source": [
"The famous LeNet-5 architecture had the following layers:\n",
"\n",
"Layer | Type | Maps | Size | Kernel size | Stride | Activation\n",
"-------|-----------------|------|----------|-------------|--------|-----------\n",
" Out | Fully connected | | 10 | | | RBF\n",
" F6 | Fully connected | | 84 | | | tanh\n",
" C5 | Convolution | 120 | 1 × 1 | 5 × 5 | 1 | tanh\n",
" S4 | Avg pooling | 16 | 5 × 5 | 2 × 2 | 2 | tanh\n",
" C3 | Convolution | 16 | 10 × 10 | 5 × 5 | 1 | tanh\n",
" S2 | Avg pooling | 6 | 14 × 14 | 2 × 2 | 2 | tanh\n",
" C1 | Convolution | 6 | 28 × 28 | 5 × 5 | 1 | tanh\n",
" In | Input | 1 | 32 × 32 | | | \n",
"\n",
"There were a few tweaks here and there, which don't really matter much anymore, but in case you are interested, here they are:\n",
"\n",
"* MNIST images are 28 × 28 pixels, but they are zero-padded to 32 × 32 pixels and normalized before being fed to the network. The rest of the network does not use any padding, which is why the size keeps shrinking as the image progresses through the network.\n",
"* The average pooling layers are slightly more complex than usual: each neuron computes the mean of its inputs, then multiplies the result by a learnable coefficient (one per map) and adds a learnable bias term (again, one per map), then finally applies the activation function.\n",
"* Most neurons in C3 maps are connected to neurons in only three or four S2 maps (instead of all six S2 maps). See table 1 (page 8) in the [original paper](https://homl.info/lenet5) for details.\n",
"* The output layer is a bit special: instead of computing the matrix multiplication of the inputs and the weight vector, each neuron outputs the square of the Euclidian distance between its input vector and its weight vector. Each output measures how much the image belongs to a particular digit class. The cross-entropy cost function is now preferred, as it penalizes bad predictions much more, producing larger gradients and converging faster."
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "iV10vudGpKz7"
},
2017-05-05 15:22:45 +02:00
"source": [
"# Implementing a ResNet-34 CNN Using Keras"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 32,
"metadata": {
"id": "p9EoM1dTpKz7"
},
2016-09-27 23:31:21 +02:00
"outputs": [],
"source": [
2021-10-17 04:04:08 +02:00
"DefaultConv2D = partial(tf.keras.layers.Conv2D, kernel_size=3, strides=1,\n",
" padding=\"same\", kernel_initializer=\"he_normal\",\n",
" use_bias=False)\n",
"\n",
2021-10-17 04:04:08 +02:00
"class ResidualUnit(tf.keras.layers.Layer):\n",
2019-03-24 02:06:29 +01:00
" def __init__(self, filters, strides=1, activation=\"relu\", **kwargs):\n",
" super().__init__(**kwargs)\n",
2021-10-17 04:04:08 +02:00
" self.activation = tf.keras.activations.get(activation)\n",
2019-03-24 02:06:29 +01:00
" self.main_layers = [\n",
" DefaultConv2D(filters, strides=strides),\n",
2021-10-17 04:04:08 +02:00
" tf.keras.layers.BatchNormalization(),\n",
2019-03-24 02:06:29 +01:00
" self.activation,\n",
" DefaultConv2D(filters),\n",
" tf.keras.layers.BatchNormalization()\n",
" ]\n",
2019-03-24 02:06:29 +01:00
" self.skip_layers = []\n",
" if strides > 1:\n",
" self.skip_layers = [\n",
" DefaultConv2D(filters, kernel_size=1, strides=strides),\n",
" tf.keras.layers.BatchNormalization()\n",
" ]\n",
2017-05-05 15:22:45 +02:00
"\n",
2019-03-24 02:06:29 +01:00
" def call(self, inputs):\n",
" Z = inputs\n",
" for layer in self.main_layers:\n",
" Z = layer(Z)\n",
" skip_Z = inputs\n",
" for layer in self.skip_layers:\n",
" skip_Z = layer(skip_Z)\n",
" return self.activation(Z + skip_Z)"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 33,
"metadata": {
"id": "_0qA-kSkpKz7"
},
2017-05-05 15:22:45 +02:00
"outputs": [],
"source": [
"model = tf.keras.Sequential([\n",
" DefaultConv2D(64, kernel_size=7, strides=2, input_shape=[224, 224, 3]),\n",
" tf.keras.layers.BatchNormalization(),\n",
" tf.keras.layers.Activation(\"relu\"),\n",
" tf.keras.layers.MaxPool2D(pool_size=3, strides=2, padding=\"same\"),\n",
"])\n",
2019-03-24 02:06:29 +01:00
"prev_filters = 64\n",
"for filters in [64] * 3 + [128] * 4 + [256] * 6 + [512] * 3:\n",
" strides = 1 if filters == prev_filters else 2\n",
" model.add(ResidualUnit(filters, strides=strides))\n",
" prev_filters = filters\n",
"\n",
2021-10-17 04:04:08 +02:00
"model.add(tf.keras.layers.GlobalAvgPool2D())\n",
"model.add(tf.keras.layers.Flatten())\n",
"model.add(tf.keras.layers.Dense(10, activation=\"softmax\"))"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "tWnoERqepKz7"
},
2016-09-27 23:31:21 +02:00
"source": [
"# Using Pretrained Models from Keras"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 34,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "wbS9p1FnpKz7",
"outputId": "a3d0e499-1036-478e-85af-06e882e25e21"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/resnet/resnet50_weights_tf_dim_ordering_tf_kernels.h5\n",
"102973440/102967424 [==============================] - 1s 0us/step\n",
"102981632/102967424 [==============================] - 1s 0us/step\n"
]
}
],
2016-09-27 23:31:21 +02:00
"source": [
"model = tf.keras.applications.ResNet50(weights=\"imagenet\")"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 35,
"metadata": {
"id": "_QhYKi22pKz8"
},
2016-09-27 23:31:21 +02:00
"outputs": [],
"source": [
"images = load_sample_images()[\"images\"]\n",
"images_resized = tf.keras.layers.Resizing(height=224, width=224,\n",
" crop_to_aspect_ratio=True)(images)"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 36,
"metadata": {
"id": "usbPpqkqpKz8"
},
2016-09-27 23:31:21 +02:00
"outputs": [],
"source": [
"inputs = tf.keras.applications.resnet50.preprocess_input(images_resized)"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 37,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "M-IYqzqRpKz8",
"outputId": "5e89ff3b-8afb-4d34-a769-dbe2d70983b5"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"(2, 1000)"
]
},
"execution_count": 37,
"metadata": {},
"output_type": "execute_result"
}
],
2016-09-27 23:31:21 +02:00
"source": [
"Y_proba = model.predict(inputs)\n",
2019-03-24 02:06:29 +01:00
"Y_proba.shape"
2017-05-05 15:22:45 +02:00
]
},
2016-09-27 23:31:21 +02:00
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 38,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "-uWvslEcpKz8",
"outputId": "5390390e-2edb-4bc3-b1aa-ff14bc2abfe3"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Downloading data from https://storage.googleapis.com/download.tensorflow.org/data/imagenet_class_index.json\n",
"40960/35363 [==================================] - 0s 0us/step\n",
"49152/35363 [=========================================] - 0s 0us/step\n",
"Image #0\n",
" n03877845 - palace 54.69%\n",
" n03781244 - monastery 24.72%\n",
" n02825657 - bell_cote 18.55%\n",
"Image #1\n",
" n04522168 - vase 32.66%\n",
" n11939491 - daisy 17.81%\n",
" n03530642 - honeycomb 12.06%\n"
]
}
],
2017-05-05 15:22:45 +02:00
"source": [
2021-10-17 04:04:08 +02:00
"top_K = tf.keras.applications.resnet50.decode_predictions(Y_proba, top=3)\n",
2019-03-24 02:06:29 +01:00
"for image_index in range(len(images)):\n",
" print(f\"Image #{image_index}\")\n",
2019-03-24 02:06:29 +01:00
" for class_id, name, y_proba in top_K[image_index]:\n",
" print(f\" {class_id} - {name:12s} {y_proba:.2%}\")"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 39,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 0
},
"id": "alc_cnxVpKz8",
"outputId": "9918157b-9826-4c6d-8e64-9edaff7a8dc7"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAjwAAAEMCAYAAADAnWyqAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nOy926+tyXEf9qvqtfe5zsyZ0ZDDsUiRlhzJkWQLUeIYcAQEAfwYIH4xLCSA4hf/L/4T8hA7iJzIuQAO8pIYQZDAARQ7iWNINOkkEkiK13DI4cyQM2fO2Xt/XZWHunT1931rrX2GMsI52H2wzl7ru/S1uurXdekmVcVdukt36S7dpbt0l+7Sy5z4/+8K3KW7dJfu0l26S3fpLv3LTneA5y7dpbt0l+7SXbpLL326Azx36S7dpbt0l+7SXXrp0x3guUt36S7dpbt0l+7SS5/uAM9dukt36S7dpbt0l176dAd47tJdukt36S7dpbv00qfDqZv/w1c/9Jj1EbpOROUJymvr8PbpMcDuK0Cg6XkigtL50PipXLWSaVWIepXqdSKAUMpS3bx3trwj14iinQKAM28iAry963eJ9vuy9sf5crf9zbyPXetze3kzEXZ7oxEAhSq8TdYcVeTzlh0BpADm9jZs+yq/79CTjVOMq9WbGVCNF70MzwEgECkq6RBTPrpu07rpUQYRoZFYX4CSrgiAkkKhQPSR59EUYKWsTtCdlmfyOrZ1OZeY9sdqrrvnq4CQonu9pnJrpX0sx5z1q56PwqhYCXjrwQtX+Wc20V/61xUQJxktg9V9jjKgHRcASDpUBVABQXDoHaoK1QXoCoJCAOskJTA6VARCDSQdUAIFvZJmnzKR0ylBQZBp7vq8h0KUoEoQqPFJAFAFw+okUkcKgPpftEEMTqtMBCUCsX2ECGCG8gHgBmoMIQZxA1rDQgC3A0AN0pq1kXkwOVLrN2JA/Lu3cz33nQnPTVQFot1rGRJZeZuZOxoUDQ1CAgFBSK1fCBAoGOzZEmzdrtCkd0ZSNTUADCIGwFCwtSH4DjOUL6y9UW/CyINLfZXLMwxi5/s5xRQQ8fERAB2QBdAFwA2g1+DlOUiuQLiC4hkU17joN7i3dBzkCgIG0QVABwgzrtsBV6w2bnIBpQOgDSTGYUU7aFlwWK7AV1dYIKDeQcsNqC/QLtDufJIJ2uwvsbVBG6MzgZkhpODGULL+zmH0/lCOPt1w1yGTVaETh1ebF2CjV2VAKPuKFCAlcBdwVywEtE5QEUAUKgoWBSsgok72BO0CsMlTIoZyg3ADcwMa4fqPvrnLw04Cnj1wsCdAjSno5tr6ub08bpMmAGMXUjBNdaRxX1XPcu1zYODcvfF6MKLbpCrA4++LyZfb9N8WgL6gDHNwCqxAhX+qoMz8nS/mHCkvbQHI6koI6xRK67TJFao0plZW8jRQqN8DyMU70yT1LKnO8aijAn0PVc2ZjGpNwON8snkt234LIDZ1EU2kpOUFIwHd/K5tzvapMw+Mey9DMjprAAQqlYY0eYgBEFnfQYeLUjX8Ty64c2qJ9RlUoEquLh+gynBCCHoX57TiFhpZhTB1GauBz4z4FCZgVeEgrYKO0l6M/BN7CVkXaPO81IEF2TMKMDEYQIdYf1HCNRf0PAAjR/mlxPWE35vEuapxmnVpWmefyeWGQ4xNlECc85cUUCJvI9eVrtd3gCD7yw6MUkAYECK296n5NS7PBCjyNicziO/Of1oAQIw+meZd4fcaIELBsoC1g1XRxOak0gFqch7KAlUBK3AQxiKBM0dZKgBpB/cOvbqG3nQD6/0G3BegCyACFesHhYKo+dKRIOwLZwGAbiBIFEpibWUqtffneMVkor8o+BI5fZADch8XNbAPtSxYnC6D9/h8YbX5RDpkeNSC4PVVAuHgAAqjPAEEYiD+SDoJeF4kBfgwADLfS1DyYjhnBXTGJKJyb3omJpGWTlI1Or5lOes6HwMK4/q2UaozSIjnNZmU1Du3rtueZmc88+LA6VSZ6iumDV8tgGaCIEfGdlTf8oxrtaa6ejY0GLdJOU5H2l7zHK0Y9Qjw0EIQEJWyFaShSRr5zyuYuFZoM9tri5lPgDVHa1adsaa61F4G33rBOVYzjvF9qZLGnGmYe9YEujrPUDWNiDFzGUDDpyq5wLIVrDoDh10LYa3xO3gdp65B4DRSFgpWL876hOIklCajDYhKYGgiAgq4HKVSB1q9lipByvKtOXYt5w+RaULIBVcwMcrCM7+EJ4mqANQ5uJ7Ax7Q6PORCYCHDKgKwQzP1OvKsAVMEEMNgKkoAGpDNjQzXXMs1PZlHQ4IcLm2P9nGtP0Y+AW4xDxkcREPjE8C4Q3QBIGBVHBQgFZDcQJTsA0CE0NVhCikU3eqoBGhoIjvQryHLDbjfQGQx0CNimh0HO6KuvQ7e5ppEqIEOSgCnoBBNjLFYCwyZvKi2dvC9BJUaIJS9Dxzwq9GYOBESI8ublZ7B0xxTeAkKuHbIimUQesxHEIjVu/s4EzsJeM5pBapavAriurKc8vB+Wgvxo3Jax61hPqFybQYA2/o6QyiDsluMKpj5pObkRcxhXvXd8vYA4akyb3PN78xmQm/TuXyP5uZEZcyxtFvX7wzwUPPbgyB5r4CH4HcxUnVs4dqICkDmnNY0F9NiLnk27YznZzBm18TrELkQCGsNl+YaGesbExDRbFUpXrYAaEN2KXCC7zpY1BUdB+j3QvTEVCqPrwpfzY8XBGc/+yklK3KAEtx5v5Ktnin4VowlDZDeoSZ8NICFZyQuSEznh4lgVQbYCXouaGRNg3Pfe8nqWpi54sU8qRgLnlq+5gLQ7nCWG8ItAJpVRB38u9DjWgsdkikET2ppVv2cP4twqBNqrRnKPMb7CkInt6qDbS4wgclNc/7MMDHxnGd8GBjanqADGh80gAPorMFTaCeiVlFGWazm4ogMgERfoUjxuC5+Tw2IiJiJDiIgESg6ujQH3sgx6z34iIJoMcBNAaA6sFyDbhagL9Clg5cO7WYKSqwFQFjB4mAj6MC7LPmbAsplvBD8j4opt9Kslv93GEehj5gvMdRGdlUrapTWAiglXxqLy0rz0+JBYaZE17iJVIXCnM5qeLaiZdWgeC4BQRE+hcOqr3yq1iQBxhF0MMwJNOW3AUyICXwCkBTGPpexqsvqejSowCoc6Y1bpAHAaKfMTZWPtGevzse+v1jNxnddje1t3p3/apkcOwCrjHmujxS+yKJC3OOFaaJNK8uaJ00apEGPuqlL8npyoaSDXcaTASk2JLruEsWoU5ZVEFCdSUqrVwsziLywHZNt8cPXg7DukvVi4Mg4VnImwpmh/tQlrUjc6St7fJomtpKuw0YaQmr44yRYABAaH0Dc2mMCQVzwEdhXtVXOBx+sIz585aQMfhVEFezMFachZz1PTWAVJlAFqLlQbkUbqEjhgVFH0x44UKhgKgBBTmwat7KTo8I0/52AkYm5XNSo90dayrjwek7tgpL7kYQmThz0RF3g99x0OaoW9S7PgGBmLQc8a00Rwsy1mq/kmkLnNabRYygcWYRGQwPkBOroIO0umL1rtEOlg3SxZ7LeY0yibA5aRIfBbwGL+erojYAWgSwCEk2/HYiauZbJTauehSiI1UxKpW8o5JJrXrJbQTY2oYJcr6M15hmXC9E9hOFrhtQCxQJuWEPIu5sqRSb91AVtdn08UsFUgJ8j6bSG59j10KzkJMsbANS1f+5IllUuj6y+p4Ct96IJ8cwaPB3hzLMvkWe6zry041j7djKOL7vvnEu6qcMRlHcmj0mrdUJtfFYTFStVGiwyqjUIPTKbv8d4VfxAuvqe5ay+BL/arSNNzZqb50JhB4KvtYgTj8aaXiowLH+dnsPEUeubAm6lURnlY5csJrr3d0LEjNaWekb/yb4vzUYZqutr0X/B2W6b5n5/qdJa0wAMAo2B22l4+OtEJ1M4u8TIhBYlJoO6OQipExnAP/1M7GpwxfAyEfcBClMR1bmCeHWmpjU5m3+DlR5mKWu2awSm0gNMjCw0BW0IqHThzzKD2EKhsl2UFGK
"text/plain": [
"<Figure size 720x432 with 2 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
2016-09-27 23:31:21 +02:00
"source": [
"# extra code displays the cropped and resized images\n",
"\n",
"plt.figure(figsize=(10, 6))\n",
"for idx in (0, 1):\n",
" plt.subplot(1, 2, idx + 1)\n",
" plt.imshow(images_resized[idx] / 255)\n",
" plt.axis(\"off\")\n",
"\n",
"plt.show()"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "hqxnSBJ3pKz8"
},
2016-09-27 23:31:21 +02:00
"source": [
"# Pretrained Models for Transfer Learning"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 40,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 208,
"referenced_widgets": [
"2839afc6cb6d4a50b0bdad1fcb7f39d1",
"1c08c78c0d484eed9638ad2b757ab584",
"eefd1a01ef1c46e09ffbd97ad25377cf",
"d142189db76a4681a22f38ae252e4ebc",
"d441368305704ab9a3bdbe762ab340a4",
"57cbb645792f45adbfab9b29aa708809",
"b681dc2200ad4ee397a46602e8f4f654",
"0401482a18a94f22b95d5321bfa6f414",
"54a90429726b4d848358cafae87ad893",
"8f0660be3bf44dd48fd42cd52a507e32",
"f8ef3c06db574e3f88dc9a8c0bcd22ab"
]
},
"id": "mbktvHOXpKz8",
"outputId": "ee28b6fc-e112-4d2a-ad11-6bbb88c56a38"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"\u001b[1mDownloading and preparing dataset tf_flowers/3.0.1 (download: 218.21 MiB, generated: 221.83 MiB, total: 440.05 MiB) to /root/tensorflow_datasets/tf_flowers/3.0.1...\u001b[0m\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"WARNING:absl:Dataset tf_flowers is hosted on GCS. It will automatically be downloaded to your\n",
"local data directory. If you'd instead prefer to read directly from our public\n",
"GCS bucket (recommended if you're running on GCP), you can instead pass\n",
"`try_gcs=True` to `tfds.load` or set `data_dir=gs://tfds-data/datasets`.\n",
"\n"
]
},
{
"data": {
"application/vnd.jupyter.widget-view+json": {
"model_id": "2839afc6cb6d4a50b0bdad1fcb7f39d1",
"version_major": 2,
"version_minor": 0
},
"text/plain": [
"Dl Completed...: 0%| | 0/5 [00:00<?, ? file/s]"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"\n",
"\u001b[1mDataset tf_flowers downloaded and prepared to /root/tensorflow_datasets/tf_flowers/3.0.1. Subsequent calls will reuse this data.\u001b[0m\n"
]
}
],
2016-09-27 23:31:21 +02:00
"source": [
"import tensorflow_datasets as tfds\n",
"\n",
"dataset, info = tfds.load(\"tf_flowers\", as_supervised=True, with_info=True)\n",
"dataset_size = info.splits[\"train\"].num_examples\n",
2019-03-24 02:06:29 +01:00
"class_names = info.features[\"label\"].names\n",
"n_classes = info.features[\"label\"].num_classes"
2017-05-05 15:22:45 +02:00
]
},
{
2019-03-24 02:06:29 +01:00
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 41,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "769isDkDpKz8",
"outputId": "891b3c57-1212-4959-b24f-574ad366cf4d"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"3670"
]
},
"execution_count": 41,
"metadata": {},
"output_type": "execute_result"
}
],
2017-05-05 15:22:45 +02:00
"source": [
"dataset_size"
2017-05-05 15:22:45 +02:00
]
},
{
2019-03-24 02:06:29 +01:00
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 42,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "Nok5SNbEpKz9",
"outputId": "f79e7c41-6454-4ae7-a497-a60024938480"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"['dandelion', 'daisy', 'tulips', 'sunflowers', 'roses']"
]
},
"execution_count": 42,
"metadata": {},
"output_type": "execute_result"
}
],
2017-05-05 15:22:45 +02:00
"source": [
"class_names"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 43,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "D50TeDylpKz9",
"outputId": "f795a3b2-1170-49e2-8508-de275c0f1861"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"5"
]
},
"execution_count": 43,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"n_classes"
]
},
2017-05-05 15:22:45 +02:00
{
2019-03-24 02:06:29 +01:00
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 44,
"metadata": {
"id": "M-lgeD08pKz9"
},
2019-03-24 02:06:29 +01:00
"outputs": [],
2017-05-05 15:22:45 +02:00
"source": [
"test_set_raw, valid_set_raw, train_set_raw = tfds.load(\n",
" \"tf_flowers\",\n",
" split=[\"train[:10%]\", \"train[10%:25%]\", \"train[25%:]\"],\n",
" as_supervised=True)"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 45,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 592
},
"id": "niSFaiTgpKz9",
"outputId": "45879b7b-31f5-43c8-dcbd-9a1865867a17"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAqwAAAI/CAYAAACyMIvdAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nOy9ebQty13f9/lVVXfv4ZxzxzdplhAgGUFkJtvLLMCxISSLxJaW7YBjm8nGweAwRIkXWQYSBxIHkxVsY8eOCcYImwAGBzCLAA44ASxBiBZSAA1PetJ7T3rTnc6whx6q6pc/qrp3n3MHvSvQe/c99Xetfc/du6u7q6u7q771/Q0lqsqECRMmTJgwYcKECfcqzPNdgQkTJkyYMGHChAkT7oSJsE6YMGHChAkTJky4pzER1gkTJkyYMGHChAn3NCbCOmHChAkTJkyYMOGexkRYJ0yYMGHChAkTJtzTmAjrhAkTJkyYMGHChHsaE2GdMGHChAkTJvyeISIfEpG3PN/1uBNE5E+LyHOSz1NEViLylXdR/itFZHW775/omAjrhAkTJkyYMOGOEJEHROTviMgHRKQRkY+IyM+JyH/wfNftRYwfBV7zfFfiXoF7viswYcKECRMmTLh3ISKvAn4NOAG+FXgnSfD648A/BF7xfNXtxQxV3QLb57se9womhXXChAkTJkyYcCf8g/z3s1X1x1T1var6blX9PuAzbreTiHyLiLxLRNZZkf1+ETk/2n5ORN4qIs+ISC0ij4jIN422/xUReV/edlVEfl5E7kpoE5G/KCKPishGRP4V8MCZ7Z8kIj8lIk/ler5DRL70TJkPicjfEJF/JCLHIvJhEfkvzpR5rYj8m1zX9549Ri7zUhH530TkRv78rIh88h3qfpNLQG6T94tIm//+5TPbVUS+VkR+PF/PIyLy5++iye5ZTIR1woQJEyZMmHBLiMhF4EuAv6+qN/lTqurhHXaPwDcBnwb8OeBzgb832v6dwKcDXwp8KvDVwEfyeT8b+PvAf5O3/XHg/xjV6wszOfvCO9T9DwE/CPwvwBuBnwH+5plie8DPAV8E/DvATwA/KSKvO1Pum4H/D/hM4H8AvltE/kg+jwH+JYlT/ZF8Hf81UI3qsgB+GaiBL8jlngT+dd72USEibwK+D/he4A3A3wH+gYj8h2eKfjvwU/l6fhT4ARF5wavgk0vAhAkTJkyYMOF2eC0gwLvvdkdV/d7R1w+JyH8J/JSIfIWqRuCVwDtU9TdymUdH5V8BrIGfVtWTvO2do+0b4L357+3wjcD/qarflb+/T0Q+B/iaUR3feea435UJ4J8mEeoev5AVZYC/JyL/GYlEvw34E8AfAF6tqo8BZKX4V0b7fxmpHb9KVTWX+SvAMyTC/mN3uI4ebwHeOqrH+0Tks4C/TiLjPd6qqj+cz/FtuR0+H/jhZ3GOexaTwjphwoR7ElPE8U3nmiKOJzwfkI95R5F/V0R+MZvQT4CfBErgwVzkfwb+YxF5p4h8j4h8wWj3XySR1A+KyD8Tka8Qkf1+o6r+hqq+bkR2b4XXkwjlGKe+i8hSRL5bRH43m+lXwGdzs1/uu858fwK4f3Sej/RkNePXSQpzj88CXg2c5Hd5BRwBF4BPusM1nL2eXzvz26+SyPIt66qqHrgyqusLFhNhnTBhwnOOKeL4ecEUcTzhY8HDgJLI0rOGiLwS+FmSMvtnSITtq/PmEkBVf46ksn4PcBn4WRH5J3nbCcn8/meBx0jBXu8RkZf8Hq/nLL4n1+/bSKb6NwK/0ddxhO7Md+XuOJQBfisff/z5FOAf3XWtb67LGL/Xut6TeMFfwIQJE15YyBHH7wD+PdIg9Bkkk9rPkiKOJ3wcoKpbVX3m+a7HhBcWVPU68PPAN4jI3tnt4yCqM/hsEun7ZlV9m6q+D7iJbKrqVVV9q6p+JclU/xUiUuVtXlV/SVX7fmJJMp8/W7wb+MNnfjv7/fOAH1LVn1DVdwEf5tkrnuPzvFREXj767XM5zbHeQXKvuKqq7z/zuX4X5/mjt6j/795lfV+QmAjrhAkTnmtMEcdTxPGEFxa+nuQa8Jsi8mdE5FNF5HUi8nXcbCrv8TCJY3yTiLxaRL6cFIA1QET+poj8KRH5ZBF5PfBm4BFVbUTkS0XkG0XkD2a19s8B+2RfWhH5XBF5j4h87h3q/XeBPyEi35rP8ZeBN50p8z7gTSLymSLy6SQ/z9ldtA3AvwbeA/yQiLxRUjDW/wT4UZl/BjxN8uH9gtwmny8i/+Od3tsz+NvAXxCRr8/X89eA/wT47rus7wsSE2GdMGHCcwaZIo57TBHHE14wUNVHSM/qL5Ke13cBvwT8R8DX3mafd5GCfb6FpAD+JVLQ0BgN8F2koKdfIxHS/vk7BP4UOzL4FuAvqWofyLQgvcu3fd5V9e0k1bYn1m8mvUtjfAsp8OlXSO/u2zkdLPVRkQPI3kR6Z38d+CFSf9SMymxIgU+PAD+er+mfknxYbzzL8/zvwF8j9R+/S2rfv6qqP3PHHV8sUNXpM32mz/R5Tj4kkqnAm55F2Q8Bb7nD9i8hDQgmf/9p4AduU/bNpACH/TvU6z3A597hfP8c+MUzv31/6kbveB1vB/7Gmev6kTNlHu7LAF8MBOAVo+2fl9vtK/P3r877yKiMBa4BfzZ//0pgNdp+9vuvnW0vEiH/1dF3Bf770XdHisr+88/3szR9ps/0+cT6fEIprDJFHZ891xR1POG5xhRxnDBFHE+YMGHCXeBFQ1hlijp+PjBFHU+4W0wRxwlTxPGECRMm3AVeFJ2OTFHHzwt0ijqecJfQKeL42Z5nijieMGHChBFeFISVKep4ijqe8ELCFHF8Z0wRxxMmTJhwBi94wipT1HGPKep4wgsCOkUc3xE6RRxPmDBhwk0Q1eckvufjhqyI/DrwZlX9lx+l7IeA71PV77nN9i8hEam5qkYR+WmSye2rb1H2zcA/AV6W/eNuVa8fAv7i7QI5ROSfA/ep6heNfvt+4GtU9bbBKSLyduBfqep3jq7rbar65aMyDwP/VFW/U0S+mDR4jtc5/jzSQPpVqvqDIvLVJHeKT9H8UIiIJQ2+X6eqPyYpQOv7VHUvbz/7/deA947bS0R+EHitqn5e/q7A38pmWbIifQx8rea1jydMmDBhwrPH5/yFL1JUUQVUEVWiKgrEGIcoa/RmlUqBKGAUJCeGSKwg5u+GZBARRECJoDqU6iG3GbFktCFVT/ujpV80HV7VAoY0X/OoQoySrkkkV05J+svp40cVRAwi/UUItq91Pn2UiBqLYLEERCGKoygXXFhcZn++z/7igHI2o/YtvvV0IeBDwGtA+3YUydcRKURwxlCECBpwzhJiQJ2gYsEYWt8RY6Cut7u2joKIYCS1SogtXWhp2y0+BqIqJp3lppt1MzVI15zaVRCxqCohhPyJow/4/P8YND8b/X0/e78iiGKMwRiLtRZjDcYJ1hqM0fTdCO/6iV/+mINp7wZ3Zb6+R/F7ijomkbTXA+dIaWH6qOMnSFHH/0JEPoukBv2Mqv5fefdx1PHPA78A/GRPXjNJPauCnsXrSarqGG8jqTh9HZfAd5BU3oeAgmRiPGs6/f2MOh4fZ8HdRR3/wJnffpWknN2yrqrqRWSKOp4wYcKEjxGR3UCoAtp/U+V0f65wR5FKRhxJEsk9RZoUJeRD735P5zg9FJ8ZR4byMt52qiqnzyXSk830H83D1XAckeG8JjFeRBQjmfTqiBQKGLFEEtEUk65TAGcts6Jkf77HpYMLzOcL2uBZ1zXbpmHbNnS+G0h60ExciYhGiCERvxgJMYAViGAM+K5lW2/xoaPrukTyc11QwYgiAj56YuyGyQWqRBS5Kf7xVnRHTt0L1URET38gBoghEL2iUfPERm7/PIhksir5A9bKQFLFyG0nKR8vvBgI6zjq+I4K6xiyizr+xyQT9TWSmfJHGEUd53L/Psnk/7Mi8uOq+lWqeiIin0kyy30Rifj+dyLyOar6xO/b1aWo4y8hmTAfJuVA/CE+flHHX3aLbc82iON2mKKOJ0yYMOHjCJW
"text/plain": [
"<Figure size 864x720 with 9 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
2017-05-05 15:22:45 +02:00
"source": [
"# extra code displays the first 9 images in the validation set\n",
"\n",
2019-03-24 02:06:29 +01:00
"plt.figure(figsize=(12, 10))\n",
"index = 0\n",
"for image, label in valid_set_raw.take(9):\n",
2019-03-24 02:06:29 +01:00
" index += 1\n",
" plt.subplot(3, 3, index)\n",
" plt.imshow(image)\n",
" plt.title(f\"Class: {class_names[label]}\")\n",
2019-03-24 02:06:29 +01:00
" plt.axis(\"off\")\n",
2017-05-05 15:22:45 +02:00
"\n",
2019-03-24 02:06:29 +01:00
"plt.show()"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "gXG6iv8XpKz9"
},
2017-05-05 15:22:45 +02:00
"source": [
"All three datasets contain individual images. We need to batch them, but for this we first need to ensure they all have the same size, or else batching will not work. We can use a `Resizing` layer for this. We must also call the `tf.keras.applications.xception.preprocess_input()` function to preprocess the images appropriately for the Xception model. We will also add shuffling and prefetching to the training dataset."
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 46,
"metadata": {
"id": "Bnz0n9XApKz9"
},
2017-05-05 15:22:45 +02:00
"outputs": [],
"source": [
"tf.keras.backend.clear_session() # extra code resets layer name counter\n",
"\n",
"batch_size = 32\n",
"preprocess = tf.keras.Sequential([\n",
" tf.keras.layers.Resizing(height=224, width=224, crop_to_aspect_ratio=True),\n",
" tf.keras.layers.Lambda(tf.keras.applications.xception.preprocess_input)\n",
"])\n",
"train_set = train_set_raw.map(lambda X, y: (preprocess(X), y))\n",
"train_set = train_set.shuffle(1000, seed=42).batch(batch_size).prefetch(1)\n",
"valid_set = valid_set_raw.map(lambda X, y: (preprocess(X), y)).batch(batch_size)\n",
"test_set = test_set_raw.map(lambda X, y: (preprocess(X), y)).batch(batch_size)"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "ovNEMky-pKz9"
},
2017-05-05 15:22:45 +02:00
"source": [
"Let's take a look again at the first 9 images from the validation set: they're all 224x224 now, with values ranging from -1 to 1:"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 47,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 700
},
"id": "ZL3c3i4opKz9",
"outputId": "38847d8d-8822-41a3-cfb2-27479aa5debe"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAqYAAAKrCAYAAAAj5U9BAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nOy9e6x9W5bX9RlzzrX23uec3+/+7qPq9kO6KR5No+Fpp4FAgOAjxgABIhi1lQ7yUIHQEPwDxUSJhggY4yMqseMDxBfBiEqQtEEjdEAkBErTdHdVV3dXVVfdqvv4Pc45+7HWnGP4x5hzrbX3Ob/fvbeAvvdX7nHv/p39WI8555pjju94TjEzznSmM53pTGc605nOdKaPmsJH3YAznelMZzrTmc50pjOdCc7A9ExnOtOZznSmM53pTB8TOgPTM53pTGc605nOdKYzfSzoDEzPdKYznelMZzrTmc70saAzMD3Tmc50pjOd6UxnOtPHgs7A9ExnOtOZznSmM53pTB8L+roHpiLyYyLy+z/qdryIROSfEJGflLpdInIjIt/9IY7/bhG5ed7nM53p7zadefbOvc48e6aPLZ359c69zvz6d0gvNTAVkTdF5N8TkR8RkYOI/ISI/HkR+cc/6rZ9HdN/B/y0j7oRZ3o56cyzHwmdefZMXxOd+fUjof/f82v6qBvwtZKI/FTg+4Fr4A8AfwsH2v8Q8J8A3/JRte3rmcxsB+w+6nac6eWjM89+NHTm2TN9LXTm14+Gzvz6cltM/6P69zvM7L83sx8ys79tZv8h8HOfd5KI/D4R+bSI3Fbt73tF5NHi91dE5E+KyFdFZC8inxOR71n8/jtE5Ifrb++IyF8QkQ8F8EXknxORHxeRrYj8L8CbJ7//dBH5syLyVm3n3xCRX31yzI+JyB8UkT8uIs9E5Isi8i+fHPMzROT/qG39odNr1GO+WUT+WxF5XF9/TkR+5gvafsfNUMfksyIy1L+/7eR3E5HfLiJ/uvbncyLyXR9iyM709UFnnj3z7JleHjrz65lfPxoys5fuBbwGKPCvfIBjfwz4/YvP3wP8KuCnAr8C+DTwJxe//wfA3wS+E/hW4FcCv7H+9h1ABv6Z+tvPA34vkOrvvxIw4Fe+oD2/qLb9XwW+DfgdwLv+KKZjfh7wLwA/B/gZ9dgB+PaTfr0L/K56zO+u9/4l9fcA/D/A/wn8AuCXAn8dGIHvrsdcAD8M/Bf4QvPtwPcCPw5c1GO+G7hZ3Pf086+v1/xdtT+/u37+NYtjDPgi8F21rX+49udbPuq5dH795LzOPHvm2fPr5Xmd+fXMrx/p/PuoG/A1Ms131gfx6z8s09zz+z8GHIBQP/9PwH/2nGN/A/AUePCCdv0g8J0vuN9/DXzfyXffu2Sa55z3V4E/eNKv/+bkmM+0Y4B/FCjLiQn8sjpu310//5Z6jiyOiZUZf1P9/H5M8/2n41WZ8C8vPhvwhxefE7AFvuujnkvn10/O68yzZ549v16e15lfz/z6Ub5eVle+fM0nivwqEfm+apa/Bv4HoAe+oR7yHwP/pIj8LRH5YyLyKxanfx+u6fyoiPwpEfnNIvKg/Whmf83Mvt3M/toLmvCzgb9y8t3RZxG5FJE/IiI/UE3/N7gmeRrT8+mTz18CPrm4z0+Y2ecXv/9fuCbZ6B8EPgVci2cS3uCLwqvAT39BH0778/0n3/1l4O9/XlvNLANvL9p6pq9/OvOs05lnz/Qy0Jlfnc78+hHQywpMP4NrCD/7w5wkIt8K/DngbwO/EZ80v6X+3AOY2Z/HXQh/DHgD+HMi8p/X366BXwj8JuDzeED4D4rIN/0d9ueU/lht37+Gu0J+PvDXWhsXNJ58Nj7cMw24S+Xnn7y+DfjjH7rVd9uypL/Ttp7p5aYzzzqdefZMLwOd+dXpzK8fAb2UjTaz94C/APwuEbk6/X0ZaH1C34FPvN9rZn/FzH4YuDPhzewdM/uTZvbdwD8P/GYRWdXfspn9RTP7A3jMyCVwJ+D5BfS3gV988t3p518G/Akz+zNm9mk8duSDalfL+3yziPyUxXffyfEz/xt4PMo7ZvbZk9d7H+I+v/Se9v/Ah2zvmb6O6cyzH/g+Z54900dOZ379wPc58+vfA3opgWml34m7G/66iPxGEflZIvLtIvIvctf83ugzeJ+/R0Q+JSL/FB6oPZGI/CER+XUi8jNF5GfjMS+fM7ODiPxqEfk9IvILqmb4TwMP8ImDiHyniPygiHznC9r97wP/sIj8gXqP34YHNy/ph4FfLyK/UER+DvBfAesPMTYA/xsei/MnROTni8gvAf5dPLC80Z8CvgL8WRH5FXVMfrmI/Dsvyho8oT8K/LMi8jtrf343Hrj+Rz5ke8/09U9nnn0xnXn2TB8nOvPri+nMr3+P6KUFpmb2Odzk/33Av40zyl8Efi3w259zzqeB3wP8Plzb+K3A6Y4VB+Dfwmu2fT/OFL+m/vYE+HXME/L3A7/VzP5S/f0C+Fn17/Pa/VdxDbEx928A/vWTw34f8FXgLwF/Hg/K/kt8CDIzxZkx4HEvfwL4N2v/2jFb4JcDnwP+dO3Tf4nHvzz+gPf5H/Eswd+Lj+nvAf4lM/ufP0x7z/T1T2eefTGdefZMHyc68+uL6cyvf+9IzE7DFM50pjOd6UxnOtOZznSmn3x6aS2mZzrTmc50pjOd6Uxn+vqiMzA905nOdKYznelMZzrTx4LOwPRMZzrTmc50pjOd6UwfCzoD0zOd6UxnOtOZznSmM30sKL3ox+/505+wh5cXDKOgsQCBdXyVPD6lhGcUCrCimFdHEOuxVAhqdDygcAuiiOyJuqFEoagiIYAImjNiEGNERGh5WFL3nDADESUIYIJZAAIiIEjdmqJuY4XXtRCRei0Bqxea/tTfUAzleHMLAbwNz80HE7BlTdvF6WYFaitEIBxdO2KC3zN4u8VCPdzmu9dOiHeLCGi7jCz74v0TEdcszM8NtY+G+BhPbTNEpHbMCCJ1zMyPr+OH+Dkqfo766CImSB1LmbrlNxUzQt3owlBMFCggxY8VQSwihHp7w0xRDRixDao/EVOoY2SAqSJAEvH7ab2n1JaZ+nEIEAi1r/MzkqkEsRAQdHqx2PZMl8/S5qrF/8avefI1737yUdEf+k//qB2Ga7a7xxQ90HWJlNbE2LPqVmxWV1yuH9L3V6z6Ky6vHvHotW/gtVde4cHFFZu+Z9VFYjCi+VwJAZKAYGAFTBERRhJ7hce7az77xc/yA5/9ND/6+R/g3SfvUEy4vHiDb/nmn8OnvuXn8onX3yRIYH9b2G+33Gzf4TM/8v/yIz/6GfbjAMDtzTMOux1aRp48eYcvvfVZhuGWrusJEgkhEGNktekZNYNFHr36DfTdBYogMWGq7He3HHYHVv0FXdeR1oEUEqtuzVh2vPvky9zu3uUw7Njd7ijbA5YNKk+KCLHv6TYdFoTYr5AYCBhBC+OwZxgOiBgBwbKwSg945eEb/OJf9GuJaU3JEGPPerVizAd+/HOf5r13P490CnHg6fW7HPa3mEHOsLm44NHrb/Do0Sfo+oeYdqxWPV0Hw/6Wp0/eY7vdseo2XK0f0nUPIK5QiRiG5QOH/S05DxQtvPXWl9jtbxjylt3uGV0X6GMkBSGKYZaRCCFF1EBVSQIxBUKKpK4jrgOSfN3VbGgGUec1EFIwUgRJAiliAYoVVJUYEqtVR7eO9OvIZnVBl3pSTKQQSSnR94nL1QWvX73B5eVD+vUFm80VV5evcHn1gBAU0cK6u+ATr30r3/Lmp3hts6GjTDzalrlQ2xRj91Lx7Hd81z9iUhcdM6vLapVQVRCp1vW1/X6nrrqT1WVykiXtHH+Hr4eC1HVeqqAxvzm2OGwSKickC7kmcjzUrb1NDlPbO8usUGW4YRQExQxUm+wXv6dMJx3dp2EERQjiOMBQP02E2Pre8IMYKgYSESLBCkHATLDQ0XVrXlm/ymZ1wcXqkvVqg6TIIQ+UUTE1shWyKsUUsYKqQYi0kcOMFIROIp0ZaCGEQAiQtUAESQm1gGFkLagWDocDpooFl79
"text/plain": [
"<Figure size 864x864 with 9 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
2017-05-05 15:22:45 +02:00
"source": [
"# extra code displays the first 9 images in the first batch of valid_set\n",
2017-05-05 15:22:45 +02:00
"\n",
"plt.figure(figsize=(12, 12))\n",
"for X_batch, y_batch in valid_set.take(1):\n",
" for index in range(9):\n",
" plt.subplot(3, 3, index + 1)\n",
" plt.imshow((X_batch[index] + 1) / 2) # rescale to 01 for imshow()\n",
" plt.title(f\"Class: {class_names[y_batch[index]]}\")\n",
" plt.axis(\"off\")\n",
2017-05-05 15:22:45 +02:00
"\n",
"plt.show()"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 48,
"metadata": {
"id": "Ib0cA8Y1pKz9"
},
2017-05-05 15:22:45 +02:00
"outputs": [],
"source": [
"data_augmentation = tf.keras.Sequential([\n",
" tf.keras.layers.RandomFlip(mode=\"horizontal\", seed=42),\n",
" tf.keras.layers.RandomRotation(factor=0.05, seed=42),\n",
" tf.keras.layers.RandomContrast(factor=0.2, seed=42)\n",
"])"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "G7GrQjsspKz-"
},
"source": [
"Try running the following cell multiple times to see different random data augmentations:"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 49,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 700
},
"id": "w6GH5_vupKz-",
"outputId": "eeb2c924-2f4f-4aa1-bea9-951bebef4bf0"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAqYAAAKrCAYAAAAj5U9BAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nOy9aaxty3bf9RtVNedauzm3e++5d+w0ju0IJ06wHkSJYssgOiUQRzgRYMhTSAPElhuZD4YgQQSKCEaIRgmRLBqb0EUgAhjLMgqNY8UYKyJGuM+LbezX3HfvPfecs/dea81ZVYMPo6pmzbX32ffeZz+fux9rHK2z15ptdWOMf40xapSoKic60YlOdKITnehEJzrRiyb3ogtwohOd6EQnOtGJTnSiE8EJmJ7oRCc60YlOdKITneh9QidgeqITnehEJzrRiU50ovcFnYDpiU50ohOd6EQnOtGJ3hd0AqYnOtGJTnSiE53oRCd6X9AJmJ7oRCc60YlOdKITneh9QZ/1wFREfkFEvvNFl+M+EpF/XER+XfJ2iciViHzkPVz/ERG5et7vE53o15pOPHvrXSeePdH7lk78eutdJ379VdKDBqYi8rki8u+KyN8WkYOI/IqI/ICI/CMvumyfxfRfAb/pRRfiRA+TTjz7QujEsyf6tOjEry+E/n/Pr+FFF+DTJRH5UuBHgGfAdwF/CwPafx/wHwK/4UWV7bOZVHUH7F50OU708OjEsy+GTjx7ok+HTvz6YujErw/bYvoXyt+vUdX/WlV/RlV/SlX/A+C3P+8mEfkOEfkJEbkus7/vEZFXuvMvi8j3icjrIrIXkY+KyLd15/+UiPxsOfeGiPygiLwngC8i/4yI/KKI3IjI/wh87tH53ywif1VEPlHK+TdF5PcfXfMLIvJnROQvichTEfllEfkXj675LSLyv5ay/szxM8o1Xygi/6WIPC6f7xeRL7un7LfcDKVNfl5EpvL3TxydVxH5kyLyV0p9Pioi3/QemuxEnx104tkTz57o4dCJX0/8+mJIVR/cB3gNyMC/9C6u/QXgO7vf3wZ8PfClwNcCPwF8X3f+3wf+L+DDwJcAXwd8Yzn3NUAE/qly7ncA3w6Ecv7rAAW+7p7y/D2l7P8y8FuBPwW8aV3RrvkdwD8HfBXwW8q1E/AVR/V6E/jmcs23lHf/7nLeAf838L8DvxP4PcCPAzPwkXLNOfCzwH+CCZqvAL4H+EXgvFzzEeCqe+/x728oz/zmUp9vKb//QHeNAr8MfFMp658r9fkNL3osnT6/Pp8Tz5549vR5OJ8Tv5749YWOvxddgE+TaT5cOuIb3ivT3HH+HwIOgCu//3vgP3rOtX8IeAI8uqdcPw18+J73/efADx0d+56eaZ5z348Cf+aoXv/F0TU/V68B/gEg9QMT+L2l3T5Sfv+xco901/jCjH+4/H4npvmR4/YqTPjXu98K/LnudwBugG960WPp9Pn1+Zx49sSzp8/D+Zz49cSvL/LzUF358mnfKPL1IvJDxSz/DPhvgRH4vHLJXwT+iIj8LRH5bhH52u72H8JmOn9HRP6yiPxREXlUT6rqj6nqV6jqj91ThK8E/sbRsdVvEbkQkT8vIj9ZTP9X2EzyOKbnJ45+fwz4nO49v6Kqv9Sd/z+wmWSlvxv4jcAzsZWEV5hQeBX4zffU4bg+P3J07K8Dv+15ZVXVCHyqK+uJPvvpxLNGJ5490UOgE78anfj1BdBDBaY/h80QvvK93CQiXwJ8P/BTwDdig+aPldMjgKr+AOZC+G7gg8D3i8h/XM49A34X8IeBX8ICwn9aRL7gV1mfY/ruUr5/BXOFfDXwY7WMHc1Hv5X31qcOc6l89dHntwJ/6T2X+nZZevrVlvVED5tOPGt04tkTPQQ68avRiV9fAD3IQqvqW8APAt8sIpfH5/tA6yP6Gmzgfbuq/g1V/Vng1oBX1TdU9ftU9SPAPwv8URHZlHNRVf+aqn4XFjNyAdwKeL6Hfgr4e4+OHf/+vcD3qup/o6o/gcWOvNvZVf+eLxSRL+6OfZh1n/9NLB7lDVX9+aPPW+/hPb/njvL/5Hss74k+i+nEs+/6PSeePdELpxO/vuv3nPj1M0APEpgW+tOYu+HHReQbReTLReQrROSf57b5vdLPYXX+NhH5jSLyT2CB2o1E5M+KyB8UkS8Tka/EYl4+qqoHEfn9IvKtIvI7y8zwnwQeYQMHEfmwiPy0iHz4nnL/e8DfLyLfVd7xJ7Dg5p5+FvgGEfldIvJVwH8GbN9D2wD8z1gszveKyFeLyO8G/h0ssLzSXwY+CfxVEfna0ia/T0T+7ftWDR7RvwX80yLyp0t9vgULXP/z77G8J/rspxPP3k8nnj3R+4lO/Ho/nfj1M0QPFpiq6kcxk/8PAf8mxih/DfhHgT/5nHt+AvhW4Duw2cYfB453rDgA/waWs+1HMKb4A+Xc28AfZBmQ3wn8cVX94XL+HPjy8vd55f5RbIZYmfsPAf/q0WXfAbwO/DDwA1hQ9g/zHkhVM8aMDot7+V7gXy/1q9fcAL8P+CjwV0qd/lMs/uXxu3zPf4etEvx2rE2/FfgXVPV/eC/lPdFnP5149n468eyJ3k904tf76cSvnzkS1eMwhROd6EQnOtGJTnSiE53o158erMX0RCc60YlOdKITnehEn110AqYnOtGJTnSiE53oRCd6X9AJmJ7oRCc60YlOdKITneh9QSdgeqITnehEJzrRiU50ovcFhftO/oX/7R9TzyMcW5x4RD2Jkb0euI6f4JA+RcxPmNMzVBLOexDBFqtBRtCsiGQERWTZTKJ+7xdfCfKc/SbkKJWsINzOLtuuxaHHzy7vy2i5T9vV9ZcConatlLfUIovWOxXKebSUq15Trhe6j3bFF6BrAy1XuVIAYXn3as5Q2wpFdXmEbd+1vF9VUQRt75Du2toefatpK2ApRem7jJIRsTorCUiI5PIua2PBIwRrh64Ra7PXndLsDct7rZyl4LXLVREtLSy5lYMydlC1tpXEUmkprWgvTmWMSa1v18f1m9T2bLu41aPSta3gUf7sP/z2p737yYui/+Un/0/VsvGIiOBEbFw6h3fB+ECVlBLTPLE73PDW00/xibc+xsc/+f/yS7/yd/jYx3+BN9/6JFc3z4gpkXEkVRsTGRwDG3/OJlyyCZc8unyND33o83n5pQ8SwpacIk+v3uDjn/wor3/ql7jeP0bVxo+I4H1gM24Jw4hzAy6M+OGMzfYcxXHY7TjsnrHfXaFZCWHEBUfYOHwAVeGw37PbXxOnmWlK5DlBVkQdqMmhMAY2mw2DH8B7kio5JwRF08w8H9AccU4QFTQ5xnDBF37Bl/GlX/rbGcIZqo4QBoIP3Fw/4ZMf/3meXX0KN2Qie65vnnCYdkzTzDzP4IRxGBnHLcFvcOJRzRwOO+I0IQjeeQRPVkA9Ydxydv6Ilx59gM3mAu82+OBRnXjy9E2ePn2T66unpBzRnHHB4QYlcgAywXucc4Qwcr55mfPNq4zhEcNwhnhP1sTN4SlPr1/navcGKR1IUyJFyBFyzMZfGZwIm3HD2faMzWaDDwFESDkT88QcJ5SI84J3gjjBBcEHsT4aPJvNyPnFhsvLLefnI2F0hNEhIRHzFc7NbMeBVy5f5fNe/WK+6ENfwee/9uV84NEX8WjzEhsXCM6RVZlj4ibueOv647z+9Gf4xOP/h4+99bd588lb3OwPxKzEBNNh5vpmz//0r/3ig+LZr/kj/6CaXJMmz9vWjNnkseZMzpmclJQgJyXnIvurnOvkXf3e9FfVTyJ4L3jvcN7hveBc/7HjriqxQkL3kI6aPuyVYD1X5Y4IXhzeB5wLRQY5UDEppUqmjD+kjOOBEAJjGAni7PnZZI8g1i6AOk8SIdEkHlXaV+0p4nDi8K77BE9wnhxn9ocdN4cb+0zPmOc9ohFM2qEi9nTNuCx9BcmAipK16lAp/VfwDosOXLVZAQOitYwgokX3OBBP1fhIWrVsU7VFV0k7q8tbbn9Z7l+da8DF9L9Kd1ePFaqellaG7sytkdFjnYq7pH6XTgcL/Oj
"text/plain": [
"<Figure size 864x864 with 9 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
2017-05-05 15:22:45 +02:00
"source": [
"# extra code displays the same first 9 images, after augmentation\n",
"\n",
2019-03-24 02:06:29 +01:00
"plt.figure(figsize=(12, 12))\n",
"for X_batch, y_batch in valid_set.take(1):\n",
" X_batch_augmented = data_augmentation(X_batch, training=True)\n",
2019-03-24 02:06:29 +01:00
" for index in range(9):\n",
" plt.subplot(3, 3, index + 1)\n",
" # We must rescale the images to the 0-1 range for imshow(), and also\n",
" # clip the result to that range, because data augmentation may\n",
" # make some values go out of bounds (e.g., RandomContrast in this case).\n",
" plt.imshow(np.clip((X_batch_augmented[index] + 1) / 2, 0, 1))\n",
" plt.title(f\"Class: {class_names[y_batch[index]]}\")\n",
2019-03-24 02:06:29 +01:00
" plt.axis(\"off\")\n",
"\n",
2017-05-05 15:22:45 +02:00
"plt.show()"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "kNL9AOsDpKz-"
},
"source": [
"Now let's load the pretrained model, without its top layers, and replace them with our own, for the flower classification task:"
]
},
2017-05-05 15:22:45 +02:00
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 50,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "lRyCgvaKpKz-",
"outputId": "a825e173-8b1d-4217-a1c4-5491b49c3e82"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/xception/xception_weights_tf_dim_ordering_tf_kernels_notop.h5\n",
"83689472/83683744 [==============================] - 1s 0us/step\n",
"83697664/83683744 [==============================] - 1s 0us/step\n"
]
}
],
2017-05-05 15:22:45 +02:00
"source": [
"tf.random.set_seed(42) # extra code ensures reproducibility\n",
2021-10-17 04:04:08 +02:00
"base_model = tf.keras.applications.xception.Xception(weights=\"imagenet\",\n",
" include_top=False)\n",
2021-10-17 04:04:08 +02:00
"avg = tf.keras.layers.GlobalAveragePooling2D()(base_model.output)\n",
"output = tf.keras.layers.Dense(n_classes, activation=\"softmax\")(avg)\n",
"model = tf.keras.Model(inputs=base_model.input, outputs=output)"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 51,
"metadata": {
"id": "KBlyG6ElpKz-"
},
2017-05-05 15:22:45 +02:00
"outputs": [],
"source": [
"for layer in base_model.layers:\n",
" layer.trainable = False"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "WFEFw7GKpKz-"
},
"source": [
"Let's train the model for a few epochs, while keeping the base model weights fixed:"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 52,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "GGxK2yPcpKz-",
"outputId": "6b64214a-e104-4b6c-9b7a-3388fc9aa15f"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/3\n",
"86/86 [==============================] - 42s 417ms/step - loss: 0.7492 - accuracy: 0.8052 - val_loss: 0.6932 - val_accuracy: 0.8421\n",
"Epoch 2/3\n",
"86/86 [==============================] - 35s 393ms/step - loss: 0.3207 - accuracy: 0.9084 - val_loss: 0.6522 - val_accuracy: 0.8312\n",
"Epoch 3/3\n",
"86/86 [==============================] - 35s 393ms/step - loss: 0.1766 - accuracy: 0.9415 - val_loss: 0.7004 - val_accuracy: 0.8475\n"
]
}
],
2017-05-05 15:22:45 +02:00
"source": [
"optimizer = tf.keras.optimizers.SGD(learning_rate=0.1, momentum=0.9)\n",
2019-03-24 02:06:29 +01:00
"model.compile(loss=\"sparse_categorical_crossentropy\", optimizer=optimizer,\n",
" metrics=[\"accuracy\"])\n",
"history = model.fit(train_set, validation_data=valid_set, epochs=3)"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 53,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "GvGMiJMLpKz-",
"outputId": "91f2c96c-c058-45e0-e428-66fa6076ad56"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
" 0: input_1 33: block4_pool 66: block8_sepconv1_act 99: block11_sepconv2_act \n",
" 1: block1_conv1 34: batch_normalization_2 67: block8_sepconv1 100: block11_sepconv2 \n",
" 2: block1_conv1_bn 35: add_2 68: block8_sepconv1_bn 101: block11_sepconv2_bn \n",
" 3: block1_conv1_act 36: block5_sepconv1_act 69: block8_sepconv2_act 102: block11_sepconv3_act \n",
" 4: block1_conv2 37: block5_sepconv1 70: block8_sepconv2 103: block11_sepconv3 \n",
" 5: block1_conv2_bn 38: block5_sepconv1_bn 71: block8_sepconv2_bn 104: block11_sepconv3_bn \n",
" 6: block1_conv2_act 39: block5_sepconv2_act 72: block8_sepconv3_act 105: add_9 \n",
" 7: block2_sepconv1 40: block5_sepconv2 73: block8_sepconv3 106: block12_sepconv1_act \n",
" 8: block2_sepconv1_bn 41: block5_sepconv2_bn 74: block8_sepconv3_bn 107: block12_sepconv1 \n",
" 9: block2_sepconv2_act 42: block5_sepconv3_act 75: add_6 108: block12_sepconv1_bn \n",
" 10: block2_sepconv2 43: block5_sepconv3 76: block9_sepconv1_act 109: block12_sepconv2_act \n",
" 11: block2_sepconv2_bn 44: block5_sepconv3_bn 77: block9_sepconv1 110: block12_sepconv2 \n",
" 12: conv2d 45: add_3 78: block9_sepconv1_bn 111: block12_sepconv2_bn \n",
" 13: block2_pool 46: block6_sepconv1_act 79: block9_sepconv2_act 112: block12_sepconv3_act \n",
" 14: batch_normalization 47: block6_sepconv1 80: block9_sepconv2 113: block12_sepconv3 \n",
" 15: add 48: block6_sepconv1_bn 81: block9_sepconv2_bn 114: block12_sepconv3_bn \n",
" 16: block3_sepconv1_act 49: block6_sepconv2_act 82: block9_sepconv3_act 115: add_10 \n",
" 17: block3_sepconv1 50: block6_sepconv2 83: block9_sepconv3 116: block13_sepconv1_act \n",
" 18: block3_sepconv1_bn 51: block6_sepconv2_bn 84: block9_sepconv3_bn 117: block13_sepconv1 \n",
" 19: block3_sepconv2_act 52: block6_sepconv3_act 85: add_7 118: block13_sepconv1_bn \n",
" 20: block3_sepconv2 53: block6_sepconv3 86: block10_sepconv1_act 119: block13_sepconv2_act \n",
" 21: block3_sepconv2_bn 54: block6_sepconv3_bn 87: block10_sepconv1 120: block13_sepconv2 \n",
" 22: conv2d_1 55: add_4 88: block10_sepconv1_bn 121: block13_sepconv2_bn \n",
" 23: block3_pool 56: block7_sepconv1_act 89: block10_sepconv2_act 122: conv2d_3 \n",
" 24: batch_normalization_1 57: block7_sepconv1 90: block10_sepconv2 123: block13_pool \n",
" 25: add_1 58: block7_sepconv1_bn 91: block10_sepconv2_bn 124: batch_normalization_3 \n",
" 26: block4_sepconv1_act 59: block7_sepconv2_act 92: block10_sepconv3_act 125: add_11 \n",
" 27: block4_sepconv1 60: block7_sepconv2 93: block10_sepconv3 126: block14_sepconv1 \n",
" 28: block4_sepconv1_bn 61: block7_sepconv2_bn 94: block10_sepconv3_bn 127: block14_sepconv1_bn \n",
" 29: block4_sepconv2_act 62: block7_sepconv3_act 95: add_8 128: block14_sepconv1_act \n",
" 30: block4_sepconv2 63: block7_sepconv3 96: block11_sepconv1_act 129: block14_sepconv2 \n",
" 31: block4_sepconv2_bn 64: block7_sepconv3_bn 97: block11_sepconv1 130: block14_sepconv2_bn \n",
" 32: conv2d_2 65: add_5 98: block11_sepconv1_bn 131: block14_sepconv2_act \n"
]
}
],
2017-05-05 15:22:45 +02:00
"source": [
"for indices in zip(range(33), range(33, 66), range(66, 99), range(99, 132)):\n",
" for idx in indices:\n",
" print(f\"{idx:3}: {base_model.layers[idx].name:22}\", end=\"\")\n",
" print()"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "L_bEwL8KpKz_"
},
"source": [
"Now that the weights of our new top layers are not too bad, we can make the top part of the base model trainable again, and continue training, but with a lower learning rate:"
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 54,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "GEUNGlhvpKz_",
"outputId": "c622a91d-f634-4443-b87e-8d46defdb578"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/10\n",
"86/86 [==============================] - 72s 779ms/step - loss: 0.2921 - accuracy: 0.9117 - val_loss: 0.4541 - val_accuracy: 0.8711\n",
"Epoch 2/10\n",
"86/86 [==============================] - 67s 767ms/step - loss: 0.0382 - accuracy: 0.9876 - val_loss: 0.3715 - val_accuracy: 0.9074\n",
"Epoch 3/10\n",
"86/86 [==============================] - 67s 765ms/step - loss: 0.0140 - accuracy: 0.9956 - val_loss: 0.3679 - val_accuracy: 0.9038\n",
"Epoch 4/10\n",
"86/86 [==============================] - 67s 766ms/step - loss: 0.0167 - accuracy: 0.9945 - val_loss: 0.3298 - val_accuracy: 0.9020\n",
"Epoch 5/10\n",
"86/86 [==============================] - 67s 767ms/step - loss: 0.0090 - accuracy: 0.9971 - val_loss: 0.3876 - val_accuracy: 0.8984\n",
"Epoch 6/10\n",
"86/86 [==============================] - 67s 765ms/step - loss: 0.0044 - accuracy: 0.9985 - val_loss: 0.3708 - val_accuracy: 0.9038\n",
"Epoch 7/10\n",
"86/86 [==============================] - 67s 765ms/step - loss: 0.0029 - accuracy: 0.9996 - val_loss: 0.3535 - val_accuracy: 0.9147\n",
"Epoch 8/10\n",
"86/86 [==============================] - 67s 767ms/step - loss: 0.0070 - accuracy: 0.9975 - val_loss: 0.4261 - val_accuracy: 0.8893\n",
"Epoch 9/10\n",
"86/86 [==============================] - 67s 766ms/step - loss: 0.0061 - accuracy: 0.9985 - val_loss: 0.3635 - val_accuracy: 0.9074\n",
"Epoch 10/10\n",
"86/86 [==============================] - 67s 767ms/step - loss: 0.0018 - accuracy: 0.9996 - val_loss: 0.3550 - val_accuracy: 0.9111\n"
]
}
],
"source": [
"for layer in base_model.layers[56:]:\n",
2019-03-24 02:06:29 +01:00
" layer.trainable = True\n",
2017-05-05 15:22:45 +02:00
"\n",
"optimizer = tf.keras.optimizers.SGD(learning_rate=0.01, momentum=0.9)\n",
2019-03-24 02:06:29 +01:00
"model.compile(loss=\"sparse_categorical_crossentropy\", optimizer=optimizer,\n",
" metrics=[\"accuracy\"])\n",
"history = model.fit(train_set, validation_data=valid_set, epochs=10)"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "mpVsD1f8pKz_"
},
2017-05-05 15:22:45 +02:00
"source": [
2019-03-24 02:06:29 +01:00
"# Classification and Localization"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 55,
"metadata": {
"id": "k_7rd9hopKz_"
},
2017-05-05 15:22:45 +02:00
"outputs": [],
"source": [
"tf.random.set_seed(42) # extra code ensures reproducibility\n",
2021-10-17 04:04:08 +02:00
"base_model = tf.keras.applications.xception.Xception(weights=\"imagenet\",\n",
" include_top=False)\n",
2021-10-17 04:04:08 +02:00
"avg = tf.keras.layers.GlobalAveragePooling2D()(base_model.output)\n",
"class_output = tf.keras.layers.Dense(n_classes, activation=\"softmax\")(avg)\n",
"loc_output = tf.keras.layers.Dense(4)(avg)\n",
"model = tf.keras.Model(inputs=base_model.input,\n",
" outputs=[class_output, loc_output])\n",
2019-03-24 02:06:29 +01:00
"model.compile(loss=[\"sparse_categorical_crossentropy\", \"mse\"],\n",
" loss_weights=[0.8, 0.2], # depends on what you care most about\n",
2019-03-24 02:06:29 +01:00
" optimizer=optimizer, metrics=[\"accuracy\"])"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 56,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "gGGaA3SJpKz_",
"outputId": "2e525486-d886-4ba1-c123-1c8cdf7f1b8a"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/2\n",
"10/10 [==============================] - 22s 1s/step - loss: 1.2157 - dense_1_loss: 1.4583 - dense_2_loss: 0.2450 - dense_1_accuracy: 0.4219 - dense_2_accuracy: 0.2594\n",
"Epoch 2/2\n",
"10/10 [==============================] - 15s 1s/step - loss: 0.7974 - dense_1_loss: 0.9474 - dense_2_loss: 0.1972 - dense_1_accuracy: 0.7594 - dense_2_accuracy: 0.2812\n"
]
},
{
"data": {
"text/plain": [
"<keras.callbacks.History at 0x7eff33ce5250>"
]
},
"execution_count": 56,
"metadata": {},
"output_type": "execute_result"
}
],
2017-05-05 15:22:45 +02:00
"source": [
"# extra code fits the model using random target bounding boxes (in real life\n",
"# you would need to create proper targets instead)\n",
"\n",
"def add_random_bounding_boxes(images, labels):\n",
" fake_bboxes = tf.random.uniform([tf.shape(images)[0], 4])\n",
" return images, (labels, fake_bboxes)\n",
"\n",
"fake_train_set = train_set.take(5).repeat(2).map(add_random_bounding_boxes)\n",
"model.fit(fake_train_set, epochs=2)"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "mD9oCJ7vpKz_"
},
2017-05-05 15:22:45 +02:00
"source": [
"# Extra Material How mAP Relates to Precision/Recall"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 57,
"metadata": {
"id": "fgjxsrkLpKz_"
},
2017-05-05 15:22:45 +02:00
"outputs": [],
"source": [
2019-03-24 02:06:29 +01:00
"def maximum_precisions(precisions):\n",
" return np.flip(np.maximum.accumulate(np.flip(precisions)))"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 58,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 287
},
"id": "pB2kJkHrpKz_",
"outputId": "fd9f2bc1-ae06-4c60-8a0c-f4bec6577efa"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAY4AAAEOCAYAAACetPCkAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjIsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+WH4yJAAAgAElEQVR4nO3deViUVfvA8e89rIK476LiBq6paOYuLi36mrZpKZaaSVlp/VLL8n1NKysre7NSi8ysBNv0LculTXHJJfclFTUVxS01zQVBgfP7Y1gFhUGYZ4D7c11zMc85Z57n5jDMPc92jhhjUEoppXLLZnUASimlChdNHEoppRyiiUMppZRDNHEopZRyiCYOpZRSDtHEoZRSyiFOSxwiMktE/hKRHdeoFxF5V0T2icg2EQl2VmxKKaVyz5l7HLOBO65T3wOon/IIA2Y4ISallFIOclriMMasAP6+TpM+wGfGbi1QRkSqOic6pZRSueVudQAZVAcOZ1iOTSk7dnVDEQnDvleCt7d3y5o1azolQFeXnJyMzaanrUD7IiPti3TaF+n27NlzyhhTMS+vdaXEkWvGmHAgHCAoKMhER0dbHJFriIqKIiQkxOowXIL2RTrti3TaF+lEJCavr3Wl1HsEqJFh2T+lTCmllAtxpcSxAHgo5eqqNsA/xpgsh6mUUkpZy2mHqkRkLhACVBCRWOBFwAPAGPMBsAjoCewD4oAhzopNKaVU7jktcRhj+udQb4AnnBSOUkqpPHKlQ1V5smePHwEBEBFhzfYjIiAgAGw2LIsj4vFVBLjH0rVLJwLcY4l4fJXzg1BKFRuF8qqqq8XEQFiY/XloqPO2GxFh325cnHVxRDy+irAZLYjD1x5Dkj9hM8oCqwid3sE5QSilipUikTjA/uE9bBh89ZXztvnzz3DpkkVxJCbCP2f5+beWXKJE5hjw5ckPmnK++mnK1ilLmXI2ypaFMmVI++nhkf8hRUTAuHFw6BDUrAmTJjk3kYM9kY4LD+BQUidqusUyKeygJlCl8lmRSRxg/xA/dMi52yuwOEwyXIqHywmQcNn+8/JlqFABSpeBiwmw9zCXaJ7ty8+a0gz/97VX7+trKFs6OSWpSJbEUrbstZ/7+IBI5vXp3pdSxUeRShy1asHmzc7bXkCA/QMyT3EkJ8OuXXDgQPpj/37o3RsefhiOHAN///T25ctDUG149lno2xcuGFh9koCeR4hJ8s+y+hpuR1j3ehRn9pzkzP4znD18njPH4zk78kXOeFbmzNItnI3azJkTFTnrXZkYtwpsMWU4SxnOnb/+qS8Pj6xJZuXK9KSRKi4OHn8cdu7MoS8cZYx9jys+PtPj/W9uSksaaTHgy7jwAEKn53MMShVjRSZx+PjYD40406Semb/hAvhwkUk9NwMd7Fll//70pHDgALRoAaNH2z/8mje3fwAClCgBtWvDhQv25apV4X//gzp17BmqVKnMGy9ZEm67jUlhqwibUTZLDK+FHaDq6FCuOdjX3R7w63k4sC09toMH4fhxEr18Off0eM5Mi+BMmTqcrRzEmfL1OFu6Jmc63cWZf2ycPZHAmQsenPnHxpkzEBdnAMmymXPnDG+8kbUczFWLKcuSkrSSkzKXg303x+Zmf54I4J3ySC3K/u0ck1SdB2quJviOigT3q0+LFvY8rJTKmyKROGrVsuZ4eujCUKA943iVQ9SkJoeYxAuELloNHIROndKPWbm52Q/8p+5FuLnB119DlSr2hFGpUubjPzYb3HVXzjFM7wCkHtevRk23o7k7rt+kif2RkTEggjtQrv/tlKteImVvKBr2L4Z9F2DRPfa2DwyCeV9DjRpQuzYB8ikxJuuYYbXkMAf7PANnzkBgIMxIGfQ4MAj27s3cuGdPWLjQ/jygrj2JZjxWdvvtMGoUIPDONHuyzVAf0LZqtntfJbjEuqM1+PKjGvBRSlxusQRXiiW44SVadCxJ8F01qdq88vX7TCkFgBhjcm7lwpw2VtWVK/ZDS4cOQa9e9rKrD/SnErEfilq4MH1PokYNcC/YPF3g4/AkJ9sTGsAPP8C6dWmH2SJW1yKMj7Ls+YQzjNBGW+0f8K1awdSp9soPPrCfDMp4zKt6dahf316fksQccfU5jrQYhm8mdHoH/j5t2LxF2LTwGJvmHWDTsarsvVILk3JVepVyCQS38SK41mmC/fYSfGd1arbzR2yOxeFqdHymdNoX6URkozGmVZ5ebIwp1I/AwEBTYH780ZhHHzXm5puN8fIyBozx8TEmMdFeX6GCvezqR61aBRfTdSxbtsyS7RpjjKlVy8yhv6nFASMkmVocMHPo7/S+mDN8panldtgeg9thM2f4yuu2P3fknFk5bauZek+UGdQ/3jRtaoybLSntT1lOTptuZTeaMTcvM3M/vmiio41JSnLSL5NPLH1fuBjti3TABpPHz90icajqhly4AFu2wKZN9jPamzbBsmVQrhysXQtffgnBwTBihP1ncHD6t+533sl8KRFYc7LFFUyaRGhYGKFxc9PLfHxgUrhTwwid3oHQ6Rm/WWY9dJWRXzU/Ojx+Ex0eTy+79HcC27/bz6afT9vfEocqMHV9Yy6v9wKgpGcCLbx3E1znLME322hxe2Ua/qsO7t72f6f0S4IdOHSoVGGS14zjKg+H9jj+/tuYX3815sQJ+/LcucaIpO8pVKpkTI8exuzfb6+PjzcmOfn665wzx/6tWsT+c86c3MeTzyz/NlWE++LypUSzZYsxs2YZ82SHzaad31bjw4W0t463XDKtWxvTrdFR40l8ph1QHy7kuOdTkCx/X7gQ7Yt0FOc9Dr89e+xXHWV3dvyvv+Cjj9L3JA4csJfPmWNvGxwMEyak70lUrZr5uLqXV84BhIY6/6y8qyrCfeHh7UazZtCsGQwZYr93JulyEnt++pNNi46z+c9SbLrSlKW/V047Z5JKLwlWRU2hTxyA/bLXoUNh3jz7Sey77rIvX7kC//431KsHN98Mjz5qTxA332x/XWAgjB9vbeyq0HLzdKNhr7o07FWX1HR5rfPoh5KqOS0upQpa0UgcAAkJ9vseGjWy32ENUK0anD0LpUtbG5sqNmq6Hc32kmB3Etm/35M6dSwISql8VuhHx81EBP74A4YPT1/WpKGcaFLYQXy4mKnMi3jcPYRWrWDRB04cE0epAlK0EkfNrDegKeVModM7ED58M7XcYhGSqeUWy8fDN7Bjtwc1y/xDr+H+vNg5iqTLSVaHqlSeFZ1DVcX1MljlclIvCbbzJ/WS4DUbPBneejUvrQjh9+rriVhbj3J1y1oUpVJ5VzT2OGrVgvDwIntFjyoaSpQrwSd72vPBgBUsPXUTLRtcYFPELqvDUsphhT5xnA8MtA/Op0lDFQJiEx6N6MTKWftIMjbaDQlk1iyro1LKMYU+cShVGLUe0piNe0vTsbMbQ4fCsDuPE3823uqwlMoVTRxKWaRi7ZIsWQIvPHWBmT9UoUP1/cT8Fmt1WErlSBOHUhZyc4NJ75Tk2xd+Z29cdYI7+vDjpA1Wh6XUdWniUMoF9JnUmg0/naG61yl6/DuYl7tFkZxUuKc8UEWXJg6lXET9WwNYE1Od0NqrGb80hN59hDNnrI5Kqaw0cSjlQnwr+fLZvva8/24yP/0ErZrGs+VLJ0xUppQDNHEo5WLEJjwxwsbyKEPCyXO0faAmn4atsjospdJo4lDKRbVtJ2zaJLQts5vBH3VgeOMVJJxLsDospTRxKOXKKjWuyE/HmvJs6yg+2NmJTtX2cmirnvhQ1tLEoZSLc/d2Z/K6EOaNWcuuhDq07F6GX36xOipVnGniUKqQuOeNNqzf4UOlSsLttxte/dcqkhOTrQ5LFUOaOJQqRIKCYN066Nd4J+MWdeDuGus5G/OP1WGpYkYTh1KFTMmSELmlEVPvWc6i48HcXP8M2+ftsTosVYxo4lCqEBKbMHJeZ5ZN28XFJG9uuc+fiHE7rQ5LFROaOJQqxDo8fhObNtu4ufIhBr7aiBEj4PJlq6NSRZ0mDqUKuSo3VeKXww145hl4/30IqfknseuPWR2WKsKcmjh
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
2017-05-05 15:22:45 +02:00
"source": [
2019-03-24 02:06:29 +01:00
"recalls = np.linspace(0, 1, 11)\n",
"\n",
"precisions = [0.91, 0.94, 0.96, 0.94, 0.95, 0.92, 0.80, 0.60, 0.45, 0.20, 0.10]\n",
"max_precisions = maximum_precisions(precisions)\n",
"mAP = max_precisions.mean()\n",
"plt.plot(recalls, precisions, \"ro--\", label=\"Precision\")\n",
"plt.plot(recalls, max_precisions, \"bo-\", label=\"Max Precision\")\n",
"plt.xlabel(\"Recall\")\n",
"plt.ylabel(\"Precision\")\n",
"plt.plot([0, 1], [mAP, mAP], \"g:\", linewidth=3, label=\"mAP\")\n",
"plt.grid(True)\n",
"plt.axis([0, 1, 0, 1])\n",
2022-01-30 23:12:12 +01:00
"plt.legend(loc=\"lower center\")\n",
2019-03-24 02:06:29 +01:00
"plt.show()"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "markdown",
2019-03-24 02:06:29 +01:00
"metadata": {
"id": "hFjs5WBKpK0A"
2019-03-24 02:06:29 +01:00
},
2017-05-05 15:22:45 +02:00
"source": [
2019-03-24 02:06:29 +01:00
"# Exercises"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "iXYUCZlvpK0B"
},
2017-05-05 15:22:45 +02:00
"source": [
2019-03-24 02:06:29 +01:00
"## 1. to 8."
2017-05-05 15:22:45 +02:00
]
},
{
2019-03-24 02:06:29 +01:00
"cell_type": "markdown",
"metadata": {
"id": "7gmFmNUjpK0B"
},
2017-05-05 15:22:45 +02:00
"source": [
"1. These are the main advantages of a CNN over a fully connected DNN for image classification:\n",
" * Because consecutive layers are only partially connected and because it heavily reuses its weights, a CNN has many fewer parameters than a fully connected DNN, which makes it much faster to train, reduces the risk of overfitting, and requires much less training data.\n",
" * When a CNN has learned a kernel that can detect a particular feature, it can detect that feature anywhere in the image. In contrast, when a DNN learns a feature in one location, it can detect it only in that particular location. Since images typically have very repetitive features, CNNs are able to generalize much better than DNNs for image processing tasks such as classification, using fewer training examples.\n",
" * Finally, a DNN has no prior knowledge of how pixels are organized; it does not know that nearby pixels are close. A CNN's architecture embeds this prior knowledge. Lower layers typically identify features in small areas of the images, while higher layers combine the lower-level features into larger features. This works well with most natural images, giving CNNs a decisive head start compared to DNNs.\n",
"2. Let's compute how many parameters the CNN has.\n",
" * Since its first convolutional layer has 3 × 3 kernels, and the input has three channels (red, green, and blue), each feature map has 3 × 3 × 3 weights, plus a bias term. That's 28 parameters per feature map. Since this first convolutional layer has 100 feature maps, it has a total of 2,800 parameters. The second convolutional layer has 3 × 3 kernels and its input is the set of 100 feature maps of the previous layer, so each feature map has 3 × 3 × 100 = 900 weights, plus a bias term. Since it has 200 feature maps, this layer has 901 × 200 = 180,200 parameters. Finally, the third and last convolutional layer also has 3 × 3 kernels, and its input is the set of 200 feature maps of the previous layers, so each feature map has 3 × 3 × 200 = 1,800 weights, plus a bias term. Since it has 400 feature maps, this layer has a total of 1,801 × 400 = 720,400 parameters. All in all, the CNN has 2,800 + 180,200 + 720,400 = 903,400 parameters.<br/>\n",
" * Now let's compute how much RAM this neural network will require (at least) when making a prediction for a single instance. First let's compute the feature map size for each layer. Since we are using a stride of 2 and `\"same\"` padding, the horizontal and vertical dimensions of the feature maps are divided by 2 at each layer (rounding up if necessary). So, as the input channels are 200 × 300 pixels, the first layer's feature maps are 100 × 150, the second layer's feature maps are 50 × 75, and the third layer's feature maps are 25 × 38. Since 32 bits is 4 bytes and the first convolutional layer has 100 feature maps, this first layer takes up 4 × 100 × 150 × 100 = 6 million bytes (6 MB). The second layer takes up 4 × 50 × 75 × 200 = 3 million bytes (3 MB). Finally, the third layer takes up 4 × 25 × 38 × 400 = 1,520,000 bytes (about 1.5 MB). However, once a layer has been computed, the memory occupied by the previous layer can be released, so if everything is well optimized, only 6 + 3 = 9 million bytes (9 MB) of RAM will be required (when the second layer has just been computed, but the memory occupied by the first layer has not been released yet). But wait, you also need to add the memory occupied by the CNN's parameters! We computed earlier that it has 903,400 parameters, each using up 4 bytes, so this adds 3,613,600 bytes (about 3.6 MB). The total RAM required is therefore (at least) 12,613,600 bytes (about 12.6 MB).<br/>\n",
" * Lastly, let's compute the minimum amount of RAM required when training the CNN on a mini-batch of 50 images. During training TensorFlow uses backpropagation, which requires keeping all values computed during the forward pass until the reverse pass begins. So we must compute the total RAM required by all layers for a single instance and multiply that by 50. At this point, let's start counting in megabytes rather than bytes. We computed before that the three layers require respectively 6, 3, and 1.5 MB for each instance. That's a total of 10.5 MB per instance, so for 50 instances the total RAM required is 525 MB. Add to that the RAM required by the input images, which is 50 × 4 × 200 × 300 × 3 = 36 million bytes (36 MB), plus the RAM required for the model parameters, which is about 3.6 MB (computed earlier), plus some RAM for the gradients (we will neglect this since it can be released gradually as backpropagation goes down the layers during the reverse pass). We are up to a total of roughly 525 + 36 + 3.6 = 564.6 MB, and that's really an optimistic bare minimum.\n",
"3. If your GPU runs out of memory while training a CNN, here are five things you could try to solve the problem (other than purchasing a GPU with more RAM):\n",
" * Reduce the mini-batch size.\n",
" * Reduce dimensionality using a larger stride in one or more layers.\n",
" * Remove one or more layers.\n",
" * Use 16-bit floats instead of 32-bit floats.\n",
" * Distribute the CNN across multiple devices.\n",
"4. A max pooling layer has no parameters at all, whereas a convolutional layer has quite a few (see the previous questions).\n",
"5. A local response normalization layer makes the neurons that most strongly activate inhibit neurons at the same location but in neighboring feature maps, which encourages different feature maps to specialize and pushes them apart, forcing them to explore a wider range of features. It is typically used in the lower layers to have a larger pool of low-level features that the upper layers can build upon.\n",
"6. The main innovations in AlexNet compared to LeNet-5 are that it is much larger and deeper, and it stacks convolutional layers directly on top of each other, instead of stacking a pooling layer on top of each convolutional layer. The main innovation in GoogLeNet is the introduction of _inception modules_, which make it possible to have a much deeper net than previous CNN architectures, with fewer parameters. ResNet's main innovation is the introduction of skip connections, which make it possible to go well beyond 100 layers. Arguably, its simplicity and consistency are also rather innovative. SENet's main innovation was the idea of using an SE block (a two-layer dense network) after every inception module in an inception network or every residual unit in a ResNet to recalibrate the relative importance of feature maps. Xception's main innovation was the use of depthwise separable convolutional layers, which look at spatial patterns and depthwise patterns separately. Lastly, EfficientNet's main innotation was the compound scaling method, to efficiently scale a model to a larger compute budget.\n",
"7. Fully convolutional networks are neural networks composed exclusively of convolutional and pooling layers. FCNs can efficiently process images of any width and height (at least above the minimum size). They are most useful for object detection and semantic segmentation because they only need to look at the image once (instead of having to run a CNN multiple times on different parts of the image). If you have a CNN with some dense layers on top, you can convert these dense layers to convolutional layers to create an FCN: just replace the lowest dense layer with a convolutional layer with a kernel size equal to the layer's input size, with one filter per neuron in the dense layer, and using `\"valid\"` padding. Generally the stride should be 1, but you can set it to a higher value if you want. The activation function should be the same as the dense layer's. The other dense layers should be converted the same way, but using 1 × 1 filters. It is actually possible to convert a trained CNN this way by appropriately reshaping the dense layers' weight matrices.\n",
"8. The main technical difficulty of semantic segmentation is the fact that a lot of the spatial information gets lost in a CNN as the signal flows through each layer, especially in pooling layers and layers with a stride greater than 1. This spatial information needs to be restored somehow to accurately predict the class of each pixel."
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "KIpUNnvnpK0B"
},
2017-05-05 15:22:45 +02:00
"source": [
2019-03-24 02:06:29 +01:00
"## 9. High Accuracy CNN for MNIST\n",
"_Exercise: Build your own CNN from scratch and try to achieve the highest possible accuracy on MNIST._"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "FSsUWNuzpK0B"
},
"source": [
"The following model uses 2 convolutional layers, followed by 1 pooling layer, then dropout 25%, then a dense layer, another dropout layer but with 50% dropout, and finally the output layer. It reaches about 99.2% accuracy on the test set. This places this model roughly in the top 20% in the [MNIST Kaggle competition](https://www.kaggle.com/c/digit-recognizer/) (if we ignore the models with an accuracy greater than 99.79% which were most likely trained on the test set, as explained by Chris Deotte in [this post](https://www.kaggle.com/c/digit-recognizer/discussion/61480)). Can you do better? To reach 99.5 to 99.7% accuracy on the test set, you need to add image augmentation, batch norm, use a learning schedule such as 1-cycle, and possibly create an ensemble."
]
2017-05-05 15:22:45 +02:00
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 59,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "6tdKYb9PpK0B",
"outputId": "37baf840-d76d-4d94-f692-524eef47a041"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/mnist.npz\n",
"11493376/11490434 [==============================] - 0s 0us/step\n",
"11501568/11490434 [==============================] - 0s 0us/step\n"
]
}
],
"source": [
"mnist = tf.keras.datasets.mnist.load_data()\n",
"(X_train_full, y_train_full), (X_test, y_test) = mnist\n",
"X_train_full = X_train_full / 255.\n",
"X_test = X_test / 255.\n",
"X_train, X_valid = X_train_full[:-5000], X_train_full[-5000:]\n",
"y_train, y_valid = y_train_full[:-5000], y_train_full[-5000:]\n",
"\n",
"X_train = X_train[..., np.newaxis]\n",
"X_valid = X_valid[..., np.newaxis]\n",
"X_test = X_test[..., np.newaxis]"
]
2017-05-05 15:22:45 +02:00
},
{
"cell_type": "code",
2022-02-19 06:19:59 +01:00
"execution_count": 60,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/"
},
"id": "uDchCzo3pK0B",
"outputId": "5e68d152-cd84-4451-db6c-d69fe7f18cc7"
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/10\n",
"1719/1719 [==============================] - 25s 14ms/step - loss: 0.1943 - accuracy: 0.9415 - val_loss: 0.0431 - val_accuracy: 0.9884\n",
"Epoch 2/10\n",
"1719/1719 [==============================] - 22s 13ms/step - loss: 0.0807 - accuracy: 0.9754 - val_loss: 0.0454 - val_accuracy: 0.9882\n",
"Epoch 3/10\n",
"1719/1719 [==============================] - 21s 12ms/step - loss: 0.0609 - accuracy: 0.9808 - val_loss: 0.0361 - val_accuracy: 0.9890\n",
"Epoch 4/10\n",
"1719/1719 [==============================] - 21s 12ms/step - loss: 0.0506 - accuracy: 0.9841 - val_loss: 0.0339 - val_accuracy: 0.9910\n",
"Epoch 5/10\n",
"1719/1719 [==============================] - 21s 12ms/step - loss: 0.0407 - accuracy: 0.9869 - val_loss: 0.0330 - val_accuracy: 0.9928\n",
"Epoch 6/10\n",
"1719/1719 [==============================] - 21s 12ms/step - loss: 0.0350 - accuracy: 0.9889 - val_loss: 0.0383 - val_accuracy: 0.9916\n",
"Epoch 7/10\n",
"1719/1719 [==============================] - 21s 12ms/step - loss: 0.0326 - accuracy: 0.9892 - val_loss: 0.0356 - val_accuracy: 0.9922\n",
"Epoch 8/10\n",
"1719/1719 [==============================] - 21s 12ms/step - loss: 0.0278 - accuracy: 0.9910 - val_loss: 0.0354 - val_accuracy: 0.9922\n",
"Epoch 9/10\n",
"1719/1719 [==============================] - 21s 12ms/step - loss: 0.0280 - accuracy: 0.9912 - val_loss: 0.0327 - val_accuracy: 0.9914\n",
"Epoch 10/10\n",
"1719/1719 [==============================] - 21s 12ms/step - loss: 0.0231 - accuracy: 0.9923 - val_loss: 0.0397 - val_accuracy: 0.9908\n",
"313/313 [==============================] - 2s 5ms/step - loss: 0.0306 - accuracy: 0.9914\n"
]
},
{
"data": {
"text/plain": [
"[0.030648918822407722, 0.9914000034332275]"
]
},
"execution_count": 60,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
2021-10-17 04:04:08 +02:00
"tf.keras.backend.clear_session()\n",
"tf.random.set_seed(42)\n",
"np.random.seed(42)\n",
"\n",
2021-10-17 04:04:08 +02:00
"model = tf.keras.Sequential([\n",
" tf.keras.layers.Conv2D(32, kernel_size=3, padding=\"same\",\n",
" activation=\"relu\", kernel_initializer=\"he_normal\"),\n",
" tf.keras.layers.Conv2D(64, kernel_size=3, padding=\"same\",\n",
" activation=\"relu\", kernel_initializer=\"he_normal\"),\n",
2021-10-17 04:04:08 +02:00
" tf.keras.layers.MaxPool2D(),\n",
" tf.keras.layers.Flatten(),\n",
" tf.keras.layers.Dropout(0.25),\n",
" tf.keras.layers.Dense(128, activation=\"relu\",\n",
" kernel_initializer=\"he_normal\"),\n",
2021-10-17 04:04:08 +02:00
" tf.keras.layers.Dropout(0.5),\n",
" tf.keras.layers.Dense(10, activation=\"softmax\")\n",
"])\n",
"model.compile(loss=\"sparse_categorical_crossentropy\", optimizer=\"nadam\",\n",
" metrics=[\"accuracy\"])\n",
"\n",
"model.fit(X_train, y_train, epochs=10, validation_data=(X_valid, y_valid))\n",
"model.evaluate(X_test, y_test)"
]
2017-05-05 15:22:45 +02:00
},
{
2019-03-24 02:06:29 +01:00
"cell_type": "markdown",
"metadata": {
"id": "ax165YCQpK0B"
},
2017-05-05 15:22:45 +02:00
"source": [
"## 10. Use transfer learning for large image classification"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "n5KdUYhHpK0B"
},
2017-05-05 15:22:45 +02:00
"source": [
"_Exercise: Use transfer learning for large image classification, going through these steps:_\n",
"\n",
"* _Create a training set containing at least 100 images per class. For example, you could classify your own pictures based on the location (beach, mountain, city, etc.), or alternatively you can use an existing dataset (e.g., from TensorFlow Datasets)._\n",
"* _Split it into a training set, a validation set, and a test set._\n",
"* _Build the input pipeline, including the appropriate preprocessing operations, and optionally add data augmentation._\n",
"* _Fine-tune a pretrained model on this dataset._"
]
},
2017-05-05 15:22:45 +02:00
{
"cell_type": "markdown",
"metadata": {
"id": "STW6EOmbpK0C"
},
2017-05-05 15:22:45 +02:00
"source": [
"See the Flowers example above."
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "Bl9zizRopK0C"
},
2017-05-05 15:22:45 +02:00
"source": [
2019-03-24 02:06:29 +01:00
"## 11.\n",
"_Exercise: Go through TensorFlow's [Style Transfer tutorial](https://homl.info/styletuto). It is a fun way to generate art using Deep Learning._\n"
2017-05-05 15:22:45 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "W1yw_8PSpK0C"
},
2017-05-05 15:22:45 +02:00
"source": [
"Simply open the Colab and follow its instructions."
2016-09-27 23:31:21 +02:00
]
}
],
"metadata": {
"accelerator": "GPU",
"colab": {
"name": "14_deep_computer_vision_with_cnns.ipynb",
"provenance": []
},
2016-09-27 23:31:21 +02:00
"kernelspec": {
"display_name": "Python 3",
2016-09-27 23:31:21 +02:00
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
2021-10-17 03:27:34 +02:00
"version": "3.8.12"
2016-09-27 23:31:21 +02:00
},
"nav_menu": {},
"toc": {
"navigate_menu": true,
"number_sections": true,
"sideBar": true,
"threshold": 6,
"toc_cell": false,
"toc_section_display": "block",
"toc_window_display": false
},
"widgets": {
"application/vnd.jupyter.widget-state+json": {
"0401482a18a94f22b95d5321bfa6f414": {
"model_module": "@jupyter-widgets/controls",
"model_module_version": "1.5.0",
"model_name": "ProgressStyleModel",
"state": {
"_model_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_model_name": "ProgressStyleModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/base",
"_view_module_version": "1.2.0",
"_view_name": "StyleView",
"bar_color": null,
"description_width": ""
}
},
"1c08c78c0d484eed9638ad2b757ab584": {
"model_module": "@jupyter-widgets/base",
"model_module_version": "1.2.0",
"model_name": "LayoutModel",
"state": {
"_model_module": "@jupyter-widgets/base",
"_model_module_version": "1.2.0",
"_model_name": "LayoutModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/base",
"_view_module_version": "1.2.0",
"_view_name": "LayoutView",
"align_content": null,
"align_items": null,
"align_self": null,
"border": null,
"bottom": null,
"display": null,
"flex": null,
"flex_flow": null,
"grid_area": null,
"grid_auto_columns": null,
"grid_auto_flow": null,
"grid_auto_rows": null,
"grid_column": null,
"grid_gap": null,
"grid_row": null,
"grid_template_areas": null,
"grid_template_columns": null,
"grid_template_rows": null,
"height": null,
"justify_content": null,
"justify_items": null,
"left": null,
"margin": null,
"max_height": null,
"max_width": null,
"min_height": null,
"min_width": null,
"object_fit": null,
"object_position": null,
"order": null,
"overflow": null,
"overflow_x": null,
"overflow_y": null,
"padding": null,
"right": null,
"top": null,
"visibility": null,
"width": null
}
},
"2839afc6cb6d4a50b0bdad1fcb7f39d1": {
"model_module": "@jupyter-widgets/controls",
"model_module_version": "1.5.0",
"model_name": "HBoxModel",
"state": {
"_dom_classes": [],
"_model_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_model_name": "HBoxModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/controls",
"_view_module_version": "1.5.0",
"_view_name": "HBoxView",
"box_style": "",
"children": [
"IPY_MODEL_eefd1a01ef1c46e09ffbd97ad25377cf",
"IPY_MODEL_d142189db76a4681a22f38ae252e4ebc",
"IPY_MODEL_d441368305704ab9a3bdbe762ab340a4"
],
"layout": "IPY_MODEL_1c08c78c0d484eed9638ad2b757ab584"
}
},
"54a90429726b4d848358cafae87ad893": {
"model_module": "@jupyter-widgets/base",
"model_module_version": "1.2.0",
"model_name": "LayoutModel",
"state": {
"_model_module": "@jupyter-widgets/base",
"_model_module_version": "1.2.0",
"_model_name": "LayoutModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/base",
"_view_module_version": "1.2.0",
"_view_name": "LayoutView",
"align_content": null,
"align_items": null,
"align_self": null,
"border": null,
"bottom": null,
"display": null,
"flex": null,
"flex_flow": null,
"grid_area": null,
"grid_auto_columns": null,
"grid_auto_flow": null,
"grid_auto_rows": null,
"grid_column": null,
"grid_gap": null,
"grid_row": null,
"grid_template_areas": null,
"grid_template_columns": null,
"grid_template_rows": null,
"height": null,
"justify_content": null,
"justify_items": null,
"left": null,
"margin": null,
"max_height": null,
"max_width": null,
"min_height": null,
"min_width": null,
"object_fit": null,
"object_position": null,
"order": null,
"overflow": null,
"overflow_x": null,
"overflow_y": null,
"padding": null,
"right": null,
"top": null,
"visibility": null,
"width": null
}
},
"57cbb645792f45adbfab9b29aa708809": {
"model_module": "@jupyter-widgets/controls",
"model_module_version": "1.5.0",
"model_name": "DescriptionStyleModel",
"state": {
"_model_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_model_name": "DescriptionStyleModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/base",
"_view_module_version": "1.2.0",
"_view_name": "StyleView",
"description_width": ""
}
},
"8f0660be3bf44dd48fd42cd52a507e32": {
"model_module": "@jupyter-widgets/controls",
"model_module_version": "1.5.0",
"model_name": "DescriptionStyleModel",
"state": {
"_model_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_model_name": "DescriptionStyleModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/base",
"_view_module_version": "1.2.0",
"_view_name": "StyleView",
"description_width": ""
}
},
"b681dc2200ad4ee397a46602e8f4f654": {
"model_module": "@jupyter-widgets/base",
"model_module_version": "1.2.0",
"model_name": "LayoutModel",
"state": {
"_model_module": "@jupyter-widgets/base",
"_model_module_version": "1.2.0",
"_model_name": "LayoutModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/base",
"_view_module_version": "1.2.0",
"_view_name": "LayoutView",
"align_content": null,
"align_items": null,
"align_self": null,
"border": null,
"bottom": null,
"display": null,
"flex": null,
"flex_flow": null,
"grid_area": null,
"grid_auto_columns": null,
"grid_auto_flow": null,
"grid_auto_rows": null,
"grid_column": null,
"grid_gap": null,
"grid_row": null,
"grid_template_areas": null,
"grid_template_columns": null,
"grid_template_rows": null,
"height": null,
"justify_content": null,
"justify_items": null,
"left": null,
"margin": null,
"max_height": null,
"max_width": null,
"min_height": null,
"min_width": null,
"object_fit": null,
"object_position": null,
"order": null,
"overflow": null,
"overflow_x": null,
"overflow_y": null,
"padding": null,
"right": null,
"top": null,
"visibility": null,
"width": null
}
},
"d142189db76a4681a22f38ae252e4ebc": {
"model_module": "@jupyter-widgets/controls",
"model_module_version": "1.5.0",
"model_name": "FloatProgressModel",
"state": {
"_dom_classes": [],
"_model_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_model_name": "FloatProgressModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/controls",
"_view_module_version": "1.5.0",
"_view_name": "ProgressView",
"bar_style": "success",
"description": "",
"description_tooltip": null,
"layout": "IPY_MODEL_54a90429726b4d848358cafae87ad893",
"max": 5,
"min": 0,
"orientation": "horizontal",
"style": "IPY_MODEL_0401482a18a94f22b95d5321bfa6f414",
"value": 5
}
},
"d441368305704ab9a3bdbe762ab340a4": {
"model_module": "@jupyter-widgets/controls",
"model_module_version": "1.5.0",
"model_name": "HTMLModel",
"state": {
"_dom_classes": [],
"_model_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_model_name": "HTMLModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/controls",
"_view_module_version": "1.5.0",
"_view_name": "HTMLView",
"description": "",
"description_tooltip": null,
"layout": "IPY_MODEL_f8ef3c06db574e3f88dc9a8c0bcd22ab",
"placeholder": "",
"style": "IPY_MODEL_8f0660be3bf44dd48fd42cd52a507e32",
"value": " 5/5 [00:10&lt;00:00, 2.12s/ file]"
}
},
"eefd1a01ef1c46e09ffbd97ad25377cf": {
"model_module": "@jupyter-widgets/controls",
"model_module_version": "1.5.0",
"model_name": "HTMLModel",
"state": {
"_dom_classes": [],
"_model_module": "@jupyter-widgets/controls",
"_model_module_version": "1.5.0",
"_model_name": "HTMLModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/controls",
"_view_module_version": "1.5.0",
"_view_name": "HTMLView",
"description": "",
"description_tooltip": null,
"layout": "IPY_MODEL_b681dc2200ad4ee397a46602e8f4f654",
"placeholder": "",
"style": "IPY_MODEL_57cbb645792f45adbfab9b29aa708809",
"value": "Dl Completed...: 100%"
}
},
"f8ef3c06db574e3f88dc9a8c0bcd22ab": {
"model_module": "@jupyter-widgets/base",
"model_module_version": "1.2.0",
"model_name": "LayoutModel",
"state": {
"_model_module": "@jupyter-widgets/base",
"_model_module_version": "1.2.0",
"_model_name": "LayoutModel",
"_view_count": null,
"_view_module": "@jupyter-widgets/base",
"_view_module_version": "1.2.0",
"_view_name": "LayoutView",
"align_content": null,
"align_items": null,
"align_self": null,
"border": null,
"bottom": null,
"display": null,
"flex": null,
"flex_flow": null,
"grid_area": null,
"grid_auto_columns": null,
"grid_auto_flow": null,
"grid_auto_rows": null,
"grid_column": null,
"grid_gap": null,
"grid_row": null,
"grid_template_areas": null,
"grid_template_columns": null,
"grid_template_rows": null,
"height": null,
"justify_content": null,
"justify_items": null,
"left": null,
"margin": null,
"max_height": null,
"max_width": null,
"min_height": null,
"min_width": null,
"object_fit": null,
"object_position": null,
"order": null,
"overflow": null,
"overflow_x": null,
"overflow_y": null,
"padding": null,
"right": null,
"top": null,
"visibility": null,
"width": null
}
}
}
},
"accelerator": "GPU"
2016-09-27 23:31:21 +02:00
},
"nbformat": 4,
2020-04-06 09:13:12 +02:00
"nbformat_minor": 4
2016-09-27 23:31:21 +02:00
}