handson-ml/15_processing_sequences_usi...

4830 lines
1.2 MiB
Plaintext
Raw Normal View History

2016-09-27 23:31:21 +02:00
{
"cells": [
{
"cell_type": "markdown",
"metadata": {},
2016-09-27 23:31:21 +02:00
"source": [
"**Chapter 15 Processing Sequences Using RNNs and CNNs**"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {},
2016-09-27 23:31:21 +02:00
"source": [
"_This notebook contains all the sample code and solutions to the exercises in chapter 15._"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"<table align=\"left\">\n",
" <td>\n",
" <a href=\"https://colab.research.google.com/github/ageron/handson-ml3/blob/main/15_processing_sequences_using_rnns_and_cnns.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>\n",
" </td>\n",
2021-05-25 21:31:19 +02:00
" <td>\n",
" <a target=\"_blank\" href=\"https://kaggle.com/kernels/welcome?src=https://github.com/ageron/handson-ml3/blob/main/15_processing_sequences_using_rnns_and_cnns.ipynb\"><img src=\"https://kaggle.com/static/images/open-in-kaggle.svg\" /></a>\n",
2021-05-25 21:31:19 +02:00
" </td>\n",
"</table>"
]
},
2022-02-19 10:08:31 +01:00
{
"cell_type": "markdown",
"metadata": {
"id": "dFXIv9qNpKzt",
"tags": []
},
"source": [
"# Setup"
]
},
2016-09-27 23:31:21 +02:00
{
"cell_type": "markdown",
"metadata": {
"id": "8IPbJEmZpKzu"
},
"source": [
2022-02-19 11:03:20 +01:00
"This project requires Python 3.7 or above:"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"id": "TFSU3FCOpKzu"
},
"outputs": [],
2016-09-27 23:31:21 +02:00
"source": [
"import sys\n",
"\n",
2022-02-19 11:03:20 +01:00
"assert sys.version_info >= (3, 7)"
2016-09-27 23:31:21 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "GJtVEqxfpKzw"
},
"source": [
2022-02-28 23:41:27 +01:00
"And TensorFlow ≥ 2.8:"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"id": "0Piq5se2pKzx"
},
"outputs": [],
"source": [
"from packaging import version\n",
2019-04-05 11:04:38 +02:00
"import tensorflow as tf\n",
"\n",
"assert version.parse(tf.__version__) >= version.parse(\"2.8.0\")"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "DDaDoLQTpKzx"
},
"source": [
"As we did in earlier chapters, let's define the default font sizes to make the figures prettier:"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {
"id": "8d4TH3NbpKzx"
},
"outputs": [],
"source": [
2019-04-05 11:04:38 +02:00
"import matplotlib.pyplot as plt\n",
"\n",
"plt.rc('font', size=14)\n",
"plt.rc('axes', labelsize=14, titlesize=14)\n",
"plt.rc('legend', fontsize=14)\n",
"plt.rc('xtick', labelsize=10)\n",
"plt.rc('ytick', labelsize=10)"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "RcoUIRsvpKzy"
},
"source": [
"And let's create the `images/rnn` folder (if it doesn't already exist), and define the `save_fig()` function which is used through this notebook to save the figures in high-res for the book:"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {
"id": "PQFH5Y9PpKzy"
},
"outputs": [],
"source": [
"from pathlib import Path\n",
"\n",
"IMAGES_PATH = Path() / \"images\" / \"rnn\"\n",
"IMAGES_PATH.mkdir(parents=True, exist_ok=True)\n",
2019-04-05 11:04:38 +02:00
"\n",
"def save_fig(fig_id, tight_layout=True, fig_extension=\"png\", resolution=300):\n",
" path = IMAGES_PATH / f\"{fig_id}.{fig_extension}\"\n",
2019-04-05 11:04:38 +02:00
" if tight_layout:\n",
" plt.tight_layout()\n",
" plt.savefig(path, format=fig_extension, dpi=resolution)"
]
},
{
"cell_type": "markdown",
"metadata": {
"id": "YTsawKlapKzy"
},
"source": [
"This chapter can be very slow without a GPU, so let's make sure there's one, or else issue a warning:"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {
"id": "Ekxzo6pOpKzy"
},
"outputs": [],
"source": [
"if not tf.config.list_physical_devices('GPU'):\n",
" print(\"No GPU was detected. Neural nets can be very slow without a GPU.\")\n",
" if \"google.colab\" in sys.modules:\n",
" print(\"Go to Runtime > Change runtime and select a GPU hardware \"\n",
" \"accelerator.\")\n",
" if \"kaggle_secrets\" in sys.modules:\n",
" print(\"Go to Settings > Accelerator and select GPU.\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
2022-02-21 02:40:39 +01:00
"# Basic RNNs"
]
},
2019-04-05 11:04:38 +02:00
{
"cell_type": "markdown",
"metadata": {},
"source": [
2022-02-21 02:40:39 +01:00
"Let's download the ridership data from the ageron/data project. It originally comes from Chicago's Transit Authority, and was downloaded from the [Chicago's Data Portal](https://homl.info/ridership)."
2019-04-05 11:04:38 +02:00
]
},
{
2022-02-21 02:40:39 +01:00
"cell_type": "code",
"execution_count": 6,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-21 02:40:39 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Downloading data from https://github.com/ageron/data/raw/main/ridership.tgz\n",
"114688/108512 [===============================] - 0s 0us/step\n",
"122880/108512 [=================================] - 0s 0us/step\n"
]
},
{
"data": {
"text/plain": [
"'./datasets/ridership.tgz'"
]
},
"execution_count": 6,
2022-02-21 02:40:39 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
2019-04-05 11:04:38 +02:00
"source": [
2022-02-21 02:40:39 +01:00
"tf.keras.utils.get_file(\n",
" \"ridership.tgz\",\n",
" \"https://github.com/ageron/data/raw/main/ridership.tgz\",\n",
" cache_dir=\".\",\n",
" extract=True\n",
")"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 7,
2019-04-05 11:04:38 +02:00
"metadata": {},
"outputs": [],
"source": [
"import pandas as pd\n",
"from pathlib import Path\n",
"\n",
"path = Path(\"datasets/ridership/CTA_-_Ridership_-_Daily_Boarding_Totals.csv\")\n",
"df = pd.read_csv(path, parse_dates=[\"service_date\"])\n",
"df.columns = [\"date\", \"day_type\", \"bus\", \"rail\", \"total\"] # shorter names\n",
"df = df.sort_values(\"date\").set_index(\"date\")\n",
"df = df.drop(\"total\", axis=1) # no need for total, it's just bus + rail\n",
"df = df.drop_duplicates() # remove duplicated months (2011-10 and 2014-07)"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 8,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/html": [
"<div>\n",
"<style scoped>\n",
" .dataframe tbody tr th:only-of-type {\n",
" vertical-align: middle;\n",
" }\n",
"\n",
" .dataframe tbody tr th {\n",
" vertical-align: top;\n",
" }\n",
"\n",
" .dataframe thead th {\n",
" text-align: right;\n",
" }\n",
"</style>\n",
"<table border=\"1\" class=\"dataframe\">\n",
" <thead>\n",
" <tr style=\"text-align: right;\">\n",
" <th></th>\n",
" <th>day_type</th>\n",
" <th>bus</th>\n",
" <th>rail</th>\n",
" </tr>\n",
" <tr>\n",
" <th>date</th>\n",
" <th></th>\n",
" <th></th>\n",
" <th></th>\n",
" </tr>\n",
" </thead>\n",
" <tbody>\n",
" <tr>\n",
" <th>2001-01-01</th>\n",
" <td>U</td>\n",
" <td>297192</td>\n",
" <td>126455</td>\n",
" </tr>\n",
" <tr>\n",
" <th>2001-01-02</th>\n",
" <td>W</td>\n",
" <td>780827</td>\n",
" <td>501952</td>\n",
" </tr>\n",
" <tr>\n",
" <th>2001-01-03</th>\n",
" <td>W</td>\n",
" <td>824923</td>\n",
" <td>536432</td>\n",
" </tr>\n",
" <tr>\n",
" <th>2001-01-04</th>\n",
" <td>W</td>\n",
" <td>870021</td>\n",
" <td>550011</td>\n",
" </tr>\n",
" <tr>\n",
" <th>2001-01-05</th>\n",
" <td>W</td>\n",
" <td>890426</td>\n",
" <td>557917</td>\n",
" </tr>\n",
" </tbody>\n",
"</table>\n",
"</div>"
],
"text/plain": [
" day_type bus rail\n",
"date \n",
"2001-01-01 U 297192 126455\n",
"2001-01-02 W 780827 501952\n",
"2001-01-03 W 824923 536432\n",
"2001-01-04 W 870021 550011\n",
"2001-01-05 W 890426 557917"
]
},
"execution_count": 8,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"df.head()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Let's look at the first few months of 2019 (note that Pandas treats the range boundaries as inclusive):"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 9,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAjAAAADsCAYAAABqkpwSAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAAC2I0lEQVR4nOydd5gcV5X2f7c694TumZFmlHOwZNmyJUuW5TSOmGiyE2DAmIWFJS6sDcuyS1pYk5ewNgZsg20MBoMxxtnjJMlKliwrx1FOkztVd1Xd749b1dOTu3u6a4y+eZ9Hj2aqq6rvVN1w7jnveY+QUjKKUYxiFKMYxShG8Y8EbaQbMIpRjGIUoxjFKEZRKEYNmFGMYhSjGMUoRvEPh1EDZhSjGMUoRjGKUfzDYdSAGcUoRjGKUYxiFP9wGDVgRjGKUYxiFKMYxT8cRg2YUYxiFKMYxShG8Q8H70g3oNSIRqNy1qxZI92MUYxiSMTjcSoqKka6GaMYRRajfXIUrzesW7fupJRybH+fnXIGTENDA2vXrh3pZoxiFEOiqamJxsbGkW7GKEaRxWifHMXrDUKI5oE+Gw0hjWIUoxjFKEYxin84jBowoxjFKEYxilGM4h8OowbMKEYxilGMYhSj+IfDqAEzilGMYhSjGMUo/uEwasCMYhSvc6xrbuOnz+5iXXPbSDdlFKMYxSheNzjlspBGMYpTCeua27jm9pWYliTg07j3I8tYPLVmpJs1ilMQ65rbeGR3mqrpbaN9bBT/EBj1wIxiFK9j/GblPgxLIoGMYbFqT8tIN2kUpyDWNbdx3R2r+OPODDfcuWrU2zeKfwiMGjCjGMXrFMc7Uzy99Vj2d00TLJtRN4ItGsWpilV7TpI2rbwN5dGw5iheDxgNIY1iFK9DSCn51wdfJWNJfnztWfzHw5uZEAmOuvZHURZMrg1nf/Zo2qCG8mhYcxSvF4x6YEYxihGAwzcYaAd7z8pmnt9xgi+/eT5vO2siH71oBluOdLH7RMzllo7i/wd0JDIAeAScPSU6qEHy5Jajo2HNUbwuMGrAjGIULmNdcxvX/2JgvsHOY11869GtXDJ3LO87dwoA7148Ca8meGDNgbK3bTQ08P8fVuxuYWI0xBVTfaxtbuN4Z2rAcw+3J7M/+7yDe2tKgdE+OYqBMGrAjGIULmPVnhZ0Q/ENUhmL36zch2lJANKGxad/t4HKgJf/efdChBAA1FcFuXxeAw+uO4humGVpl2NY3fb4dq7/xSiR002s2HWS/31m54g8c8uSrNzTwnkz62ic7MW0JH9Yd7DfczuSGZ7eehyASMhb9vDRuuY2rrP75HV3rGLdvtayfdcoXn9Y19yGp7Ju3ECfjxow/+B4eU8LPxmhiW8UxWHZjDqE/bMA/rzhMJd+r4lvPLKF63+xii1HOvn2u85kbFWgx3XXLp1MazzNk1uO9blnKbBqTwtpwwJANyxW7j5Zlu/5/xm53oSjHSl+s6qZq3/yItff+TLfe2IH192x0vWxvOVIJ+2JDOfPqmNchcbymXXcv3o/lm1U5+Lel5uJp02WTKtBSsrOfbnzhT3ZPpk2LT77wEZe2T/w81m7r5XvP7F9dD48BbCuuY1r71iJp6Jm4kDnnHIk3taUZNWekyybMWakm1J2rNh1khvufBkJBH27XldkunXNbby48wQXzB77umnT6wWLp9YQDfuIeE1uu/5cTnTp/PCpHdz54l4APJqgtsLf57oLZ49lYjTE71Yf4C1nTih5u5bNqMPrEWRMtXAd7hg4jDCKwuF4uNKGBQKkbR/UhH0IQAJpU/KVP2/itx9Z1m8fKAdW7lYclvNmjGHbK7u4/twpfPK+V3hh10kunjM2e14qY/KrF/dx4ewxnDU5ytrmNixLomlioFsXjYxp8fVHtvD3146iCWXoCyFoT6Z5x89WcOX8Bt50xngOtCWorwrQkczwxOZjrLUNl9uf38N9N5duPlzX3MaqPS0sm1HX454DHR9pvF7bVQh+v2a/mosG6V55GTBCiM8CH0GNsU3Ah4Aw8AAwDdgHvFdK2WaffytwE2ACn5JSPm4fXwzcBYSAR4FPSymlECIA3AMsBlqAa6SU++xrbgT+3W7KN6SUdw/W1s605AO/WsP9Jey8r1d894ntOHskPaPIdK+Hv9nJUjAsyU+e3cXvPnre66JdrycYlmRWVGPJtFoA9p6M870ntmNJQMp+36VHE1yzZDLff3IHzS1xptZVlLRNi6fWcOPyadz5wl7OmFjN79cc4PqlU1gwMVLS7/n/BWoROUlDdZD9LQnufXk/uu1NQMKFs8fw1bfOpyOR4YZfvkzGsBBCsP1YF5d//zluXD4NrwbLZowp6/h5afdJZoytYFwkyDbgyvnjqKvwc9/LzT0MmIdeOcTJmM7HLj6LrUc6kRJiaYPqoK9kbVnX3MbTW4/x7LbjbD3axUcvmsHl8+pZs6+NZTPqmDuuil+9uJefN+3miV6eyJpwdzsyZunmw3XNbdxwpzI8PZrgxuXTiAR97Dzexd9ePYolX1/ZWE7YLW1YeDXB/71vMZfPbxjpZhWEXcdj/PXVI8p2kbKvK9DGkAaMEGIi8ClgvpQyKYT4PXAtMB94Wkr5bSHELcAtwL8JIebbn58OTACeEkLMkVKawM+BjwKrUAbMVcDfUcZOm5RylhDiWuA7wDVCiFrgq8A5KONpnRDiYcdQGghGCTvv6xUrdp1k/f52PJrAtDMC3NqxDYX7Vzdj2O7njCl5csvR1/27cHPHIqUkmTYJeruH37IZdfi9GhnDGpQY+d5zJvPDp3bwwJoDfPGq00retjGVKmz1f+8/h3f89CU++8AG/vovFxD0eUr+XacyHPe3480SwPwJVXQkM1hS4vdqfObyOcyqrwLg3o8sy/a/ioCHT967nh88uQMor3c1Y1qs3tvKuxZNyh7zezXefc4k7nxhL8c6UzRUBzEtyS+e38MZEyMsn1nHoTZF5O1IZIoyYJzxtmhKlBljK+lIZli9p5X/emRz9pl99oo5fPqy2QAsnd49Hj512WzShsVPn92FBDQBH7t4JpfNa8gu3J4Saiat2tOCnlGcNcuU3PmC8pR6NYFpr61ONtbrYZ57ceeJbNjNsCQf/c1a3n7WRD58wXR0u52vZ89MS0znw3etIez38L33LOStP20/PNC5+YaQvEBICJFBeV4OA7cCjfbndwNNwL8BVwO/k1LqwF4hxC5gqRBiH1AtpVwJIIS4B3g7yoC5GvhP+14PAj8Rir34BuBJKWWrfc2TKKPn/sEaW8rO+3pEZyrDFx58lRljKvjG2xewak8L963ez+3P7eZtCydQERi5yOCOY108+upR2+ULloRnt5/g81fOxed5fVKuVuw6yY2/Xk3GlAS8Wkldz/0hbVoYlsSfYxMsnlrTYxEb6PvHRYJcelo9v197kM9eMafkzzSRVgTh8dVBbnvPQm781Wpue3w7X3nL/JJ+z6mORzcd6WG8/PMlM/nCG04b0FBePLWmx+9vP3si33tihworlXFxfPVgO4m0yfKZPefL65ZM4fbn9vCHtQf45KWzeXLLUfacjPOT689GCEF1SBktnalMwd/phNKy3qh+oAllIAyES06r584X92QN/svmNbB4ag0/vvYsPvbb9Xz0whkle14tMR2Jeo9+r8bPbljEBbPH8NqhzuzfIcTrZ8050JYA1DP0eTQum1fP45uP8qdXDuE8Ur/39eMxykUqY/JPv1nHsc4U9390GYum1GDGWo4OdP6QK52U8pAQ4rvAfiAJPCGlfEII0SClPGKfc0QIUW9fMhHlYXFw0D6WsX/ufdy55oB9L0MI0QHU5R7v55oB8dGLStd5X4/42l+3cKQjyYMfX86iKTUsnzWG82eN4dpfrOIbf9vCf7/zzBFpl2M5VwS9/ODas9h1PIZhSX7w5A6+98QObnlj/x4Dt+O165rbWLn7JJUBL5sOdfLwxkPZxUY3LP7+2pGytiNpGwkBT88JuvciNhCuWzqFp7au5emtx7lqwYAE/aKQ0A1CPg+aJrh4zlg+cN5UfvniXi6bV8/ymcXzyk6FmHwh2Hi
"text/plain": [
"<Figure size 576x252 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"import matplotlib.pyplot as plt\n",
"\n",
"df[\"2019-03\":\"2019-05\"].plot(grid=True, marker=\".\", figsize=(8, 3.5))\n",
"save_fig(\"daily_ridership_plot\") # extra code saves the figure for the book\n",
"plt.show()"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 10,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAjAAAAFYCAYAAABNvsbFAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAAEAAElEQVR4nOyddXhdVfb3P+e6xj2NNE3qnrpQo8VpseIUH9wdhmHwYXAGd0qhQKHQFilV6pa6t0njnlx3Oe8fN0mTNnJvmvub4SXf5+nT5Ny9z105sveS71pLEEWRbnSjG93oRje60Y0/EyT/bQG60Y1udKMb3ehGN0JFtwLTjW50oxvd6EY3/nToVmC60Y1udKMb3ejGnw7dCkw3utGNbnSjG93406FbgelGN7rRjW50oxt/OnQrMN3oRje60Y1udONPB9l/W4D/BqKiosTs7Oz/thjd6MZJsNlsaLXa/7YY3ehG97PYjf8J5OXl1YqiGN/aZ39JBSYxMZFt27b9t8XoRjdOwurVq5k8efJ/W4xudKP7WezG/wQEQShq67PuEFI3utGNbnSjG93406FbgelGN7rRjW50oxt/OnQrMN3oRje60Y1udONPh24Fphvd6EY3utGNbvzp0K3AdKMb3ehGN7rRjT8duhWYbnSjG93oRje68adDtwLTjW78CZFXZODtVUfJKzL8t0XpRje60Y3/CroVmG5040+GvCIDl76/kX8vPcQVH276n1JiXtr6Elf/cjUev+e/LUo3TgHrj9ayON/9P/VsAdg99v+2CN34H0K3AtONbvzJ8Pryw3j9IgBur59NBXX/ZYmOo2dkT2LVsf9tMbpxClhzqIYrP9rM90c8XPlRxwry/5U30OP38Pf1f2dp4dKwfk83/jzoVmC60Y0/ET5ed4y1R2qRCIHfRSAr7r9b7l0URXbX7Abgkt6X8PqU15FL5P9VmbrReczbcrzwqacDBfm3vRVc8t4GXvn9UFDKzqlAKkgptZby67FfOxybb8zH6DSGTZZu/G+gW4HpRjf+R5BXZOCHI27yCutb/Xze5iKeWbKfswYmMf/mMdwyKQu5VOD3/VVhl6s9C/v7I99z1S9XsatmV9OxNaVr+P7w92GV6/9X/Lf5TaWGZmEaQWBMVuseNVEU+ddvh/CL4Bc7VnZOBX7RT7W9mr/1eZ5s8bZ2r40oijy05iFuXX4roiiGRZ5u/G/gL9kL6a+CvCIDmwrqGJMVS25G9H9bnG60g7wiA5d/uAm318/i/I1cMCyV6yf0ZEBKBNuLjXy8toBf9lYytW8Cb1w2DIVMwqiesYDA+2vyuXFiTwakRIZVLk3KQm4aNZl7x1zVYsx5vc5DRGRw3OCmYz8e/ZFSSykX5FyARAiPnbS+bD3Li5fzyKhHUEqVYfmO/2vkFRm44sNNeHx+FDIJ824c83/67laYHOyvsHDpiDTyjpaRb/QToWp9m1i0q5xjtTYABEAuk7Sp7Jwqtldt57ql1+EsuQGvNQel/AAfXZvLhF49ThorCAJX9XqEvOIKthcbw379fH4fUokUCCha4Xre/4rIKzIg1cUmtfV5l1xpQRDuFQRhnyAIewVB+FoQBJUgCDGCICwTBOFIw//RzcY/KgjCUUEQDgmCcEaz47mCIOxp+OxNQRCEhuNKQRC+aTi+WRCEzGZz5jR8xxFBEOYEI6/JJf7PkdO6Go0L4ctLw+/aDRXLDx/m7iUf/0/J9N/GpoI6PF4/EAgLLdxZxrlvrWPU88uZ/d5GftlbiUSAmyb2RCE7/treOrkXkWo5L/56MMxy+RDllWwvPwYEFupnNz2Ly+dCKVVySe9LaHhdAXhyzJPMO2deWBfzfGM+Cw4v+FMTO7cW1vOPn/by3M/7uXv+Dq77dAsurx+/CE6Pn6+3FP2fehEW7SxHFAPP1d3DVeiUEm789S7m7p8LBHgoT214im3l+3h68X6G9IhEAMZkxYRN2aq3ufl6gw1X9XQ8tgxEwY0042UeXvkyO0uM5BXWN3mszA4XX28p5qH5RSw4vJCr5n0ZtnXGL/p56I+HuG/FU7y96ijLDx9m+nfTWVe2rtXxh+oP8UvBL2GRpTX8tz15p4pNBbXMfn8jUm10altjTtkDIwhCKnAX0F8URYcgCN8ClwH9gRWiKL4oCMIjwCPAw4Ig9G/4fACQAiwXBKG3KIo+4F3gZmAT8AtwJvArcANgEEUxWxCEy4B/AZcKghAD/AMYQWDdzxMEYZEoiu3eMYNL5MqPNv2fWzdtodJWyXObn+OmQTcxOH5wxxOCwLL9lbgaNkSnx8/G/Nr/ib81r7CeW77YjaLH9yw7UMDc2Q/8T8jViNUlq5mYOrHJovq/wpisWGRSAY9PRCmT8P7VuVSbXbz3Rz41FjcQsHK3FxsZ2yuuaV6kWs4dU7J59ucDrDtSy4ScuDa+ofNyKWLWI9EexFF0KzUuHXa3l1JbAd8e+pYUXQrXD7z+pHlRqigAvH4vNo+NSGXXe4euGXANV/S7AplEhl/0I4ri//l9CxaN3tBhaVEIgsC2wnqWH6hiV6mpaUy0Rs6AlAi2Fhrw+UVEYEFeGQcqLNw6uReJESq2HKsPi0fV4/cgl8j5cWc5ydkL+PjQJqYppnHrlBze3GvjaE1AzgJjAcuKllFYnInJkcDn1+cy59NtZMZpu1SmvCIDG/JrMdk9fJdXitXlZUb/K1lpqsbj8+M1jKPWlc6st9cjEFj8Bfyo0j/GZ8/C45uAQnsYjyOTTQV1XSbb2iM1/LqnnOwEPSlRGkpqFOwodLCo9hBKTR2jc/uQoc8A4EDdAd7b9R6vT3kdQRBYUrCErw58xYzMGcgkXR/8sHls7KzeSe/o3hTXyLj8k9/xy4sQlg/ki+tHtVg3/lcgimILw6cRBpubB77bjc8vBha+NtBVV1EGqAVB8AAaoBx4FJjc8PnnwGrgYWAmMF8URRdwTBCEo8AoQRAKgQhRFDcCCILwBTCLgAIzE3iq4VwLgP80eGfOAJaJoljfMGcZAaXn644EbozX/i9sniaXidUlq5nTPygHUoeos7pYvKscoOnl3lNmavNh+b9CvuEYT/58AK9XicTWC4+pP2sO1/zX70HjddlVs4s7V97J38f8ndl9ZgP/d2G43IxoLh+Vzhcbi/hozggm5sQD0CtBx5UNIYW2XPRXjcng0/WFvPjbARb1moBE0nX3ODcjmn5J0RyxKLh/em9eW36Yv83N46M5I1h/+Xp0cl2bc/2in6t+uYoe+h68POnlLpOpORqVl6c2PIVEkPCPsf/4rz7jraHRG9poUAAIAsRpFU2/SwS4cWJPbp+S0/TMjciIpqjeznur87njqx0IDfO6OrT01IanOFB/gGdGfsSBCjPTx/UiIyIJ6uD68T2Zu/Fv7NijxD9RpE9MH57Jnc/1n+zilklZbDcuQUz+jkrLg53+/p93l7PyYDUZsVoS9EoOVpr5clNxU6Zd7zQT/5jSg1n9hrGj2NTwPo4jJ1HHA9/uauKAiYKPBFUqg9N689smFbajj6GQnlpYy+6xs7pkNQPjBrL5sMBDP65Fk/E+jp1z8LsTgbFNY132WEZp7iMtIg0IGEOl1lJMLhNRqiiu7n81V/W7CqnQtUp24/pVaavkluW38O9J/2br3nS8kjq0aXOxF93ENZ/AmQOTOWtgElFqOTtKjP8VasHWyq18vu9zXpz4IkqpkgsWXcCNg25kVvaspjHHam1c/9lWKk3OgO7SjgvylBUYURTLBEF4GSgGHMDvoij+LghCoiiKFQ1jKgRBSGiYkkrAw9KI0oZjnoafTzzeOKek4VxeQRBMQGzz463MaRfhjNcGC5PLRKQykj4xfdh+9fYuydwwOz3M+XQLtVY3z8wcgNnp4XCVlZ92lvPszwd44px+/5UF3mh3culPf8MmUSKT3IK39kxEEX7ZW864/nZGpw5tdV64FYhqezUP/PEAD418CK89nTPjHiVFOok1h2v4fntpwKUOyCQCX900uoF3Eh5EquUIwPhmllJuRjTzbhrT4hr8UfIHr+W9xtyz56JX6FHJpdw/ozf3fbuLxbvLmTk0qFcgaPR
"text/plain": [
"<Figure size 576x360 with 2 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"diff_7 = df[[\"bus\", \"rail\"]].diff(7)[\"2019-03\":\"2019-05\"]\n",
2019-04-05 11:04:38 +02:00
"\n",
"fig, axs = plt.subplots(2, 1, sharex=True, figsize=(8, 5))\n",
"df.plot(ax=axs[0], legend=False, marker=\".\") # original time series\n",
"df.shift(7).plot(ax=axs[0], grid=True, legend=False, linestyle=\":\") # lagged\n",
"diff_7.plot(ax=axs[1], grid=True, marker=\".\") # 7-day difference time series\n",
"axs[0].set_ylim([170_000, 900_000]) # extra code beautifies the plot\n",
"save_fig(\"differencing_plot\") # extra code saves the figure for the book\n",
2019-04-05 11:04:38 +02:00
"plt.show()"
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {
"tags": []
},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"['A', 'U', 'U']"
]
},
"execution_count": 11,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"list(df.loc[\"2019-05-25\":\"2019-05-27\"][\"day_type\"])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Mean absolute error (MAE), also called mean absolute deviation (MAD):"
]
},
2019-04-05 11:04:38 +02:00
{
"cell_type": "code",
"execution_count": 12,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"bus 43915.608696\n",
"rail 42143.271739\n",
"dtype: float64"
]
},
"execution_count": 12,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"diff_7.abs().mean()"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Mean absolute percentage error (MAPE):"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 13,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"bus 0.082938\n",
"rail 0.089948\n",
"dtype: float64"
]
},
"execution_count": 13,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"targets = df[[\"bus\", \"rail\"]][\"2019-03\":\"2019-05\"]\n",
"(diff_7 / targets).abs().mean()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now let's look at the yearly seasonality and the long-term trends:"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 14,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAjAAAAEQCAYAAACutU7EAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAADg6ElEQVR4nOydd3wbhfn/36dt2ZYt7xk7HtnbmZBA2BsKtKWsDgp0UNpSoINf+23pboECLZRNoWW0rLJHCMQkkDjD2YmTeO9tecjauvv9cTpZsmVbDnFW7/16QWxZ0g2d7j73jM8jSJKEioqKioqKisqJhOZYr4CKioqKioqKykRRBYyKioqKiorKCYcqYFRUVFRUVFROOFQBo6KioqKionLCoQoYFRUVFRUVlRMOVcCoqKioqKionHAcUwEjCMLTgiB0CIKwN8rnf1kQhP2CIOwTBOGFyV4/FRUVFRUVleMT4Vj6wAiCcBpgB/4pSdKccZ5bDLwEnClJkk0QhDRJkjqOxnqqqKioqKioHF8c0wiMJEnrgZ7QxwRBKBQE4X1BEMoFQdggCMKMwJ9uAh6WJMkWeK0qXlRUVFRUVP5HOR5rYB4HbpUkqQS4A/h74PFpwDRBED4TBKFMEITzj9kaqqioqKioqBxTdMd6BUIRBCEOOAV4WRAE5WFj4F8dUAysBnKADYIgzJEkqfcor6aKioqKiorKMea4EjDIEaFeSZIWRPhbE1AmSZIXqBUE4SCyoNl6FNdPRUVFRUVF5TjguEohSZLUjyxOvgQgyMwP/Pl14IzA4ynIKaWaY7GeKioqKioqKseWY91G/SKwCZguCEKTIAjfBK4FvikIwi5gH3BZ4OkfAN2CIOwH1gF3SpLUfSzWW0VFRUVFReXYckzbqFVUVFRUVFRUDofjKoWkoqKioqKiohINqoBRUVFRUVFROeE4Zl1IiYmJUlFR0bFa/KQyODhIbGzssV6NSeNk3j51205cTubtU7ftxORk3jY4OttXXl7eJUlSaqS/HTMBk56ezrZt247V4ieV0tJSVq9efaxXY9I4mbdP3bYTl5N5+9RtOzE5mbcNjs72CYJQP9rf1BSSioqKioqKygmHKmBUVFRUVFRUTjhUAaOioqKioqJywqEKGBUVFRUVFZUTDlXAqKioqKioqJxwjNuFJAjC08DFQIckSXMi/F0AHgQuBBzA1yVJ2n6kV1RFRUVFReVEob+/H4vFQkVFxbFelUkjISHhc22fXq8nLS0Ni8VyWK+Ppo36GeAh4J+j/P0C5KnQxcAy4JHAvyoqKkeR8nobm6q7WFGYQkme9VivjorK/yz9/f20t7czZcoUkpOTke/zTz4GBgaIj48/rNdKkoTT6aS5uRngsETMuAJGkqT1giDkj/GUy4B/SvJQpTJBEBIFQciUJKl1wmujoqJyWJTX2/jK45vw+iWMuipeuGm5KmJUVI4RHR0dZGdn4/f7T1rx8nkRBAGz2Ux2djYtLS2HJWCORA1MNtAY8ntT4DEVFZWjRFlNN16/PJjV6xcpq1EHtauoHCu8Xi8xMTHHejVOCGJiYvB6vYf12iPhxBtJXkYccS0Iws3AzQCpqamUlpYegcUff9jt9pN22+Dk3r4TddsMvb7gzxoBjL31lJY2hT3nRN22aDmZt0/dthOLhIQE7HY7fr+fgYGBY706k8aR2j6Xy3VYx8CREDBNQG7I7zlAS6QnSpL0OPA4wPTp06WT1WJZtY8+cTlRt62g28HvNq8D4K4LZ3LDyoIRzzlRty1aTubtU7ftxKKiooL4+PjPVSNyInCkts9kMrFw4cIJv+5IpJDeBL4qyCwH+tT6FxWVo8vG6q7gz1mJauhaRUVl4qxevZrvfe97x3o1oiaaNuoXgdVAiiAITcAvAT2AJEmPAu8it1BXIbdRf2OyVlZFRSUyG6u70WsFvH6JAZdv/BeoqKionOBE04V09Th/l4BbjtgaqaioTAhJkthY3c3ygmQ2VHYx6FYFjIqKysmP6sSronKCU91pp8vu5pxZ6QAMevzHeI1UVFROVHw+Hz/4wQ+wWq1YrVbuvPNORFEEID8/n3vvvTfs+cPTTq+99hrz5s0jJiaGpKQkTj/9dNrb2ydlXVUBo6JygvPvrbKLQUqsEb1WUFNIKioqh83zzz+PKIps2rSJxx57jMcff5wHHnggqte2tbXxla98ha997WtUVFSwfv16rr/++klb1yPRhaSionKMKK+38fSntQD86OWdGHUaNYWkonIccvdb+9jf0n9Ulzkry8IvL5k9oddkZmby17/+FUEQmDFjBocOHeIvf/kLP/rRj8Z9bUtLC16vly9+8Yvk5eUBMGfOiAlERww1AqOicgJTVtONGHBd8vpENIKgChgVFZXDZvny5WHuwStWrKC5uZn+/vHF1/z58zn77LOZM2cOV155JY888gidnZ2Ttq5qBEZF5QRmeUEyArJzpF6nwWo2MKAKGBWV446JRkKORzQaDXLfzhChLrparZY1a9ZQVlbGmjVreOqpp/jZz37GJ598wvz584/8+hzxd1RRUTlqlORZiTFoWTQlkedvXE5KvFGNwKioqBw2mzdvDhMpZWVlZGVlYbFYSE1NpbV1yObN5XJx4MCBsNcLgsCKFSv45S9/ydatW8nKyuI///nPpKyrGoFRUTmB8fhEHB4/q6enUZJnJdaoo8/hOdarpaKicoLS0tLCD3/4Q7773e+yZ88e7rnnHn7+858DcOaZZ/L0009z6aWXkpqayq9+9auwCExZWRlr167lvPPOIz09nR07dtDY2MisWbMmZV1VAaOicgJjC4iV5DgDAPFGHc02x7FcJRUVlROYa6+9Fr/fz7JlyxAEgW9+85vcdtttAPzsZz+jrq6Oyy67jLi4OG6//fawGpeEhAQ+++wz/va3v9Hb20tubi6/+MUvuO666yZlXVUBo6JyAtNldwOQHGsEINaoxa6mkFRUVA6D0IGKDz300Ii/WywWXnzxxeDvAwMDYd1JM2fO5L333pvUdQxFrYFRUTmB6bbLEZiUQAQm1qhj0K0a2amoqJz8qAJGReU4pLzexsPrqiivt435vO5BOQKTFDuUQhr0+BBFaayXqaioqJzwqCkkFZXjjI1VXVz/1BYkJAw6Dc/fuJySPGvE5yoRmOQ4JYWkQ5LA4fUTZ1S/3ioqKicvagRGRWUSiTaSEsrbu1vxSxKiJJvTldV0j/rcLrsHvVbAYpLFSmxAtKit1CoqKic76i2aisoksa2uhy8/tglg3EhKKFOSzcGfdVoNywuSR31ut91Ncqwx6JwZHxAydreP9M+z8ioqKirHOWoERkVlkvioogNRIqpISigpgXSQIMC0tDjKarpHjeD0DHqCLdQAsYaAgFEHOqqoqJzkqBEYFZVJItsaE/xZrxs7khJKb8Db5fzZGby3t419rf2jRnC6Bj3B+hdQU0gqKir/O6gRGBWVScJs0AKQn2yOOn0E0Of0IghQmBoHjB3B6ba7SYkdisCEppAmi8Op61EZG3WfqqhMHDUCo6IySTT2OAGIMeiiFi8gCxiLSc8ZM9J49JNqfKKEdpRamG67J9hCDUMRmMkSMOX1Nq55ogyvX5xQXY/K6JTX27jqsU34RQmjXt2nKirRokZgVFQmicaApb/ilhstvQ4viWY9JXlW/vGNJRh1GuZmJ1CSZw27U3d4fDi9/mEpJDnqM1kppA2Vnbh9IqIEbq/IAx8eUqMGn5Oymm58ooTExGqlVFSOB0pLSxEEga6uroi/TyZqBEZFZZJoCgiYnkEPoiih0QhRva7P6SUxRg/AquJUvn9WMfd8cJBv/XMbHx/swC/K/jD3f3kBQFgRb7xRfp19ktx4p6fHB3+WgA1VXWyt71GjBp+DeTkJwZ8nUiulonI8cMopp9Da2kpy8tE/btUIjIrKJKGkkPyiRK/TO86zh+h1erEEBAzAgtxEAD7Y347XL/vDeHwin1XJdzgpIQLGpNegEcDujn55EyHHKrd4z8ocEjJq1ODzYTXLn59RTcmpHGd4PONPtjcYDGRkZAStHI4mqoBRUZkEvH6R1j4nBSmxwMTSSH0
"text/plain": [
"<Figure size 576x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"period = slice(\"2001\", \"2019\")\n",
"df_monthly = df.resample('M').mean() # compute the mean for each month\n",
"rolling_average_12_months = df_monthly[period].rolling(window=12).mean()\n",
"\n",
"fig, ax = plt.subplots(figsize=(8, 4))\n",
"df_monthly[period].plot(ax=ax, marker=\".\")\n",
"rolling_average_12_months.plot(ax=ax, grid=True, legend=False)\n",
"save_fig(\"long_term_ridership_plot\") # extra code saves the figure for the book\n",
2019-04-05 11:04:38 +02:00
"plt.show()"
]
},
{
"cell_type": "code",
"execution_count": 15,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAjAAAADICAYAAAD2r9syAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAACmBUlEQVR4nOydd3xcV5n+v+dOVdeoV0vuvcqxZdKcQgg1dFKABAjsD5YtwLILIZQFwu5CYGGXsoQQkkASCKGFQCDFceIkVmzLseNuy7Jk9TqSZkaadu/5/XHuvTOjZrk7YZ7Pxx+PZubO3Hvn3nOe877P+7xCSkkaaaSRRhpppJHGqwna+d6BNNJII4000kgjjZNFmsCkkUYaaaSRRhqvOqQJTBpppJFGGmmk8apDmsCkkUYaaaSRRhqvOqQJTBpppJFGGmmk8apDmsCkkUYaaaSRRhqvOjjP9w6cD+Tn58t58+ad7904YwiFQmRlZZ3v3ThjSB/PhY308VzYSB/PhY308ZwcGhsb+6WUxZO99jdJYEpLS9mxY8f53o0zhs2bN7Nx48bzvRtnDOnjubCRPp4LG+njubCRPp6TgxCidarX0imkNNJII4000kjjVYc0gUkjjTTSSCONNF51SBOYNNJII4000kjjVYc0gUkjjdcwGlv9/OCZJhpb/Sf1WhpppJHGhY6/SRFvGmn8LaCx1c+NP2kgEjfwOjUe+Gg9dTW+lNdiuoHbqfHArYnX0kgjjTReDTgjBEYIcQ/wFqBXSrnMfK4A+BVQC7QA75VS+s3XPg98BNCBf5RS/tV8vg64F8gA/gz8k5RSCiE8wP1AHTAAvE9K2WJuczNwu7krX5dS3ncmjimNNF7taGgeIBo3AIjqBg3NAzZJaWgeIGK+FounvpZGGmmcPkZGRujt7SUWi6U8n5eXx4EDB87TXp15nM7xuFwuSkpKyM3NPaXtz1QE5l7g+yiSYeFzwNNSyv8UQnzO/PvfhBBLgOuBpUAF8JQQYoGUUgd+BHwMaEARmGuBx1Fkxy+lnCeEuB74L+B9Jkn6MrAWkECjEOJRiyilkcbfMurnFOJ0CGK6xKkJ6ucUprxmweXUUv5OI400Tg8jIyP09PRQWVlJRkYGQgj7tUAgQE5OznncuzOLUz0eKSVjY2N0dHQAnBKJOSMaGCnlc8DguKevA6xoyH3A25Oe/6WUMiKlPAY0AeuEEOVArpRyq5RSosjQ2yf5rEeAq4S6It4APCmlHDRJy5Mo0pNGGn/zqKvxceslswH45JXzUyIsi8vVgJPrdabTR2mkcYbR29tLZWUlmZmZKeQljQSEEGRmZlJZWUlvb+8pfcbZFPGWSim7AMz/S8znK4G2pPe1m89Vmo/HP5+yjZQyDgwDhdN8VhpppAGU5WUAUJzjSXm+ezgMQDhusGZW/rnerTTSeE0jFouRkZFxvnfjVYGMjIwJabaZ4nyIeCejo3Ka5091m9QvFeJjqPQUxcXFbN68+YQ7+mpBMBhMH88FjPN5PPuPqYFh175DlI82J54f0AGIxg3+8vRmMpwzXyWmf58LG+njOf/Iy8sjGAxO+pqu6wQCgXO8R2cPZ+J4wuHwKf3GZ5PA9AghyqWUXWZ6yIoRtQPVSe+rAjrN56smeT55m3YhhBPIQ6Ws2oGN47bZPNnOSCnvAu4CWLhwoUxbOV+4SB/PmcMe/QgcOkxJ5Sw2blxoPz/Q2A7bdwOwaNU6ZhfNvJdJ+ve5sJE+nvOPAwcOTKkLSWtgJsLr9bJ69eqT3u5sppAeBW42H98M/CHp+euFEB4hxGxgPrDNTDMFhBD1pr7lg+O2sT7r3cAmUyfzV+AaIYRPCOEDrjGfSyONNMCuNAqE4ynPd4+E7cf9wcg53ac00kjjwsTGjRv55Cc/eb53Y8Y4U2XUD6EiIUVCiHZUZdB/Ag8LIT4CHAfeAyCl3CeEeBjYD8SBvzcrkAA+TqKM+nHzH8BPgZ8LIZpQkZfrzc8aFEJ8Ddhuvu+rUsrxYuI00vibRSSubq3xBKZreMx+PJAmMGmkkcarEGeEwEgpb5jipaumeP8dwB2TPL8DWDbJ82FMAjTJa/cA98x4Z9NI428IiQhMqkiuezhMUbab/mCUvmD0fOxaGmmkkcZpId1KII00XsOIxBSBCUZSIzCdQ2GWVOQB0B9IR2DSSCMNhXg8zj/90z/h8/nw+Xx89rOfxTDUOFJbW8udd96Z8v43velNKWmn3/72t6xYsYKMjAwKCgq4/PLL6enpOSv7miYwaaTxGsZUKaTukTDVvgx8mS4GQorApHsjpZHGhYdzfV8+8MADGIbB1q1b+fGPf8xdd93Fd7/73Rlt293dzfXXX8/NN9/MgQMHeO655/jABz5w1vY13QspjTRew5gshRSO6QyGopTneSnK9tAfiNLY6uf6u7YS0yVeV7o3UhppnGn8+x/3sb9zBF3XcTgcM9omEI5xsDuAIUETsKgshxyva8bfuaQily+/delJ7Wd5eTn/8z//gxCCRYsWcfjwYb7zne/w6U9/+oTbdnZ2EovFePe7301NTQ0Ay5ZNUIWcMaQjMGmk8RqGRWCSU0g9ZgVSWV6GIjDBCA3NA8R0ZaEUNXsjpZFGGucXI+E4hulsZkj199lGfX19invwhg0b6OjoYGRk5ITbrly5kquvvpply5bxrne9ix/96Ef09fWdtX1NR2DSSOM1DCuFlDzwdZkuvOV5Xgqz3ezrHKF+TiFCgJRM6JuURhppnD6sSMjJ+KY0tvq56e4GYnEDl1Pje9evPq+RUU3TUA4mCSS76DocDp544gkaGhp44okn+OlPf8rnP/95nn32WVauXHnm9+eMf2IaaaRxwcAS8Ubjhk1mrDYCZXYKKcKaWflkuFRY+wP1Nen0URppXACoq/HxwK31fPqahecsrfvSSy+lkJSGhgYqKirIzc2luLiYrq4u+7VwOMzhw4dTthdCsGHDBr785S+zfft2Kioq+NWvfnVW9jUdgUkjjdcwrBQSQDAcx5PtsCMwZbleinM8BCJx2v1jjEYVwdEnbcaRxplAY8sgW5sH2DC3KE0S05gR6mp85/Ra6ezs5J//+Z/5xCc+wZ49e/jWt77F7bffDsCVV17JPffcw9ve9jaKi4u54447iMcT0d2Ghgaeeuop3vCGN1BaWsrLL79MW1sbS5YsOSv7miYwaaTxGoYVdQFViVSY7aFreIxcr5Msj5PCLDcAW03NiyagZSB0Xvb1tY7GVj/vu6uBuCHxuprSQuk0LkjcdNNN6LrO+vXrEULwkY98hE996lMAfP7zn6elpYXrrruO7OxsvvCFL9DWluinnJeXxwsvvMD//u//MjQ0RHV1NV/84hd5//vff1b2NU1g0kjjNYxI3CDL7SAU1e1S6q7hMOVml+qibNWluuGoIjBrawpo6U8TmLOBhuYB4qYiM2YKpdMEJo0LCckNFb///e9PeD03N5eHHnoo5bkPfOADtqZn8eLFPP744xO2O1tIa2DSSONVglPxg4jGDYpyFEmxSqmP9gaJ6gaNrX77tRePDpCX4WJtrY82/xgx3ZjyM9M4NdTPKUQziztcDi0tlE4jjdNEmsCkkcarAFY1wrefOMRNdzfMmMRE4oadJgpE4jS2+mnuD3GsP8RNdzfQbfZE6h4Js6A0m9qiLHRD0uEfm+5j0zgF1NX4WFyuVqo/uGlNOvqSRhqniTSBSSONVwEamgeIxA0MeXI+LZGYbqeJAuE4LzQlPBlicYND3QH773klOcwuygLgWFoHc5agQjDWeU4jjTROHWkCk0YarwLUzynEYZpLObWZpx8icYPC7EQKaU5xNqCmUZdT45L5xWR7lBRufkk2tYVqYm1N62DOCF5s6ud/Nx2xI2aWoeD41g5ppJHGySMt4k3jgkRjq5+G5gHq5xSmQ+2o9MM711Ty8I52Pr5xzozOSVw3iBuS4myVQgqG48wrUQTmfRdV85611dTV+CjKdhOMxJlfmk1Rtpsst4OWgdGzejx/C1Bpv5cA+IFZdRQMpwlMGmmcKaQJTBoXHJr8Ot98soGYbuBJ9+Wx4XEqo7lsz9S9UJKJn6W
"text/plain": [
"<Figure size 576x216 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"df_monthly.diff(12)[period].plot(grid=True, marker=\".\", figsize=(8, 3))\n",
"save_fig(\"yearly_diff_plot\") # extra code saves the figure for the book\n",
"plt.show()"
2019-04-05 11:04:38 +02:00
]
},
2022-02-21 02:40:39 +01:00
{
"cell_type": "markdown",
"metadata": {},
"source": [
"If running on Colab or Kaggle, install the statsmodels library:"
]
},
2019-04-05 11:04:38 +02:00
{
"cell_type": "code",
"execution_count": 16,
2019-04-05 11:04:38 +02:00
"metadata": {},
"outputs": [],
2022-02-21 02:40:39 +01:00
"source": [
2022-04-05 08:20:41 +02:00
"if \"google.colab\" in sys.modules:\n",
2022-02-21 02:40:39 +01:00
" %pip install -q -U statsmodels"
]
},
{
"cell_type": "code",
"execution_count": 17,
2022-02-21 02:40:39 +01:00
"metadata": {},
"outputs": [],
2019-04-05 11:04:38 +02:00
"source": [
"from statsmodels.tsa.arima.model import ARIMA\n",
2019-04-05 11:04:38 +02:00
"\n",
"origin, today = \"2019-01-01\", \"2019-05-31\"\n",
"rail_series = df.loc[origin:today][\"rail\"].asfreq(\"D\")\n",
"model = ARIMA(rail_series,\n",
" order=(1, 0, 0),\n",
" seasonal_order=(0, 1, 1, 7))\n",
"model = model.fit()\n",
"y_pred = model.forecast() # returns 427,758.6"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 18,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"427758.62631318445"
]
},
"execution_count": 18,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"y_pred[0] # ARIMA forecast"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 19,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"379044"
]
},
"execution_count": 19,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"df[\"rail\"].loc[\"2019-06-01\"] # target value"
]
},
{
"cell_type": "code",
"execution_count": 20,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"426932"
]
},
"execution_count": 20,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"df[\"rail\"].loc[\"2019-05-25\"] # naive forecast (value from one week earlier)"
]
},
{
"cell_type": "code",
"execution_count": 21,
"metadata": {},
"outputs": [],
"source": [
"origin, start_date, end_date = \"2019-01-01\", \"2019-03-01\", \"2019-05-31\"\n",
"time_period = pd.date_range(start_date, end_date)\n",
"rail_series = df.loc[origin:end_date][\"rail\"].asfreq(\"D\")\n",
"y_preds = []\n",
"for today in time_period.shift(-1):\n",
" model = ARIMA(rail_series[origin:today], # train on data up to \"today\"\n",
" order=(1, 0, 0),\n",
" seasonal_order=(0, 1, 1, 7))\n",
" model = model.fit() # note that we retrain the model every day!\n",
" y_pred = model.forecast()[0]\n",
" y_preds.append(y_pred)\n",
2019-04-05 11:04:38 +02:00
"\n",
"y_preds = pd.Series(y_preds, index=time_period)\n",
"mae = (y_preds - rail_series[time_period]).abs().mean() # returns 32,040.7"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 22,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"32040.72008847262"
]
},
"execution_count": 22,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"mae"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 23,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAfoAAADgCAYAAADrL6QAAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAACqV0lEQVR4nOydd5gb1dX/P3fU6xb3umvjhhsYY2zTbDD1Teg4EHoPJKQBecH8SCAhEEgCSUiBECCEBEIPLRgwYNNs44Ibbrj3ul1dmrm/P2ZG2ySt5NXIid/9Ps8+uzuaGV2NZu73nnO+5xwhpaQLXehCF7rQhS4cmlAO9gC60IUudKELXeiCdegi+i50oQtd6EIXDmF0EX0XutCFLnShC4cwuoi+C13oQhe60IVDGF1E34UudKELXejCIYwuou9CF7rQhS504RCG/WAPoNgoLy+XQ4YMOdjD6MIhjHA4jM/nO9jD6MIhgK57qQvFwuLFi/dLKXtkeu2QI/pevXqxaNGigz2MLhzCmDNnDlOnTj3Yw+jCIYCue6kLxYIQYku217pc913oQhe60IUuHMLoIvoudKELXehCFw5hdBF9F7rQhS50oQuHMLqIvgtd6EIXutCFQxhdRN+FLnTh/zTmb9zPL99Zw+ItdQd7KF34D8biLXX8cfb6/8r75JBT3XehC/9nMW8ezJkDU6fC5MkHezT/8ZBS8ocP1/PwrK+QwJOfbuK56ycxvqriYA+tC/9hWLCphsueWEBK03DaFZ697r/rPuki+i504VDAvHk6wadS4HLBBx90kX0GLN5Sx/yNNfT0u3hx8TYWbm62zhIpjfkba/6rJvAuWAspJe+t2sOMV5eTUDUAkv+F90kX0XehC4cC3n4bEgn970RCt+y7iL4VFm+p45K/zCee0ifsoNvOjVMG8/TczcSSGhIY2Td4cAd5EGAufiYN7vZfRV5WYvGWOl5ZvJ3FW+tYu7uJfuVuIIkAHHaFSYO7HewhFoQuou9CF/4LsWBTDZ9vquXYw7rrk3Pb6mpTphycgf2HQkrJk59sTJO8AK46rppbTh3OqSN78+/lO3nqs80s3lzHScN7HtzBlhDm4iepajhsSjp08X+Z/BdsquHix+ejSf3/G6cM5rbThvONP89jS02Ex684+r/umnQRfRe68F+GxVvquOjx+UgJv1XW8ZtvHMHZe/eC0wnHHQezZ8OyZXDssQd7qP8R2FYb4c5/reCTdfsRQid5p11hyjCd0MdXVTC+qoJdDTH+Nm8zN0wZTNDtOLiDNrB4Sx1zN+xvXtAVGS8u2ppe/MRTGlc+tYAhPf3YP5/PMVuW82D1Edxw+6VMO7wnX2yt/z9B/o9/vDFN8jYBAbcDu01h6vCe/Ob9rxjay39wB3gA6CL6LhSGLsFXh7DaGpq/sQZpTESqJvne80sZ9fK/cYw4kn/d8Seukt+l7Ac/gAkT4Oij2x2/aHMtn2+qPeQn7MWba/nTnA18un4/dkXws3NGMbJPMOtn/85JQ5j55W7+Pm8L3znp4PfLWLyljosfn0dSlfzBvr44QsFPP9Wf32nT+Jd7AC8v3o4AhABFCMZXlWOfP58/PTcDm5YiOfd5LtU0vj1wFEkjRu1y/PeJ0fJFQzTJ5xtrUIwFYUs3/Yk164jNfZFNbygc8Y0zD+5AC0QX0f834D+FXD/7THcJS9kl+MoC0xWaSOnqXCtU3ObEY1qmV4wsp/qX6/nDgIv4zYcbeG7cdXy8bg2u6dPhiy+gQn9/KSUPzlzDYx9vRHBoT9iLt9Rx8V/mk1QlioDfXXwUZ4zuA8DR1ZWtdzaer9FTpzJ1eA+e/HQTVx9Xjdd5cKfHVxZvJ6nqK7qk2kkBWCgEd90FjzyClBJ59z0smnYDx1xwOTedNIQVOxqY3D/AUQs/IPHmL3GqSf04NcV3lR38of+x6bSy/0YxWk60mF//UFtGKKHyqwvHsqcx3rwgfOABjphxJ0cg0eY+B44X4bzz/nPm5g7QRfT/6Zg3D6ZNg3j84JPrM8+Aqup/dwm+MmLmil2tXKEPvrOaP3zzKHquXFK0CWHcgHIAJh/WjVtPG874lfNAanw+cDQAe5x+/vrdB7jxx1fDWWfB//wP8RNO5Cd7grywaBsAEoglNT5au/fQmbBbYP7GmjRJCmDDvnDmHf/yF7jxRtA0cLu549nXOGNtiuc+38p1Jwwu3YDbYEtNmDeX70z/b1NE4QKwefPgrbdg5054/XWoq0OiXw80lftmPYrc8gFi+SmcuG4dLFkC+/fjHDgQzW5HpFLYpMZJZx9P8OjDufDRuUj+O8VoWTFvHpx0EiSTaE4XS79xL9PPOoULxw/QX5cSfvUrmDFDv26ALZWC88+H4cNh40b93nE6C5ubi7xAWLylDpu/W+9sr3cR/X865szRSV7T9N8Hk1xrapr/FkK/SbuQxpc7GnjRIFLFmBUWbqrj5u/+iX/8807sqSS4XCgfdm6xFkvpi60pw3roJP3ox2gOB6uqDkcR+tz068YKTvnGlQx59gnkZ5+Bw8m6i+7jgrNO4d8rdhE3VOYvLNzGmWP6cHifQ0ttPqZfGdDe/ZqeYHv0gFdegXfeaT4oFmPEoo+YNPgs/vLJRi6fXIXLbiv52BsiSa55eiE2RfDzc0dz12tfcuWx1YUtyMx0SzMT48QTWTPtbKruvROHmiJps1N34830/eQD+NOf9H0UBR56CH7wA5TPP4fnn4e//hV+9CPGf/wxQ3r6SWqSh6YfcegsDl95RZ9XAWJRzlg3n7NO+47+fyIBN90ETz0FJ58M8+ahxhOkFAXnDdcjXnsNksnmffOdm+fN08+XSByQ8TZvw37+vXwXlT4nmoTFW2qZv7EWm6+iX7Zjuoj+Px1TpyKFgkBDahriyCMPzjhUVY/vTZ0KsRjMn6+vYv+vwyCOL4eO4+IVgjKPg/vOG83W2iiTBneju9/J0uvfwJGMI4BUPM6uf82kXyeIPprQid7jNEjoo49QJkzgyZumMn9jDSP7BnlszgZe/TTBbQgUJM5kgj/u+5g+02/lkolVzN9YQ4XXyW/f/4pz/vAZp4/qxVXHDTpkJvC9TfrkfcnEgZx/VH/9c5kTbCym7xQM6hP500/rk66qwgsv8P0XruGbL9fy3eeW8K0ph1l2TRZvqeOtDQkCg+rS77Hy5Xf49MlXqOgxnPvuuZojB5Rz12tfUuEt8FmbM6eZhGw2dk2ewnR1NEMvvo9JW1ewoGoMJ51/Lt/p2x1WrNANCSF00lMUnXgmT4bLL9c9itOmMexbv2GXt6w416OtRZtMwr//DZ98olvLxx3X+ffIB2vWACCFQEjJVQv+he3O2/Qx/fjHsGmT/vuee+Dzz1nx99f4WVMPHv7pTVRPnAhXXqlfL6czf8PHNN6kLNgz+u/lO7n5uSUYEh0UAWUeh/6/yH5cF9H/h2Nx3xGI3kMZtncTrlSSLb95DOXoE9jTGOOLrfWtREVrXn2XupmzqDjzVEacf3pxB/LRR7BnD+t+/AvmDDySq645A8eNN+qEbyu91fMfgXnz0KaehEjEGWazc9K3f8edt1xNnzJP8z6qim3jUgQYblPJp30P56JOvG00qRO922GDcBgWLYLbbkurxwGOO6w7D21fRXzu8zjUFAqSPu++CWedxfjrrmP86tUwdSo9zx/D9c8s4s3lu3hv1Z5DpjLcW8t30q/cw8/PHY0Qxgz48svNJC8E/PCH+gR++eX6ZOvzwW23Meamy/BM+RHvrdrDx+v2WaJjWLy5Nq0heH3jPL415TAGrPyCc390BcM1jStsdracOwb34NNx2hUaY8nC3mDqVJ2AVJWU3cH3d5fjGqLwZdUolvU/HIddYcbgbuCeqluViURmsjr6aL1Gw2mncc8DN/DeESfBOKVzXsWWCy4hoKwM6uvTL8uHH0acfjpceilUVsLy5dbEwDduhHfeQbvwQp6JVbKqvB/3u7fDY4/BH/+o7+N0wplnphc/rupRfPG7T1iyrY7q44/X9znvPLj11vzHN3WqPmemUgUtENbvDXH7K8sZt2M1k7au4POqMUy58hyOG9KdS5+Yr4t
"text/plain": [
"<Figure size 576x216 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"# extra code displays the SARIMA forecasts\n",
"fig, ax = plt.subplots(figsize=(8, 3))\n",
"rail_series.loc[time_period].plot(label=\"True\", ax=ax, marker=\".\", grid=True)\n",
"ax.plot(y_preds, color=\"r\", marker=\".\", label=\"SARIMA Forecasts\")\n",
"plt.legend()\n",
"plt.show()"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 24,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAA3kAAAFACAYAAAABGrWZAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAAA/aUlEQVR4nO3dfZxcdX33//dnJyzZJMQNYFZyI1CJEWI1QEoulOpSij/A9gLUtnD1R1Gxqb9LuKzW/uSy/XnXq5Z6c7WKN5QWflIuhUsrClUsYmRLrYDcRSDEhYhAQmgCgSXZ7Cabnflcf5wzyczszOyZnTN7bub1fDzmsTtnzjnz/Z5z5nzmM9/v+R5zdwEAAAAA8qEn6QIAAAAAAOJDkgcAAAAAOUKSBwAAAAA5QpIHAAAAADlCkgcAAAAAOUKSBwAAAAA5QpIH5JyZDZqZm9mRba7nmHA9a+IqGwAgWWb2pJl9qIX5Y4kp3cTM3mlmozGsh22PyEjykGtmdqKZFc3s32ewbNcmNWY2ZGZfrJm8RdJRkjbMfokAoDuY2VfD2ONmtt/MnjCzz5rZ/DbX+3Eze6TOS78m6cvtrLvJe74tjMFfm+HyXZvUNEi+f6IgDu9MoEjIGJI85N0fKgherzWz45MuzEyZWW+daXPMzGarDO5edPf/cPfJ2XpPAOhSP1TwZf5XJP25pP8q6bMzXZmZHdLoNXd/zt3HZrruabxH0qclnWdmizr0HrOi3jasF5s7yd0nwjjss/m+yCaSPOSWmfVJ+i+S/l7SP0m6pOK1uq104bR3hE9/Gf69N5w+FM7TY2b/n5ltMbN9ZvawmZ1bs54lZvY1M9tpZmNmtsHMTq94/Y/MbLOZTYR//7BOOd5nZjeZ2R5Jnyr/Cht2+/iFpH2S5pvZy8zsajPbYWa7zexfm7U+mtkRZnaDmW01s3Ez22hm76p4/auS3izpfRW/Jh9Tb5uZ2ZvM7B4z22tm283sbyqDXtgi+GUz+5SZPR+W8bNmxrkHABrbF36Z3+LuX5f0NUnnSZKZ/d9mdm94vt9hZt80s6XlBStav84xs5+a2YSkP5L0MUmrKs7r7wznr2oxMrMPmtlDZrbHzJ4xs38ws/5WK2BmyySdriA5vVvS79e8PqWVrjLOmNkxku4IX3ounP7VcL5Dzexvw7iz18zuNrPTatb/GjO7xcxeMrNRM7vLzH41fK1pHK8ox4Vm9iMzG5f0Rxa0sn7XzD5sZlslbQ3nX2pmN5rZi+Hje2a2osm2eZWZ3Wxm/xFu5wfM7LcqXh+SdLSkz5T3V5Nt9raw/PvC+vyZ2cEfgMP9++dm9ndmtiuM/X867Q5E5vFFC3n2DklPuftDkq6X9AfW5NfMOk4J/56l4BfVt4XP3y/pTyV9WNKvSvq2pJvMbLUkWdCl5l8lHSPp/HCeT5ZXambnS/qipL+V9FpJn5f0ZTP77Zr3/5ikW8PlvxROO1ZB4vo7kl6vINH7nqSlkn5L0omS7pT0IzM7qkG95kp6IJx/Vfj+f2dmZ1TU7y5J/39Y76MUdNWsEn6p+L6kB8P3vUTShZL+qmbW35c0KekNki6V9MeSfq9B2QAAU41LKsevXgXx4fUKzuNHSrqhzjJ/raAV8DWSbpb0OUnDOnhe/98N3quk4Dy9SkG8OUXSlTMo87sk/cDddyqIwe9pcfktkt4e/r9KQZnfHz7/tII48m4F8edhSf9SjntmtkTSjyW5pDMlnaQgjhbC5ZvG8Qp/paA30AmSvhNOe7Ok1yn4bnCGmc1TkIzuDV87VdKzkn4YvlbPAgXx80wF+/Fb4fu/Jnz9bQoSyE/q4P6awsxOlvRNSTeF9bhc0n9XEGsrfSDcRicpOC4+bWanNigb8sLdefDI5UNBovWh8H+T9KSkt4fPj1Fw8l9Ts4xLesc08zwj6aM104Yk/a/w/z+UtFvSkQ3K9e+Srq2Z9lVJP64px5U183xc0n5JAxXTfkPSqKS+mnk3SPp/w/8Hw/XVLU84z42S/qGmPl+smadqe0j6S0mbJfVUzPNOBYnnvIr13FWzntsr34sHDx48eBx8hPHguxXPT5H0vKT/3WD+14Tn5mXh8/I5/+01831c0iN1ln+yHCsbrP+s8LzeU7P+ZjHFJD1REU8XSNoj6eSKeaasp06cqTfPfEkTkv6gYlpB0i8k/Y/w+V9KekpSb4PyTRfHy+X4kzr75jlJh1ZMe7ekxyVZTXl2Svrd8Pk7JY1Os9/vlvTnzfZL7fZQ0ML7ozr7eWvNem6omefxyvfikc8HLXnIJTM7TtIbJX1dkjw4q31Nrf+SWLvehZKWKEjUKv1YwS99UvCr4kPu/nyD1Rw/zfJl99VZdqu7b694frKkeQq6soyWHwpaCF/VoA6FsDvHQxZ0Jx1V8KvhKxuUt5HjFSRwpZp69Eo6rmLaQzXLbZO0uMX3AoBuclZ4Pt+roGfFnZIukyQzOyns6veUme3WwVhRew6vF0OmZWa/YWa3h936ditoJeqV9IoWVnOGpEWS/lmS3H1UQUtYWzE49CoFrZoH4qi7FxVsp8o4/GN3n6hdOGIcL6u3DR9x930Vz09W0Mtmd0UMfklB/RvF4flm9mkzezTs3jkqaY1mFofr1WNpWM8y4nAXmpN0AYAOeY+CX9KeruyaLklmtlxBd5QD08LprXTlrHfRc3lalMFQmi1ftqfOPLXTeiRtl/Trdebd1eC9PyTpTxR0V3lYQUvgp9T6Cd9Uvx6qmb6/zmv8wAQAjd0paZ2C8+c2d98vHbgc4DYFA7NcJGmHgu6a/6YgEatUL4Y0ZWZHK7gE4O8lfVRBa9RJCrqDtjLIyHsk9UvaUxODd5vZn3gw0MuUOKyDXVKbFjP8m6Y4vEHSBXXmfaHBe39WQQvphxS0qo1J+ke1to0l4jCaYAcjd8xsjqSLFfRLX13xeL2CX7PepaC7hVTdz311zarKvwCW+/DL3Xcp+AXstJp5T5P0aPj/A5JeZ42HfN40zfKteEDSgKSSu2+ueexosMxpkv7Z3a939w0Kuri8umaeCVXUu4FHJZ1q1YOonBYu+4tWKwIAOGAsPI8/VU7wQq9RkNR9xN3vdPefK/oPdFHO62sUJBofcPe73P0xBa1ekZnZ4QoGiblYU2PwPgXXy0szjMMKLhOYUEUcNbOCgmvhKuPwaVZn9MuIcbwVDyjovfJ8nTjcKMk7TdI/uvu3PBg3YKumtvpFjcP16rHV3Xe3Vg3kDUke8uitCoLg37v7I5UPBdeevVtBoLlb0ofNbJWZvUFTh6feoeBi9//LzAbM7GXh9M9I+lA46tarzeyTClrSPhe+/vVw2e+Y2a+b2bFm9p/t4Oian5F0kQWjZ64ws8sUDE7y6RnU9YcKumrcbGZnh+91qpl9wszqte5J0mMKLhY/LbzI+4sKuppUelLSKeEIY0da/dEwv6wg+H/ZzI43s7dKukLBtXydGo4bALrZ0wri16Vm9ivhefcvIi77pKSjw+6eR5rZoXXmeVzBd8M/DuPJhQoGYWnFRQquS/9anRh8kw522dysYHCVj4ex9C0KBoqp9JSCVqe3mtnLzWyBu++R9BVJV1gwgujx4fMBHbzf35cVXAf4DTP7NTM7LozZq8PXp4vjrfiagh41N5vZm8Pt9iYz+5w1HmHzMUnnh/viVyX9LwWDolV6UtKvWzByZ6MfjT8n6c0WjL79ajP7fQU9dWbyfQI5Q5KHPLpE0h0ejOhV65sKhiX+TQXJniTdK+nvVBNcPLgf3H9TEJC2KRidTJK+oCBAfFrSIwpG0Hx72CqmMAC9WcGF3f8saaOkTyjsOuHu31FwbcUHFPwK935J/9Xd/7nViobXGp4j6UcKutcMS/qGpJVhmev5H5J+qmBkrzsVdD2pvVHtZxX8iviogl9bp1wn4O7PSDpbwbUPGyRdq6BLz0darQcAYHru/pyCFrLzFJyfPybpgxEX/5aCEZvXKzivX1hn/Q8piEkfDNf/HgVdCltxiaRvh9fJ1fqmgsTl1WEL5QUK7gX4MwVxsip+hHHmYwoGUtmu4EdJKRgV8xsKRoHeoHC0S3d/tmK5Nylolbx
"text/plain": [
"<Figure size 1080x360 with 2 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"# extra code shows how to plot the Autocorrelation Function (ACF) and the\n",
"# Partial Autocorrelation Function (PACF)\n",
2019-04-05 11:04:38 +02:00
"\n",
"from statsmodels.graphics.tsaplots import plot_acf, plot_pacf\n",
2019-04-05 11:04:38 +02:00
"\n",
"fig, axs = plt.subplots(nrows=1, ncols=2, figsize=(15, 5))\n",
"plot_acf(df[period][\"rail\"], ax=axs[0], lags=35)\n",
"axs[0].grid()\n",
"plot_pacf(df[period][\"rail\"], ax=axs[1], lags=35, method=\"ywm\")\n",
"axs[1].grid()\n",
"plt.show()"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 25,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"2022-02-17 19:19:46.679147: I tensorflow/core/platform/cpu_feature_guard.cc:151] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA\n",
"To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.\n"
]
},
{
"data": {
"text/plain": [
"[(<tf.Tensor: shape=(2, 3), dtype=int32, numpy=\n",
" array([[0, 1, 2],\n",
" [1, 2, 3]], dtype=int32)>,\n",
" <tf.Tensor: shape=(2,), dtype=int32, numpy=array([3, 4], dtype=int32)>),\n",
" (<tf.Tensor: shape=(1, 3), dtype=int32, numpy=array([[2, 3, 4]], dtype=int32)>,\n",
" <tf.Tensor: shape=(1,), dtype=int32, numpy=array([5], dtype=int32)>)]"
]
},
"execution_count": 25,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"import tensorflow as tf\n",
"\n",
"my_series = [0, 1, 2, 3, 4, 5]\n",
"my_dataset = tf.keras.utils.timeseries_dataset_from_array(\n",
" my_series,\n",
" targets=my_series[3:], # the targets are 3 steps into the future\n",
" sequence_length=3,\n",
" batch_size=2\n",
")\n",
"list(my_dataset)"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 26,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"0 1 2 3 \n",
"1 2 3 4 \n",
"2 3 4 5 \n",
"3 4 5 \n",
"4 5 \n",
"5 \n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"2022-02-17 19:19:46.784180: W tensorflow/core/framework/dataset.cc:744] Input of Window will not be optimized because the dataset does not implement the AsGraphDefInternal() method needed to apply optimizations.\n"
]
}
],
2019-04-05 11:04:38 +02:00
"source": [
"for window_dataset in tf.data.Dataset.range(6).window(4, shift=1):\n",
" for element in window_dataset:\n",
" print(f\"{element}\", end=\" \")\n",
" print()"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 27,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"[0 1 2 3]\n",
"[1 2 3 4]\n",
"[2 3 4 5]\n"
]
}
],
2019-04-05 11:04:38 +02:00
"source": [
"dataset = tf.data.Dataset.range(6).window(4, shift=1, drop_remainder=True)\n",
"dataset = dataset.flat_map(lambda window_dataset: window_dataset.batch(4))\n",
"for window_tensor in dataset:\n",
" print(f\"{window_tensor}\")"
]
},
{
"cell_type": "code",
"execution_count": 28,
"metadata": {},
"outputs": [],
"source": [
"def to_windows(dataset, length):\n",
" dataset = dataset.window(length, shift=1, drop_remainder=True)\n",
" return dataset.flat_map(lambda window_ds: window_ds.batch(length))"
]
},
{
"cell_type": "code",
"execution_count": 29,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"[(<tf.Tensor: shape=(2, 3), dtype=int64, numpy=\n",
" array([[0, 1, 2],\n",
" [1, 2, 3]])>,\n",
" <tf.Tensor: shape=(2,), dtype=int64, numpy=array([3, 4])>),\n",
" (<tf.Tensor: shape=(1, 3), dtype=int64, numpy=array([[2, 3, 4]])>,\n",
" <tf.Tensor: shape=(1,), dtype=int64, numpy=array([5])>)]"
]
},
"execution_count": 29,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"dataset = to_windows(tf.data.Dataset.range(6), 4)\n",
"dataset = dataset.map(lambda window: (window[:-1], window[-1]))\n",
"list(dataset.batch(2))"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Before we continue looking at the data, let's split the time series into three periods, for training, validation and testing. We won't look at the test data for now:"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 30,
2019-04-05 11:04:38 +02:00
"metadata": {},
"outputs": [],
"source": [
"rail_train = df[\"rail\"][\"2016-01\":\"2018-12\"] / 1e6\n",
"rail_valid = df[\"rail\"][\"2019-01\":\"2019-05\"] / 1e6\n",
"rail_test = df[\"rail\"][\"2019-06\":] / 1e6"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 31,
2019-04-05 11:04:38 +02:00
"metadata": {},
"outputs": [],
"source": [
"seq_length = 56\n",
"tf.random.set_seed(42) # extra code ensures reproducibility\n",
"train_ds = tf.keras.utils.timeseries_dataset_from_array(\n",
" rail_train.to_numpy(),\n",
" targets=rail_train[seq_length:],\n",
" sequence_length=seq_length,\n",
" batch_size=32,\n",
" shuffle=True,\n",
" seed=42\n",
")\n",
"valid_ds = tf.keras.utils.timeseries_dataset_from_array(\n",
" rail_valid.to_numpy(),\n",
" targets=rail_valid[seq_length:],\n",
" sequence_length=seq_length,\n",
" batch_size=32\n",
")"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 32,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/500\n",
"33/33 [==============================] - 0s 5ms/step - loss: 0.0098 - mae: 0.1118 - val_loss: 0.0071 - val_mae: 0.0966\n",
"Epoch 2/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0070 - mae: 0.0883 - val_loss: 0.0052 - val_mae: 0.0768\n",
"Epoch 3/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0059 - mae: 0.0796 - val_loss: 0.0050 - val_mae: 0.0741\n",
"Epoch 4/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0055 - mae: 0.0761 - val_loss: 0.0049 - val_mae: 0.0732\n",
"Epoch 5/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0054 - mae: 0.0749 - val_loss: 0.0043 - val_mae: 0.0666\n",
"Epoch 6/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0051 - mae: 0.0724 - val_loss: 0.0041 - val_mae: 0.0638\n",
"Epoch 7/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0047 - mae: 0.0696 - val_loss: 0.0040 - val_mae: 0.0615\n",
"Epoch 8/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0051 - mae: 0.0735 - val_loss: 0.0038 - val_mae: 0.0599\n",
"Epoch 9/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0045 - mae: 0.0670 - val_loss: 0.0037 - val_mae: 0.0599\n",
"Epoch 10/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0046 - mae: 0.0677 - val_loss: 0.0041 - val_mae: 0.0658\n",
"Epoch 11/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0044 - mae: 0.0664 - val_loss: 0.0038 - val_mae: 0.0611\n",
"Epoch 12/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0042 - mae: 0.0634 - val_loss: 0.0034 - val_mae: 0.0551\n",
"Epoch 13/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0046 - mae: 0.0680 - val_loss: 0.0056 - val_mae: 0.0829\n",
"Epoch 14/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0044 - mae: 0.0671 - val_loss: 0.0039 - val_mae: 0.0637\n",
"Epoch 15/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0044 - mae: 0.0673 - val_loss: 0.0037 - val_mae: 0.0610\n",
"Epoch 16/500\n",
"33/33 [==============================] - 0s 4ms/step - loss: 0.0045 - mae: 0.0676 - val_loss: 0.0035 - val_mae: 0.0584\n",
"Epoch 17/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0044 - mae: 0.0662 - val_loss: 0.0033 - val_mae: 0.0544\n",
"Epoch 18/500\n",
"<<396 more lines>>\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0026 - mae: 0.0440 - val_loss: 0.0023 - val_mae: 0.0404\n",
"Epoch 217/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0029 - mae: 0.0500 - val_loss: 0.0028 - val_mae: 0.0526\n",
"Epoch 218/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0026 - mae: 0.0458 - val_loss: 0.0023 - val_mae: 0.0387\n",
"Epoch 219/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0027 - mae: 0.0454 - val_loss: 0.0023 - val_mae: 0.0396\n",
"Epoch 220/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0026 - mae: 0.0444 - val_loss: 0.0026 - val_mae: 0.0425\n",
"Epoch 221/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0026 - mae: 0.0452 - val_loss: 0.0023 - val_mae: 0.0387\n",
"Epoch 222/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0025 - mae: 0.0433 - val_loss: 0.0024 - val_mae: 0.0432\n",
"Epoch 223/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0026 - mae: 0.0441 - val_loss: 0.0029 - val_mae: 0.0489\n",
"Epoch 224/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0031 - mae: 0.0524 - val_loss: 0.0023 - val_mae: 0.0394\n",
"Epoch 225/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0025 - mae: 0.0424 - val_loss: 0.0023 - val_mae: 0.0386\n",
"Epoch 226/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0026 - mae: 0.0438 - val_loss: 0.0023 - val_mae: 0.0383\n",
"Epoch 227/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0027 - mae: 0.0463 - val_loss: 0.0023 - val_mae: 0.0405\n",
"Epoch 228/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0026 - mae: 0.0445 - val_loss: 0.0023 - val_mae: 0.0384\n",
"Epoch 229/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0025 - mae: 0.0430 - val_loss: 0.0023 - val_mae: 0.0382\n",
"Epoch 230/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0026 - mae: 0.0451 - val_loss: 0.0023 - val_mae: 0.0397\n",
"Epoch 231/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0025 - mae: 0.0434 - val_loss: 0.0023 - val_mae: 0.0401\n",
"Epoch 232/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0027 - mae: 0.0459 - val_loss: 0.0022 - val_mae: 0.0389\n",
"Epoch 233/500\n",
"33/33 [==============================] - 0s 3ms/step - loss: 0.0027 - mae: 0.0464 - val_loss: 0.0025 - val_mae: 0.0469\n"
]
}
],
2019-04-05 11:04:38 +02:00
"source": [
"tf.random.set_seed(42)\n",
"model = tf.keras.Sequential([\n",
" tf.keras.layers.Dense(1, input_shape=[seq_length])\n",
"])\n",
"early_stopping_cb = tf.keras.callbacks.EarlyStopping(\n",
" monitor=\"val_mae\", patience=50, restore_best_weights=True)\n",
"opt = tf.keras.optimizers.SGD(learning_rate=0.02, momentum=0.9)\n",
"model.compile(loss=tf.keras.losses.Huber(), optimizer=opt, metrics=[\"mae\"])\n",
"history = model.fit(train_ds, validation_data=valid_ds, epochs=500,\n",
" callbacks=[early_stopping_cb])"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 33,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"3/3 [==============================] - 0s 2ms/step - loss: 0.0022 - mae: 0.0379\n"
]
},
{
"data": {
"text/plain": [
"37866.38006567955"
]
},
"execution_count": 33,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"# extra code evaluates the model\n",
"valid_loss, valid_mae = model.evaluate(valid_ds)\n",
"valid_mae * 1e6"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Using a Simple RNN"
]
},
2019-04-05 11:04:38 +02:00
{
"cell_type": "code",
"execution_count": 34,
2019-04-05 11:04:38 +02:00
"metadata": {},
"outputs": [],
"source": [
"tf.random.set_seed(42) # extra code ensures reproducibility\n",
2021-10-17 04:04:08 +02:00
"model = tf.keras.Sequential([\n",
" tf.keras.layers.SimpleRNN(1, input_shape=[None, 1])\n",
"])"
]
},
{
"cell_type": "code",
"execution_count": 35,
"metadata": {},
"outputs": [],
"source": [
"# extra code defines a utility function we'll reuse several time\n",
2019-04-05 11:04:38 +02:00
"\n",
"def fit_and_evaluate(model, train_set, valid_set, learning_rate, epochs=500):\n",
" early_stopping_cb = tf.keras.callbacks.EarlyStopping(\n",
" monitor=\"val_mae\", patience=50, restore_best_weights=True)\n",
" opt = tf.keras.optimizers.SGD(learning_rate=learning_rate, momentum=0.9)\n",
" model.compile(loss=tf.keras.losses.Huber(), optimizer=opt, metrics=[\"mae\"])\n",
" history = model.fit(train_set, validation_data=valid_set, epochs=epochs,\n",
" callbacks=[early_stopping_cb])\n",
" valid_loss, valid_mae = model.evaluate(valid_set)\n",
" return valid_mae * 1e6"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 36,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/500\n",
"33/33 [==============================] - 1s 11ms/step - loss: 0.0219 - mae: 0.1637 - val_loss: 0.0195 - val_mae: 0.1394\n",
"Epoch 2/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0170 - mae: 0.1553 - val_loss: 0.0179 - val_mae: 0.1482\n",
"Epoch 3/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0166 - mae: 0.1555 - val_loss: 0.0176 - val_mae: 0.1501\n",
"Epoch 4/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0164 - mae: 0.1558 - val_loss: 0.0173 - val_mae: 0.1534\n",
"Epoch 5/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0163 - mae: 0.1572 - val_loss: 0.0172 - val_mae: 0.1479\n",
"Epoch 6/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0162 - mae: 0.1555 - val_loss: 0.0170 - val_mae: 0.1496\n",
"Epoch 7/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0162 - mae: 0.1556 - val_loss: 0.0168 - val_mae: 0.1552\n",
"Epoch 8/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0161 - mae: 0.1580 - val_loss: 0.0169 - val_mae: 0.1448\n",
"Epoch 9/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0160 - mae: 0.1563 - val_loss: 0.0168 - val_mae: 0.1451\n",
"Epoch 10/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0159 - mae: 0.1562 - val_loss: 0.0167 - val_mae: 0.1454\n",
"Epoch 11/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0159 - mae: 0.1564 - val_loss: 0.0164 - val_mae: 0.1491\n",
"Epoch 12/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0158 - mae: 0.1559 - val_loss: 0.0165 - val_mae: 0.1445\n",
"Epoch 13/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0158 - mae: 0.1556 - val_loss: 0.0162 - val_mae: 0.1514\n",
"Epoch 14/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0157 - mae: 0.1564 - val_loss: 0.0162 - val_mae: 0.1533\n",
"Epoch 15/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0157 - mae: 0.1553 - val_loss: 0.0165 - val_mae: 0.1420\n",
"Epoch 16/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0158 - mae: 0.1562 - val_loss: 0.0164 - val_mae: 0.1425\n",
"Epoch 17/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0156 - mae: 0.1570 - val_loss: 0.0164 - val_mae: 0.1407\n",
"Epoch 18/500\n",
"<<687 more lines>>\n",
"Epoch 362/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0103 - mae: 0.1130 - val_loss: 0.0103 - val_mae: 0.1029\n",
"Epoch 363/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0103 - mae: 0.1128 - val_loss: 0.0103 - val_mae: 0.1029\n",
"Epoch 364/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0104 - mae: 0.1131 - val_loss: 0.0102 - val_mae: 0.1029\n",
"Epoch 365/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0104 - mae: 0.1133 - val_loss: 0.0103 - val_mae: 0.1029\n",
"Epoch 366/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0104 - mae: 0.1128 - val_loss: 0.0103 - val_mae: 0.1028\n",
"Epoch 367/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0103 - mae: 0.1129 - val_loss: 0.0103 - val_mae: 0.1029\n",
"Epoch 368/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0104 - mae: 0.1135 - val_loss: 0.0102 - val_mae: 0.1030\n",
"Epoch 369/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0103 - mae: 0.1129 - val_loss: 0.0103 - val_mae: 0.1028\n",
"Epoch 370/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0104 - mae: 0.1129 - val_loss: 0.0103 - val_mae: 0.1029\n",
"Epoch 371/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0103 - mae: 0.1130 - val_loss: 0.0103 - val_mae: 0.1029\n",
"Epoch 372/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0103 - mae: 0.1131 - val_loss: 0.0103 - val_mae: 0.1029\n",
"Epoch 373/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0104 - mae: 0.1132 - val_loss: 0.0103 - val_mae: 0.1029\n",
"Epoch 374/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0104 - mae: 0.1130 - val_loss: 0.0103 - val_mae: 0.1029\n",
"Epoch 375/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0104 - mae: 0.1132 - val_loss: 0.0103 - val_mae: 0.1029\n",
"Epoch 376/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0104 - mae: 0.1134 - val_loss: 0.0103 - val_mae: 0.1029\n",
"Epoch 377/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0104 - mae: 0.1131 - val_loss: 0.0103 - val_mae: 0.1029\n",
"Epoch 378/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0103 - mae: 0.1128 - val_loss: 0.0103 - val_mae: 0.1029\n",
"3/3 [==============================] - 0s 3ms/step - loss: 0.0103 - mae: 0.1028\n"
]
},
{
"data": {
"text/plain": [
"102786.95076704025"
]
},
"execution_count": 36,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"fit_and_evaluate(model, train_ds, valid_ds, learning_rate=0.02)"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 37,
2019-04-05 11:04:38 +02:00
"metadata": {},
"outputs": [],
"source": [
"tf.random.set_seed(42) # extra code ensures reproducibility\n",
"univar_model = tf.keras.Sequential([\n",
" tf.keras.layers.SimpleRNN(32, input_shape=[None, 1]),\n",
" tf.keras.layers.Dense(1) # no activation function by default\n",
"])"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 38,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/500\n",
"33/33 [==============================] - 1s 13ms/step - loss: 0.0489 - mae: 0.2061 - val_loss: 0.0060 - val_mae: 0.0854\n",
"Epoch 2/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0060 - mae: 0.0813 - val_loss: 0.0052 - val_mae: 0.0825\n",
"Epoch 3/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0042 - mae: 0.0647 - val_loss: 0.0041 - val_mae: 0.0656\n",
"Epoch 4/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0041 - mae: 0.0636 - val_loss: 0.0042 - val_mae: 0.0714\n",
"Epoch 5/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0039 - mae: 0.0595 - val_loss: 0.0023 - val_mae: 0.0387\n",
"Epoch 6/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0033 - mae: 0.0542 - val_loss: 0.0026 - val_mae: 0.0423\n",
"Epoch 7/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0032 - mae: 0.0502 - val_loss: 0.0021 - val_mae: 0.0354\n",
"Epoch 8/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0030 - mae: 0.0500 - val_loss: 0.0020 - val_mae: 0.0345\n",
"Epoch 9/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0033 - mae: 0.0539 - val_loss: 0.0050 - val_mae: 0.0825\n",
"Epoch 10/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0034 - mae: 0.0573 - val_loss: 0.0023 - val_mae: 0.0399\n",
"Epoch 11/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0030 - mae: 0.0493 - val_loss: 0.0022 - val_mae: 0.0377\n",
"Epoch 12/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0029 - mae: 0.0478 - val_loss: 0.0019 - val_mae: 0.0328\n",
"Epoch 13/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0028 - mae: 0.0460 - val_loss: 0.0024 - val_mae: 0.0404\n",
"Epoch 14/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0029 - mae: 0.0487 - val_loss: 0.0022 - val_mae: 0.0371\n",
"Epoch 15/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0029 - mae: 0.0469 - val_loss: 0.0019 - val_mae: 0.0306\n",
"Epoch 16/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0027 - mae: 0.0465 - val_loss: 0.0019 - val_mae: 0.0348\n",
"Epoch 17/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0029 - mae: 0.0485 - val_loss: 0.0024 - val_mae: 0.0426\n",
"Epoch 18/500\n",
"<<201 more lines>>\n",
"Epoch 119/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0024 - mae: 0.0428 - val_loss: 0.0020 - val_mae: 0.0334\n",
"Epoch 120/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0024 - mae: 0.0423 - val_loss: 0.0019 - val_mae: 0.0362\n",
"Epoch 121/500\n",
"33/33 [==============================] - 0s 11ms/step - loss: 0.0023 - mae: 0.0408 - val_loss: 0.0019 - val_mae: 0.0356\n",
"Epoch 122/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0023 - mae: 0.0397 - val_loss: 0.0020 - val_mae: 0.0395\n",
"Epoch 123/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0024 - mae: 0.0429 - val_loss: 0.0017 - val_mae: 0.0297\n",
"Epoch 124/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0025 - mae: 0.0437 - val_loss: 0.0019 - val_mae: 0.0359\n",
"Epoch 125/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0024 - mae: 0.0430 - val_loss: 0.0017 - val_mae: 0.0305\n",
"Epoch 126/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0023 - mae: 0.0399 - val_loss: 0.0021 - val_mae: 0.0409\n",
"Epoch 127/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0023 - mae: 0.0411 - val_loss: 0.0018 - val_mae: 0.0314\n",
"Epoch 128/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0023 - mae: 0.0394 - val_loss: 0.0021 - val_mae: 0.0392\n",
"Epoch 129/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0023 - mae: 0.0416 - val_loss: 0.0017 - val_mae: 0.0329\n",
"Epoch 130/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0023 - mae: 0.0418 - val_loss: 0.0020 - val_mae: 0.0389\n",
"Epoch 131/500\n",
"33/33 [==============================] - 0s 11ms/step - loss: 0.0023 - mae: 0.0398 - val_loss: 0.0017 - val_mae: 0.0297\n",
"Epoch 132/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0023 - mae: 0.0415 - val_loss: 0.0018 - val_mae: 0.0333\n",
"Epoch 133/500\n",
"33/33 [==============================] - 0s 12ms/step - loss: 0.0023 - mae: 0.0398 - val_loss: 0.0019 - val_mae: 0.0319\n",
"Epoch 134/500\n",
"33/33 [==============================] - 0s 11ms/step - loss: 0.0023 - mae: 0.0401 - val_loss: 0.0019 - val_mae: 0.0333\n",
"Epoch 135/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0022 - mae: 0.0384 - val_loss: 0.0020 - val_mae: 0.0398\n",
"3/3 [==============================] - 0s 6ms/step - loss: 0.0018 - mae: 0.0290\n"
]
},
{
"data": {
"text/plain": [
"29014.97296988964"
]
},
"execution_count": 38,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"# extra code compiles, fits, and evaluates the model, like earlier\n",
"fit_and_evaluate(univar_model, train_ds, valid_ds, learning_rate=0.05)"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Deep RNNs"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 39,
2019-04-05 11:04:38 +02:00
"metadata": {},
"outputs": [],
"source": [
"tf.random.set_seed(42) # extra code ensures reproducibility\n",
"deep_model = tf.keras.Sequential([\n",
" tf.keras.layers.SimpleRNN(32, return_sequences=True, input_shape=[None, 1]),\n",
" tf.keras.layers.SimpleRNN(32, return_sequences=True),\n",
" tf.keras.layers.SimpleRNN(32),\n",
" tf.keras.layers.Dense(1)\n",
"])"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 40,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/500\n",
"33/33 [==============================] - 2s 32ms/step - loss: 0.0393 - mae: 0.2109 - val_loss: 0.0085 - val_mae: 0.1110\n",
"Epoch 2/500\n",
"33/33 [==============================] - 1s 25ms/step - loss: 0.0068 - mae: 0.0858 - val_loss: 0.0032 - val_mae: 0.0629\n",
"Epoch 3/500\n",
"33/33 [==============================] - 1s 24ms/step - loss: 0.0055 - mae: 0.0750 - val_loss: 0.0035 - val_mae: 0.0638\n",
"Epoch 4/500\n",
"33/33 [==============================] - 1s 27ms/step - loss: 0.0048 - mae: 0.0678 - val_loss: 0.0021 - val_mae: 0.0429\n",
"Epoch 5/500\n",
"33/33 [==============================] - 1s 27ms/step - loss: 0.0043 - mae: 0.0606 - val_loss: 0.0020 - val_mae: 0.0408\n",
"Epoch 6/500\n",
"33/33 [==============================] - 1s 27ms/step - loss: 0.0042 - mae: 0.0591 - val_loss: 0.0027 - val_mae: 0.0502\n",
"Epoch 7/500\n",
"33/33 [==============================] - 1s 25ms/step - loss: 0.0045 - mae: 0.0635 - val_loss: 0.0025 - val_mae: 0.0469\n",
"Epoch 8/500\n",
"33/33 [==============================] - 1s 24ms/step - loss: 0.0042 - mae: 0.0592 - val_loss: 0.0027 - val_mae: 0.0498\n",
"Epoch 9/500\n",
"33/33 [==============================] - 1s 26ms/step - loss: 0.0039 - mae: 0.0555 - val_loss: 0.0034 - val_mae: 0.0619\n",
"Epoch 10/500\n",
"33/33 [==============================] - 1s 25ms/step - loss: 0.0041 - mae: 0.0590 - val_loss: 0.0022 - val_mae: 0.0400\n",
"Epoch 11/500\n",
"33/33 [==============================] - 1s 25ms/step - loss: 0.0037 - mae: 0.0526 - val_loss: 0.0022 - val_mae: 0.0408\n",
"Epoch 12/500\n",
"33/33 [==============================] - 1s 26ms/step - loss: 0.0037 - mae: 0.0543 - val_loss: 0.0019 - val_mae: 0.0349\n",
"Epoch 13/500\n",
"33/33 [==============================] - 1s 23ms/step - loss: 0.0034 - mae: 0.0493 - val_loss: 0.0019 - val_mae: 0.0334\n",
"Epoch 14/500\n",
"33/33 [==============================] - 1s 23ms/step - loss: 0.0035 - mae: 0.0505 - val_loss: 0.0020 - val_mae: 0.0341\n",
"Epoch 15/500\n",
"33/33 [==============================] - 1s 23ms/step - loss: 0.0034 - mae: 0.0494 - val_loss: 0.0020 - val_mae: 0.0360\n",
"Epoch 16/500\n",
"33/33 [==============================] - 1s 23ms/step - loss: 0.0033 - mae: 0.0496 - val_loss: 0.0027 - val_mae: 0.0474\n",
"Epoch 17/500\n",
"33/33 [==============================] - 1s 23ms/step - loss: 0.0037 - mae: 0.0559 - val_loss: 0.0020 - val_mae: 0.0332\n",
"Epoch 18/500\n",
"<<103 more lines>>\n",
"Epoch 70/500\n",
"33/33 [==============================] - 1s 24ms/step - loss: 0.0026 - mae: 0.0422 - val_loss: 0.0022 - val_mae: 0.0363\n",
"Epoch 71/500\n",
"33/33 [==============================] - 1s 24ms/step - loss: 0.0027 - mae: 0.0458 - val_loss: 0.0019 - val_mae: 0.0321\n",
"Epoch 72/500\n",
"33/33 [==============================] - 1s 24ms/step - loss: 0.0025 - mae: 0.0413 - val_loss: 0.0020 - val_mae: 0.0335\n",
"Epoch 73/500\n",
"33/33 [==============================] - 1s 24ms/step - loss: 0.0026 - mae: 0.0435 - val_loss: 0.0021 - val_mae: 0.0354\n",
"Epoch 74/500\n",
"33/33 [==============================] - 1s 25ms/step - loss: 0.0026 - mae: 0.0436 - val_loss: 0.0021 - val_mae: 0.0357\n",
"Epoch 75/500\n",
"33/33 [==============================] - 1s 24ms/step - loss: 0.0026 - mae: 0.0432 - val_loss: 0.0021 - val_mae: 0.0347\n",
"Epoch 76/500\n",
"33/33 [==============================] - 1s 24ms/step - loss: 0.0025 - mae: 0.0421 - val_loss: 0.0027 - val_mae: 0.0477\n",
"Epoch 77/500\n",
"33/33 [==============================] - 1s 24ms/step - loss: 0.0027 - mae: 0.0444 - val_loss: 0.0019 - val_mae: 0.0320\n",
"Epoch 78/500\n",
"33/33 [==============================] - 1s 24ms/step - loss: 0.0028 - mae: 0.0468 - val_loss: 0.0019 - val_mae: 0.0318\n",
"Epoch 79/500\n",
"33/33 [==============================] - 1s 24ms/step - loss: 0.0027 - mae: 0.0466 - val_loss: 0.0021 - val_mae: 0.0366\n",
"Epoch 80/500\n",
"33/33 [==============================] - 1s 24ms/step - loss: 0.0026 - mae: 0.0442 - val_loss: 0.0025 - val_mae: 0.0454\n",
"Epoch 81/500\n",
"33/33 [==============================] - 1s 25ms/step - loss: 0.0026 - mae: 0.0438 - val_loss: 0.0019 - val_mae: 0.0313\n",
"Epoch 82/500\n",
"33/33 [==============================] - 1s 26ms/step - loss: 0.0025 - mae: 0.0419 - val_loss: 0.0020 - val_mae: 0.0350\n",
"Epoch 83/500\n",
"33/33 [==============================] - 1s 27ms/step - loss: 0.0026 - mae: 0.0438 - val_loss: 0.0021 - val_mae: 0.0391\n",
"Epoch 84/500\n",
"33/33 [==============================] - 1s 27ms/step - loss: 0.0027 - mae: 0.0446 - val_loss: 0.0019 - val_mae: 0.0325\n",
"Epoch 85/500\n",
"33/33 [==============================] - 1s 24ms/step - loss: 0.0027 - mae: 0.0456 - val_loss: 0.0019 - val_mae: 0.0318\n",
"Epoch 86/500\n",
"33/33 [==============================] - 1s 24ms/step - loss: 0.0025 - mae: 0.0419 - val_loss: 0.0021 - val_mae: 0.0372\n",
"3/3 [==============================] - 0s 9ms/step - loss: 0.0019 - mae: 0.0312\n"
]
},
{
"data": {
"text/plain": [
"31211.024150252342"
]
},
"execution_count": 40,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"# extra code compiles, fits, and evaluates the model, like earlier\n",
"fit_and_evaluate(deep_model, train_ds, valid_ds, learning_rate=0.01)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Multivariate time series"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 41,
2019-04-05 11:04:38 +02:00
"metadata": {},
"outputs": [],
"source": [
"df_mulvar = df[[\"bus\", \"rail\"]] / 1e6 # use both bus & rail series as input\n",
"df_mulvar[\"next_day_type\"] = df[\"day_type\"].shift(-1) # we know tomorrow's type\n",
"df_mulvar = pd.get_dummies(df_mulvar) # one-hot encode the day type"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 42,
2019-04-05 11:04:38 +02:00
"metadata": {},
"outputs": [],
2019-04-05 11:04:38 +02:00
"source": [
"mulvar_train = df_mulvar[\"2016-01\":\"2018-12\"]\n",
"mulvar_valid = df_mulvar[\"2019-01\":\"2019-05\"]\n",
"mulvar_test = df_mulvar[\"2019-06\":]"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 43,
2019-04-05 11:04:38 +02:00
"metadata": {},
"outputs": [],
"source": [
"tf.random.set_seed(42) # extra code ensures reproducibility\n",
2019-04-05 11:04:38 +02:00
"\n",
"train_mulvar_ds = tf.keras.utils.timeseries_dataset_from_array(\n",
" mulvar_train.to_numpy(), # use all 5 columns as input\n",
" targets=mulvar_train[\"rail\"][seq_length:], # forecast only the rail series\n",
" sequence_length=seq_length,\n",
" batch_size=32,\n",
" shuffle=True,\n",
" seed=42\n",
")\n",
"valid_mulvar_ds = tf.keras.utils.timeseries_dataset_from_array(\n",
" mulvar_valid.to_numpy(),\n",
" targets=mulvar_valid[\"rail\"][seq_length:],\n",
" sequence_length=seq_length,\n",
" batch_size=32\n",
")"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 44,
"metadata": {},
"outputs": [],
"source": [
"tf.random.set_seed(42) # extra code ensures reproducibility\n",
"mulvar_model = tf.keras.Sequential([\n",
" tf.keras.layers.SimpleRNN(32, input_shape=[None, 5]),\n",
" tf.keras.layers.Dense(1)\n",
"])"
]
},
2019-04-05 11:04:38 +02:00
{
"cell_type": "code",
"execution_count": 45,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/500\n",
"33/33 [==============================] - 1s 17ms/step - loss: 0.0386 - mae: 0.1872 - val_loss: 0.0011 - val_mae: 0.0346\n",
"Epoch 2/500\n",
"33/33 [==============================] - 0s 11ms/step - loss: 0.0029 - mae: 0.0585 - val_loss: 0.0040 - val_mae: 0.0790\n",
"Epoch 3/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0018 - mae: 0.0435 - val_loss: 7.7056e-04 - val_mae: 0.0273\n",
"Epoch 4/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0017 - mae: 0.0407 - val_loss: 0.0010 - val_mae: 0.0362\n",
"Epoch 5/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0015 - mae: 0.0386 - val_loss: 8.1681e-04 - val_mae: 0.0306\n",
"Epoch 6/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0014 - mae: 0.0372 - val_loss: 0.0011 - val_mae: 0.0380\n",
"Epoch 7/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0014 - mae: 0.0366 - val_loss: 7.9942e-04 - val_mae: 0.0289\n",
"Epoch 8/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0013 - mae: 0.0344 - val_loss: 6.9211e-04 - val_mae: 0.0271\n",
"Epoch 9/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0015 - mae: 0.0374 - val_loss: 8.2185e-04 - val_mae: 0.0299\n",
"Epoch 10/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0014 - mae: 0.0363 - val_loss: 0.0017 - val_mae: 0.0494\n",
"Epoch 11/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0013 - mae: 0.0357 - val_loss: 0.0016 - val_mae: 0.0473\n",
"Epoch 12/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0013 - mae: 0.0337 - val_loss: 8.0260e-04 - val_mae: 0.0287\n",
"Epoch 13/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0013 - mae: 0.0349 - val_loss: 0.0011 - val_mae: 0.0389\n",
"Epoch 14/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0014 - mae: 0.0363 - val_loss: 6.3723e-04 - val_mae: 0.0245\n",
"Epoch 15/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0012 - mae: 0.0340 - val_loss: 6.2749e-04 - val_mae: 0.0255\n",
"Epoch 16/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0013 - mae: 0.0342 - val_loss: 0.0020 - val_mae: 0.0549\n",
"Epoch 17/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0012 - mae: 0.0332 - val_loss: 7.3463e-04 - val_mae: 0.0275\n",
"Epoch 18/500\n",
"<<181 more lines>>\n",
"Epoch 109/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0011 - mae: 0.0319 - val_loss: 6.3961e-04 - val_mae: 0.0244\n",
"Epoch 110/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0012 - mae: 0.0354 - val_loss: 0.0013 - val_mae: 0.0433\n",
"Epoch 111/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0010 - mae: 0.0307 - val_loss: 7.3263e-04 - val_mae: 0.0281\n",
"Epoch 112/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0014 - mae: 0.0377 - val_loss: 7.8642e-04 - val_mae: 0.0293\n",
"Epoch 113/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0012 - mae: 0.0340 - val_loss: 0.0013 - val_mae: 0.0415\n",
"Epoch 114/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0012 - mae: 0.0344 - val_loss: 0.0011 - val_mae: 0.0376\n",
"Epoch 115/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0011 - mae: 0.0314 - val_loss: 0.0010 - val_mae: 0.0344\n",
"Epoch 116/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0013 - mae: 0.0374 - val_loss: 7.2942e-04 - val_mae: 0.0264\n",
"Epoch 117/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0011 - mae: 0.0336 - val_loss: 0.0011 - val_mae: 0.0393\n",
"Epoch 118/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0014 - mae: 0.0392 - val_loss: 0.0015 - val_mae: 0.0455\n",
"Epoch 119/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0012 - mae: 0.0369 - val_loss: 0.0011 - val_mae: 0.0363\n",
"Epoch 120/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0012 - mae: 0.0348 - val_loss: 0.0011 - val_mae: 0.0372\n",
"Epoch 121/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0011 - mae: 0.0316 - val_loss: 0.0012 - val_mae: 0.0408\n",
"Epoch 122/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0011 - mae: 0.0330 - val_loss: 0.0022 - val_mae: 0.0583\n",
"Epoch 123/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0014 - mae: 0.0402 - val_loss: 0.0014 - val_mae: 0.0438\n",
"Epoch 124/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0014 - mae: 0.0392 - val_loss: 8.6813e-04 - val_mae: 0.0323\n",
"Epoch 125/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0011 - mae: 0.0319 - val_loss: 6.3585e-04 - val_mae: 0.0243\n",
"3/3 [==============================] - 0s 4ms/step - loss: 5.6491e-04 - mae: 0.0221\n"
]
},
{
"data": {
"text/plain": [
"22062.301635742188"
]
},
"execution_count": 45,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# extra code compiles, fits, and evaluates the model, like earlier\n",
"fit_and_evaluate(mulvar_model, train_mulvar_ds, valid_mulvar_ds,\n",
" learning_rate=0.05)"
]
},
{
"cell_type": "code",
"execution_count": 46,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/500\n",
"33/33 [==============================] - 1s 13ms/step - loss: 0.0398 - mae: 0.1953 - val_loss: 0.0073 - val_mae: 0.0998\n",
"Epoch 2/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0039 - mae: 0.0632 - val_loss: 0.0012 - val_mae: 0.0384\n",
"Epoch 3/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0027 - mae: 0.0509 - val_loss: 0.0010 - val_mae: 0.0362\n",
"Epoch 4/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0024 - mae: 0.0488 - val_loss: 0.0018 - val_mae: 0.0491\n",
"Epoch 5/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0023 - mae: 0.0473 - val_loss: 0.0012 - val_mae: 0.0372\n",
"Epoch 6/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0022 - mae: 0.0463 - val_loss: 0.0011 - val_mae: 0.0361\n",
"Epoch 7/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0019 - mae: 0.0442 - val_loss: 8.8553e-04 - val_mae: 0.0322\n",
"Epoch 8/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0018 - mae: 0.0427 - val_loss: 9.3772e-04 - val_mae: 0.0339\n",
"Epoch 9/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0017 - mae: 0.0411 - val_loss: 9.0027e-04 - val_mae: 0.0324\n",
"Epoch 10/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0019 - mae: 0.0440 - val_loss: 0.0014 - val_mae: 0.0427\n",
"Epoch 11/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0017 - mae: 0.0415 - val_loss: 0.0021 - val_mae: 0.0546\n",
"Epoch 12/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0017 - mae: 0.0412 - val_loss: 8.3458e-04 - val_mae: 0.0311\n",
"Epoch 13/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0016 - mae: 0.0399 - val_loss: 8.2083e-04 - val_mae: 0.0311\n",
"Epoch 14/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0015 - mae: 0.0391 - val_loss: 0.0010 - val_mae: 0.0358\n",
"Epoch 15/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0016 - mae: 0.0407 - val_loss: 0.0011 - val_mae: 0.0361\n",
"Epoch 16/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0014 - mae: 0.0378 - val_loss: 0.0012 - val_mae: 0.0380\n",
"Epoch 17/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0015 - mae: 0.0394 - val_loss: 9.6802e-04 - val_mae: 0.0346\n",
"Epoch 18/500\n",
"<<215 more lines>>\n",
"Epoch 126/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0011 - mae: 0.0317 - val_loss: 6.8940e-04 - val_mae: 0.0271\n",
"Epoch 127/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0011 - mae: 0.0328 - val_loss: 0.0013 - val_mae: 0.0412\n",
"Epoch 128/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0012 - mae: 0.0344 - val_loss: 7.6342e-04 - val_mae: 0.0292\n",
"Epoch 129/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0011 - mae: 0.0328 - val_loss: 8.3261e-04 - val_mae: 0.0311\n",
"Epoch 130/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0011 - mae: 0.0316 - val_loss: 6.7921e-04 - val_mae: 0.0263\n",
"Epoch 131/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0011 - mae: 0.0320 - val_loss: 7.7970e-04 - val_mae: 0.0297\n",
"Epoch 132/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0011 - mae: 0.0334 - val_loss: 7.4201e-04 - val_mae: 0.0286\n",
"Epoch 133/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0011 - mae: 0.0330 - val_loss: 9.3328e-04 - val_mae: 0.0339\n",
"Epoch 134/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0011 - mae: 0.0322 - val_loss: 6.9349e-04 - val_mae: 0.0267\n",
"Epoch 135/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0011 - mae: 0.0317 - val_loss: 6.6078e-04 - val_mae: 0.0261\n",
"Epoch 136/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0011 - mae: 0.0322 - val_loss: 9.1503e-04 - val_mae: 0.0322\n",
"Epoch 137/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0011 - mae: 0.0327 - val_loss: 6.7553e-04 - val_mae: 0.0261\n",
"Epoch 138/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0010 - mae: 0.0311 - val_loss: 7.1123e-04 - val_mae: 0.0276\n",
"Epoch 139/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0011 - mae: 0.0317 - val_loss: 6.7194e-04 - val_mae: 0.0260\n",
"Epoch 140/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0012 - mae: 0.0342 - val_loss: 0.0010 - val_mae: 0.0361\n",
"Epoch 141/500\n",
"33/33 [==============================] - 0s 13ms/step - loss: 0.0011 - mae: 0.0325 - val_loss: 7.6832e-04 - val_mae: 0.0293\n",
"Epoch 142/500\n",
"33/33 [==============================] - 0s 11ms/step - loss: 0.0011 - mae: 0.0324 - val_loss: 6.7870e-04 - val_mae: 0.0264\n",
"3/3 [==============================] - 0s 5ms/step - loss: 6.5248e-04 - mae: 0.0259\n"
]
},
{
"data": {
"text/plain": [
"25850.363075733185"
]
},
"execution_count": 46,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# extra code build and train a multitask RNN that forecasts both bus and rail\n",
"\n",
"tf.random.set_seed(42)\n",
"\n",
"seq_length = 56\n",
"train_multask_ds = tf.keras.utils.timeseries_dataset_from_array(\n",
" mulvar_train.to_numpy(),\n",
" targets=mulvar_train[[\"bus\", \"rail\"]][seq_length:], # 2 targets per day\n",
" sequence_length=seq_length,\n",
" batch_size=32,\n",
" shuffle=True,\n",
" seed=42\n",
")\n",
"valid_multask_ds = tf.keras.utils.timeseries_dataset_from_array(\n",
" mulvar_valid.to_numpy(),\n",
" targets=mulvar_valid[[\"bus\", \"rail\"]][seq_length:],\n",
" sequence_length=seq_length,\n",
" batch_size=32\n",
")\n",
"\n",
"tf.random.set_seed(42)\n",
"multask_model = tf.keras.Sequential([\n",
" tf.keras.layers.SimpleRNN(32, input_shape=[None, 5]),\n",
" tf.keras.layers.Dense(2)\n",
"])\n",
"\n",
"fit_and_evaluate(multask_model, train_multask_ds, valid_multask_ds,\n",
" learning_rate=0.02)"
]
},
{
"cell_type": "code",
"execution_count": 47,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"43441.63157894738"
]
},
"execution_count": 47,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# extra code evaluates the naive forecasts for bus\n",
"bus_naive = mulvar_valid[\"bus\"].shift(7)[seq_length:]\n",
"bus_target = mulvar_valid[\"bus\"][seq_length:]\n",
"(bus_target - bus_naive).abs().mean() * 1e6"
]
},
{
"cell_type": "code",
"execution_count": 48,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"bus 26369\n",
"rail 25330\n"
]
}
],
"source": [
"# extra code evaluates the multitask RNN's forecasts both bus and rail\n",
"Y_preds_valid = multask_model.predict(valid_multask_ds)\n",
"for idx, name in enumerate([\"bus\", \"rail\"]):\n",
" mae = 1e6 * tf.keras.metrics.mean_absolute_error(\n",
" mulvar_valid[name][seq_length:], Y_preds_valid[:, idx])\n",
" print(name, int(mae))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Forecasting Several Steps Ahead"
]
},
{
"cell_type": "code",
"execution_count": 49,
"metadata": {
"tags": []
},
"outputs": [],
"source": [
"import numpy as np\n",
"\n",
"X = rail_valid.to_numpy()[np.newaxis, :seq_length, np.newaxis]\n",
"for step_ahead in range(14):\n",
" y_pred_one = univar_model.predict(X)\n",
" X = np.concatenate([X, y_pred_one.reshape(1, 1, 1)], axis=1)"
]
},
{
"cell_type": "code",
"execution_count": 50,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAjAAAADsCAYAAABqkpwSAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAAB2LUlEQVR4nO2dd3gcxd2A37k76VROvVuyJUuy5YaxMbjQLDAdHJPQnFC/ACa0BAIxJSS0kNACoQQSWkwLxPSSmGqLauOCDe6WLMlWs2X1Xu5uvj92Tz7J6rqueZ/nnrubnZmdXZ12f/urQkqJQqFQKBQKhT9h8PYCFAqFQqFQKIaKEmAUCoVCoVD4HUqAUSgUCoVC4XcoAUahUCgUCoXfoQQYhUKhUCgUfocSYBQKhUKhUPgdgxJghBA3CiG2CiG2CCFeE0KECCFihRCfCiHy9fcYp/63CSEKhBA7hRCnOrXPEkJs1rc9LoQQertZCPEfvf07IUSG05hL9X3kCyEudeGxKxQKhUKh8FMGFGCEEKnAr4EjpZTTACOwGLgV+FxKOQH4XP+OEGKKvn0qcBrwlBDCqE/3NLAEmKC/TtPbLwdqpZTZwKPAA/pcscCdwBxgNnCns6CkUCgUCoVidDJYE5IJCBVCmIAwoBxYBLyob38ROFv/vAh4XUrZLqUsAgqA2UKIFCBSSrlaatnzXuoxxjHXm8ACXTtzKvCplLJGSlkLfMpBoUehUCgUCsUoZUABRkpZBjwM7AUqgHop5SdAkpSyQu9TASTqQ1KBEqcpSvW2VP1zz/ZuY6SUVqAeiOtnLoVCoVAoFKMY00AddJPNImA8UAe8IYS4qL8hvbTJftqHO8Z5jUvQTFOEhITMGjduXD/LUygUCv+kpaUFgLCwMC+vxHXY7XYMBhVPMtrZtWtXlZQyYShjBhRggJOAIinlAQAhxNvA0cB+IUSKlLJCNw9V6v1LgbFO49PQTE6l+uee7c5jSnUzVRRQo7fn9hiT13OBUspngGcAcnJy5M6dOwdxWAqFQuFf5ObmApCXl+fVdbiSvLy8ruNSjF6EEHuGOmYwYu9eYK4QIkz3S1kAbAfeBxxRQZcC7+mf3wcW65FF49GcddfqZqZGIcRcfZ5LeoxxzHUusFL3k/kYOEUIEaNrgk7R2xQKhUKhUIxiBtTASCm/E0K8CXwPWIGNaNoOC7BcCHE5mpBznt5/qxBiObBN73+tlNKmT3c1sAwIBVboL4DngZeFEAVompfF+lw1Qoh7gXV6v3uklDUjOmKFQqFQKBR+z2BMSEgp70QLZ3amHU0b01v/+4D7emlfD0zrpb0NXQDqZdsLwAuDWadCoVAoFIrRgfKcUigUCoVC4XcMSgOjUCgUCu/zz3/+09tLUCh8BiXAKBQKhZ+Qk5Pj7SUoFD6DMiEpFAqFn/DBBx/wwQcfeHsZCoVPoDQwCoVC4Sf89a9/BWDhwoVeXolC4X2UBkahUCgUCoXfoQQYhcKDbNhTy99XFbBhT623l6JQKBR+jRJgFAoPsa64hp8/s4aHP97Jhc+uUUKMQqEYGg8+CKtWdW9btUprH4UoAUahcCN2u2RtUQ13vreFy15YS4fNjgTarHa+KTjg7eUpFApvMRxh5Kij4PzzD45btUr7ftRR7lunD6OceH2YDXtqWVNYzdzMOGalx3h7OYpBIqXk+711/PfHCv63uYJ9DW2YTQZmjotmw55arDaJBNbsruG6EyQGQ29F1xWKQ3n55Ze9vQSFq3AII/ffD4cdBmvXwh13wF13aYKJzaa97Pbun6+9Fs4+G66+Gp5/HpYvhxNO8PbReAUlwPgoa4uquei5tXTa7JiMgjvPmsoJkxNJijBjMrpWcaYJSlXMzYxXgtIwkVLyY2k9H/5Yzv8276OsrpVgo4H5OQncNn0SCyYnYTGbuoTS/Q1tvLR6D/d8uI07F05Bq2+qUPTP2LFjvb0ExUhoaYHvvoOvvtJeTU1wxRXd+9x44+DmeuABWLBg1GpfQAkwPkVtcwdf7DrAyh2VfLx1Hx02OwCdNskd722B98BoECRHhpAaE0padCipMaGkOr3vb2jj+711zBkfS2aChaqmdqqa2qlu6qC6qZ2qpg6qm/X3pnbK61rZ19AOQLCxgNeWzFVCzCDZsKeW9zeVUd9qZcPeGkpqWgkyCo6bkMBNp0zkpClJRIYEdRszKz2GWekxSCkJMhp4/usikiJDuDo3y0tH4R08pV0MNC3mf/7zHwAuuOACL6/EiQcf1G6izlqAVatg3TpYutR76/IkfZ2DL7+EWbMOCizr10NnJwihaV0uvxzKyuDdd+HnP4f/+z8wGMBo1F69fd6wAW66CeLi4PPPITUVfv97uOYasFi8dgq8gRJgvIiUkm0VDazaUcnKHZVsKqnDLiEuPJh5mXF8s7sKu11iMhpYeloOYcEmSmtbKKttpayulTWF1exraMMuB79Pg4DYcDPxlmDiLWZiwoPZ39COBDpsdr7cdSAgLvTu5r8/lnP9axu7zv2MsdFcf+IETp2STFRYUP+DASEEvz9jMgca23ngox0kRJg5d1aam1ftG2zYU8vPn12D1WYn2GTg1StcJzTXt3SytbyezWX1fJlfxbe7q0CC2WTg1Sv9Xzh/+umnATcKMMMRRg4/HM49Fx57DC68EPLyNNPI8uXuWaMv4jAH/eMfYLXCa6/Bhx9qZh+AoCA48kj47W/huOPg6KMhJuagD8sf/gBPPw1XXtm/OWjVKrj1VnjnHa3fk09qc95yCzz0kPY3uuYaCA/3zHF7GSXAeAjHk+CMsdE0tVvJ21nJqh0H2NfQBsBhqVFcd+IETpyUyPTUKAwGMainx06bnX31bZTVtfLSt8Ws2LIPCQjgxEmJnD0zlXiLJrDEWcxEhwZ187nYsKeWC59bQ4fVjl3C1vJ6D5wN/6WxrZMnVxbw7FeFXcKLUcDJU5I4/8ihqfcNBsHD5x1OTXMHt7z1I3GWYE7ISXTDqn2LN9aX0GHVtIttnXZ+9fJ6jhofS2a8hcyEcDITtPee2que/w9VTe1sKatna3kDW8rq2VJeT0lNa1f/yBATUv8btVntPPjRDp74xUwSI0I8dqx+h+NGfO+9MHYsfP01PPGE5nNxzTVQXX3oq6VFG3vxxXDDDdDWpt3AR5NfRm4uXHCBJsg5mDULFi3SBJbZsyEsrPsYh/Di8GE54YTu33tj3bru26+7DqZOhTffhN27NQHmwQdHjSAjpBzC47sfkJOTI3fu3OntZXRjw55afvHsGtr1izaAxWziuAnxnDApkdycBJdcVB3CSKfVTtAQnmwdN4aCyibe2VjGc5ccyUlTkka8nkDCZpe8uaGEhz7eSVVTByfmJPDN7mqstqGd695obOtk8TNrKDzQzGtL5jJjbLRrF+9j3LR8E299X4ZAE+JmjI2mprmDvTUt2JzUifEWM5nx4WQmhBNsMvDa2r1YbRIhICYsmOrmjq6+6XFhTBsTxdTUSA5LjWLqmCiKqpq7hHMAKSHIZOCcI9JYcnwm4+P97+Kem5sLQF5enusmbW/X/DK++ELTnnz9NXR0dO9jMGgag7i4Pl/1r79JVN5nWv+wMLjoIs3RdMaMfnefl5fXdVx+SUeH5lj73HMwcSLs2qWZdP70p/7HucP0tno13H03fPwxxMdr87S3wzHH+LyJTwixQUp55JDGKAHG/fx9VQEPf7yzSzNy/pFp3Hv2YQSbXB/FPhKbf7vVxqInv6GqqZ2PbzieOIvZ5evzR9YW1XDPh1vZUtbArPQY7lw4help0S71r6hsbOOcp7+lud3GW1cf7Zc318Hy06e+oaXDxk8OH9Pt3HVY7eytaaHwQBOFVc3a+4FmCquaqWnufkOdnBLBz2amMTU1kqljoogK7d1s5/w3igsP5pmvCnlzQymdNjunT0vmV/OzmJ4W7e5DdhlDEmD6ukF++612Q8vL04SW1au1m5wQMH065OZSs6uI2BXvU/qLXyLvuYcGcxhNHXaaO6w0tllparfSpL83tlkJ/foLrnjyNl6deQaXfv8BzJ1H1HffQGurZi655hpNO2E+9Jri1wJMVZV
"text/plain": [
"<Figure size 576x252 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"# extra code generates and saves Figure 1511\n",
"\n",
"# The forecasts start on 2019-02-26, as it is the 57th day of 2019, and they end\n",
"# on 2019-03-11. That's 14 days in total.\n",
"Y_pred = pd.Series(X[0, -14:, 0],\n",
" index=pd.date_range(\"2019-02-26\", \"2019-03-11\"))\n",
"\n",
"fig, ax = plt.subplots(figsize=(8, 3.5))\n",
"(rail_valid * 1e6)[\"2019-02-01\":\"2019-03-11\"].plot(\n",
" label=\"True\", marker=\".\", ax=ax)\n",
"(Y_pred * 1e6).plot(\n",
" label=\"Predictions\", grid=True, marker=\"x\", color=\"r\", ax=ax)\n",
"ax.vlines(\"2019-02-25\", 0, 1e6, color=\"k\", linestyle=\"--\", label=\"Today\")\n",
"ax.set_ylim([200_000, 800_000])\n",
"plt.legend(loc=\"center left\")\n",
"save_fig(\"forecast_ahead_plot\")\n",
"plt.show()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now let's create an RNN that predicts all 14 next values at once:"
]
},
{
"cell_type": "code",
"execution_count": 51,
"metadata": {},
"outputs": [],
2019-04-05 11:04:38 +02:00
"source": [
"tf.random.set_seed(42) # extra code ensures reproducibility\n",
2019-04-05 11:04:38 +02:00
"\n",
"def split_inputs_and_targets(mulvar_series, ahead=14, target_col=1):\n",
" return mulvar_series[:, :-ahead], mulvar_series[:, -ahead:, target_col]\n",
2019-04-05 11:04:38 +02:00
"\n",
"ahead_train_ds = tf.keras.utils.timeseries_dataset_from_array(\n",
" mulvar_train.to_numpy(),\n",
" targets=None,\n",
" sequence_length=seq_length + 14,\n",
" batch_size=32,\n",
" shuffle=True,\n",
" seed=42\n",
").map(split_inputs_and_targets)\n",
"ahead_valid_ds = tf.keras.utils.timeseries_dataset_from_array(\n",
" mulvar_valid.to_numpy(),\n",
" targets=None,\n",
" sequence_length=seq_length + 14,\n",
" batch_size=32\n",
").map(split_inputs_and_targets)"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 52,
2019-04-05 11:04:38 +02:00
"metadata": {},
"outputs": [],
"source": [
"tf.random.set_seed(42)\n",
2019-04-05 11:04:38 +02:00
"\n",
"ahead_model = tf.keras.Sequential([\n",
" tf.keras.layers.SimpleRNN(32, input_shape=[None, 5]),\n",
" tf.keras.layers.Dense(14)\n",
"])"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 53,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/500\n",
"33/33 [==============================] - 1s 12ms/step - loss: 0.1250 - mae: 0.3791 - val_loss: 0.0287 - val_mae: 0.1935\n",
"Epoch 2/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0191 - mae: 0.1613 - val_loss: 0.0136 - val_mae: 0.1289\n",
"Epoch 3/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0131 - mae: 0.1303 - val_loss: 0.0102 - val_mae: 0.1113\n",
"Epoch 4/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0108 - mae: 0.1164 - val_loss: 0.0083 - val_mae: 0.1009\n",
"Epoch 5/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0093 - mae: 0.1068 - val_loss: 0.0071 - val_mae: 0.0931\n",
"Epoch 6/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0083 - mae: 0.0996 - val_loss: 0.0061 - val_mae: 0.0862\n",
"Epoch 7/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0076 - mae: 0.0941 - val_loss: 0.0055 - val_mae: 0.0811\n",
"Epoch 8/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0072 - mae: 0.0900 - val_loss: 0.0050 - val_mae: 0.0779\n",
"Epoch 9/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0068 - mae: 0.0869 - val_loss: 0.0046 - val_mae: 0.0751\n",
"Epoch 10/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0066 - mae: 0.0844 - val_loss: 0.0045 - val_mae: 0.0737\n",
"Epoch 11/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0063 - mae: 0.0822 - val_loss: 0.0041 - val_mae: 0.0709\n",
"Epoch 12/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0061 - mae: 0.0804 - val_loss: 0.0039 - val_mae: 0.0688\n",
"Epoch 13/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0060 - mae: 0.0796 - val_loss: 0.0039 - val_mae: 0.0690\n",
"Epoch 14/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0059 - mae: 0.0777 - val_loss: 0.0036 - val_mae: 0.0656\n",
"Epoch 15/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0058 - mae: 0.0766 - val_loss: 0.0035 - val_mae: 0.0649\n",
"Epoch 16/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0056 - mae: 0.0755 - val_loss: 0.0034 - val_mae: 0.0638\n",
"Epoch 17/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0055 - mae: 0.0744 - val_loss: 0.0033 - val_mae: 0.0633\n",
"Epoch 18/500\n",
"<<303 more lines>>\n",
"Epoch 170/500\n",
"33/33 [==============================] - 0s 7ms/step - loss: 0.0032 - mae: 0.0474 - val_loss: 0.0014 - val_mae: 0.0359\n",
"Epoch 171/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0032 - mae: 0.0477 - val_loss: 0.0014 - val_mae: 0.0359\n",
"Epoch 172/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0032 - mae: 0.0479 - val_loss: 0.0014 - val_mae: 0.0353\n",
"Epoch 173/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0032 - mae: 0.0480 - val_loss: 0.0014 - val_mae: 0.0359\n",
"Epoch 174/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0032 - mae: 0.0481 - val_loss: 0.0015 - val_mae: 0.0365\n",
"Epoch 175/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0032 - mae: 0.0476 - val_loss: 0.0014 - val_mae: 0.0358\n",
"Epoch 176/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0032 - mae: 0.0474 - val_loss: 0.0014 - val_mae: 0.0355\n",
"Epoch 177/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0032 - mae: 0.0480 - val_loss: 0.0014 - val_mae: 0.0362\n",
"Epoch 178/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0032 - mae: 0.0476 - val_loss: 0.0014 - val_mae: 0.0353\n",
"Epoch 179/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0032 - mae: 0.0481 - val_loss: 0.0014 - val_mae: 0.0357\n",
"Epoch 180/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0032 - mae: 0.0476 - val_loss: 0.0014 - val_mae: 0.0352\n",
"Epoch 181/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0032 - mae: 0.0475 - val_loss: 0.0014 - val_mae: 0.0358\n",
"Epoch 182/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0032 - mae: 0.0474 - val_loss: 0.0014 - val_mae: 0.0357\n",
"Epoch 183/500\n",
"33/33 [==============================] - 0s 8ms/step - loss: 0.0032 - mae: 0.0477 - val_loss: 0.0014 - val_mae: 0.0358\n",
"Epoch 184/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0032 - mae: 0.0479 - val_loss: 0.0014 - val_mae: 0.0353\n",
"Epoch 185/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0032 - mae: 0.0473 - val_loss: 0.0015 - val_mae: 0.0368\n",
"Epoch 186/500\n",
"33/33 [==============================] - 0s 9ms/step - loss: 0.0032 - mae: 0.0475 - val_loss: 0.0014 - val_mae: 0.0356\n",
"3/3 [==============================] - 0s 3ms/step - loss: 0.0014 - mae: 0.0350\n"
]
},
{
"data": {
"text/plain": [
"35017.29667186737"
]
},
"execution_count": 53,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"# extra code compiles, fits, and evaluates the model, like earlier\n",
"fit_and_evaluate(ahead_model, ahead_train_ds, ahead_valid_ds,\n",
" learning_rate=0.02)"
]
},
{
"cell_type": "code",
"execution_count": 54,
"metadata": {},
"outputs": [],
"source": [
"X = mulvar_valid.to_numpy()[np.newaxis, :seq_length] # shape [1, 56, 5]\n",
"Y_pred = ahead_model.predict(X) # shape [1, 14]"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now let's create an RNN that predicts the next 14 steps at each time step. That is, instead of just forecasting time steps 56 to 69 based on time steps 0 to 55, it will forecast time steps 1 to 14 at time step 0, then time steps 2 to 15 at time step 1, and so on, and finally it will forecast time steps 56 to 69 at the last time step. Notice that the model is causal: when it makes predictions at any time step, it can only see past time steps."
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To prepare the datasets, we can use `to_windows()` twice, to get sequences of consecutive windows, like this:"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 55,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"[<tf.Tensor: shape=(4, 3), dtype=int64, numpy=\n",
" array([[0, 1, 2],\n",
" [1, 2, 3],\n",
" [2, 3, 4],\n",
" [3, 4, 5]])>,\n",
" <tf.Tensor: shape=(4, 3), dtype=int64, numpy=\n",
" array([[1, 2, 3],\n",
" [2, 3, 4],\n",
" [3, 4, 5],\n",
" [4, 5, 6]])>]"
]
},
"execution_count": 55,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"my_series = tf.data.Dataset.range(7)\n",
"dataset = to_windows(to_windows(my_series, 3), 4)\n",
"list(dataset)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Then we can split these elements into the desired inputs and targets:"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 56,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"[(<tf.Tensor: shape=(4,), dtype=int64, numpy=array([0, 1, 2, 3])>,\n",
" <tf.Tensor: shape=(4, 2), dtype=int64, numpy=\n",
" array([[1, 2],\n",
" [2, 3],\n",
" [3, 4],\n",
" [4, 5]])>),\n",
" (<tf.Tensor: shape=(4,), dtype=int64, numpy=array([1, 2, 3, 4])>,\n",
" <tf.Tensor: shape=(4, 2), dtype=int64, numpy=\n",
" array([[2, 3],\n",
" [3, 4],\n",
" [4, 5],\n",
" [5, 6]])>)]"
]
},
"execution_count": 56,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"dataset = dataset.map(lambda S: (S[:, 0], S[:, 1:]))\n",
"list(dataset)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Let's wrap this idea into a utility function. It will also take care of shuffling (optional) and batching:"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 57,
2019-04-05 11:04:38 +02:00
"metadata": {},
"outputs": [],
"source": [
"def to_seq2seq_dataset(series, seq_length=56, ahead=14, target_col=1,\n",
" batch_size=32, shuffle=False, seed=None):\n",
" ds = to_windows(tf.data.Dataset.from_tensor_slices(series), ahead + 1)\n",
" ds = to_windows(ds, seq_length).map(lambda S: (S[:, 0], S[:, 1:, 1]))\n",
" if shuffle:\n",
" ds = ds.shuffle(8 * batch_size, seed=seed)\n",
" return ds.batch(batch_size)"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 58,
2019-04-05 11:04:38 +02:00
"metadata": {},
"outputs": [],
"source": [
"seq2seq_train = to_seq2seq_dataset(mulvar_train, shuffle=True, seed=42)\n",
"seq2seq_valid = to_seq2seq_dataset(mulvar_valid)"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 59,
2019-04-05 11:04:38 +02:00
"metadata": {},
"outputs": [],
"source": [
"tf.random.set_seed(42) # extra code ensures reproducibility\n",
"seq2seq_model = tf.keras.Sequential([\n",
" tf.keras.layers.SimpleRNN(32, return_sequences=True, input_shape=[None, 5]),\n",
" tf.keras.layers.Dense(14)\n",
" # equivalent: tf.keras.layers.TimeDistributed(tf.keras.layers.Dense(14))\n",
" # also equivalent: tf.keras.layers.Conv1D(14, kernel_size=1)\n",
"])"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 60,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/500\n",
"33/33 [==============================] - 1s 17ms/step - loss: 0.0754 - mae: 0.2785 - val_loss: 0.0163 - val_mae: 0.1379\n",
"Epoch 2/500\n",
"33/33 [==============================] - 0s 11ms/step - loss: 0.0097 - mae: 0.1050 - val_loss: 0.0071 - val_mae: 0.0853\n",
"Epoch 3/500\n",
"33/33 [==============================] - 0s 11ms/step - loss: 0.0069 - mae: 0.0846 - val_loss: 0.0063 - val_mae: 0.0790\n",
"Epoch 4/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0060 - mae: 0.0773 - val_loss: 0.0056 - val_mae: 0.0729\n",
"Epoch 5/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0055 - mae: 0.0722 - val_loss: 0.0049 - val_mae: 0.0662\n",
"Epoch 6/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0052 - mae: 0.0690 - val_loss: 0.0051 - val_mae: 0.0683\n",
"Epoch 7/500\n",
"33/33 [==============================] - 0s 11ms/step - loss: 0.0049 - mae: 0.0663 - val_loss: 0.0046 - val_mae: 0.0626\n",
"Epoch 8/500\n",
"33/33 [==============================] - 0s 11ms/step - loss: 0.0047 - mae: 0.0640 - val_loss: 0.0043 - val_mae: 0.0589\n",
"Epoch 9/500\n",
"33/33 [==============================] - 0s 11ms/step - loss: 0.0046 - mae: 0.0627 - val_loss: 0.0041 - val_mae: 0.0560\n",
"Epoch 10/500\n",
"33/33 [==============================] - 0s 11ms/step - loss: 0.0045 - mae: 0.0616 - val_loss: 0.0043 - val_mae: 0.0589\n",
"Epoch 11/500\n",
"33/33 [==============================] - 0s 11ms/step - loss: 0.0044 - mae: 0.0608 - val_loss: 0.0042 - val_mae: 0.0580\n",
"Epoch 12/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0043 - mae: 0.0594 - val_loss: 0.0040 - val_mae: 0.0554\n",
"Epoch 13/500\n",
"33/33 [==============================] - 0s 11ms/step - loss: 0.0042 - mae: 0.0584 - val_loss: 0.0041 - val_mae: 0.0572\n",
"Epoch 14/500\n",
"33/33 [==============================] - 0s 11ms/step - loss: 0.0042 - mae: 0.0577 - val_loss: 0.0042 - val_mae: 0.0580\n",
"Epoch 15/500\n",
"33/33 [==============================] - 0s 11ms/step - loss: 0.0042 - mae: 0.0579 - val_loss: 0.0038 - val_mae: 0.0530\n",
"Epoch 16/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0041 - mae: 0.0573 - val_loss: 0.0039 - val_mae: 0.0534\n",
"Epoch 17/500\n",
"33/33 [==============================] - 0s 11ms/step - loss: 0.0041 - mae: 0.0566 - val_loss: 0.0038 - val_mae: 0.0530\n",
"Epoch 18/500\n",
"<<219 more lines>>\n",
"Epoch 128/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0033 - mae: 0.0484 - val_loss: 0.0036 - val_mae: 0.0470\n",
"Epoch 129/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0033 - mae: 0.0489 - val_loss: 0.0036 - val_mae: 0.0472\n",
"Epoch 130/500\n",
"33/33 [==============================] - 0s 11ms/step - loss: 0.0032 - mae: 0.0476 - val_loss: 0.0036 - val_mae: 0.0473\n",
"Epoch 131/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0032 - mae: 0.0483 - val_loss: 0.0036 - val_mae: 0.0479\n",
"Epoch 132/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0033 - mae: 0.0492 - val_loss: 0.0037 - val_mae: 0.0489\n",
"Epoch 133/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0033 - mae: 0.0499 - val_loss: 0.0036 - val_mae: 0.0480\n",
"Epoch 134/500\n",
"33/33 [==============================] - 0s 11ms/step - loss: 0.0033 - mae: 0.0486 - val_loss: 0.0035 - val_mae: 0.0469\n",
"Epoch 135/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0033 - mae: 0.0486 - val_loss: 0.0035 - val_mae: 0.0468\n",
"Epoch 136/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0033 - mae: 0.0491 - val_loss: 0.0035 - val_mae: 0.0467\n",
"Epoch 137/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0033 - mae: 0.0493 - val_loss: 0.0035 - val_mae: 0.0471\n",
"Epoch 138/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0033 - mae: 0.0486 - val_loss: 0.0036 - val_mae: 0.0476\n",
"Epoch 139/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0033 - mae: 0.0487 - val_loss: 0.0035 - val_mae: 0.0470\n",
"Epoch 140/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0033 - mae: 0.0492 - val_loss: 0.0035 - val_mae: 0.0467\n",
"Epoch 141/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0033 - mae: 0.0488 - val_loss: 0.0035 - val_mae: 0.0471\n",
"Epoch 142/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0033 - mae: 0.0493 - val_loss: 0.0035 - val_mae: 0.0468\n",
"Epoch 143/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0033 - mae: 0.0494 - val_loss: 0.0035 - val_mae: 0.0473\n",
"Epoch 144/500\n",
"33/33 [==============================] - 0s 10ms/step - loss: 0.0033 - mae: 0.0486 - val_loss: 0.0035 - val_mae: 0.0469\n",
"3/3 [==============================] - 0s 13ms/step - loss: 0.0034 - mae: 0.0459\n"
]
},
{
"data": {
"text/plain": [
"45928.88057231903"
]
},
"execution_count": 60,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"fit_and_evaluate(seq2seq_model, seq2seq_train, seq2seq_valid,\n",
" learning_rate=0.1)"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 61,
2019-04-05 11:04:38 +02:00
"metadata": {},
"outputs": [],
"source": [
"X = mulvar_valid.to_numpy()[np.newaxis, :seq_length]\n",
"y_pred_14 = seq2seq_model.predict(X)[0, -1] # only the last time step's output"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 62,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"MAE for +1: 25,519\n",
"MAE for +2: 26,274\n",
"MAE for +3: 27,054\n",
"MAE for +4: 29,324\n",
"MAE for +5: 28,992\n",
"MAE for +6: 31,739\n",
"MAE for +7: 32,847\n",
"MAE for +8: 33,282\n",
"MAE for +9: 33,072\n",
"MAE for +10: 29,752\n",
"MAE for +11: 37,468\n",
"MAE for +12: 35,125\n",
"MAE for +13: 34,614\n",
"MAE for +14: 34,322\n"
]
}
],
2019-04-05 11:04:38 +02:00
"source": [
"Y_pred_valid = seq2seq_model.predict(seq2seq_valid)\n",
"for ahead in range(14):\n",
" preds = pd.Series(Y_pred_valid[:-1, -1, ahead],\n",
" index=mulvar_valid.index[56 + ahead : -14 + ahead])\n",
" mae = (preds - mulvar_valid[\"rail\"]).abs().mean() * 1e6\n",
" print(f\"MAE for +{ahead + 1}: {mae:,.0f}\")"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "markdown",
2019-04-05 11:04:38 +02:00
"metadata": {},
"source": [
"# Deep RNNs with Layer Norm"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 63,
2019-04-05 11:04:38 +02:00
"metadata": {},
"outputs": [],
"source": [
2021-10-17 04:04:08 +02:00
"class LNSimpleRNNCell(tf.keras.layers.Layer):\n",
2019-04-05 11:04:38 +02:00
" def __init__(self, units, activation=\"tanh\", **kwargs):\n",
" super().__init__(**kwargs)\n",
" self.state_size = units\n",
" self.output_size = units\n",
2021-10-17 04:04:08 +02:00
" self.simple_rnn_cell = tf.keras.layers.SimpleRNNCell(units,\n",
" activation=None)\n",
" self.layer_norm = tf.keras.layers.LayerNormalization()\n",
2021-10-17 04:04:08 +02:00
" self.activation = tf.keras.activations.get(activation)\n",
"\n",
2019-04-05 11:04:38 +02:00
" def call(self, inputs, states):\n",
" outputs, new_states = self.simple_rnn_cell(inputs, states)\n",
" norm_outputs = self.activation(self.layer_norm(outputs))\n",
" return norm_outputs, [norm_outputs]"
]
},
{
"cell_type": "code",
"execution_count": 64,
2019-04-05 11:04:38 +02:00
"metadata": {},
"outputs": [],
"source": [
"tf.random.set_seed(42) # extra code ensures reproducibility\n",
"custom_ln_model = tf.keras.Sequential([\n",
" tf.keras.layers.RNN(LNSimpleRNNCell(32), return_sequences=True,\n",
" input_shape=[None, 5]),\n",
" tf.keras.layers.Dense(14)\n",
"])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Just training for 5 epochs to show that it works (you can increase this if you want):"
]
},
{
"cell_type": "code",
"execution_count": 65,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/5\n",
"33/33 [==============================] - 2s 25ms/step - loss: 0.0809 - mae: 0.2898 - val_loss: 0.0178 - val_mae: 0.1511\n",
"Epoch 2/5\n",
"33/33 [==============================] - 1s 18ms/step - loss: 0.0149 - mae: 0.1438 - val_loss: 0.0156 - val_mae: 0.1245\n",
"Epoch 3/5\n",
"33/33 [==============================] - 1s 18ms/step - loss: 0.0120 - mae: 0.1281 - val_loss: 0.0131 - val_mae: 0.1160\n",
"Epoch 4/5\n",
"33/33 [==============================] - 1s 17ms/step - loss: 0.0105 - mae: 0.1167 - val_loss: 0.0118 - val_mae: 0.1095\n",
"Epoch 5/5\n",
"33/33 [==============================] - 1s 17ms/step - loss: 0.0093 - mae: 0.1067 - val_loss: 0.0105 - val_mae: 0.1038\n",
"3/3 [==============================] - 0s 14ms/step - loss: 0.0105 - mae: 0.1038\n"
]
},
{
"data": {
"text/plain": [
"103751.08569860458"
]
},
"execution_count": 65,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"fit_and_evaluate(custom_ln_model, seq2seq_train, seq2seq_valid,\n",
" learning_rate=0.1, epochs=5)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# Extra Material Creating a Custom RNN Class"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The RNN class is not magical. In fact, it's not too hard to implement your own RNN class:"
]
},
{
"cell_type": "code",
"execution_count": 66,
"metadata": {},
"outputs": [],
"source": [
2021-10-17 04:04:08 +02:00
"class MyRNN(tf.keras.layers.Layer):\n",
" def __init__(self, cell, return_sequences=False, **kwargs):\n",
" super().__init__(**kwargs)\n",
" self.cell = cell\n",
" self.return_sequences = return_sequences\n",
"\n",
" def get_initial_state(self, inputs):\n",
" try:\n",
" return self.cell.get_initial_state(inputs)\n",
" except AttributeError:\n",
" # fallback to zeros if self.cell has no get_initial_state() method\n",
" batch_size = tf.shape(inputs)[0]\n",
" return [tf.zeros([batch_size, self.cell.state_size],\n",
" dtype=inputs.dtype)]\n",
"\n",
" @tf.function\n",
" def call(self, inputs):\n",
" states = self.get_initial_state(inputs)\n",
2021-08-19 02:30:12 +02:00
" shape = tf.shape(inputs)\n",
" batch_size = shape[0]\n",
" n_steps = shape[1]\n",
" sequences = tf.TensorArray(\n",
" inputs.dtype, size=(n_steps if self.return_sequences else 0))\n",
" outputs = tf.zeros(shape=[batch_size, self.cell.output_size],\n",
" dtype=inputs.dtype)\n",
" for step in tf.range(n_steps):\n",
" outputs, states = self.cell(inputs[:, step], states)\n",
" if self.return_sequences:\n",
" sequences = sequences.write(step, outputs)\n",
"\n",
" if self.return_sequences:\n",
" # stack the outputs into an array of shape\n",
" # [time steps, batch size, dims], then transpose it to shape\n",
" # [batch size, time steps, dims]\n",
2021-08-19 02:30:12 +02:00
" return tf.transpose(sequences.stack(), [1, 0, 2])\n",
" else:\n",
" return outputs"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Note that `@tf.function` requires the `outputs` variable to be created before the `for` loop, which is why we initialize its value to a zero tensor, even though we don't use that value at all. Once the function is converted to a graph, this unused value will be pruned from the graph, so it doesn't impact performance. Similarly, `@tf.function` requires the `sequences` variable to be created before the `if` statement where it is used, even if `self.return_sequences` is `False`, so we create a `TensorArray` of size 0 in this case."
]
},
{
"cell_type": "code",
"execution_count": 67,
"metadata": {},
"outputs": [],
"source": [
"tf.random.set_seed(42)\n",
"\n",
"custom_model = tf.keras.Sequential([\n",
" MyRNN(LNSimpleRNNCell(32), return_sequences=True, input_shape=[None, 5]),\n",
" tf.keras.layers.Dense(14)\n",
"])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Just training for 5 epochs to show that it works (you can increase this if you want):"
]
},
{
"cell_type": "code",
"execution_count": 68,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/5\n",
"33/33 [==============================] - 2s 26ms/step - loss: 0.0814 - mae: 0.2916 - val_loss: 0.0176 - val_mae: 0.1544\n",
"Epoch 2/5\n",
"33/33 [==============================] - 1s 20ms/step - loss: 0.0151 - mae: 0.1440 - val_loss: 0.0157 - val_mae: 0.1247\n",
"Epoch 3/5\n",
"33/33 [==============================] - 1s 19ms/step - loss: 0.0119 - mae: 0.1281 - val_loss: 0.0134 - val_mae: 0.1160\n",
"Epoch 4/5\n",
"33/33 [==============================] - 1s 18ms/step - loss: 0.0105 - mae: 0.1162 - val_loss: 0.0111 - val_mae: 0.1084\n",
"Epoch 5/5\n",
"33/33 [==============================] - 1s 18ms/step - loss: 0.0093 - mae: 0.1068 - val_loss: 0.0103 - val_mae: 0.1029\n",
"3/3 [==============================] - 0s 14ms/step - loss: 0.0103 - mae: 0.1029\n"
]
},
{
"data": {
"text/plain": [
"102874.92722272873"
]
},
"execution_count": 68,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"fit_and_evaluate(custom_model, seq2seq_train, seq2seq_valid,\n",
" learning_rate=0.1, epochs=5)"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# LSTMs"
]
},
{
"cell_type": "code",
"execution_count": 69,
2019-04-05 11:04:38 +02:00
"metadata": {
"scrolled": true
},
"outputs": [],
"source": [
"tf.random.set_seed(42) # extra code ensures reproducibility\n",
"lstm_model = tf.keras.models.Sequential([\n",
" tf.keras.layers.LSTM(32, return_sequences=True, input_shape=[None, 5]),\n",
" tf.keras.layers.Dense(14)\n",
"])"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "markdown",
2019-04-05 11:04:38 +02:00
"metadata": {},
"source": [
"Just training for 5 epochs to show that it works (you can increase this if you want):"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 70,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/5\n",
"33/33 [==============================] - 2s 29ms/step - loss: 0.0535 - mae: 0.2517 - val_loss: 0.0187 - val_mae: 0.1716\n",
"Epoch 2/5\n",
"33/33 [==============================] - 1s 16ms/step - loss: 0.0176 - mae: 0.1598 - val_loss: 0.0176 - val_mae: 0.1473\n",
"Epoch 3/5\n",
"33/33 [==============================] - 1s 16ms/step - loss: 0.0160 - mae: 0.1528 - val_loss: 0.0168 - val_mae: 0.1433\n",
"Epoch 4/5\n",
"33/33 [==============================] - 1s 16ms/step - loss: 0.0152 - mae: 0.1485 - val_loss: 0.0161 - val_mae: 0.1388\n",
"Epoch 5/5\n",
"33/33 [==============================] - 1s 16ms/step - loss: 0.0145 - mae: 0.1443 - val_loss: 0.0154 - val_mae: 0.1352\n",
"3/3 [==============================] - 0s 14ms/step - loss: 0.0154 - mae: 0.1352\n"
]
},
{
"data": {
"text/plain": [
"135186.25497817993"
]
},
"execution_count": 70,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"fit_and_evaluate(lstm_model, seq2seq_train, seq2seq_valid,\n",
" learning_rate=0.1, epochs=5)"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# GRUs"
]
},
{
"cell_type": "code",
"execution_count": 71,
2019-04-05 11:04:38 +02:00
"metadata": {
"scrolled": true
2019-04-05 11:04:38 +02:00
},
"outputs": [],
"source": [
"tf.random.set_seed(42) # extra code ensures reproducibility\n",
"gru_model = tf.keras.Sequential([\n",
" tf.keras.layers.GRU(32, return_sequences=True, input_shape=[None, 5]),\n",
" tf.keras.layers.Dense(14)\n",
"])"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Just training for 5 epochs to show that it works (you can increase this if you want):"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 72,
2020-04-06 09:13:12 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/5\n",
"33/33 [==============================] - 2s 29ms/step - loss: 0.0516 - mae: 0.2489 - val_loss: 0.0165 - val_mae: 0.1529\n",
"Epoch 2/5\n",
"33/33 [==============================] - 1s 18ms/step - loss: 0.0145 - mae: 0.1386 - val_loss: 0.0139 - val_mae: 0.1260\n",
"Epoch 3/5\n",
"33/33 [==============================] - 1s 18ms/step - loss: 0.0118 - mae: 0.1249 - val_loss: 0.0121 - val_mae: 0.1170\n",
"Epoch 4/5\n",
"33/33 [==============================] - 1s 18ms/step - loss: 0.0106 - mae: 0.1166 - val_loss: 0.0111 - val_mae: 0.1109\n",
"Epoch 5/5\n",
"33/33 [==============================] - 1s 18ms/step - loss: 0.0098 - mae: 0.1107 - val_loss: 0.0104 - val_mae: 0.1071\n",
"3/3 [==============================] - 0s 14ms/step - loss: 0.0104 - mae: 0.1071\n"
]
},
{
"data": {
"text/plain": [
"107093.29694509506"
]
},
"execution_count": 72,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"fit_and_evaluate(gru_model, seq2seq_train, seq2seq_valid,\n",
" learning_rate=0.1, epochs=5)"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "markdown",
2019-04-05 11:04:38 +02:00
"metadata": {},
"source": [
"## Using One-Dimensional Convolutional Layers to Process Sequences"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "markdown",
2019-04-05 11:04:38 +02:00
"metadata": {},
"source": [
"```\n",
" |-----0-----| |-----3----| |--... |-------52------|\n",
" |-----1----| |-----4----| ... | |-------53------|\n",
" |-----2----| |------5--...-51------| |-------54------|\n",
"X: 0 1 2 3 4 5 6 7 8 9 10 11 12 ... 104 105 106 107 108 109 110 111\n",
"Y: from 4 6 8 10 12 ... 106 108 110 112\n",
" to 17 19 21 23 25 ... 119 121 123 125\n",
"```"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 73,
"metadata": {},
"outputs": [],
"source": [
"tf.random.set_seed(42) # extra code ensures reproducibility\n",
"conv_rnn_model = tf.keras.Sequential([\n",
" tf.keras.layers.Conv1D(filters=32, kernel_size=4, strides=2,\n",
" activation=\"relu\", input_shape=[None, 5]),\n",
" tf.keras.layers.GRU(32, return_sequences=True),\n",
" tf.keras.layers.Dense(14)\n",
"])\n",
"\n",
"longer_train = to_seq2seq_dataset(mulvar_train, seq_length=112,\n",
" shuffle=True, seed=42)\n",
"longer_valid = to_seq2seq_dataset(mulvar_valid, seq_length=112)\n",
"downsampled_train = longer_train.map(lambda X, Y: (X, Y[:, 3::2]))\n",
"downsampled_valid = longer_valid.map(lambda X, Y: (X, Y[:, 3::2]))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Just training for 5 epochs to show that it works (you can increase this if you want):"
]
},
{
"cell_type": "code",
"execution_count": 74,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/5\n",
"31/31 [==============================] - 2s 30ms/step - loss: 0.0482 - mae: 0.2420 - val_loss: 0.0214 - val_mae: 0.1616\n",
"Epoch 2/5\n",
"31/31 [==============================] - 1s 18ms/step - loss: 0.0165 - mae: 0.1532 - val_loss: 0.0171 - val_mae: 0.1423\n",
"Epoch 3/5\n",
"31/31 [==============================] - 1s 18ms/step - loss: 0.0144 - mae: 0.1447 - val_loss: 0.0157 - val_mae: 0.1342\n",
"Epoch 4/5\n",
"31/31 [==============================] - 1s 17ms/step - loss: 0.0130 - mae: 0.1361 - val_loss: 0.0141 - val_mae: 0.1254\n",
"Epoch 5/5\n",
"31/31 [==============================] - 1s 17ms/step - loss: 0.0115 - mae: 0.1256 - val_loss: 0.0124 - val_mae: 0.1159\n",
"1/1 [==============================] - 0s 88ms/step - loss: 0.0124 - mae: 0.1159\n"
]
},
{
"data": {
"text/plain": [
"115850.42625665665"
]
},
"execution_count": 74,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"fit_and_evaluate(conv_rnn_model, downsampled_train, downsampled_valid,\n",
" learning_rate=0.1, epochs=5)"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## WaveNet"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"```\n",
" ⋮\n",
"C2 /\\ /\\ /\\ /\\ /\\ /\\ /\\ /\\ /\\ /\\ /\\ /\\ /\\...\n",
" \\ / \\ / \\ / \\ / \\ / \\ / \\ \n",
" / \\ / \\ / \\ \n",
"C1 /\\ /\\ /\\ /\\ /\\ /\\ /\\ /\\ /\\ /\\ /\\ /\\ /...\\\n",
"X: 0 1 2 3 4 5 6 7 8 9 10 11 12 ... 111\n",
"Y: 1 2 3 4 5 6 7 8 9 10 11 12 13 ... 112\n",
" /14 15 16 17 18 19 20 21 22 23 24 25 26 ... 125\n",
2019-04-05 11:04:38 +02:00
"```"
]
},
{
"cell_type": "code",
"execution_count": 75,
2019-04-05 11:04:38 +02:00
"metadata": {},
"outputs": [],
"source": [
"tf.random.set_seed(42) # extra code ensures reproducibility\n",
"wavenet_model = tf.keras.Sequential()\n",
"wavenet_model.add(tf.keras.layers.InputLayer(input_shape=[None, 5]))\n",
"for rate in (1, 2, 4, 8) * 2:\n",
" wavenet_model.add(tf.keras.layers.Conv1D(\n",
" filters=32, kernel_size=2, padding=\"causal\", activation=\"relu\",\n",
" dilation_rate=rate))\n",
"wavenet_model.add(tf.keras.layers.Conv1D(filters=14, kernel_size=1))"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Just training for 5 epochs to show that it works (you can increase this if you want):"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 76,
2019-04-05 11:04:38 +02:00
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/5\n",
"31/31 [==============================] - 2s 26ms/step - loss: 0.0796 - mae: 0.3159 - val_loss: 0.0239 - val_mae: 0.1723\n",
"Epoch 2/5\n",
"31/31 [==============================] - 1s 16ms/step - loss: 0.0172 - mae: 0.1585 - val_loss: 0.0182 - val_mae: 0.1545\n",
"Epoch 3/5\n",
"31/31 [==============================] - 1s 16ms/step - loss: 0.0159 - mae: 0.1561 - val_loss: 0.0181 - val_mae: 0.1505\n",
"Epoch 4/5\n",
"31/31 [==============================] - 1s 16ms/step - loss: 0.0155 - mae: 0.1535 - val_loss: 0.0175 - val_mae: 0.1479\n",
"Epoch 5/5\n",
"31/31 [==============================] - 1s 17ms/step - loss: 0.0147 - mae: 0.1488 - val_loss: 0.0166 - val_mae: 0.1407\n",
"1/1 [==============================] - 0s 74ms/step - loss: 0.0166 - mae: 0.1407\n"
]
},
{
"data": {
"text/plain": [
"140713.95993232727"
]
},
"execution_count": 76,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
2019-04-05 11:04:38 +02:00
"source": [
"fit_and_evaluate(wavenet_model, longer_train, longer_valid,\n",
" learning_rate=0.1, epochs=5)"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "markdown",
2019-04-05 11:04:38 +02:00
"metadata": {},
"source": [
"# Extra Material Wavenet Implementation"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Here is the original WaveNet defined in the paper: it uses Gated Activation Units instead of ReLU and parametrized skip connections, plus it pads with zeros on the left to avoid getting shorter and shorter sequences:"
]
},
{
"cell_type": "code",
"execution_count": 77,
2019-04-05 11:04:38 +02:00
"metadata": {},
"outputs": [],
"source": [
2021-10-17 04:04:08 +02:00
"class GatedActivationUnit(tf.keras.layers.Layer):\n",
2019-04-05 11:04:38 +02:00
" def __init__(self, activation=\"tanh\", **kwargs):\n",
" super().__init__(**kwargs)\n",
2021-10-17 04:04:08 +02:00
" self.activation = tf.keras.activations.get(activation)\n",
"\n",
2019-04-05 11:04:38 +02:00
" def call(self, inputs):\n",
" n_filters = inputs.shape[-1] // 2\n",
" linear_output = self.activation(inputs[..., :n_filters])\n",
2021-10-17 04:04:08 +02:00
" gate = tf.keras.activations.sigmoid(inputs[..., n_filters:])\n",
2019-04-05 11:04:38 +02:00
" return self.activation(linear_output) * gate"
]
},
{
"cell_type": "code",
"execution_count": 78,
2019-04-05 11:04:38 +02:00
"metadata": {},
"outputs": [],
"source": [
"def wavenet_residual_block(inputs, n_filters, dilation_rate):\n",
2021-10-17 04:04:08 +02:00
" z = tf.keras.layers.Conv1D(2 * n_filters, kernel_size=2, padding=\"causal\",\n",
" dilation_rate=dilation_rate)(inputs)\n",
2019-04-05 11:04:38 +02:00
" z = GatedActivationUnit()(z)\n",
2021-10-17 04:04:08 +02:00
" z = tf.keras.layers.Conv1D(n_filters, kernel_size=1)(z)\n",
" return tf.keras.layers.Add()([z, inputs]), z"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 79,
2019-04-05 11:04:38 +02:00
"metadata": {},
"outputs": [],
"source": [
"tf.random.set_seed(42)\n",
2019-04-05 11:04:38 +02:00
"\n",
"n_layers_per_block = 3 # 10 in the paper\n",
"n_blocks = 1 # 3 in the paper\n",
"n_filters = 32 # 128 in the paper\n",
"n_outputs = 14 # 256 in the paper\n",
2019-04-05 11:04:38 +02:00
"\n",
"inputs = tf.keras.layers.Input(shape=[None, 5])\n",
2021-10-17 04:04:08 +02:00
"z = tf.keras.layers.Conv1D(n_filters, kernel_size=2, padding=\"causal\")(inputs)\n",
2019-04-05 11:04:38 +02:00
"skip_to_last = []\n",
"for dilation_rate in [2**i for i in range(n_layers_per_block)] * n_blocks:\n",
" z, skip = wavenet_residual_block(z, n_filters, dilation_rate)\n",
" skip_to_last.append(skip)\n",
"\n",
2021-10-17 04:04:08 +02:00
"z = tf.keras.activations.relu(tf.keras.layers.Add()(skip_to_last))\n",
"z = tf.keras.layers.Conv1D(n_filters, kernel_size=1, activation=\"relu\")(z)\n",
"Y_preds = tf.keras.layers.Conv1D(n_outputs, kernel_size=1)(z)\n",
2019-04-05 11:04:38 +02:00
"\n",
"full_wavenet_model = tf.keras.Model(inputs=[inputs], outputs=[Y_preds])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Just training for 5 epochs to show that it works (you can increase this if you want):"
2019-04-05 11:04:38 +02:00
]
},
{
"cell_type": "code",
"execution_count": 80,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/5\n",
"31/31 [==============================] - 2s 26ms/step - loss: 0.0706 - mae: 0.2861 - val_loss: 0.0209 - val_mae: 0.1630\n",
"Epoch 2/5\n",
"31/31 [==============================] - 1s 18ms/step - loss: 0.0137 - mae: 0.1398 - val_loss: 0.0140 - val_mae: 0.1273\n",
"Epoch 3/5\n",
"31/31 [==============================] - 1s 20ms/step - loss: 0.0104 - mae: 0.1190 - val_loss: 0.0116 - val_mae: 0.1125\n",
"Epoch 4/5\n",
"31/31 [==============================] - 1s 18ms/step - loss: 0.0086 - mae: 0.1048 - val_loss: 0.0096 - val_mae: 0.1020\n",
"Epoch 5/5\n",
"31/31 [==============================] - 1s 19ms/step - loss: 0.0073 - mae: 0.0942 - val_loss: 0.0087 - val_mae: 0.0953\n",
"1/1 [==============================] - 0s 71ms/step - loss: 0.0087 - mae: 0.0953\n"
]
},
{
"data": {
"text/plain": [
"95349.08086061478"
]
},
"execution_count": 80,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"fit_and_evaluate(full_wavenet_model, longer_train, longer_valid,\n",
" learning_rate=0.1, epochs=5)"
]
},
2019-04-05 11:04:38 +02:00
{
"cell_type": "markdown",
"metadata": {},
"source": [
"In this chapter we explored the fundamentals of RNNs and used them to process sequences (namely, time series). In the process we also looked at other ways to process sequences, including CNNs. In the next chapter we will use RNNs for Natural Language Processing, and we will learn more about RNNs (bidirectional RNNs, stateful vs stateless RNNs, EncoderDecoders, and Attention-augmented Encoder-Decoders). We will also look at the Transformer, an Attention-only architecture."
]
},
2016-11-24 17:23:11 +01:00
{
"cell_type": "markdown",
2020-04-06 09:13:12 +02:00
"metadata": {},
2016-11-24 17:23:11 +01:00
"source": [
"# Exercise solutions"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 1. to 8."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"1. Here are a few RNN applications:\n",
" * For a sequence-to-sequence RNN: predicting the weather (or any other time series), machine translation (using an EncoderDecoder architecture), video captioning, speech to text, music generation (or other sequence generation), identifying the chords of a song\n",
" * For a sequence-to-vector RNN: classifying music samples by music genre, analyzing the sentiment of a book review, predicting what word an aphasic patient is thinking of based on readings from brain implants, predicting the probability that a user will want to watch a movie based on their watch history (this is one of many possible implementations of _collaborative filtering_ for a recommender system)\n",
" * For a vector-to-sequence RNN: image captioning, creating a music playlist based on an embedding of the current artist, generating a melody based on a set of parameters, locating pedestrians in a picture (e.g., a video frame from a self-driving car's camera)\n",
"2. An RNN layer must have three-dimensional inputs: the first dimension is the batch dimension (its size is the batch size), the second dimension represents the time (its size is the number of time steps), and the third dimension holds the inputs at each time step (its size is the number of input features per time step). For example, if you want to process a batch containing 5 time series of 10 time steps each, with 2 values per time step (e.g., the temperature and the wind speed), the shape will be [5, 10, 2]. The outputs are also three-dimensional, with the same first two dimensions, but the last dimension is equal to the number of neurons. For example, if an RNN layer with 32 neurons processes the batch we just discussed, the output will have a shape of [5, 10, 32].\n",
"3. To build a deep sequence-to-sequence RNN using Keras, you must set `return_sequences=True` for all RNN layers. To build a sequence-to-vector RNN, you must set `return_sequences=True` for all RNN layers except for the top RNN layer, which must have `return_sequences=False` (or do not set this argument at all, since `False` is the default).\n",
"4. If you have a daily univariate time series, and you want to forecast the next seven days, the simplest RNN architecture you can use is a stack of RNN layers (all with `return_sequences=True` except for the top RNN layer), using seven neurons in the output RNN layer. You can then train this model using random windows from the time series (e.g., sequences of 30 consecutive days as the inputs, and a vector containing the values of the next 7 days as the target). This is a sequence-to-vector RNN. Alternatively, you could set `return_sequences=True` for all RNN layers to create a sequence-to-sequence RNN. You can train this model using random windows from the time series, with sequences of the same length as the inputs as the targets. Each target sequence should have seven values per time step (e.g., for time step _t_, the target should be a vector containing the values at time steps _t_ + 1 to _t_ + 7).\n",
"5. The two main difficulties when training RNNs are unstable gradients (exploding or vanishing) and a very limited short-term memory. These problems both get worse when dealing with long sequences. To alleviate the unstable gradients problem, you can use a smaller learning rate, use a saturating activation function such as the hyperbolic tangent (which is the default), and possibly use gradient clipping, Layer Normalization, or dropout at each time step. To tackle the limited short-term memory problem, you can use `LSTM` or `GRU` layers (this also helps with the unstable gradients problem).\n",
"6. An LSTM cell's architecture looks complicated, but it's actually not too hard if you understand the underlying logic. The cell has a short-term state vector and a long-term state vector. At each time step, the inputs and the previous short-term state are fed to a simple RNN cell and three gates: the forget gate decides what to remove from the long-term state, the input gate decides which part of the output of the simple RNN cell should be added to the long-term state, and the output gate decides which part of the long-term state should be output at this time step (after going through the tanh activation function). The new short-term state is equal to the output of the cell. See Figure 1512.\n",
"7. An RNN layer is fundamentally sequential: in order to compute the outputs at time step _t_, it has to first compute the outputs at all earlier time steps. This makes it impossible to parallelize. On the other hand, a 1D convolutional layer lends itself well to parallelization since it does not hold a state between time steps. In other words, it has no memory: the output at any time step can be computed based only on a small window of values from the inputs without having to know all the past values. Moreover, since a 1D convolutional layer is not recurrent, it suffers less from unstable gradients. One or more 1D convolutional layers can be useful in an RNN to efficiently preprocess the inputs, for example to reduce their temporal resolution (downsampling) and thereby help the RNN layers detect long-term patterns. In fact, it is possible to use only convolutional layers, for example by building a WaveNet architecture.\n",
"8. To classify videos based on their visual content, one possible architecture could be to take (say) one frame per second, then run every frame through the same convolutional neural network (e.g., a pretrained Xception model, possibly frozen if your dataset is not large), feed the sequence of outputs from the CNN to a sequence-to-vector RNN, and finally run its output through a softmax layer, giving you all the class probabilities. For training you would use cross entropy as the cost function. If you wanted to use the audio for classification as well, you could use a stack of strided 1D convolutional layers to reduce the temporal resolution from thousands of audio frames per second to just one per second (to match the number of images per second), and concatenate the output sequence to the inputs of the sequence-to-vector RNN (along the last dimension)."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 9. Tackling the SketchRNN Dataset"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"_Exercise: Train a classification model for the SketchRNN dataset, available in TensorFlow Datasets._"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"The dataset is not available in TFDS yet, the [pull request](https://github.com/tensorflow/datasets/pull/361) is still work in progress. Luckily, the data is conveniently available as TFRecords, so let's download it (it might take a while, as it's about 1 GB large, with 3,450,000 training sketches and 345,000 test sketches):"
]
},
{
"cell_type": "code",
"execution_count": 81,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Downloading data from http://download.tensorflow.org/data/quickdraw_tutorial_dataset_v1.tar.gz\n",
"1065304064/1065301781 [==============================] - 230s 0us/step\n",
"1065312256/1065301781 [==============================] - 230s 0us/step\n"
]
}
],
"source": [
"tf_download_root = \"http://download.tensorflow.org/data/\"\n",
"filename = \"quickdraw_tutorial_dataset_v1.tar.gz\"\n",
"filepath = tf.keras.utils.get_file(filename,\n",
" tf_download_root + filename,\n",
" cache_dir=\".\",\n",
" extract=True)"
]
},
{
"cell_type": "code",
"execution_count": 82,
"metadata": {},
"outputs": [],
"source": [
"quickdraw_dir = Path(filepath).parent\n",
"train_files = sorted(\n",
" [str(path) for path in quickdraw_dir.glob(\"training.tfrecord-*\")]\n",
")\n",
"eval_files = sorted(\n",
" [str(path) for path in quickdraw_dir.glob(\"eval.tfrecord-*\")]\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 83,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"['datasets/training.tfrecord-00000-of-00010',\n",
" 'datasets/training.tfrecord-00001-of-00010',\n",
" 'datasets/training.tfrecord-00002-of-00010',\n",
" 'datasets/training.tfrecord-00003-of-00010',\n",
" 'datasets/training.tfrecord-00004-of-00010',\n",
" 'datasets/training.tfrecord-00005-of-00010',\n",
" 'datasets/training.tfrecord-00006-of-00010',\n",
" 'datasets/training.tfrecord-00007-of-00010',\n",
" 'datasets/training.tfrecord-00008-of-00010',\n",
" 'datasets/training.tfrecord-00009-of-00010']"
2022-02-19 10:24:54 +01:00
]
},
"execution_count": 83,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"train_files"
]
},
{
"cell_type": "code",
"execution_count": 84,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"['datasets/eval.tfrecord-00000-of-00010',\n",
" 'datasets/eval.tfrecord-00001-of-00010',\n",
" 'datasets/eval.tfrecord-00002-of-00010',\n",
" 'datasets/eval.tfrecord-00003-of-00010',\n",
" 'datasets/eval.tfrecord-00004-of-00010',\n",
" 'datasets/eval.tfrecord-00005-of-00010',\n",
" 'datasets/eval.tfrecord-00006-of-00010',\n",
" 'datasets/eval.tfrecord-00007-of-00010',\n",
" 'datasets/eval.tfrecord-00008-of-00010',\n",
" 'datasets/eval.tfrecord-00009-of-00010']"
2022-02-19 10:24:54 +01:00
]
},
"execution_count": 84,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"eval_files"
]
},
{
"cell_type": "code",
"execution_count": 85,
"metadata": {},
"outputs": [],
"source": [
"with open(quickdraw_dir / \"eval.tfrecord.classes\") as test_classes_file:\n",
" test_classes = test_classes_file.readlines()\n",
" \n",
"with open(quickdraw_dir / \"training.tfrecord.classes\") as train_classes_file:\n",
" train_classes = train_classes_file.readlines()"
]
},
{
"cell_type": "code",
"execution_count": 86,
"metadata": {},
"outputs": [],
"source": [
"assert train_classes == test_classes\n",
"class_names = [name.strip().lower() for name in train_classes]"
]
},
{
"cell_type": "code",
"execution_count": 87,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"['aircraft carrier',\n",
" 'airplane',\n",
" 'alarm clock',\n",
" 'ambulance',\n",
" 'angel',\n",
" 'animal migration',\n",
" 'ant',\n",
" 'anvil',\n",
" 'apple',\n",
" 'arm',\n",
" 'asparagus',\n",
" 'axe',\n",
" 'backpack',\n",
" 'banana',\n",
" 'bandage',\n",
" 'barn',\n",
" 'baseball',\n",
" 'baseball bat',\n",
" 'basket',\n",
" 'basketball',\n",
" 'bat',\n",
" 'bathtub',\n",
" 'beach',\n",
" 'bear',\n",
" 'beard',\n",
" 'bed',\n",
" 'bee',\n",
" 'belt',\n",
" 'bench',\n",
" 'bicycle',\n",
" 'binoculars',\n",
" 'bird',\n",
" 'birthday cake',\n",
" 'blackberry',\n",
" 'blueberry',\n",
" 'book',\n",
" 'boomerang',\n",
" 'bottlecap',\n",
" 'bowtie',\n",
" 'bracelet',\n",
" 'brain',\n",
" 'bread',\n",
" 'bridge',\n",
" 'broccoli',\n",
" 'broom',\n",
" 'bucket',\n",
" 'bulldozer',\n",
" 'bus',\n",
" 'bush',\n",
" 'butterfly',\n",
" 'cactus',\n",
" 'cake',\n",
" 'calculator',\n",
" 'calendar',\n",
" 'camel',\n",
" 'camera',\n",
" 'camouflage',\n",
" 'campfire',\n",
" 'candle',\n",
" 'cannon',\n",
" 'canoe',\n",
" 'car',\n",
" 'carrot',\n",
" 'castle',\n",
" 'cat',\n",
" 'ceiling fan',\n",
" 'cell phone',\n",
" 'cello',\n",
" 'chair',\n",
" 'chandelier',\n",
" 'church',\n",
" 'circle',\n",
" 'clarinet',\n",
" 'clock',\n",
" 'cloud',\n",
" 'coffee cup',\n",
" 'compass',\n",
" 'computer',\n",
" 'cookie',\n",
" 'cooler',\n",
" 'couch',\n",
" 'cow',\n",
" 'crab',\n",
" 'crayon',\n",
" 'crocodile',\n",
" 'crown',\n",
" 'cruise ship',\n",
" 'cup',\n",
" 'diamond',\n",
" 'dishwasher',\n",
" 'diving board',\n",
" 'dog',\n",
" 'dolphin',\n",
" 'donut',\n",
" 'door',\n",
" 'dragon',\n",
" 'dresser',\n",
" 'drill',\n",
" 'drums',\n",
" 'duck',\n",
" 'dumbbell',\n",
" 'ear',\n",
" 'elbow',\n",
" 'elephant',\n",
" 'envelope',\n",
" 'eraser',\n",
" 'eye',\n",
" 'eyeglasses',\n",
" 'face',\n",
" 'fan',\n",
" 'feather',\n",
" 'fence',\n",
" 'finger',\n",
" 'fire hydrant',\n",
" 'fireplace',\n",
" 'firetruck',\n",
" 'fish',\n",
" 'flamingo',\n",
" 'flashlight',\n",
" 'flip flops',\n",
" 'floor lamp',\n",
" 'flower',\n",
" 'flying saucer',\n",
" 'foot',\n",
" 'fork',\n",
" 'frog',\n",
" 'frying pan',\n",
" 'garden',\n",
" 'garden hose',\n",
" 'giraffe',\n",
" 'goatee',\n",
" 'golf club',\n",
" 'grapes',\n",
" 'grass',\n",
" 'guitar',\n",
" 'hamburger',\n",
" 'hammer',\n",
" 'hand',\n",
" 'harp',\n",
" 'hat',\n",
" 'headphones',\n",
" 'hedgehog',\n",
" 'helicopter',\n",
" 'helmet',\n",
" 'hexagon',\n",
" 'hockey puck',\n",
" 'hockey stick',\n",
" 'horse',\n",
" 'hospital',\n",
" 'hot air balloon',\n",
" 'hot dog',\n",
" 'hot tub',\n",
" 'hourglass',\n",
" 'house',\n",
" 'house plant',\n",
" 'hurricane',\n",
" 'ice cream',\n",
" 'jacket',\n",
" 'jail',\n",
" 'kangaroo',\n",
" 'key',\n",
" 'keyboard',\n",
" 'knee',\n",
" 'knife',\n",
" 'ladder',\n",
" 'lantern',\n",
" 'laptop',\n",
" 'leaf',\n",
" 'leg',\n",
" 'light bulb',\n",
" 'lighter',\n",
" 'lighthouse',\n",
" 'lightning',\n",
" 'line',\n",
" 'lion',\n",
" 'lipstick',\n",
" 'lobster',\n",
" 'lollipop',\n",
" 'mailbox',\n",
" 'map',\n",
" 'marker',\n",
" 'matches',\n",
" 'megaphone',\n",
" 'mermaid',\n",
" 'microphone',\n",
" 'microwave',\n",
" 'monkey',\n",
" 'moon',\n",
" 'mosquito',\n",
" 'motorbike',\n",
" 'mountain',\n",
" 'mouse',\n",
" 'moustache',\n",
" 'mouth',\n",
" 'mug',\n",
" 'mushroom',\n",
" 'nail',\n",
" 'necklace',\n",
" 'nose',\n",
" 'ocean',\n",
" 'octagon',\n",
" 'octopus',\n",
" 'onion',\n",
" 'oven',\n",
" 'owl',\n",
" 'paint can',\n",
" 'paintbrush',\n",
" 'palm tree',\n",
" 'panda',\n",
" 'pants',\n",
" 'paper clip',\n",
" 'parachute',\n",
" 'parrot',\n",
" 'passport',\n",
" 'peanut',\n",
" 'pear',\n",
" 'peas',\n",
" 'pencil',\n",
" 'penguin',\n",
" 'piano',\n",
" 'pickup truck',\n",
" 'picture frame',\n",
" 'pig',\n",
" 'pillow',\n",
" 'pineapple',\n",
" 'pizza',\n",
" 'pliers',\n",
" 'police car',\n",
" 'pond',\n",
" 'pool',\n",
" 'popsicle',\n",
" 'postcard',\n",
" 'potato',\n",
" 'power outlet',\n",
" 'purse',\n",
" 'rabbit',\n",
" 'raccoon',\n",
" 'radio',\n",
" 'rain',\n",
" 'rainbow',\n",
" 'rake',\n",
" 'remote control',\n",
" 'rhinoceros',\n",
" 'rifle',\n",
" 'river',\n",
" 'roller coaster',\n",
" 'rollerskates',\n",
" 'sailboat',\n",
" 'sandwich',\n",
" 'saw',\n",
" 'saxophone',\n",
" 'school bus',\n",
" 'scissors',\n",
" 'scorpion',\n",
" 'screwdriver',\n",
" 'sea turtle',\n",
" 'see saw',\n",
" 'shark',\n",
" 'sheep',\n",
" 'shoe',\n",
" 'shorts',\n",
" 'shovel',\n",
" 'sink',\n",
" 'skateboard',\n",
" 'skull',\n",
" 'skyscraper',\n",
" 'sleeping bag',\n",
" 'smiley face',\n",
" 'snail',\n",
" 'snake',\n",
" 'snorkel',\n",
" 'snowflake',\n",
" 'snowman',\n",
" 'soccer ball',\n",
" 'sock',\n",
" 'speedboat',\n",
" 'spider',\n",
" 'spoon',\n",
" 'spreadsheet',\n",
" 'square',\n",
" 'squiggle',\n",
" 'squirrel',\n",
" 'stairs',\n",
" 'star',\n",
" 'steak',\n",
" 'stereo',\n",
" 'stethoscope',\n",
" 'stitches',\n",
" 'stop sign',\n",
" 'stove',\n",
" 'strawberry',\n",
" 'streetlight',\n",
" 'string bean',\n",
" 'submarine',\n",
" 'suitcase',\n",
" 'sun',\n",
" 'swan',\n",
" 'sweater',\n",
" 'swing set',\n",
" 'sword',\n",
" 'syringe',\n",
" 't-shirt',\n",
" 'table',\n",
" 'teapot',\n",
" 'teddy-bear',\n",
" 'telephone',\n",
" 'television',\n",
" 'tennis racquet',\n",
" 'tent',\n",
" 'the eiffel tower',\n",
" 'the great wall of china',\n",
" 'the mona lisa',\n",
" 'tiger',\n",
" 'toaster',\n",
" 'toe',\n",
" 'toilet',\n",
" 'tooth',\n",
" 'toothbrush',\n",
" 'toothpaste',\n",
" 'tornado',\n",
" 'tractor',\n",
" 'traffic light',\n",
" 'train',\n",
" 'tree',\n",
" 'triangle',\n",
" 'trombone',\n",
" 'truck',\n",
" 'trumpet',\n",
" 'umbrella',\n",
" 'underwear',\n",
" 'van',\n",
" 'vase',\n",
" 'violin',\n",
" 'washing machine',\n",
" 'watermelon',\n",
" 'waterslide',\n",
" 'whale',\n",
" 'wheel',\n",
" 'windmill',\n",
" 'wine bottle',\n",
" 'wine glass',\n",
" 'wristwatch',\n",
" 'yoga',\n",
" 'zebra',\n",
" 'zigzag']"
]
},
"execution_count": 87,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"sorted(class_names)"
]
},
{
"cell_type": "code",
"execution_count": 88,
"metadata": {},
"outputs": [],
"source": [
"def parse(data_batch):\n",
" feature_descriptions = {\n",
" \"ink\": tf.io.VarLenFeature(dtype=tf.float32),\n",
" \"shape\": tf.io.FixedLenFeature([2], dtype=tf.int64),\n",
" \"class_index\": tf.io.FixedLenFeature([1], dtype=tf.int64)\n",
" }\n",
" examples = tf.io.parse_example(data_batch, feature_descriptions)\n",
" flat_sketches = tf.sparse.to_dense(examples[\"ink\"])\n",
" sketches = tf.reshape(flat_sketches, shape=[tf.size(data_batch), -1, 3])\n",
" lengths = examples[\"shape\"][:, 0]\n",
" labels = examples[\"class_index\"][:, 0]\n",
" return sketches, lengths, labels"
]
},
{
"cell_type": "code",
"execution_count": 89,
"metadata": {},
"outputs": [],
"source": [
"def quickdraw_dataset(filepaths, batch_size=32, shuffle_buffer_size=None,\n",
" n_parse_threads=5, n_read_threads=5, cache=False):\n",
" dataset = tf.data.TFRecordDataset(filepaths,\n",
" num_parallel_reads=n_read_threads)\n",
" if cache:\n",
" dataset = dataset.cache()\n",
" if shuffle_buffer_size:\n",
" dataset = dataset.shuffle(shuffle_buffer_size)\n",
" dataset = dataset.batch(batch_size)\n",
" dataset = dataset.map(parse, num_parallel_calls=n_parse_threads)\n",
" return dataset.prefetch(1)"
]
},
{
"cell_type": "code",
"execution_count": 90,
"metadata": {},
"outputs": [],
"source": [
"train_set = quickdraw_dataset(train_files, shuffle_buffer_size=10000)\n",
"valid_set = quickdraw_dataset(eval_files[:5])\n",
"test_set = quickdraw_dataset(eval_files[5:])"
]
},
{
"cell_type": "code",
"execution_count": 91,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"sketches = tf.Tensor(\n",
"[[[-0.08627451 0.11764706 0. ]\n",
" [-0.01176471 0.16806725 0. ]\n",
" [ 0.02352941 0.07563025 0. ]\n",
" ...\n",
" [ 0. 0. 0. ]\n",
" [ 0. 0. 0. ]\n",
" [ 0. 0. 0. ]]\n",
"\n",
" [[-0.04705882 -0.06696428 0. ]\n",
" [-0.09019607 -0.07142857 0. ]\n",
" [-0.0862745 -0.04464286 0. ]\n",
" ...\n",
" [ 0. 0. 0. ]\n",
" [ 0. 0. 0. ]\n",
" [ 0. 0. 0. ]]\n",
"\n",
" [[ 0. 0. 1. ]\n",
" [ 0. 0. 0. ]\n",
" [ 0.00784314 0.11320752 0. ]\n",
" ...\n",
" [ 0.11764708 0.01886791 0. ]\n",
" [-0.03529412 0.12264156 0. ]\n",
" [-0.19215688 0.33962262 1. ]]\n",
"\n",
" ...\n",
"\n",
" [[-0.21276593 -0.01960784 0. ]\n",
" [-0.31382978 0.00784314 0. ]\n",
" [-0.37234044 0.13725491 0. ]\n",
" ...\n",
" [ 0. 0. 0. ]\n",
" [ 0. 0. 0. ]\n",
" [ 0. 0. 0. ]]\n",
"\n",
" [[ 0. 0.4677419 0. ]\n",
" [-0.01176471 0.15053767 0. ]\n",
" [ 0.16470589 0.05376345 0. ]\n",
" ...\n",
" [ 0. 0. 0. ]\n",
" [ 0. 0. 0. ]\n",
" [ 0. 0. 0. ]]\n",
"\n",
" [[-0.04819274 0.01568627 0. ]\n",
" [-0.07228917 -0.01176471 0. ]\n",
" [-0.05622491 -0.03921568 0. ]\n",
" ...\n",
" [ 0. 0. 0. ]\n",
" [ 0. 0. 0. ]\n",
" [ 0. 0. 0. ]]], shape=(32, 104, 3), dtype=float32)\n",
"lengths = tf.Tensor(\n",
"[ 29 48 104 34 29 35 28 40 95 26 23 41 47 17 37 47 12 13\n",
" 17 41 36 23 8 15 60 32 54 38 68 30 89 36], shape=(32,), dtype=int64)\n",
"labels = tf.Tensor(\n",
"[ 95 190 163 12 77 213 216 278 25 202 310 33 327 204 260 181 337 233\n",
" 299 186 61 157 274 150 7 34 47 319 213 292 312 282], shape=(32,), dtype=int64)\n"
]
}
],
"source": [
"for sketches, lengths, labels in train_set.take(1):\n",
" print(\"sketches =\", sketches)\n",
" print(\"lengths =\", lengths)\n",
" print(\"labels =\", labels)"
]
},
{
"cell_type": "code",
"execution_count": 92,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAqwAAAYRCAYAAAB72cuIAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAAEAAElEQVR4nOydd3xTVf/HP+fe7KRN96a9UPYeMsqyiiga3I8THKiAE/Tnijvu4HrEgQIq6iPurVF8UKkCgsiUvUPp3mnS7HvP748ErX1YTZPcpL3v16svSHLv+X7SnN587znfQSilkJCQkJCQkJCQkIhVGLEFSEhISEhISEhISBwPyWGVkJCQkJCQkJCIaSSHVUJCQkJCQkJCIqaRHFYJCQkJCQkJCYmYRnJYJSQkJCQkJCQkYhrJYZWQkJCQkJCQkIhp4sZhJYSUEEJeEVtHeyCE9CWErCGEuAkhVrH1SESfeJy3EhISEhISsYZMbAHt4CIAvpM5kBBSDGAFgHRKaV0ENZ2IJwA4AfQF0CKiDgkJCQkJCQmJuCVuVlgppQ2UUnu07RJCFB04vSeAVZRSK6W09hjjyzswvoRE1Ojg34KEhOhIc1giHiCEyAghRGwdsUZMOKzBbdPXCSHzCSGNwZ9nCSFMm2NeafVYQQh5ihByiBDiIYQcIITMIYRwCKyuAkAtIYQSQt4+2hjB594mhHzbxs5rhJDnCCG1AFYHn+9PCLEQQuyEkBpCyAeEkKzjvCcKYAiAh4MaTIQQLvj/KwghPxNCXABmE0JSg+OVEUJchJDthJAZR/kdvUYIeZ4Q0kAIqSWEzCWEKAkhrxJCmgghpYSQq9qcl0sI+bDV79VCCOnVns9HosPIjjW3CSHJhJB3gs+7CCE/EkIGtD6ZEHIRIWRrcJ4fJoQ80PpiRgixEkIeDs5le/CYywghScHP3kEI2UsIObPNuMed00f+Nggh9xJCygCUBZ+fTgj5o9V5nxBCcludVxyc55MIIb8TQpyEkPWEkOFt7F8XnLNOQsg3hJCbg383Ep0AQshEQsja4PyzBefCwOBrVwev3c7gHLul9WcfvF5uazPetYQQR6vHhYSQrwghVYSQFkLIRkLI1DbnWINjvUUIaQKwNPj8WELIL0H75cFra2Ikfx8SnRcS4B5CyP7gdXwrIWR6q9fNhJDdwdeshJBnCCGqVq+bCCHbgnN8PwAPAK0Y7yWWiQmHNcg0BPQUAZgNYBaA249z/DsArgbwfwD6AbgeQBOAwwAuDh4zAEA2gLnt1DIdAAEwAcDVhJBsAL8C2AZgFIAzAOgAfE1aOdVtyAawG8Dzwf8/1+q1pwEsANAfwJcAVAA2Apga1DwfwEJCyKQ2Y04DYAcwGoAZwIvB8/cAOAWB38kbhJAcACCEaBBw3t0ATkXgd1sJ4MfgaxLR4Xhz+20EPs/zEZhbTgDLCCFqACCEjADwCYDPAQwCYARwH4Bb29i4HcA6AMMBfIzAXHgfwHcAhiIwf987cpFsx5w+FcBgAFMAHJmPCgCPIHBDNhVAGoAPjvK+nw7qHQ6gHsDSI442IaQIwBsAXg3q+xrAo0f/9UnEG4QQGYCvAKxCYJ6MRuC6xhNCRiMw7xch8Nl/A+CxEMzoAHwPYHLQxmcAPieE9G1z3P8B2IXANfJ+QsggAP9FYM4NQSDcbCiAt0LQICEBBML/rgdwCwLf608j8B1uCL7eAuA6BHyVmwFcDuCBNmN0B3AlgEsQmJfuyMuOMyilov8AKEHA6SKtnnsQQFmbY14J/r8XAApgyjHGKw6+nnYUO6+0ee5tAN+2OebPNsc8BuCnNs8lB22MOs772gbA1OoxFzznzpP4nXwI4I02uta0ekwA1AL4utVzcgBeAP8KPr4OwN42v1cWAefhUrE/967wc7y53WoeT2z1mh6ADcANwcdLAfzcZkxTm78NK4APWj3WBcd96Shz75Tg4xPO6eDfRi0A5QneY9/geXnBx0f+/s5qdcy4Nsd8AGBZm3EWAaBif2bST8d/AKQEP+9Tj/La+wCWt3nujdaffXCOb2tzzLUAHCewuxbAg60eWwF80+aYdwG82ea5oUG9GWL/7qSf+PpBYCXUBWBCm+dfBPDdMc65EcC+Vo9NCOToZIr9fmL5J5ZWWNfS4CcXZA2A3GNs0wwDIODvrf9ws6HN4xEAJga3thzBbanDwdcKQxh/fesHhBA2uM37JyGkPjj+RQDy25z355H/BH9XNQC2tnrOB6ARQEYr3d0B2FvptiHgmISiWyI0jjq3EbjbFoKPAQCUUhsCn2n/4FP9EAxLacUq/O/fRuu54UBgpXZrq9erg/+2nhsnM6e3UUo9rY0TQoYHt2IPEULs+Hs+H3O+AqhoY78vAivCrfkdEp0CSmkDAjc8PwTDTv6PENIt+HI/tJrzQdo+PiGEEG1wa3UHCYTUOBBYRW07D9e3eTwCwPQ2c//I35h0XZRoL/0R2CVd1mZO3YTgfCKE/IsQsioYvuIA8G/87zwto5RWQ+KYxFOVgNaEGowsHOXcoyU9tc3oZwBYANx1lGNDmWBtx78LwJ0IhC5sBeAA8BT+/nI/QtsqCfQYzx25EWEAbEZg+6EtDe1SLBEJjjePaatjjhXX2fr5E82NI8e2nhsnM6f/MVcJIVoAPwD4EcBVCNw0pQFYiUCoQGuOZ/9470uiE0ApnUEIeRGBcJLzADxJCLkAJ3f9Pplr9XPBse9CYCfJicDqadt5eLTr+RsIOA1tKT8JbRISrTlyTTsXQGmb13yEkDEI7Jg+CuAOBEIXz8M/wwQBqZLQCYklh3U0IYS0WokaA6CCUtp8lGM3IjBJTgOw7Cive4P/sm2er0UgnrQ1QxDYNjoeGwFcCuBQcBUz3IxHYNvqP0AggBtAbwQmdkfYCOAKAHWU0o6OJRE6R53bAHbg79jWXwEguGo6CMCS4LE7EJgfrRmPwN14R6pmhDqn+yLgoN5PKT0Y1HxRCPZ3IhA725q2jyXiHErpFgBbAMwjhHwP4BoE5vSYNoe2fVwLILPN383QNseMB/AupfQzAAjGZxciEIJzPDYCGEAp3dee9yIhcQx2IJAkVUAp/bnti4SQfwEop5Q+3uq5gijq6zTEUkhADoAXCSF9gh/w3Tj6HTAopXsRSCx5gxByMSGkOyFkAvk7Q/4QAqs3BkJIOiFEF3z+ZwBnE0LOC9p5AUC3/7XwP7yKQGzhR4SQ0YSQHoSQMwghiwghCaG/5b/YA2ASIWR8MGHgFQS28jvKUgRWy74ihJwa/D1NJIFKA1KlgOhx1LkdnMdfIRCcPyGYDPIegGYE4vyAQNLeqcEs0t6EkGkIrMY/00FNoc7pUgQuzrcGzzEAePw4xx+LlwCcSQi5mxDSixByPYALQxhHIgYJXmvMJJCNX0AIOQ2B5L0dCHz2ZxBC7gt+9jPxv599CQJxsPeTQDWA6wH8q80xewBcGAxROfK3o8KJmQdgFAlUphlGCOlJCJlKCFkY+juW6KoEFw6eA/AcCVQ+6UkIGUoIuZEQMguBeZpLCJkWvGbehMBCkkQ7iSWHdSkCK6K/A1gM4E0cw2ENcjUCX+ovIZAB+jYCX8CglJYjkMX8JAIO25FSVm+1+lmNwNb7FycSRimtQCBpREBgRXc7Al/4nuBPR3kCgXi+7xFYaWtBsPxKR6CUOgFMBHAAgUzzXQhkjycjEOsqER2ON7dnIPDZfx38V4NAMqELACilGxHIGr0YgSQ+c/CnQ92zQp3TNFBP+BoAFyDgfDyCQBZ2e+2vATATwBwEYl0vQMCRkDJjOwdOBHaJPkHgC/sdBP4O5lFK1yKQUX0TAp/9RQgknfwFpXRn8PVZwWMmIxAm1Zr/QyAkZSUC1861wf8fF0rpnwhcFzkAvyCwAvw0QgvvkpAAgIcQmMN3IXAtXY7ANfsgpfQbAM8ikIR1ZC4/LIrKOIf8MxdEJBGElCCQ3NG2VI+EhEQXgRDybwBnUEoHia1FIroEdx4+oZRKxdIlJCSOSizFsEpISHQhCCF3I7AS4UCgDuyNAO4XVZSEhISEREwiOawSEhJicQoCW2h6AAcRaIgwX1RFEhI
"text/plain": [
"<Figure size 864x2016 with 32 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"def draw_sketch(sketch, label=None):\n",
" origin = np.array([[0., 0., 0.]])\n",
" sketch = np.r_[origin, sketch]\n",
" stroke_end_indices = np.argwhere(sketch[:, -1]==1.)[:, 0]\n",
" coordinates = sketch[:, :2].cumsum(axis=0)\n",
" strokes = np.split(coordinates, stroke_end_indices + 1)\n",
" title = class_names[label.numpy()] if label is not None else \"Try to guess\"\n",
" plt.title(title)\n",
" plt.plot(coordinates[:, 0], -coordinates[:, 1], \"y:\")\n",
" for stroke in strokes:\n",
" plt.plot(stroke[:, 0], -stroke[:, 1], \".-\")\n",
" plt.axis(\"off\")\n",
"\n",
"def draw_sketches(sketches, lengths, labels):\n",
" n_sketches = len(sketches)\n",
" n_cols = 4\n",
" n_rows = (n_sketches - 1) // n_cols + 1\n",
" plt.figure(figsize=(n_cols * 3, n_rows * 3.5))\n",
" for index, sketch, length, label in zip(range(n_sketches), sketches, lengths, labels):\n",
" plt.subplot(n_rows, n_cols, index + 1)\n",
" draw_sketch(sketch[:length], label)\n",
" plt.show()\n",
"\n",
"for sketches, lengths, labels in train_set.take(1):\n",
" draw_sketches(sketches, lengths, labels)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Most sketches are composed of less than 100 points:"
]
},
{
"cell_type": "code",
"execution_count": 93,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAZwAAAEOCAYAAAC976FxAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAAAZiklEQVR4nO3df7RlZX3f8ffHATVRW0BHMxlIBnWqGe0K0inS2lirks5AdTSpLkiq+KMdSSDGX0kmpjbk1yr+XtJQJoNMhFZFE3U51TFoqYbaFGVARBCREUcZmcCoKWpQceTbP/YePB7uvXPO3HOfc+fO+7XWWefsZz/Pvd+972E+7H32eXaqCkmSFtoDpl2AJOnwYOBIkpowcCRJTRg4kqQmDBxJUhMGjiSpiaaBk2RdkpuT7EyyaYb1SXJ+v/76JCf27Q9O8ukkn01yY5I/GBhzTJKPJbmlfz665TZJkkbTLHCSLAMuANYDa4AzkqwZ6rYeWN0/NgIX9u3fB55eVT8PnACsS3Jyv24TcEVVrQau6JclSYtMyyOck4CdVXVrVd0DXAZsGOqzAbi0OlcBRyVZ0S9/p+9zZP+ogTGX9K8vAZ6zkBshSTo4RzT8XSuB2waWdwNPHqHPSmBPf4R0DfBY4IKq+lTf51FVtQegqvYkeeRMvzzJRrqjJh7ykIf8k8c//vHz3BxJOrxcc801X6+q5Qc7vmXgZIa24Xl1Zu1TVT8ETkhyFPCBJE+sqhtG/eVVtQXYArB27drasWPHqEMlSUCSr8xnfMtTaruB4waWjwVuH7dPVf0/4BPAur7pjiQrAPrnOydWsSRpYloGztXA6iTHJ3kgcDqwbajPNuCF/dVqJwN39afJlvdHNiT5CeCZwBcGxpzZvz4T+OACb4ck6SA0O6VWVfuSnANcDiwDtlbVjUnO6tdvBrYDpwI7gbuBF/fDVwCX9J/jPAB4b1V9qF93HvDeJC8Fvgo8r9U2SZJGl8Px9gR+hiNJ40tyTVWtPdjxzjQgSWrCwJEkNWHgSJKaMHAkSU0YOJKkJgwcSVITBo4kqQkDR5LUhIEjSWrCwJEkNWHgSJKaMHAkSU0YOJKkJgwcSVITBo4kqQkDR5LUhIEjSWrCwJEkNWHgSJKaMHAkSU0YOJKkJgwcSVITBo4kqQkDR5LUhIEjSWrCwJEkNWHgSJKaaBo4SdYluTnJziSbZlifJOf3669PcmLfflySjye5KcmNSX5zYMy5Sb6W5Lr+cWrLbZIkjeaIVr8oyTLgAuAUYDdwdZJtVfX5gW7rgdX948nAhf3zPuDVVXVtkocB1yT52MDYt1bVm1ptiyRpfC2PcE4CdlbVrVV1D3AZsGGozwbg0upcBRyVZEVV7amqawGq6tvATcDKhrVLkuapZeCsBG4bWN7N/UPjgH2SrAKeBHxqoPmc/hTc1iRHT6xiSdLEtAyczNBW4/RJ8lDgfcArqupbffOFwGOAE4A9wJtn/OXJxiQ7kuzYu3fvmKVLkuarZeDsBo4bWD4WuH3UPkmOpAubd1bV+/d3qKo7quqHVXUvcBHdqbv7qaotVbW2qtYuX7583hsjSRpPy8C5Glid5PgkDwROB7YN9dkGvLC/Wu1k4K6q2pMkwMXATVX1lsEBSVYMLD4XuGHhNkGSdLCaXaVWVfuSnANcDiwDtlbVjUnO6tdvBrYDpwI7gbuBF/fDnwK8APhckuv6ttdW1XbgDUlOoDv1tgt4WZMNkiSNJVXDH6MsfWvXrq0dO3ZMuwxJOqQkuaaq1h7seGcakCQ1YeBIkpowcCRJTRg4kqQmDBxJUhMGjiSpCQNHktSEgSNJasLAkSQ1YeBIkpowcCRJTRg4kqQmDBxJUhMGjiSpCQNHktSEgSNJasLAkSQ1YeBIkpowcCRJTRg4kqQmDBxJUhMGjiSpiSOmXYBmt2rTh0fqt+u80xa4EkmaP49wJElNGDiSpCYMHElSEwaOJKkJA0eS1ETTq9SSrAPeBiwD3l5V5w2tT7/+VOBu4EVVdW2S44BLgZ8C7gW2VNXb+jHHAO8BVgG7gOdX1d812aBFwqvZJB0Kmh3hJFkGXACsB9YAZyRZM9RtPbC6f2wELuzb9wGvrqqfA04Gzh4Yuwm4oqpWA1f0y5KkRablKbWTgJ1VdWtV3QNcBmwY6rMBuLQ6VwFHJVlRVXuq6lqAqvo2cBOwcmDMJf3rS4DnLPB2SJIOQsvAWQncNrC8mx+Fxsh9kqwCngR8qm96VFXtAeifHznTL0+yMcmOJDv27t17sNsgSTpILQMnM7TVOH2SPBR4H/CKqvrWOL+8qrZU1dqqWrt8+fJxhkqSJqDlRQO7geMGlo8Fbh+1T5Ij6cLmnVX1/oE+d+w/7ZZkBXDnxCufsFE/5JekpaTlEc7VwOokxyd5IHA6sG2ozzbghemcDNzVB0mAi4GbquotM4w5s399JvDBhdsESdLBanaEU1X7kpwDXE53WfTWqroxyVn9+s3AdrpLonfSXRb94n74U4AXAJ9Lcl3f9tqq2g6cB7w3yUuBrwLPa7RJkqQxNP0eTh8Q24faNg+8LuDsGcZ9kpk/36GqvgE8Y7KVSpImzZkGJElNGDiSpCYMHElSEwaOJKkJA0eS1ISBI0lqwsCRJDUxcuAkafqdHUnS0jLOEc6eJG9K8nMLVo0kacka56jltXRTzbwyyaeBtwPvqarvLEhlmjjvDCppmkY+wqmqi6rqnwNPBD4J/DHdUc/WJE9ZqAIlSUvD2BcNVNVNVfVbdLcOeC3wK8CVSb6Q5KwkXoggSbqfsS8E6G8t8EvAS4Cn0x3tXAz8NPA64Gl0tx6QJOk+IwdOkhPpQuYM4AfApcDZVXXLQJ8rgP896SIlSYe+cY5wPg18DNgIfLCq9s3Q5ybgskkUJklaWsYJnMdU1Vfm6lBVf8+PbpomSdJ9xvmA/+NJHj7cmOSoJLdOsCZJ0hI0TuCsors19LAHASsnUo0kack64Cm1JL80sHhakrsGlpfR3d5514TrkiQtMaN8hvOX/XPRXf486Ad0YfPqCdYkSVqCDhg4VfUAgCRfBv5pVX19wauSJC05I1+lVlXHL2QhkqSlbc7ASfIq4L9W1ff617OqqrdMtDJJ0pJyoCOc3wAuAb7Xv55NAQaOJGlWcwbO4Gk0T6lJkuZjXjM7JzlyUoVIkpa2cW4x/fIkvzywvBX4bpKbkzxuQaqTJC0Z4xzhvBzYC5DkqcDz6O6Fcx3w5lF+QJJ1fUDtTLJphvVJcn6//vp+hur967YmuTPJDUNjzk3ytSTX9Y9Tx9gmSVIj4wTOSn40o8CzgL+oqvcC5wInH2hwkmXABcB6YA1wRpI1Q93WA6v7x0bgwoF17wDWzfLj31pVJ/SP7aNsjCSprXFmi/4WsBz4KnAK8Ma+/QfAg0cYfxKws6puBUhyGbAB+PxAnw3ApVVVwFX9xKArqmpPVV2ZZNUY9eogrdr04ZH67TrvtAWuRNJSMs4RzkeBi5JcDDwW+Ejf/gTgyyOMXwncNrC8m/tP+jlKn5mc05+C25rk6Jk6JNmYZEeSHXv37h3hR0qSJmmcwDkb+D/AI4B/W1Xf7NtPBN49wvjM0FYH0WfYhcBjgBOAPczyeVJVbamqtVW1dvny5Qf4kZKkSRtnaptvMcOXP6vq90f8EbuB4waWjwVuP4g+w7//jv2vk1wEfGjEeiRJDY3zGQ4ASX4aeCRDR0dVde0Bhl4NrE5yPPA14HS6q9wGbaM7PXYZ8GTgrqrac4B6Vgz0eS5ww1z9JUnTMXLgJHkS8N+Bx3P/U1/FzDdn+1GHqn1JzgEu7/turaobk5zVr98MbAdOBXYCdzNwu+ok7waeBjwiyW7g96vqYuANSU7oa9gFvGzUbZIktTPOEc4Wug/0/wPdaa4DfbZyP/0ly9uH2jYPvC66z4pmGnvGLO0vGLcOSVJ74wTOGuBJVfXFhSpGkrR0jXOV2ueAn1qoQiRJS9s4gfNaus9LnpnkUUmOGXwsVIGSpKVhnFNq/7N//ig//vlNGOGiAUnS4W2cwPlXC1aFJGnJG+eLn3+9kIVIkpa2sW7AluQfJ/nTJB9JsqJ
"text/plain": [
"<Figure size 432x288 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
}
],
"source": [
"lengths = np.concatenate([lengths for _, lengths, _ in train_set.take(1000)])\n",
"plt.hist(lengths, bins=150, density=True)\n",
"plt.axis([0, 200, 0, 0.03])\n",
"plt.xlabel(\"length\")\n",
"plt.ylabel(\"density\")\n",
"plt.show()"
]
},
{
"cell_type": "code",
"execution_count": 94,
"metadata": {},
"outputs": [],
"source": [
"def crop_long_sketches(dataset, max_length=100):\n",
" return dataset.map(lambda inks, lengths, labels: (inks[:, :max_length], labels))\n",
"\n",
"cropped_train_set = crop_long_sketches(train_set)\n",
"cropped_valid_set = crop_long_sketches(valid_set)\n",
"cropped_test_set = crop_long_sketches(test_set)"
]
},
{
"cell_type": "code",
"execution_count": 95,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/2\n",
"107813/107813 [==============================] - 2048s 19ms/step - loss: 4.0817 - accuracy: 0.1705 - sparse_top_k_categorical_accuracy: 0.3747 - val_loss: 3.0628 - val_accuracy: 0.3127 - val_sparse_top_k_categorical_accuracy: 0.5969\n",
"Epoch 2/2\n",
"107813/107813 [==============================] - 3975s 37ms/step - loss: 2.7176 - accuracy: 0.3771 - sparse_top_k_categorical_accuracy: 0.6660 - val_loss: 2.4580 - val_accuracy: 0.4253 - val_sparse_top_k_categorical_accuracy: 0.7143\n"
]
}
],
"source": [
2021-10-17 04:04:08 +02:00
"model = tf.keras.Sequential([\n",
" tf.keras.layers.Conv1D(32, kernel_size=5, strides=2, activation=\"relu\"),\n",
" tf.keras.layers.BatchNormalization(),\n",
" tf.keras.layers.Conv1D(64, kernel_size=5, strides=2, activation=\"relu\"),\n",
" tf.keras.layers.BatchNormalization(),\n",
" tf.keras.layers.Conv1D(128, kernel_size=3, strides=2, activation=\"relu\"),\n",
" tf.keras.layers.BatchNormalization(),\n",
" tf.keras.layers.LSTM(128, return_sequences=True),\n",
" tf.keras.layers.LSTM(128),\n",
" tf.keras.layers.Dense(len(class_names), activation=\"softmax\")\n",
"])\n",
2021-10-17 04:04:08 +02:00
"optimizer = tf.keras.optimizers.SGD(learning_rate=1e-2, clipnorm=1.)\n",
"model.compile(loss=\"sparse_categorical_crossentropy\",\n",
" optimizer=optimizer,\n",
" metrics=[\"accuracy\", \"sparse_top_k_categorical_accuracy\"])\n",
"history = model.fit(cropped_train_set, epochs=2,\n",
" validation_data=cropped_valid_set)"
]
},
{
"cell_type": "code",
"execution_count": 96,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"WARNING:tensorflow:5 out of the last 18 calls to <function Model.make_predict_function.<locals>.predict_function at 0x7fd0e07f7a60> triggered tf.function retracing. Tracing is expensive and the excessive number of tracings could be due to (1) creating @tf.function repeatedly in a loop, (2) passing tensors with different shapes, (3) passing Python objects instead of tensors. For (1), please define your @tf.function outside of the loop. For (2), @tf.function has experimental_relax_shapes=True option that relaxes argument shapes that can avoid unnecessary retracing. For (3), please refer to https://www.tensorflow.org/guide/function#controlling_retracing and https://www.tensorflow.org/api_docs/python/tf/function for more details.\n"
]
}
],
"source": [
"y_test = np.concatenate([labels for _, _, labels in test_set])\n",
"y_probas = model.predict(test_set)"
]
},
{
"cell_type": "code",
"execution_count": 97,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"0.60668993"
]
},
"execution_count": 97,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
2021-10-17 04:04:08 +02:00
"np.mean(tf.keras.metrics.sparse_top_k_categorical_accuracy(y_test, y_probas))"
]
},
{
"cell_type": "code",
"execution_count": 98,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAALUAAADdCAYAAAD99DOeAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAAAwlUlEQVR4nO2deZwb1ZXvv7dKe0ut3lcvso2x8QJ4wdgsptmXJiHJhBcS8pKQxDAwScjCS5rJhOmXTAYnk5kJEJIQZzAMkLzJQth6wk5jwGY1NgbjBdvtrfdN3WrtVff9Idk0Hm8tlVTqdn0/H32kkkrnHkk/3Tp1695zhJQSC4uJhGK2AxYWRmOJ2mLCYYnaYsJhidpiwmGJ2mLCYYnaYsJhidpiwjHuRS2EkMe43WdQO/cJIZ4wwpZFbrGZ7YAB1I56fCWw6pDnIqN3FkLYpZSJfDhmYQ7jvqeWUnYeuAGDo58DXMCgEOKzQojnhRAR4CYhxJAQ4tOj7QghLhZCJIQQ1Ye2IYRoBr4INI46AjSkX5svhHhWCBERQvSne3T/0XwWQpwphFgvhIgKId4WQlxxiM2G9HbFqPcE0s8tHvXcHCFEixBiWAjRLYT4vRCiZtTr84UQz6U/77AQYqMQ4vz0a3YhxJ1CiHYhREwIsVcIsfL4v/nCZdyL+ji5HfglMAf4M/B74MuH7PNl4AkpZddh3v8z4A/As6SOArXAWiGEB3gSCAFLgE8CZwH3HskRIYQXeALYAiwCvgv8y1g/kBCiFlgDvJtu+yLACzwmhDjwu/4O6Ei/vgBoBqLp176R9vcaYCbwGWDrWP0oSKSUE+YGfDr1kQ5uBwAJfOeQ/RYDSaA+vV1KKky58ii27yMl+tHPrQCCgG/Ucw3pNk86gp0bgH7APeq5z6Xf03CIjYrDfJbF6e0fAs8dYrs0vc+S9PYQ8MUj+HEn8BwgzP7djL6dKD31m6M3pJRvAptIhRSQEtUA8Ncx2j0FeEdKOTzqubWATuqocDhmA+9KKUfH+q+NsV1I9fLLhRChAzdgb/q1Gen7fwN+mw69vi+EmD3q/fcBpwPbhBB3CyEaR/Xw45oJ8SGOg5HDPPdb4Lr04y8D90kptTHaFaR6xsNxpOeP9p4D6KP2PYD9kH0UoIWUMEffZpIKb5BSNpP6cz1CKix6Rwjx5fRr60n1/n+ftnU/8MxEEPa4/wBZ8CBQL4T4GrAQWH2M/eOAeshzm4HThBC+Uc+dRep7ff8Idt4H5gsh3KOeW3LIPj3p+9GjOKcfss96YC6wW0r5wSG3g0cOKeV2KeWdUspG4D+Ar456bVhK+Ucp5Y1AI3ABcNIR/B4/mB3/GHnjyDH14iPsfz8QA148Dtt/T+rwPguoINVzeoB24C/AfGA5qZOtPx/FjpeUaB8k1YteRCoUksB56X3swB7gYeBk4BJgIx+NqeuA7vQ+ZwLT07Z+A/gAN3A3qfg8kN5nE/Db9Pu/DXyWVAh1EnAHqfMDj9m/Y9Y6MNsBk0W9PP36F47DdiXwNDDMR0/q5pM64YqQisvvA/zHsLUUeDv9h3ob+Ju0zTNH7XMWsCFtdx2pnvQjn4VUqPGndLuR9B/qLsCRvv0O2J1upz0t+OL0e1eQ6u2HSZ1QvgicZfZvaMRNpD/gCYkQ4jPAPUCdlDJsoh9Xkertq6SUvWb5MVGYCFcUx0x6fDlAKqRYlW9BCyG+COwkFc7MA34OPG4J2hhO1BPF75KKUfuBH5nQfjXwAKlw4W5SQ4mfN8GPCckJHX5YTExO1J7aYgJjidpiwmGJ2mLCYYnaYsJhidpiwmGJ2mLCYYnaYsJhidpiwnFCXiYfbwSaWpaRmm3X2raycZ3J7hQ81hXFAmbm3z9cm9CdnwP9JyBUIAni7im+Hb0B/weVb3Wd9cJIwtd/cum7thklW5V17Q2bB2PlQSDctrLxhP1hLVEXADevulYkpe3cF/deVh9KFM8rcfZdokl14XC8JMPwUEqXGhFRzd0OIlTsGHCWuXr9bUMnvQQiVO9tK6/0dJZs6F76V2B4VummiipPh/ul/Zc8B0xWRHKmLm1/GK9HBUvUeSbQ1KJO929ZONnXdt2GniWOYKxsuiKSZ+jS5k3votmV2I6TS99T+6JVD3WOTGoH/p3UwoEEcNXi6pfbK9zd05/dc2V3Und455RvOKXC3TVvzb6LN4LiCxRvP7XE2X/Khp4z3wZ8Ve6OOS5bZNKe4eltgM+pRmqlVNxx3SkO7yWI1Iqyf5QoK9tWNsZz+60YiyVqgxkd/5JeclVkH1p2UsmWv90xONsWShRPIbViBkVoCV2qbzuU6LtLal92DMX9v3un54wX2lY2Ro9k08jeM9DUogLec+ufnuRQY3XP7fnYVcCNgJJajyAAOueUbXhlesm2//uL6+/fZFTbucQStYEEmloagKdA2kEKkBqoKoBLDUuvY3hHb6T6CYG+/pLAo0MuNfLkHSseipnr9Yek/zzP8eFR4R/sSuyKhO68UKDpEvX/KWh3fWzGf712x4qHClY4lqgNINDUMnt+xVv3be47bZEmbekRJUmps3fLQKzyNmD9GdUv7/rjt27Xj2qoADjcUeHGe7580Yt7L706nPReAxTXe3eH7Ur8/7YNzbzz0KNKIWCJOkNuXvX5oo6R+u+93rn8XKBBQdPrvbs794amladHKhLAheP1ZOtwBJpafHPL1/9Td7huRU+kxg30VHnaHz2j+pV77r7hvjePaSBPWKIeI4GmlqnA9U41+vWY5vKpIrFfk/ZfOJTY6m3//KmuE2FMOdDUIoALga+D/nEFiY7yJxB3AS+ZPZxoifo4CDS1qFOLP/icXYn/7IPBUypBSLsSe+bc+mfX+hzB2+9Y8dAJm0X1pnuuO3djzxk37Q9NvRQorXR3BCf72u5f373sL8AyTPhzW6I+Cjfdc928DT1nXN8emvpxYKrPHkzWefc8tnVg/rfaVjbuMdu/QiLQ1OKp8rTfpAj9R50jk1zpp3VS6RnyGoZZoj6E9KH1POBGRWj/S5cqwPPAr+qK9jy69gc3nrC98vFw86prxaM7rrkHxIp0hjUNxA/aVjbeni8fLFGnCTS1lM6vePOn7aHJn+uLVnuAgdqivc+cXvX66l/dcO+TZvs3nkidV8gXAKdA6hLlHKunzhM3r7pWhJNFX2jde9lFCd35N4B7kndXSJO22zpGJv+6bWVj5JhGLA5LoKllmdsW+k0kWTQHxOltKxvzduHmhBR1oKmlCPicQ4l+M6675qgiGdek7T5FaL/eefvH3zbbv4lCoKmlHNgu0DZ8fMZ/XZivCzYFJ+pcDonNuPXRuXPLNzy4uf/UU5K6wwm8u7j6lddri/bedtf1D+w3si2LFGf96Fc/bB+Z8oNqz74XusKTvp+PMKSgRJ0WdCupy7RRDDhr/saqz/v2DE2/ZUPPmecD5yoiqc8s2bJt68C8rwJrzR5TnejM/v4fz4tq7tbUloiQh5GQQlsk0JCaNyEEKWE3kMr4edyM6um3AUuc6ie+HtPcbkVobbpUv2tXkquf+t73rJx1eSKqec4iNbSnkMrE2sAYf9OxUmiibgWpAyqIRGr7+Bl91p2eYaYpQn/2wilPtHrtQ/9yx4qHxlopwCJ7WtPDegrIBIjWXDdYUOEHQKCp5RFSGe0vHethKtDUcivIf0p/gRLET9pWNt6aE0ctjpvLf/rjx9/vP/3KQPH2Fa1//83f5rq9Qlx4GwJ6Moy7WkHEQCZBRCvcnU+3toqZBvtnMUYiyaJ1AF778Cv5aK/gRH1Syeblle6O/1Gg83hI/xEuBHEbcOHPzvvqF4FXWltFsaFOWoyJtqGZnQDv9i08XEEpwym0mJpgrHRYk4fWCzp+0sJeB9DaShfwVEODHDLIPYsM8NqHXKFEMSXOvsx/2DFQcD11T6S2vz9a9YERthoa5M6GBvl7gNZWcWZrq7jRCLsWY2NB1WuLAc6uez6jI/BYKThRk1q/l4vD1N8C32ltFUU5sG1xFHYPTd8GsDN48mA+2iu48KPC3Tm3zNVbmipGZSgrgMqGBjnS2ioEYGtokNaMuzywZ3hGN8D
"text/plain": [
"<Figure size 216x252 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Top-5 predictions:\n",
" 1. popsicle 13.105%\n",
" 2. computer 7.943%\n",
" 3. television 7.032%\n",
" 4. laptop 6.640%\n",
" 5. cell phone 5.520%\n",
"Answer: picture frame\n"
]
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAALUAAADdCAYAAAD99DOeAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAAAmuklEQVR4nO2deXhTVf7G35O9W7qkC12gFyiFsu+lgCwiyBB11HGbAURxKogKqKPEdRjcwuioKCBQF2AA/bmiEkURKYvsUva1wC0t3Zd0z35+fyTMVIatzU1Obrif58nT5ubmfN+0b84959xzvodQSiEhEUzIWAuQkBAaydQSQYdkaomgQzK1RNAhmVoi6JBMLRF0SKaWCDpEb2pCCL3KY7lAcZYTQtYJUZaEb1GwFiAAiS1+vwVAzkXHmlueTAhRUkrt/hAmwQbR19SU0tILDwDmlscAaACYCSF/JoT8QghpBjCDEFJHCLmrZTmEkLGEEDshJOHiGISQuQCmANC3uAKM8rzWixDyMyGkmRBS7anRI6+kmRCSSQjZRwixEELyCCETLipzlOd5bIv3cJ5jA1sc604IMRFC6gkh5YSQTwgh7Vq83osQstHzeesJIQcIIaM9rykJIe8SQooJIVZCSCEhxHjtf/nARfSmvkZeB7AYQHcAXwL4BMDUi86ZCmAdpbTsEu9/E8BnAH6G+yqQCGA7ISQUwHoADQAGA7gDwFAAH11OCCEkHMA6AMcBDADwDIA3WvuBCCGJALYAOOyJfROAcADfEkIu/F/XACjxvN4PwFwAFs9rMz167wPQBcC9AE60VkdAQikNmgeAu9wf6T/POQAUwFMXnTcQgANAsud5NNzNlFuuUPZyuE3f8lg2gFoAES2OjfLETLtMOdMAVAMIaXHsL573jLqojNhLfJaBnufzAGy8qOxozzmDPc/rAEy5jI53AWwEQFj/34R+XC819d6WTyilewEcgrtJAbhNVQPgh1aWmwHgIKW0vsWx7QBccF8VLkU3AIcppS3b+rtaGRdw1/IjCCENFx4ACj2vdfb8fAvAB56m1/OEkG4t3r8cQF8AJwkhiwgh+hY1vKgJig9xDTRe4tgHAB70/D4VwHJKqbOV5RK4a8ZLcbnjV3rPBVwtzr2A8qJzZABMcBuz5aML3M0bUErnwv3lWgt3s+ggIWSq57V9cNf+z3nKWgFgQzAYW/QfwAtWAUgmhDwGoD+Aj69yvg2A/KJjRwH0IYREtDg2FO6/67HLlHMMQC9CSEiLY4MvOqfC87PlKE7fi87ZB6AHgAJKaf5Fj/9cOSilpyil71JK9QA+BPDXFq/VU0o/p5Q+AkAP4EYAaZfRLR5Yt3+EfODybeqBlzl/BQArgM3XUPZzcF/euwKIhbvmDAVQDOBrAL0AjIC7s/XlFcoJh9u0q+CuRW+CuylEAYz0nKMEcA7AVwDSAYwDcAC/b1MnASj3nJMJoJOnrGUAIgCEAFgEd/uc85xzCMAHnvc/CeDPcDeh0gAsgLt/EMr6/+i1D1gLYGzqEZ7X77+GsuMA/ASgHr/v1PWCu8PVDHe7fDmAyKuUNQRAnucLlQfgT54yM1ucMxTAfk+5O+CuSX/3WeBuanzhidvs+UK9B0DleawBUOCJU+wxvNbz3my4a/t6uDuUmwEMZf0/FOJBPB/wuoQQci+ApQCSKKVNDHX8Ee7aPp5SWslKR7AQDHcUW41nfJmDu0mR429DE0KmADgDd3OmJ4B3AHwnGVoYrteO4jNwt1GrAbzMIH4CgH/D3VxYBPdQ4iQGOoKS67r5IRGcXK81tUQQI5laIuiQTC0RdEimlgg6JFNLBB2SqSWCDsnUEkGHZGqJoOO6uE3OGUxZcM9Wy+WN+h2M5Uj4mKC/o+g2NN0EQA1QJyBbBODQ8OSfuzTYIs7sr8g8BKB+XOra0CZHeNG28zeV8Ua9g7FsCS+4Hkz9CkCf//0ikqtBm7SqWo2Tyiob7dpzAK1Pjz7Src4Wdby0MeWInDgaBrb7tU9lU8L+07XdjqjklsYRyT9xJY3tDx+p6seHK+vqb+xgshDQugXZq4P7DxyABLWpZ+VMCvvm9L2/ArI+AJwAtQHkNgAnbua+7nu+oQMOVw6wEri0w5N/1p9vSK05U9vVLCeOqO66/X8obUyuqGhObJARZ3R8SElfszWm0eIMlQFUC5CLV8H8DwQuJ4XMLCeOpvjQkrgGmza/3h5ZoFE02tOjjqWXNCbvrmhOPB2prnb2jt3b+Wxtl11FDR35uJAS28CE7eGnzBnH883dy+FejpYJqQl1TQStqTmDifTU7Tt2uKp/Vzmxz3VSpQ0CGWJWzkRCQaKOV/eKOFnTU63TlMX1S9g15FRNd3NBXZpdq6pJzIg5eOOZ2vSiiuZEa6iiITE5vCCruLF9caNdS5UyW5xG0ZTSYNPaKWTqq0ekFP+51JBmAGMkY1+eYDb1kwD+1T9+x6avnnzlRtZ6LgdnMCm76/brOG1+2r6yIbbSphRVSsTZDp0jT2QerBhYUGONlWnkjXdZnKGZHl9TAM/zRv3rjKUHLEE5+tHzxTW3A5FvAPhyX3nWPaz1XAneqLcD+lIApRe9tObCL5zB9CvcS8Y0AEinyONR7tVdEpci6Grq6Useum1T4R/WyojzdLMjvC9v1F8qPYLo8IzijNWqap6XESojxBWXN/cBM2tdgUhQmZozmMYDrk808ubQm1LXDV/48Mo9rDUJzYOLZkzcVDhhBUBW8Ub9A6z1BCJBc0eRM5iGATABsiiLM4yuO3NvUDatPn508WqAvAZgyuB/fDCZtZ5AJGhMHRdSciHTEOBOOjOKnRqf80qUuqqoyRG6YmbO5B6sxQQaQWNqgCa7BwaoA4AdQC5bPb6DN+ptQxI3P9rkCHOtO3P365zB1Jo7S0FPUJiaM5g0Fc1JHQFqAshLuA7GcZdM//BbF1U846LyW/HfRJcSCBJTK2W2WwFoAdk7vFH/erAbugXvqOXNu1Uyywczlj44nLWYQCEoTN0p8uTboYp6C4BNrLX4E96od43pYHqKEEq2FI17jzOYguL/6S2i/yNwBlPUqZqMdqna07t5o761qXhFz+JpH29zuJTZDXZtXwCzWOsJBERvagB3uiCXH6vu+zfWQljhpIqPAfotgfOf05c8dCtrPawRvalDFA3TAHoKF+0WcD3BG/V0VMr6p8OUjfK9ZcM+5AymixO0X1eI2tSPLpvSz+IIHdw/fkcBb9QHz63RNrD8sYUnk8LPPVbZnBAH4AXWelgialNvKLhtLIUMUZrq11hrCQR+mjNnMYCVAH3+zwv+9ifWelghalPbnJp7AOz9aMb719Wox1WYpVWZHfnmbmvGGN8MZy2GBaI19UOLp48BMECjaPyCtZZAgjfqzf3jdxoqmhNVp80Zr7LWwwLRmrraEvcMgQtjOpjWs9YSaCx/bOE7cO/hMrPni2v+wFiO3xGlqTmDieSVZ3ZSyy07Fj284gBrPQHKs2GK+lK5zPnd48smp7IW409EaWoAAwGSZnGGfsBaSKDCG/XNw5I3GuqsUbL1/B3zWevxJ6I0dXdd3tsy4nTAvc+4xGVY9kjOCgrZPLtLfS9nMF03oyGiMzVnMMnP1qYPSIs6VsYb9bWs9YiAVxXEflAjb1r92LIpvViL8QeiMzWA0c2OME2+OWM2ayFigDfq7WNTv33OQRXqX4tH51wPc69FZ2oC5yQAdS4qX8dai1h4f/pHJqXMPqfGEpeJ/+7HHrSIytQzcyZFq+S2+zntqSO8UW9hrUdMNDvC3gSQKyPOhTOWPngDaz2+RFSm3lk86narM4Skak9/xlqL2OCNele/+B0zVTJryP6KQZ8F89xrUX2w8ubEWwGUbi4a/x5rLWLk6ydfOdQp8uTLxQ2p7QA8wVqPrxCNqacveag9QG8B8On1uBhAKI5W9/07gLUAfe2ut58dzVqPLxCNqRvsEfMAohzd/vudrLWIGc8U3WmhikYUN3Qw3fX2sxrWmoRGNKbeVTIyLVTRUK5VmaX2tJfwRn35oHbb5hc3dgjZWzb8edZ6hEYUacc4gykJQBGAebxRP5exnKC
"text/plain": [
"<Figure size 216x252 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Top-5 predictions:\n",
" 1. garden hose 15.217%\n",
" 2. trumpet 10.083%\n",
" 3. rifle 8.203%\n",
" 4. spoon 5.367%\n",
" 5. moustache 4.533%\n",
"Answer: boomerang\n"
]
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAALUAAADdCAYAAAD99DOeAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAAAi70lEQVR4nO2dd3xb1dnHf0dbsmR5xtuR7exNJg6J44QwgsIKtCmvU1KGgZASE0pBQFNoKERtoZAEaBMDBUKghZaXJSB7koRMMkjiLMt727Is2dY87x+SeY2bKV/pStfn+/noI+nqnuc81/r56Lnnnvs8hFIKBkNIiPh2gMHgGiZqhuBgomYIDiZqhuBgomYIDiZqhuBgomYIjogXNSGEXuLxDkf9vEMI+ZILW4zgIuHbAQ5I6fZ6NoDiHts6uu9MCJFSSl2hcIzBDxE/UlNKa7seACzdtwFQALAQQu4ihGwmhHQAeJgQYiWE3NndDiHkOkKIixCS1LMPQshzAOYD0Hf7Bcj3fzaSELKRENJBCGn2j+jai/lMCJlECDlICOkkhBwihNzUw2a+/31CtzY6/7bx3bYNI4SYCCFthJB6QsiHhJDkbp+PJIRs8h9vGyHkMCFkuv8zKSFkBSGkmhDiIIRUEEKMl/+XD18iXtSXyTIAbwAYBuA/AD4EcG+Pfe4F8CWltO487V8C8BGAjfD9CqQA2EUIUQH4BoANwEQAtwOYDODtCzlCCFED+BLASQDjADwB4C9XekCEkBQA2wEc8/c9E4AawOeEkK7v9QMANf7PrwLwHIBO/2eL/P7+AsBAAHMBlFypH2EJpVQwDwB3+g7px/c6ABTAb3rsNx6AG0Ca/30sfGHK7IvYfgc+0XffVgigFYCm27Z8f58DLmDnQQDNAJTdtv2Pv01+DxsJ5zmW8f73SwFs6mE71r/PRP97K4D5F/BjBYBNAAjf3xvXj74yUu/v/oZSuh/AUfhCCsAnqhYAX1+h3aEAjlBK27pt2wXAC9+vwvkYAuAYpbR7rP/dFfYL+Eb5PEKIresBoML/WY7/+a8A3vSHXs8QQoZ0a/8OgDEAThFCXieE6LuN8BGNIA7iMrCfZ9ubAO7xv74XwDuUUs8V2iXwjYzn40LbL9amC2+3fbuQ9thHBMAEnzC7PwbCF96AUvocfP9cn8IXFh0hhNzr/+wgfKP/035b7wLYIARhR/wB9IL3AaQRQn4NYCyAf1xifycAcY9txwGMJoRoum2bDN/f9cQF7JwAMJIQouy2bWKPfRr8z91nccb02OcggOEAyiilZ3o8fvzloJSeppSuoJTqAbwF4P5un7VRSj+mlC4AoAcwA8CAC/gdOfAd/3D5wIVj6vEX2P9dAA4A2y7D9tPw/bwPBpAA38ipAlAN4H8BjASQB9/J1n8uYkcNn2jfh28UnQlfKEQBTPPvIwVQDuATAIMAXA/gMH4aU6cCqPfvMwlAtt/WagAaAEoAr8MXn+v8+xwF8Ka//WMA7oIvhBoAYDl85wcqvr/HXuuAbwd4FnWe//O7L8N2IoD1ANrw05O6kfCdcHXAF5e/A0B7CVtXAzjk/4c6BOAOv81J3faZDOB7v93d8I2kPzkW+EKNf/v77fD/Q60EIPM/PgBQ5u+n2i/4aH/bQvhG+zb4Tii3AZjM93fIxYP4D7BPQgiZC2AVgFRKaTuPftwK32jfj1LayJcfQkEIVxSvGP/8sg6+kKI41IImhMwHcA6+cGYEgFcBfMEEzQ199UTxCfhi1GYAz/PQfxKANfCFC6/DN5U4jwc/BEmfDj8YwqSvjtQMAcNEzRAcTNQMwcFEzRAcTNQMwcFEzRAcTNQMwcFEzRAcffIyudDRGUy58K3O22o26nfz7E7IYSO1wJi57C+LCbw7Afo8gE1+gfcp2EgtEHQG00gAvwOG/LzbWKVQS1sfhG/pap+Brf2IcOatfHROrT1t+RnLsHQAtihp60a7K3oWQKQAFQEE/VTVx+vbUx8wG/Xf8u1vKGCijlCGPPPxlE6P6ikANyklNpqkqn3PbB3wmNmob+6KqeMVdccGxJx8eG/t1IkUoji5uOO7qWkbP4qStr2yvHCtYL94JuoIY+DTn0xPU5d/bLYOjAfQBOCvuSlb3vqw6KXz5SsBAOgMpigAhSqJ7YV2t1olIu7vvFTyBwDfmI16wQmAiToCKCouIOVt2fceqr96PoCpCnG7fVTi/vV7a/PuNhv1tsu1s6h4nvZk88hnTrWMmAsgM15RVzcs/siqHVXX/cFs1HsvaSBCYKIOY3QGEwFwU6y88fUWR0J/EfHUeqn4RQBvmo36jku1v4hdmUZmKZSJHK80dSZJARwl8L5wS84/P1leuDbi8wwKUtSRPk9bVFwgbnXGvLCrasbNTq9imIh4KnJTtmyPVzYsWFH4ftulLVx2P/J15tvu6vSongAwNFFZ44yS2paYrQP/ajbq3Vz1E2oEI2qdwSTRyCypqVHlC0paRv4GIGL47qK+NlKErTOYxAB+JoJniRfiYdGyFovVGbsYwFqzUR+0EVRnMIkmJu9YYrbm/Ka+PVUDoDRNXVY8LmnXa1z+E4WKiBC1/8tOBpCRl75uTpUtU37WMtQN0Ix0tfnGxo4kb6dHpcZ/JZuhSFJVH5uUsj1vReH7LTy4flnoDCbJ1SlbXj7RPPruVkdcDIDjg2OPvjck7uirywvXOkLoBwEwG6BLADJBI7M42pwxv0Uvw51Qw4uo3b/XukUEIi+F9zcZs5XVtsycfXVTtAAyxid9O6elMz76bOsQO4CMaFnLuDanVkohIj3MdACoyNSc0wL0bHlbzmYAlcmqioLa9oyp/5/Zi0BCXM1uKv0DgNVmo74TYUJR8byoz87OvQsQPQUgOyWqorPDrXrA4ohfy+eJW1FxAbG5op/8rmZqgc2lHQHQugnJO3elRFUtWFG45oKzLOFCyEXt/r3WLSb/P6J6PKAD3B/8RLAS4vK6qfQsgIqcmBNxcrGj8njTGBOAiqlp60mU1HbsG/OcsvNNR01aunpjXXvatV3m4xQNnzV3JsQBJF8u7rCMS9q94UTTqLsP/WE+b+LWGUyKdHXp0zZX9NMWR7wYwH6ZuPPFWbpPPg23+WOdwTRNK2t+tdUZN0ZMXFYPlf4ZwGtmo76Vb98uRMhF7X1W6yUAIQSgFKAA/XnMw2/vr5vyKYDKSSnbmpJVVZWBfrnZhs9WeSF5AKBugLjgj6l1BlN+hqb03xVtWfEAqgC8EK+of/vAc/eE7Od9UfG8+P211zxebc+cDyAlVV3WpIs++8dd1TOWh/t88T2vL/zVlopZdwJELyGujjH9vttXZs25Y9+z94ddrhL+R2oKj2RpK2drUHQG04cApsGXfusnsx9FxQVkR9XMm5o7+z0F4BqtvNmVoSl9/VjjuCfNRr2TKx/O45MGwMMqie35drdaCtAtAHne719Yi7knOoNpbLa25ONzrYOz4Us2/8aAmBMrNxoer+Tbty54j6m5FDQAjFjyYTUh3jNHlxbkXWgfncFERifuvbepI/GVSluWBkCZRmZ5eUbGV28uL1zL2QnRI6t/2b/KlrnqYH3uRIDERsta9l2dsu0fqxcU/42rPvhizLPvjbE44p8A6FypyEn6R5/beMYy9B6zUV/Ft28RMftxuTz09/vU68tubRvbb8/Ofy9+ceql9vef7d8A4A8AJsYpGlxtzuiFLq/8nd5MoekMpgQAj4qIe7GXSlSxioY9LZ2JRWajfm+gNsOVn7/6ZK7dpX7vh6arsgDiUYjbP5iR+dVbX5Xe6QFP1woEJWqdwTQGwKEEZe19+5+974J1V87TjkzP+GrJofpJhRZHfDqAs6MS9v0rS3t66ZVMqY159t2UbO3pTw83TBjjoRIpQP8zI+OrD99e+MYnARxORKEzmLIAPCmCpxCgIi/EHn/OeCdCfK1AaKK+G76c08PMRv2Fkp5frD0BMFsqchhdXvkwtdTaZHNFLwbwgdmov2CVgZnGv+ScsQx7BMADBF7lwNgTJadahs8xG/XHAz6YCGXhql+N31l17QetzriBvi3UA5AlZqN+Wah8EJSor1328ue
"text/plain": [
"<Figure size 216x252 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Top-5 predictions:\n",
" 1. wine bottle 24.326%\n",
" 2. hexagon 22.632%\n",
" 3. octagon 13.903%\n",
" 4. lipstick 2.759%\n",
" 5. blackberry 2.112%\n",
"Answer: square\n"
]
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAALUAAADdCAYAAAD99DOeAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAAAu9klEQVR4nO2dd3hUVfrHP++dkkmddCChjCIsKii6WFCUCPa46rqWVWyAqKDCuqvuiIJjj66uYtdY1/6zrjrqquhYVtxVFNvakJ0AoYYkkzqZcs/vj5loZCnJlEwyuZ/nmQfmzj3veW/mO+e+95T3iFIKA4N0Qku1AwYGicYQtUHaYYjaIO0wRG2QdhiiNkg7DFEbpB2GqA3Sjn4vahFR23k9nKB6HhaRVxJhyyC5mFPtQAIY0uX/RwHVmx1r73qyiFiUUsHecMwgNfT7llopta7zBTR2PQbYgEYROVlE3haRdmCOiDSJyPFd7YjIISISFJFBm9chIi7gDKCyyx2gIvrZOBF5S0TaRaQ+2qLbt+WziOwjIp+KiF9EPhORIzezWRF9X9yljCN6bEKXY7uIiFtEmkVkg4g8KSKDu3w+TkQWR6+3WUQ+F5GDop9ZROQ2EVkjIh0iskpEqrr/l++79HtRd5PrgbuAXYDngCeBGZudMwN4RSm1fgvlbwL+D3iLyF1gCPChiGQBrwMtwN7Ab4H9gAe35oiI5ACvAN8CvwYuAf7S0wsSkSHAe8BX0boPBnKAl0Sk83t9Algb/XwPwAX4o5/Njfr7e2AUcBLwXU/96JMopdLmBRwfuaSf3jsABfxps/MmACGgPPq+gEiYctQ2bD9MRPRdj80CfEBul2MV0Tp32oqdc4B6ILPLsVOiZSo2s1G8hWuZEH1/FbB4M9sF0XP2jr5vAs7Yih+3AYsBSfX3lujXQGmpP+n6Rin1CfAlkZACIqJqAF7rod2dgS+UUs1djn0I6ETuCltiDPCVUqprrP+vHtYLkVb+QBFp6XwBq6KfjYz++1fg/mjodZmIjOlS/mFgPPC9iNwpIpVdWvh+TVpcRDdo3cKx+4Hp0f/PAB5WSoV7aFeItIxbYmvHt1WmE73LuZ1YNjtHA9xEhNn1NYpIeINSykXkx/UikbDoCxGZEf3sUyKt//yorUeAN9NB2P3+AuLgMaBcRM4H9gQe2s75AcC02bH/ALuLSG6XY/sR+bt+sxU73wDjRCSzy7G9NztnY/Tfrr044zc751NgV6BGKbV8s9dPdw6l1A9KqduUUpXAA8BZXT5rVko9o5SaDVQCU4CdtuJ3/yHV8U8iX2w9pp6wlfMfATqAd7thez6R2/uvgGIiLWcWsAZ4ARgHHEjkYeu5bdjJISLax4i0ogcTCYUUMDl6jgVYCTwPjAYOBT7nlzF1GbAhes4+wI5RW/cBuUAmcCeR+NwRPedL4P5o+T8CJxMJoXYCFhF5PshK9fcYtw5S7UCKRX1g9PPTu2G7BHgDaOaXD3XjiDxwtROJyx8G7NuxtS/wWfQH9Rnwu6jNfbqcsx+wLGp3CZGW9BfXQiTUeDZab3v0B3U7YI2+ngBqovWsiQo+L1p2FpHWvpnIA+W7wH6p/g4T8ZLoBQ5IROQk4F6gTCnVlkI/jiHS2pcqpepS5Ue6kA4jij0m2r/sIBJSVPe2oEXkDGAFkXBmLHAr8LIh6MQwUB8ULyESo9YDV6eg/kHAo0TChTuJdCWemgI/0pIBHX4YpCcDtaU2SGMMURukHYaoDdIOQ9QGaYchaoO0wxC1QdphiNog7TBEbZB2DMhh8t7C4XRPJDJLzuOtqlySajsDBWNEMcHMq54myzbuPbKmaaffgLqByDTSMMizoK8bX/Lx5Lr2Qd7VLY4aTULabsVLD9jYPnhFbcuIVWYJmsYWfzppfVvZ8rWtw2qtWodlWO6Kg3/0jRkJooEKDs9dcfZ7l819JNXX2ZcxRB0HDqe7aOrwV06uadrRsbxxlxxQYzPNbfu1h7Lll2cqQPygOjLNbfagbvGHdGsHKKLv20O6NfDT+7C1PaQsAdDFrIXyQrqFrotgMkztmzrCmW9nmVs+m1T+Vr3N3P7UbbMe8/XqxfdhDFF3g0NvuMH+fcPYXwFjdy36bEZde+mg9W3lOcDgLqc1AF+NKfzCkmHyf/r5xr2/JLJG0AIEgamxhA7R0GNxxI4Kjyn8YklN08jm9lDOOCIzDQE9BNpnedaGb/Yo/VfLutbye75rGPeVt6pyQH65A17UXeNVYOnkoa8f3BbKnvTxugMUMDbP2jilKZCf03m+WYLhoswNvvVt5S8BX00Y9EFrada6j1797/Gfby6iZMfUp98x91cmCZ/+Qe1Ua1DPmGCS0H5hZbZGP67Lz9i0fHTB183/2bT7X1uC9g+9VZVNsfrQnxjQonY43ccI+nMKMYEoIgteo+sQVRjkuyHZK5vLc1Y1fLJ+/3uJ5Nj4r7eqsqcLdHuFedXTMtpDWZVv1BxbDOybZ234TVOgoDMhjsq1+NaNyPux8atN428G7SPADkwmzR5AB5yo51VPy2gK5F/mWXX4rxXaEfwUrCoF8u6O9u9eGl3wdc3H6ya5l7qmd6TU2QQw7bY/DvnnmqnjgH2H5njPqGsvHe4PZ0V7vX767v0gU9JF2ANG1DvNf2F4SLfOtGiBOUHdWmzROhqDesaLRDIUmYkj7u1POJxuIbK28RpQJ3R5AP3nhEH//P2zF163OnXeJYa0FrXD6TZnmVuOKctZedfyxp1Loik3/jF56D+W5mfUX7to1uPtA7UP+JcPoAgoyTS3SXnOyruXN+78J29VpX87JvosaSnq8+87Y+I3m3a74EffmMlAWbal2b+j/TvPl3UTZnurKr2p9q+v0PUHPXX4ywUrfKOr/+v7VRngLbKtv25S+eIHF816vE8+P2yLtBH1yEv/PimszGcDI0HtJ5F48XWFdg/g9lZVhlLrYf/A4XRPBXUTyPgh2asa17YOO8ZbVfleqv3qCWkh6oOuu+Wm/zaN+lM0PlQ5Ft9zBw5945a7znn4w1T71h854ZZLtUxz290frTvw+EDYVgjqxSnD3bc/OOfut1PtW3fo16J2ON3DiQxw/C46ageRbKYLvVWV16fSt3TA4XRnAn/QJLQAJLMkc93z69vKz/VWVW7cbuEU0i9F7XC6MyYM+uCFZRv2OSSkLEHgb6BOB4lr9M5gy5x/3xm7rPCNfvg/m3bfE6RtcNbq+/ce8v7Vt816rCHVvm2Jfidqh9N9GJHUWqNGF3y1ZkPbkAOWXXn6ioHai9GbOJzuMYJ+g0I7Otfa6G8O5J8DPOatqtS3W7gX6TeiPvPO8/dc0zLspe8bxpYDy81aYN7y6377aqr9Goicddc5c/+5Zurs9lD2GEFfdtDwV//24Jy7b0m1X530eVE7nO4M4CJQl1m0gG1M4VfPfln369O8VZX9frSvP+NwujXgJJup7Q5/OKswx+L7oCVoP4fI0HsFKbxj9mlRn3nn+c7P1u97qS9QkAc8V5y57pJPrpi5ItV+GfzM3OpT7V7fqDu/qJtwFJAbHXpXIAFS9GzTJ0XtcLodwC3AsSWZ64JZlpbT3p0/7+kUu2WwDRxOd5Ggv6yQidFeqDCwIBW9UH1qjeLMu2bnnnir800h/B1wqKBfNrHsnSJD0H0fb1XlJoX2J1CBaGutEUkc3+v0mZba4XQfAep2kJEj7d8u/9E3Zqq3qjIlfxSD2In2Qp0I6lSr1pFbMewf5903+74HetOHlIt69r0z9v+2ftyj//WN3gH4vsi24eKlrukvpdQpg7g56dZL9lzeOOajxo5CFVaWI71VlYt7q+6UidrhdNuAiwX9cosWsJblrLrb2zTqQqNXI334/aKLR3+0dvJzIKMzza3TvrnmxGd7o96UiHr6nedd8dHayXPaQ9mlwDP7lS2++om5f/2y1x0xSDoOp7vQrAXe1HXTnhPL3rn78bm3zEl2nb0qaofTvQORrSCOLs5c72/wFx394/XHvNlrDhikhLnVpw36fOOEL2qadioF5nqrKm9PZn29IurZ907P39g25NlP1u+3P0g419p405Rhr/1l0az
"text/plain": [
"<Figure size 216x252 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Top-5 predictions:\n",
" 1. ear 62.866%\n",
" 2. moon 17.284%\n",
" 3. boomerang 3.729%\n",
" 4. knee 2.912%\n",
" 5. squiggle 2.257%\n",
"Answer: ear\n"
]
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAALUAAADdCAYAAAD99DOeAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAABMx0lEQVR4nO2dd3wUZf7H38/M1vRCrwEsiRqxIxYcEjvYC7YTy6EoqL+TU8GzLOqd3J146h2K4qnoWbCfgpWEATyxoSJIEEQSegvp2T7P74+ZhTUmpO1CCPt+vfa1mdl5ymQ/+8z3ad+vkFKSIEFnQtnbFUiQINYkRJ2g05EQdYJOR0LUCTodCVEn6HQkRJ2g05EQdYJOxz4vaiGEbOb1QozKeUEIMTsWeSWIL7a9XYEY0DPq75HAjAbnvNEXCyHsUsrgnqhYgr3DPt9SSyk3R15AZfQ5wAVUCiEuF0IUCyG8wM1CiGohxMXR+QghThNCBIUQ3RuWIYTwAKOBEVFPAM36LF8IMVcI4RVC7LBa9PTd1VkIMUQI8a0QwieE+E4IcXaDPDXruEtUmhzr3DFR5w4RQswRQtQIIbYKIV4VQvSI+jxfCFFk3W+NEGKJEGK49ZldCPGEEGKjEMIvhFgnhJjS8v98x2WfF3ULeRh4EjgEeAt4FbiuwTXXAbOllFsaSf8I8DowF/Mp0BP4XAiRBHwE1ALHARcAJwDPNVURIUQKMBtYARwN3An8vbU3JIToCSwAllllnwqkAO8JISLf6yvAJuvzIwEP4LM+u9Wq72XAgcAo4KfW1qNDIqXsNC/gYvOWdh7nABKY0OC6Y4AQ0Ns6zsQ0U0buJu8XMEUffW4MUAWkRp3TrDIPaCKfG4EdgDvq3BVWGq1BHl0auZdjrOMHgKIGeWda1xxnHVcDo5uoxxNAESD29vcW69f+0lJ/E30gpfwGWIppUoApqgrgw1bmmwf8IKWsiTr3OWBgPhUaIxdYJqWMtvW/bGW5YLbyw4QQtZEXsM76bJD1/ijwrGV6/UkIkRuV/gXgCGClEGKaEGJEVAu/T9MpbqIF1DVy7lngWuvv64AXpJThVuYrMFvGxmjq/O7SRDCiro1gb3CNAszBFGb060BM8wYppQfzx/Uupln0gxDiOuuzbzFb/7utvGYCn3YGYe/zN9AO/gP0FkKMB44Cnm/m+gCgNji3HBgshEiNOncC5v+1pIl8SoB8IYQ76txxDa7ZZr1Hj+Ic0eCab4FDgTIp5c8NXjufHFLKVVLKJ6SUI4B/A7+P+qxGSvmGlPImYARQABzQRL33Hfa2/RPLF03b1Mc0cf1MwA/Mb0Hed2M+3g8GumC2nEnARuAdIB8YhtnZems3+aRgivY/mK3oqZimkAROsa6xA2uBt4GDgNOBJfzapu4FbLWuGQIMtPJ6BkgF3MA0TPs8x7pmKfCslf524HJME+oA4HHM/kHS3v4e262DvV2BvSzqYdbnV7cg767AJ0ANv+7U5WN2uLyYdvkLQHozeR0PfGf9oL4DLrLyHBJ1zQnA91a+izBb0l/dC6ap8aZVrtf6Qf0TcFivV4Ayq5yNluDTrLRjMFv7GswO5XzghL39HcbiJawb3C8RQowCngZ6SSnr92I9zsNs7btJKbfvrXp0FjrDjGKrscaXczBNihl7WtBCiNHAL5jmzGHAY8D7CUHHhv21o3gnpo26A3hwL5TfHXgJ01yYhjmUeNVeqEenZL82PxJ0TvbXljpBJyYh6gSdjoSoE3Q6EqJO0OlIiDpBpyMh6gSdjoSoE3Q6EqJO0OnocNPkU0eNHIq5skyfMGv2or1cnQT7IHt9RjFKxAudaXVH+quT/gEoIHxAYULYCVrLXm2pH73irJNAmY8pYvzVydEfOzDFnhB1glaxV21qRQ0/DEKxdi0ZCKMYZMj6OAzoe61yCfZZ9pqop44aeWQ44DgeZNgSsh+pvL+rTlLsLn2CBE2xV0T98t8OzlbsoXeAraozcIUzo+4boYaXA1PNlntn3bS9Ub8E+zZ7RdS1m7I+MYK2/giZFPY7XvBXphyvqMYBwKsg/WbLLQIkzI8EbWCPjX5MHTVSBYaDfBFEZJf0duD1rIPWf+VMq//PFXesDLdmSC8x/JegMWIi6qbEZZ2/2pVVfXywzpUX9juckc+SulXOrN+a8XzGwE0HK7Zw9x0r+/wMJKf13XaMUIwuVWXdlwPJyd0rjkXItLrNWauAZGdGbb5QZGaw1ukPB+yZICQQBIYnhJ0A2iFqS7CjnFnVJ/l3pB4FQmD6aSuYMGv2Iuvzz2i7iSOBOsUeVBWbIUJehyPK3m6M6RNmzb6pjWUl6ES0aZw6WrD+HWnscjgkXTa3/1rMsWVtVwopQVQBGfZkb0Van+0vlf/Udx5Qn56zOd2R4mPbspwfgPqsg9ZLZ6q3JljvrKss7T4s5HWeJ43wFSC6WLb2RzaX//2Qz3E/0GuXwyPB+okLzwYGA3qfKScnWu39lLZOvmggrfFlaYAIY7bIqpTiuidvHjYF0nRMfxN2TGeMbuDTYJ37zGseWGI0lunUUSPVHSv7nCRs4ctVe2hMyOtUAL8MK18502sdUoqHb5kx77//nnjMbZVrevSyhgMFiODR2af/IJFzAATCv37iwuE/nznsR02T1W28xwT7KG0yP8yWWn5uZeEFbgO6pPQsd9RuzrwLqWxFGC8jRSlC9nCk+CYG651hGVZzJ8yavbFBXjZgWFqfbVNrt2QeaARtyYAvqWvlWmda/Vs1G7K/Cfkcr4NQAa9Qw+fYXMGPhGIEAjVJZwInAfqFh/zuUpu3523CNIOQhMoq+7/V35+x4uVjLp97FYCui2xNk+Vt/3cl2Bdos039jyvPLJVSIMPq5Q06hzcBT4JEqAYI+ZoM2S7rN2zp4q6Hrv1G0+TYqaNG2nofX/JY3ZaMcyvX9HQBXYUSDiV1qyqr25x1N/DBhFmza58cd3JusN5ZEqp3RbIPYZo2J2fnrr3+msk/PAeg6+IKV8WhL/X+clpQoKiAkEjL750REKjaz2cO+xnYAozTNPnUAw/cfaoQ8vxw2Pmyx+NJmCqdiDav/TBCtp8BdyMjDhmAAUKRYUWCGAXyOaGGxZriw29c/NSIniBO3PBFXrZiCyEU401pKK8dfs3cUTZn6EBNk68DzHps4NP+yoOuNkI2LJs9bOV7AjAjZ/jSaIeO7/kyf7xfilCqkI4jMZ2gnycQijRFrgErgYnAgoceuvN0w3B/bKW9zuPxFCaE3Xlos6hVR9CQxq5QDFHogB+kA1BBbgDxctm8wZ9aNvi5wMepvbfP7jds2ZIzz69dCKDrwgl0f+XvBwlfZcq7FT/nnQsChLHJlVn7x/S+2/5SubZr/2BNUp0Rsk0EFui62KZp8kJNk7XfzrwwV0j7lZi/gIDVOUUggoBumR1/83g8SSDfM6sqAOxDT3hp4adzX1IMA+OM01d3uOW4CVpHm2cUk7tVDlCdof4Nz1std6HqDK4FSUrPHTOBf0YNx4WA+Tc8+sW/IoIG0DT5yuKnRjy95fuBb1b83OtcS3B+pHLeuKcWvFKzocvL/oo0Mg/c+PyEWbN3YDpG/GBnfbaeeKUw0whA9aX+pAKE1fq/9ply8iJdF6vmznU+rSjBD0AMMDu4Mjz0hJeEqqIKgVBV1I8/GRRZUJVgH6XNrZK3IuWLYL2jZxMf9w/7HQMUe+CV2k1Zd4C0WyINY7Wc0RdPHTVyqOoIXgH2M4yQ7QCEYSAVJbX39ik3PPrF11NHjewF6eOBT8tL+t0KoGny8Uj69RMXpriVwX6J4RAoYSDsqjlIARQl7J64fuLCj0OnOt5YvvySSYZhQ1V9Y/LzX/Ns3HjUCkWhAMxRdilBUVAeffT6m7Oyfp7qcNQ+fcUVi/9P14Ud+BfwmqbJedbxycCPmtZojJgEe5E2izpY514DJE0dNVKZMGv2ziG6/0zJPR4xaAZSLDKCjuW/FjRzgckNOpaFID8MB2x2a7zZQCoKSKNmQ5ePp44aOdTm9r0d8jpdIG6aMGt2Yz3bGxXD6ZQYN2D6ju4Hyo2RewwSKvzss0mHA3Tv/v1zN93
"text/plain": [
"<Figure size 216x252 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Top-5 predictions:\n",
" 1. monkey 34.293%\n",
" 2. mermaid 8.274%\n",
" 3. blueberry 7.341%\n",
" 4. camouflage 4.992%\n",
" 5. bear 4.961%\n",
"Answer: monkey\n"
]
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAALUAAADdCAYAAAD99DOeAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAAAoC0lEQVR4nO2deXhTVfrHPydb96ZladkJCiJixG0AYcSr4loXGFxGUSs6uCPiNnX7TUdHreK4Iyo6UgV1XHHGuo3L0UFcBhWNiChK2CkFurdpk9zz+yO3TmTYSpvcJL2f58nT3u2cb9pvTt577jnvEUopLCxSCZvZAiwsOhvL1BYph2Vqi5TDMrVFymGZ2iLlsExtkXJYprZIOZLe1EIItYvX3E6qZ64Q4vXOKMsitjjMFtAJ9I76/SRgzjb7mqNPFkI4lVLBeAizMIekb6mVUhvbXkBN9D4gHagRQpwlhHhfCNEMXCaEqBNCnBZdjhDiGCFEUAhRuG0dQohSoBgoivoG0IxjXiHEu0KIZiHEVqNFd+9MsxBilBDiSyFEQAjxlRDixG3K1IztHlHXeIx9h0bt208IUSGEqBdCbBJCPCeE6BV13CuEeM94v/VCiK+FEEcax5xCiAeFEOuFEC1CiDVCiLLd/8snLklv6t3kTuARYD/gZeA54IJtzrkAeF0pVbmd6+8BXgDeJfIt0BtYJITIBN4CGoCRwERgDPC3HQkRQmQDrwPfA4cA1wMz2/uGhBC9gY+Ab426xwPZwD+EEG3/12eBDcbxg4BSIGAcu9LQ+3tgCHAmsLy9OhISpVTKvIDTIm/pl20PoIBrtjnvUCAE9DW284mEKSftpOy5REwfvW8qUAvkRO3TjDoH76Cci4GtQEbUvrONa7RtyuixnfdyqLF9K/DeNmXnG+eMNLbrgOId6HgQeA8QZv/fOvvVVVrqxdEbSqnFgI9ISAERU1UDb7az3GHAN0qp+qh9iwCdyLfC9tgX+FYpFR3rf9bOeiHSyo8TQjS0vYA1xrG9jZ/3Ak8YoddNQoh9o66fCxwI/CCEmCWEKIpq4ZOalHgTu0HjdvY9AUwxfr8AmKuUCrezXEGkZdweO9q/s2va0KPObcO5zTk2oIKIMaNfQ4iENyilSol8uBYQCYu+EUJcYBz7kkjrf6NRVjnwr1QwdtK/gQ4wD+grhLgCOBh4ahfntwL2bfZ9B4wQQuRE7RtD5O+6bAflLAO8QoiMqH0jtzmnyvgZ3Ytz4DbnfAkMB1YppVZs8/rlm0Mp9aNS6kGlVBHwJPCHqGP1SqkXlVKXAkXAUcDgHehOHsyOfzrzxY5j6kN3cH450AJ8uBtl30jk630o0INIy5kJrAdeBbzAOCI3Wy/vpJxsIqadR6QVHU8kFFLAEcY5TmA18AqwD3As8DW/jqn7AJuMc0YBexllPQ7kABnALCLxucc4xwc8YVx/NXAWkRBqMPAAkfuDTLP/jx32gdkCTDb1OOP4ebtRdk/gHaCeX9/UeYnccDUTicvnAu5dlDUa+Mr4QH0FTDLKHBV1zhhgiVHuJ0Ra0l+9FyKhxktGvc3GB+ohwGW8ngVWGfWsNwyfa1w7lUhrX0/khvJDYIzZ/8POeAnjDXZJhBBnAo8BfZRSTSbqOJVIa1+glNpslo5UIRWeKLYbo3/ZQySkmBNvQwshioGfiYQz+wP3A/+0DN05dNUbxeuJxKhbgdtMqL8QeIZIuDCLSFfiOSboSEm6dPhhkZp01ZbaIoWxTG2Rclimtkg5LFNbpByWqS1SDsvUFimHZWqLlMMytUXKEbPH5N5y72FERohJX7Hvk1jVY2GxLTF5omgY+gMiI8UCwNGWsS3iRazCD42IoQWoDDv6MTGqx8Lif4iVqSUQAKVAkGkLn+8t9+bGqC4Li18RE1MbocbRefbQ3cfmbgrX644BgBz1zLDeu7rWwqKjxHyUnpTCNm3V/scL1Mtue8h2cFbtiQ9M3PBeTCu16NLEZeiplEL4mnL+MX9L35OadPsWhTjBV+z7T8wrtuiSxKuf2unNrO93ScGq1xSiHpSc/MLAS+NUt0UXIy6m1jTVCoz3pDX/DhiTYwtv/bY555GJz+91Szzqt+haxH3mi5Si2+ag86yHNw2aviXkGgL8EZjpK/ZZU3AsOgUzHpPf0sMZvOe87mvHA88Dd+2bXv/x9QsKt00UY2GxR5hh6v8Dxl5wfONqYPKQtIYPvw/kHPZhfbd/eMu9aSbosUgxTJ14K6U44I2aAt/njXl/2RJy3Qh80N/VfNobZ63Yapooi6THtFF6UoqRwFcn5m26UE5efhNwLqhxrbpt7fRXe3vN0mWR/Jg59PQ/wDXA3wF8xb55h+dsvXlL2Ol6v677695y71ATtVkkMQmR90NKYQfSNE01jSjf/1Ad8QYo29js6umPTlo332x9FsmF6ZMEDEO/AzwK8HXxt4uBMdm2sP3zxrx5xz87eLKpAi2SDtNNrWkqTGTdlHfb9vmKfStGZ1f/Nk3oK9cFM8q95d5t12exsNghCRF+RCOlEJpm5Mgt9+YQSVV77IGZtW/3draccPeEysQSbJFwmN5SRyOlOAF4R0qRDuAr9tUDJ++V1vj9kib3ce/XdZ/jLfdaD2ksdkpCmZqInm5A97YdvmJf69D0xv2zbKFHWpT9QlAvXPZKn5wdF2HR1UnE8MNuxNn/g7fcOwO4t7+ruXagq2n47Enr18VZnkUSkHCmBjDCjz8DszRNrY4+dvHLfR/4tCH/Ch2+IzIue605Ki0SlUQLP9roA7StGPUrHpu0brqOOBbEQIH6dPorvU+KvzyLRCYhW2oAKUUvTVMbd3TcW+49KMMW/gSFyyH0oxed+/0H8dRnkbgkrKnbkFIMBbprmlq07bEZr/Y6/KP6bs+1Knt34Pe+Yt9r8VdokWgktKmlFILIGBEncGBb/3U03nJvT6AC1CGjsmqefeK0tefGW6dFYpHQpgaQUgwD6jRN7bCnw1vuzRroalqxqjWzlw11q44otWbSdF0S3tRtGK32ME1T323v+HULCjPer+vxRKuynS1QTxznrrpi5oTKljjLtEgAErX3Y3tcASyRUgzf3sGZEyqbW5XtHFB3KMQffgxk+Sc8v1dWnDVaJADJ1FJ3A84HHtjRw5k2Lnip3wv/acw7DcQnwMm+Yp81k6YLkTSmjkZKkQk0b+/GsQ1vufc0YL4DfY2Wu+Wc+yZu/DR+Ci3MJJnCDwCkFP0AHzBlZ+f5in0v2dGPtws16LOG/I9GlO+/f3wUWphN0pka2AAsJLIE8k5ZUrz0gzHZ1ac36fYaHbHQW+4dF3t5FmaTlOFHe/GWewcCbwnU4MOyq+97bNK6683WZBE7ktrUUoqrgQGapq7a1bnjnx1SYEetWB9MzwEu9xV/+0jMBVqYQjKGH9H0AfpLKXa5ds27Z/+46YDM+kECKkDMOvjp4Xddv6BQxEGjRZxJ9pbaAYR31guyLd5yr8OGekxHXDA0veH75YFsr6/YF4qhTIs4k9SmbkNKUQBcC9ykaSq4q/OvX1AoNgTTPljS5D4CqADO8BX7mmKt0yI+JHv40cYRwDTg4N05+e4JleqZ01drRMZsn5Auwp9d82rhPjHUZxFHUqKlBpBS9N3ZoKcdcdgzw34f0G3PZdvD9TVh5wG+Yp8/BvIs4kjKmLoNKcXRgE/T1KbdveaKV/pc+lF9tzsVogk4wVfs+zp2Ci1iTaqEHwBIKXoC/wD+1J7rHv7d+tkKMRZU2CH0zy9/pc+M2CjsOnjLvYd5y703GAvFxpVUbKk14HNNU+2+8Zv8wsD9NgTTvtwScjl0xNm+Yt8Lna2vK6DNH3rklpDzHRACaCXOKx6nVEsNoGlKappqklI4pRSD2nPt/DNWfbdvesMQHT4Bnh/1zLDrYiQzZZFSiJ6OlpdAOAA7kVlLWjw1pJypo5gHvNeW7Wl3eWTS+jUgjnUI/Y0m3XH3WS8M/NBb7rUe0uwCKcXhUgrbs1v6nPh9IDsHlA6EgCCRFZDjRiqb+n7gj5qmAu290Ffsaz4md/PEYen1vm+bc8cBc73lXmenK0wRpBTHAB9VBl3nL2/
"text/plain": [
"<Figure size 216x252 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Top-5 predictions:\n",
" 1. fork 8.643%\n",
" 2. shovel 7.149%\n",
" 3. syringe 6.684%\n",
" 4. screwdriver 5.352%\n",
" 5. stitches 4.247%\n",
"Answer: line\n"
]
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAALUAAADdCAYAAAD99DOeAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAAAs50lEQVR4nO2deZhdRZn/P3XO3Xvvzr6HJJAGAgQhIWGxIMogAZRRBsFR3Inww5lR1ODCZEYdIoIjaJRFZwjiPoyMEkBZLAgQwpIdkhASOnvSSXd6vfs99fvjnA6Xtrd0zu3u3NTnee7TXXXqnKp77/fWeU/VW/UKrTUGQzFhDXYDDAa/MaI2FB1G1Iaiw4jaUHQYURuKDiNqQ9FhRG0oOo55UQshdC+vB3yq5wEhxKN+XMtQWAKD3QAfGJ33/2XA/Z3yEvmFhRBBrXVmIBpmGByO+Z5aa72v4wU05ecBEaBJCHGNEOIZIUQCuEEI0SKE+Ej+dYQQ7xdCZIQQIzvXIYRYBFwHzM+7A0jv2AwhxFNCiIQQotHr0St6arMQYrYQYpUQIimEWC2EuLTTNaWXHpZ3ziQv76y8vJOFEMuEEK1CiHohxK+FEKPyjs8QQjztvd9WIcRaIcSF3rGgEOJuIcQeIURKCLFTCLG475/80OWYF3UfuQ34CXAy8DDwa+DTncp8GnhUa72/i/PvAH4HPIV7FxgNvCiEiAFPAG3ALOBKYC7wX901RAhRCjwKbALeA3wV+P6RviEhxGjgOWCDV/f7gFLgj0KIju/1V8Be7/hMYBGQ9I590WvvR4FpwNXA5iNtx5BEa100L+Aj7ls6nJ4EaODLncqdBWSBsV66CtdMuayHaz+AK/r8vM8BzUBZXp706pzazXWuBxqBaF7etd45stM1hnXxXs7y0v8OPN3p2lVemVleugW4rpt23A08DYjB/t78fh0vPfWr+Qmt9avAelyTAlxRHQIeP8Lr1gLrtNateXkvAg7uXaErpgMbtNb5tv7KI6wX3F7+AiFEW8cL2Okdm+L9/QHwM8/0+oYQYnre+Q8AZwBvCiGWCCHm5/XwxzRF8Sb6QHsXeT8DPuX9/2ngAa117givK3B7xq7oLr+nczpw8sp2EOxUxgKW4Qoz/zUN17xBa70I98f1CK5ZtE4I8Wnv2Crc3v/r3rWWAk8Wg7CP+TdwFDwEjBVC/D/gTOC/eymfBuxOeW8ApwshyvLy5uJ+rhu7uc5GYIYQIpqXN6tTmQPe3/xRnDM6lVkFnAJs11q/1el1+M6htd6itb5baz0f+Dnw2bxjrVrr32utvwDMBy4CpnbT7mOHwbZ//HzRvU19VjfllwIp4Nk+XPvruLf3k4BhuD1nDNgD/AGYAVyA+7D1cA/XKcUV7UO4vej7cE0hDbzXKxMEdgD/C5wIXAys5d029Rig3iszGzjBu9Z9QBkQBZbg2ueTvDLrgZ95538JuAbXhJoK3IX7fBAb7O/xqHUw2A0YZFFf4B3/RB+uPRz4C9DKux/qZuA+cCVw7fIHgIpernUOsNr7Qa0GPuxdc3ZembnAGu+6K3B70ne9F1xT43+8ehPeD+pHQMh7/QrY7tWzxxN8uXfu53B7+1bcB8pngbmD/R368RLeGzwuEUJcDdwLjNFaxwexHR/E7e1HaK0PDlY7ioVimFE8Yrzx5Um4JsX9Ay1oIcR1wDZcc+ZU4IfAn4yg/eF4fVD8Kq6N2gh8exDqHwn8AtdcWII7lPiPg9COouS4Nj8Mxcnx2lMbihgjakPRYURtKDqMqA1FhxG1oegwojYUHUbUhqLDiNpQdByX0+RDnV0Ll8/B9a5T4xafv2KQm3PMYWYUhxi7Fi6fC/wVt8NJAfOMsI8M01MfBUsWPHMervvqX2+856KjFt76execVxa8dFkgUxUC0OiQQEhc11NDHzE9dT9ZsuCZc4HlXjIJzOuvsHctXC6Az2mRvUuLTEQ4kSxeh5Muqbtmyrc+8RtfGn2cYB4U+4kdbvoy7hpCATqEawMfMevvvf6sbKjhZeBeoQMvto16bqZAXJCN7vst5HKh9ok37Fq4vPP6REMPGPOjnwQiDRfkUh171mgLhDqS8xfduigLWDijnU87JZYm+6AgsD/SdMrpTRN/92bl9n9YCKwDvutYyYd3LVy+AvPg2CeM+dEPlix4ZjzoumDJ7qdz6cpyJ1M6G5h14z0XvdKX8xfduiiLyFvEK8h9NjlP0MOdU6MdgTAPjn3AmB/94zMgRKZ93PVOpvT9wEFwbuvz2cL73Ds2QHCwtLdrgkY76djOlbjbN3wqEz6gNFoLhIW7IFf69zaKEyPqI+R3P/pk2Aq23WwF21648Z6L3r7xnotawxVv/xyseb++84tf7tNFtLevh34nra00mpwWiFQoPv5fxi0+/4Fxi89/IJga/nWBSOLuKJUBVAHeVlFhRH2EJBpPXOBkSksqJvz1+Y688nHP32aHmxJNde/79JIFz4iezgdY9O+LAmhygEaTW/TtRYFEzasfyUYabqeTeeH9Pw+4tfMxQ9cYm/oIWbLgmcfAOWP4KQ9N/oebHkjl5X8Gd9env7/xnov+MHgtNJie+ghYsuCpK0Ffgsg+d+D1T6Q7HV4Kzpt2qHnJ7370ySMeglNKzFdKzPOpqcc1pqfuI0sWPDMHnOVgdYxaxMGpiw57o8ZJl76Uapn0fKhs55Xp1vFzS0evfLBt7+xP4m5aIwHV28SMUuIVoF5KPb+gb+Q4wPTUfUe+M1yhHWCNFUjuyqXKK1Kt494PfD/dOn4uQNve2Z8AMqBfAP0fwNPuj6JHPkTePneG/mMmX/qOApEGgiAywM1f+PFlh3vfJQueqYxUbT5VO4Gvp5qnfAC0N9soAB0EIenBh0NKvbugrT+OMObHEeD1tpIezIklC54RgejBDU4mcpKTLckAAe9H0KNviFLiSiAhpX6iEG0/njCiLgC//sGNn2l888M/s4KtDzqZsk0Ym3pAMTZ1AbjmS0t+DvzRyZR96ISLF+w95aPz+jKqMZ93IhsYjgIj6sLxDdBlTW//3V3Ax5USPU7KSKnrpTQbRPqBEXWBuPGeizaUjFy99dC2S8u3P/edy6Xs2c5TSlyllLhsoNpXzBhRF5Bw+fZP6Fwo07Znzpf6UPyrwBcK3abjAfOgWGCWLHjmx+B8oXT0yy+17T3n5u4eGH9yw/kXp1tjc3Pp4J+//NtHjX/HUWB66oKT3QyW1bZ31ly6mYS58+rL5iQaKh7LpYO3Ak/fefVlvU3UHJssqpjDoopbWFRR0PdnJl8KxAOLbp/cvu+shRD4nOtjasE7/tCde2IJ2vZmLLsrc2yzqGKeRj8BCIFIs6hiHouaC/Iejah9ZsmCZy6yIw135tIzzgCdBfF7EJfjirU7f2gvT+NN1HRV5pgmZ+m/sx3RobeC/nCN+eEjSxY8817gqVyy5gycoI5UbvnijfdcdLWw0u+rPvF/to6Z9f3/6sqm9mzoQwj9CjCvGG1qy+EPGo1Gawq82MGI2lf0fA57PVm5ZNOJlQA3/OSSF0ef+dM1VSc8saX7c0UYbT1bjIIGEItaVmjYrQWbgIKZHmDMD1+J1ry+K9FwKqB1ZzNCSv3R7s5buui0AEyIRWtaqgagmYNGS0WuOpQWkdjX2gr6wzU9tY/YofirAHa46XG6cWDqambRjqSHAUSrW0cWvJGDSCaoGwJZUV3oeoyofaRt76w2gFyq6uedBa2UGKGUeBv4TOfz9q92w4Ef2jbqzwPRzsFCaFaFMpZovKskVsh6jKh9RAQSIfefbK6LwwdwQyXv6OJYKYDO2c0Fa9wgM2nhMvmHA64XQPWhwLhC1mVE7SNVJyw7FaBqyrLTOx+TUmsp9Sel1H/pfKxy8r7JAKWjG8sL38qBZ9LCZXNAP/1k5uwrAH6fveCSQtZnRO0jqaYpjQDp1nHbuyujlIgpJUL5ecGS5GSAaE1LUYoad0za2q2HA5COZT9WyMqMqH2kvX5mE0D7/vfs7Oq4UuIcoAV4b37+gQ2T6gAOvjFheRenFQPLAfZRpbPa4nRrS1shKzOi9hEr0B4GEHbK6abIG8Bi/ta
"text/plain": [
"<Figure size 216x252 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Top-5 predictions:\n",
" 1. snowflake 22.972%\n",
" 2. yoga 10.533%\n",
" 3. matches 6.915%\n",
" 4. candle 4.574%\n",
" 5. syringe 3.947%\n",
"Answer: trumpet\n"
]
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAALUAAADdCAYAAAD99DOeAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAAA2AUlEQVR4nO2deZgcVbn/P6eqept9yWSSyTbZZ0gGAgmBsIQiKKABFUFAUAEVDQZFwesvokIEldzrBRVuFA0qKFwFFRTIFdDEYg1LAiEJmUnIMllnMpl96bWqzu+PqoEhJpnMTK+T/jxPP0l3nzrn7Z5vn3rP9r5CSkmWLMMJJdUGZMkSb7KizjLsyIo6y7AjK+osw46sqLMMO7KizjLsyIo6y7Aj40UthJD9PB6MUzsPCiGejkddWRKLlmoD4sDoPv+/CFhxyGuhvoWFEB4pZSwZhmVJDRnfU0spG3sfQHvf1wA/0C6E+LQQYrUQIgR8RQjRKYS4rG89QogPCyFiQojyQ9sQQiwFrgEW9rkD6O57NUKIfwohQkKIVrdHLzyazUKI04QQbwohwkKIt4QQHz2kTt19PqLPNZXua3P6vHaCEGKlEKJLCNEkhPiDEGJUn/drhBCr3M/bJYR4WwhxrvueRwhxrxBivxAiIoTYI4RYduzffPqS8aI+Ru4Cfg6cAPwF+APw+UPKfB54Wkp54DDX/zfwGPBPnLvAaOAVIUQO8AzQDcwFLgHOAH5zJEOEEHnA00AdMBv4FvDjgX4gIcRo4AVgk9v2h4A84EkhRO/f9X+BBvf9k4GlQNh972uuvVcCU4ErgC0DtSMtkVIOmwdwmfOR3nteCUjglkPKzQFMYIz7vBjHTbnoKHU/iCP6vq9dD3QA+X1e0902pxyhni8DrUCgz2tXudfoh9Qx4jCfZY77/A5g1SF1F7tl5rrPO4FrjmDHvcAqQKT67xbvx/HSU6/t+0RKuRbYiONSgCOqNuDvA6y3Gtggpezq89orgI1zVzgcVcAmKWVfX/+1AbYLTi8/XwjR3fsA9rjvTXb/vQd4wHW9viOEqOpz/YPALGCrEGK5EGJhnx4+oxkWH+IY6DnMaw8A17n//zzwoJTSGmC9AqdnPBxHev1o1/Ri9ynbi+eQMgqwEkeYfR9TcdwbpJRLcX5cf8VxizYIIT7vvvcmTu9/q1vXQ8A/hoOwM/4DDIGHgTFCiBuBU4Df9lM+CqiHvLYZOEkIkd/ntTNwvtfaI9RTC9QIIQJ9Xpt7SJmD7r99Z3FmHVLmTWAGsEtKue2Qx3t3Dinlu1LKe6WUC4FfA1/s816XlPJPUsobgIXAAmDKEezOHFLt/8TzwZF96jlHKP8QEAGeP4a6b8W5vU8HRuD0nDnAfuAJoAaYjzPY+stR6snDEe3DOL3oh3BcIQmc45bxALuBx4FpwPnA23zQp64AmtwypwGT3Lp+BeQDAWA5jn9e6ZbZCDzgXn8z8GkcF2oK8DOc8UFOqv+OQ9ZBqg1Isajnu+9/7hjqLgOeA7r44KCuBmfAFcLxyx8ECvup63TgLfcH9RZwqVvnaX3KnAGsd+tdg9OTfuCz4Lgaf3bbDbk/qPsAr/v4X2CX285+V/AF7rXX4/T2XTgDyueBM1L9N4zHQ7gf8LhECHEF8EugQkoZTKEdH8fp7UdKKZtTZcdwYTisKA4Yd365EselWJFsQQshrgF24LgzM4GfAk9lBR0fjteB4rdwfNRW4M4UtF8O/B7HXViOM5X4mRTYMSw5rt2PLMOT47WnzjKMyYo6y7AjK+osw46sqLMMO7KizjLsyIo6y7AjK+osw46sqLMMO47LZfIsH6Ryycp5OLv5jPplC9ek2Jwhk11RPM5xBG2/AEIFEQO+eNro5//66E3/1dXvxWlKVtTHOZOWPHmrjfLDvodsBDa5nq7O7ljhK0Dd3FEvFAAvvd44/+/AgfplC9NaNFlRH+c4PbVcBXhAmGD/4NTyV87d01XpbQyOzQVZBSLn/Stkx9i8XYqNeGt/94RnBHbdhZVPRL1q1PjZ9Q/3fLDe1Lg0WVEfxxiGOBV4+9pnnp7NEQT4qZ98WxmZ0zDnnZZZlfWdU8t9auikirw9n9zXNV5GbX9JbzmBbUuU7R4lun1Cwbbx29qrp4JQcI7BnZdMYWdFfZxiGKIU2Ak8ouvyhsHUUblk5XleJXzF5KK60sbgmP1t4bJRXjV8sml5JtvvHeeUcmTO/mfmjX5+4c+ufyQpYsuKehhwuFv9TSuuFiEzx1ffOcW3tW2mVuhtC5w5ZtXEHR3TwnWtJ0bzPB25V1b95oYtrTPeeHHf+dvzve0FJ498bd7Ojql79nRN6szzdBTNGLH+rPqOKTsOBMd05Hk6SqYU1Z25u2vittbwyC6fEpwStf1nSYQAEaJPbzz11ifOitmef4DwglRAoApzmyW1nwK/q1+2MKGD0KyoM5gbf/W5aS/tO+9L7ZERXwepAqjCjFrSY4P0gRD9VDEAZNSvhrym1DpN29spsPIkSpE7wDSB2+qXLbyrt3TvDy3f0/7GlOK6895qOv1DwByPEolWlWx6e2Pz7M8ApSTA786KOoO4acXVnnfbTvjw5tZZc0FeKJCnfTAekSTP01nbHStcKbAjc0a9fHpzqHzLzo5pWzURi51eYZwSiuU2zhn18vU7Oqb9xpZq+77u8dvqWk/cm6N1m+eMfXbUnq6Jeze1nNKc722P6uOe0Tojxe3P770gfOiMhytad4BJjH785solK4WCddq0knce3to6Y6KNqgC2c5ZYRPq7fiBkRZ3mfOLu2yauP3ja2cCFAa3nspCZ68FRwmvTizfutqRSu619xrc4RnEZhhgL3A98U9dl3VBsG+wMx9RbHx8Vs32/B/khN7aPBeJ7fXv6oZAVdZrxhZ8v8q7affGpwEdK/Ae/0Bou641i2jQqd88704o3b3hh7wV31i9b2NJ7TSauCLpTiS+A0ECGQSzI9tTDiMolKyuAC0fm7P9CV7RgXsjME4CV5+msPaF0fUN9x9SlTaHRr9YvW2j3V9eRMAxRDNwOfF/XZVu8bB8KlUtWrsCJGHV2/bKFL8Wr3uzejxRQuWSlBzizLND4GVUxr4axfoCW0Mim6pIN73bFCn6yq3PKHzfd+el2J4hSXDgXJ+LqQzjBb1KOTw0Rsfxt9csuipugISvqpFG5ZOX4fE/HJaPz9vw/VVQVWVILHAyVm5UF27omFdb9dkdH1f2W1DY+/R/fS8itU9fl44YhKnX9sPG3U8LEwq3nN4fK8+Jdb1bUR2D5otXv+amL718wIF/P9XHPAzpmlb129bb2qglQOKorVojosaPjC3a8urNj2k9ArDZu/XpnAsx/D8MQecBUXZdvpZOgAfZ3T2gV2OH+Sw6M41rUvQMsjxJ9KWZ71wOBeaOeH3Pi/hlfyGfsDYCQyNjX7/zv3/21p3oVkDOz9M2TvWp41JtNZ2wEApOLaudowizZ0lZTCwRK/Y2nQvnY3jniDQdny4q8vfu6Y4W3AM90RotrN9zxmWQOZL4D3GIYYpKuy71JbLdfOqNFKkeODjtojltRu4J+CaQSs714JMyIqkzfej4FtoJEIhDY4G1oqfkifvOLAJtaTumt4jIgtqdzkvSqYQEUAaGgmVfE+1veLBv1zpe+u/j7yf10H+C/gA3pJmgARVijFGGti3e9CRX1UG7hSUAHlBGWwqyIysyoZnkQaizQEomM2Ozz7jsdaWu2Aubk4nfufC00/S9A8NTyl8TovL2hJ7df2Vy/bOG/BWn/90UJ8VxyP5aDYQgPYLozHX9IhQ1H46YVVwvB5WUnlb0xzUk9Ez8SNqW3fNHqM4B/ARrYdk7ZhoeDB2f9FmjJq3glHCje1nj5Vx88XIT/hPP7H9zuf7PxrG/6JHeOs1RMJEHFerbA1m4HXp9x5XljWrZ88kuNby0OMXifWieF88aGIe7CyW2zUNdlNBU2HI25d6wobQpWNFcWvHuPcevXb4ln3QkR9WP3XRto2/GR7WaobHQ/RbuBVi3Q5NV8nXa4fcpLQEtO2dvlqq8z1LX37OeAloLxq3NUb1dz27aPbwDaF9+
"text/plain": [
"<Figure size 216x252 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Top-5 predictions:\n",
" 1. shovel 15.070%\n",
" 2. floor lamp 10.788%\n",
" 3. screwdriver 10.516%\n",
" 4. lipstick 9.559%\n",
" 5. lantern 7.887%\n",
"Answer: anvil\n"
]
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAALUAAADdCAYAAAD99DOeAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAAA8rElEQVR4nO2deXxU1dnHv+fOnkky2SAEAoRdhChalYJir6DWSm21ttpWLYWK4mvVV601apdptYq12lZLpWLVuC9t9VWpiiVeikJVlE1AZJElQPY9mf2e94+5oTEEyMzcbMN8P5/5zMy9Z8vkN2eec85zniOklKRIkUwofd2AFCnMJiXqFElHStQpko6UqFMkHSlRp0g6UqJOkXSkRJ0i6RjwohZCyKM8njCpnieEEK+bUVaKnsXa1w0wgYIOr78OLOl0zdcxsRDCJqUM9UbDUvQNA76nllJWtD+Aho7XACfQIIT4nhCiTAjhA/5HCNEkhPh2x3KEEOcIIUJCiPzOdQghvMAcYHaHXwDVuFcshPiXEMInhKgzenTPkdoshJgqhPhYCOEXQqwVQpzfqUzVeJ/XIU+Rce2UDteOF0IsFUI0CyGqhBDPCSGGdLhfLIRYbvy9zUKI9UKIs4x7NiHEg0KI/UKIgBBirxBiYfc/+f7LgBd1N7kH+DNwPPB34DlgXqc084DXpZSVXeT/HfAi8C+ivwIFwCohRBrwJtACnAZcBEwHHjtcQ4QQ6cDrwKfAl4CfAvfF+gcJIQqAfwOfGHWfDaQDrwoh2v+vzwIHjPsnAV7Ab9y73mjvd4FxwKXA1ljb0S+RUibNA/h29E86+L4IkMDNndKdAoSBYcb7bKJmytePUPYTREXf8dp8oBHI6HBNNeoce5hyrgbqAFeHa9838qidysjr4m85xXj/a2B5p7KzjTSnGe+bgDmHaceDwHJA9PX/zezHsdJTr+n4Rkq5BthI1KSAqKjqgTdiLHcisEFK2dzh2ipAJ/qr0BXHAZ9IKTva+u/HWC9Ee/kzhRAt7Q9gr3FvjPH8APCoYXrdIYQ4rkP+J4ApwGdCiEVCiNkdevgBTVL8Ed2gtYtrjwJzjdfzgCeklJEYyxVEe8auONz1I+VpR++Qth1bpzQKsJSoMDs+xhE1b5BSeol+uV4hahZtEELMM+59TLT3v90oqxR4OxmEPeD/gAR4GhgmhPgxcDLw+FHSBwFLp2ubgROFEBkdrk0n+rluOUw5W4BiIYSrw7XTOqWpNp47zuJM6ZTmY2ASsFtKub3T4+Avh5Rym5TyQSnlbOCvwJUd7jVLKV+SUl4DzAZmAmMP0+6BQ1/bP2Y+OLxNfcph0pcCAWBFN8q+nejP+wQgj2jPmQbsB14GioEziQ62/n6EctKJivZpor3o2URNIQl8xUhjA/YA/wDGA+cC6/miTT0UqDLSTAVGG2U9AmQALmARUfu8yEizEXjUyH8T8D2iJtRY4I9Exwdpff1/TFgHfd2APhb1mcb9H3Sj7EHAMqCZLw7qiokOuHxE7fInAM9RyvoysNb4Qq0FLjbKnNohzXRgnVHuaqI96Rf+FqKmxt+Men3GF+ohwG48ngV2G/XsNwSfaeSdT7S3byY6oFwBTO/r/6EZD2H8gcckQohLgb8AQ6WUbX3Yjm8S7e0HSylr+qodyUIyrCjGjDG/XETUpFjS24IWQswBdhI1ZyYDfwBeSwnaHI7VgeJPidqodcCdfVB/PvAUUXNhEdGpxMv7oB1JyTFtfqRITo7VnjpFEpMSdYqkIyXqFElHStQpko6UqFMkHSlRp0g6UqJOkXSkRJ0i6Tgml8l7mqKSpdOIesdpuxbOXt3HzTnmSK0omkxU0LIMcAABEDNTwu5dUuaH+aiAA4Qgupv95qKSpeLIWVKYSUrU5qOBkMaOLR24ON3WtOaav8w7u2+bdeyQErXJTMjeWAUoIJaBmAH61WHdOmXZrm8sKyp5/faikqWd9xqmMJmUqE0m11V1F8BpQ/59966Fs1ftWnjBI7NGvn6yy+p7G8RvQP/4R3++Zs7RykkRP6mBoslM+vnz7ysiMnbm8H/m/XH+M1/4cItKln7DaWl7MhBxerIcdU/VB/Ku3bVwdvPhykoRH6me2kSKSpZmt4YyTm4OZi3pLGiAXQtnv3rOyFcnjfJsW14fyLsc2DTzngdSvbbJpERtImM8n15FdO7/5cOleeiqp/aV3XbT2cDpFhFu29k44Ykzf/PguqKSpYfE8EsRHynzw0TOvffefZWtQ4eMz95se+nGe/Sjpb9hyeXuA62Fr3xQccaZINpsiv9RAf6g7vxnam47flKiNomikqVpIGsL3OXLVv98wTdjzDsBeB6YYkwFShD/mJiz/uOizO3vvrHr4pW7Fs5O/aO6SUrUJlFUsvSbRMN7nbNr4ex/xZH/dpB3glCIKjtENHYHRIPW/Of43LXNec7qf/9737nbiMYOSS3Dd0HK98Mkhmd8/pN9LcNbdGldEWcR74AIEI3OFALOOXfk/w3Z3TR66tb64sEgp22uPWk8cJmRXgL+opKls1LC/iKpntoEvnL3H63VbUMCRZ7tu/95yx2j4y3naI5Q1y+5fNy/dl9wd1vYfXF0GV5GQPx818LZ9yTQ/KQj1VObwO6msWcCSoM/565EyjGEfNhe98H5T28rKln6gED/pkTYBDIiEVoidSYjqSk9c7gI8O9vHfFCT1e0a+Hs1en2xv8BsCihu1Omx6GkRJ0gNyy5TKTbmn6U46xet2vh7K7iYJtOczD7bwBh3dHSG/UNNFKiTpBtDcd/tSWU6ZqUu3ZjL1bbqIiIf2j67i/3Yp0DhpSoE2Rz7ZQzgcj66lNvS7gwr2caXs9teD3TjpRs18LZMtdZpWTYmo6Y7lglNVBMGHkhiBUbfn15bULFeD3TJHIF0f+JX3g9s/A2HtZebgpmf1DrH+xMqM4kJdVTJ8A1f5l3HoiJE7I3bjChOBWwif/umFGPlDgQcW7XpWWoCfUmHSlRJ8CG6lPOAxib9elzJhRXKxBIJIawj+jglJ+2LwCy4IYll7mOlO5YJCXqBNjXMnIasGbR1U98kGhZLe7IFRLpE4hfAe8ANxxY7HrgcOnHZW9xgxBB3XFConUnGylRx8mUXz55CdFTtdYcLe3RqHkobYjLp5xRlxPeibfRC5zf4o7U5lfabgzdlfm1rvJsrDn57wBvfH5x5xPDjnlSoo6DopKl0xoD2c9G38m5xvJ23OTV2i6x6AIpuBEAb6O/Njf8JeBTW1i8iNczp/OsSGMgZ4fxcngidScjKVHHhyppP0RTWDjKoO6IeD2C6NHOH+Zd1/Z2++WR8/y7FSnOlsgWiXxCIu8ElrcLOz9t3z6AcdmbvhJ33UlKStTxoYFo3wQQir6Pj/JhgauA41vTIi8dctPbuC9ok28LBCL65bFhfIG+XLCi3mHx4ba2dD5Y9JgnJeo42LVw9mpFhHaBvg1IyPUzu976vbBF6tWDwk92dd8eEg9LJBLZ7mOtAfxx/jMyGHFsXVc9dVe8dScrKVHHidvaNqo47+NIQg5FXk+2u80y1RLhL0Vz/ZVdJRHeptUC8ZZAtAJfWJCRKHuBwrjrT1JSoo6DopKlzuaQR2kKZr2bSDlBm3414BSIR46Urs2l/wtI3zc06O54PT9tn+6yth6fSBuSkZSo46MQYHfT2PfiLUB6M6chudNvjzTgbVx3pLQ1eaGPAexBcUnH68MzPnf6w66Myx68yRFvO5KRlKjjYKh7z0QAp6Vtf1wFRGcw3rGHFasjqKQfzYFJV3hHIvcPqrFldrz+cdWXn5EovLd/Viq8QgdSoo6DMVmfzgZQh78Z7144VSBsQPuSuHqkxEVz/VIgygDVmAIEQJfWvcbLlF3dgZSo42BT7UnNABFpiXc1UZPICIBEBunGlGBTRmQrkL+3MPDV9msTc9Y3A5w0+D+z4mxHUtIvRF1esnJWecnKu8pLVg4I/+A6/6AMoGbJNY/Ux1WAt3F1gye8GsDn0r9/JBfTduqzw6sA3K2Wg6Iem7VlL4CCPimudiQpfepPvWHJPEta1RnPOBl3KYBE3vG596UWm3/
"text/plain": [
"<Figure size 216x252 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Top-5 predictions:\n",
" 1. blueberry 13.230%\n",
" 2. submarine 11.078%\n",
" 3. bicycle 9.777%\n",
" 4. motorbike 9.246%\n",
" 5. eyeglasses 8.239%\n",
"Answer: pickup truck\n"
]
},
{
"data": {
"image/png": "iVBORw0KGgoAAAANSUhEUgAAALUAAADdCAYAAAD99DOeAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAABQG0lEQVR4nO2dd3hUVdrAf+fe6SmTSkJooRNgUIqoIDiAPdbVFRVX1oKyssK67PrFsmssq9EVO8ouNuyurrpq7OBYAAsqEpQqhJ7eM33u+f6YCRsxIZkGLM7vefIkc+9pN/POmfec8xYhpSRBgsMJ5WAPIEGCWJMQ6gSHHQmhTnDYkRDqBIcdCaFOcNiREOoEhx0JoU5w2PE/L9RCCNnFz1Mx6ucpIcRbsWgrQXzRHewBxICe7f4+HVi8zzVX+8JCCL2U0ncgBpbg4PA/P1NLKSvafoCG9tcAE9AghLhQCLFMCOECrhZCNAkhzmvfjhDiRCGETwiRs28fQohiYCZQ2O4bwB66ZxNCfCiEcAkh6kIzunV/YxZCHC2E+EYI4RZCfCuEOG2fNu2h11nt6uSHro1rd224EKJUCNEshKgSQrwghMhtd98mhFgaet5mIcR3QogpoXt6IcSDQojdQgiPEGKHEKKk+//5Q5f/eaHuJncCjwDDgX8DLwCX7VPmMuAtKWVlB/XvAf4FfEjwW6AnsEIIYQHeBVqA8cA5wATgic4GIoRIBt4C1gNjgeuAv4f7QEKInsAnwNpQ3ycAycAbQoi29/V5YE/o/migGHCH7s0NjfcCYDAwHdgQ7jgOSaSUh80PcF7wkfa+zgckMH+fcuMAP9Ar9DqdoJpy+n7afoqg0Le/NgtoBFLaXbOH+hzUSTtXAXWAud21i0J17Pu0kdXBs4wLvb4VWLpP2+mhMuNDr5uAmZ2M40FgKSAO9vsW659fyky9qv0LKeUqoIygSgFBoaoH3gmz3QJgjZSyud21FYBG8FuhI4YBa6WU7XX9L8LsF4Kz/GQhREvbD7AjdG9g6Pe9wGMh1etGIcSwdvWfAo4ENgohFgohCtvN8P/THBYP0Q1aO7j2GHBp6O/LgKeklIEw2xUEZ8aO6Oz6/uq0obUr24Z+nzIKUEpQMNv/DCao3iClLCb44XqdoFq0RghxWejeNwRn/xtCbS0BPjgcBPt//gGi4FmglxDi98AY4MkuynsBdZ9rPwBHCCFS2l2bQPD/uq6TdtYBNiGEud218fuUqQ79br+Lc+Q+Zb4BRgDbpJSb9/nZ+80hpdwkpXxQSlkIPA5c0e5es5TyZSnl74BCYCowqJNx/+9wsPWfWP7QuU49rpPySwAP8HE32r6B4Nf7UCCL4MxpAXYDrwE2YDLBxda/99NOMkGhfZbgLHoCQVVIAseHyuiB7cCrwBDgJOA7fqpT5wFVoTJHAwNCbf0TSAHMwEKC+nl+qEwZ8Fio/h+BCwmqUIOABwiuDywH+32MWg4O9gAOslBPDt2/pBttZwPvA838dFFnI7jgchHUy58CrF20dQzwbegD9S1wbqjNo9uVmQCsDrW7kuBM+pNnIahqvBLq1xX6QD0EGEI/zwPbQv3sDgl8aqjuLIKzfTPBBeXHwISD/R7G4keEHvAXiRBiOvAPIE9K6TyI4ziL4GzfQ0pZc7DGcbhwOJwohk1ofzmfoEqx+EALtBBiJrCFoDozErgfeDMh0LHhl7pQvI6gjloH3HYQ+s8BniGoLiwkuJV48UEYx2HJL1r9SHB48kudqRMcxiSEOsFhR0KoExx2JIQ6wWFHQqgTHHYkhDrBYUdCqBMcdiSEOsFhxwE5Js8vKj2WoLWYo7ykcOWB6DPBL5e4nyjmF5VOBD4i+K3gBaYlBDtBPIm7+mHWtc4maB+sgjQAN4dm7gQJ4kLchTo/dVNa8C8JQcE+CViaEOwE8SLuQr2jecAGkJpe8X4evCIEwZnbHu++E/wyibtQt/hSc0Ds9GnGP4JwEQxN4AMc8e47wS+TuAt1sr5xtFnXWhlaHE5LNdTffdbA529MLBYTxIu4C7UqAgUD09anAZSXFK58cOpvss8Z/PxtDodIinffCX6ZxHWfOr+o1AQZir7V1z5aaAlwn90uO4rFkSBB1MT78KUfQI0r55u2C3a73BLnPhP8womr+lGQ8d1EgBzLrp8EXXQ4RJ7DIR5yOERBPPtP8MskrkKdYao5CWBc7vJ9vaT9BOPYjY1n/wl+mcRVqL/YM3k3SK9O+Ne0v263yyogx26Xz8az/wS/TOJq+5FfVPoKwXhvl9GJQZPDIVS7PezAjAkSdEpcZ+pUQ8M4i645AHwE8m8gP2p/PO5wiMcIBkFPkCBmxHWmHn7TS1qyoamqytmzR+h4HIJx294f02OFZ1LvD5OGZ67+3qh6/2K3JwKQJIgNcRPq/KLSNKDeqDoXewLmiwEjoIH4SKANkyh9QkWdIL8am7NC8Wv6F76rHv9ieUlhfVwGleAXQTyFejTBqJrnEYy4aaedTv27f1w67tuqY8ZUtPYamWJoPL3Vl9Jfk8Hwz6rwbxieuVpr8aUs2do49BVgS3lJYWImT9At4ibUZ95zy/w1NePuyU/ddLzjhj980lk5h0OcC7xS0Zp3zh1f3OVu8qYflWqoP8MbMB7lDlgAUIWvbmDaBnedO/uJGldOKfBNeUmhNy4DT/A/T9xOFE0650QAW9bX27oo+i4wKzdp97I1t17cFHp928yHr9Gt2D1luE8zTsi2VF5Q48qZWOfOvgm4SREB3/F3PODc09r7CW/A9FGqoWHlmltnJCKGJgDiq348DHJGecnp6TFssycwcaB1/WVOv+XYPa19kkHoALLNFe5qV85zIJYPTS9bMyyj7JsHZj2XUFl+gcRTqEuBnuUlhWO6KutwCIWg7t1kt8t3w+jDAhw1Kuur39e5s0bvbOmfCaQBWHQtPqc/+R1g+bic5duyLRXvPHrVE00RPUyC/yniJtS2vz5XbVTdm1bdfPmErso6HEIQTHK5wW6Xv4q0z/yiUgUYdkxPxx93NvcburOlfy6hxDyK8Gua1H0BLD+mp6MxzVj3yqLZj6+PtK8Ehy5xEep5i2eIt7eep43M+nrVa3+8/aju1HE4RG9gt90utS4Lh8GQG17Nmdhr6R821o/I39XSrx/IsSAModubQC6fmLdUSJSnV+ye6igvKYxp/wkOPHER6pDuu1sRgWu23HnmwzHvIArOu++GpDRj7SVrasblVznzhikiMFmTalrodr1e8awa0+Nzrcmb9ti6uiPeLi8pPGi5YBJERryEegKwHCgsLyl8e7+Fi617A9047E090uvUW21llpcVKZZS3Bh3l695i2cIr2Y8ecWuKYMavRmjTarzRHfA0nYw5DepzvUFmWu8la15j+xu7VtaXlJYEe8xJYiOuAj1OffedPu3VcfeOCxjzfh3r7v+q04LFluPlchlBNOjeWoz/P/MqNfNFRIpEB5g2oEQ7H35/T8vGfJe+dkFPs14bJqx9pwWX8oQvxbUWIyqq2pg2obW8sZBDzj9ycuA7xMqy6FFXPapNamOBBiRuXpTF0XtgFEghESas+r0RxNMXdw+jMIBF+qHr3x6I7AR+A9QdOnCOUkf7TjNBkzMNFf9ZlvTgBFOf/L9AHrF6556572NWxqHLASxPFnf+MXa2y5KuKodROKlfjwOnFpeUpi334LBmXopYAKECKXilkgpEG4O0kzdFflFpYJgUvuJg9O+/12tu8fQOnd2GoBAkz0sFQ2VzrxngOWjsr/67o35xRsO5nh/acRLqD8CDOUlhRO7LPxfnfrLVkvgmiSnehaARL4iEHMobqyK+QDjQH5RaTpw7BHZX87f1dK3X40rN49gKmWS9Y2tLT7rG8DySb3e35FhqnnngVnP+Q7qgA9j4iLUI//yfLPV2LBm+U1Xdy3U7ah90DIxs07/mSakR5FCpwnp39bPYwyo9B70G8+umA80juQXleoVETjy6NxPrt/cUNCj2pU7AOgJoBM+r1/qPwG5fGqft1tNOuczj1z1VGIBGiNiLtRn3HOrYW3NaM+RPb5c9tofb5sWVuVi6zTgQ2AKsKc
"text/plain": [
"<Figure size 216x252 with 1 Axes>"
]
},
"metadata": {
"needs_background": "light"
},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Top-5 predictions:\n",
" 1. stereo 21.389%\n",
" 2. radio 16.453%\n",
" 3. yoga 9.803%\n",
" 4. ant 6.983%\n",
" 5. power outlet 4.575%\n",
"Answer: calendar\n"
]
}
],
"source": [
"n_new = 10\n",
"Y_probas = model.predict(sketches)\n",
"top_k = tf.nn.top_k(Y_probas, k=5)\n",
"for index in range(n_new):\n",
" plt.figure(figsize=(3, 3.5))\n",
" draw_sketch(sketches[index])\n",
" plt.show()\n",
" print(\"Top-5 predictions:\".format(index + 1))\n",
" for k in range(5):\n",
" class_name = class_names[top_k.indices[index, k]]\n",
" proba = 100 * top_k.values[index, k]\n",
" print(\" {}. {} {:.3f}%\".format(k + 1, class_name, proba))\n",
" print(\"Answer: {}\".format(class_names[labels[index].numpy()]))"
]
},
{
"cell_type": "code",
"execution_count": 99,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"2022-02-18 16:47:16.114014: W tensorflow/python/util/util.cc:368] Sets are not currently considered sequences, but this may change in the future, so consider avoiding using them.\n",
"WARNING:absl:Found untraced functions such as lstm_cell_1_layer_call_fn, lstm_cell_1_layer_call_and_return_conditional_losses, lstm_cell_2_layer_call_fn, lstm_cell_2_layer_call_and_return_conditional_losses, lstm_cell_1_layer_call_fn while saving (showing 5 of 10). These functions will not be directly callable after loading.\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"INFO:tensorflow:Assets written to: my_sketchrnn/assets\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"INFO:tensorflow:Assets written to: my_sketchrnn/assets\n",
"WARNING:absl:<keras.layers.recurrent.LSTMCell object at 0x7fd0e0822610> has the same name 'LSTMCell' as a built-in Keras object. Consider renaming <class 'keras.layers.recurrent.LSTMCell'> to avoid naming conflicts when loading with `tf.keras.models.load_model`. If renaming is not possible, pass the object in the `custom_objects` parameter of the load function.\n",
"WARNING:absl:<keras.layers.recurrent.LSTMCell object at 0x7fd0e080f070> has the same name 'LSTMCell' as a built-in Keras object. Consider renaming <class 'keras.layers.recurrent.LSTMCell'> to avoid naming conflicts when loading with `tf.keras.models.load_model`. If renaming is not possible, pass the object in the `custom_objects` parameter of the load function.\n"
]
}
],
"source": [
2022-09-12 01:47:36 +02:00
"model.save(\"my_sketchrnn\", save_format=\"tf\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## 10. Bach Chorales\n",
"_Exercise: Download the [Bach chorales](https://homl.info/bach) dataset and unzip it. It is composed of 382 chorales composed by Johann Sebastian Bach. Each chorale is 100 to 640 time steps long, and each time step contains 4 integers, where each integer corresponds to a note's index on a piano (except for the value 0, which means that no note is played). Train a model—recurrent, convolutional, or both—that can predict the next time step (four notes), given a sequence of time steps from a chorale. Then use this model to generate Bach-like music, one note at a time: you can do this by giving the model the start of a chorale and asking it to predict the next time step, then appending these time steps to the input sequence and asking the model for the next note, and so on. Also make sure to check out [Google's Coconet model](https://homl.info/coconet), which was used for a nice [Google doodle about Bach](https://www.google.com/doodles/celebrating-johann-sebastian-bach)._\n",
"\n"
]
},
{
"cell_type": "code",
"execution_count": 100,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Downloading data from https://github.com/ageron/data/raw/main/jsb_chorales.tgz\n",
"122880/117793 [===============================] - 0s 0us/step\n",
"131072/117793 [=================================] - 0s 0us/step\n"
]
},
{
"data": {
"text/plain": [
"'./datasets/jsb_chorales.tgz'"
]
},
"execution_count": 100,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"tf.keras.utils.get_file(\n",
" \"jsb_chorales.tgz\",\n",
" \"https://github.com/ageron/data/raw/main/jsb_chorales.tgz\",\n",
" cache_dir=\".\",\n",
" extract=True)"
]
},
{
"cell_type": "code",
"execution_count": 101,
"metadata": {},
"outputs": [],
"source": [
"jsb_chorales_dir = Path(\"datasets/jsb_chorales\")\n",
"train_files = sorted(jsb_chorales_dir.glob(\"train/chorale_*.csv\"))\n",
"valid_files = sorted(jsb_chorales_dir.glob(\"valid/chorale_*.csv\"))\n",
"test_files = sorted(jsb_chorales_dir.glob(\"test/chorale_*.csv\"))"
]
},
{
"cell_type": "code",
"execution_count": 102,
"metadata": {},
"outputs": [],
"source": [
"import pandas as pd\n",
"\n",
"def load_chorales(filepaths):\n",
" return [pd.read_csv(filepath).values.tolist() for filepath in filepaths]\n",
"\n",
"train_chorales = load_chorales(train_files)\n",
"valid_chorales = load_chorales(valid_files)\n",
"test_chorales = load_chorales(test_files)"
]
},
{
"cell_type": "code",
"execution_count": 103,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"data": {
"text/plain": [
"[[74, 70, 65, 58],\n",
" [74, 70, 65, 58],\n",
" [74, 70, 65, 58],\n",
" [74, 70, 65, 58],\n",
" [75, 70, 58, 55],\n",
" [75, 70, 58, 55],\n",
" [75, 70, 60, 55],\n",
" [75, 70, 60, 55],\n",
" [77, 69, 62, 50],\n",
" [77, 69, 62, 50],\n",
" [77, 69, 62, 50],\n",
" [77, 69, 62, 50],\n",
" [77, 70, 62, 55],\n",
" [77, 70, 62, 55],\n",
" [77, 69, 62, 55],\n",
" [77, 69, 62, 55],\n",
" [75, 67, 63, 48],\n",
" [75, 67, 63, 48],\n",
" [75, 69, 63, 48],\n",
" [75, 69, 63, 48],\n",
" [74, 70, 65, 46],\n",
" [74, 70, 65, 46],\n",
" [74, 70, 65, 46],\n",
" [74, 70, 65, 46],\n",
" [72, 69, 65, 53],\n",
" [72, 69, 65, 53],\n",
" [72, 69, 65, 53],\n",
" [72, 69, 65, 53],\n",
" [72, 69, 65, 53],\n",
" [72, 69, 65, 53],\n",
" [72, 69, 65, 53],\n",
" [72, 69, 65, 53],\n",
" [74, 70, 65, 46],\n",
" [74, 70, 65, 46],\n",
" [74, 70, 65, 46],\n",
" [74, 70, 65, 46],\n",
" [75, 69, 63, 48],\n",
" [75, 69, 63, 48],\n",
" [75, 67, 63, 48],\n",
" [75, 67, 63, 48],\n",
" [77, 65, 62, 50],\n",
" [77, 65, 62, 50],\n",
" [77, 65, 60, 50],\n",
" [77, 65, 60, 50],\n",
" [74, 67, 58, 55],\n",
" [74, 67, 58, 55],\n",
" [74, 67, 58, 53],\n",
" [74, 67, 58, 53],\n",
" [72, 67, 58, 51],\n",
" [72, 67, 58, 51],\n",
" [72, 67, 58, 51],\n",
" [72, 67, 58, 51],\n",
" [72, 65, 57, 53],\n",
" [72, 65, 57, 53],\n",
" [72, 65, 57, 53],\n",
" [72, 65, 57, 53],\n",
" [70, 65, 62, 46],\n",
" [70, 65, 62, 46],\n",
" [70, 65, 62, 46],\n",
" [70, 65, 62, 46],\n",
" [70, 65, 62, 46],\n",
" [70, 65, 62, 46],\n",
" [70, 65, 62, 46],\n",
" [70, 65, 62, 46],\n",
" [72, 69, 65, 53],\n",
" [72, 69, 65, 53],\n",
" [72, 69, 65, 53],\n",
" [72, 69, 65, 53],\n",
" [74, 71, 53, 50],\n",
" [74, 71, 53, 50],\n",
" [74, 71, 53, 50],\n",
" [74, 71, 53, 50],\n",
" [75, 72, 55, 48],\n",
" [75, 72, 55, 48],\n",
" [75, 72, 55, 50],\n",
" [75, 72, 55, 50],\n",
" [75, 67, 60, 51],\n",
" [75, 67, 60, 51],\n",
" [75, 67, 60, 53],\n",
" [75, 67, 60, 53],\n",
" [74, 67, 60, 55],\n",
" [74, 67, 60, 55],\n",
" [74, 67, 57, 55],\n",
" [74, 67, 57, 55],\n",
" [74, 65, 59, 43],\n",
" [74, 65, 59, 43],\n",
" [72, 63, 59, 43],\n",
" [72, 63, 59, 43],\n",
" [72, 63, 55, 48],\n",
" [72, 63, 55, 48],\n",
" [72, 63, 55, 48],\n",
" [72, 63, 55, 48],\n",
" [72, 63, 55, 48],\n",
" [72, 63, 55, 48],\n",
" [72, 63, 55, 48],\n",
" [72, 63, 55, 48],\n",
" [75, 67, 60, 60],\n",
" [75, 67, 60, 60],\n",
" [75, 67, 60, 60],\n",
" [75, 67, 60, 60],\n",
" [77, 70, 62, 58],\n",
" [77, 70, 62, 58],\n",
" [77, 70, 62, 56],\n",
" [77, 70, 62, 56],\n",
" [79, 70, 62, 55],\n",
" [79, 70, 62, 55],\n",
" [79, 70, 62, 53],\n",
" [79, 70, 62, 53],\n",
" [79, 70, 63, 51],\n",
" [79, 70, 63, 51],\n",
" [79, 70, 63, 51],\n",
" [79, 70, 63, 51],\n",
" [77, 70, 63, 58],\n",
" [77, 70, 63, 58],\n",
" [77, 70, 60, 58],\n",
" [77, 70, 60, 58],\n",
" [77, 70, 62, 46],\n",
" [77, 70, 62, 46],\n",
" [77, 68, 62, 46],\n",
" [75, 68, 62, 46],\n",
" [75, 67, 58, 51],\n",
" [75, 67, 58, 51],\n",
" [75, 67, 58, 51],\n",
" [75, 67, 58, 51],\n",
" [75, 67, 58, 51],\n",
" [75, 67, 58, 51],\n",
" [75, 67, 58, 51],\n",
" [75, 67, 58, 51],\n",
" [74, 67, 58, 55],\n",
" [74, 67, 58, 55],\n",
" [74, 67, 58, 55],\n",
" [74, 67, 58, 55],\n",
" [75, 67, 58, 53],\n",
" [75, 67, 58, 53],\n",
" [75, 67, 58, 51],\n",
" [75, 67, 58, 51],\n",
" [77, 65, 58, 50],\n",
" [77, 65, 58, 50],\n",
" [77, 65, 56, 50],\n",
" [77, 65, 56, 50],\n",
" [70, 63, 55, 51],\n",
" [70, 63, 55, 51],\n",
" [70, 63, 55, 51],\n",
" [70, 63, 55, 51],\n",
" [75, 65, 60, 45],\n",
" [75, 65, 60, 45],\n",
" [75, 65, 60, 45],\n",
" [75, 65, 60, 45],\n",
" [74, 65, 58, 46],\n",
" [74, 65, 58, 46],\n",
" [74, 65, 58, 46],\n",
" [74, 65, 58, 46],\n",
" [72, 65, 57, 53],\n",
" [72, 65, 57, 53],\n",
" [72, 65, 57, 53],\n",
" [72, 65, 57, 53],\n",
" [72, 65, 57, 53],\n",
" [72, 65, 57, 53],\n",
" [72, 65, 57, 53],\n",
" [72, 65, 57, 53],\n",
" [74, 65, 58, 58],\n",
" [74, 65, 58, 58],\n",
" [74, 65, 58, 58],\n",
" [74, 65, 58, 58],\n",
" [75, 67, 58, 57],\n",
" [75, 67, 58, 57],\n",
" [75, 67, 58, 55],\n",
" [75, 67, 58, 55],\n",
" [77, 65, 60, 57],\n",
" [77, 65, 60, 57],\n",
" [77, 65, 60, 53],\n",
" [77, 65, 60, 53],\n",
" [74, 65, 58, 58],\n",
" [74, 65, 58, 58],\n",
" [74, 65, 58, 58],\n",
" [74, 65, 58, 58],\n",
" [72, 67, 58, 51],\n",
" [72, 67, 58, 51],\n",
" [72, 67, 58, 51],\n",
" [72, 67, 58, 51],\n",
" [72, 65, 57, 53],\n",
" [72, 65, 57, 53],\n",
" [72, 65, 57, 53],\n",
" [72, 65, 57, 53],\n",
" [70, 65, 62, 46],\n",
" [70, 65, 62, 46],\n",
" [70, 65, 62, 46],\n",
" [70, 65, 62, 46],\n",
" [70, 65, 62, 46],\n",
" [70, 65, 62, 46],\n",
" [70, 65, 62, 46],\n",
" [70, 65, 62, 46]]"
]
},
"execution_count": 103,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"train_chorales[0]"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Notes range from 36 (C1 = C on octave 1) to 81 (A5 = A on octave 5), plus 0 for silence:"
]
},
{
"cell_type": "code",
"execution_count": 104,
"metadata": {},
"outputs": [],
"source": [
"notes = set()\n",
"for chorales in (train_chorales, valid_chorales, test_chorales):\n",
" for chorale in chorales:\n",
" for chord in chorale:\n",
" notes |= set(chord)\n",
"\n",
"n_notes = len(notes)\n",
"min_note = min(notes - {0})\n",
"max_note = max(notes)\n",
"\n",
"assert min_note == 36\n",
"assert max_note == 81"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Let's write a few functions to listen to these chorales (you don't need to understand the details here, and in fact there are certainly simpler ways to do this, for example using MIDI players, but I just wanted to have a bit of fun writing a synthesizer):"
]
},
{
"cell_type": "code",
"execution_count": 105,
"metadata": {},
"outputs": [],
"source": [
"from IPython.display import Audio\n",
"\n",
"def notes_to_frequencies(notes):\n",
" # Frequency doubles when you go up one octave; there are 12 semi-tones\n",
" # per octave; Note A on octave 4 is 440 Hz, and it is note number 69.\n",
" return 2 ** ((np.array(notes) - 69) / 12) * 440\n",
"\n",
"def frequencies_to_samples(frequencies, tempo, sample_rate):\n",
" note_duration = 60 / tempo # the tempo is measured in beats per minutes\n",
" # To reduce click sound at every beat, we round the frequencies to try to\n",
" # get the samples close to zero at the end of each note.\n",
" frequencies = (note_duration * frequencies).round() / note_duration\n",
" n_samples = int(note_duration * sample_rate)\n",
" time = np.linspace(0, note_duration, n_samples)\n",
" sine_waves = np.sin(2 * np.pi * frequencies.reshape(-1, 1) * time)\n",
" # Removing all notes with frequencies ≤ 9 Hz (includes note 0 = silence)\n",
" sine_waves *= (frequencies > 9.).reshape(-1, 1)\n",
" return sine_waves.reshape(-1)\n",
"\n",
"def chords_to_samples(chords, tempo, sample_rate):\n",
" freqs = notes_to_frequencies(chords)\n",
" freqs = np.r_[freqs, freqs[-1:]] # make last note a bit longer\n",
" merged = np.mean([frequencies_to_samples(melody, tempo, sample_rate)\n",
" for melody in freqs.T], axis=0)\n",
" n_fade_out_samples = sample_rate * 60 // tempo # fade out last note\n",
" fade_out = np.linspace(1., 0., n_fade_out_samples)**2\n",
" merged[-n_fade_out_samples:] *= fade_out\n",
" return merged\n",
"\n",
"def play_chords(chords, tempo=160, amplitude=0.1, sample_rate=44100, filepath=None):\n",
" samples = amplitude * chords_to_samples(chords, tempo, sample_rate)\n",
" if filepath:\n",
" from scipy.io import wavfile\n",
" samples = (2**15 * samples).astype(np.int16)\n",
" wavfile.write(filepath, sample_rate, samples)\n",
" return display(Audio(filepath))\n",
" else:\n",
" return display(Audio(samples, rate=sample_rate))"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now let's listen to a few chorales:"
]
},
{
"cell_type": "code",
"execution_count": 106,
"metadata": {},
"outputs": [],
"source": [
"for index in range(3):\n",
" play_chords(train_chorales[index])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Divine! :)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"In order to be able to generate new chorales, we want to train a model that can predict the next chord given all the previous chords. If we naively try to predict the next chord in one shot, predicting all 4 notes at once, we run the risk of getting notes that don't go very well together (believe me, I tried). It's much better and simpler to predict one note at a time. So we will need to preprocess every chorale, turning each chord into an arpegio (i.e., a sequence of notes rather than notes played simultaneuously). So each chorale will be a long sequence of notes (rather than chords), and we can just train a model that can predict the next note given all the previous notes. We will use a sequence-to-sequence approach, where we feed a window to the neural net, and it tries to predict that same window shifted one time step into the future.\n",
"\n",
"We will also shift the values so that they range from 0 to 46, where 0 represents silence, and values 1 to 46 represent notes 36 (C1) to 81 (A5).\n",
"\n",
"And we will train the model on windows of 128 notes (i.e., 32 chords).\n",
"\n",
"Since the dataset fits in memory, we could preprocess the chorales in RAM using any Python code we like, but I will demonstrate here how to do all the preprocessing using tf.data (there will be more details about creating windows using tf.data in the next chapter)."
]
},
{
"cell_type": "code",
"execution_count": 107,
"metadata": {},
"outputs": [],
"source": [
"def create_target(batch):\n",
" X = batch[:, :-1]\n",
" Y = batch[:, 1:] # predict next note in each arpegio, at each step\n",
" return X, Y\n",
"\n",
"def preprocess(window):\n",
" window = tf.where(window == 0, window, window - min_note + 1) # shift values\n",
" return tf.reshape(window, [-1]) # convert to arpegio\n",
"\n",
"def bach_dataset(chorales, batch_size=32, shuffle_buffer_size=None,\n",
" window_size=32, window_shift=16, cache=True):\n",
" def batch_window(window):\n",
" return window.batch(window_size + 1)\n",
"\n",
" def to_windows(chorale):\n",
" dataset = tf.data.Dataset.from_tensor_slices(chorale)\n",
" dataset = dataset.window(window_size + 1, window_shift, drop_remainder=True)\n",
" return dataset.flat_map(batch_window)\n",
"\n",
" chorales = tf.ragged.constant(chorales, ragged_rank=1)\n",
" dataset = tf.data.Dataset.from_tensor_slices(chorales)\n",
" dataset = dataset.flat_map(to_windows).map(preprocess)\n",
" if cache:\n",
" dataset = dataset.cache()\n",
" if shuffle_buffer_size:\n",
" dataset = dataset.shuffle(shuffle_buffer_size)\n",
" dataset = dataset.batch(batch_size)\n",
" dataset = dataset.map(create_target)\n",
" return dataset.prefetch(1)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now let's create the training set, the validation set and the test set:"
]
},
{
"cell_type": "code",
"execution_count": 108,
"metadata": {},
"outputs": [],
"source": [
"train_set = bach_dataset(train_chorales, shuffle_buffer_size=1000)\n",
"valid_set = bach_dataset(valid_chorales)\n",
"test_set = bach_dataset(test_chorales)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now let's create the model:\n",
"\n",
"* We could feed the note values directly to the model, as floats, but this would probably not give good results. Indeed, the relationships between notes are not that simple: for example, if you replace a C3 with a C4, the melody will still sound fine, even though these notes are 12 semi-tones apart (i.e., one octave). Conversely, if you replace a C3 with a C\\#3, it's very likely that the chord will sound horrible, despite these notes being just next to each other. So we will use an `Embedding` layer to convert each note to a small vector representation (see Chapter 16 for more details on embeddings). We will use 5-dimensional embeddings, so the output of this first layer will have a shape of `[batch_size, window_size, 5]`.\n",
"* We will then feed this data to a small WaveNet-like neural network, composed of a stack of 4 `Conv1D` layers with doubling dilation rates. We will intersperse these layers with `BatchNormalization` layers for faster better convergence.\n",
"* Then one `LSTM` layer to try to capture long-term patterns.\n",
"* And finally a `Dense` layer to produce the final note probabilities. It will predict one probability for each chorale in the batch, for each time step, and for each possible note (including silence). So the output shape will be `[batch_size, window_size, 47]`."
]
},
{
"cell_type": "code",
"execution_count": 109,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Model: \"sequential_19\"\n",
"_________________________________________________________________\n",
" Layer (type) Output Shape Param # \n",
"=================================================================\n",
" embedding (Embedding) (None, None, 5) 235 \n",
" \n",
" conv1d_22 (Conv1D) (None, None, 32) 352 \n",
" \n",
" batch_normalization_3 (Batc (None, None, 32) 128 \n",
" hNormalization) \n",
" \n",
" conv1d_23 (Conv1D) (None, None, 48) 3120 \n",
" \n",
" batch_normalization_4 (Batc (None, None, 48) 192 \n",
" hNormalization) \n",
" \n",
" conv1d_24 (Conv1D) (None, None, 64) 6208 \n",
" \n",
" batch_normalization_5 (Batc (None, None, 64) 256 \n",
" hNormalization) \n",
" \n",
" conv1d_25 (Conv1D) (None, None, 96) 12384 \n",
" \n",
" batch_normalization_6 (Batc (None, None, 96) 384 \n",
" hNormalization) \n",
" \n",
" lstm_3 (LSTM) (None, None, 256) 361472 \n",
" \n",
" dense_17 (Dense) (None, None, 47) 12079 \n",
" \n",
"=================================================================\n",
"Total params: 396,810\n",
"Trainable params: 396,330\n",
"Non-trainable params: 480\n",
"_________________________________________________________________\n"
]
}
],
"source": [
"n_embedding_dims = 5\n",
"\n",
2021-10-17 04:04:08 +02:00
"model = tf.keras.Sequential([\n",
" tf.keras.layers.Embedding(input_dim=n_notes, output_dim=n_embedding_dims,\n",
" input_shape=[None]),\n",
2021-10-17 04:04:08 +02:00
" tf.keras.layers.Conv1D(32, kernel_size=2, padding=\"causal\", activation=\"relu\"),\n",
" tf.keras.layers.BatchNormalization(),\n",
" tf.keras.layers.Conv1D(48, kernel_size=2, padding=\"causal\", activation=\"relu\", dilation_rate=2),\n",
" tf.keras.layers.BatchNormalization(),\n",
" tf.keras.layers.Conv1D(64, kernel_size=2, padding=\"causal\", activation=\"relu\", dilation_rate=4),\n",
" tf.keras.layers.BatchNormalization(),\n",
" tf.keras.layers.Conv1D(96, kernel_size=2, padding=\"causal\", activation=\"relu\", dilation_rate=8),\n",
" tf.keras.layers.BatchNormalization(),\n",
" tf.keras.layers.LSTM(256, return_sequences=True),\n",
" tf.keras.layers.Dense(n_notes, activation=\"softmax\")\n",
"])\n",
"\n",
"model.summary()"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now we're ready to compile and train the model!"
]
},
{
"cell_type": "code",
"execution_count": 110,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Epoch 1/20\n",
"98/98 [==============================] - 25s 208ms/step - loss: 1.8695 - accuracy: 0.5301 - val_loss: 3.7034 - val_accuracy: 0.1226\n",
"Epoch 2/20\n",
"98/98 [==============================] - 22s 225ms/step - loss: 0.9034 - accuracy: 0.7638 - val_loss: 3.4941 - val_accuracy: 0.1050\n",
"Epoch 3/20\n",
"98/98 [==============================] - 23s 233ms/step - loss: 0.7523 - accuracy: 0.7916 - val_loss: 3.3243 - val_accuracy: 0.1938\n",
"Epoch 4/20\n",
"98/98 [==============================] - 23s 232ms/step - loss: 0.6756 - accuracy: 0.8074 - val_loss: 2.5097 - val_accuracy: 0.3022\n",
"Epoch 5/20\n",
"98/98 [==============================] - 22s 223ms/step - loss: 0.6188 - accuracy: 0.8193 - val_loss: 1.7532 - val_accuracy: 0.4628\n",
"Epoch 6/20\n",
"98/98 [==============================] - 23s 237ms/step - loss: 0.5788 - accuracy: 0.8280 - val_loss: 1.0323 - val_accuracy: 0.6826\n",
"Epoch 7/20\n",
"98/98 [==============================] - 25s 256ms/step - loss: 0.5396 - accuracy: 0.8374 - val_loss: 0.7257 - val_accuracy: 0.7910\n",
"Epoch 8/20\n",
"98/98 [==============================] - 27s 278ms/step - loss: 0.5079 - accuracy: 0.8451 - val_loss: 0.8296 - val_accuracy: 0.7497\n",
"Epoch 9/20\n",
"98/98 [==============================] - 26s 267ms/step - loss: 0.4796 - accuracy: 0.8523 - val_loss: 0.6217 - val_accuracy: 0.8162\n",
"Epoch 10/20\n",
"98/98 [==============================] - 26s 270ms/step - loss: 0.4543 - accuracy: 0.8594 - val_loss: 0.6307 - val_accuracy: 0.8136\n",
"Epoch 11/20\n",
"98/98 [==============================] - 28s 285ms/step - loss: 0.4291 - accuracy: 0.8665 - val_loss: 0.6203 - val_accuracy: 0.8183\n",
"Epoch 12/20\n",
"98/98 [==============================] - 28s 284ms/step - loss: 0.4062 - accuracy: 0.8732 - val_loss: 0.6111 - val_accuracy: 0.8210\n",
"Epoch 13/20\n",
"98/98 [==============================] - 24s 247ms/step - loss: 0.3846 - accuracy: 0.8798 - val_loss: 0.6185 - val_accuracy: 0.8167\n",
"Epoch 14/20\n",
"98/98 [==============================] - 24s 247ms/step - loss: 0.3647 - accuracy: 0.8856 - val_loss: 0.6036 - val_accuracy: 0.8244\n",
"Epoch 15/20\n",
"98/98 [==============================] - 24s 248ms/step - loss: 0.3454 - accuracy: 0.8918 - val_loss: 0.6400 - val_accuracy: 0.8149\n",
"Epoch 16/20\n",
"98/98 [==============================] - 24s 243ms/step - loss: 0.3299 - accuracy: 0.8969 - val_loss: 0.6517 - val_accuracy: 0.8099\n",
"Epoch 17/20\n",
"98/98 [==============================] - 23s 240ms/step - loss: 0.3100 - accuracy: 0.9027 - val_loss: 0.6472 - val_accuracy: 0.8148\n",
"Epoch 18/20\n",
"98/98 [==============================] - 23s 238ms/step - loss: 0.2952 - accuracy: 0.9080 - val_loss: 0.6446 - val_accuracy: 0.8167\n",
"Epoch 19/20\n",
"98/98 [==============================] - 22s 221ms/step - loss: 0.2781 - accuracy: 0.9136 - val_loss: 0.6774 - val_accuracy: 0.8104\n",
"Epoch 20/20\n",
"98/98 [==============================] - 23s 234ms/step - loss: 0.2642 - accuracy: 0.9179 - val_loss: 0.6484 - val_accuracy: 0.8199\n"
]
},
{
"data": {
"text/plain": [
"<keras.callbacks.History at 0x7fd121a6bdf0>"
]
},
"execution_count": 110,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
2021-10-17 04:04:08 +02:00
"optimizer = tf.keras.optimizers.Nadam(learning_rate=1e-3)\n",
"model.compile(loss=\"sparse_categorical_crossentropy\", optimizer=optimizer,\n",
" metrics=[\"accuracy\"])\n",
"model.fit(train_set, epochs=20, validation_data=valid_set)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"I have not done much hyperparameter search, so feel free to iterate on this model now and try to optimize it. For example, you could try removing the `LSTM` layer and replacing it with `Conv1D` layers. You could also play with the number of layers, the learning rate, the optimizer, and so on."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Once you're satisfied with the performance of the model on the validation set, you can save it and evaluate it one last time on the test set:"
]
},
{
"cell_type": "code",
"execution_count": 111,
"metadata": {},
2022-02-19 10:24:54 +01:00
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"34/34 [==============================] - 3s 74ms/step - loss: 0.6631 - accuracy: 0.8164\n"
]
},
{
"data": {
"text/plain": [
"[0.6630987524986267, 0.8163789510726929]"
]
},
"execution_count": 111,
2022-02-19 10:24:54 +01:00
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
2022-09-12 01:47:36 +02:00
"model.save(\"my_bach_model\", save_format=\"tf\")\n",
"model.evaluate(test_set)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"**Note:** There's no real need for a test set in this exercise, since we will perform the final evaluation by just listening to the music produced by the model. So if you want, you can add the test set to the train set, and train the model again, hopefully getting a slightly better model."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now let's write a function that will generate a new chorale. We will give it a few seed chords, it will convert them to arpegios (the format expected by the model), and use the model to predict the next note, then the next, and so on. In the end, it will group the notes 4 by 4 to create chords again, and return the resulting chorale."
]
},
{
"cell_type": "code",
"execution_count": 112,
"metadata": {},
"outputs": [],
"source": [
"def generate_chorale(model, seed_chords, length):\n",
" arpegio = preprocess(tf.constant(seed_chords, dtype=tf.int64))\n",
" arpegio = tf.reshape(arpegio, [1, -1])\n",
" for chord in range(length):\n",
" for note in range(4):\n",
" next_note = model.predict(arpegio, verbose=0).argmax(axis=-1)[:1, -1:]\n",
" arpegio = tf.concat([arpegio, next_note], axis=1)\n",
" arpegio = tf.where(arpegio == 0, arpegio, arpegio + min_note - 1)\n",
" return tf.reshape(arpegio, shape=[-1, 4])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"To test this function, we need some seed chords. Let's use the first 8 chords of one of the test chorales (it's actually just 2 different chords, each played 4 times):"
]
},
{
"cell_type": "code",
"execution_count": 113,
"metadata": {},
"outputs": [],
"source": [
"seed_chords = test_chorales[2][:8]\n",
"play_chords(seed_chords, amplitude=0.2)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Now we are ready to generate our first chorale! Let's ask the function to generate 56 more chords, for a total of 64 chords, i.e., 16 bars (assuming 4 chords per bar, i.e., a 4/4 signature):"
]
},
{
"cell_type": "code",
"execution_count": 114,
"metadata": {},
"outputs": [],
"source": [
"new_chorale = generate_chorale(model, seed_chords, 56)\n",
"play_chords(new_chorale)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"This approach has one major flaw: it is often too conservative. Indeed, the model will not take any risk, it will always choose the note with the highest score, and since repeating the previous note generally sounds good enough, it's the least risky option, so the algorithm will tend to make notes last longer and longer. Pretty boring. Plus, if you run the model multiple times, it will always generate the same melody.\n",
"\n",
"So let's spice things up a bit! Instead of always picking the note with the highest score, we will pick the next note randomly, according to the predicted probabilities. For example, if the model predicts a C3 with 75% probability, and a G3 with a 25% probability, then we will pick one of these two notes randomly, with these probabilities. We will also add a `temperature` parameter that will control how \"hot\" (i.e., daring) we want the system to feel. A high temperature will bring the predicted probabilities closer together, reducing the probability of the likely notes and increasing the probability of the unlikely ones."
]
},
{
"cell_type": "code",
"execution_count": 115,
"metadata": {},
"outputs": [],
"source": [
"def generate_chorale_v2(model, seed_chords, length, temperature=1):\n",
" arpegio = preprocess(tf.constant(seed_chords, dtype=tf.int64))\n",
" arpegio = tf.reshape(arpegio, [1, -1])\n",
" for chord in range(length):\n",
" for note in range(4):\n",
" next_note_probas = model.predict(arpegio)[0, -1:]\n",
" rescaled_logits = tf.math.log(next_note_probas) / temperature\n",
" next_note = tf.random.categorical(rescaled_logits, num_samples=1)\n",
" arpegio = tf.concat([arpegio, next_note], axis=1)\n",
" arpegio = tf.where(arpegio == 0, arpegio, arpegio + min_note - 1)\n",
" return tf.reshape(arpegio, shape=[-1, 4])"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Let's generate 3 chorales using this new function: one cold, one medium, and one hot (feel free to experiment with other seeds, lengths and temperatures). The code saves each chorale to a separate file. You can run these cells over an over again until you generate a masterpiece!\n",
"\n",
"**Please share your most beautiful generated chorale with me on Twitter @aureliengeron, I would really appreciate it! :))**"
]
},
{
"cell_type": "code",
"execution_count": 116,
"metadata": {
"scrolled": true
},
"outputs": [],
"source": [
"new_chorale_v2_cold = generate_chorale_v2(model, seed_chords, 56, temperature=0.8)\n",
"play_chords(new_chorale_v2_cold, filepath=\"bach_cold.wav\")"
]
},
{
"cell_type": "code",
"execution_count": 117,
"metadata": {},
"outputs": [],
"source": [
"new_chorale_v2_medium = generate_chorale_v2(model, seed_chords, 56, temperature=1.0)\n",
"play_chords(new_chorale_v2_medium, filepath=\"bach_medium.wav\")"
]
},
{
"cell_type": "code",
"execution_count": 118,
2020-04-06 09:13:12 +02:00
"metadata": {},
"outputs": [],
"source": [
"new_chorale_v2_hot = generate_chorale_v2(model, seed_chords, 56, temperature=1.5)\n",
"play_chords(new_chorale_v2_hot, filepath=\"bach_hot.wav\")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Lastly, you can try a fun social experiment: send your friends a few of your favorite generated chorales, plus the real chorale, and ask them to guess which one is the real one!"
]
},
{
"cell_type": "code",
"execution_count": 119,
"metadata": {},
"outputs": [],
"source": [
"play_chords(test_chorales[2][:64], filepath=\"bach_test_4.wav\")"
]
2016-09-27 23:31:21 +02:00
}
],
"metadata": {
"accelerator": "GPU",
2016-09-27 23:31:21 +02:00
"kernelspec": {
"display_name": "Python 3",
2016-09-27 23:31:21 +02:00
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
2016-09-27 23:31:21 +02:00
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.10.6"
2016-09-27 23:31:21 +02:00
},
"nav_menu": {},
"toc": {
"navigate_menu": true,
"number_sections": true,
"sideBar": true,
"threshold": 6,
2016-09-27 23:31:21 +02:00
"toc_cell": false,
"toc_section_display": "block",
"toc_window_display": false
}
2016-09-27 23:31:21 +02:00
},
"nbformat": 4,
"nbformat_minor": 4
2016-09-27 23:31:21 +02:00
}