Update 11_training_deep_neural_networks.ipynb

Swapped the Activation and BatchNormalization lines in order to make the code consistent with the description and the book (p. 343), i.e. adding the BN layers BEFORE the activation function.
main
Fai Sharji 2019-12-25 23:00:26 -05:00 committed by GitHub
parent 1f542d2757
commit de450d8077
1 changed files with 1 additions and 1 deletions

View File

@ -715,8 +715,8 @@
" keras.layers.BatchNormalization(),\n",
" keras.layers.Activation(\"relu\"),\n",
" keras.layers.Dense(100, use_bias=False),\n",
" keras.layers.Activation(\"relu\"),\n",
" keras.layers.BatchNormalization(),\n",
" keras.layers.Activation(\"relu\"),\n",
" keras.layers.Dense(10, activation=\"softmax\")\n",
"])"
]