Fixing wide-and-deep to use normalized inputs
Thanks for this great book/repo. I noticed the wide and deep example is using non-normalized inputs which leads to performance degradation.main
parent
eabcc492aa
commit
bdd246bf07
|
@ -1568,7 +1568,7 @@
|
|||
"normalized = normalization_layer(input_)\n",
|
||||
"hidden1 = hidden_layer1(normalized)\n",
|
||||
"hidden2 = hidden_layer2(hidden1)\n",
|
||||
"concat = concat_layer([input_, hidden2])\n",
|
||||
"concat = concat_layer([normalized, hidden2])\n",
|
||||
"output = output_layer(concat)\n",
|
||||
"\n",
|
||||
"model = tf.keras.Model(inputs=[input_], outputs=[output])"
|
||||
|
|
Loading…
Reference in New Issue