Point to the autodiff notebook in index.ipynb and math_differential_calculus.ipynb

main
Aurélien Geron 2020-04-12 18:47:26 +12:00
parent 1451060165
commit 78f33162fb
2 changed files with 2 additions and 2 deletions

View File

@ -41,7 +41,7 @@
"* [Differential Calculus](math_differential_calculus.ipynb)\n",
"\n",
"## Extra Material\n",
"Work in progress\n",
"* [Auto-differentiation](extra_autodiff.ipynb)\n",
"\n",
"## Misc.\n",
"* [Equations](book_equations.pdf) (list of equations in the book)\n"

View File

@ -473,7 +473,7 @@
"id": "ebb31wJp72Zn"
},
"source": [
"**Important note:** in Deep Learning, differentiation is almost always performed automatically by the framework you are using (such as TensorFlow or PyTorch). This is called auto-diff, and I did [another notebook](https://github.com/ageron/handson-ml/blob/master/extra_autodiff.ipynb) on that topic. However, you should still make sure you have a good understanding of derivatives, or else they will come and bite you one day, for example when you use a square root in your cost function without realizing that its derivative approaches infinity when $x$ approaches 0 (tip: you should use $\\sqrt{x+\\epsilon}$ instead, where $\\epsilon$ is some small constant, such as $10^{-4}$)."
"**Important note:** in Deep Learning, differentiation is almost always performed automatically by the framework you are using (such as TensorFlow or PyTorch). This is called auto-diff, and I did [another notebook](https://github.com/ageron/handson-ml2/blob/master/extra_autodiff.ipynb) on that topic. However, you should still make sure you have a good understanding of derivatives, or else they will come and bite you one day, for example when you use a square root in your cost function without realizing that its derivative approaches infinity when $x$ approaches 0 (tip: you should use $\\sqrt{x+\\epsilon}$ instead, where $\\epsilon$ is some small constant, such as $10^{-4}$)."
]
},
{