diff --git a/index.ipynb b/index.ipynb index 9118072..09a1f2b 100644 --- a/index.ipynb +++ b/index.ipynb @@ -41,7 +41,7 @@ "* [Differential Calculus](math_differential_calculus.ipynb)\n", "\n", "## Extra Material\n", - "Work in progress\n", + "* [Auto-differentiation](extra_autodiff.ipynb)\n", "\n", "## Misc.\n", "* [Equations](book_equations.pdf) (list of equations in the book)\n" diff --git a/math_differential_calculus.ipynb b/math_differential_calculus.ipynb index 9fcc6a2..5f50a95 100644 --- a/math_differential_calculus.ipynb +++ b/math_differential_calculus.ipynb @@ -473,7 +473,7 @@ "id": "ebb31wJp72Zn" }, "source": [ - "**Important note:** in Deep Learning, differentiation is almost always performed automatically by the framework you are using (such as TensorFlow or PyTorch). This is called auto-diff, and I did [another notebook](https://github.com/ageron/handson-ml/blob/master/extra_autodiff.ipynb) on that topic. However, you should still make sure you have a good understanding of derivatives, or else they will come and bite you one day, for example when you use a square root in your cost function without realizing that its derivative approaches infinity when $x$ approaches 0 (tip: you should use $\\sqrt{x+\\epsilon}$ instead, where $\\epsilon$ is some small constant, such as $10^{-4}$)." + "**Important note:** in Deep Learning, differentiation is almost always performed automatically by the framework you are using (such as TensorFlow or PyTorch). This is called auto-diff, and I did [another notebook](https://github.com/ageron/handson-ml2/blob/master/extra_autodiff.ipynb) on that topic. However, you should still make sure you have a good understanding of derivatives, or else they will come and bite you one day, for example when you use a square root in your cost function without realizing that its derivative approaches infinity when $x$ approaches 0 (tip: you should use $\\sqrt{x+\\epsilon}$ instead, where $\\epsilon$ is some small constant, such as $10^{-4}$)." ] }, {