Point to the autodiff notebook in index.ipynb and math_differential_calculus.ipynb
parent
1451060165
commit
78f33162fb
|
@ -41,7 +41,7 @@
|
|||
"* [Differential Calculus](math_differential_calculus.ipynb)\n",
|
||||
"\n",
|
||||
"## Extra Material\n",
|
||||
"Work in progress\n",
|
||||
"* [Auto-differentiation](extra_autodiff.ipynb)\n",
|
||||
"\n",
|
||||
"## Misc.\n",
|
||||
"* [Equations](book_equations.pdf) (list of equations in the book)\n"
|
||||
|
|
|
@ -473,7 +473,7 @@
|
|||
"id": "ebb31wJp72Zn"
|
||||
},
|
||||
"source": [
|
||||
"**Important note:** in Deep Learning, differentiation is almost always performed automatically by the framework you are using (such as TensorFlow or PyTorch). This is called auto-diff, and I did [another notebook](https://github.com/ageron/handson-ml/blob/master/extra_autodiff.ipynb) on that topic. However, you should still make sure you have a good understanding of derivatives, or else they will come and bite you one day, for example when you use a square root in your cost function without realizing that its derivative approaches infinity when $x$ approaches 0 (tip: you should use $\\sqrt{x+\\epsilon}$ instead, where $\\epsilon$ is some small constant, such as $10^{-4}$)."
|
||||
"**Important note:** in Deep Learning, differentiation is almost always performed automatically by the framework you are using (such as TensorFlow or PyTorch). This is called auto-diff, and I did [another notebook](https://github.com/ageron/handson-ml2/blob/master/extra_autodiff.ipynb) on that topic. However, you should still make sure you have a good understanding of derivatives, or else they will come and bite you one day, for example when you use a square root in your cost function without realizing that its derivative approaches infinity when $x$ approaches 0 (tip: you should use $\\sqrt{x+\\epsilon}$ instead, where $\\epsilon$ is some small constant, such as $10^{-4}$)."
|
||||
]
|
||||
},
|
||||
{
|
||||
|
|
Loading…
Reference in New Issue