diff --git a/README.md b/README.md index 434480c..6fbff2a 100644 --- a/README.md +++ b/README.md @@ -2,11 +2,11 @@ Machine Learning Notebooks, 3rd edition ================================= This project aims at teaching you the fundamentals of Machine Learning in -python. It contains the example code and solutions to the exercises in the second edition of my O'Reilly book [Hands-on Machine Learning with Scikit-Learn, Keras and TensorFlow (3rd edition)](https://homl.info/er3): +python. It contains the example code and solutions to the exercises in the third edition of my O'Reilly book [Hands-on Machine Learning with Scikit-Learn, Keras and TensorFlow (3rd edition)](https://homl.info/er3): -**Note**: If you are looking for the second edition notebooks, check out [ageron/handson-ml2](https://github.com/ageron/handson-ml2). For the first edition, see check out [ageron/handson-ml](https://github.com/ageron/handson-ml). +**Note**: If you are looking for the second edition notebooks, check out [ageron/handson-ml2](https://github.com/ageron/handson-ml2). For the first edition, see [ageron/handson-ml](https://github.com/ageron/handson-ml). ## Quick Start @@ -34,7 +34,7 @@ Read the [Docker instructions](https://github.com/ageron/handson-ml3/tree/main/d ### Want to install this project on your own machine? -Start by installing [Anaconda](https://www.anaconda.com/distribution/) (or [Miniconda](https://docs.conda.io/en/latest/miniconda.html)), [git](https://git-scm.com/downloads), and if you have a TensorFlow-compatible GPU, install the [GPU driver](https://www.nvidia.com/Download/index.aspx), as well as the appropriate version of CUDA and cuDNN (see TensorFlow's documentation for more details). +Start by installing [Anaconda](https://www.anaconda.com/products/distribution) (or [Miniconda](https://docs.conda.io/en/latest/miniconda.html)), [git](https://git-scm.com/downloads), and if you have a TensorFlow-compatible GPU, install the [GPU driver](https://www.nvidia.com/Download/index.aspx), as well as the appropriate version of CUDA and cuDNN (see TensorFlow's documentation for more details). Next, clone this project by opening a terminal and typing the following commands (do not type the first `$` signs on each line, they just indicate that these are terminal commands): diff --git a/math_differential_calculus.ipynb b/math_differential_calculus.ipynb index d332649..244c2a0 100644 --- a/math_differential_calculus.ipynb +++ b/math_differential_calculus.ipynb @@ -49,7 +49,6 @@ "outputs": [], "source": [ "#@title\n", - "%matplotlib inline\n", "import matplotlib as mpl\n", "import matplotlib.pyplot as plt\n", "import numpy as np\n", @@ -167,7 +166,7 @@ "id": "gcb7eqkmGGXf" }, "source": [ - "But what if you want to know the slope of something else than a straight line? For example, let's consider the curve defined by $y = f(x) = x^2$:" + "But what if you want to know the slope of something other than a straight line? For example, let's consider the curve defined by $y = f(x) = x^2$:" ] }, { @@ -226,7 +225,7 @@ "id": "4qCXg9nQSp6S" }, "source": [ - "How can we put numbers on these intuitions? Well, say we want to estimate the slope of the curve at a point $\\mathrm{A}$, we can do this by taking another point $\\mathrm{B}$ on the curve, not too far away, and then computing the slope between these two points:\n" + "How can we put numbers on these intuitions? Well, say we want to estimate the slope of the curve at a point $\\mathrm{A}$. We can do this by taking another point $\\mathrm{B}$ on the curve, not too far away, and then computing the slope between these two points:\n" ] }, { @@ -963,7 +962,7 @@ "source": [ "# Differentiability\n", "\n", - "Note that some functions are not quite as well-behaved as $x^2$: for example, consider the function $f(x)=|x|$, the absolute value of $x$:" + "Note that some functions are not quite as well-behaved as $x^2$. For example, consider the function $f(x)=|x|$, the absolute value of $x$:" ] }, { @@ -2231,7 +2230,7 @@ "& = \\underset{x_\\mathrm{B} \\to x_\\mathrm{A}}\\lim x_\\mathrm{B} \\, + \\underset{x_\\mathrm{B} \\to x_\\mathrm{A}}\\lim x_\\mathrm{A}\\quad && \\text{since the limit of a sum is the sum of the limits}\\\\\n", "& = x_\\mathrm{A} \\, + \\underset{x_\\mathrm{B} \\to x_\\mathrm{A}}\\lim x_\\mathrm{A} \\quad && \\text{since } x_\\mathrm{B}\\text{ approaches } x_\\mathrm{A} \\\\\n", "& = x_\\mathrm{A} + x_\\mathrm{A} \\quad && \\text{since } x_\\mathrm{A} \\text{ remains constant when } x_\\mathrm{B}\\text{ approaches } x_\\mathrm{A} \\\\\n", - "& = 2 x_\\mathrm{A}\n", + "& = 2x_\\mathrm{A} &&\n", "\\end{align*}\n", "$\n", "\n", @@ -2307,7 +2306,7 @@ "& = \\underset{\\epsilon \\to 0}\\lim\\dfrac{{x}^2 + 2x\\epsilon + \\epsilon^2 - {x}^2}{\\epsilon}\\quad && \\text{since } (x + \\epsilon)^2 = {x}^2 + 2x\\epsilon + \\epsilon^2\\\\\n", "& = \\underset{\\epsilon \\to 0}\\lim\\dfrac{2x\\epsilon + \\epsilon^2}{\\epsilon}\\quad && \\text{since the two } {x}^2 \\text{ cancel out}\\\\\n", "& = \\underset{\\epsilon \\to 0}\\lim \\, (2x + \\epsilon)\\quad && \\text{since } 2x\\epsilon \\text{ and } \\epsilon^2 \\text{ can both be divided by } \\epsilon\\\\\n", - "& = 2 x\n", + "& = 2x &&\n", "\\end{align*}\n", "$\n", "\n", @@ -2343,7 +2342,7 @@ "\n", "The $f'$ notation is Lagrange's notation, while $\\dfrac{\\mathrm{d}f}{\\mathrm{d}x}$ is Leibniz's notation.\n", "\n", - "There are also other less common notations, such as Newton's notation $\\dot y$ (assuming $y = f(x)$) or Euler's notation $\\mathrm{D}f$." + "There are other less common notations, such as Newton's notation $\\dot y$ (assuming $y = f(x)$) or Euler's notation $\\mathrm{D}f$." ] }, { @@ -4164,7 +4163,7 @@ "\n", "It is possible to chain many functions. For example, if $f(x)=g(h(i(x)))$, and we define $y=i(x)$ and $z=h(y)$, then $\\dfrac{\\mathrm{d}f}{\\mathrm{d}x} = \\dfrac{\\mathrm{d}f}{\\mathrm{d}z} \\dfrac{\\mathrm{d}z}{\\mathrm{d}y} \\dfrac{\\mathrm{d}y}{\\mathrm{d}x}$. Using Lagrange's notation, we get $f'(x)=g'(z)\\,h'(y)\\,i'(x)=g'(h(i(x)))\\,h'(i(x))\\,i'(x)$\n", "\n", - "The chain rule is crucial in Deep Learning, as a neural network is basically as a long composition of functions. For example, a 3-layer dense neural network corresponds to the following function: $f(\\mathbf{x})=\\operatorname{Dense}_3(\\operatorname{Dense}_2(\\operatorname{Dense}_1(\\mathbf{x})))$ (in this example, $\\operatorname{Dense}_3$ is the output layer).\n" + "The chain rule is crucial in Deep Learning, as a neural network is basically a long composition of functions. For example, a 3-layer dense neural network corresponds to the following function: $f(\\mathbf{x})=\\operatorname{Dense}_3(\\operatorname{Dense}_2(\\operatorname{Dense}_1(\\mathbf{x})))$ (in this example, $\\operatorname{Dense}_3$ is the output layer).\n" ] }, { @@ -4296,7 +4295,7 @@ "\n", "At each iteration, the step size is proportional to the slope, so the process naturally slows down as it approaches a local minimum. Each step is also proportional to the learning rate: a parameter of the Gradient Descent algorithm itself (since it is not a parameter of the function we are optimizing, it is called a **hyperparameter**).\n", "\n", - "Here is an animation of this process on the function $f(x)=\\dfrac{1}{4}x^4 - x^2 + \\dfrac{1}{2}$:" + "Here is an animation of this process for the function $f(x)=\\dfrac{1}{4}x^4 - x^2 + \\dfrac{1}{2}$:" ] }, { @@ -5253,8 +5252,6 @@ ], "source": [ "#@title\n", - "from mpl_toolkits.mplot3d import Axes3D\n", - "\n", "def plot_3d(f, title):\n", " fig = plt.figure(figsize=(8, 5))\n", " ax = fig.add_subplot(111, projection='3d')\n", @@ -5367,7 +5364,7 @@ "$\\nabla f(\\mathbf{x}_\\mathrm{A}) = \\begin{pmatrix}\n", "\\dfrac{\\partial f}{\\partial x_1}(\\mathbf{x}_\\mathrm{A})\\\\\n", "\\dfrac{\\partial f}{\\partial x_2}(\\mathbf{x}_\\mathrm{A})\\\\\n", - "\\vdots\\\\\\\n", + "\\vdots\\\\\n", "\\dfrac{\\partial f}{\\partial x_n}(\\mathbf{x}_\\mathrm{A})\\\\\n", "\\end{pmatrix}$" ] @@ -5407,7 +5404,7 @@ "source": [ "# Jacobians\n", "\n", - "Until now we have only considered functions that output a scalar, but it is possible to output vectors instead. For example, a classification neural network typically outputs one probability for each class, so if there are $m$ classes, the neural network will output an $d$-dimensional vector for each input.\n", + "Until now, we have only considered functions that output a scalar, but it is possible to output vectors instead. For example, a classification neural network typically outputs one probability for each class, so if there are $m$ classes, the neural network will output a $d$-dimensional vector for each input.\n", "\n", "In Deep Learning we generally only need to differentiate the loss function, which almost always outputs a single scalar number. But suppose for a second that you want to differentiate a function $\\mathbf{f}(\\mathbf{x})$ which outputs $d$-dimensional vectors. The good news is that you can treat each _output_ dimension independently of the others. This will give you a partial derivative for each input dimension and each output dimension. If you put them all in a single matrix, with one column per input dimension and one row per output dimension, you get the so-called **Jacobian matrix**.\n", "\n", @@ -5532,7 +5529,7 @@ "& = \\underset{\\epsilon \\to 0}\\lim\\dfrac{g(x+\\epsilon)h(x+\\epsilon) - g(x)h(x+\\epsilon)}{\\epsilon} + \\underset{\\epsilon \\to 0}\\lim\\dfrac{g(x)h(x + \\epsilon) - g(x)h(x)}{\\epsilon} && \\quad \\text{since the limit of a sum is the sum of the limits}\\\\\n", "& = \\underset{\\epsilon \\to 0}\\lim{\\left[\\dfrac{g(x+\\epsilon) - g(x)}{\\epsilon}h(x+\\epsilon)\\right]} \\,+\\, \\underset{\\epsilon \\to 0}\\lim{\\left[g(x)\\dfrac{h(x + \\epsilon) - h(x)}{\\epsilon}\\right]} && \\quad \\text{factorizing }h(x+\\epsilon) \\text{ and } g(x)\\\\\n", "& = \\underset{\\epsilon \\to 0}\\lim{\\left[\\dfrac{g(x+\\epsilon) - g(x)}{\\epsilon}h(x+\\epsilon)\\right]} \\,+\\, g(x)\\underset{\\epsilon \\to 0}\\lim{\\dfrac{h(x + \\epsilon) - h(x)}{\\epsilon}} && \\quad \\text{taking } g(x) \\text{ out of the limit since it does not depend on }\\epsilon\\\\\n", - "& = \\underset{\\epsilon \\to 0}\\lim{\\left[\\dfrac{g(x+\\epsilon) - g(x)}{\\epsilon}h(x+\\epsilon)\\right]} \\,+\\, g(x)h'(x) && \\quad \\text{using the definition of h'(x)}\\\\\n", + "& = \\underset{\\epsilon \\to 0}\\lim{\\left[\\dfrac{g(x+\\epsilon) - g(x)}{\\epsilon}h(x+\\epsilon)\\right]} \\,+\\, g(x)h'(x) && \\quad \\text{using the definition of }h'(x)\\\\\n", "& = \\underset{\\epsilon \\to 0}\\lim{\\left[\\dfrac{g(x+\\epsilon) - g(x)}{\\epsilon}\\right]}\\underset{\\epsilon \\to 0}\\lim{h(x+\\epsilon)} + g(x)h'(x) && \\quad \\text{since the limit of a product is the product of the limits}\\\\\n", "& = \\underset{\\epsilon \\to 0}\\lim{\\left[\\dfrac{g(x+\\epsilon) - g(x)}{\\epsilon}\\right]}h(x) + h(x)g'(x) && \\quad \\text{since } h(x) \\text{ is continuous}\\\\\n", "& = g'(x)h(x) + g(x)h'(x) && \\quad \\text{using the definition of }g'(x)\n", @@ -5620,7 +5617,7 @@ "& = \\underset{\\epsilon \\to 0}\\lim{\\left[\\dfrac{1}{\\epsilon} \\, \\ln\\left(1 + \\dfrac{\\epsilon}{x}\\right)\\right]} && \\quad \\text{just moving things around a bit}\\\\\n", "& = \\underset{\\epsilon \\to 0}\\lim{\\left[\\dfrac{1}{xu} \\, \\ln\\left(1 + u\\right)\\right]} && \\quad \\text{defining }u=\\dfrac{\\epsilon}{x} \\text{ and thus } \\epsilon=xu\\\\\n", "& = \\underset{u \\to 0}\\lim{\\left[\\dfrac{1}{xu} \\, \\ln\\left(1 + u\\right)\\right]} && \\quad \\text{replacing } \\underset{\\epsilon \\to 0}\\lim \\text{ with } \\underset{u \\to 0}\\lim \\text{ since }\\underset{\\epsilon \\to 0}\\lim u=0\\\\\n", - "& = \\underset{u \\to 0}\\lim{\\left[\\dfrac{1}{x} \\, \\ln\\left((1 + u)^{1/u}\\right)\\right]} && \\quad \\text{since }a\\ln(b)=\\ln(a^b)\\\\\n", + "& = \\underset{u \\to 0}\\lim{\\left[\\dfrac{1}{x} \\, \\ln\\left((1 + u)^{1/u}\\right)\\right]} && \\quad \\text{since }a\\ln(b)=\\ln(b^a)\\\\\n", "& = \\dfrac{1}{x}\\underset{u \\to 0}\\lim{\\left[\\ln\\left((1 + u)^{1/u}\\right)\\right]} && \\quad \\text{taking }\\dfrac{1}{x} \\text{ out since it does not depend on }\\epsilon\\\\\n", "& = \\dfrac{1}{x}\\ln\\left(\\underset{u \\to 0}\\lim{(1 + u)^{1/u}}\\right) && \\quad \\text{taking }\\ln\\text{ out since it is a continuous function}\\\\\n", "& = \\dfrac{1}{x}\\ln(e) && \\quad \\text{since }e=\\underset{u \\to 0}\\lim{(1 + u)^{1/u}}\\\\\n", @@ -5644,9 +5641,9 @@ "\n", "We know the derivative of the exponential: $g'(x)=e^x$. We also know the derivative of the natural logarithm: $\\ln'(x)=\\dfrac{1}{x}$ so $h'(x)=\\dfrac{r}{x}$. Therefore:\n", "\n", - "$f'(x) = \\dfrac{r}{x}\\exp\\left({\\ln(x^r)}\\right)$\n", + "$f'(x) = \\dfrac{r}{x} e^{\\ln(x^r)}$\n", "\n", - "Since $a = \\exp(\\ln(a))$, this equation simplifies to:\n", + "Since $e^{\\ln(a)} = a$, this equation simplifies to:\n", "\n", "$f'(x) = \\dfrac{r}{x} x^r$\n", "\n", @@ -5657,7 +5654,7 @@ "Note that the power rule works for any $r \\neq 0$, including negative numbers and real numbers. For example:\n", "\n", "* if $f(x) = \\dfrac{1}{x} = x^{-1}$, then $f'(x)=-x^{-2}=-\\dfrac{1}{x^2}$.\n", - "* if $f(x) = \\sqrt(x) = x^{1/2}$, then $f'(x)=\\dfrac{1}{2}x^{-1/2}=\\dfrac{1}{2\\sqrt{x}}$" + "* if $f(x) = \\sqrt{x} = x^{1/2}$, then $f'(x)=\\dfrac{1}{2}x^{-1/2}=\\dfrac{1}{2\\sqrt{x}}$" ] }, { @@ -5800,17 +5797,17 @@ "source": [ "The circle is the unit circle (radius=1).\n", "\n", - "Assuming $0 < \\theta < \\dfrac{\\pi}{2}$, the area of the blue triangle (area $\\mathrm{A}$) is equal to its height ($\\sin(\\theta)$), times its base ($\\cos(\\theta)$), divided by 2. So $\\mathrm{A} = \\dfrac{1}{2}\\sin(\\theta)\\cos(\\theta)$.\n", + "Assuming $0 < \\theta < \\dfrac{\\pi}{2}$, the area of the blue triangle (area $\\mathrm{A}$) is equal to its height ($\\sin(\\theta)$) times its base ($\\cos(\\theta)$) divided by 2. So $\\mathrm{A} = \\dfrac{1}{2}\\sin(\\theta)\\cos(\\theta)$.\n", "\n", "The unit circle has an area of $\\pi$, so the circular sector (in the shape of a pizza slice) has an area of A + B = $\\pi\\dfrac{\\theta}{2\\pi} = \\dfrac{\\theta}{2}$.\n", "\n", - "Next, the large triangle (A + B + C) has an area equal to its height ($\\tan(\\theta)$) multiplied by its base (1) divided by 2, so A + B + C = $\\dfrac{\\tan(\\theta)}{2}$.\n", + "Next, the large triangle (A + B + C) has an area equal to its height ($\\tan(\\theta)$) multiplied by its base (of length 1) divided by 2, so A + B + C = $\\dfrac{\\tan(\\theta)}{2}$.\n", "\n", "When $0 < \\theta < \\dfrac{\\pi}{2}$, we have $\\mathrm{A} < \\mathrm{A} + \\mathrm{B} < \\mathrm{A} + \\mathrm{B} + \\mathrm{C}$, therefore:\n", "\n", "$\\dfrac{1}{2}\\sin(\\theta)\\cos(\\theta) < \\dfrac{\\theta}{2} < \\dfrac{\\tan(\\theta)}{2}$\n", "\n", - "We can multiply all the terms by 2 to get rid of the $\\dfrac{1}{2}$ factors. We can also divide by $\\sin(\\theta)$, which is stricly positive (assuming $0 < \\theta < \\dfrac{\\pi}{2}$), so the inequalities still hold:\n", + "We can multiply all the terms by 2 to get rid of the $\\dfrac{1}{2}$ factors. We can also divide by $\\sin(\\theta)$, which is strictly positive (assuming $0 < \\theta < \\dfrac{\\pi}{2}$), so the inequalities still hold:\n", "\n", "$cos(\\theta) < \\dfrac{\\theta}{\\sin(\\theta)} < \\dfrac{\\tan(\\theta)}{\\sin(\\theta)}$\n", "\n", @@ -5843,7 +5840,7 @@ "\n", "$\\dfrac{1}{cos(\\theta)} > \\dfrac{\\sin(\\theta)}{\\theta} > \\cos(\\theta)$\n", "\n", - "assuming $-\\dfrac{\\theta}{2} < \\theta < \\dfrac{\\pi}{2}$ and $\\theta \\neq 0$\n", + "assuming $-\\dfrac{\\pi}{2} < \\theta < \\dfrac{\\pi}{2}$ and $\\theta \\neq 0$\n", "
\n", "\n", "Since $\\cos$ is a continuous function, $\\underset{\\theta \\to 0}\\lim\\cos(\\theta)=\\cos(0)=1$. Similarly, $\\underset{\\theta \\to 0}\\lim\\dfrac{1}{cos(\\theta)}=\\dfrac{1}{\\cos(0)}=1$.\n", @@ -5872,12 +5869,12 @@ "\\begin{align*}\n", "\\underset{\\theta \\to 0}\\lim\\dfrac{\\cos(\\theta) - 1}{\\theta} & = \\underset{\\theta \\to 0}\\lim\\dfrac{\\cos(\\theta) - 1}{\\theta}\\frac{\\cos(\\theta) + 1}{\\cos(\\theta) + 1} && \\quad \\text{ multiplying and dividing by }\\cos(\\theta)+1\\\\\n", "& = \\underset{\\theta \\to 0}\\lim\\dfrac{\\cos^2(\\theta) - 1}{\\theta(\\cos(\\theta) + 1)} && \\quad \\text{ since }(a-1)(a+1)=a^2-1\\\\\n", - "& = \\underset{\\theta \\to 0}\\lim\\dfrac{\\sin^2(\\theta)}{\\theta(\\cos(\\theta) + 1)} && \\quad \\text{ since }\\cos^2(\\theta) - 1 = \\sin^2(\\theta)\\\\\n", - "& = \\underset{\\theta \\to 0}\\lim\\dfrac{\\sin(\\theta)}{\\theta}\\dfrac{\\sin(\\theta)}{\\cos(\\theta) + 1} && \\quad \\text{ just rearranging the terms}\\\\\n", - "& = \\underset{\\theta \\to 0}\\lim\\dfrac{\\sin(\\theta)}{\\theta} \\, \\underset{\\theta \\to 0}\\lim\\dfrac{\\sin(\\theta)}{\\cos(\\theta) + 1} && \\quad \\text{ since the limit of a product is the product of the limits}\\\\\n", - "& = \\underset{\\theta \\to 0}\\lim\\dfrac{\\sin(\\theta)}{\\cos(\\theta) + 1} && \\quad \\text{ since } \\underset{\\theta \\to 0}\\lim\\dfrac{\\sin(\\theta)}{\\theta}=1\\\\\n", - "& = \\dfrac{0}{1+1} && \\quad \\text{ since } \\underset{\\theta \\to 0}\\lim\\sin(\\theta)=0 \\text{ and } \\underset{\\theta \\to 0}\\lim\\cos(\\theta)=1\\\\\n", - "& = 0\\\\\n", + "& = \\underset{\\theta \\to 0}\\lim\\dfrac{-\\sin^2(\\theta)}{\\theta(\\cos(\\theta) + 1)} && \\quad \\text{ since }\\cos^2(\\theta) - 1 = -\\sin^2(\\theta)\\\\\n", + "& = -\\underset{\\theta \\to 0}\\lim\\dfrac{\\sin(\\theta)}{\\theta}\\dfrac{\\sin(\\theta)}{\\cos(\\theta) + 1} && \\quad \\text{ just rearranging the terms}\\\\\n", + "& = -\\underset{\\theta \\to 0}\\lim\\dfrac{\\sin(\\theta)}{\\theta} \\, \\underset{\\theta \\to 0}\\lim\\dfrac{\\sin(\\theta)}{\\cos(\\theta) + 1} && \\quad \\text{ since the limit of a product is the product of the limits}\\\\\n", + "& = -\\underset{\\theta \\to 0}\\lim\\dfrac{\\sin(\\theta)}{\\cos(\\theta) + 1} && \\quad \\text{ since } \\underset{\\theta \\to 0}\\lim\\dfrac{\\sin(\\theta)}{\\theta}=1\\\\\n", + "& = -\\dfrac{0}{1+1} && \\quad \\text{ since } \\underset{\\theta \\to 0}\\lim\\sin(\\theta)=0 \\text{ and } \\underset{\\theta \\to 0}\\lim\\cos(\\theta)=1\\\\\n", + "& = 0 &&\n", "\\end{align*}\n", "$\n", "\n", @@ -5911,7 +5908,7 @@ "\\begin{align*}\n", "f'(x) & = \\underset{\\theta \\to 0}\\lim\\dfrac{f(x+\\theta) - f(x)}{\\theta} && \\quad\\text{by definition}\\\\\n", "& = \\underset{\\theta \\to 0}\\lim\\dfrac{\\sin(x+\\theta) - \\sin(x)}{\\theta} && \\quad \\text{using }f(x) = \\sin(x)\\\\\n", - "& = \\underset{\\theta \\to 0}\\lim\\dfrac{\\cos(x)\\sin(\\theta) + \\sin(x)\\cos(\\theta) - \\sin(x)}{\\theta} && \\quad \\text{since } cos(a+b)=\\cos(a)\\sin(b)+\\sin(a)\\cos(b)\\\\\n", + "& = \\underset{\\theta \\to 0}\\lim\\dfrac{\\cos(x)\\sin(\\theta) + \\sin(x)\\cos(\\theta) - \\sin(x)}{\\theta} && \\quad \\text{since } \\sin(a+b)=\\cos(a)\\sin(b)+\\sin(a)\\cos(b)\\\\\n", "& = \\underset{\\theta \\to 0}\\lim\\dfrac{\\cos(x)\\sin(\\theta)}{\\theta} + \\underset{\\theta \\to 0}\\lim\\dfrac{\\sin(x)\\cos(\\theta) - \\sin(x)}{\\theta} && \\quad \\text{since the limit of a sum is the sum of the limits}\\\\\n", "& = \\cos(x)\\underset{\\theta \\to 0}\\lim\\dfrac{\\sin(\\theta)}{\\theta} + \\sin(x)\\underset{\\theta \\to 0}\\lim\\dfrac{\\cos(\\theta) - 1}{\\theta} && \\quad \\text{bringing out } \\cos(x) \\text{ and } \\sin(x) \\text{ since they don't depend on }\\theta\\\\\n", "& = \\cos(x)\\underset{\\theta \\to 0}\\lim\\dfrac{\\sin(\\theta)}{\\theta} && \\quad \\text{since }\\underset{\\theta \\to 0}\\lim\\dfrac{\\cos(\\theta) - 1}{\\theta}=0\\\\\n", diff --git a/math_linear_algebra.ipynb b/math_linear_algebra.ipynb index c9f1312..629b175 100644 --- a/math_linear_algebra.ipynb +++ b/math_linear_algebra.ipynb @@ -6,7 +6,7 @@ "source": [ "**Math - Linear Algebra**\n", "\n", - "*Linear Algebra is the branch of mathematics that studies [vector spaces](https://en.wikipedia.org/wiki/Vector_space) and linear transformations between vector spaces, such as rotating a shape, scaling it up or down, translating it (ie. moving it), etc.*\n", + "*Linear Algebra is the branch of mathematics that studies [vector spaces](https://en.wikipedia.org/wiki/Vector_space) and linear transformations between vector spaces, such as rotating a shape, scaling it up or down, translating it (i.e. moving it), etc.*\n", "\n", "*Machine Learning relies heavily on Linear Algebra, so it is essential to understand what vectors and matrices are, what operations you can perform with them, and how they can be useful.*" ] @@ -33,7 +33,7 @@ "## Definition\n", "A vector is a quantity defined by a magnitude and a direction. For example, a rocket's velocity is a 3-dimensional vector: its magnitude is the speed of the rocket, and its direction is (hopefully) up. A vector can be represented by an array of numbers called *scalars*. Each scalar corresponds to the magnitude of the vector with regards to each dimension.\n", "\n", - "For example, say the rocket is going up at a slight angle: it has a vertical speed of 5,000 m/s, and also a slight speed towards the East at 10 m/s, and a slight speed towards the North at 50 m/s. The rocket's velocity may be represented by the following vector:\n", + "For example, say the rocket is going up at a slight angle: it has a vertical speed of 5,000 m/s, and also a slight speed towards the East at 10 m/s, and a slight speed towards the North at 50 m/s. The rocket's velocity may be represented by the following vector:\n", "\n", "**velocity** $= \\begin{pmatrix}\n", "10 \\\\\n", @@ -41,9 +41,9 @@ "5000 \\\\\n", "\\end{pmatrix}$\n", "\n", - "Note: by convention vectors are generally presented in the form of columns. Also, vector names are generally lowercase to distinguish them from matrices (which we will discuss below) and in bold (when possible) to distinguish them from simple scalar values such as ${meters\\_per\\_second} = 5026$.\n", + "Note: by convention vectors are generally presented in the form of columns. Also, vector names are usually lowercase to distinguish them from matrices (which we will discuss below) and in bold (when possible) to distinguish them from simple scalar values such as ${meters\\_per\\_second} = 5026$.\n", "\n", - "A list of N numbers may also represent the coordinates of a point in an N-dimensional space, so it is quite frequent to represent vectors as simple points instead of arrows. A vector with 1 element may be represented as an arrow or a point on an axis, a vector with 2 elements is an arrow or a point on a plane, a vector with 3 elements is an arrow or point in space, and a vector with N elements is an arrow or a point in an N-dimensional space… which most people find hard to imagine.\n", + "A list of N numbers may also represent the coordinates of a point in an N-dimensional space, so it is quite frequent to represent vectors as simple points instead of arrows. A vector with 1 element may be represented as an arrow or a point on an axis, a vector with 2 elements is an arrow or a point on a plane, a vector with 3 elements is an arrow or a point in space, and a vector with N elements is an arrow or a point in an N-dimensional space… which most people find hard to imagine.\n", "\n", "\n", "## Purpose\n", @@ -203,7 +203,7 @@ "metadata": {}, "source": [ "### 2D vectors\n", - "Let's create a couple very simple 2D vectors to plot:" + "Let's create a couple of very simple 2D vectors to plot:" ] }, { @@ -306,7 +306,7 @@ "metadata": {}, "source": [ "### 3D vectors\n", - "Plotting 3D vectors is also relatively straightforward. First let's create two 3D vectors:" + "Plotting 3D vectors is also relatively straightforward. First, let's create two 3D vectors:" ] }, { @@ -345,8 +345,6 @@ } ], "source": [ - "from mpl_toolkits.mplot3d import Axes3D\n", - "\n", "subplot3d = plt.subplot(111, projection='3d')\n", "x_coords, y_coords, z_coords = zip(a,b)\n", "subplot3d.scatter(x_coords, y_coords, z_coords)\n", @@ -470,7 +468,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Let's plot a little diagram to confirm that the length of vector $\\textbf{v}$ is indeed $\\approx5.4$:" + "Let's plot a little diagram to confirm that the length of vector $\\textbf{u}$ is indeed $\\approx5.4$:" ] }, { @@ -480,7 +478,7 @@ "outputs": [ { "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAVcAAAD8CAYAAADDneeBAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAAAUdElEQVR4nO3dXWxkZ33H8d/fbzu2Z71bhAubBBqI60goEi9Z8dJIyE4oChDRXvSCSHCBKrkXLQp9UVt6U6Fe9KZC5QJVWiW8VA5BNBCpihAFiVgpUgtkQygJm0ZOmvht1x6/rO3xeGzP+N8Lz1iO1/Yc2+f4OTPn+5Gs2OvZ9U8r7y/H//M8zzF3FwAgXm2hAwBAK6JcASABlCsAJIByBYAEUK4AkADKFQASEKlczeyimT1pZi+b2TUz+0jSwQCgmXVEfN1XJf3Q3f/IzLok9SSYCQCanjXaRGBmfZJ+Jendzo4DAIgkypXruyUVJH3DzN4r6aqkR9x9be+LzGxE0ogk5XK5e9/5znfGnfXEtre31daWnvEyeRpLWybyHC2reV555ZV5d+8/8JPufuSbpMuSKpI+VPv4q5L+4ajfMzg46GnyzDPPhI7wJuRpLG2ZyHO0rOaR9Jwf0oNRqn1K0pS7/6z28ZOSPnDKwgeAltawXN39hqRJM7u79ksPSPpNoqkAoMlFXS3wBUmP11YKvCbp88lFAoDmF6lc3f0F7cxeAQARpOf2HgC0EMoVABJAuQJAAihXAEgA5QoACaBcASABlCsAJIByBYAEUK4AkADKFQASQLkCQAIoVwBIAOUKAAmgXAEgAZQrACSAcgWABER9EsGxuLveeOON3Y/NTGamtra2hm8dHR1qb29Xe3u7zCyJeACQuETKVZKq1eqxXr+/SN1dZrZbtB0dHers7FRHR8eb3qeAAaRRYuV6XDtPqb311yqViiqVijY2NiS9uYTdfbdou7q61NXVpc7OTnV2dqq9vf3MsgPAfqkp16j2l3C9fNfX13eLt37V29XVpe7ubm1vb6tSqTBqAHBmmq5cj7K3eN1dGxsb2tjYUKVS0eTkpCTtFu65c+d07tw5dXS01F8BgJTITLPUi7deuPUrWDNTd3e3enp61N3dTdkCiEVmm6Retu6utbU1lUolSVJbW5tyuRxlC+BUaI6aetlWq9Vbyranp0e9vb3q7u5mZgsgEsr1EHvLdnV1VcViUZKUy+WUz+fV09PDigQAh6JcI6qX7fr6usrlstxdXV1dyufz6u3tVWdnZ+CEANKEcj2BetFubm5qaWlJS0tLam9vV19fn/L5PHNaANHK1cxel7QqqSqp4u6XkwzVTOpFW6lUtLS0pMXFRZ07d059fX3q7e1VWxvHNwBZdJxLrGF3n08sSQvYu9xrfn5ehUJB3d3d6uvrU09PDzfDgAzh59eE7J/RSlI+n9eFCxdCxgJwRqL+zOqSfmRmV81sJMlArcjd5e5aXV3V9Ouva6tc1tra2oHnKQBoDRblH7iZ3ebuM2b225J+LOkL7v7svteMSBqRpP7+/ntHR0eTyHsi5XJZuVwudIxde/PUT/0KqVgsKp/PB82wX9oykedoWc0zPDx89bB7UJHGAu4+U/vvnJk9JemDkp7d95orkq5I0uDgoA8MDJwqdJzGx8eV9jzd3d26ePGicrncmc9mx8bGNDQ0dKZfs5G0ZSLP0chzq4ZjATPrNbPz9fclfVzSi0kHy5r19XXduHFDU1NTKhaLjAyAJhflyvVtkp6qXU11SPq2u/8w0VQZ5e7a2tpSoVDQwsKC3vKWtyifz7PKAGhCDcvV3V+T9N4zyIIad1e1WtX8/LwWFhZ08eJF9fX1sWYWaCIsxUqx+iqD+i6wCxcu6MKFC8FvgAFojHJtAvX56/LyspaXl3Xx4kVduHCBK1kgxfjX2UTqV7I3b97UxMSElpeXufEFpBTl2oTcXdvb21pcXNTExASrC4AUolybWP3GV6FQ0OTkpEqlEiULpATl2gLqjyCfnZ3VzMzM7mPIAYRDubaQ+hNvZ2ZmVCgUVK1WQ0cCMotybUH1Q2ImJia0srLCqAAIgHJtYe6uhYUFTU9PMyoAzhjl2uLcXZubm5qZmdHc3ByjAuCMUK4Z4e4qFouamJjQ6uoqowIgYZRrxri75ufndf36dVUqldBxgJZFuWaQu6tcLmtyclKrq6uh4wAtiXLNsPpV7NbWFlexQMwo14yrn1dQv4plFgvEg3KFJGaxQNwoV+yqz2KnpqZUKpVCxwGaGuWKW2xvb2t2dlYLCwuMCYATolxxIHfXysqKpqenGRMAJ0C54lD13V2Tk5NaW1sLHQdoKpQrGnJ3zc3NaX5+njEBEBHlikjqJ21NTU0xJgAioFwRmbtra2tLU1NTKpfLoeMAqUa54ti2t7d1/fp1rayshI4CpBblihOpnxVbKBSYwwIHoFxxYvVjDGdmZjgnFtiHcsWp1J/bNTU1pc3NzdBxgNSgXBGLarWq6elpts0CNZHL1czazeyXZvZ0koHQvNxds7Oz3OgCdLwr10ckXUsqCFpD/UbXzZs3Q0cBgopUrmZ2h6RPSXo02ThoBe6upaUldnQh0yzKN7+ZPSnpHyWdl/RX7v7QAa8ZkTQiSf39/feOjo7GHPXkyuWycrlc6Bi7spSnra1NHR0dx/59xWJR+Xw+gUQnQ56jZTXP8PDwVXe/fNDnGn7Xm9lDkubc/aqZDR32One/IumKJA0ODvrAwMDJ0iZgfHxc5DlcknnMTOfOndPb3/52tbVFn0KNjY1paGgokUwnQZ6jkedWUb7b75P0aTN7XdJ3JN1vZum5LEWq1Q/gnp6eZi0sMqVhubr7l9z9Dne/U9JnJP3E3T+beDK0lK2tLc6GRaawzhVnplKpULDIjGOVq7uPHXQzC4iqvtmAgkWr48oVZ46CRRZQrgiCgkWro1wRDAWLVka5IigKFq2KckVw1WpVMzMz2t7eDh0FiA3lilSoVCq6fv06BYuWQbkiNTY2NjQ7O8thL2gJlCtSpVwuq1AohI4BnBrlilRxd62trXEOAZoe5YrUcXdVq1UtLy+HjgKcGOWK1FpcXNTa2lroGMCJUK5ILXfX3NycyuVy6CjAsVGuSDV3140bN9hkgKZDuSL1tre3WQOLpkO5oilUKhUVCgXWwKJpUK5oCu6uUqnECgI0DcoVTaP+yO5SqRQ6CtAQ5Yqm4u6anZ3V5uZm6CjAkShXNB135wYXUo9yRVOqVquam5vjBhdSi3JF01pfX9fq6mroGMCBKFc0LXfXwsIC81ekEuWKplbfwcX8FWlDuaLpVatVzc/Ph44BvAnliqZXPwOW+SvShHJFS3B3zc/Pa2trK3QUQBLlihZSn7+yPAtp0LBczSxnZj83s1+Z2Utm9uWzCAacRKVS0dLSUugYQKQr1w1J97v7eyW9T9KDZvbhRFMBJ+TuWl5eZnkWgmtYrr6jWPuws/bGz11Irfr5A4wHEJJF+QY0s3ZJVyUNSPqau//NAa8ZkTQiSf39/feOjo7GHPXkyuWycrlc6Bi7yNNYHJna29vV3t4eS55isah8Ph/LnxUH8hztrPIMDw9fdffLB32uI8of4O5VSe8zs4uSnjKze9z9xX2vuSLpiiQNDg76wMDA6VLHaHx8XOQ5XNrySPFkMjPdfvvt6urqOnWesbExDQ0NnfrPiQt5jpaGPMdaLeDuNyWNSXowiTBAnBgPIKQoqwX6a1esMrNuSR+T9HLCuYBYsHoAoUQZC1yS9K3a3LVN0nfd/elkYwHxqK8eyOfzsYwHgKgalqu7/4+k959BFiAR7q5CoaDbbrtNZhY6DjKCHVrIhM3NTa2trYWOgQyhXJEJ9bMHOJoQZ4VyRWbUnx4LnAXKFZnh7lpZWWFrLM4E5YpMqd/cYu0rkka5InM2NzdVKpVCx0CLo1yROfWrV25uIUmUKzKpPn8FkkK5IpPqKwe4ekVSKFdkGkuzkBTKFZlVHw1UKpXQUdCCKFdkmrtrcXExdAy0IMoVmbe2tsYjuRE7yhWZ5+5aWFgIHQMthnIFJK2vr2tjYyN0DLQQyhUQs1fEj3IFasrlMoe6IDaUK1DDkYSIE+UK7FEqlVj3ilhQrsAeXL0iLpQrsE+xWFS1Wg0dA02OcgX2cXfdvHkzdAw0OcoVOMDKygonZuFUKFfgEMvLy6EjoIlRrsAB3F3Ly8s8awsnRrkCh3B3ra+vh46BJkW5AofgxhZOo2G5mtk7zOwZM7tmZi+Z2SNnEQxIg42NDY4jxIl0RHhNRdJfuvvzZnZe0lUz+7G7/ybhbEBw9dkrcFwNr1zd/bq7P197f1XSNUm3Jx0MSIvV1dXQEdCE7Dh3Q83sTknPSrrH3Vf2fW5E0ogk9ff33zs6OhpjzNMpl8vK5XKhY+wiT2Npy7SxsaHz58+HjrGrWCwqn8+HjrErq3mGh4evuvvlgz4XZSwgSTKzvKTvSfri/mKVJHe/IumKJA0ODvrAwMAJ48ZvfHxc5Dlc2vJI6cv06quvamhoKHSMXWNjY+Q5QhryRFotYGad2inWx939+8lGAtLH3XlSAY4lymoBk/SYpGvu/pXkIwHpxOwVxxHlyvU+SZ+TdL+ZvVB7+2TCuYDUKRaL7NhCZA1nru7+U0l2BlmAVHN3lctldXd3h46CJsAOLSAid9fKyi33coEDUa7AMZRKJY4iRCSUK3BMpVIpdAQ0AcoVOAZGA4iKcgWOqVwu84wtNES5AsdkZlpbWwsdAylHuQLH5O4qFouhYyDlKFfgBMrlMqsGcCTKFTgBM+MRMDgS5QqcAKMBNEK5AidUKpU4awCHolyBU+AYQhyGcgVOyN1ZkoVDUa7AKTB3xWEoV+AUtre3efQ2DkS5AqfEkiwchHIFTsHdOSULB6JcgVMql8ssycItKFfglNxdlUoldAykDOUKxIC5K/ajXIFTYu6Kg1CuQAyYu2I/yhWIAXNX7Ee5AjFh7oq9KFcgBu5OueJNKFcgJpyQhb0oVyAmlUqFR79gV8NyNbOvm9mcmb14FoGAZmVm2tzcDB0DKRHlyvWbkh5MOAfQEihX1DUsV3d/VtLiGWQBmho3tbAXM1cgRtzUQp1F2VViZndKetrd7zniNSOSRiSpv7//3tHR0bgynlq5XFYulwsdYxd5GktbpuPk6erqSjjNzhMQ8vl84l8nqqzmGR4evurulw/6XEdcX8Tdr0i6IkmDg4M+MDAQ1x99auPj4yLP4dKWR0pfpqh5zEyXLl1K/H8MY2NjGhoaSvRrHAd5bsVYAIgZN7UgRVuK9YSk/5J0t5lNmdkfJx8LaE7uTrlCUoSxgLs/fBZBgFZBuUJiLADEjqfBQqJcgdhVq1XOdgXlCsTNzDjbFZQrkARGA6BcgZi5O+UKyhVIAttgQbkCCWA5FihXIAHVajV0BARGuQIJ4IkEoFyBBLg7BZtxlCuQADNjNJBxlCuQADYSgHIFEuDuXLlmHOUKJIByBeUKJIRdWtlGuQIJYeaabZQrkBDGAtlGuQIJYZ1rtlGuQEI4MDvbKFcgIVy5ZhvlCiSEK9dso1yBhFCu2Ua5AgmiYLOLcgUSxNw1uyhXICFmRrlmGOUKJIixQHZRrkCCKNfsolyBhJhZ6AgIKFK5mtmDZva/ZjZuZn+bdCigFbg7V64Z1rBczaxd0tckfULSeyQ9bGbvSToYADSzKFeuH5Q07u6vufumpO9I+oNkYwHNj7FAtnVEeM3tkib3fDwl6UP7X2RmI5JGah9u3HXXXS+ePl5s3ippPnSIPcjTWNoykedoWc3zO4d9Ikq5HvS/31sGSe5+RdIVSTKz59z9cuR4CSPP0dKWR0pfJvIcjTy3ijIWmJL0jj0f3yFpJpk4ANAaopTrLyT9rpm9y8y6JH1G0r8nGwsAmlvDsYC7V8zszyT9h6R2SV9395ca/LYrcYSLEXmOlrY8Uvoykedo5NnHWIcHAPFjhxYAJIByBYAExFquadsma2ZfN7M5M0vFmlsze4eZPWNm18zsJTN7JHCenJn93Mx+Vcvz5ZB56sys3cx+aWZPpyDL62b2azN7wcyeC51Hkszsopk9aWYv176XPhIwy921v5v624qZfTFUnlqmP699P79oZk+YWS5IjrhmrrVtsq9I+n3tLN/6haSH3f03sXyBk2X6qKSipH9193tC5diT55KkS+7+vJmdl3RV0h+G+juynS1Eve5eNLNOST+V9Ii7/3eIPHty/YWky5L63P2hwFlel3TZ3VOzQN7MviXpP9390doKnh53vxk4Vr0DpiV9yN3fCJThdu18H7/H3dfN7LuSfuDu3zzrLHFeuaZum6y7PytpMWSGvdz9urs/X3t/VdI17eyAC5XH3b1Y+7Cz9hb0DqeZ3SHpU5IeDZkjrcysT9JHJT0mSe6+mYZirXlA0quhinWPDkndZtYhqUeB1uXHWa4HbZMNVhxpZ2Z3Snq/pJ8FztFuZi9ImpP0Y3cPmkfSP0v6a0lpOcLfJf3IzK7WtniH9m5JBUnfqI1OHjWz3tChaj4j6YmQAdx9WtI/SZqQdF3Ssrv/KESWOMs10jZZSGaWl/Q9SV9095WQWdy96u7v087Ouw+aWbDxiZk9JGnO3a+GynCA+9z9A9o5Fe5Pa6OmkDokfUDSv7j7+yWtSUrD/Y0uSZ+W9G+Bc/yWdn5ifpek2yT1mtlnQ2SJs1zZJhtBbbb5PUmPu/v3Q+epq/1oOSbpwYAx7pP06dqc8zuS7jez0YB55O4ztf/OSXpKO+OvkKYkTe35CeNJ7ZRtaJ+Q9Ly7zwbO8TFJ/+fuBXffkvR9Sb8XIkic5co22QZqN5Aek3TN3b+Sgjz9Znax9n63dr4xXw6Vx92/5O53uPud2vn++Ym7B7nqkCQz663deFTtR++PSwq68sTdb0iaNLO7a7/0gKRgN433eFiBRwI1E5I+bGY9tX9vD2jn3saZi3IqViQn3CabKDN7QtKQpLea2ZSkv3f3xwJGuk/S5yT9ujbnlKS/c/cfBMpzSdK3and52yR9192DL39KkbdJeqp2LmuHpG+7+w/DRpIkfUHS47WLmNckfT5kGDPr0c4qoT8JmUOS3P1nZvakpOclVST9UoG2wrL9FQASwA4tAEgA5QoACaBcASABlCsAJIByBYAEUK4AkADKFQAS8P+XN4hRU5IHvQAAAABJRU5ErkJggg==\n", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAVcAAAD8CAYAAADDneeBAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8qNh9FAAAACXBIWXMAAAsTAAALEwEAmpwYAAAYOUlEQVR4nO3dXWxk5X3H8e/f7/bOepcNbnZhIUA2RopQQmCVN6TICy0hDUp70YsgwQVptL1oEemL0lJFqqJeNBdVBIpQpQ0kAZkQpZsgVSjahggcGqlNyBJSIEuRobDr9eLXXdtje2zP+N8Lz2y9XttzbM+Z55w5v49k4Zfx+qdl+HH8/J/njLk7IiJSW02hA4iINCKVq4hIDFSuIiIxULmKiMRA5SoiEgOVq4hIDCKVq5ntNbPjZvaGmZ0ys0/FHUxEJM1aIj7uEeCEu/+JmbUBXTFmEhFJPat2iMDMuoHfAje4ThyIiEQS5cr1BmAM+K6ZfRQ4CTzo7rOrH2RmR4GjAB0dHbdee+21tc66bcvLyzQ1JWd5WXmqS1om5dlcVvO8+eab4+7es+4X3X3TN+AwUAQ+Uf74EeAfN/ue3t5eT5IXXnghdIRLKE91ScukPJvLah7g175BD0ap9iFgyN1/Wf74OHDLDgtfRKShVS1Xd38POGNmN5Y/dQfwu1hTiYikXNTdAg8AT5V3CrwN3B9fJBGR9ItUru7+CitrryIiEkFyxnsiIg1E5SoiEgOVq4hIDFSuIiIxULmKiMRA5SoiEgOVq4hIDFSuIiIxULmKiMRA5SoiEgOVq4hIDFSuIiIxULmKiMRA5SoiEgOVq4hIDFSuIiIxiPpKBFvi7rz77rsXPzYzzIympqaqby0tLTQ3N9Pc3IyZxRFPRCR2sZQrQKlU2tLj1xapu2NmF4u2paWF1tZWWlpaLnlfBSwiSRRbuW7VyqvUXv65YrFIsVhkYWEBuLSE3f1i0ba1tdHW1kZrayutra00NzfXLbuIyFqJKdeo1pZwpXzn5+cvFm/lqretrY3Ozk6Wl5cpFotaahCRuklduW5mdfG6OwsLCywsLFAsFjlz5gzAxcJtb2+nvb2dlpaG+isQkYTITLNUirdSuJUrWDOjs7OTrq4uOjs7VbYiUhOZbZJK2bo7s7OzzM3NAdDU1ERHR4fKVkR2RM1RVinbUql0Wdl2dXWxa9cuOjs7tWYrIpGoXDewumxnZmbI5/MAdHR0kMvl6Orq0o4EEdmQyjWiStnOz89TKBRwd9ra2sjlcuzatYvW1tbACUUkSVSu21Ap2sXFRc6fP8/58+dpbm6mu7ubXC6ndVoRiVauZvYOMAOUgKK7H44zVJpUirZYLHL+/HkmJydpb2+nu7ubXbt20dSk2zeIZNFWLrGOuPt4bEkawOrtXuPj44yNjdHZ2Ul3dzddXV0aholkiH5/jcnaNVqAXC7Hnj17QsYSkTqJ+jurAz81s5NmdjTOQI3I3XH3lV0HX/saxclJZmdn172fgog0BovyH7iZXeXuw2b2e8BzwAPu/uKaxxwFjgL09PTc2t/fH0febSkUCnR0dISOQdPkJC1nzzJz/fW053IAF+/6FVI+nydXzpMUScukPJvLap4jR46c3GgGFWlZwN2Hy/8cNbNngI8DL655zDHgGEBvb68fOnRoR6FraXBwkNB5Ol56if33348tLfH8k09y/c03X/L1zs5O9u7dS0dHR93XZgcGBujr66vrz6wmaZmUZ3PKc7mqywJmtsvMdlfeB+4EXos7WCNpeecd9n/5yzQVCnhbG77ODoL5+Xnee+89hoaGyOfzWjIQSbkoV67vB54pX021AN939xOxpmogTVNTXHXvvdjs7MonmptX3tbh7iwtLTE2NsbExAT79u0jl8tpl4FIClUtV3d/G/hoHbI0nqUl9n/pSzSNj2OVG8WYrXvlupq7UyqVGB8fZ2Jigr1799Ld3a09syIpoq1YcXGn56GHaHvjDZqWli75/EZXrpf/ESu7DCqnwPbs2cOePXuCD8BEpDqVa0z2fPvb7DpxgqbyHtcK20K5VlTWX6emppiammLv3r3s2bNHV7IiCaZyjUHXz37GFY88clmxAlAsVl0W2EilZC9cuMDU1BRXXHEF3d3dWpMVSSBd+sSg67nnwB1fp/RsaWnLV65ruTvLy8tMTk5y+vRp7S4QSSCVawzGv/ENzvX3M3vXXQCXX6nW6EqzMvgaGxvjzJkzzM3NqWRFEkLLAnEwY+GWWxi9+WZyH/oQCx/5CM2jozRPTkJTU83KtaLyEuQjIyO0tbVx5ZVX0t7eXtOfISJbo3KN0d5HHwVg+PhxANpffpmm6enYfl7lFW+Hh4fJ5XLs27dPOwtEAlG5xmjfww+zdPDgxSvVhVtvXfnC4GCsP/fiTWLyed73vvexe/duDb1E6kxrrjFpfestAM49+WSwDO7OxMQEZ8+eZWFhIVgOkSxSucbkwH33AVD8wAeC5nB3FhcXGR4eZnR0lFKpFDSPSFaoXOOwvEzLyAgTX/1q6CQXuTv5fJ7Tp08zMzOjXQUiMVO5xqAyyJo6mrz7irs74+PjnDt3jmKxGDqOSMNSucZg7SAradydQqHAmTNnmJmZCR1HpCGpXGssCYOsqCpXsUtLS7qKFakxlWuNJWWQFVXlzluVq1itxYrUhsq1lhI4yIpKa7EitaVyraEkD7KiqKzFDg0NMTc3FzqOSKqpXGso6YOsqJaXlxkZGWFiYkLLBCLbpHKtkTQNsqJwd6anpzl79qyWCUS2QeVaI2kbZEVROd115swZZisvsCgikahcayHFg6wo3J3R0VHGx8e1TCASkcq1BtI+yIqicqetoaEhLROIRKByrYFGGWRV4+4sLS0xNDREYb3XBxORi1SuO9Rog6wolpeXOXfuHNMx3vhbJO1UrjvUiIOsKCr3ih0bG9M6rMg6VK470eCDrGoqtzEcHh7WfWJF1lC57kAWBlnVVF63a2hoiMXFxdBxRBJD5boDWRlkRVEqlTh79qyOzYqURS5XM2s2s9+Y2bNxBkqLLA6yqnF3RkZGNOgSYWtXrg8Cp+IKkjZZHWRVUxl0XbhwIXQUkaAilauZHQQ+DzwWb5yUyPggqxp35/z58zrRJZlmUZ78ZnYc+CdgN/A37n73Oo85ChwF6OnpubW/v7/GUbevUCjQ0dFRsz+veXSU5pERFm+6aVvrrbXOs1Nx5mlqaqKlpWXL35fP58nlcjEk2h7l2VxW8xw5cuSkux9e72tVn/Vmdjcw6u4nzaxvo8e5+zHgGEBvb68fOnRoe2ljMDg4SC3z3PDZz7J08CBnfv7zROTZqTjzmBnt7e3s37+fpqboq1ADAwP09fXFkmk7lGdzynO5KM/224AvmNk7wA+A280sOZeldaZB1tZUbsB99uxZ7YWVTKlaru7+kLsfdPfrgC8Cz7v7vbEnSygNsrZnaWlJ94aVTNE+163QIGtHisWiClYyY0vl6u4D6w2zskInsnaucthABSuNTleuW6ATWbWhgpUsULlGpEFWbalgpdGpXCPSIKv2VLDSyFSuUWiQFRsVrDQqlWsEGmTFq1QqMTw8zPLycugoIjWjco1Ag6z4FYtFzp07p4KVhqFyrUKDrPpZWFhgZGREN3uRhqByrUKDrPoqFAqMjY2FjiGyYyrXzWiQVXfuzuzsrO5DIKmnct2EBllhuDulUompqanQUUS2TeW6CQ2ywpqcnGR2djZ0DJFtUbluQIOs8Nyd0dFRCoVC6CgiW6Zy3YAGWcng7rz33ns6ZCCpo3JdjwZZibK8vKw9sJI6Ktd1aJCVPMVikbGxMe2BldRQua5Dg6zkcXfm5ua0g0BSQ+W6hgZZyVV5ye65ubnQUUSqUrmuoUFWsrk7IyMjLC4uho4isimV62oaZKWCu2vAJYmncl1Fg6z0KJVKjI6OasAliaVyXUWDrHSZn59nZmYmdAyRdalcyzTISh93Z2JiQuuvkkgq1zINstKpcoJL66+SNCpX0CAr5UqlEuPj46FjiFxC5YoGWWlXuQes1l8lSVSuaJDVCNyd8fFxlpaWQkcRAVSuGmQ1kMr6q7ZnSRJULVcz6zCzX5nZb83sdTP7ej2C1YsGWY2lWCxy/vz50DFEIl25LgC3u/tHgZuBu8zsk7GmqhcNshqOuzM1NaXtWRJc1XL1Ffnyh63lt4b4vUuDrMZUuf+AlgckJIvyBDSzZuAkcAh41N3/dp3HHAWOAvT09Nza399f46jbVygU6OjouOzzba++ire1sXTjjYnIE0rS8kBtMjU3N9Pc3FyTPPl8nlwuV5M/qxaUZ3P1ynPkyJGT7n54va9FKteLDzbbCzwDPODur230uN7eXj9x4sRWc8ZmcHCQQ4cOXfK51rfe4po77+T088/Xfb11vTwhJS0P1CaTmXH11VfT1ta24zwDAwP09fXt+M+pFeXZXL3ymNmG5bql3QLufgEYAO7aeaywNMhqfFoekJCi7BboKV+xYmadwO8Db8ScK14aZGWGdg9IKC0RHnMAeKK87toE/NDdn403Vrw0yMqOyu6BXC5Xk+UBkaiqlqu7/zfwsTpkqRudyMoWd2dsbIyrrroK079zqZPMndDSiaxsWlxcZHZ2NnQMyZDMlasGWdlUufeAbk0o9ZKtctUgK9Mqrx4rUg+ZKlcNsrLN3ZmentbRWKmLTJWrBllSGW5p76vELTPlqkGWVCwuLjI3Nxc6hjS4zJSrBllSUbl61XBL4pSNcnXXIEsuUVl/FYlLJsq1eWwM0CBL/l9l54CuXiUu2SjXkRENsmRd2polcWn4ctUgSzZSWRooFouho0gDavhy1SBLNuPuTE5Oho4hDaixy7V8Iqu0f3/oJJJgs7OzekluqbmGLtfKiazSlVcGTiJJ5u5MTEyEjiENpqHLVSeyJKr5+XkWFhZCx5AG0rDlqkGWbIXWXqXWGrZcNciSrSoUCrqpi9RMY5arbi0o26BbEkotNWS56taCsl1zc3Pa9yo10ZDluu/hh1m65hoNsmTLdPUqtdJw5XpxkPXEE4GTSFrl83lKpVLoGJJyDVeuGmTJTrk7Fy5cCB1DUq6xylWDLKmR6elp3TFLdqShylWDLKmlqamp0BEkxRqqXDXIklpxd6ampvRaW7JtDVOuGmRJrbk78/PzoWNISjVMuWqQJbWmwZbsRNVyNbNrzOwFMztlZq+b2YP1CLYlGmRJTBYWFnQ7QtmWlgiPKQJ/7e4vm9lu4KSZPefuv4s5W2QaZElcKmuvIltV9crV3c+5+8vl92eAU8DVcQfbCg2yJE4zMzOhI0gK2VamoWZ2HfAicJO7T6/52lHgKEBPT8+t/f39NYy5SaaFBVrffJPF3l5ob1/3MYVCgY6OjrrkiUJ5qktapoWFBXbv3h06xkX5fJ5cLhc6xkVZzXPkyJGT7n54va9FWRYAwMxywI+Ar6wtVgB3PwYcA+jt7fVDhw5tM+7WXPvpT9MyMsLb5d0C6xkcHKReeaJQnuqSlumtt96ir68vdIyLBgYGlGcTScgTabeAmbWyUqxPufuP4420BRpkSZ24u16pQLYkym4BAx4HTrn7N+OPFJ0GWVJPWnuVrYhy5XobcB9wu5m9Un77w5hzRaJBltRTPp/XiS2JrOqaq7v/Akhce+lEltSbu1MoFOjs7AwdRVIgtSe0dCJL6s3dmZ6+bJYrsq50lqsGWRLI3NycbkUokaSyXDXIkpDm5uZCR5AUSGW5apAloWhpQKJKXblqkCWhFQoFvcaWVJW6ctUgS0IzM2ZnZ0PHkIRLV7lqkCUJ4O7k8/nQMSThUlWuGmRJUhQKBe0akE2lqlw1yJKkMDO9BIxsKjXlqkGWJImWBqSa1JSrBlmSNHNzc7rXgGwoHeWqQZYklG5DKBtJRblqkCVJ5O7akiUbSkW5apAlSaV1V9lI4stVgyxJsuXlZb30tqwr8eWqQZYknbZkyXqSXa4aZEnCubvukiXrSnS5apAlaVAoFLQlSy6T6HLVIEvSwN0pFouhY0jCJLZcNciSNNG6q6yV2HLVIEvSQuuusp5klqsGWZIyWneVtRJZrhpkSdpo3VXWSmS5apAlaaR1V1ktceWqQZakkburXOUSiStXDbIkrXSHLFktWeWqQZakWLFY1Eu/yEVVy9XMvmNmo2b2WtxhNMiSNDMzFhcXQ8eQhIhy5fo94K6YcwAaZEn6qVylomq5uvuLwGTcQTTIkrTTUEtWS8yaqwZZ0gg01JIKi3KqxMyuA55195s2ecxR4ChAT0/Prf39/dFTuNP22muU9u+n1NMT/fsiKhQKdHR01PzP3S7lqS5pmbaSp62tLeY0K6+AkMvlYv85UWU1z5EjR066++H1vtZSqx/i7seAYwC9vb1+6NChyN+791vfYt/DD/P24GAs662Dg4NsJU/clKe6pGWKmsfMOHDgQOz/YxgYGKCvry/Wn7EVynO5RCwLaJAljURDLYFoW7GeBv4TuNHMhszsT2sZQIMsaSTurnIVIMKygLvfE2cADbKk0ahcBUIvC+hEljQgvRqsQOBy1YksaUSlUkn3dpWw5apBljQiM9O9XSVcuWqQJY1MSwMSrFw1yJJG5e4qVwlUrhpkSYPTMVgJUq4aZEmj03YsCVKuGmRJoyuVSqEjSGB1L1cNsiQL9IoEUvdy1SBLssDdVbAZV99y1SBLMsLMtDSQcXUtVw2yJCt0kEDqWq4aZElWuLuuXDOubuWqQZZkicpV6lauGmRJ1uiUVrbVp1w1yJIM0pprttWlXDXIkizSskC21aVcNciSLNI+12yLvVw1yJKs0g2zsy32ctUgS7JKV67ZFm+5apAlGaYr12yLtVw1yJIsU7lmW6zlqkGWZJ0KNrtiK1cNskS07pplsZWrBlmSdWamcs2w2MpVgywRLQtkWSzlauVjfxpkSdapXLMrnivXpSUNsiTzTM//TItUrmZ2l5n9j5kNmtnfRfkeDbIk69xdV64ZVrVczawZeBT4HPBh4B4z+3C179MgS0SyLMqV68eBQXd/290XgR8Af7Tpd7S21iCaSLppWSDbWiI85mrgzKqPh4BPrH2QmR0FKhOshQ9+8IOv7TxezVwJjIcOsYryVJe0TMqzuazm2fBX9Cjlut7/fi9bSHL3Y8AxADP7tbsfjhwvZsqzuaTlgeRlUp7NKc/loiwLDAHXrPr4IDAcTxwRkcYQpVxfAj5kZtebWRvwReDf4o0lIpJuVZcF3L1oZn8B/DvQDHzH3V+v8m3HahGuhpRnc0nLA8nLpDybU541TPvwRERqr24vrS0ikiUqVxGRGNS0XLdzTDZOZvYdMxs1s0TsuTWza8zsBTM7ZWavm9mDgfN0mNmvzOy35TxfD5mnwsyazew3ZvZsArK8Y2avmtkrZvbr0HkAzGyvmR03szfKz6VPBcxyY/nvpvI2bWZfCZWnnOkvy8/n18zsaTPrCJKjVmuu5WOybwJ/wMr2rZeAe9z9dzX5AdvL9BkgDzzp7jeFyrEqzwHggLu/bGa7gZPAH4f6O7KVI0S73D1vZq3AL4AH3f2/QuRZleuvgMNAt7vfHTjLO8Bhd0/MBnkzewL4D3d/rLyDp8vdLwSOVemAs8An3P3dQBmuZuV5/GF3nzezHwI/cffv1TtLLa9ct35MNmbu/iIwGTLDau5+zt1fLr8/A5xi5QRcqDzu7vnyh63lt6ATTjM7CHweeCxkjqQys27gM8DjAO6+mIRiLbsDeCtUsa7SAnSaWQvQRaB9+bUs1/WOyQYrjqQzs+uAjwG/DJyj2cxeAUaB59w9aB7gYeCrQFJu4e/AT83sZPmId2g3AGPAd8tLJ4+Z2a7Qocq+CDwdMoC7nwX+GTgNnAOm3P2nIbLUslwjHZMVMLMc8CPgK+4+HTKLu5fc/WZWTt593MyCLZ+Y2d3AqLufDJVhHbe5+y2s3BXuz8tLTSG1ALcA/+LuHwNmgSTMN9qALwD/GjjHFaz8xnw9cBWwy8zuDZGlluWqY7IRlNc2fwQ85e4/Dp2novyr5QBwV8AYtwFfKK9z/gC43cz6A+bB3YfL/xwFnmFl+SukIWBo1W8Yx1kp29A+B7zs7iOBc/w+8L/uPubuS8CPgU+HCFLLctUx2SrKA6THgVPu/s0E5Okxs73l9ztZeWK+ESqPuz/k7gfd/TpWnj/Pu3uQqw4AM9tVHjxS/tX7TiDozhN3fw84Y2Y3lj91BxBsaLzKPQReEig7DXzSzLrK/73dwcpso+6i3BUrkm0ek42VmT0N9AFXmtkQ8A/u/njASLcB9wGvltc5Af7e3X8SKM8B4InylLcJ+KG7B9/+lCDvB54p35e1Bfi+u58IGwmAB4CnyhcxbwP3hwxjZl2s7BL6s5A5ANz9l2Z2HHgZKAK/IdBRWB1/FRGJgU5oiYjEQOUqIhIDlauISAxUriIiMVC5iojEQOUqIhIDlauISAz+D/IrBQWdSj4rAAAAAElFTkSuQmCC\n", "text/plain": [ "
" ] @@ -774,9 +772,9 @@ "metadata": {}, "source": [ "## Zero, unit and normalized vectors\n", - "* A **zero-vector ** is a vector full of 0s.\n", + "* A **zero-vector** is a vector full of 0s.\n", "* A **unit vector** is a vector with a norm equal to 1.\n", - "* The **normalized vector** of a non-null vector $\\textbf{u}$, noted $\\hat{\\textbf{u}}$, is the unit vector that points in the same direction as $\\textbf{u}$. It is equal to: $\\hat{\\textbf{u}} = \\dfrac{\\textbf{u}}{\\left \\Vert \\textbf{u} \\right \\|}$\n", + "* The **normalized vector** of a non-null vector $\\textbf{v}$, noted $\\hat{\\textbf{v}}$, is the unit vector that points in the same direction as $\\textbf{v}$. It is equal to: $\\hat{\\textbf{v}} = \\dfrac{\\textbf{v}}{\\left \\Vert \\textbf{v} \\right \\|}$\n", "\n" ] }, @@ -787,7 +785,7 @@ "outputs": [ { "data": { - "image/png": "iVBORw0KGgoAAAANSUhEUgAAAVQAAAD4CAYAAACzOx6UAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjQuMywgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/MnkTPAAAACXBIWXMAAAsTAAALEwEAmpwYAAAcNklEQVR4nO3deXxV9Z3/8dfn5mYjwbCFAAKCgA6KAgYXxvorOKjUtvqwM7Uu7bSjDo7VqdaNTufh+FPrWKt16riNa7WtFe0I1VatisJYWx0liCCiRVGQRYLsSch2z3f+OFFZEpKQ78255+T9fDzyyHIv3/smkHe+55zvOcecc4iISNelog4gIpIUKlQREU9UqCIinqhQRUQ8UaGKiHiSjuJFBwwY4EaMGOF93NraWkpKSryPmw1xygrxyhunrBCvvHHKCtnJW1VV9YlzrrzVB51z3f5WWVnpsmHevHlZGTcb4pTVuXjljVNW5+KVN05ZnctOXmCBa6PbtMkvIuKJClVExBMVqoiIJypUERFPVKgiIp6oUEVEPFGhioh4okIVEfFEhSoi4okKVUTEExWqiIgnKlQREU9UqCIinqhQRUQ8UaGKiHiiQhUR8USFKiLiSZcL1cyKzOw1M3vTzJaa2TU+gomIxI2Pe0o1AMc752rMLB942cyecc696mFsEZHY6HKhttxjpabl0/yWN9fVcUVE4sbCPuziIGZ5QBUwGrjDOTezlefMAGYAVFRUVM6aNavLr7u7mpoaSktLvY+bDXHKCvHKG6esEK+8ccoK2ck7derUKufcpFYfbOvuffvyBvQB5gHj9vY83fU0Xlmdi1feOGV1Ll5545TVuZjf9dQ5twWYD0z3Oa6ISBz4OMpfbmZ9Wj4uBqYB73R1XBGRuPFxlH8w8FDLftQU8Jhz7vcexhURiRUfR/kXAxM9ZBERiTWdKSUi4okKVUTEExWqiIgnKlQREU9UqCIinqhQRUQ8UaGKiHiiQhUR8USFKiLiiQpVRMQTFaqIiCcqVBERT1SoIiKeqFBFRDxRoYqIeKJCFRHxRIUqIuKJClVExBMVqoiIJypUERFPVKgiIp6oUEVEPFGhioh4okIVEfFEhSoi4okKVUTEExWqiIgnKlQREU9UqCIinqhQRUQ8UaGKiHiiQhUR8USFKiLiiQpVRMSTLheqmQ0zs3lmtszMlprZxT6CiYjETdrDGM3AZc65hWbWG6gys+edc297GFtEJDa6PEN1zq1zzi1s+Xg7sAzYv6vjiojEjdd9qGY2ApgI/K/PcUVE4sCcc34GMisF/ge43jk3u5XHZwAzACoqKipnzZrl5XV3VlNTQ2lpqfdxsyFOWSFeeeOUFeKVN05ZITt5p06dWuWcm9Tqg865Lr8B+cCzwKUdeX5lZaXLhnnz5mVl3GyIU1bn4pU3Tlmdi1feOGV1Ljt5gQWujW7zcZTfgPuBZc65W7o6nohIXPnYh3os8C3geDNb1PJ2sodxRURipcvLppxzLwPmIYuISKzpTCkREU9UqCIinqhQRUQ8UaGKiHiiQhUR8USFKiLiiQpVRMQTFaqIiCcqVBERT1SoIiKeqFBFRDxRoYqIeKJCFRHxRIUqIuKJClVExBMVqoiIJypUERFPVKgiIp6oUEVEPFGhioh4okIVEfFEhSoi4okKVUTEExWqiIgnKlQREU9UqCIinqhQRUQ8UaGKiHiiQhUR8USFKiLiiQpVRMQTFaqIiCcqVBERT1SoIiKeqFBFRDzxUqhm9oCZVZvZWz7GExGJI18z1AeB6Z7GEhGJpbSPQZxzL5nZCB9jJU1dJsPahgbWNTaytrGRbc3NNDlHeVMTd69dS3EqxaCCAgYXFDCksJB+6TRmFnVs2UcvvwwlJTBxIjzwAHzjG7B5M7z+Opx2Grz4IlRUwKGHwl13wZe+FHVi8cmcc34GCgv19865cW08PgOYAVBRUVE5a9YsL6+7s5qaGkpLS72P21FNzlGbyVCbybA9k2FHEBAQbgYYsPN3ekgmw9q8PGh5DCBoeV9gRmleHqV5efTKy6NXKvpd3VF/bzsjiqzPPTeQ224bQyZjmEE67aivT1FYGBAE0NwcftzU9PnjTU3GwQdv57rr/qTvbZZkI+/UqVOrnHOTWnvMywy1I5xz9wD3AEyaNMlNmTLF+2vMnz+fbIzblsA5Xt++ndkbNvBodTXrm5ooSKepyWQ+K8e23FxTw+Xt/EP3SqXIM8MB0/v144yBAzmxb196p7vtn+0z3f297Yoosh5+OLz/Pjz22K5fb2zM++zjhobPP540CZ57DsrK+vLSS6X63mZJd+ft/p/MBPiovp7b16zhnnXryDjHjkyG5pbH6j2+Tl3weS3/94YNPLtpEw1BwLS+fbli2DC+2KePdg/kiH79YPLkPQt1d+k0XHEFHHkk9O3bPdmk+6hQO8g5x4tbtvDjVav445YtOKDR0+6SjtqeyQDwzKZNvLR1K/3Saa4YNox/GDyYkry8dv60ZFteHvTqBXV1rT/ety+ceSb8+793by7pPr6WTT0CvAIcbGarzexcH+Pmile2bmVSVRWnLlnC3M2baXCu28t0Zw6oyWRY1dDAzBUrGPrnP3PH6tU0Be3taMhdxx8Pw4fv+fVVq8AMrrmm+zN1xPr1sHUrTJ0K3/senHpq68876SRYtAhuuaVb40k381KozrkznXODnXP5zrmhzrn7fYwbtXdqaznhzTeZ9uabLKypoTYHC6suCNiSyTBzxQqGv/oqs9avx9eBxu70xhtQWbnn16uqwvcTJ3Zvnr1Zvhyefx5eeAEGDQqP3N97L3zyCex+/DA/Pzy6f+WV4S+MwsJoMkv30CZ/K5qDgBtWreKGVatoaDlSn+tqg4DaxkbOe/dd7ly7ll+PHcvQoqKoY3XIihWwZUtuF+rGjXDDDfD3fw/nnAONjbBwIdTX71qS//iP8MQTUFMDpaVw++3w7W9Hl1u6lwp1N0trazl96VJW1tezIwdnpO2pDQJe2baNsa+9xs/GjOGcQYNy/sDVp6XZVqH27w/DhnVvpk/98IfhOtJvfhMefjgsxwULPn989wUXX/wiHHVUWKZXXBF+LD1H9Ascc8jda9dyZFUVy+rqcnLzvqOanaMmCLh4+XKmL17M9ubm9v9QhBYuDN+3VajdOTvNZMLXNAs36wsLYfRoOPZYWLcODjus/TGGD4ebb4YvfAEKCrKfWXKHZqhAUxBw4fLlPLx+fSxnpW2pDQJe2rKF8QsWMHf8eA4sLo46UqsWLoShQ2HgwF2/vnw5bNgAEyZk9/U3bAhnlIccAkEQ7hu99dZwGdQJJ3R+vJ//3H9GiYceX6gbm5r48uLFLKmt3WXdZ1LUO8fK+nomLljAb8eNY2oOLn58++3wVMzd/frX4ftszFDffTc8Ov/hh+HpoS+8AHPmhLsW+vcPj9iLdFaPLtR1DQ1MXriQdY2NkS6DyrYA2JbJ8OUlS3h47FhOKy+POtIuamvDgzg7e+UV+PGPw499FWp1dbhs6Z/+CY44ItzfOXv2ngeWRPZVjy3UNQ0NHF1VxfqmJpoTXKY72xEEnL1sGT8PAr5RURF1nM8ccww88wycey6MHx+u13zqKRgzJjyd86CDujb+hRfC/vvDuHFw//3hkfiamnA/qYhPPfKg1PrGRiYvXJicMl21Ch58MHzfjh1BwD+8+y5PfPJJ9nN10F13hQvfH30Urr8enAuPpAdBeBCosyeBOQcvvRQW5htvhEucysvhlFPC/aWjRqlMJTt63Ax1RybDlEWLWNfQQG4f++4g5+Cmm8LFnAsXhkdT2mmLHUHAmW+/zYvjx3NMWVk3BW3bAQfAH/6w59ff6sTlyjduhLKy8NTPww47nPvvh5/+NJzd3nuvv6wie9OjZqjOOc56+21W1tcno0wBnn4aPv44bI116+DZZzv0x3YEAScvWcLqep+Xc+ley5fD0qVwzz0wYAAsWRIeXLr66reZMAEuvTS8NqlId+lRhfqjlSt5bvPm5CyN2ro1LNKZM2HIkPD8xrvvhm3bOvTHt2cynLB4MTtaLroSB+vXw1VXhQeYDjoonJCfdRbs2BEevDruONhvv8T8upSY6TGb/HM3beKGVauSU6YQbuP+9reff37kkeHanw5qbllSdc477/BIa+uWckRzM1xwQXjWUV4e3Hln+HkQaF+o5JYeMUPd1tzMmcuWJatMPdkRBDy5cSNPbdwYdZRdOBfuvSgshDVrwreysvA8+o0bwwm5ylRyTY8o1IuWL6cmx0+/jFJdEPCtZcvY3NQUaY5t28I1oWbh0qaSErjuuvAMqqefhtNPjzSeSLsSX6jPb9rE4xs2UJ+E5VE7u/TS8BSf3a1fH16c86GHOjVcXSbD+X/5i6dwHffBB+Fa0xtvDGeg69bB7373+bnwV14JOXrGrMgeEl2oTUHAd955J5GnlLJ8eesr3j8txdGjOzVcg3M8tXEjf9q61UO4vVuzJizQ9evhwAPhl7/8fLH9yJHwla9Anz5ZjyHiXaIPSt2/bh1bk7ipv3Zt2D57K9QxYzo9bF0QcNHy5SysrPR+yb8dO+D734e/+ZtwyeyPfgTnnx8ecNLdWyQpEjtDrctk+OEHH8T6Mnxt+rQ02yrU/fbb89JNHbS8ro6nN23qQrhd/eY3YZRt2+DVV8PN+pkzYfv2cBaqMpUkSWyh/mz1ahqSWKYQbu5D24W6D7PTT9UGAf+8fDnBPu5zrqsLj8KbwdVXhweWLr44vEHdokVw4on7HE0k5yVyk78pCLjpo4+Sue8UwtIsL9/zPsSrV4f3Eunk/tPdbWhq4oXNmzmhX78OPf+jj8J7Kf3Hf4Sne27YAL/6FXz1q+Fk+eSTuxRHJDYSOUN9cuNGMkk7qr+zlSthxIg9v/7CC+H7LhZqTSbDTz76aK/P+fDD8CStFSvCK9TPng2XXx5u2g8YAGefHZapSE+SyBnqjatWfXYP+0Sqrw+P8uxs6dLPr8jchU3+T728dSsf1dczbKcb/W3fHt5j6atfDS+398AD4Wy0sTG8u6dIT5e4Geo7tbW8VVsbdYzsOuSQ8FJMP/kJPP54uAbpqqvC+4gUFYXvuyhwjjvWrAHCywOcd14lW7aE1yktKQk377duDe+ZpDIVCSVuhvpodXWyN/cBLrkkvPT8vHnhofOjj4b/+q/w8PnIkXseOp8zJ7xAaEcF+TQ29efGZc8w7PbwliDHHfcJQ4b0ZsUKr38TkURJXKHOqq5O9O1MABg0KJyd7q6tu8MtWRIeYt/7oEAjcBtwFtgA8qbP4a+/fhITB/Zi0KCV5OWN7FJskaRL1CZ/k3N8EOPre2bN9OnhlZf3cCBwGlAJrANOhvRV0H803Hsj+f/Snxfqc+fK/iK5LlGFurW5mbQuQbSnI44Ir3UHQB/gFuAo4CbgP4EqoBCK/huOGQK/+E8YNYr6IODh6upoMovEUKIKdVsmk8wzo7pq9WqovwSYDewPfB1oBv4WGBY+p9Dgu9+Fa6/dZTa7tLaWRn1PRTokUftQa5O8VKqz3l8J186CVc8QFmg5pN6AYCmflSiEh+jLysJ7No8atccwRakUS5O+akLEk8TMUGszGZqSfjCqPYs/hnMvgamFcN63YdViOPg2eOQ78PzXoOCmXZ9fVBReBv+hh1otU4CMc1Rt35716CJJkJgZ6qKamuT8duiMVzfBbXNg7XrgOeBhmHw3XLAZhj2863MnTw6XWkF4KfwLLghX6e9lv3NdEPCnrVvp2rlXIj1DYgr1nbo6esz8dMk6+Lda2PJT4DfA9+D4q+CC38GAy9r+cyedFK5H7dsXbrihw6eoLqqp4dtegoskW2IKdU1DA4m+Y/CHH8L3m2BLHeHBpcfgmFfhB++H+0D5QftjVFaGl8D/whfaWEbVuvUR3xpFJC4SU6gf7NjBuKhD+LbiQ7j6SVg9G5gCTIOD+sFPLoWyZcA3OzdeOr1P18/bpEIV6RAvhWpm04FbgTzgPufcj32M2xkrGxqSUahL1sGt18P7zwKDgethzJ1w7eUwqDSSSBlAC6dE2tflQjWzPOAO4ARgNfC6mT3pnHu7q2N3RnVjY3e+3L6bOxfuu4/Lq6vDS9mfdx6UHAF3/QI+KgQeBe6EI6+Dfz4Fht0bdWIKzWjWWlSRdvmYoR4FvOecWwFgZrOAU4FuLdRYLJmaOze8nWdDAzAQ1l8M198JXAMYTLkXvvs7KL8u4qC7Spnh4vD9FYmYj5VG+wM7X414dcvXulVzHH7g77uvpUwBJgPnAfkwYDDM6wtXXwnlvSMM2LYYfHdFIudjhtraIsY9fv7MbAYwA6CiooL58+d7eOnPfb+2lvJMhptraryO69Plu5wX/0TLG7DRcjp3Cmh0zvu/WbbU1NTEJivEK2+cskL35/VRqKvZ5VxGhgJrd3+Sc+4e4B6ASZMmuSlTpnh46c9d+NprnFNdzeWl0Ry46ZCBA8Ob0bfy9VzOXZJK8Xgmg+9/s2yZP39+bLJCvPLGKSt0f14fm/yvA2PMbKSZFQBnAE96GLdT+sfhsvHnnReeobSzwsLw6zms0TldxUukA7o8Q3XONZvZRcCzhMumHnDOLe1ysk4avntR5aJp08L3990HOx/l//TrOSyv/aeI9Hhe1qE6554GnvYx1r4aVVwc5ct33LRpMG0aN9fU5PRm/s76ptOgxf0i7UrM9USGFBYm5y+TYwYWFEQdQSQWEtNBY4qLW11uIF13aCfO+xfpyRJTqEeUlur0yCwoNuO4Pn2ijiESC4kp1D75+ToSnQX5qRSVMdnXKxK1xBQqQK9Uov46OaEuCBivQhXpkEQ10H7pNMUqVa8OLCqiOE+LpkQ6IlHt0yedmMu75oQCM84YODDqGCKxkahCLTCjPA5nTMVEvhmnDRgQdQyR2EhUoQKcPnBgcm5DELHCVEr7T0U6IXGF+o3ycgq1H7XL0sDp5eWYVk6IdFjimqeyd2/2j8N5/TkuP5Xie0OHRh1DJFYSV6hmxszhwynVLLVLDi0pYWxJou8jK+JdIlvnjIEDdYX5Luidl8fMYcPaf6KI7CKRhdorL4/zhwyhSLPUfVKUSnGqju6LdFpiG+dfDzhA1/DcByWpFD8dNYp8/TIS6bTE/tT0y89n5vDhOh21k8oLCjiroiLqGCKxlOi2uWzYMApUqB1Wkkpx2+jR5GmplMg+SXTb9MrL47bRoylRqbYrbcbE3r35cv/+UUcRia3EN83ZFRVM3m8/dELq3hWa8cjYsVrIL9IFiS9UM+MXY8dSpCsmtakkleJno0cztKgo6igisZb4QgUYXFjIXWPG6ABVKwrMqOzdm3MHD446ikjs9ZiGOXvQIM6qqFCp7sSAfuk0s8eN06a+iAc9ql3uHDOGcSUlFKg8gPAOB3MnTKC/Lnko4kWPKtT8VIqnDz+cful0j79DanEqxa8POYRDdb6+iDc9qlAB+ufnM3/iRMp68NX9e6VS3DByJKfo9FIRr3pcoQIc3KsXf5wwgbK8vB43U+2VSnHVAQdwsS5+IuJdjyxUgHGlpfzpiCPok073mG9Cr1SKa0eO5AcHHBB1FJFE6ild0qpDS0p4vbKSoYWFFCb8QFVxKsUdY8ZwmWamIlnTowsVYFRxMYuPPJK/LitL5JKqtBl902leHD+e72itqUhWJa9B9kFZOs3z48dz/pAhiSrVXqkUo4uLWTxpEseUlUUdRyTxktMeXZRnxi2jRzNn3DgG5OfH/uLUxakUF+2/P4smTdIppSLdJN6tkQUn9uvHe0cfzd8NGBDL2WqvVIqRRUX8ceJEbhw1SneAFelG+mlrRVk6zS8POYQnxo1jVFFRLC7/V2hGSSrFzOHDWXbUUVT27h11JJEep+eubu+Aaf368Zejj2ZWdTWXvvceNZkMtUEQdaxd5JuRNuPcwYP5/yNG6DRSkQipUNuRMuOsigr+rrycBz/+mBtWrWJDYyN1QRDpnVVLUykc8J1Bg5g5fDjDtJ9UJHJd2pY1s6+b2VIzC8xskq9QuagglWLGkCGsOPponh0/nlP696fQjN7deJ3VQjN6pVKMKS7m1jFjqD72WG4/6CCVqUiO6OoM9S3ga8DdHrLEgplxbFkZxx52GJubmvjDpk08Ul3NC5s3kzajPghodH7mrimgNC+P+iDgr3r14qyKCk7t35+/0gVNRHJSlwrVObcM6LHX0uybn8+ZFRWcWVFBUxDw523beG3bNv5nyxaqtm9nY3MzRakUKaDZOeqDgEwr4xSnUuS3fA+bnCNFeBbX/ysr4+j99mNKnz4MKCjozr+aiOwDcx5mU2Y2H7jcObdgL8+ZAcwAqKioqJw1a1aXX3d3NTU1lJaWeh93XwVAY8uMtck5moKAjHMEQElDA3WFhaTMyDejwIz8lmLNz8FfULn2vd2bOGWFeOWNU1bITt6pU6dWOeda38XpnNvrGzCXcNN+97dTd3rOfGBSe2N9+lZZWemyYd68eVkZNxvilNW5eOWNU1bn4pU3Tlmdy05eYIFro9va3eR3zk3z0eoiIkmX+yvWRURioqvLpk4zs9XAZOApM3vWTywRkfjp6lH+OcAcT1lERGJNm/wiIp6oUEVEPFGhioh4okIVEfFEhSoi4okKVUTEExWqiIgnKlQREU9UqCIinqhQRUQ8UaGKiHiiQhUR8USFKiLiiQpVRMQTFaqIiCcqVBERT7zc9bTTL2q2AViZhaEHAJ9kYdxsiFNWiFfeOGWFeOWNU1bITt4DnHPlrT0QSaFmi5ktcG3d3jXHxCkrxCtvnLJCvPLGKSt0f15t8ouIeKJCFRHxJGmFek/UATohTlkhXnnjlBXilTdOWaGb8yZqH6qISJSSNkMVEYmMClVExJNEFaqZfd3MlppZYGY5u7TDzKab2btm9p6Z/SDqPHtjZg+YWbWZvRV1lvaY2TAzm2dmy1r+H1wcdaa2mFmRmb1mZm+2ZL0m6kztMbM8M3vDzH4fdZb2mNmHZrbEzBaZ2YLuet1EFSrwFvA14KWog7TFzPKAO4AvAYcAZ5rZIdGm2qsHgelRh+igZuAy59xY4Bjgwhz+3jYAxzvnxgMTgOlmdky0kdp1MbAs6hCdMNU5N0HrUPeRc26Zc+7dqHO04yjgPefcCudcIzALODXiTG1yzr0EbIo6R0c459Y55xa2fLyd8Id//2hTtc6Falo+zW95y9kjxGY2FPgycF/UWXJZogo1JvYHPtrp89Xk6A99nJnZCGAi8L8RR2lTyyb0IqAaeN45l7NZgZ8BVwJBxDk6ygHPmVmVmc3orhdNd9cL+WJmc4FBrTz0r865J7o7zz6wVr6WszOTODKzUuBx4BLn3Lao87TFOZcBJphZH2COmY1zzuXcvmoz+wpQ7ZyrMrMpEcfpqGOdc2vNbCDwvJm907K1lVWxK1Tn3LSoM3TRamDYTp8PBdZGlCVxzCyfsEwfds7NjjpPRzjntpjZfMJ91TlXqMCxwClmdjJQBOxnZr9yzn0z4lxtcs6tbXlfbWZzCHe1Zb1Qtcnf/V4HxpjZSDMrAM4Anow4UyKYmQH3A8ucc7dEnWdvzKy8ZWaKmRUD04B3Ig3VBufcvzjnhjrnRhD+f30xl8vUzErMrPenHwMn0k2/qBJVqGZ2mpmtBiYDT5nZs1Fn2p1zrhm4CHiW8KDJY865pdGmapuZPQK8AhxsZqvN7NyoM+3FscC3gONblsssaplV5aLBwDwzW0z4S/Z551zOL0eKiQrgZTN7E3gNeMo594fueGGdeioi4kmiZqgiIlFSoYqIeKJCFRHxRIUqIuKJClVExBMVqoiIJypUERFP/g+/9JOg1rcGQwAAAABJRU5ErkJggg==\n", + "image/png": "iVBORw0KGgoAAAANSUhEUgAAAVQAAAD4CAYAAACzOx6UAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjUuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8qNh9FAAAACXBIWXMAAAsTAAALEwEAmpwYAAAcSklEQVR4nO3deXxV9Z3/8dfn5mYjwbCFAAIqixZFEIMLY/0V+qBKbUcfOtVW2k4dy+BYndrWhS6Pjq22Y522th2rHdfaxZHqFKqtOggKo7ZaJRRBRIuiIksJsich2z3f3x8nyGJCEvK999xz8n4+HnlkuZfvfRPIO99zzvecY845RESk51JRBxARSQoVqoiIJypUERFPVKgiIp6oUEVEPElH8aKDBg1yRx99tPdx6+vrKSsr8z5uNsQpK8Qrb5yyQrzyxikrZCdvTU3Nu865ynYfdM7l/K26utplw+LFi7MybjbEKatz8cobp6zOxStvnLI6l528wFLXQbdpk19ExBMVqoiIJypUERFPVKgiIp6oUEVEPFGhioh4okIVEfFEhSoi4okKVUTEExWqiIgnKlQREU9UqCIinqhQRUQ8UaGKiHiiQhUR8USFKiLiiQpVRMSTHheqmZWY2Qtm9pKZrTKzb/sIJiISNz7uKdUEfNg5V2dmhcCzZva4c+55D2OLiMRGjwu17R4rdW2fFra9uZ6OKyISNxb2YQ8HMSsAaoAxwG3OuTntPGc2MBugqqqqeu7cuT1+3YPV1dVRXl7ufdxsiFNWiFfeOGWFeOWNU1bITt5p06bVOOcmt/tgR3fvO5w3oB+wGBh/qOfprqfxyupcvPLGKatz8cobp6zOxfyup865HcASYIbPcUVE4sDHUf5KM+vX9nEpMB14tafjiojEjY+j/EOBX7TtR00BDzrn/uBhXBGRWPFxlH8FMMlDFhGRWNOZUiIinqhQRUQ8UaGKiHiiQhUR8USFKiLiiQpVRMQTFaqIiCcqVBERT1SoIiKeqFBFRDxRoYqIeKJCFRHxRIUqIuKJClVExBMVqoiIJypUERFPVKgiIp6oUEVEPFGhioh4okIVEfFEhSoi4okKVUTEExWqiIgnKlQREU9UqCIinqhQRUQ8UaGKiHiiQhUR8USFKiLiiQpVRMQTFaqIiCcqVBERT1SoIiKeqFBFRDzpcaGa2QgzW2xmq81slZld5SOYiEjcpD2M0Qpc7ZxbZmZ9gRozW+ice8XD2CIisdHjGapzbpNzblnbx7uB1cCRPR1XRCRuvO5DNbOjgUnAn32OKyISB+ac8zOQWTnwf8B3nXPz2nl8NjAboKqqqnru3LleXnd/dXV1lJeXex83G+KUFeKVN05ZIV5545QVspN32rRpNc65ye0+6Jzr8RtQCCwAvtKV51dXV7tsWLx4cVbGzYY4ZXUuXnnjlNW5eOWNU1bnspMXWOo66DYfR/kNuAdY7Zy7pafjiYjElY99qGcAnwU+bGbL297O8TCuiEis9HjZlHPuWcA8ZBERiTWdKSUi4okKVUTEExWqiIgnKlQREU9UqCIinqhQRUQ8UaGKiHiiQhUR8USFKiLiiQpVRMQTFaqIiCcqVBERT1SoIiKeqFBFRDxRoYqIeKJCFRHxRIUqIuKJClVExBMVqoiIJypUERFPVKgiIp6oUEVEPFGhioh4okIVEfFEhSoi4okKVUTEExWqiIgnKlQREU9UqCIinqhQRUQ8UaGKiHiiQhUR8USFKiLiiQpVRMQTFaqIiCdeCtXM7jWzWjN72cd4IiJx5GuGeh8ww9NYIiKxlPYxiHPuaTM72sdYSdOQybCxqYlNzc1sbG5mV2srLc5R2dLCHRs3UppKMaSoiKFFRQwrLmZAOo2ZRR1bDtOzz0JZGUyaBPfeC5/8JGzfDi++COefD089BVVVcMIJ8LOfwUc/GnVi8cmcc34GCgv1D8658R08PhuYDVBVVVU9d+5cL6+7v7q6OsrLy72P21UtzlGfyVCfybA7k2FPEBAQbgYYsP93elgmw8aCAmh7DCBoe19kRnlBAeUFBfQpKKBPKvpd3VF/b7sjiqxPPDGYW28dSyZjmEE67WhsTFFcHBAE0NoaftzSsu/xlhbjuON2c+ONf9T3NkuykXfatGk1zrnJ7T3mZYbaFc65O4E7ASZPnuymTp3q/TWWLFlCNsbtSOAcL+7ezbwtW/hNbS2bW1ooSqepy2TeK8eO/KCujms6+Yfuk0pRYIYDZgwYwKcGD+as/v3pm87ZP9t7cv297Ykosk6YAG+8AQ8+eODXm5sL3vu4qWnfx5MnwxNPQEVFf55+ulzf2yzJdd7c/2QmwDuNjfx0wwbu3LSJjHPsyWRobXus0ePrNAT7avl/tmxhwbZtNAUB0/v359oRI/hQv37aPZAnBgyAKVPeX6gHS6fh2mvhlFOgf//cZJPcUaF2kXOOp3bs4Hvr1vHMjh04oNnT7pKu2p3JAPD4tm08vXMnA9Jprh0xgn8aOpSygoJO/rRkW0EB9OkDDQ3tP96/P1x8Mfz7v+c2l+SOr2VTDwDPAceZ2Xoz+7yPcfPFczt3MrmmhvNWrmTR9u00OZfzMt2fA+oyGdY1NTFn7VqG/+lP3LZ+PS1BZzsa8tOll4IZbNny/sdefx2KiuDyy3Ofqys2b4adO2HaNPjiF+G889p/3tlnw/LlcMstOY0nOealUJ1zFzvnhjrnCp1zw51z9/gYN2qv1tfzkZdeYvpLL7Gsro76PCyshiBgRybDnLVrGfn888zdvBlfBxpzZcKE8P3L7axinjMHSkrg29/ObaZDWbMGFi6EJ5+EIUPCI/d33QXvvgsHHz8sLAyP7l93HYwcCcXF0WSW3NAmfztag4Cb1q3jpnXraGo7Up/v6oOA+uZmZr32Grdv3Mh/jxvH8JKSqGN1yf6FOm3avq//6U8wb164iTx4cDTZ9tq6FW66Cf7xH8MZdXMzLFsGjY0HluQ//zM8/DDU1UF5Ofz0p/C5z0WXW3JLhXqQVfX1XLRqFW83NrInD2eknakPAp7btYtxL7zAj8eO5dIhQ/L+wNXEieH7g2eo11wDI0bAl7+c+0x7ff3r4TrSz3wG7r8/LMelS/c9fvCCiw99CE49NSzTa68NP5beI/oFjnnkjo0bOaWmhtUNDXm5ed9Vrc5RFwRctWYNM1asYHdra+d/KEIDB8LQobBq1b6vPfggPPdcOCvM5UQ7k4GamnCf7sKF4exzzBg44wzYtAlOPLHzMUaOhB/8AD74wXD/r/QemqECLUHAFWvWcP/mzbGclXakPgh4escOJi5dyqKJExlVWhp1pA5NmAB//nP4cXMzfO1r4dKimTOz/9pbtoQzyuOPhyAI943+5CfhMqiPfKT74/385/4zSjz0+kLd2tLCx1asYGV9/QHrPpOi0Tnebmxk0tKl/G78eKbl6eLHCRNgwQJYvx4eegjWroX77gtnitnw2mvh0fm33gpPD33ySZg/P9zFMHBgeMRepLt6daFuampiyrJlbGpujnQZVLYFwK5Mho+tXMn948ZxfmVl1JHeZ++BqWeege98By64AM480+9r1NaGy5b+5V/g5JPD/Z3z5r3/wJLI4eq1hbqhqYnTamrY3NJCa4LLdH97goBPr17Nz4OAT1ZVRR3nAHsL9Utfgt274eab/Y19xRVw5JEwfjzcc094JL6uLnuzX+m9euVBqc3NzUxZtiwZZbpuXbhtvG5dl56+Jwj4p9de4+F3381urm4aNy48gFNbGxbgmDGHP5Zz8PTTYWH+5S/hPtnKSjj33HB/6ejRKlPJjl43Q92TyTB1+XI2NTWR38e+u8A5+P73wx2Oy5aFR1K60BR7goCLX3mFpyZO5PSKihwE7VxhITQ1Hf6f37oVKirCUz9PPHEC99wDP/whHHtsuOheJBd61QzVOcfMV17h7cbG+JcpwGOPwd/+FjbGpk3hUZ0u2hMEnLNyJesbfV7OJbfWrAmXWt15JwwaBCtXhgeXrr/+FU46Cb7ylfDapCK50qsK9Ttvv80T27cnY2nUzp1hkc6ZA8OGhec23nEH7NrV5SF2ZzJ8ZMUK9rRddCUONm+Gb34z3DVw7LHhpHzmTNizJ7yo85lnwhFHJOLXpcRQr9nkX7RtGzetW5eMMoVw+/Z3v9v3+SmnhOt+uqG1bUnVpa++ygMnnOA3n0etreHFUU49Nbyi0+23h58HgfaFSn7pFTPUXa2tXLx6dXLK1KM9QcAjW7fy6NatUUc5gHPhHoziYtiwIXyrqAjPo9+6NZyUq0wl3/SKQr1yzRrq8vz0yyg1BAGfXb2a7S0tkebYtStcE2oWLm0qK4MbbwwvjPLYY3DRRZHGE+lU4gt14bZt/HbLFhrjvjxqfzffHF6WaceO9z+2YUN4vuSPftStIRsyGS7761/95OuGN98Mbx1y883hDHTTJvj97/edC3/ddZDHZ8yKHCDRhdoSBFzy6qvJO6V09Ojw/Ztvvv+xO+4IF3Reckm3hmxyjke3buWPO3f2PF8nNmwIC3TzZhg1Cn71q32L7Y85Bj7+cejXL+sxRLxLdKHes2kTO5O4qT9qVPj+4EJ9+eXw3M2ZMw/rhkUNQcCVa9Zk5QLVe/aEp3w+9BD8+tfh6aXFxeEBp299K7wnk5Y4SdwltlAbMhm+/uabsb4MX4c6mqH+13+FOxwvvPCwh17T0MBj27b1INyBHnoojLRrFzz/fLhZP2dOeHppv37hUXuRpEhsof54/XqaklimELbSwIHhpZL2Wrw4XOU+a1aPLsJZHwT865o1BIc5S21oCI/Cm8H114ezzquuCifMy5fDWWcddjSRvJfIQm0JAr7/zjvJ23e6v1Gj9hVqSwvcfTd84AMwfXqPh97S0sKT27d3+fnvvBPuF73mmrBAnQs366++Gs45B77xDV1oWXqHRBbqI1u3kknSUf32jBoVHsXZsiVc4L9xY7ja3cPizLpMhv94551DPuett8ITtdauDa9QP29eWKi7doWngX7603DEET2OIhIriTxT6uZ16967h31i7d2PumJFOB0888x918Dz4NmdO3mnsZER+91/ZPfu8B5Lf//38PjjcO+9YZ83N4cXNxHp7RI3Q321vp6X6+ujjpF9e4/033ZbuOPyssu8Dh84x20bNgDhSqxZs6rZsQMefTTcrP/Rj8LLCRQVqUxF9krcDPU3tbXJ39wHOOqosMm2b4dPfCK8gnJH5s8PLxDaVUEhzS0DuXn144z4aXj868wz32XYsL6sXdvz6CJJlbhCnVtbm+jbmbwnnYYnnujac1euDA+xH9IQoBm4FZgJNoiCGfP5uwvPZtLgPgwZ8jYFBcf0KLJI0iVqk7/FOd6M8fU9s2bGjPDKy+8zCjgfqAY2AedA+pswcAzcdTOFXxvIk435dWV/kXyWqELd2dpKWpcger+TTw6vdQdAP+AW4FTg+8B/AjVAMZT8D5w+DH75nzB6NI1BwP21tdFkFomhRBXqrkwmmWdG9dT69dD4JWAecCRwIdAK/AMwInxOscEXvgA33HDAbHZVfT3N+p6KdEmi9qHWJ32pVHe88TbcMBfWPU5YoJWQ+gsEq3ivRCE8sFVRAd/73r6lWPspSaVY1RtWTYh4kJgZan0mQ0tvOBh1KCv+Bp//Ekwrhlmfg3Ur4Lhb4YFLYOEFUPT9A59fUhJeBv8Xv2i3TAEyzlGze3fWo4skQWJmqMvr6pLz26E7nt8Gt86HjZuBJ4D7YcodcPl2GHH/gc+dMiU85x/CSz1dfnm4Sv8Q+50bgoA/7txJD+7qLNJrJKZQX21ooNfMT1dugn+rhx0/BB4Cvggf/iZc/nsYdHXHf+7ss8P1qP37w003wZiu1eTyujo+5yW4SLIlplA3NDWR6MtpvvUWfLkFdjQQHlx6EE5/Hr76RrgPlK92PkZ1dXgJ/A9+sINlVO3bHPGtUUTiIjGF+uaePYyPOoRva9+C6x+B9fOAqcB0OHYA/MdXoGI18JnujZdOH9b187apUEW6xEuhmtkM4CdAAXC3c+57PsbtjrebmpJRqCs3wU++C28sAIYC34Wxt8MN18CQ8kgiZQAtnBLpXI8L1cwKgNuAjwDrgRfN7BHn3Cs9Hbs7apubc/lyh2/RIrj7bq6prQ0vZT9rFpSdDD/7JbxTDPwGuB1OuRH+9VwYcVfUiSk2o1VrUUU65WOGeirwunNuLYCZzQXOA3JaqLFYMrVoUXg7z6YmYDBsvgq+ezvwbcBg6l3whd9D5Y0RBz1Qyiwr95kSSRofK42OBPa/GvH6tq/lVGscfuDvvrutTAGmALOAQhg0FBb3h+uvg8q+EQbsWAy+uyKR8zFDbW8R4/t+/sxsNjAboKqqiiVLlnh46X2+XF9PZSbDD+rqvI7r0zUHnBf/cNsbsNXyOncKaHbO+79ZttTV1cUmK8Qrb5yyQu7z+ijU9RxwLiPDgY0HP8k5dydwJ8DkyZPd1KlTPbz0Ple88AKX1tZyTXk0B266ZPDg8Gb07Xw9n3OXpVL8NpPB979ZtixZsiQ2WSFeeeOUFXKf18cm/4vAWDM7xsyKgE8Bj3gYt1sGxuGy8bNmhWco7a+4OPx6Hmt2TlfxEumCHs9QnXOtZnYlsIBw2dS9zrlVPU7WTSMPLqp8tPeOpHffDfsf5fdwp9JsK4g6gEgMeFmH6px7DHjMx1iHa3RpaZQv33XTp8P06fygri6vN/P31z+dDm9VLSKHlJjriQwrLk7OXybPDC4qijqCSCwkpoPGlpa2u9xAeu6Ebpz3L9KbJaZQTy4v1+mRWVBqxpn9+kUdQyQWElOo/QoLdSQ6CwpTKapjsq9XJGqJKVSAPqlE/XXyQkMQMFGFKtIliWqgI9JpSlWqXo0qKaG0QIumRLoiUe3TL52Yy7vmhSIzPjV4cNQxRGIjUYVaZEZlHM6YiolCM84fNCjqGCKxkahCBbho8ODk3IYgYsWplPafinRD4gr1k5WVFGs/ao+lgYsqKzGtnBDpssQ1T3XfvhwZh/P681xhKsUXhw+POoZIrCSuUM2MOSNHUq5Zao+cUFbGuLJE30dWxLtEts6nBg/WFeZ7oG9BAXNGjOj8iSJygEQWap+CAi4bNowSzVIPS0kqxXk6ui/SbYltnG8cdZSu4XkYylIpfjh6NIX6ZSTSbYn9qRlQWMickSN1Omo3VRYVMbOqKuoYIrGU6La5esQIilSoXVaWSnHrmDEUaKmUyGFJdNv0KSjg1jFjKFOpdiptxqS+ffnYwIFRRxGJrcQ3zaerqphyxBHohNRDKzbjgXHjtJBfpAcSX6hmxi/HjaNEV0zqUFkqxY/HjGF4SUnUUURiLfGFCjC0uJifjR2rA1TtKDKjum9fPj90aNRRRGKv1zTMp4cMYWZVlUp1PwYMSKeZN368NvVFPOhV7XL72LGMLyujSOUBhHc4WHTSSQzUJQ9FvOhVhVqYSvHYhAkMSKd7/R1SS1Mp/vv44zlB5+uLeNOrChVgYGEhSyZNoqIXX92/TyrFTcccw7k6vVTEq15XqADH9enDMyedREVBQa+bqfZJpfjmUUdxlS5+IuJdryxUgPHl5fzx5JPpl073mm9Cn1SKG445hq8edVTUUUQSqbd0SbtOKCvjxepqhhcXU5zwA1WlqRS3jR3L1ZqZimRNry5UgNGlpaw45RT+rqIikUuq0mb0T6d5auJELtFaU5GsSl6DHIaKdJqFEydy2bBhiSrVPqkUY0pLWTF5MqdXVEQdRyTxktMePVRgxi1jxjB//HgGFRbG/uLUpakUVx55JMsnT9YppSI5Eu/WyIKzBgzg9dNO4xODBsVyttonleKYkhKemTSJm0eP1h1gRXJIP23tqEin+dXxx/Pw+PGMLimJxeX/is0oS6WYM3Ikq089leq+faOOJNLr9N7V7V0wfcAA/nraacytreUrr79OXSZDfRBEHesAhWakzfj80KF86+ijdRqpSIRUqJ1ImTGzqopPVFZy39/+xk3r1rGluZmGIIj0zqrlqRQOuGTIEOaMHMkI7ScViVyPtmXN7EIzW2VmgZlN9hUqHxWlUsweNoy1p53GgokTOXfgQIrN6JvD66wWm9EnlWJsaSk/GTuW2jPO4KfHHqsyFckTPZ2hvgxcANzhIUssmBlnVFRwxoknsr2lhf/dto0Hamt5cvt20mY0BgHNzs/cNQWUFxTQGAR8oE8fZlZVcd7AgXxAFzQRyUs9KlTn3Gqg115Ls39hIRdXVXFxVRUtQcCfdu3ihV27+L8dO6jZvZutra2UpFKkgFbnaAwCMu2MU5pKUdj2PWxxjhThWVz/r6KC0444gqn9+jGoqCiXfzUROQzmPMymzGwJcI1zbukhnjMbmA1QVVVVPXfu3B6/7sHq6uooLy/3Pu7hCoDmthlri3O0BAEZ5wiAsqYmGoqLSZlRaEaRGYVtxVqYh7+g8u17eyhxygrxyhunrJCdvNOmTatxzrW/i9M5d8g3YBHhpv3Bb+ft95wlwOTOxtr7Vl1d7bJh8eLFWRk3G+KU1bl45Y1TVufilTdOWZ3LTl5gqeug2zrd5HfOTffR6iIiSZf/K9ZFRGKip8umzjez9cAU4FEzW+AnlohI/PT0KP98YL6nLCIisaZNfhERT1SoIiKeqFBFRDxRoYqIeKJCFRHxRIUqIuKJClVExBMVqoiIJypUERFPVKgiIp6oUEVEPFGhioh4okIVEfFEhSoi4okKVUTEExWqiIgnXu562u0XNdsCvJ2FoQcB72Zh3GyIU1aIV944ZYV45Y1TVshO3qOcc5XtPRBJoWaLmS11Hd3eNc/EKSvEK2+cskK88sYpK+Q+rzb5RUQ8UaGKiHiStEK9M+oA3RCnrBCvvHHKCvHKG6eskOO8idqHKiISpaTNUEVEIqNCFRHxJFGFamYXmtkqMwvMLG+XdpjZDDN7zcxeN7OvRp3nUMzsXjOrNbOXo87SGTMbYWaLzWx12/+Dq6LO1BEzKzGzF8zspbas3446U2fMrMDM/mJmf4g6S2fM7C0zW2lmy81saa5eN1GFCrwMXAA8HXWQjphZAXAb8FHgeOBiMzs+2lSHdB8wI+oQXdQKXO2cGwecDlyRx9/bJuDDzrmJwEnADDM7PdpInboKWB11iG6Y5pw7SetQD5NzbrVz7rWoc3TiVOB159xa51wzMBc4L+JMHXLOPQ1sizpHVzjnNjnnlrV9vJvwh//IaFO1z4Xq2j4tbHvL2yPEZjYc+Bhwd9RZ8lmiCjUmjgTe2e/z9eTpD32cmdnRwCTgzxFH6VDbJvRyoBZY6JzL26zAj4HrgCDiHF3lgCfMrMbMZufqRdO5eiFfzGwRMKSdh77hnHs413kOg7XztbydmcSRmZUDvwW+5JzbFXWejjjnMsBJZtYPmG9m451zebev2sw+DtQ652rMbGrEcbrqDOfcRjMbDCw0s1fbtrayKnaF6pybHnWGHloPjNjv8+HAxoiyJI6ZFRKW6f3OuXlR5+kK59wOM1tCuK867woVOAM418zOAUqAI8zs1865z0Scq0POuY1t72vNbD7hrrasF6o2+XPvRWCsmR1jZkXAp4BHIs6UCGZmwD3AaufcLVHnORQzq2ybmWJmpcB04NVIQ3XAOfc159xw59zRhP9fn8rnMjWzMjPru/dj4Cxy9IsqUYVqZueb2XpgCvComS2IOtPBnHOtwJXAAsKDJg8651ZFm6pjZvYA8BxwnJmtN7PPR53pEM4APgt8uG25zPK2WVU+GgosNrMVhL9kFzrn8n45UkxUAc+a2UvAC8Cjzrn/zcUL69RTERFPEjVDFRGJkgpVRMQTFaqIiCcqVBERT1SoIiKeqFBFRDxRoYqIePL/AR/rja2S8NzeAAAAAElFTkSuQmCC\n", "text/plain": [ "
" ] @@ -803,8 +801,8 @@ "plt.plot(0, 0, \"ko\")\n", "plot_vector2d(v / LA.norm(v), color=\"k\", zorder=10)\n", "plot_vector2d(v, color=\"b\", linestyle=\":\", zorder=15)\n", - "plt.text(0.3, 0.3, r\"$\\hat{u}$\", color=\"k\", fontsize=18)\n", - "plt.text(1.5, 0.7, \"$u$\", color=\"b\", fontsize=18)\n", + "plt.text(0.3, 0.3, r\"$\\hat{v}$\", color=\"k\", fontsize=18)\n", + "plt.text(1.5, 0.7, \"$v$\", color=\"b\", fontsize=18)\n", "plt.axis([-1.5, 5.5, -1.5, 3.5])\n", "plt.gca().set_aspect(\"equal\")\n", "plt.grid()\n", @@ -1002,7 +1000,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Note: due to small floating point errors, `cos_theta` may be very slightly outside of the $[-1, 1]$ interval, which would make `arccos` fail. This is why we clipped the value within the range, using NumPy's `clip` function." + "Note: due to small floating point errors, `cos_theta` may be very slightly outside the $[-1, 1]$ interval, which would make `arccos` fail. This is why we clipped the value within the range, using NumPy's `clip` function." ] }, { @@ -1064,7 +1062,7 @@ "metadata": {}, "source": [ "# Matrices\n", - "A matrix is a rectangular array of scalars (ie. any number: integer, real or complex) arranged in rows and columns, for example:\n", + "A matrix is a rectangular array of scalars (i.e. any number: integer, real or complex) arranged in rows and columns, for example:\n", "\n", "\\begin{bmatrix} 10 & 20 & 30 \\\\ 40 & 50 & 60 \\end{bmatrix}\n", "\n", @@ -1207,7 +1205,7 @@ "metadata": {}, "source": [ "## Element indexing\n", - "The number located in the $i^{th}$ row, and $j^{th}$ column of a matrix $X$ is sometimes noted $X_{i,j}$ or $X_{ij}$, but there is no standard notation, so people often prefer to explicitely name the elements, like this: \"*let $X = (x_{i,j})_{1 ≤ i ≤ m, 1 ≤ j ≤ n}$*\". This means that $X$ is equal to:\n", + "The number located in the $i^{th}$ row, and $j^{th}$ column of a matrix $X$ is sometimes noted $X_{i,j}$ or $X_{ij}$, but there is no standard notation, so people often prefer to explicitly name the elements, like this: \"*let $X = (x_{i,j})_{1 ≤ i ≤ m, 1 ≤ j ≤ n}$*\". This means that $X$ is equal to:\n", "\n", "$X = \\begin{bmatrix}\n", " x_{1,1} & x_{1,2} & x_{1,3} & \\cdots & x_{1,n}\\\\\n", @@ -1217,7 +1215,7 @@ " x_{m,1} & x_{m,2} & x_{m,3} & \\cdots & x_{m,n}\\\\\n", "\\end{bmatrix}$\n", "\n", - "However in this notebook we will use the $X_{i,j}$ notation, as it matches fairly well NumPy's notation. Note that in math indices generally start at 1, but in programming they usually start at 0. So to access $A_{2,3}$ programmatically, we need to write this:" + "However, in this notebook we will use the $X_{i,j}$ notation, as it matches fairly well NumPy's notation. Note that in math indices generally start at 1, but in programming they usually start at 0. So to access $A_{2,3}$ programmatically, we need to write this:" ] }, { @@ -1244,7 +1242,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The $i^{th}$ row vector is sometimes noted $M_i$ or $M_{i,*}$, but again there is no standard notation so people often prefer to explicitely define their own names, for example: \"*let **x**$_{i}$ be the $i^{th}$ row vector of matrix $X$*\". We will use the $M_{i,*}$, for the same reason as above. For example, to access $A_{2,*}$ (ie. $A$'s 2nd row vector):" + "The $i^{th}$ row vector is sometimes noted $M_i$ or $M_{i,*}$, but again there is no standard notation so people often prefer to explicitly define their own names, for example: \"*let **x**$_{i}$ be the $i^{th}$ row vector of matrix $X$*\". We will use the $M_{i,*}$, for the same reason as above. For example, to access $A_{2,*}$ (i.e. $A$'s 2nd row vector):" ] }, { @@ -1271,7 +1269,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Similarly, the $j^{th}$ column vector is sometimes noted $M^j$ or $M_{*,j}$, but there is no standard notation. We will use $M_{*,j}$. For example, to access $A_{*,3}$ (ie. $A$'s 3rd column vector):" + "Similarly, the $j^{th}$ column vector is sometimes noted $M^j$ or $M_{*,j}$, but there is no standard notation. We will use $M_{*,j}$. For example, to access $A_{*,3}$ (i.e. $A$'s 3rd column vector):" ] }, { @@ -1298,7 +1296,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Note that the result is actually a one-dimensional NumPy array: there is no such thing as a *vertical* or *horizontal* one-dimensional array. If you need to actually represent a row vector as a one-row matrix (ie. a 2D NumPy array), or a column vector as a one-column matrix, then you need to use a slice instead of an integer when accessing the row or column, for example:" + "Note that the result is actually a one-dimensional NumPy array: there is no such thing as a *vertical* or *horizontal* one-dimensional array. If you need to actually represent a row vector as a one-row matrix (i.e. a 2D NumPy array), or a column vector as a one-column matrix, then you need to use a slice instead of an integer when accessing the row or column, for example:" ] }, { @@ -1507,7 +1505,7 @@ "metadata": {}, "source": [ "## Adding matrices\n", - "If two matrices $Q$ and $R$ have the same size $m \\times n$, they can be added together. Addition is performed *elementwise*: the result is also a $m \\times n$ matrix $S$ where each element is the sum of the elements at the corresponding position: $S_{i,j} = Q_{i,j} + R_{i,j}$\n", + "If two matrices $Q$ and $R$ have the same size $m \\times n$, they can be added together. Addition is performed *elementwise*: the result is also an $m \\times n$ matrix $S$ where each element is the sum of the elements at the corresponding position: $S_{i,j} = Q_{i,j} + R_{i,j}$\n", "\n", "$S =\n", "\\begin{bmatrix}\n", @@ -1539,7 +1537,7 @@ } ], "source": [ - "B = np.array([[1,2,3], [4, 5, 6]])\n", + "B = np.array([[1, 2, 3], [4, 5, 6]])\n", "B" ] }, @@ -1638,7 +1636,7 @@ } ], "source": [ - "C = np.array([[100,200,300], [400, 500, 600]])\n", + "C = np.array([[100, 200, 300], [400, 500, 600]])\n", "\n", "A + (B + C)" ] @@ -1712,7 +1710,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Scalar multiplication is also defined on the right hand side, and gives the same result: $M \\lambda = \\lambda M$. For example:" + "Scalar multiplication is also defined on the right-hand side, and gives the same result: $M \\lambda = \\lambda M$. For example:" ] }, { @@ -1961,7 +1959,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "The `@` operator also works for vectors: `u @ v` computes the dot product of `u` and `v`:" + "The `@` operator also works for vectors. `u @ v` computes the dot product of `u` and `v`:" ] }, { @@ -1988,7 +1986,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Let's check this result by looking at one element, just to be sure: looking at $E_{2,3}$ for example, we need to multiply elements in $A$'s $2^{nd}$ row by elements in $D$'s $3^{rd}$ column, and sum up these products:" + "Let's check this result by looking at one element, just to be sure. To calculate $E_{2,3}$ for example, we need to multiply elements in $A$'s $2^{nd}$ row by elements in $D$'s $3^{rd}$ column, and sum up these products:" ] }, { @@ -2064,7 +2062,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This illustrates the fact that **matrix multiplication is *NOT* commutative**: in general $QR ≠ RQ$\n", + "This illustrates the fact that **matrix multiplication is *NOT* commutative**: in general $QR ≠ RQ$.\n", "\n", "In fact, $QR$ and $RQ$ are only *both* defined if $Q$ has size $m \\times n$ and $R$ has size $n \\times m$. Let's look at an example where both *are* defined and show that they are (in general) *NOT* equal:" ] @@ -2552,7 +2550,7 @@ "metadata": {}, "source": [ "## Converting 1D arrays to 2D arrays in NumPy\n", - "As we mentionned earlier, in NumPy (as opposed to Matlab, for example), 1D really means 1D: there is no such thing as a vertical 1D-array or a horizontal 1D-array. So you should not be surprised to see that transposing a 1D array does not do anything:" + "As we mentioned earlier, in NumPy (as opposed to Matlab, for example), 1D really means 1D: there is no such thing as a vertical 1D-array or a horizontal 1D-array. So you should not be surprised to see that transposing a 1D array does not do anything:" ] }, { @@ -2627,7 +2625,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Notice the extra square brackets: this is a 2D array with just one row (ie. a 1x2 matrix). In other words it really is a **row vector**." + "Notice the extra square brackets: this is a 2D array with just one row (i.e. a $1 \\times 2$ matrix). In other words, it really is a **row vector**." ] }, { @@ -2807,7 +2805,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Of course we could also have stored the same 4 vectors as row vectors instead of column vectors, resulting in a $4 \\times 2$ matrix (the transpose of $P$, in fact). It is really an arbitrary choice.\n", + "Of course, we could also have stored the same 4 vectors as row vectors instead of column vectors, resulting in a $4 \\times 2$ matrix (the transpose of $P$, in fact). It is really an arbitrary choice.\n", "\n", "Since the vectors are ordered, you can see the matrix as a path and represent it with connected dots:" ] @@ -3302,7 +3300,7 @@ "dx + ey + fz\n", "\\end{pmatrix}$\n", "\n", - "This transormation $f$ maps 3-dimensional vectors to 2-dimensional vectors in a linear way (ie. the resulting coordinates only involve sums of multiples of the original coordinates). We can represent this transformation as matrix $F$:\n", + "This transformation $f$ maps 3-dimensional vectors to 2-dimensional vectors in a linear way (i.e. the resulting coordinates only involve sums of multiples of the original coordinates). We can represent this transformation as matrix $F$:\n", "\n", "$F = \\begin{bmatrix}\n", "a & b & c \\\\\n", @@ -3313,11 +3311,11 @@ "\n", "$f(\\textbf{u}) = F \\textbf{u}$\n", "\n", - "If we have a matric $G = \\begin{bmatrix}\\textbf{u}_1 & \\textbf{u}_2 & \\cdots & \\textbf{u}_q \\end{bmatrix}$, where each $\\textbf{u}_i$ is a 3-dimensional column vector, then $FG$ results in the linear transformation of all vectors $\\textbf{u}_i$ as defined by the matrix $F$:\n", + "If we have a matrix $G = \\begin{bmatrix}\\textbf{u}_1 & \\textbf{u}_2 & \\cdots & \\textbf{u}_q \\end{bmatrix}$, where each $\\textbf{u}_i$ is a 3-dimensional column vector, then $FG$ results in the linear transformation of all vectors $\\textbf{u}_i$ as defined by the matrix $F$:\n", "\n", "$FG = \\begin{bmatrix}f(\\textbf{u}_1) & f(\\textbf{u}_2) & \\cdots & f(\\textbf{u}_q) \\end{bmatrix}$\n", "\n", - "To summarize, the matrix on the left hand side of a dot product specifies what linear transormation to apply to the right hand side vectors. We have already shown that this can be used to perform projections and rotations, but any other linear transformation is possible. For example, here is a transformation known as a *shear mapping*:" + "To summarize, the matrix on the left-hand side of a dot product specifies what linear transformation to apply to the right-hand side vectors. We have already shown that this can be used to perform projections and rotations, but any other linear transformation is possible. For example, here is a transformation known as a *shear mapping*:" ] }, { @@ -3455,7 +3453,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Let's show a last one: reflection through the horizontal axis:" + "Let's show a last one -- reflection through the horizontal axis:" ] }, { @@ -3531,7 +3529,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "We applied a shear mapping on $P$, just like we did before, but then we applied a second transformation to the result, and *lo and behold* this had the effect of coming back to the original $P$ (I've plotted the original $P$'s outline to double check). The second transformation is the inverse of the first one.\n", + "We applied a shear mapping on $P$, just like we did before, but then we applied a second transformation to the result, and *lo and behold* this had the effect of coming back to the original $P$ (I've plotted the original $P$'s outline to double-check). The second transformation is the inverse of the first one.\n", "\n", "We defined the inverse matrix $F_{shear}^{-1}$ manually this time, but NumPy provides an `inv` function to compute a matrix's inverse, so we could have written instead:" ] @@ -3634,7 +3632,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "This transformation matrix performs a projection onto the horizontal axis. Our polygon gets entirely flattened out so some information is entirely lost and it is impossible to go back to the original polygon using a linear transformation. In other words, $F_{project}$ has no inverse. Such a square matrix that cannot be inversed is called a **singular matrix** (aka degenerate matrix). If we ask NumPy to calculate its inverse, it raises an exception:" + "This transformation matrix performs a projection onto the horizontal axis. Our polygon gets entirely flattened out so some information is entirely lost, and it is impossible to go back to the original polygon using a linear transformation. In other words, $F_{project}$ has no inverse. Such a square matrix that cannot be inversed is called a **singular matrix** (aka degenerate matrix). If we ask NumPy to calculate its inverse, it raises an exception:" ] }, { @@ -3787,7 +3785,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Also, the inverse of scaling by a factor of $\\lambda$ is of course scaling by a factor or $\\frac{1}{\\lambda}$:\n", + "Also, the inverse of scaling by a factor of $\\lambda$ is of course scaling by a factor of $\\frac{1}{\\lambda}$:\n", "\n", "$ (\\lambda \\times M)^{-1} = \\frac{1}{\\lambda} \\times M^{-1}$\n", "\n", @@ -4088,7 +4086,7 @@ "source": [ "Correct!\n", "\n", - "The determinant can actually be negative, when the transformation results in a \"flipped over\" version of the original polygon (eg. a left hand glove becomes a right hand glove). For example, the determinant of the `F_reflect` matrix is -1 because the surface area is preserved but the polygon gets flipped over:" + "The determinant can actually be negative, when the transformation results in a \"flipped over\" version of the original polygon (e.g. a left-hand glove becomes a right-hand glove). For example, the determinant of the `F_reflect` matrix is -1 because the surface area is preserved but the polygon gets flipped over:" ] }, { @@ -4507,7 +4505,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Indeed the horizontal vectors are stretched by a factor of 1.4, and the vertical vectors are shrunk by a factor of 1/1.4=0.714…, so far so good. Let's look at the shear mapping matrix $F_{shear}$:" + "Indeed, the horizontal vectors are stretched by a factor of 1.4, and the vertical vectors are shrunk by a factor of 1/1.4=0.714…, so far so good. Let's look at the shear mapping matrix $F_{shear}$:" ] }, { @@ -4556,7 +4554,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Wait, what!? We expected just one unit eigenvector, not two. The second vector is almost equal to $\\begin{pmatrix}-1 \\\\ 0 \\end{pmatrix}$, which is on the same line as the first vector $\\begin{pmatrix}1 \\\\ 0 \\end{pmatrix}$. This is due to floating point errors. We can safely ignore vectors that are (almost) colinear (ie. on the same line)." + "Wait, what!? We expected just one unit eigenvector, not two. The second vector is almost equal to $\\begin{pmatrix}-1 \\\\ 0 \\end{pmatrix}$, which is on the same line as the first vector $\\begin{pmatrix}1 \\\\ 0 \\end{pmatrix}$. This is due to floating point errors. We can safely ignore vectors that are (almost) collinear (i.e. on the same line)." ] }, { @@ -4630,16 +4628,9 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "# What next?\n", - "This concludes this introduction to Linear Algebra. Although these basics cover most of what you will need to know for Machine Learning, if you wish to go deeper into this topic there are many options available: Linear Algebra [books](http://linear.axler.net/), [Khan Academy](https://www.khanacademy.org/math/linear-algebra) lessons, or just [Wikipedia](https://en.wikipedia.org/wiki/Linear_algebra) pages. " + "# What's next?\n", + "This concludes this introduction to Linear Algebra. Although these basics cover most of what you will need to know for Machine Learning, if you wish to go deeper into this topic there are many options available: Linear Algebra [books](https://linear.axler.net/), [Khan Academy](https://www.khanacademy.org/math/linear-algebra) lessons, or just [Wikipedia](https://en.wikipedia.org/wiki/Linear_algebra) pages." ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [] } ], "metadata": {