Taylor polynomial clarified

I decided to write this article, because all other articles I found on the internet seemed too complicated and did not explain the basic principles. I will avoid formalism, but I am sure formalisms will be clear to you after reading this article.

Prerequisities

Before you start to read, you must know, what is polynomial, derivative and factorial, and be able to differentiate and integrate polynomials.

Polynomial is a function, in which we put some real value x and receive an output value at point x using addition, subtraction, multiplication and division.

Derivative of a function is a number. When y grows as fast as x, derivative is $=1$. When y grows faster than x, derivative is $>1$, when slower, derivative is $<1$. When y grows, but x stays at the same place, derivative is $=0$.

Motivation

In an everyday life, one often needs to compute sine, cosine, tangent or logarithm. We don't have to go that far. We run into problem, even when we need to compute a square root of a number. The problem is, that we (humans) know only how to add, subtract, multiply and divide. Most of us don't know how to compute sines, logarithms or square roots. Sadly, computers can not help us. Their circuits usually can add (or add a negative value) and multiply (or multiply by inverted value) only.

But hundreds of yers ago, there existed mathematical tables with values of these functions. Also, when you pres SIN on your calculator, you usually receive a pretty exact answer. How do these people and machines do that?

The solution is in the approximation of our function (which we don't know how to compute) with some different function (which we do know how to compute).

Example

Let's say, we want to compute $\sin(1)$, i.e. sine of one radian, which has a value around 0.841470984. From calculus, we can discover these 3 properties of sine:

  • sine of zero is zero ... $\sin(0) = 0$
  • derivative of sine in zero is one ... $\sin'(0) = 1$
  • sine is periodic ... $\sin(x) = \sin(x + 2\pi)$

The nice property of polynomials

"For each N points in a plane there exists a polynomial of degree at most N-1, which intersects them."

Which means, that:

  • for each point there is a horizontal line, which intersects it
  • for each two points there is a line, which intersects them
  • for each three points there is parabola, which intersects them
  • ...

It looks like polynomials can have many different shapes and "resemble" many different functions.

The basic principle of Taylor polynomial

Taylor polynomial is based on a simple principle:

"The more similar (high order) derivatives of two functions are at point X, the more similar these functions are around point X."

The process of reaching this idea could look like this: functions should be similar to each other (around point X). They probably should have the same value in X (obvious). They probably should enter the point X going at the same angle (we don't want one to decrease at X and another to grow). One function should be as "curvy" as the other (we don't want one to be convex and another to be concave). And now we may notice, that we have just mentioned derivatives of order 0, 1 and 2. It probably works like that:

  • functions should have the same derivation of oreder 0, i.e. a function value at our point
  • when they have the same derivative of order 1 (same slope of tangent), they are more similar
  • when they have the same derivative of order 2, they are even more similar
  • ...

Aapproximation with polynomials

We try to approximate unknown function with polynomial, because we know how to compute it (we only add, subtract, multiply and divide).

Our example: $\sin(x)$ has the zeroth derivative 0 at $x=0$, first derivative 1, second derivative 0, third derivative -1 and it keeps repeating. If we find a polynomial, which has the same derivatives at $x=0$ up to some order, we can compute the value of that polynomial instead of sine and believe, that it is almost equal to sine.

Example: finding polynomial for sine

Let's try to find a polynomial $p$, which is similar to sine around point zero. First, we want it to have the same derivative of order 0, then 1, 2, 3 ... and gradually increase our demands. Simultaneously, let's see, how the graph becomes more and more similar to sine, and the value at point 1, $p(1)$, gets more and more closer to $\sin(1)$.

The polynomial of degree 0, which has the same value at 0, as sine, is $p(x) = 0$ (same 0. derivative), p(1)= 0

  1. Sine has a first derivative equal to 1, it corresponds to polynomial $p(x) = x$ (same 0. and 1. derivative), p(1)= 1

  2. Function $p(x) = x$ has the second derivation at 0 equal to 0, we don't have to change it.
  3. Third derivative of our polynomial at 0 must be -1, let's try to integrate it and add a constant, if needed.
    • $p'''(x) = -1$ - at zero it should be -1 (third derivative)
    • $p''(x) = -x$ - at zero should be 0 (second derivative), which is OK
    • $p'(x) = 1 - \frac{x^2}{2}$ , at zero should be 1 (first derivative), thus + 1
    • $p(x) = x - \frac{x^3}{6}$ at zero should be 0 (zeroth derivative), which is OK
    Indeed, polynomial $p(x) = x - \frac{x^3}{6}$ is more similar to sine (same 0. to 3. derivative), p(1)= 0.833333

  4. Fourth derivative of our polynomial should be 0, but it is satisfied in previous polynomial.
  5. Fifth derivative of our polynomial should be 1, so:
    • $p''''(x) = x$ - at zero should be 0 (fourth derivative), which is OK
    • $p'''(x) = -1 + \frac{x^2}{2}$ , at zero should be -1 (third derivative), thus - 1
    • $p''(x) = -x + \frac{x^3}{6}$ at zero should be 0 (second derivative), which is OK
    • $p'(x) = 1 - \frac{x^2}{2} + \frac{x^4}{24} $ at zero should be 1 (first derivative), thus + 1
    • $p(x) = x - \frac{x^3}{6} + \frac{x^5}{120}$ at zero should be 0 (zeroth derivative), which is OK
    Indeed, polynomial $p(x) = x - \frac{x^3}{6} + \frac{x^5}{120}$ is even more similar to sine (same 0. to 5. derivative), p(1)= 0.841666
  6. Sixth derivative of our polynomial should be 0, but it is satisfied by previous polynomial.
  7. In the same way, or by guessing, we can come up with the next, more precise polynomial $p(x) = x - \frac{x^3}{6} + \frac{x^5}{120} - \frac{x^7}{5040}$. (same 0. to 7. derivative), p(1)= 0.841468

We reached the function $p(x) = x - \frac{x^3}{6} + \frac{x^5}{120} - \frac{x^7}{5040}$, which is very similar to $\sin(x)$ around 0.

As you have probably noticed, $sin(x)$ has the derivatives of even order always equal to 0 at point 0. Taylor polynomial changes only when computing with precission to odd order of derivative. Now you see, where the term "odd function" came from (symmetry with respect to the origin).

Taylor polynomial for any function

For a general function f we want to find a Taylor polynomial p, which is very similar to f around some point a. We know several derivatives of f at point a. Polynomial p must have the same derivatives, as f, at point a, i. e. $p^{(i)}(a) = f^{(i)}(a)$. That is our only requirement on p.

When p is differentiated $i$ times and we substitute a into it, all summands should become 0 and only the value $f^{(i)}(a)$ should remain. Thus, polynomial p must contain the values of these derivatives. Summands should also contain (x-a), so they become 0 after substituting a. Summands should have a denominator, which is cancelled with exponents, which are decremented by 1 during differentiation. Denominator should be a product of these exponents, thus factorial.

The following expression satisfies our requirements:

$p(x) = f(a) + \frac{f'(a) * (x-a) }{ 1! } + \frac{f''(a) * (x-a)^2 }{ 2! }+ \frac{f'''(a) * (x-a)^3 }{ 3! } + ...$

After substituting a, the polynomial equals $f(a)$ (other summands are 0). Let's try to differentiate it.

$p'(x) = 0 + f'(a) + \frac{f''(a) * (x-a)^1 }{ 1! }+ \frac{f'''(a) * (x-a)^2 }{ 2! } + ...$

After substituting a, the polynomial equals $f'(a)$ (other summands are 0). It works in the same way for higher orders of derivative.

Our polynomial for sine (a=0, zeroth derivative 0, first 1, second 0, third -1, fourth 0, ...) is:

$p(x) = 0 + \frac{1 * (x-0) }{ 1! } + \frac{0 * (x-0)^2 }{ 2! }+ \frac{-1 * (x-0)^3 }{ 3! } + \frac{0* (x-0)^4 }{ 4! } + \frac{1 * (x-0)^5 }{ 5! } +... $

Taylor series

If we know all derivatives of some function, up to infinity, (e.g. for sine, where they repeat), we can express that function with an infinite Taylor polynomial. The value of such function can be expressed as a sum of series (infinitely many numbers).

Example:

For a function $y=e^x$ we know, that it has a value 1 at point 0, first and all higher derivatives are also 1. Let's write a Taylor polynomial for this function:

$p(x) = 1 + \frac{1 * (x-0) }{ 1! } + \frac{1* (x-0)^2 }{ 2! }+ \frac{1* (x-0)^3 }{ 3! } + ...$

Now we can compute the value of Euler's number $e$ (i.e. $e^1$) up to any precission. It is:

$e = 1 + 1 + \frac{1}{2}+ \frac{1}{6} + \frac{1}{24} + \frac{1}{120} + ...$

Old comments (closed because of spam)

Comments are closed.