Can the Taylor Series of Analytic Functions be Proven?

In summary: }^\infty {a_n (x - a)^n }= a_0 + a_1 a_2 + \cdot + na_n (x - a)^{n - 1}= a_0 + a_1 (x - a) + a_2 (x - a)^2 + \cdot + a_n (x - a)^n= a_0 + a_1x + a_2x^2 + \cdot + a_nx^n
  • #1
timm3r
10
0
i'm having a hard time understanding taylor series and why it works and how it works. if someone could please explain it to me that would be great. My teacher explained it in class but he goes so fast that i have no idea what he's saying. he did give us some practice problems but if i have no idea how it works i really wan't work on them. this is a couple of problems he gave. x=a=0.
1. y=e^x
2.y=sinx
3.y=1/(1-x)
4.y=ln(1-x)
 
Last edited:
Physics news on Phys.org
  • #2
Ok Well the theory behind it is actually VERY simple.

If we wanted to approximate a function, say, near x=0, then this approximation should have the same value for x=0, obviously. And, preferably, the points around it should be the same. To achieve this, the derivatives should also be the same. Thats the simple theory.

For a better introduction, I have attached this powerpoint presentation which will explain it better than me. I renamed the extentsion as pdf because PF wouldn't let me upload a file of this size unless it was...so when you get it, renamed the extentsion to zip, and then extract it. Good Luck
 

Attachments

  • Calc09_2day1.pdf
    212.8 KB · Views: 2,422
  • #3
Do you want a "proof" of the Taylor series (proof of what, exactly? You can't prove that the Taylor series sums to the original function, that's not always true!) or do you want to find the Taylor series of those functions?

There is also no "proof" that a Taylor polynomial (the Taylor series stopped at a particular finite power) is in any sense the "best" approximation- that also is not always true.

Think of it this way-
If we wanted the linear function such that its value and its derivative we the same as f's at some x= a, then we must have the tangent line there: y= f(a)+ f'(a)(x-a).
If we want the second degree (quadratic) function that has exactly the same value, first derivative, and second derivative at x= a, then we must have y"= f"(a) so y'= f"(a)x+ C. But y'(a)= f"(a)a+ C= f'(a) so that C= f'(a)- f"(a)a:
y'= f"(a)x+ f'(a)- f"(a)a= f"(a)(x-a)+ f'(a). Integrating again,
[tex]y= \frac{f"(a)}{2}(x-a)^2+ f'(a)x+ C[/itex]
Since
[tex]y(a)= \frac{f"(a)}{2}(a-a)^2+ f'(a)a+ C= f'(a)a+ C= f(a)[/tex]
C= f(a)- f'(a)a and
[tex]y(x)= \frac{f"(a)}{2}(x-a)^2+ f'(a)x+ f(a)- f'(a)a[/tex]
[tex]= \frac{f"(a)}{2}(x-a)+ f'(a)(x-a)+ f(a)[/tex]

Starting from y(n)= fn(a) and integrating repeatedly, always requireing that f(k)= f(k)(a) for k< n, you get the nth Taylor polynomial.


The Taylor series for any infinitely differentiable function, f(x), about x= a, is given by
[tex]\Sigma_{n=0}^\infty \frac{f^{(n)}(a)}{n!}(x- a)^n[/tex]
where f(n)(a) is the nth derivative of f evaluated at x= a. In particular, since you always have a= 0, the Taylor series is
[tex]\Sigma_{n=0}^\infty \frac{f^{(n)}(0)}{n!}x^n[/tex]

Now the important question! Do you know how to differentiate those functions? In particular can you see a pattern and guess the formula for the nth derivative at x= 0?
 
  • #4
ok thanks you guys explained it a lot more for me.
 
  • #5
The attachment isn't working. And I think he (and I ) want the proof of Taylor series , I mean how we get [tex]
\Sigma_{n=0}^\infty \frac{f^{(n)}(a)}{n!}(x- a)^n
[/tex]
equation by proof ?
 
  • #6
I'll show you how to get the Taylor series. First, start with a power series about x=a:

[tex]
f(x) = \sum\limits_{n = 0}^\infty {a_n (x - a)^n } = a_0 + a_1 (x - a) + a_2 (x - a)^2 + \cdot \cdot \cdot + a_n (x - a)^n + \cdot \cdot \cdot
[/tex]

Differentiate term by term...

[tex]
\begin{gathered}
f(x) = a_0 + a_1 (x - a) + a_2 (x - a)^2 + \cdot \cdot \cdot + a_n (x - a)^n + \cdot \cdot \cdot \hfill \\
f'(x) = a_1 + 2a_2 (x - a) + 3a_3 (x - a)^2 + \cdot \cdot \cdot + na_n (x - a)^{n - 1} + \cdot \cdot \cdot \hfill \\
f''(x) = 1 \cdot 2a_2 + 2 \cdot 3a_3 (x - a) + 3 \cdot 4a_4 (x - a)^2 + \cdot \cdot \cdot \hfill \\
f'''(x) = 1 \cdot 2 \cdot 3a_3 + 2 \cdot 3 \cdot 4a_4 (x - a) + 3 \cdot 4 \cdot 5a_5 (x - a)^2 + \cdot \cdot \cdot \hfill \\
\end{gathered}
[/tex]

Then realize:

[tex]
f^{(n)} (x) = n!a_n + \cdot \cdot \cdot
[/tex]

The dots represent a sum of terms with (x-a) being a factor. So now we know a_n.

[tex]
a_n = \frac{{f^{(n)} (x)}}
{{n!}}
[/tex]

Now plug this result into our original power series about x=a to get the Taylor series of a function:

[tex]
\sum\limits_{n = 0}^\infty {\frac{{f^{(n)} (a)}}
{{n!}}} (x - a)^n
[/tex]
 
  • #7
CylonMath said:
The attachment isn't working. And I think he (and I ) want the proof of Taylor series , I mean how we get [tex]
\Sigma_{n=0}^\infty \frac{f^{(n)}(a)}{n!}(x- a)^n
[/tex]
equation by proof ?
You don't "prove" it- that is the definition of the Taylor's series for a function having all derivatives.

If you mean, "prove that converges to the original function f(x) for all x in the radius of convergence", you can't- it isn't true. That is only true for "analytic" functions which, again, are defined as functions for which that is true!

For example, the function
[tex]f(x)= e^{-1/x^2}[/tex]
if x is not 0, f(0)= 0, has all derivatives and all derivatives at x= 0 are 0 which means that its Taylor's series about x= 0 is simply 0+ 0x+ 0x2+ ...= 0 which is not equal to f for any non-zero x.

(Linear Space's "proof" starts by assuming that there exist a power series equal to the function. What he showed was that if that is true, then the Taylor's series is that power series.)
 
  • #8
I am actually looking for a proof as well.

You say that is the definition of the Taylor series, but how does one prove that if a function F is analytic, it can be represented by a power series of the form
[tex]\Sigma^{\infty}_{n=0}a_nz^n[/tex]
where
[tex]a_n = f^{(n)}(0)/n![/tex]

My teacher recommended a method including 'picking z, using Cauchy integral formula to computer f(z) as an integral, expand the integrand in a geom. series with ratio z/zeta, integrate term by term, and use CIF again to identify integrals as a_n.'

But, I'm not sure exactly what is meant by 'pick z.' Or how to do this, really.

Thanks!
 
Last edited:
  • #9
Still curious
 
  • #10
saraaaahhhhhh said:
I am actually looking for a proof as well.

You say that is the definition of the Taylor series, but how does one prove that if a function F is analytic, it can be represented by a power series of the form
[tex]\Sigma^{\infty}_{n=0}a_nz^n[/tex]
where
[tex]a_n = f^{(n)}(0)/n![/tex]

My teacher recommended a method including 'picking z, using Cauchy integral formula to computer f(z) as an integral, expand the integrand in a geom. series with ratio z/zeta, integrate term by term, and use CIF again to identify integrals as a_n.'

But, I'm not sure exactly what is meant by 'pick z.' Or how to do this, really.

Thanks!
For the third or fourth time now, you don't "prove" a definition!

And "can be represented by a power series of the form
[tex]\Sigma^{\infty}_{n=0}a_nz^n[/tex]"
in some neighborhood of 0 (or z0 if you use (z- z0)^n) is the definition of "analytic at 0" (or z0).

Once you have
[tex]f(z)= \Sigma^{\infty}_{n=0}a_n(z- z_0)^n[/tex]
in some neighborhood of z0, taking z= z0 makes all but the 0th[/b] term 0 and give f(z0= a0.
Differentiating term by term gives
[tex]f'(z)= \Sigma^{\infty}_{n=0}na_n(z- z_0)^n[/tex]
and setting z= z0 gives
f'(z0)= a1, etc.
 
  • #11
timm3r said:
ok thanks you guys explained it a lot more for me.

Were your questions actually answered? Have you been able to find the Taylor series of those functions?

Also, the Taylor series at a=0 is given a special name, called the Maclaurin series.
 
  • #12
saraaaahhhhhh said:
I am actually looking for a proof as well.

You say that is the definition of the Taylor series, but how does one prove that if a function F is analytic, it can be represented by a power series of the form
[tex]\Sigma^{\infty}_{n=0}a_nz^n[/tex]
where
[tex]a_n = f^{(n)}(0)/n![/tex]

I think it would be best to start a new thread about this, as your problem is within complex analysis, where the original poster seems to be in a calculus 2 course.

HallsofIvy said:
For the third or fourth time now, you don't "prove" a definition!

And "can be represented by a power series of the form
[tex]\Sigma^{\infty}_{n=0}a_nz^n[/tex]"
in some neighborhood of 0 (or z0 if you use (z- z0)^n) is the definition of "analytic at 0" (or z0).

In complex analysis, this is not the definition of an analytic function. For example, look within Conway's Functions of One Complex Variable I on p. 34.

A function [itex]f:G\to \mathds{C}[/itex] is analytic if f is continuously differentiable on G.

Later it is proven that if f is analytic then it has a power series representation with a formula for the coefficients. This is what saraaaahhhhhh is referring to. It is also a theorem in complex analysis that a function with a power series is analytic, but these are not the definitions as you imply they are. So yes, the statement that saraaaahhhhhh gave CAN be proven.

Also, look in Gamelin's Complex Analysis on p. 45.
 

Related to Can the Taylor Series of Analytic Functions be Proven?

1. What is a Taylor series?

A Taylor series is a representation of a function as an infinite sum of terms, where each term is a constant multiple of a power of x.

2. What is the purpose of a Taylor series?

The purpose of a Taylor series is to approximate a function near a specific point by using a polynomial of increasing degree. This allows us to calculate the value of a function at a point without knowing its exact form.

3. How is a Taylor series calculated?

A Taylor series is calculated by taking derivatives of a function at a specific point and evaluating those derivatives at that point. The resulting coefficients of the derivatives are then used to construct the terms of the series.

4. What is the difference between a Taylor series and a Maclaurin series?

A Taylor series is a representation of a function at any point, while a Maclaurin series is a special case of a Taylor series where the point is at x=0. This means that the Maclaurin series only includes non-negative powers of x.

5. How does the number of terms in a Taylor series affect its accuracy?

The more terms included in a Taylor series, the more accurate the approximation will be. However, since a Taylor series is an infinite sum, it is only an approximation and will never be an exact representation of the function. Including more terms will result in a closer approximation, but it may not be practical to include an infinite number of terms.

Similar threads

  • Calculus
Replies
3
Views
2K
Replies
2
Views
1K
Replies
3
Views
2K
Replies
5
Views
12K
Replies
17
Views
3K
Replies
9
Views
1K
  • Introductory Physics Homework Help
Replies
5
Views
404
Replies
1
Views
1K
Replies
3
Views
908
Back
Top