Continuity and Differentiability of Infinite Series

In summary, the problem involved proving the convergence, continuity, and differentiability of a function f(x) = \sum\limits_{n=1}^\infty\frac{(sin(x))^n}{\sqrt(n)} over the interval (-π/2, π/2). The first part was easily proven using the comparison test, while the second and third parts posed difficulty in showing uniform convergence. Attempts were made using the Weierstrass M test and Taylor expansions, but both methods proved to be problematic. It is possible that the initial function is not uniformly convergent on the interval, and further exploration is needed to fully prove the continuity and differentiability of f(x).
  • #1
AnalysisNewb
3
0

Homework Statement



I came across a problem where f: (-π/2, π/2)→ℝ where f(x) = [itex]\sum\limits_{n=1}^\infty\frac{(sin(x))^n}{\sqrt(n)}[/itex]

The problem had three parts.

The first was to prove the series was convergent ∀ x ∈ (-π/2, π/2)

The second was to prove that the function f(x) was continuous over the same interval

The third was to prove that the function f(x) was differentiable over the interval

Homework Equations

The Attempt at a Solution



The first part of the problem was simple using the comparison test with the infinite sum of (sin(x))^n, and noting that the geometric series produced converges as long as |sin(x)|<1. Since |sin(x)| = 1 at |π/2|, it is convergent in the interval.

The second part of the problem, I ran into difficulty. I tried to show that f(x) is the limit of a series of continuous functions to prove the continuity of f(x), but then I remembered that for this to work, the series of continuous functions has to be uniformly convergent and not merely convergent.

So, I tried to prove the series was uniformly convergent using the Weierstass M test, but failed because I couldn't find a convergent sequence of M terms that were always greater than the absolute value of the terms of the series.

This same issue with convergent vs. uniformly convergent is what is plaguing me with the third part as well.

Any advice on how to work around this convergent vs. uniformly convergent issue would be greatly appreciated. If I can prove that the series is uniformly convergent, the second and third part are quite easy to prove using the Uniform Limit Theorem for part 2, and then the related theorems about differentiability of series that converge uniformly on an open interval for part 3.
 
Physics news on Phys.org
  • #2
Is there any reason this function would not be uniformly convergent? The convergence properties are based on n and not x.
I think the wording is that for any epsilon>0, there exists an n such that |f_n(x) - f(x) | < epsilon for all x.
Maybe you need to add a caveat in there, like for -pi/2+delta <x<pi/2-delta, but as you said, it should work out.
If that doesn't work, you could take a Taylor expansion of sin(x) around x_0 and see if you can find that this relation is true:
## \| \sum _ {i=1}^{\infty} \frac{(\sin x_0)^n }{\sqrt{n}} - \sum _ {i=1}^{\infty} \frac{(\sin (x_0+\delta ) )^n }{\sqrt{n}}\| < \epsilon##
 
  • #3
Yes, your wording for uniform convergence is similar to the one in my book, with the minor exception that it is for any epsilon>0, there exists an N such that for all n>N etc

I'm having some difficulty understanding how the convergence properties aren't based on x.

If I take |f_n(x)-f(x)| < [itex]\sum\limits_{m=n+1}^\infty\frac{(\sin(x))^m}{\sqrt(m)} < \sum\limits_{m=n+1}^\infty(\sin(x))^m[/itex] = [itex] \frac{\sin^{n+1}(x)}{1-\sin(x)} [/itex].

While [itex] \frac{\sin^{n+1}(x)}{1-\sin(x)} [/itex] can be made < episilon, doesn't its convergence depend on x?

I can't think of a different way to do this, especially since [itex]\sum\limits_{n=1}^\infty\frac{1}{\sqrt(n)}[/itex] is divergent...
 
Last edited:
  • #4
You're right, I missed that part, so as x goes to pi/2 the n goes to infinity.
Then I would recommend the epsilon delta style proof with the Taylor expansion.
 
  • #5
RUber said:
Then I would recommend the epsilon delta style proof with the Taylor expansion.

Trying this, it seems like the taylor expansion doesn't aid the convergence because the terms of the resultant series are either unchanged sine series (when they are even), or are slightly increased in amplitude when the exponential term is a product of two of 3 ,5 ,7 , 9, etc

For example
f(x) = (x - x^3/3! + x^5/5! + ... ) + (1/sqrt(2)) ( x - x^3/3! + x^5/5! + ... )^2 + (1/sqrt(3)) ( x - x^3/3! + x^5/5! + ...) +...

[itex] = x + \frac{x^2}{\sqrt(2)} - x^3(\frac{1}{3!} + \frac{1}{\sqrt(3)}) + \frac{x^4}{\sqrt(4)} + \frac{x^5}{5!} + ... [/itex]

This series is less than [itex] \sum\limits_{n=1}^\infty\frac{x^n}{\sqrt(n)}[/itex]

I don't think this is convergent except for certain values of x, again ruling out uniform convergence. (Although I might just be frustrated with all of the terms)

Could it be possible that the initial [itex] f(x) = \sum\limits_{n=1}^\infty\frac{\sin^n(x)}{\sqrt(n)} [/itex] is not uniformly convergent on the interval?
 
  • #6
I was thinking something on the lines of ##\sin(x_0+\delta)= \sum_{i=0}^\infty \frac{\delta^i}{i!}\frac{d^i}{dx^i}\sin(x_0)##
Not for convergence, but to get straight at continuity.
That should give you a problem that looks like
##\sum_{n=1}^\infty\sum_{i=1}^\infty \frac{\delta^i}{i!\sqrt{n}}<\epsilon##
That too seems problematic.Perhaps you need to first fix x =x_0. Then x_0 is at least some delta from pi/2 or -pi/2 due to the open interval. For any individual x_0, you should be able to show continuity, therefore for all x you can.
 

Related to Continuity and Differentiability of Infinite Series

1. What is the definition of a convergent infinite series?

A convergent infinite series is a sequence of partial sums that approaches a finite limit as more and more terms are added. In other words, the sum of all the terms in the series approaches a fixed value as the number of terms goes to infinity.

2. How do you test for the convergence of an infinite series?

One way to test for convergence is by using the ratio test, which compares the ratio of consecutive terms in the series to a limiting value. If the ratio is less than 1, the series is convergent. Another test is the integral test, which compares the series to an improper integral. If the integral converges, then the series also converges.

3. Can a divergent infinite series still have a finite limit?

Yes, a divergent infinite series can have a finite limit. This is known as a conditionally convergent series, where the sum of the absolute values of the terms diverges, but the sum of the actual terms converges.

4. What is the relationship between continuity and differentiability of infinite series?

A series is continuously differentiable if it is both continuous and differentiable. This means that not only does the series have a finite limit, but the derivative of the series also has a finite limit. However, it is possible for a series to be continuous but not differentiable, or differentiable but not continuous.

5. How do you find the sum of an infinite series?

Finding the sum of an infinite series can be done using various methods, such as the partial sum formula or the geometric series formula. In some cases, the sum may not have a closed form solution and can only be approximated.

Similar threads

  • Calculus and Beyond Homework Help
Replies
26
Views
944
  • Calculus and Beyond Homework Help
Replies
1
Views
367
  • Calculus and Beyond Homework Help
Replies
3
Views
374
  • Calculus and Beyond Homework Help
Replies
7
Views
746
  • Calculus and Beyond Homework Help
Replies
2
Views
754
  • Calculus and Beyond Homework Help
Replies
4
Views
432
  • Calculus and Beyond Homework Help
Replies
3
Views
473
  • Calculus and Beyond Homework Help
Replies
6
Views
458
  • Calculus and Beyond Homework Help
Replies
4
Views
138
  • Calculus and Beyond Homework Help
Replies
24
Views
2K
Back
Top