Problem with proof of Chain rule for f:R->R

In summary, the author is discussing a problem in Rosenlicht's Intro. Analysis book where the author does not clearly state the assumptions he is making, which results in a flaw in the argument. The author provides exercises to try to find the flaw. The first problem is that the author has written f(g(xo))-f(g(x)) in the numerator instead of f(g(x)) - f(g(x_0)) -- this is a typo. The second problem is that g(x) may equal g(x_0) infinitely many times in a neighborhood of x_0. There is a way to handle that possibility though -- define the function h(
  • #1
Bacle
662
1
Problem with proof of Chain rule for f:R-->R

Hi, Analysts:

I am going over problems in Rosenlicht's Intro. Analysis

book. In this problem , he asks one to find the flaw in this

argument to the effect that (f(g(x))'=f'(g(x))g'(x). Unfortunately,

author does not clearly state the assumptions; I believe author assumes

f differentiable at g(xo), and g'(xo) exists.

Here is the exercises:

(f(g(xo))=

1) Lim_x->xo [f(g(xo))-f(g(x)]/(x-xo)=

(multiply by (g(x)-g(xo))/(g(x)-g(xo))

2) Lim_x->xo [[f(g(xo)-f(g(x)]/(g(x)-g(xo) ]*[(g(x)-g(xo)]/(x-xo)=

3) Lim_x->xo [f(g(xo)-f(g(x))/(g(x)-g(xo)]*Lim_x->xo[(g(x)-g(xo)/(x-xo)]=

4) f'(g(xo))*g'(xo).

I know the problem is in step 3, since f(g(xo)-f(g(x)) can be zero, even for f

continuous ( or, even smooth, real-analytic, etc.) , using examples like

f(x)=x^2 sin(1/x) for x=/0 and f(x)=0 otherwise.


My question is: what additional conditions do we need on g to make the proof

work, and/or how to change the proof to make it work for a more general case.?


Thanks.
 
Physics news on Phys.org
  • #2


Are you certain the problem lies with f(g(xo)-f(g(x)) ?
I think that the problem rather is that g(xo)-g(x) can be zero, and that you then divide by zero.

The way to save the situation is, I think, to make the assumption that [tex]g(x)\neq g(x_0)[/tex] in a neighbourhood of xo.
 
  • #3


There are actually two problems with this proof. The first one is that you have written f(g(xo))-f(g(x) in the numerator instead of f(g(x)) - f(g(x_0)) -- I assume this is a typo. The second problem, as micromass says, the problem with this proof is in g, not f -- namely the fact that g(x) may equal g(x_0) infinitely many times in a neighborhood of x_0. There is a way to handle that possibility though -- define the function h(x) as follows:

h(x) = (f(g(x)) - f(g(x_0)))/(g(x) - g(x_0)) if g(x)≠g(x_0); h(x) = f'(g(x_0)) if g(x) = g(x_0)

It is simple matter of cases to verify that h(x) * (g(x) - g(x_0))/(x-x_0) = (f(g(x)) - f(g(x_0)))/(x - x_0) for every x≠x_0, and also straightforward to verify that h(x) is continuous at x_0.Then it follows immediately that:

[tex]\lim_{x\rightarrow x_0} \frac{f(g(x)) - f(g(x_0))}{x - x_0} = \lim_{x \rightarrow x_0} h(x) \frac{g(x) - g(x_0)}{x - x_0} = f'(g(x_0))g'(x_0)[/tex]
 
  • #4


it is also possible to treat the case where g equals g(0) in every nbhd of xo as a special case. I.e. then it is easy to see that the derivative is zero, so the formula holds as well.
 
  • #5


mathwonk said:
it is also possible to treat the case where g equals g(0) in every nbhd of xo as a special case. I.e. then it is easy to see that the derivative is zero, so the formula holds as well.

Is it? I agree that it's easy to see that if the derivative exists it must be zero, but how would you show that f(g(x)) is differentiable at x_0 without doing something like what I mentioned?
 
  • #6


Yes, thanks all. And sorry for the typos and carelesness; I will try to make
sure I am well-awake next time I post. Yes, the problem is clearly when
g(x0)-g(x) is zero infinitely often, and not what I stated. Sorry.
 
  • #7


to show a limit is zero, it suffices to look only at points where the value is NOT zero, since where it IS zero it obviously converges to zero. now when g(x) = g(xo), we do have [f(g(x))-f(g(x0))]/[x-x0] = 0, so we can ignore those points.

Thus looking only at points where g(x) ≠ g(xo), you can use the same trick as in the general case, i.e. then [f(g(x))-f(g(x0))]/[x-x0] =
[f(g(x))-f(g(x0))]/[g(x)-g(x0)].[g(x)-g(xo)]/[x-xo], and the second expression converges by the product rule to f'(g(x0)).0 = 0.

Indeed your proof is apparently a disguised version of this one, but why disguise the idea?This was the standard proof a hundred years ago before some smart aleck decided to make the proof look harder by using the clever tricks found in books today. G.H Hardy may be to blame as his book has a linear approximation proof, which however has other virtues in higher dimensions.
 

Related to Problem with proof of Chain rule for f:R->R

What is the Chain Rule?

The Chain Rule is a mathematical rule that allows us to calculate the derivative of a composite function, which is a function that is made up of two or more other functions.

What is the problem with the proof of Chain Rule for f:R->R?

The problem with the proof of Chain Rule for f:R->R is that it assumes that the derivative of the inner function exists, which may not always be the case. This assumption has been proven to be false in certain scenarios, making the proof invalid.

Why is the Chain Rule important?

The Chain Rule is important because it allows us to find the rate of change of a composite function, which is crucial in many fields such as physics, economics, and engineering. It also serves as a fundamental rule in calculus and is used in various other derivative rules.

Are there any alternative proofs for the Chain Rule?

Yes, there are alternative proofs for the Chain Rule that do not rely on the assumption of the existence of the derivative of the inner function. These proofs involve using different approaches, such as the limit definition of a derivative or the concept of differentiability.

How can we address the problem with the proof of Chain Rule for f:R->R?

To address the problem with the proof of Chain Rule for f:R->R, we can use alternative proofs or workarounds that do not rely on the assumption of the existence of the derivative of the inner function. Alternatively, we can also modify the proof by including conditions that ensure the existence of the derivative of the inner function.

Similar threads

  • Calculus
Replies
2
Views
1K
  • Calculus
Replies
9
Views
2K
Replies
9
Views
2K
Replies
7
Views
1K
  • Calculus
Replies
1
Views
1K
Replies
1
Views
1K
Replies
8
Views
2K
  • General Math
Replies
3
Views
834
  • Calculus
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
929
Back
Top