Welcome to our community

Be a part of something great, join today!

Understanding limits

Rido12

Well-known member
MHB Math Helper
Jul 18, 2013
715
I know this is probably a dumb question, but I have a question regarding this. My textbook says the following: "A function f is differentiable at a if f'(a) exists."

It then follows with and example regarding if f(x) = |x| is differentiable at x = 0. They prove this by finding the limit of its derivative, and then splits it in two equations: for the limit as h -> 0+ and h -> 0-. Finally concluding that it is not differentiable as the limits are different. Done.

My question is as follows: does the method, as demonstrated above, work for all functions when you're trying to find if a certain point is differentiable? I question it because even if the aforementioned limit DOES exist, it doesn't mean that f'(a) exists. It just means that it has a limit, as you can have a limit of something when f'(a) is undefined. Thus, making the method inadequate to making such a conclusion.

EDIT:
Sorry if this is confusing, and if you want me to clarify, I can provide a hypothetical situation.
Ok, time to clarify what I'm trying to convey with a hypothetical situation (not real)

We are asked to see if f(x) = |x| is differentiable at x = 0. Let's PRETEND that the limit exists at f'(a). That when you took the limit from the positive and negative side of it, it resulted in 5. However, it is actually undefined at 5, but the limit as you approach f'(a) is 5. By using the same logic as above, the logic that the textbook example used, we are to assume that YES, It is differentiable, BECAUSE the limit exists! However, that is actually NOT the case, since the by the definition, f'(a) MUST exists. But in this example, YES, the limit exists, but f'(a) doesn't exist as it is undefined at a.

That being said, would additional steps be taken/required, in order to prove if something is or is not differentiable. As we see in my hypothetical situation, just determining the limit was not enough.
 
Last edited:

Bacterius

Well-known member
MHB Math Helper
Jan 26, 2012
644
[JUSTIFY]Okay, this is hurting my brain so I'll conservatively assert that:

1. If the two limits are the same, you are correct that this says absolutely nothing about the existence of the derivative at $x = a$, so if the limits are the same this is as far as this method will take you.

2. If the two limits are different, then for most well-behaved functions the derivative does not exist at that point, however there are may exist crazy analytical functions which are continuous while their derivative isn't, and vice versa, so it may not be a sufficient condition to show the existence of $f'(a)$.[/JUSTIFY]
 
Last edited:

Jameson

Administrator
Staff member
Jan 26, 2012
4,041
This is a really good question.

[Rest of post deleted due to a mistake. Updated thought can be found in my later post]
 

caffeinemachine

Well-known member
MHB Math Scholar
Mar 10, 2012
834
Hey Jameson! :) I think what you said is not entirely correct.

This is a really good question.

In order for a function of one variable, $f(x)$, to be differentiable at a point $a$, then it must be be continuous at $a$ and the following limit must exist ...
I think the thing in bold is redundant. The continuity of $f$ at $a$ follows from the differentiability of $f$ at $a$.

2) Just knowing that the above limit exists is not enough to say that $f(x)$ is differentiable at $a$. ...
I think knowing that the limit exists equivalent to $f$ being differentiable at $a$. That is by definition of differentiability at a point. Moreover, strictly, we should not say that $f(x)$ is differentiable at $a$ but simply say that $f$ is differentiable at $a$.

Please correct me if I am wrong or if I have misinterpretted your response.
 

caffeinemachine

Well-known member
MHB Math Scholar
Mar 10, 2012
834
I know this is probably a dumb question
This is not a dumb question at all. :)

My textbook says the following: "A function f is differentiable at a if f'(a) exists."
It then follows with and example regarding if f(x) = |x| is differentiable at x = 0. They prove this by finding the limit of its derivative, and then splits it in two equations: for the limit as h -> 0+ and h -> 0-. Finally concluding that it is not differentiable as the limits are different. Done.

My question is as follows: does the method, as demonstrated above, work for all functions when you're trying to find if a certain point is differentiable? I question it because even if the aforementioned limit DOES exist, it doesn't mean that f'(a) exists. It just means that it has a limit, as you can have a limit of something when f'(a) is undefined. Thus, making the method inadequate to making such a conclusion.
I think you are making mistakes here. We write $$f'(a)=\lim_{x\to a}\frac{f(x)-f(a)}{x-a}$$ whether the limit exists or doesn't exist. When the limit doesn't exist, $f'(a)$ is not defined. When the limit $\lim_{x\to a}\frac{f(x)-f(a)}{t-a}$ exists, then $f'(a)$ is same as this limit. So $f'(a)$ is a real number and certainly it exists.

Note that when $\lim_{x\to a}\frac{f(x)-f(a)}{t-a}$ exists then this doesn't say that $f'(a)$ has a limit. This would have no meaning. It says that $\frac{f(x)-f(a)}{x-a}$ has a limit as $x$ approaches $a$. For a short hand notation, and for furthering the theory, we denote this limit, whether it exists or not, as $f'(a)$.

I have not read the content in Spoiler box thoroughly yet. Tell me if you have further doubts or you need to discuss.

EDIT: I have implicitly assumed that $f$ is a real function having an interval $I$ in its domain such that $a\in I$.
 
Last edited:

Bacterius

Well-known member
MHB Math Helper
Jan 26, 2012
644
He is taking the limit of the derivative, not the original function itself.
 

Jameson

Administrator
Staff member
Jan 26, 2012
4,041
Hey Jameson! :) I think what you said is not entirely correct.
Yep, I think I need to edit a couple of things.

I think the thing in bold is redundant. The continuity of $f$ at $a$ follows from the differentiability of $f$ at $a$.
Yes I agree that continuity is a necessary condition of differentiability thus differentiability implies continuity, but the converse is not true. That's what I meant.

I think knowing that the limit exists equivalent to $f$ being differentiable at $a$. That is by definition of differentiability at a point. Moreover, strictly, we should not say that $f(x)$ is differentiable at $a$ but simply say that $f$ is differentiable at $a$.

Please correct me if I am wrong or if I have misinterpreted your response.
About the part in bold - yes I agree with you and would normally write that but thought the OP might be used to seeing functions always written in the form "$f(x)$" so wrote it that way. You're right of course.

Now that I think about it more the if the limit exists then it will be continuous. I was thinking of a situation like $f(x)=\frac{x^2-4}{x-2}$ at $x=2$ where the derivative appears to exist by the limit definition but isn't continuous. However now I see that this won't have a limit for the limit definition of a derivative so the extra stipulation of being continuous isn't necessary as it's already implied by the existence of the limit definition of a derivative.

I suppose the main idea I'm expressing is differentiability implies continuity but the opposite is not true. :)
 

Jameson

Administrator
Staff member
Jan 26, 2012
4,041
He is taking the limit of the derivative, not the original function itself.
Are you sure? I think caffeinemachine addressed his terminology in a way that makes sense. It looks to me like the we're talking about the limit definitions of functions. :confused:
 

caffeinemachine

Well-known member
MHB Math Scholar
Mar 10, 2012
834
He is taking the limit of the derivative, not the original function itself.
Umm.. now I am totally confused. :) If you understand his doubt then can you please frame it nicely and post it along with your solution?
 

Bacterius

Well-known member
MHB Math Helper
Jan 26, 2012
644
Are you sure? I think caffeinemachine addressed his terminology in a way that makes sense. It looks to me like the we're talking about the limit definitions of functions. :confused:
That was my first interpretation as well. But after rereading his example it actually seems that he is talking about taking the two-sided limit of the derivative of $f$ to establish whether the derivative exists at a given point (so the question would become: is the derivative being (non-)continuous at $x = a$ a sufficient condition for the (non-)existence of the derivative at $x = a$?)

At least that's my take on it. I figured the question was deeper than just "being continuous is necessary but not sufficient for differentiability". But it's possible I am just talking nonsense here :p haven't done any formal calculus in a while.
 

Rido12

Well-known member
MHB Math Helper
Jul 18, 2013
715
I haven't read all your responses yet, but it is the limit of the derivatvive. I don't knwo anymore haha, I'm too confused.

The textbook does this:

f'(x) = lim h-> 0 (|0 + h| - |0|)/ h
Then split in two parts to compute the left and right limits separetely.
 

Jameson

Administrator
Staff member
Jan 26, 2012
4,041
I'm pretty sure you don't mean "the limit of the derivative" but the "limit definition of the derivative". Your example seems to confirm this.
 

Rido12

Well-known member
MHB Math Helper
Jul 18, 2013
715
Yep, I think I need to edit a couple of things.



Yes I agree that continuity is a necessary condition of differentiability thus differentiability implies continuity, but the converse is not true. That's what I meant.



About the part in bold - yes I agree with you and would normally write that but thought the OP might be used to seeing functions always written in the form "$f(x)$" so wrote it that way. You're right of course.

Now that I think about it more the if the limit exists then it will be continuous. I was thinking of a situation like $f(x)=\frac{x^2-4}{x-2}$ at $x=2$ where the derivative appears to exist by the limit definition but isn't continuous. However now I see that this won't have a limit for the limit definition of a derivative so the extra stipulation of being continuous isn't necessary as it's already implied by the existence of the limit definition of a derivative.

I suppose the main idea I'm expressing is differentiability implies continuity but the opposite is not true. :)
It's quite strange actually. I always do see functions written in teh form "f(x)" rather than just "f"... My textbook uses "f", so now I do that too. But my previous math teachers use "f(x)"

- - - Updated - - -

I'm pretty sure you don't mean "the limit of the derivative" but the "limit definition of a derivative". Your example seems to confirm this.
I'm confused. OH. My example was just finding the derivative of a function right, by taking its limit right? I got confused because I don't usually see the word "limit" when I take derivatives using power rule, chain, etc, and I got mixed up thinking it was the limit of a derivative.

And I haven't finished reading the responses yet from the previous page.
 

Rido12

Well-known member
MHB Math Helper
Jul 18, 2013
715
That was my first interpretation as well. But after rereading his example it actually seems that he is talking about taking the two-sided limit of the derivative of $f$ to establish whether the derivative exists at a given point (so the question would become: is the derivative being (non-)continuous at $x = a$ a sufficient condition for the (non-)existence of the derivative at $x = a$?)

At least that's my take on it. I figured the question was deeper than just "being continuous is necessary but not sufficient for differentiability". But it's possible I am just talking nonsense here :p haven't done any formal calculus in a while.
This is precisely what I meant with my question; I don't know if that is the limit definition of derivatives, or whatnot. Now if only my original post was as concise as how you explained it (Envy)

EDIT: second thought, brain hurts, not sure if that's how i meant it but:
It's just that I'm not convinced that f'(a) exists by just taking its limit from both sides. And it must exist for it to be differentiable at that point.

This:
1. If the two limits are the same, you are correct that this says absolutely nothing about the existence of the derivative at $x = a$, so if the limits are the same this is as far as this method will take you.
I think the notation is confusing me; the limit definition of derivatives has both "f'(x)" and "lim" so it makes me want to think that it's taking the LIMIT of the DERIVATIVE, but now that I think about it, it's simply just taking the derivative.
 
Last edited:

Rido12

Well-known member
MHB Math Helper
Jul 18, 2013
715
This leads me to the following question:

When would it be necessary to use the limit definition of a derivative, and when to just solve for the derivative using chain/product, etc rule? It is to my understanding that the limit definition was used to derive the rules mentioned before.

Isn't the limit definition of a derivative the same as finding the derivative and then apply the limit? So the limit of the derivative?

For example, finding the limit (if there is) for f(x) = |x|using the limit definition of a derivative
We compute the two side limit and we get:
f'(x)=lim as h >0+ |x|= |h|/h = 1
f'(x)=lim as h ->0- |x|= |h|/h = -1

Instead, wouldn't it be possible to find the derivative first, then apply the limit?
So: d/dx[|x|] = (skipping steps) 2x / 2|x|= x / |x|
Hence, apply limit as h >0+ x / |x|
= x / x = 1
limit as h >0- x / |x|
= x / -x = -1

Same answer.
 

Bacterius

Well-known member
MHB Math Helper
Jan 26, 2012
644
This leads me to the following question:

When would it be necessary to use the limit definition of a derivative, and when to just solve for the derivative using chain/product, etc rule? It is to my understanding that the limit definition was used to derive the rules mentioned before.

Isn't the limit definition of a derivative the same as finding the derivative and then apply the limit? So the limit of the derivative?

For example, finding the limit (if there is) for f(x) = |x|using the limit definition of a derivative
We compute the two side limit and we get:
f'(x)=lim as h >0+ |x|= |h|/h = 1
f'(x)=lim as h ->0- |x|= |h|/h = -1

Instead, wouldn't it be possible to find the derivative first, then apply the limit?
So: d/dx[|x|] = (skipping steps) 2x / 2|x|= x / |x|
Hence, apply limit as h >0+ x / |x|
= x / x = 1
limit as h >0- x / |x|
= x / -x = -1

Same answer.
If you already have an expression for the derivative then you don't need to use the limit definition. You can just evaluate it directly (which you did, since you did not use "h" at all) :)

I am not sure what you mean though - what would be the purpose of calculating such a limit?

The chain, product rules etc.. can all be derived from the limit definition. They are just much more convenient to use, because calculating derivatives from the limit form (the so-called "first principles") gets rather tedious for anything bigger than a quadratic. The idea is that the limit definition gives the most "basic" definition of what a derivative is, which is important in mathematics (where something without a proper definition is useless).
 
Last edited:

Rido12

Well-known member
MHB Math Helper
Jul 18, 2013
715
If you already have an expression for the derivative then you don't need to use the limit definition. You can just evaluate it directly (which you did, since you did not use "h" at all) :)

I am not sure what you mean though - what would be the purpose of calculating such a limit?

The chain, product rules etc.. can all be derived from the limit definition. They are just much more convenient to use, because deriving from the limit form (the so-called "first principles") gets rather tedious for anything bigger than a quadratic.
Do you also mean to imply that finding the limit here is redundant?

Instead, wouldn't it be possible to find the derivative first, then apply the limit?
So: d/dx[|x|] = (skipping steps) 2x / 2|x|= x / |x|
Hence, apply limit as h >0+ x / |x|
= x / x = 1
limit as h >0- x / |x|
= x / -x = -1
So we could use direct substitution instead of finding the limit, meaning that once we get f'(x) = x / |x| we can do f'(0) = 0/ 0 , meaning that it is not possible to differentiate at x= 0? (Instead of doing all the two side limit hassle that was only necessary when utilizing the limit definition of a derivative)
 

Bacterius

Well-known member
MHB Math Helper
Jan 26, 2012
644
Do you also mean to imply that finding the limit here is redundant?



So we could use direct substitution instead of finding the limit, meaning that once we get f'(x) = x / |x| we can do f'(0) = 0/ 0 , meaning that it is not possible to differentiate at x= 0?
Well the same holds true for the function itself. Before differentiating your function you should rigorously show that it is in fact continuous at the locations you want to differentiate it at, otherwise the function is not differentiable there and you cannot even bring up the concept of derivative (it is meaningless). In this case, our function $f(x) = |x| / x$ has a discontinuity at $x = 0$ (you can rigorously show that by taking the left and right limits of the function around that point and demonstrating that $\lim_{x \to 0} f(x)$ does not exist as they are not the same ($-1$ and $+1$ respectively).

Once you know that your function is continuous at that point, then you may try to differentiate it. But that won't necessarily mean the derivative exists! Consider $g(x) = |x|$, it is continuous at $x = 0$ (left and right limits are the same) but it is not differentiable at $x = 0$.

You need to be very careful when differentiating. Just because you get a nice expression for your derivative everywhere except at, say, $x = 0$, does not mean that expression is valid for $x = 0$! This is why it's important to keep track of the domain of each function you are working with. If you find that your derivative is valid for all $x \neq 0$, then that's it - end of story. It is meaningless to plug $x = 0$ into it, the results are undefined because the derivative doesn't apply at that point.

Also, limits don't have much to do with derivatives in the grand scheme of things, but they provide a useful framework on top of which to build the concept of derivative and differentiability (as well as a lot of other important stuff). Don't associate them with limits too much.

Here the "left and right limits" are used to show that the two-sided limit exists, and additionally show continuity. The limit expression of the derivative is different and has a geometrical interpretation, which is basically "take two points on a curve and draw a line between them, as the points get closer and closer together that line starts to approximate a tangent to the curve, which happens to also give the rate of change of the height of the curve with horizontal distance and lots of other useful things".
 
Last edited:

Rido12

Well-known member
MHB Math Helper
Jul 18, 2013
715
Once you know that your function is continuous at that point, then you may try to differentiate it. But that won't necessarily mean the derivative exists! Consider $g(x) = |x|$, it is continuous at $x = 0$ (left and right limits are the same) but it is not differentiable at $x = 0$.
Thanks! So I guess the last question is how would I prove that g(x) is not differentiable at x = 0. Say you prove that g(x) is continuous at 0. How would I prove that it is not differentiable at x = 0 .
 

Jameson

Administrator
Staff member
Jan 26, 2012
4,041
You already showed that the left and right hand limits at $x=0$ are 1 and -1, which is all you need to do. That shows that the limit doesn't exist thus $g$ is not differentiable at $x=0$. Once again, differentiability implies continuity but continuity does not imply differentiability.
 

Rido12

Well-known member
MHB Math Helper
Jul 18, 2013
715
So Step 1 is to see to determine its continuity, if the function is continuous, we can proceed to find the derivative.
Step 2, we find the limit of both sides of the derivative at the point we want to prove, say x = 0 , and we find that the limit doesn't exist
Step 3, we can make a conclusion that because the limit doesn't exist, g'(x) is discontious at the point and impossible to differentiate. for a function to be differentiable, it must be continuous, but the converse is not true.

Is there anythign wrong with the procedure above?
 

Bacterius

Well-known member
MHB Math Helper
Jan 26, 2012
644
Thanks! So I guess the last question is how would I prove that g(x) is not differentiable at x = 0. Say you prove that g(x) is continuous at 0. How would I prove that it is not differentiable at x = 0 .
You take the definition of "differentiable":

A function $f(x)$ on $\mathbb{R}$ is differentiable at $x = a$ if and only if $f'(a)$ exists.
From that definition you can choose different approaches depending on your function. For instance, take $f(x) = |x|$. Then, assume $x < 0$, where the function is (probably) differentiable - we'll prove it is shortly. Now apply the definition of the derivative:

$$f'(x) = \lim_{h \to 0} \frac{f(x + h) - f(x)}{h}$$

And we get:

$$f'(x) = \lim_{h \to 0} \frac{|x + h| - |x|}{h}$$

Now because $x < 0$, $|x| = -x$, and we can rewrite this as:

$$f'(x) = \lim_{h \to 0} \frac{|x + h| + x}{h}$$

Now assume $x + h > 0$. But this means that $h > -x$, and since $x$ is negative then $-x$ is positive so $h$ is greater than some non-zero value.. which is a contradiction since we assumed that $h \to 0$. Therefore $x + h < 0$, so $|x + h| = - (x + h) = -x - h$ and:

$$f'(x) = \lim_{h \to 0} \frac{-x - h + x}{h} = \lim_{h \to 0} \frac{-h}{h} = -1 ~ ~ ~ \text{for} ~ ~ x < 0$$

A similar reasoning for $x > 0$ leads to:

$$f'(x) = +1 ~ ~ ~ \text{for} ~ ~ x > 0$$

So we have:

$$f'(x) = \begin{cases} +1 ~ ~ &\text{for} ~ ~ x > 0 \\ -1 ~ ~ &\text{for} ~ ~ x < 0 \end{cases}$$

What now? Well now the left and right limits around $x = 0$ are easy (note we are never actually touching zero, $f'(x)$ has not been defined at that point) and we clearly see that:

$$\lim_{x \to 0^{-}} f'(x) \ne \lim_{x \to 0^{+}} f'(x)$$

And we may conclude that:

$$\lim_{x \to 0} f'(x) ~ ~ ~ \text{does not exist} ~ ~ ~ \implies ~ ~ ~ f'(0) ~ ~ ~ \text{does not exist}$$

And therefore by the definition of differentiability, $f(x)$ is not differentiable at $x = 0$.

That was the semi-rigorous, long-winded way. In practice you use things like the chain rule and various limit/derivative theorems (or even plain observation) to help you simplify the function instead of starting from scratch each time.

For instance, using the result above, you now know that $f(x) = |x|$ is not differentiable at $x = 0$. From that point on you can easily show that $f(x) = |x - a|$ is not differentiable at $x = a$, for all $a \in \mathbb{R}$, and you can now use that fact to find other, more complicated derivatives involving absolute values, using the chain rule, product rule, and so on.
 

Jameson

Administrator
Staff member
Jan 26, 2012
4,041
So Step 1 is to see to determine its continuity, if the function is continuous, we can proceed to find the derivative.
Step 2, we find the limit of both sides of the derivative at the point we want to prove, say x = 0 , and we find that the limit doesn't exist
Step 3, we can make a conclusion that because the limit doesn't exist, g'(x) is discontinuous at the point and impossible to differentiate. for a function to be differentiable, it must be continuous, but the converse is not true.

Is there anything wrong with the procedure above?
I hope someone who is more versed in analysis stops by to comment on this, but I will add what I can.

1) Yes, this is a good start. There are other things to consider as well, such a bend, cusp or vertical tangent - but your step #2 will sort those out anyway. You'll develop an eye for noticing these things quickly.

2) It's always a good idea to be able to use the limit definition of a derivative (left and right hand limits as you said) but of course in practice you have quicker ways of calculating derivatives, as you know. You can probably tell from the problem if the goal is to quickly calculate the derivative and use that as part of a greater problem, or if the differentiability itself is the main focus of the problem. Anyway, this step is ok.

3) If the limit doesn't exist, it doesn't automatically follow that the function is not continuous at that point. $f(x)=|x|$ is a great example of this. It is continuous at $x=0$ but not differentiable.

The only thing that I am not certain of is whether step 1 is implied by step 2. Last night I thought there could be an example of a function and a point where the limit definition of a derivative results in an answer for that point but the function was actually discontinuous at the point, so the derivative there couldn't exist. However, I'm not able to come up with an example of this and doubt that it's possible now. If anyone could that would be interesting.

Put another way, if we have a function of one real variable, $f$, can the limit following limit exist if the point, $a$ is not in the domain of $f$?

\(\displaystyle \lim_{h \to 0} \frac{f(a + h) - f(a)}{h}\)

I think not because $f(a)$ is not defined. So that leads me to say that step 2 implies step 1.
 

Rido12

Well-known member
MHB Math Helper
Jul 18, 2013
715
Last night I thought there could be an example of a function and a point where the limit definition of a derivative results in an answer for that point but the function was actually discontinuous at the point, so the derivative there couldn't exist. However, I'm not able to come up with an example of this and doubt that it's possible now. If anyone could that would be interesting.
This problem was what compelled me to start this thread in the first place. It occurred to me "what if the limit definition of a derivative results in an answer, but that point was actually discontinuous". I'm also interested if there such as a function as well, as it will clear my problems haha. I agree with you on the premise that step 2 implies step 1.

Thanks guys for the help!
 
Last edited:

MarkFL

Administrator
Staff member
Feb 24, 2012
13,775
Consider the function:

\(\displaystyle f(x)=\frac{x}{|x|}\)

It has a jump discontinuity at $x=0$, however, we find:

\(\displaystyle \lim_{h\to0^{-}}\frac{f(x+h)-f(x)}{h}=0=\lim_{h\to0^{+}}\frac{f(x+h)-f(x)}{h}\)

Is this the kind of thing you are after?