# Common Calculus Mistakes

#### ThePerfectHacker

##### Well-known member
It is very common to write that $\int \tfrac{1}{x} ~ dx = \log |x| + C$. But it is wrong.

The meaning of $\int f(x) ~ dx$ is the collection of all functions $F(x)$ such that $F'(x) = f(x)$. It is assumed that the function $f(x) = x^{-1}$ is defined with domain on $(-\infty,0)\cup (0,\infty)$. It is true that $\log |x|$ is defined on the same domain as $x^{-1}$ and on top of that $(\log |x|)' = x^{-1}$. Thus, many people conclude (incorrectly) that therefore $\log |x| + C$ are all anti-derivatives of $x^{-1}$.

To see what this is wrong define the function,
$$H(x) = \left\{ \begin{array}{ccc} 0 & \text{if} & x< 0 \\ 1 & \text{if} & x>0 \end{array} \right.$$
It is not hard to see that $H'(x) = 0$ and so $\log|x| + H(x)$ is an example of an anti-derivative of $x^{-1}$ which does not have the form $\log |x| + C$.

The reason why this mistake is common is because people use the following theorem: if $f(x),g(x)$ are functions defined on an open interval $I$ with $f'(x) = g'(x)$ then $f(x) = g(x) + C$ for some constant $C$. The mistake above follows from the fact that $(-\infty,0)\cup (0,\infty)$ is no longer an interval.

Whenever I teach logarithms in Calculus I always write instead,
$$\smallint x^{-1} ~ dx = \log x + C, ~ x>0$$
I tell my students that $x>0$ means that we are considering the function $x^{-1}$ on the interval $(0,\infty)$. With this restriction there is no longer need for absolute value and the equation is now true.

Post other calculus mistakes that are very common.

#### Klaas van Aarsen

##### MHB Seeker
Staff member
When an expression is written like
$$\int \tfrac{1}{x} ~ dx = \ln |x|\color{lightgray}{ + C}$$
it should be understood that $\color{lightgray}{C}$ is the integration constant (light gray color for emphasis), which may have different values on different parts of the domain when these parts are separated by undefined points.

It should be considered to be a short hand notation.

I agree that it commonly leads to mistakes by the uninitiated, so this short hand can only be used safely when readers can be assumed to be aware of its meaning.

#### ZaidAlyafey

##### Well-known member
MHB Math Helper
Another common mistake

$$\displaystyle \int^1_{-1}\frac{1}{x^3}\, dx =0$$

Because the function is odd, but the integral is undefined at $x=0$ hence

$$\displaystyle \int^1_0 \frac{1}{x^3}\, dx = \lim_{t \to 0}\int^1_t \frac{1}{x^3}\, dx$$

Where the limit doesn't approach a finite value. So the integral diverges.

#### Klaas van Aarsen

##### MHB Seeker
Staff member
This is where Cauchy invented the Cauchy Principal Value:
$$PV\int^1_{-1}\frac{1}{x^3}\, dx =0$$

The integral is still zero with the understanding (for the initiated) that the limits around zero are approached simultaneously.

#### Deveno

##### Well-known member
MHB Math Scholar
Another beauty:

$\dfrac{d}{dx}\sin^2(x) = \cos(\sin(x))\cos(x)$

instead of the correct:

$\dfrac{d}{dx}\sin^2(x) = 2\sin(x)\cos(x) = \sin(2x)$

where people misintepret $\sin^2(x)$ to mean $\sin(\sin(x))$ instead of:

$(\sin(x))^2$ (the usual notation for powers of trig functions isn't very helpful, here).

Similar gaffs can be constructed using inverse trig functions, which must undoubtedly confuse students even more.

#### Krizalid

##### Active member
Classic computation of the following derivative and integral as follows:

$f(x)=\sin(2x)\implies f'(x)=\cos(2x),$ since most of students forget to apply the chain rule.

$\displaystyle\int{\frac{dx}{1+2{{x}^{2}}}}=\tan^{-1}(2x)+k,$ again the same error since they don't consider the fact the chain rule is present.

MHB Math Scholar

#### ThePerfectHacker

##### Well-known member
When an expression is written like
$$\int \tfrac{1}{x} ~ dx = \ln |x|\color{lightgray}{ + C}$$
it should be understood that $\color{lightgray}{C}$ is the integration constant (light gray color for emphasis), which may have different values on different parts of the domain when these parts are separated by undefined points.
I have never seen a standardized Calculus book say overwise. They write it without even explaining that.

..Counterexamples in Analysis..
Too advanced for an undergraduate course in calculus.

----

The examples above are more of students common mistakes. I was hoping to see more serious mistakes that one can find in actual textbooks.

I remembered two more serious mistakes from calculus. The first one being almost universal.

Fallacy of integrating a curve rather than a path: Many books say that given a curve $C$ (in space) with a (smooth) parametrization $\gamma(t)$ were $a\leq t\leq b$ and a (continous) function $f(x,y,z)$, we define the line integral along $f$ along $C$ to be:
$$\int_C f(x,y,z) ~ ds = \int_a^b f(\gamma(t)) \cdot \gamma'(t) ~ dt$$
The big issue with this definition is that the RHS depends on the choice of parametrization. Calculus books "assure" us that this is not a problem because if there is a (orientation-preserving) reparametrization $\gamma_1(t)$ of the form $\gamma_1(t) = \gamma(\varphi(t))$ then everything is okay and they go ahead and prove it using the chain rule. The problem with this "proof" is what if there is a parametrization which is not obtained in such a form.

The simpliest counter-example is to parametrize the unit circle by $\gamma(t) = (\cos t,\sin t)$ and by $\gamma_1(t) = (\cos 2t,\sin 2t)$ where $0\leq t\leq 2\pi$ in both examples. Then integrating the function $f(x,y) = 1$ over this circle results in different answers. The curves are identical but the answers are different.

It is best to define integration along paths rather than integration along curves. Were a path is a parametrization, $\gamma(t)$, for a particular curve, rather than just a set in space which we integrate over. So instead we should define,
$$\int_{\gamma}f(x,y,z) ~ ds = \int_a^b f(\gamma(t))\cdot \gamma'(t) ~ dt$$
This small correction eliminates these sorts of mistakes.

Fallacious Chain Rule Proof: Suppose that $f(x)$ and $g(x)$ are defined on $(-\infty,\infty)$ and both are differenciable. We will prove that $f(g(x))$ is differenciable everywhere and the derivative is $f'(g(x))g'(x)$. The "proof" goes as follows:
$$\lim_{t\to x} \frac{f(g(t)) - f(g(x))}{x-t} = \lim_{t\to x} \frac{f(g(t))-f(g(x))}{g(t)-g(x)}\cdot \frac{g(t)-g(x)}{t-x}$$
In RHS on first-factor substitute $u = g(t)$ so as $t\to x$ we get $u\to g(x)$ (by continuity) and so,
$$\lim_{t\to x} \frac{f(g(t))-f(g(x))}{g(t)-g(x)} = \lim_{u\to g(x)} \frac{f(u) - f(g(x))}{u-g(x)} = f'(g(x))$$
While on RHS the second-factor limit is $g'(x)$. Therefore, in the end we see that the limit-quotient for $f(g(x))$ exists and it is equal to $f'(g(x))g'(x)$.

#### ZaidAlyafey

##### Well-known member
MHB Math Helper
$$\displaystyle \frac{d}{dx} x^x= x\cdot x^{x-1}= x^x$$

I think this one is the worst .

#### ThePerfectHacker

##### Well-known member
Write $f(x) = x^2 = \underbrace{x + x + ... + x}_{x\text{ times}}$. Therefore,
$$f'(x) = \underbrace{1 + 1 + ... + 1}_{x\text{ times}} = x$$

#### ThePerfectHacker

##### Well-known member
I hate the L'Hopital rule and never use. L'Hopefully this will make you dislike it as well.

We will compute,
$$\lim_{x\to \infty} \frac{x+\cos x\sin x}{e^{\sin x}(x+\cos x\sin x)}$$
Notice that it has the form "$\tfrac{\infty}{\infty}$" so we apply the evil rule,
$$\lim_{x\to \infty} \frac{[x+\cos x\sin x]'}{[e^{\sin x}(x+\cos x\sin x)]'} = \lim_{x\to\infty} \frac{2e^{-\sin x}\cos x}{x + 2\cos x + \cos x\sin x}=0$$
Clearly, the limit is $0$ as denominator goes to $\infty$ while numerator is bounded. Therefore, we conclude the original limit problem has limit $0$. But clearly, the original limit problem has no limit to it!

#### chisigma

##### Well-known member
I hate the L'Hopital rule and never use. L'Hopefully this will make you dislike it as well...

We will compute...

$$\lim_{x\to \infty} \frac{x+\cos x\sin x}{e^{\sin x}(x+\cos x\sin x)}$$
If there are no mistakes, Yor expression contains the term $\displaystyle x + \cos x\ \sin x$ both in numerator and denominator and in the primary school they explained to me that in this case this term has to be canceled...

Kind regards

$\chi$ $\sigma$

#### ThePerfectHacker

##### Well-known member
Yor expression contains the term $\displaystyle x + \cos x\ \sin x$ both in numerator and denominator and in the primary school they explained to me that in this case this term has to be canceled...
Why does that matter? Just L'Hopital numerator and denominator anyway.

#### Opalg

##### MHB Oldtimer
Staff member
I hate the L'Hopital rule and never use. L'Hopefully this will make you dislike it as well.

We will compute,
$$\lim_{x\to \infty} \frac{x+\cos x\sin x}{e^{\sin x}(x+\cos x\sin x)}$$
Notice that it has the form "$\tfrac{\infty}{\infty}$" so we apply the evil rule,
$$\lim_{x\to \infty} \frac{[x+\cos x\sin x]'}{[e^{\sin x}(x+\cos x\sin x)]'} = \lim_{x\to\infty}\color{red}{ \frac{2e^{-\sin x}\cos x}{x + 2\cos x + \cos x\sin x}}=0$$
Clearly, the limit is $0$ as denominator goes to $\infty$ while numerator is bounded. Therefore, we conclude the original limit problem has limit $0$. But clearly, the original limit problem has no limit to it!
I don't see where that derivative comes from. What I get is $$\frac{[x+\cos x\sin x]'}{[e^{\sin x}(x+\cos x\sin x)]'} = \frac{1+\cos^2x-\sin^2x}{e^{\sin x}\bigl(\cos x(x + \cos x\sin x) + 1+\cos^2x - \sin^2x\bigr)}.$$ That does not have a limit as $x\to\infty$, so good old L'Hôpital's rule does not apply in this case.

Edit. I now see where that red fraction comes from (I had failed to notice that $1-\sin^2x = \cos^2x$)! As ILS points out below, you need to check the conditions for L'Hôpital's rule before doing any cancellation in the fraction.

Last edited:

#### Klaas van Aarsen

##### MHB Seeker
Staff member
We will compute,
$$\lim_{x\to \infty} \frac{x+\cos x\sin x}{e^{\sin x}(x+\cos x\sin x)}$$
Nice example!

Good ole l'Hôpital's rule is still valid (of course), but we need to check all conditions carefully.

Application of the rule fails because the condition $g'(x)\ne 0$ does not hold near infinity.

\begin{aligned}g'(x)
&=e^{\sin x}\cos x(x+\cos x \sin x) + e^{\sin x}(1-\sin^2 x + \cos^2 x) \\
&=e^{\sin x}\cos x(x + \cos x \sin x + 2 \cos x)\end{aligned}

When we approach infinity, $g'(x)$ changes sign infinitely often.

#### Random Variable

##### Well-known member
MHB Math Helper
Can the issue be rephrased as $g'(x)$ is zero at some point in every deleted neighborhood of infinity?

Last edited:

#### ThePerfectHacker

##### Well-known member
Can the issue be rephrased as $g'(x)$ is zero at some point in every deleted neighborhood of infinity?
The issue here is more simple, I think, the limit of $f'(x)/g'(x)$ as $x\to \infty$ does not even make any sense. As many of you noticed $g'(x) = 0$ in any neighborhood of $\infty$, any time we consider limits at $\infty$ we require that the function is defined for all sufficiently large values of $x$. Which is not the case here. Otherwise, the entire notion of a limit is not even defined here.

#### Random Variable

##### Well-known member
MHB Math Helper
That was sort of the point I was trying to make. But I didn't explain myself well.

Wouldn't the same thing apply to the limit at $x=a$? That is, for $\lim_{x \to a} \frac{f'(x)}{g'(x)}$ to be defined, wouldn't there need to be a small interval about $x=a$ where $g'(x)$ is not zero?

#### ThePerfectHacker

##### Well-known member
Wouldn't the same thing apply to the limit at $x=a$? That is, for $\lim_{x \to a} \frac{f'(x)}{g'(x)}$ to be defined, wouldn't there need to be a small interval about $x=a$ where $g'(x)$ is not zero?
Yes, it is the same idea.

In general, when we say $\lim_{x\to a} f(x)$ we mean that for any, blah blah blah, there is a $\delta > 0$ such that if $0<|x-a|<\delta$ blah blah blah. Afterwards, we evaluate $f(x)$, so that implicitly assumes that $f(x)$ must be defined for all $0<|x-a|<\delta$, in a punctured neighborhood of $a$.

#### Fantini

##### "Read Euler, read Euler." - Laplace
MHB Math Helper
I found this post in this thread about the fallacies of integrating over a curve instead of over a path and the chain rule proof. Can you elaborate on these, PerfectHacker? I feel I have missed the key points of why they're fallacies.

More specifically, I think I missed:

1) The distinction made between curve and path. In your example the parametrization $\gamma_2(t)$ covers the circle twice instead of once, so in order to work the same as the other shouldn't you restrict the interval to $(0, \pi)$?

2) Whatever mistake the fallacious chain rule proof contains.

Thanks.

Best wishes,

Fantini.

#### ThePerfectHacker

##### Well-known member
1) The distinction made between curve and path. In your example the parametrization $\gamma_2(t)$ covers the circle twice instead of once, so in order to work the same as the other shouldn't you restrict the interval to $(0, \pi)$?
.
Define "curve"?

One way of doing it is to make the following definition:

Definition: Let $C$ be a subset of $\mathbb{R}^3$. We say that $C$ is a curve if there exists a smooth function $\gamma(t)$ for $a\leq t \leq b$ whose image is $C$. Such a $\gamma(t)$ is called a parametrization of $C$. [So basically a curve is a set which can be parametrized by a function]. A smooth function $\gamma(t)$ for $a\leq t\leq b$ whose values lie in $\mathbb{R}^3$ is called a path.

A curve is a set and a path is a function. That is the difference. The problem with "curve" is that it depends on the choice of parameterization when you integrate over a curve.

2) Whatever mistake the fallacious chain rule proof contains.
The mistake is that for any two parameterization there exists there is a function that relates the two parameterizations together, i.e. $\gamma_2(t) = \gamma_1(\psi(t))$.

#### Fantini

##### "Read Euler, read Euler." - Laplace
MHB Math Helper
Thank you for your clarification. However, you left two points untouched: you didn't comment on how your example covers the given curve twice instead of once, and you misunderstood what I mean by the chain rule fallaciousness. I meant your second point about the purported "proof" that $(f(g(x))' =f'(g(x)) g'(x)$, assuming both differentiable and proceeding as

$$(f(g(x))' = \lim_{t \to x} \frac{f(g(t)) - f(g(x))}{t - x} = \lim_{t \to x} \frac{f(g(t)) - f(g(x))}{g(t) - g(x)} \frac{g(t) - g(x)}{t -x}.$$

Thank you!

Best wishes,

Fantini.

#### ThePerfectHacker

##### Well-known member
you didn't comment on how your example covers the given curve twice instead of once
So what?

The example is supposed to illustrate how integration along curve is not well-defined. It all depends on the parametrization you choose. If you say "make sure you do not overlap", that is a non-rigorous statement, how do you make that precise?

and you misunderstood what I mean by the chain rule fallaciousness. I meant your second point about the purported "proof" that $(f(g(x))' =f'(g(x)) g'(x)$, assuming both differentiable and proceeding as.
Hint: Division by zero.