# Thread: Rigorous definition of "Differential"

1. First of all I want to clarify that I posted this question on many forums and Q&A websites so the chances of getting an answer will be increased. So don't be surprised if you saw my post somewhere else.
Now let's get started:

When it comes to definitions, I will be very strict. Most textbooks tend to define differential of a function/variable in a way like this:

--------------------------------------------------------------------------------
Let $\displaystyle f(x)$ be a differentiable function. By assuming that changes in $\displaystyle x$ are small, with a good approximation we can say:
$\displaystyle \Delta f(x)\approx {f}'(x)\Delta x$
Where $\displaystyle \Delta f(x)$ is the changes in the value of function. Now if we consider that changes in $\displaystyle f(x)$ are small enough then we define differential of $\displaystyle f(x)$ as follows:
$\displaystyle \mathrm{d}f(x):= {f}'(x)\mathrm{d} x$
Where $\displaystyle \mathrm{d} f(x)$ is the differential of $\displaystyle f(x)$ and $\displaystyle \mathrm{d} x$ is the differential of $\displaystyle x$.

--------------------------------------------------------------------------------

What bothers me is this definition is completely circular. I mean we are defining differential by differential itself. Although some say that here $\displaystyle \mathrm{d} x$ is another object independent of the meaning of differential but as we proceed it seems that's not the case:

First of all we define differential as $\displaystyle \mathrm{d} f(x)=f'(x)\mathrm{d} x$ then we deceive ourselves that $\displaystyle \mathrm{d} x$ is nothing but another representation of $\displaystyle \Delta x$ and then without clarifying the reason, we indeed treat $\displaystyle \mathrm{d} x$ as the differential of the variable $\displaystyle x$ and then we write the derivative of $\displaystyle f(x)$ as the ratio of $\displaystyle \mathrm{d} f(x)$ to $\displaystyle \mathrm{d} x$. So we literally (and also by stealthily screwing ourselves) defined "Differential" by another differential and it is circular.

Secondly (at least I think) it could be possible to define differential without having any knowledge of the notion of derivative. So we can define "Derivative" and "Differential" independently and then deduce that the relation $\displaystyle f'{(x)}=\frac{\mathrm{d} f(x)}{\mathrm{d} x}$ is just a natural result of their definitions (using possibly the notion of limits) and is not related to the definition itself.

Though I know many don't accept the concept of differential quotient($\displaystyle \frac{\mathrm{d} f(x)}{\mathrm{d} x}$) and treat this notation merely as a derivative operator($\displaystyle \frac{\mathrm{d} }{\mathrm{d} x}$) acting on the function($\displaystyle f(x)$) but I think that it should be true that a "Derivative" could be represented as a "Differential quotient" for many reasons. For example think of how we represent derivatives with the ratio of differentials to show how chain rule works by cancelling out identical differentials. Or how we broke a differential into another differential in the $\displaystyle u$-substitution method to solve integrals. And it's especially obvious when we want to solve differential equations where we freely take $\displaystyle \mathrm{d} x$ and $\displaystyle \mathrm{d} y$ from any side of a differential equation and move it to any other side to make a term in the form of $\displaystyle \frac{\mathrm{d} y}{\mathrm{d} x}$, then we call that term "Derivative of $\displaystyle y$". It seems we are actually treating differentials as something like algebraic expressions.

I know the relation $\displaystyle \mathrm{d} f(x)=f'(x)\mathrm{d} x$ always works and it will always give us a way to calculate differentials. But I (as an strictly axiomaticist person) couldn't accept it as a definition of Differential.

So my question is:

Can we define "Differential" more precisely and rigorously?

Thank you in advance.

P.S. I prefer the answer to be in the context of "Calculus" or "Analysis" rather than the "Theory of Differential forms". And again I don't want a circular definition. I think it is possible to define "Differential" with the use of "Limits" in some way(though it's just a feeling).

2. Your restriction, that you want the answer to be given in terms of "Calculus" or "Analysis", pretty much guarantees that you won't get a very satisfactory answer. The "differential" as used in Calculus is just a notational device and is not given a rigorous definition. You need to go to Differential Geometry or "Calculus on Manifolds" to find a rigorous definition.

Originally Posted by HallsofIvy
Your restriction, that you want the answer to be given in terms of "Calculus" or "Analysis", pretty much guarantees that you won't get a very satisfactory answer. The "differential" as used in Calculus is just a notational device and is not given a rigorous definition. You need to go to Differential Geometry or "Calculus on Manifolds" to find a rigorous definition.
In this case, I think it's really bad to use Leibniz's notation in calculus. We should reformulate Calculus using Lagrange's notation for both derivative and antiderivative operations so we get rid of this nonsense differential.

4. Originally Posted by HamedBegloo
In this case, I think it's really bad to use Leibniz's notation in calculus. We should reformulate Calculus using Lagrange's notation for both derivative and antiderivative operations so we get rid of this nonsense differential.
Well, Leibniz's notation provides a great deal of intuition, and the non-rigorous manipulation of differentials is extremely helpful in many areas of physics and engineering. Lagrange's notation does not provide this flexibility. I believe one should be accustomed to both forms of notation.

Originally Posted by Rido12
Well, Leibniz's notation provides a great deal of intuition, and the non-rigorous manipulation of differentials is extremely helpful in many areas of physics and engineering. Lagrange's notation does not provide this flexibility. I believe one should be accustomed to both forms of notation.
I understand that Leibniz's notation has so many intuitive and practical advantages that help students who have not much interest in higher mathematics to solve calculus or differential equations problems easier. But it would be better if we could keep rigor and intuition at the same time. Leibniz introduces symbols like "$\displaystyle \int$" and "$\displaystyle \mathrm{d}$" which seems to be insisting they must have separate standalone meanings and definitions. But when you talk about them everyone says these symbols has no meaning for themselves alone and we end up that $\displaystyle \int f(x) \mathrm{d}x$ and $\displaystyle \frac{\mathrm{d} f(x)}{\mathrm{d} x}$ are just mere notations for antiderivative and derivative respectively. Anyway I thought that widespread usage of Leibniz's notation must have an important reason. Not that just this makes them algebraically more flexible so students manipulate them easily... but from a more pure mathematical perspective.

6. Originally Posted by HamedBegloo
--------------------------------------------------------------------------------
Let $\displaystyle f(x)$ be a differentiable function. By assuming that changes in $\displaystyle x$ are small, with a good approximation we can say:
$\displaystyle \Delta f(x)\approx {f}'(x)\Delta x$
Where $\displaystyle \Delta f(x)$ is the changes in the value of function. Now if we consider that changes in $\displaystyle f(x)$ are small enough then we define differential of $\displaystyle f(x)$ as follows:
$\displaystyle \mathrm{d}f(x):= {f}'(x)\mathrm{d} x$
Where $\displaystyle \mathrm{d} f(x)$ is the differential of $\displaystyle f(x)$ and $\displaystyle \mathrm{d} x$ is the differential of $\displaystyle x$.

--------------------------------------------------------------------------------

What bothers me is this definition is completely circular.
Hi HamedBegloo! Welcome to MHB!

To be honest, I don't really see how it is circular.
Also, I don't see why we would need to have an independent definition.

The derivative in calculus is rigorously defined as:
$$f'(x) = \lim_{\Delta x \to 0}\frac{\Delta f(x)}{\Delta x}$$
Leibniz' notation is just that - a short hand notation for the same thing, defined such that:
$$f'(x) =\frac{df}{dx}$$
It allows us to write the formulas more succinctly and more intuitively.
But whenever we use it, we should make sure we never forget what it really means - it's actually a limit.

Over the ages the rigorousness of the notation has been a pain point for generations of mathematicians and theoretical physicists. Personally I accept that within calculus there is no rigorous definition possible, other than that it's a convenient shorthand.

As for a more rigorous definition for Leibniz' notation, I'm aware of:
1. The differential form that you already mentioned, which, as I understand it, is a more abstract form for the same thing.
2. The hyperreals, that introduce the notion of a positive value smaller than any positive real value, similar to the notion of infinity ($\infty$) that is bigger than any real value. Then again, the hyperreals have their flaws as well.

Originally Posted by I like Serena
Hi HamedBegloo! Welcome to MHB!
Thanks. I hope that someday I could be helpful too here.

Originally Posted by I like Serena
To be honest, I don't really see how it is circular.
Also, I don't see why we would need to have an independent definition.

The derivative in calculus is rigorously defined as:
$$f'(x) = \lim_{\Delta x \to 0}\frac{\Delta f(x)}{\Delta x}$$
Leibniz' notation is just that - a short hand notation for the same thing, defined such that:
$$f'(x) =\frac{df}{dx}$$
It allows us to write the formulas more succinctly and more intuitively.
But whenever we use it, we should make sure we never forget what it really means - it's actually a limit.
Actually, it is "Derivative" which is rigorously defined -not "Differential"- however if $\displaystyle \frac{\mathrm{d} f(x)}{\mathrm{d} x}$ was just a notation for derivative, there would be no problem. But we go further and define differential as $\displaystyle \mathrm{d}f(x):= {f}'(x)\mathrm{d} x$ -not just saying it's a relation between the limit of two differences- I even tried to think of a definition for differential as like it's simply the limit of a difference as the difference approaches zero:
$\displaystyle \mathrm{d}x= \lim_{\Delta x \to 0}\Delta x$
But because of the so called "Archimedean property" of the "Real number system", that makes a differential simply zero:
$\displaystyle \mathrm{d}x= 0$
and it's not logical.

Originally Posted by I like Serena
The hyperreals, that introduce the notion of a positive value smaller than any positive real value, similar to the notion of infinity ($\infty$) that is bigger than any real value. Then again, the hyperreals have their flaws as well.
Yes, and the idea of "Hyperreal analysis" seems so interesting to me too and I don't see any problem about them. But I don't understand why there is a huge opposition against it from the math community. Afterall my knowledge of history of mathematics says many didn't accepted the concept of "Real numbers" as an extension of "Rational numbers" around the times of Pythagoras, but nowadays everything has changed. So maybe it's just a matter of time to see the acceptance of Hyperreals in the mainstream math community.

However despite I strongly feel hyperreals are useful, I think there might be a workaround in "Real analysis" too:

I remember someone pointed out that the main problem with defining Differential in "Real calculus" is that the mathematical objects $\displaystyle \mathrm{d} f(x)$ and $\displaystyle \mathrm{d} x$ aren't even "Real numbers" or in some sense aren't "Real valued functions".

But I also remember there were a section in our "Calculus" course where they discussed "Infinite limits" and "Infinite derivatives" which were Limits and Derivatives that have a value of "Infinity". Since infinity isn't a real number sure they were treated as a kind of "Non-existent limit/derivative" but as a special kind of them. I mean this type of non-existence of limit/derivative had some significant properties which made them important.

Now why don't introduce "Infinitesimals" as some kind of non-existent limits and then relate the idea to the concept of "Differentials". By doing so we neither entered into non-standard analysis nor put down the differentials and even somehow made them more rigorous.

Do you think it's possible?

8. Ah well, I guess we already know that not everything is black and white.

Let's take a look at what we have. We can write:
$$f'(x) = \lim_{\Delta x \to 0} \frac{\Delta f(x)}{\Delta x} \quad\iff\quad \lim_{\Delta x \to 0} \frac{f'(x)\Delta x - \Delta f(x)}{\Delta x} = 0$$
In other words, we can use the "short hand" that:
$$f'(x) dx - df = 0$$

As for the anti-derivative, that one is actually rigorously defined (with the Riemann-Stieltjes integral) as (somewhat liberally):
$$\int f(x)dx = \lim_{\Delta x_i \to 0} \sum f(x_i)\Delta x_i$$
To be honest, I've never really thought about it, but it's the Leibniz' way of defining an anti-derivative.

As for hyperreals, the immediate problem is that reals form a so called "Field", meaning that in particular multiplications, additions, and additive inverses are properly defined. However, with hyperreals it breaks down. We cannot properly define all multiplications and additions with infinitesimals and infinity. As a consequence only a subset of what we consider in calculus can be accepted.

#### Posting Permissions

• You may not post new threads
• You may not post replies
• You may not post attachments
• You may not edit your posts
•