Explaining Differentials: How Were They First Introduced in Calculus?

In summary, the conversation discusses the concept of differentials and how they are used to model physical problems. The speakers also touch on the idea of "change" and how it relates to differentials, with one speaker mentioning their experience with learning calculus and the use of "tricks" before understanding the concept of a limit. The concept of differentials is explained as a convenient notation that works, but may not have a rigorous definition in elementary calculus.
  • #1
PFuser1232
479
20
When I first came across differentials, I was told that they could be thought of as infinitesimal changes. However, I can't get my head around how they're actually used to model physical problems. For example, if ##x## is the x-coordinate of a moving body, then ##dx## is an infinitesimally small change in position. More generally, if ##\vec{r}## is the position vector of a body, then ##d\vec{r}## is an infinitesimally small change in position. What I don't understand is how we think of things like distance, mass, and area in terms of differentials. For example, we think of ##dA## as an infinitesimally small area; ##dm## as an infinitesimally small mass; and ##ds## as an infinitesimally small distance. Why do we avoid the concept of "change" when talking about mass, area, and distance (to name a few)?
 
Physics news on Phys.org
  • #2
I was first introduced to the concept of differentiation at the age of 16 when taking my first calculus class. We were not told what ##\frac {dy}{dx}## actually meant, we were just told 'tricks and tips' so to speak for dealing with differentiation and integration. i.e. 'When you differentiate a polynomial, reduce the powers of x by 1, and multiply by the old power.' was the first thing we were taught. It wasn't until I was introduced to differentiation during mechanics and physics modules that the concept was explained in the terms you have described it. It may help to think of ##dA##, ##dm##, ##ds## as infinitesimally small 'amounts' of something, rather than changes. Can you provide an example of where you believe the concept of 'change' has been abandoned?
 
  • #3
That's the problem with taking Calculus so young- you don't have the 'maturity' yet to grasp the theory behind the math (or at least your teachers don't think you do) so you just learn "rules". The derivative is defined in terms of a limit: [itex]\frac{dy}{dx}= \lim_{h\to 0} \frac{f(x+h)- f(x)}{h}[/itex]. One result of that is that, while the derivative is NOT a fraction, it can be "treated like one". You cannot, for example, prove the "chain rule", [itex]\frac{dy}{dx}\frac{dx}{dt}= \frac{dy}{dt}[/itex] just by saying "the dx terms cancel" but you can proving by going back before the limit, cancelling in the "difference quotient" then taking the limit again.
Because the derivative can be "treated like a quotient" we introduce the "differentials", dx and dy, separately to make use of that property. You say you were "told that they could be thought of as infinitesimal changes". That's fine if you were taught "non-standard" Calculus where you actually introduce the notion of "infinitesmals" rigously. But that requires some very deep "symbolic logic" showing that one can introduce "infinitesmals" into the number system (and what you wind up with is NOT the standard real number system). Lacking that, you need to consider "differentials" as just a convenient notation that happens to work.
 
Last edited by a moderator:
  • #4
MohammedRady97 said:
. Why do we avoid the concept of "change" when talking about mass, area, and distance (to name a few)?

I'm not sure what you are asking. "Change" of things is often dealt with in physics. Is the question: "Why do we use expressions where symbols like [itex] dy [/itex] appear as ordinary variables instead of only as part of a fraction like [itex] \frac{dy}{dx} [/itex] ?"

If you browse contributions to old threads concerning differentials, you won't find a unanimous point of view - even from the experts who generally agree on other aspects of mathematics. I think differentials have a standard definition in differential geometry, but there is no standard definition in elementary calculus. The treatment of differentials in calculus varies from textbook to textbook. Physics texts often reason with differentials without establishing any rigorous definition for them.
 
Last edited:
  • #5
HallsofIvy said:
That's the problem with taking Calculus so young- you don't have the 'maturity' yet to grasp the theory behind the math (or at least your teachers don't think you do) so you just learn "rules". The derivative is defined in terms of a limit: [itex]\frac{dy}{dx}= \lim_{h\to 0} \frac{f(x+h)- f(x)}{h}[/itex]. One result of that is that, while the derivative is NOT a fraction, it can be "treated like one". You cannot, for example, prove the "chain rule", [itex]\frac{dy}{dx}\frac{dx}{dt}= \frac{dy}{dt}[/itex] just by saying "the dx terms cancel" but you can proving by going back before the limit, cancelling in the "difference quotient" then taking the limit again.
Because the derivative can be "treated like a quotient" we introduce the "differentials", dx and dy, separately to make use of that property. You say you were "told that they could be thought of as infinitesimal changes". That's fine if you were taught "non-standard" Calculus where you actually introduce the notion of "infinitesmals" rigously. But that requires some very deep "symbolic logic" showing that one can introduce "infinitesmals" into the number system (and what you wind up with is NOT the standard real number system). Lacking that, you need to consider "differentials" as just a convenient notation that happens to work.

I write the following assuming this post was a reply to my own.

I had calculus classes for two years before university, using the 'tricks' we had been taught. I had trouble with the concept of differentiation and integration for that time as it had not been properly explained. I suppose the teaching of differentiation followed the order it was developed by Newton and the concept of the limit was not mentioned until I reached university. It was then that we learned differentiation from First Principles in our first week of calculus. After that first week the differentiation and integration that had been a slight problem for me before, was now 'simple', so to speak. I felt like I had a much deeper understanding of the calculus I had learned over the past two years and more confident in my ability to differentiate functions more complicated than polynomials. I feel as though the concept of a limit should have been introduced at the earliest point, and then once we had understood the groundwork for what we were doing we could move onto learning the 'tricks' that help differentiate and integrate more easily. Although, you could argue that the two years of study before university gave me the foundation to appreciate the further insight gained from my introduction to the limit notation and the explanation of differentiation from First Principles.Anyway, back to the topic at hand. OP I suggest reading 'Zero, the biography of a dangerous idea' by Charles Seife. The book focuses around the concepts of 0 and ∞, building up to an explanation of the development of calculus, and the different notation used. The passage also explains how Newton developed the idea (and his notation) and how Newton's ideas differed slightly from those of Leibniz. It then goes on to explain a little about the 0/0 problem and l'Hôpital's rule. The book is popular science and as such is easy to digest, if a little slow at getting to what it's trying to convey.
 

Related to Explaining Differentials: How Were They First Introduced in Calculus?

1. What is a differential?

A differential is a mathematical concept that measures the instantaneous rate of change of a function. It is typically represented as dy/dx, where y is the dependent variable and x is the independent variable.

2. How do differentials relate to derivatives?

Differentials and derivatives are closely related concepts. In fact, the differential dy represents the change in the dependent variable y, while the derivative dy/dx represents the rate of change of y with respect to the independent variable x.

3. Why is it important to make sense of differentials?

Differentials are essential in many areas of science and engineering, particularly in physics, economics, and engineering. They allow us to model and understand real-world phenomena that involve rates of change, such as velocity, acceleration, and growth.

4. What is the process for finding differentials?

To find a differential, you would first need to find the derivative of the function. Then, you would substitute the value of the independent variable into the derivative to find the corresponding value of the differential. This process is also known as finding the tangent line to the graph of the function at a specific point.

5. Can differentials be used to estimate values?

Yes, differentials can be used to estimate values of a function at a specific point. This is known as linear approximation or tangent line approximation. By finding the differential at a given point, we can approximate the value of the function at that point and make predictions about its behavior in the surrounding area.

Similar threads

Replies
22
Views
2K
Replies
46
Views
1K
Replies
13
Views
1K
  • Calculus
Replies
10
Views
2K
Replies
1
Views
1K
Replies
22
Views
3K
Replies
14
Views
1K
Replies
2
Views
383
Back
Top