Why are Physicists so informal with mathematics?

  • #71
TurtleKrampus said:
I am completely out of my water here lmao
Well, to put it more simply, quantum field theory uses a bunch of operators on a Hilbert space to describe "fields", such as the electromagnetic field etc, which give rise to "particles". In essence all these are partial differential equations, whose solutions are promoted to linear operators, via "quantization". That's all fine when the underlying PDE is simple, as is the case with the Klein-Gordon or Dirac equations. These can generally be solved, and the appropriate Hilbert space to study their "quantized" version is called a Fock space. We generally know how their solutions (called free fields) act on the Fock space.

Unfortunately, when you want to have fields interacting with each other, you end up with PDEs involving multiple different functions coupled in convoluted ways. Just Google "standard model Lagrangian". Almost every different letter you see is a different field. Now apply on that the Euler-Lagrange equations and you get an absolutely insane system of horribly coupled PDEs. Of course nobody with the whole thing at once, we just look at parts of it. One such part are the famous Yang-Mills equations: https://en.m.wikipedia.org/wiki/Yang–Mills_equations

The existence problem for the YM equations is a Millennium Prize problem. Actually, it's not just that one, none of the other realistic interacting QFTs have been solved in 4 spacetime dimensions. The reason the other ones aren't a Millennium Prize problem is probably mostly that many people don't think they even really have a solution, for various reasons, never mind they are still used.

But to quantize, say, the Klein-Gordon equation, the best way is to just solve the ("classical") PDE first, and then to promote the solution to an operator in a specific sense. So how are we supposed to quantize the interacting ones when we can't solve them? Physicists have some workarounds. Probably the most commonly used one is a type of functional integral called a path integral. This is another tool that came out of physics (originally to describe Brownian motion if I'm not mistaken) that has been applied to other areas of math. The idea is, you somehow integrate on a space of functions. Of course, to integrate you need a measure. For some specific functional integrals, this measure is known. For the path integrals of QFT, there is no rigorous formalization of the measure as of yet. Nevertheless, it is used.

And then on top of all of that, you do perturbation theory. What's that? Well, we have a Hilbert space we don't really know, and operators representing fields that we have quantized, nevermind the fact that we don't rigorously know how they act on that Hilbert space (or even if we can rigorously consider these actions, because they relate to PDEs we don't know the solutions to), and we want to approximate the solutions to various problems regarding their action on these spaces, using power series. Great.

Perhaps you will find it amusing to learn that these power series have divergent terms. But maybe you already heard that, and heard that you can just do renormalization, etc. Indeed, renormalization generally fixes the problem, and Epstein-Glaser theory shows how you do that rigorously, starting from first principles, in a manner that is not ad hoc. Only physicists usually don't do that and follow a much less rigorous counterterm procedure, that is easier to work with. But at least we know we can cure the divergences. Trouble is, even AFTER you cure these divergences in the terms of the series, the series STILL diverges if you include every term, as in, it has ZERO radius of convergence. The physicist answer to this? "Well I'll just keep the first few terms of the series, which don't diverge". Well, in some cases people use some other summation schemes, like Borel summation etc. But sometimes that doesn't work.

So, to summarize, we start from PDEs that we don't know how to solve or if they even have solutions, we quantize them via integration measures that don't exist, and then we approximate the solutions to various problems using series that don't converge, by just ignoring the rest of the series, at least when we can get each term to converge. And it's not even a cutting edge theory, it's been around for decades. Not just that, but it is probably THE most successful physical theory ever, that has yielded the most precise predictions. This is how physicists learn to be less formal with math.

To learn about QFT, you may be interested in these books, written mostly for mathematicians, by mathematicians:

https://www.amazon.com/dp/0821847058/?tag=pfamazon01-20
https://www.amazon.com/dp/1316510271/?tag=pfamazon01-20

The second one is essentially a more digested version of the first one, including only the things that for the most part are known, but being significantly bigger. It's also interesting to see Talagrand's comments throughout the text indicating his struggle to understand why various things work. Really that's the main strength of the book imo, the fact that when he covers something that is very suspicious but nevertheless works, he says it explicitly. However I'm not sure how much you would get out of these books without further background into physics. Maybe you could try reading the Arnold book I mentioned, then maybe something like Quantum Theory for Mathematicians by Brian Hall, and then the Talagrand book (or the Folland book if you prefer). You will also see how much of QM and QFT really is just representation theory, and see why it was a huge motivator for its development.
 
Science news on Phys.org
  • #72
Why are physicists so informal with mathematics? Perhaps they fear excess rigor may lead to rigor mortis.
 
  • Like
Likes CalcNerd, vanhees71, jasonRF and 3 others
  • #73
Some quotes from prominent physicists:

Weyl: Space is a field of linear operators.
Heisenberg: Nonsense, Space is blue and birds fly through it.

Asher Peres: Quantum phenomena occur in a laboratory, not a hilbert space.

I forgot who said this but it was probably an experimentalist:
" You can keep your hilbert space, I need the answer in volts"

Physicists tend to be relaxed when applying mathematics. They don't often focus on Peano axioms, Dedekind cuts and other axioms of mathematics.

By the way, I do not hold 100% with the iconoclastic viewpoints above. Einstein himself often lamented he wished he knew more mathematics. However I realize that physicists today are busy with publishing duties and competition, and they need to get on with doing physics.
 
  • Haha
Likes vanhees71
  • #74
mpresic3 said:
Some quotes from prominent physicists:

Weyl: Space is a field of linear operators.
Heisenberg: Nonsense, Space is blue and birds fly through it.

Asher Peres: Quantum phenomena occur in a laboratory, not a hilbert space.

I forgot who said this but it was probably an experimentalist:
" You can keep your hilbert space, I need the answer in volts"

Physicists tend to be relaxed when applying mathematics. They don't often focus on Peano axioms, Dedekind cuts and other axioms of mathematics.

By the way, I do not hold 100% with the iconoclastic viewpoints above. Einstein himself often lamented he wished he knew more mathematics. However I realize that physicists today are busy with publishing duties and competition, and they need to get on with doing physics.
Some very different ways of how people think or how is their orienting.

Try this idea between Mathematics and Physics.
Mathematics - build and test the machine
Physics - Use the machine for what it is made for;
I am not saying that those comments are perfect. Just showing a way of thought.
 
  • Like
Likes vanhees71
  • #75
mpresic3 said:
Some quotes from prominent physicists:

Weyl: Space is a field of linear operators.
Heisenberg: Nonsense, Space is blue and birds fly through it.

Asher Peres: Quantum phenomena occur in a laboratory, not a hilbert space.

I forgot who said this but it was probably an experimentalist:
" You can keep your hilbert space, I need the answer in volts"

Physicists tend to be relaxed when applying mathematics. They don't often focus on Peano axioms, Dedekind cuts and other axioms of mathematics.

By the way, I do not hold 100% with the iconoclastic viewpoints above. Einstein himself often lamented he wished he knew more mathematics. However I realize that physicists today are busy with publishing duties and competition, and they need to get on with doing physics.
That quote isn't by Weyl, it's by Felix Bloch.
Not sure where that misconception came from, but mathematicians don't often focus on Peano axioms nor Dedekind cuts when doing math (nor are Dedekind cuts axioms, they're a construction).
 
  • #76
symbolipoint said:
Some very different ways of how people think or how is their orienting.

Try this idea between Mathematics and Physics.
Mathematics - build and test the machine
Physics - Use the machine for what it is made for;
I am not saying that those comments are perfect. Just showing a way of thought.
I don't like that analogy at all, sounds too reductive of mathematics to me.
 
  • #77
Can anyone point to an example where a lack of rigor led to a wrong physical result by an otherwise competent physicist?
 
  • #78
bob012345 said:
Can anyone point to an example where a lack of rigor led to a wrong physical result by an otherwise competent physicist?
The question is to what extent physics would have developed all the requisite mathematics without the rigorous mathematical research. A good example would be Noether's Theorem. Could physicists have figured out the key criteria without Emmy Noether having worked it out rigorously - or Sophus Lie having developed the theory of Lie groups in the first place?

It's one thing to demostrate a free and easy version of a mathematical theorem, but another thing to develop the theory non-rigorously in the first place.

You could say the same about group theory, linear algebra, functional analysis, complex analysis, topology and differential geometry in general. Yes, you can use the mathematics non-rigorously, but would it have been developed non-rigorously in the first place?
 
  • Like
Likes TurtleKrampus and martinbn
  • #79
PeroK said:
The question is to what extent physics would have developed all the requisite mathematics without the rigorous mathematical research. A good example would be Noether's Theorem. Could physicists have figured out the key criteria without Emmy Noether having worked it out rigorously - or Sophus Lie having developed the theory of Lie groups in the first place?

It's one thing to demostrate a free and easy version of a mathematical theorem, but another thing to develop the theory non-rigorously in the first place.

You could say the same about group theory, linear algebra, functional analysis, complex analysis, topology and differential geometry in general. Yes, you can use the mathematics non-rigorously, but would it have been developed non-rigorously in the first place?
What does Noether's theorem say about parity if anything?
 
  • #80
Parity is not a continuous symmetry.
 
  • Like
Likes vanhees71 and bob012345
  • #81
TurtleKrampus said:
I don't like that analogy at all, sounds too reductive of mathematics to me.
What I said was this, which you reacted to:
Some very different ways of how people think or how is their orienting.

Try this idea between Mathematics and Physics.
Mathematics - build and test the machine
Physics - Use the machine for what it is made for;
I am not saying that those comments are perfect. Just showing a way of thought.

That was the best I could think at the current time. As I plainly said, the comment is not perfect. Have you a thought along the lines of the original posted topic about Mathematics differently handled between Physicists and Mathematicians, and if you want to share with readers here, then say those thoughts.
 
  • #82
Why are mathematicians so formal with physics?
 
  • Like
Likes haushofer and symbolipoint
  • #83
Frabjous said:
Why are mathematicians so formal with physics?
Are they?
 
  • #84
PeroK said:
It's one thing to demostrate a free and easy version of a mathematical theorem, but another thing to develop the theory non-rigorously in the first place.

You could say the same about group theory, linear algebra, functional analysis, complex analysis, topology and differential geometry in general. Yes, you can use the mathematics non-rigorously, but would it have been developed non-rigorously in the first place?
Almost all mathematics (until a certain point at least) was developed non-rigorously first. I was recently reading Riemann. There is NOTHING rigorous in modern terms about his writings. In many respects they were LESS rigorous than a lot of modern theoretical physics. He wildly asserts without proper proof all over the place, and he often does not give proper definitions of things (which actually makes it confusing sometimes). He also missed some counterexamples to his claims as was demonstrated by others later. In fact mathematics from that era have a long history of mathematicians successively finding errors with the previous theories, improving them, and then erring themselves, only to be corrected later by others.

Even today, to find new things mathematicians don't tend to start from the "rigorous" picture. They use intuition and imprecise concepts, and restore the rigor only after they have found their result. Intuition plays a significant role. Read Terence Tao's perspective on this: https://terrytao.wordpress.com/career-advice/theres-more-to-mathematics-than-rigour-and-proofs/

Another place where I noticed this was Richard Borcherds' lectures that he uploads on YouTube. I don't remember what it was exactly but he described a theorem as saying "you can often find x". "Often" of course is an extremely imprecise word. But it is useful in that context because if you are too rigorous about things you tend to forget what you are really doing. In the course of devising a new proof or theorem, rigor is often first ignored, and after the skeleton of the new construct has been heuristically worked out, mathematicians polish it and introduce back the rigor. The end product is presented rigorously, which makes the process seem like it never happened, which leads to this misconception.
 
  • Like
Likes vanhees71
  • #85
AndreasC said:
Almost all mathematics (until a certain point at least) was developed non-rigorously first. I was recently reading Riemann. There is NOTHING rigorous in modern terms about his writings. In many respects they were LESS rigorous than a lot of modern theoretical physics. He wildly asserts without proper proof all over the place, and he often does not give proper definitions of things (which actually makes it confusing sometimes). He also missed some counterexamples to his claims as was demonstrated by others later. In fact mathematics from that era have a long history of mathematicians successively finding errors with the previous theories, improving them, and then erring themselves, only to be corrected later by others.

Even today, to find new things mathematicians don't tend to start from the "rigorous" picture. They use intuition and imprecise concepts, and restore the rigor only after they have found their result. Intuition plays a significant role. Read Terence Tao's perspective on this: https://terrytao.wordpress.com/career-advice/theres-more-to-mathematics-than-rigour-and-proofs/

Another place where I noticed this was Richard Borcherds' lectures that he uploads on YouTube. I don't remember what it was exactly but he described a theorem as saying "you can often find x". "Often" of course is an extremely imprecise word. But it is useful in that context because if you are too rigorous about things you tend to forget what you are really doing. In the course of devising a new proof or theorem, rigor is often first ignored, and after the skeleton of the new construct has been heuristically worked out, mathematicians polish it and introduce back the rigor. The end product is presented rigorously, which makes the process seem like it never happened, which leads to this misconception.
I don't agree with most of this. All of the examples seem perfectly reborous to me.
 
  • #86
AndreasC said:
Even today, to find new things mathematicians don't tend to start from the "rigorous" picture. They use intuition and imprecise concepts, and restore the rigor only after they have found their result. Intuition plays a significant role. Read Terence Tao's perspective on this: https://terrytao.wordpress.com/career-advice/theres-more-to-mathematics-than-rigour-and-proofs/
Having read that link, I believe you are seriously misrepresenting Tao's position.
 
  • #87
AndreasC said:
Almost all mathematics (until a certain point at least) was developed non-rigorously first. I was recently reading Riemann. There is NOTHING rigorous in modern terms about his writings. In many respects they were LESS rigorous than a lot of modern theoretical physics. He wildly asserts without proper proof all over the place, and he often does not give proper definitions of things (which actually makes it confusing sometimes). He also missed some counterexamples to his claims as was demonstrated by others later. In fact mathematics from that era have a long history of mathematicians successively finding errors with the previous theories, improving them, and then erring themselves, only to be corrected later by others.
This is precisely my point. Mathematical rigour was (re-)introduced in the 19th Century in order to establish which intuitive results were correct, which false and the precise hypotheses required for the result to hold. Riemann largely pre-dates this. Mathematics had progressed to the point where no one could figure out what was true and what was not. The rigorous method largely post-dates Riemann. Modern mathematics (from 1860) couldn't exist without it. Tao actually emphaises its importance in the link you provided (with my underlining):

The “post-rigorous” stage, in which one has grown comfortable with all the rigorous foundations of one’s chosen field, and is now ready to revisit and refine one’s pre-rigorous intuition on the subject, but this time with the intuition solidly buttressed by rigorous theory. (For instance, in this stage one would be able to quickly and accurately perform computations in vector calculus by using analogies with scalar calculus, or informal and semi-rigorous use of infinitesimals, big-O notation, and so forth, and be able to convert all such calculations into a rigorous argument whenever required.)
 
  • Like
Likes vanhees71
  • #88
As an undergraduate, we had a TA that was ungodly smart. The joke that went around was that if one got lost in a proof, just write down “trivially“. The TA would get to it, agree that it was trivial, and give one credit.
In reality, if I wrote down this proof, it would be non-rigorous. If the TA wrote down the identical proof it would be rigorous. Rigor is a construct defined by the audience. The OP is in Tao’s phase two. The physics is not addressed to that audience.
 
  • Like
Likes vanhees71 and PeroK
  • #89
PeroK said:
The rigorous method largely post-dates Riemann. Modern mathematics (from 1860) couldn't exist without it.
Yes, obviously rigor played a huge part in mathematics reaching where they did, but usually you first discover something new non rigorously, and then you reintroduce rigor and expand on things. One famous example is the Dirac delta function. Dirac used it completely unrigorously (and in fact physicists still do). He didn't start from a rigorous concept and simplified it. He started from this tool, and then mathematicians developed the theory of distributions, which is rigorous and far more powerful than just this one thing.

Bottom line is, I agree with you that modern mathematics can't reach to where it has without rigor. However I disagree that the relationship is as simple as "rigorous result first, then simplification". I think there is a lot more of back and forth. And really most of the math physics uses was developed unrigorously, because it is a lot more basic and manageable than modern mathematics, so rigorous foundations are not necessary to inform your intuition. But moving forward rigor takes part in the process as described by Tao, refining tools and intuition and building hugely complex structures.

By the way, even if you read Poincare (long after Riemann), it's not really rigorous by modern standards for the most part. It's really only after Bourbaki perhaps (though some people were doing it earlier) that things got pretty hardcore, rigor wise. Which has its place, I don't discount that. It's just that the relationship between the two is a little bit complex, and this is why physics can survive with these standards.
 
Last edited:
  • Like
Likes vanhees71
  • #90
AndreasC said:
Yes, obviously rigor played a huge part in mathematics reaching where they did, but usually you first discover something new non rigorously, and then you reintroduce rigor and expand on things.
I don't believe this is the way mathematicians have worked in the past 150 years. The onus is on you to prove that this is true. In particular, this statement is entirely false.
but usually you first discover something new non rigorously, and then you reintroduce rigor
I think it's obvious from that that you have never done any mathematical research.

AndreasC said:
One famous example is the Dirac delta function. Dirac used it completely unrigorously (and in fact physicists still do).
That is one example, but is far from the norm.
AndreasC said:
And really most of the math physics uses was developed unrigorously,
I'd like you to prove this claim. I don't believe this is true.
AndreasC said:
, so rigorous foundations are not necessary to inform your intuition.
They are according to Terence Tao. See the link above. Tao directly contradicts what you say.
AndreasC said:
But moving forward rigor takes part in the process described by Tao, refining tools and building hugely complex structures.
IMO, you've misunderstood what Tao is saying.
 
  • #91
PeroK said:
They are according to Terence Tao. See the link above. Tao directly contradicts what you say.
But neither my statement nor I believe Tao's is universal here... Obviously some things are intuitive even without having to go through this process. It's why physicists can get away with not knowing the rigorous foundations and still extract good results.

PeroK said:
I don't believe this is the way mathematicians have worked in the past 150 years. The onus is on you to prove that this is true. In particular, this statement is entirely false.
I have a few arguments to that effect. First, the cutoff is not as sharp or as old as you claim. As I said before, even as soon as Poincare things are not that rigorous. Second, look at the importance conjectures have. A conjecture is not proven. However you have things such as the Langlands program and many others which are entirely about exploring the ramifications of conjectures, or trying to prove them. But how were the conjectures formulated in the first place? They did not follow from rigorous foundations directly, or they would be proved. But they are not just random assertions either, they are somehow special, and seem "likely" to be true. In a sense, you could say that the whole process until a conjecture is proved is the moment of intuitive discovery, before it is polished and made rigorous, suspended in time for years.

Now I don't have to work much to prove that most mathematics used in physics was either not developed rigorously or could have developed some other way. Most of it developed before the 20th century, so that's that... At the boundaries of course, the situation changes and you are right.
 
  • Skeptical
Likes PeroK
  • #92
AndreasC said:
Second, look at the importance conjectures have. A conjecture is not proven.
An unproven conjecture is not non-rigorous mathematics. Those are two very different things.
 
  • #93
AndreasC said:
But how were the conjectures formulated in the first place?
Through rigorous mathematics.
AndreasC said:
They did not follow from rigorous foundations directly
Yes they did.
AndreasC said:
But they are not just random assertions either, they are somehow special, and seem "likely" to be true. In a sense, you could say that the whole process until a conjecture is proved is the moment of intuitive discovery, before it is polished and made rigorous, suspended in time for years.
This is a fantasy.
 
  • #94
AndreasC said:
Well, to put it more simply, quantum field theory uses a bunch of operators on a Hilbert space to describe "fields", such as the electromagnetic field etc, which give rise to "particles". In essence all these are partial differential equations, whose solutions are promoted to linear operators, via "quantization". That's all fine when the underlying PDE is simple, as is the case with the Klein-Gordon or Dirac equations. These can generally be solved, and the appropriate Hilbert space to study their "quantized" version is called a Fock space. We generally know how their solutions (called free fields) act on the Fock space.

Unfortunately, when you want to have fields interacting with each other, you end up with PDEs involving multiple different functions coupled in convoluted ways. Just Google "standard model Lagrangian". Almost every different letter you see is a different field. Now apply on that the Euler-Lagrange equations and you get an absolutely insane system of horribly coupled PDEs. Of course nobody with the whole thing at once, we just look at parts of it. One such part are the famous Yang-Mills equations: https://en.m.wikipedia.org/wiki/Yang–Mills_equations

The existence problem for the YM equations is a Millennium Prize problem. Actually, it's not just that one, none of the other realistic interacting QFTs have been solved in 4 spacetime dimensions. The reason the other ones aren't a Millennium Prize problem is probably mostly that many people don't think they even really have a solution, for various reasons, never mind they are still used.

But to quantize, say, the Klein-Gordon equation, the best way is to just solve the ("classical") PDE first, and then to promote the solution to an operator in a specific sense. So how are we supposed to quantize the interacting ones when we can't solve them? Physicists have some workarounds. Probably the most commonly used one is a type of functional integral called a path integral. This is another tool that came out of physics (originally to describe Brownian motion if I'm not mistaken) that has been applied to other areas of math. The idea is, you somehow integrate on a space of functions. Of course, to integrate you need a measure. For some specific functional integrals, this measure is known. For the path integrals of QFT, there is no rigorous formalization of the measure as of yet. Nevertheless, it is used.

And then on top of all of that, you do perturbation theory. What's that? Well, we have a Hilbert space we don't really know, and operators representing fields that we have quantized, nevermind the fact that we don't rigorously know how they act on that Hilbert space (or even if we can rigorously consider these actions, because they relate to PDEs we don't know the solutions to), and we want to approximate the solutions to various problems regarding their action on these spaces, using power series. Great.

Perhaps you will find it amusing to learn that these power series have divergent terms. But maybe you already heard that, and heard that you can just do renormalization, etc. Indeed, renormalization generally fixes the problem, and Epstein-Glaser theory shows how you do that rigorously, starting from first principles, in a manner that is not ad hoc. Only physicists usually don't do that and follow a much less rigorous counterterm procedure, that is easier to work with. But at least we know we can cure the divergences. Trouble is, even AFTER you cure these divergences in the terms of the series, the series STILL diverges if you include every term, as in, it has ZERO radius of convergence. The physicist answer to this? "Well I'll just keep the first few terms of the series, which don't diverge". Well, in some cases people use some other summation schemes, like Borel summation etc. But sometimes that doesn't work.

So, to summarize, we start from PDEs that we don't know how to solve or if they even have solutions, we quantize them via integration measures that don't exist, and then we approximate the solutions to various problems using series that don't converge, by just ignoring the rest of the series, at least when we can get each term to converge. And it's not even a cutting edge theory, it's been around for decades. Not just that, but it is probably THE most successful physical theory ever, that has yielded the most precise predictions. This is how physicists learn to be less formal with math.

To learn about QFT, you may be interested in these books, written mostly for mathematicians, by mathematicians:

https://www.amazon.com/dp/0821847058/?tag=pfamazon01-20
https://www.amazon.com/dp/1316510271/?tag=pfamazon01-20

The second one is essentially a more digested version of the first one, including only the things that for the most part are known, but being significantly bigger. It's also interesting to see Talagrand's comments throughout the text indicating his struggle to understand why various things work. Really that's the main strength of the book imo, the fact that when he covers something that is very suspicious but nevertheless works, he says it explicitly. However I'm not sure how much you would get out of these books without further background into physics. Maybe you could try reading the Arnold book I mentioned, then maybe something like Quantum Theory for Mathematicians by Brian Hall, and then the Talagrand book (or the Folland book if you prefer). You will also see how much of QM and QFT really is just representation theory, and see why it was a huge motivator for its development.
I should've responded to this sooner, but this message is too large for me to possibly respond to its entirety (mostly because when I've tried I get bored less than midway). Anyway, thanks for your messages, I've read them I just can't really respond to them if that makes sense. I'm not really interested in self studying QFT, at least at the moment. I'm using most of my "study time" for either my URS or just regular classes. Once again, I really appreciate the answers.
 
  • #95
symbolipoint said:
What I said was this, which you reacted to:


That was the best I could think at the current time. As I plainly said, the comment is not perfect. Have you a thought along the lines of the original posted topic about Mathematics differently handled between Physicists and Mathematicians, and if you want to share with readers here, then say those thoughts.
Yeah, I don't have / didn't want to think of a better analogy, sorry if this sounds harsh. I don't care how physicists do math, not even most math done by mathematicians is thought of rigorously (though there are plenty of cases of just bashing, which you may usually start with rigor so you don't waste time). You have intuition, and see where that leads you.

Again, my problem was in the exposure, please disregard the original post I guess.
Just the other day I had a really awful experience with my Physics professor while introducing preliminary knowledge to Lagrangian mechanics. The basic gist is since the professor basically never mentions the type of domain / codomain of functions (implicitly or explicitly), him writing f(X), where X is a function of a real variable t, as opposed to say something like ##(f \circ X)## lead to a lot of unnecessary ambiguity, that did indeed lead to unnecessary confusion, since at a later point we mention the derivative of f and write f(X(t)) as opposed to f(X)(t).
There are some important details that I also left out, but I don't really want to write out since they'd require a lot more context and I honestly cba to write all of it out.
 
  • #96
PeroK said:
Through rigorous mathematics.
Let me put it another way. Most mathematicians believe the Riemann hypothesis is correct. But this suspicion is not grounded in some kind of rigorous proof, obviously. It's just a suspicion, supported by non rigorous arguments. Nevertheless, this suspicion is part of the process of real math even if it is not officially admitted as theorem. In fact, the conjecture would not have been formulated and widely investigated like that had there not been an intuitive jump from what was rigorously known to something that still to this day isn't. It's also telling that entire fields are founded on conjectures. If mathematicians didn't admit non-rigorous methods at all, these fields would he considered a waste of time, until the conjectures were proven or disproven.

One example that I just thought of is Perelman's proof of the Poincare conjecture. Perelman is of course thought of as the one who proved it, but his proof was more like a proof sketch. It still had some very non trivial gaps, non-rigorous "jumps" that he made to reach the solution, that were later filled in by other mathematicians (I believe it was Cao and someone else? I don't remember, you can look it up). This to me reveals a that the creative jump is usually one that is non-rigorous, and supported by (informed) intuition.

Looking back at your original post, I realize that I actually don't really disagree with it much, and perhaps I didn't express myself well enough either. The main point of disagreement is that I don't think most of the unrigorous math in physics is really a "simplification" of a rigorous result, for instance the way calculus is used in physics with hand wavy infinitesimals etc is close to how calculus was originally conceived, before being formalized and made rigorous with epsilons and deltas. But other than that, I don't really disagree that rigor was immensely helpful in building the complex mathematical structures we have today.
 
  • #97
AndreasC said:
Let me put it another way. Most mathematicians believe the Riemann hypothesis is correct. But this suspicion is not grounded in some kind of rigorous proof, obviously. It's just a suspicion, supported by non rigorous arguments. Nevertheless, this suspicion is part of the process of real math even if it is not officially admitted as theorem. In fact, the conjecture would not have been formulated and widely investigated like that had there not been an intuitive jump from what was rigorously known to something that still to this day isn't. It's also telling that entire fields are founded on conjectures. If mathematicians didn't admit non-rigorous methods at all, these fields would he considered a waste of time, until the conjectures were proven or disproven.

One example that I just thought of is Perelman's proof of the Poincare conjecture. Perelman is of course thought of as the one who proved it, but his proof was more like a proof sketch. It still had some very non trivial gaps, non-rigorous "jumps" that he made to reach the solution, that were later filled in by other mathematicians (I believe it was Cao and someone else? I don't remember, you can look it up). This to me reveals a that the creative jump is usually one that is non-rigorous, and supported by (informed) intuition.

Looking back at your original post, I realize that I actually don't really disagree with it much, and perhaps I didn't express myself well enough either. The main point of disagreement is that I don't think most of the unrigorous math in physics is really a "simplification" of a rigorous result, for instance the way calculus is used in physics with hand wavy infinitesimals etc is close to how calculus was originally conceived, before being formalized and made rigorous with epsilons and deltas. But other than that, I don't really disagree that rigor was immensely helpful in building the complex mathematical structures we have today.
The suspicion that the Riemann hypothesis is true is mostly due to how it relates to the structure of prime numbers, it being false would produce very interesting mathematics. Do not be deceived, there are results in both ways, there are results that hold if the Riemann hypothesis is true and those that hold if it's false.
Regardless, mathematics isn't necessarily the study of proving theorems, conditional results (i.e. if A then B) are still considered mathematics, and completely rigorous. (A premise doesn't have to be true for what's stated to be rigorous / valid.
There are also multiple mathematicians that believe / want to believe that the Riemann hypothesis is false, while they are a minority they do exist.

Humans are flawed creatures, researches when writing papers won't write every single mathematical step, they're bound to mistakes.
Most of them are somewhat naive, you recognize the similarity between things and think "since this holds for X, this is bound to be true here because I can turn this into X", when there's a subtle error in the thinking. Just the other day I thought "surely the set of affine transformations on X forms a ring, since I can embed them in a matrix algebra", this argument was flawed because the embedding doesn't preserve addition in 1 entry (mind you, I should've known better, as if this were true it'd be a very well known result).
People who do mathematics, specially with very long arguments, don't have time to check the validity of every small proposition they "swear" is true, specially when at a sufficiently large level you can't simply google "is X true?" and get an answer.

Does this mean that mathematics is unrigorous? I believe not at all, because first and foremost the published material will always have formal language / very easily formalized language, there should never be any ambiguity about what is meant. Secondly, the peer review process exists for this very reason, people will read what you write, and they'll spot those very propositions and check if they do indeed hold.

I don't care about how people arrive at the idea, I only care about its exposition. How people think of anything is their business, and I don't care at all.

But like, calculus HAS been formalized. There's no need for 'new' objects, which I honestly believe will only give a false sense of understanding and that do not help at all with computation.
 
  • #98
TurtleKrampus said:
The suspicion that the Riemann hypothesis is true is mostly due to how it relates to the structure of prime numbers, it being false would produce very interesting mathematics.
Right, that's my point, as with many other conjectures, there are arguments, that are not rigorous but nevertheless sound plausible and indicate research directions etc.

My general point is that there is a point during discovery or during devising a proof where mathematicians have to take a leap, that they typically fill up later. They didn't always fill it up later, but today the structures they deal with are incredibly complex and it has proven necessary to do that.
TurtleKrampus said:
But like, calculus HAS been formalized. There's no need for 'new' objects, which I honestly believe will only give a false sense of understanding and that do not help at all with computation.
Right, calculus has been formalized. But now we get to physics and how physicists use math. The thing is, physicists are first of all constrained and informed by physical reality, on top of the mathematical structure. For instance, for a body thrown upwards in a gravitational field, you can tell its trajectory will have a critical point, without doing any math. You can also tell it will be continuous, etc. So while in this particular example you could be very formal with it, it's kind of a burden to carry all that baggage. You won't encounter the pathological cases that mathematicians deal with so you can safely just ignore complications to keep things easy.

That works most of the time, but not all of the time. Sometimes you need some more serious math to really get to the bottom of something. But that's only a small part of physics research, so it's not what most physicists are taught, although maybe they should.
 
  • #99
@AndreasC I beleave you use the word "rigorous" differently than the way mathematicians use it.
 
  • Like
Likes Vanadium 50 and PeroK
  • #100
martinbn said:
@AndreasC I beleave you use the word "rigorous" differently than the way mathematicians use it.
You might be right but I think maybe the issue is that it's not really used consistently in general...
 
  • #101
martinbn said:
@AndreasC I beleave you use the word "rigorous" differently than the way mathematicians use it.
When I was a graduate student I worked my way through A Hilbert Space Problem Book, by Paul Halmos.

https://link.springer.com/book/10.1007/978-1-4684-9330-6

It included a series of open questions, where you had either to proof something or find a counterexample - without being given the usual undergraduate hints that it was true or not. As I recall, almost every statement that looked intuitively attractive had a counterexample; and, statements that looked like they couldn't be correct could be proved. Even the statements of the problem required a grounding in rigorous mathematics

Halmos was trying to achieve what Tao describes above: retraining the intuition of a student to be based on sound, rigorous mathematics, not on woolly thinking.
 
  • Like
Likes martinbn
  • #102
AndreasC said:
You might be right but I think maybe the issue is that it's not really used consistently in general...
I think any two mathematicians will agree every time if something is rigorous or not.
 
  • #103
martinbn said:
I think any two mathematicians will agree every time if something is rigorous or not.
Different fields have different standards of rigor so I doubt it... After all I think sometimes there is a bit of confusion between "formal" and "rigorous"...
 
  • #104
AndreasC said:
Different fields have different standards of rigor so I doubt it...
You can try. Ask some and see.
AndreasC said:
After all I think sometimes there is a bit of confusion between "formal" and "rigorous"...
I was going to say that, it seems that you are thinking about "formal" not "rigorous", because Bourbaki and Poincare are rigorous, but the first is more formal in style.
 
  • #105
martinbn said:
I was going to say that, it seems that you are thinking about "formal" not "rigorous", because Bourbaki and Poincare are rigorous, but the first is more formal in style.
I see your point, however Poincare is not fully rigorous even in that sense. There are examples of theorems and proofs he published that turned out to be kind of wrong due to relying on more or less heuristic arguments, and which were later refined either by others or himself. If you read Analysis Situs for example, there are lots of things he doesn't justify. For instance he initially stated the Poincare duality in 1893, without proof. Then in Analysis Situs he "proves" it, but Heegard showed the proof was more or less wrong. If I recall correctly, Heegard also made numerous other criticisms of Analysis Situs. Back then math had a lot of back and forth like that. Today, you generally have to be more rigorous to publish, though it depends on the field.
 

Similar threads

  • STEM Educators and Teaching
Replies
4
Views
2K
  • Science and Math Textbooks
Replies
28
Views
2K
  • STEM Academic Advising
Replies
9
Views
247
  • STEM Academic Advising
Replies
10
Views
2K
  • STEM Educators and Teaching
2
Replies
65
Views
9K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
Replies
26
Views
1K
  • Beyond the Standard Models
Replies
6
Views
3K
  • Science and Math Textbooks
Replies
6
Views
2K
Replies
16
Views
3K
Back
Top