- #1
swampwiz
- 571
- 83
I was looking at this discussion of swapping roots of a polynomial causing the discriminant to loop around the origin.
https://www.akalin.com/quintic-unsolvability
Although it appears to be the case, has this mathematical fact ever been proven?
It seems that the formula for the discriminant is a product of squares of the difference between each combinatorial pair of roots, and thus the polar angle of the discriminant is simply twice the sum of the polar angle of the vector for each combinatorial pair of roots.
I can see how it works for the case of a quadratic polynomial, since there is only a single combinational pair, and the polar angle of the vector must make a net sweep of 180 degrees (i.e, in one of the directions), and so the polar angle for the discriminant must make a net sweep of 360 degrees. However, I don't see how this can be extended to the case of the cubic or higher polynomial.
https://www.akalin.com/quintic-unsolvability
Although it appears to be the case, has this mathematical fact ever been proven?
It seems that the formula for the discriminant is a product of squares of the difference between each combinatorial pair of roots, and thus the polar angle of the discriminant is simply twice the sum of the polar angle of the vector for each combinatorial pair of roots.
I can see how it works for the case of a quadratic polynomial, since there is only a single combinational pair, and the polar angle of the vector must make a net sweep of 180 degrees (i.e, in one of the directions), and so the polar angle for the discriminant must make a net sweep of 360 degrees. However, I don't see how this can be extended to the case of the cubic or higher polynomial.