- Thread starter
- #1
Are You searching the 'particular' quadratic polynomial $A\ x^{2} + B$ and not the 'general' quadratic polynonial $A\ x^{2} + B\ x + C$?...Hi does anyone know the "formula/process" for the least square method if the are giving several (x,y) points and asking me to find y=Ax^2 + B?
Sounds like you want a non-linear least squares.Hi does anyone know the "formula/process" for the least square method if the are giving several (x,y) points and asking me to find y=Ax^2 + B?
Bring in a dummy variable \(\displaystyle \displaystyle X = x^2\) to give you a new set of points, and then perform a linear least squares regression on X vs y to give an equation of the form \(\displaystyle \displaystyle y = A\,X + B = A\,x^2 + B\).Hi does anyone know the "formula/process" for the least square method if the are giving several (x,y) points and asking me to find y=Ax^2 + B?
Let's suppose that the approximating polynomial is $y=A\ x + B$ so that we have the so called 'least square straight line'. In this case if we have a discrete set of N + 1 points $[y_{i},x_{i}],\ i=0,1,...,N$ then we have to minimize respect to A and B the sum...Hi does anyone know the "formula/process" for the least square method if there are giving several (x,y) points and asking me to find y=Ax^2 + B?
Of course if we are searching a polynomial like $\displaystyle y=A\ x^{2} + B$ all what we have to do is to write $x_{i}^{2}$ instead of $x_{i}$ in (4)...Let's suppose that the approximating polynomial is $y=A\ x + B$ so that we have the so called 'least square straight line'. In this case if we have a discrete set of N + 1 points $[y_{i},x_{i}],\ i=0,1,...,N$ then we have to minimize respect to A and B the sum...
$\displaystyle S= \sum_{i=0}^{N} [y_{i} - A\ x_{i} - B]^{2}$ (1)
Proceeding in standard fashion we compute the partial derivatives and impose them to vanish...
$\displaystyle \frac{\partial{S}}{\partial{A}} = - 2\ \sum_{i=0}^{N} x_{i}\ (y_{i} - A x_{i} - B) = 0$
$\displaystyle \frac{\partial{S}}{\partial{B}} = - 2\ \sum_{i=0}^{N} (y_{i} - A x_{i} - B) = 0$ (2)
Reordering we arrive to the 2 x 2 linear system of equations...
$\displaystyle A\ \sum_{i=0}^{N} x_{i}^{2} + B\ \sum_{i=0}^{N} x_{i} = \sum_{i=0}^{N} x_{i}\ y_{i}$
$\displaystyle A\ \sum_{i=0}^{N} x_{i} + B\ (N+1) = \sum_{i=0}^{N} y_{i}$ (3)
... the solution of which is...
$\displaystyle A = \frac{(N+1)\ \sum_{i=0}^{N} x_{i}\ y_{i}- \sum_{i=0}^{N} x_{i}\ \sum_{i=0}^{N} y_{i}}{(N+1)\ \sum_{i=0}^{N} x_{i}^{2} - (\sum_{i=0}^{N} x_{i})^{2}}$
$\displaystyle B = \frac {\sum_{i=0}^{N} x_{i}^{2}\ \sum_{i=0}^{N} y_{i} - \sum_{i=0}^{N} x_{i}\ \sum_{i=0}^{N} x_{i}\ y_{i}}{(N+1)\ \sum_{i=0}^{N} x_{i}^{2} - (\sum_{i=0}^{N} x_{i})^{2}}$ (4)
Very well!... but what to do if we want to use a polynomial of degree n>1?... we will examine that [if necessary...] in a next post...
The set of derivatives of all the equations with respect to the coefficients is a Jacobian matrix J.Let's suppose that the approximating polynomial is $y=A\ x + B$ so that we have the so called 'least square straight line'. In this case if we have a discrete set of N + 1 points $[y_{i},x_{i}],\ i=0,1,...,N$ then we have to minimize respect to A and B the sum...
$\displaystyle S= \sum_{i=0}^{N} [y_{i} - A\ x_{i} - B]^{2}$ (1)
Proceeding in standard fashion we compute the partial derivatives and impose them to vanish...
$\displaystyle \frac{\partial{S}}{\partial{A}} = - 2\ \sum_{i=0}^{N} x_{i}\ (y_{i} - A x_{i} - B) = 0$
$\displaystyle \frac{\partial{S}}{\partial{B}} = - 2\ \sum_{i=0}^{N} (y_{i} - A x_{i} - B) = 0$ (2)
Reordering we arrive to the 2 x 2 linear system of equations...
$\displaystyle A\ \sum_{i=0}^{N} x_{i}^{2} + B\ \sum_{i=0}^{N} x_{i} = \sum_{i=0}^{N} x_{i}\ y_{i}$
$\displaystyle A\ \sum_{i=0}^{N} x_{i} + B\ (N+1) = \sum_{i=0}^{N} y_{i}$ (3)
... the solution of which is...
$\displaystyle A = \frac{(N+1)\ \sum_{i=0}^{N} x_{i}\ y_{i}- \sum_{i=0}^{N} x_{i}\ \sum_{i=0}^{N} y_{i}}{(N+1)\ \sum_{i=0}^{N} x_{i}^{2} - (\sum_{i=0}^{N} x_{i})^{2}}$
$\displaystyle B = \frac {\sum_{i=0}^{N} x_{i}^{2}\ \sum_{i=0}^{N} y_{i} - \sum_{i=0}^{N} x_{i}\ \sum_{i=0}^{N} x_{i}\ y_{i}}{(N+1)\ \sum_{i=0}^{N} x_{i}^{2} - (\sum_{i=0}^{N} x_{i})^{2}}$ (4)
Very well!... but what to do if we want to use a polynomial of degree n>1?... we will examine that [if necessary...] in a next post...
Kind regards
$\chi$ $\sigma$
From the 'pure theoretically' point of view what You say is true... from the 'pratical' point of view, when You have an approximating polynomial of degree n>1, the procedure leads in most cases to an ill conditioned linear problem so that a different approach [disovered by the German mathematician Karl Friedriek Gauss...] has to be used...The set of derivatives of all the equations with respect to the coefficients is a Jacobian matrix J.
Solving the system is equivalent to multiplying by $(J^T J)^{-1}J^T$.
This can be generalized to more complicated expressions.
The solution remains the same as long as the expression is linear in the coefficients.
I believe the only practical problem is how to find the inverse matrix.From the 'pure theoretically' point of view what You say is true... from the 'pratical' point of view, when You have an approximating polynomial of degree n>1, the procedure leads in most cases to an ill conditioned linear problem so that a different approach [disovered by the German mathematician Karl Friedriek Gauss...] has to be used...
Kind regards
$\chi$ $\sigma$