- Thread starter
- #1

- Thread starter Juliayaho
- Start date

- Thread starter
- #1

- Feb 13, 2012

- 1,704

Are You searching the 'particular' quadratic polynomial $A\ x^{2} + B$ and not the 'general' quadratic polynonial $A\ x^{2} + B\ x + C$?...Hi does anyone know the "formula/process" for the least square method if the are giving several (x,y) points and asking me to find y=Ax^2 + B?

Kind regards

$\chi$ $\sigma$

- Admin
- #3

- Mar 5, 2012

- 8,868

Sounds like you want a non-linear least squares.Hi does anyone know the "formula/process" for the least square method if the are giving several (x,y) points and asking me to find y=Ax^2 + B?

Suppose we write your equation like:

$$\begin{bmatrix}y_1 \\ \vdots \\ y_n \end{bmatrix} =

\begin{bmatrix}x_1^2 & 1 \\ \vdots & \vdots \\ x_n^2 & 1 \end{bmatrix}

\begin{bmatrix}A \\ B \end{bmatrix}$$

If we call the matrix J, and the vector of coefficients $\mathbf a$, we can write this as:

$$\mathbf y = J \mathbf a$$

The least square solution (minimizing the summed square residues of y) is:

$$\mathbf a = (J^T J)^{-1} J^T \mathbf y$$

Bring in a dummy variable \(\displaystyle \displaystyle X = x^2\) to give you a new set of points, and then perform a linear least squares regression on X vs y to give an equation of the form \(\displaystyle \displaystyle y = A\,X + B = A\,x^2 + B\).Hi does anyone know the "formula/process" for the least square method if the are giving several (x,y) points and asking me to find y=Ax^2 + B?

- Feb 13, 2012

- 1,704

Let's suppose that the approximating polynomial is $y=A\ x + B$ so that we have the so called 'least square straight line'. In this case if we have a discrete set of N + 1 points $[y_{i},x_{i}],\ i=0,1,...,N$ then we have to minimize respect to A and B the sum...Hi does anyone know the "formula/process" for the least square method if there are giving several (x,y) points and asking me to find y=Ax^2 + B?

$\displaystyle S= \sum_{i=0}^{N} [y_{i} - A\ x_{i} - B]^{2}$ (1)

Proceeding in standard fashion we compute the partial derivatives and impose them to vanish...

$\displaystyle \frac{\partial{S}}{\partial{A}} = - 2\ \sum_{i=0}^{N} x_{i}\ (y_{i} - A x_{i} - B) = 0$

$\displaystyle \frac{\partial{S}}{\partial{B}} = - 2\ \sum_{i=0}^{N} (y_{i} - A x_{i} - B) = 0$ (2)

Reordering we arrive to the 2 x 2 linear system of equations...

$\displaystyle A\ \sum_{i=0}^{N} x_{i}^{2} + B\ \sum_{i=0}^{N} x_{i} = \sum_{i=0}^{N} x_{i}\ y_{i}$

$\displaystyle A\ \sum_{i=0}^{N} x_{i} + B\ (N+1) = \sum_{i=0}^{N} y_{i}$ (3)

... the solution of which is...

$\displaystyle A = \frac{(N+1)\ \sum_{i=0}^{N} x_{i}\ y_{i}- \sum_{i=0}^{N} x_{i}\ \sum_{i=0}^{N} y_{i}}{(N+1)\ \sum_{i=0}^{N} x_{i}^{2} - (\sum_{i=0}^{N} x_{i})^{2}}$

$\displaystyle B = \frac {\sum_{i=0}^{N} x_{i}^{2}\ \sum_{i=0}^{N} y_{i} - \sum_{i=0}^{N} x_{i}\ \sum_{i=0}^{N} x_{i}\ y_{i}}{(N+1)\ \sum_{i=0}^{N} x_{i}^{2} - (\sum_{i=0}^{N} x_{i})^{2}}$ (4)

Very well!... but what to do if we want to use a polynomial of degree n>1?... we will examine that [if necessary...] in a next post...

Kind regards

$\chi$ $\sigma$

- Feb 13, 2012

- 1,704

Of course if we are searching a polynomial like $\displaystyle y=A\ x^{2} + B$ all what we have to do is to write $x_{i}^{2}$ instead of $x_{i}$ in (4)...Let's suppose that the approximating polynomial is $y=A\ x + B$ so that we have the so called 'least square straight line'. In this case if we have a discrete set of N + 1 points $[y_{i},x_{i}],\ i=0,1,...,N$ then we have to minimize respect to A and B the sum...

$\displaystyle S= \sum_{i=0}^{N} [y_{i} - A\ x_{i} - B]^{2}$ (1)

Proceeding in standard fashion we compute the partial derivatives and impose them to vanish...

$\displaystyle \frac{\partial{S}}{\partial{A}} = - 2\ \sum_{i=0}^{N} x_{i}\ (y_{i} - A x_{i} - B) = 0$

$\displaystyle \frac{\partial{S}}{\partial{B}} = - 2\ \sum_{i=0}^{N} (y_{i} - A x_{i} - B) = 0$ (2)

Reordering we arrive to the 2 x 2 linear system of equations...

$\displaystyle A\ \sum_{i=0}^{N} x_{i}^{2} + B\ \sum_{i=0}^{N} x_{i} = \sum_{i=0}^{N} x_{i}\ y_{i}$

$\displaystyle A\ \sum_{i=0}^{N} x_{i} + B\ (N+1) = \sum_{i=0}^{N} y_{i}$ (3)

... the solution of which is...

$\displaystyle A = \frac{(N+1)\ \sum_{i=0}^{N} x_{i}\ y_{i}- \sum_{i=0}^{N} x_{i}\ \sum_{i=0}^{N} y_{i}}{(N+1)\ \sum_{i=0}^{N} x_{i}^{2} - (\sum_{i=0}^{N} x_{i})^{2}}$

$\displaystyle B = \frac {\sum_{i=0}^{N} x_{i}^{2}\ \sum_{i=0}^{N} y_{i} - \sum_{i=0}^{N} x_{i}\ \sum_{i=0}^{N} x_{i}\ y_{i}}{(N+1)\ \sum_{i=0}^{N} x_{i}^{2} - (\sum_{i=0}^{N} x_{i})^{2}}$ (4)

Very well!... but what to do if we want to use a polynomial of degree n>1?... we will examine that [if necessary...] in a next post...

Kind regards

$\chi$ $\sigma$

- Admin
- #7

- Mar 5, 2012

- 8,868

The set of derivatives of all the equations with respect to the coefficients is a Jacobian matrix J.Let's suppose that the approximating polynomial is $y=A\ x + B$ so that we have the so called 'least square straight line'. In this case if we have a discrete set of N + 1 points $[y_{i},x_{i}],\ i=0,1,...,N$ then we have to minimize respect to A and B the sum...

$\displaystyle S= \sum_{i=0}^{N} [y_{i} - A\ x_{i} - B]^{2}$ (1)

Proceeding in standard fashion we compute the partial derivatives and impose them to vanish...

$\displaystyle \frac{\partial{S}}{\partial{A}} = - 2\ \sum_{i=0}^{N} x_{i}\ (y_{i} - A x_{i} - B) = 0$

$\displaystyle \frac{\partial{S}}{\partial{B}} = - 2\ \sum_{i=0}^{N} (y_{i} - A x_{i} - B) = 0$ (2)

Reordering we arrive to the 2 x 2 linear system of equations...

$\displaystyle A\ \sum_{i=0}^{N} x_{i}^{2} + B\ \sum_{i=0}^{N} x_{i} = \sum_{i=0}^{N} x_{i}\ y_{i}$

$\displaystyle A\ \sum_{i=0}^{N} x_{i} + B\ (N+1) = \sum_{i=0}^{N} y_{i}$ (3)

... the solution of which is...

$\displaystyle A = \frac{(N+1)\ \sum_{i=0}^{N} x_{i}\ y_{i}- \sum_{i=0}^{N} x_{i}\ \sum_{i=0}^{N} y_{i}}{(N+1)\ \sum_{i=0}^{N} x_{i}^{2} - (\sum_{i=0}^{N} x_{i})^{2}}$

$\displaystyle B = \frac {\sum_{i=0}^{N} x_{i}^{2}\ \sum_{i=0}^{N} y_{i} - \sum_{i=0}^{N} x_{i}\ \sum_{i=0}^{N} x_{i}\ y_{i}}{(N+1)\ \sum_{i=0}^{N} x_{i}^{2} - (\sum_{i=0}^{N} x_{i})^{2}}$ (4)

Very well!... but what to do if we want to use a polynomial of degree n>1?... we will examine that [if necessary...] in a next post...

Kind regards

$\chi$ $\sigma$

Solving the system is equivalent to multiplying by $(J^T J)^{-1}J^T$.

This can be generalized to more complicated expressions.

The solution remains the same as long as the expression is linear in the coefficients.

- Feb 13, 2012

- 1,704

From the 'pure theoretically' point of view what You say is true... from the 'pratical' point of view, when You have an approximating polynomial of degree n>1, the procedure leads in most cases to an ill conditioned linear problem so that a different approach [disovered by the German mathematician Karl Friedriek Gauss...] has to be used...The set of derivatives of all the equations with respect to the coefficients is a Jacobian matrix J.

Solving the system is equivalent to multiplying by $(J^T J)^{-1}J^T$.

This can be generalized to more complicated expressions.

The solution remains the same as long as the expression is linear in the coefficients.

Kind regards

$\chi$ $\sigma$

- Admin
- #9

- Mar 5, 2012

- 8,868

I believe the only practical problem is how to find the inverse matrix.From the 'pure theoretically' point of view what You say is true... from the 'pratical' point of view, when You have an approximating polynomial of degree n>1, the procedure leads in most cases to an ill conditioned linear problem so that a different approach [disovered by the German mathematician Karl Friedriek Gauss...] has to be used...

Kind regards

$\chi$ $\sigma$

For 2 coefficients there is no problem, as it is straight forward to find the inverse matrix, as your equations show.

For 3 or more coefficients we may need to be careful how to find the inverse (if the problem is ill-conditioned).

For that there are numerical methods that minimize the rounding errors.