Welcome to our community

Be a part of something great, join today!

Solving this system of equations.

jasonc

New member
Jun 19, 2012
6
I have a personal project I'm working on that involves calibrating some values from a sensor input. So far I've been doing the calibrations in a very tedious manner but I'd like to write a program to solve the calibration for me. Very long story very short, this basically boils down to the following:

I have this equation:

math_image.gif

I have a set of many v, r, g, b, and a values (v is either 0 or 1, a is a known constant, r g b vary between 0 and 1), and I want to find the best fit values for G, B, and C.

Is this possible? I don't know much about linear equations but I don't think this is one. Can anybody recommend any methods for solving these?

Thanks!
J
 

Sudharaka

Well-known member
MHB Math Helper
Feb 5, 2012
1,621
I have a personal project I'm working on that involves calibrating some values from a sensor input. So far I've been doing the calibrations in a very tedious manner but I'd like to write a program to solve the calibration for me. Very long story very short, this basically boils down to the following:

I have this equation:



I have a set of many v, r, g, b, and a values (v is either 0 or 1, a is a known constant, r g b vary between 0 and 1), and I want to find the best fit values for G, B, and C.

Is this possible? I don't know much about linear equations but I don't think this is one. Can anybody recommend any methods for solving these?

Thanks!
J
Hi jasonc, :)

What you have is a system of equations which you have to solve for \(G,\,B,\mbox{ and }C\).

Whenever, \(v=1\),

\[1=\frac{-gG-bB+r-C}{-gG-bB+r-C}\Rightarrow r=a\]

Hence the set of values that you have should meet the criteria, \(r=a\) whenever \(v=1\). Otherwise this system is not soluble.

Whenever, \(v=0\),

\[0=\frac{-gG-bB+r-C}{-gG-bB+r-C}\Rightarrow -gG-bB+r-C=0\]

When you plug in values for \(r,\,g,\,b\) you have a set of equations with three variables, \(G,\,B,\mbox{ and }C\). If you have more than three linearly independent equations then the system doesn't have a solution. If there are exactly three linearly independent equations then the system has a unique solution. If the number of linearly independent equations are less than two the system will have infinitely many solutions.

Kind Regards,
Sudharaka.
 

CaptainBlack

Well-known member
Jan 26, 2012
890
I have a personal project I'm working on that involves calibrating some values from a sensor input. So far I've been doing the calibrations in a very tedious manner but I'd like to write a program to solve the calibration for me. Very long story very short, this basically boils down to the following:

I have this equation:

View attachment 220

I have a set of many v, r, g, b, and a values (v is either 0 or 1, a is a known constant, r g b vary between 0 and 1), and I want to find the best fit values for G, B, and C.

Is this possible? I don't know much about linear equations but I don't think this is one. Can anybody recommend any methods for solving these?

Thanks!
J
You want a non-linear optimisation tool that finds the minimum of:

\[ \rm{ Ob(G,B,C)=\sum_i \left|v_i-\frac{-g_iG-b_iB+r_i-C}{-g_iG-b_iB+a-C}\right|}^{\alpha} \]

Usual choices of \(\alpha\) are 1 and 2 (2 is better as systems that assume smoothness will work better).

I would initially suggest you look at the (non-linear) solvers that ship with Excel and/or Gnumeric.

CB
 

Ackbach

Indicium Physicus
Staff member
Jan 26, 2012
4,192
I'm not so sure it's nonlinear. Consider that you can re-arrange the equation thus:

$$va-r=(v-1)gG+(v-1)bB+(v-1)C=(v-1)(gG+bB+C).$$

Thus, you could try minimizing the difference

$$Ob(G,B,C)=\sum_{i}\left(v_{i}a-r_{i}+(1-v_{i})(g_{i}G+b_{i}B+C)\right)^{2}.$$

You might even be able to derive the explicit formulas you need by using the standard calculus treatment of setting the derivatives

$$\frac{\partial Ob}{\partial G}=\frac{\partial Ob}{\partial B}=\frac{\partial Ob}{\partial C}=0.$$
 

CaptainBlack

Well-known member
Jan 26, 2012
890
I'm not so sure it's nonlinear.
I'm not sure I said it was intrinsically non-linear, only that it could be solved fairly easily with a non-linear optimisation tool. But in the sense of mathematical programing it is non-linear.

In terms of regression it can be reduced to a linear least squares problem, but are we sure that least squares is the desired optimality condition, also you are not now minimising the sum of square resiuals between the variable of interest and the model. So you have lost any obvious sense in which this is a good fit, that is you have introduced an optimality condition different from minimising the sum some strictly increasing function of the absolute residuals (or some even more general function of the residuals).

If you want to go down this route either you need to prove that the two solutions are the same, or invoke the principle of "good enough for government purposes".

CB

PS I know that I did not leave the optimality condition fully general, but least squares or least absolute value are the two most popular optimality conditions.

PPS Another approach to to treat this a probabilistic problem were the RHS is the probability of v being 1, then we could go down the route of a maximum likelihood (or maximum posterior probability) estimator for the model paramenters - but we would still probably end up with a numerical non-linear least squares problem to solve. See my warship battle-damage survival probability paper for an example of this approach.
 
Last edited: