Solving Simultaneous equations using Matrices

In summary: Then, you can do what you need to do with the equation in order to get to the solution.Yes, that sounds more like it. Thanks for clearing that up for me.
  • #1
theJorge551
68
0
I have to teach myself pre-calculus and basic calculus over the summer, and whilst covering matrices the chapter on solving simultaneous systems of equations using matrices puts forth several methods, one of which being the method of Gaussian elimination with augmented matrices. I understand why the first element of the newly augmented matrix has to now equal zero, but the formula for adjusting every other element on the first row wasn't clearly defined in my book, and they show the result without going through how to evaluate the other elements. Is it basically making a function like "Row 1 minus 3 x (Row 2)" to somehow make the first element equal zero, or is there an ironclad method for each row reduction?
 
Physics news on Phys.org
  • #2
Do you mean the first element of the matrix has to equal one?

In any event, you can read more about Gaussian elimination here:

http://en.wikipedia.org/wiki/Gaussian_elimination

Basically, though, you can take a row in an augmented matrix and do a few different things to it.

1) Multiply it by a constant.
2) Switch it with another row.
3) Add another row to it.

Gaussian elimination just consists in performing a set of these "row operations" such that your matrix is reduced to "echelon form", where all nonzero rows are above all zero rows, and the first nonzero number in each row is a) 1, b) in a column further to the right than the row above it.
 
  • #3
Well, with the method the book puts forth, basically the first row (of a 2 x 3 augmented matrix) reads { a b : c} in which a is the first x coefficient, b is the first y coefficient, and c is the independent constant. It then says that a must equal zero, in order to leave a newly transformed b and transformed c, to find the value of an independent y. Hence, one can find other values for x using the other equation...I've been doing some practice problems and it seems that one simply need add another row to it (sort of like what the book describes) and multiplying one of the rows by a constant. Thanks
 
  • #4
Are you sure that is what your book says? I have never seen it done that way. You want to reduce something like "ax+ by= c, dx+ ey= f" "x= p, y= q" or, in terms of matrices,
[tex]\begin{bmatrix}a & b & c \\ d & e & f\end{bmatrix}[/tex]
to
[tex]\begin{bmatrix}1 & 0 & p \\ 0 & 1 & q\end{bmatrix}[/tex]

That is, you want the first number in the first column (more generally the numbers on the main diagonal) to be one, not zero.
 
  • #5
I am quite positive that it's what my book says (I apologize, I'm hopeless at using LaTex to make matrices and don't quite have a grasp on it yet) basically here's the process that the book describes. It involves no use of the identity matrix in any way, that I can see.

For the equations ax+by = c, and px + qy = d

a b : c
p q : d

Creating some sort of row operation, for example: If a=3 and p=3, then the operation would be Row 1 - Row 2. Then, the new augmented matrix is

0 b-q : c-d
p q : d

And, the new equation to determine the value of Y is (b-q)y = (c-d) and solving for y is simple at that point. To determine the value of x, the book then goes on to demonstrate that one only need plug in the value for y, and solve for x in the px + qy = d equation.

I'm aware that there is a method for using the identity matrix, but my book isolates that completely from the method for Gaussian elimination.
 
  • #6
That's a bit weird. The way I've usually seen it, you use row operations to transform the coefficient matrix into an upper triangular matrix or row-echelon form, and then solve from the bottom up.

Gauss-Jordan elimination goes a bit further and transforms the coefficient matrix into the identity matrix.
 

Related to Solving Simultaneous equations using Matrices

1. How do you solve simultaneous equations using matrices?

To solve simultaneous equations using matrices, you first need to represent each equation in matrix form. This means writing all the coefficients and constants in a matrix. Then, you can use matrix operations such as multiplication and addition to manipulate the matrices and find the solutions to the equations.

2. What is the advantage of using matrices to solve simultaneous equations?

Using matrices to solve simultaneous equations can be more efficient and less prone to errors compared to traditional methods such as substitution or elimination. It also allows for a systematic approach to solving equations and can be applied to larger systems of equations.

3. Can matrices be used to solve any type of simultaneous equations?

Yes, matrices can be used to solve any type of simultaneous equations, including systems with two or more variables and equations with different degrees of complexity.

4. How do you know if a system of simultaneous equations has a unique solution?

A system of simultaneous equations has a unique solution if the number of equations is equal to the number of variables and the determinant of the coefficient matrix is non-zero. If the determinant is zero, the system may have infinitely many solutions or no solutions at all.

5. Can matrices be used to solve nonlinear simultaneous equations?

Yes, matrices can also be used to solve nonlinear simultaneous equations. However, the process can be more complex as it involves finding the inverse of a nonlinear function. In this case, numerical methods may be used to approximate the solutions.

Similar threads

  • Precalculus Mathematics Homework Help
Replies
4
Views
2K
  • General Math
Replies
11
Views
2K
  • Precalculus Mathematics Homework Help
Replies
12
Views
11K
  • Precalculus Mathematics Homework Help
Replies
5
Views
2K
  • Precalculus Mathematics Homework Help
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
13
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Atomic and Condensed Matter
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
748
Back
Top