Do Row Operations Maintain Equivalence in Matrix Subtraction?

  • Thread starter MathewsMD
  • Start date
  • Tags
    Row
In summary, the question is whether matrix C can still be obtained by doing A-B and then performing row operations, and whether there is any distortion of the row space. The response is that elementary row operations can be performed in any order without changing the results, and the matrix will remain intact without any distortion of the row space. However, for the specific question of (A' + B)' = (A + B)', it is not necessarily true and examples should be used to test it.
  • #1
MathewsMD
433
7
If you have two arbitrary matrices, A and B, I was wondering if row operations can be performed in any order to produce the same results.

For example, you perform elementary row operations on A to produce A', then do A' - B, then also produce a new matrix through elementary row operations on this new matrix to produce a new matrix C.

Can matrix C still be obtained by doing A (not A') - B, and then performing row operations? By performing row operations, the matrix is still remained intact, correct? There's no distortion of the row space, right?
 
  • #3
MathewsMD said:
If you have two arbitrary matrices, A and B, I was wondering if row operations can be performed in any order to produce the same results.
Let's limit the discussion to a single matrix A. You can do elementary row operations in any order. At each step along the way, the new matrix will be equivalent to the one you started with.
MathewsMD said:
For example, you perform elementary row operations on A to produce A', then do A' - B
Why? When you perform an elementary row operation on A, you get a new matrix A' that is equivalent to A, but not equal to it. Some of the entries in the new matrix are different from those in A.

The basic idea is that if Ax = 0, for instance, then A'x = 0 as well, even though A ##\neq## A'. Subtracting one matrix from another is not an elementary row operation, so I don't get the point of your question.
MathewsMD said:
, then also produce a new matrix through elementary row operations on this new matrix to produce a new matrix C.

Can matrix C still be obtained by doing A (not A') - B, and then performing row operations? By performing row operations, the matrix is still remained intact, correct? There's no distortion of the row space, right?
 
  • #4
Mark44 said:
Let's limit the discussion to a single matrix A. You can do elementary row operations in any order. At each step along the way, the new matrix will be equivalent to the one you started with.
Why? When you perform an elementary row operation on A, you get a new matrix A' that is equivalent to A, but not equal to it. Some of the entries in the new matrix are different from those in A.

The basic idea is that if Ax = 0, for instance, then A'x = 0 as well, even though A ##\neq## A'. Subtracting one matrix from another is not an elementary row operation, so I don't get the point of your question.

Okay, thank you for the response. I was trying to get at the this question specifically: A' is a reduced form of A, then is (A' + B)' = (A + B)', where (A+B)' is some reduction of A+B. Does the above equation hold true? Note: the reductions on each matrix may involve completely different operations, there's just some kind of row operation being done.
 
  • #5
MathewsMD said:
Okay, thank you for the response. I was trying to get at the this question specifically: A' is a reduced form of A, then is (A' + B)' = (A + B)', where (A+B)' is some reduction of A+B. Does the above equation hold true? Note: the reductions on each matrix may involve completely different operations, there's just some kind of row operation being done.
Why don't you try a couple of simple examples - say 2 x 2 matrices or 3 x 3 matrices?

What you wrote is, I think, garbled.
(A' + B)' = (A + B)'
Did you mean (A' + B') = (A + B)'?

Also, by "=" do you mean "is row equivalent to" or "equals"? A professor I had in a 400-level linear algebra class was always very careful to write ##\equiv## when he was doing row operations, a habit that I've followed ever since.
 
  • #6
Mark44 said:
Why don't you try a couple of simple examples - say 2 x 2 matrices or 3 x 3 matrices?

What you wrote is, I think, garbled.

Did you mean (A' + B') = (A + B)'?

Also, by "=" do you mean "is row equivalent to" or "equals"? A professor I had in a 400-level linear algebra class was always very careful to write ##\equiv## when he was doing row operations, a habit that I've followed ever since.

Sorry, I meant "equals to." By " ' " I am just referring to any random sequence of elementary row operations (I am not sure if there is better notation). And the ' used for one matrix isn't necessarily the same exact sequence of row operations done on the other matrices.

The only case I can necessarily find where it is not true is when A = B, since then A - B = 0, and no row operations can be used to derive A' - B. I don't seem to be able to prove if it's true for the nontrivial case, though. It does seem to stand, though.
 
  • #7
MathewsMD said:
The only case I can necessarily find where it is not true is when A = B, since then A - B = 0
Did you try it with a few specific examples, as I suggested? I don't think the statement is true at all.
MathewsMD said:
, and no row operations can be used to derive A' - B. I don't seem to be able to prove if it's true for the nontrivial case, though. It does seem to stand, though.
 

Related to Do Row Operations Maintain Equivalence in Matrix Subtraction?

1. What is the purpose of performing row reductions?

The purpose of performing row reductions is to simplify a system of linear equations and solve for the unknown variables. This process involves manipulating the rows of a matrix to reduce it to a simpler form, making it easier to solve for the variables.

2. How do row reductions help in solving systems of equations?

Row reductions help in solving systems of equations by reducing a complex system into a simpler form that is easier to solve. This allows us to find the solution to the system of equations in a more efficient manner.

3. What are the basic steps involved in performing row reductions?

The basic steps involved in performing row reductions are:

  1. Identify the leading entry (first non-zero number) in each row.
  2. Use row operations to create zeros in the columns below the leading entries.
  3. Perform row swaps (if necessary) to get the leading entries in the correct order.
  4. Continue the process until the matrix is in its reduced row echelon form.

4. Can row reductions be used to solve any system of equations?

Yes, row reductions can be used to solve any system of equations as long as it has a unique solution. In some cases, the system may have no solution or infinitely many solutions.

5. Are there any limitations to using row reductions to solve systems of equations?

While row reductions are a powerful tool for solving systems of equations, there are certain limitations to this method. It may become computationally intensive for large systems, and in certain cases, it may not work for systems with non-linear equations or equations with variables raised to a power higher than one.

Similar threads

  • Precalculus Mathematics Homework Help
Replies
25
Views
1K
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
948
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
951
  • Precalculus Mathematics Homework Help
Replies
32
Views
926
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Linear and Abstract Algebra
2
Replies
48
Views
5K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
914
Back
Top