A*transpose(A) for orthonormal columns ?

  • Thread starter FrogPad
  • Start date
  • Tags
    Columns
In summary, if an n x m matrix A has all orthonormal columns, then A'A = I. For the case where n = m, A is square and both A'A and AA' equal I. For the case where n > m, AA' is a matrix with rank m and it is not equal to I. Further investigation is needed to understand the relationship between AA' and I in this case.
  • #1
FrogPad
810
0

Homework Statement



Consider an n x m matrix A with n >= m. If all columns of A are orthonormal, then A'A = I. What can you say about AA'?

Where A'A = transpose(A)*A and AA' = A*transpose(A)

Homework Equations





The Attempt at a Solution


For the case that n = m:
A is square. Since the columns of A are normalized, and the set of vectors contained in A is orthogonal, we can call A orthogonal.

So, A'A = I and AA' = I

For the case that n > m:
I'm lost here...

any hints? What should I be looking up in the books to understand this?
 
Physics news on Phys.org
  • #2
One thought is...

For n > m, rank(A) = m.

So, AA' = L where L is [n x n]. Now I am assuming that the rank(L) = m, but I do not know how to prove this.

Now if we wanted, AA' = I where I is [n x n], and rank(I) = n.

Lets try, AA' = I

-> rank(AA') = rank(I)
-> rank(L) = rank(I)
-> m = n

this is not true, since n > m


Thanks for any help
 

Related to A*transpose(A) for orthonormal columns ?

1. What is the significance of A*transpose(A) for orthonormal columns?

A*transpose(A) for orthonormal columns is significant because it is equal to the identity matrix I. This means that the columns of A are orthogonal (perpendicular) to each other and have a magnitude of 1, making A an orthonormal matrix. This property is useful in many applications, such as in solving systems of linear equations and in data compression techniques.

2. How is A*transpose(A) for orthonormal columns calculated?

To calculate A*transpose(A) for orthonormal columns, we first take the transpose of matrix A, denoted as A^T. Then, we multiply A^T with matrix A, which results in a square matrix. This square matrix will have the same number of rows and columns as matrix A. Finally, we can simplify the resulting matrix to the identity matrix I, showing that A*transpose(A) for orthonormal columns is equal to I.

3. What is the difference between A*transpose(A) and A*transpose(A) for orthonormal columns?

The difference between A*transpose(A) and A*transpose(A) for orthonormal columns lies in the properties of matrix A. For A*transpose(A), matrix A does not have to be orthonormal, meaning its columns are not necessarily orthogonal and do not have a magnitude of 1. However, for A*transpose(A) for orthonormal columns, matrix A must have these properties. This makes A*transpose(A) for orthonormal columns a special case of A*transpose(A).

4. How can A*transpose(A) for orthonormal columns be used in solving systems of linear equations?

A*transpose(A) for orthonormal columns can be used in solving systems of linear equations by transforming the system into a simplified form. Since A*transpose(A) for orthonormal columns is equal to I, we can multiply both sides of the equation by A^T to eliminate A. This results in a simpler equation that can be solved for the unknown variables. This technique is commonly used in least squares regression, where the columns of A represent the predictor variables.

5. What are some examples of applications that utilize A*transpose(A) for orthonormal columns?

A*transpose(A) for orthonormal columns has many applications in mathematics and engineering. Some examples include data compression techniques, such as the Gram-Schmidt process and QR decomposition, where orthonormal matrices are used to reduce the dimensions of a dataset without losing important information. It is also used in signal processing and image processing to transform signals and images into a more simplified form. Additionally, A*transpose(A) for orthonormal columns is utilized in machine learning algorithms, such as principal component analysis, to reduce the dimensionality of data and improve the accuracy of models.

Similar threads

  • Calculus and Beyond Homework Help
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
15
Views
925
  • Linear and Abstract Algebra
Replies
14
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
16
Views
1K
  • Calculus and Beyond Homework Help
Replies
11
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
8
Views
2K
  • Calculus and Beyond Homework Help
Replies
15
Views
2K
Back
Top