What is the intuition behind using eigenvectors of AA^T and A^T A in the SVD?

In summary, the SVD uses the eigenvectors of AA^T and A^T A as the input and output bases for the matrix. This is because A^T acts as an inverse of A, and the inner product is defined in the respective vector spaces to determine perpendicularity. This allows for the identification of the orthogonal complement of AU, where A maps U to a subspace of V.
  • #1
daviddoria
97
0
In the SVD, we use the eigenvectors of AA^T and A^T A as the input and output bases for the matrix. Does anyone have any intuition about these matrices? ie. if I multiply a vector x by AA^T, what space (in terms of the column space, etc. of A) will it bring x to?

Thanks,

Dave
 
Physics news on Phys.org
  • #2
If A: U->V, that is, if A maps vector space U to vector space V, then A^T: V-> U so it acts "like" an inverse. The condition is that <Au, v>V= <u,A^Tv>V. Note that Au and V are in vector space V and u and A^Tv are in vector space U. That's why I put the "V" and "U" subscripts on the inner products- to indicate which space the inner product is defined in. Of course, an important application of inner product is to define "orthogonal" or perpendicular. If A maps U into a subspace of V, then all vectors, v, in the orthogonal complement of that subspace have the property that <Au,v>= 0. Since <Au,v>= 0= <u,A^Tv>, and u could be any member of U, it follows that A^Tv= 0 for v in the orthogonal complement of AU.
 

Related to What is the intuition behind using eigenvectors of AA^T and A^T A in the SVD?

What is the meaning of "A A^T"?

"A A^T" refers to the product of a matrix A and its transpose A^T. This operation is also known as the inner product or dot product of A with itself.

How is "A A^T" calculated?

To calculate "A A^T", the rows of A are multiplied by the columns of A^T and then added together. This results in a square matrix with the same number of rows and columns as A.

What is the significance of "A A^T" in linear algebra?

"A A^T" is important in linear algebra because it can help determine the properties of matrix A. For example, if A A^T is a diagonal matrix, then A is an orthogonal matrix and its columns are orthogonal to each other.

How is "A A^T" related to the concept of symmetry?

"A A^T" is related to symmetry because it is a symmetric matrix. This means that the entries above and below the main diagonal are reflections of each other. In other words, the transpose of "A A^T" is equal to "A A^T" itself.

Can "A A^T" be used to solve systems of linear equations?

Yes, "A A^T" can be used to solve systems of linear equations. By taking the inverse of "A A^T" and multiplying it with the right hand side of the equation, the original matrix A can be obtained. This can then be used to solve for the variables in the system of equations.

Similar threads

Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
10
Views
3K
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
4
Views
2K
Replies
5
Views
17K
  • Linear and Abstract Algebra
Replies
14
Views
2K
  • Special and General Relativity
Replies
1
Views
566
Replies
12
Views
3K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
12
Views
1K
Back
Top