Transforms that Preserve The Dominant Eigenvector?

In summary, the conversation is about finding transforms that preserve the dominant eigenvector in stochastic matrices. The person mentions studying methods that preserve the spectrum, but is specifically looking for methods that preserve the dominant eigenvector. The suggested method is to use a transformation matrix B that is composed of the dominant eigenvector and eigenvalue pair.
  • #1
csguy
2
0
Hi,

I'm working with stochastic matrices (square matrices where each entry is a probability of moving to a different state in a Markov chain) and I am looking for transforms that would preserve the dominant eigenvector (the "stationary distribution" of the chain). What I want to do is to cause the antidiagonal of the matrix to be zero.

I remember studying a host of methods that would preserve the spectrum (e.g. QR method, Jacobi rotation, Householder matrices, etc.), but which methods preserve the dominant eigenvector?

Any suggestions?
 
Physics news on Phys.org
  • #2
Supposing [itex]A \textbf{v} = \lambda \textbf{v}[/itex], where [itex]\textbf{v}, \lambda[/itex] is the dominant eigenvector/eigenvalue pair with components [itex]v_1, v_2, ..., v_n[/itex]. Then you could do something like
[tex]B = \lambda \left[\begin{matrix} 1 & 0 & 0 & ... & 0 \\ \frac{v_2}{v_1} & 0 & 0 & ... & 0
\\ \frac{v_3}{v_1} & 0 & 0 & ... & 0 \\ \vdots & \vdots & \vdots & \ddots & 0 \\ \frac{v_{n-1}}{v_1} & 0 & 0 & ... & 0 \\ 0 & \frac{v_n}{v_2} & 0 & ... & 0 \end{matrix}\right][/tex]

I think [itex]B \textbf{v} = \lambda \textbf{v}[/itex] if you work out the multiplication.
 

Related to Transforms that Preserve The Dominant Eigenvector?

1. What are transforms that preserve the dominant eigenvector?

Transforms that preserve the dominant eigenvector are linear transformations that do not change the direction of the dominant eigenvector of a matrix. This means that the dominant eigenvector remains the same before and after the transformation.

2. Why is preserving the dominant eigenvector important?

Preserving the dominant eigenvector is important because it allows us to identify the most influential variable or direction in a dataset. This can help us understand the underlying patterns and relationships in the data, and make more accurate predictions.

3. What are some examples of transforms that preserve the dominant eigenvector?

Examples of transforms that preserve the dominant eigenvector include scaling, rotation, and reflection. These transformations do not alter the direction of the dominant eigenvector, only its magnitude.

4. How can we determine if a transformation preserves the dominant eigenvector?

To determine if a transformation preserves the dominant eigenvector, we can calculate the eigenvalues and eigenvectors of the original matrix and the transformed matrix. If the dominant eigenvalue and eigenvector remain the same, then the transformation preserves the dominant eigenvector.

5. Can a transformation change the dominant eigenvector?

Yes, a transformation can change the dominant eigenvector if it is not a linear transformation. Non-linear transformations can alter the direction of the dominant eigenvector, making it important to use linear transforms when preserving the dominant eigenvector is desired.

Similar threads

  • Linear and Abstract Algebra
Replies
4
Views
4K
  • Linear and Abstract Algebra
Replies
2
Views
3K
  • Linear and Abstract Algebra
Replies
2
Views
9K
  • Poll
  • Science and Math Textbooks
Replies
1
Views
5K
  • Poll
  • Science and Math Textbooks
Replies
1
Views
3K
Replies
1
Views
2K
  • General Math
Replies
5
Views
3K
  • STEM Academic Advising
Replies
13
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
5K
  • Quantum Physics
Replies
4
Views
3K
Back
Top