Something that can't be right about eigenvectors-where is my mistake?

  • Thread starter marschmellow
  • Start date
  • Tags
    Mistake
In summary, the conversation discusses eigenvectors and eigenvalues and their relationship to a matrix equation. The speaker brings up a claim that each component of an eigenvector is equal to the product of the previous component and its corresponding eigenvalue, but this is disproven through finding counterexamples. The conversation also touches on the uniqueness of the A matrix and the self-consistency of boundary conditions. Ultimately, the speaker concludes that there is no error in their logic, but the artificial nature of setting specific initial conditions may not lead to anything significant.
  • #1
marschmellow
49
0
Something that can't be right about eigenvectors--where is my mistake?

Xs are eigenvectors, lambdas are eigenvalues, and Cs are constants of integration.

If we rearrange some homogeneous higher-order system into a matrix equation, we get the first equation on the word document. The solution to that equation is the second equation on the word document. But if we set the initial conditions such that all of the constants except C1 are equal to zero, the solution is the third equation, with the components of the eigenvector written out, indexed by m. Each component of Y is the derivative of the previous component, which implies that each component of any eigenvector equals the product of the component that it follows and its corresponding eigenvalue. But almost any particular case you could find would be a counterexample to this claim. I brought this up with my teacher and he reasoned that it couldn't be true but also couldn't find any mistake in my logic. Where did I go wrong?
 

Attachments

  • eqns.doc
    42.5 KB · Views: 170
Physics news on Phys.org
  • #2


Each component of Y is the derivative of the previous component

Only if you have derived your matrix by taking a differential equation in 1 dimension and converted to a system of first order equations. Find a counterexample that satisfies the requirements of such a converted matrix. A general matrix, on the other hand, will not give such an eigenvector.

*edit*

If the converted matrix is what you are talking about, remember how it was derived. You would expect the eigenvectors of the matrix to do that. However, you don't care, as you only want one component of the solution - the first one. That would be the solution to your original ODE.
 
Last edited:
  • #3


Sethric said:
Only if you have derived your matrix by taking a differential equation in 1 dimension and converted to a system of first order equations. Find a counterexample that satisfies the requirements of such a converted matrix. A general matrix, on the other hand, will not give such an eigenvector.

*edit*

If the converted matrix is what you are talking about, remember how it was derived. You would expect the eigenvectors of the matrix to do that. However, you don't care, as you only want one component of the solution - the first one. That would be the solution to your original ODE.


Hmm, you're right. I just tried a few second-order ODEs out, and the claim did hold. I could have sworn that I had found counterexamples from matrices derived from an ODE, but maybe I didn't; maybe they were just ordinary matrices. Well, thanks for your help!
 
  • #4


I don't think there is any error in your logic, but "setting the initial conditions such that only C_1 is non-zero" is a rather artificial thing to do.

Your "A matrix" for the DE is not unique. You are choosing one particular A matrix, and then choosing boundary conditions derived from its eigenpairs, so the boundary conditions are by definition "self consistent" with the matrix in the way you describe.

But I don't think that leads to anything very interesting, because you could pick a different A matrix with different eigenpairs.
 
  • #5


There are a few possible mistakes in your reasoning. First, eigenvectors and eigenvalues are not always related to each other in the way you described. While it is true that the eigenvalues can be found by solving the characteristic equation using the eigenvectors, this is not always the case. In some cases, the eigenvalues can be complex numbers or repeated values, making them more difficult to relate to the eigenvectors.

Second, your statement that "each component of any eigenvector equals the product of the component that it follows and its corresponding eigenvalue" is not always true. This is only true for certain types of matrices, such as diagonal matrices. In other cases, the components of an eigenvector may not have a simple relationship with each other.

Lastly, your example of setting all constants except C1 to zero is not a valid way to test the relationship between eigenvectors and eigenvalues. This is because by setting all other constants to zero, you are essentially reducing the system to a single equation with only one unknown (C1). This does not provide enough information to accurately determine the relationship between eigenvectors and eigenvalues.

In conclusion, while it is important to understand the relationship between eigenvectors and eigenvalues, it is also important to recognize that this relationship is not always straightforward and can vary depending on the matrix and its properties. It is always best to approach each problem with a thorough understanding of the concepts and to carefully check for any potential mistakes in your logic.
 

Related to Something that can't be right about eigenvectors-where is my mistake?

1. What are eigenvectors and why are they important in mathematics?

Eigenvectors are special vectors in linear algebra that do not change direction when multiplied by a given matrix. They are important because they represent the directions along which a linear transformation acts by simply scaling the vector, and they have many applications in fields such as physics, engineering, and computer science.

2. Can there be a mistake in the calculation of eigenvectors?

Yes, there can be mistakes in the calculation of eigenvectors. Common mistakes include incorrect matrix multiplication, incorrect use of eigenvalue equations, or not considering complex numbers when needed.

3. How can I check if my calculated eigenvectors are correct?

To check if your calculated eigenvectors are correct, you can multiply them by the corresponding eigenvalues and then multiply the resulting vector by the original matrix. If the result is equal to the original eigenvector multiplied by the same eigenvalue, then your calculated eigenvector is correct.

4. What happens if a matrix has repeated eigenvalues?

If a matrix has repeated eigenvalues, it means that there is more than one eigenvector associated with that eigenvalue. In this case, the eigenvectors form a subspace and any linear combination of these eigenvectors is also an eigenvector with the same eigenvalue.

5. Are eigenvectors unique for a given matrix?

No, eigenvectors are not unique for a given matrix. Different matrices can have the same eigenvectors, and even the same matrix can have different sets of eigenvectors if they are scaled by a constant factor.

Similar threads

Replies
3
Views
893
  • Differential Equations
Replies
5
Views
2K
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Chemistry
Replies
2
Views
886
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
Replies
5
Views
1K
Replies
1
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
5K
  • Introductory Physics Homework Help
Replies
8
Views
2K
  • Differential Equations
Replies
3
Views
2K
Back
Top