What makes Eigenvalues and Eigenvectors important and how were they developed?

In summary, linear algebra can be a challenging math class, but it is important to approach it from a geometric perspective in order to truly understand the concepts. Eigenvectors and eigenvalues are crucial in understanding linear transformations and their applications, such as finding symmetry and simplifying calculations. It is also a fundamental concept in physics, making it a highly relevant and useful subject to study. Some recommended books for gaining a deeper understanding of linear algebra are Anton's "Elementary Linear Algebra," Axler's "Linear Algebra Done Right," and Strang's "Introduction to Linear Algebra."
  • #1
dduardo
Staff Emeritus
1,905
3
I'm currently taking linear algebra and it has to be the worst math class EVER. It is extremely easy, but I find the lack of application discouraging. I really want to understand how the concepts arose and not simple memorize an algorithm to solve mindless operations, which are tedious. My professor is unhelpful and brushes off any discussion in class citing the lack of time to learn all the material. He also assumes to much, leaving little room for proofs.

If anyone would be kind enough to post the importance of Eigenvalues and Eigenvectors, how they where developed, and possible applications for their use.

Any input is wecome. :smile:
 
Physics news on Phys.org
  • #2
Well, the first step in understanding anything in linear algebra is to think about it geometrically, in my opinion. Instead of matrices, think about linear transformations of vectors. i.e., think about geometric operations on vectors: think in terms of arrows rotating, stretching, shearing, etc. If you choose a basis (a set of axes), then you can write down the matrix components of the transformation in this basis; you can get infinitely many matrices (related by similarity transformations) that are all different ways of representing the same geometric transformation.

Then, consider some specific transformations, and look for vectors that are unchanged under the transformation: we are looking for symmetry. For instance, if you have a 3D rotation about an axis, any vector pointing along that axis will be unchanged under that rotation; that axis is a symmetry of the rotation. This is an example of an eigenvector.

We can relax our definition a little: let's look for vectors that are not completely unchanged by a transformation, but just have their direction unchanged. (Well, we'll count reversing direction as leaving the direction unchanged; it still points along the same line.) For example, if you have a transformation that stretches a vector in one direction and squashes it in another. Then those two directions are also eigenvectors of the transformation --- e.g., a vector pointing purely in the "stretch" direction gets stretched that way; it gets squashed the other way, but it has no component in the other direction, so the squashing doesn't do anything to it. It remains pointing in the stretched direction. The eigenvalues of those eigenvectors are the amount of stretching and squashing, respectively.

Now, if you looked at this transformation from the perspective of a matrix in a basis that didn't point along these directions, this nice geometric property would be obscured. But if you choose a basis consisting of eigenvectors of the transformation, then the matrix becomes simple: it's just diagonal, with the diagonal components being the eigenvalues!

(That's because a column of a matrix represents how the transformation acts on one of the basis vectors; if you choose your basis axis to be the eigenvectors, then the eigenvectors will have only one nonzero component, and since they remain in the same direction after the transformation, the transformed basis vectors will also have one nonzero component.)

Example: suppose I choose a linear transformation T(u+v) = 5u + 3v, assuming u and v are orthogonal. This transformation stretches vectors parallel to 'u' by a factor of 5, and it stretches vectors parallel to 'v' by a factor of 3. In this basis, 'u' has components [1 0]T because it is written as u = 1 x u + 0 x v, and similarly 'v' has components [0 1]T. After transformation, u' = T(u) = 5u = [5 0]T, v' = T(v) = 3v = [0 3]T. So in the {u,v} basis, the transformation T is represented by a diagonal matrix with components,

[5 0]
[0 3]

Not all matrices have a nice real set of eigenvalues so they can't all be diagonalized like this. But when they can, it makes the matrix easy to deal with. The eigenvectors tell you which basis will give you you the easy diagonal matrix. Even when it can't be diagonalized, eigenvectors are a good way to find out in which directions the transformation is "simple", and it can simplify calculations.

Example: if you want to compute a matrix power An, you could multiply it with itself over and over. Or, you could find the eigenbasis which makes it diagonal (if such a basis exists): raising that matrix to a power is trivial, since you're just raising the diagonal elements to a power. Then, if you want, you can change the answer back to whatever basis you were originally using.

Or, in physics, angular momentum L and angular velocity ω are related by the inertia tensor (matrix) I, by L = Iω The eigenvectors of the inertia tensor are the "principal axes" of a body: if you rotate the body about those axes, then the angular momentum of the body will point in the same direction as the axis of rotation. So you can decompose a general rotation into rotations about those axes, and analyze them separately: they "decouple" from each other (like in the stretching/squashing example, the stretching in one direction and the squashing in another are independent of each other). You also do this "decoupling" to find normal modes of oscillation and such, in other applications: characteristic resonant behavior of a body.

I'll also throw in a few recommendations for my favorite linear algebra books:

Anton, Elementary Linear Algebra
Axler, Linear Algebra Done Right
Strang, Introduction to Linear Algebra

If matrix algebra seems like an unmotivated bunch of meaningless, mindless manipulations on a pile of numbers, these books will help. The secret is, as I said, looking at the geometry of linear transformations acting on abstract vector spaces, not in matrix gymnastics.

By the way, linear algebra lies behind everything in physics: it's one of the most underrated math courses. Quantum mechanics is just linear algebra on infinite-dimensional vectors. Special relativity involves linear algebra on 4D spacetime vectors (Lorentz transformations are analogous to rotations). General relativity involves lots of tensors, which are generalizations of matrices that act on more than one vector at a time (multilinear transformations). Mechanics has inertia tensors, elasticity tensors, normal modes of wave equations ... electromagnetism has field tensors ... and so on. I had a particularly strong linear algebra background; my linear algebra course has served me better than any other single course that I've ever taken.
 
Last edited:
  • #3
Wow, great summary Ambitwistor. Not to metion very clear! You explain things better than Strang himself. I use his book in class -"Introduction to Linear Algebra" and I've listen to his lectures on mit opencourseware but your explanations are much better. Can you teach my class Ambitwistor. :wink:

I had no idea that a basis was a set axis. Now it makes sense that the number of vectors in the basis define the dimesion of the vector space.
 
  • #4
I'll add a different major applications: differential equations.

The basic theory of linear differential equations IS linear algebra: the set of all solutions to a linear homogeneous differential equations forms a vector space. And 90% of solving non-linear differential equations consists of reducing them to linear equations!

Linear algebra really is the theory of "linear problems".
 
  • #5
newbie here

whats the sgnificance of eigenvectors in terms of decribing oscillation?
 
  • #6
Originally posted by robinyau
whats the sgnificance of eigenvectors in terms of decribing oscillation?

Eigenvectors in a linear mechanical system describe normal modes of oscillation, which are resonsant modes that "decouple" from each other (oscillate independently of each other); you can describe the general motion as a superposition of these modes. See, for instance,

http://othello.mech.northwestern.edu/ea3/book/modes1/modes.html
 
  • #7
hi I'm a newie also but speaking of eigenvectors what's the importance of normalizing eigenvectors?

Dan
 
  • #8
Originally posted by Dan_potato
hi I'm a newie also but speaking of eigenvectors what's the importance of normalizing eigenvectors?

It's just convenient. An eigenvector isn't uniquely defined, since you can multiply any eigenvector by any number and get another eigenvector. All the eigenvectors obtained that way correspond to the same eigenvalue (as long as you don't multiply by zero). People like to single out one of them as "representative" of the bunch, and since they differ only in their lengths (up to a sign), the simplest way to do that is to pick the one with unit length.
 
  • #9
Hi I am a newie as well and also very confused by normalising
ok, so now i understand the importance of normalising eigenvectors, but how do you actually do this operation.
for example given the matrix

3 0 0
5 4 0
3 6 1

i calculate the eigenvalues to be 3, 4 and 1
using eigenvalue = 3
i get an eigenvector of

k
-13.5k
-5k

so how do you normalise this, any help would be grately appreciated.
 
  • #10
Originally posted by bracey
i get an eigenvector of

k
-13.5k
-5k

Are you sure? I get an eigenvector of,

[tex]
\begin{pmatrix}1 \\ -5 \\ -13.5 \end{pmatrix}
[/tex]


so how do you normalise this,

The same way you normalize any vector: divide it by its magnitude.
 
  • #11
brilliant cheers, you were right i typed the eigenvector in wrong

so if i got this right, then the normalsised eigenvector should be

0.069
-0.346
-0.935
 
  • #12
Yes, that's right.
 
  • #13
thanks for the help Ambitwistor, much appreciated
 
  • #14
This is probably going to make me sound really stupid, but as this is my first post go easy on me!

Im abit stuck on working out eigenvectors for 3 by 3 matrices, i get confused when trying to get the realtionships between the three values. Hope that makes sense!
 
  • #15
I'm not sure what you mean by "the three values". The three components of one of the eigenvectors? The three eigenvalues? Are you able to find the eigenvalues?
 
  • #16
yeah sorry its not that clear is it, i meant the three terms that make up the eigenvector, i can get eigenvalues alright. take for example the matrix that bracy posted how did you get the
1
-5
-13.5

i can get the equations out of the matrix ok but then get stuck
 
  • #17
Well, if the eigenvector is v = [x,y,z]T, then applying bracey's matrix to it yields a vector with components [3x,5x+4y,3x+6y+z]T. That has to equal 3 (the eigenvalue) times v, so

3x = 3x
5x+4y = 3y
3x+6y+z = 3z

The first equation gives x=1. The second gives 5+4y=3y, or y=-5. The third gives 3-30+z = 3z, or 2z = -27, or z = -13.5.
 
  • #18
ok, let me see if i have got this right, take the same matrix again but this time with an eigenvalue of 4

so you get [3x,5x+4y,3x+6y+z] again but this time it has to equal to 4

3x = 4x
5x+4y = 4y
3x+6y+z = 4z

so the first gives x=1.3 the second gives y=0 and the third gives z=1.3 again

Have i got that right or have i just made an idiot of myself! if i have then i blame the hard day i have had!
 
  • #19

3x = 4x
5x+4y = 4y
3x+6y+z = 4z

so the first gives x=1.3 the second gives y=0 and the third gives z=1.3 again

The only solution to the equation 3x = 4x is x=0. That makes the second equation 4y = 4y or y=1. The third equation becomes 6+z=4z, or 3z=6, or z=2.
 
  • #20
oh yeah, sorry about that, didnt think it looked right!. like i said its been a long hard day!
thank you for the help i will try a few more make sure i got it.
 
  • #21
We've just been doing eigenvectors etc and I'm grasping the concepts pretty much ok, it's just my main problem half the time is workin out the eigenvalues which isn't too good since that's usually how you start the question. Take the following matrix A-

-3 7 -5
2 4 3
1 2 2

I just have no idea about how to get them through the normal way of taking the determinant of A-lambdaI equal to zero. Any help would be much appreciated!

My next problem is that when I'm looking at the eigenspace and then doing x+y+z=0 etc to work out what x, y and z need to equal to form an eigenvector, what happens if all 3 eqts are the same? Here's what I have-

6x-2y-4z
3x-y-2z
6x-2y-4z

all equal to zero. Do I take x,y and z as being zero or what? So confused!
 
  • #22
Take your matrix
-3 7 -5
2 4 3
1 2 2

subtract λ from each of the numbers on the main diagonal. That is the same as subtracting λI from your matrix.

Now you have
(-3-λ) 7 -5
2 (4-λ) 3
1 2 (2-λ)

Now go through the steps for computing the determinant of this matrix, keeping the lambda factors in parenthesis. You will get a numeric expression with the lambda factors in it. Multiply the factors out and collect terms on powers of lambda. You now have a polynomial in lambda. Set it equal to zero and solve the equation.
 
  • #23
Thanks for your help, but as it turned out (after asking one of the pure maths tutors today), the lecturer must have been on the booze when he worte out the examples because most of them can't actually be solved (well not at the level we're at anyway). Typical.
 
  • #24
I took linear algebra about 20 years ago. It is comforting to know that nothing has changed. My instructor also stripped the subject of any conceivable applications. The class was boring drugdery, until the entire class fell asleep, then the one interesting nugget of theory was slipped in unnoticed.

If I ever had to teach it, I would dress in a different outlandish costume each day - lederhosen one day, matador outfit the next, then perhaps nothing but a loin cloth. That might keep the students awake.

Oddly enough, it is very important to know if you go into physics or any field with extensive mathematical modelling.

Njorl
 

1. What are eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are concepts in linear algebra that are used to understand the behavior of linear transformations. Eigenvalues represent the scaling factor by which a vector is scaled when it is transformed by a linear transformation. Eigenvectors are the vectors that do not change direction when transformed by a linear transformation, but are only scaled by the corresponding eigenvalue.

2. How are eigenvalues and eigenvectors calculated?

Eigenvalues and eigenvectors can be calculated by solving the characteristic equation, which is a polynomial equation formed by setting the determinant of the linear transformation matrix equal to 0. The eigenvalues are the solutions to this equation, and the corresponding eigenvectors can be found by plugging in the eigenvalues into the linear transformation matrix and solving for the eigenvectors.

3. What is the significance of eigenvalues and eigenvectors in data analysis?

Eigenvalues and eigenvectors are commonly used in data analysis and machine learning algorithms because they can help reduce the dimensionality of a dataset. By finding the eigenvectors of the covariance matrix of a dataset, the data can be projected onto a lower-dimensional space without losing much information, making it easier to analyze and visualize.

4. Can a matrix have multiple eigenvalues and eigenvectors?

Yes, a matrix can have multiple eigenvalues and corresponding eigenvectors. The number of eigenvalues and eigenvectors depends on the size of the matrix and the complexity of the linear transformation it represents. In some cases, a matrix may have repeated eigenvalues, resulting in less distinct eigenvectors.

5. What is the relationship between eigenvalues and diagonalization?

Eigenvalues and eigenvectors play a crucial role in the diagonalization of a matrix. Diagonalization is the process of finding a diagonal matrix that is similar to the original matrix, meaning that they have the same eigenvalues. This allows for easier computation and analysis of the matrix, as well as the potential for simplification and reduction of the matrix's size.

Similar threads

  • STEM Academic Advising
Replies
6
Views
4K
Replies
1
Views
726
  • STEM Academic Advising
Replies
2
Views
3K
  • STEM Academic Advising
Replies
13
Views
2K
  • Programming and Computer Science
Replies
29
Views
2K
  • Beyond the Standard Models
Replies
30
Views
7K
  • Linear and Abstract Algebra
Replies
6
Views
3K
  • STEM Academic Advising
Replies
3
Views
1K
  • STEM Academic Advising
Replies
6
Views
1K
  • STEM Academic Advising
Replies
4
Views
2K
Back
Top