Welcome to our community

Be a part of something great, join today!

Motivate me!

Petrus

Well-known member
Feb 21, 2013
739
Hello MHB,
I am intrested to know WHY you enjoy/like/intrested in linear algebra! I am myself feeling like im "forcing" myself to learn it cause I Dont find any point with linear algebra..? I mean I calculate basis and many more.. If I am Honest it feels like I got no use with them..( I have only used it for Solving differential equation..).
I am OPEN to read any text/article to try motivate myself WHY I should learn it, I have Really hard studdy something that Dont feel intrested/usefull.

Best Regards,
\(\displaystyle |\pi\rangle\)
 

Ackbach

Indicium Physicus
Staff member
Jan 26, 2012
4,193
Hello MHB,
I am intrested to know WHY you enjoy/like/intrested in linear algebra! I am myself feeling like im "forcing" myself to learn it cause I Dont find any point with linear algebra..? I mean I calculate basis and many more.. If I am Honest it feels like I got no use with them..( I have only used it for Solving differential equation..).
I am OPEN to read any text/article to try motivate myself WHY I should learn it, I have Really hard studdy something that Dont feel intrested/usefull.

Best Regards,
\(\displaystyle |\pi\rangle\)
Well, I don't know if you're studying physics, but if you are, I can say this: Quantum Mechanics is Linear Algebra on steroids. QM is where the action is, where LA gets seriously applied. Very few topics in LA are untouched by QM.
 

Petrus

Well-known member
Feb 21, 2013
739
Well, I don't know if you're studying physics, but if you are, I can say this: Quantum Mechanics is Linear Algebra on steroids. QM is where the action is, where LA gets seriously applied. Very few topics in LA are untouched by QM.
Hello,
I do not studdy physics but I may (Im not sure) studdy physics. Well I plan study some programming! Currently I got only math course!

Regards,
\(\displaystyle |\pi\rangle\)
 

Ackbach

Indicium Physicus
Staff member
Jan 26, 2012
4,193
Hello,
I do not studdy physics but I may (Im not sure) studdy physics. Well I plan study some programming! Currently I got only math course!

Regards,
\(\displaystyle |\pi\rangle\)
Linear Algebra is quite useful to programmers as well. One particular area of application is in numerical analysis. Indeed, when I took numerical analysis, which was a year-long course, one-half of it was numerical linear algebra. To find out how you find roots of polynomials (convert to matrix problem and find eigenvalues using the QR algorithm), etc., was interesting and useful. To know how the technology works is essential to using it well, I find.
 

Petrus

Well-known member
Feb 21, 2013
739
Linear Algebra is quite useful to programmers as well. One particular area of application is in numerical analysis. Indeed, when I took numerical analysis, which was a year-long course, one-half of it was numerical linear algebra. To find out how you find roots of polynomials (convert to matrix problem and find eigenvalues using the QR algorithm), etc., was interesting and useful. To know how the technology works is essential to using it well, I find.
That sound good! I have not read numerical analysis but do you mean any polynomials? Any chance You can Link me an exemple?

Regards,
\(\displaystyle |\pi\rangle\)
 

Ackbach

Indicium Physicus
Staff member
Jan 26, 2012
4,193
It works for any polynomial. You first form the Frobenius companion matrix of the polynomial. This is a matrix whose eigenvalues are provably equal to the roots of the polynomial. Then you use the QR method, or implicit QR, or whatever you want, to compute the eigenvalues of the companion matrix, and voila! Roots!
 

Jameson

Administrator
Staff member
Jan 26, 2012
4,043
Linear algebra comes up a lot in statistics as well because you have large amounts of data that you need to manipulate to find things like regressions, principle components and other decompositions of data. This semester I've really started appreciating how important LA is given that in real world practice we don't have perfect data that follows nice functions, so we use large matrices to estimate as closely as possible.
 

ModusPonens

Well-known member
Jun 26, 2012
45
I think it's hard to overestimate the importance of linear algebra. Every applied mathematics is touched by it, be it physics, numerical analysis or probability and statistics. LA is also important in pure mathematics: itself is applicable in the vast field of differential geometry and the vast field of differential equations. Its way of thinking, or the way you learn to deal with vector spaces and linear transformations are a door to the gigantic field of abstract algebra and, eventualy, category theory. The only field for which I can't remember usefulness is logic. I will probably be remembered by someone. :)

Most importantly, learn to apreciate the subject. It's very beautiful. Think deeply about what the results mean, and what the definitions are saying. The imagery created in your mind becausde of this subject is very apealing. :)
 

Fantini

"Read Euler, read Euler." - Laplace
MHB Math Helper
Feb 29, 2012
342
I understand your frustration. Much has been said here, but I think maybe the answers for these questions in MSE might help enlighten you even more.

Why study linear algebra?

Importance of linear algebra.

It takes some time and experience to begin appreciate the power of the tools we are learning. Linear algebra is one of those tools. Given time, you'll begin to see things in perspective and realize how powerful it is.

Cheers. :)
 

Deveno

Well-known member
MHB Math Scholar
Feb 15, 2012
1,967
Many people find Linear Algebra tedious, and perhaps even...boring. This is because vector spaces are (mathematically speaking) fairly well-behaved entities. So....why is this important?

For one, it turns out that very complicated (hard) problems in math can be "approximated" (or MODELED) by a suitable linear problem, which may be EASY to solve where the original problem is practically impossible to solve. An example of this, is in solving the equation for the motion of a pendulum, where the sine term is replaced a simple x term (the function $f(x) = x$ is a "first-order approximation" (linear one) for the function $g(x) = \sin x$).

Many unexpected things turn out to be linear: one prominent example is the derivative operator, $D$. Facts learned in linear algebra can be exploited to solve differential equations, which has many practical applications in many areas of "the real world".

In computer programming, any time you have multiple data registers (that you can perform arithmetic operations on) that do not affect each other, you are effectively dealing with a vector space. Linear algebra provides a very convenient way of talking about such data arrays. Many things can be conveniently and efficiently described as a matrix: game pay-offs, weather trends, economic indicators...it's a long list.

Fourier analysis becomes much more intelligible when you realize you are using trig functions as a basis for the vector space of continuous periodic functions of a given period, relative to an inner product given by integrating two functions multiplied together over the period interval. Fourier first used these to analyze heat transfer, but they are widely used now in signal processing.

Multi-variate calculus is MUCH easier to understand with the vocabulary of vector spaces in play. One is simply replacing the vector space $\Bbb R$ with $\Bbb R^m$ for the domain, and $\Bbb R^n$ for the co-domain. The linear function $f'(p)x$ (translated by a suitable constant) becomes the linear transformation $Df(p)x$ (translated by a suitable constant vector). Some important cases:

m = 1, n = 2, or 3 (often representing parametrically, with the domain variable representing time)

m = 2, n = 1 (surfaces in 3-space)

m = n (vector fields). These can be visualized graphically by using the "dual nature" of vectors as "arrows" (with "heads" and "tails") and as "points", one uses the "point form" for the domain, and the "arrow form" for the range (one draws an arrow representing $f(p)$ at the point $p$).

In some areas of math (such as representation theory), one seeks to reduce a tough question about an algebraic object (such as a group), to an equivalent (and easier to work with) question about linear algebra.

***********

A really "broad" answer: in many areas of mathematics, the ability to "add" things is a fundamental requirement. Often, the things we are adding can also be "sized" by some multiplicative factor. Technically, the term mathematics use for such a structure is a (left) $R$-module, where $R$ refers to some (usually commutative) ring (rings are important in mathematics, because division doesn't always make "sense"). You've already become familiar with a lot of rings:

1) The integers (the basic example!)
2) The set of even integers
3) Polynomial functions over a given field (like the real numbers)
4) nxn matrices
5) Continuous (or differentiable) real-valued functions defined on a set $A$.

Now, when we can always divide in a (commutative) ring $R$, it becomes a field, $F$. For certain "well-behaved" rings, it is possible to do this by creating a "field of fractions" (this means what you probably imagine it does). This allows us to embed our $R$-module inside an $F$-module, that is: a vector space.

You are probably aware that a system of linear equations, such as:

$2x + 3y = 2$
$4x - 3y = 1$

doesn't always have an INTEGER solution, even though the coefficients of the equations are all integers. If we can use fractions, though...hey, no problem! That is why we use fields, instead of rings.

Loosely speaking, vectors are things we can add and scale. Of course, these two things have to be "compatible" (that is why the vector space axioms have the form they do), we want a certain kind of "interaction" between our adding and scaling. The thing to remember is that in vector spaces, the addition is "king". It controls the situation to a large degree (as seen by the importance of the 0-vector in so many situations). The scaling is a secondary operation which has to "respect" the vector addition (it IS king, after all).

Often, the role of the underlying field is so minor with respect to understanding what is going on, that a single field (such as the real numbers) is relied upon exclusively. What IS important, is "how many copies" of the underlying field we're dealing with: the DIMENSION of the vector space. It is not much of an understatement to say that the dimension of a vector space is its single most important property. If one knows the dimension of a space, it's pretty much downhill sailing from that point. This is why the rank-nullity theorem is such a biggie: it tells us dimensional information.

***************

Still bigger picture: higher mathematics has two main roads of exploration-

1) "Building with tinker-toys": we choose (sometimes arbitrarily) some rules for structures to follow, and see what JUST THE RULES THEMSELVES tell us, divorced from any particular instance of something that happens to obey them. We let the structures play with each other, and see if they get along, or not.

2) "Examining the nooks and crannies": under a highly restricted set of assumptions, we see what neat stuff happens under these "perfect storm" conditions.

Linear algebra is central to both kinds of exploration, in many disparate branches of math.
 

Evgeny.Makarov

Well-known member
MHB Math Scholar
Jan 30, 2012
2,492
A couple of things relevant to a programmer. In computer graphics, many transformations are described by matrices, even non-linear ones such as translations (unlike linear transformations, they don't have a fixpoint) and perspective projections (they don't preserve parallelism). Also, have a look at this online course: Coding the Matrix. It described several applications of linear algebra to computer science, including computations of page rank used by Google search.
 

ModusPonens

Well-known member
Jun 26, 2012
45
Hey Evgeny

Since you know quite a lot about logic, can you mention applications of linear algebra to logic?
 

Evgeny.Makarov

Well-known member
MHB Math Scholar
Jan 30, 2012
2,492
Well, there exists linear logic, which is quite a curious object. It may seem horribly restrictive: for example, one cannot derive "A and A implies A". But it is intended for reasoning about resources rather than truth, so this restriction is reasonable. It has two conjunctions: $\otimes$ and $\&$. It is said that $\otimes$ "behaves like" tensor product and $\&$ "behaves like" Cartesian product, but I am not sure how far this analogy goes. The semantics of linear logic is pretty complicated, even though it came before inference rules. While Googling, I also found a work in progress that tries to construct a semantics of linear logic from regular linear vector spaces and linear operators.