Dimension and Orthogonality in Vector Spaces: A Proof of the Inequality m ≤ n

In summary, this conversation is trying to figure out if a set of mutually orthogonal vectors can have a cardinality greater than n. The attempted solution is to show that the set is linearly independent and that this proves that the cardinality is at most n.
  • #1
roam
1,271
12

Homework Statement


If {u1, u2,...,um} are nonzero pairwise orthogonal vectors of a subspace W of dimension n, prove that [tex]m \leq n[/tex].

The Attempt at a Solution



I look at all my notes but I still can't understand what this qurstion asks or what definitions I need to be using for this... I'm stuck and appreciate some guidance so I can get started. Thank you.
 
Physics news on Phys.org
  • #2
Well, what is n? It's the dimension. So we're asking if a set of mutually orthogonal vectors can have cardinality greater then n. Can it? Hmm, what is dimension? It is the cardinality (size) of a maximally linearly independent set. Does that help? (I.e. you should now think: if I can show that the set is ________ then by what's gone before it must have at most n elements.)
 
  • #3
matt grime said:
Well, what is n? It's the dimension. So we're asking if a set of mutually orthogonal vectors can have cardinality greater then n. Can it? Hmm, what is dimension? It is the cardinality (size) of a maximally linearly independent set. Does that help? (I.e. you should now think: if I can show that the set is ________ then by what's gone before it must have at most n elements.)

Do you mean I have to prove that the set is linearly independent? A set of mutually orthogonal non-zero vectors is always linearly independent.

OK so the vectors {u1, u2,...,um} are in Rm and they are orthogonal (i.e uj · ui=0 if i =/= j).

Let's take a linear combination of the vectors in this set that gives the zero vector:

k1u1+k2u2+...+kmum = 0

I just need to show that there's only one value for all the constants, k1 = k2 = … = km = 0.

If I take the dot product of both sides of the equation with u1:

u1 · (k1u1+k2u2+...+kmum) = 0 · u1

Factoring out the constant.

k1(u1 · u1) + k2(u1 · u2) +...+km(u1 · um) = 0

k1(u1)2 + k2(0) +...+km(0) = 0

k1=0 & all scalar coefficients are zeros and therefore vectors are independent.

I showed that the vectors in the set are linearly independent but I can't see exactly how this is useful in showing that [tex]m \leq n[/tex]...
Could you please provide me with more explanation of what I need to do next?
 
  • #4
I stated the definition of dimension in my first post: it is the maximum size of *any* set of linearly independent vectors. You have a set of m linearly independent vectors. You know that *by definition* the maximum size a set of linearly independent vectors can have is n.
 
  • #5
It proves that [tex]m \leq n[/tex], end of proof? Is what I've done sufficient to write as a proof or do I need to also further prove the definition of dimension that you stated?
 
  • #6
Yes it obviously proves m<=n. Was that really a question? Do I need to explain it more?

What I used above is the definition of dimension. What other one do you have?
 
  • #7
OK, thank you very much for your help Matt. :biggrin:
 
  • #8
Seriously, are you using a different definition? You could be using the minimal size of a spanning set as the definition, so you probably ought to show that this agrees with maximal size of a linearly independent set. Or whatever definition you're using (I can't think of another elementary one).
 

Related to Dimension and Orthogonality in Vector Spaces: A Proof of the Inequality m ≤ n

1. What is the difference between dimension and orthogonality?

Dimension refers to the number of independent variables or parameters required to define a particular space or object. Orthogonality, on the other hand, refers to the perpendicularity or independence of two vectors or objects within a space.

2. How does dimension affect orthogonality?

The dimension of a space determines the maximum number of orthogonal vectors that can exist within that space. For example, in a 2-dimensional space, there can be a maximum of 2 orthogonal vectors, while in a 3-dimensional space, there can be a maximum of 3 orthogonal vectors.

3. Can a higher dimensional space have fewer orthogonal vectors than a lower dimensional space?

Yes, it is possible for a higher dimensional space to have fewer orthogonal vectors than a lower dimensional space. This can happen when the vectors in the higher dimensional space are not independent, and therefore not orthogonal.

4. How are dimension and orthogonality related to linear independence?

Linear independence refers to the property of a set of vectors where none of the vectors can be expressed as a linear combination of the others. In a space with a higher dimension, there can be more linearly independent vectors, which can lead to a higher number of orthogonal vectors.

5. Why is orthogonality important in mathematics and science?

Orthogonality has many practical applications in mathematics and science, such as in vector algebra, optimization problems, and signal processing. It allows for the simplification and analysis of complex systems by breaking them down into orthogonal components. It also helps in solving equations and finding solutions in various fields of study, including physics, engineering, and computer science.

Similar threads

  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
15
Views
913
  • Calculus and Beyond Homework Help
Replies
15
Views
1K
  • Calculus and Beyond Homework Help
Replies
15
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
831
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
479
  • Calculus and Beyond Homework Help
Replies
34
Views
2K
  • Calculus and Beyond Homework Help
Replies
0
Views
489
Replies
11
Views
4K
Back
Top