Orthogonality in h^n for a field h

In summary, the conversation discusses a linear map from a vector space over a field to the product of the field with itself n times. Two vectors in the product space are called orthogonal if their dot product is zero. The question asks if the map is surjective if the only vector orthogonal to all images of the map is the zero vector. The conversation concludes that the map is indeed surjective, with the proof relying on the result that the dimension of a subspace and its orthogonal complement add up to the dimension of the product space.
  • #1
Josh Swanson
19
0
Let V be a vector space over a field h and let n be a positive integer. Let f:V -> h^n be a linear map given by
f(v) = (f1(v), f2(v), ..., fn(v)). Call two vectors (g1, ..., gn) and (h1, ..., hn) in h^n "orthogonal" if

g1 h1 + ... + gn hn = 0

Suppose the only vector orthogonal to every vector f(v) is the 0 vector. Is f surjective?


Maybe I'm just missing something, but the only way I can see to get the result is to consider the orthogonal complement of the image of f, but that requires the codomain to be an inner product space, which h^n isn't in general. (In fact, in this application, h can be either a field of prime order or the rationals.)

This question was inspired by Andrea Ferretti's http://mathoverflow.net/questions/13322/slick-proof-a-vector-space-has-the-same-dimension-as-its-dual-if-and-only-if-it" that the dimension of an infinite dimensional vector space is less than that of its dual. I'm not convinced the end of the proof works out. Considering orthogonality in basically the situation I've outlined is suggested at the end of the comments by KConrad.
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
Let f(V) be our image. It is proven in www.maths.bris.ac.uk/~maxmr/la2/notes_5.pdf[/URL] that it still holds for every subspace W of [itex]h^n[/itex] that

[tex]\dim{W}+\dim{W^\bot}=n[/tex]

In particular (since [itex]f(V)^\bot=\{0\}[/itex])

[tex]\dim{f(V)}=n[/tex]

so our map is surjective.

EDIT: I should probably say why our "inner product" is non-degenerate, since I can imagine that this is not always the case. It's because [itex]f(V)^\bot=\{0\}[/itex], from which it follows that [itex](h^n)^\bot=\{0\}[/itex]. So the radical is {0}, which is equivalent to non-degeneracy.
 
Last edited by a moderator:
  • #3
Ah, wonderful, thanks for the link! I was hoping the result was still true outside of inner product spaces, and so it is. That completes the proof then.


P.S. Non-degeneracy can also be deduced easily without recourse to the existence of [itex]f[/itex] from

[tex](c_1, ..., c_i, ..., c_n) . (0, ..., 0, 1, 0, ..., 0) = c_i[/tex]
so if c is orthogonal to [itex]h^n[/itex] then
[tex]c . e_i = c_i = 0[/tex]
so [itex]c = 0[/itex].
 

Related to Orthogonality in h^n for a field h

1. What is orthogonality in h^n for a field h?

Orthogonality in h^n for a field h refers to the property of vectors in h^n being perpendicular to each other. This means that the dot product of any two vectors in h^n is equal to zero.

2. Why is orthogonality important in h^n for a field h?

Orthogonality is important in h^n for a field h because it allows for the representation of geometric concepts in higher dimensions. It also has many applications in fields such as physics, engineering, and computer science.

3. How is orthogonality achieved in h^n for a field h?

Orthogonality is achieved in h^n for a field h by ensuring that the dot product of any two vectors is equal to zero. This can be done by choosing vectors that are perpendicular to each other or by using the Gram-Schmidt process to orthogonalize a set of vectors.

4. Can orthogonality exist in h^n for a field h with non-Euclidean metrics?

Yes, orthogonality can exist in h^n for a field h with non-Euclidean metrics. In fact, orthogonality is a concept that can be extended to any vector space with an inner product, regardless of the metric used.

5. How is orthogonality related to linear independence in h^n for a field h?

In h^n for a field h, orthogonality and linear independence are closely related concepts. Orthogonal vectors are always linearly independent, meaning that they cannot be written as a linear combination of each other. Similarly, a set of linearly independent vectors can be orthogonalized using the Gram-Schmidt process to obtain a set of orthogonal vectors.

Similar threads

Replies
3
Views
1K
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
900
  • Calculus and Beyond Homework Help
Replies
0
Views
499
Replies
6
Views
521
  • Linear and Abstract Algebra
Replies
7
Views
1K
Replies
22
Views
3K
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Advanced Physics Homework Help
Replies
5
Views
2K
Back
Top