Confusion about basis vectors and matrix tensor

In summary, the covariant and contravariant metric tensors can be defined using coordinate basis vectors and dual base vectors, respectively, with the inner product denoted by °. The definition of base vectors does not necessarily require orthogonality, only linear independence. The dual basis can be used to achieve orthogonality with an arbitrary basis through the relation e_i \cdot e^j = \delta_i^j. Therefore, orthogonality does not always imply linear independence.
  • #1
nomadreid
Gold Member
1,674
208
In "A Student's Guide to Vectors and Tensors" by Daniel Fleisch, I read that the covariant metric tensor gij=ei°ei (I'm leaving out the → s above the e's) where ei and ei are coordinate basis vectors and ° denotes the inner product, and similarly for the contravariant metric tensor using dual base vectors. But I thought the definition of base vectors included that the inner product of two distinct ones of the same type was zero, and similarly for dual base vectors. (For example, the basis vectors of Rn, with the inner product = the dot product.) Where is my thinking wrong?
 
Physics news on Phys.org
  • #2
All you need for a basis is linear independence. If you also have orthogonality -- i.e., if [itex]e_i \cdot e_j = 0[/itex] when [itex]i \ne j[/itex] -- that can be convenient, but it's not necessary.

The dual basis is the tool we use to get this same convenience (zero dot products) with an arbitrary basis. The main relation here is:
[tex]
e_i \cdot e^j = \delta_i^j
[/tex]
where [itex]\delta_i^j[/itex] is the Kronecker delta symbol (1 if [itex]i=j[/itex]; 0 otherwise).
 
  • Like
Likes 1 person
  • #3
Ah, thanks, it did not occur to me that, although orthogonality implied linear independence, the converse is not true. This spurred me to try some examples, and I see that one could make a basis of independent vectors {(1,1), (1,0)} to span R2 although they are not orthogonal, having a dot product of 1. OK, case closed. Thanks again.
 

Related to Confusion about basis vectors and matrix tensor

1. What is the difference between a basis vector and a matrix tensor?

A basis vector is a vector that spans a vector space, while a matrix tensor is a multi-dimensional array that can represent linear transformations between vector spaces.

2. How are basis vectors and matrix tensors related?

Basis vectors can be used to represent the columns or rows of a matrix tensor, depending on whether it is a column or row vector.

3. Can basis vectors and matrix tensors be used interchangeably?

No, basis vectors and matrix tensors serve different purposes and cannot be used interchangeably. Basis vectors represent the coordinates of a vector in a vector space, while matrix tensors represent linear transformations between vector spaces.

4. How do I determine the basis vectors for a given vector space?

The basis vectors for a vector space can be determined by finding a set of linearly independent vectors that span the space. These vectors can then be used as a basis to represent any vector in the space.

5. Can there be more than one set of basis vectors for a given vector space?

Yes, there can be infinitely many sets of basis vectors for a given vector space. However, all sets will have the same number of vectors, known as the dimension of the vector space.

Similar threads

  • Linear and Abstract Algebra
Replies
9
Views
332
  • Linear and Abstract Algebra
Replies
9
Views
674
  • Linear and Abstract Algebra
Replies
12
Views
1K
  • Linear and Abstract Algebra
Replies
33
Views
999
  • Linear and Abstract Algebra
2
Replies
43
Views
5K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
2K
Replies
24
Views
1K
Replies
6
Views
1K
  • Special and General Relativity
Replies
8
Views
2K
Back
Top