Welcome to our community

Be a part of something great, join today!

tensor notation for vector product proofs

skatenerd

Active member
Oct 3, 2012
114
I am new to tensor notation, but have known how to work with vector calculus for a while now. I understand for the most part how the Levi-Civita and Kronecker Delta symbol work with Einstein summation convention. However there are a few things I'm iffy about.
For example, I have a problem where I am to prove
$$\vec{A}\bullet\vec{B}\times\vec{C} = \vec{B}\bullet\vec{C}\times\vec{A} = \vec{C}\bullet\vec{A}\times\vec{B}$$
using tensor notation to avoid having to write out all the terms.
So I know the very left side of this equation would look like
$$\vec{A}\bullet(\vec{B}\times\vec{C}) = A_i (\vec{B}\times\vec{C})_i = A_i \varepsilon_{ijk} B_j C_k$$
But then I get confused when trying to assign the indices for the next two parts of the equation.
Would the second part look like this:
$$\vec{B}\bullet(\vec{C}\times\vec{A}) = B_j (\vec{C}\times\vec{A})_j = B_j \varepsilon_{jkl} C_k A_i$$
Or would the indices of the epsilon be the same as for the first part (\(\varepsilon_{ijk}\))?
Same confusion goes for the first part. The reason I have this uncertainty in my mind is because I know with the triple vector product, you have to introduce 2 extra indices. So I guess my lack of complete understanding of these functions is leaving me confused with my problem. Thanks in advance for any guidance.
 

dwsmith

Well-known member
Feb 1, 2012
1,673
I am new to tensor notation, but have known how to work with vector calculus for a while now. I understand for the most part how the Levi-Civita and Kronecker Delta symbol work with Einstein summation convention. However there are a few things I'm iffy about.
For example, I have a problem where I am to prove
$$\vec{A}\bullet\vec{B}\times\vec{C} = \vec{B}\bullet\vec{C}\times\vec{A} = \vec{C}\bullet\vec{A}\times\vec{B}$$
using tensor notation to avoid having to write out all the terms.
So I know the very left side of this equation would look like
$$\vec{A}\bullet(\vec{B}\times\vec{C}) = A_i (\vec{B}\times\vec{C})_i = A_i \varepsilon_{ijk} B_j C_k$$
But then I get confused when trying to assign the indices for the next two parts of the equation.
Would the second part look like this:
$$\vec{B}\bullet(\vec{C}\times\vec{A}) = B_j (\vec{C}\times\vec{A})_j = B_j \varepsilon_{jkl} C_k A_i$$
Or would the indices of the epsilon be the same as for the first part (\(\varepsilon_{ijk}\))?
Same confusion goes for the first part. The reason I have this uncertainty in my mind is because I know with the triple vector product, you have to introduce 2 extra indices. So I guess my lack of complete understanding of these functions is leaving me confused with my problem. Thanks in advance for any guidance.

Even if your re-arrange and your indices are in some order say jik, you can always let this new indices be lmn and then let lmn = ijk since the indices are arbitrary so you are right back to ijk.

Is that what you were asking? That is what I thought from the question. If not, clarify were I lost the point.
 

skatenerd

Active member
Oct 3, 2012
114
I'm not sure I made myself clear enough...
What I am unsure of in constructing this equation in tensor notation, is if I take a cross product of two arbitrary vectors B and C, that would be in a certain arbitrary plane. Would the indices of the epsilon symbol be different than if I took the cross product of vectors C and A? Or would it make sense to call them both just \(\varepsilon_{ijk}\)? That just somehow doesn't seem like it would make sense to me, but I'm not sure what else would be correct.
 

dwsmith

Well-known member
Feb 1, 2012
1,673
I will explain the first equality which may help:
\[
\mathbf{A}\cdot(\mathbf{B}\times\mathbf{C}) = \varepsilon_{ijk}a_ib_jc_k
\]
which is just a summation and a, b, c are the components of the vectors.

We can then write \(\varepsilon_{ijk}a_ib_jc_k = \varepsilon_{jki}b_jc_ka_i\).

We can either do a subsitution or recall that jki doesn't introduce a negative sign since we are still reading right to left. Thus, we can re-write the the equation as
\[
\varepsilon_{ijk}b_ic_ja_k = \mathbf{B}\cdot(\mathbf{C}\times\mathbf{A})
\]
 

skatenerd

Active member
Oct 3, 2012
114
Ahhh wow thank you so much. Now I see what you meant by the indices being arbitrary. And seeing now that this proof works helps me conceptualize better what these indices are actually doing.
 

skatenerd

Active member
Oct 3, 2012
114
One other quick question, about a slightly different thing if you dont mind...How would you construct a problem in vector notation and find the magnitude of that tensor product? Like for instance the vector A crossed with the vector B? Would it make sense just to dot the product with itself?
 

dwsmith

Well-known member
Feb 1, 2012
1,673
One other quick question, about a slightly different thing if you dont mind...How would you construct a problem in vector notation and find the magnitude of that tensor product? Like for instance the vector A crossed with the vector B? Would it make sense just to dot the product with itself?
I don't quite understand your question. Can you give me an example problem or question?
 

skatenerd

Active member
Oct 3, 2012
114
I want to do this in tensor notation:
$$|\vec{A}\times\vec{B}|$$
Magnitude of a cross product of two arbitrary vectors. So the way I know to start is:
$$|\varepsilon_{ijk}A_j B_k|$$
To take this magnitude in vector notation is what I am not sure I understand how to do. Would it make sense to write
$$(\varepsilon_{ijk}A_j B_k)(\varepsilon_{ijk}A_j B_k)$$
and then use the "epsilon killer" identity to simplify it? Not really sure of any other way to notate it.
 

dwsmith

Well-known member
Feb 1, 2012
1,673
I want to do this in tensor notation:
$$|\vec{A}\times\vec{B}|$$
Magnitude of a cross product of two arbitrary vectors. So the way I know to start is:
$$|\varepsilon_{ijk}A_j B_k|$$
To take this magnitude in vector notation is what I am not sure I understand how to do. Would it make sense to write
$$(\varepsilon_{ijk}A_j B_k)(\varepsilon_{ijk}A_j B_k)$$
and then use the "epsilon killer" identity to simplify it? Not really sure of any other way to notate it.
I would just interpret as
\[
\lvert\mathbf{A}\times\mathbf{B}\rvert = \lVert \mathbf{A}\rVert\lVert \mathbf{B}\rVert\sin(\theta)
\]
Is there a specific end goal of this question? Should you come up with a certain expression?
 

skatenerd

Active member
Oct 3, 2012
114
Well actually the goal of the question is to prove that expression you just wrote, using tensor notation. I was just having a hard time even getting started with that whole idea of taking a magnitude in tensor notation.
 

dwsmith

Well-known member
Feb 1, 2012
1,673

skatenerd

Active member
Oct 3, 2012
114
That really helps a lot. Definitely going to bookmark that, for future reference :D