Levi Civita (Permutation) Symbol Proof

In summary, the proof involves using the two equations provided in the textbook, which relate the permutation symbol to the cross product and the dot product. By manipulating these equations and using the "occurance rule" and the kronecker delta as an index switcher, it is possible to show that the permutation symbol is equal to the determinant of the matrix formed by the kronecker deltas.
  • #1
blink-
15
0

Homework Statement


Prove the following:
[itex]\varepsilon_{ijk}=
\left| \begin{array}{ccc}
\delta_{1i} & \delta_{1j} & \delta_{1k} \\
\delta_{2i} & \delta_{2j} & \delta_{2k} \\
\delta_{3i} & \delta_{3j} & \delta_{3k}
\end{array} \right|[/itex]

Homework Equations


From my textbook:
[itex]\hat{e}_3 = \hat{e}_1 \times \hat{e}_2, \quad \hat{e}_1 = \hat{e}_2 \times \hat{e}_3, \quad \ldots \quad \varepsilon_{ijk} \hat{e}_k = \hat{e}_i \times \hat{e}_j \\
\delta_{ij} = \hat{e}_i \cdot \hat{e}_j [/itex]

From a website:
[itex] \varepsilon_{ijk} = (\hat{e}_i \times \hat{e}_j)\cdot\hat{e}_k [/itex]

The Attempt at a Solution


I don't even know where to start. My textbook says I should be able to prove the determinant proof using those two relations they provide; however, I have not been able to prove anything.

It seems as though every continuum mechanics book I've ever seen likes to say "it's easy to show the determinant proof." Apparently it's so easy that no book feels the need to show the derivation. Am I missing any relations? Can someone give me hints or "suggestions" to get me going in the right direction?

Thanks.
 
Last edited:
Physics news on Phys.org
  • #2
start with the first formula and sub the second one in

eijk = (ei x ej) . (ei x ej) dij

next notice that (ei x ej) is dotted with itself meaning ?
 
  • #3
jedishrfu,

Thank you very much for your quick response. My wife and I have been looking at your post and can't seem to understand how you got there. Do I need the third equation? I found it on a website but my textbook informs me that I only need the two other equations... When I rearrange those, I seem to get something different than that listed on the website...

[itex]\varepsilon_{ijk} \hat{e}_k = \hat{e}_i \times \hat{e}_j [/itex]
[itex]\varepsilon_{ijk} \hat{e}_k \cdot \hat{e}_k = (\hat{e}_i \times \hat{e}_j) \cdot \hat{e}_k [/itex]
[itex]\varepsilon_{ijk} \delta{kk} = (\hat{e}_i \times \hat{e}_j) \cdot \hat{e}_k [/itex]
[itex]3\varepsilon_{ijk} = (\hat{e}_i \times \hat{e}_j) \cdot \hat{e}_k [/itex]

which is not equal to what the website says...

The only thing that we've gotten when trying to use your hint is the following:

[itex]\delta_{ij} = \hat{e}_i \cdot \hat{e}_j [/itex]
[itex]\phantom{\delta_{ij}} = \left( \frac{\hat{e}_j \times \hat{e}_k}{\varepsilon_{ijk}} \right) \cdot \left( \frac{\hat{e}_k \times \hat{e}_i}{\varepsilon_{ijk}} \right) [/itex]

which can be further manipulated, but it doesn't seem to give anything?

Furthermore, even if we could get it to the state you mention, I am still slightly baffled. I understand that with something dotted with itself you have the norm squared, but I can't figure out if there is another relation or how to use that one.

Sorry, I'm sure this is straight forward for you but I've always had a hard time with this damn permutation symbol. I'm used it for countless other identity proofs, but I've never been able to prove the determinant identity of itself.

Thanks Again.
 
  • #4
blink- said:

Homework Statement


Prove the following:
[itex]\varepsilon_{ijk}=
\left( \begin{array}{ccc}
\delta_{1i} & \delta_{1j} & \delta_{1k} \\
\delta_{2i} & \delta_{2j} & \delta_{2k} \\
\delta_{3i} & \delta_{3j} & \delta_{3k}
\end{array} \right)[/itex]

Do you mean [itex]\varepsilon_{ijk}= \begin{vmatrix} \delta_{1i} & \delta_{1j} & \delta_{1k} \\ \delta_{2i} & \delta_{2j} & \delta_{2k} \\ \delta_{3i} & \delta_{3j} & \delta_{3k} \end{vmatrix}[/itex]?

blink- said:
jedishrfu,

Thank you very much for your quick response. My wife and I have been looking at your post and can't seem to understand how you got there.

I have no idea what jedishrfu did (or was trying to do) there either, so don't feel too bad.

Do I need the third equation? I found it on a website but my textbook informs me that I only need the two other equations... When I rearrange those, I seem to get something different than that listed on the website...

[itex]\varepsilon_{ijk} \hat{e}_k = \hat{e}_i \times \hat{e}_j [/itex]
[itex]\varepsilon_{ijk} \hat{e}_k \cdot \hat{e}_k = (\hat{e}_i \times \hat{e}_j) \cdot \hat{e}_k [/itex]
[itex]\varepsilon_{ijk} \delta{kk} = (\hat{e}_i \times \hat{e}_j) \cdot \hat{e}_k [/itex]
[itex]3\varepsilon_{ijk} = (\hat{e}_i \times \hat{e}_j) \cdot \hat{e}_k [/itex]

which is not equal to what the website says...

Whenever you are doing index gymnastics, at each step, you should check 2 things:

(1) Does any index occur more than twice in a single term?
(2) Does each term have the same free indices?

When taking the dot product of [itex]\varepsilon_{ijk} \hat{e}_k[/itex] (which has an implied summation over [itex]k[/itex]) with [itex]\hat{e}_k[/itex], you need to use a different dummy index for the summation:

[tex] (\hat{e}_i \times \hat{e}_j) \cdot \hat{e}_k = \varepsilon_{ijm} \hat{e}_m \cdot \hat{e}_k = \varepsilon_{ijm}\delta_{km} = \varepsilon_{ijk}[/tex]
 
  • #5
sorry for he confusion but in your first post I saw a formula relating ek to eixej delta ij and didn't realize the delta ij was part of the 3rd equation. now though it shows it on the 3rd line. did you edit the post?
 
  • #6
gabbagabbahey said:
Do you mean [itex]\varepsilon_{ijk}= \begin{vmatrix} \delta_{1i} & \delta_{1j} & \delta_{1k} \\ \delta_{2i} & \delta_{2j} & \delta_{2k} \\ \delta_{3i} & \delta_{3j} & \delta_{3k} \end{vmatrix}[/itex]?

Yes, that is absolutely right. It is the determinant of the matrix, not just the matrix itself.

gabbagabbahey said:
Whenever you are doing index gymnastics, at each step, you should check 2 things:

(1) Does any index occur more than twice in a single term?
(2) Does each term have the same free indices?

When taking the dot product of [itex]\varepsilon_{ijk} \hat{e}_k[/itex] (which has an implied summation over [itex]k[/itex]) with [itex]\hat{e}_k[/itex], you need to use a different dummy index for the summation:

[tex] (\hat{e}_i \times \hat{e}_j) \cdot \hat{e}_k = \varepsilon_{ijm} \hat{e}_m \cdot \hat{e}_k = \varepsilon_{ijm}\delta_{km} = \varepsilon_{ijk}[/tex]

That is a very clear explanation. I sometimes forget about the "occurance rule." I had the revelation while sleeping that what I did was wrong (mainly, that I can't sum one term ([itex]\delta_{kk}[/itex]) because it was multiplied by [itex]\varepsilon_{ijk}[/itex]). My plan was then to use the kronecker delta as an index switcher [itex]k\rightarrow{k}[/itex], which would leave me with that final formula (although my methodology was incorrect).

jedishrfu said:
sorry for he confusion but in your first post I saw a formula relating ek to eixej delta ij and didn't realize the delta ij was part of the 3rd equation. now though it shows it on the 3rd line. did you edit the post?

Sorry about that. I did edit the post because I think I had typed an error in one of the supplied equations. You just replied too fast (you probably never hear that!).

Thanks guys, I'll keep trying.
 
Last edited:
  • #7
If you are allowed to use the well-known fact that the determinant of a 3 x 3 matrix can be written as [itex]\det(\mathbf{A}) = \varepsilon_{lmn} A_{1l} A_{2m} A_{3n}[/itex] (see here and the reference cited therein), then it should be fairly easy to show.

If not, you will probably just have to expand the determinant using whatever methods you are allowed to use and compare the result to [itex]\varepsilon_{ijk}[/itex].
 
  • #8
gabbagabbahey said:
If you are allowed to use the well-known fact that the determinant of a 3 x 3 matrix can be written as [itex]\det(\mathbf{A}) = \varepsilon_{lmn} A_{1l} A_{2m} A_{3n}[/itex] (see here and the reference cited therein), then it should be fairly easy to show.

If not, you will probably just have to expand the determinant using whatever methods you are allowed to use and compare the result to [itex]\varepsilon_{ijk}[/itex].

Thanks gabbagabbahey. We are allowed the use the [itex]\det(\mathbf{A}) = \varepsilon_{lmn} A_{1l} A_{2m} A_{3n}[/itex] (it was another proof I did).

The problem I am running into is starting the problem. The way the proofs are expected to be is is working from the left side until it equals the right. This method eliminates the proof by expanding the determinant. I have been trying for days now but can't seem to solve it using the two relations the book lays out (and explicitly says "can be easily proved using these relations"). Does anyone know how to start the proof with the equations I defined earlier in the thread?

Thanks Again.
 
  • #9
blink- said:
Thanks gabbagabbahey. We are allowed the use the [itex]\det(\mathbf{A}) = \varepsilon_{lmn} A_{1l} A_{2m} A_{3n}[/itex] (it was another proof I did).

The way the proofs are expected to be is is working from the left side until it equals the right.

Are you sure about this? What is the difference between proving a=b and proving b=a?

Does anyone know how to start the proof with the equations I defined earlier in the thread?

Just use the matrix [itex]\mathbf{A} = \begin{pmatrix} \delta_{1i} & \delta_{1j} & \delta_{1k} \\ \delta_{2i} & \delta_{2j} & \delta_{2k} \\ \delta_{3i} & \delta_{3j} & \delta_{3k} \end{pmatrix} [/itex] and calculate the determinant using the above formula. What is [itex]A_{1l}[/itex]? What is [itex]A_{2m}[/itex]?...
 
  • #10
[itex]\mathrm{det}\begin{pmatrix} \delta_{1i} & \delta_{1j} & \delta_{1k} \\ \delta_{2i} & \delta_{2j} & \delta_{2k} \\ \delta_{3i} & \delta_{3j} & \delta_{3k} \end{pmatrix} = \varepsilon_{ijk} [/itex]
[itex]\phantom{\mathbf{A} = \begin{pmatrix} \delta_{1i} & \delta_{1j} & \delta_{1k} \\ \delta_{2i} & \delta_{2j} & \delta_{2k} \\ \delta_{3i} & \delta_{3j} & \delta_{3k} \end{pmatrix}} = \mathrm{det}(\delta) [/itex]
[itex]\phantom{\mathbf{A} = \begin{pmatrix} \delta_{1i} & \delta_{1j} & \delta_{1k} \\ \delta_{2i} & \delta_{2j} & \delta_{2k} \\ \delta_{3i} & \delta_{3j} & \delta_{3k} \end{pmatrix}} = \varepsilon_{lmn}\delta_{il}\delta_{jm}\delta_{kn} [/itex]
[itex]\phantom{\mathbf{A} = \begin{pmatrix} \delta_{1i} & \delta_{1j} & \delta_{1k} \\ \delta_{2i} & \delta_{2j} & \delta_{2k} \\ \delta_{3i} & \delta_{3j} & \delta_{3k} \end{pmatrix}} = \varepsilon_{ijk} [/itex]

It's much easier when I stop trying to solve for the other side. I could have done this from the beginning but I was really trying to manipulate the LHS. Thanks for your help.
 
  • #11
blink- said:
[itex]\mathrm{det}\begin{pmatrix} \delta_{1i} & \delta_{1j} & \delta_{1k} \\ \delta_{2i} & \delta_{2j} & \delta_{2k} \\ \delta_{3i} & \delta_{3j} & \delta_{3k} \end{pmatrix} = \varepsilon_{ijk} [/itex]
[itex]\phantom{\mathbf{A} = \begin{pmatrix} \delta_{1i} & \delta_{1j} & \delta_{1k} \\ \delta_{2i} & \delta_{2j} & \delta_{2k} \\ \delta_{3i} & \delta_{3j} & \delta_{3k} \end{pmatrix}} = \mathrm{det}(\delta) [/itex]
[itex]\phantom{\mathbf{A} = \begin{pmatrix} \delta_{1i} & \delta_{1j} & \delta_{1k} \\ \delta_{2i} & \delta_{2j} & \delta_{2k} \\ \delta_{3i} & \delta_{3j} & \delta_{3k} \end{pmatrix}} = \varepsilon_{lmn}\delta_{il}\delta_{jm}\delta_{kn} [/itex]
[itex]\phantom{\mathbf{A} = \begin{pmatrix} \delta_{1i} & \delta_{1j} & \delta_{1k} \\ \delta_{2i} & \delta_{2j} & \delta_{2k} \\ \delta_{3i} & \delta_{3j} & \delta_{3k} \end{pmatrix}} = \varepsilon_{ijk} [/itex]

It's much easier when I stop trying to solve for the other side. I could have done this from the beginning but I was really trying to manipulate the LHS. Thanks for your help.

This isn't quite correct. [itex]A_{1l}[/itex] represents the [itex]l[/itex]th component along the 1st row of the matrix [itex]\mathbf{A}[/itex]. Using [itex]\mathbf{A}=\begin{pmatrix} \delta_{1i} & \delta_{1j} & \delta_{1k} \\ \delta_{2i} & \delta_{2j} & \delta_{2k} \\ \delta_{3i} & \delta_{3j} & \delta_{3k} \end{pmatrix}[/itex], you do not have [itex]A_{1l} = \delta_{il}[/itex]. You do however know that [itex]A_{l1}=\delta_{li}[/itex] (the [itex]l[/itex]th component along the 1st column), so you want to be sure to expand the determinant along the columns of your matrix instead of the rows. Of course [itex]\delta_{il}=\delta_{li}[/itex], so technically there is nothing incorrect in your equations, but your reasoning is not clear. When you expand the determinant along each column you have [itex]\det(\mathbf{A}) = \varepsilon_{lmn} A_{l1} A_{m2} A_{n3}[/itex].
 

Related to Levi Civita (Permutation) Symbol Proof

1. What is the Levi Civita symbol proof?

The Levi Civita symbol proof is a mathematical proof used to show the properties and relationships of the Levi Civita symbol, also known as the permutation symbol. This symbol is used in vector calculus and differential geometry to represent the sign of a permutation of a set of numbers.

2. What are the properties of the Levi Civita symbol?

The Levi Civita symbol has three main properties: it is completely antisymmetric, it takes the value of +1 for an even permutation and -1 for an odd permutation, and it takes the value of 0 if any two indices are equal.

3. How is the Levi Civita symbol used in vector calculus?

In vector calculus, the Levi Civita symbol is used to define the cross product between two vectors. It is also used in the definition of the curl and divergence of a vector field. The symbol is helpful in simplifying vector equations and expressing them in a concise form.

4. Can the Levi Civita symbol proof be extended to higher dimensions?

Yes, the Levi Civita symbol proof can be extended to higher dimensions. In three dimensions, the symbol has three indices, but in higher dimensions, it can have more indices. The properties and relationships of the symbol remain the same, but the number of terms in the proof increases.

5. Are there any practical applications of the Levi Civita symbol proof?

Yes, the Levi Civita symbol proof has practical applications in various fields of physics and engineering. It is used in electromagnetism, fluid mechanics, and quantum mechanics, to name a few. The symbol helps in simplifying and solving complex equations and is an essential tool in these fields.

Similar threads

Replies
2
Views
284
  • Atomic and Condensed Matter
Replies
2
Views
2K
  • Atomic and Condensed Matter
Replies
4
Views
2K
Replies
5
Views
2K
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
10
Views
4K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
Replies
6
Views
5K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
Back
Top