Transformation behavior of the gradient

In summary, the conversation discusses the concept of partial derivatives of a vector in general coordinates and the motivation for defining the covariant derivative. The main issue is that the additional term in the partial derivative prevents it from being a tensor. The mistake lies in switching the order of the partial derivatives, which is not allowed. The correct approach is to use the chain rule, which leads to a non-zero first term.
  • #1
joda80
18
0
Hi All,

I think I have confused myself ... perhaps you can tell me where my reasoning is wrong. The idea is that in general coordinates the partial derivative of a vector,
[tex]\frac{\partial A^i}{\partial x^j},[/tex]
is not a tensor because an additional term arises (which is the motivation for defining the covariant derivative).

However, when I do the math that additional term always drops out as demonstrated below. I'm pretty sure my math is wrong, but I don't see where. Maybe you can help out.

Here's my thinking:

[tex] \frac{\partial A^i}{\partial x^j} = \frac{\partial }{\partial x^j} \left[\frac{\partial x^i}{\partial a^r}\bar{A^r} \right].[/tex]

Applying the product rule,

[tex] \frac{\partial A^i}{\partial x^j} = \frac{\partial^2 x^i}{\partial x^j \partial a^r} \bar{A}^r
+ \frac{\partial x^i}{\partial a^r} \frac{\partial \bar{A}^r}{\partial x^j}.[/tex]

Switching the order of the partial derivatives in the first term results in

[tex] \frac{\partial A^i}{\partial x^j} = \frac{\partial}{\partial a^r}\left[ \frac{\partial x^i}{\partial x^j}\right] \bar{A}^r
+ \frac{\partial x^i}{\partial a^r} \frac{\partial \bar{A}^r}{\partial x^j},[/tex]

which we may write as

[tex] \frac{\partial A^i}{\partial x^j} = \frac{\partial}{\partial a^r}\left[\delta^i_j \right] \bar{A}^r + \frac{\partial x^i}{\partial a^r} \frac{\partial \bar{A}^r}{\partial x^j}.[/tex]

Since the unit tensor,

[tex]\delta^i_j[/tex]

is constant, its derivative is zero, so the first term vanishes and we get, after applying the chain rule to the second term,

[tex] \frac{\partial A^i}{\partial x^j} = \frac{\partial x^i}{\partial a^r}\frac{\partial a^q}{\partial x^j} \frac{\partial \bar{A}^r}{\partial a^q}, [/tex]

which would imply that the lhs actually is a second-order tensor, which I think is wrong -- but where did I make the mistake?

Thanks so much for your help!



Johannes
 
Physics news on Phys.org
  • #2
Well, the problem is the term with the second derivative (the first term), that term shouldn't vanish. It is that term which prevents the expression from being a tensor.

Let's examine that term more explicitly. We know that the transformation of coordinates is ##x^i=x^i(a^r)## This is a function of the new coordinates. So, say we have a function y(x), what you have done in that first term is analogous to:

$$\frac{\partial}{\partial y}\left(\frac{\partial y}{\partial x}\right)= \frac{\partial}{\partial x}\left(\frac{\partial y}{\partial y}\right)=0$$

This is not kosher. You can't switch the derivative around in there like that. Equality of mixed partial derivatives works for multivariate functions like y(x,z) where I can exchange derivatives in x and z, but obviously we don't have functions like y(x,y) and try to exchange derivatives in x and y...

What you should have done is do the chain rule:
$$\frac{\partial}{\partial x^j}=\frac{\partial a^q}{\partial x^j}\frac{\partial}{\partial a^q}$$

This will yield the first term looking like:

$$\frac{\partial a^q}{\partial x^j}\frac{\partial^2 x^i}{\partial a^q \partial a^r}\bar{A}^r$$

Which is not 0.
 
  • #3
That makes sense … thanks for helping out!
 

Related to Transformation behavior of the gradient

What is the transformation behavior of the gradient?

The transformation behavior of the gradient refers to how the gradient vector changes when the coordinates of a system are transformed. In other words, it describes how the direction and magnitude of the gradient vector are affected when the coordinate system is rotated, translated, or scaled.

How is the gradient transformed when the coordinate system is rotated?

When the coordinate system is rotated, the gradient vector is also rotated by the same amount. This means that the direction of the gradient remains the same, but the components of the vector may change.

What happens to the gradient when the coordinate system is translated?

If the coordinate system is translated, the gradient vector is not affected. This is because translation does not change the direction or magnitude of the vector, only its position in space.

Does the gradient change when the coordinate system is scaled?

Yes, the gradient vector is affected by scaling of the coordinate system. The direction of the gradient remains the same, but the magnitude of the vector changes in proportion to the scale factor.

Why is understanding the transformation behavior of the gradient important?

Understanding the transformation behavior of the gradient is important in many areas of science and engineering, particularly in vector calculus and optimization problems. It allows us to analyze and solve problems in different coordinate systems and understand how the gradient changes with respect to those transformations.

Similar threads

  • Differential Geometry
Replies
9
Views
516
  • Differential Geometry
Replies
1
Views
2K
  • Differential Geometry
Replies
7
Views
2K
  • Differential Geometry
Replies
9
Views
2K
  • Differential Geometry
Replies
12
Views
3K
  • Differential Geometry
Replies
2
Views
677
  • Differential Geometry
Replies
3
Views
3K
  • Differential Geometry
Replies
3
Views
3K
Replies
9
Views
3K
  • Differential Geometry
Replies
3
Views
2K
Back
Top