Multivariate calculus problem: Calculating the gradient vector

In summary, the conversation discusses finding partial derivatives of a function and using the gradient vector to find the derivative of a vector function. There is also confusion about applying the chain rule to find the derivative of a composition of functions.
  • #1
squenshl
479
4
Homework Statement
Let ##U := \left\{(x,y)\in \mathbb{R}^2: xy\neq 0\right\}## and let ##f: U\mapsto \mathbb{R}## be defined by
$$f(x,y) := (\log_{e}{(|x|)})^2+(\log_{e}{(|y|)})^2.$$


1. Calculate ##\nabla f(x,y)## at each point of ##U##.

2. Let ##\mathbf{r}: (0,1)\mapsto \mathbb{R}^2## be defined by ##\mathbf{r}(t) := \left(e^{\sin{(t)}},e^{\cos{(t)}}\right).##
Calculate the derivative of ##\mathbf{r}## at each point of ##(0,1).##

3. Justify whether you can use the chain rule to calculate the derivative of ##f\circ \mathbf{r}.##
If it is justifiable, calculate the derivative of ##f\circ \mathbf{r}## using the chain rule.
Relevant Equations
None
1. We find the partial derivatives of ##f## with respect to ##x## and ##y## to get ##f_x = \frac{2\ln{(x)}}{x}## and ##f_y = \frac{2\ln{(y)}}{y}.## This makes the gradient vector
$$\nabla{f} = \begin{bmatrix}
f_x \\
f_y
\end{bmatrix} = \begin{bmatrix}
\frac{2\ln{(x)}}{x} \\
\frac{2\ln{(y)}}{y}
\end{bmatrix}.$$

2. We have
$$\mathbf{r}'(t) = \left(\cos{(t)}e^{\sin{(t)}},-\sin{(t)}e^{\cos{(t)}}\right).$$

After this I'm a little confused. Any help is appreciated.
 
Physics news on Phys.org
  • #2
For (3), what does your version of the chain rule say?

If your multivariable calculus textbook is rigorous, it might also want you to show that ##U## is an open set (which is easily seen by a drawing).
 
  • #3
The derivative of ##\mathbf{r}## at each point of ##(0,1)##?
 
  • #4
squenshl said:
The derivative of ##\mathbf{r}## at each point of ##(0,1)##?
That confused me initially too. It would be clearer if it said "at each point in the open interval (0,1)."
 
  • #5
Okay (1) and (2) are done.
So for (3), assuming ##t > 0##, ##f\circ \mathbf{r} = \ln{(e^{\sin{(t)}})}^2+\ln{(e^{\sin{(t)}})}^2 = \sin^2{(t)}+\cos^2{(t)} = 1## so the derivative is ##0##.
 
  • #6
For #3, you need calculate the derivative using the chain rule if it can be applied.
 
  • #7
Okay then I’m lost how do we then justify whether to use the chain rule?
 
  • #8
I'm sure the conditions are stated in your textbook or were covered in lecture. Look them up.
 

Related to Multivariate calculus problem: Calculating the gradient vector

1. What is a gradient vector in multivariate calculus?

A gradient vector in multivariate calculus is a mathematical concept that represents the direction and magnitude of the steepest ascent of a function at a given point. It is a vector that contains the partial derivatives of the function with respect to each of its variables.

2. Why is calculating the gradient vector important?

Calculating the gradient vector is important because it allows us to find the maximum and minimum values of a multivariate function, as well as the direction of the steepest ascent. This information is useful in many applications, such as optimization problems in economics, physics, and engineering.

3. How do you calculate the gradient vector?

To calculate the gradient vector, you first find the partial derivatives of the function with respect to each of its variables. Then, you combine these derivatives into a vector, with each component representing the derivative with respect to a specific variable. The resulting vector is the gradient vector.

4. What is the relationship between the gradient vector and the level curves of a function?

The gradient vector is perpendicular to the level curves of a function. This means that the direction of the gradient vector is always pointing towards the direction of steepest ascent, while the level curves represent points where the function has the same output.

5. Can the gradient vector be used to solve optimization problems?

Yes, the gradient vector can be used to solve optimization problems by finding the maximum and minimum values of a multivariate function. By setting the gradient vector equal to zero and solving for the variables, we can find the critical points of the function, which can then be evaluated to determine the maximum or minimum value.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
9
Views
822
  • Calculus and Beyond Homework Help
Replies
8
Views
527
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
Replies
4
Views
676
  • Calculus and Beyond Homework Help
Replies
3
Views
606
  • Calculus and Beyond Homework Help
Replies
14
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
588
  • Calculus and Beyond Homework Help
Replies
4
Views
591
Replies
3
Views
1K
Back
Top