Calculating Functional Derivatives: Understanding Notation and Examples

In summary: Are we looking at the value of the functional itself?Your y is a dummy variable and your answer cannot depend on it. The functional ##F[f]## itself does not take ##f(x)## as an argument (i.e., the value of a function in a point), it takes ##f##, the function, as an argument. It is therefore more instructive to write ##\delta(x-z) \equiv \delta_z(x)## and give ##f + \epsilon \delta_z## as an argument to the functional. The definition of the functional derivative would then be$$\frac{\delta F}{\delta f(z)} = \lim_{\epsilon \to 0} \frac{F[f
  • #1
BiGyElLoWhAt
Gold Member
1,622
131
If I understand what's going on (quite possibly I don't), I think my book is using bad (confusing) notation.

Homework Statement


As written: "Calculate ##\frac{\delta H[f]}{\delta f(z)} \ \text{where} \ H=\int G(x,y)f(y)dy##"

and ##\frac{\delta H[f]}{\delta f(z)}## is the functional derivative of H[f] with respect to f(y) at z (I think that's what it is).

Homework Equations


...

The Attempt at a Solution


So if I think what's happening here is we're concerned with how the value of H[f] changes with f(y) when y = z.

This gives rise to (changing notation slightly) ##\frac{dF[f]}{d f(y)} = \lim_{\epsilon\ \to\ 0} \frac{F[f(y) + \epsilon \delta (x-y)] - F[f(y)] }{\epsilon}## where ##\delta(x-y)## is the dirac delta function centered at y= x, (I believe this is so, please correct me, my book is not explicit) so this gives us our F[f + epsilon] - F[f] when y = x and F[f]-F[f]=0 when y doesn't equal x. If this is right, going through the example:

##\frac{d H[f]}{d f(z)} ##
##=lim_{\epsilon\ \to\ 0}\frac{\int[G(x,y)(f(z) + \epsilon \delta (x-y))] - G(x,y)f(z)]}{\epsilon }##
##=\left \{
\begin{array}{lr}
-\int G(x,y)dy\ \text{if y=z} \\
0\ otherwise\\
\end{array} \right. ##

I tried to show my train of thought, please correct any misconceptions I have about this. I would also appreciate it if someone would check my answers. Apparently my book doesn't have the answer's to it's exercises.
 
Physics news on Phys.org
  • #2
Your y is a dummy variable and your answer cannot depend on it. The functional ##F[f]## itself does not take ##f(x)## as an argument (i.e., the value of a function in a point), it takes ##f##, the function, as an argument. It is therefore more instructive to write ##\delta(x-z) \equiv \delta_z(x)## and give ##f + \epsilon \delta_z## as an argument to the functional. The definition of the functional derivative would then be
$$
\frac{\delta F}{\delta f(z)} = \lim_{\epsilon \to 0} \frac{F[f+\epsilon \delta_z] - F[f]}{\epsilon}.
$$
You will obtain
$$
F[f +\epsilon \delta_z] = \int G(x,y) (f(y) + \delta_z(y)) dy.
$$
I will let you take it from there, but in the end it is not much stranger than having a sum
$$
S = \sum_m G_m x_m
$$
and asking for ##\partial S/\partial x_k##.
 
  • #3
I guess I'm missing what this delta is. Is it not the dirac delta? If so what is it?
 
  • #4
Is it perhaps just to represent a 2 dimensional epsilon? The product of epsilon and delta, that is. So small changes in f over a range of small changes in y?
 
  • #5
BiGyElLoWhAt said:
If I understand what's going on (quite possibly I don't), I think my book is using bad (confusing) notation.

Homework Statement


As written: "Calculate ##\frac{\delta H[f]}{\delta f(z)} \ \text{where} \ H=\int G(x,y)f(y)dy##"

and ##\frac{\delta H[f]}{\delta f(z)}## is the functional derivative of H[f] with respect to f(y) at z (I think that's what it is).

Homework Equations


...

The Attempt at a Solution


So if I think what's happening here is we're concerned with how the value of H[f] changes with f(y) when y = z.

This gives rise to (changing notation slightly) ##\frac{dF[f]}{d f(y)} = \lim_{\epsilon\ \to\ 0} \frac{F[f(y) + \epsilon \delta (x-y)] - F[f(y)] }{\epsilon}## where ##\delta(x-y)## is the dirac delta function centered at y= x, (I believe this is so, please correct me, my book is not explicit) so this gives us our F[f + epsilon] - F[f] when y = x and F[f]-F[f]=0 when y doesn't equal x. If this is right, going through the example:

##\frac{d H[f]}{d f(z)} ##
##=lim_{\epsilon\ \to\ 0}\frac{\int[G(x,y)(f(z) + \epsilon \delta (x-y))] - G(x,y)f(z)]}{\epsilon }##
##=\left \{
\begin{array}{lr}
-\int G(x,y)dy\ \text{if y=z} \\
0\ otherwise\\
\end{array} \right. ##

I tried to show my train of thought, please correct any misconceptions I have about this. I would also appreciate it if someone would check my answers. Apparently my book doesn't have the answer's to it's exercises.

What is the definition of ##\delta H(f) / \delta f## that your book uses?
 
  • #6
BiGyElLoWhAt said:
I guess I'm missing what this delta is. Is it not the dirac delta? If so what is it?

It is the Dirac delta. But it must be seen as a function of an argument just as ##f## is a function of an argument and the functional takes functions as arguments, not function values. When you write ##\delta_z(x)## instead of ##\delta(z-x)##, it simply becomes more evident which of the variables should be used as the function parameter (not that it matters in this case, the delta is symmetric). The value ##F[f+\epsilon \delta_z]## is simply the value of the functional when its argument is the function ##f + \epsilon\delta_z##. Now, this function has a parameter ##z##, which is the reason that the resulting functional derivative is a function of ##z##.
 
  • #7
Ray Vickson said:
What is the definition of ##\delta H(f) / \delta f## that your book uses?
It shows the limit definition of a run of the mill calc 1 derivative then says:
"The derivative of the function tells you how the number returned by the function f(x) changes as you slightly change the number x that feed into the 'machine'. In the same way, we can define a functional derivative of a functional F[f] as follows:
(insert limit definition from my first post, I don't feel like re-latexing it)
The functional derivative tells you how the number returned by the functional F[f(x)] changes as you slightly change the function f(x) that you feed into the 'machine' "

Orodruin said:
It is the Dirac delta. But it must be seen as a function of an argument just as ##f## is a function of an argument and the functional takes functions as arguments, not function values. When you write ##\delta_z(x)## instead of ##\delta(z-x)##, it simply becomes more evident which of the variables should be used as the function parameter (not that it matters in this case, the delta is symmetric). The value ##F[f+\epsilon \delta_z]## is simply the value of the functional when its argument is the function ##f + \epsilon\delta_z##. Now, this function has a parameter ##z##, which is the reason that the resulting functional derivative is a function of ##z##.

So, the dirac delta is 1 at one point and 0 everywhere else, no? So unless y = z, delta(y-z) is 0, which gives you a value of 0 for the derivative (0/epsilon I would think would be zero). Are we not looking at numerical values of f when we're looking at how F changes with respect to f? I see that f is definitely a function, and F takes a function as an argument; but epsilon is a number, and so multiplying it by the dirac delta gives us epsilon at the point ##\delta## is centered around, and 0 everywhere else, so aren't we really only looking at changes of our function f when f = z (or where ever we're centering around?) maybe I don't see the purpose of multiplying epsilon by the dirac delta otherwise.
 
  • #8
actually, not that I think it really matters, but the argument in the dirac delta per my book would be y-x, not x-y as I have in my first post.
 
  • #9
BiGyElLoWhAt said:
So, the dirac delta is 1 at one point and 0 everywhere else, no?

No. It has the properties that ##\delta(x-y) = 0## if ##x \neq y## and that ##\int \delta(x-y) dx = 1## as long as ##y## is in the integration domain.

BiGyElLoWhAt said:
Are we not looking at numerical values of f when we're looking at how F changes with respect to f?
Yes, but the thing you need to understand is that a functional is essentially a map from a functional space to the real (or complex) numbers. As such, it makes no sense to write ##F[f(x)]## as ##f(x)## is a number (the value of the function ##f## at the point ##x##) and not a function and the functional ##F## needs a function as its argument. Now, in sloppy notation, you might write out the ##x## as a dummy variable, but your result is not dependent on this dummy variable.

Let us say you have functions on the interval ##I = [0,L]## and that ##f## is a function defined on that interval. The function ##g = f + \epsilon\delta_y## is also a function on the same interval, which for ##x \in I## takes the value ##g(x) = f(x) + \epsilon \delta(x-y)##.

If the functional ##F## is simply
$$
F[f] = \int_0^L f(x) dx,
$$
then it is quite apparent that this does not depend on any parameter ##x## - you might just as well have called the dummy variable ##x## something else.

If you compare with the discrete case that I also mentioned, ##x## corresponds to the summation variable ##m##, while ##f(x)## corresponds to the value of ##x_m##.

BiGyElLoWhAt said:
not that I think it really matters, but the argument in the dirac delta per my book would be y-x, not x-y as I have in my first post.

The Dirac delta is symmetric so this is indeed irrelevant.
 
  • #10
So I guess my question now is this: how does f(x) (the argument taken by the functional) change. I see what you're saying about mapping from my f space to my F space.

If f(x) were to equal cos(x), are we looking at changes in f(x) such as epsilon = sin(x)-cos(x) or something similar? but for all possible "small changes" to our function.
So this would also include epsilon = constant, so that we look at the change of cos(x) to cos(x) + .001

Perhaps this is my area of confusion. Is epsilon a number or a function itself?
 
  • #11
##\epsilon## is just a small number (in fact you send it to zero in the limit) just as usual. When considering the functional derivative, you are considering small (as predicated by ##\epsilon \to 0## changes to the function, just as you consider small deviations from the function argument when you do partial derivatives of a normal function. If f(x) = cos(x), the functional F[f] will be a number, just as it will be a number if f(x) = cos(x) - sin(x) or f(x) = cos(x) + 100000. However, these are not small perturbations of the function, just as evaluating f(x) at x = 0 and at x = 10^65 is not a small change of the argument x.
 
  • #12
Ok.
I appreciate you bearing with me, by the way.

I am, however, still missing something. I'm not sure what that is, though.
 
  • #13
So epsilon is a number, however, when we alter our function by a factor of epsilon, epsilon must be must be multiplied by a function for our functional to take it as an argument. Am i right so far?

I see where the delta function would be the go to function, as in if someone said pick the most obvious function to multiply epsilon by. Is there any sort of derivation for this? Or was it just the most useful function anyone could come up with?
 
  • #14
I would rather say that it is the most useful definition you can come up with. In a similar fashion you could ask why partial derivatives are defined in the way they are with respect to one coordinate at a time only and not like a more general directional derivative. The answer is that you can build the variation of the functional for any variation of the argument using the functional derivative, just as you can build the directional derivative using partial derivatives and the direction vector.
 
  • #15
Actually if i remember correct, in calc 3 we went through and very rigerously derived partial derivatives by seeing what happens to small changes w.r.t. each variable, and it the limit definition for partials came out of it. I am curious if there is something like this for functional derivatives.
 
  • #16
But this is my point exactly. You define partial derivatives using what happens when you change one of the coordinates and you define the functional derivative as what happens when you change one of the function values.

Again, it helps seeing the function values as the coordinates and their argument as simply a label denoting which coordinate is intended.

In a separable space, it might have been an idea to define thefunctional derivative using a countable basis, but still I think it is perfectly fine as it is.
 
  • #17
Let me offer up a worked example to perhaps show my confusion.
##J [f] = \int [f (y)]^p \phi (y)dy##
The derivative w.r.t f(x) is given as
## \lim_{\epsilon \to 0} \frac{1}{\epsilon}[\int [f (y) + \epsilon \delta (y-x)]^p \phi (y)dy-\int [f (y)]^p\phi (y)dy]##
And they skip straight to the answer ##p [f]^{p-1}\phi (y)##
If we actually expand this out and let epsilon get close to zero, f^p' cancel as expected, and all of the terms except f(y)delta^p phi (y) go to zero because they have a degree of epsilon^2 or higher and thus all go to zero when we cancel the epsilons and let epsilon go to zero. I guess the integral actually saves this one, because of the delta function having area 1, but what if j wasnt defined as an integral? Literally every worked example is an integrated functional. So the answer to my problem would be the case where p =0, and so get rid of the piecewise conditions, and its just g (x,y) everywhere. I still don't feel like i really understand exactly what's going on though.
 
  • #18
I hope that rendered properly. My app won't display latex for some reason, i don't know if i need a 3rd party for latex code or what. Just got it yesterday.
 
  • #19
BiGyElLoWhAt said:
Let me offer up a worked example to perhaps show my confusion.
##J [f] = \int [f (y)]^p \phi (y)dy##
The derivative w.r.t f(x) is given as
## \lim_{\epsilon \to 0} \frac{1}{\epsilon}[\int [f (y) + \epsilon \delta (y-x)]^p \phi (y)dy-\int [f (y)]^p\phi (y)dy]##
And they skip straight to the answer ##p [f]^{p-1}\phi (y)##
If we actually expand this out and let epsilon get close to zero, f^p' cancel as expected, and all of the terms except f(y)delta^p phi (y) go to zero because they have a degree of epsilon^2 or higher and thus all go to zero when we cancel the epsilons and let epsilon go to zero. I guess the integral actually saves this one, because of the delta function having area 1, but what if j wasnt defined as an integral? Literally every worked example is an integrated functional. So the answer to my problem would be the case where p =0, and so get rid of the piecewise conditions, and its just g (x,y) everywhere. I still don't feel like i really understand exactly what's going on though.

Actually, the integral is irrelevant, as I forgot we're looking at the derivative w.r.t. f(x), and the derivative doesn't include the integral (the integral is an operand).
So if we have, after canceling all the terms out in the numerator:

##\lim_{\epsilon\to 0} \frac{pf(x)^{p-1}\epsilon \delta(y-x)\phi(x)}{\epsilon}##
How does that delta function just go away? Unless we're only looking at how our functional J changes with small changes to the function f at the point our delta function is centered around? I hope that is a clear question.
 
  • #20
You had it right from the beginning. The integral is part of the functional and is used to perform the integration over the delta.
 
  • #21
Ok, so what would happen if there weren't an integral attached to J?
 
  • #22
You would get a delta function as the functional derivative. Note that what you differentiate must still be a functional. For example:
F[f] = f(0)
is a mapping from functions to real numbers (assuming f is a function from real numbers to real numbers) and so is a functional. Its functional derivative at ##z## would be
$$
\lim_{\epsilon \to 0}\frac{F[f+\epsilon\delta_z] - F[f]}{\epsilon} = \delta_z(0) = \delta(z).
$$
 
  • #23
Ok, so basically, minus the integral, the answer is what I posted a few back.
 
  • #24
What you showed in post #17 was correct and the "magic step" was simply using the ##\delta## to perform the integral.
 
  • #25
Ok cool. Thanks again. I still need to work on this obviously, but at least I have a half-a**ed idea of what I'm doing
 
  • #26
I'm curious, and I want to work this problem for myself. As written, the definition of the functional seems like a transformation, not a functional. The latter should return a real, possibly complex, number. Your definition of H is an indefinite integral of the product of a kernel function, G[x,y] and a function of y, f[y] , so it will return a function of x. For example, the Fourier Transform has as its kernel Exp[-i x y] and it returns a function of y. It's not a functional.
 
  • #27
Mark Harder said:
I'm curious, and I want to work this problem for myself. As written, the definition of the functional seems like a transformation, not a functional. The latter should return a real, possibly complex, number. Your definition of H is an indefinite integral of the product of a kernel function, G[x,y] and a function of y, f[y] , so it will return a function of x. For example, the Fourier Transform has as its kernel Exp[-i x y] and it returns a function of y. It's not a functional.

By the use of notation, I would assume that G(x,y) is a Green's function for some differential equation. The H in itself is a functional for fixed x, just as it is a functon (of x) for fixed f and seen as a function of x will be the solution to the DE with an inhomogeneity f(x).
 
  • #28
Orodruin said:
By the use of notation, I would assume that G(x,y) is a Green's function for some differential equation. The H in itself is a functional for fixed x, just as it is a functon (of x) for fixed f and seen as a function of x will be the solution to the DE with an inhomogeneity f(x).

OK, got that. And does the evaluation of the functional at some one value of z, and the delta function that singles it out the means by which the problem solution finds this single value of f?
 

Related to Calculating Functional Derivatives: Understanding Notation and Examples

1. What are functional derivatives?

Functional derivatives are a mathematical tool used in the field of calculus of variations to calculate the rate of change of a functional with respect to a function. They are similar to ordinary derivatives, but instead of dealing with functions of one variable, they deal with functions of multiple variables.

2. How are functional derivatives represented?

Functional derivatives are typically represented using the notation δF/δf, where F is the functional and f is the function. This notation is similar to the notation used for ordinary derivatives, but the δ symbol is used to represent the functional derivative instead of d.

3. What is the process of calculating functional derivatives?

The process of calculating functional derivatives involves taking the limit of a difference quotient as the difference in the function approaches zero. This can be done using the definition of the functional derivative, which is given by δF/δf = lim[(F[f+h] - F[f])/h], where h is the difference in the function.

4. How are functional derivatives used in real-world applications?

Functional derivatives have various applications in physics, engineering, and economics. They are used to determine critical points of functionals, which can then be used to optimize various systems. They are also used in quantum mechanics to calculate the Hamiltonian of a system.

5. What are some common mistakes when working with functional derivatives?

Some common mistakes when working with functional derivatives include forgetting to take the limit of the difference quotient, not using the correct notation for the functional derivative, and not fully understanding the underlying concepts of calculus of variations. It is important to carefully follow the steps involved in calculating functional derivatives and to have a solid understanding of the notation and concepts involved.

Similar threads

  • Calculus and Beyond Homework Help
Replies
6
Views
916
  • Calculus and Beyond Homework Help
Replies
8
Views
558
  • Calculus and Beyond Homework Help
Replies
9
Views
633
  • Calculus and Beyond Homework Help
Replies
6
Views
624
  • Calculus and Beyond Homework Help
Replies
5
Views
945
  • Calculus and Beyond Homework Help
Replies
2
Views
339
Replies
1
Views
673
  • Calculus and Beyond Homework Help
Replies
8
Views
196
  • Calculus and Beyond Homework Help
Replies
21
Views
941
  • Calculus and Beyond Homework Help
Replies
9
Views
2K
Back
Top