How should I find the nontrivial stationary paths?

  • Thread starter Math100
  • Start date
  • #1
Math100
756
202
Homework Statement
Consider the functional ## S[y]=\alpha y(1)^2+\int_{0}^{1}\beta y'^2dx, y(0)=0 ##, with a natural boundary condition at ## x=1 ## and subject to the constraint ## C[y]=\gamma y(1)^2+\int_{0}^{1}w(x)y^2dx=1 ##, where ## \alpha, \beta ## and ## \gamma ## are nonzero constants.
a) Show that the stationary paths of this system satisfy the Euler-Lagrange equation ## \beta\frac{d^2y}{dx^2}+\lambda w(x)y=0, y(0)=0, (\alpha-\gamma\lambda)y(1)+\beta y'(1)=0 ##, where ## \lambda ## is a Lagrange multiplier.
b) Let ## w(x)=1 ## and ## \alpha=\beta=\gamma=1 ##. Find the nontrivial stationary paths, stating clearly the eigenfunctions ## y ## (normalized so that ## C[y]=1 ##) and the values of the associated Lagrange multiplier.
Relevant Equations
None.
a) Proof:
Let ## \lambda ## be the Lagrange multiplier.
Then the auxiliary functional is ## \overline{S}[y]=\alpha y(1)^2+\int_{0}^{1}\beta y'^2dx-\lambda (\gamma y(1)^2+\int_{0}^{1}w(x)y^2dx-1) ##.
This gives ## \overline{S}[y+\epsilon h]=\alpha (y(1)+\epsilon h(1))^2+\int_{0}^{1}\beta (y'+\epsilon h')^2dx-\lambda (\gamma(y(1)+\epsilon h(1))^2+\int_{0}^{1}w(x)(y+\epsilon h)^2dx-1) ##, where ## y+\epsilon h ## is an admissible perturbation, so that ## h(0)=0 ##.
Note that the Gateaux differential ## \triangle\overline{S}[y, h] ## is given by ## \frac{d}{d\epsilon}\overline{S}[y+\epsilon h]\vert_{\epsilon=0} ##.
Thus ## \frac{d}{d\epsilon}\overline{S}[y+\epsilon h]\vert_{\epsilon=0}=2\alpha y(1)h(1)+2\int_{0}^{1}\beta y'h'dx-2\lambda (\gamma y(1)h(1)+\int_{0}^{1}wyhdx) ##.

From here, how should I show that the stationary paths of this system satisfy the given Euler-Lagrange equation?

b) Let ## w(x)=1 ## and ## \alpha=\beta=\gamma=1 ##.
Consider the Euler-Lagrange equation ## \beta\frac{d^2y}{dx^2}+\lambda w(x)y=0, y(0)=0, (\alpha-\gamma\lambda)y(1)+\beta y'(1)=0 ##, where ## \lambda ## is a Lagrange multiplier.
Then we have ## \frac{d^2y}{dx^2}+\lambda y=0, y(0)=0, (1-\lambda)y(1)+y'(1)=0 ##, where ## \lambda ## is a Lagrange multiplier.
This gives ## y=c_{1}sin(\sqrt{\lambda}x)+c_{2}cos(\sqrt{\lambda}x) ##.

From here, how should I find the nontrivial stationary paths?
 
Physics news on Phys.org
  • #2
Math100 said:
Homework Statement: Consider the functional ## S[y]=\alpha y(1)^2+\int_{0}^{1}\beta y'^2dx, y(0)=0 ##, with a natural boundary condition at ## x=1 ## and subject to the constraint ## C[y]=\gamma y(1)^2+\int_{0}^{1}w(x)y^2dx=1 ##, where ## \alpha, \beta ## and ## \gamma ## are nonzero constants.
a) Show that the stationary paths of this system satisfy the Euler-Lagrange equation ## \beta\frac{d^2y}{dx^2}+\lambda w(x)y=0, y(0)=0, (\alpha-\gamma\lambda)y(1)+\beta y'(1)=0 ##, where ## \lambda ## is a Lagrange multiplier.
b) Let ## w(x)=1 ## and ## \alpha=\beta=\gamma=1 ##. Find the nontrivial stationary paths, stating clearly the eigenfunctions ## y ## (normalized so that ## C[y]=1 ##) and the values of the associated Lagrange multiplier.
Relevant Equations: None.

a) Proof:
Let ## \lambda ## be the Lagrange multiplier.
Then the auxiliary functional is ## \overline{S}[y]=\alpha y(1)^2+\int_{0}^{1}\beta y'^2dx-\lambda (\gamma y(1)^2+\int_{0}^{1}w(x)y^2dx-1) ##.
This gives ## \overline{S}[y+\epsilon h]=\alpha (y(1)+\epsilon h(1))^2+\int_{0}^{1}\beta (y'+\epsilon h')^2dx-\lambda (\gamma(y(1)+\epsilon h(1))^2+\int_{0}^{1}w(x)(y+\epsilon h)^2dx-1) ##, where ## y+\epsilon h ## is an admissible perturbation, so that ## h(0)=0 ##.
Note that the Gateaux differential ## \triangle\overline{S}[y, h] ## is given by ## \frac{d}{d\epsilon}\overline{S}[y+\epsilon h]\vert_{\epsilon=0} ##.
Thus ## \frac{d}{d\epsilon}\overline{S}[y+\epsilon h]\vert_{\epsilon=0}=2\alpha y(1)h(1)+2\int_{0}^{1}\beta y'h'dx-2\lambda (\gamma y(1)h(1)+\int_{0}^{1}wyhdx) ##.

From here, how should I show that the stationary paths of this system satisfy the given Euler-Lagrange equation?

Assuming your work is correct, you have [tex]
(\alpha - \gamma\lambda)y(1)h(1) + \int_0^1 \beta y'h' - \lambda w y h \,dx = 0.[/tex] What is the next step in all of these problems? Integrate [itex]y'h'[/itex] by parts.

b) Let ## w(x)=1 ## and ## \alpha=\beta=\gamma=1 ##.
Consider the Euler-Lagrange equation ## \beta\frac{d^2y}{dx^2}+\lambda w(x)y=0, y(0)=0, (\alpha-\gamma\lambda)y(1)+\beta y'(1)=0 ##, where ## \lambda ## is a Lagrange multiplier.
Then we have ## \frac{d^2y}{dx^2}+\lambda y=0, y(0)=0, (1-\lambda)y(1)+y'(1)=0 ##, where ## \lambda ## is a Lagrange multiplier.
This gives ## y=c_{1}sin(\sqrt{\lambda}x)+c_{2}cos(\sqrt{\lambda}x) ##.

This assumes [itex]\lambda \neq 0[/itex]. What happens if [itex]\lambda = 0[/itex]? Do you get a non-zero solution for [itex]y[/itex]?

For [itex]\lambda = k^2 > 0[/itex], you know from the condition [itex]y(0) = 0[/itex] that [itex]c_2 = 0[/itex]. That leaves you with the condition at [itex]y(1)[/itex], which takes the form [tex]
c_1f(k) = 0.[/tex] We know that [itex]c_1 \neq 0[/itex], so that requires [itex]f(k) = 0[/itex]. The condition [itex]C[c_1\sin kx] = 1[/itex] then gives you [itex]c_1[/itex] in terms of [itex]k[/itex].
 
  • Like
Likes Math100
  • #3
pasmith said:
Assuming your work is correct, you have [tex]
(\alpha - \gamma\lambda)y(1)h(1) + \int_0^1 \beta y'h' - \lambda w y h \,dx = 0.[/tex] What is the next step in all of these problems? Integrate [itex]y'h'[/itex] by parts.



This assumes [itex]\lambda \neq 0[/itex]. What happens if [itex]\lambda = 0[/itex]? Do you get a non-zero solution for [itex]y[/itex]?

For [itex]\lambda = k^2 > 0[/itex], you know from the condition [itex]y(0) = 0[/itex] that [itex]c_2 = 0[/itex]. That leaves you with the condition at [itex]y(1)[/itex], which takes the form [tex]
c_1f(k) = 0.[/tex] We know that [itex]c_1 \neq 0[/itex], so that requires [itex]f(k) = 0[/itex]. The condition [itex]C[c_1\sin kx] = 1[/itex] then gives you [itex]c_1[/itex] in terms of [itex]k[/itex].
So for part b), I've got ## y=Asin(\sqrt{\lambda}x)+Bcos(\sqrt{\lambda}x) ##, where ## A, B ## are constants. The condition ## y(0)=0 ## gives ## B=0 ## and the boundary condition at ## x=1 ## gives ## y(1)=0\implies 0=Asin(\sqrt{\lambda})\implies sin(\sqrt{\lambda})=0 ## since ## A\neq 0 ##. This means ## \sqrt{\lambda}=n\pi\implies y=Asin(n\pi x) ##.
Thus, the constraint gives ## 1=\int_{0}^{1}[Asin(n\pi x)]^2dx\implies 1=A^2\int_{0}^{1}sin^2(n\pi x)dx\implies A=\sqrt{2} ##.
Hence, ## y=\sqrt{2}sin(\sqrt{\lambda x}) ##.
Is this the correct stationary path?
 
  • #4
Try again. The condition at [itex]x = 1[/itex] is [tex]
(1 - \lambda)y(1) + y'(1) = 0.[/tex]
 
  • #5
pasmith said:
Try again. The condition at [itex]x = 1[/itex] is [tex]
(1 - \lambda)y(1) + y'(1) = 0.[/tex]
I still don't get this one. How does ## (1-\lambda)y(1)+y'(1)=0 ## determine our another constant ## A ##?
 
  • #6
You are dealing with an eigenvalue problem. The condition at 1 tells you that either [itex]A = 0[/itex], which is the trivial solution, or else [itex]\lambda[/itex] must satisfy a certain condition. Then the constraint [itex]C[y] = 1[/itex] determines [itex]A[/itex].
 
Last edited:
  • #7
pasmith said:
You are dealing with an eigenvalue problem. The condition at 1 tells you that either [itex]A = 0[/itex], which is the trivial solution, or else [itex]\lambda[/itex] must satisfy a certain condition. Then the constraint [itex]C[y] = 1[/itex] determines [itex]A[/itex].
How can ## \lambda ## satisfy a certian condition? And how to find the constant ## A ##?
 
  • #8
pasmith said:
You are dealing with an eigenvalue problem. The condition at 1 tells you that either [itex]A = 0[/itex], which is the trivial solution, or else [itex]\lambda[/itex] must satisfy a certain condition. Then the constraint [itex]C[y] = 1[/itex] determines [itex]A[/itex].
Since the constant ## B=0 ##, we have ## y=Asin(\sqrt{\lambda}x) ## and using the boundary condition at ## x=1 ## gives ## (1-\lambda)y(1)+y'(1)=0\implies (1-\lambda)Asin(\sqrt{\lambda})+A\sqrt{\lambda}cos(\sqrt{\lambda})=0 ##. But then this means ## sin(\sqrt{\lambda})=0\implies \sqrt{\lambda}=n\pi ## for ## n\neq 0 ## and ## cos(\sqrt{\lambda})=0\implies \sqrt{\lambda}=(n+\frac{1}{2})\pi ## for some ## n\in\mathbb{Z} ##. What's wrong in here?
 
  • #9
Math100 said:
Since the constant ## B=0 ##, we have ## y=Asin(\sqrt{\lambda}x) ## and using the boundary condition at ## x=1 ## gives ## (1-\lambda)y(1)+y'(1)=0\implies (1-\lambda)Asin(\sqrt{\lambda})+A\sqrt{\lambda}cos(\sqrt{\lambda})=0 ##. But then this means ## sin(\sqrt{\lambda})=0\implies \sqrt{\lambda}=n\pi ## for ## n\neq 0 ## and ## cos(\sqrt{\lambda})=0\implies \sqrt{\lambda}=(n+\frac{1}{2})\pi ## for some ## n\in\mathbb{Z} ##. What's wrong in here?
What's wrong is your assumption that the ##\sin## and ##\cos## terms must vanish individually. Following the suggestion of @pasmith, set ##\lambda=k^2## and write your boundary (eigenvalue) condition as ##\left(1-k^{2}\right)\sin k+k\cos k=0##. Beyond the the trivial solution ##k=0##, a plot of the function on the left-side suggests that the condition has an infinity of roots, the first few of which are (using Mathematica): ##k_1=1.20779,k_2=3.44824,k_3=6.44095,k_4=9.53048,k_5=12.6458##. The squares of these are the first five allowed values of the Lagrange multiplier ##\lambda## in your variational problem part b). All that remains to do now is to plug your eigensolutions ##y_\lambda (x)## into ##C[y_\lambda]=1## and calculate the normalization factors ##A_\lambda##.
 
  • #10
renormalize said:
What's wrong is your assumption that the ##\sin## and ##\cos## terms must vanish individually. Following the suggestion of @pasmith, set ##\lambda=k^2## and write your boundary (eigenvalue) condition as ##\left(1-k^{2}\right)\sin k+k\cos k=0##. Beyond the the trivial solution ##k=0##, a plot of the function on the left-side suggests that the condition has an infinity of roots, the first few of which are (using Mathematica): ##k_1=1.20779,k_2=3.44824,k_3=6.44095,k_4=9.53048,k_5=12.6458##. The squares of these are the first five allowed values of the Lagrange multiplier ##\lambda## in your variational problem part b). All that remains to do now is to plug your eigensolutions ##y_\lambda (x)## into ##C[y_\lambda]=1## and calculate the normalization factors ##A_\lambda##.
I don't understand. If the condition has an infinity of roots, then how are we supposed to plug those ## k ## values into the eigensolutions ##y_\lambda (x)## into ##C[y_\lambda]=1## in order to find the constant ## A ##?
 
  • #11
Math100 said:
I don't understand. If the condition has an infinity of roots, then how are we supposed to plug those ## k ## values into the eigensolutions ##y_\lambda (x)## into ##C[y_\lambda]=1## in order to find the constant ## A ##?
It's exactly analogous to what you would do for the simple boundary/eigenvalue condition ##\sin k_n=0##. Of course, for that case you know that the infinity of roots are explicitly given by ##k_n=n\pi##, where ##n## is any natural number, and you use those values to express the normalization ##A_n## as a function of ##n\pi##. But what if you didn't know that explicit solution? You'd simply replace ##n\pi## in ##A_n## by ##k_n##, along with the statement than the allowed eigenvalues of ##k_n## are the roots of ##\sin k_n=0##. (And perhaps display one or more of the eigenvalues that you find numerically.) Just do the same thing for your variational problem.
 

Similar threads

  • Calculus and Beyond Homework Help
Replies
6
Views
865
  • Calculus and Beyond Homework Help
Replies
2
Views
553
  • Calculus and Beyond Homework Help
Replies
18
Views
1K
  • Calculus and Beyond Homework Help
Replies
9
Views
555
  • Calculus and Beyond Homework Help
Replies
2
Views
470
  • Calculus and Beyond Homework Help
Replies
5
Views
630
  • Calculus and Beyond Homework Help
Replies
2
Views
722
  • Calculus and Beyond Homework Help
Replies
4
Views
700
Replies
1
Views
637
  • Calculus and Beyond Homework Help
Replies
1
Views
711
Back
Top