# Laplace Transforms (proofs of)

#### DreamWeaver

##### Well-known member
I should state, from the outset, that this tutorial is NOT going to go into any great detail about the theory and applications of Laplace transforms. Some of the aforementioned will be discussed in a cursory way, but the aim here is merely to provide a selection of proofs for common transforms. That said, on with the show...

Throughout, I will use the notation

$$\displaystyle F(w) = \mathfrak{L}(f) = \int_{0}^{\infty} e^{-wx}f(x)\, dx$$

and

$$\displaystyle f(w) = \mathfrak{L}^{-1}(F)$$

NOTE:

In the field of Laplace transforms, it is standard practice to use "s" rather than the "w" I will use here. Consider the following expressions of the same transform:

$$\displaystyle F(w) = \mathfrak{L}(f) = \int_{0}^{\infty} e^{-wx}f(x)\, dx$$

$$\displaystyle F(s) = \mathfrak{L}(f) = \int_{0}^{\infty} e^{-sx}f(x)\, dx$$

Personally, I think that superscript of "s" in the second example is far less legible than the equivalent "w" in the first. Hence me using "w" throughout this thread.

----------------
Proposition 01:
----------------

Let $$\displaystyle f(x)=1$$ (the unit constant), then

$$\displaystyle {\color{BrickRed} \mathfrak{L}(1) = \int_0^{\infty} e^{-wx}dx = \frac{1}{w}}$$

Proof:

$$\displaystyle \mathfrak{L}(1) = \int_0^{\infty} e^{-wx}dx = -\frac{1}{w}\, e^{-wx}\, \Bigg|_0^{\infty}$$

Hence when $$\displaystyle w>0$$ this becomes

$$\displaystyle -\frac{1}{w}\, \left[ \lim_{z\to \infty} e^{-z} - \lim_{z\to 0} e^{-z}\right] = -\frac{1}{w}\, \left[ \frac{1}{e^{\infty}} - \frac{1}{e^0} \right] = \frac{1}{w}$$

As was to be shown. $$\displaystyle \Box$$

----------------
Proposition 02:
----------------

Linearity of the Laplace transform: Let $$\displaystyle a$$ and $$\displaystyle b$$ be scalar constants, and $$\displaystyle f$$ and $$\displaystyle g$$ be functions of the variable, then:

$$\displaystyle {\color{BrickRed}\mathfrak{L}(a\, f +b\, g) = a\, \mathfrak{L}(f) + b\, \mathfrak{L}(g)}$$

This follows directly from the linearity of integrals:

$$\displaystyle \int (a\, f(x) + b\, g(x) )\, dx = a\, \int f(x)\, dx + b\, \int g(x)\, dx. \, \Box$$

----------------
Proposition 03:
----------------

Let $$\displaystyle x \ge 0$$, then when $$\displaystyle f(x)=e^{ax}$$, and $$\displaystyle w-a>0$$:

$$\displaystyle {\color{BrickRed} \mathfrak{L}(e^{ax}) = \frac{1}{w-a}}$$

Proof:

$$\displaystyle \mathfrak{L}(e^{ax}) = \int_0^{\infty} e^{ax}e^{-wx}\, dx = \int_0^{\infty} e^{-(w-a)x}\, dx=$$

$$\displaystyle -\frac{1}{(w-a)}e^{-(w-a)x}\, \Bigg|_0^{\infty} = \frac{1}{w-a}$$

Provided $$\displaystyle w-a > 0$$. The proposition is now proved. $$\displaystyle \Box$$

http://mathhelpboards.com/commentar...uot-laplace-transforms-proofs-quot-10894.html

Last edited:

#### DreamWeaver

##### Well-known member
----------------
Proposition 04:
----------------

Let $$\displaystyle a \in \mathbb{R}^{+}$$ and $$\displaystyle n\in\mathbb{N}$$, then the following Laplace transforms hold:

$$\displaystyle {\color{BrickRed}\mathfrak{L}(x^n) = \frac{n!}{w^{n+1}} }$$

$$\displaystyle {\color{BrickRed}\mathfrak{L}(x^a) = \frac{\Gamma(w+1)}{w^{a+1}} }$$

Proof:

We begin with the integral definition of the Euler Gamma function $$\displaystyle \Gamma(x)$$:

$$\displaystyle \Gamma(x) = \int_0^{\infty}e^{-t}t^{x-1}\, dt$$

Replacing $$\displaystyle x$$ with $$\displaystyle x+1$$ and then performing an integration by parts, we obtain:

$$\displaystyle \Gamma(x+1) = \int_0^{\infty}e^{-t}t^{x}\, dt =$$

$$\displaystyle -e^{-t}t^x\, \Bigg|_0^{\infty} + x\, \int_0^{\infty}e^{-t}t^{x-1}\, dt$$

The limiting term on the L.H.S. tends to zero, while the integral term on the R.H.S. is, by the definition of the Euler Gamma function, equal to $$\displaystyle x\, \Gamma(x)$$.

Hence

$$\displaystyle \Gamma(1+x) = x\, \Gamma(x)$$

Next, set x=1 in the integral representation of the Euler Gamma function:

$$\displaystyle \Gamma(1) = \int_0^{\infty}e^{-t}\, dt = -e^{-t}\, \Bigg|_0^{\infty} = 1$$

So

$$\displaystyle \Gamma(1) =1$$

$$\displaystyle \Gamma(2) = 1\, \Gamma(1) = 1$$

$$\displaystyle \Gamma(3) = 2\, \Gamma(2) = 2$$

$$\displaystyle \Gamma(4) = 3\, \Gamma(3) = 6$$

And more generally,

$$\displaystyle \Gamma(m+1) = m!$$

Next, assume that $$\displaystyle a>0$$, as per part 2 of proposition 4. Then

$$\displaystyle \mathfrak{L}(x^a) = \int_0^{\infty}e^{-wx}x^a\,dx$$

Apply the substitution $$\displaystyle t=wx, dt=wdx\, \Rightarrow$$

$$\displaystyle \mathfrak{L}(x^a) = \frac{1}{w}\, \int_0^{\infty} e^{-t}\left(\frac{t}{w}\right)^a\, dt = \frac{1}{w^{a+1}}\, \int_0^{\infty}e^{-t}t^{(a+1)-1}\, dt = \frac{\Gamma(w+1)}{w^{a+1}}$$

This proves part 2 of proposition 4. Replacing the positive, real number $$\displaystyle a$$ with the natural number $$\displaystyle n\in\mathbb{N}$$, and then using $$\displaystyle \Gamma(n+1)=n!$$ proves the first part. $$\displaystyle \Box$$

Last edited:

#### DreamWeaver

##### Well-known member
----------------
Proposition 05:
----------------

$$\displaystyle {\color{BrickRed}\mathfrak{L}(\sin \omega x) = \int_0^{\infty}e^{-wt}\sin \omega x\, dx = \frac{\omega}{w^2+\omega^2} }$$

$$\displaystyle {\color{BrickRed}\mathfrak{L}(\cos \omega x) = \int_0^{\infty}e^{-wt}\cos \omega x\, dx = \frac{w}{w^2+\omega^2} }$$

Proof:

Set $$\displaystyle a=i\omega$$ in proposition (03), then

$$\displaystyle \mathfrak{L}(e^{i\omega x}) = \frac{1}{w-i\omega} = \frac{w+i\omega}{w^2+\omega^2} = \frac{w}{w^2+\omega^2} + \frac{i\omega}{w^2+\omega^2}$$

Conversely,

$$\displaystyle e^{i\omega x} = \cos \omega x + i\sin \omega x$$

Hence, by proposition (02),

$$\displaystyle \mathfrak{L}(e^{i\omega x}) = \mathfrak{L}(\cos \omega x) + i\, \mathfrak{L}(\sin \omega x)$$

Equating the real and imaginary parts proves both parts of proposition (05). $$\displaystyle \Box$$

Last edited:

#### DreamWeaver

##### Well-known member
----------------
Proposition 06:
----------------

For $$\displaystyle w-a > 0$$:

$$\displaystyle {\color{BrickRed}\mathfrak{L}(\sinh ax) = \int_0^{\infty}e^{-wx}\sinh ax\, dx = \frac{a}{w^2-a^2} }$$

$$\displaystyle {\color{BrickRed}\mathfrak{L}(\cosh ax) = \int_0^{\infty}e^{-wx}\cosh ax\, dx = \frac{w}{w^2-a^2} }$$

Proof:

$$\displaystyle \sinh ax = \frac{e^{ax}-e^{-ax}}{2}$$

$$\displaystyle \cosh ax = \frac{e^{ax}+e^{-ax}}{2}$$

Hence by propositions (03) and (02),

$$\displaystyle \mathfrak{L}(\sinh ax) = \frac{1}{2}\mathfrak{L}(e^{ax}) - \frac{1}{2}\mathfrak{L}(e^{-ax}) =$$

$$\displaystyle \frac{1}{2}\, \frac{1}{w-a} - \frac{1}{2}\, \frac{1}{w+a} = \frac{a}{w^2-a^2}$$

The equivalent form for the Hyperbolic Cosine is proved in the same manner. $$\displaystyle \Box$$

Last edited:

#### DreamWeaver

##### Well-known member
----------------
Proposition 07:
----------------

Let $$\displaystyle \mathscr{Re}(w) > \mathscr{Re}(a)$$, and $$\displaystyle \mathscr{Re}(\lambda) > 0$$, then:

$$\displaystyle {\color{BrickRed}\mathfrak{L}(x^{\lambda -1}e^{ax}) = \frac{\Gamma(\lambda)}{(w-a)^{\lambda}} }$$

Proof:

$$\displaystyle \mathfrak{L} (x^{\lambda -1}e^{ax}) = \int_0^{\infty} e^{-wx}\, (x^{\lambda -1}e^{ax})\, dx = \int_0^{\infty} e^{-(w-a)x}x^{\lambda-1}\, dx$$

This is almost a Euler Gamma function:

$$\displaystyle \int_0^{\infty} e^{-(w-a)x}x^{\lambda-1}\, dx = \frac{1}{(w-a)}\, \int_0^{\infty} \left(\frac{t}{w-a} \right)^{\lambda-1}e^{-t}\, dt =$$

$$\displaystyle \frac{1}{(w-a)^{\lambda}}\, \int_0^{\infty}e^{-t}t^{\lambda -1}\, dt = \frac{\Gamma(\lambda)}{(w-a)^{\lambda}}$$

This completes the proof. $$\displaystyle \Box$$

----------------
Corollary:
----------------

For $$\displaystyle \mathscr{Re}(w) > -\mathscr{Re}(a)$$, and $$\displaystyle \mathscr{Re}(\lambda) > 0$$, then:

$$\displaystyle {\color{BrickRed}\mathfrak{L}(x^{\lambda -1}e^{-ax}) = \frac{\Gamma(\lambda)}{(w+a)^{\lambda}} }$$

Last edited:

#### DreamWeaver

##### Well-known member
----------------
Proposition 08:
----------------

$$\displaystyle {\color{BrickRed}\mathfrak{L}(x^{\lambda -1}e^{ax}\log x) = \frac{\Gamma(\lambda)}{(w-a)^{\lambda}}\, \Bigg[ \psi_0(\lambda) - \log(w-a) \Bigg] }$$

Proof:

Let $$\displaystyle \mathscr{Re}(w) > \mathscr{Re}(a)$$, and $$\displaystyle \mathscr{Re}(\lambda) > 0$$, then:

$$\displaystyle \mathfrak{L}(x^{\lambda -1}e^{ax}) = \frac{\Gamma(\lambda)}{(w-a)^{\lambda}} \Rightarrow$$

$$\displaystyle \frac{d}{d\lambda} \mathfrak{L}(x^{\lambda -1}e^{ax}) = \mathfrak{L}'(x^{\lambda -1}e^{ax})= \frac{d}{d\lambda}\, \frac{\Gamma(\lambda)}{(w-a)^{\lambda}} =$$

$$\displaystyle \frac{(w-a)^{\lambda} \Gamma'(\lambda) - (w-a)^{\lambda}\log(w-a) \Gamma(\lambda) }{(w-a)^{2\lambda}} = \frac{\Gamma'(\lambda) - \log(w-a) \Gamma(\lambda) }{(w-a)^{\lambda}}$$

Defining the Digamma function $$\displaystyle \psi_0(z)$$ in the usual way,

$$\displaystyle \psi_0(z)=\frac{d}{dz}\log\Gamma(z) = \frac{\Gamma'(z)}{\Gamma(z)}$$

This becomes,

$$\displaystyle \mathfrak{L}'(x^{\lambda -1}e^{ax})= \frac{\Gamma'(\lambda) - \log(w-a) \Gamma(\lambda) }{(w-a)^{\lambda}} =$$

$$\displaystyle \frac{\Gamma(\lambda)}{(w-a)^{\lambda}}\, \Bigg[ \psi_0(\lambda) - \log(w-a) \Bigg]$$

Conversely, by the (usual) integral representation of the Laplace transform,

$$\displaystyle \mathfrak{L}'(x^{\lambda -1}e^{ax})= \frac{d}{d\lambda}\, \int_0^{\infty}e^{-(w-a)x}x^{\lambda -1}\, dx =$$

$$\displaystyle \int_0^{\infty}e^{-(w-a)x}x^{\lambda -1}(\log x)\, dx = \mathfrak{L}(x^{\lambda -1}e^{ax}\log x)$$

Hence

$$\displaystyle \mathfrak{L}(x^{\lambda -1}e^{ax}\log x) = \frac{\Gamma(\lambda)}{(w-a)^{\lambda}}\, \Bigg[ \psi_0(\lambda) - \log(w-a) \Bigg]$$

This completes the proof. $$\displaystyle \Box$$

----------------
Corollary:
----------------

$$\displaystyle {\color{BrickRed}\mathfrak{L}(x^{\lambda -1}e^{-ax}\log x) = \frac{\Gamma(\lambda)}{(w+a)^{\lambda}}\, \Bigg[ \psi_0(\lambda) - \log(w+a) \Bigg] }$$

This follows directly from propositions (07) and (08), provided that the conditions $$\displaystyle \mathscr{Re}(w) > -\mathscr{Re}(a)$$, and $$\displaystyle \mathscr{Re}(\lambda) > 0$$ are satisfied.

Last edited:

#### DreamWeaver

##### Well-known member
----------------
Proposition 09:
----------------

Let $$\displaystyle \mathscr{Re}(w) > \mathscr{Re}(a)$$, and $$\displaystyle \mathscr{Re}(\lambda) > 0$$, then:

$$\displaystyle {\color{BrickRed} \mathfrak{L}(x^{\lambda-1}e^{ax}(\log x)^2) =}$$

$$\displaystyle {\color{BrickRed} \frac{\Gamma(\lambda)}{(w-a)^{\lambda}}\, \Bigg\{ \Bigg[ \psi_0(\lambda) - \log(w-a) \Bigg]^2 + \psi_1(\lambda) \Bigg\} }$$

And

$$\displaystyle {\color{BrickRed} \mathfrak{L}(x^{\lambda-1}e^{ax}(\log x)^3) =}$$

$$\displaystyle {\color{BrickRed} \frac{\Gamma(\lambda)}{(w-a)^{\lambda}}\, \Bigg\{ \Bigg[ \psi_0(\lambda) - \log(w-a) \Bigg]^3 + 3\, \psi_1(\lambda)\, \Bigg[ \psi_0(\lambda) - \log(w-a) \Bigg] + \psi_2(\lambda) \Bigg\} }$$

Proof:

Proposition (08) can itself be differentiated - on both sides - multiple times, to give:

$$\displaystyle \frac{d^m}{d\lambda^m}\, \mathfrak{L} (x^{\lambda-1}e^{ax}(\log x)) = \mathfrak{L}(x^{\lambda-1}e^{ax}(\log x)^{m+1}) = \frac{d^m}{d\lambda^m}\, \frac{\Gamma(\lambda)}{(w-a)^{\lambda}}\, \Bigg[ \psi_0(\lambda) - \log(w-a) \Bigg]$$

Since

$$\displaystyle \frac{d^m}{d\lambda^m}\, \mathfrak{L} (x^{\lambda-1}e^{ax}(\log x)) = \frac{d^m}{d\lambda^m}\, \int_0^{\infty}e^{-(w-a)x}x^{\lambda-1}(\log x)\, dx =$$

$$\displaystyle \int_0^{\infty} e^{-(w-a)x} (\log x)\, \left[ \frac{d^m}{d\lambda^m}\, x^{\lambda -1 }\right]\, dx =$$

$$\displaystyle \int_0^{\infty}e^{-(w-a)} x^{\lambda-1}(\log x)^{m+1}\, dx = \mathfrak{L} (x^{\lambda-1}e^{ax}(\log x)^{m+1})$$

The Polygamma functions are given by:

$$\displaystyle \psi_m(z) = \frac{d^{m+1}}{dz^{m+1}}\, \log\Gamma(z)$$

and so

$$\displaystyle \psi_1(x) = \frac{d}{dx}\, \psi_0(x)$$

$$\displaystyle \psi_2(x) = \frac{d}{dx}\, \psi_1(x)$$

$$\displaystyle \psi_3(x) = \frac{d}{dx}\, \psi_2(x)$$

Etc.

Hence

$$\displaystyle \mathfrak{L}(x^{\lambda-1}e^{ax}(\log x)^2) = \frac{d}{d\lambda}\, \frac{\Gamma(\lambda)}{(w-a)^{\lambda}}\, \Bigg[ \psi_0(\lambda) - \log(w-a) \Bigg] =$$

$$\displaystyle \frac{\Gamma(\lambda)}{(w-a)^{\lambda}}\, \Bigg[ \psi_0(\lambda) - \log(w-a) \Bigg]^2 + \frac{\Gamma(\lambda)}{(w-a)^{\lambda}}\, \psi_1(\lambda)$$

And

$$\displaystyle \mathfrak{L}(x^{\lambda-1}e^{ax}(\log x)^3) = \frac{d}{d\lambda}\, = \frac{d}{d\lambda}\, \mathfrak{L}(x^{\lambda-1}e^{ax}(\log x)^2) =$$

$$\displaystyle \frac{\Gamma(\lambda)}{(w-a)^{\lambda}}\, \Bigg[ \psi_0(\lambda) - \log(w-a) \Bigg]^3 + \frac{3\, \Gamma(\lambda)}{(w-a)^{\lambda}}\, \psi_1(\lambda)\, \Bigg[ \psi_0(\lambda) - \log(w-a) \Bigg] +$$

$$\displaystyle \frac{\Gamma(\lambda)}{(w-a)^{\lambda}}\, \psi_2(\lambda)$$

This completes the proof. $$\displaystyle \Box$$

----------------
Corollary:
----------------

$$\displaystyle {\color{BrickRed} \mathfrak{L}(x^{\lambda-1}e^{-ax}(\log x)^2) = }$$

$$\displaystyle {\color{BrickRed} \frac{\Gamma(\lambda)}{(w+a)^{\lambda}}\, \Bigg\{ \Bigg[ \psi_0(\lambda) - \log(w+a) \Bigg]^2 + \psi_1(\lambda) \Bigg\} }$$

And

$$\displaystyle {\color{BrickRed} \mathfrak{L}(x^{\lambda-1}e^{-ax}(\log x)^3) = }$$

$$\displaystyle {\color{BrickRed} \frac{\Gamma(\lambda)}{(w+a)^{\lambda}}\, \Bigg\{ \Bigg[ \psi_0(\lambda) - \log(w+a) \Bigg]^3 + 3\, \psi_1(\lambda)\, \Bigg[ \psi_0(\lambda) - \log(w+a) \Bigg] + \psi_2(\lambda) \Bigg\} }$$

These follow directly from propositions (07), (08), and (09) above, provided that the conditions $$\displaystyle \mathscr{Re}(w) > -\mathscr{Re}(a)$$, and $$\displaystyle \mathscr{Re}(\lambda) > 0$$ are satisfied.

Last edited:

#### DreamWeaver

##### Well-known member
----------------
Proposition 10:
----------------

For $$\displaystyle \mathscr{Re}(\lambda) > 0$$ and $$\displaystyle \mathscr{Re}(w) > 0$$:

Part 1:

$$\displaystyle {\color{BrickRed}\mathfrak{L}(x^{\lambda -1}\log x) = \frac{\Gamma(\lambda)}{w^{\lambda}}\, \Bigg[ \psi_0(\lambda) - \log w \Bigg] }$$

Part 2:

$$\displaystyle {\color{BrickRed} \mathfrak{L}(x^{\lambda-1}(\log x)^2) =}$$

$$\displaystyle {\color{BrickRed} \frac{\Gamma(\lambda)}{w^{\lambda}}\, \Bigg\{ \Bigg[ \psi_0(\lambda) - \log w \Bigg]^2 + \psi_1(\lambda) \Bigg\} }$$

Part 3:

$$\displaystyle {\color{BrickRed} \mathfrak{L}(x^{\lambda-1}(\log x)^3) =}$$

$$\displaystyle {\color{BrickRed} \frac{\Gamma(\lambda)}{w^{\lambda}}\, \Bigg\{ \Bigg[ \psi_0(\lambda) - \log w \Bigg]^3 + 3\, \psi_1(\lambda)\, \Bigg[ \psi_0(\lambda) - \log w \Bigg] + \psi_2(\lambda) \Bigg\} }$$

Proof:

Set $$\displaystyle a=0$$ in propositions (08) and (09), or their corollaries. $$\displaystyle \Box$$

#### DreamWeaver

##### Well-known member
----------------
Proposition 11:
----------------

For $$\displaystyle \mathscr{Re}(w) > \mathscr{Re}(a)$$:

$$\displaystyle {\color{BrickRed} \mathfrak{L}\left( \frac{e^{ax}}{\sqrt{x}} \right) = \sqrt{ \frac{\pi}{w-a} } }$$

Proof:

This follows directly by setting $$\displaystyle \lambda = 1/2$$ in proposition (07), but here's the full proof anyway:

$$\displaystyle \mathfrak{L}\left( \frac{e^{ax}}{\sqrt{x}} \right) = \int_0^{\infty} \frac{e^{-(w-a)x}}{\sqrt{x}}\, dx = \frac{1}{(w-a)}\, \int_0^{\infty} \frac{e^{-t}}{ \sqrt{\frac{t}{(w-a)}} }\, dt =$$

$$\displaystyle \frac{1}{ \sqrt{w-a} }\, \int_0^{\infty}e^{-t}t^{1/2-1}\, dt = \frac{\Gamma\left( \tfrac{1}{2} \right) }{ \sqrt{w-a} } = \sqrt{ \frac{\pi}{w-a} }$$

Since

$$\displaystyle \Gamma\left( \tfrac{1}{2} \right) = \sqrt{\pi}$$

$$\displaystyle \Box$$

#### DreamWeaver

##### Well-known member
----------------
Proposition 12:
----------------

For $$\displaystyle \mathscr{Re}(w) > -\mathscr{Re}(a)$$:

$$\displaystyle {\color{BrickRed} \mathfrak{L}\left( \frac{e^{-at}}{\sqrt{x}} \right) = \sqrt{ \frac{\pi}{w+a} }}$$

Proof:

This is a corollary of proposition (11), where $$\displaystyle a$$ has been replaced with $$\displaystyle -a$$ (affecting the subsequent change of conditions: $$\displaystyle \mathscr{Re}(w) > -\mathscr{Re}(a)$$).

$$\displaystyle \Box$$

----------------
Proposition 13:
----------------

For $$\displaystyle \mathscr{Re}(w) > -\mathscr{Re}(a)$$:

$$\displaystyle {\color{BrickRed} \mathfrak{L}\left( \frac{\sinh ax}{\sqrt{x} } \right) = \frac{\sqrt{\pi}}{2}\, \frac{\sqrt{w+a} -\sqrt{w-a} }{\sqrt{w^2-a^2}} }$$

$$\displaystyle {\color{BrickRed} \mathfrak{L}\left( \frac{\cosh ax}{\sqrt{x} } \right) = \frac{\sqrt{\pi}}{2}\, \frac{\sqrt{w+a} +\sqrt{w-a} }{\sqrt{w^2-a^2}} }$$

Proof:

The case for $$\displaystyle \sinh ax$$ is proven directly. The case for $$\displaystyle \cosh ax$$ is identical in methodology.

Firstly, we have:

$$\displaystyle \sinh ax = \frac{e^{ax}-e^{-ax}}{2}$$

$$\displaystyle \cosh ax = \frac{e^{ax}+e^{-ax}}{2}$$

Hence

$$\displaystyle \mathfrak{L}\left( \frac{\sinh ax}{\sqrt{x}} \right) = \frac{1}{2}\, \mathfrak{L}\left( \frac{e^{ax}}{ \sqrt{x} } \right) - \frac{1}{2}\, \mathfrak{L}\left( \frac{e^{-ax}}{ \sqrt{x} } \right) =$$

$$\displaystyle \frac{1}{2}\, \left[ \sqrt{ \frac{\pi}{w-a} } - \sqrt{ \frac{\pi}{w+a} }\right] =$$

$$\displaystyle \frac{\sqrt{\pi}}{2}\, \frac{\sqrt{w+a} -\sqrt{w-a} }{\sqrt{w^2-a^2}}$$

This completes the proof. $$\displaystyle \Box$$

#### DreamWeaver

##### Well-known member
----------------
Proposition 14:
----------------

Let $$\displaystyle \mathscr{Re}(w) > -\mathscr{Re}(a)$$, and $$\displaystyle \mathscr{Re}(\lambda) > 0$$, then:

Part 1:

$$\displaystyle {\color{BrickRed} \mathfrak{L}(x^{\lambda-1}\sinh ax) = \frac{\Gamma(\lambda)}{2}\, \frac{ (w+a)^{\lambda} - (w-a)^{\lambda} }{(w^2-a^2)^{\lambda}} }$$

Part 2:

$$\displaystyle {\color{BrickRed} \mathfrak{L}(x^{\lambda-1}\cosh ax) = \frac{\Gamma(\lambda)}{2}\, \frac{ (w+a)^{\lambda} + (w-a)^{\lambda} }{(w^2-a^2)^{\lambda}} }$$

Proof:

$$\displaystyle \sinh ax = \frac{e^{ax}-e^{-ax}}{2}\, \Rightarrow$$

$$\displaystyle \mathfrak{L}(x^{\lambda-1}\sinh ax) = \mathfrak{L}\left( \frac{x^{\lambda-1} e^{ax }}{2} \right) - \mathfrak{L}\left( \frac{x^{\lambda-1} e^{-ax }}{2} \right) =$$

$$\displaystyle \frac{1}{2}\, \mathfrak{L} \left( x^{\lambda-1} e^{ax } \right) - \frac{1}{2}\, \mathfrak{L} \left( x^{\lambda-1} e^{-ax } \right) =$$

$$\displaystyle \frac{1}{2}\, \frac{\Gamma(\lambda)}{(w-a)^{\lambda}} - \frac{1}{2}\, \frac{\Gamma(\lambda)}{(w+a)^{\lambda}}$$

By proposition (07) and its corollary. Similarly

$$\displaystyle \cosh ax = \frac{e^{ax}+e^{-ax}}{2}\, \Rightarrow$$

$$\displaystyle \mathfrak{L}(x^{\lambda-1}\cosh ax) = \mathfrak{L}\left( \frac{x^{\lambda-1} e^{ax }}{2} \right) + \mathfrak{L}\left( \frac{x^{\lambda-1} e^{-ax }}{2} \right) =$$

$$\displaystyle \frac{1}{2}\, \frac{\Gamma(\lambda)}{(w-a)^{\lambda}} + \frac{1}{2}\, \frac{\Gamma(\lambda)}{(w+a)^{\lambda}}$$

Hence

$$\displaystyle \mathfrak{L}(x^{\lambda-1}\sinh ax) = \frac{\Gamma(\lambda)}{2}\, \frac{ (w+a)^{\lambda} - (w-a)^{\lambda} }{(w^2-a^2)^{\lambda}}$$

And

$$\displaystyle \mathfrak{L}(x^{\lambda-1}\cosh ax) = \frac{\Gamma(\lambda)}{2}\, \frac{ (w+a)^{\lambda} + (w-a)^{\lambda} }{(w^2-a^2)^{\lambda}}$$

As was to be shown. $$\displaystyle \Box$$

#### DreamWeaver

##### Well-known member
----------------
Proposition 15:
----------------

Let $$\displaystyle \mathscr{Re}(w) > |\mathscr{Im}(b)|$$, then:

$$\displaystyle {\color{BrickRed} \mathfrak{L}(\sin(a+bx)) = \frac{w\sin a + b\cos a}{(w^2+b^2)} }$$

Proof:

By the Addition formula for the Sine:

$$\displaystyle \sin(x\pm y) = \sin x\cos y \pm \cos x\sin y \Rightarrow$$

$$\displaystyle \mathfrak{L}(\sin(a+bx)) = \mathfrak{L}(\sin a\ \cos bx + \cos a\sin bx) = \sin a\, \mathfrak{L}(\cos bx) + \cos a\, \mathfrak{L}(\sin bx)$$

By proposition (05) this equates to

$$\displaystyle \sin a\, \frac{w}{w^2+b^2} +\cos a\, \frac{b}{w^2+b^2} \Rightarrow$$

$$\displaystyle \mathfrak{L}(\sin(a+bx)) = \frac{w\sin a + b\cos a}{(w^2+b^2)}$$

This concludes the proof. $$\displaystyle \Box$$

#### DreamWeaver

##### Well-known member
----------------
Proposition 16:
----------------

Let $$\displaystyle \mathscr{Re}(w) > |\mathscr{Im}(b)|$$, then:

$$\displaystyle {\color{BrickRed} \mathfrak{L}(\cos(a+bx)) = \frac{w\cos a - b\sin a}{(w^2+b^2)} }$$

Proof:

By the Addition formula for the Cosine:

$$\displaystyle \cos (x\pm y) = \cos x\cos y \mp \sin x\sin y \Rightarrow$$

$$\displaystyle \mathfrak{L}(\cos(a+bx)) = \mathfrak{L}(\cos a\ \cos bx - \sin a\sin bx) = \cos a\, \mathfrak{L}(\cos bx) - \sin a\, \mathfrak{L}(\sin bx)$$

By proposition (05) this equates to

$$\displaystyle \cos a\, \frac{w}{w^2+b^2} -\sin a\, \frac{b}{w^2+b^2} \Rightarrow$$

$$\displaystyle \mathfrak{L}(\cos(a+bx)) = \frac{w\cos a - b\sin a}{(w^2+b^2)}$$

This concludes the proof. $$\displaystyle \Box$$

#### DreamWeaver

##### Well-known member
----------------
Proposition 17:
----------------

Let $$\displaystyle \mathscr{Re}(w) > |\mathscr{Im}(b)|$$, then:

$$\displaystyle {\color{BrickRed} \mathfrak{L}(\sinh(a+bx)) = \frac{w\sinh a + b\cosh a}{(w^2-b^2)} }$$

Proof:

By the Addition formula for the Hyperbolic Sine:

$$\displaystyle \sinh(x\pm y) = \sinh x\cosh y \pm \cosh x\sinh y \Rightarrow$$

$$\displaystyle \mathfrak{L}(\sinh(a+bx)) = \mathfrak{L}(\sinh a\ \cosh bx + \cosh a\sinh bx) = \sinh a\, \mathfrak{L}(\cosh bx) + \cosh a\, \mathfrak{L}(\sinh bx)$$

By proposition (06) this equates to

$$\displaystyle \sinh a\, \frac{w}{w^2-b^2} +\cosh a\, \frac{b}{w^2-b^2} \Rightarrow$$

$$\displaystyle \mathfrak{L}(\sinh(a+bx)) = \frac{w\sinh a + b\cosh a}{(w^2-b^2)}$$

This concludes the proof. $$\displaystyle \Box$$

#### DreamWeaver

##### Well-known member
----------------
Proposition 18:
----------------

Let $$\displaystyle \mathscr{Re}(w) > |\mathscr{Im}(b)|$$, then:

$$\displaystyle {\color{BrickRed} \mathfrak{L}(\cosh(a+bx)) = \frac{w\cosh a + b\sinh a}{(w^2-b^2)} }$$

Proof:

By the Addition formula for the Hyperbolic Cosine:

$$\displaystyle \cosh (x\pm y) = \cosh x\cosh y \pm \sinh x\sinh y \Rightarrow$$

$$\displaystyle \mathfrak{L}(\cosh(a+bx)) = \mathfrak{L}(\cosh a \cosh bx + \sinh a\sinh bx) = \cos ah\, \mathfrak{L}(\cosh bx) + \sinh a\, \mathfrak{L}(\sinh bx)$$

By proposition (06) this equates to

$$\displaystyle \cosh a\, \frac{b}{w^2-b^2} +\sinh a\, \frac{w}{w^2-b^2} \Rightarrow$$

$$\displaystyle \mathfrak{L}(\cosh(a+bx)) = \frac{w\cosh a + b\sinh a}{(w^2-b^2)}$$

This concludes the proof. $$\displaystyle \Box$$

Last edited:

#### DreamWeaver

##### Well-known member
----------------
Proposition 19:
----------------

Defining the Modified Bessel function of the second kind (of order 1/2) in the usual way:

$$\displaystyle K_{1/2}(x) = e^{-x}\, \sqrt{\frac{\pi}{2x}}$$

And with $$\displaystyle \mathscr{Re}(w) > -\mathscr{Re}(a)$$, as well as $$\displaystyle \mathscr{Re}(\lambda) > 1/2$$, then:

$$\displaystyle {\color{BrickRed} \mathfrak{L}(x^{\lambda-1}K_{1/2}(ax)) = \sqrt{ \frac{\pi}{2a} }\, \frac{ \Gamma\left( \lambda - \frac{1}{2} \right) }{(w+a)^{\lambda -1/2} } }$$

Proof:

$$\displaystyle \mathfrak{L}(x^{\lambda-1} K_{1/2}(ax)) = \int_0^{\infty}e^{-wx}x^{\lambda-1} K_{1/2}(ax)\, dx =$$

$$\displaystyle \sqrt{ \frac{\pi}{2a} }\, \int_0^{\infty} x^{\lambda-3/2}e^{-(w+a)x}\, dx = \frac{1}{(w+a)^{\lambda-1/2}} \sqrt{ \frac{\pi}{2a} }\, \int_0^{\infty} e^{-t}t^{\lambda-3/2} \, dt =$$

$$\displaystyle \sqrt{ \frac{\pi}{2a} }\, \frac{ \Gamma\left( \lambda - \frac{1}{2} \right) }{(w+a)^{\lambda -1/2} }$$

$$\displaystyle \Box$$

Last edited:

#### DreamWeaver

##### Well-known member
----------------
Proposition 20:
----------------

Defining the Modified Bessel function of the second kind (of order 1/2) in the usual way:

$$\displaystyle K_{1/2}(x) = e^{-x}\, \sqrt{\frac{\pi}{2x}}$$

And with $$\displaystyle \mathscr{Re}(w) > -\mathscr{Re}(a)$$, as well as $$\displaystyle \mathscr{Re}(\lambda) > 1/2$$, then:

$$\displaystyle {\color{BrickRed} \mathfrak{L}(x^{\lambda-1}K_{1/2}(ax)\, \log x) = }$$

$$\displaystyle {\color{BrickRed} \sqrt{ \frac{\pi}{2a} }\, \frac{ \Gamma\left( \lambda - \frac{1}{2} \right) }{(w+a)^{\lambda -1/2} }\, \Bigg[ \psi_0\left( \lambda - \frac{1}{2} \right) - \log(w+a) \Bigg] }$$

Proof:

This follows by differentiation of both sides of proposition (19) with respect to the parameter $$\displaystyle \lambda$$, and then using

$$\displaystyle \psi_0(x) = \frac{d}{dx}\, \log\Gamma(x) = \frac{\Gamma'(x)}{\Gamma(x)}$$

$$\displaystyle \Box$$

#### DreamWeaver

##### Well-known member
More generally, we define the Modified Bessel function of the second kind, of fractional 'half-order' $$\displaystyle n+1/2$$ - where $$\displaystyle n \in\mathbb{N} \cup \{0\}$$ - by the finite sum:

$$\displaystyle K_{n+1/2}(x) = e^{-x}\, \sqrt{ \frac{\pi}{2x} }\, \sum_{j=0}^n \frac{(n+j)!}{2^jj!\, (n-j)!x^j}$$

The first few examples of which are:

$$\displaystyle K_{1/2}(x) = e^{-x} \, \sqrt{ \frac{\pi}{2x} }$$

$$\displaystyle K_{3/2}(x) = e^{-x} \, \sqrt{ \frac{\pi}{2x} }\, \left( \frac{1}{x} +1\right)$$

$$\displaystyle K_{5/2}(x) = e^{-x} \, \sqrt{ \frac{\pi}{2x} }\, \left( \frac{3}{x^2} + \frac{3}{x} + 1\right)$$

$$\displaystyle K_{7/2}(x) = e^{-x} \, \sqrt{ \frac{\pi}{2x} }\, \left( \frac{15}{x^3} + \frac{15}{x^2} + \frac{6}{x} + 1\right)$$

In general, as the previous case $$\displaystyle (n=0)$$ in propositions (19) and (20) illustrates, should we wish to multiply these functions by $$\displaystyle x^{\lambda-1}$$, and then find the Laplace transform, we will require that the parameter $$\displaystyle \lambda$$ satisfies $$\displaystyle \mathscr{Re}(\lambda) > n+1/2$$.

----------------
Proposition 21:
----------------

For $$\displaystyle n\in\mathbb{N}\cup \{0\}$$, $$\displaystyle \mathscr{Re}(w) > -\mathscr{Re}(a)$$, and $$\displaystyle \mathscr{Re}(\lambda) > n+1/2$$:

$$\displaystyle {\color{BrickRed} \mathfrak{L}(x^{\lambda-1}K_{n+1/2}(ax)) = }$$

$$\displaystyle {\color{BrickRed} \sqrt{ \frac{\pi}{2a} }\, \sum_{j=0}^n \frac{(n+j)!\, \Gamma\left( \lambda - j - \frac{1}{2} \right) }{2^jj!\, (n-j)!a^j(w+a)^{\lambda - j - 1/2} } }$$

Proof:

$$\displaystyle \mathfrak{L}(x^{\lambda-1}K_{n+1/2}(ax)) =$$

$$\displaystyle \sqrt{ \frac{\pi}{2a} }\, \sum_{j=0}^n \frac{(n+j)!}{2^jj!\, (n-j)!a^j}\, \int_0^{\infty} x^{\lambda-j-3/2} e^{-(w+a)x}\, dx =$$

$$\displaystyle \sqrt{ \frac{\pi}{2a} }\, \sum_{j=0}^n \frac{(n+j)!}{2^jj!\, (n-j)!a^j}\, \frac{1}{(w+a)}\, \int_0^{\infty} \left( \frac{t}{w+a} \right)^{\lambda-j-3/2} e^{-t}\, dx =$$

$$\displaystyle \sqrt{ \frac{\pi}{2a} }\, \sum_{j=0}^n \frac{(n+j)!}{2^jj!\, (n-j)!a^j(w+a)^{\lambda - j - 1/2} }\, \int_0^{\infty} t^{\lambda-j-3/2} e^{-t}\, dx =$$

$$\displaystyle \sqrt{ \frac{\pi}{2a} }\, \sum_{j=0}^n \frac{(n+j)!\, \Gamma\left( \lambda - j - \frac{1}{2} \right) }{2^jj!\, (n-j)!a^j(w+a)^{\lambda - j - 1/2} }$$

This concludes the proof. $$\displaystyle \Box$$

----------------
Corollary:
----------------

$$\displaystyle {\color{BrickRed} \mathfrak{L}(x^{\lambda-1}K_{n+1/2}(ax)\, \log x) = }$$

$$\displaystyle {\color{BrickRed} \sqrt{ \frac{\pi}{2a} }\, \sum_{j=0}^n \frac{(n+j)!\, \Gamma\left( \lambda - j - \frac{1}{2} \right) }{2^jj!\, (n-j)!a^j(w+a)^{\lambda - j - 1/2} } \, \left[ \psi_0\left( \lambda - j - \frac{1}{2} \right) - \log(w+a) \right] }$$

Proof:

Differentiate proposition (21) with respect to the parameter $$\displaystyle \lambda$$. The L.H.S. is easy, whereas the finite series on the R.H.S. contains the general (differentiated!) term:

$$\displaystyle \frac{d}{d\lambda}\, \frac{ \Gamma\left( \lambda - j - \frac{1}{2} \right) }{(w+a)^{\lambda - j - 1/2}} =$$

$$\displaystyle \frac{ \Gamma\left( \lambda - j - \frac{1}{2} \right) }{(w+a)^{\lambda - j - 1/2}} \, \left[ \psi_0\left( \lambda - j - \frac{1}{2} \right) - \log(w+a) \right]$$

This proves the Corollary. $$\displaystyle \Box$$

Last edited:

#### DreamWeaver

##### Well-known member
----------------
Proposition 22:
----------------

Let $$\displaystyle \mathscr{Re}(w) > -\mathscr{Re}(a)$$, and $$\displaystyle \mathscr{Re}(\lambda) > 0$$, then:

$$\displaystyle {\color{BrickRed} \mathfrak{L} (x^{\lambda-1}\sinh ax\, \log x) = }$$

$$\displaystyle {\color{BrickRed} \frac{\Gamma(\lambda)}{2(w-a)^{\lambda}}\, \Bigg[ \psi_0(\lambda) - \log(w-a) \Bigg] - \frac{\Gamma(\lambda)}{2(w+a)^{\lambda}}\, \Bigg[ \psi_0(\lambda) - \log(w+a) \Bigg] }$$

Proof:

By proposition (14) - Part 1 - we have:

$$\displaystyle \mathfrak{L}(x^{\lambda-1}\sinh ax) = \frac{\Gamma(\lambda)}{2}\, \frac{ (w+a)^{\lambda} - (w-a)^{\lambda} }{(w^2-a^2)^{\lambda}} =$$

$$\displaystyle \frac{\Gamma(\lambda)}{2}\, \left[ \frac{1}{(w-a)^{\lambda}} - \frac{1}{(w+a)^{\lambda}} \right]$$

Hence

$$\displaystyle \mathfrak{L}'(x^{\lambda-1}\sinh ax) = \frac{d}{d\lambda} \mathfrak{L}(x^{\lambda-1}\sinh ax) =$$

$$\displaystyle \frac{\Gamma'(\lambda)}{2}\, \left[ \frac{1}{(w-a)^{\lambda}} - \frac{1}{(w+a)^{\lambda}} \right] +$$

$$\displaystyle \frac{\Gamma(\lambda)}{2}\, \left[ -\frac{\log(w-a)}{(w-a)^{\lambda}} + \frac{\log(w+a)}{(w+a)^{\lambda}} \right] =$$

$$\displaystyle \frac{\Gamma(\lambda)}{2(w-a)^{\lambda}}\, \Bigg[ \psi_0(\lambda) - \log(w-a) \Bigg] - \frac{\Gamma(\lambda)}{2(w+a)^{\lambda}}\, \Bigg[ \psi_0(\lambda) - \log(w+a) \Bigg]$$

Conversely,

$$\displaystyle \mathfrak{L}'(x^{\lambda-1}\sinh ax) = \frac{d}{d\lambda} \mathfrak{L}(x^{\lambda-1}\sinh ax) =$$

$$\displaystyle \frac{d}{d\lambda}\, \int_0^{\infty}e^{-wx}x^{\lambda-1}\sinh ax\, dx = \int_0^{\infty}e^{-wx}\sinh ax\, \left[ \frac{d}{d\lambda}\, x^{\lambda-1} \right]\, dx =$$

$$\displaystyle \int_0^{\infty}e^{-wx}x^{\lambda-1}\sinh ax\, \log x\, dx \equiv \mathfrak{L} (x^{\lambda-1}\sinh ax\, \log x)$$

This concludes the proof. $$\displaystyle \Box$$

#### DreamWeaver

##### Well-known member
----------------
Proposition 23:
----------------

Let $$\displaystyle \mathscr{Re}(w) > -\mathscr{Re}(a)$$, and $$\displaystyle \mathscr{Re}(\lambda) > 0$$, then:

$$\displaystyle {\color{BrickRed} \mathfrak{L} (x^{\lambda-1}\cosh ax\, \log x) = }$$

$$\displaystyle {\color{BrickRed} \frac{\Gamma(\lambda)}{2(w-a)^{\lambda}}\, \Bigg[ \psi_0(\lambda) - \log(w-a) \Bigg] + \frac{\Gamma(\lambda)}{2(w+a)^{\lambda}}\, \Bigg[ \psi_0(\lambda) - \log(w+a) \Bigg] }$$

Proof:

By proposition (14) - Part 2 - we have:

$$\displaystyle \mathfrak{L}(x^{\lambda-1}\cosh ax) = \frac{\Gamma(\lambda)}{2}\, \frac{ (w+a)^{\lambda} + (w-a)^{\lambda} }{(w^2-a^2)^{\lambda}} =$$

$$\displaystyle \frac{\Gamma(\lambda)}{2}\, \left[ \frac{1}{(w-a)^{\lambda}} + \frac{1}{(w+a)^{\lambda}} \right]$$

Hence

$$\displaystyle \mathfrak{L}'(x^{\lambda-1}\cosh ax) = \frac{d}{d\lambda} \mathfrak{L}(x^{\lambda-1}\cosh ax) =$$

$$\displaystyle \frac{\Gamma'(\lambda)}{2}\, \left[ \frac{1}{(w-a)^{\lambda}} + \frac{1}{(w+a)^{\lambda}} \right] +$$

$$\displaystyle \frac{\Gamma(\lambda)}{2}\, \left[ -\frac{\log(w-a)}{(w-a)^{\lambda}} - \frac{\log(w+a)}{(w+a)^{\lambda}} \right] =$$

$$\displaystyle \frac{\Gamma(\lambda)}{2(w-a)^{\lambda}}\, \Bigg[ \psi_0(\lambda) - \log(w-a) \Bigg] + \frac{\Gamma(\lambda)}{2(w+a)^{\lambda}}\, \Bigg[ \psi_0(\lambda) - \log(w+a) \Bigg]$$

Conversely,

$$\displaystyle \mathfrak{L}'(x^{\lambda-1}\cosh ax) = \frac{d}{d\lambda} \mathfrak{L}(x^{\lambda-1}\cosh ax) =$$

$$\displaystyle \frac{d}{d\lambda}\, \int_0^{\infty}e^{-wx}x^{\lambda-1}\cosh ax\, dx = \int_0^{\infty}e^{-wx}\cosh ax\, \left[ \frac{d}{d\lambda}\, x^{\lambda-1} \right]\, dx =$$

$$\displaystyle \int_0^{\infty}e^{-wx}x^{\lambda-1}\cosh ax\, \log x\, dx \equiv \mathfrak{L} (x^{\lambda-1}\cosh ax\, \log x)$$

This concludes the proof. $$\displaystyle \Box$$