# How can I show that there is exactly one f?

#### evinda

##### Well-known member
MHB Site Helper
Hello!!! I wanted to ask you how I can show that there is exactly one $f$ continuous for $x>0$,so that $f(x)=1+\frac{1}{x}\int_{1}^{x}f(t)dt, \forall x>0$.Is there a theorem that I could use?Or do I have just to suppose that there is a $g\neq f$,which satisfy $g(x)=1+\frac{1}{x}\int_{1}^{x}g(t)dt, \forall x>0$ and then to conclude that it must be $g(x)=f(x)$ ?

Last edited:

#### chisigma

##### Well-known member
Hello!!! I wanted to ask you how I can show that there is exactly one $f$ continuous for $x>0$,so that $f(x)=1+\frac{1}{x}\int_{1}^{x}f(t)dt, \forall x>0$.Is there a theorem that I could use?Or do I have just to suppose that there is a $g\neq f$,which satisfy $g(x)=1+\frac{1}{x}\int_{1}^{x}g(t)dt, \forall x>0$ and then to conclude that it must be $g(x)=f(x)$ ?
Setting f(x) = y and deriving the expression, with a little arrangement You arrive to the ODE...

$\displaystyle y^{\ '} = \frac{1}{x},\ y(1)=1\ (1)$

... which has the only solution $y=1 + \ln x$...

Kind regards

$\chi$ $\sigma$

#### evinda

##### Well-known member
MHB Site Helper
Setting f(x) = y and deriving the expression, with a little arrangement You arrive to the ODE...

$\displaystyle y^{\ '} = \frac{1}{x},\ y(1)=1\ (1)$

... which has the only solution $y=1 + \ln x$...

Kind regards

$\chi$ $\sigma$
Can't I show the uniqueness of the solution,before finding it? #### chisigma

##### Well-known member
Can't I show the uniqueness of the solution,before finding it? The ODE is in the form...

$\displaystyle \frac{d y}{d x} = f(x,y),\ y(x_{0})=y_{0}\ (1)$

... where f(*,*) and its first order partial derivatives are continous in $(x_{0},y_{0})$, so that the (1) has one and only one solution [Picard's theorem...]

Kind regards

$\chi$ $\sigma$

#### evinda

##### Well-known member
MHB Site Helper
So,do I have to take the functions $f(x)=1+\frac{1}{x}\int_{1}^{x}f(t)dt$ and $g(x)=1+\frac{1}{x}\int_{1}^{x}g(t)dt$ and say:
$g(1)=f(1) \text{ and } g'(1)=f'(1)$ ,and so we conclude that $f(x)=g(x) \forall x>0$?Or am I wrong? #### evinda

##### Well-known member
MHB Site Helper
So,do I have to take the functions $f(x)=1+\frac{1}{x}\int_{1}^{x}f(t)dt$ and $g(x)=1+\frac{1}{x}\int_{1}^{x}g(t)dt$ and say:
$g(1)=f(1) \text{ and } g'(1)=f'(1)$ ,and so we conclude that $f(x)=g(x) \forall x>0$?Or am I wrong? The theorem is like that:
$f,g$ satisfy $y''+by=0$ , $f(x_{0})=g(x_{0}),f'(x_{0})=g'(x_{0})$ $\Rightarrow$ $f(x)=g(x) \forall x \in (-\infty,+\infty)$ .Right?So,is it applicable in this case,or not?

#### Evgeny.Makarov

##### Well-known member
MHB Math Scholar
Setting f(x) = y and deriving the expression, with a little arrangement You arrive to the ODE...

$\displaystyle y^{\ '} = \frac{1}{x},\ y(1)=1\ (1)$
The theorem is like that:
$f,g$ satisfy $y''+by=0$ , $f(x_{0})=g(x_{0}),f'(x_{0})=g'(x_{0})$ $\Rightarrow$ $f(x)=g(x) \forall x \in (-\infty,+\infty)$ .Right?So,is it applicable in this case,or not?
chisigma says that the original integral equation is reducible to a first-order differential equation, while you are quoting a theorem about second-order differential equations... I am not sure whether it makes sense to clarify the assumptions of the Picard–Lindelöf theorem or explain the difference between 1 and 2... I've seen evidence that you can solve complicated problems. Please give more thought to your posts.