# Why does log(f(x)) = log(g(x)) imply f(x) = g(x)?

#### Poly

##### Member
In my book/course we always assume $\log(f(x)) = \log(g(x)) \implies f(x) = g(x)$.

Could someone explain why this is true? Usually f(x) and g(x) are polynomials.

#### MarkFL

Staff member
Basically, I would say it has to do with the one-to-one nature of log functions. Since there is only one output associated with a given input, then given two equal outputs, we know the inputs must also be equal.

#### Poly

##### Member
Can we prove that $\log(x)$ is one-to-one then? For all $a,b \in\mathbb{R}^{+}$ that $\log(a) = \log(b) \implies a = b$.

#### Jameson

Staff member
Proving the uniqueness of the existence of a logarithm is something you might see in real analysis in college. That is probably overkill for what you need but a proof is sketched here.

How rigorous do you need your proof to be? Are you looking for an intuitive explanation for yourself or a proof?

#### MarkFL

Staff member
Can we prove that $\log(x)$ is one-to-one then? For all $a,b \in\mathbb{R}^{+}$ that $\log(a) = \log(b) \implies a = b$.
Using differential calculus, one can demonstrate that the log function is monotonically increasing. I hesitate to use this as you have posted in the Pre-Calculus forum. At this level, it generally suffices to use the horizontal line test on the graph of the general log function and observe that for any horizontal line, the log functions intersects this line only once.

#### Poly

##### Member
Proving the uniqueness of the existence of a logarithm is something you might see in real analysis in college. That is probably overkill for what you need but a proof is sketched here.

How rigorous do you need your proof to be? Are you looking for an intuitive explanation for yourself or a proof?
Initially I just wanted an explanation. Now I sort of get it, but I'm tempted to learn a proof. Thanks for the link.

#### Poly

##### Member
Using differential calculus, one can demonstrate that the log function is monotonically increasing. I hesitate to use this as you have posted in the Pre-Calculus forum. At this level, it generally suffices to use the horizontal line test on the graph of the general log function and observe that for any horizontal line, the log functions intersects this line only once.
I posted it in this section because I thought the it would be simpler but I don't mind calculus.

#### MarkFL

Staff member
I posted it in this section because I thought the it would be simpler but I don't mind calculus.
To keep it as simple as possible, consider:

$\displaystyle f(x)=\ln(x)$ where $\displaystyle 0<x$

We see then that:

$\displaystyle f'(x)=\frac{1}{x}>0$ $\displaystyle\text{ }$ $\displaystyle\forall x\in\mathbb{R}^{+}$

This means that for all x in the domain, the log function is increasing.

A similar argument can be used for logs of other bases, via the change of base theorem. If the base is less than 1, then the function will be decreasing, but it will still be monotonic.

#### Poly

##### Member
Thanks. So to extend this to $g(x)$ whose domain is $\mathbb{R}^+$ we would have $f'(x) = \frac{g'(x)}{g(x)}$. Why is this greater than zero?

#### MarkFL

Staff member
You're right, unless the argument of the log function is itself monotonic, then there is no guarantee that the resulting composite function is also monotonic. Perhaps we can approach this from a different angle. Consider:

$\displaystyle \log_a(f(x))=\log_a(g(x))$ where $\displaystyle a\ne1$.

Convert from logarithmic to exponential form:

$\displaystyle f(x)=a^{\log_a(g(x))}=g(x)$

#### Plato

##### Well-known member
MHB Math Helper
Can we prove that $\log(x)$ is one-to-one then? For all $a,b \in\mathbb{R}^{+}$ that $\log(a) = \log(b) \implies a = b$.

Here is another way. Suppose that $e$ is the base, (really it can be a genetic base).

\begin{align*}\log(a)&=\log(b)\\ e^a &=e^b \\e^{a-b} &=1\\ a-b&=0\\a &= b \end{align*}.

#### Poly

##### Member
Here is another way. Suppose that $e$ is the base, (really it can be a genetic base).

\begin{align*}\log(a)&=\log(b)\\ e^a &=e^b \\e^{a-b} &=1\\ a-b&=0\\a &= b \end{align*}.
Thanks. But that relies on $e^{a} = e^{b} \implies a = b$.

#### Poly

##### Member
Here is another way. Suppose that $e$ is the base, (really it can be a genetic base).

\begin{align*}\log(a)&=\log(b)\\ e^a &=e^b \\e^{a-b} &=1\\ a-b&=0\\a &= b \end{align*}.
How do you get $a-b = 0$? I see that $e^{0} = 1$ but that amounts to using we're using what we're proving.

#### Jameson

Staff member
How do you get $a-b = 0$? I see that $e^{0} = 1$ but that amounts to using we're using what we're proving.
It's using the fact that given a base, $b>1$ and, and $$\displaystyle b^x=1$$, then $x=0$ is a unique solution to this. However, using this fact without proof is not rigorous and needs to be proven. I don't think you're going to see an answer that is truly rigorous without going into some high level math like in the link I gave you.

This also only applies to the vector space R1 I believe and proving something about polynomial space doesn't automatically follow without justification. So again, this boils down to how rigorous are we going to get.

I think it's great though you are pursuing this proof!

#### Plato

##### Well-known member
MHB Math Helper
How do you get $a-b = 0$? I sey that $e^{0} = 1$ but that amounts to using we're using what we're proving.

Absolutely it does not.

Any mathematically literate person knows that
$$\text{If }b\ne 0\text{ and }b^x=1\text{ if and only if }x=0$$ .

Last edited:

#### Jameson

Staff member
Plato,

Isn't it also necessary to exclude $b=1$ and $b=-1$?

#### Plato

##### Well-known member
MHB Math Helper
Plato,

Isn't it also necessary to exclude $b=1$ and $b=-1$?
That is a very good point.

#### Deveno

##### Well-known member
MHB Math Scholar
this proof relies on two facts (the proof of these two facts is another story):

1) $\log_a(x) - \log_a(y) = \log_a(\frac{x}{y})$

2) $a^x = y$ when $\log_a(y) = x$ <--this is really saying $\log_a$ is 1-1, more on that later.

(this is often given as the definition of log (base a) of y: the number you have to exponentiate a by to get y).

now if:

$\log_a(x) = \log_a(y)$

$\log_a(\frac{x}{y}) = \log_a(x) - \log_a(y) = 0$

so:

$a^{\log_a(\frac{x}{y})} = a^0$

that is:

$\frac{x}{y} = 1$

which means:

$x = y$.

to be honest, there's a degree of "circularity" in this argument: we use the fact that $\log_a$ is an inverse function to exponentiation by a (functions with inverses have to be 1-1). to avoid that, we need a better definition of $\log_a$.

the most "practical" definition (in terms of avoiding self-reference) is:

$$\log(x) = \int_1^x \dfrac{1}{t}\ dt$$

then one can PROVE that:

$\log(ab) = \log(a) + \log(b)$ (*)

and that $\log$ (note the absence of an indicated base) is an increasing function (and thus has an inverse) on $(0,\infty)$.

the proof of (*) is interesting, so i'll give it here:

$$\log(ab) = \int_1^{ab} \dfrac{1}{t}\ dt = \int_1^a \dfrac{1}{t}\ dt + \int_a^{ab} \dfrac{1}{t}\ dt = \log(a) + \int_a^{ab} \frac{1}{t}\ dt$$

to evaluate the second integral, we make a "u-substitution":

let $u = \dfrac{t}{a}$, so that $du = \dfrac{1}{a}\ dt$

as $t$ goes from $a$ to $ab$, $u$ goes from 1 to $b$ so:

$$\int_a^{ab} \frac{1}{t}\ dt = \int_1^b \frac{a}{au}\ du = \int_1^b \frac{1}{u}\ du = \log(b)$$

it should be clear that $\log$ defined this way is increasing, so it ought to have an inverse defined on its range (ignoring, for the moment, just what its range might be). let's call this function $g$.

suppose $x = \log(a)$, $y = \log(b)$ which means:

$g(x) = a, g(y) = b$.

then: $x + y = \log(a) + \log(b) = \log(ab)$, so:

$g(x+y) = g(\log(ab)) = ab = g(x)g(y)$.

so $g$ "acts like" it's some function $c^x$ for some $c$. what might $c$ be?

well, $c^1 = c$ so we should have:

$$1 = \log(c) = \int_1^c \frac{1}{t}\ dt$$

this only gives a definition of "log" (no base). can you figure out how you would get to "$\log_a$"(with a base)?

*******
as other have pointed out, $\log(f(x))$ may not be a 1-1 function. BUT....

if $\log \circ f = \log \circ g,\ \forall x \in \Bbb R$

we can conclude that:

$f(x) = g(x),\ \forall x \in \Bbb R$, hence $f = g$.

it could happen that $\log(f(x))$ is undefined, for certain $x$. in that case, we can only be sure that $f = g$ for those $x$ where the logs ARE defined. so sometimes one has to consider $|f(x)|$ and $|g(x)|$ instead.