- Thread starter
- #1

- Thread starter Poly
- Start date

- Thread starter
- #1

- Admin
- #2

- Thread starter
- #3

- Admin
- #4

- Jan 26, 2012

- 4,055

How rigorous do you need your proof to be? Are you looking for an intuitive explanation for yourself or a proof?

- Admin
- #5

Using differential calculus, one can demonstrate that the log function is monotonically increasing. I hesitate to use this as you have posted in the Pre-Calculus forum. At this level, it generally suffices to use the horizontal line test on the graph of the general log function and observe that for any horizontal line, the log functions intersects this line only once.Can we prove that $\log(x)$ is one-to-one then? For all $a,b \in\mathbb{R}^{+}$ that $\log(a) = \log(b) \implies a = b$.

- Thread starter
- #6

Initially I just wanted an explanation. Now I sort of get it, but I'm tempted to learn a proof. Thanks for the link.

How rigorous do you need your proof to be? Are you looking for an intuitive explanation for yourself or a proof?

- Thread starter
- #7

I posted it in this section because I thought the it would be simpler but I don't mind calculus.Using differential calculus, one can demonstrate that the log function is monotonically increasing. I hesitate to use this as you have posted in the Pre-Calculus forum. At this level, it generally suffices to use the horizontal line test on the graph of the general log function and observe that for any horizontal line, the log functions intersects this line only once.

- Admin
- #8

To keep it as simple as possible, consider:I posted it in this section because I thought the it would be simpler but I don't mind calculus.

$\displaystyle f(x)=\ln(x)$ where $\displaystyle 0<x$

We see then that:

$\displaystyle f'(x)=\frac{1}{x}>0$ $\displaystyle\text{ }$ $\displaystyle\forall x\in\mathbb{R}^{+}$

This means that for all

A similar argument can be used for logs of other bases, via the change of base theorem. If the base is less than 1, then the function will be decreasing, but it will still be monotonic.

- Thread starter
- #9

- Admin
- #10

$\displaystyle \log_a(f(x))=\log_a(g(x))$ where $\displaystyle a\ne1$.

Convert from logarithmic to exponential form:

$\displaystyle f(x)=a^{\log_a(g(x))}=g(x)$

Can we prove that $\log(x)$ is one-to-one then? For all $a,b \in\mathbb{R}^{+}$ that $\log(a) = \log(b) \implies a = b$.

Here is another way. Suppose that $e$ is the base, (really it can be a genetic base).

[tex]\begin{align*}\log(a)&=\log(b)\\ e^a &=e^b \\e^{a-b} &=1\\ a-b&=0\\a &= b \end{align*}[/tex].

- Thread starter
- #12

Thanks. But that relies on $e^{a} = e^{b} \implies a = b$.Here is another way. Suppose that $e$ is the base, (really it can be a genetic base).

[tex]\begin{align*}\log(a)&=\log(b)\\ e^a &=e^b \\e^{a-b} &=1\\ a-b&=0\\a &= b \end{align*}[/tex].

- Thread starter
- #13

How do you get $a-b = 0$? I see that $e^{0} = 1$ but that amounts to using we're using what we're proving.Here is another way. Suppose that $e$ is the base, (really it can be a genetic base).

[tex]\begin{align*}\log(a)&=\log(b)\\ e^a &=e^b \\e^{a-b} &=1\\ a-b&=0\\a &= b \end{align*}[/tex].

- Admin
- #14

- Jan 26, 2012

- 4,055

It's using the fact that given a base, $b>1$ and, and \(\displaystyle b^x=1\), then $x=0$ is a unique solution to this. However, using this fact without proof is not rigorous and needs to be proven. I don't think you're going to see an answer that is truly rigorous without going into some high level math like in the link I gave you.How do you get $a-b = 0$? I see that $e^{0} = 1$ but that amounts to using we're using what we're proving.

This also only applies to the vector space R1 I believe and proving something about polynomial space doesn't automatically follow without justification. So again, this boils down to how rigorous are we going to get.

I think it's great though you are pursuing this proof!

How do you get $a-b = 0$? I sey that $e^{0} = 1$ but that amounts to using we're using what we're proving.

Any mathematically literate person knows that

[tex]\text{If }b\ne 0\text{ and }b^x=1\text{ if and only if }x=0[/tex] .

Last edited:

- Admin
- #16

- Jan 26, 2012

- 4,055

Plato,

Isn't it also necessary to exclude $b=1$ and $b=-1$?

Isn't it also necessary to exclude $b=1$ and $b=-1$?

That is a very good point.Plato,

Isn't it also necessary to exclude $b=1$ and $b=-1$?

- Feb 15, 2012

- 1,967

1) $\log_a(x) - \log_a(y) = \log_a(\frac{x}{y})$

2) $a^x = y$ when $\log_a(y) = x$ <--this is really saying $\log_a$ is 1-1, more on that later.

(this is often given as the definition of log (base a) of y: the number you have to exponentiate a by to get y).

now if:

$\log_a(x) = \log_a(y)$

$\log_a(\frac{x}{y}) = \log_a(x) - \log_a(y) = 0$

so:

$a^{\log_a(\frac{x}{y})} = a^0$

that is:

$\frac{x}{y} = 1$

which means:

$x = y$.

to be honest, there's a degree of "circularity" in this argument: we use the fact that $\log_a$ is an inverse function to exponentiation by a (functions with inverses have to be 1-1). to avoid that, we need a better definition of $\log_a$.

the most "practical" definition (in terms of avoiding self-reference) is:

$$\log(x) = \int_1^x \dfrac{1}{t}\ dt$$

then one can PROVE that:

$\log(ab) = \log(a) + \log(b)$ (*)

and that $\log$ (note the absence of an indicated base) is an increasing function (and thus has an inverse) on $(0,\infty)$.

the proof of (*) is interesting, so i'll give it here:

$$\log(ab) = \int_1^{ab} \dfrac{1}{t}\ dt = \int_1^a \dfrac{1}{t}\ dt + \int_a^{ab} \dfrac{1}{t}\ dt = \log(a) + \int_a^{ab} \frac{1}{t}\ dt$$

to evaluate the second integral, we make a "u-substitution":

let $u = \dfrac{t}{a}$, so that $du = \dfrac{1}{a}\ dt$

as $t$ goes from $a$ to $ab$, $u$ goes from 1 to $b$ so:

$$\int_a^{ab} \frac{1}{t}\ dt = \int_1^b \frac{a}{au}\ du = \int_1^b \frac{1}{u}\ du = \log(b)$$

it should be clear that $\log$ defined this way is increasing, so it ought to have an inverse defined on its range (ignoring, for the moment, just what its range might be). let's call this function $g$.

suppose $x = \log(a)$, $y = \log(b)$ which means:

$g(x) = a, g(y) = b$.

then: $x + y = \log(a) + \log(b) = \log(ab)$, so:

$g(x+y) = g(\log(ab)) = ab = g(x)g(y)$.

so $g$ "acts like" it's some function $c^x$ for some $c$. what might $c$ be?

well, $c^1 = c$ so we should have:

$$1 = \log(c) = \int_1^c \frac{1}{t}\ dt$$

this only gives a definition of "log" (no base). can you figure out how you would get to "$\log_a$"(with a base)?

*******

as other have pointed out, $\log(f(x))$ may not be a 1-1 function. BUT....

if $\log \circ f = \log \circ g,\ \forall x \in \Bbb R$

we can conclude that:

$f(x) = g(x),\ \forall x \in \Bbb R$, hence $f = g$.

it could happen that $\log(f(x))$ is undefined, for certain $x$. in that case, we can only be sure that $f = g$ for those $x$ where the logs ARE defined. so sometimes one has to consider $|f(x)|$ and $|g(x)|$ instead.