Understanding the Relationship Between Logarithmic and Exponential Functions

  • I
  • Thread starter mikeng
  • Start date
In summary, the conversation discusses the inequality log(n) < n^a for all n,a>0 and how it may be difficult to understand intuitively. The concept of increasing functions is also mentioned. The main question is whether there is a constant a such that x^a < log(x), and the conversation delves into finding such a constant and the importance of the constant in the definition of big oh. The conclusion is that it is sufficient to prove the existence of a constant c, rather than finding a specific value, to show that log(n) is always at most some factor c larger than na.
  • #1
mikeng
This is probably really easy, but i am really rusty and i want to really understand.
I am trying to understand why [tex] log(n) < n^a [/tex] For all n,a>0.
I am really having a hard time to get an intuition over this, of course if i let a >= 1 then its obvious, but how do i know if there is not some extremely small epsilon that breaks the inequality. I understand maybe i should use that in a proof but i am kinda stuck.

Also this is not really directly related but a thought i had, is there a name for functions that has the property that if x < y then g(x) < g(y), I'm curious.
 
Mathematics news on Phys.org
  • #2
mikeng said:
This is probably really easy, but i am really rusty and i want to really understand.
I am trying to understand why [tex] log(n) < n^a [/tex] For all n,a>0.
I am really having a hard time to get an intuition over this, of course if i let a >= 1 then its obvious, but how do i know if there is not some extremely small epsilon that breaks the inequality. I understand maybe i should use that in a proof but i am kinda stuck.

Also this is not really directly related but a thought i had, is there a name for functions that has the property that if x < y then g(x) < g(y), I'm curious.

Functions that satisfy ##x < y \Rightarrow g(x) < g(y)## are called increasing functions. You might want to read this: https://en.wikipedia.org/wiki/Monotonic_function
 
  • Like
Likes mikeng
  • #3
Math_QED said:
Functions that satisfy ##x < y \Rightarrow g(x) < g(y)## are called increasing functions.
Thanks, that makes sense :)
 
  • #4
I see did not occur to me to use derivatives.
I have been looking this over to try to see if i missed something but correct me if i am wrong but this does only show log(x) < x. But i wonder if there is a constant a such that [tex]x^a < log(x) [/tex]
Hmm but using derivatives then we must show that
1/x < a*x^(a-1)
Or is my thinking wrong?
Or more simpler that [tex]1< a*x^a [/tex]
 
  • #5
mikeng said:
This is probably really easy, but i am really rusty and i want to really understand.
I am trying to understand why [tex] log(n) < n^a [/tex] For all n,a>0.
Try n = 100, a = .001
 
  • Like
Likes mikeng
  • #7
mikeng said:
Thanks, but then i must have misunderstood the origin of the question, it says. recall that for any c > 0, log(n) is O(n^c ).
This is the first problem from here https://ocw.mit.edu/courses/electri...fall-2011/assignments/MIT6_006F11_ps1_sol.pdf

There must be something i am missing here, cause clearly this is not true for all c> 0

Hold on a minute lol i forgot the constant, i can choose my constant in the definition of big oh, to cancel any c>0, or?

Well after thinking not sure how to find such a constant. Hmm i am prob making this allot more complicated.
 
Last edited by a moderator:
  • #8
The constant is the important difference. log(n) can be larger than na, but no matter which a you choose, it will always be at most some factor c larger (c depends on a). Finding a specific c can be challenging but it is not necessary. It is sufficient to show that such a c exists.
 
  • Like
Likes mikeng
  • #9
mfb said:
The constant is the important difference. log(n) can be larger than na, but no matter which a you choose, it will always be at most some factor c larger (c depends on a). Finding a specific c can be challenging but it is not necessary. It is sufficient to show that such a c exists.

Thanks it makes sense, will continue to think of how to prove that, to show that no matter how much bigger log(n) is then n^a, it is always bigger by some constant factor. But at least i understand why it is true, just want to prove it. Thanks :)
 

Related to Understanding the Relationship Between Logarithmic and Exponential Functions

1. What does the notation "log(n)" mean in this context?

The notation "log(n)" refers to the logarithm function. In this context, it represents the logarithm of the base 10 of the value n.

2. How is the logarithm function related to exponential functions?

The logarithm function is the inverse of the exponential function. This means that if we have an equation in the form of y = a^x, the inverse function would be x = log(a)y.

3. Can you explain the meaning of "log(n) < n^a" in simpler terms?

This inequality means that the value of logarithm of n is always smaller than the value of n raised to the power of a. In other words, as n increases, the logarithm of n increases at a slower rate compared to n^a.

4. What is the significance of this inequality in mathematics?

This inequality is significant in mathematics because it helps us understand the growth rate of functions. It shows that exponential functions, such as n^a, grow much faster than logarithmic functions, such as log(n).

5. How is this inequality used in real-world applications?

This inequality is used in various real-world applications, such as analyzing algorithms and data structures. It helps us understand the time complexity and efficiency of different algorithms, as well as the best way to sort and search data. It is also used in fields such as finance and biology to model and predict growth rates.

Similar threads

Replies
12
Views
1K
Replies
1
Views
643
  • General Math
Replies
4
Views
3K
Replies
3
Views
1K
Replies
7
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
624
  • Precalculus Mathematics Homework Help
Replies
11
Views
709
Replies
4
Views
577
Replies
2
Views
417
Back
Top