Facebook Page
Twitter
RSS
+ Reply to Thread
Page 1 of 10 123 ... LastLast
Results 1 to 10 of 97
  1. MHB Master
    MHB Site Helper
    MHB Math Helper
    evinda's Avatar
    Status
    Offline
    Join Date
    Apr 2013
    Posts
    3,562
    Thanks
    3,466 times
    Thanked
    1,072 time
    Awards
    MHB Model User Award (2016)  

MHB Model User Award (2014)
    #1
    Hello!!!

    We want to find an efficient algorithm that checks whether an odd number is a prime or not.
    In order to obtain such an algorithm, one tests the congruence $(X+a)^n \equiv X^n+a$ not "absolutely" in $\mathbb{Z}_n[X]$, but modulo a polynomial $X^r-1$, where $r$ have to be chosen in a clever way. That is, one compares, in $\mathbb{Z}_n[X]$, the polynomials
    $$(X+a)^n \pmod{X^r-1} \text{ and } (X^n+a) \pmod{X^r-1}=X^{n \mod{r}}+a.$$

    In the intermediate results that appear in the course of the computation of the power $(X+a)^n$
    , all coefficients are reduced modulo $n$, hence they can never exceed $n$. Calculating modulo $X^r-1$ just means that one can replace $X^s$ by $X^{s \mod{r}}$, hence that the degrees of the polynomials that appear as intermediate results can be kept below $r$. This keeps the computational cost in the polynomial range as long as $r$ is $O((\log{n})^c)$ for some constant $c$.

    How do we compute the computational cost for computing $(X+a)^n \pmod{X^r-1}$ ?

  2. # ADS
    Circuit advertisement
    Join Date
    Always
    Posts
    Many
     

  3. MHB Seeker
    MHB Global Moderator
    MHB Math Scholar
    MHB Coder
    Klaas van Aarsen's Avatar
    Status
    Online
    Join Date
    Mar 2012
    Location
    Leiden
    Posts
    7,707
    Thanks
    5,162 times
    Thanked
    13,906 times
    Thank/Post
    1.804
    Awards
    MHB Model Helper Award (2017)  

MHB Best Ideas (2017)  

MHB Analysis Award (2017)  

MHB Calculus Award (2017)  

MHB Model Helper Award (2016)
    #2
    Quote Originally Posted by evinda View Post
    This keeps the computational cost in the polynomial range as long as $r$ is $O((\log{n})^c)$ for some constant $c$.
    Hey evinda!!

    What does $r$ is $O((\log{n})^c)$ mean?

    Quote Originally Posted by evinda View Post
    How do we compute the computational cost for computing $(X+a)^n \pmod{X^r-1}$ ?
    That depends on the algorithm we use doesn't it?

    We might for instance set up a recursive algorithm based on squaring.
    In that case let $f(n,r)$ be the computational cost of evaluating $(X^{r-1}+b_{r-2}X^{r-2} + ... + b_0)^{n} \pmod{X^r-1, n}$.
    Then:
    $$f(2k,r)=f(k,r)+\text{ cost to evaluate }(X^{r-1}+b_{r-2}X^{r-2} + ... + b_0)^2$$
    and:
    $$f(2k+1,r)=f(2k,r)+\text{ cost to evaluate }(X^{r-1}+c_{r-2}X^{r-2} + ... + c_0)(X^{r-1}+b_{r-2}X^{r-2} + ... + b_0)$$

    Can we find the worst case computational cost for computing $(X+a)^n \pmod{X^r-1,n}$ from that?

  4. MHB Master
    MHB Site Helper
    MHB Math Helper
    evinda's Avatar
    Status
    Offline
    Join Date
    Apr 2013
    Posts
    3,562
    Thanks
    3,466 times
    Thanked
    1,072 time
    Awards
    MHB Model User Award (2016)  

MHB Model User Award (2014)
    #3 Thread Author
    Quote Originally Posted by I like Serena View Post
    Hey evinda!!

    What does $r$ is $O((\log{n})^c)$ mean?
    I assume that we will find the computational cost in respect to $r$ and then we will deduce that the computational cost is polynomial only if $r=O((\log{n})^c)$.

    Quote Originally Posted by I like Serena View Post
    That depends on the algorithm we use doesn't it?
    We use fast exponentiation in the ring $\mathbb{Z}_n[X]$.

    Quote Originally Posted by I like Serena View Post
    We might for instance set up a recursive algorithm based on squaring.
    In that case let $f(n,r)$ be the computational cost of evaluating $(X^{r-1}+b_{r-2}X^{r-2} + ... + b_0)^{n} \pmod{X^r-1, n}$.
    So, $f(n,r)$ will be the computational cost of evaluating any monic polynomial of degree $r-1$ ?

    Quote Originally Posted by I like Serena View Post

    Then:
    $$f(2k,r)=f(k,r)+\text{ cost to evaluate }(X^{r-1}+b_{r-2}X^{r-2} + ... + b_0)^2$$
    and:
    $$f(2k+1,r)=f(2k,r)+\text{ cost to evaluate }(X^{r-1}+c_{r-2}X^{r-2} + ... + c_0)(X^{r-1}+b_{r-2}X^{r-2} + ... + b_0)$$
    I see...

    Quote Originally Posted by I like Serena View Post
    Can we find the worst case computational cost for computing $(X+a)^n \pmod{X^r-1,n}$ from that?
    Do we multiply two polynomials of some degree $d$ using the FFT, the time complexity of which is $O(d \log{d})$ ?

    If so, will we have a recursive relation of the following form?

    $f(n,n)=\left\{\begin{matrix}
    f\left(\frac{n}{2},n \right )+O\left(\frac{n}{2}\log{\left(\frac{n}{2} \right )} \right ), & n \text{ even} \\
    f\left(\frac{n-1}{2},n \right )+O\left(\frac{n-1}{2}\log{\left(\frac{n-1}{2} \right )} \right )+\dots, & n \text{ odd}
    \end{matrix}\right.$


    For the case when $n$ is odd, how do we multiply a polynomial of degree $n-1$ with a polynomial of degree $1$ ? Do we use again the FFT?

  5. MHB Seeker
    MHB Global Moderator
    MHB Math Scholar
    MHB Coder
    Klaas van Aarsen's Avatar
    Status
    Online
    Join Date
    Mar 2012
    Location
    Leiden
    Posts
    7,707
    Thanks
    5,162 times
    Thanked
    13,906 times
    Thank/Post
    1.804
    Awards
    MHB Model Helper Award (2017)  

MHB Best Ideas (2017)  

MHB Analysis Award (2017)  

MHB Calculus Award (2017)  

MHB Model Helper Award (2016)
    #4
    Quote Originally Posted by evinda View Post
    I assume that we will find the computational cost in respect to $r$ and then we will deduce that the computational cost is polynomial only if $r=O((\log{n})^c)$.
    Isn't $r$ supposed to be a constant?

    Quote Originally Posted by evinda View Post
    So, $f(n,r)$ will be the computational cost of evaluating any monic polynomial of degree $r-1$ ?
    Yep.

    Quote Originally Posted by evinda View Post
    Do we multiply two polynomials of some degree $d$ using the FFT, the time complexity of which is $O(d \log{d})$ ?
    I guess we can.

    Quote Originally Posted by evinda View Post
    If so, will we have a recursive relation of the following form?

    $f(n,n)=\left\{\begin{matrix}
    f\left(\frac{n}{2},n \right )+O\left(\frac{n}{2}\log{\left(\frac{n}{2} \right )} \right ), & n \text{ even} \\
    f\left(\frac{n-1}{2},n \right )+O\left(\frac{n-1}{2}\log{\left(\frac{n-1}{2} \right )} \right )+\dots, & n \text{ odd}
    \end{matrix}\right.$
    Shouldn't that be:
    $$f(n,r)=\begin{cases}
    f\left(\frac{n}{2},r \right )+O\left(r\log{r} \right ), & n \text{ even} \\
    f\left(n-1,r \right )+O\left(r\log{r} \right ), & n \text{ odd}
    \end{cases}$$


    Quote Originally Posted by evinda View Post
    For the case when $n$ is odd, how do we multiply a polynomial of degree $n-1$ with a polynomial of degree $1$ ? Do we use again the FFT?
    We don't.
    First we evaluate the left hand polynomial completely, yielding a polynomial of degree $r-1$.
    And then we multiply by the right hand polynomial of degree $r-1$.
    We can indeed use the FFT for that.

  6. MHB Master
    MHB Site Helper
    MHB Math Helper
    evinda's Avatar
    Status
    Offline
    Join Date
    Apr 2013
    Posts
    3,562
    Thanks
    3,466 times
    Thanked
    1,072 time
    Awards
    MHB Model User Award (2016)  

MHB Model User Award (2014)
    #5 Thread Author
    Quote Originally Posted by I like Serena View Post
    Isn't $r$ supposed to be a constant?
    Yes, it just says at the beginning that $r$ will have to be chosen in a clever way. So they mean that we have to pick an $r$ in the order $O((\log{n})^c)$, or not?



    Quote Originally Posted by I like Serena View Post

    Shouldn't that be:
    $$f(n,r)=\begin{cases}
    f\left(\frac{n}{2},r \right )+O\left(r\log{r} \right ), & n \text{ even} \\
    f\left(n-1,r \right )+O\left(r\log{r} \right ), & n \text{ odd}
    \end{cases}$$



    We don't.
    First we evaluate the left hand polynomial completely, yielding a polynomial of degree $r-1$.
    And then we multiply by the right hand polynomial of degree $r-1$.
    We can indeed use the FFT for that.
    Isn't it as follows?

    $(X+a)^{2k+1} \mod{(X^r-1)}=((X+a)^k)^2 \cdot (X+a) \mod{(X^r-1)}$

    So don't we multiply a polynomial of maximum degree $r-1$ by a polynomial of degree $1$?
    Or am I wrong?

  7. MHB Seeker
    MHB Global Moderator
    MHB Math Scholar
    MHB Coder
    Klaas van Aarsen's Avatar
    Status
    Online
    Join Date
    Mar 2012
    Location
    Leiden
    Posts
    7,707
    Thanks
    5,162 times
    Thanked
    13,906 times
    Thank/Post
    1.804
    Awards
    MHB Model Helper Award (2017)  

MHB Best Ideas (2017)  

MHB Analysis Award (2017)  

MHB Calculus Award (2017)  

MHB Model Helper Award (2016)
    #6
    Quote Originally Posted by evinda View Post
    Yes, it just says at the beginning that $r$ will have to be chosen in a clever way. So they mean that we have to pick an $r$ in the order $O((\log{n})^c)$, or not?
    I don't know what that is supposed to mean. I was hoping you could clarify.


    Quote Originally Posted by evinda View Post
    Isn't it as follows?

    $(X+a)^{2k+1} \mod{(X^r-1)}=((X+a)^k)^2 \cdot (X+a) \mod{(X^r-1)}$

    So don't we multiply a polynomial of maximum degree $r-1$ by a polynomial of degree $1$?
    Or am I wrong?
    Oh yes, the right side polynomial is always $(X+a)$.
    That means we can calculate it directly:
    $$(X^{r-1}+b_{r-2}X^{r-2} + ... + b_0)(X+a) \\
    = X^r + (b_{r-2}+a)X^{r-1} + (b_{r-3}+a)X^{r-2} + ... +(b_1+a)X+b_0a \\
    = (b_{r-2}+a)X^{r-1} + (b_{r-3}+a)X^{r-2} + ... +(b_1+a)X+(b_0a + 1)
    $$
    It takes $r$ additions, so it's $O(r)$.

    It doesn't really matter though. We might as well use the same FFT for $O(r\log r)$.
    That's because $O(r\log r + r)=O(r\log r)$.

  8. MHB Master
    MHB Site Helper
    MHB Math Helper
    evinda's Avatar
    Status
    Offline
    Join Date
    Apr 2013
    Posts
    3,562
    Thanks
    3,466 times
    Thanked
    1,072 time
    Awards
    MHB Model User Award (2016)  

MHB Model User Award (2014)
    #7 Thread Author
    Quote Originally Posted by I like Serena View Post
    I don't know what that is supposed to mean. I was hoping you could clarify.
    It says the following:



    But as they write it, they mean that the computational cost is polynomial only if $r$ is $O((\log{n})^c)$. I don't know why they say it like that.

    Quote Originally Posted by I like Serena View Post
    Oh yes, the right side polynomial is always $(X+a)$.
    That means we can calculate it directly:
    $$(X^{r-1}+b_{r-2}X^{r-2} + ... + b_0)(X+a) \\
    = X^r + (b_{r-2}+a)X^{r-1} + (b_{r-3}+a)X^{r-2} + ... +(b_1+a)X+b_0a \\
    = (b_{r-2}+a)X^{r-1} + (b_{r-3}+a)X^{r-2} + ... +(b_1+a)X+(b_0a + 1)
    $$
    It takes $r$ additions, so it's $O(r)$.

    It doesn't really matter though. We might as well use the same FFT for $O(r\log r)$.
    That's because $O(r\log r + r)=O(r\log r)$.
    So don't we have the following?

    $$f(n,r)=\begin{cases}
    f\left(\frac{n}{2},r \right )+O\left(r\log{r} \right ), & n \text{ even} \\
    f\left(n-1,r \right )+O(r), & n \text{ odd}
    \end{cases}$$

    Also, I was wondering if it is not possible that the polynomial that we get has degree $<r-1$...

    In addition, we also need an initial condition in order to be able to solve the recurrence relation, don't we?
    Do we check the computational cost when $n=r$, in order to find an initial condition? Or am I wrong?

  9. MHB Seeker
    MHB Global Moderator
    MHB Math Scholar
    MHB Coder
    Klaas van Aarsen's Avatar
    Status
    Online
    Join Date
    Mar 2012
    Location
    Leiden
    Posts
    7,707
    Thanks
    5,162 times
    Thanked
    13,906 times
    Thank/Post
    1.804
    Awards
    MHB Model Helper Award (2017)  

MHB Best Ideas (2017)  

MHB Analysis Award (2017)  

MHB Calculus Award (2017)  

MHB Model Helper Award (2016)
    #8
    Quote Originally Posted by evinda View Post
    It says the following:

    But as they write it, they mean that the computational cost is polynomial only if $r$ is $O((\log{n})^c)$. I don't know why they say it like that.
    It actually says 'if $r$ is of size $O((\log{n})^c)$'.
    So I believe they're not talking about computational complexity but just about which constant value of $r$ to pick.
    In particular that affects that number of different values of $a$ that we have to check.

    Anyway, we're currently only talking about the computational complexity for a single value of $a$.

    Quote Originally Posted by evinda View Post
    So don't we have the following?

    $$f(n,r)=\begin{cases}
    f\left(\frac{n}{2},r \right )+O\left(r\log{r} \right ), & n \text{ even} \\
    f\left(n-1,r \right )+O(r), & n \text{ odd}
    \end{cases}$$

    Also, I was wondering if it is not possible that the polynomial that we get has degree $<r-1$...
    Yep.

    A degree $<r-1$ can happen in some of the iterations, but not generally.
    And it does not matter for the worst case computational complexity.

    Quote Originally Posted by evinda View Post
    In addition, we also need an initial condition in order to be able to solve the recurrence relation, don't we?
    Do we check the computational cost when $n=r$, in order to find an initial condition? Or am I wrong?
    Indeed, we need an initial condition.
    As the algorithm is now, we can only stop when $n=1$ can't we?
    Otherwise we don't actually have the polynomial yet.
    And then there is nothing to do, so the computational complexity is $0$.

  10. MHB Master
    MHB Site Helper
    MHB Math Helper
    evinda's Avatar
    Status
    Offline
    Join Date
    Apr 2013
    Posts
    3,562
    Thanks
    3,466 times
    Thanked
    1,072 time
    Awards
    MHB Model User Award (2016)  

MHB Model User Award (2014)
    #9 Thread Author
    Quote Originally Posted by I like Serena View Post
    It actually says 'if $r$ is of size $O((\log{n})^c)$'.
    So I believe they're not talking about computational complexity but just about which constant value of $r$ to pick.
    In particular that affects that number of different values of $a$ that we have to check.
    So the computational cost will be polynomial, no matter what $r$ we will pick, right?

    It doesn't say how $r$ affects the number of different values of $a$ that we have to check, does it?


    Quote Originally Posted by I like Serena View Post

    A degree $<r-1$ can happen in some of the iterations, but not generally.
    And it does not matter for the worst case computational complexity.
    A ok.


    Quote Originally Posted by I like Serena View Post

    Indeed, we need an initial condition.
    As the algorithm is now, we can only stop when $n=1$ can't we?
    Otherwise we don't actually have the polynomial yet.
    And then there is nothing to do, so the computational complexity is $0$.
    Ah yes, I see... So we have the following recurrence relation, right?


    $$f(n,r)=\left\{\begin{matrix}
    0, & n=1\\
    f\left(\frac{n}{2},r \right )+O(r \log{r}) &, n \text{ even } \\
    f(n-1,r)+O(r), & n \text{ odd}>1.
    \end{matrix}\right.$$

    How can we solve the recurrence relation, now that $f$ is a function of two variables? Do we maybe replace $r$ by $O((\log{n})^c)$ ?

  11. MHB Seeker
    MHB Global Moderator
    MHB Math Scholar
    MHB Coder
    Klaas van Aarsen's Avatar
    Status
    Online
    Join Date
    Mar 2012
    Location
    Leiden
    Posts
    7,707
    Thanks
    5,162 times
    Thanked
    13,906 times
    Thank/Post
    1.804
    Awards
    MHB Model Helper Award (2017)  

MHB Best Ideas (2017)  

MHB Analysis Award (2017)  

MHB Calculus Award (2017)  

MHB Model Helper Award (2016)
    #10
    Quote Originally Posted by evinda View Post
    So the computational cost will be polynomial, no matter what $r$ we will pick, right?

    It doesn't say how $r$ affects the number of different values of $a$ that we have to check, does it?
    I still don't quite get what is intended.
    And as far as I can tell the cost will be polynomial whatever we do.


    Quote Originally Posted by evinda View Post
    Ah yes, I see... So we have the following recurrence relation, right?

    $$f(n,r)=\left\{\begin{matrix}
    0, & n=1\\
    f\left(\frac{n}{2},r \right )+O(r \log{r}) &, n \text{ even } \\
    f(n-1,r)+O(r), & n \text{ odd}>1.
    \end{matrix}\right.$$

    How can we solve the recurrence relation, now that $f$ is a function of two variables? Do we maybe replace $r$ by $O((\log{n})^c)$ ?
    Yep.

    We can simplify it:
    $$\begin{aligned}f(n,r)&=\begin{cases}
    0, & n=1\\
    f\left(\frac{n}{2},r \right )+O(r \log{r}), & n \text{ even } \\
    f(n-1,r)+O(r), & n \text{ odd}>1
    \end{cases} \\
    &=\begin{cases}
    0, & n=1\\
    f\left(\frac{n}{2},r \right )+O(r \log{r}), & n \text{ even } \\
    f(\frac{n-1}2,r)+O(r) + O(r\log r), & n \text{ odd}>1
    \end{cases} \\
    &=\begin{cases}
    0, & n=1\\
    f\left(\lfloor\frac{n}{2}\rfloor,r \right )+O(r \log{r}), & n >1
    \end{cases} \\
    &= O(\log n \cdot r \log r)
    \end{aligned}$$

    Now we can substitute $r=O((\log n)^c)$ if we want.

Similar Threads

  1. Odd/even and prime
    By jasonsmith206 in forum Computer Science
    Replies: 2
    Last Post: October 19th, 2016, 17:05
  2. [SOLVED] Ratio test with an integer power of an in numerator
    By tmt in forum Calculus
    Replies: 4
    Last Post: September 7th, 2016, 00:46
  3. Prove prime test conjecture.
    By RLBrown in forum Number Theory
    Replies: 1
    Last Post: July 24th, 2016, 09:10
  4. ratio test and root test
    By aruwin in forum Calculus
    Replies: 2
    Last Post: July 19th, 2014, 04:48
  5. Prime Ideals in K[X]
    By Peter in forum Linear and Abstract Algebra
    Replies: 1
    Last Post: December 11th, 2013, 21:34

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Math Help Boards