
MHB Master
#1
November 24th, 2017,
13:27
Hello!!!
We want to find an efficient algorithm that checks whether an odd number is a prime or not.
In order to obtain such an algorithm, one tests the congruence $(X+a)^n \equiv X^n+a$ not "absolutely" in $\mathbb{Z}_n[X]$, but modulo a polynomial $X^r1$, where $r$ have to be chosen in a clever way. That is, one compares, in $\mathbb{Z}_n[X]$, the polynomials
$$(X+a)^n \pmod{X^r1} \text{ and } (X^n+a) \pmod{X^r1}=X^{n \mod{r}}+a.$$
In the intermediate results that appear in the course of the computation of the power $(X+a)^n$
, all coefficients are reduced modulo $n$, hence they can never exceed $n$. Calculating modulo $X^r1$ just means that one can replace $X^s$ by $X^{s \mod{r}}$, hence that the degrees of the polynomials that appear as intermediate results can be kept below $r$. This keeps the computational cost in the polynomial range as long as $r$ is $O((\log{n})^c)$ for some constant $c$.
How do we compute the computational cost for computing $(X+a)^n \pmod{X^r1}$ ?

November 24th, 2017 13:27
# ADS
Circuit advertisement

MHB Seeker
#2
November 25th, 2017,
06:27
Originally Posted by
evinda
This keeps the computational cost in the polynomial range as long as $r$ is $O((\log{n})^c)$ for some constant $c$.
Hey evinda!!
What does $r$ is $O((\log{n})^c)$ mean?
Originally Posted by
evinda
How do we compute the computational cost for computing $(X+a)^n \pmod{X^r1}$ ?
That depends on the algorithm we use doesn't it?
We might for instance set up a recursive algorithm based on squaring.
In that case let $f(n,r)$ be the computational cost of evaluating $(X^{r1}+b_{r2}X^{r2} + ... + b_0)^{n} \pmod{X^r1, n}$.
Then:
$$f(2k,r)=f(k,r)+\text{ cost to evaluate }(X^{r1}+b_{r2}X^{r2} + ... + b_0)^2$$
and:
$$f(2k+1,r)=f(2k,r)+\text{ cost to evaluate }(X^{r1}+c_{r2}X^{r2} + ... + c_0)(X^{r1}+b_{r2}X^{r2} + ... + b_0)$$
Can we find the worst case computational cost for computing $(X+a)^n \pmod{X^r1,n}$ from that?

MHB Master
#3
November 25th, 2017,
07:38
Thread Author
Originally Posted by
I like Serena
Hey
evinda!!
What does $r$ is $O((\log{n})^c)$ mean?
I assume that we will find the computational cost in respect to $r$ and then we will deduce that the computational cost is polynomial only if $r=O((\log{n})^c)$.
Originally Posted by
I like Serena
That depends on the algorithm we use doesn't it?
We use fast exponentiation in the ring $\mathbb{Z}_n[X]$.
Originally Posted by
I like Serena
We might for instance set up a recursive algorithm based on squaring.
In that case let $f(n,r)$ be the computational cost of evaluating $(X^{r1}+b_{r2}X^{r2} + ... + b_0)^{n} \pmod{X^r1, n}$.
So, $f(n,r)$ will be the computational cost of evaluating any monic polynomial of degree $r1$ ?
Originally Posted by
I like Serena
Then:
$$f(2k,r)=f(k,r)+\text{ cost to evaluate }(X^{r1}+b_{r2}X^{r2} + ... + b_0)^2$$
and:
$$f(2k+1,r)=f(2k,r)+\text{ cost to evaluate }(X^{r1}+c_{r2}X^{r2} + ... + c_0)(X^{r1}+b_{r2}X^{r2} + ... + b_0)$$
I see...
Originally Posted by
I like Serena
Can we find the worst case computational cost for computing $(X+a)^n \pmod{X^r1,n}$ from that?
Do we multiply two polynomials of some degree $d$ using the FFT, the time complexity of which is $O(d \log{d})$ ?
If so, will we have a recursive relation of the following form?
$f(n,n)=\left\{\begin{matrix}
f\left(\frac{n}{2},n \right )+O\left(\frac{n}{2}\log{\left(\frac{n}{2} \right )} \right ), & n \text{ even} \\
f\left(\frac{n1}{2},n \right )+O\left(\frac{n1}{2}\log{\left(\frac{n1}{2} \right )} \right )+\dots, & n \text{ odd}
\end{matrix}\right.$
For the case when $n$ is odd, how do we multiply a polynomial of degree $n1$ with a polynomial of degree $1$ ? Do we use again the FFT?

MHB Seeker
#4
November 25th, 2017,
08:23
Originally Posted by
evinda
I assume that we will find the computational cost in respect to $r$ and then we will deduce that the computational cost is polynomial only if $r=O((\log{n})^c)$.
Isn't $r$ supposed to be a constant?
Originally Posted by
evinda
So, $f(n,r)$ will be the computational cost of evaluating any monic polynomial of degree $r1$ ?
Yep.
Originally Posted by
evinda
Do we multiply two polynomials of some degree $d$ using the FFT, the time complexity of which is $O(d \log{d})$ ?
I guess we can.
Originally Posted by
evinda
If so, will we have a recursive relation of the following form?
$f(n,n)=\left\{\begin{matrix}
f\left(\frac{n}{2},n \right )+O\left(\frac{n}{2}\log{\left(\frac{n}{2} \right )} \right ), & n \text{ even} \\
f\left(\frac{n1}{2},n \right )+O\left(\frac{n1}{2}\log{\left(\frac{n1}{2} \right )} \right )+\dots, & n \text{ odd}
\end{matrix}\right.$
Shouldn't that be:
$$f(n,r)=\begin{cases}
f\left(\frac{n}{2},r \right )+O\left(r\log{r} \right ), & n \text{ even} \\
f\left(n1,r \right )+O\left(r\log{r} \right ), & n \text{ odd}
\end{cases}$$
Originally Posted by
evinda
For the case when $n$ is odd, how do we multiply a polynomial of degree $n1$ with a polynomial of degree $1$ ? Do we use again the FFT?
We don't.
First we evaluate the left hand polynomial completely, yielding a polynomial of degree $r1$.
And then we multiply by the right hand polynomial of degree $r1$.
We can indeed use the FFT for that.

MHB Master
#5
November 25th, 2017,
10:14
Thread Author
Originally Posted by
I like Serena
Isn't $r$ supposed to be a constant?
Yes, it just says at the beginning that $r$ will have to be chosen in a clever way. So they mean that we have to pick an $r$ in the order $O((\log{n})^c)$, or not?
Originally Posted by
I like Serena
Shouldn't that be:
$$f(n,r)=\begin{cases}
f\left(\frac{n}{2},r \right )+O\left(r\log{r} \right ), & n \text{ even} \\
f\left(n1,r \right )+O\left(r\log{r} \right ), & n \text{ odd}
\end{cases}$$
We don't.
First we evaluate the left hand polynomial completely, yielding a polynomial of degree $r1$.
And then we multiply by the right hand polynomial of degree $r1$.
We can indeed use the FFT for that.
Isn't it as follows?
$(X+a)^{2k+1} \mod{(X^r1)}=((X+a)^k)^2 \cdot (X+a) \mod{(X^r1)}$
So don't we multiply a polynomial of maximum degree $r1$ by a polynomial of degree $1$?
Or am I wrong?

MHB Seeker
#6
November 25th, 2017,
11:30
Originally Posted by
evinda
Yes, it just says at the beginning that $r$ will have to be chosen in a clever way. So they mean that we have to pick an $r$ in the order $O((\log{n})^c)$, or not?
I don't know what that is supposed to mean. I was hoping you could clarify.
Originally Posted by
evinda
Isn't it as follows?
$(X+a)^{2k+1} \mod{(X^r1)}=((X+a)^k)^2 \cdot (X+a) \mod{(X^r1)}$
So don't we multiply a polynomial of maximum degree $r1$ by a polynomial of degree $1$?
Or am I wrong?
Oh yes, the right side polynomial is always $(X+a)$.
That means we can calculate it directly:
$$(X^{r1}+b_{r2}X^{r2} + ... + b_0)(X+a) \\
= X^r + (b_{r2}+a)X^{r1} + (b_{r3}+a)X^{r2} + ... +(b_1+a)X+b_0a \\
= (b_{r2}+a)X^{r1} + (b_{r3}+a)X^{r2} + ... +(b_1+a)X+(b_0a + 1)
$$
It takes $r$ additions, so it's $O(r)$.
It doesn't really matter though. We might as well use the same FFT for $O(r\log r)$.
That's because $O(r\log r + r)=O(r\log r)$.

MHB Master
#7
November 25th, 2017,
11:55
Thread Author
Originally Posted by
I like Serena
I don't know what that is supposed to mean. I was hoping you could clarify.
It says the following:
But as they write it, they mean that the computational cost is polynomial only if $r$ is $O((\log{n})^c)$. I don't know why they say it like that.
Originally Posted by
I like Serena
Oh yes, the right side polynomial is always $(X+a)$.
That means we can calculate it directly:
$$(X^{r1}+b_{r2}X^{r2} + ... + b_0)(X+a) \\
= X^r + (b_{r2}+a)X^{r1} + (b_{r3}+a)X^{r2} + ... +(b_1+a)X+b_0a \\
= (b_{r2}+a)X^{r1} + (b_{r3}+a)X^{r2} + ... +(b_1+a)X+(b_0a + 1)
$$
It takes $r$ additions, so it's $O(r)$.
It doesn't really matter though. We might as well use the same FFT for $O(r\log r)$.
That's because $O(r\log r + r)=O(r\log r)$.
So don't we have the following?
$$f(n,r)=\begin{cases}
f\left(\frac{n}{2},r \right )+O\left(r\log{r} \right ), & n \text{ even} \\
f\left(n1,r \right )+O(r), & n \text{ odd}
\end{cases}$$
Also, I was wondering if it is not possible that the polynomial that we get has degree $<r1$...
In addition, we also need an initial condition in order to be able to solve the recurrence relation, don't we?
Do we check the computational cost when $n=r$, in order to find an initial condition? Or am I wrong?

MHB Seeker
#8
November 25th, 2017,
12:09
Originally Posted by
evinda
It says the following:
But as they write it, they mean that the computational cost is polynomial only if $r$ is $O((\log{n})^c)$. I don't know why they say it like that.
It actually says 'if $r$ is of size $O((\log{n})^c)$'.
So I believe they're not talking about computational complexity but just about which constant value of $r$ to pick.
In particular that affects that number of different values of $a$ that we have to check.
Anyway, we're currently only talking about the computational complexity for a single value of $a$.
Originally Posted by
evinda
So don't we have the following?
$$f(n,r)=\begin{cases}
f\left(\frac{n}{2},r \right )+O\left(r\log{r} \right ), & n \text{ even} \\
f\left(n1,r \right )+O(r), & n \text{ odd}
\end{cases}$$
Also, I was wondering if it is not possible that the polynomial that we get has degree $<r1$...
Yep.
A degree $<r1$ can happen in some of the iterations, but not generally.
And it does not matter for the worst case computational complexity.
Originally Posted by
evinda
In addition, we also need an initial condition in order to be able to solve the recurrence relation, don't we?
Do we check the computational cost when $n=r$, in order to find an initial condition? Or am I wrong?
Indeed, we need an initial condition.
As the algorithm is now, we can only stop when $n=1$ can't we?
Otherwise we don't actually have the polynomial yet.
And then there is nothing to do, so the computational complexity is $0$.

MHB Master
#9
November 25th, 2017,
12:56
Thread Author
Originally Posted by
I like Serena
It actually says 'if $r$ is of size $O((\log{n})^c)$'.
So I believe they're not talking about computational complexity but just about which constant value of $r$ to pick.
In particular that affects that number of different values of $a$ that we have to check.
So the computational cost will be polynomial, no matter what $r$ we will pick, right?
It doesn't say how $r$ affects the number of different values of $a$ that we have to check, does it?
Originally Posted by
I like Serena
A degree $<r1$ can happen in some of the iterations, but not generally.
And it does not matter for the worst case computational complexity.
A ok.
Originally Posted by
I like Serena
Indeed, we need an initial condition.
As the algorithm is now, we can only stop when $n=1$ can't we?
Otherwise we don't actually have the polynomial yet.
And then there is nothing to do, so the computational complexity is $0$.
Ah yes, I see... So we have the following recurrence relation, right?
$$f(n,r)=\left\{\begin{matrix}
0, & n=1\\
f\left(\frac{n}{2},r \right )+O(r \log{r}) &, n \text{ even } \\
f(n1,r)+O(r), & n \text{ odd}>1.
\end{matrix}\right.$$
How can we solve the recurrence relation, now that $f$ is a function of two variables? Do we maybe replace $r$ by $O((\log{n})^c)$ ?

MHB Seeker
#10
November 25th, 2017,
13:10
Originally Posted by
evinda
So the computational cost will be polynomial, no matter what $r$ we will pick, right?
It doesn't say how $r$ affects the number of different values of $a$ that we have to check, does it?
I still don't quite get what is intended.
And as far as I can tell the cost will be polynomial whatever we do.
Originally Posted by
evinda
Ah yes, I see... So we have the following recurrence relation, right?
$$f(n,r)=\left\{\begin{matrix}
0, & n=1\\
f\left(\frac{n}{2},r \right )+O(r \log{r}) &, n \text{ even } \\
f(n1,r)+O(r), & n \text{ odd}>1.
\end{matrix}\right.$$
How can we solve the recurrence relation, now that $f$ is a function of two variables? Do we maybe replace $r$ by $O((\log{n})^c)$ ?
Yep.
We can simplify it:
$$\begin{aligned}f(n,r)&=\begin{cases}
0, & n=1\\
f\left(\frac{n}{2},r \right )+O(r \log{r}), & n \text{ even } \\
f(n1,r)+O(r), & n \text{ odd}>1
\end{cases} \\
&=\begin{cases}
0, & n=1\\
f\left(\frac{n}{2},r \right )+O(r \log{r}), & n \text{ even } \\
f(\frac{n1}2,r)+O(r) + O(r\log r), & n \text{ odd}>1
\end{cases} \\
&=\begin{cases}
0, & n=1\\
f\left(\lfloor\frac{n}{2}\rfloor,r \right )+O(r \log{r}), & n >1
\end{cases} \\
&= O(\log n \cdot r \log r)
\end{aligned}$$
Now we can substitute $r=O((\log n)^c)$ if we want.