Welcome to our community

Be a part of something great, join today!

[SOLVED] square root in Q(root 2) means its in Z[root 2]

caffeinemachine

Well-known member
MHB Math Scholar
Mar 10, 2012
834
Let $a,b \in \mathbb{Z}$, and if $a+b\sqrt{2}$ has a square root in $\mathbb{Q}(\sqrt{2})$, then the square root is actually in $\mathbb{Z}[\sqrt{2}]$.

Only one approach comes to my mind. Let $r_1, r_2 \in \mathbb{Q}$ such that $a+b\sqrt{2}=(r_1+r_2\sqrt{2})^2$. This gives $a=r_1^2+2r_2^2, b=2r_1r_2$. I need to somehow show that $r_1, r_2$ are integers. I played with the above equations putting $r_i=p_i/q_i$ with $\gcd (p_i,q_i)=1$. But I couldn't conclude anything.
 

Opalg

MHB Oldtimer
Staff member
Feb 7, 2012
2,708
Let $a,b \in \mathbb{Z}$, and if $a+b\sqrt{2}$ has a square root in $\mathbb{Q}(\sqrt{2})$, then the square root is actually in $\mathbb{Z}[\sqrt{2}]$.

Only one approach comes to my mind. Let $r_1, r_2 \in \mathbb{Q}$ such that $a+b\sqrt{2}=(r_1+r_2\sqrt{2})^2$. This gives $a=r_1^2+2r_2^2, b=2r_1r_2$. I need to somehow show that $r_1, r_2$ are integers. I played with the above equations putting $r_i=p_i/q_i$ with $\gcd (p_i,q_i)=1$. But I couldn't conclude anything.
If $a+b\sqrt{2}$ has a square root in $\mathbb{Q}(\sqrt{2})$, then the square root can be written in the form $\dfrac{p+q\sqrt2}r$, where $p$, $q$ and $r$ are integers and $r$ is chosen to be positive and as small as possible (so that in particular the triple $p,\,q,\,r$ will have no common factor greater than 1).

Then $r^2(a+b\sqrt2) = (p+q\sqrt2)^2 = p^2+2q^2 + 2pq\sqrt2$, and therefore $p^2+2q^2 - r^2a = (r^2b-2pq)\sqrt2.$ But $\sqrt2$ is irrational, so no nonzero multiple of it can be an integer. Therefore $$p^2+2q^2 = r^2a, \qquad 2pq = r^2b.$$
Suppose that $r$ has an odd prime factor $\rho$. Then the second of those displayed equations shows that $\rho$ is a factor of either $p$ or $q$. The first of the displayed equations then shows that $\rho$ is a factor of both $p$ and $q$. Thus $p$, $q$ and $r$ have the common factor $\rho$, contrary to the initial assumption.

Next, suppose that $r$ is even, say $r=2s$. Then the first displayed equation becomes $p^2+2q^2 = 4s^2a$, showing that $p$ must be even, say $p=2t.$ It follows that $2t^2+q^2 = 2s^2a$, showing that $q$ is even. Thus $p$, $q$ and $r$ have the common factor 2, again contrary to the initial assumption.

The conclusion is that $r$ has no prime factors at all and is therefore equal to 1, proving that $a+b\sqrt{2}$ has a square root in $\mathbb{Z}[\sqrt{2}].$
 

caffeinemachine

Well-known member
MHB Math Scholar
Mar 10, 2012
834
If $a+b\sqrt{2}$ has a square root in $\mathbb{Q}(\sqrt{2})$, then the square root can be written in the form $\dfrac{p+q\sqrt2}r$, where $p$, $q$ and $r$ are integers and $r$ is chosen to be positive and as small as possible (so that in particular the triple $p,\,q,\,r$ will have no common factor greater than 1).

Then $r^2(a+b\sqrt2) = (p+q\sqrt2)^2 = p^2+2q^2 + 2pq\sqrt2$, and therefore $p^2+2q^2 - r^2a = (r^2b-2pq)\sqrt2.$ But $\sqrt2$ is irrational, so no nonzero multiple of it can be an integer. Therefore $$p^2+2q^2 = r^2a, \qquad 2pq = r^2b.$$
Suppose that $r$ has an odd prime factor $\rho$. Then the second of those displayed equations shows that $\rho$ is a factor of either $p$ or $q$. The first of the displayed equations then shows that $\rho$ is a factor of both $p$ and $q$. Thus $p$, $q$ and $r$ have the common factor $\rho$, contrary to the initial assumption.

Next, suppose that $r$ is even, say $r=2s$. Then the first displayed equation becomes $p^2+2q^2 = 4s^2a$, showing that $p$ must be even, say $p=2t.$ It follows that $2t^2+q^2 = 2s^2a$, showing that $q$ is even. Thus $p$, $q$ and $r$ have the common factor 2, again contrary to the initial assumption.

The conclusion is that $r$ has no prime factors at all and is therefore equal to 1, proving that $a+b\sqrt{2}$ has a square root in $\mathbb{Z}[\sqrt{2}].$
Thank You so much!