Welcome to our community

Be a part of something great, join today!

Number Theory Why isn't gcd(0,0)=1?

  • Thread starter
  • Banned
  • #1

Poirot

Banned
Feb 15, 2012
250
I have read that by convention gcd(0,0)=0. But surely 1 fits the bill. Everything divides 0, including 1. But since 1 divides everything we must have gcd(0,0)=1
 

Petek

Member
Jan 8, 2013
20
I have read that by convention gcd(0,0)=0. But surely 1 fits the bill. Everything divides 0, including 1. But since 1 divides everything we must have gcd(0,0)=1
What's your source for the convention that gcd(0,0) = 0? I looked at several books on number theory and they all include in the definition of gcd(a,b) the assumption that at least one of a and b is nonzero. Besides, since every positive integer divides 0, there is no greatest common divisor of 0 and 0.
 

Bacterius

Well-known member
MHB Math Helper
Jan 26, 2012
644
What's your source for the convention that gcd(0,0) = 0? I looked at several books on number theory and they all include in the definition of gcd(a,b) the assumption that at least one of a and b is nonzero. Besides, since every positive integer divides 0, there is no greatest common divisor of 0 and 0.
No, it is a convention: see here. Having $\gcd{(0, 0)} = 1$ would break the following property:

$$m \cdot \gcd{(a, b)} = \gcd{(m \cdot a, m \cdot b)} ~ ~ ~ \text{for} ~ m \geq 0$$

Which would yield $1 = \text{anything}$.

Generally, conventions like these are chosen to minimize the amount of special cases theorems have to deal with. In and of themselves, they are rather trivial - in practice, it's not very useful to know whether $\gcd{(0, 0)} = 0 ~ \text{or} ~ 1$.
 
Last edited:

Klaas van Aarsen

MHB Seeker
Staff member
Mar 5, 2012
8,780
No, it is a convention: see here. Having $\gcd{(0, 0)} = 1$ would break the following property:

$$m \cdot \gcd{(a, b)} = \gcd{(m \cdot a, m \cdot b)} ~ ~ ~ \text{for} ~ m \geq 0$$

Which would yield $1 = \text{anything}$.

Generally, conventions like these are chosen to minimize the amount of special cases theorems have to deal with. In and of themselves, they are rather trivial - in practice, it's not very useful to know whether $\gcd{(0, 0)} = 0 ~ \text{or} ~ 1$.
I do not consider Wolfram|Alpha a good reference.
There are many cases where it does not give the proper mathematical answer.
I don't blame it - it's a calculator and not a math reference.

As for gcd(0,0), I don't see specific references to it on wiki or on wolfram|mathworld.
However, both give the definition that it "is the largest positive integer that divides the numbers without a remainder."
Since any large positive integer divides 0, it would follow that gcd(0,0) is undefined (or infinity).
 

Bacterius

Well-known member
MHB Math Helper
Jan 26, 2012
644
I do not consider Wolfram|Alpha a good reference.
There are many cases where it does not give the proper mathematical answer.
I don't blame it - it's a calculator and not a math reference.

As for gcd(0,0), I don't see specific references to it on wiki or on wolfram|mathworld.
However, both give the definition that it "is the largest positive integer that divides the numbers without a remainder."
Since any large positive integer divides 0, it would follow that gcd(0,0) is undefined (or infinity).
The assumption was that the people who wrote Wolfram|Alpha probably know their stuff, and would correctly handle the special cases. Of course it's not an official reference, but it's handy to quickly check things :)