The greatest common divisor of n integers

In summary, the statement "gcd(a_1, ..., a_n) = min( gcd(a_i, a_j) i,j\in{1...n})" is not true, and can be proved by induction.
  • #1
Ratpigeon
56
0

Homework Statement


define the gcd of a set of n integers, S={a_1...a_n}
Prove that exists and can be written as
q_1 a_1+...+q_n a_n for some integers, q_1...q_n

Homework Equations



Euclid's Algorithm?

The Attempt at a Solution



I have the statement that gcd(a_1...a_n) = min( gcd(a_i, a_j) i,j[itex]\in[/itex]{1...n})

and then I know that I can say since any two numbers have a gcd, the gcd exists, and can be represented as a linear combination of the a_i and a_j that have the minimal gcd, but I don't know how to prove the first statement.
 
Physics news on Phys.org
  • #2
I say this should be moved to the Precalculus help page...

Anyhow... :) Well, for a start, your statement is NOT true.
Consider {6,10,15}. Then gcd (6,10,15) = 1, but gcd(6,10) = 2, gcd(6,15) = 3, and gcd(10,15) = 10.

Try solving it for n = 3. See if that gives you any ideas on how to approach the general case.
 
  • #3
This is easily done by induction (over the number of arguments) from gcd(a_1, a_2) = q_1 a_1 + q_2 a_2.
 
Last edited:
  • #4
voko said:
This is done by induction.

Assume that gcd(a_1, a_2) = q_1 a_1 + q_2 a_2. This is indeed the case and is known as Bézout's identity (see below).

Now assume gcd(a_1, ..., a_n) = q_1 a_1 + ... + q_n a_n, and check for gcd(a_1, ..., a_n, a_{n + 1}).

It is trivial to show that gcd(a_1, ..., a_n, a_{n + 1}) = gcd(gcd(a_1, ..., a_n), a_{n + 1}), that is, gcd(a_1, ..., a_n, a_{n + 1}) = A gcd(a_1, ..., a_n) + B a_{n + 1} = A (q_1 a_1 + ... + q_n a_n) + B a_{n + 1} = A q_1 a_1 + ... + A q_n a_n + B a_{n + 1}, which is what we are looking for.

You shouldn't merely give the answer - next time, try to lead the OP toward the correct solution. :)
 
  • #5
Thanks, i didnt catch the counterexample which set me off on the wrong track. Should be fine now
 

Related to The greatest common divisor of n integers

1. What is the definition of the greatest common divisor (GCD) of n integers?

The greatest common divisor of n integers is the largest positive integer that divides evenly into all of the given integers.

2. How do you calculate the GCD of n integers?

To calculate the GCD of n integers, you can use the Euclidean algorithm, which involves finding the remainder of the division of the larger number by the smaller number and repeating this process until the remainder is 0. The last non-zero remainder is the GCD.

3. Can the GCD of n integers be negative?

No, the GCD of n integers is always a positive integer. It is the largest positive integer that divides evenly into all of the given integers.

4. What is the relationship between the GCD of n integers and the least common multiple (LCM) of n integers?

The GCD and LCM of n integers are related by the following formula: GCD(a,b) * LCM(a,b) = a * b. This means that the GCD and LCM are complementary, as the GCD represents the largest factor shared by the integers and the LCM represents the smallest multiple that contains all of the given integers.

5. Can the GCD of n integers be 0?

Yes, the GCD of n integers can be 0 if all of the given integers are 0. In this case, the GCD is undefined as there is no greatest common divisor for a set of 0 integers.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Precalculus Mathematics Homework Help
Replies
1
Views
2K
  • Precalculus Mathematics Homework Help
Replies
9
Views
1K
  • Precalculus Mathematics Homework Help
Replies
24
Views
5K
Replies
1
Views
1K
  • Precalculus Mathematics Homework Help
Replies
9
Views
2K
  • Precalculus Mathematics Homework Help
2
Replies
35
Views
5K
  • Precalculus Mathematics Homework Help
Replies
6
Views
2K
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
938
Back
Top