A specific combination problem

In summary, the binomial coefficient is a formula that can be used to calculate the amount of possible permutations when you're asking for a particular combination of which the order doesn't matter.
  • #1
JohnnyGui
796
51
Hello,

I have been trying to solve this problem but I can't seem to find a way.

Given are ##n## cards and each card can show one of two values: M or K.

How many possible permutations are there in which there are as many cards with M as there are with K? Given that ##n## is an even amount of cards.

Is it possible to derive a formula for this as a function of ##n##? How does one deduce this?
 
Physics news on Phys.org
  • #3
DrClaude said:
Check out the binomial coefficient.

Thanks for the reply. So my question is basically calculated by:
$$\binom n k = \frac{n!}{0.5n! \cdot 0.5n!}$$
Since I'm asking about an equal amount of cards with M as K, that means ##k = 0.5n##. Is this formula therefore correct?
 
  • #4
JohnnyGui said:
Thanks for the reply. So my question is basically calculated by:
$$\binom n k = \frac{n!}{0.5n! \cdot 0.5n!}$$
Is this correct?
Correct. You are basically choosing the positions of the n/2 M cards (or K cards) out of the order of the n cards.
 
  • #5
DrClaude said:
Correct. You are basically choosing the positions of the n/2 M cards (or K cards) out of the order of the n cards.

This might be a bit too much to ask, but is there a simple way to explain how this formula is derived? I do understand that the total number of permutations is ##2^n## and I know how to calculate the amount of possible permutations when you're asking for a very specific sequence/order.
But when you ask about a particular combination of which the order doesn't matter, there is a chance that you'd add a number of certain permutations multiple times when you do this manually. I know I need to substract these possible permutations that I have covered before but it's a lot of work and I can't seem to recognize a pattern to put this in a formula.
 
Last edited:
  • #6
JohnnyGui said:
I need to substract these possible permutations that I have covered before

There can be combinatorial problems where such a subtraction is necessary, but typically the aspect of many-permutations-are-the-same-combination is handled by using division or multiplication. The general concept is that 1 combination of objects can be used to generate F permutations of the objects. ( So if we are interested in counting combinations we can count permutations and then divide that answer by F.)

How many possible permutations are there in which there are as many cards with M as there are with K?

Since you used the word "permutation" it indicates that you wish to consider n cards that are distinguishable. However, you didn't indicate any way to distinguish one "M" from another "M".

The concept of several literally "indistinguishable" things is paradoxical. For example, If we had two balls that were literally indistinguishable, then we wouldn't have two balls. We would have only one ball. They would be the same ball. If we want to reason about several "indistinguishable" things we have to keep in mind that they are indistinguishable with respect to some properties but distinguishable with respect to others.

Suppose we have 4 cards, C1,C2,C3,C4, and a set of 4 (distinct) labels: M1,M2,K1,K2. How many "ways" can the 4 labels be assigned to the 4 cards? We have 4 choices of a label for card C1, 3 choices of a label for card C2, etc. So we have a total of (4)(3)(2)(1) = 24 "ways".

Suppose we wish to consider M1 and M2 "indistinguishable" (with respect to their having the same "M-ness").
Then a "way" like
C1=M1, C2=M2, C3= K1, C3= K2
is no longer distinct from a "way" like
C1=M2, C2 =M1,C3 =K1, C3 = K2

Losing the distinction between M1 and M2 , we would describe another kind of "way" by:
C1=M, C2=M, C3=K1, C3 = K2
Each "way" of this kind, can be realized in 2 "ways" of the previous kind. So if we are interested in counting the number of "ways" of this kind we can use:
(number of "ways" of this kind)(2) = (number of ways of the previous kind) = 24
(number of "ways" of this kind) = 24/2 = (number of ways with M1 not distinguished from M2)

Similarly, if we wish to also lose the distinction between K1 and K2 we can use

(number of "ways" with K1 not distinguished from K2 and M1 not distinguished from M2)(2) = (number of ways with M1 not distinguished from M2)

(number of "ways" with K1 not distinguished from K1 and M1 not distinguished from M2) = (24/2)/2 = 24/4.
 
Last edited:
  • #7
JohnnyGui said:
This might be a bit too much to ask, but is there a simple way to explain how this formula is derived? I do understand that the total number of permutations is ##2^n## and I know how to calculate the amount of possible permutations when you're asking for a very specific sequence/order.
But when you ask about a particular combination of which the order doesn't matter, there is a chance that you'd add a number of certain permutations multiple times when you do this manually. I know I need to substract these possible permutations that I have covered before but it's a lot of work and I can't seem to recognize a pattern to put this in a formula.

Let ##n=2k##. We need to count how many ways we can arrange ##k## M's and ##k## K's. Note that each permutation is defined by the position of the M's. Now, imagine each position is numbered from 1 to ##n##. Each permutation is defined by a set of ##k## numbers, representing the positions of all the M's. And, the number of ways we can choose ##k## numbers from ##n## numbers is ##\binom{n}{k}##.
 
  • Like
Likes QuantumQuest and StoneTemplePython
  • #8
If some visual is needed / helpful, working through Pascal's Triangle could be instructive. (This problem is equivalent to looking at the 'midpoint' or row n of the triangle.)
 
  • #9
JohnnyGui said:
This might be a bit too much to ask, but is there a simple way to explain how this formula is derived?
Consider that there is a total of ##n## cards. You want to pick the position of ##m## of these cards (say, the ##n/2## M cards). There are ##n## positions where you can put the first card, then ##n-1## positions for the second card, ##n-2## for the third card, and so on. So the total number of arrangements of these ##m## cards in the ##n## positions is
$$
n \times (n-1) \times (n-2) \times \cdots \times (n-m+1)
$$
which can be written succinctly as
$$
\frac{n!}{(n-m)!}
$$
This will give you the number of permutations of ##m## in ##n##.

But in your case, you don't care if it was the first card in position 3 and the second card in position 10, or the first card in position 10 and the second card in position 3, and so on for all the cards. So you are overcounting the number of distinct outcomes. The amount of overcounting is exactly the number of ways you can shuffle the ##m## cards (e.g., once you've picked the ##m## positions they will go in). The number of arrangements of the ##m## cards is thus ##m \times (m-1) \times \cdots \times 1 = m!##. Hence, the total number of combinations of ##m## in ##n## is
$$
\frac{n!}{(n-m)! m!}
$$
which is the binomial coefficient.
 
  • Like
Likes QuantumQuest and JohnnyGui
  • #10
Stephen Tashi said:
There can be combinatorial problems where such a subtraction is necessary, but typically the aspect of many-permutations-are-the-same-combination is handled by using division or multiplication. The general concept is that 1 combination of objects can be used to generate F permutations of the objects. ( So if we are interested in counting combinations we can count permutations and then divide that answer by F.)

Since you used the word "permutation" it indicates that you wish to consider n cards that are distinguishable. However, you didn't indicate any way to distinguish one "M" from another "M".

The concept of several literally "indistinguishable" things is paradoxical. For example, If we had two balls that were literally indistinguishable, then we wouldn't have two balls. We would have only one ball. They would be the same ball. If we want to reason about several "indistinguishable" things we have to keep in mind that they are indistinguishable with respect to some properties but distinguishable with respect to others.

Suppose we have 4 cards, C1,C2,C3,C4, and a set of 4 (distinct) labels: M1,M2,K1,K2. How many "ways" can the 4 labels be assigned to the 4 cards? We have 4 choices of a label for card C1, 3 choices of a label for card C2, etc. So we have a total of (4)(3)(2)(1) = 24 "ways".

Suppose we wish to consider M1 and M2 "indistinguishable" (with respect to their having the same "M-ness").
Then a "way" like
C1=M1, C2=M2, C3= K1, C3= K2
is no longer distinct from a "way" like
C1=M2, C2 =M1,C3 =K1, C3 = K2

Losing the distinction between M1 and M2 , we would describe another kind of "way" by:
C1=M, C2=M, C3=K1, C3 = K2
Each "way" of this kind, can be realized in 2 "ways" of the previous kind. So if we are interested in counting the number of "ways" of this kind we can use:
(number of "ways" of this kind)(2) = (number of ways of the previous kind) = 24
(number of "ways" of this kind) = 24/2 = (number of ways with M1 not distinguished from M2)

Similarly, if we wish to also lose the distinction between K1 and K2 we can use

(number of "ways" with K1 not distinguished from K2 and M1 not distinguished from M2)(2) = (number of ways with M1 not distinguished from M2)

(number of "ways" with K1 not distinguished from K1 and M1 not distinguished from M2) = (24/2)/2 = 24/4.

Thanks for the explanation. I see that in case of 4 cards of 2 K's and 2 M's, one would have to divide by 4 to treat M1 the same as M2 and K1 as K2. How does this number of copies increase with more cards then? For example, I see that for 6 cards, in which there are M1, M2, M3 and K1, K2 and K3, there are 720 possible permutations if the K's and M's are to be distinguished. I have to divide that by 36 apparently to get the 20 possible permutations of 2 K's and 2 M's if they are not distinguished, but I can't seem to extrapolate the division by 4 in case of a 4 card system to the division by 36 in a 6 card system. Perhaps missing something very obvious here.

@DrClaude : Reading your post at the moment.
@StoneTemplePython : Thanks, I'll check that out
 
  • #11
DrClaude said:
Consider that there is a total of ##n## cards. You want to pick the position of ##m## of these cards (say, the ##n/2## M cards). There are ##n## positions where you can put the first card, then ##n-1## positions for the second card, ##n-2## for the third card, and so on. So the total number of arrangements of these ##m## cards in the ##n## positions is
$$
n \times (n-1) \times (n-2) \times \cdots \times (n-m+1)
$$
which can be written succinctly as
$$
\frac{n!}{(n-m)!}
$$
This will give you the number of permutations of ##m## in ##n##.

But in your case, you don't care if it was the first card in position 3 and the second card in position 10, or the first card in position 10 and the second card in position 3, and so on for all the cards. So you are overcounting the number of distinct outcomes. The amount of overcounting is exactly the number of ways you can shuffle the ##m## cards (e.g., once you've picked the ##m## positions they will go in). The number of arrangements of the ##m## cards is thus ##m \times (m-1) \times \cdots \times 1 = m!##. Hence, the total number of combinations of ##m## in ##n## is
$$
\frac{n!}{(n-m)! m!}
$$
which is the binomial coefficient.

Thanks for the clarification. Can I say that the formula ##\frac{n!}{(n-m)!}## is only used if I do care about which M card is in which position (3 or 10 like you said) but it doesn't matter for me which of the K cards is in which position?
 
  • #12
JohnnyGui said:
Thanks for the explanation. I see that in case of 4 cards of 2 K's and 2 M's, one would have to divide by 4 to treat M1 the same as M2 and K1 as K2. How does this number of copies increase with more cards then? For example, I see that for 6 cards, in which there are M1, M2, M3 and K1, K2 and K3, there are 720 possible permutations if the K's and M's are to be distinguished. I have to divide that by 36 apparently to get the 20 possible permutations of 2 K's and 2 M's if they are not distinguished, but I can't seem to extrapolate the division by 4 in case of a 4 card system to the division by 36 in a 6 card system. Perhaps missing something very obvious

Think about having four cards. An M and three K's, labelled ##K_1, K_2,K_3##.

You have ##4!## permutations. However, if you consider the K's are all the same, then you have only ##4## permutations - each defined by the position of the M.

This is because with three K's you have ##3!## ways to arrange them, so each new permutation corresponds to ##6## original permutations.

You can extend this argument to the case there are three M's as well. You start with ##6!## permutations where all the cards are different. First you consider all the K's to be the same, reducing the total by a factor of 6. Then you consider all the M's to be the same, reducing by another factor of 6.

In general, with ##n## cards, ##k## of which are K's and ##m = n -k## of which are M's, we have ##n!## total permutations and ##\frac{n!}{k!(n-k)!}## permutations if we consider all the K's and all the M's the same.

This ties into the previous arguments to get the binomial coefficient again.
 
  • #13
JohnnyGui said:
Thanks for the clarification. Can I say that the formula ##\frac{n!}{(n-m)!}## is only used if I do care about which M card is in which position (3 or 10 like you said) but it doesn't matter for me which of the K cards is in which position?
Yes. You've effectively divided the total permutations by ##k! = (n-m)!## to ignore the permutations of the K's in each permutation.
 
  • #14
PeroK said:
Yes. You've effectively divided the total permutations by ##k! = (n-m)!## to ignore the permutations of the K's in each permutation.

Thanks a lot for your explaining and verifying this. I'm happy to say I was able to derive the formula myself before reading your explanation of post #12.

I noticed that if I want half of the cards ##M## and half of the cards ##K## of a total of 4 cards, that for each order, for example ##MMKK##, there are 3 more sequences of that order that I don't care about. In this case ##M_1M_2K_1K_2##, ##M_2M_1K_1K_2## and ##M_2M_1K_2K_1## for example.

The number of sequencess that I don't care about for each particular order with ##n## cards can be calculated simply by looking at how many remaining possibilities there are for each card. In the case of 4 cards, it is ##2 \cdot 1 \cdot 2 \cdot 1 = 4## sequences for each order; the first card being either ##M_1## or ##M_2##, the second card must then be the remaining ##M## card, the third card being either ##K_1## or ##K_2##, the fourth one must be the remaining ##K## card.
This can be rewritten as ##0.5n! \cdot 0.5n!## if I want equal numbers of ##M## and ##K## cards. If I want a different amount of M cards than K cards, then I can extrapolate this reasoning to the fact that there are ##k! \cdot (n-k)!## possibilites for each order.

There are ##n!## orders, each order having ##k! \cdot (n-k)!## sequences. Since I don't care about these extra possible sequences for each order (I only want 1 of each order), I simply have to divide the possible amount of orders (##n!##) by the possible amount of sequences per order. Thus
$$\frac{n!}{k!\cdot(n-k)!}$$
 
Last edited:
  • #15
PeroK said:
Yes. You've effectively divided the total permutations by ##k! = (n-m)!## to ignore the permutations of the K's in each permutation.

There is something I have been wondering about. If the chance that I would throw either a K or an M card is 50%, can I deduce that the chance that I'd throw an equal amount of K's as M's in a trial of ##n## throws is equal to the following:
$$\frac{n!}{(0.5n)! \cdot (0.5n)!} \cdot \frac{1}{2^n}$$
??
 
  • #16
JohnnyGui said:
There is something I have been wondering about. If the chance that I would throw either a K or an M card is 50%, can I deduce that the chance that I'd throw an equal amount of K's as M's in a trial of ##n## throws is equal to the following:
$$\frac{n!}{(0.5n)! \cdot (0.5n)!} \cdot \frac{1}{2^n}$$
??
Yes. If you think about the ways to get ##k## K's out of ##n## cards, that is ##\binom{n}{k}##, for any ##k## from ##0## to ##n##.

And the sum of these gives the total number of permutations:

##\sum_{k=0}^{n} \binom{n}{k} = 2^n##

As each of the ##2^n## permutations is equally likely, the probabilities follow.
 
  • Like
Likes QuantumQuest
  • #17
PeroK said:
Yes. If you think about the ways to get ##k## K's out of ##n## cards, that is ##\binom{n}{k}##, for any ##k## from ##0## to ##n##.

And the sum of these gives the total number of permutations:

##\sum_{k=0}^{n} \binom{n}{k} = 2^n##

As each of the ##2^n## permutations is equally likely, the probabilities follow.

Great, thanks! I noticed that when I want to write out ##\binom{n}{k}## as ##\frac{n!}{k!\cdot (n-k)!}## in the summation ##\sum_{k=0}^{n} \binom{n}{k} = 2^n##, I'd have to do it like the following;
$$2 + \sum_{k=1}^{n-1} \frac{n!}{k!\cdot (n-k)!} = 2^n$$
This is because, in this case, you can not start the summation with ##k=0## or end it with ##k=n## since that would make you divide by 0 in the summation. This leaves 2 remaining possibilities (all ##M##'s or all ##K##'s) that is not included in the summation so you'd have to add 2 next to it. Right?
 
  • #18
##0!=1## by definition.
 
  • #19
PeroK said:
##0!=1## by definition.

Ah, I see they covered that problem.

There's one last thing; I realized that as the number of throws ##n## increases, the chance according to ##\frac{n!}{(0.5n)! \cdot (0.5n)!} \cdot \frac{1}{2^n}## to throw equal ##M##'s as ##K's## would decrease remarkably (each ##M## or ##K## having 50% chance). I understand that this is because there are a lot of other possible combinations with different amounts of ##M##'s and ##K##'s as ##n## increases.

However, even though the chance for throwing equal amounts of ##M's## and ##K##'s decreases at a higher ##n##, isn't that chance still the largest chance compared to the chance to throw one other possible combination that has other amounts of ##M##'s and ##K##'s?
 
  • #20
JohnnyGui said:
Ah, I see they covered that problem.

There's one last thing; I realized that as the number of throws ##n## increases, the chance according to ##\frac{n!}{(0.5n)! \cdot (0.5n)!} \cdot \frac{1}{2^n}## to throw equal ##M##'s as ##K's## would decrease remarkably (each ##M## or ##K## having 50% chance). I understand that this is because there are a lot of other possible combinations with different amounts of ##M##'s and ##K##'s as ##n## increases.

However, even though the chance for throwing equal amounts of ##M's## and ##K##'s decreases at a higher ##n##, isn't that chance still the largest chance compared to the chance to throw one other possible combination that has other amounts of ##M##'s and ##K##'s?

Yes. From Pascals triangle you can see the binomial coefficients are always largest in the middle. For even ##n## this is a single maximum value at ##n/2##.

But, as you have observed, the probability of getting exactly ##n/2## reduces as ##n## increases, as the binomial distribution spreads out.
 
  • #21
PeroK said:
Yes. From Pascals triangle you can see the binomial coefficients are always largest in the middle. For even ##n## this is a single maximum value at ##n/2##.

But, as you have observed, the probability of getting exactly ##n/2## reduces as ##n## increases, as the binomial distribution spreads out.

What surprises me is that as ##n## increases, the ratio of the chance for throwing equal amounts of K’s as M’s divided by the summed chance for throwing any other amounts of K’s and M’s declines. This would mean that you’d have increasingly lower chance to correctly deduce that throwing either an M or K is 50% each, as your number of throws ##n## increases.
Shouldn’t the binomial distribution represent the individual chances of M and K better as ##n## increases? That the “hill” at n/2 would not only be higher, but also slimmer?
 
  • #22
JohnnyGui said:
What surprises me is that as ##n## increases, the ratio of the chance for throwing equal amounts of K’s as M’s divided by the summed chance for throwing any other amounts of K’s and M’s declines. This would mean that you’d have increasingly lower chance to correctly deduce that throwing either an M or K is 50% each, as your number of throws ##n## increases.
Shouldn’t the binomial distribution represent the individual chances of M and K better as ##n## increases? That the “hill” at n/2 would not only be higher, but also slimmer?

That's more or less the common misconception about the "law of averages". If you toss a coin 1,000 times, it's unlikely you will get exactly 500 heads and 500 tails. But, it's likely that you will get 49%-51% heads, and almost certain that you will get 45%-55% heads.

Compare this with 10 tosses, where it is very likely to get only 40% heads, or fewer.

The distribution gets slimmer relative to the number of tosses, but wider in absolute terms.
 
  • #23
PeroK said:
That's more or less the common misconception about the "law of averages". If you toss a coin 1,000 times, it's unlikely you will get exactly 500 heads and 500 tails. But, it's likely that you will get 49%-51% heads, and almost certain that you will get 45%-55% heads.

Compare this with 10 tosses, where it is very likely to get only 40% heads, or fewer.

The distribution gets slimmer relative to the number of tosses, but wider in absolute terms.

Thanks. So can I say that the chance to throw between 49%-51% heads increases as ##n## increases? Putting it in mathematical terms:
$$\sum_{k=0.49n}^{0.51n} \frac{n!}{k!\cdot (n-k)!} \cdot \frac{1}{2^n}$$
The value coming out of this formula will approach 1 as ##n## increases?
 
  • #24
JohnnyGui said:
Thanks. So can I say that the chance to throw between 49%-51% heads increases as ##n## increases? Putting it in mathematical terms:
$$\sum_{k=0.49n}^{0.51n} \frac{n!}{k!\cdot (n-k)!} \cdot \frac{1}{2^n}$$
The value coming out of this formula will approach 1 as ##n## increases?

Yes. There are a few ways to get at this. One particularly nice way for your problem: suppose heads has a payoff of ##+1## and tails has a payoff of ##-1##. The expected payoff per toss is zero. The expected variance per toss is ##1##. Because tosses are independent, variance for ##n## tosses ##= n * 1 = n##. That means the std deviation for n tosses grows with ##\sqrt{n}## -- this is how wide the normal distribution will get as n grows large.

your series is:

##\sum_{k=a}^{b} \frac{n!}{k!\cdot (n-k)!} \cdot \frac{1}{2^n}##

where ##a = 0.49n## and ##b = 0.51n##.
- - - -
edit:
to make this sensible, I need to first translate this from the ##\{0,1\}## case you were using with mean 0.5 to ##\{-1,1\}## 0 mean case. Your a and b referred to having 49 and 51 percent heads (and vice versa with tails) -- in this adjusted payoff setup, a goes from ##0.49 = 0.49(1) + 0.51( 0) \to 0.49 (1) +0.51( -1) = -0.02##

so I should say in ##a = -0.02 n ## and ##b = 0.02n##
and you are interested in

##p_X( a \leq x \leq b)##
or if using the cumulative distribution function

##Pr(X \leq b) - Pr(X \leq a)##

or something along those lines.
- - - -
A clever finish would be to rescale such that you have a standard normal random variable -- i.e. divide by ##\sqrt{n}## so that it has expected value of zero (still) and variance of one. If you do the adjustment on ##a## and ##b##, you see that you have##a' = -0.02 n \frac{1}{\sqrt{n}} = -0.02 \sqrt{n}## and ##b' = 0.02 \sqrt{n}## --
(edited to line up with the above)

that is, they grow arbitrarily large as ##n## grows hence even events that are 10 sigmas out on your bell curve are well within the bounds of ##[a', b']## for large enough n.

There are some other ways to get at this with strong and weak laws of large numbers, but since you we're talking zero mean coin tossing, drawing out the implications of a rescaled bell curve, is quite nice in my view.
 
Last edited:
  • Like
Likes QuantumQuest and JohnnyGui
  • #25
StoneTemplePython said:
Yes. There are a few ways to get at this. One particularly nice way for your problem: suppose heads has a payoff of ##+1## and tails has a payoff of ##-1##. The expected payoff per toss is zero. The expected variance per toss is ##1##. Because tosses are independent, variance for ##n## tosses ##= n * 1 = n##. That means the std deviation for n tosses grows with ##\sqrt{n}## -- this is how wide the normal distribution will get as n grows large.

your series is:

##\sum_{k=a}^{b} \frac{n!}{k!\cdot (n-k)!} \cdot \frac{1}{2^n}##

where ##a = 0.49n## and ##b = 0.51n##. A clever finish would be to rescale such that you have a standard normal random variable -- i.e. divide by ##\sqrt{n}## so that it has expected value of zero (still) and variance of one. If you do the adjustment on ##a## and ##b##, you see that you have

##a' = 0.49n \frac{1}{\sqrt{n}} = 0.49 \sqrt{n}## and ##b' = 0.51 \sqrt{n}## -- that is, they grow arbitrarily large as ##n## grows hence even events that are 10 sigmas out on your bell curve are well within the bounds of ##[a', b']## for large enough n.

There are some other ways to get at this with strong and weak laws of large numbers, but since you we're talking zero mean coin tossing, drawing out the implications of a rescaled bell curve, is quite nice in my view.

Thanks a lot, I think I get what you mean with this. You're basically transforming the distribution so that you can show that the standard deviation would always stay within the boundaries of ##a## and ##b## if I understand correctly.

I've got one question. I've tried to test this summation using summation calculators from several websites but they all give deviating answers from 1. This is what I'm trying to calculate:
$$\lim_{n \rightarrow \infty} \sum_{k=a \cdot n}^{b\cdot n} \frac{n!}{k!\cdot (n-k)!} \cdot \frac{1}{2^n}$$
Here, ##a## and ##b## are any percentages of ##n## equally far from the mean (##0.5n##) where ##b > a##. Shouldn't this limit give the answer ##1## again, regardless of how wide or small the range ##a-b## is?
 
Last edited:
  • #26
JohnnyGui said:
Thanks a lot, I think I get what you mean with this. You're basically transforming the distribution so that you can show that the standard deviation would always stay within the boundaries of ##a## and ##b## if I understand correctly.

I've got one question. I've tried to test this summation using summation calculators from several websites but they all give deviating answers from 1. This is what I'm trying to calculate:
$$\lim_{n \rightarrow \infty} \sum_{k=a \cdot n}^{b\cdot n} \frac{n!}{k!\cdot (n-k)!} \cdot \frac{1}{2^n}$$
Here, ##a## and ##b## are any percentages of ##n## equally far from the mean (##0.5n##) where ##b > a##. Shouldn't this limit give the answer ##1## again, regardless of how wide or small the range ##a-b## is?

Or the flip side -- I centered and normalized so that you always have zero mean and 1 std deviation on the normal random variable. Note: the process of converting it to zero mean changes the interpretation of a and b a little bit so I went back and edited my post -- a belated thought on my end.

so long as ##a## is less than the mean and ##b## greater than the mean, and you scale them both by n, then in principle it should tend to one. The series you are evaluating doesn't have a direct, closed form, though, so I have no idea how its being evaluated by whatever computer algebra systems. Also the form you are using isn't with zero mean, which could create some challenges.

There are basically two ways to evaluate it: 1.) think more abstractly and learn your limit laws in probability. 2.) if you code: run a bunch of simulation trials, where each one has you tossing a very large number n -- the averaged result of heads in that range should be awfully close to one.
 
  • #27
StoneTemplePython said:
Or the flip side -- I centered and normalized so that you always have zero mean and 1 std deviation on the normal random variable. Note: the process of converting it to zero mean changes the interpretation of a and b a little bit so I went back and edited my post -- a belated thought on my end.

so long as ##a## is less than the mean and ##b## greater than the mean, and you scale them both by n, then in principle it should tend to one. The series you are evaluating doesn't have a direct, closed form, though, so I have no idea how its being evaluated by whatever computer algebra systems. Also the form you are using isn't with zero mean, which could create some challenges.

There are basically two ways to evaluate it: 1.) think more abstractly and learn your limit laws in probability. 2.) if you code: run a bunch of simulation trials, where each one has you tossing a very large number n -- the averaged result of heads in that range should be awfully close to one.

This got me wondering how one actually deduces the individual chance of, for example, heads or tails being 50% each. Apparently, it is not based on the chance that one would throw an exact equal amount of heads as tails with increasing ##n##, since that chance actually declines with ##n##.
So are the individual chances of a head or tail determined by the mean of a range that reaches a chance of 1 with increasing ##n##?
 
Last edited:
  • #28
Scratch what I asked above. I was confused for a bit. I was thinking about the ratio between the amount of outcomes in a very narrow percentage range (49%-51% for example) w.r.t. the amount of outcomes in a wider percentage range (40%-60%) for a particular value of ##n##. Is it correct that for the following formula:
$$\frac{\sum_{k=0.49n}^{0.51n} \frac{n!}{k!\cdot (n-k)!}}{\sum_{k=0.4n}^{0.6n} \frac{n!}{k!\cdot (n-k)!}}$$
This fraction would give 1 as well as ##n## increases? Which would mean that with increasing ##n##, the amount of outcomes within the same narrow range would contribute more to the amount of outcomes within the wider percentage range?
 
  • #29
@StoneTemplePython :

There is something else I was quite surprised by.

When you do a lot of trials, say a ##T## amount, each trial consisting of flipping a coin ##n## times where ##n## is a relatively low number (like 10), the most frequent trial coming out of that would be the trial having an equal amount of heads and tails of its ##n## throws, which is what I expect. From this, you would deduce that the chance for heads or tails are ach 50%. This whole method can be considered as 1 large trial consisting of flipping the coin ##T \cdot n## times.

However, something peculiar arises when the chance for a head is different from tails, for example 0.3 for heads and 0.7 for tails. When you do a lot of trials, each trial having the same relatively low ##n## throws, then no matter how many times you do a trial, the most frequent trial would have 27.93% heads from its ##n## throws. You would therefore erroneously deduce that the chance for heads is 27.93% instead of 30%. The part that I don't understand is, that when you consider this whole method as 1 trial consisting of ##T \cdot n## throws, then the amount of heads of all those throws together would be indeed nearly 30%.

I do understand this mathematically, but I can't seem put this in terms of logical thinking. Why can you accurately deduce the individual chance of 2 outcomes by doing many trials, each consisting of a low number ##n## only if the chance of each outcome is 50%? While when they differ in chance, then doing many trials each consisting of the same low number of ##n##, would show a less accurate individual chance?
 
  • #30
JohnnyGui said:
I do understand this mathematically, but I can't seem put this in terms of logical thinking. Why can you accurately deduce the individual chance of 2 outcomes by doing many trials, each consisting of a low number ##n## only if the chance of each outcome is 50%? While when they differ in chance, then doing many trials each consisting of the same low number of ##n##, would show a less accurate individual chance?

This is pretty far astray from the original post that this thread is under. I.e. your original question was asked and answered. Some follow-ups, also asked and answered. Now you have a question about inference -- this requires a new thread, at a minimum. Your line of thinking here doesn't make sense to me. With a large enough number of tosses we should be able to estimate probability of heads up to any amount of (in the real world, reasonable) precision. There are a lot of different approaches, and ways to frame the inference problem.

Personally, I think you need to study probability theory first, then revisit these questions in a few months.
 
Last edited:
  • #31
JohnnyGui said:
Hello,

I have been trying to solve this problem but I can't seem to find a way.

Given are ##n## cards and each card can show one of two values: M or K.

How many possible permutations are there in which there are as many cards with M as there are with K? Given that ##n## is an even amount of cards.

Is it possible to derive a formula for this as a function of ##n##? How does one deduce this?

This way makes sense to me ( a rephrasing of omeone else's answer, I think Dr Claude's ) : Assume you need to go from point A to point B along a grid system , where you must go ,say, north (M) j times and east(K) j times in order to arrive at B, i.e. n=2j. How many ways can you do this trip? Once the j places where you make a turn east(north) fully determine the rest of the trip.
This gives you a way of counting paths where M=K. The total number of paths is straightforward.
 
Last edited:
  • #32
StoneTemplePython said:
This is pretty far astray from the original post that this thread is under. I.e. your original question was asked and answered. Some follow-ups, also asked and answered. Now you have a question about inference -- this requires a new thread, at a minimum. Your line of thinking here doesn't make sense to me. With a large enough number of tosses we should be able to estimate probability of heads up to any amount of (in the real world, reasonable) precision. There are a lot of different approaches, and ways to frame the inference problem.

Personally, I think you need to study probability theory first, then revisit these questions in a few months.

I was talking about when those large number of tosses are divided into small number of tosses, each being a trial, and how one can interpret these small trials to deduce the individual chance.
It was my intention to ask the question in my OP as a base that leads to this question. Making a new thread for every question that I have regarding this would seem ineffective to me. I got it eventually figured out though, so nevermind.

WWGD said:
This way makes sense to me ( a rephrasing of omeone else's answer, I think Dr Claude's ) : Assume you need to go from point A to point B along a grid system , where you must go ,say, north (M) j times and east(K) j times in order to arrive at B, i.e. n=2j. How many ways can you do this trip? Once the j places where you make a turn east(north) fully determine the rest of the trip.
This gives you a way of counting paths where M=K. The total number of paths is straightforward.

This is a creative way to think about it. It helped me deduce the same formula again. Thanks!
 

Related to A specific combination problem

1. What is a combination problem?

A combination problem is a mathematical problem that involves selecting a certain number of items from a larger set of items, without regard to the order in which they are selected. The number of possible combinations can be calculated using the formula nCr = n! / (r! * (n-r)!), where n is the total number of items and r is the number of items being selected.

2. How is a combination problem different from a permutation problem?

A combination problem differs from a permutation problem in that the order of selection does not matter in a combination problem, while it does matter in a permutation problem. In a combination problem, the same group of items can be selected in different orders and still be considered the same combination. In a permutation problem, different orders of selection result in different outcomes.

3. What are some real-life applications of combination problems?

Combination problems can be used in a variety of real-life situations, such as selecting a team from a larger pool of players, choosing a combination of toppings on a pizza, or creating a password with a specific number of characters from a set of letters and numbers.

4. How do you solve a combination problem?

To solve a combination problem, you can use the formula nCr = n! / (r! * (n-r)!), where n is the total number of items and r is the number of items being selected. Alternatively, you can use a combination calculator or create a chart to list out all possible combinations.

5. Can a combination problem have repeated items?

Yes, a combination problem can have repeated items. In this case, the formula for calculating the number of combinations becomes nHr = (n+r-1)! / (r! * (n-1)!), where n is the total number of items and r is the number of items being selected. This accounts for the repeated items and ensures that all possible combinations are considered.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
14
Views
953
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
11
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
11
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
12
Views
984
Back
Top