Selection bias with respect to potentially apocalyptic events

  • Thread starter Patra Neutrino
  • Start date
  • Tags
    Bias Events
In summary, the author suggests that survivor bias is a form of selection bias and that it affects our estimation of probability.
  • #1
Patra Neutrino
4
4
Note to moderators: I’m not sure if this topic is mainstream enough for the forum. If you decide to remove it, I’ll understand. Not convinced I've put it in the right section either!

I’m sure you’re all familiar with, or will easily understand, survivor bias, which is a form of selection bias. Imagine you got 3,600 people and put them into individual isolated cells. Now, imagine that you sent a guard into each cell, each with two revolvers loaded with five bullets each, and you asked each guard to spin each barrel and fire both revolvers, point blank, at each prisoner’s head. Approximately (1/6)*(1/6) of the prisoners would survive, i.e. 100.

If you asked each surviving prisoner to estimate the number of bullets per chamber, they would never say five. They would underestimate the number of bullets each gun had been loaded with by virtue of having survived the event.* (Imagine it from your point of view: a guard has just come in and fired two guns at you. You heard two clicks.)

I got to thinking about apocalypses, and the probability of, say, a nuclear war. We know that it looked pretty hairy in the 1960s and 1980s, but it didn’t happen. Therefore, it is often inferred, there probably won’t be a nuclear war anytime soon, because before lots of people thought there was going to be one before, and there wasn’t.

If we make the initial assumption that nuclear wars always kill 100% of the global population and further assume the multiverse** model of realty, an analogy can easily be made with the guard/prisoner example above. If human life in most other universes was wiped out in the last century, we’d never know about it, and from the anthropic principle, we would just be in one of the few universes in which there was no nuclear war.

But nuclear wars don’t always kill 100% of the population, so the calculation gets a lot messier. I have a couple of ideas myself***, but that’s enough for now. I posit, however, that we are underestimating the probability of future apocalypses due to survivor bias. Does anyone have any thoughts on the matter?

(I intend this to be a very open thread. I’m not totally convinced of my conclusion, but somewhere in all of this I’m sure there’s an interesting line of discussion regarding probability, the anthropic principle, multiverses, or something like that).

Cheers.

*Any surviving mathematicians could use Bayes’ Theorem to estimate the probability of there having been n bullets in each gun as follows: p(0)=39.6%, p(1)=27.5%, p(2)=17.6%, p(3)= 9.9%, p(4)=4.4%, p(5)=1.1%, p(6)=0%. Working available on request.

**I’m not sure it is actually necessary to assume this, but it makes things easier at first. I don't think the multiverse approach should ultimately have any bearing upon standard probability, however.

***I think that the probabilty of existing in any given universe in the multiverse is proportional to the probability of your having been born in that universe, which in this case is, I suspect, proportional to the surviving population of an apocalyptic event.
 
Physics news on Phys.org
  • #2
Thanks for the post! Sorry you aren't generating responses at the moment. Do you have any further information, come to any new conclusions or is it possible to reword the post?
 
  • #3
I don't think invoking a "multiverse" to define probability works. Here are some of the assumptions:
1) That there is a multiverse.
2) That the number of branch universe generated from say 1950 to 1990 was finite.
3) That the number of branches formed provides a meaningful denominator for the probability. That is, that all such universes are of equal or knowable weight.

All that said, I estimate the probability of a human-race-ending catastrophe to be 1. It's just a matter of when.

What matters more than an absolute estimation of the probability of a World War 3 within the next century is an estimation of what should be done to minimize this probability.

The current tact is "MAD", Mutually Assured Destruction.
 
  • #4
Thanks for the reply.

Perhaps it isn't necessary to consider multiverses at all. After all, the example with the guards and the prisoners doesn't need them.

If you first assume a nuclear war would end all life, and assume that on exactly two occasions last century there was a 5/6 chance of a nuclear war, do we have an exact analogy with the guards/prisoners? I don't see why not.

If so, the next step should just be a case of making more realistic assumptions and doing some harder maths.
 
  • #5
I’m sure you’re all familiar with, or will easily understand, survivor bias, which is a form of selection bias. Imagine you got 3,600 people and put them into individual isolated cells. Now, imagine that you sent a guard into each cell, each with two revolvers loaded with five bullets each, and you asked each guard to spin each barrel and fire both revolvers, point blank, at each prisoner’s head. Approximately (1/6)*(1/6) of the prisoners would survive, i.e. 100.

If you asked each surviving prisoner to estimate the number of bullets per chamber, they would never say five. They would underestimate the number of bullets each gun had been loaded with by virtue of having survived the event.* (Imagine it from your point of view: a guard has just come in and fired two guns at you. You heard two clicks.)
... usually there is, at most, one bullet per chamber.

Presumably they would see that the guns were 6-shots a-piece, so they should estimate at most five bullets per gun.
But we are talking about the illogical estimate - hence the subject.

Certainly the assessment that the chance there will be an accident is small only because as accident has not happened would be irrational.

I suspect survivors with enough stats education to know about Bayes theorem are unlikely to make their estimates like that ... they would realize they have a sampling bias and proceed appropriately. i.e. assume there were n more people initially who were removed by the experiment. If they were scientists they would want to gather evidence to support some hypothesis about what just happened.

Stats ed can be like that - I used to think that correlation was causation, but I took a stats course and now I don't think that at all. It is possible that the course influenced my thinking but without further evidence...

Its tricky - I think it is reasonable, in NZ, that I won't be obliterated by an RPG attack on my house ... well it has not happened right?
But I don't think RPG attacks are unlikely only because they have not happened - but for other reasons associated with the culture here. The absence of RPG attacks on myself or anyone I know just bears out the conclusion that they are unlikely - so providing supporting evidence for the model I used to arrive at that conclusion. Of course it would be better to try to disprove it ... second thought, maybe not better...

OTOH: I've known people to conclude that it is OK for them to speed because they are such good drivers - based on the fact they are not dead yet.
 

Related to Selection bias with respect to potentially apocalyptic events

1. What is selection bias?

Selection bias is a type of bias that occurs when the sample or data being analyzed is not representative of the entire population. This can happen when certain groups or individuals are systematically excluded from the sample, leading to inaccurate or skewed results.

2. How does selection bias apply to potentially apocalyptic events?

In the context of potentially apocalyptic events, selection bias can occur when certain groups or regions are more likely to be affected by the event, leading to a skewed understanding of its impact. For example, if a study only looks at data from urban areas, it may underestimate the impact of an apocalyptic event on rural communities.

3. How can selection bias be avoided in studies related to potentially apocalyptic events?

To avoid selection bias, it is important to have a diverse and representative sample that includes a variety of groups and regions that may be affected by the event. Researchers should also carefully consider their sampling methods and strive to collect data from a wide range of sources.

4. What are the consequences of selection bias in understanding potentially apocalyptic events?

Selection bias can lead to inaccurate or incomplete understanding of the impact and potential consequences of a potentially apocalyptic event. This can have serious implications for preparedness and response efforts, as well as for policy decisions related to preventing or mitigating the event.

5. How can selection bias be addressed in the scientific community?

The scientific community can address selection bias by promoting diversity and inclusivity in research and data collection, and by encouraging transparency and open access to data. Peer review and replication studies can also help identify and address potential biases in research related to potentially apocalyptic events.

Similar threads

Replies
49
Views
3K
  • Calculus and Beyond Homework Help
Replies
8
Views
11K
  • Programming and Computer Science
Replies
1
Views
1K
  • Sci-Fi Writing and World Building
Replies
7
Views
1K
  • Biology and Medical
3
Replies
75
Views
8K
Replies
1
Views
1K
Replies
1
Views
2K
  • Sci-Fi Writing and World Building
Replies
2
Views
2K
  • Beyond the Standard Models
Replies
1
Views
3K
  • Sci-Fi Writing and World Building
Replies
3
Views
2K
Back
Top