Welcome to our community

Be a part of something great, join today!

Probability of Having a Disease

LostMinosHeins

New member
Sep 10, 2019
1
According to the CDC 1 out of 102 persons who are males of a particular demographic have HIV. That mean's there is 1/102 = .98 * 100 = 1% chance of having HIV. There is a swab HIV test available that is 91.7% accurate at being able to identify HIV positive people. This means that 12 out of 13 people who are HIV positive will test positive on the test.

So say a person takes this HIV test and they fit into the demographic of the CDC statistic. If 1/13 people who are HIV positive tests negative on the test, it seems there is an 8.3% possibility that any person taking the test in general period is HIV positive, is that correct? On its face it doesn't seem to make much sense because it doesn't even put you below the 1% chance you have HIV if your a male in the USA according to the CDC. So that does that really mean that being a male in the USA according to the CDC makes you less likely to have HIV than being a person who tests negative on the swab test? It might be comparing apples and oranges because the 1/13 number assumes every one of the 13 has HIV but it would still be useful to compare them.

But using the 1% statistic and assuming the person is a randomly selected individual of that demographic it seems (logically) like the possibility would be reduced from 1% if someone of that category tested negative on the test. It seems like a compound statistical problem, is there a way of calculating how far below 1% a person who has tested negative on the swab test now has of having HIV using these two statistics? One statistic it looks like assumes every person being tested is HIV positive and the other one is a random sample of all males in the US of a particular demographic but it would be informative to use them to be able to derive this i just don't know if they can be because i was told if you use the rule of multiplying statistics they have to be of the same group and add up to 100. I think it depends whether the probabilities represent independent or dependent events. Which category do these two events fall into two? And how can they be manipulated to calculate how likely it is that the person has HIV if they fall into both categories?
 
Last edited:

HallsofIvy

Well-known member
MHB Math Helper
Jan 29, 2012
1,151
According to the CDC 1 out of 102 persons who are males of a particular demographic have HIV. That mean's there is 1/102 = .98 * 100 = 1% chance of having HIV.
.98 *100 is 98, not 1% but what you mean is 1/102= 0.0098 which 0.98% which is approximately 1%
There is a swab HIV test available that is 91.7% accurate at being able to identify HIV positive people. This means that 12 out of 13 people who are HIV positive will test positive on the test.

So say a person takes this HIV test and they fit into the demographic of the CDC statistic. If 1/13 people who are HIV positive tests negative on the test, it seems there is an 8.3% possibility that any person taking the test in general period is HIV positive, is that correct?
I'm not sure what this is supposed to mean.
Suppose we have 10000 people. 10000/102= 98 people who have the disease. That is 98/10000= 0.98% just as you said above. You haven't yet said anything about testing positive or negative on this test. Saying that the test is "91.7% accurate" means that 91.7% of people who have HIV will test positive and 91.7% of people who do not have it will test negative. (It is more common to have different percentages for these two cases but this may be a simplification.) Of the 98 people who have the disease, 91.7% of them, 98(0.917)= 90 will test positive and or the remaining 10000- 98= [FONT=Verdana,Arial,Tahoma,Calibri,Geneva,sans-serif]99902,, 100- 91.7= 8.3%, .083(99902)= 8292, will also test positive. So a total of 90+ 8292= 8382 people tested positive of whom 90 actually have the disease. If a person tests positive on the test the probability they actually have the disease is [tex]\frac{90}{8392}= 10.8%. That might seem surprisingly low but it is much higher than the original 0.98%.[/FONT]

On its face it doesn't seem to make much sense because it doesn't even put you below the 1% chance you have HIV if your a male in the USA according to the CDC. So that does that really mean that being a male in the USA according to the CDC makes you less likely to have HIV than being a person who tests negative on the swab test? It might be comparing apples and oranges because the 1/13 number assumes every one of the 13 has HIV but it would still be useful to compare them.

But using the 1% statistic and assuming the person is a randomly selected individual of that demographic it seems (logically) like the possibility would be reduced from 1% if someone of that category tested negative on the test. It seems like a compound statistical problem, is there a way of calculating how far below 1% a person who has tested negative on the swab test now has of having HIV using these two statistics? One statistic it looks like assumes every person being tested is HIV positive and the other one is a random sample of all males in the US of a particular demographic but it would be informative to use them to be able to derive this i just don't know if they can be because i was told if you use the rule of multiplying statistics they have to be of the same group and add up to 100. I think it depends whether the probabilities represent independent or dependent events. Which category do these two events fall into two? And how can they be manipulated to calculate how likely it is that the person has HIV if they fall into both categories?
What "categories" are you talking about? I would think testing positive or negative but it is impossible for a person to test both positive or negative (at least on one test).

- - - Updated - - -

According to the CDC 1 out of 102 persons who are males of a particular demographic have HIV. That mean's there is 1/102 = .98 * 100 = 1% chance of having HIV.
.98 *100 is 98, not 1% but what you mean is 1/102= 0.0098 which 0.98% which is approximately 1%
There is a swab HIV test available that is 91.7% accurate at being able to identify HIV positive people. This means that 12 out of 13 people who are HIV positive will test positive on the test.

So say a person takes this HIV test and they fit into the demographic of the CDC statistic. If 1/13 people who are HIV positive tests negative on the test, it seems there is an 8.3% possibility that any person taking the test in general period is HIV positive, is that correct?
I'm not sure what this is supposed to mean.
Suppose we have 10000 people. 10000/102= 98 people who have the disease. That is 98/10000= 0.98% just as you said above. You haven't yet said anything about testing positive or negative on this test. Saying that the test is "91.7% accurate" means that 91.7% of people who have HIV will test positive and 91.7% of people who do not have it will test negative. (It is more common to have different percentages for these two cases but this may be a simplification.) Of the 98 people who have the disease, 91.7% of them, 98(0.917)= 90 will test positive and or the remaining 10000- 98= 99902,, 100- 91.7= 8.3%, .083(99902)= 8292, will also test positive. So a total of 90+ 8292= 8382 people tested positive of whom 90 actually have the disease. If a person tests positive on the test the probability they actually have the disease is [tex]\frac{90}{8392}= 10.8%. That might seem surprisingly low but it is much higher than the original 0.98%.

On its face it doesn't seem to make much sense because it doesn't even put you below the 1% chance you have HIV if your a male in the USA according to the CDC. So that does that really mean that being a male in the USA according to the CDC makes you less likely to have HIV than being a person who tests negative on the swab test? It might be comparing apples and oranges because the 1/13 number assumes every one of the 13 has HIV but it would still be useful to compare them.

But using the 1% statistic and assuming the person is a randomly selected individual of that demographic it seems (logically) like the possibility would be reduced from 1% if someone of that category tested negative on the test. It seems like a compound statistical problem, is there a way of calculating how far below 1% a person who has tested negative on the swab test now has of having HIV using these two statistics? One statistic it looks like assumes every person being tested is HIV positive and the other one is a random sample of all males in the US of a particular demographic but it would be informative to use them to be able to derive this i just don't know if they can be because i was told if you use the rule of multiplying statistics they have to be of the same group and add up to 100. I think it depends whether the probabilities represent independent or dependent events. Which category do these two events fall into two? And how can they be manipulated to calculate how likely it is that the person has HIV if they fall into both categories?
What "categories" are you talking about? I would think testing positive or negative but it is impossible for a person to test both positive or negative (at least on one test).