Std dev is different depending on the scale?

  • Thread starter Lobotomy
  • Start date
  • Tags
    Scale
In summary, the standard deviation is the square root of the variance and is used to measure the variability of data. When measuring the same objects in different units, the variance may remain the same but the standard deviation will change accordingly. It is important to ensure that the units of variance are squared to correctly convert to different units.
  • #1
Lobotomy
58
0
hello
std dev is the square root of the variance.

assume we measure lenghts of something normally distributed. we use millimeter.
we calculate our variance to be 100mm and thus std dev to be sqrt(100)=10mm

but, if we instead would measure the same objects in meter, then we'd get the variance to be 0.1m (exactely the same as 100mm) but then the std dev is sqrt(0.1)=0.3162 which is 316mm!

so have our std dev suddenly increased from 10mm to 316mm just by using a different scale when measureing the same objects?
 
Physics news on Phys.org
  • #2
Your variance would be 100 mm2 not just mm (you should check that the units of variance is that of the variable squared), so when converting to mm you need to convert to mm2

Otherwise when you took the square root your standard deviation would have units square root of a meter, which is pretty weird
 
  • #3
Office_Shredder said:
Your variance would be 100 mm2 not just mm (you should check that the units of variance is that of the variable squared), so when converting to mm you need to convert to mm2

Otherwise when you took the square root your standard deviation would have units square root of a meter, which is pretty weird

ok i see. thanks.

so first time we had a variance of 100mm^2 and std dev of 10mm
then we measure in meter.
so that means we have a variance of 0.0001 m^2 (0.0001m^2=100mm^2).
so we calculate the std dev sqrt(0.0001)=0.01 which is equal to 10mm! makes more sense
 

Related to Std dev is different depending on the scale?

1. What is standard deviation and why is it important?

Standard deviation is a measure of how spread out a set of data is from the mean. It is important because it allows us to understand the variability of a data set and make comparisons between different groups or populations.

2. How does the scale affect the standard deviation?

The scale can affect the standard deviation because it changes the units of measurement for the data. For example, if we measure weight in kilograms versus pounds, the standard deviation will be different because the scale is different.

3. Can standard deviation be negative?

No, standard deviation cannot be negative as it is calculated by taking the square root of the variance, which is always a positive value.

4. Why is it important to consider the scale when interpreting standard deviation?

It is important to consider the scale because it can affect the magnitude of the standard deviation and make comparisons between data sets misleading. For example, if one data set is measured in millimeters and another in kilometers, the standard deviation will be much larger for the kilometer data even if the variability is similar.

5. Is there a preferred scale for calculating standard deviation?

No, there is no preferred scale for calculating standard deviation. The scale should be chosen based on the type of data and the units that make the most sense for the specific context. It is important to be consistent with the scale used when making comparisons or interpreting the results.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
1K
Replies
7
Views
962
Replies
13
Views
2K
Replies
12
Views
2K
Replies
2
Views
2K
Replies
39
Views
4K
  • Precalculus Mathematics Homework Help
Replies
4
Views
2K
  • Introductory Physics Homework Help
Replies
4
Views
2K
  • Beyond the Standard Models
Replies
11
Views
2K
  • Introductory Physics Homework Help
Replies
6
Views
1K
Back
Top