# Why may variance be difficult to use a measure of spread?

*print*Print*list*Cite

### 1 Answer

Variance is certainly a good measure of spread. However, it is less intuitive than standard deviation because its value is the expected value of the **square **of the deviation from the mean.

This aspect of the variance can lead to problems with real-world interpretation because often multipliers of this expected deviation are used to see how much results are spread within a set of values. If we use variance, then one interval of average variance may very well put us far outside our set of values, depending on the set!

For example, consider the following set of numbers:

`A = {1,2,3,4,5,6}`

The mean of this set is 3.5, the variance of the set is 3.5, and the standard deviation is 1.871. The standard deviation gives us a much better indicator of how many samples are grouped around the mean in terms of the values for each sample! If we use variance, all of the samples are within one level of variance from the mean, so it does not really help us at all in any intuitive sense.

Variance is generally used more when we're looking at the reasons for why sample values are the way they are. The link below has more information on this use.

I hope this helps!

**Sources:**