We are given a population with a mean iron level of `mu=15.5` gdl, with a population standard deviation of `sigma=1.6` gdl. We assume that the iron levels in the population are distributed approximately normally. We are asked to determine the range for which the average of the iron levels of a group of specified size would be considered normal.

Let us use a sample size of 25; that is, we measure the iron levels of 25 randomly chosen individuals from the population. We are asked to determine the range of "normal" values for the average of this sample.

From the central limit theorem, we know that the standard error is `sigma/sqrt(n)=1.6/sqrt(25)=.32` (Contrast this with the standard error for an individual which is 1.6 gdl.)

Now apply the empirical normal rule with the mean of 15.5 and the standard error of .32.

Approximately 68% of samples of size 25 will have a mean between 15.18 and 15.82, 95% will be between 14.86 and 16.14, and 99.7% will have a mean between 14.54 and 16.46.

**Thus we would expect the mean iron level of the sample of 25 individuals randomly chosen from the population to be between 14.54 and 16.46 gdl.**

**Further Reading**