# Consider iron levels in a population that have a mean of 15.5 gdL and a standard deviation of 1.6 gdL. You are measuring the iron levels in one patient. Give an example of an iron level from the patient that would be considered unusual. Describe how you determined that such a level would be considered unusual, using the Empirical Rule

We are given a population with a mean `mu=15.5` gdl, and a population standard deviation `sigma=1.6` gdl. We are asked to give an iron level from one individual that would be considered "unusual."

We assume that the levels of iron in the population are distributed approximately normally. That means that if we drew the histogram for the population, the shape would be bell-like: symmetric with a single maximum in the center.

Under such an assumption, we can apply the empirical normal rule. Approximately 68% of the population will lie within 1 standard deviation of the mean, about 95% of the population will lie within 2 standard deviations of the mean, and close to 99.7% of the population will lie within 3 standard deviations of the mean.

Thus 68% of the population will have iron levels from 13.9–17.1 gdl, 95% of the population will have iron levels from 12.3-18.7 gdl, and 99.7% of the population will have iron levels from 10.7–20.3 gdl.

Any reading less that 10.7 or greater than 20.3gdl would be very unusual. Only 3 in 1000 people would be expected to have such extreme levels of iron in their blood.