When someone looks at a nebula through a telescope, are they actually looking at what the nebula used to look like a few million years ago?
Generally speaking this is true, although it would be more practical to round down the timeframe to "thousands of years" rather than millions, because most of the nebulae we observe are located in our own galaxy, which is only about 100,000 light years across.
The term "light year" is a bit nonintuitive for many despite the apparent simplicity of its name. A light year is not a unit of time, but of distance; it is the distance light travels in one year. Since light travels at an incredible rate relative to our own everyday conceptions of fast and slow, it is more practical to describe interstellar distances in terms of light years as opposed to miles, or something like "light minutes," although this is useful for reference to smaller scales (for example, we're about 8 light minutes from the sun, meaning that if it burned out it would take 8 minutes for us to find out).
Connected to this concept is the fundamental principle that information cannot be transmitted faster than the speed of light. Without getting into particle physics and the attempts to bend or break this rule, it generally follows that light speed is the maximum speed at which information about anything can be transmitted, and so the soonest we can learn that an event has taken place is equivalent to the time it would take light to travel the distance between us.
For nebulae, or any other stellar object, its distance in terms of light years also happens to be the time difference between what we're seeing and the actual state of the object. For any object, its distance in light years is equal to "how long ago" you're seeing the object.