If you look at the recent history of computers and microchips, the technology has gotten much smaller, faster, and less expensive. Moore's Law basically states that the number of transistors that can be manufactured and arranged on a microchip in a cost effective manner will double roughly once every two years.
The rule has held up quite well over time, although some say the technology has to plateau sooner or later. The Law has completely gone beyond what the man who originally uttered the formula could have imagined, as that was in 1965:
The complexity for minimum component costs has increased at a rate of roughly a factor of two per year... Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000. I believe that such a large circuit can be built on a single wafer.
You can apply the same rule outside of the computing world as well, if you look at the advancement in technology with DVD players or iPods, where they became faster, had more capacity and capability, and yet they became less expensive.
Moor's law refers to a trend in computer hardware development and design noted by Gordon E. Moore. This trend was described by him in a paper presented in 1965, As per this paper Moore noted that number of components in integrated circuits had doubled every year from the invention of the integrated circuit in 1958 until 1965 and predicted continuation of this trend for at least ten years that is up to 1975.
Later in 1975 Moore changed his prediction for future development to doubling every two years rather than one year. Such prediction about development of computers has held quite correct till date. Some experts now note that this trend will continue up to 2015.
Because of the popularity of the computers among common people, may different version of the predictions made by Moore are doing rounds as Moore's law. The variations include using a period of 18 months instead of two years, and relate the changes in terms of some other measures like doubling of computing power, for the same cost of producing the hardware.