# An airplane is flying at 300ms 672 mi/h How much time must elapse before a clock in the airplane and one on the ground differ by 1s?

*print*Print*list*Cite

In the SPECIAL (or restrained) theory of relativity it is demonstrated that for a moving observer with speed v, his clock is falling behind a clock that is considered motionless. In other words time dilates in for moving systems.

The relation between the time shown by the clock of the moving observer `Delta(t')` and the time shown by the motionless clock `Delta(t)` is

`Delta(t') =(Delta(t))/sqrt(1-(v^2/c^2))` where `c =3*10^8 m/s` is the speed of light in vacuum.

For 1 second difference we have

`1 = Delta(t') -Delta(t) = Delta(t)*(1/sqrt(1-v^2/c^2) -1)=`

`=Delta(t) *(1/sqrt(1-(9*10^4)/(9*10^16))-1)= 5*10^-13*Delta(t)`` `

Therefore the time needed to elapse before 1 second difference between clocks is

`Delta(t) =1.99*10^12 seconds =63420 years`

Observation:

In the GENERAL theory of relativity it is shown that there is another effect that is much pronounced which makes a clock run faster or slower relative to another. This is the gravitational time dilation, and says that clocks closer to high masses (like the Earth) run slower relative to clocks at high altitudes (far from Earth). There is no need for one clock to move relative to the other. Since in the problem it is given only the speed of the plane the answer does not take into account this effect.