An object is dropped from a tower 180 m high. How long does it take to reach the ground [ take g = 10 m/S2]?
I'm sure we ignore air resistance. Then the height of an object at time t is given by the rule
`h(t) = h_0 - (g*t^2)/2,`
where `h_0` is the initial height and t=0 corresponds to the moment when object was dropped.
Object hits the ground means `h(t_1)=0.` This is a simple equation:
`180 - (10*t_1^2)/2 = 0,`
`t_1^2 = 36,` `t_1` = 6(s).
(the solution `t_1=-6` isn't suitable, t must be >=0)
The initial speed is zero at a height of 180m.
The object is traveling towards the center of gravity, so we can take positive g = 10m/s^2.
So using laws of motion, S = ut + (1/2)gt^2
Here S= 180m, u=0 m/s and t = time taken to reach the ground.
180 = (1/2).10.t^2
t^2 = 36
t = 6 sec