A ball is thrown down vertically with an initial speed of 20.5 m/s from a height of 58.8m.
What will be it's speed just before it strikes the ground? How long it will take for the ball to reach the ground?
We have the initial speed: v0 = 20.5 m/s.
We also know the height from where the ball is thrown: h = 58.8m.
We'll write the equation of the motion:
v^2 = v0^2 + 2g*h
v = sqrt(v0^2 + 2g*h) (1)
We'll substitute the given data into the relation (1):
v = sqrt[(20.5)^2 + 2*9.8*58.8]
v = sqrt(420.25 + 1152.48)
v = sqrt(1572.73)
v = 39.7 m/s
The speed of the ball before striking the ground is v = 39.7 m/s.
To calculate how long it will take for the ball to reach the ground, we'll write the equation of motion:
v = v0 + g*t
We'll subtract v0 both sides:
gt = v - v0
We'll divide by g:
t = (v-v0)/g (2)
We'll substitute the velocities in (2):
t = (39.7 - 20.5)/9.8
t = 19.2/9.8
t = 1.96 s