# Suppose that on earth you can throw a ball vertically upward a distance of 2.00 m. Given that the acceleration of gravity on the Moon is 1.67 m/s2, how high could you throw a ball on the...

Suppose that on earth you can throw a ball vertically upward a distance of 2.00 m. Given that the acceleration of gravity on the Moon is 1.67 m/s2, how high could you throw a ball on the Moon? (Take the *y*-axis in the vertical direction, and assume that the location of your hand is at *y* = 0.)

*print*Print*list*Cite

### 1 Answer

First, we must find the initial velocity that the ball must have on earth, to reach the height of 2 m. The height is calculated using the following equation:

H(earth) = (v^2 – v0^2)/-2g (earth)

v → is the velocity for each instant.

v0 → is the initial velocity of the ball.

g(earth) → is the gravity of the earth.

H(earth) → is the height reached on earth.

The negative sign in the denominator means that we consider the upward movement. The final velocity v at the point of maximum height is zero, so that:

H(earth) = (– v0^2)/-2g (earth)

v0 = sqrt (2*g(earth)*H(earth) )

v0 = sqrt (2*9.8*2) = 6.26 m/s

With this value of the initial velocity, we evaluated the same equation above to know the height achieved when the body is thrown on the moon.

H(moon) = (v^2 – v0^2)/-2g(moon)

H(moon) = (– v0^2)/-2g(moon) = (6.26)^2/(2*1.67) = 11.73 m

**So, this ball thrown on the moon, will reach a height of 11.73 m.**