We can solve this by using kinematic equations and the background knowledge that the acceleration due to gravity on the surface of the earth is 9.8m/s
We also have to assume that the rock is being dropped rather than thrown, which makes its initial velocity 0. If the rock was being thrown then we would have no way to reliably solve the problem.
Acceleration is a measure of the change in velocity over time. If we know that time is 3 seconds, and the initial velocity is zero, then we can find the velocity the rock will achieve in 3 seconds under that acceleration, as well as the distance that it will cover in that time while accelerating.
The best equation to use is;
x = ViT + .5aT^2
Vi (the initial velocity) is 0. This means that the distance the rock travels will depend entirely upon the acceleration that it gains in this timeframe.
a = 9.8m/s
T = 3
X = .5(9.8)(3^2)
X = 44.1m
In a more realistic setup, taking into account things like air resistance which would slow the rock slightly, it would probably cover less than 44.1m because its total velocity over this time period would be less.