The distance, in feet, a stone drops in $t$ seconds is given by $d(t) = 16t^2$. The depth of a hole is to be approximated by dropping a rock and listening for it to hit the bottom. Assume that time measurement is accurate to 1 / 10th of a second. Use a linear approximation to estimate an upper bound for the propagated error if the measured time is:

a) 3 seconds. feet

b) 6 seconds. feet