Determine the minimum sample size needed to estimate the mean seek time of disk drives from a production run to a level of accuracy of 0.025ms with 99% confidence. Assume that s(y) is known to be 0.075 from a previous study. Solution: Since the required z value for a 99% CI is 2.6 and you know from CI theory that the 99% plus/minus amount for your estimate is: delta= z(99)*(s(y)/sqrt(n)) where delta is the level of accuracy then: 0.025=2.6*(s(y)/sqrt(n)) Rearranging terms: n=((2.6)*(0.075)/0.025)**2=60.84 Hence 61 disk drives would be required.