You are interested in the performance of an algorithm that processes images. In particular, you are interested in the relationship between 'processing time' and 'image complexity' and would like to derive a regression equation that will allow you to predict 'processing time' from 'image complexity'. You select a random sample of images from a corpus of images and compute the following sample statistics: processing time: mean=460ms; standard deviation=100ms image complexity: mean=90; standard deviation=20 correlation between 'processing time' and 'image complexity' is 0.8 Question #1: Derive the regression equation to predict 'processing time' from 'image complexity' and state the equation. Solution: Notice that you need to predict 'processing time' from image complexity'. So to compute the slope: slope=0.8(100/20)=4 Since (90, 460) is on the regression line: intercept=460-4(90)=100 Hence the regression equation is: processing time = 100 + 4(image complexity) Question #2: From your answer to question 1, answer the following: a) Predict processing time for an image that has an 'image complexity' score of 150. Solution: Merely plug the 'image complexity' score into the regression equation: processing time = 100 + 4(150) = 100 + 600 = 700ms b) Interpret the coefficients of the regression equation. Assume that a complexity score of zero means that the image is monochromatic (i.e zero complexity). Solution: SLOPE: For unit increase in 'image complexity' we can expect 'processing time' to increase by 4ms. INTERCEPT: When 'image complexity' is zero (i.e. monochromatic) the 'processing time' is 100ms. This could be meaningful and may represent the overhead (i.e. of 100ms) in processing an image.