Serial correlation (also known as autocorrelation) is a violation of the Ordinary Least Squares assumption that all observations of the error term in a dataset are uncorrelated. In a model with serial correlation, the current value of the error term is a function of the one immediately previous to it:

e_{t}= ρe_{(t-1)}+ u_{t}where e = error term of equation in question; ρ = first-order autocorrelation coefficient; u = classical (not serially correlated error term)

Continue reading “Serial Correlation: Durbin-Watson and Cochrane-Orcutt Remedy”