The error is a theoretical entity from the true model, like the alpha and beta coefficients.
\(Y = \alpha + \beta X + \epsilon\)
When you do a regression you are estimating these parameters with a model
\(\hat{Y} = a + bX\)
where a and b are estimates of alpha and beta, respectively. The residual (e) is the difference between the data point and the fitted line: \(e = Y - a + bX\). You will never have the error just like you'll never have the true coefficients. You have estimates, and the residual estimates the error: the variation in the relationship of Y ~ X that is not accounted for in that model. Therefore, your question is analogous to asking "what is the difference between the estimate and the true coefficient?" They are related, but they are not the same entity at all.