# Difference between Mean Squared Residuals (MSR) and Mean Square Error (MSE)

The difference between Mean Squared Residuals (MSR) and Mean Square Error (MSE) is on the denominator of the formula.

## Formula of Sum of Squared Residuals (SSR)

In particular, both MSR and MSE start with the sum of squares of residuals (SSR). SSR is also known as the residual sum of squares (RSS) or sum of squared errors (SSE). That is, SSR or RSS or SSE is the sum of the squares of residuals (deviations between predicted values and the actual values from data).

$SSR=\sum_{i=1}^{n} (\hat{y_i}-y_i)^2$

where,

n is the number of observations

$\hat{y_i} \ is \ estimated \ value$

$\ y_i \ is \ observed \ value$

## Formula of Mean Squared Residuals (MSR)

MSR is ratio between SSR and the number of observations, i.e., n. The following is the formula of Mean Squared Residuals (MSR).

$MSR=\frac{SSR}{n}=\frac{\sum_{i=1}^{n} (\hat{y_i}-y_i)^2 }{n}$

where,

n is the number of observations

$\hat{y_i} \ is \ estimated \ value$

$\ y_i \ is \ observed \ value$

## Formula of Mean Squared Error (MSE)

MSR is a biased estimate of the variance of the unobserved errors. We can remove the bias by dividing the SSR by df = n − p − 1, instead of n, where df is the degree of freedom (n – p -1: n – the number of parameters (excluding the intercept) p being estimated – 1).

MSE is the ratio between SSR and degree of freedom. Thus, MSE is the unbiased estimate of the variance of the unobserved errors. The following is the formula for Mean Squared Error (MSE).

$MSE=\frac{SSR}{n-p-1}=\frac{\sum_{i=1}^{n} (\hat{y_i}-y_i)^2 }{n-p-1}$

Where,

n is the number of observations

p is the number of estimated parameters (excluding the intercept)

$\hat{y_i} \ is \ estimated \ value$

$\ y_i \ is \ observed \ value$

Side Note: You might see some posts online that use n rather than n-p-1 for MSE. In that case, you just need to know that it is a biased estimate, rather than an unbiased one (see reference below).