Sum of squares

Represents a measure of variation or deviation from the mean. It is calculated as a summation of the squares of the differences from the mean. The calculation of the total sum of squares considers both the sum of squares from the factors and from random chance or error.

Regression

In regression, the total sum of squares helps express the total variation of the y's. For example, you collect data to determine a model explaining overall sales as a function of your advertising budget.

The total sum of squares =  regression sum of squares (SSR) + residual sum of squares (SSE)

S (y - )2 = S ( - )2 +  S (y - )2

The regression sum of squares is the variation attributed to the relationship between the x's and y's, or in this case between the advertising budget and your sales. The residual sum of squares is the variation attributed to the error.

By comparing the regression sum of squares to the total sum of squares, you find the proportion of the total variation that is explained by the regression model (R2, the coefficient of determination). The larger this value is, the better the relationship explaining sales as a function of advertising budget.

ANOVA

In analysis of variance (ANOVA), the total sum of squares helps express the total variation that can be attributed to various factors. For example, you run an experiment to test the effectiveness of three laundry detergents.

The total sum of squares =  treatment sum of squares (SST) + error sum of squares (SSE)

The treatment sum of squares is the variation attributed to, or in this case between the laundry detergents. The residual sum of squares is the variation attributed to the error.

Converting the sum of squares into mean squares by dividing by the degrees of freedom will allow you to compare these ratios and determine whether there is a significant difference due to detergent. The larger this ratio is, the more the treatments affect the outcome.  

Sequential sum of squares vs. adjusted sum of squares

Minitab breaks down the SS Regression or Treatments component of variance into sums of squares for each factor.

·    Sequential sums of squares depend on the order the factors are entered into the model. It is the unique portion of SS Regression explained by a factor, given any previously entered factors.

For example, if you have a model with three factors, X1, X2, and X3, the sequential sum of squares for X2 shows how much of the remaining variation X2 explains, given that X1 is already in the model. To obtain a different sequence of factors, repeat the regression procedure entering the factors in a different order.

·    Adjusted sums of squares does not depend on the order the factors are entered into the model. It is the unique portion of SS Regression explained by a factor, given all other factors in the model, regardless of the order they were entered into the model.

For example, if you have a model with three factors, X1, X2, and X3, the adjusted sum of squares for X2 shows how much of the remaining variation X2 explains, given that X1 and X3 are also in the model.