List of basic formulas related to the least squares method:

  1. The general form of linear regression is:
    [
    y = \beta_0 + \beta_1 x + \epsilon
    ]
    where (y) is the dependent variable, (x) is the independent variable, (\beta_0) is the intercept, (\beta_1) is the slope, and (\epsilon) is the error term.
  2. Regression coefficients:
  • Free term ((\beta_0)):
    [
    \beta_0 = \bar{y} – \beta_1 \bar{x}
    ]
  • Slope coefficient ((\beta_1)):
    [
    \beta_1 = \frac{n(\sum xy) – (\sum x)(\sum y)}{n(\sum x^2) – (\sum x)^2}
    ]
    where (n) is the number of observations, (\sum xy) is the sum of the products of (x) and (y), (\sum x) and (\sum y) are the sums of the values ​​of (x) and (y), (\sum x^2) is the sum of the squares of the values ​​of (x).
  1. Sum of squared errors (SSE):
    [
    SSE = \sum (y_i – \hat{y}_i)^2
    ]
    where (y_i) are the observed values, (\hat{y}_i) are the predicted values.
  2. Total Sum of Squares (SST):
    [
    SST = \sum (y_i – \bar{y})^2
    ]
  3. Sum of squares regression (SSR):
    [
    SSR = SST – SSE
    ]
  4. Coefficient of determination (R²):
    [
    R^2 = \frac{SSR}{SST}
    ]
    This coefficient shows what proportion of the variation in the dependent variable is explained by the independent variable.

By Math