The approach to estimating the parameters for a regression equation is outlined in Exhibit 33.25. The intercept and coefficient estimates are based on the Least Squares Error (LSE) method where the criterion is to minimize the sum of the squared error (SSE) (actual – predicted): $$ min\sum(y_i-\hat y_i)^2 $$

y_{i}: observed value.

ŷ_{i}: estimated value.

In the case of simple linear regression:

$$ SSE = \sum_{i=1}^n (y_i - \hat y_i )^2 = \sum_{i=1}^n [y_i - (b_0 + b_1 x_i)]^2 $$Via differentiation we can obtain the estimate for the *regression coefficient*:

And, b_{0}, the intercept:

As with simple regression, estimates for multiple linear regression are based on the
LSE method. The parameters b_{1}, b_{2}, b_{3} etc. are called *partial
regression coefficients*. They reveal the importance of their respective predictor variables,
in driving the response variable.

*Note: The partial contribution of each x-variable (as measured by its b-coefficient)
may not agree in relative magnitude (or even sign) with the bivariate correlation between the
x-variable and y (the dependent variable).*

Regression coefficients vary with measurement scales. If we standardize (i.e., subtract
the mean and divide by the standard deviation) y as well as x_{1}, x_{2}, x_{3},
the resulting equation is scale-invariant:

*Use the Search Bar to find content on MarketingMind.*

In an analytics-driven business environment, this analytics-centred consumer marketing workshop is tailored to the needs of consumer analysts, marketing researchers, brand managers, category managers and seasoned marketing and retailing professionals.

Is marketing education fluffy too?

Marketing simulators impart much needed combat experiences, equipping practitioners with the skills to succeed in the consumer market battleground. They combine theory with practice, linking the classroom with the consumer marketplace.