Chapter 2: Simple Linear Regression

[Pages:58]Chapter 2: Simple Linear Regression

1 The model

The simple linear regression model for n obser-

vations can be written as

yi = 0 + 1xi + ei, i = 1, 2, ? ? ? , n. (1)

The designation simple indicates that there is only

one predictor variable x, and linear means that the model is linear in 0 and 1. The intercept 0 and the slope 1 are unknown constants, and

they are both called regression coefficients; ei's

are random errors. For model (1), we have the following assumptions:

1. E(ei) = 0 for i = 1, 2, ? ? ? , n, or, equivalently E(yi) = 0 + 1xi.

2. var(ei) = 2 for i = 1, 2, ? ? ? , n, or, equivalently, var(yi)) = 2.

3. cov(ei, ej) = 0 for all i = j, or, equivalently, cov(yi, yj) = 0.

2 Ordinary Least Square Estimation

The method of least squares is to estimate 0 and 1 so that the sum of the squares of the difference between the observations yi and the straight

line is a minimum, i.e., minimize

n

S(0, 1) = (yi - 0 - 1xi)2.

i=1

4

3

2

E(Y|X=x)

1 = Slope 1

1

0

0 = Intercept

0

1

2

3

4

Predictor = X

Figure 1: Equation of a straight line E(Y |X = x) = 0 + 1x.

The least-squares estimators of 0 and 1, say ^0

and ^1, must satisfy

n

-2 (yi - ^0 - ^1xi) = 0 (2)

i=1 n

-2 (yi - ^0 - ^1xi)xi = 0 (3)

i=1

Simplifying these two equations yields

n

n

n^0 + ^1 xi = yi

n

i=1

i=1

n

n

(4)

^0 xi + ^1 x2i = yixi

i=1

i=1

i=1

Equations (4) are called the least-squares nor-

mal equations. The solution to the normal equa-

tions is

^1 =

n i=1

xiyi

-

nx?y?

n i=1

x2i

-

nx?2

=

=

Sxy Sxx

,

^0 = y? - ^1x?.

ni=1(xi - x?)(yi - y?) ni=1(xi - x?)2

The difference between the observed value yi

and the corresponding fitted value y^i is a residual,

i.e.,

ei = yi-y^i = yi-(^0+^1xi), i = 1, 2, ? ? ? , n

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download