Lecture 5 Least-squares - Stanford Engineering …
[Pages:23]EE263 Autumn 2007-08
Lecture 5 Least-squares
Stephen Boyd
? least-squares (approximate) solution of overdetermined equations ? projection and orthogonality principle ? least-squares estimation ? BLUE property
5?1
Overdetermined linear equations
consider y = Ax where A Rm?n is (strictly) skinny, i.e., m > n
? called overdetermined set of linear equations (more equations than unknowns)
? for most y, cannot solve for x
one approach to approximately solve y = Ax: ? define residual or error r = Ax - y ? find x = xls that minimizes r xls called least-squares (approximate) solution of y = Ax
Least-squares
5?2
Geometric interpretation
Axls is point in R(A) closest to y (Axls is projection of y onto R(A))
R(A)
yr Axls
Least-squares
5?3
Least-squares (approximate) solution
? assume A is full rank, skinny ? to find xls, we'll minimize norm of residual squared,
r 2 = xT AT Ax - 2yT Ax + yT y
? set gradient w.r.t. x to zero: x r 2 = 2AT Ax - 2AT y = 0
? yields the normal equations: AT Ax = AT y ? assumptions imply AT A invertible, so we have
xls = (AT A)-1AT y
. . . a very famous formula
Least-squares
5?4
? xls is linear function of y ? xls = A-1y if A is square ? xls solves y = Axls if y R(A) ? A = (AT A)-1AT is called the pseudo-inverse of A ? A is a left inverse of (full rank, skinny) A:
AA = (AT A)-1AT A = I
Least-squares
5?5
Projection on R(A)
Axls is (by definition) the point in R(A) that is closest to y, i.e., it is the projection of y onto R(A)
Axls = PR(A)(y) ? the projection function PR(A) is linear, and given by
PR(A)(y) = Axls = A(AT A)-1AT y
? A(AT A)-1AT is called the projection matrix (associated with R(A))
Least-squares
5?6
Orthogonality principle
optimal residual r = Axls - y = (A(AT A)-1AT - I)y
is orthogonal to R(A): r, Az = yT (A(AT A)-1AT - I)T Az = 0
for all z Rn
R(A)
yr Axls
Least-squares
5?7
Least-squares via QR factorization
? A Rm?n skinny, full rank ? factor as A = QR with QT Q = In, R Rn?n upper triangular,
invertible ? pseudo-inverse is
(AT A)-1AT = (RT QT QR)-1RT QT = R-1QT so xls = R-1QT y ? projection on R(A) given by matrix
A(AT A)-1AT = AR-1QT = QQT
Least-squares
5?8
................
................
In order to avoid copyright disputes, this page is only a partial summary.
To fulfill the demand for quickly locating and searching documents.
It is intelligent file search solution for home and business.
Related download
- lecture 5 least squares stanford engineering
- 4 qr factorization iit
- chapter 7 least squares estimation
- qr factorization and singular value decomposition
- he ece133a spring2021 8 least squares
- standard format for upper school syllabi
- mathematics program models components
- united nations unece
- mathsteachers
- mm als notes are resources
Related searches
- how to graph least squares regression line
- equation of least squares regression line calculator
- slope of the least squares regression line
- least squares regression line example
- how to use least squares regression
- least squares equation
- the least squares regression line calculator
- method of least squares equation
- least squares regression calculator
- least squares regression equation calculator
- weighted least squares regression excel
- least squares regression method