LinearLeastSquaresApproximation线性最小二乘逼近精选精选课件_第1页
LinearLeastSquaresApproximation线性最小二乘逼近精选精选课件_第2页
LinearLeastSquaresApproximation线性最小二乘逼近精选精选课件_第3页
LinearLeastSquaresApproximation线性最小二乘逼近精选精选课件_第4页
LinearLeastSquaresApproximation线性最小二乘逼近精选精选课件_第5页
已阅读5页,还剩21页未读 继续免费阅读

下载本文档

版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领

文档简介

1、第1页,共26页。Linear Least SquaresApproximationBy Kristen Bauer, Renee Metzger, Holly Soper, Amanda Unklesbay第2页,共26页。Linear Least SquaresIs the line of best fit for a group of pointsIt seeks to minimize the sum of all data points of the square differences between the function value and data value.It is

2、the earliest form of linear regression第3页,共26页。Gauss and LegendreThe method of least squares was first published by Legendre in 1805 and by Gauss in 1809.Although Legendres work was published earlier, Gauss claims he had the method since 1795.Both mathematicians applied the method to determine the o

3、rbits of bodies about the sun.Gauss went on to publish further development of the method in 1821.第4页,共26页。ExampleConsider the points (1,2.1), (2,2.9), (5,6.1), and (7,8.3) with the best fit line f(x) = 0.9x + 1.4 The squared errors are:x1=1f(1)=2.3y1=2.1 e1= (2.3 2.1) = .04x2=2f(2)=3.2y2=2.9 e2= (3.

4、2 2.9) =. 09x3=5f(5)=5.9y3=6.1 e3= (5.9 6.1) = .04x4=7f(7)=7.7y4=8.3 e4= (7.7 8.3) = .36So the total squared error is .04 + .09 + .04 + .36 = .53 By finding better coefficients of the best fit line, we can make this error smaller第5页,共26页。We want to minimize the vertical distance between the point an

5、d the line. E = (d1) + (d2) + (d3) +(dn) for n data points E = f(x1) y1 + f(x2) y2 + + f(xn) yn E = mx1 + b y1 + mx2 + b y2 + mxn + b yn E= (mxi+ b yi ) 第6页,共26页。E must be MINIMIZED!How do we do this?E = (mxi+ b yi )Treat x and y as constants, since we are trying to find m and b.SoPARTIALS! E/m = 0

6、and E/b = 0But how do we know if this will yield maximums, minimums, or saddle points?第7页,共26页。Minimum PointMaximum PointSaddle Point第8页,共26页。Minimum!Since the expression E is a sum of squares and is therefore positive (i.e. it looks like an upward paraboloid), we know the solution must be a minimum

7、.We can prove this by using the 2nd Partials Derivative Test.第9页,共26页。2nd Partials TestAnd form the discriminant D = AC B2If D 0, then f takes onA local minimum at (x0,y0) if A 0A local maximum at (x0,y0) if A 0Suppose the gradient of f(x0,y0) = 0. (An instance of this is E/m = E/b = 0.)We set第10页,共

8、26页。Calculating the Discriminant第11页,共26页。If D 0, then f takes onA local minimum at (x0,y0) if A 0A local maximum at (x0,y0) if A 0 by an inductive proof showing that Those details are not covered in this presentation.We know A 0 since A = 2 x2 is always positive (when not all xs have the same value

9、).第12页,共26页。ThereforeSetting E/m and E/b equal to zero will yield two minimizing equations of E, the sum of the squares of the error.Thus, the linear least squares algorithm(as presented) is valid and we can continue.第13页,共26页。E = (mxi + b yi) is minimized (as just shown) when the partial derivative

10、s with respect to each of the variables is zero. ie: E/m = 0 and E/b = 0E/b = 2(mxi + b yi) = 0set equal to 0 mxi + b = yi mSx + bn = SyE/m = 2xi (mxi + b yi) = 2(mxi + bxi xiyi) = 0 mxi + bxi = xiyi mSxx + bSx = SxyNOTE: xi = Sxyi = Syxi = Sxxxiyi = SxSy第14页,共26页。Next we will solve the system of eq

11、uations for unknowns m and b: nmSxx + bnSx = nSxyMultiply by n mSxSx + bnSx = SySxMultiply by SxnmSxx mSxSx = nSxy SySxSubtractm(nSxx SxSx) = nSxy SySx Factor mSolving for m第15页,共26页。Next we will solve the system of equations for unknowns m and b: mSxSxx + bSxSx = SxSxyMultiply by Sx mSxSxx + bnSxx

12、= SySxxMultiply by SxxbSxSx bnSxx = SxySx SySxxSubtractb(SxSx nSxx) = SxySx SySxxSolve for bSolving for b第16页,共26页。Example: Find the linear least squares approximation to the data: (1,1), (2,4), (3,8)Sx = 1+2+3= 6Sxx = 1+2+3 = 14Sy = 1+4+8 = 13Sxy = 1(1)+2(4)+3(8) = 33n = number of points = 3The lin

13、e of best fit is y = 3.5x 2.667Use these formulas:第17页,共26页。Line of best fit: y = 3.5x 2.667第18页,共26页。THE ALGORITHMin Mathematica 第19页,共26页。第20页,共26页。第21页,共26页。ActivityFor this activity we are going to use the linear least squares approximation in a real life situation.You are going to be given a bo

14、x score from either a baseball or softball game.With the box score you are given you are going to write out the points (with the x coordinate being the number of hits that player had in the game and the y coordinate being the number of at-bats that player had in the game).After doing that you are go

15、ing to use the linear least squares approximation to find the best fitting line.The slope of the besting fitting line you find will be the teams batting average for that game. 第22页,共26页。In ConclusionE = (mxi+ b yi ) is the sum of the squared error between the set of data points (x1,y1),(xi,yi),(xn,y

16、n) and the line approximating the data f(x) = mx + b.By minimizing the error by calculus methods, we get equations for m and b that yield the least squared error:第23页,共26页。AdvantagesMany common methods of approximating data seek to minimize the measure of difference between the approximating function and given data points. Advantages for using the squares of differences at each point rather than just the difference, absolute value of difference, or other measures of error include:Positive differences do not cancel negative differencesDifferentiation is

温馨提示

  • 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
  • 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
  • 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
  • 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
  • 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
  • 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
  • 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

评论

0/150

提交评论