统计学习[The Elements of Statistical Learning]第五章习题

整理文档很辛苦,赏杯茶钱您下走!

免费阅读已结束,点击下载阅读编辑剩下 ...

阅读已结束,您可以下载文档离线阅读编辑

资源描述

TheElementofStatisticalLearning{Chapter5oxstar@SJTUJanuary6,2011Ex.5.9DerivetheReinschformS=(I+K)1forthesmoothingspline.AnswerLetK=(NT)1NN1,soKdoesnotdependon,andwehaveN=NTKNS=N(NTN+N)1NT=N(NTN+NTKN)1NT=N[NT(I+K)N]1NT=NN1(I+K)1(NT)1NT=(I+K)1WecanalsocalculatethesingularvaluedecompositionofN,i.e.N=UDVT,whereUandVareorthogonalmatrices,Disdiagonalmatrix.ThenwehaveS=N(NTN+N)1NT=UDVT(VDUTUDVT+N)1VDUT=(UD1VT(VD2VT)VD1UT+UD1VT(N)VD1UT)1=(I+UD1VTNVD1UT)1=(I+K)1whereK=UD1VTNVD1UTEx.5.15ThisexercisederivessomeoftheresultsquotedinSection5.8.1.SupposeK(x;y)satisfyingtheconditions(5.45)andletf(x)2HK.Showthat1.hK(;xi);fiHK=f(xi).2.hK(;xi);K(;xj)iHK=K(xi;xj).3.Ifg(x)=PNi=1 iK(x;xi),thenJ(g)=NXi=1NXj=1K(xi;xj) i j:Supposethat~g(x)=g(x)+(x),with(x)2HK,andorthogonalinHKtoeachofK(x;xi);i=1;:::;N.Showthat4.NXi=1L(yi;~g(xi))+J(~g)NXi=1L(yi;g(xi))+J(g)withequalityi (x)=0.1Proof1.hK(;xi);fiHK=1Xi=1ciihK(;xi);i()i=1Xi=1cii[ii(xi)]=1Xi=1cii(xi)=f(xi)(1)2.hK(;xi);K(;xj)iHK=1Xk=11khK(;xi);K(;xj)i=1Xk=11k[kk(xi)kk(xj)]=1Xk=1kk(xi)k(xj)=K(xi;xj)(2)3.J(g)=hg(x);g(x)iHK=NXi=1NXj=1hK(x;xi)K(x;xj)iHK i j=NXi=1NXj=1K(xi;xj) i j//(2)4.Because(x)orthogonalinHKtoeachofK(x;xi),wehavehK(;xi);(x)iHK=0and~g(xi)=hK(;xi);~g(xi)iHK=hK(;xi);g(xi)iHK+hK(;xi);(xi)iHK=g(xi)//(1)J(~g)=h~g(x);~g(x)iHK=hg(x);g(x)iHK+2hg(x);(x)iHK+h(x);(x)iHK=J(g)+2NXi=1 ihK(x;xi);(x)iHK+J()=J(g)+J()HenceweprovedNXi=1L(yi;~g(xi))+J(~g)=NXi=1L(yi;g(xi))+(J(g)+J())NXi=1L(yi;g(xi))+J(g)i when(x)=0,J(~g)=J(g)andNXi=1L(yi;~g(xi))+J(~g)=NXi=1L(yi;g(xi))+J(g)2Ex.5.16Considertheridgeregressionproblem(5.53),andassumeMN.AssumeyouhaveakernelKthatcomputestheinnerproductK(x;y)=PMm=1hm(x)hm(y).1.Derive(5.62)onpage171inthetext.HowwouldyoucomputethematricesVandD,givenK?Henceshowthat(5.63)isequivalentto(5.53).2.Showthat^f=H^ =K(K+I)1ywhereHistheNMmatrixofevaluationshm(xi),andK=HHTtheNNmatrixofinner-productsh(xi)Th(xj).3.Showthat^f(x)=h(x)T^ =NXi=1K(x;xi)^ iand^ =(K+I)1y.4.HowwouldyoumodifyyoursolutionifMN?Answer1.Accordingtothede nitionofK(x;y),wehaveK(x;y)=MXm=1hm(x)hm(y)=1Xi=1ii(x)i(y)Multiplyeachitemswithk(x)andintegralw.r.t.x,i.e.calculatehK(x;y);k(x)iMXm=1(Zhm(x)k(x)dx)hm(y)=1Xi=1i(Zi(x)k(x)dx)i(y)(3)Bythepropertyoforthogonaleigen-functionsi(x)Zi(x)k(x)dx=hi(x);k(x)i=(1;i=k0;i6=kSowecansimplify(3)MXm=1(Zhm(x)k(x)dx)hm(y)=kk(y)Letgkm=Rhm(x)k(x)dxandcalculateh;`(y)i,thenMXm=1gkmhm(y)=kk(y)(4)MXm=1gkm(Zhm(y)`(y)dy)=kZk(y)`(y)dyMXm=1gkmg`m=kk;`3whereg`m=Zhm(y)`(y)dy;k;`=(1;`=k0;`6=kLetGM=fgnmg;nMandeveryrowofGMarelinearindependent,thenwehaveGMGTM=diagf1;2;:::;Mg=DLetVT=D12GM(VT)TVTGTM=VVTGTM=GTM(D12)TD12GMGTM=GTMD12D12D=GTMTherefore(VT)TVT=VVT=E,VTisanorthogonalmatrix.Leth(x)=(h1(x);h2(x);:::;hM(x))Tand(x)=(1(x);2(x);:::;M(x))T,thenequation(4)equalstoGMh(x)=D(x)VD12GMh(x)=VD12D(x)h(x)=VD12(x)Andwehaveminf mgM1NXi=1yiMXm=1 mhm(xi)!2+MXm=1 2m//(5.63)=min NXi=1(yi Th(xi))2+ 2//Let =( 1; 2;:::; M)T(5)=min NXi=1(yi TVD12(xi))2+ T =mincNXi=1(yicT(xi))2+(VD12c)TVD12c//Letc=D12VT =mincNXi=1(yicT(xi))2+cTcD1=minfcjg11NXi=1yi1Xj=1cjj(xi)!2+1Xj=1c2jj//(5.53)2.Calculatederivativeof(5)w.r.t. andletit=0,wehaveHT(yH^ )+^ =0MN,soKisinvertibleandforany;K+I=HTH+Iisinvertible^ =(HTH+I)1HTy^f=H^ =H(HTH+I)1HTy=((HT)1HTHH1+(HT)1H1)1y=(I+K1)1y=KK1(I+K1)1y=K(K+I)1y43.From(b),wehave^f=H^ =K^ =K( 1; 2;:::; N)T^f(x)=h(x)T^ =NXi=1K(x;xi)^ i4.MN,soKisnotinvertible.If6=0,K+Iisstillinvertiblehencesolutionsabovestillhold.Butif=0,wehave^f=H^ =H(HTH)1HTy=y5

1 / 5
下载文档,编辑使用

©2015-2020 m.777doc.com 三七文档.

备案号:鲁ICP备2024069028号-1 客服联系 QQ:2149211541

×
保存成功