第六章LMS算法

整理文档很辛苦,赏杯茶钱您下走!

免费阅读已结束,点击下载阅读编辑剩下 ...

阅读已结束,您可以下载文档离线阅读编辑

资源描述

TheLMSAlgorithmChapter61DerivationOfTheLMSAlgorithm1ConvergenceOfTheWeightVector2AnExampleOfConvergence3LearningCurve4NoiseInTheWeight-VectorSolution5Misadjustment6Performance7CONTENT2DerivationOfTheLMSAlgorithm31DerivationOfTheLMSAlgorithm1Fortheadaptivelinearcombinationoftwobasisstructureforms,theoutput,𝑦𝑘,isalinearcombinationoftheinputsamplesε𝑘=𝑑𝑘−𝑋𝑘𝑇𝑊𝑘(a)ingeneralformFigure6.1Theadaptivelinearcombiner:(b)atransversalfilter4DerivationOfTheLMSAlgorithm1TheSteepestdescentDifferentwaysinestimatingthegradient:TheLMSalgorithmε≈E𝜀𝑘2=E𝑑𝑘2+𝑊𝑇𝑅𝑊−2𝑃𝑇𝑊𝜀≈𝜀𝑘2𝛻≜𝜕𝜀𝜕𝑊=[𝜕𝜀𝜕𝑊0𝜕𝜀𝜕𝑊1…𝜕𝜀𝜕𝑊𝐿]𝑇=2𝑅𝑊−2𝑃𝛻𝑘=[𝜕𝜀𝑘2𝜕𝑤0𝜕𝜀𝑘2𝜕𝑤1…𝜕𝜀𝑘2𝜕𝑤𝐿]𝑇=2𝜀𝑘[𝜕𝜀𝑘𝜕𝑤0𝜕𝜀𝑘𝜕𝑤1…𝜕𝜀𝑘𝜕𝑤𝐿]𝑇=−2𝜀𝑘𝑋𝑘Withthissimpleestimateofthegradient,wecannowspecifyasteepest-descenttypeofadaptivealgorithm(LMS)𝑊𝑘+1=𝑊𝑘−𝜇𝛻𝑘=𝑊𝑘+2𝜇𝜀𝑘𝑋𝑘Formitsform,weknowtheLMSalgorithmissimplicityandefficiency.5ConvergenceOfTheWeightVector62ConvergenceOfTheWeightVector2Isitrationaltoreplacemeanvaluewithsingleestimate?7ConvergenceOfTheWeightVector2ToexamineLMSconvergence,wecanseethatthegradientestimate𝛻𝑘=−2𝜀𝑘𝑋𝑘isunbiasedwhentheweightvectorisheldconstant.E𝛻k=−2Eε𝑘𝑋𝑘=−2Ed𝑘𝑋𝑘−𝑋𝑘𝑋𝑘𝑇W=−2P−RW=𝛻Withtheincreaseofsamplingfrequency,thegradientofsingleestimateisclosetoidealgradientvalueateachiteration.8k:TakemultiplesamplingineachiterationW:wremainsthesameineachweightvectorConvergenceOfTheWeightVector2Fortheconvergenceofweightvector,wecanconsidertwocasesasfollowing.1)Wecanestimateagradientforeachsampledataandthenadjustweightaftergettingmanyestimatevalueateachiteration.thegradientestimateisunbiasedthegradientofsingleestimateisclosetoidealgradientvaluethecorrespondingalgorithmisconvergent9ConvergenceOfTheWeightVector22)Wecanestimateagradientforsampledataateachiterationandthenadjustweight.Afterasufficientnumberofiterations𝑊𝑘+2𝜇𝜀𝑘𝑋𝑘𝑊𝑘+1=E𝑊𝑘convergence𝑊∗=𝑅−1P𝐸[𝑊𝑘+1]=𝐸[𝑊𝑘]+2μE𝜀𝑘𝑋𝑘=𝐸[𝑊𝑘]+2μ(Ed𝑘𝑋𝑘−𝐸[𝑋𝑘𝑋𝑘𝑇W𝑘])=𝐸[𝑊𝑘]+2μP−RE𝑊𝑘=(I−2μR)𝐸[𝑊𝑘]+2μR𝑊∗theprovingprocess:10ConvergenceOfTheWeightVector211Thesteepestdescentmethod*12]2[WWRIWkk*WWVkkkkQVVkkIVV)2(10)2(VVkkI,RotateTranslation,thengetFurther,thengetConvergenceOfTheWeightVector2Wecantransformsittotheprincipal-axiscoordinatesystemandsolveequation.𝐸[𝑉𝑘′]=(I−2μΛ)𝑘𝑉0′𝑉′:theweightvector,W,intheprincipal-axissystemΛ:thediagonaleigenvaluematrixofR𝑉0′:theinitialweightvectorintheprincipal-axissystem12ConvergenceOfTheWeightVector2𝐸[𝑉𝑘′]=(I−2μΛ)𝑘𝑉0′Suchconvergenceisguaranteedonlyif0μ1λ𝑚𝑎𝑥λ𝑚𝑎𝑥:thelargesteigenvalue,thelargsetdigonalelementinΛIngeneral:0μ1𝑡𝑟[𝑅]Transversalfilter:0μ1(𝐿+1)(𝑠𝑖𝑔𝑛𝑎𝑙𝑝𝑜𝑤𝑒𝑟)Obviously,LMSalgorithmisconvergentAfterasufficientnumberofiterations13AnExampleOfConvergence143ConvergenceOfTheWeightVector2AnExampleOfConvergence3TheelementsoftheinputcorrelationmatrixR15ConvergenceOfTheWeightVector2AnExampleOfConvergence3Theoptimumweightvector𝑊∗𝑅𝑊∗=PWhenN=16,𝜑=0.01theoptimumweightvalueThen𝑤0∗=3.784𝑤1∗=−4.17816ConvergenceOfTheWeightVector2AnExampleOfConvergence3Togetherwiththecontoursofconstant𝜉inthefigure,therearetwoweight-valuetracks.Thetwotrackshavefollowingcharacteristics:Becauseofthenoisygradientestimateateachiteration,theweighttracksareseentobeerratic;theydonotalwaysproceedinthedirectionofthetruegradient.𝜇drift,NumberofIterations17ConvergenceOfTheWeightVector2AnExampleOfConvergence3IftheLMSalgorithmweremadetocontinue,eithertrackwouldwanderaroundvicinityof𝜉𝑚𝑖𝑛anoisyweightvectorsolutionFinally,wenotethatinthefigurethatthevalueofμ(i.e.,0.05and0.10)arewellbelowtheupperbound.WiththeLMSalgorithm,itistypicaltovaluesof𝜇ontheorderofatenthoftheupperbound.18ConvergenceOfTheWeightVector2AnExampleOfConvergence319LearningCurve204ConvergenceOfTheWeightVector2AnExampleOfConvergence3LearningCurve4TimeconstantReferencelearningcurveanalysisofthesteepestdescentmethodThegeometricratiosoftheweightThetimeconstantoftheweightThetimeconstantoftheLearningcurveThetimeconstantoftheAdaptive——21NoiseInTheWeight-VectorSolution235ConvergenceOfTheWeightVector2AnExampleOfConvergence3LearningCurve4Letusdefine𝑁𝑘asavectorofnoiseinthegradientestimateatthekthiteration.NoiseInTheWeight-VectorSolution5𝛻k=𝛻k+NkIfweassumethattheLMSprocesshasconvergedtoasteady-stateweightvectorsolutionnear𝑊∗withasmallvalueofadaptivegainconstant𝜇𝛻k≈0,Nk=𝛻k=−2𝜀𝑘𝑋𝑘ThecovarianceofthenoiseIntheprincipal-axiscoordinatesystem:24ConvergenceOfTheWeightVector2AnExampleOfConvergence3LearningCurve4Theweightvectorcovarianceintheprincipal-axiscoordinatesystem.NoiseInTheWeight-VectorSolution5Inthetranslationalcoordinatesystems25Misadjustment266ConvergenceOfTheWeightVector2AnExampleOfConvergence3LearningCurve4Themisadjustmentinanadaptiveprocessisdefinedastheratiooftheexcessmean-squareerrortotheminimummean-squareerror,andisameasureofthe“costofadaptability”NoiseInTheWeight-VectorSolution5Misadjustment6Inordertocompromisedisordersandadaptivespeed,itisnecessarytoexportthedisordersandtherelationshipbetwee

1 / 30
下载文档,编辑使用

©2015-2020 m.777doc.com 三七文档.

备案号:鲁ICP备2024069028号-1 客服联系 QQ:2149211541

×
保存成功