Learning with AdaBoost AdaBoost 算法

整理文档很辛苦,赏杯茶钱您下走!

免费阅读已结束,点击下载阅读编辑剩下 ...

阅读已结束,您可以下载文档离线阅读编辑

资源描述

LearningwithAdaBoostXinLiTuid910876215TempleUniversityFall20072/16/2020LearningwithAdaboost2OutlineIntroductionandbackgroundofBoostingandAdaboostAdaboostAlgorithmexampleAdaboostAlgorithmincurrentprojectExperimentresultsDiscussionandconclusion2/16/2020LearningwithAdaboost3OutlineIntroductionandbackgroundofBoostingandAdaboostAdaboostAlgorithmexampleAdaboostAlgorithmincurrentprojectExperimentresultsDiscussionandconclusion2/16/2020LearningwithAdaboost4BoostingAlgorithmDefinitionofBoosting[1]:Boostingreferstoageneralmethodofproducingaveryaccuratepredictionrulebycombiningroughandmoderatelyinaccuraterules-of-thumb.Boostingprocedures[2]Givenasetoflabeledtrainingexamples,whereisthelabelassociatedwithinstanceOneachround,•Theboosterdevisesadistribution(importance)overtheexampleset•Theboosterrequestsaweakhypothesis(rule-of-thumb)withlowerrorAfterTrounds,theboostercombinetheweakhypothesisintoasinglepredictionrule.Niyxii1,iyixTt,,1thttD2/16/2020LearningwithAdaboost5BoostingAlgorithm(cont’d)TheintuitiveideaAlteringthedistributionoverthedomaininawaythatincreasestheprobabilityofthe“harder”partsofthespace,thusforcingtheweaklearnertogeneratenewhypothesesthatmakelessmistakesontheseparts.DisadvantagesNeedstoknowthepriorknowledgeofaccuraciesoftheweakhypothesesTheperformanceboundsdependsonlyontheaccuracyoftheleastaccurateweakhypothesis2/16/2020LearningwithAdaboost6backgroundofAdaboost[2]2/16/2020LearningwithAdaboost7AdaboostAlgorithm[2]2/16/2020LearningwithAdaboost8AdvantagesofAdaboostAdaboostadjustsadaptivelytheerrorsoftheweakhypothesesbyWeakLearn.Unliketheconventionalboostingalgorithm,thepriorerrorneednotbeknownaheadoftime.Theupdaterulereducestheprobabilityassignedtothoseexamplesonwhichthehypothesismakesagoodpredictionsandincreasestheprobabilityoftheexamplesonwhichthepredictionispoor.2/16/2020LearningwithAdaboost9Theerrorbound[3]SupposetheweaklearningalgorithmWeakLearn,whencalledbyAdaboost,generateshypotheseswitherrors.ThentheerrorofthefinalhypothesisoutputbyAdaboostisboundedabovebyNotethattheerrorsgeneratedbyWeakLearnarenotuniform,andthefinalerrordependsontheerrorofalloftheweakhypotheses.Recallthattheerrorsofthepreviousboostingalgorithmsdependonlyonthemaximalerroroftheweakesthypothesisandignoredtheadvantagesthatcanbegainedfromthehypotheseswhoseerrorsaresmaller.T,,1iifDiyxh~PrfhTttt1122/16/2020LearningwithAdaboost10OutlineIntroductionandbackgroundofBoostingandAdaboostAdaboostAlgorithmexampleAdaboostAlgorithmincurrentprojectExperimentresultsDiscussionandconclusion2/16/2020LearningwithAdaboost11Atoyexample[2]Trainingset:10points(representedbyplusorminus)OriginalStatus:EqualWeightsforalltrainingsamples2/16/2020LearningwithAdaboost12Atoyexample(cont’d)Round1:Three“plus”pointsarenotcorrectlyclassified;Theyaregivenhigherweights.2/16/2020LearningwithAdaboost13Atoyexample(cont’d)Round2:Three“minuse”pointsarenotcorrectlyclassified;Theyaregivenhigherweights.2/16/2020LearningwithAdaboost14Atoyexample(cont’d)Round3:One“minuse”andtwo“plus”pointsarenotcorrectlyclassified;Theyaregivenhigherweights.2/16/2020LearningwithAdaboost15Atoyexample(cont’d)FinalClassifier:integratethethree“weak”classifiersandobtainafinalstrongclassifier.2/16/2020LearningwithAdaboost16OutlineIntroductionandbackgroundofBoostingandAdaboostAdaboostAlgorithmexampleAdaboostAlgorithmincurrentprojectExperimentresultsDiscussionandconclusion2/16/2020LearningwithAdaboost17LookatAdaboost[3]Again2/16/2020LearningwithAdaboost18Adaboost(Con’d):Multi-classExtensionsThepreviousdiscussionisrestrictedtobinaryclassificationproblems.ThesetYcouldhaveanynumberoflabels,whichisamulti-classproblems.Themulti-classcase(AdaBoost.M1)requirestheaccuracyoftheweakhypothesisgreaterthan½.Thisconditioninthemulti-classisstrongerthanthatinthebinaryclassificationcases2/16/2020LearningwithAdaboost19AdaBoost.M12/16/2020LearningwithAdaboost20ErrorUpperBoundofAdaboost.M1[3]Likethebinaryclassificationcase,theerrorofthefinalhypothesisisalsobounded.Tttt1122/16/2020LearningwithAdaboost21HowdoesAdaboost.M1work[4]?2/16/2020LearningwithAdaboost22Adaboostinourproject2/16/2020LearningwithAdaboost23Adaboostinourproject1)Theinitializationhassetthetotalweightsoftargetclassthesameasallotherstaff.bird[1,…,10]=½*1/10;otherstaff[1,…,690]=½*1/690;2)Thehistoryrecordispreservedtostrengthentheupdatingprocessoftheweights.3)theunifiedmodelobtainedfromCPMalignmentareusedfortrainingprocess.2/16/2020LearningwithAdaboost24Adaboostinourproject2)Thehistoryrecordweight_histogram(withHistoryRecord)weight_histogram(withoutHistoryRecord)2/16/2020LearningwithAdaboost25Adaboostinourproject3)theunifiedmodelobtainedfromCPMalignmentareusedfortrainingprocess.Thishasdecreasedtheoverfittingproblem.3.1)OverfittingProblem.3.2)CPMmodel.2/16/2020LearningwithAdaboost26Adaboostinourproject3.1)OverfittingProblem.WhythetrainedAdaboostdoesnotworkforbird11~20?Ihavecompared:I)therankofalphavalueforeach60classifiersII)howeachclassifierhasactuallydetectedbirdsintrainprocessIII)howeachclass

1 / 58
下载文档,编辑使用

©2015-2020 m.777doc.com 三七文档.

备案号:鲁ICP备2024069028号-1 客服联系 QQ:2149211541

×
保存成功