机器学习-试卷-finals19

整理文档很辛苦,赏杯茶钱您下走!

免费阅读已结束,点击下载阅读编辑剩下 ...

阅读已结束,您可以下载文档离线阅读编辑

资源描述

CS189Spring2019IntroductiontoMachineLearningFinalExamˆPleasedonotopentheexambeforeyouareinstructedtodoso.ˆElectronicdevicesareforbiddenonyourperson,includingcellphones,iPods,headphones,andlaptops.Turnyourcellphoneo andleaveallelectronicsatthefrontoftheroom,orriskgettingazeroontheexam.ˆWhenyoustart,thefirstthingyoushoulddoischeckthatyouhaveall12pagesandall6questions.Thesecondthingistopleasewriteyourinitialsatthetoprightofeverypageafterthisone(e.g.,write“JS”ifyouareJonathanShewchuk).ˆTheexamisclosedbook,closednotesexceptyourtwocheatsheets.ˆYouhave3hours.ˆMarkyouranswersontheexamitselfinthespaceprovided.Donotattachanyextrasheets.ˆThetotalnumberofpointsis150.Thereare26multiplechoicequestionsworth3pointseach,and5writtenquestionsworthatotalof72points.ˆFormultipleanswerquestions,fillinthebubblesforALLcorrectchoices:theremaybemorethanonecorrectchoice,butthereisalwaysatleastonecorrectchoice.NOpartialcreditonmultipleanswerquestions:thesetofallcorrectanswersmustbechecked.FirstnameLastnameSIDFirstandlastnameofstudenttoyourleftFirstandlastnameofstudenttoyourright1Q1.[78pts]MultipleAnswerFillinthebubblesforALLcorrectchoices:theremaybemorethanonecorrectchoice,butthereisalwaysatleastonecorrectchoice.NOpartialcredit:thesetofallcorrectanswersmustbechecked.(a)[3pts]Whichofthefollowingalgorithmscanlearnnonlineardecisionboundaries?Thedecisiontreesuseonlyaxis-alignedsplits.Adepth-fivedecisiontreeQuadraticdiscriminantanalysis(QDA)AdaBoostwithdepth-onedecisiontreesPerceptronThesolutionsareobviousotherthanAdaBoostwithdepth-onedecisiontrees,whereyoucanformnon-linearboundariesduetothefinalclassifiernotactuallybeingalinearcombinationofthelinearweaklearners.(b)[3pts]Whichofthefollowingclassifiersarecapableofachieving100%trainingaccuracyonthedatabelow?Thedecisiontreesuseonlyaxis-alignedsplits.LogisticregressionAneuralnetworkwithonehiddenlayerAdaBoostwithdepth-onedecisiontreesAdaBoostwithdepth-twodecisiontreestopleft:Eachweaklearnerwilleitherclassifythepointsfromeachpairindi erentclasses,orclassifyeverypointinthesameclass.Sincethemetaclassifierisaweightedsumofalloftheseweakclassifiers,eachwhichhasa50%trainingaccuracy,themetaclassifiercannothave100%accuracy.topright:Aneuralnetworkwithonehiddenlayer(withenoughunits)isauniversalfunctionapproximator.lowerleft:Logisticregressionfindsalineardecisionboundary,whichcannotseparatethedata.lowerright:Adepthtwodecisiontreecanfullyseparatethedata.(c)[3pts]Whichofthefollowingaretrueofsupportvectormachines?IncreasingthehyperparameterCtendstodecreasethetrainingerrorThehard-marginSVMisaspecialcaseofthesoft-marginwiththehyperparameterCsettozeroIncreasingthehyperparameterCtendstodecreasethemarginIncreasingthehyperparameterCtendstodecreasethesensitivitytooutliersTopleft:True,fromthelecturenotes.Bottomleft:False,Hard-marginSVMiswhereCtendstowardsinfinity.Topright:false,perceptronistrainedusinggradientdescentandSVMistrainedusingaquadraticprogram.2Bottomright:True:slackbecomeslessexpensive,soyouallowdatapointstobefartheronthewrongsideofthemarginandmakethemarginbigger.Doingthiswillneverreducethenumberofdatapointsinsidethemargin.(d)[3pts]Letr(x)beadecisionrulethatminimizestheriskforathree-classclassifierwithlabelsy2f0;1;2gandanasymmetriclossfunction.Whatistrueaboutr()?8y2f0;1;2g;9x:r(x)=yIfwedon’thaveaccesstotheunderlyingdatadis-tributionP(X)orP(YjX),wecannotexactlycomputetheriskofr()8x,r(x)isaclassythatmaximizestheposteriorprobabilityP(Y=yjX=x)IfP(X=x)changesbutP(Y=yjX=x)remainsthesameforallx;y,r(X)stillminimizestherisktopleft:itispossiblethatr(X)isthesameforallX.topright:no,becausetheriskisasymmetriclowerleft:bydefinitionofriskweneedtobeabletocomputeexpectationsoverthesetwodistributions.lowerright:Giventhatr(X)hasnoconstraint,itcanpicktheythatminimizesriskforeveryX=xwithouttrade-o s.Therefore,ifonlythemarginalschange,thatchoiceisnota ected.(e)[3pts]Whichofthefollowingaretrueabouttwo-classGaussiandiscriminantanalysis?AssumeyouhaveestimatedtheparametersˆC;ˆC,ˆCforclassCandˆD;ˆD,ˆDforclassD.IfˆC=ˆDandˆC=ˆD,thentheLDAandQDAclassifiersareidenticalIfˆC=I(theidentitymatrix)andˆD=5I,thentheLDAandQDAclassifiersareidenticalIfˆC=ˆD,ˆC=1=6,andˆD=5=6,thentheLDAandQDAclassifiersareidenticalIftheLDAandQDAclassifiersareidentical,thentheposteriorprobabilityP(Y=CjX=x)islinearinxTopleft:false,thecovariancematricesmightdi er,makingtheQDAdecisionfunctionnonlinear.Bottomleft:false,theQDAdecisionfunctionisnonlinear.Topright:correct.Bottomright:no,theposteriorisalogisticfunction.(f)[3pts]ConsiderannddesignmatrixXwithlabelsy2Rn.Whatistrueoffittingthisdatawithdualridgeregressionwiththepolynomialkernelk(Xi;Xj)=(XTiXj+1)p=(Xi)(Xj)andregularizationparameter0?Ifthepolynomialdegreeishighenough,thepoly-nomialwillfitthedataexactlyThealgorithmcomputes(Xi)and(Xj)inO(dp)timeThealgorithmsolvesannnlinearsystemWhennisverylarge,thisdualalgorithmismorelikelytooverfitthantheprimalalgorithmwithdegree-ppolynomialfeaturesTopleft:seedefinitionofdualridgeregressionLowerleft:bothgivethesamesolution,nomattern!Topright:Thedualmethodproblemofridgeregressi

1 / 16
下载文档,编辑使用

©2015-2020 m.777doc.com 三七文档.

备案号:鲁ICP备2024069028号-1 客服联系 QQ:2149211541

×
保存成功