机器学习-试卷-mids20a

整理文档很辛苦,赏杯茶钱您下走!

免费阅读已结束,点击下载阅读编辑剩下 ...

阅读已结束,您可以下载文档离线阅读编辑

资源描述

CS189Spring2020IntroductiontoMachineLearningMidtermAˆPleasedonotopentheexambeforeyouareinstructedtodoso.ˆTheexamisclosedbook,closednotesexceptyourcheatsheets.ˆPleasewriteyournameatthetopofeachpageoftheAnswerSheet.(Youmaydothisbeforetheexam.)ˆYouhave80minutestocompletethemidtermexam(6:40–8:00PM).(IfyouareintheDSPprogramandhaveanallowanceof150%or200%time,thatcomesto120minutesor160minutes,respectively.)ˆWhentheexamends(8:00PM),stopwriting.Youhave15minutestoscantheexamandturnitintoGradescope.Youmustremainvisibleoncamerawhileyouscanyourexamandturnitin(unlessthescanningdeviceisyouronlyself-monitoringdevice).Mostofyouwilluseyourcellphoneandathird-partyscanningapp.Ifyouhaveaphysicalscannerinyourworkspacethatyoucanmakevisiblefromyourcamera,youmayusethat.Lateexamswillbepenalizedatarateof10pointsperminuteafter8:15PM.(Themidtermhas100pointstotal.)Continuingtoworkontheexamafter8:00PM(ornotbeingvisiblepriortosubmission)mayincurascoreofzero.ˆMarkyouranswersontheAnswerSheet.Ifyouabsolutelymustuseoverflowspaceforawrittenquestion,usethespacefor“WrittenQuestion#5”(butpleasetryhardnottooverflow).Donotattachanyextrasheets.ˆThetotalnumberofpointsis100.Thereare10multiplechoicequestionsworth4pointseach,andthreewrittenquestionsworth20pointseach.ˆFormultipleanswerquestions,fillinthebubblesforALLcorrectchoices:theremaybemorethanonecorrectchoice,butthereisalwaysatleastonecorrectchoice.NOpartialcreditonmultipleanswerquestions:thesetofallcorrectanswersmustbechecked.ˆForwrittenquestions,pleasewriteyourfullanswerinthespaceprovidedandclearlylabelallsubpartsofeachwrittenquestion.Again,donotattachextrasheets.FirstnameLastnameSID1Q1.[40pts]MultipleAnswerFillinthebubblesforALLcorrectchoices:theremaybemorethanonecorrectchoice,butthereisalwaysatleastonecorrectchoice.NOpartialcredit:thesetofallcorrectanswersmustbechecked.(a)[4pts]LetXbeanmnmatrix.Whichofthefollowingarealwaysequaltorank(X)?A:rank(XT)B:rank(XTX)C:mdimension(nullspace(X))D:dimension(rowspace(X))OptionCisnotequalbecausebyRank-NullityTheorem:ndim(nullspace(X))=rank(X)OptionsAandDareequal,sincedim(rowspace(X))=dim(columnspace(XT))=rank(XT)=rank(X)(b)[4pts]Whichofthefollowingtypesofsquarematricescanhavenegativeeigenvalues?A:asymmetricmatrixB:IuuTwhereuisaunitvectorC:anorthonormalmatrix(MsuchthatMM=I)D:r2f(x)wheref(x)isaGaussianPDFTopleft:Asymmetricmatrixcanhavenegativeeigenvalues,theyjusthavetobereal.Bottomleft:uisaunitvectorthatcanbeexpressedasalinearcombinationofthestandardvectorsPni=1cieiwhereci1.(Pni=1ciei)(Pni=1ciei)T=(Pni=1ciei)(Pni=1ciei)TTopright:Anorthogonalmatrixhaseigenvaluesof1and-1.Bottomright:ThegaussianofaPDFisaconcavefunction,thusthehessianmusthavenegativeeigenvalues.(c)[4pts]Choosethecorrectstatement(s)aboutSupportVectorMachines(SVMs).A:ifafinitesetoftrainingpointsfromtwoclassesislinearlyseparable,ahard-marginSVMwillalwaysfindadecisionboundarycorrectlyclassifyingeverytrainingpointB:ifafinitesetoftrainingpointsfromtwoclassesislinearlyseparable,asoft-marginSVMwillalwaysfindadecisionboundarycorrectlyclassifyingeverytrainingpointC:everytrainedtwo-classhard-marginSVMmodelhasatleastonepointofeachclassatadistanceofexactly1=kwk(themarginwidth)fromthedecisionboundaryD:everytrainedtwo-classsoft-marginSVMmodelhasatleastonepointofeachclassatadistanceofexactly1=kwk(themarginwidth)fromthedecisionboundaryOptionAiscorrect:fundamentalmaterialaboutSVMsfromlectures.(d)[4pts]Supposeweperformleast-squareslinearregression,butwedon’tassumethatallweightvectorsareequallyreasonable;instead,weusethemaximumaposteriorimethodtoimposeanormally-distributedpriorprobabilityontheweights.ThenwearedoingA:L2regularizationB:LassoregressionC:logisticregressionD:ridgeregressionAsshowninLecture13,theBayesianjustificationforridgeregressionisderivedbyapplyingMAPtotheposteriorprobabilitywithaGaussianpriorontheweights.(e)[4pts]WhichofthefollowingstatementsregardingROCcurvesaretrue?2A:theROCcurveismonotonicallyincreasingB:foralogisticregressionclassifier,theROCcurve’shorizontalaxisistheposteriorprobabilityusedasathresholdforthedecisionruleC:theROCcurveisconcaveD:iftheROCcurvepassesthrough(0;1),theclassifierisalwayscorrect(onthetestdatausedtomaketheROCcurve)TheaxesofanROCcurvedonotcorrespondtothe”knob”we’returningwhenweplotthecurve.Alwayspredictingpositivewillgiveus100Doesnothavetobeconcave,justneedstobeincreasing.Sinceit’sincreasing,thecurveisahorizontallineaty=1.So,wehavenofalsepositivesnorfalsenegatives.(f)[4pts]Onewaytounderstandregularizationistoaskwhichvectorsminimizetheregularizationterm.Considerthesetofunitvectorsintheplane:fx2R2:kxk22=1g.Whichofthefollowingregularizationtermsareminimizedsolelybythefourunitvectorsf(0;1);(1;0);(1;0);(0;1)gandnootherunitvector?A:f(x)=kxk0=the#ofnonzeroentriesofxB:f(x)=kxk1C:f(x)=kxk22D:f(x)=kxk1=maxfjx1j;jx2jgThefirstoptionisalmosttruebydefinition:thesearethesparsestunitvectors.ThesecondoptionfollowsCauchy–Schwartz.Intuitively,however,weknowalsothatthe`1-normpromotessparsity,soweshouldexpectthistobetrue.Finally,noticethatkxk22alwaysequals1andthatmax(x1;x2)isminimizedwhenx1=x2,

1 / 8
下载文档,编辑使用

©2015-2020 m.777doc.com 三七文档.

备案号:鲁ICP备2024069028号-1 客服联系 QQ:2149211541

×
保存成功