机器学习-试卷-mids19

整理文档很辛苦,赏杯茶钱您下走!

免费阅读已结束,点击下载阅读编辑剩下 ...

阅读已结束,您可以下载文档离线阅读编辑

资源描述

CS189Spring2019IntroductiontoMachineLearningMidtermˆPleasedonotopentheexambeforeyouareinstructedtodoso.ˆTheexamisclosedbook,closednotesexceptyourone-pagecheatsheet.ˆElectronicdevicesareforbiddenonyourperson,includingcellphones,iPods,headphones,andlaptops.Turnyourcellphoneo andleaveallelectronicsatthefrontoftheroom,orriskgettingazeroontheexam.ˆYouhave1hourand20minutes.ˆPleasewriteyourinitialsatthetoprightofeachpageafterthisone(e.g.,write“JS”ifyouareJonathanShewchuk).Finishthisbytheendofyour1hourand20minutes.ˆMarkyouranswersontheexamitselfinthespaceprovided.Donotattachanyextrasheets.ˆThetotalnumberofpointsis100.Thereare20multiplechoicequestionsworth3pointseach,and4writtenquestionsworthatotalof40points.ˆFormultipleanswerquestions,fillinthebubblesforALLcorrectchoices:theremaybemorethanonecorrectchoice,butthereisalwaysatleastonecorrectchoice.NOpartialcreditonmultipleanswerquestions:thesetofallcorrectanswersmustbechecked.FirstnameLastnameSIDFirstandlastnameofstudenttoyourleftFirstandlastnameofstudenttoyourright1Q1.[60pts]MultipleAnswerFillinthebubblesforALLcorrectchoices:theremaybemorethanonecorrectchoice,butthereisalwaysatleastonecorrectchoice.NOpartialcredit:thesetofallcorrectanswersmustbechecked.(a)[3pts]LetAbeareal,symmetricnnmatrix.WhichofthefollowingaretrueaboutA’seigenvectorsandeigenvalues?AcanhavenomorethanndistincteigenvaluesAcanhavenomorethan2ndistinctunit-lengtheigenvectorsThevector~0isaneigenvector,becauseA~0=~0WecanfindnmutuallyorthogonaleigenvectorsofATherecanbeinfinitelymanyunit-lengtheigenvectorsifthemultiplicityofanyeigenvectorisgreaterthan1(sotheeiganspaceisaplane,andyoucanpickanyvectorontheunitcircleonthatplane).The0vectorisnotaneigenvectorbydefinition.(b)[3pts]Thematrixthathaseigenvector[1;2]witheigenvalue2andeigenvector[2;1]witheigenvalue1(notethatthesearenotuniteigenvectors!)is9226#9=52=52=56=5#6229#6=52=52=59=5#(c)[3pts]Considerabinaryclassificationproblemwhereweknowbothoftheclassconditionaldistributionsexactly.Tocomputetherisk,weneedtoknowallthesamplepointsweneedtoknowthelossfunctionweneedtoknowtheclasspriorprobabilitiesweneedtousegradientdescent(d)[3pts]Assumingwecanfindalgorithmstominimizethem,whichofthefollowingcostfunctionswillencouragesparsesolutions(i.e.,solutionswheremanycomponentsofwarezero)?kXwyk22+kwk1kXwyk22+kwk21kXwyk22+(#ofnonzerocomponentsofw)kXwyk22+kwk22ThefirstanswerisLasso,whichweknowfindssparsesolutions.ThesecondisLassowiththepenaltysquared.SquaringthiswillleavethesameisocontoursandthiswillkeepthesamepropertiesasLasso.Thethirdcostfunctionpenalizessolutionsthatarenotsparseandwillnaturallyencouragesparsesolutions.Thelastsolutionisridgeregression,whichshrinksweightsbutdoesnotsetweightstozero.(e)[3pts]Whichofthefollowingstatementsaboutlogisticregressionarecorrect?ThecostfunctionoflogisticregressionisconvexLogisticregressionusesthesquarederrorasthelossfunctionThecostfunctionoflogisticregressionisconcaveLogisticregressionassumesthateachclass’spointsaregeneratedfromaGaussiandistribution(f)[3pts]WhichofthefollowingstatementsaboutstochasticgradientdescentandNewton’smethodarecorrect?2Newton’smethodoftenconvergesfasterthanstochasticgradientdescent,especiallywhenthedi-mensionissmallNewton’smethodconvergesinoneiterationwhenthecostfunctionisexactlyquadraticwithoneuniqueminimum.Ifthefunctioniscontinuouswithcontinuousderivatives,Newton’smethodalwaysfindsalocalminimumStochasticgradientdescentreducesthecostfunc-tionateveryiteration.3(g)[3pts]LetX2Rndbeadesignmatrixcontainingnsamplepointswithdfeatureseach.Lety2Rnbethecorrespondingreal-valuedlabels.WhatisalwaystrueabouteverysolutionwthatlocallyminimizesthelinearleastsquaresobjectivefunctionkXwyk22,nomatterwhatthevalueofXis?w=X+y(whereX+isthepseudoinverse)wisinthenullspaceofXwsatisfiesthenormalequationsAllofthelocalminimaareglobalminimaTopleft:w=X+yistheleastsquaressolutionwithleastnorm.Ifthenull-spaceofXisnon-empty,thereareinfinitelymanyothersolutionsthatminimizekXwyk22butthathavelargernorm.Bottomleft:IfwwasinthenullspaceofX,thenXw=0.Thiscanonlybeasolutiontothelinearleastsquaresobjectiveifandonlyifyisalsointhenull-spaceofX.Topright:ThenormalequationsATAw=ATydefineallvaluesofwthatmakezerothegradientoftheleastsquaresobjective.Thereforeanyminimizermustsatisfythem.Bottomright:Theobjectiveisconvex,andthereforealllocalminimizersarealsoglobalminimizers.(h)[3pts]Weareusinglineardiscriminantanalysistoclassifypointsx2Rdintothreedi erentclasses.LetSbethesetofpointsinRdthatourtrainedmodelclassifiesasbelongingtothefirstclass.Whichofthefollowingaretrue?ThedecisionboundaryofSisalwaysahyperplaneThedecisionboundaryofSisalwaysasubsetofaunionofhyperplanesScanbethewholespaceRdSisalwaysconnected(thatis,everypairofpointsinSisconnectedbyapathinS)Topleft:Giventhatwehavethreeclasses,Sisdefinedbytwolinearinequalities,andthereforeitsboundarymaynotbeahyperplane.Bottomleft:GiventhatSisdefinedasthepointssatisfyingasetofinequalities,itsboundaryisasubsetofthehyperplanesdefinedbyeachofthelinearinequalities.Topright:Ifthepriorforthefirstclassishighenough,theprobab

1 / 12
下载文档,编辑使用

©2015-2020 m.777doc.com 三七文档.

备案号:鲁ICP备2024069028号-1 客服联系 QQ:2149211541

×
保存成功