机器学习-试卷-mids14

整理文档很辛苦,赏杯茶钱您下走!

免费阅读已结束,点击下载阅读编辑剩下 ...

阅读已结束,您可以下载文档离线阅读编辑

资源描述

CS189Spring2014IntroductiontoMachineLearningMidterm•Youhave2hoursfortheexam.•Theexamisclosedbook,closednotesexceptyourone-pagecribsheet.•Pleaseusenon-programmablecalculatorsonly.•MarkyouranswersONTHEEXAMITSELF.Ifyouarenotsureofyouransweryoumaywishtoprovideabriefexplanation.•Fortrue/falsequestions, llintheTrue/Falsebubble.•Formultiple-choicequestions, llinthebubblesforALLCORRECTCHOICES(insomecases,theremaybemorethanone).Wehaveintroducedanegativepenaltyforfalsepositivesforthemultiplechoicequestionssuchthattheexpectedvalueofrandomlyguessingis0.Don'tworry,forthissection,yourscorewillbethemaximumofyourscoreand0,thusyoucannotincuranegativescoreforthissection.FirstnameLastnameSIDFirstandlastnameofstudenttoyourleftFirstandlastnameofstudenttoyourrightForsta useonly:Q1.TrueorFalse/10Q2.MultipleChoice/24Q3.DecisionTheory/8Q4.Kernels/14Q5.L2-RegularizedLinearRegressionwithNewton'sMethod/8Q6.MaximumLikelihoodEstimation/8Q7.AneTransformationsofRandomVariables/13Q8.GenerativeModels/15Total/1001Q1.[10pts]TrueorFalse(a)[1pt]Thehyperparametersintheregularizedlogisticregressionmodelare(learningrate)and(regularizationterm).TrueFalse(b)[1pt]TheobjectivefunctionusedinL2regularizedlogisticregressionisconvex.TrueFalse(c)[1pt]InSVMs,thevaluesof ifornon-supportvectorsare0.TrueFalse(d)[1pt]Asthenumberofdatapointsapproaches1,theerrorrateofa1-NNclassi erapproaches0.TrueFalse(e)[1pt]Crossvalidationwillguaranteethatourmodeldoesnotover t.TrueFalse(f)[1pt]Asthenumberofdimensionsincreases,thepercentageofthevolumeintheunitballshellwiththicknessgrows.TrueFalse(g)[1pt]Inlogisticregression,theHessianofthe(nonregularized)loglikelihoodispositivede nite.TrueFalse(h)[1pt]Givenabinaryclassi cationscenariowithGaussianclassconditionalsandequalpriorprobabilities,theoptimaldecisionboundarywillbelinear.TrueFalse(i)[1pt]IntheprimalversionofSVM,weareminimizingtheLagrangianwithrespecttowandinthedualversion,weareminimizingtheLagrangianwithrespectto .TrueFalse(j)[1pt]ForthedualversionofsoftmarginSVM,the i'sforsupportvectorssatisfy iC.TrueFalse2Q2.[24pts]MultipleChoice(a)[3pts]Considerthebinaryclassi cationproblemwherey2f0;1gisthelabelandwehavepriorprobabilityP(y=0)=0.IfwemodelP(xjy=1)tobethefollowingdistributions,whichone(s)willcausetheposteriorP(y=1jx)tohavealogisticfunctionform?GaussianPoissonUniformNoneoftheabove(b)[3pts]Giventhefollowingdatasamples(squareandtrianglebelongtotwodi erentclasses),whichone(s)ofthefollowingalgorithmscanproducezerotrainingerror?vvv1-nearestneighborSupportvectormachineLogisticregressionLineardiscriminantanalysis(c)[3pts]Thefollowingdiagramsshowtheiso-probabilitycontoursfortwodi erent2DGaussiandistributions.Ontheleftside,thedataN(0;I)whereIistheidentitymatrix.Therightsidehasthesamesetofcontourlevelsasleftside.Whatisthemeanandcovariancematrixfortherightside'smultivariateGaussiandistribution?xy−5−4−3−2−1012345−5−4−3−2−1012345xy−5−4−3−2−1012345−5−4−3−2−1012345=[0;0]T,=1001#=[0;1]T,=1001#=[0;1]T,=4000:25#=[0;1]T,=2000:5#3(d)[3pts]Giventhefollowingdatasamples(squareandtrianglemeantwoclasses),whichone(s)ofthefollowingkernelscanweuseinSVMtoseparatethetwoclasses?vvvvvvvvvvvvLinearkernelPolynomialkernelGaussianRBF(radialbasisfunction)kernelNoneoftheabove(e)[3pts]Considerthefollowingplotsofthecontoursoftheunregularizederrorfunctionalongwiththeconstraintregion.Whatregularizationtermisusedinthiscase?L2L1L1Noneoftheabove(f)[3pts]Supposewehaveacovariancematrix=5aa4Whatisthesetofvaluesthatacantakeonsuchthatisavalidcovariancematrix?a2p20ap20a0p20ap204(g)[3pts]ThesoftmarginSVMformulationisasfollows:min12wTw+CNXi=1isubjecttoyi(wTxi+b)1i8ii08iWhatisthebehaviorofthewidthofthemargin(2kwk)asC!0?BehaveslikehardmarginGoestoin nityGoestozeroNoneoftheabove(h)[3pts]InHomework4,you talogisticregressionmodelonspamandhamdataforaKaggleCompetition.Assumeyouhadaverygoodscoreonthepublictestset,butwhentheGSIsranyourmodelonaprivatetestset,yourscoredroppedalot.Thisislikelybecauseyouover ttedbysubmittingmultipletimesandchangingthefollowingbetweensubmissions:,yourpenaltyterm,yourstepsize,yourconvergencecriterionFixingarandombug(i)[0pts]BONUSQUESTION(Answerthisonlyifyouhavetimeandarecon dentofyourotheranswersbecausethisisnotextrapoints.)Wehaveconstructedthemultiplechoiceproblemssuchthateveryfalsepositivewillincursomenegativepenalty.Foroneofthesemultiplechoiceproblems,giventhatthereareppoints,rcorrectanswers,andkchoices,whatistheformulaforthepenaltysuchthattheexpectedvalueofrandomguessingisequalto0?(Youmayassumekr)pkr5Q3.[8pts]DecisionTheoryConsiderthefollowinggenerativemodelfora2-classclassi cationproblem,inwhichtheclassconditionalsareBernoullidistributions:p(!1)=p(!2)=1xj!1=(1withprobability0:50withprobability0:5xj!2=(1withprobability0:50withprobability0:5Assumethelossmatrixtrueclass=1trueclass=2predictedclass=1012predictedclass=2210(a)[8pts]Giveaconditionintermsof12,21,andthatdetermineswhenclass1shouldalwaysbechosenastheminimum-riskclass.BasedonBayes'Rule,theposteriorprobabilityofP(wijx)isP(w1jx)=P(xjw1)P(w1)P(x)=12P(x)

1 / 12
下载文档,编辑使用

©2015-2020 m.777doc.com 三七文档.

备案号:鲁ICP备2024069028号-1 客服联系 QQ:2149211541

×
保存成功