深度学习最新进展资料

整理文档很辛苦,赏杯茶钱您下走!

免费阅读已结束,点击下载阅读编辑剩下 ...

阅读已结束,您可以下载文档离线阅读编辑

资源描述

HowdoesReducetheDimensionalityofDatawithDeepNeuralNetworks?2019/12/13◆2012,ImageNetRecognitionChallenge,Hinton’steamwitha16%errorrateinclassifying1.2millionimages,againsta26%errorratebyitsclosestcompetitors.◆2006,HintonandSalakhutdinovproposeddeepneuralnetworkslearningmethod;Subsequently(June26,2012,news),researchersatGoogleandStanfordUniversitycreatedthelargestneuralnetworkswhichisadeepneuralnetworkwithnine-layerthatlearnedontheinternettolearnonitsown.Thesystemwasusedtosimulatehumanbrain,withonebillionconnections,andtrainedoverthreedayson10millionimagesbyconnecting16,000cores.Afterthreedays’unsupervisedlearning,itinventedtheconceptofacat.(Oct,2012,news)Microsoftdisplayafullyautomatedinterpretationsystemonthe21stCenturyComputingConference.TheKeytechnologyofthissystemisDNN(Deepneuralnetworks).◆2013,Hinton-Google;YannLeCun-Facebook;Baidu(IDL)◆2014,AndrewNg-BaiduRecentlyAdvanceofDeepLearning◆2015,JorunalcitationreportsforSCIThreeIdeasofCNN1.TheareaoflocalperceptionFig.1ConnectedwayofConventionalcurrentneuralnetsFig.2Connectedwayofcurrentneuralnets2.ShareweightsFig.3SingleconvolutionkernelReducethecomplexityofnetworkandreduceamountofcalculationFig.4MultipleconvolutionkernelFig.5Insideaconvolutionalneuralnetwork3.SecondaryextractionofthefeaturemapsResearchpointsthatcanproducenewideas※Initialweightspretrainingalgorithm※Thearchitectureofdeepneuralnetworks※Layer-by-layerlearningalgorithmReference[1]G.E.Hinton,R.R.Salakhutdinov.ReducingtheDimensionalityofDatawithNeuralNetworks[J].Science,2006,313(28):504-507.[2]YannLeCun,YoshuaBengio,GeoffreyHinton.Deeplearning[J].Nature,2015,521(28):436-444.[3]!

1 / 9
下载文档,编辑使用

©2015-2020 m.777doc.com 三七文档.

备案号:鲁ICP备2024069028号-1 客服联系 QQ:2149211541

×
保存成功