HowdoesReducetheDimensionalityofDatawithDeepNeuralNetworks?2019/12/13◆2012,ImageNetRecognitionChallenge,Hinton’steamwitha16%errorrateinclassifying1.2millionimages,againsta26%errorratebyitsclosestcompetitors.◆2006,HintonandSalakhutdinovproposeddeepneuralnetworkslearningmethod;Subsequently(June26,2012,news),researchersatGoogleandStanfordUniversitycreatedthelargestneuralnetworkswhichisadeepneuralnetworkwithnine-layerthatlearnedontheinternettolearnonitsown.Thesystemwasusedtosimulatehumanbrain,withonebillionconnections,andtrainedoverthreedayson10millionimagesbyconnecting16,000cores.Afterthreedays’unsupervisedlearning,itinventedtheconceptofacat.(Oct,2012,news)Microsoftdisplayafullyautomatedinterpretationsystemonthe21stCenturyComputingConference.TheKeytechnologyofthissystemisDNN(Deepneuralnetworks).◆2013,Hinton-Google;YannLeCun-Facebook;Baidu(IDL)◆2014,AndrewNg-BaiduRecentlyAdvanceofDeepLearning◆2015,JorunalcitationreportsforSCIThreeIdeasofCNN1.TheareaoflocalperceptionFig.1ConnectedwayofConventionalcurrentneuralnetsFig.2Connectedwayofcurrentneuralnets2.ShareweightsFig.3SingleconvolutionkernelReducethecomplexityofnetworkandreduceamountofcalculationFig.4MultipleconvolutionkernelFig.5Insideaconvolutionalneuralnetwork3.SecondaryextractionofthefeaturemapsResearchpointsthatcanproducenewideas※Initialweightspretrainingalgorithm※Thearchitectureofdeepneuralnetworks※Layer-by-layerlearningalgorithmReference[1]G.E.Hinton,R.R.Salakhutdinov.ReducingtheDimensionalityofDatawithNeuralNetworks[J].Science,2006,313(28):504-507.[2]YannLeCun,YoshuaBengio,GeoffreyHinton.Deeplearning[J].Nature,2015,521(28):436-444.[3]!