Using Recurrent Neural Networks for Time Series Fo

整理文档很辛苦,赏杯茶钱您下走!

免费阅读已结束,点击下载阅读编辑剩下 ...

阅读已结束,您可以下载文档离线阅读编辑

资源描述

UsingRecurrentNeuralNetworksforTimeSeriesForecastingSandyD.BalkinPennsylvaniaStateUniversityDepartmentofManagementScienceandInformationSystemsUniversityPark,Pennsylvania16802e-mail:sxb31@psu.eduAbstract.Inthepastfewyears,articialneuralnetworks(ANNs)havebeeninvestigatedasatoolfortimeseriesanalysisandforecasting.Themostpopulararchitectureisthemultilayerperceptron,afeedforwardnetworkoftentrainedbyback-propagation.TheforecastingperformanceofANNsrelativetotraditionalmethodsisstillopentoquestionalthoughmanyexperimentersseemoptimistic.Oneproblemwiththemultilayerperceptronisthat,initssimplestform,itissimilartoapureautoregressivetypemodel,soitlackstheabilitytoaccountforanymovingaveragestructurethatmayexist.Bymakinganetworkrecurrent,itispossibletoincludesuchstructure.WepresentseveralexamplesshowinghowanANNcanbeusedtorepresentanARMAschemeandcomparetheforecastingabilitiesoffeedforwardandrecurrentneu-ralnetworkswithtraditionalmethods.WorkingPaperSeriesnumber97-11.Thispaperwaspresentedatthe1997InternationalSymposiumonForecasting,Barbados.1UsingRecurrentNeuralNetworksforTimeSeriesForecasting21.IntroductionAnArticialNeuralNetwork(ANN)isaninformationprocessingparadigmthatisinspiredbythewaythebrainprocessesinformation.Thekeyelementofthisparadigmisthenovelstructureoftheinformationprocessingsystem.Itiscomposedofalargenumberofhighlyinterconnectedprocessingelements(neurons)workinginunisontosolvespecicproblems.ANNs,likepeople,learnbyexample.AnANNisconguredforaspecicapplication,suchaspatternrecognitionordataclassication,throughalearningprocess.Learninginbiologicalsystemsinvolvesadjustmentstothesynapticconnectionsthatexistbetweentheneurons.ThisistrueofANNsaswell.ForanintroductiontoANNssee,forexample,Hinton(1988),Lippmann(1987),orChengandTitterington(1994).Recently,ANNshavebeeninvestigatedasatoolfortimeseriesforecasting.ExamplesincludeTang,etal.(1991),Hill,etal.(1996),andFarawayandChateld(1995).Themostpopularclassisthemultilayerperceptron,afeedforwardnetworktrainedbybackpropaga-tion.Thisclassofnetworkconsistsofaninputlayer,anumberofhiddenlayersandanoutputlayer.Figure1isanexampleofanANNofthistypewith3neuronsintheinputlayer,asinglehiddenlayerwith2neurons,and1neuronintheoutputlayer.TherststepinusinganANNconcernsthechoiceofstructure,knownasthearchitectureofthenetwork.NeuralNetworksareknowntobeuniversalfunctionapproximators(Hornik,Stinchcombe,&White,1989),however,foranANNtoperformwell,theinputsandnumberofnodesinthehiddenlayermustbecarefullychosen.Sincetheylearnbyexample,itisvitalthattheinputscharacterizetheimportantrelationshipsandcorrelationsinthetimeseriesbeingmodeled.Forafeedforwardneuralnetworkwithonehiddenlayer,thepredictionequation,givenbyFarawayandChateld(1995),forcomputingforecasts^xtusingselectedpastobservations,xtj1;:::;xtjk,atlags(j1;:::;jk)andhnodesinthehiddenlayerwillbereferredtoasNN[j1;:::;jk;h]:Thus,thenetworkinFigure1isNN[1;12;13;2]:ThefunctionalformUsingRecurrentNeuralNetworksforTimeSeriesForecasting3OutputLayerHiddenLayerInputLayerxt-1xt-12xt-131Figure1:Afeedforwardneuralnetworkmaybewrittenas^xt=o(wco+Xhwhoh(wch+Xiwihxtji))wherefwchgdenotetheweightsfortheconnectionsbetweentheconstantinputandthehiddenneuronsandwcodenotestheweightofthedirectconnectionbetweentheconstantinputandtheoutput.Theweightsfwihgandfwhogdenotetheweightsfortheothercon-nectionsbetweentheinputsandthehiddenneuronsandbetweentheneuronsandtheout-putrespectively.Thetwofunctionshandodenotetheactivationfunctionsthatdenethemappingsfrominputstohiddennodesandfromhiddennodestooutput(s)respec-tively.Thelogisticfunction,y=(x)=11+exiswidelyusedintheANNliterature.FarawayandChateld(1995)recommend,forforecastingapplications,thathbelogisticand0linear.Forecastsaregeneratediterativelybyperformingsuccessiveone-stepaheadforecastsusingpreviousforecastsasestimatesofobservables.Thisapproachrepresentsaneuralnetworkimplementationofanon-linearautoregressive(NLAR)modeloforderkasbxt=ef(xtj1;xtj2;:::;xtjk;t)whereftgisasequenceofIIDrandomvariablesandUsingRecurrentNeuralNetworksforTimeSeriesForecasting4ef:k+1!:ItshouldbenotedthattheliteratureonANNsrarelyspeaksintermsofrandomvariables,orofanyassumptionsaboutthem.However,almostallapplicationsusetheleastsquarescriterionforttingsothat,byimplication,theerrorsareassumedtobeatleastuncorrelatedwithequalvariances.Further,theuseofANNsforforecastingrequiresconstancyofmodelstructureforfutureobservations,implyingtemporalinvarianceofef:ThisNLARnetwork,initssimplestform,willhavezeronodesinthehiddenlayer.Theresultingpredictionequationforcomputingaforecastforxtbecomes^xt=o(wco+Xiwioxtji)whereoisalinearfunction,wcodenotestheweightofthedirectconnectionbetweentheconstantinputandtheoutputandfwiogaretheweightsoftheconnectionsbetweentheinputneuronsandoutputneuron.Thisequationcorrespondstothetraditionalautoregressive(AR)forecastingfunction^xt=c+1xtj1+2xtj2++pxtjk:werekisthenumberofinputnodesinthecorrespondingneuralnetworkatlags(j1;:::;jk).So,initssimplestform,theneuralnetworkarchitecturereducestothestandardARtypemodel

1 / 19
下载文档,编辑使用

©2015-2020 m.777doc.com 三七文档.

备案号:鲁ICP备2024069028号-1 客服联系 QQ:2149211541

×
保存成功