Extracting Rules from a Fuzzy Crisp Recurrent Neu

整理文档很辛苦,赏杯茶钱您下走!

免费阅读已结束,点击下载阅读编辑剩下 ...

阅读已结束,您可以下载文档离线阅读编辑

资源描述

()ExtractingRulesfromaFuzzy/CrispRecurrentNeuralNetworkusingaSelf-OrganizingMapA.Blanco,M.Delgado,M.C.Pegalajar*DepartamentodeCienciasdelaComputacioneInteligenciaArtificial,E.T.S.I.´Informatica,UniversidaddeGranada,AvenidadeAndalucıa,38,18071´´Granada,SpainAlthoughtheextractionofsymbolicknowledgefromtrainedfeedforwardneuralnet-.workshasbeenwidelystudied,researchinrecurrentneuralnetworksRNNhasbeenmoreneglected,eventhoughitperformsbetterinareassuchascontrol,speechrecognition,timeseriesprediction,etc.Nowadays,asubjectofparticularinterestis.crisprfuzzygrammaticalinference,inwhichtheapplicationoftheseneuralnetworkshasproventobesuitable.Inthispaper,wepresentamethodusingaself-organizingmap.SOMforextractingknowledgefromarecurrentneuralnetworkabletoinfera.crisprfuzzyregularlanguage.Identificationofthislanguageisdoneonlyfroma.crisprfuzzyexamplesetofthelanguage.Q2000JohnWiley&Sons,Inc.1.INTRODUCTION.EventhoughartificialneuralnetworksANNshavedemonstratedtheireffectivenessinmanylearningproblems,understandinghowtheyrepresenttheknowledgethattheycaptureisdifficult.AmethodforproducingasuitablerulebasetodescribeanANNshouldthereforeprovidemorethanjusttheidentifi-cationofasystem.ItshouldalsoofferasimpleexplanationassociatedtotheinternalmechanismofanANN,providinganinterpretationofit.Alearningmethodmustbuildasuitablemodelforanexampleset,buttheresultingmodelmustalsobeeasytounderstand.1]6Inthiswork,wefocusontheknowledgeextractionoftherecurrentneural.7]10networkRNNwhichiscurrentlybeingwidelystudied.Aproblemalsodealt.11,12withhereinisthefuzzyrcrispgrammaticalinference,sinceitisanappro-priatetestforinvestigatingsomeofthefundamentalissuesinRNN,suchastrainingalgorithms,knowledgerepresentationandextraction,etc.RNNswith*Authortowhomallcorrespondenceshouldbeaddressed;e-mail:mcarmen@decsai.ugr.es.INTERNATIONALJOURNALOFINTELLIGENTSYSTEMS,VOL.15,595]6212000Q2000JohnWiley&Sons,Inc.BLANCO,DELGADO,ANDPEGALAJAR596time-discreteinputsareabletolearndeterministicfiniteautomatafromexam-ples.IntheirstudyontheseneuralnetworksCleeremansetal.13concludedthatthehiddenunitactivationsrepresentpasthistoriesandthattheactivationclusterscanrepresentthesought-forstatesoftheautomaton.Tostartfromthisassumption,theproblemofextractingtherulesassociatedtothedeterministic.finiteautomatonDFAlearnedfromthetrainingexamplesisreducedtoexploringandanalyzingtheclustersoftheoutputspaceofthehiddenrecurrent.neurons.WehavedevelopedamethodthatdetectstheseclustersDFAstatesandthenextractsthetransitionsbetweenstatesoftheautomatonlearnedbytheRNN..14OurmethodusesKohonen’sself-organizingmapSOMsince,inouropinion,itisespeciallysuitableforthistask.TheSOMtendstoclusterpatternswithsimilarcharacteristics,withincomingpatternsbeingclassifiedbytheunitstheyactivateinthecompetitivelayer.Similaritiesamongpatternsaremappedintoclosenessrelationshipsonthecompetitivelayergrid.Afterthetrainingiscomplete,patternrelationshipsandgroupingsareobservedfromthecompeti-tivelayer.Incertaincasestheavailableexampleshaveanassociateduncertaintyorvagueness,inwhichcasefuzzygrammars,15arebestabletoexpresssuchuncertaintiesandtheneuralnetworksareabletoinferthesegrammars.Toextractthefuzzyautomatonassociatedwiththefuzzyexamples,wehavedevelopedamethodthatalsousestheSOM.Thispaperprovidesabriefintroductiontothegrammaticalinferenceproblem,thearchitectureofsecond-orderrecurrentneuralnetworks,andtheirtraininginSection2.TheextractionalgorithmoftheautomataarediscussedindetailinSection3.InSection4,wefocusonthefuzzygrammaticalinferenceproblemandtheneuralmodelusedtodofuzzygrammaticalinference.TheextractionalgorithmoffuzzyautomatafromthetrainednetworkispresentedinSection5.Finally,someconclusionsareprovided.2.CRISPGRAMMATICALINFERENCEUSINGRECURRENTNEURALNETWORKSThegrammaticalinferenceproblemhasbeenextensivelystudied11,12,16inrecentyearsduetoitsextendedfieldsofapplication:patternrecognition,informationrecovery,designofprogramminglanguages,compilationandtrans-lation,graphiclanguages,man]machinecommunication,etc.Beforedealingwiththegrammaticalinferenceproblemusingneuralnetworks,wewillshowasetofdefinitionsneededtounderstandthesubsequentdevelopmentofthiswork.2.1.Definitions.DEFINITION2.1.1.Aregulargrammar,G,isafour-tupleGsN,T,P,S,whereNisafinitesetofnonterminalsymbols,TisafinitesetofterminalEXTRACTINGRULESFROMARECURRENTNEURALNETWORK597symbols,PisafinitesetofproductionsoftheformAªaBorAªa,whereA,BgNandagT,andSisthestartingsymbol,SgN..DEFINITION2.1.2.AdeterministicfiniteautomatonDFAisastructureMs.Q,S,d,q,F,whereQisafiniteset,elementsofQarecalledstates,Sisa0finiteset,theinputalphabet,d:Q=SªQisthetransitionfunctionrecall.4.thatQ=Sisthesetoforderedpairsq,aNqgQandagS.Intuitively,disafunctionthattellsastatewheretomoveinresponsetoaninput:ifMisin.stateqandseesinputa,itmovestostatedq,a.qgQisthestartingstate0andFisasubsetofQ,elementsofFarecalledacceptorfinalstates.DEFINITION2.1.3.AstringxisacceptedbytheDFAMandhenceisa.memberoftheregularlanguageLMifanacceptingstateisreachedafterstringxhasbeenreadbyM.THEOREM2.1.1.IflanguageLisgeneratedbygrammarGthenther

1 / 27
下载文档,编辑使用

©2015-2020 m.777doc.com 三七文档.

备案号:鲁ICP备2024069028号-1 客服联系 QQ:2149211541

×
保存成功