Python的隐马尔科夫HMMLearn库的应用教学

整理文档很辛苦,赏杯茶钱您下走!

免费阅读已结束,点击下载阅读编辑剩下 ...

阅读已结束,您可以下载文档离线阅读编辑

资源描述

PythonHMMLearnTutorialEditedBy毛片物语hmmlearnimplementstheHiddenMarkovModels(HMMs).TheHMMisagenerativeprobabilisticmodel,inwhichasequenceofobservable\(\mathbf{X}\)variablesisgeneratedbyasequenceofinternalhiddenstates\(\mathbf{Z}\).Thehiddenstatesarenotbeobserveddirectly.Thetransitionsbetweenhiddenstatesareassumedtohavetheformofa(first-order)Markovchain.Theycanbespecifiedbythestartprobabilityvector\(\boldsymbol{\pi}\)andatransitionprobabilitymatrix\(\mathbf{A}\).Theemissionprobabilityofanobservablecanbeanydistributionwithparameters\(\boldsymbol{\theta}\)conditionedonthecurrenthiddenstate.TheHMMiscompletelydeterminedby\(\boldsymbol{\pi}\),\(\mathbf{A}\)and\(\boldsymbol{\theta}\).TherearethreefundamentalproblemsforHMMs:Giventhemodelparametersandobserveddata,estimatetheoptimalsequenceofhiddenstates.Giventhemodelparametersandobserveddata,calculatethelikelihoodofthedata.Givenjusttheobserveddata,estimatethemodelparameters.ThefirstandthesecondproblemcanbesolvedbythedynamicprogrammingalgorithmsknownastheViterbialgorithmandtheForward-Backwardalgorithm,respectively.ThelastonecanbesolvedbyaniterativeExpectation-Maximization(EM)algorithm,knownastheBaum-Welchalgorithm.References:[Rabiner89]LawrenceR.Rabiner“AtutorialonhiddenMarkovmodelsandselectedapplicationsinspeechrecognition”,ProceedingsoftheIEEE77.2,pp.257-286,1989.[Bilmes98]JeffA.Bilmes,“AgentletutorialoftheEMalgorithmanditsapplicationtoparameterestimationforGaussianmixtureandhiddenMarkovmodels.”,1998.Availablemodelshmm.GaussianHMMHiddenMarkovModelwithGaussianemissions.hmm.GMMHMMHiddenMarkovModelwithGaussianmixtureemissions.hmm.MultinomialHMMHiddenMarkovModelwithmultinomial(discrete)emissionsReadonfordetailsonhowtoimplementanHMMwithacustomemissionprobability.BuildingHMMandgeneratingsamplesYoucanbuildanHMMinstancebypassingtheparametersdescribedabovetotheconstructor.Then,youcangeneratesamplesfromtheHMMbycallingsample.importnumpyasnpfromhmmlearnimporthmmnp.random.seed(42)model=hmm.GaussianHMM(n_components=3,covariance_type=full)model.startprob_=np.array([0.6,0.3,0.1])model.transmat_=np.array([[0.7,0.2,0.1],...[0.3,0.5,0.2],...[0.3,0.3,0.4]])model.means_=np.array([[0.0,0.0],[3.0,-3.0],[5.0,10.0]])model.covars_=np.tile(np.identity(2),(3,1,1))X,Z=model.sample(100)Thetransitionprobabilitymatrixneednottobeergodic.Forinstance,aleft-rightHMMcanbedefinedasfollows:lr=hmm.GaussianHMM(n_components=3,covariance_type=diag,...init_params=cm,params=cmt)lr.startprob_=np.array([1.0,0.0,0.0])lr.transmat_=np.array([[0.5,0.5,0.0],...[0.0,0.5,0.5],...[0.0,0.0,1.0]])Ifanyoftherequiredparametersaremissing,samplewillraiseanexception:hmm.GaussianHMM(n_components=3)X,Z=model.sample(100)Traceback(mostrecentcalllast):...sklearn.utils.validation.NotFittedError:ThisGaussianHMMinstanceisnotfittedyet.Call'fit'withappropriateargumentsbeforeusingthismethod.FixingparametersEachHMMparameterhasacharactercodewhichcanbeusedtocustomizeitsinitializationandestimation.EMalgorithmneedsastartingpointtoproceed,thuspriortotrainingeachparameterisassignedavalueeitherrandomorcomputedfromthedata.Itispossibletohookintothisprocessandprovideastartingpointexplicitly.Todoso1.ensurethatthecharactercodefortheparameterismissingfrominit_paramsandthen2.settheparametertothedesiredvalue.Forexample,consideranHMMwithexplicitlyinitializedtransitionprobabilitymatrixmodel=hmm.GaussianHMM(n_components=3,n_iter=100,init_params=mcs)model.transmat_=np.array([[0.7,0.2,0.1],...[0.3,0.5,0.2],...[0.3,0.3,0.4]])Asimilartrickappliestoparameterestimation.Ifyouwanttofixsomeparameterataspecificvalue,removethecorrespondingcharacterfromparamsandsettheparametervaluebeforetraining.Examples:SamplingfromHMMTrainingHMMparametersandinferringthehiddenstatesYoucantrainanHMMbycallingthefitmethod.Theinputisamatrixofconcatenatedsequencesofobservations(akasamples)alongwiththelengthsofthesequences(seeWorkingwithmultiplesequences).Note,sincetheEMalgorithmisagradient-basedoptimizationmethod,itwillgenerallygetstuckinlocaloptima.Youshouldingeneraltrytorunfitwithvariousinitializationsandselectthehighestscoredmodel.Thescoreofthemodelcanbecalculatedbythescoremethod.Theinferredoptimalhiddenstatescanbeobtainedbycallingpredictmethod.Thepredictmethodcanbespecifiedwithdecoderalgorithm.CurrentlytheViterbialgorithm(viterbi),andmaximumaposterioriestimation(map)aresupported.Thistime,theinputisasinglesequenceofobservedvalues.Note,thestatesinremodelwillhaveadifferentorderthanthoseinthegeneratingmodel.remodel=hmm.GaussianHMM(n_components=3,covariance_type=full,n_iter=100)remodel.fit(X)GaussianHMM(algorithm='viterbi',...Z2=remodel.predict(X)MonitoringconvergenceThenumberofEMalgorithmiterationisupperboundedbythen_iterparameter.Thetrainingproceedsuntiln_iterstepswereperformedorthechangeinscoreislowerthanthespecifiedthresholdtol.Note,thatdependingonthedataEMalgorithmmayormaynotachieveconvergenceinthegivennumberofsteps.Youcanusethemonitor_attributetodiagnoseconvergence:remodel.monitor_ConvergenceMonitor(history=[...],iter=12,n

1 / 11
下载文档,编辑使用

©2015-2020 m.777doc.com 三七文档.

备案号:鲁ICP备2024069028号-1 客服联系 QQ:2149211541

×
保存成功