基于元启发式算法的神经网络对选择特征的影响.(IJMSC-V2-N3-4)

整理文档很辛苦,赏杯茶钱您下走!

免费阅读已结束,点击下载阅读编辑剩下 ...

阅读已结束,您可以下载文档离线阅读编辑

资源描述

I.J.MathematicalSciencesandComputing,2016,3,41-48PublishedOnlineJuly2016inMECS()DOI:10.5815/ijmsc.2016.03.04Availableonlineat:Featureselection,datamining,algorithmcluster,heuristicmethods.©2016PublishedbyMECSPublisher.Selectionand/orpeerreviewunderresponsibilityoftheResearchAssociationofModernEducationandComputerScience1.IntroductionManualprocessingofdataisimpracticablebecauseofFastgrowingtechnologies.Evenworse,machinelearningtechniquesandknowledgeextractionseemtobeunableagainstthisbigdatabase.Vastdimensionsmanagementisoneofthecommonchallengesforextractingknowledgeandmachinelearning.Featureselectionmethodsisoneofthemostchallengingandthemostimportantactivitiesindevelopingmachinelearningandpatternsrecognition.Featureselectionisoneoftheissueswhichhasbeendiscussedinmachinelearningandalsopatternstatisticalrecognition.Thisissueisimportantinmanyusages(e.g.classification),*Correspondingauthor.Tel.:09304586038E-mailaddress:may.toghraee@yahoo.com42EffectNeuralNetworksonSelectedFeaturebyMetaHeuristicAlgorithmssincetherearealotoffeaturesintheseusages.Manyofthemareuselessorarenotinformative.EliminatingthesefeaturesdoesnotchangetheinformativecontentbutitaffectsoncalculatingfeatureofthementionedusageAlsoithelpstosavemuchuselessinformationwithusefuldata.Noteliminationofthewastefeaturesmakessomedimensionalproblems.Dimensionalproblemsaysthatwhenthedimensionsincrease,itispossiblethatthe2data(orsample)getfarfromeachother.Itmeansthatthehigherthedimensions(features),probablythemostdistancebetweenthemrandomly.Astheresult,thosesamplesareaffectedbythedimensionandthedistancebetweenthosesamplesareestimatedmuchmore.Itmakesthedistancebetweenbothsampleslessrepresentingtherealdistances.So,thequalityofclassifyingorclusteringareunpleasantlyunrealanddrop.Itcanbestatedinanotherway.Itcanbesaidthatsomeclustersorbranchesinfeature’satmospherearemorecoherentwithsomespecialfeatures;Threegeneralwayshavebeensubmittoovercometheabovedimensionproblem:(a)usingsubspacesdeterminedforclustersorbranchesbyuser,(b)usingfeatureselectionmethodsordecreasingdimensionslikeanalyzingmainfactorsandfinally(c)usingsubspaceclusteringorsubspaceclassifyingmethods.Wediscussaboutthefeatureselectionmethods(b)inthisreport.Alotofsolutionsandalgorithmhavebeenrepresentedforfeatureselectionissue.Alotofsolutionsandalgorithmshavebeenrepresentedforfeatureselectionissue,someofwhichare30or40yearsold.Theproblemaboutalgorithmswhenrepresentedwastheircalculatingfeature.However,fastcomputersandbigsavingsourceshavemadethisproblemunimportant,beside,bigdatasetsfornewissueshasmadeitimportanttofindafastalgorithmforthisissue.Featureselectionhas2types:a)Supervisedfeature:Labelsareusedduringfeatureselectionalgorithm(Zhao&Liu2007)b)Unsupervisedfeature:Labelsarenotusedduringfeatureselectionalgorithm(G.D2008)Researchdomainisjustlimitedtothesupervisedfeatureselectionwhilelabelsareusedduringfeatureselectionalgorithm.Purposeoffeatureselectionisdecreasingthedimensionsofasettoasmallersubsetoftheset’sfeatures.Thissubsetofmainfeaturesisknownasthebestsubsetoffeatures;Atargetfunctionisusedasbeingbetterfactorwhichcanbedifferenttargetsaccordingtotheprimaryhypothesis.Thisbeingbettercanbemeasured,forexampleaccuracyinclassifying.Weareseekingforthebelowtargetsusinglocalsearchingmethodsandimitatingthenaturesearchingalgorithm:a)Discussaboutthefeatureselectionefficiencymethods.b)Explaintheassessingalgorithmofthisresearchc)Lookingfortestsandresultsfromthesemethodsusingrealdatasets.2.EfficiencyFunctionMethodsDifferenttypesofefficiencyfunctionforvarioussubsetsincludes(1)wrappermethods,(2)embeddedmethods,(3)filtermethods.Wrappermethods:Toasseseachofthecandidatefeaturesubsets,aclassifyingmodelisdoneonthosefeatures(instructandproduce).Weconsidertheaccuracyofthetrainedclassifieronadistinctexperim

1 / 8
下载文档,编辑使用

©2015-2020 m.777doc.com 三七文档.

备案号:鲁ICP备2024069028号-1 客服联系 QQ:2149211541

×
保存成功