一、属性选择:1、理论知识:见以下两篇文章:数据挖掘中的特征选择算法综述及基于WEKA的性能比较_陈良龙数据挖掘中约简技术与属性选择的研究_刘辉2、weka中的属性选择2.1评价策略(attributeevaluator)总的可分为filter和wrapper方法,前者注重对单个属性进行评价,后者侧重对特征子集进行评价。Wrapper方法有:CfsSubsetEvalFilter方法有:CorrelationAttributeEval2.1.1Wrapper方法:(1)CfsSubsetEval根据属性子集中每一个特征的预测能力以及它们之间的关联性进行评估,单个特征预测能力强且特征子集内的相关性低的子集表现好。Evaluatestheworthofasubsetofattributesbyconsideringtheindividualpredictiveabilityofeachfeaturealongwiththedegreeofredundancybetweenthem.Subsetsoffeaturesthatarehighlycorrelatedwiththeclasswhilehavinglowintercorrelationarepreferred.Formoreinformationsee:M.A.Hall(1998).Correlation-basedFeatureSubsetSelectionforMachineLearning.Hamilton,NewZealand.(2)WrapperSubsetEvalWrapper方法中,用后续的学习算法嵌入到特征选择过程中,通过测试特征子集在此算法上的预测性能来决定其优劣,而极少关注特征子集中每个特征的预测性能。因此,并不要求最优特征子集中的每个特征都是最优的。Evaluatesattributesetsbyusingalearningscheme.Crossvalidationisusedtoestimatetheaccuracyofthelearningschemeforasetofattributes.Formoreinformationsee:RonKohavi,GeorgeH.John(1997).Wrappersforfeaturesubsetselection.ArtificialIntelligence.97(1-2):273-324.2.1.2Filter方法:如果选用此评价策略,则搜索策略必须用Ranker。(1)CorrelationAttributeEval根据单个属性和类别的相关性进行选择。Evaluatestheworthofanattributebymeasuringthecorrelation(Pearson's)betweenitandtheclass.Nominalattributesareconsideredonavaluebyvaluebasisbytreatingeachvalueasanindicator.Anoverallcorrelationforanominalattributeisarrivedatviaaweightedaverage.(2)GainRatioAttributeEval根据信息增益比选择属性。Evaluatestheworthofanattributebymeasuringthegainratiowithrespecttotheclass.GainR(Class,Attribute)=(H(Class)-H(Class|Attribute))/H(Attribute).(3)InfoGainAttributeEval根据信息增益选择属性。Evaluatestheworthofanattributebymeasuringtheinformationgainwithrespecttotheclass.InfoGain(Class,Attribute)=H(Class)-H(Class|Attribute).(4)OneRAttributeEval根据OneR分类器评估属性。Classforbuildingandusinga1Rclassifier;inotherwords,usestheminimum-errorattributeforprediction,discretizingnumericattributes.Formoreinformation,see:R.C.Holte(1993).Verysimpleclassificationrulesperformwellonmostcommonlyuseddatasets.MachineLearning.11:63-91.(5)PrincipalComponents主成分分析(PCA)。Performsaprincipalcomponentsanalysisandtransformationofthedata.UseinconjunctionwithaRankersearch.Dimensionalityreductionisaccomplishedbychoosingenougheigenvectorstoaccountforsomepercentageofthevarianceintheoriginaldata---default0.95(95%).AttributenoisecanbefilteredbytransformingtothePCspace,eliminatingsomeoftheworsteigenvectors,andthentransformingbacktotheoriginalspace.(6)ReliefFAttributeEval根据ReliefF值评估属性。Evaluatestheworthofanattributebyrepeatedlysamplinganinstanceandconsideringthevalueofthegivenattributeforthenearestinstanceofthesameanddifferentclass.Canoperateonbothdiscreteandcontinuousclassdata.Formoreinformationsee:KenjiKira,LarryA.Rendell:APracticalApproachtoFeatureSelection.In:NinthInternationalWorkshoponMachineLearning,249-256,1992.IgorKononenko:EstimatingAttributes:AnalysisandExtensionsofRELIEF.In:EuropeanConferenceonMachineLearning,171-182,1994.MarkoRobnik-Sikonja,IgorKononenko:AnadaptationofReliefforattributeestimationinregression.In:FourteenthInternationalConferenceonMachineLearning,296-304,1997.(7)SymmetricalUncertAttributeEval根据属性的对称不确定性评估属性。Evaluatestheworthofanattributebymeasuringthesymmetricaluncertaintywithrespecttotheclass.SymmU(Class,Attribute)=2*(H(Class)-H(Class|Attribute))/H(Class)+H(Attribute).2.2搜索策略(SearchMethod)2.2.1和评价策略中的wrapper方法对应(1)BestFirst最好优先的搜索策略。是一种贪心搜索策略。Searchesthespaceofattributesubsetsbygreedyhillclimbingaugmentedwithabacktrackingfacility.Settingthenumberofconsecutivenon-improvingnodesallowedcontrolsthelevelofbacktrackingdone.Bestfirstmaystartwiththeemptysetofattributesandsearchforward,orstartwiththefullsetofattributesandsearchbackward,orstartatanypointandsearchinbothdirections(byconsideringallpossiblesingleattributeadditionsanddeletionsatagivenpoint).(2)ExhaustiveSearch穷举搜索所有可能的属性子集。Performsanexhaustivesearchthroughthespaceofattributesubsetsstartingfromtheemptysetofattrubutes.Reportsthebestsubsetfound.(3)GeneticSearch基于Goldberg在1989年提出的简单遗传算法进行的搜索。PerformsasearchusingthesimplegeneticalgorithmdescribedinGoldberg(1989).Formoreinformationsee:DavidE.Goldberg(1989).Geneticalgorithmsinsearch,optimizationandmachinelearning.Addison-Wesley.(4)GreedyStepwise向前或向后的单步搜索。Performsagreedyforwardorbackwardsearchthroughthespaceofattributesubsets.Maystartwithno/allattributesorfromanarbitrarypointinthespace.Stopswhentheaddition/deletionofanyremainingattributesresultsinadecreaseinevaluation.Canalsoproducearankedlistofattributesbytraversingthespacefromonesidetotheotherandrecordingtheorderthatattributesareselected.(5)RandomSearch随机搜索。PerformsaRandomsearchinthespaceofattributesubsets.Ifnostartsetissupplied,Randomsearchstartsfromarandompointandreportsthebestsubsetfound.Ifastartsetissupplied,Randomsearchesrandomlyforsubsetsthatareasgoodorbetterthanthestartpointwiththesameororfewerattributes.UsingRandomSearchinconjunctionwithastartsetcontainingallattributesequatestotheLVFalgorithmofLiuandSetiono(ICML-96).Formoreinformationsee:H.Liu,R.Setiono:Aprobabilisticapproachtofeatureselection-Afiltersolution.In:13thIntern