Hot Search : Source embeded web remote control p2p game More...
Location : Home Search - Information Gain
Search - Information Gain - List
C4.5算法进行决策树生成 以信息增益最大的属性作为分类属性,生成决策树,从而得出决策规则。-C4.5 decision tree algorithms to generate information gain the greatest attribute as a classification attributes, generate decision tree, and came to decision-making rules.
Date : 2025-12-31 Size : 11kb User : 亚亚

用遗传算法求解背包问题是南京航空航天大学信息与计算科学专业编写的.本程序利用遗传算法来求解背包问题.采用二进制字符串编码,1表示选择物体,0则不选择. 背包问题描述:在M件物品取出若干件放在空间为W的背包里,每件物品的重量为W1,W·2……Wn,与之相对应的价值为P1,P2……Pn。求出获得最大价值的方案。注意:在本题中,所有的重量值均为整数。-genetic algorithm knapsack problem is the Nanjing University of Aeronautics and Astronautics, Information and Computing Science prepared. The procedure the use of genetic algorithms to solve knapsack problem. binary string encoding, an option objects, 0 no choice. Knapsack Problem Description : M items removed several pieces on the space-W backpack, Each of the weight W1, W 2 ... Wn, the corresponding value of P1, P2 ... Pn. Sought to gain maximum value of the program. NOTE : In this issue, all the weight values are integers.
Date : 2025-12-31 Size : 25kb User : 王杰

DL : 0
该代码是数据挖掘里面的决策树算法 利用ID3理论,通过对训练数据的分析判断,计算出各个数据的其它对目标属性的重要程度,即计算出每个其它数据的信息增益值来将训练数据逐步分类,最后得出目标分类,从而实现决策树的生成过程。最后即可利用此决策树来对新的数据进行测试,判断其目标属性的可能值。-The data mining code is inside the ID3 decision tree algorithm using the theory of training data by analyzing the data to calculate all the other attributes on the target level of importance, that is, every other data to calculate the information gain value to the training data Category gradually came to the conclusion that the target classification, in order to achieve the process of decision tree generation. Finally, you can use this decision tree to test the new data to determine its objectives may be the value of properties.
Date : 2025-12-31 Size : 283kb User :

这个例子主要用于计算一个图象的信息熵,当我们测量这个图象的紧密程度时约简在不确定增益.-This example of a main image used to calculate the information entropy, when we measured the extent of the image of the close of uncertainty when the gain reduction.
Date : 2025-12-31 Size : 7kb User : 陈雪娇

DL : 0
决策树算法部分代码(从文件中读数据并计算信息增益)-Part of the code of the decision tree algorithm (data read from the file and calculate the information gain)
Date : 2025-12-31 Size : 268kb User : 李志琼

DL : 0
基于信息熵计算属性的信息增益,最终得到约简后的属性集,实现特征选择-Feature Selection based on information entropy gain, then the reduct of attributes are obtained.
Date : 2025-12-31 Size : 5kb User : Jacky

本文比较研究了在中文文本分类中特征选取方法对分类效果的影响。考察了文档频率DF、信息增 益IG、互信息MI、V2分布CHI 四种不同的特征选取方法。采用支持向量机(SVM) 和KNN两种不同的分类 器以考察不同抽取方法的有效性。实验结果表明, 在英文文本分类中表现良好的特征抽取方法( IG、MI 和 CHI)在不加修正的情况下并不适合中文文本分类。文中从理论上分析了产生差异的原因, 并分析了可能的 矫正方法包括采用超大规模训练语料和采用组合的特征抽取方法。最后通过实验验证组合特征抽取方法的 有效性。-Thispaper is a comparativestudy of feature selectionmethodsintext categorization. Four methods were evaluated, including document frequency ( DF) , information gain ( IG) , mutual information ( MI) andV 2 -test ( CHI). ASupport Vector Machine ( SVM) anda k-nearest neighbor ( KNN) wereselectedastheevaluating class-i fiers. We foundIG, MI andCHI hadpoor performance inour test, thoughthey behavewell inEnglishtext catego-rization. We analyzedthereasonstheoretically andput forwardedthe possible solutions. Afurthermore experiment provedthat the combinedfeatureselectionmethodis effective.
Date : 2025-12-31 Size : 235kb User : xz

DL : 0
决策树分类 通过读取数据 求信息增益率选择最好的分离属性-Decision tree classification by reading the data and information gain ratio to select the best separation properties
Date : 2025-12-31 Size : 63kb User : lihongjun

完全的信息增益比代码,有注释。已运行。可进行信息熵,信息增益运算-Complete information gain ratio code, annotated.
Date : 2025-12-31 Size : 1kb User : fan lu

昨天传的有点小BUG 今天全部改正。实现信息增益比的代码。输入为样本,及样本类-Yesterday, a little BUG pass all correct today. Realized gain ratio information code. Enter the sample, and the sample class
Date : 2025-12-31 Size : 1kb User : fan lu

决策树ID3算法的MATLAB程序,这里采用信息增益的方法.-MATLAB program of Decision Tree Algorithm ID3,by the information gain.
Date : 2025-12-31 Size : 2kb User : huadong

DL : 0
C4.5是机器学习算法中的另一个分类决策树算法,它是基于ID3算法进行改进后的一种重要算法,相比于ID3算法,改进有如下几个要点:用信息增益率来选择属性.-C4.5 decision tree algorithm is another classification machine learning algorithm, which is based on ID3 algorithm is an important algorithm improved, compared to the ID3 algorithm, the improved following points: the use of information gain ratio to Properties .
Date : 2025-12-31 Size : 60kb User : sunlee0729

编写代码计算信息增益,splitDataSet函数是用来选择各个特征的子集的,比如选择年龄(第0个特征)的青年(用0代表)的自己,我们可以调用splitDataSet(dataSet,0,0)这样返回的子集就是年龄为青年的5个数据集。chooseBestFeatureToSplit是选择选择最优特征的函数。(Write code to calculate the information gain.SplitDataSet function is used to select the feature subset, such as the choice of age (0) young (0), we can call splitDataSet (dataSet, 0, 0) returns a subset of the five data sets is the age of youth. ChooseBestFeatureToSplit is a function that selects the best features.)
Date : 2025-12-31 Size : 19kb User : 冰琳
CodeBus is one of the largest source code repositories on the Internet!
Contact us :
1999-2046 CodeBus All Rights Reserved.