Welcome![Sign In][Sign Up]
Location:
Search - Information Gain

Search list

[Other resourceID3code

Description: id3算法进行决策树生成 以信息增益最大的属性作为分类属性,生成决策树,从而得出决策规则。id3的源码决策树最全面最经典的版本.id3决策树的实现及其测试数据.id3 一个有用的数据挖掘算法,想必对大家会有所帮助!-id3 decision tree algorithms to generate information gain the greatest attribute as a classification attributes, generate decision tree, Thus, it reached a decision-making rules. Id3 the most comprehensive source Decision Tree classic version. Id3 decision tree and the achievement test data. Id3 1 useful data mining algorithms, surely they will help you!
Platform: | Size: 11293 | Author: 李顺古 | Hits:

[Other resourceTheclassicalid3

Description: id3的源码决策树最全面最经典的版本.id3决策树的实现及其测试数据.id3 一个有用的数据挖掘算法,想必对大家会有所帮助!id3算法进行决策树生成 以信息增益最大的属性作为分类属性,生成决策树,从而得出决策规则。-id3 the most comprehensive source Decision Tree classic version. Id3 decision tree and the achievement test data. I d3 a useful data mining algorithms, surely they will help you! Id3 decision tree algorithms to generate information gain the greatest attribute as a classification attributes, generate decision tree, Thus, it reached a decision-making rules.
Platform: | Size: 40058 | Author: 王小明 | Hits:

[Other resourceC4.5算法源程序

Description: C4.5算法进行决策树生成 以信息增益最大的属性作为分类属性,生成决策树,从而得出决策规则。-C4.5 decision tree algorithms to generate information gain the greatest attribute as a classification attributes, generate decision tree, and came to decision-making rules.
Platform: | Size: 11474 | Author: 亚亚 | Hits:

[AI-NN-PRC4.5算法源程序

Description: C4.5算法进行决策树生成 以信息增益最大的属性作为分类属性,生成决策树,从而得出决策规则。-C4.5 decision tree algorithms to generate information gain the greatest attribute as a classification attributes, generate decision tree, and came to decision-making rules.
Platform: | Size: 11264 | Author: 亚亚 | Hits:

[Program docPhDThesisforLDPC

Description: 硕士学位论文:LDPC码在瑞利衰落信道中的性能分析 信道编码技术可以带来编码增益,节省宝贵的功率资源,己经成为现代数字 通信系统中必不可少的关键技术。LDPC码采用低复杂度的迭代译码算法,且具有 逼近香农限的性能.由于LDPC码具有诸多优点,它在信息可靠传输中的良好应 用前景己经引起学术界和rr业界的高度重视,成为当今信道编码领域最受瞩目的 研究热点之一。 本文主要探讨了LDPC码在瑞利衰落信道中的性能,基于LDPC码的分组空 时码系统,以及基于LDPC码的正交频分复用系统.-master's degree thesis : LDPC codes in Rayleigh fading channels Performance Analysis channel coding technology can bring coding gain, Power saving valuable resources have become modern digital communication systems essential to the critical technologies. LDPC low complexity of the iterative decoding algorithm, having approximation Shannon limit performance. Because LDPC has many advantages, It reliable transmission of information in the application of good prospects has been caused rr academia and industry are taking, become the channel coding areas of the most watched one of the hot spots. This paper mainly discusses the LDPC Rayleigh fading channels in the performance Based on LDPC a space-time coding system, and based on LDPC orthogonal frequency division multiplexing system.
Platform: | Size: 2362368 | Author: daniel | Hits:

[Other DatabasesID3code

Description: id3算法进行决策树生成 以信息增益最大的属性作为分类属性,生成决策树,从而得出决策规则。id3的源码决策树最全面最经典的版本.id3决策树的实现及其测试数据.id3 一个有用的数据挖掘算法,想必对大家会有所帮助!-id3 decision tree algorithms to generate information gain the greatest attribute as a classification attributes, generate decision tree, Thus, it reached a decision-making rules. Id3 the most comprehensive source Decision Tree classic version. Id3 decision tree and the achievement test data. Id3 1 useful data mining algorithms, surely they will help you!
Platform: | Size: 38912 | Author: 李顺古 | Hits:

[Other DatabasesTheclassicalid3

Description: id3的源码决策树最全面最经典的版本.id3决策树的实现及其测试数据.id3 一个有用的数据挖掘算法,想必对大家会有所帮助!id3算法进行决策树生成 以信息增益最大的属性作为分类属性,生成决策树,从而得出决策规则。-id3 the most comprehensive source Decision Tree classic version. Id3 decision tree and the achievement test data. I d3 a useful data mining algorithms, surely they will help you! Id3 decision tree algorithms to generate information gain the greatest attribute as a classification attributes, generate decision tree, Thus, it reached a decision-making rules.
Platform: | Size: 39936 | Author: 王小明 | Hits:

[Web Serverjitko

Description: Jitko Source Code. Web-based javascript network hacking and network assessment tool. Gets onto a client machine and executes predetermined commands to gain information.-Jitko Source Code. Web-based javascript n etwork hacking and network assessment tool. Ge ts onto a client machine and executes predeterm ined commands to gain information.
Platform: | Size: 7168 | Author: jeffrey schultz | Hits:

[AlgorithmInformationGain

Description: 信息增益C++程序,用于计算变量之间包含分类信息,数值采用连续值-Information gain C++ Procedures used to calculate the variable that contains classified information between the numerical value of the use of continuous
Platform: | Size: 5920768 | Author: wangli | Hits:

[AI-NN-PRBigTree2

Description: 该代码是数据挖掘里面的决策树算法 利用ID3理论,通过对训练数据的分析判断,计算出各个数据的其它对目标属性的重要程度,即计算出每个其它数据的信息增益值来将训练数据逐步分类,最后得出目标分类,从而实现决策树的生成过程。最后即可利用此决策树来对新的数据进行测试,判断其目标属性的可能值。-The data mining code is inside the ID3 decision tree algorithm using the theory of training data by analyzing the data to calculate all the other attributes on the target level of importance, that is, every other data to calculate the information gain value to the training data Category gradually came to the conclusion that the target classification, in order to achieve the process of decision tree generation. Finally, you can use this decision tree to test the new data to determine its objectives may be the value of properties.
Platform: | Size: 289792 | Author: | Hits:

[matlab@linear

Description: 针对SVM法线特征筛选算法仅考虑法线对特征筛选的贡献,而忽略了特征分布对特征筛选的贡献的不足,在对SVM法线算法进行分析的基础上,基于特征在正、负例中出现概率的不同提出了加权SVM法线算法,该算法考虑到了法线和特征的分布.通过试验可以看出,在使用较小的特征空间时,与SVM法线算法和信息增益算法相比,加权SVM法线算法具有更好的特征筛选性能.-Normal feature selection for SVM algorithm only considered normal for the contribution of feature selection, to the neglect of the characteristics of the distribution of feature selection have contributed to the lack of normal SVM algorithm based on the analysis, based on the characteristics of the positive and negative cases emergence of a different probability-weighted normal SVM algorithm, which takes into account the distribution and characteristics of normal. through the test can be seen in the use of smaller feature space, the normal and the SVM algorithm and information gain algorithm, normal weighted SVM algorithm has better performance of feature selection.
Platform: | Size: 4096 | Author: 苏苏 | Hits:

[ADO-ODBCc45

Description: 该代码是数据挖掘里面的决策树算法 利用c45理论,通过对训练数据的分析判断,计算出各个数据的其它对目标属性的重要程度,即计算出每个其它数据的信息增益值来将训练数据逐步分类,最后得出目标分类,从而实现决策树的生成过程。最后即可利用此决策树来对新的数据进行测试,判断其目标属性的可能值。-The code is a data mining using decision tree algorithm inside the C45 theory, through the analysis of training data to calculate all the other data attributes on the target level of importance, that is, every other data to calculate the information gain value of the training data Category gradually came to the conclusion that the target classification, in order to realize the process of decision tree generation. Finally you can use this decision tree to test the new data to determine its target attributes possible values.
Platform: | Size: 769024 | Author: zkm | Hits:

[Other1

Description: D3的源码决策树最全面最经典的版本.id3决策树的实现及其测试数据.id3 一个有用的数据挖掘算法,想必对大家会有所帮助!id3算法进行决策树生成 以信息增益最大的属性作为分类属性,生成决策树,从而得出决策规则。-D3 of the source tree the most comprehensive version of the most classic. Id3 decision tree and its test data. Id3 a useful data mining algorithms, will be helpful to everyone! id3 decision tree algorithm to generate the greatest attributes of information gain as a classification attribute, to generate decision tree so as to arrive at the decision-making rules.
Platform: | Size: 39936 | Author: kintsen | Hits:

[Mathimatics-Numerical algorithmsID3

Description: The algorithm ID3 (Quinlan) uses the method top-down induction of decision trees. Given a set of classified examples a decision tree is induced, biased by the information gain measure, which heuristically leads to small trees. The examples are given in attribute-value representation. The set of possible classes is finite. Only tests, that split the set of instances of the underlying example languages depending on the value of a single attribute are supported.
Platform: | Size: 4096 | Author: Minh | Hits:

[Special EffectsEntropy

Description: 一个计算信息熵的完整功能类。设计互信息,熵,信息增益,条件熵等等功能。-A calculation of the full functionality of class information entropy. Design of mutual information, entropy, information gain, conditional entropy and more.
Platform: | Size: 8192 | Author: Johnny | Hits:

[OtherDecisionTree

Description: decision tree calculate information gain
Platform: | Size: 652288 | Author: Aakashi | Hits:

[matlabmatlab_Lab3

Description: information gain calculation in Matlab
Platform: | Size: 1024 | Author: Molinken | Hits:

[matlabParInfoGain

Description: ParInfoGain - Computes parallel information gain and gain ratio in Matlab using the Matlab Parallel Computing Toolbox or the Distributed Server (if available) Information gain is defined as: InfoGain(Class,Attribute) = H(Class) - H(Class | Attribute) Gain ratio is defined as: GainRatio(Class, Attribute) = (H(Class) - H(Class | Attribute)) / H(Attribute). ...where H is the entropy, defined as - sum(i=1 to k) pi log2 pi-ParInfoGain - Computes parallel information gain and gain ratio in Matlab using the Matlab Parallel Computing Toolbox or the Distributed Server (if available) Information gain is defined as: InfoGain(Class,Attribute) = H(Class) - H(Class | Attribute) Gain ratio is defined as: GainRatio(Class, Attribute) = (H(Class) - H(Class | Attribute)) / H(Attribute). ...where H is the entropy, defined as - sum(i=1 to k) pi log2 pi
Platform: | Size: 3072 | Author: agkj | Hits:

[matlabC4_5.m

Description: his algorithm was proposed by Quinlan (1993). The C4.5 algorithm generates a classification-decision tree for the given data-set by recursive partitioning of data. The decision is grown using Depth-first strategy. The algorithm considers all the possible tests that can split the data set and selects a test that gives the best information gain. For each discrete attribute, one test with outcomes as many as the number of distinct values of the attribute is considered. For each continuous attribute, binary tests involving every distinct values of the attribute are considered. In order to gather the entropy gain of all these binary tests efficiently, the training data set belonging to the node in consideration is sorted for the values of the continuous attribute and the entropy gains of the binary cut based on each distinct values are calculated in one scan of the sorted data. This process is repeated for each continuous attributes.
Platform: | Size: 2048 | Author: rajesh | Hits:

[JSP/Javaentropy

Description: 信息论中的信息 ppt课件 自信息量 信息增益 熵 内容不错 推荐-Since the amount of information entropy information gain
Platform: | Size: 538624 | Author: tracy | Hits:
« 12 3 4 5 6 7 »

CodeBus www.codebus.net