Hot Search : Source embeded web remote control p2p game More...
Location : Home Search - Information Gain
Search - Information Gain - List
The algorithm ID3 (Quinlan) uses the method top-down induction of decision trees. Given a set of classified examples a decision tree is induced, biased by the information gain measure, which heuristically leads to small trees. The examples are given in attribute-value representation. The set of possible classes is finite. Only tests, that split the set of instances of the underlying example languages depending on the value of a single attribute are supported.
Date : 2025-12-31 Size : 4kb User : Minh

ID3算法是一种贪心算法,用来构造决策树。ID3算法起源于概念学习系统(CLS),以信息熵的下降速度为选取测试属性的标准,即在每个节点选取还尚未被用来划分的具有最高信息增益的属性作为划分标准,然后继续这个过程,直到生成的决策树能完美分类训练样例。(The ID3 algorithm is a greedy algorithm, which is used to construct a decision tree. ID3 algorithm originated from the concept of learning system (CLS), with the descending velocity of the information entropy for choosing the test attribute selection criteria, namely in each node has not yet been used to attribute with the highest information gain division as the division standard, then continues the process until the decision tree generated perfect classification of training examples.)
Date : 2025-12-31 Size : 2.38mb User : 秦冰

给出对决策树与随机森林的认识。主要分析决策树的学习算法:信息增益和ID3、C4.5、CART树,然后给出随机森林。 决策树中,最重要的问题有3个: 1. 特征选择。即选择哪个特征作为某个节点的分类特征; 2. 特征值的选择。即选择好特征后怎么划分子树; 3. 决策树出现过拟合怎么办? 下面分别就以上问题对决策树给出解释。决策树往往是递归的选择最优特征,并根据该特征对训练数据进行分割。(The understanding of decision tree and random forest is given. This paper mainly analyzes the learning algorithm of decision tree: information gain and ID3, C4.5, CART tree, and then give the random forest. Among the decision trees, there are 3 of the most important issues. 1. feature selection. Which is to choose which feature as the classification of a node; 2. the selection of eigenvalues. That is, how to divide the subtrees after the selection of the good features. 3. how to do the fitting of the decision tree? The following questions are explained on the decision tree respectively. The decision tree is often the optimal feature of the recursive selection, and the training data are segmented according to the feature.)
Date : 2025-12-31 Size : 2.02mb User : ZJN27
CodeBus is one of the largest source code repositories on the Internet!
Contact us :
1999-2046 CodeBus All Rights Reserved.