Welcome![Sign In][Sign Up]
Location:
Search - learning classifiers

Search list

[Windows Developmppdf

Description: 多分类器集成算法,应用前景广泛,本pdf有益于对该算法的了解和学习,共享之。-multiple classifiers integration algorithm, broad application prospects, the benefits of this algorithm pdf understanding and learning, Sharing.
Platform: | Size: 102841 | Author: ge | Hits:

[OtherMLC21NT-C

Description: machine learning, accuracy estimation, cross-validation, bootstrap, ID3, decision trees, decision graphs, naive-bayes, decision tables, majority, induction algorithms, classifiers, categorizers, general logic diagrams, instance-based algorithms, discretization, lazy learning, bagging, MineSet. -machine learning, accuracy estimation, cross-validation, bootstrap, ID3, decision trees, decision graphs, naive - bayes, decision tables, the majority, induction algorithms, classifiers, categorizers, general logic diagrams. instance-based algorithms, discretization. lazy learning, bagging, MineSet.
Platform: | Size: 3153562 | Author: infinite8 | Hits:

[Other resourceBayesnet

Description: a Java toolkit for training, testing, and applying Bayesian Network Classifiers. Implemented classifiers have been shown to perform well in a variety of artificial intelligence, machine learning, and data mining applications. -a Java toolkit for training, testing, and applying Bayesian Network Classifiers. Im plemented classifiers have been shown to perfo rm well in a variety of artificial intelligence , machine learning, and data mining applications.
Platform: | Size: 504530 | Author: lyb | Hits:

[Other resourceHerbrich-Learning-Kernel-Classifiers-Theory-and-Al

Description: Learning Kernel Classifiers: Theory and Algorithms, Introduction This chapter introduces the general problem of machine learning and how it relates to statistical inference. 1.1 The Learning Problem and (Statistical) Inference It was only a few years after the introduction of the first computer that one of man’s greatest dreams seemed to be realizable—artificial intelligence. Bearing in mind that in the early days the most powerful computers had much less computational power than a cell phone today, it comes as no surprise that much theoretical research on the potential of machines’ capabilities to learn took place at this time. This becomes a computational problem as soon as the dataset gets larger than a few hundred examples.-Learning Kernel Classifiers : Theory and Algorithms. Introduction This chapter introduces the gene the acidic problem of machine learning and how it relat es to statistical inference. 1.1 The Learning P roblem and (Statistical) It was only inference a few years after the introduction of the first c omputer that one of man's greatest dreams seeme d to be realizable-artificial intelligence. B earing in mind that in the early days the most pow erful computers had much less computational po wer than a cell phone today, it comes as no surprise that much theoretical're search on the potential of machines' capabilit ies to learn took place at this time. This become 's a computational problem as soon as the dataset gets larger than a few hundred examples.
Platform: | Size: 2537081 | Author: google2000 | Hits:

[Other resourceBP-nn

Description: BP神经网络分类器 程序有两种运行状态,一个是学习,另外一个是分类。在学习状态下,在Dos命令符下输入bp learn,便开始学习了,学习的结果放在weight.dat中;在工作状态下,在Dos命令符下输入bp work,便开始识别classfyme.dat中的数据了,识别完成后,结果放在results.dat中。在bp运行的任何一种状态下,都不能手工打开Weight.dat、Sample.dat、classfyme.dat、results.dat中的任何一种。~..~-BP neural network classifiers have the two operations, a study, the other one is classified. In learning mode, in order website Dos importing bp learn, they start learning, the results of the study on weight.dat; in working condition, Dos website in order to import bp work, classfyme.dat began to identify the data, the identification is completed, The results were on results.dat. Bp in the operation of any state, it can not open Weight.dat hand, Sample.dat, classfyme.dat. results.dat of any kind. ~ ~ ..
Platform: | Size: 80529 | Author: wh | Hits:

[Other resourceicsiboost-0.3.tar

Description: Boosting is a meta-learning approach that aims at combining an ensemble of weak classifiers to form a strong classifier. Adaptive Boosting (Adaboost) implements this idea as a greedy search for a linear combination of classifiers by overweighting the examples that are misclassified by each classifier. icsiboost implements Adaboost over stumps (one-level decision trees) on discrete and continuous attributes (words and real values). See http://en.wikipedia.org/wiki/AdaBoost and the papers by Y. Freund and R. Schapire for more details [1]. This approach is one of most efficient and simple to combine continuous and nominal values. Our implementation is aimed at allowing training from millions of examples by hundreds of features in a reasonable time/memory.
Platform: | Size: 116681 | Author: njustyw | Hits:

[Windows Developmppdf

Description: 多分类器集成算法,应用前景广泛,本pdf有益于对该算法的了解和学习,共享之。-multiple classifiers integration algorithm, broad application prospects, the benefits of this algorithm pdf understanding and learning, Sharing.
Platform: | Size: 102400 | Author: ge | Hits:

[OtherMLC21NT-C

Description: machine learning, accuracy estimation, cross-validation, bootstrap, ID3, decision trees, decision graphs, naive-bayes, decision tables, majority, induction algorithms, classifiers, categorizers, general logic diagrams, instance-based algorithms, discretization, lazy learning, bagging, MineSet. -machine learning, accuracy estimation, cross-validation, bootstrap, ID3, decision trees, decision graphs, naive- bayes, decision tables, the majority, induction algorithms, classifiers, categorizers, general logic diagrams. instance-based algorithms, discretization. lazy learning, bagging, MineSet.
Platform: | Size: 3152896 | Author: infinite8 | Hits:

[AI-NN-PRBayesnet

Description: a Java toolkit for training, testing, and applying Bayesian Network Classifiers. Implemented classifiers have been shown to perform well in a variety of artificial intelligence, machine learning, and data mining applications. -a Java toolkit for training, testing, and applying Bayesian Network Classifiers. Im plemented classifiers have been shown to perfo rm well in a variety of artificial intelligence , machine learning, and data mining applications.
Platform: | Size: 503808 | Author: lyb | Hits:

[OtherHerbrich-Learning-Kernel-Classifiers-Theory-and-Al

Description: Learning Kernel Classifiers: Theory and Algorithms, Introduction This chapter introduces the general problem of machine learning and how it relates to statistical inference. 1.1 The Learning Problem and (Statistical) Inference It was only a few years after the introduction of the first computer that one of man’s greatest dreams seemed to be realizable—artificial intelligence. Bearing in mind that in the early days the most powerful computers had much less computational power than a cell phone today, it comes as no surprise that much theoretical research on the potential of machines’ capabilities to learn took place at this time. This becomes a computational problem as soon as the dataset gets larger than a few hundred examples.-Learning Kernel Classifiers : Theory and Algorithms. Introduction This chapter introduces the gene the acidic problem of machine learning and how it relat es to statistical inference. 1.1 The Learning P roblem and (Statistical) It was only inference a few years after the introduction of the first c omputer that one of man's greatest dreams seeme d to be realizable-artificial intelligence. B earing in mind that in the early days the most pow erful computers had much less computational po wer than a cell phone today, it comes as no surprise that much theoretical're search on the potential of machines' capabilit ies to learn took place at this time. This become 's a computational problem as soon as the dataset gets larger than a few hundred examples.
Platform: | Size: 2536448 | Author: | Hits:

[AI-NN-PRBP-nn

Description: BP神经网络分类器 程序有两种运行状态,一个是学习,另外一个是分类。在学习状态下,在Dos命令符下输入bp learn,便开始学习了,学习的结果放在weight.dat中;在工作状态下,在Dos命令符下输入bp work,便开始识别classfyme.dat中的数据了,识别完成后,结果放在results.dat中。在bp运行的任何一种状态下,都不能手工打开Weight.dat、Sample.dat、classfyme.dat、results.dat中的任何一种。~..~-BP neural network classifiers have the two operations, a study, the other one is classified. In learning mode, in order website Dos importing bp learn, they start learning, the results of the study on weight.dat; in working condition, Dos website in order to import bp work, classfyme.dat began to identify the data, the identification is completed, The results were on results.dat. Bp in the operation of any state, it can not open Weight.dat hand, Sample.dat, classfyme.dat. results.dat of any kind. ~ ~ ..
Platform: | Size: 79872 | Author: wh | Hits:

[AI-NN-PRicsiboost-0.3.tar

Description: Boosting is a meta-learning approach that aims at combining an ensemble of weak classifiers to form a strong classifier. Adaptive Boosting (Adaboost) implements this idea as a greedy search for a linear combination of classifiers by overweighting the examples that are misclassified by each classifier. icsiboost implements Adaboost over stumps (one-level decision trees) on discrete and continuous attributes (words and real values). See http://en.wikipedia.org/wiki/AdaBoost and the papers by Y. Freund and R. Schapire for more details [1]. This approach is one of most efficient and simple to combine continuous and nominal values. Our implementation is aimed at allowing training from millions of examples by hundreds of features in a reasonable time/memory.
Platform: | Size: 116736 | Author: njustyw | Hits:

[Linux-Unixgmmbayestb-v0.1.tar

Description: This package contains Matlab m-files for learning finite Gaussian mixtures from sample data and performing data classification with Mahalanobis distance or Bayesian classifiers. Each class in training set is learned individually with one of the three variations of the Expectation Maximization algorithm: the basic EM algorithm with covariance fixing, the Figueiredo-Jain clustering algorithm and the greedy EM algorithm. The basic EM and FJ algorithms can handle complex valued data directly, the greedy EM algorithm cannot.
Platform: | Size: 20480 | Author: | Hits:

[2D GraphicComponentbasedFaceDetection

Description: Abstract We present a component-based, trainable system for detecting frontal and near-frontal views of faces in still gray images. The system consists of a two-level hierarchy of Support Vector Machine (SVM) classifiers. On the first level, component classifiers independently detect components of a face. On the second level, a single classifier checks if the geometrical configuration of the detected components in the image matches a geometrical model of a face. We propose a method for automatically learning components by using 3-D head models. This approach has the advantage that no manual interaction is required for choosing and extracting components. Experiments show that the componentbased system is significantly more robust against rotations in depth than a comparable system trained on whole face patterns.
Platform: | Size: 349184 | Author: a | Hits:

[matlabEnsemble-learning-based-on-GMDH

Description: 基于自组织数据挖掘的多分类器集成选择的程序-Multiple classifiers ensemble selection based on GMDH
Platform: | Size: 1771520 | Author: 肖进 | Hits:

[AI-NN-PRself-taught-learning

Description: 自主学习把稀疏自编码器和分类器实现结合。先通过稀疏自编码对无标签的5-9的手写体进行训练得到最优参数,然后通过前向传播,得到训练集和测试集的特征,通过0-4有标签训练集训练出softmax模型,然后输入测试集到分类模型实现分类。-Independent Learning the encoder and the sparse classifiers achieve the combination. First through sparse coding since no label was handwritten 5-9 training obtain the optimal parameters, and then through the front propagation, get the training and test sets of features, a label by 0-4 trained softmax model train set, then enter the test set to the classification model to classify.
Platform: | Size: 9228288 | Author: 单清序 | Hits:

[AI-NN-PRonlinesvm

Description: 用c编译的online svm适合大数据快速学习的分类器,对内存要求较低,适合数据量大的数据。-C compiled with online svm suitable for large data quickly learning classifiers, low memory requirements for large amount of data.
Platform: | Size: 53248 | Author: 杨杨 | Hits:

[AI-NN-PREnsemble-Learning

Description: 集成学习将若干基分类器的预测结果进行综合,具体包括Bagging算法和AdaBoost算法;还有随机森林算法,利用多棵树对样本进行训练并预测的一种分类器-Integrated learning integrates the prediction results of several base classifiers, including Bagging algorithm and AdaBoost algorithm and random forest algorithm, using a tree to train the sample and predict a classifier
Platform: | Size: 2048 | Author: 董小鱼 | Hits:

[Industry research2-learning

Description: Reading text photographs is a challenging problem that has received a significant amount of attention. Two key components of most systems are (i) text detection from images and (ii) character recognition, and many recent methods have been proposed to design better feature representations and models for both. In this paper, we apply methods recently developed in machine learning–specifically, large-scale algorithms for learning the features automatically unlabeled data–and show that they allow us to construct highly effective classifiers for both detection and recognition to be used in a high accuracy end-to-end system.-Reading text photographs is a challenging problem that has received a significant amount of attention. Two key components of most systems are (i) text detection from images and (ii) character recognition, and many recent methods have been proposed to design better feature representations and models for both. In this paper, we apply methods recently developed in machine learning–specifically, large-scale algorithms for learning the features automatically unlabeled data–and show that they allow us to construct highly effective classifiers for both detection and recognition to be used in a high accuracy end-to-end system.
Platform: | Size: 1145856 | Author: qwmpg | Hits:

[OtherStanford-University-machine-learning

Description: 本文是关于机器学习的全面介绍,主要涵盖内容有:监督学习,生成学习算法,支持向量机,学习理论,EM算法,感知器和大幅度分类器,正则化和模型选择等-this paper is about machine learning,including:Supervised learning、Generative Learning algorithms、Support Vector Machines、Learning Theory、The EM algorithm、The perceptron and large margin classifiers、Regularization and model selection
Platform: | Size: 3403776 | Author: 高乐莲 | Hits:
« 12 3 »

CodeBus www.codebus.net