CodeBus
www.codebus.net
Search
Sign in
Sign up
Hot Search :
Source
embeded
web
remote control
p2p
game
More...
Location :
Home
Search - classification by nearest neighbor
Main Category
SourceCode
Documents
Books
WEB Code
Develop Tools
Other resource
Sub Category
Compress-Decompress algrithms
STL
Data structs
Algorithm
AI-NN-PR
matlab
Bio-Recognize
Crypt_Decrypt algrithms
mathematica
Maple
DataMining
Big Data
comsol
physical calculation
chemical calculation
simulation modeling
Search - classification by nearest neighbor - List
[
AI-NN-PR
]
knn_vb
DL : 0
In pattern recognition, the k-nearest neighbor algorithm (k-NN) is a method for classifying objects based on closest training examples in the feature space. k-NN is a type of instance-based learning, or lazy learning where the function is only approximated locally and all computation is deferred until classification. The k-nearest neighbor algorithm is amongst the simplest of all machine learning algorithms: an object is classified by a majority vote of its neighbors, with the object being assigned to the class most common amongst its k nearest neighbors (k is a positive integer, typically small). If k = 1, then the object is simply assigned to the class of its nearest neighbor.
Date
: 2025-12-16
Size
: 51kb
User
:
Putra
[
AI-NN-PR
]
CNN
DL : 0
CNN(CondensedNearestNeighbor)是最早的基于近邻分类的实例选择算法。本程序实现了CNN算法,能很好的实现近邻分类的实例选择。-CNN (CondensedNearestNeighbor) is the earliest instance selection algorithm based on nearest neighbor classification. The core idea of the algorithm is that if the instance cannot be correctly classified by the currently selected set, the selected set is added. This program implements the CNN algorithm
Date
: 2025-12-16
Size
: 841kb
User
:
周鑫
[
AI-NN-PR
]
MLkNN
DL : 0
ML-KNN,这是来自传统的K-近邻(KNN)算法。详细地,为每一个看不见的实例中,首先确定了训练集中的k近邻。之后,基于从标签集获得的统计信息。这些相邻的实例,即属于每个可能类的相邻实例的数量,最大后验(MAP)原理。用于确定不可见实例的标签集。三种不同现实世界中多标签学习问题的实验研究,即酵母基因功能分析、自然场景分类和网页自动分类,表明ML-KNN实现了卓越的性能(ML-KNN which is derived from the traditional K-nearest neighbor (KNN) algorithm. In detail, for each unseen instance, its K nearest neighbors in the training set are firstly identified. After that, based on statistical information gained from the label sets of these neighboring instances, i.e. the number of neighboring instances belonging to each possible class, maximum a posteriori (MAP) principle is utilized to determine the label set for the unseen instance. Experiments on three different real-world multi-label learning problems, i.e. Yeast gene functional analysis, natural scene classification and automatic web page categorization, show that ML-KNN achieves superior performance to some well-established multi-label learning algorithms. 2007 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.)
Date
: 2025-12-16
Size
: 5kb
User
:
玖
CodeBus
is one of the largest source code repositories on the Internet!
Contact us :
1999-2046
CodeBus
All Rights Reserved.