Welcome![Sign In][Sign Up]
Location:
Search - matlab classifier plot

Search list

[Other resource27796720simple-SVM

Description: 支持向量积的一些分类器训练的matlab代码-support vector plot of some classifier training Matlab code
Platform: | Size: 2488 | Author: 江兰 | Hits:

[2D Graphic27796720simple-SVM

Description: 支持向量积的一些分类器训练的matlab代码-support vector plot of some classifier training Matlab code
Platform: | Size: 2048 | Author: 江兰 | Hits:

[Communicationrocplot

Description: ROC curves illustrate performance on a binary classification problem where classification is based on simply thresholding a set of scores at varying levels. Lenient thresholds give high sensitivity but low specificity, strict thresholds give high specificity but low sensitivity the ROC curve plots this trade-off over a range of thresholds (usually with sens vs 1-spec, but I prefer sens vs spec this code gives you the option). It is theoretically possible to operate anywhere on the convex hull of an ROC curve, so this is plotted too. The area under the curve (AUC) for a ROC plot is a measure of overall accuracy, and the area under the ROCCH is a kind of upper bound on what might be achievable with a weighted combination of differently thresholded results from the given classifier -ROC curves illustrate performance on a binary classification problem where classification is based on simply thresholding a set of scores at varying levels. Lenient thresholds give high sensitivity but low specificity, strict thresholds give high specificity but low sensitivity the ROC curve plots this trade-off over a range of thresholds (usually with sens vs 1-spec, but I prefer sens vs spec this code gives you the option). It is theoretically possible to operate anywhere on the convex hull of an ROC curve, so this is plotted too. The area under the curve (AUC) for a ROC plot is a measure of overall accuracy, and the area under the ROCCH is a kind of upper bound on what might be achievable with a weighted combination of differently thresholded results from the given classifier
Platform: | Size: 4096 | Author: saadat | Hits:

[OS programcode-(2)

Description: Using MATLAB tools for MLP NNs (e.g., newff, …), design a two-layer feed-forward neural network as a classifier to categorize the input geometric shapes. - The snapshot and bitmap of shapes are given: - Training shapes: shkt.bmp - Training patterns: trn.txt (each shape is in a 125*140 matrix) - Test shapes: shks.bmp - Test patterns: tsn.txt (each shape is in a 125*140 matrix) - Since the dimension of inputs is too high (17500-dimensional), it is not possible to apply them directly to the net. So, … . - Try the number of hidden neurons to be at least. - Do training of NN until all training patterns are truly classified. - To examine the generalization ability of your NN after training, a) Apply it to the test patterns and report the accuracies. b) Add p noise (p=5, 10, …, 75) to the training shapes (only degrade the black pixels of the shapes) and report in a plot the accuracy versus p.-Using MATLAB tools for MLP NNs (e.g., newff, …), design a two-layer feed-forward neural network as a classifier to categorize the input geometric shapes. - The snapshot and bitmap of shapes are given: - Training shapes: shkt.bmp - Training patterns: trn.txt (each shape is in a 125*140 matrix) - Test shapes: shks.bmp - Test patterns: tsn.txt (each shape is in a 125*140 matrix) - Since the dimension of inputs is too high (17500-dimensional), it is not possible to apply them directly to the net. So, … . - Try the number of hidden neurons to be at least. - Do training of NN until all training patterns are truly classified. - To examine the generalization ability of your NN after training, a) Apply it to the test patterns and report the accuracies. b) Add p noise (p=5, 10, …, 75) to the training shapes (only degrade the black pixels of the shapes) and report in a plot the accuracy versus p.
Platform: | Size: 3072 | Author: fatemeh | Hits:

CodeBus www.codebus.net