Description: 我做的一个用matlab程序编写的BP算法,是是我毕设的一部分,程序运行绝对没有问题,欢迎大家指正。-I do a Matlab programming with the back-propagation algorithm, is part of a complete, running absolutely no problem, we are happy to correct. Platform: |
Size: 3072 |
Author:FX |
Hits:
Description: The adaptive Neural Network Library is a collection of blocks that implement several Adaptive Neural Networks featuring
different adaptation algorithms.~..~
There are 11 blocks that implement basically these 5 kinds of neural networks:
1) Adaptive Linear Network (ADALINE)
2) Multilayer Layer Perceptron with Extended Backpropagation algorithm (EBPA)
3) Radial Basis Functions (RBF) Networks
4) RBF Networks with Extended Minimal Resource Allocating algorithm (EMRAN)
5) RBF and Piecewise Linear Networks with Dynamic Cell Structure (DCS) algorithm
A simulink example regarding the approximation of a scalar nonlinear function of 4 variables is included-The adaptive Neural Network Library is a collection of blocks that implement several Adaptive Neural Networks featuring different adaptation algorithms .~..~ There are 11 blocks that implement basically these five kinds of neural networks : a) Adaptive Linear Network (ADALINE) 2) Multilayer Layer 102206 with Extended Backpropagation algorithm (EBPA) 3) Radial Basis Functions (RBF) Networks, 4) RBF Networks with Extended Minimal Resource Allocating algorithm (EMRAN) 5) and RBF Networks with Piecewise Linear Dynamic Cell Structure (DCS) algorithm A Simulink example regarding the approximation of a scalar nonlinear function of four variables is included Platform: |
Size: 198656 |
Author:叶建槐 |
Hits:
Description: Batch version of the back-propagation algorithm.
% Given a set of corresponding input-output pairs and an initial network
% [W1,W2,critvec,iter]=batbp(NetDef,W1,W2,PHI,Y,trparms) trains the
% network with backpropagation.
%
% The activation functions must be either linear or tanh. The network
% architecture is defined by the matrix NetDef consisting of two
% rows. The first row specifies the hidden layer while the second
% specifies the output layer.
%-Batch version of the back-propagation algorithm. Given a set of corresponding input-output pairs and an initial network [W1, W2, critvec, iter] = batbp (NetDef, W1, W2, PHI, Y, trparms) trains the network with backpropagation. The activation functions must be either linear or tanh. The network architecture is defined by the matrix NetDef consisting of two rows. The first row specifies the hidden layer while the second specifies the output layer. Platform: |
Size: 2048 |
Author:张镇 |
Hits:
Description: BP neural code
the backpropagation algorithm nwueal implementation in matlab, the code for training element wise Platform: |
Size: 1024 |
Author:Saswata Nath |
Hits:
Description: A MLP code with backpropagation training algorithm designed for aproximation of functions problems. Platform: |
Size: 1024 |
Author:Paulo |
Hits:
Description: MATLAB neural network backprop code
This code implements the basic backpropagation of
error learning algorithm. The network has tanh hidden
neurons and a linear output neuron.
adjust the learning rate with the slider
-MATLAB neural network backprop code
This code implements the basic backpropagation of
error learning algorithm. The network has tanh hidden
neurons and a linear output neuron.
adjust the learning rate with the slider
Platform: |
Size: 2048 |
Author:azoma |
Hits:
Description: This the error backpropagation algorithm based character recognition program developed in matlab. The program folder contains the training images of some english alphabets & numerals. & trains the neural network using train command in matlab. The neural network is initialized using newff command.
After the simulation, the program recognizes the input character. Accuracy on the provided images is 100 .-This is the error backpropagation algorithm based character recognition program developed in matlab. The program folder contains the training images of some english alphabets & numerals. & trains the neural network using train command in matlab. The neural network is initialized using newff command.
After the simulation, the program recognizes the input character. Accuracy on the provided images is 100 . Platform: |
Size: 7730176 |
Author:manish |
Hits:
Description: 利用BP神经网络对邮件分类,2分类,实现反向传播算法细节,非直接函数调用(BP neural network is used to classify mail, 2 classifications, to implement the details of backpropagation algorithm, non direct function call.) Platform: |
Size: 2048 |
Author:horizonch115 |
Hits: