Location:
Search - backpropagation
Search list
Description: This "classic" style of neural network is used for spontaneously generating an algorithm for the analysis of numeric patterns. The goal is to "teach" the network how to analyze a desired set of numeric inputs. For instance, the network might be taught to calculate the AND, OR, and XOR of two input numbers: given the inputs of 0 and 1, the network should output 0, 1, and 1. The trick is that the network is not taught how to analyze the numbers the network is given several sets of inputs and the correct output for each input set, and attempts to synthesize an algorithm to provide correct outputs. If this process is correctly performed, the network may be able to yield the correct analysis of input sets for which it has not been taught the correct output.-This "classic" style of neural network is used for spontaneously generating an algorithm for the analysis of numeric patterns. The goal is to "teach" the network how to analyze a desired set of numeric inputs. For instance, the network might be taught to calculate the AND, OR, and XOR of two input numbers: given the inputs of 0 and 1, the network should output 0, 1, and 1. The trick is that the network is not taught how to analyze the numbers the network is given several sets of inputs and the correct output for each input set, and attempts to synthesize an algorithm to provide correct outputs. If this process is correctly performed, the network may be able to yield the correct analysis of input sets for which it has not been taught the correct output.
Platform: |
Size: 56320 |
Author: diana |
Hits:
Description: Purpose: This program used for testing source code in
* Chapter 5: Introduction to Neural Networks and the Backpropagation Algorithm
* Book: AI Application Programming
-Purpose: This program used for testing source code in
* Chapter 5: Introduction to Neural Networks and the Backpropagation Algorithm
* Book: AI Application Programming
Platform: |
Size: 2841600 |
Author: tuan |
Hits:
Description: simple but incomplete backpropagation
Platform: |
Size: 1024 |
Author: Adrian |
Hits:
Description: backpropagation for Neural network
Platform: |
Size: 5120 |
Author: hacmiu |
Hits:
Description: simulation presure proses rig with NN backpropagation
Platform: |
Size: 35840 |
Author: zul |
Hits:
Description: BP backpropagation code java
Platform: |
Size: 2048 |
Author: mayel |
Hits:
Description: backpropagation algorithm
Platform: |
Size: 1024 |
Author: ore |
Hits:
Description: Backpropagation neural network using .NET
Platform: |
Size: 2456576 |
Author: jokosu |
Hits:
Description: THIS A BACK PROPAGATION ALGORITHM IN C FROM VALLURU RAO BOOK
Platform: |
Size: 9216 |
Author: Giri |
Hits:
Description: matlab code for backpropagation
Platform: |
Size: 6144 |
Author: acp |
Hits:
Description: Artificial Intellegence: backpropagation
Platform: |
Size: 66560 |
Author: nmsa |
Hits:
Description: This book provides illustrative examples in C++ that the reader can use as a basis for further experimentation. A key to learning about neural networks to appreciate their inner workings is to experiment. Neural networks, in the end, are fun to learn about and discover. Although the language for description used is C++, you will not find extensive class libraries in this book. With the exception of the backpropagation simulator, you will find fairly simple example programs for many different neural network architectures and paradigms. Since backpropagation is widely used and also easy to tame, a simulator is provided with the capacity to handle large input data sets. You use the simulator in one of the chapters in this book to solve a financial forecasting problem. You will find ample room to expand and experiment with the code presented in this book.
Platform: |
Size: 2395136 |
Author: vasi |
Hits:
Description: c++ 人工智能神经元系统源代码,给人工智能入门者学习用-This is a set of artificial neural network programs. The later versions are multilayer perceptron (MLP) backpropagation neural networks in which the number of layers and nodes are user-selectable.
Platform: |
Size: 90112 |
Author: alex |
Hits:
Description: MATLAB code that implements the generalized delta rule
Platform: |
Size: 1024 |
Author: sara |
Hits:
Description: backpropagation neural network
Platform: |
Size: 10240 |
Author: bahariii |
Hits:
Description: 单输出函数Y=SIN(X)逼近问题的bp程序:假设网络结构为3--2--1,输入维数M,共N个样本,一般输入不算层,输出算层- 激活函数: hardlim---(0,1),hardlims---(-1,1),purelin,logsig---(0,1),tansig----(-1,1)
softmax,poslin,radbas,satlin,satlins,tribas
训练算法: 1.traingd,traingdm,traingda(variable learning rate
backpropagation),trainrp( resilient backpropagation )
2.conjugate gradient (traincgf, traincgp, traincgb, trainscg),
quasi-Newton (trainbfg, trainoss), and Levenberg-Marquardt (trainlm).
Platform: |
Size: 2048 |
Author: 刘老师 |
Hits:
Description: Backpropagation with momentum *
by Andres Perez-Uribe *
References :
- G. Hinton, "How neural networks learn from experience",
Scientific American, sep 1992.
- P. Werbos, "The Roots of Backpropagation: From ordered derivatives
to Neural Neworks and Political Forecasting", John Wiley and Sons,
New York, 1994
Compile : gcc -o Bkprop Bkprop.c -lm
Run : see example at the end of the C code.
There is no guarantee that the code will do what you
expect or that it is error free. It is simply meant
to provide a useful way to experiment with the
Backpropagation learning algorithm.
Update Oct 7/99...thanks to Stephane Pouyet
change of variable names for the sake of clarity
The original names came from Hinton s paper.-Backpropagation with momentum *
by Andres Perez-Uribe *
References :
- G. Hinton, "How neural networks learn from experience",
Scientific American, sep 1992.
- P. Werbos, "The Roots of Backpropagation: From ordered derivatives
to Neural Neworks and Political Forecasting", John Wiley and Sons,
New York, 1994
Compile : gcc -o Bkprop Bkprop.c -lm
Run : see example at the end of the C code.
There is no guarantee that the code will do what you
expect or that it is error free. It is simply meant
to provide a useful way to experiment with the
Backpropagation learning algorithm.
Update Oct 7/99...thanks to Stephane Pouyet
change of variable names for the sake of clarity
The original names came from Hinton s paper.
Platform: |
Size: 3072 |
Author: amasz |
Hits:
Description: MATLAB neural network backprop code
This code implements the basic backpropagation of
error learning algorithm. The network has tanh hidden
neurons and a linear output neuron.
adjust the learning rate with the slider
-MATLAB neural network backprop code
This code implements the basic backpropagation of
error learning algorithm. The network has tanh hidden
neurons and a linear output neuron.
adjust the learning rate with the slider
Platform: |
Size: 2048 |
Author: azoma |
Hits:
Description: 新编写的Jordan BP神经网络,已测试,运行良好,可用于数据预测、分类,及模式识别。-Create an Jordan backpropagation network for data prediction.
Platform: |
Size: 3072 |
Author: 徐俊仁 |
Hits:
Description: backpropagation 算法实现-backpropagation algorithm
Platform: |
Size: 2048 |
Author: Tommy |
Hits: