Welcome![Sign In][Sign Up]
Location:
Downloads SourceCode Windows Develop Other
Title: rnn-from-scratch-master Download
 Description: You can find that the parameters `(W, U, V)` are shared in different time steps. And the output in each time step can be**softmax**. So you can use**cross entropy** loss as an error function and use some optimizing method (e.g. gradient descent) to calculate the optimized parameters `(W, U, V)`. Let recap the equations of our RNN:
 Downloaders recently: [More information of uploader 大夫]
 To Search:
File list (Check if you may need any files):
 

rnn-from-scratch-master
.......................\README.md
.......................\__pycache__
.......................\...........\activation.cpython-34.pyc
.......................\...........\gate.cpython-34.pyc
.......................\...........\layer.cpython-34.pyc
.......................\...........\output.cpython-34.pyc
.......................\...........\preprocessing.cpython-34.pyc
.......................\...........\rnn.cpython-34.pyc
.......................\activation.py
.......................\data
.......................\....\reddit-comments-2015-08.csv
.......................\figures
.......................\.......\gradient.png
.......................\.......\init.png
.......................\.......\rnn-bptt-with-gradients.png
.......................\.......\rnn-bptt1.png
.......................\.......\rnn-compuattion-graph.png
.......................\.......\rnn-compuattion-graph_2.png
.......................\.......\rnn.jpg
.......................\.......\rnn_equation.png
.......................\.......\rnn_eval.png
.......................\.......\rnn_loss.png
.......................\.......\rnn_loss_2.png
.......................\gate.py
.......................\layer.py
.......................\output.py
.......................\preprocessing.py
.......................\rnn.py
.......................\rnnlm.py
    

CodeBus www.codebus.net