资源简介
消费者请注意,本资源是分别用RNN(循环神经网络)和LSTM(长短记忆网络)编写的MATLAB的案例,内部RNN.m和LSTM.m文件程序可以直接运行,内部已包含所需功能函数,如过不能直接运行请留言。
代码片段和文件信息
% 接下来就是LSTM的Matlab代码,我也进行了注释,用英文注释的,也比较容易懂:
% implementation of LSTM
clc
clear all
close all
%% training dataset generation
binary_dim = 8;
largest_number = 2^binary_dim - 1;
binary = cell(largest_number 1);
for i = 1:largest_number + 1
binary{i} = dec2bin(i-1 binary_dim);
int2binary{i} = binary{i};
end
%% input variables
alpha = 0.1;
input_dim = 2;
hidden_dim = 32;
output_dim = 1;
%% initialize neural network weights
% in_gate = sigmoid(X(t) * U_i + H(t-1) * W_i) ------- (1)
U_i = 2 * rand(input_dim hidden_dim) - 1;
W_i = 2 * rand(hidden_dim hidden_dim) - 1;
U_i_update = zeros(size(U_i));
W_i_update = zeros(size(W_i));
% forget_gate = sigmoid(X(t) * U_f + H(t-1) * W_f) ------- (2)
U_f = 2 * rand(input_dim hidden_dim) - 1;
W_f = 2 * rand(hidden_dim hidden_dim) - 1;
U_f_update = zeros(size(U_f));
W_f_update = zeros(size(W_f));
% out_gate = sigmoid(X(t) * U_o + H(t-1) * W_o) ------- (3)
U_o = 2 * rand(input_dim hidden_dim) - 1;
W_o = 2 * rand(hidden_dim hidden_dim) - 1;
U_o_update = zeros(size(U_o));
W_o_update = zeros(size(W_o));
% g_gate = tanh(X(t) * U_g + H(t-1) * W_g) ------- (4)
U_g = 2 * rand(input_dim hidden_dim) - 1;
W_g = 2 * rand(hidden_dim hidden_dim) - 1;
U_g_update = zeros(size(U_g));
W_g_update = zeros(size(W_g));
out_para = 2 * rand(hidden_dim output_dim) - 1;
out_para_update = zeros(size(out_para));
% C(t) = C(t-1) .* forget_gate + g_gate .* in_gate ------- (5)
% S(t) = tanh(C(t)) .* out_gate ------- (6)
% Out = sigmoid(S(t) * out_para) ------- (7)
% Note: Equations (1)-(6) are cores of LSTM in forward and equation (7) is
% used to transfer hiddent layer to predicted output i.e. the output layer.
% (Sometimes you can use softmax for equation (7))
%% train
iter = 99999; % training iterations
for j = 1:iter
% generate a simple addition problem (a + b = c)
a_int = randi(round(largest_number/2)); % int version
a = int2binary{a_int+1}; % binary encoding
b_int = randi(floor(largest_number/2)); % int version
b = int2binary{b_int+1}; % binary encoding
% true answer
c_int = a_int + b_int; % int version
c = int2binary{c_int+1}; % binary encoding
% where we‘ll store our best guess (binary encoded)
d = zeros(size(c));
if length(d)<8
pause;
end
% total error
overallError = 0;
% difference in output layer i.e. (target - out)
output_deltas = [];
% values of hidden layer i.e. S(t)
hidden_layer_values = [];
cell_gate_values = [];
% initialize S(0) as a zero-vector
hidden_layer_values = [hidden_layer_values; zeros(1 hidden_dim)];
cell_gate_values = [cell_gate_values; zeros(1 hidden_dim)];
% initialize memory gate
属性 大小 日期 时间 名称
----------- --------- ---------- ----- ----
文件 10107 2018-12-18 11:37 RNN-LSTM\LSTM.m
文件 5821 2018-12-18 11:23 RNN-LSTM\RNN.m
文件 63 2018-12-18 11:25 RNN-LSTM\sigmoid.m
文件 81 2018-12-18 11:29 RNN-LSTM\sigmoid_output_to_derivative.m
文件 68 2018-12-18 12:10 RNN-LSTM\tan_h.m
文件 72 2018-12-18 12:10 RNN-LSTM\tan_h_output_to_derivative.m
文件 184 2018-12-18 12:12 RNN-LSTM\参考网址.txt
目录 0 2018-12-18 13:40 RNN-LSTM\
相关资源
- DeepLearningLSTM.m
- MATLAB之LSTM预测
- RNN算法推导过程及代码.zip
- 广义自回归神经网络预测代码含原始
- GRNN的数据预测-基于广义回归神经网络
- 9 RBF、GRNN和PNN神经网络案例matlab参考
- Matlab实现循环神经网络RNN
- GRNN神经网络.zip
- LSTM神经网络MATLAB
- GRU的matlab时间序列神经网络
- RNN-LSTM 卷积神经网络使用 Matlab 实现
- LSTM-regression-master.zip
- LSTM算法推导及代码.zip
- RNN算法打包matlab
- 径向基神经网络MATLAB仿真
- RNN-LSTM 卷积神经网络 Matlab 实现,简单
- 基于广义回归神经网络(GRNN )的数据
- LSTM-Matlab代码
- RNN-LSTM卷积神经网络Matlab实现
- matlab实现的递归神经网络程序(LSTM)
- matlab制作的RNN神经网络
- rnn-esn 基于神经网络的负荷预测
- rnn rnn maltab时间序列预测实现
- RNN 利用MATLAB实现循环神经网络的例子
- GRNN
- LSTM
- LSTM-MATLAB-master 亲测有效
- lstm用于时间序列分析和预测
- matlab开发-huashiyiqikeLSTMMATLAB
评论
共有 条评论