Does anyone know of code for building an LSTM recurrent neural network?

19 views (last 30 days)
I am trying to build a form of recurrent neural network - a Long Short Term Memory RNN. I have not been able to find this architecture available on the web. Any advice will be appreciated.
  1 Comment
Bradley Wright
Bradley Wright on 25 May 2016
I also have been on the look for an LTSM network in Matlab that I could adopt and re-purpose. Would really like to see mathworks give more support to neural nets.
In the meantime, I did find this. https://github.com/joncox123/Cortexsys

Sign in to comment.

Answers (8)

oshri
oshri on 19 Jan 2017
Hi, I just implemented today LSTM using MATLAB neural network toolbox. Here is the code:
function net1=create_LSTM_network(input_size , before_layers , before_activation,hidden_size, after_layers , after_activations , output_size)
%%this part split the input into two seperate parts the first part
%is the input size and the second part is the memory
real_input_size=input_size ;
N_before=length(before_layers);
N_after=length(after_layers) ;
delays_vec=1 ;
if (N_before>0 ) && (N_after>0)
input_size=before_layers(end) ;
net1=fitnet( [before_layers , input_size+hidden_size , hidden_size*ones(1,9),after_layers]) ;
elseif (N_before>0) && (N_after==0)
input_size=before_layers(end) ;
net1=fitnet([before_layers,input_size+hidden_size , hidden_size*ones(1 , 9)]) ;
elseif (N_before==0)&&(N_after>0)
net1=fitnet([input_size+hidden_ size , hidden_size*ones(1, 9) , after_layers]) ;
else
net1 =fitnet( [input size+hidden_size, hidden_size*ones(1, 9)]);
end
net1=configure(net1 ,rand( real_input_size , 200) , rand(output_size,200)) ;
%%concatenation
net1.layers{N_before+1}.name='Concatenation Layer';
net1.layers{N_before+2}.name = 'Forget Amount' ;
net1.layers{N_before+3}.name= 'Forget Gate';
net1.layers{N_before+4}.name= 'Remember Amount';
net1.layers{N_before+5}.name= 'tanh Input' ;
net1.layers{N_before+6}.name= 'Forget Gate';
net1.layers{N_before+7}.name= 'Update Memory';
net1.layers {N_before+8}.name= 'tanh Memory';
net1.layers{N_before+9}.name= 'Combine Amount' ;
net1.layers{N_before+10}.name= 'Combine gate' ;
net1.layerConnect(N_before+3 , N_before+7) =1 ;
net1.layerConnect(N_before+1 ,N_before+10)=1 ;
net1.layerConnect(N_before+4 , N_before+3)=0;
net1.layerWeights{N_before+1 , N_before+10}.delays=delays_vec ;
if N_before>0
net1.LW{N_before+1 , N_before} = [eye(input_size) ; zeros(hidden_size, input_size)];
else
net1.IW{1,1}=[eye( input_size) ;zeros(hidden_size , input_size)];
end
net1.LW{N_before+1 , N_before+10}=repmat ([zeros(input_size, hidden_size); eye(hidden_size)] , [1 , size(delays_vec,2)] ) ;
net1.layers{N_before+1}.transferFcn='purelin';
net1.layerWeights{N_before+1 ,N_before+10}.learn=false;
if N_before>0
net1.layerWeights{ N_before+1 ,N_before}.learn=false;
else
net1.inputWeights{ 1, 1}.learn=false ;
end
net1.biasConnect = [ones(1,N_before) 0 1 0 1 1 0 0 0 1 0 1 ones(1,N_after)]' ;%
%%first gate
net1.layers{N_before+2}.transferFcn= 'logsig' ;
net1.layerWeights{N_before+3, N_before+2}.weightFcn='scalprod' ;
% net1 .layerWeights{3 , 7} .weightFcn= ' scalprod ';
net1.layerWeights{N_before+3, N_before+2}.learn=false;
net1.layerWeights{N_before+3, N_before+7}.learn=false ;
net1.layers{N_before+3}.netinputFcn= 'netprod';
net1.layers{N_before+3}.transferFcn='purelin';
net1.LW{N_before+3, N_before+2}=1;
% net1.LW{3 , 7} =1 ;
%%second gate
net1.layerConnect(N_before+4,N_before+1)=1;
net1.layers{N_before+4}.transferFcn='logsig' ;
%%tanh
net1.layerConnect(N_before+5 , N_before+4) =0;
net1.layerConnect( N_before+5 , N_before+1)=1;
%%second gate mult
net1.layerConnect(N_before+6, N_before+4)=1;
net1.layers{N_before+6}.netinputFcn='netprod' ;
net1.layers{N_before+6} .transferFcn= 'purelin';
net1.layerWeights{N_before+6, N_before+5}.weightFcn='scalprod';
net1.layerWeights {N_before+6 , N_before+4}.weightFcn='scalprod';
net1.layerWeights{N_before+6 , N_before+5}.learn=false ;
net1.layerWeights{N_before+6,N_before+4}.learn=false;
net1.LW{N_before+6 , N_before+5} =1;
net1.LW{N_before+6 , N_before+4}=1 ;
%%C update
delays_vec=1;
net1.layerConnect(N_before+7,N_before+3)=1 ;
net1.layerWeights{N_before+3,N_before+7} . delays=delays_vec ;
net1.layerWeights{N_before+7,N_before+3}.weightFcn= 'scalprod';
net1.layerWeights{N_before+7,N_before+6}.weightFcn= 'scalprod';
net1 .layers{N_before+7}.transferFcn= 'purelin';
net1.LW{N_before+7 , N_before+3} =1 ;
net1.LW{N_before+7 , N_before+6} =1 ;
net1.LW{N_before+3 , N_before+7}=repmat(eye(hidden_size), [1 , size(delays_vec,2)] );
net1.layerWeights{N_before+3 , N_before+7}.learn=false ;
net1.layerWeights{N_before+7 ,N_before+6}.learn=false;
net1.layerWeights{N_before+7,N_before+3}.learn=false;
%%output stage
net1.layerConnect(N_before+9, N_before+8)=0;
net1.layerConnect(N_before+10 , N_before+8) = 1 ;
net1.layerConnect(N_before+9, N_before+1) =1 ;
net1.layerWeights{N_before+10 , N_before+8}.weightFcn='scalprod' ;
net1.layerWeights{N_before+10 , N_before+9}.weightFcn= 'scalprod' ;
net1.LW{N_before +10 ,N_before+9}=1 ;
net1.LW{N_before+10,N_before+8}=1 ;
net1.layers{N_before+10}.netinputFcn= 'netprod' ;
net1.layers{N_before+10}.transferFcn= 'purelin';
net1.layers{N_before+9}.transferFcn= 'logsig';
net1.layers{N_before+5}.transferFcn='tansig';
net1.layers{N_before+8}.transferFcn='tansig' ;
net1.layerWeights{N_before+10 ,N_before+ 9}.learn= false ;
net1.layerWeights{N_before +10,N_before+8 }.learn= false ;
net1.layerWeights{N_before+7 ,N_before+3 }. learn=false ;
for ll=1:N_before
net1.layers{ll}.transferFcn=before_activation;
end
for ll=1:N_after
net1. layers{end-ll}.transferFcn=after_activations ;
end
net1.layerWeights{N_before+8 , N_before+7}.weightFcn='scalprod' ;
net1.LW{N_before+8 , N_before+7}=1 ;
net1.layerWeights{N_before+8 , N_before+7}.learn=false ;
net1=configure(net1 , rand(real_input_size ,200) , rand(output_size , 200) ) ;
net1.trainFcn= 'trainlm';
  6 Comments
sujan ghimire
sujan ghimire on 11 Nov 2017
I have tried 25 inputs with 1 output for non linear regression and it is not working.

Sign in to comment.


Sebastine Hirimeti
Sebastine Hirimeti on 16 Feb 2017
Hi,
Can someone please help me a bit more on how to run the code, the inputs, etc.,
Also any reference on how this is built like a paper
Thanks

David Kuske
David Kuske on 26 Oct 2017
Does this code support regressionoutput for LSTMs? Matlab doesnt seem to have implemented that yet. also any help on how to use this code would be highly appreciated.

Shounak Mitra
Shounak Mitra on 31 Oct 2017
As of 17b, MATLAB does support LSTMs. Please check https://www.mathworks.com/help/nnet/examples/classify-sequence-data-using-lstm-networks.html
  1 Comment
chadi cream
chadi cream on 7 Feb 2018
Lstm in matlab 2017b support classification. But does it support prediction regression.? Thanks

Sign in to comment.


vinothini v
vinothini v on 9 Oct 2018
pls anyone explain the code i didn't get the code

Shounak Mitra
Shounak Mitra on 9 Oct 2018
Edited: KSSV on 6 Jun 2019
@Chadi: Yes, it does support regression as well. Here's the doc link: <https://www.mathworks.com/help/deeplearning/ug/long-short-term-memory-networks.html>
@Vinothini: You do not need to understand the code. you can directly start using LSTMs in your work following the document link I pasted above.

Renaud Jougla
Renaud Jougla on 6 May 2019
Hello everybody. I am a relatively new user of matlab. I am trying to use LSTM ANN. The code proposed above runs well. Unfortunetaly I don't understand how to use it then... I mean I have this function called create_LSTM_network but now how can I use with my training data ? For example if I have data called x_train with predictives variables and y_train with data I wnat to predict, how do I have to write my code to train my LSTM ANN with these data ?
Thanks a lot for your help and time.
Regards

Kwangwon Seo
Kwangwon Seo on 18 Jul 2019
Hi. I tried to use the code above. And I don't know how to input my data in the code.
Is there anyone to solve this problem?
Thank you.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!