LSTM Python hyperparameters v MATLAB

8 views (last 30 days)
Philip Hua
Philip Hua on 30 May 2022
Answered: Philip Hua on 3 Jun 2022
I am reading a LSTM research paper and it states:
The following experiments investigate deep RNN models parameterized by the following hyperparameters: 1. num_layers – the number of memory cell layers 2. rnn_size – the number of hidden units per memory cell (i.e. hidden state dimension) 3. wordvec – dimension of vector embeddings 4. seq_length – number of frames before truncating BPTT gradient
I can see 2 and 3 as the number of hidden units and the input size but I cannot find where one would set 1 and 4.

Answers (2)

David Willingham
David Willingham on 1 Jun 2022
Hi Philip,
For 1, by default layers are not a "settable" parameter. You need to setup an experiment that tests networks of different sizes and see which one might give the best results. This example Try Multiple Pretrained Networks for Transfer Learning shows how you can use the Experiment Manager App in MATLAB to do this.
For 4, while I don't have an example to share on this. You could use Experiment Manager to setup an experiment that changes the sequence length of the input data used to feed the LSTM training.
David Willingham
David Willingham on 1 Jun 2022
For 1, in MATLAB this isn't a settable parameter, however you can set them manually:
[lstmLayer(64); lstmLayer(64)]
For 4, there is an option to set sequencelength for the mini batch in the trainingoptions:

Sign in to comment.

Philip Hua
Philip Hua on 3 Jun 2022
Thank you David. Could you however, clarify the suggested network configuration above? The number of memory cells i think is not the same as the number of lstm layers right? Perhaps you could kindly send an unrolled network diagram and label the suggested configuration ?


Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange





Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!