Dropout layer for neural network

33 views (last 30 days)
Lingies Santhirasekaran
Lingies Santhirasekaran on 19 Jun 2019
Answered: M1997 on 28 Jun 2019
Dear all,
I would to know how to use dropout for neural network. I have checked the documentation : https://www.mathworks.com/help/deeplearning/ref/nnet.cnn.layer.dropoutlayer.html but it doesn't exactly show how to apply it to a fitting problem. There is an example but its only for CNN . I'm not sure if it has the feature to apply dropout to hidden layers like how it can be done it Keras.
I attempeted to do it in this way --> net.layers{1} = drpoutLayer(0.5) . It seemed like a logical way to apply but it did not work.
I have searched and read all the questions and answers regarding dropoutLayer in Matlab but i can't find any proper way to apply it. Anyone who have tried it before please advice me on this. Thanks in advance

Answers (1)

M1997
M1997 on 28 Jun 2019
Hey,
From what I understand, and I hope you do as well, is that the dropout layer randomly chooses some amount of the data specified by the value X in dropoutLayer(X).
When you create some layers for a LSTM network, lets say:
numHiddenUnits = 200; % "Number of Neurons"
featureDimension = 2; % "Number of Inputs"
numResponses = 1; %"Number of Outputs"
layers = [ ...
%Input Sequence Layer
sequenceInputLayer(featureDimension)
%LSTM Layers
lstmLayer(numHiddenUnits,'OutputMode','sequence')
%Layer where the output size is reduced to 50
fullyConnectedLayer(50)
%Then before the last fullyConnected layer that has an output the same size as the numResponses (number of outputs in training data), randomly select half the values and change them to zero.
dropoutLayer(0.5)
fullyConnectedLayer(numResponses)
regressionLayer];
The dropout layer will randomly set 50% of the parameters after the first fullyConnectedLayer to 0.
This is the reference which matlab provides for understanding dropout, but if you have used Keras I doubt you would need to read it: Srivastava, N., G. Hinton, A. Krizhevsky, I. Sutskever, R. Salakhutdinov. "Dropout: A Simple Way to Prevent Neural Networks from Overfitting." Journal of Machine Learning Research. Vol. 15, pp. 1929-1958, 2014.

Products


Release

R2018b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!