Clear Filters
Clear Filters

Can I simulation a wave-like output with the LSTM network and the same inputs

3 views (last 30 days)
Hello everyone,
I currently have difficulty on LSTM simulation
Currently I am doing an virtual hydrological experimen with linear reservoir and the outflow of the reservoir will be calculated from rainfall input and combination of water balance from multiple linear reservoirs.
The rainfall value will be consistant (2mm/day), but I turned it on and off. The graph below will be the rainfall, for 22932 days (training data; 2/3 of the total data)
The outflow value will have a wave-like pattern; The image below is the output for 22932 days (training data; 2/3 of the total data)
My test is to examinate whether I can simulate the outflow of the reservoir by rainfall alone with LSTM network. Currently my simluation result was less from ideal; I cannot simulate the wave-like pattern with rainfall alone. The result below shows the test outflow (Testing1) and simulation (sim0):
Is it possible to simulate the wave-like pattern with only precipitation with identical precipitation but different intensity? Here is my code. Please take a look, and explain how I can improve my simulation. Thank you.
miniBatchSize = 256; %one predictor sequence
numResponses = 1;
featureDimension0 = 1; %percipitation alone
layers0 = [ ...
sequenceInputLayer(featureDimension0)
lstmLayer(num_hidden_units,'OutputMode','sequence')
fullyConnectedLayer(numResponses)
dropoutLayer(0.01)
regressionLayer];
options = trainingOptions('adam', ...
'MaxEpochs',50, ...
'GradientThreshold',1, ...
'MiniBatchSize',256,...
'InitialLearnRate',0.005, ...
'SequenceLength',5733, ...
'LearnRateSchedule','piecewise', ...
'LearnRateDropPeriod',200, ...
'LearnRateDropFactor',0.001, ...
'Verbose',0);
options.Plots = 'training-progress';
%input prcp01, the precipitation
%input nu01, the simulation of discharge
length_01 = length(prcp01);
LTrain_01 = length_01 *2/3; %2/3 for the training
LTrain_01 = round(LTrain_01);
XTrain01= prcp01(1:LTrain_01,:);
XTest01 = prcp01(LTrain_01:end,:); %
YTrain01 = squeeze(nu01(2,1:LTrain_01,end));
YTest01 = squeeze(nu01(2,LTrain_01:end,end));
%training
net_nu010 = trainNetwork(XTrain01',YTrain01,layers0,options);
%simulation
Qsim100a = predict(net_nu01,XTrain01');
Qsim100a = double(Qsim100a');
Qsim100b = predict(net_nu01,XTest01');
Qsim100b = double(Qsim100b');

Answers (1)

Ashutosh
Ashutosh on 25 Aug 2023
The following steps can be followed to improve the Simulation:
  • The current LSTM network architecture contains only one LSTM layer and one fully connected layer. try increasing the numbers of LSTM layers or adding more hidden layers to capture complex patterns.Experiment with different architectures to find optimal configuration for the input.
  • Normalize the input data which can help LSTM network to converge faster, apply normalization on both training and testing data to have better performance.
  • Experiment with different hyperparameters such as MaxEpochs, MiniBatchSize, dropout rate to find the optimal configuration of these hyperparameters.
  • Further different types of RNN such as Gated Recurrent Unit (GRU) and Bidirectional LSTM can also be used to find optimal architecture.
Following links will assist in understanding the above mentioned steps:

Products


Release

R2022b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!