Multivariate Regression (in time and features) Using LSTM
2 views (last 30 days)
Show older comments
Trying to feed a LSTM with different streamflow time series and their delayed sequences for gap filling. Let x be the initial matrix with selected predictors, one per line, considering size(x,2) as the number of samples. To introduce time dependence, the predictors are alternated with their delayed versions (from dt= [1:ndt], ndt being the maximum delay considered) as below:
for ii=1:size(x,2)
for j=1:ndt
x1(j:end,ndt*(ii-1)+j)=x(1:end-j+1,ii);
end
end
with the respective LSTM:
numFeatures = size(xTrain,1);
numResponses = size(yTrain,1);
numHiddenUnits = 300;
layers = [ ...
sequenceInputLayer(numFeatures)
lstmLayer(numHiddenUnits)
fullyConnectedLayer(numResponses)
regressionLayer];
The target is a line vector y. Is there a more effective arrange to introduce time dependencies in LSTM? I mean, I have tried to associate every y instance with a 3D matrix x2 containning the values of x (not of x1) from (t-ndt) to (t):
for ii=ndt:size(x,1)
x2(:,:,ii)=x(ii-ndt+1:ii,:);
end
But I don't know how to addapt the respectve LSTM.
I know the "Sequence-to-Sequence Using Deep-Learning example
I does not include explicit time dependencies.
Thanks.
0 Comments
Answers (0)
See Also
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!