Neural Network - Multi Step Ahead Prediction (PART 2)

2 views (last 30 days)
Answer by Lucas García on 7 Sep 2011
Edited by Lucas García on 3 Sep 2015 at 13:26 to simplify step 5
The original 4-year old thread is very long with nonsequential, irrelevant and confusing posts. Therefore, I thought it would be less confusing to continue with a new thread starting with comments on Garcia's interesting post.
%When using narxnet, the network performs only a one-step
ahead
prediction after it has been trained. Therefore, you
need to
GEH0 = [ ' It seems to me if you have a net with 2
delays that you are making a two-step ahead prediction']
use closeloop to perform a multi-step-ahead prediction and
turn the
network into parallel configuration.
Take a look at this example for a multi-step-ahead
prediction, N steps.
This uses the dataset magdata.mat which
is available in the Neural Network Toolbox. Also, some of the
inputs will be used for performing the multi-step-ahead
prediction, and results validated with the original data. I
hope the comments help to understand.
rng('default') , GEH1 = 'Added for replication study'
%%1. Importing data
S = load('magdata');
X = con2seq(S.u);
T = con2seq(S.y);
GEH2 = ' indices 1:4001 '
%%2. Data preparation
N = 300; % Multi-step ahead prediction
Input and target series are divided in two groups of data:
1st group: used to train the network
inputSeries = X(1:end-N);
targetSeries = T(1:end-N);
GEH3 = ' indices 1:3701 '
2nd group: this is the new data used for simulation.
InputSeriesVal will be used for predicting new targets.
targetSeriesVal will be used for network validation after
prediction
inputSeriesVal = X(end-N+1:end);
targetSeriesVal = T(end-N+1:end); % This is generally not available
GEH4 = ' indices 3702:4001 '
%%3. Network Architecture
delay = 2;
neuronsHiddenLayer = 10;
GEH5 = ' Are these default values anywhere near optimal? '
% Network Creation
net = narxnet(1:delay,1:delay,neuronsHiddenLayer);
%%4. Training the network
[Xs,Xi,Ai,Ts] = preparets(net,inputSeries,{},targetSeries);
GEH6 = ' Xs( 2x3699 ) , Xi( 2x2 ), Ai( 2x0 ), Ts( 1x3699 ) '
GEH7 = ' In general, save the initial training RNG state for
duplication purposes '
S1= rng; % GEH
net = train(net,Xs,Ts,Xi,Ai);
view(net)
Y = net(Xs,Xi,Ai);
GEH8 = ' What are the random trn/val/tst indices ? '
GEH9 = [ 'Missing training record; Following info from
training plots: ValStop@epoch 144, MSEval = 6.1e-7 , MSE =
1.7e-6, Training Time = 8 sec, Gradient = 2.75e-4, Mu =
1e-6, 'Significant Input/Error correlations @ lags
3-to-6, 'Significant Error autocorrelations @ lags 1-to-4 ' ]
GEH10 = ' Y( 1x3699 ) '
%Performance for the series-parallel implementation, only one-step-ahead prediction
GEH11 = ' delay = 2 <==> two-step ahead '
GEH12 = ' (Xi,Ai) ~ 1:2 ; (Xs,Ts,Y) ~ 3:3701 '
perf = perform(net,Ts,Y) % 1.5105e-06
GEH13 = ' Should normalize by var(cell2mat(Ts),1) = 2.051 get Rsq = 1'
%%5. Multi-step ahead prediction
[Xs1,Xio,Aio] = preparets(net,inputSeries(1:end-delay), {},targetSeries(1:end-delay)); [Y1,Xfo,Afo] = net(Xs1,Xio,Aio);
% Xs1(2x3697), Xio( 2x2), Aio (2x0 ) % Y1(1x3697) , Xfo( 2x2), Af0 (2x0)
GEH14 = 'input ~ 1:3699; Xio,Aio ~ 1:2; Xs1,Y1 ~ 3:3699, Xfo,Afo ~3698:3699'
[netc,Xic,Aic] = closeloop(net,Xfo,Afo);
[yPred,Xfc,Afc] = netc(inputSeriesVal,Xic,Aic);
% Xic(1x2), Aic(2x2), yPred(1x300), Xfc(1x2 ), Afc( 2x2)
multiStepPerformance = perform(net,yPred,targetSeriesVal) % 0.12753
GEH15 = [ ' Should normalize by var(cell2mat(targetSeriesVal),1) = 1.394
get Rsq = 0.90852' ]
view(netc)
figure;
plot([cell2mat(targetSeries),nan(1,N);
nan(1,length(targetSeries)),cell2mat(yPred);
nan(1,length(targetSeries)),cell2mat(targetSeriesVal)]')
legend('Original Targets','Network Predictions','Expected Outputs')
Hope this helps.
Greg

Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!