NARnet Closed loop prediction results not good

2 views (last 30 days)
I had issues with the dimensions of my input codes, so I decided to stick with using the NARnet and only the volume of Oil Produced as my input (1050x1 double).
NMSEs =
0.0572
NMSEc =
6.0875
My Questions are:
1. What do the NMSE values mean for my network?
2. The error autocorrelation and input-error autocorrelation look good? ( i guess!) what do they also mean for my network?
3. The open loop results look right but the closed loop network just looks awful, please help!
Thank You
Here is my code:
clc
plt=0;
% Autoregression Time-Series Problem with a NAR Neural Network
% Created Sat May 16 23:01:03 WAT 2015
%
% This script assumes this variable is defined:
%
% PInput - feedback time series. 1050x1double
T = tonndata(PInput,false,false);
N = length (T);
% Choose a Training Function
%
trainFcn = 'trainlm'; % Levenberg-Marquardt
% Create a Nonlinear Autoregressive Network
feedbackDelays = 1:4;
hiddenLayerSize = 6;
net = narnet(feedbackDelays,hiddenLayerSize,'open',trainFcn);
rng ('default')
% Choose Feedback Pre/Post-Processing Functions
% Settings for feedback input are automatically applied to feedback output
% For a list of all processing functions type: help nnprocess
net.input.processFcns = {'removeconstantrows','mapminmax'};
% Prepare the Data for Training and Simulation
% The function PREPARETS prepares timeseries data for a particular network,
% shifting time by the minimum amount to fill input states and layer states.
% Using PREPARETS allows you to keep your original time series data unchanged, while
% easily customizing it for networks with differing numbers of delays, with
% open loop or closed loop feedback modes.
[Xs Xsi Asi Ts] = preparets(net,{},{},T);
ts1 = cell2mat( Ts );
plt = plt+1; figure(plt), hold on
plot( 5:N, ts1, 'LineWidth', 2 )
% Setup Division of Data for Training, Validation, Testing
% For a list of all data division functions type: help nndivide
net.divideFcn = 'divideblock'; % Divide data randomly
net.divideMode = 'time'; % Divide up every value
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;
% Choose a Performance Function
% For a list of all performance functions type: help nnperformance
net.performFcn = 'mse'; % Mean squared error
% Choose Plot Functions
% For a list of all plot functions type: help nnplot
net.plotFcns = {'plotperform','plottrainstate','plotresponse', ...
'ploterrcorr', 'plotinerrcorr'};
% Train the Network
[net tr Ys Es Af Xf] = train(net,Xs,Ts,Xsi,Asi);
ys1=cell2mat(Ys);
plot(5:N, ys1, 'ro', 'LineWidth', 2 )
legend( 'TARGET', 'OUTPUT' )
title( 'OPENLOOP NARNET RESULTS' )
%
Es = gsubtract( Ts, Ys )
%view( net )
NMSEs = mse( Es ) /var( ts1,1 )
% Test the Network
y = net(Xs,Xsi,Asi);
e = gsubtract(Ts,y);
performance = perform(net,Ts,y);
% Recalculate Training, Validation and Test Performance
trainTargets = gmultiply(Ts,tr.trainMask);
valTargets = gmultiply(Ts,tr.valMask);
testTargets = gmultiply(Ts,tr.testMask);
trainPerformance = perform(net,trainTargets,y);
valPerformance = perform(net,valTargets,y);
testPerformance = perform(net,testTargets,y);
% Closed Loop Network
% For multi-step prediction.
% The function CLOSELOOP replaces the feedback input with a direct
% connection from the outout layer.
%netc = closeloop(net);
%[xc,xic,aic,tc] = preparets(netc,{},{},T);
%yc = netc(xc,xic,aic);
%perfc = perform(net,tc,yc);
[ netc Xci Aci ] = closeloop(net,Xsi,Asi);
%view(netc)
[Xc,Xci,Aci,Tc] = preparets(netc,{},{},Ts);
[ Yc Xcf Acf ] = netc(Xc,Xci,Aci);
Ec = gsubtract(Tc,Yc);
yc1 = cell2mat(Yc);
tc = ts1;
NMSEc = mse(Ec) /var(tc,1)
% Multi-step Prediction
Xc2 = cell(1,N);
[ Yc2 Xcf2 Acf2 ] = netc( Xc2, Xcf, Acf );
yc2 = cell2mat(Yc2);
plt = plt+1; figure(plt), hold on
plot( 5:N, tc, 'LineWidth', 2 )
plot( 9:N, yc1, 'ro', 'LineWidth', 2 )
plot( N+1:2*N, yc2, 'o', 'LineWidth', 2 )
plot( N+1:2*N, yc2, 'r', 'LineWidth', 2 )
%axis( [ 0 2*N+2 0 1.3 ] )
legend( 'TARGET', 'OUTPUT' , 'TARGETLESS PREDICTION')
title( 'CLOSED LOOP NARNET RESULTS' )
<<
<<
<<
<<
>>
>>
>>
>>

Accepted Answer

Greg Heath
Greg Heath on 19 May 2015
Edited: Greg Heath on 28 Sep 2015
I haven't looked at your code because you haven't looked at my previous posts regarding the transition from openloop to closeloop.
In more than one of them I clearly state that if the closed loop performance is significantly worst than the openloop performance, it usually means that small open loop output errors are being fedback to the input and accumulating.
If the openloop errors are not too large, just train the closeloop configuration initialized with the weights obtained via the openloop training.
Typically, training the closeloop configuration from random initial weights takes a VERY LONG TIME. Thus the use of openloop + closeloop technique.
However, sometimes the closeloop followup does not improve performance enough.
Then there are two obvious choices.
1. Design the closeloop configuration
from scratch.
2. Redo the openloop+closeloop design
with different initial weights, and/or
different lags and/or different number
of hidden nodes and/or different number
of epochs, and/or different ...
Guess which one I would choose?
Hope this helps.
Thank you for formally accepting my answer
Greg
  2 Comments
Greg Heath
Greg Heath on 19 May 2015
Uh oh.
I just took a look at your code. You only have
NMSEo ~ 0.05.
If you take a look at my designs you will see that my goal is either
NMSEgoal = 0.01
or
NMSEgoal = 0.005
Therefore you need a better openloop design.
Hope this helps.
Greg
Olumide Oladoyin
Olumide Oladoyin on 20 May 2015
I have read quite a bit on the crosscorr and autocorr and nncorr
I just dont get what this statement means:
Use the significant lags of the target autocorrelation function and the target/input crosscorrelation function to determine ID and FD.
could you please explain further?
From the code below, 1. What does the value 35 mean? 2. Do i just use the values: inputdelays and feedackdelay s and input them straight into the net like this?:
net = narxnet( inputdelays , feedbackdelays ,hiddenLayerSize,'open',trainFcn)
x = OilRate;
t = GasProduced;
X = zscore(cell2mat(x));
T = zscore(cell2mat(t));
[ I N ] = size(X)
[ O N ] = size(T)
crosscorrXT = nncorr(X,T,N-1);
autocorrT = nncorr(T,T,N-1);
crosscorrXT(1:N-1) = []; % Delete negative delays
autocorrT(1:N-1) = [];
sigthresh95 = 0.21 % Significance threshold
sigcrossind = crosscorrXT( crosscorrXT >= sigthresh95 )
sigautoind = autocorrT( autocorrT >= sigthresh95 )
inputdelays = sigcrossind(sigcrossind <= 35)
feedbackdelays = sigautoind(sigautoind <= 35)
feedbackdelays(1)=[] % Delete zero delay

Sign in to comment.

More Answers (0)

Categories

Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!