why the network performance decreased??
1 view (last 30 days)
Show older comments
hi all,
I applied NARXNET to predicting the time series. the problem is when I used the for loop to find the most optimum HN then run a new network with the selected HN, the performance is decreased ( R value ). why? (i.e from 0.9833 to 0.9663)
thank you for help
0 Comments
Accepted Answer
Greg Heath
on 26 Apr 2016
Given a value for the number of hidden nodes, using different random weight initializations AND random weight divisions will yield a spread of results. The difference you state is typical.
To keep things manageable, I typically, do not train more than 100 nets at a time: , numH = numel(Hmin:dH:Hmax) = 10 and Ntrials = 10 for each H value. I display the 100 NMSE or Rsq =1-NMSE results in a Ntrials x numH matrix. Then I display the min, median, mean, std and max of Rsq in a 5 x numH matrix.
You would be surprised how disparate some results can be.
Searching the NEWSGROUP and ANSWERS using
greg Ntrials
should bring up enough examples.
Hope this helps.
Thank you for formally accepting my answer
Greg
0 Comments
More Answers (0)
See Also
Categories
Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!