The neural network stops before it starts? (minimum gradient reached ) error

4 views (last 30 days)
Hello every body. I am using Neural network in my research graduation project " Pattern Identification ". I have built a custom neural network, and I used the function 'trainlm' .
The Problem is that, once I start the training process, the training window appears and tells me that the minimum gradient is reached and the neural network stops even before it starts !
( The architecture of my NN is complicated, and I'm using three inputs with 13 layers each of them carries different neurons ! )
I wish I can find someone helping me
Thanks in advance !
  2 Comments
EanX
EanX on 30 Sep 2015
I have a similar problem in designing a NARNET.
Currently best performance was obtained with 15 hidden nodes (1 hidden layer), mapminmax for processing function for inputs (ymin=0, ymax=10), tansig as transfer function for layer 1 and poslin for output layer.
I noticed that if I change output transfer function to the more commonly employed purelin I can avoid "minimum gradient reached" issue, but I used poslin to obtain forecasts always positive.
Can anyone enlighten me on this topic?
Thanks in advance !
Greg Heath
Greg Heath on 30 Sep 2015
Edited: Greg Heath on 30 Sep 2015
Do not change defaults until you have obtained the best default result.
Are you looping over 10 or more weight initializations and getting premature MinGrad reached on all of them?

Sign in to comment.

Accepted Answer

Greg Heath
Greg Heath on 30 May 2014
The best way to design a neural net is to first find the simplest architecture that yields a good solution.
My advice is to start with a single hidden layer and find the smallest number of hidden neurons that will yield a solution that cannot be significantly improved by using more neurons.
The best targets for patternnet are unit column vectors with a single unity component. The best training algorithm for that type of target is trainscg.
Hope this helps.
Greg
  1 Comment
Abdullah
Abdullah on 30 May 2014
Thanks so much , I will try to apply what you have mentioned and then tell you the results .
Thanks so much :)

Sign in to comment.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!