Why 'net.IW{1,1}' will return negative value and smaller value like -0.0090, 0.2447, and so on.
3 views (last 30 days)
Show older comments
I was doing multilayer neural network using newff function. size of input = 3 x 150 (3 input data, 150 samples) size of target = 1 x 150 net = newff([minmax(Inputs)],[10,1],{'logsig' 'logsig'}); 10 neurons in hidden layer and 1 neuron in output layer so the size of input weight in hidden layer = 3 x 10, a = weight; (the value of weight is 50 and above) a = net.IW{1,1};
Why a = net.IW{1,1}; will return negative value and the value is very small?
0 Comments
Accepted Answer
Greg Heath
on 23 Jun 2015
[ 10 3 ] = size(IW)
Do not worry about the size or sign of a single weight in a single design. Typically, it is almost impossible to understand the purpose of each single weight.
The best approach for designing a real-world net is to make multiple designs to
1. Mitigate the uncertainty of using random initial weights
2. Reduce the time and effort needed to find a sufficiently low local minima in a mountainous weight space.
Using too many weights is called overfitting. If these are trained too long (overtraining), the training error can be driven very low. However, very often those designs do not work well on nontraining data.
I try to minimize the number of weights by minimizing the number of hidden nodes subject to the constraint that the variance of the error (= output-target) is less than the average variance of the target components by a factor of ~100 to 200.
Hope this helps.
Thank you for formally accepting my answer
Greg
2 Comments
Greg Heath
on 9 Dec 2019
My advice is to normalize real-valued data to [ -1 , 1 ] and use TANSIG for hidden node functions.
Greg
More Answers (0)
See Also
Categories
Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!