What is the meaning of range in linear neural network newlin?
4 views (last 30 days)
Show older comments
I read that in help of Matlab.
net = newlin([-1 1],1,[0 1],0.01);
This code creates a single-input (range of [-1 1]) linear layer with one neuron etc.
But when I wrote this code and gave inputs out of the range then also it was giving correct results. In that what is the significance of defining range in the code?
Here is my example
net=newlin([1 2; 1 2],1);
net.iw{1,1}=[1 2];
net.b{1}=0;
p=[1 2 2 3; 2 1 3 1];
a=sim(net,p)
a =
5 4 8 5
0 Comments
Accepted Answer
Greg Heath
on 12 Jun 2013
According to both the help newlin and doc newlin documentation in 2011b NNET 7.0.2:
>> help newlin
newlin Create a linear layer.
1. Obsoleted in R2010b NNET 7.0. Last used in R2010a NNET 6.0.4. The recommended function is linearlayer.
2. Syntax
net = newlin(P,S,ID,LR)
net = newlin(P,T,ID,LR)
P - RxQ matrix of Q representative input vectors.
S - Number of elements in the output vector.
T - SxQ2 matrix of Q2 representative S-element output vectors.
ID - Input delay vector, default = [0].
LR - Learning rate, default = 0.01;
NET = newlin(PR,S,0,P) takes an alternate argument,
P - Matrix of input vectors.
and returns a linear layer with the maximum stable learning rate
Notice that PR is not defined and ID is missing from the argument list.
I know from other documentation that
PR = mimmax(P)
which is merely used to help find good initial weight values for init or train. It does nothing to P when training.
However, I do not know why ID is missing, and why someone would use this syntax.
My guess is that when used correctly, it will provide the optimum value for LR. Then the other forms of syntax can be used to train the net.
HOWEVER, the default training function is trainb. When I ran the example in help newlin, I found the following:
1. The maximum epoch is really 1000, NOT 100 as stated in the help trainb documentation
2. TRAINB does not have the trainParam.min_grad property for stopping.
Therfore, even though the best epoch was 370, the algorithm kept chugging along with MSE ~ constant until it reached epoch 1000.
I DO NOT RECOMMEND USING NEWLIN OR TRAINB. If you have an obsolete version of NNTBX, USE
net = newff(P,T,[]); % Linear, no input delay
net = newfftd(P,T,ID,[]) % Linear with input delay
However, check the documentation for default I/O processing and trn/val/tst data division.
Hope this helps.
Thank you for formally accepting my answer
Greg
0 Comments
More Answers (0)
See Also
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!