- Increasing number of layers in a network may cause vanishing gradients.
- The number of neurons may be too high leading to overfitting.
- The increased number of layers may also lead to overfitting of data.
- The number of epochs may be too high compared to the dataset size.
Using more layers make my network performs worse
16 views (last 30 days)
Show older comments
Timothee Fichot
on 10 Aug 2020
Answered: Pranav Verma
on 14 Aug 2020
Hello,
I'm trying to see on a simple data set of size 2*1000 the impact of using more layers. The goal of my NN is to predict a binary value according to a vector of size 2*1 (which are the 2 previous time step) as an input. I tried for 25 repetitions of training for 3 differents NN : one with 1 layer, one with 2 layers and one with 3, all of them of 8 neurones. From what I'd guess the performance of the NN with 3 layers should be the best one but the results that I had through the 25 repetions showed me that using 2 layers was the best one and using 3 was the worst one.
Do you have any idee why I would get that ? If you have any questions please let me know !
My code to train my NN looks like this :
net = patternnet(ones(1,numLayers)*numNeurones);
net = configure(net,input,target);
for layer=1:numLayers
net.layers{layer}.transferFcn = transFct;
end
net.layers{layer+1}.transferFcn = 'logsig';
net.trainFcn = 'traingda';
net.trainParam.epochs = 5000;
net.trainParam.max_fail = 100;
net.trainParam.min_grad=1e-10;
net.trainParam.showWindow = false;
net = train(net,input,target,'useParallel','yes');
0 Comments
Accepted Answer
Pranav Verma
on 14 Aug 2020
Hi Timothee,
There is no well-defined connection between number of hidden layers and accuracy. Whether you will get an improved accuracy on increasing the number of layers depends on the complexity of the problem you are trying to solve.
The reason that less layers are performing better may be due to a variety of reasons:
0 Comments
More Answers (0)
See Also
Categories
Find more on Define Shallow Neural Network Architectures in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!