Neural Network Backpropagation Optimization
6 views (last 30 days)
Show older comments
Hi all,
I am training a shallow neural network with the following layers using the Neural Network Toolbox.
input layer - 100 (features)
hidden layer - 10 (neurons)
output layer - 100 (labels)
I used the following code to format the training set.
[inputs,~] = tonndata(features,1,0);
[labels,~] = tonndata(targets,1,0);
so, after these two lines, inputs and labels are two 1x5000 cell arrays, being each element in the arrays a 1x100 signal. Then I configure, initialize and train the network using Levenberg-Marquardt backpropagation like this:
net = configure(net,inputs,labels);
net = init(net);
[trained_net,perf] = train(net,inputs,labels);
the network view is as follows:

the issue is, during training, each epoch takes around 2 minutes to train, which seems to much. I know Levenberg-Marquardt algorithm involves the calculation of the inverse of the Jacobian, which takes time, but MATLAB's documentation literally says:
"trainlm is often the fastest backpropagation algorithm in the toolbox, and is highly recommended as a first-choice supervised algorithm, although it does require more memory than other algorithms."
I would like to know if anyone has any suggestion to reduce the training time or if I am missing something in the model setup.
Thanks in advance.
0 Comments
Answers (0)
See Also
Categories
Find more on Deep Learning Toolbox in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!