How can I constrain neural network weights?

3 views (last 30 days)
Luke Wilhelm
Luke Wilhelm on 7 Dec 2012
Answered: Sara Perez on 12 Sep 2019
I am using the neural network toolbox to create a feed forward network. The input is one 4x1 vector, then there is one 4-neuron hidden layer, one 6-neuron hidden layer, and one 4-neuron output layer. I would like to be able to constrain the final 4x6 matrix of layer weights such that the weight values cannot be negative. I realize that this will probably affect the network's accuracy, but for the purpose of my research, I would like to see what the results are.
Is it possible to constrain the layer weights in this way? I have found how to set layer weights to a specified value and prevent their learning using net.layerWeights{i,j}.learn=false;, but not how to allow wights to change, while preventing them from becoming negative.
Thanks, Luke
  1 Comment
Greg Heath
Greg Heath on 9 Dec 2012
One hidden layer is sufficient for a universal approximator.
If the hidden node activation functions are all odd, changing the sign of all weights connected to one activation function will not change the output.
Therefore, if there is only one output node, the task is easy.
Otherwise, it will not work in general.

Sign in to comment.

Answers (2)

R L
R L on 24 Jul 2015
I would like to ask you how did you a subset of the layer weights to a specified value while preventing their learning using net.layerWeights{i,j}.learn=false.
Have you ever solved your question regarding constraining the weights to have a specified sign while training with learning? thanks

Sara Perez
Sara Perez on 12 Sep 2019
You can set the propiety value of the layer 'WeightLearnRateFactor' to zero, so the weights won't be modified or learned
more info here:

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!