Freez specific weights in Custom Neural Network
4 views (last 30 days)
Show older comments
Hi, I've made a custom neural network with 69 layers, I have 3 inputs and the first Input is either 1 or -1. what I need is that the connection form this Input to different layers is scalled by a constant weight, so that the NN act on the other weights. Thank you for your help ! This is my first time I ask a community on the internet :)
3 Comments
Answers (1)
Sara Perez
on 12 Sep 2019
You can set the propiety value of the layer 'WeightLearnRateFactor' to zero, so the weights won't be modified or learned
more info here:
0 Comments
See Also
Categories
Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!