MATLAB gives me different value of output every time I train a neural network, why?

3 views (last 30 days)
I was doing multilayer neural network. Input data (3 input data and 150 samples) - 3x150 target - 1x150
I did not specify the weight and bias, is it the reason to return different value of output every time I train the neural network?

Accepted Answer

Greg Heath
Greg Heath on 2 Jul 2015
The default data division and weight initialization are both random.
To reproduce a design you have to know the initial state of the RNG before it is both configured with initial weights and divided into training, validation and testing subsets.
When designing multiple nets in a double for loop (creation in the outer loop and training in the inner loop), you only have to initialize the RNG once: before the first loop. The RNG changes its state every time it is called. Therefore, for reproducibility, record the RNG state at the beginning of the inner loop.
Exactly when the RNG is called differs for the different generation of designs. For special cases of the obsolete NEWFF family (e.g., NEWFIT, NEWPR and NEWFF), weights are initialized when the nets are created. For special cases of the current FEEDFORWARDNET family, (e.g., FITNET, PATTERNNET and FEEDFORWARDNET), weights can be initialized explicitly by the CONFIGURE function. Otherwise, they will be automatically initialiized by the function TRAIN.
When I find out exactly where the data is divided, I will post in both the NEWSGROUP and ANSWERS.
Hope this helps.
Thank you for formally accepting my answer
Greg

More Answers (1)

Walter Roberson
Walter Roberson on 1 Jul 2015
The weights are initialized randomly unless you specifically initialize them.

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!