Feedforwardnet trainning: Inputs partially disconnect with hidden layer after I train my network even if I set correctly the field inputConnect

2 views (last 30 days)
Hi, I face the problem when trainning my feedforwardnet nn_h with multiple inputs X_h as a cell format and single target Y_h. I do not know why inputs partially disconnect with hidden layer after I train my network, as shown by the resulted figure
miss.jpg
my simple code is as follows.
clear;
inputnum=3;
batch_size=10;
X_h=cell(inputnum,1);
for i=1:inputnum
X_h{i,1}=zeros(1,batch_size);
for j=1:batch_size
X_h{i,1}(:,j)=i;
end
end
Y_h=cell(1,1);
Y_h{1,1}=zeros(1,batch_size);
for j=1:batch_size
Y_h{1,1}(:,j)=4;
end
nn_h=feedforwardnet(10);
nn_h.numInputs = inputnum;
for k=1:inputnum
nn_h.inputConnect(1,k)=1;
% even I set connections for each input with hidden layer, it seems partially removed after 'train' as above figure shows
end
nn_h=train(nn_h,X_h,Y_h);

Accepted Answer

Nicky
Nicky on 29 May 2019
Hi, I guess I found the answer myself.
As we can see from the above code, the first input X_h{1,1} is actually a constant for all 10 data points, which seems the bias constant b is enough to enter into the trainning process. It makes sense!
On the other hand, if I make the elements across the whole batch of X_h{1,1} different, then we can find the back of connection between the first input X_h{1,1} and the hidden layer.
As for other inputs X_h{2,1} and X_h{3,1} which are also constants, it seems that MATLAB only removes the connection of the first constant input, due to my observation.

More Answers (0)

Categories

Find more on Parallel and Cloud in Help Center and File Exchange

Products


Release

R2015b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!