custom regression (Multiple output)

6 views (last 30 days)
jaehong kim
jaehong kim on 12 Feb 2021
Commented: jaehong kim on 14 Feb 2021
Hi, I am working on a custom regression neural network.
Inputs size=2 and Output size=6 // Number of Data =25001
However, after a certain iteration, it was confirmed that all Data(25001) outputs are the same.
x axis=Target /// y axis=output
Initially, the output is different, but it seems that the output is the same after a while.
My code is here.
--------------------------------------------------------------------------------------------
clear,clc,close all
Data=readmatrix('sim_linear.xlsx');
Y_at=Data(:,2);
Y_ft=Data(:,3);
F_at=Data(:,4);
F_ft=Data(:,5);
P_cot=Data(:,6);
T_cot=Data(:,7);
T_bt=Data(:,8);
F_et=Data(:,9);
T_et=Data(:,10);
PW_t=Data(:,11);
idx=randperm(numel(Y_at));
Y_at=Y_at(idx);
Y_ft=Y_ft(idx);
F_at=F_at(idx);
F_ft=F_ft(idx);
P_cot=P_cot(idx);
T_cot=T_cot(idx);
T_bt=T_bt(idx);
F_et=F_et(idx);
T_et=T_et(idx);
PW_t=PW_t(idx);
Input=cat(2,Y_at,Y_ft);
Output=cat(2,F_ft,T_cot,T_bt,F_et,T_et,PW_t);
Inputs=transpose(Input);
Outputs=transpose(Output);
layers = [
featureInputLayer(2,'Name','in')
fullyConnectedLayer(64,'Name','fc1')
tanhLayer('Name','tanh1')
fullyConnectedLayer(32,'Name','fc2')
tanhLayer('Name','tanh2')
fullyConnectedLayer(16,'Name','fc3')
tanhLayer('Name','tanh3')
fullyConnectedLayer(8,'Name','fc4')
tanhLayer('Name','tanh4')
fullyConnectedLayer(6,'Name','fc5')
];
lgraph=layerGraph(layers);
dlnet=dlnetwork(lgraph);
iteration = 1;
averageGrad = [];
averageSqGrad = [];
learnRate = 0.005;
gradDecay = 0.75;
sqGradDecay = 0.95;
output=[];
dlX = dlarray(Inputs,'CB');
for it=1:500
iteration = iteration + 1;
[out,loss,NNgrad]=dlfeval(@gradients,dlnet,dlX,Outputs);
[dlnet.Learnables,averageGrad,averageSqGrad] = adamupdate(dlnet.Learnables,NNgrad,averageGrad,averageSqGrad,iteration,learnRate,gradDecay,sqGradDecay);
if mod(it,100)==0
disp(it);
end
end
function [out,loss,NNgrad,grad1,grad2]=gradients(dlnet,dlx,t)
out=forward(dlnet,dlx);
loss2=sum((out(1,:)-t(1,:)).^2)+sum((out(2,:)-t(2,:)).^2)+sum((out(3,:)-t(3,:)).^2)+sum((out(4,:)-t(4,:)).^2)+sum((out(5,:)-t(5,:)).^2)+sum((out(6,:)-t(6,:)).^2);
loss=loss2;
[NNgrad]=dlgradient(loss,dlnet.Learnables);
end
-------------------------------------------------------------------------------------------------------------------------------------------------
Thanks for reading my question. I hope that a great person can answer.
  3 Comments
jaehong kim
jaehong kim on 14 Feb 2021
Edited: jaehong kim on 14 Feb 2021
Thank you for reading my question!
Is there any problem?
Is it for presenting an answer?
jaehong kim
jaehong kim on 14 Feb 2021
Inputs=2*10
0.1992 -0.7085 -0.0474 -0.4406 -0.1188 -0.3818 -0.8150 -0.3583 -0.4511 -0.4783
0.9204 0.2764 0.7833 0.5459 0.7072 0.5024 0.2000 0.5996 0.5400 0.5149

Sign in to comment.

Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!