How to apply gradient metod to a single variable optimization problem
Show older comments
As in the title I'm having a problem with optimizing a single variable function. I have already managed to make a proper function through interpolation methods and said function is listed in the code as 'J'. Now, what I have to do is to use gradient method to maximize it in range of T = 0:120. However I'm lackng knowledge on how to do so.
I've been trying different approaches so far, using fmincon or fminsearch (which BTW I was told I can't use for some reason), searching the web for some answers but nothing worked for me so far. Eventually I decided that gradient would be the best way to solve this, but at this point I really struggle to come up with a solution.
I think I'm either lacking some knowledge regarding how gradient optimization should work, or I'm missing something. Either way, I'd appreciate any help what should I do next.
Below is the code I have so far, excluding the part where Lagrange Interpolation is calculated, as I know for sure that part works well.
% Data given
Cp = 85;
Cs = 22;
Kr = 15000;
Kp = 1320;
Kop = 4000;
tau = 10;
syms T;
syms Cp;
syms Cs;
syms Kr;
syms Kp;
syms Kop;
syms tau;
output = (-0.000591914003251751*T.^5 +0.0157328896604937*T.^4 -0.255479792438275*T.^3 +2.25724123015864*T.^2 -8.07704906204852*T +199.7);
input = (-0.000265610835537946*T.^5 +0.00716439790013456*T.^4 -0.112811639550303*T.^3 +0.936296091270008*T.^2 -2.55652056277004*T +240.1);
Gsx =Cp*output - Cs*input;
IGsx = int(Gsx,T);
%%
J = (1./(T+tau))*((IGsx) - Kr - Kp*tau - Kop*T);
Jd = diff(J,T);
Accepted Answer
More Answers (0)
Categories
Find more on Linear Least Squares in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
