linprog in a for loop
3 views (last 30 days)
Hi community, I am trying to optimize a functin for a finite set of scenarios using linprog. I am however not certain if I can best use a for loop for this summation or just use symsum.
Difficulty lies in the fact that in the summation itself optimized variables z & y are being used(so needs to be optimized before next value can be added)
I want to create an optimization from the 0-200 instances. Which in symsum would like like this I thought.
l = 1;
q = 11;
s = 2;
A1 = 1;
values z and y are determined in optimization process with constraint y(k)=x-A1'*z(k) y(k)>0 x>0 and 0<z(k)<k
due to the fact that the problem needs to be optimized for every instance, I think that using a for loop for this summation might be a better way to program this. I am however not really skilfull with correctly formulating such a for loop with a linprog in it. That is where I hoped some one could help me out.
K resembles an instance for a probability that a demand will take place. For this I used a normal distribution to estimate each probability that k could happen.
demand = [0:1:200];
prob = normpdf(demand,100,10);
I would want to make an optimization of this for every demand possibility from 0 to 200. in each new setting, pk(k),z(k) and y(k) change. I hope someone could help how to best put this into Matlab
for the linprog I gave it a try with the following code
f2 = [-s.',(l-q).']; %[ Y, Z]
E=sum(f2); %value for expected optimum solution Q(x,D)
Aeq = [eye(j),A1.'];
beq = x;
lb = [zeros(1,i+j)];
ub = [inf(1,j),k];
sol = linprog(f2,,,Aeq,beq,lb,ub);
y = sol(1)
z = sol(2)
Hope that someone can help me out!