-
Notifications
You must be signed in to change notification settings - Fork 1
Exam 2
Exercise 2. Consider the following optimization problem:
We need to analyze the covexity of each term and the objective function as a whole. A function is convex if the Hessian Matrix is positive semi-definite.
In this case the objective function is:
To create the Hessian Matrix in a quick form we have:
H = [4 0 1 0
0 2 2 0
1 2 6 1
0 0 1 4];
fprintf('Eigen Values Objective Function:');
display(eig(H))Output
Eigen Values Objective Function:
1.0608
3.5933
4.0000
7.3460
Since all the eigenvalues are positive and not zero, we can conclude that Hessian Matrix is positive definite, therefore is a strongly convex optimization problem.
Starting from the point
The function is a quadratic function with the form:
Pseudocode: The steepest descent direction:
% Gradient method
Choose x0 tolerance=ie-6 i=0
while (norm(g) > tolerance)
[step direction] d = -g;
[step size] alpha = -(g'*d)/d'Qd;
x = x + alpha*d;
i = i + 1;
end
q = [-5
0
-4
3];
x0 = [0
0
0
0];
tolerance = 1e-6;
% Starting Point
x = x0;
i = 0;
v = (1/2)*x'*Q*x + q'*x;
g = Q*x + q;
fprintf('Gradient Method with exact line search');
fprintf('i \t f(x) \t \t \t ||grad f(x)|| \n');
while (norm(g)>tolerance)
v = (1/2)*x'*Q*x + q'*x;
g = Q*x + q;
fprintf('%i \t %2.10f \t \t %2.10f \n',i,v,norm(g));
% Direction
d = -g;
% Step size
alpha = -(g'*d)/(d'*Q*d);
% New point
x = x + alpha*d;
% Next iteration
i = i+1;
end
Output
Gradient Method with exact line search
i f(x) ||grad f(x)||
0 0.0000000000 7.0710678119
1 -5.0403225806 2.4685851395
2 -5.5976694222 1.1368021920
3 -5.7883000924 0.9808455712
4 -5.8850831815 0.6005067114
5 -5.9375039954 0.5321261507
. . .
. . .
. . .
45 -6.0000000000 0.0000027326
46 -6.0000000000 0.0000016771
47 -6.0000000000 0.0000014863
48 -6.0000000000 0.0000009122
Starting from the point (0, 0, 0, 0). How many iterations are needed? Write the point found by the method at any iteration.
Starting from the point (0, 0, 0, 0). How many iterations are needed?