Skip to content
Marsha Gómez edited this page May 25, 2023 · 15 revisions

Exercise 2. Consider the following optimization problem:

$$\left\lbrace \begin{array}{ll} \mathrm{minimize} & 2x_1^2 +x_1 x_3 +x_2^2 +2x_2 x_3 +3x_3^2 +x_3 x_4 +2x_4^2 -5x_1 -4x_3 +3x_4 \\ x\ \epsilon \ \Re^4 & \\ \end{array}\right.$$

2.1 Do global optimal solutions exist? Why?

2.2 Is it a convex problem? Why?

We need to analyze the covexity of each term and the objective function as a whole. A function is convex if the Hessian Matrix is positive semi-definite. In this case the objective function is: $$f\left(x\right)\ =\ 2x_1^2 +x_1 x_3 +x_2^2 +2x_2 x_3 +3x_3^2 +x_3 x_4 +2x_4^2 -5x_1 -4x_3 +3x_4$$

To create the Hessian Matrix in a quick form we have: $$\mathrm{General}\ \mathrm{Form}=a_1 x_1^2 +a_2 x_2^2 +a_3 x_3^2 \ \ +a_4 x_4^2 \ \ +a_5 x_1 x_2 \ +a_6 x_1 x_3 \ +a_7 x_1 x_4 +a_8 x_2 x_3 \ +a_9 x_2 x_4 +a_{10} x_3 x_4$$

$$\mathrm{Hessian}=\left\lbrack \begin{array}{cccc} 2a_1 & a_5 & a_6 & a_7 \\ a_5 & 2a_2 & a_8 & a_9 \\ a_6 & a_8 & 2a_3 & a_{10} \\ a_7 & a_9 & a_{10} & 2a_4 \end{array}\right\rbrack$$

H = [4 0 1 0
     0 2 2 0
     1 2 6 1
     0 0 1 4];

fprintf('Eigen Values Objective Function:');
display(eig(H))

Output

Eigen Values Objective Function:
    1.0608
    3.5933
    4.0000
    7.3460

Since all the eigenvalues are positive and not zero, we can conclude that Hessian Matrix is positive definite, therefore is a strongly convex optimization problem.

2.3 Find the global minimum by using the gradient method with exact line search.

Starting from the point $\left(0,0,0,0\right)\ \ \mathrm{with}\ \left|\nabla f\left(x\right)\right|<{10}^{-6}$ as stopping criterion. How many iterations are needed?

The function is a quadratic function with the form:

$$\begin{array}{l} f\left(x\right)=\frac{1}{2}x^T Q;x+q^T x\\ g\left(x\right)=Q;x+q; \end{array}$$

Pseudocode: The steepest descent direction:

% Gradient method
Choose x0 tolerance=ie-6 i=0
while (norm(g) > tolerance)
   [step direction] d = -g;
   [step size] alpha = -(g'*d)/d'Qd;
   x = x + alpha*d;
   i = i + 1;
end
q = [-5
      0
     -4
      3];

x0 = [0
      0
      0
      0];

tolerance = 1e-6;

% Starting Point
x = x0;
i = 0;

v = (1/2)*x'*Q*x + q'*x;
g = Q*x + q;

fprintf('Gradient Method with exact line search');
fprintf('i \t f(x) \t \t \t ||grad f(x)|| \n');

while (norm(g)>tolerance)
    v = (1/2)*x'*Q*x + q'*x;
    g = Q*x + q;
   
    fprintf('%i \t %2.10f \t \t %2.10f \n',i,v,norm(g));
    
    % Direction
    d = -g;
    % Step size
    alpha = -(g'*d)/(d'*Q*d);
    % New point
    x = x + alpha*d;
    % Next iteration
    i = i+1;
end

Output

Gradient Method with exact line search
i 	 f(x) 	 	 	 ||grad f(x)|| 
0 	 0.0000000000 	 	 7.0710678119 
1 	 -5.0403225806 	 	 2.4685851395 
2 	 -5.5976694222 	 	 1.1368021920 
3 	 -5.7883000924 	 	 0.9808455712 
4 	 -5.8850831815 	 	 0.6005067114 
5 	 -5.9375039954 	 	 0.5321261507 
. 	      . 	  	      .
. 	      . 	  	      .
. 	      . 	  	      .
45 	 -6.0000000000 	 	 0.0000027326 
46 	 -6.0000000000 	 	 0.0000016771 
47 	 -6.0000000000 	 	 0.0000014863 
48 	 -6.0000000000 	 	 0.0000009122 

2.4 Find the global minimum by using the conjugate gradient method

Starting from the point (0, 0, 0, 0). How many iterations are needed? Write the point found by the method at any iteration.

2.5 Find the global minimum by using the Newton method

Starting from the point (0, 0, 0, 0). How many iterations are needed?

Clone this wiki locally