(1) (Necessary/Sufficient Conditions for Local/Global Optimality) Consider the unconstrained
minimization of the following function
f(x) = x
1 − x1x2 + x
2 − 3×2.
Find a local minimum. Is this local minimum also a global minimum?
(2) (Positive/Negative (Semi)Definiteness of Symmetric Matrices) Identify which of the following symmetric matrices are positive/negative definite, positive/negative semi-definite,
(3) (Convex functions) Problem 3.17 from Boyd & Vanderberghe.
(4) (Optimality Conditions for Constrained Optimization) Express the optimality conditions
for the following problem (note that this has the form of the power control problem we
encountered in the introduction lecture(s)):
log(αi + pi)
s.t. pi ≥ 0, ∀i; and Xn
pi ≤ P
where αi > 0, ∀i and P
max < ∞ are given constants. Then, use the optimality conditions
to write the optimal solution p
? as a (potentially implicity) function of (αi)i and P
(5) (Equivalent Problems) Problem 4.58 from Boyd & Vanderberghe.
(6) (Convex Problem Formulation) Problem 4.62 from Boyd & Vanderberghe.
(7) (Uniqueness of Projection onto a Convex set) Prove that the projection
PC[x] = arg min
kx − yk2
of a point x ∈ Rn onto a convex set C ⊂ Rn
(8) (About the Pure Newton Method) Problem 9.10 from Boyd & Vanderberghe.
(9) (Multi-Step Method Implementation-Investigation) Using the Matlab code you developed
for PS1, implement the heavy-ball and the Nesterov’s method (with constant choices of
β and α) for the same quadratic function used in PS1, and compare their convergences
with each other, as well as with the steepest-descent method you already implemented.
Provide the plots of the trajectories of these methods from same initial conditions, and
comment on the outcomes and provide your insights on the comparison.