## Description

(1) (Necessary/Sufficient Conditions for Local/Global Optimality) Consider the unconstrained

minimization of the following function

f(x) = x

2

1 − x1x2 + x

2

2 − 3×2.

Find a local minimum. Is this local minimum also a global minimum?

(2) (Positive/Negative (Semi)Definiteness of Symmetric Matrices) Identify which of the following symmetric matrices are positive/negative definite, positive/negative semi-definite,

or none.

A =

2 1

1 5

, B =

2 −1 −1

−1 2 −1

−1 −1 2

, C =

−3 −3 0

−3 −10 −7

0 −7 −8

, D =

1 2

2 1

.

(3) (Convex functions) Problem 3.17 from Boyd & Vanderberghe.

(4) (Optimality Conditions for Constrained Optimization) Express the optimality conditions

for the following problem (note that this has the form of the power control problem we

encountered in the introduction lecture(s)):

max Xn

i=1

log(αi + pi)

s.t. pi ≥ 0, ∀i; and Xn

i=1

pi ≤ P

max

,

where αi > 0, ∀i and P

max < ∞ are given constants. Then, use the optimality conditions

to write the optimal solution p

? as a (potentially implicity) function of (αi)i and P

max

.

(5) (Equivalent Problems) Problem 4.58 from Boyd & Vanderberghe.

(6) (Convex Problem Formulation) Problem 4.62 from Boyd & Vanderberghe.

(7) (Uniqueness of Projection onto a Convex set) Prove that the projection

PC[x] = arg min

y∈C

kx − yk2

of a point x ∈ Rn onto a convex set C ⊂ Rn

is unique.

(8) (About the Pure Newton Method) Problem 9.10 from Boyd & Vanderberghe.

(9) (Multi-Step Method Implementation-Investigation) Using the Matlab code you developed

for PS1, implement the heavy-ball and the Nesterov’s method (with constant choices of

β and α) for the same quadratic function used in PS1, and compare their convergences

with each other, as well as with the steepest-descent method you already implemented.

Provide the plots of the trajectories of these methods from same initial conditions, and

comment on the outcomes and provide your insights on the comparison.