ECE509 Homework 4 Convex Optimization

$30.00

Category: Tags: , , , , You will Instantly receive a download link for .zip solution file upon Payment || To Order Original Work Click Custom Order?

Description

5/5 - (8 votes)

1. Consider the optimization
min ex1+3×2−0.1 + ex1−3×2−0.1 + e−x1−0.1
Write a code to solve this optimization using the gradient method with the backtracking
parameters α = 0.1 and β = 0.6. Draw f(x(k)
) verses k for k = 0, 1, 2, · · · , 50 on a log-linear
plot.
2. Consider the optimization
min −
!n
i=1
log(1 − x2
i) −!n
i=1
log(1 − aT
i x)
where n = 5000 and ai are randomly generated vectors. Solve this optimization using
Newtons method with the backtracking line search (α = 0.01 and β = 0.5). Draw f(x(k)
)
versus k for k = 0, 1, 2, · · · , 30 on a log-linear plot.
3. Derive the distributed ADMM updates for SVM problem