Description
The goals of this exercise are to
• Illustrate that regularization techniques using a smoothing 2-norm are not best suited for problems
where data has sharp gradients or discontinuities.
• Illustrate the potential benefits of the Total Variation (TV) regularization.
Mathematical Background and Setup
Let A denote the blurring operator defined in the context of HW #2 with n = 220 and consider the
data x provided in the file TrueData.m (this is column 100 of the 220 × 220 image Datamatrix.png). The
noisy burred data provided in the file BlurData.m was obtained as
db = Ax + ξ (1)
where ξ denotes a vector of random noise. The performance of the TSVD, Tikhonov, and TV methods in
reconstructing the true vector x is tested as follows.
Consider the solution xbk provided by the TSVD
xbk =
X
k
i=1
u
T
i db
σi
vi (2)
and the solution xbλ,L provided by the Tikhonov regularization (x0 = 0)
ATA + λ
2L
TL
xbλ,L = ATdb (3)
Consider the total variation solution xbα,β obtained by solving the minimization problem
min
xb∈Rn
Jα,β(xb), where J(xb)
def
= kAxb − dbk
2 + α
2T(xb, β) (4)
and T(xb, β) is a smooth approximation to the 1-norm kL1xbk1 defined as
T(xb, β) = Xn−1
i=1
p
β
2 + |xˆi+1 − xˆi
|
2
(5)
Homework requirements:
(30/10 pts) 1
Implement the TSVD (2) and the Tikhonov regularization (3) with L = I to reconstruct the true
data x. Provide the graphs of the reconstructed data xb and the error in the approximation, xb − x.
(40 pts) Further experiment with the L operator taken as L1 and L2. Provide the graphs of the reconstructed
data xb and the error in the approximation, xb − x.
MTH 510 students only2
(20 pts) Implement the TV method (4)-(5). Find appropriate values for the parameters α, β and provide the
graphs of the reconstructed data xbα,β and the error in the approximation, xbα,β − x.