EECE 7220 / 8220: Scientific Computing Homework: 4

$35.00

Category: Tags: , , , , You will Instantly receive a download link for .zip solution file upon Payment || To Order Original Work Click Custom Order?

Description

Rate this product

Learning Objective: To understand how to solve a rectangular matrix system of equations
(least squares problem) using: 1) QR decomposition, 2) Moore-Penrose inverse (from normal
equation), 3) using SVD, and 4) using an optimization procedure.
Data Generation: To evaluate your algorithms (programs), generate test data {(xi
, yi)}
100
i=1 that
has quadratic relationship as follows:
1. Let the underlying relationship between variables xi (independent) and yi (dependent) be
quadratic. i.e. yi = β0 + β1xi + β2x
2
i
.
2. Choose any non-zero values for the quadratic model parameters β0, β1, and β2.
3. Generate 100 data points from the quadratic model: {(xi
, yi)}
1
i=1 00, where
yi = β0 + β1xi + β2x
2
i
. For example, choose xi = −50 to 49; β0 = 100; β1 = −2; and β1 = 3.
4. Now the generated data points {(xi
, yi)}
100
i=1 lies exactly on a quadratic polynomial.
5. Add random variations to each of the data points as follows: yi = β0 + β1xi + β2x
2
i + i
,
where the random variable i follows a normal distribution with 0 mean and σ standard
deviation. i.e. i ∼ Normal(0, σ2
). You can choose any variance σ, e.g. σ = 500. Hint: In
Matlab, you can use eps = normrnd(mu, sigma, 100) to draw 100 random i from a
normal distribution with mean mu = 0 and standard deviation sigma.
6. Example model parameters for data generation:
β0 = 100; β1 = −2; β2 = 3; #»x = {−50, −49, −48, . . . , 48, 49} ; σ = 500
7. Least squares problem to be solved: X100×3
#»β 3×1 =
#»y 100×1
Page 1 of 4
AUXILIARY ALGORITHMS
Modified Gram-Schmidt Orthogonalization Procedure:
Input : A set of r linearly independent vectors {
#»x1,
#»x2, . . . ,
#»xr}
Output: A set of r orthonormal vectors {
#»q 1,
#»q 2, . . . ,
#»q r} such that
Span ({
#»xi}
r
i=1) = Span ({
#»q i}
r
i=1)
1 r11 ← k#»x1k;
2 if r11 = 0 then
3 Stop
4 else
5
#»q 1 ← x1
r11
6 end
7 for j ← 2 to r do
8 qˆ ← #»xj ;
9 for i ← 1 to j − 1 do
10 rij = (ˆq, #»q i);
11 qˆ = ˆq − rij
#»q i
;
12 end
13 rjj ← kqˆk;
14 if rjj = 0 then
15 Stop;
16 else
17 #»q j =

rjj
;
18 end
19 end
Algorithm 1: Modified Gram-Schmidt Orthogonalization Procedure
Back-substitution Procedure to solve R
#»β =
#»y :
Input : 1. An upper triangular matrix R ∈ C
n×n
; and 2. right-side vector #»y ∈ C
n×1
Output: #»β ∈ C
n×1
1 for j ← n to 1 do
2
βj =
yj −
Pn
k=j+1
βk rjk
rjj
3 end
Algorithm 2: Back-substitution procedure
Page 2 of 4
Homework Problem:
1. Using QR decomposition, write a program to solve for the model parameters #»β using the
generated data above: X100×3
#»β 3×1 =
#»y 100×1:
· With X = QR,
X
#»β =
#»y
QR#»β =
#»y
· Multiplying both sides by Q∗ and also using the fact that for rectangular matrices with
orthogonal columns Q∗Q = I,
Q
∗QR#»β = Q
∗ #»y
R
#»β = Q
∗ #»y
· Recognize that R is an upper-triangular matrix.
· Solve for the unknown parameters #»β =




β1
.
.
.
βn




from R
#»β = Q∗ #»y using the
back-substitution procedure.
2. Write a program to solve for the model parameters #»β from the normal equation of
X100×3
#»β 3×1 =
#»y 100×1 using the data generated above. (refer to the lectures notes for
details of solution using the normal equation).
3. Using singular value decomposition (use your SVD code from homework 1), write a program
to solve for the model parameters #»β from X100×3
#»β 3×1 =
#»y 100×1 using the data generated
above:
· X ∈ C
m×n where m > n.
· Using SVD, X = UΣV
∗ where the columns of U and V are orthogonal.
X
#»β =
#»y
UΣV

#»β =
#»y
· Multiplying both sides by U
∗ and using the fact that U
∗U = I for matrices with
orthogonal columns,
U
∗UΣV

#»β = U
∗ #»y
ΣV

#»β = U
∗ #»y
#»β = (ΣV

)
−1 U
∗ #»y
Page 3 of 4
· We can observe that ΣV

is a diagonal matrix and #»β is solved much easily using SVD.
4. Using unconstrained minimization, write a program to solve for the model parameters #»β
from X100×3
#»β 3×1 =
#»y 100×1 using the data generated above. Note: You can use the built-in
MATLAB function fminunc() to solve the problem.
βˆ = argmin
#»β
k
#»y − X
#»β k
2
Page 4 of 4