Description
tldr: Perform linear regression of a noisy sinewave using a set of gaussian basis
functions with learned location and scale parameters. Model parameters are
learned with stochastic gradient descent. Use of automatic differentiation is
required. Hint: note your limits!
Problem Statement Consider a set of scalars {x1, x2, . . . , xN } drawn from U(0, 1)
and a corresponding set {y1, y2, . . . , yN } where:
yi = sin (2πxi) + ϵi (1)
and ϵi
is drawn from N (0, σnoise). Given the following functional form:
yˆi =
∑
M
j=1
wjϕj (xi
| µj
, σj ) + b (2)
with:
ϕ(x | µ, σ) = exp
−(x − µ)
2
σ
2
(3)
find estimates ˆb, {µˆj}, {σˆj}, and {wˆj} that minimize the loss function:
J(y, yˆ) = 1
2
(y − yˆ)
2
(4)
for all (xi
, yi) pairs. Estimates for the parameters must be found using stochastic
gradient descent. A framework that supports automatic differentiation must be
used. Set N = 50, σnoise = 0.1. Select M as appropriate. Produce two plots. First,
show the data-points, a noiseless sinewave, and the manifold produced by the
regression model. Second, show each of the M basis functions. Plots must be of
suitable visual quality.
−4 −2 0 2 4
x
−1.5
−1.0
−0.5
0.0
0.5
1.0
1.5
y
Fit 1
−4 −2 0 2 4
x
0.0
0.2
0.4
0.6
0.8
1.0
y
Bases for Fit 1
−4 −2 0 2 4
x
−1.5
−1.0
−0.5
0.0
0.5
1.0
1.5
y
Fit 2
−4 −2 0 2 4
x
0.0
0.2
0.4
0.6
0.8
1.0
y
Bases for Fit 2
Figure 1: Example plots for models with equally spaced sigmoid and gaussian basis functions.