STATS 202: Data Mining and Analysis HOMEWORK # 1

$30.00

Category: You will Instantly receive a download link for .zip solution file upon Payment || To Order Original Work Click Custom Order?

Description

5/5 - (4 votes)

Introduction
Homework problems are selected from the course textbook: An Introduction to Statistical Learning.
Problem 1 (4 points)
Chapter 2, Exercise 2 (p. 52).
Problem 2 (4 points)
Chapter 2, Exercise 3 (p. 52).
Problem 3 (4 points)
Chapter 2, Exercise 7 (p. 53).
Problem 4 (4 points)
Chapter 10, Exercise 1 (p. 413).
Problem 5 (4 points)
Chapter 10, Exercise 2 (p. 413).
Problem 6 (4 points)
Chapter 10, Exercise 4 (p. 414).
Problem 7 (4 points)
Chapter 10, Exercise 9 (p. 416).
Problem 8 (4 points)
Chapter 3, Exercise 4 (p. 120).
Problem 9 (4 points)
Chapter 3, Exercise 9 (p. 122). In parts (e) and (f), you need only try a few interactions and transformations.
1
Problem 10 (4 points)
Chapter 3, Exercise 14 (p. 125).
Problem 11 (5 points)
Let x1, . . . , xn be a fixed set of input points and yi = f(xi) + i
, where i
iid∼ P with E (i) = 0 and
Var(i) < ∞. Prove that the MSE of a regression estimate ˆf fit to (x1, y1), . . . ,(xn, yn) for a random test
point x0 or E

y0 − ˆf(x0)
2
decomposes into variance, square bias, and irreducible error components.
Hint: You can apply the bias-variance decomposition proved in class.
Problem 12 (5 points)
Consider the regression through the origin model (i.e. with no intercept):
yi = βxi + i (1)
(a) (1 point) Find the least squares estimate for β.
(b) (2 points) Assume i
iid∼ P such that E (i) = 0 and Var(i) = σ
2 < ∞. Find the standard error of the
estimate.
(c) (2 points) Find conditions that guarantee that the estimator is consistent. n.b. An estimator βˆ
n of a
parameter β is consistent if βˆ
p→ β, i.e. if the estimator converges to the parameter value in probability.
2