Sale!

CptS 570 Machine Learning Homework #1

$30.00 $18.00

Category: You will Instantly receive a download link for .zip solution file upon Payment || To Order Original Work Click Custom Order?

Description

5/5 - (3 votes)

1 Analytical Part (2 percent grade)
This part will be graded as a PASS or FAIL.
1. Answer the following questions with a yes or no along with proper justification.
a. Is the decision boundary of voted perceptron linear?
b. Is the decision boundary of averaged perceptron linear?
2. In the class, we saw the Passive-Aggressive (PA) update that tries to achieve a margin equal
to one after each update. Derive the PA weight update for achieving margin M.
3. Consider the following setting. You are provided with n training examples: (x1, y1, h1), · · · ,(xn, yn, hn),
where xi
is the input example, yi
is the class label (+1 or -1), and hi > 0 is the importance
weight of the example. The teacher gave you some additional information by specifying the
importance of each training example.
a. How will you modify the perceptron algorithm to be able to leverage this extra information? Please justify your answer.
b. How can you solve this learning problem using the standard perceptron algorithm? Please
justify your answer. I’m looking for a reduction based solution.
4. Consider the following setting. You are provided with n training examples: (x1, y1),(x2, y2), · · · ,(xn, yn),
where xi
is the input example, and yi
is the class label (+1 or -1). However, the training data
is highly imbalanced (say 90% of the examples are negative and 10% of the examples are
positive) and we care more about the accuracy of positive examples.
a. How will you modify the perceptron algorithm to solve this learning problem? Please
justify your answer.
b. How can you solve this learning problem using the standard perceptron algorithm? Please
justify your answer. I’m looking for a reduction based solution.
2 Programming and Empirical Analysis Part (6 percent grade)
1. Programming and empirical analysis question.
Implement a binary classifier with both perceptron and passive-aggressive (PA) weight update
as shown below.
1
Algorithm 1 Online Binary-Classifier Learning Algorithm
Input: D = Training examples, T = maximum number of training iterations
Output: w, the final weight vector
1: Initialize the weights w = 0
2: for each training iteration itr ∈ {1, 2, · · · , T} do
3: for each training example (xt
, yt) ∈ D do
4: yˆt = sign(w · xt) // predict using the current weights
5: if mistake then
6: w = w + τ · yt
· xt // update the weights
7: end if
8: end for
9: end for
10: return final weight vector w
For standard perceptron, you will use τ = 1, and for Passive-Aggressive (PA) algorithm, you
will compute the learning rate τ as follows.
τ =
1 − yt
· (w · xt)
kxtk
2
(1)
Implement a multi-class online learning algorithm with both perceptron and passive-aggressive
(PA) weight update as shown below. Employ the single weight vector represenation (representationII as discussed in the class). This representation is defined as follows. Each training example
is of the form (xt
, yt), where xt ∈