Starting from:

$30

EE 559 Homework Week 12

p. 1 of 2
EE 559 Homework Week 12
1. (a) Two classes are described by normal densities as follows
Write an expression for the decision rule, simplified as much as possible. Try to put
your answer into this general form:
and give expressions for and in terms of given quantities. If it isn’t expressible
in this form, give the discriminant function in simplest form.
(b) For this part you are also given:
Solve, algebraically, for the Bayes minimum error classifier (i.e., give the resulting
decision rule algebraically). Plot (by hand or by computer) the decision boundary and
label the decision regions in 2D nonaugmented feature space.
2. A Naïve Bayes classifier is a Bayes classifier in which the features, conditioned on class, are
assumed independent; thus:
.
Repeat Problem 1 for a Naïve Bayes classifier; except let the feature variances in part (b) be
, .
3. A Bayes minimum risk classifier uses a different criterion than a Bayes minimum error
classifier. This classifier was introduced in Lecture 19, and is covered in DHS Sec. 2.2. In
this problem, please use our notation ( for class i instead of ).
For a Bayes minimum risk classifier with the given information of Problem 1 above, and
losses given by:
p( x Si) = N x,mi,Σi ( ), i = 1, 2
P(S1) = P(S2 ) = 0.5
w
T
x <
> − w0
w w0
m1 = 1
1


⎢ ⎤

⎥, m2 = 1
−1


⎢ ⎤


Σ1 = Σ2 = 1 0.5
0.5 2.25


⎢ ⎤


p x S ( i) = p x j S ( i)
j=1
D
∏ ∀i = 1,!,C
σ1
(i)
2
= 1, σ 2
(i)
2
= 2.25 i = class index = 1, 2
Si ωi
p. 2 of 2
solve for the decision regions and boundary. Plot them in 2D nonaugmented feature
space; compare to the plot of Problem 1. Which decision region has grown because of
the given losses? For the losses given above, deciding which class incurs more
loss (deciding or deciding ) ?
λii = 0, i = 1, 2
λ12 = 2λ21 > 0
λij
S1 S2