$30
p. 1 of 2
EE 559 Homework - Week 14
1. DHS Ch. 3 Problem 2 (The problem starts “Let x have a uniform density…” and
you are asked to find the Maximum Likelihood estimate of a parameter of the
uniform density.
Hint: If you find that taking the derivative w.r.t. won’t give a meaningful answer,
try writing the expression for the likelihood and maximizing it by inspection.
2. DHS, Chapter 3, Problem 17 plus additional part (f) given below. This problem
starts “The purpose of this problem is to derive the Bayesian classifier for the ddimensional multivariate Bernoulli case.” It walks you through the Bayesian method
for parameter estimation in a Bayes minimum error classifier (Approach 2
“Integrating over posterior predictive” in Lecture 25 notes).
Note that is binary, with , and that dataset is the same set of samples
that we called vector in lecture. Also probability that , with
.
Hints: For part (b), the uniform distribution for extends over ; also,
use Bayes theorem. For part (d), the integral they refer to is Eq. (2) in Lecture 25
notes (or Eq. (3) with the class distinctions ( and subscripts denoting class )
omitted).
(f) Consider a C-class problem, in which class contains samples denoted
as the set (or the vector ), and these samples sum to
. Give the Bayes minimum error decision rule based
on your estimation technique of parts (a)-(d). Use for estimates of the priors
, frequency of occurrence of the prototypes.
3. In this problem you will use k-nearest neighbor density estimation. You are given
the following data points for a 2-class problem in 1-D feature space:
For parts (a)-(c) below, use .
θ
xi xi ∈{0,1} D
z θi = xi = 1
0 ≤θi ≤1 ∀i
p(θ ) 0 ≤θ ≤1
Si
Sk nk
Dk zk
s
(k )
= s
1
(k )
,!, sd
(k ) ⎡
⎣ ⎤
⎦
T
P Sk ( )
S1 : 1.0, 4.0
S2 : 3.0, 6.0
k = 2
p. 2 of 2
(a) Graph the k-nearest neighbors estimates of the density functions and
. Be sure to label pertinent values on both axes. Also give the
density estimates algebraically, for each region in feature space.
(b) Estimate the a priori probabilities based on frequency of occurrence of the
prototypes in each class.
(c) Use the estimates you have developed in (a)-(b) above to find the decision
boundaries and regions for a Bayes minimum-error classifier based on knearest neighbors.
(d) For this part, use instead the decision rule for a discriminative kNN classifier
(which doesn’t calculate explicitly). Let (= number of
samples over all classes that are inside region ), and find the decision
boundary and regions. Classify the points 3.25, 3.75. (Hint: if you’re not
sure how to come up with the boundary and regions, try classifying the two
points first.)
p(x S1)
p(x S2 )
p(x Si
) k = 3
R