$29.99
VE 492 Homework9
Question 1: MaximumLikelihood EstimationWe will begin with a short derivation. Consider a probability distribution withadomain that consists of @ different values. We get to observe total samples fromthis distribution. We use to represent the number of the samples for whichoutcome occurs. Our goal is to estimate the probabilities
o bb@ b of eachof the events. The probability of the last outcome, @, equals b ob
@b . In maximum likelihood estimation, we choose the that maximize the likelihoodof
the observed samples, For this derivation, it is easiest to work with the log of the likelihood. Maximizinglog-likelihood also maximizes likelihood, since the quantities are related byamonotonic transformation. Taking logs we obtain
Setting derivatives with respect to equal to zero, we obtain @ b equations inthe @ b unknowns, b
b
@ b:
Multiplying by b b b @ b makes the original @ b nonlinear
equations into @ b linear equations:
That is, the maximum likelihood estimation of can be found by solving a linear
system of @ b equations in @ b unknowns. Doing so shows that themaximum likelihood estimate corresponds to simply the count for each outcomedivided by the total number of samples. I.e., we have that:
Notice: Please write each sub-question in one row, that is, there will be 3 rows for
this question. And please use irreducible fractions for your answer. Sample Answer:
1,1/2,1/3,1/4
2/5 (instead of 4/10),1/3,4/7
3/8,3/7,3/5
Part 1. Now, consider a sampling process with 3 possible outcomes: R, G, and B. We observethe following sample counts:
outcome R G B
count 3 1 7
1) What is the total sample count ?
2) What are the maximum likelihood estimates for the probabilities of eachoutcome?
o
o
o
Part 2. Now, use Laplace smoothing with strength o წ to estimate the probabilities of eachoutcome.
ܣწ o
ܣწ o
ܣწ o
Part 3. Now, consider Laplace smoothing in the limit . Fill in the correspondingprobability estimates.
o ܣ
o ܣ