Description
1. (Modified from problem 7.37 of our text). Let X1, . . . , Xn be a random sample from a uniform
distribution on the interval (−θ, 2θ), θ > 0. That is, the Xi
’s are iid with pdf fθ(x) = 1
3θ
1{−θ < x < 2θ}
for θ > 0. Find, if one exists, a best unbiased estimator of θ.
2. (Problem 7.55(b) of our text). Let X1, . . . , Xn be a random sample from the pdf fθ(x) = e
−(x−θ)
for x > θ, where −∞ < θ < ∞. Find the best unbiased estimator of ϕ(θ) = θ
r
for some constant r ≥ 1
(here r might or might not be an integer).
3. (Problem 7.57 of our text) Let X1, . . . , Xn+1 be iid Bernoulli(p), and define the function h(p) by
h(p) = P
Pn
i=1 Xi > Xn+1p
, the probability that the first n observations exceed the (n + 1)st.
(a) Show that
δ(X1, . . . , Xn+1) =
1, if Pn
i=1 Xi > Xn+1;
0, otherwise.
is an unbiased estimator of h(p).
(b) Find the best unbiased estimator of h(p).
4. (Motivated from Problem 7.59 of our text). Let X1, . . . , Xn be iid N(µ, σ2
), where both µ and
σ is unknown, i.e., θ = (µ, σ). Find the best unbiased estimators of
(a) ϕ(θ) = σ; (b) ϕ(θ) = σ
2
; and (c) ϕ(θ) = σ
4
.
5. (Modified from 6.31(c) & 7.60). Let X1, . . . , Xn be iid gamma(α, β) with α > 1 known. That is,
the pdf of Xi
is
fβ(x) = 1
Γ(α)β
α
x
α−1
e
−x/β
, 0 ≤ x < ∞
for α > 1 and β > 0. By Theorems 6.2.10 and 6.2.25, the statistic T =
Pn
i=1 Xi
is complete sufficient
for β, and it is also well known that T =
Pn
i=1 Xi has a Gamma(nα, β) distribution. To help you find
the best unbiased estimator of ϕ(β) = 1/β when α > 1 is known, we split it into the following steps.
(a) Show that Eβ
1
X1
=
1
(α−1)β
, and conclude that δ = (α − 1)/X1 is an unbiased estimator of 1/β.
(b) Prove the lemma: if U/V and U are independent random variables, then E(
U
V
) = E(
1
V
)/E(
1
U
).
(c) Use Basu’s Theorem and the lemma in (b) to show that under the setting of this question,
E
1
X1
T
= E
T
X1
1
T
T
=
E
1
X1
E
1
T
1
T
.
(d) Find the best unbiased estimator of ϕ(β) = 1/β.
6. (Motivated from Problem 10.9 of our text). Suppose that X1, . . . , Xn are iid Poisson(θ). Find
the best unbiased estimator of
(a) ϕ1(θ) = e
−θ
, the probability that X = 0.
(b) ϕ2(θ) = θe−θ
, the probability that X = 1.
(c) A preliminary test of a possible carcinogenic compound can be performed by measuring the mutation rate of microorganisms exposed to the compound. An experimenter places the compound
in 15 petri dishes and records the following number of mutant colonies:
10, 7, 8, 13, 8, 9, 5, 7, 6, 8, 3, 6, 6, 3, 5.
Calculate the best unbiased estimators of e
−θ
, the probability that no mutant colonies emerge,
and θe−θ
, the probability that one mutant colony will emerge.
Hints: If you have already thought about each problem for at least 30 minutes, then please feel free to
look at the hints. Otherwise, please try the problem first, as getting help from the hints takes away most of
the fun.
Problem 1: you have found the complete sufficient statistic for θ in HW#7, right? Can you use it to
find the best unbiased estimator?
Problem 2: Use the complete sufficient statistic T to construct an unbiased estimator. Using integration
by parts,
Eθ(T
r
) = Z ∞
θ
nen(θ−t)
t
r
dt = −t
r
e
n(θ−t)

∞
θ + r
Z ∞
θ
e
n(θ−t)
t
r−1
dt = θ
r +
r
n
Eθ(T
r−1
).
From this relation, can you see Eθ(g(T)) = θ
r and find an unbiased estimator g(T) of θ
r
?
Problem 3: Recall that Xn+1 can only take two possible values: 0 or 1, and you may need to consider
different cases of Pn+1
i=1 Xi = b, depending on whether b = 0, 1, 2 or b ≥ 3.
Problem 4: We have shown in class that (X, S ¯ 2
) is the complete sufficient statistics. Also U = (n − 1)S
2
/σ2
is χ
2
n−1 distribution, and compute E(U
p/2
) = Cp,n, which does not depend on θ = (µ, σ). Hence, (n − 1)p/2S
p
/Cp,n
is an unbiased estimator of σ
p
. What happens to p = 1, 2, 4?
Problem 5: in part (a), note that
Eβ
1
X1
=
1
Γ(α)βα
Z ∞
0
1
x
x
α−1
e
−x/βdx =
1
Γ(α)βα
Γ(α − 1)β
α−1
,
In part (b), note that U/V and 1/U are also independent.
Part (c) is for the illustration of the use of Basu’ Theorem. The key is to prove that T
X1
and T are independent.
Please feel free to use the wellknown fact on the conditional expectation that if U and W are independent then
E(Ug(W)W) = E(U)g(W) for any realvalued function g(·). In particular, E(
U
W W) = E(U)
1
W .
In part (d), you may need to compute the value E(1/T). Please feel free to use the fact that T has a Gamma(nα, β)
distribution, and thus the conclusion in (a) applies to Eβ(1/T) too except that α is replaced by nα.
Also in part (d), you may or may not use the result in (c), depending on which method you are using: Method 1
(hunting) or Method 2 (RaoBlackwell). Either approach will be fine for part (d), as long as your final answer is correct.
Problem 6: Use Theorem 6.2.25 to find the complete sufficient statistic T for Poisson distribution. In (a), the
indicator variable δ = 1(X1 = 0) is unbiased. In (b), the indicator variable δ = 1(X1 = 1) is unbiased. Then we can
use Method 2 (RaoBlackwell) to find the best unbiased estimator σ
∗ = E(δT).