STAT 2006 Assignment 2

$30.00

Category: You will Instantly receive a download link for .zip solution file upon Payment || To Order Original Work Click Custom Order?

Description

5/5 - (5 votes)

1. For two random variables (X, Y ), the MGF can be defined as
MXY (s, t) = E[e
sX+tY ].
Find MXY (s, t) when X and Y are two jointly normal random variables with
E(X) = µX, E(Y ) = µY , V ar(X) = σ
2
X, V ar(Y ) = σ
2
Y
, ρ(X, Y ) = ρ.
2. Let Y1, Y2, . . . , Yn
i.i.d. ∼ exp(θ) are random samples. If Yi
’s are sorted in ascending order, the ordered
random variables X1, X2, …, Xn with the joint pdf
fX1,X2,…,Xn
(x1, x2, …, xn) = n!
θ
n
exp {

1
θ
∑n
i=1
xi
}
,
where 0 ≤ x1 ≤ x2 ≤ … ≤ xn < ∞, are obtained.
(a) Let U1 = X1, Ui = Xi − Xi−1, i = 2, 3, …, n. Find the joint pdf of (U1, U2, …, Un).
(b) Are U1, U2, …, Un mutually independent? What is the marginal distribution of each of the
Ui
, i = 1, 2, …, n?
(c) Using the result of part (a) and (b), or otherwise, find E[X1] and E[Xn], the expectation of the
sample minimum and sample maximum.
3. Let Z1 and Z2 be independent N(0, 1) random variables, and define new random variables X and Y
by
X = aXZ1 + bXZ2 + cX and Y = aY Z1 + bY Z2 + cY ,
where aX, bX, cX, aY , bY and cY are constants.
(a) Show that E[X] = cX, Var(X) = a
2
X + b
2
X, E[Y ] = cY , Var(Y ) = a
2
Y + b
2
Y
and
Cov(X, Y ) = aXaY + bXbY .
(b) If we define the constants aX, bX, cX, aY , bY and cY by
aX =

1 + ρ
2
σX, bX =

1 − ρ
2
σX, cX = µX,
aY =

1 + ρ
2
σY , bY = −

1 − ρ
2
σY , cY = µY ,
where µX, µY , σ2
X, σ2
Y
and ρ are constants, −1 ≤ ρ ≤ 1, then show that
E[X] = µX, V ar(X) = σ
2
X, E[Y ] = µY , V ar(Y ) = σ
2
Y
, Corr(X, Y ) = ρ.
(c) Show that (X, Y ) has the bivariate normal pdf with parameters µX, µY , σ2
X, σ2
Y
and ρ.
4. Let X1, X2, · · · , Xn and Y1, Y2, · · · , Ym are independent exponential distributed random samples
with mean θ. Let Tα = αX¯ + (1 − α)Y , ¯ where 0 < α < 1.
(a) Find E[Tα] and V ar(Tα).
(b) Show that, for any ϵ > 0, P(|Tα − θ| > ϵ) → 0 as m, n → ∞ [Hint: using Chebyshev’s
inequality].
1
5. Let X1, X2, · · · , Xn
i.i.d. ∼ U[0, 1].
(a) Find the mean and variance of ln(X1).
(b) Let 0 ≤ a < b. Find limn→∞
P
(
a ≤ (X1X2 · · · Xn)
n−1/2
e
n
1/2
≤ b
)
in terms of a and b.
6. Let f(x; θ) = θxθ−1 where 0 ≤ x ≤ 1 and 0 < θ < ∞.
(a) Show that the maximum likelihood estimator (MLE) of θ is ˆθ = −∑ n
n
i=1 ln(Xi)
.
(b) Given the observed random samples be 0.55, 0.88, 0.43, 0.78, 0.66, what is the MLE of θ?
(c) Show that Y1 := − ln(X1) ∼ exp( 1
θ
).
(d) Hence, show that S := ∑n
i=1 Yi = −
∑n
i=1 ln(Xi) ∼ Γ(n, 1
θ
).
(e) Is ˆθ an unbiased estimator?
7. Suppose X1, · · · , Xn are i.i.d. with pdf f(x; θ) = 2x/θ2
, 0 < x < θ, zero elsewhere. Note this is a
non-regular case. Find:
(a) The MLE ˆθ for θ.
(b) The constant c so that E(c
ˆθ) = θ.
(c) The MLE for the median of the distribution.
8. A random sample X1, X2, · · · , Xn of size n is taken from a Poisson distribution with a mean of
λ, 0 < λ < ∞.
(a) Show that the maximum likelihood estimator for λ is λˆ = X¯.
(b) Let X equal the number of flaws per 100 feet of a used computer tape. Assume that X has a
Poisson distribution with a mean of λ. If 50 observations of X yielded 3 zeros, 5 ones, 5 twos,
8 threes, 12 fours, 9 five and 8 six, fnd the maximum likelihood estimate of λ.
9. Let X1, X2, · · · , Xn be random samples from distributions with the given probability density
functions f(x; θ). In each case, find the maximum likelihood estimator ˆθ.
(a) f(x; θ) = θ
4
6
x
3
e
−θx where 0 < x < ∞ and 0 < θ < ∞.
(b) When θ = 1, f(x; θ) = 1 where 0 < x < 1. When θ = 2, f(x; θ) = 1
2

x where 0 < x < 1.
(c) f(x; θ) = θ where 0 ≤ x ≤
1
θ
and θ > 0.
10. Let X1, X2, · · · , Xn be i.i.d. with pdf
f(x|θ) = 1
θ
, 0 ≤ x ≤ θ, θ > 0.
Estimate θ using both the method of moments and maximum likelihood. Calculate the means and
variances of the two estimators.
11. Let the pdf of X be defined by
f(x; θ) =



(
4
θ
2 )x for 0 < x ≤
θ
2
,
−(
4
θ
2 )x +
4
θ
for θ
2 < x ≤ θ,
0 otherwise,
where 0 < θ ≤ 2.
2
(a) Find the method-of-moment estimator of θ.
(b) For the following observations of X, give a point estimate of θ:
0.3209 0.2412 0.2557 0.3544 0.4168
0.5621 0.0230 0.5442 0.4552 0.5592
12. Let X1, X2, · · · , Xn be a random sample of size n from an exponential distribution with unknown
mean θ.
(a) Show that the distribution of the random variable W = (2/θ)
∑n
i=1 Xi
is χ
2
(2n).
(b) Use W to construct a 100(1 − α)% confidence interval for θ.
(c) If n = 8 and x¯ = 65.2, give the endpoints for a 90% confidence interval for the mean θ.
13. The independent random variables X1, · · · , Xn have the common distribution
P(Xi ≤ x|α, β) =



0 if x < 0
(x/β)
α
if 0 ≤ x ≤ β
1 if x > β
where the parameters α and β are positive.
(a) Assume α and β are both unknown, find the MLEs of α and β.
(b) The length of cuckoos’s eggs found in hedge sparrow nests can be modeled with this
distribution. For the data
22.0, 23.9, 20.9, 23.8, 26.0, 25.0, 21.7, 23.8, 22.8, 23.1, 23.1, 23.5, 23.0, 23.0
find the MLEs of α and β.
(c) If α is a known constant, α0, find an upper confidence limit for β with confidence coefficient
0.95.
(d) Use the data in (b) to construct an interval estimate for β. Assume that α is known and equal
to its MLE.
14. (a) Let Y be an exponential random variable with mean λ and X

= θ1 + θ2Y, θ2 > 0. Find the pdf
of X and remember to state the support of X. X is said to follow a shifted exponential
distribution with location parameter θ1 and scale parameter θ2.
(b) Let X1, X2, · · · , Xn be a random sample which Xi are identically distributed as X. Find the
method-of-moments estimator for θ1 and θ2.
(c) When θ2 is fixed, show that the likelihood function is strictly increasing in θ1 when θ1 ≤ x(1)
and is equal to zero when θ1 > x(1), where x(1)

= min{x1, x2, · · · , xn} is the sample minimum.
Hence find the maximum likelihood estimator of θ1 and θ2.
15. A manufacturer sells a light bulb that has a mean life of 1580 hours with a standard deviation of 58
hours. A new manufacturing process is being tested, and there is interest in knowing the mean life
µ of the new bulbs. How large a sample is required so that [¯x − 10, x¯ + 10] is an approximate 90%
confidence interval for µ? You may assume that the change in the standard deviation is minimal.
3