Statistical Inference Test Set 1
1. Let X ~ ( ).Ρ λ Find unbiased estimators of ( )i λ3, ( )ii e−λcos ,λ (iii)sinλ. (iv) Show that there does not exist unbiased estimators of 1 / , and exp{ 1 / }.λ − λ
2. Let X X1, 2,...,Xnbe a random sample from a N( ,µ σ2)population. Find unbiased and consistent estimators of the signal to noise ration µ
σ and quantile µ+bσ , where b is any given real.
3. Let X X1, 2,...,Xnbe a random sample from a U(−θ θ, 2 )population. Find an unbiased and consistent estimator of θ.
4. Let X X1, 2be a random sample from an exponential population with mean 1 /λ. Let
1 2
1 , 2 1 2
2 X X
T = + T = X X . Show that T1 is unbiased and T2is biased. Further, prove that
2 1
( ) ( )
MSE T ≤Var T .
5. Let T1and T2 be unbiased estimators of θ with respective variances σ12 and σ22 and
1 2 12
cov( ,T T )=σ (assumed to be known). Consider T =αT1+ −(1 α) , 0T2 ≤ ≤α 1. Show that Tis unbiased and find value of αfor which Var T( )is minimized.
6. Let X X1, 2,...,Xnbe a random sample from anExp( , )µ σ population. Find the method of moment estimators (MMEs) of µ and σ .
7. Let X X1, 2,...,Xn be a random sample from a Pareto population with density
( ) 1 , , 0, 2.
fX x x
x
β β
βα+ α α β
= > > > Find the method of moments estimators of α β, . 8. Let X X1, 2,...,Xnbe a random sample from a U(−θ θ, )population. Find the MME of θ. 9. Let X X1, 2,...,Xnbe a random sample from a lognormal population with density
2 2
1 1
( ) exp (log ) , 0.
2 2
X e
f x x x
x µ
σ π σ
= − − >
Find the MMEs of µ and σ2.
10. Let X X1, 2,...,Xnbe a random sample from a double exponential( , )µ σ population. Find the MMEs of µ and σ.
Hints and Solutions 1. (i) E X X{ ( −1)(X −2)}=λ3
(ii) For this we solve estimating equation. Let T X( )be unbiased for e−λcosλ. Then
ET X( )=e−λcosλfor all >0λ .
0
( ) cos for all >0
!
x
x
T x e e
x
λλ λ λ λ
∞ −
−
=
⇒
∑
=
2 4
0
( ) 1 for all >0
! 2! 4!
x
x
T x x
λ λ λ λ
∞
=
⇒
∑
= − + −As the two power series are identical on an open interval, equating coefficients of powers of λon both sides gives
( ) 0, if 2 1, 1, if 4 ,
1, if 4 2, 0,1, 2,
T x x m
x m
x m m
= = +
= =
= − = + =
(iii) For this we have to solve estimating equation. However, we use Euler’s identity to solve it.
Let U X( )be unbiased for sinλ. Then
( ) 1 ( ) for all 0
! 2
0
1 (1 ) (1 )
( ) for all 0
2
1 (1 ) (1 )
for all 0.
2 0 ! 0 !
x i i
U x e e e
x i
x
i i
e e
i
k k k k
i i
i k k k k
λ λ λ λ λ
λ λ λ
λ λ λ
∞ −
⇒ ∑ = − >
=
+ −
= − >
∞ + ∞ −
= ∑ − ∑ >
= =
Applying De-Moivre’s Theorem on the two terms inside the parentheses, we get
0
0
0
( ) 1 ( 2) cos sin ( 2) cos sin
! 2 ! 4 4 4 4
0
1 ( 2) cos sin ( 2) cos sin
2 ! 4 4 4 4
( 2) sin
! 4
k k k
k k
k k
k k
k
k k
k
x
U x i i
x i k
x
k k k k
i i
i k
k k
λ λ π π π π
λ π π π π
λ π
∞
=
∞
=
∞
=
∞ = + − − + −
∑=
= + − − + −
=
∑
∑
∑
for allλ >0
Equating the coefficients of powers of λon both sides gives ( ) ( 2) sin , 0,1, 2,
4
x x
U x π x
= =
In Parts (iv) and (v) , we can show in a similar way that estimating equations do not have any solutions.
2. Let 2 2
1 1
1 1
, and ( )
1
n n
i i
i i
X X S X X
n = n =
= = −
∑
−∑
.Then
2 2
2 2 1
( 1)
~ , , and W= n S ~ n
X N n
µ σ χ
σ −
−
. It can be seen that
1/ 2
2 2
( )
1 2
n E W = n
− and 1/ 2
2 ( ) 2
2 1 2 n
E W n
−
−
= − . Using these, we get unbiased estimators
of σ and 1
σ as 1 2
1 1
1 2 and 2 2 1
2 1 2
2 2
n n
T n S T
n S
n n
− −
= − =
− − respectively. As and 2
X S are independently distributed, U1 =X T2 is unbiased for µ.
σ Further,
2 1
U =X +bT is unbiased for µ+bσ. As X and S2 are consistent for µ and σ2 respectively, U1 and U2are also consistent for µ
σ and µ+bσ respectively.
3. As 1 3 2
2 , 3
T X
µ′ = θ = is unbiased for θ. T is also consistent for θ.
4. As E X( i) 1,T1
=λ is unbiased. Also X1and X2are independent. So
( ) ( )
2 22 1 2 1
( ) ( ) 1 .
2 4
E T E X X E X π π
λ λ
= = = = 1 2
( ) 1 Var T 2
= λ .
( )
2
2 1 2 1 2 1 2 2
2
1 2 1
( ) ( )
2 1 4
MS ET E X X E X X E X X
λ λ λ
π λ
= − = − +
= −
5. The minimizing choice of α is obtained as
2
2 12
2 2
1 2 2 12
σ σ
σ σ σ
−
+ − .
6. ( ) 1 exp x , , 0.
f x µ x µ σ
σ σ
−
= − > > µ1′= +µ σ µ, 2′ =
(
µ σ+)
2+σ2.So µ µ= 1′− µ2′−µ σ′2, = µ2′−µ′2 . The method of moments estimators for and
µ σ are therefore given by
2 2
1 1
1 1
ˆ ( ) , an ˆ d ( )
n n
MM i MM i
i i
X X X X X
n n
µ σ
= =
= −
∑
− =∑
− .7.
2
1 , 2
1 2
βα βα
µ µ
β β
′= ′ =
− − . So 1 2 2 2
2
2 1
2 1
, 1
µ µ µ
α β
µ µ µ µ
′ ′ ′
= = +
′− ′
′− ′
The method of moments estimators for α and βare therefore given by
2 2
1 1
2 2 2
1 1 1
ˆ , ˆ 1
( )
( )
n n
i i
i i
MM n n MM n
i i i
i i i
X X X
X X
X X X
α = β =
= = =
= = +
− + −
∑ ∑
∑ ∑ ∑
.
8. Since µ′ =1 0, we consider
2
2 3
µ′ =θ . So 2
1
ˆ 3 n
MM i
i
n X θ
=
=
∑
9. µ1′=eµ σ+ 2/ 2,µ2′ =e2µ σ+2 2 . So
2
1 2 2
2 2 1
log µ , log µ
µ σ
µ µ
′ ′
= ′ = ′ and the method of moments estimators for µ and σ2are therefore given by
2 2
2 1
2 2
1
1
ˆ log , ˆ log
1
n i i
MM n MM
i i
X n X
X X n
µ σ =
=
= =
∑
∑
.
10. 1
( ) exp , , , 0.
2
f x x µ x µ σ
σ σ
−
= − ∈ ∈ >
µ1′=µ µ, 2′ =µ2+2σ2.
So 1 1 2 2
, ( )
µ µ σ= ′ = 2 µ′ −µ′ . The method of moments estimators for µ and σ are therefore given by
2 1
ˆ , and ˆ 1 ( )
2
n
MM MM i
i
X X X
µ σ n
=
= =