• Tidak ada hasil yang ditemukan

θ θ θ θ θ θ θ θ - Nptel

N/A
N/A
Protected

Academic year: 2024

Membagikan "θ θ θ θ θ θ θ θ - Nptel"

Copied!
4
0
0

Teks penuh

(1)

Statistical Inference Test Set 3

1. Let X X1, 2,...,Xn be a random sample from a population with density function

2

( ) exp , 0, 0.

2

x x

f x x θ

θ θ

 

= −  > >

  Find FRC lower bound for the variance of an unbiased estimator of θ. Hence derive a UMVUE for θ.

2. Let X X1, 2,...,Xnbe a random sample from a discrete population with mass function

1 1

( 1) , ( 0) , ( 1) , 0 1.

2 2 2

P X = − = θ P X = = P X = =θ < <θ

Find FRC lower bound for the variance of an unbiased estimator of θ. Show that the variance of the unbiased estimator 1

X+2 is more than or equal to this bound.

3. Let X X1, 2,...,Xn be a random sample from a population with density function

(1 )

( ) (1 ) , 0, 0.

f x =θ +x − +θ x> θ > Find FRC lower bound for the variance of an unbiased estimator of 1/θ. Hence derive a UMVUE for 1/θ.

4. Let X X1, 2,...,Xn be a random sample from a Pareto population with density

( ) 1 , , 0, 2.

fX x x

x

β β

βα+ α α β

= > > > Find a sufficient statistics when (i) α is known, (ii) when βis known and (iii) when both α β, are unknown.

5. Let X X1, 2,...,Xnbe a random sample from a Gamma ( , )p λ population. Find a sufficient statistics when (i) p is known, (ii) when λ is known and (iii) when both p,λ are unknown.

6. Let X X1, 2,...,Xnbe a random sample from a Beta ( , )λ µ population. Find a sufficient statistics when (i) µ is known, (ii) when λ is known and (iii) when both λ µ, are unknown.

7. Let X X1, 2,...,Xn be a random sample from a continuous population with density function

( ) 1 , 0, 0.

(1 )

f x x

x θ

θ + θ

= > >

+ Find a minimal sufficient statistic.

8. Let X X1, 2,...,Xnbe a random sample from a double exponential population with the density ( ) 1 | |, , .

2

f x = e− −xθ x∈ θ∈ Find a minimal sufficient statistic.

9. Let X X1, 2,...,Xnbe a random sample from a discrete uniform population with pmf ( ) 1, 1, , ,

p x x θ

=θ =  where θis a positive integer. Find a minimal sufficient statistic.

(2)

10. Let X X1, 2,...,Xnbe a random sample from a geometric population with pmf

1 ,

( ) (1 )x , 1, 2

f x = −p p x= , 0< <p 1. Find a minimal sufficient statistic.

11. Let X have a N(0,σ2) distribution. Show that X is not complete, but X2is complete.

12. Let X X1, 2,...,Xnbe a random sample from an exponential population with the density

( ) x, 0, 0.

f xeλ x> λ> Show that

1 n

i i

Y X

=

=

is complete.

13. Let X X1, 2,...,Xnbe a random sample from an exponential population with the density

( ) x, , .

f x =eµ x>µ µ∈ Show that Y =X(1)is complete.

14. Let X X1, 2,...,Xnbe a random sample from a N( ,θ θ2) population. Show that ( ,X S2) is minimal sufficient but not complete.

(3)

Hints and Solutions 1.

2

log ( | ) log log 2 f x θ x θ x

= − − θ .

2 2 2 4 2 2

2 4 3 2 4 3 2 2

lo g 1 ( ) ( ) 1 8 2 1 1

2 4 4

f X E X E X

E E θ θ

θ θ θ θ θ θ θ θ θ θ

 

∂  =  −  = − + = − + =

 ∂ 

    .

So FRC lower bound for the variance of an unbiased estimator ofθis

2

n θ .

Further 1 2

2 i

T X

= n

is unbiased forθand

2

( ) Var T

n

=θ . Hence Tis UMVUE for θ.

2. 1

( ) 2

E X = −θ . So 1

T = X +2is unbiased for θ.

1 4 4 2

( ) 4

Var T

n

θ θ

+ −

= .

1

1

( 1) , 1

log ( | ) 0, 0

, 1

x

f x x

x θ

θ θ

θ

 − = −

∂ = =

∂  =

So we get

log 2 1

2 (1 ) E f

θ θ θ

∂  =

 ∂  −

 

The FRC lower bound for the variance of an unbiased estimator ofθis 2 (1 ) n θ −θ

. It can be seen that

2 (1 ) (1 2 )2

( ) 0.

Var T 4

n n

θ −θ − θ

− = ≥

3. We have 1

log ( | )f x θ log(1 x)

θ θ

∂ = − +

∂ . Further, E{log(1+X)}=θ1, and

2 2

{log(1 )} 2

E +X = θ . So

2 2

log f 1

E θ θ

∂  =

 ∂ 

  , and FRC lower bound for the variance of an unbiased estimator of θ1is

2

n

θ . Now 1

log(1 i)

T X

=n

+ is unbiased for θ1and

2

( ) Var T

n

=θ .

4. Using Factorization Theorem, we get sufficient statistics in each case as below (i)

1 n

i i

X

= ,(ii) X(1) (iii) (1) 1

,

n i i

X X

=

 

 

5. Using Factorization Theorem, we get sufficient statistics in each case as below (i)

1 n

i i

X

= ,(ii) 1 n

i i

X

= (iii)

1 1

,

n n

i i

i i

X X

= =

 

 

∑ ∏

6. Using Factorization Theorem, we get sufficient statistics in each case as below (i)

1 n

i i

X

= ,(ii) 1

(1 )

n

i i

X

=

(iii)

1 1

, (1 )

n n

i i

i i

X X

= =

 − 

 

∏ ∏

7. Using Lehmann-Scheffe Theorem, a minimal sufficient statistic is

1

(1 )

n

i i

X

=

+ .

8. Using Lehmann-Scheffe Theorem, a minimal sufficient statistic is the order statistics

(1) ( )

(X ,, X n ).

(4)

9. Using Lehmann-Scheffe Theorem, a minimal sufficient statistic is the largest order statisticsX( )n .

10. Using Lehmann-Scheffe Theorem, a minimal sufficient statistic is

1 n

i i

X

= .

11. Since E X( )=0 for all σ >0 , but but (P X =0)=1 for all σ >0 . Hence X is not complete. Let W =X2. The pdf of Wis 1 / 2 2

( ) , 0

2

w

fW w e w

w

σ

σ π

= > .

Now E g Wσ ( )=0 for allσ >0 1/ 2 / 2 2

0

( ) w 0 for all 0

g w w e σ dw σ

= > . Uniqueness

of the Laplace transform implies g w( )=0 . .a e Hence X2is complete.

12. Note that

1 n

i i

Y X

=

=

has a Gamma ( , )n λ distribution. Now proceeding as in Problem 11, it can be proved that Yis complete.

13. Note that the density of Y = X(1)is given by fY( )y =n en(µx),x>µ µ, ∈. ( ) 0 for all

Eg Y = µ∈

( )

( ) n y 0 for all n g y e µ dy

µ

µ

= ∈

( ) ny 0 for all g y e dy

µ

µ

= ∈

Using Lebesgue integration theory we conclude that g y( )=0 a.e. Hence Yis complete.

14. Minimal sufficiency can be proved using Lehmann-Scheffe theorem. To see that ( ,X S2) is not complete, note that 2 2 0 for all 0.

1

E n X S

n θ

 − = >

 + 

 

However, 2 2 0

1

P n X S

n

 = =

 + 

  .

Referensi

Dokumen terkait