• Tidak ada hasil yang ditemukan

CHAPTER 4 Discrete Random Variables and Probability Distribution Contents:

N/A
N/A
Protected

Academic year: 2025

Membagikan "CHAPTER 4 Discrete Random Variables and Probability Distribution Contents:"

Copied!
15
0
0

Teks penuh

(1)

1 | S . A l h i d a i r a h 2 0 1 7 0 1

CHAPTER 4

Discrete Random Variables and Probability Distribution

Contents:

1.Basic concepts

2.Discrete Random Variable 3.Probability Distribution 4.Expectation

5.Higher moments 6.Variance

7.Cumulative distribution function 8.The probability generating function 9.The moment generating function 10.Discrete Probability Distribution

1. Bernoulli Distribution 2. Binomial Distribution 3.Poisson Distribution 4.Geometric Distribution

5.Negative Binomial Distribution 6.Hyper Geometric Distribution

Basic concepts

1.A variable:

The attribute or characteristic that can assume different values.

2.Random variable:

The variable is called random when its values are related to the outcomes of a random experiment.

Random variables are classified into two broad types: discrete and continuous.

1.A discrete random variable has a countable set of distinct possible values.

Ex(1): The number of boys in a classroom.

Ex(2): The number of smokers in a certain city.

Ex(3): The number of errors in a randomly selected computer.

2.A continuous random variable is such that any value (to any number of decimal places) within some interval is a possible value.

Ex(4): The height of a person in a certain age.

Ex(5): The temperature of a certain city.

Ex(6): The air pressure at different heights.

(2)

2 | S . A l h i d a i r a h 2 0 1 7 0 1

Discrete Random Variables

Probability distribution:

A table, graph, or formula that gives the probability of a given outcome's occurrence

Probability Distribution For a discrete random variable: its probability distribution (also called the probability distribution function) is any table, graph, or formula that gives each possible value and the probability of that value.

For a discrete r.v. X , we define the Probability Mas Function p.m.f of X by : f

    

x P x PXx

fx

 

x

If x one of the distinct values x1,x2,...,xn with probabilities f1,f2,...,fn , then f(x) represents a (p.m.f) if it is satisfy :

1) f

 

xi 0 for all x , i=1,2,…,n 2) f

 

x 1

n

1 i

i

3) P(X=k)=f(k)

EX(7): If the experiment consists of flipping coin twice . Find the sample space . Solution:

 

       

0,1,2.

X

2 H) (H, X 1, T) (H, X 1, H) (T, X 0, T) (T, X

heads of number :

X

H) (H, T), (H, H), (T, T), (T, S

Ex(8): Suppose that a two dice are rolled. Find the p.m.f for the sum of two dice . Solution:

(i,j);i,j 1,...,6

n 6 36 S

,12 ...

3, 2, X

dice two of sum the : X

2

(3)

3 | S . A l h i d a i r a h 2 0 1 7 0 1 36 P(6,6) 1 12)

P(X f(12)

36 P(6,5) 2 P(5,6)

11) P(X f(11)

36 P(6,4) 3 P(5,5)

P(4,6) 10)

P(X f(10)

36 P(6,3) 4 P(5,4)

P(4,5) P(3,6)

9) P(X f(9)

36 P(6,2) 5 P(5,3)

P(4,4) P(3,5)

P(2,6) 8)

P(X f(8)

36 P(6,1) 6 P(5,2)

P(4,3) P(3,4)

P(2,5) P(1,6)

7) P(X f(7)

36 P(5,1) 5 P(4,2)

P(3,3) P(2,4)

P(1,5) 6)

P(X f(6)

36 P(4,1) 4 P(3,2)

P(2,3) P(1,4)

5) P(X f(5)

6 P(3,1) 3 P(2,2)

P(1,3) 4)

P(X f(4)

36 P(2,1) 2 P(1,2)

3) P(X f(3)

36 P(1,1) 1 2)

P(X f(2)

The p.m.f for the sum of two dice is:

X 2 3 4 5 6 7 8 9 10 11 12 Total

f(x) 1/36 2/36 3/36 4/36 5/36 6/36 5/36 4/36 3/36 2/36 1/36 1

(4)

4 | S . A l h i d a i r a h 2 0 1 7 0 1

Ex(9): If the experiment consists of flipping coin twice , find the p.m.f for the no. of heads . Solution:

 

   

4 1 2 1 2 1 H) P(H, 2)

P(X f(2)

4 2 4 1 4 1 2 1 2 1 2 1 2 1 H) P(T, T) P(H, 1)

P(X f(1)

4 1 2 1 2 1 T) P(T, 0) P(X f(0)

0,1,2 X

heads of no.

: X

T) (T, H), (T, T), (H, H), (H, S

The p.m.f for no. of the heads is

Ex(10): Consider the p.m.f of the discrete r.v. X, f(x)=P(X=x)=k(5x+7) for X=0,1,3,4 Find the value of k.

X

0 1 3 4

f(x) 7k 12k 22k 27k

Solution:

We have: 7k+12k+22k+27k=1

68k=1 → k=1/68

Probability of sum two dice

0 0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.16 0.18

2 3 4 5 6 7 8 9 10 11 12

x

f(x)

X 0 1 2 Total

f(x) 1/4 2/4 1/4 1

(5)

5 | S . A l h i d a i r a h 2 0 1 7 0 1

Expectation (Expected value, mean value, mean)

Determines the value about which the values of X tend to accumulate .

f(x) μ x

E(X)

The value of E(X) is positive, negative or zero.

Properties of expected value:

1. If c is constant then E(c)c.

2. E(aXb)aE(X)b. a,b constant. 3. E(c g(x)) = cE(g(x)).

4. E

g1(x)g2(x)

E(g1(x))E(g2(x)).

Higher moments of X:

The rth non-central moment of the discrete r.v X is denoted by:

2 2 2 2

1 1

0 0

r r

r r

σ μ ) μ E(X

μ ) μ E(X

1 ) μ E(X

(x) x x) (X x ) μ E(X

 

 

 

 

P

f

Ex(11): Consider the probability distribution:

X

1 2 3 4

f(x) 1/6 1/3 1/3 1/6

Find the following:

i) E(X).

Solution: E(X)=1/6+2/3+1+4/6=2.5 ii) E(2X2+5)

Solution: E(2X2+5)=2E(X2)+5=2(1/6+4/3+3+16/6)+5=19.33

Variance of a r.v X:

It measures the degree of scatter (measured in squared units of X) of the value of X about its mean µ. It is denoted by Var(X) or σ2 and defined by:

σ2 σ2x Var(X)V(X)E

X-μ

2 E(X2)μ2.

(6)

6 | S . A l h i d a i r a h 2 0 1 7 0 1

Standard deviation

It measures the degree of scatter of the value of X (measured in the same units as X) about its mean µ. It is denoted by σ and defined as:

σ V(X)  Var(X).

Properties of variance :

constants.

are b a, , V(X) a b) X V(a 4.

constant.

is k V(X), k

X) V(k 3.

constant.

is k , 0 V(k) . 2

0.

V(X) 1.

2 2

Ex(12): For the probability distribution:

X

1 2 3 4

f(x) 1/6 1/3 1/3 1/6

We have E(X)=2.5, E(X2)=7.17. Find the following:

i) Var(X)

Solution: Var(X)=E(X2)-[E(X)]2=7.17-(2.5)2=0.92 ii) Var(3X-5)

Solution: Var(3X-5)=9Var(X)=9×0.92=8.28

Cumulative distribution function for x:

If X is a discrete random variable , then the distribution function F(x) is defined by :

        

x x

i x

i

x f x

X P x F x F

Remarks:

1. F(x) is non-decreasing function.

2. F()0, F()1.

3. 0 ≤ F(x) ≤ 1.

4. limF(x h) F(x).

0

h  

5. F(x)is unique forX.

(7)

7 | S . A l h i d a i r a h 2 0 1 7 0 1 6. If X is a discrete r.v then:

f(b).

F(a) F(b) b) X P(a

f(b) f(a) F(a) F(b) b) X P(a

f(a) F(a) F(b) b) X P(a

F(a) F(b) b) X P(a

7. We can find the p.m.f From a cumulative function ,f(xi)F(xi)F(xi1) Ex(13): If the distribution of discrete r.v X is given by:







. 4

; 1

. 4 3

; 10 9

. 3 2

; 5 4

. 2 1

; 5 3

. 1 0

; 2 1

. 0

; 0

) (

x x x x x x

x F

1) Find the probability mass function of X.

2) Find P(X1).

3) .

2 X 3 2 P 1

Find 

 

  

Solution:

0.1..

10 1 10

9 10 10 1 9 F(3) F(4) f(4)

10 0.1.

1 10

8 9 5 4 10 F(2) 9 F(3) f(3)

5 0.2.

1 5 3 5 F(1) 4 F(2) f(2)

10 0.1.

1 10

5 6 2 1 5 F(0) 3 F(1) f(1)

2 0.5.

0 1 2 ) 1 F(

F(0) f(0) 1)

 

 

 



The p.m.f is:

X 0 1 2 3 4

f(x) 0.5 0.1 0.2 0.1 0.1

(8)

8 | S . A l h i d a i r a h 2 0 1 7 0 1

0.1 0 F(0) 2 F(1)

f 3 F(0.5) F(1.5)

2 f 3 2 F 1 2 F 3 2 X 3 2 P 1 3)

5 0.4.

1 3 F(1) 1 1) P(X 1 1) P(X 2)



 

 



 

 



 

 



 

 



 

  

Ex(14): If X is a discrete r.v with p.m.f f(x)=CX; X=0,1,2,3. Find the value of c.

Solution:

 

.

6 c 1 1 c 6 1 3 2 1 0 c 1 x c 1 cx

3

0 x 3

0 x

Ex(15): If X is a discrete r.v with p.m.f

X 0 1 2 3 Total

f(x) 8/125 36/125 54/125 27/125 1

. ) 2 1 ( . 8 ), 3 1

2 ( . 7

), 3 2

( . 6 ),

3 1

( . 5

), 3 1

( . 4 ),

3 2

( . 3

1), 2.P(X 1),

1.P(X Find

X X P X

X P

X P X

P

X P X

P

Solution:

f(3) 1.

f(2) f(3) 2) f(2)

X 1 8.P(X

90. 36 125 90

125 36 125) 54) ((36

125 36 f(2)

f(1) 3) f(1) X 1 2 7.P(X

125. f(3) 27 3) X 6.P(2

125. f(3) 81 f(2) 3) X 5.P(1

125. f(3) 117 f(2) f(1) 3) X 4.P(1

125. f(3) 81 f(2) 3) X 3.P(2

125. 81 125

27 125 f(3) 54 f(2) 1) 2.P(X

125. f(0) 117 1 f(3) f(2) f(1) 1) 1.P(X

 

 

 

 

(9)

9 | S . A l h i d a i r a h 2 0 1 7 0 1

Probability generating function

If X is discrete r.v with probability mass function given by f(x), then probability generating function is given by: (t)E(tx)txf(x)

Gx Notes:

 

G (1)

. (1)

G (1) G 4.Var(X)

1) X(X E (1) G (t)

G . 3

μ μ E(X) (1)

G (t)

G . 2

1 (1) 1.G

2 x x

x 1 t

1 1 t x

 

 

 

 

 

 

 

 

Ex(16) If X has the probability generating function given by 5(1 ) )

( t

e t

G    find:

1. E(X) and V(X).

2. E[X(X-1)].

Solution:

 

25 x(1) G 1)]

- E[X(X 2.

2 5.

5 2 5 2 5 x(1) G x(1) G x(1) G V(X) (ii)

2. 5 x(1) G

2. 5 5 5e 2e 5 1 t t e5 52 e 5 1 t t e5 dt 5 5 d 1 e (t) t Gx

5.

x(1) G

5 5.

5e e 5 1 t t e5 5 5 e 1 t t e5 dt 5 d t) e

(1 e 5 dt

d 1 (t) t Gx E(X) i) ( 1.

 

 

 

 

 

 

 



 



 

 





 

 



 

 

 



 



 

 





 

 

 

 

Moments generating function

It is denoted by Mx(t) and defined by: E

 

e Mx(t) etxf(x). x

t

x

Note:

 

 

μ . ) E(X (0) 4.M

μ . ) E(X (0) M e

X E (t) M 3.

μ. μ E(X) (0)

M e

X E (t) M 2.

1 M(0) 1.

r r (r)

2 2 0

t x t 2

0 1 t x t

 

 

 

 

 

 

 

(10)

10 | S . A l h i d a i r a h 2 0 1 7 0 1

Ex(17): Find the moment generating function given that:

X 1 3 5

f(x) 0.25 0.5 0.25

Solution: f(x) 0.25et 0.5e3t 0.25e5t x

etx x(t)

M    

Discrete Probability Distributions

1.Bernoulli distribution:

Bernoulli trial: it is a random experiment, whose outcomes is either:

i- Success with probability p. ( 0 < p < 1), ii- Failure with probability q. (q=1-p)

Now let X denote the number of successes in one Bernoulli trial , the possible values of X are 0, 1.

P(X=1)=P(success)=p P(X=0)=P(failure)=q Such that: q+p=1

A random variable X is said to be Bernoulli random variable if its probability mass function is given by : f(x)P(Xx)pxq1x,x 0,1, 0q,p1.

This is referred to simply as: X~Ber(p)

Ex(18): Tossing a coin once and looking for a head.

Ex(19): Rolling a die once and looking for “face 1”

Characteristics of Bernoulli distribution:

1.Expectation: μE(X)p 2.Variance: Var(X) = p q 3.Standard deviation: σ pq

4.Moment generating function : Mx(t)qpet 5. The probability generating function : Gx(t)qpt

(11)

11 | S . A l h i d a i r a h 2 0 1 7 0 1 2. Binomial distribution:

Consider of n trials of Bernoulli trials, such that:

1- The outcome of each trial is:

i- Success with probability p.

ii-Failure with probability q.

2- The probability p of success is constant, it doesn’t change from trial to trial.

3- The trials are independent.

Let X be a r.v denoted the number of successes in these n (or random sample of size n such that drawn with replacement) trials. The possible values of X are X=0,1,2,…,n. The p.m.f of X which follows Binomial distribution with parameters n and p is:

x qn px n Cx

f(x)  x = 0,1,…,n , 0≤ p ≤1 , q=1-p This is referred to simply as: X~BIN(n,p)

Characteristics of Binomial distribution:

1.Expectation: μE(X)np 2.Variance: Var(X) = n p q 3.Standard deviation: σ npq

4.Moment generating function : Mx(t)(qpet)n 5. The probability generating function : Gx(t)(qpt)n

Ex(20):A coin is tossed 6 times. Find the probability that the head will appear:

i. Exactly 3 times.

Solution:

0.313 3

6 2 3 1 2 1 3 f(3) 6

6 0,1,2,..., x

, x 6 2 x 1 2 1 x f(x) 6

 



 

 

 

 

 



 



 

 

 



 



(12)

12 | S . A l h i d a i r a h 2 0 1 7 0 1 ii. At least 2 times.

Solution:

0.890

1 6 2 1 1 2 1 1 0 6 6 2 0 1 2 1 0 1 6

f(1)]

[f(0) 1

f(6) ...

f(3) f(2) 2) P(x





 



 

 

 



 





 

 

 



 

 

Ex(21): The m.g.f of r.v X is: Mx(t)(0.250.75et)6

Determine the probability distribution, mean, and variance.

Solution:

X~BIN(6, 0.75) E(X)=4.5 Var(X)=1.125

3. Poisson distribution:

Let X be a r.v denoted the number of successes in a sequence of n (n30) Binomial trials with probability p (p < 0.5) of successes, satisfying the conditions of the Binomial distribution such that:

λ=np if finite.

The possible values of X are X=0,1,….The p.m.f of X is:

λ 0 , 0,1,2,...

x x! ,

e λ λx

f(x)   

In this case we say X follows a Poisson distribution with parameter λ. Characteristics of Poisson distribution:

1.Poisson distribution is a limiting distribution of a binomial distribution as n tends to infinity.

2.Poisson distribution is the distribution for rare events. It is used when n is large and p is small, such that λ=np is finite.

3.Expectation: μ E(X) λ 4.Variance: Var(X) = λ 5.Standard deviation: σ 

6.Moment generating function : 1) λ(et e x(t)

M  

(13)

13 | S . A l h i d a i r a h 2 0 1 7 0 1

Ex(22): The number of X annual earthquakes in a certain country has mean 4, what is the probability distribution of X.

Solution:

Earthquake is a rare event then, the distribution of X is Poisson with parameter λ= 4

Ex(23): Consider the case when X has a Poisson distribution with parameter 3. Find the p.m.f and P(X=2).

Solution:

The p.m.f is:

e 3 2! 4.5

e 3 32 f(2) 2)

P(X

0,1,2,...

x x! ,

e 3 3x f(x)

 

 

 

4. Geometric distribution:

Let X be r.v denoting the number of Bernoulli trials, required to obtain the first success. The possible values of X are X=1,2,….

The p.m.f of X is: f(x)P(Xx)pqx1, x1,2,...

This is called Geometric distribution with parameter p, we denote this by writing: X~GEOM(p) Characteristics of Geometric distribution:

1.Expectation:

p E(X) 1

μ 

2.Variance: Var(X) = 2 p

q

3.Standard deviation: 2 p σ q

4.Moment generating function :

et q 1

et (t) p Mx

  5. The probability generating function :

t q 1

t (t) p Gx

 

(14)

14 | S . A l h i d a i r a h 2 0 1 7 0 1

Ex(24):Consider the case X~GEOM(0.65)

This means that X has a Geometric distribution with parameter p=0.65.

1,2,...

x 1,

0.35)x 0.65(

x) P(X

f(x)    

We have:

E(X)=1/0.65=1.538

Var(X)=0.35/(0.65)2=0.828 P(X > 2)=1-[f(1)+f(2)]=0.1225 P(X < 4)=f(1)+f(2)+f(3)=0.957 5. Negative Binomial distribution:

Let X be r.v denoting the number of Bernoulli trials, required to obtain the kth success. The possible values of X are k,k+1,k+2,…

The p.m.f of X is: pkqx k, x k,k 1,k 2,...

1 k

1 x) x

P(X

f(x)    



 

 

This is called Negative Binomial distribution with parameters k and p, we denote this by writing: X~NIB(k, p)

Remark:

-The negative binomial distribution can be reduced to the geometric distribution when k = 1.

-Thus the geometric distribution is a special case of the negative binomial distribution.

Characteristics of Negative Binomial distribution:

1.Expectation:

p E(X) k

μ 

2.Variance: Var(X) = 2 p kq

3.Standard deviation: 2 p σ kq

4.Moment generating function :

k





 

et q 1

et (t) p

Mx

5. The probability generating function :

k



 

  t q 1

t (t) p

Gx

(15)

15 | S . A l h i d a i r a h 2 0 1 7 0 1

Ex(25): Consider the case when X~NIB(5, 0.8).

This means that X has a Negative Binomial distribution with k=5 , and p=0.8 We have:

E(X)=5/0.8=6.25

Var(X)=5(0.2)/(0.8)2=1.563

0.197 5

(0.2)7 (0.8)5 1 5

1 7) 7

P(X

f(7)  



 

 

6. Hyper Geometric distribution:

Consider a collection of k objects of a certain type and N-k of another type. A random sample of size n is drawn without replacement. Let X be a r.v denoting the number of objects of the first type in the selected sample.

The p.m.f is:

k - N x - n and k x

,..., 1 , 0 x , n N

x n

k N x k f(x)



 



 

 

 

n



 

 

 

 

 , ( ) 1 1

)

( N

n N N

k N n k X N Var

n k X E

In this case we say that X has a Hyper Geometric distribution with parameters n, k, N, and we write: X~HG(n,k,N)

Ex(26): From a group of 6 men and 4 women , a random sample of 5 persons is selected without replacement. Let X denotes the number of men in the sample.

It is clear that X~HG(5,6,10), the p.m.f is :

5 , 4 , 3 , 2 , 1 , 0 x , 5

10 x 5

6 0 1 x 6

f(x) 



 



 

 

 

P(2 men selected)= 0.238

5 10

3 4 2 6

f(2) 



 



 



 

Referensi

Dokumen terkait