Linear combination of Normal Random Variables Linear Function of a Normal Random Variable If X ~N(µ,σ2) and a and b are constants, then
) , (
~N aµ b a2σ2 b
aX
Y = + +
Linear Combinations of Independent Normal Random Variable
If Xi ~ N(µi,σi2), 1≤i≤n, are independent variables and if ai , 1≤i≤nand b are constants, then
) , (
~
... 2
1
1X a X b N µ σ
a
Y = + n n + Where
b a
a + n n +
= µ µ
µ 1 1...
and
2 2 2
1 2 1
2 a σ ... anσn
σ = + +
Average Independent Normal Random Variable
IfXi ~N(µ,σ2), 1≤i≤n, are independent variables, then their average X is distributed
) , (
~
2
N n
X µ σ
The Central Limit Theorem
IfX1,...,Xn, is a sequence of independent identically distributed random variables with a mean and a variance, then the distribution of their averageX can be approximated by a
) , (
2
N µ σn
distribution, Similarly, the distribution of the sum X1 +...+Xn can be approximated by a )
, (nµ nσ2 N
distribution.
The Chi-Square Distribution
A Chi-square random variable with vdegrees of freedom, X can be generated as
2 2
1 ... Xv
X
X = + +
Where Xi are independent standard normal random variables. A chi-square distribution with v degrees of freedom is a gamma distribution with parameter values and , it has an
expectation of vand a variance of 2v The t-Distribution
A t-Distribution with vdegrees of freedom, X is defined to be v
tv N
/ 1 , 0
~ ( χ2
Where N(0,1) and χv2 random variables are independently distributed. The t-distribution has a shape similar to a standard normal distribution but is a little flatter. As v→∞, the t- distribution tends to a standard normal distribution.
Sample Mean
IfX1,...,Xn, are observations from a population with a mean µ and a variance σ2 then the central limit theorem indicates that the sample mean µˆ=X has the approximate distribution
) , ( ˆ ~
2
N n
X µ σ
µ=
Sample Variance
IfX1,...,Xn, are normally distribution with a meanµand a variance σ2 then the sample variance S2
~ 1
2 1 2 2
−
−
S nχn σ
t-Statistic
IfX1,...,Xn, are normally distribution with a meanµthen
( )
~ −1
− tn
S X
n µ
This result is very important since in practice an experimenter knows the value of n and the observed sample mean X and sample variance S2, and so knows everything in the quantity
( )
S X n −µ
except for µ. This allows the experimenter to make useful inferences about µ