🔖Statistical Inference I ( Theory of estimation : Efficiency)
In this article we see the terms:
I. Efficiency.
II. Mean Square Error.
III. Consistency.
📚Efficiency:
We know that two unbiased estimator of parameter gives rise to infinitely many unbiased estimators of parameter. there if one of parameter have two estimators then the problem is to choose one of the best estimator among the class of unbiased estimators. in that case we need to some other criteria to to find out best estimator. therefore, that situation we check the variability of that estimator, the measure of variability of estimator T around it mean is Var(T). hence If T is an Unbiased estimator of parameter then it's variance gives good precision. the variance is smaller then it give's greater precision.
📑 i. Efficient estimator: An estimator T is said to be an Efficient Estimator of 𝚹, if T is unbiased estimator of 𝛉. and it's variance is less than any other estimator. i.e. Var(T) < Var(T*).
where T* is any other estimator of 𝛉.
🔖ii. Relative Efficiency: If T1 and T2 are two unbiased estimators of parameter 𝛉 and E(T₁²) < ∞ and E(T₂²) < ∞ then relative efficiency of T1 with respect to T2 is denoted by Efficiency( T1 ,T2 ).
and T1 )/ V(T2 )
Remark: i. if relative efficiency =1 it means that T1 and T2 are equal efficient estimators.
ii. If Relative Efficiency ( T1 and T2 ) < 1, it means that V( T1 ) < V( T2 ).
i.e. T1 is more efficient than T2 .
iii. If Relative Efficiency ( T1 and T2 ) > 1, it means that V( T1 ) > V( T2 ).
i.e. T2 is more efficient than T1
Example: 1. if X1 and X2 are two independent observation from normal distribution with mean 𝚹 and variance 𝛔² , then show thatT1 = ( X1 + X2)/2 , T2 = ( X1 + 2X2 )/3 are unbiased estimators of 𝚹 and also find the relative Efficiency ( T1 ,T2 ).
Solution: if X1 and X2 are two independent observation from normal distribution with mean 𝚹 and variance 𝛔²
E( X1 ) = E( X2 ) = 𝚹.
we have T1 = (X1 + X2)/2 , T2 = (X1 + 2X2 )/3
E(T1 ) = E[(X1 + X2 )/2] , E(T2 )= E [(X1 + 2X2 )/3]. we have E(X1 ) = E(X2) = 𝚹.
E(T1 ) = E[(𝚹+𝚹)/2] , E(T2 )= E [(𝚹 + 2𝚹 )/3
E(T1 ) = 2𝚹/2 , E(T2 )= 3𝚹 /3
E(T1 ) = 𝚹 , E(T2 ) = 𝚹
Thus T1 &T2 are unbiased estimators of parameter 𝚹.
now we obtain Variation of T1 &T2
V(T1 ) =V[ (X1 + X2 )/2] , V(T2)= (X1 + 2X2 )/3
V(T1 ) =[V(X1 )+V( X2 )] /4 , V(T2 )=[ V(X1 )+ 4V(X2 )] /3
V(T1 ) =[V(X1 )+V( X2 )] /4 , V(T2)=[ V(X1 )+ 4V(X2 )] /9
V(T1 ) =[𝛔² +𝛔² ] /4 , V(T2)=[ 𝛔² + 4𝛔² ] /9
V(T1 ) =2𝛔² /4 , V(T2)=5𝛔² /9
V(T1 ) =𝛔² /2 , V(T2)=5𝛔² /9
relative Efficiency T1 relative to T2 is
E ( T1 ,T2) = V( T1 )/V(T2)
= (𝛔² /2 ) /(5𝛔² /9)
= 9/10
= 0.9 < 1
therefore E ( T1 ,T2) < 1 , means V( T1 ) =𝛔² /2 < V(T2)=5𝛔² /9
Hence T1 is more efficient than T2
Example 2. if X1, X2, ........Xn. random sample form normal distribution with mean 𝚹 and variance 𝛔² , T1 = sample mean i.e. x̄, and T2= sample median. find the Efficiency of T1 relative to T2 . given that variance of median is = V( T2)= (𝝅 𝛔² )/2n. (for large n)
solution: X1, X2, ........Xn. random sample form normal distribution with mean 𝚹 and variance 𝛔² then we have V(x̄) = V( T1 ) = 𝛔² /n. and V( T2 )= (𝝅 𝛔² )/2n.
therefore the Efficiency of T1 relative to T2 = E ( T1 ,T2 ) = V( T1 )/V(T2 )
E ( T1 ,T2 ) = V( T1 )/V(T2)
=[ 𝛔² /n] / [ (𝝅 𝛔² )/2n]
= [𝛔² /n] x[2n/((𝝅 𝛔² )]
= (2/𝝅 )
= 0.6369 < 1
therefore E ( T1 ,T2 ) < 1 , means V( T1 ) =𝛔² /n < V( T2 )= 𝝅𝛔² /2n
Hence T1 is more efficient than T2
i.e. sample mean is more efficient than sample median.
III. Mean Square Error (M. S. E.)
let T1 & T2 be the two estimators of parameter 𝚹, where T1 is a unbiased estimator of 𝚹 & T2 is a biased estimator of 𝚹. suppose the possible value of T1 is spread around 𝚹 and the value of T2 is near to the parameter. therefore in this case T2 may be preferred than that of T1 .
that case we need to study the variability of estimator around the parameter. for this we have to find out the variance of the estimator, which measure the variability of estimator around it's mean. if T1 is an unbiased estimator of it's variance gives a good measure of precision. but if T2 is biased estimator of parameter then variability of T around 𝚹. as measure of it's precision and it is called as Mean Square Error (M. S. E.).
Mean Square Error : an estimator T is a said to be a good estimator of 𝚹 if it's mean square error is minimum.
i.e. E(T-𝚹)² ≤ E(T*-𝚹)²
where 𝚹. = T and T* is any other estimator .
this criteria known as mean square error.
Result: show that M. S. E. of T = MSE(T) = Var(T) + b² (T,𝚹) where b(.) is biased estimator.
Proof:
by definition of M. S. E. (T) = E(T-𝚹.)²
= E[T-E(T)+E(T)-𝚹]²
= E{T-E(T)}²+E{E(T)-𝚹}² + E{[T-E(T)]x[E(T)-𝚹]}
= E{T-E(T)}²+E{E(T)-𝚹}²
= Var(T) + b² (T,𝚹)
{Var(T) = E{T-E(T)}², b² (T,𝚹) =E{E(T)-𝚹}² }
therefore M. S. E. of T = MSE(T) = Var(T) + b² (T,𝚹).
Definition: Mean Square Error (M. S. E.) :
The M.S.E. of estimator T of parameter 𝚹 is defined as
M. S. E. of T = MSE(T) = Var(T) + b² (T,𝚹) if T is biased estimator of 𝚹.
M. S. E. of T = MSE(T) = Var(T)
if an T unbiased estimator of 𝚹.
this is the definition of M.S.E.
Example: 1. if T1 & T2 be the two unbiased estimators of parameter 𝚹 with variance 𝞂1² and 𝞂2² respectively and having correlation ρ then find the best liner unbiased combination of T1 & T2 and also find the expression of variance of such combination.
Solution: we have two estimators T1 & T2 are unbiased estimators of parameter 𝚹 with variance 𝞂1² and 𝞂2².
E(T1 ) = 𝚹 , E(T2 ) = 𝚹
V(T1 ) = 𝞂1² , V(T2)= 𝞂2².
and correlation is ρ
consider a linear combination of estimators T1 & T2 is T = ɑ T1 + (1-ɑ) T2
Note that: Smaller M.S.E. i.e. variance is small means greater precision. hence while comparing two estimators T1 & T2 of 𝞗 we choose an estimator with smaller M.S.E. this used to modifies the formula of efficiency
Comments
Post a Comment