Relation to other statistical properties Monotone likelihood ratio




1 relation other statistical properties

1.1 exponential families
1.2 powerful tests: karlin–rubin theorem
1.3 median unbiased estimation
1.4 lifetime analysis: survival analysis , reliability

1.4.1 proofs
1.4.2 first-order stochastic dominance
1.4.3 monotone hazard rate


1.5 example





relation other statistical properties

monotone likelihoods used in several areas of statistical theory, including point estimation , hypothesis testing, in probability models.


exponential families

one-parameter exponential families have monotone likelihood-functions. in particular, one-dimensional exponential family of probability density functions or probability mass functions with








f

θ


(
x
)
=
c
(
θ
)
h
(
x
)
exp

(
π
(
θ
)
t
(
x
)
)


{\displaystyle f_{\theta }(x)=c(\theta )h(x)\exp(\pi (\theta )t(x))}



has monotone non-decreasing likelihood ratio in sufficient statistic t(x), provided



π
(
θ
)


{\displaystyle \pi (\theta )}

non-decreasing.


most powerful tests: karlin–rubin theorem

monotone likelihood functions used construct uniformly powerful tests, according karlin–rubin theorem. consider scalar measurement having probability density function parameterized scalar parameter θ, , define likelihood ratio




(
x
)
=

f


θ

1




(
x
)

/


f


θ

0




(
x
)


{\displaystyle \ell (x)=f_{\theta _{1}}(x)/f_{\theta _{0}}(x)}

. if




(
x
)


{\displaystyle \ell (x)}

monotone non-decreasing, in



x


{\displaystyle x}

, pair




θ

1




θ

0




{\displaystyle \theta _{1}\geq \theta _{0}}

(meaning greater



x


{\displaystyle x}

is, more




h

1




{\displaystyle h_{1}}

is), threshold test:







φ
(
x
)
=


{



1



if 

x
>

x

0






0



if 

x
<

x

0










{\displaystyle \varphi (x)={\begin{cases}1&{\text{if }}x>x_{0}\\0&{\text{if }}x<x_{0}\end{cases}}}


where




x

0




{\displaystyle x_{0}}

chosen




e


θ

0





φ
(
x
)
=
α


{\displaystyle \operatorname {e} _{\theta _{0}}\varphi (x)=\alpha }



is ump test of size α testing




h

0


:
θ


θ

0



 vs. 


h

1


:
θ
>

θ

0


.


{\displaystyle h_{0}:\theta \leq \theta _{0}{\text{ vs. }}h_{1}:\theta >\theta _{0}.}


note same test ump testing




h

0


:
θ
=

θ

0



 vs. 


h

1


:
θ
>

θ

0


.


{\displaystyle h_{0}:\theta =\theta _{0}{\text{ vs. }}h_{1}:\theta >\theta _{0}.}


median unbiased estimation

monotone likelihood-functions used construct median-unbiased estimators, using methods specified johann pfanzagl , others. 1 such procedure analogue of rao–blackwell procedure mean-unbiased estimators: procedure holds smaller class of probability distributions rao–blackwell procedure mean-unbiased estimation larger class of loss functions.


lifetime analysis: survival analysis , reliability

if family of distributions




f

θ


(
x
)


{\displaystyle f_{\theta }(x)}

has monotone likelihood ratio property in



t
(
x
)


{\displaystyle t(x)}

,



but not conversely: neither monotone hazard rates nor stochastic dominance imply mlrp.


proofs

let distribution family




f

θ




{\displaystyle f_{\theta }}

satisfy mlr in x,




θ

1


>

θ

0




{\displaystyle \theta _{1}>\theta _{0}}

,




x

1


>

x

0




{\displaystyle x_{1}>x_{0}}

:











f


θ

1




(

x

1


)



f


θ

0




(

x

1


)








f


θ

1




(

x

0


)



f


θ

0




(

x

0


)



,


{\displaystyle {\frac {f_{\theta _{1}}(x_{1})}{f_{\theta _{0}}(x_{1})}}\geq {\frac {f_{\theta _{1}}(x_{0})}{f_{\theta _{0}}(x_{0})}},}



or equivalently:








f


θ

1




(

x

1


)

f


θ

0




(

x

0


)


f


θ

1




(

x

0


)

f


θ

0




(

x

1


)
.



{\displaystyle f_{\theta _{1}}(x_{1})f_{\theta _{0}}(x_{0})\geq f_{\theta _{1}}(x_{0})f_{\theta _{0}}(x_{1}).\,}



integrating expression twice, obtain:



first-order stochastic dominance

combine 2 inequalities above first-order dominance:








f


θ

1




(
x
)


f


θ

0




(
x
)
 

x


{\displaystyle f_{\theta _{1}}(x)\leq f_{\theta _{0}}(x)\ \forall x}



monotone hazard rate

use second inequality above monotone hazard rate:











f


θ

1




(
x
)


1


f


θ

1




(
x
)








f


θ

0




(
x
)


1


f


θ

0




(
x
)



 

x


{\displaystyle {\frac {f_{\theta _{1}}(x)}{1-f_{\theta _{1}}(x)}}\leq {\frac {f_{\theta _{0}}(x)}{1-f_{\theta _{0}}(x)}}\ \forall x}



example






Comments

Popular posts from this blog

Early forms Nasal helmet

History Fixed exchange-rate system

Early years .281995.E2.80.931999.29 History of D.C. United