# MLEs help!

#### lyb2012

##### New Member
I meet a problems when doing my statistics homework!
The question is,
Let X and Y be independent exponential random variables with
$$f(\mathbf{x} | \lambda)=\frac {1} {\lambda} e^{-x/\lambda}$$, x>0, $$f(\mathbf{y}|\mu)=\frac {1}{\mu} e^{-y/\mu}$$, y>0
We observe Z and W with
$$Z=min(X,Y)$$ and W=1 if Z=X W=0 if Z=Y
Assume that $$(Z_i,W_i)$$,i=1,...,n, are n i.i.d. observations. Find the MLEs of λ and μ.

I can get the likelilood functionL(λ，μ | Z，W)=$$\prod_{i=1}^n$$ $$(\frac{\lambda}{\lambda+\mu})^wi$$ $$(\frac{\mu}{\lambda+\mu})^(1-wi)$$$$(1-e^{-(\frac{1}{\mu}+\frac{1}{\lambda})z_i})$$

Could someone help me to get the MLEs of λ，μ?

Last edited:

#### BGM

##### TS Contributor
In the first part of the question, see Casella and Berger, Statistical Inference Ex 4.26,

which you will need to calculate

$$\Pr\{Z \leq z, W = w\}, ~~ w = 0, 1$$

Note that you got the observation of $$(Z_i, W_i)$$. When you got the observation of a continuous random variable, the likelihood function should be equal to the probability density function. Only when the data itself is censored, specifically left-censored , the likelihood function is equal to the CDF.

So the very first step of the second half of the question, see Casella and Berger, Statistical Inference Ex 7.14, is to write down a correct likelihood function. And obviously you will need to do a differentiation with respect to $$z$$ for each value of $$w$$, and combine these two parts by the indicator $$w$$.

Next proceeds by usual maximization of the log-likelihood function.

#### lyb2012

##### New Member
I realized where my problem is. Thank you very much.