In all the literature I can find it is stated (and "proven" trivially) that for i.i.d. samples r with Rayleigh distribution \(\sigma\) the MLE is \(\widehat{\sigma} = \frac{\sum r_i^2}{2n}\), and it is an unbiased estimator for \(\sigma\).
But any Monte Carlo test shows that's not true: Only the square root of that MLE is anything like an estimator for \(\sigma\).
Obviously I'm misunderstanding what it means to be a Maximum Likelihood Estimator. Can somebody explain what I'm missing?
But any Monte Carlo test shows that's not true: Only the square root of that MLE is anything like an estimator for \(\sigma\).
Obviously I'm misunderstanding what it means to be a Maximum Likelihood Estimator. Can somebody explain what I'm missing?