Thread: Consequences of choosing the wrong likelihood function...

1. Consequences of choosing the wrong likelihood function...

hey everyone! just a quicky-quick question here.

i think i remember reading somewhere that some of the nice properties of parameter estimates obtained via maximum likelihood get lost when you choose the wrong likelihood function over which you should me maximising. say the true data-generation model requires you to choose... i dunno, a beta or a gamma likelihood but you incorrectly choose to maximize over a gaussian likelihood.

does anyone remember what gets lost? what remains? what changes? i'm aaalmost sure that you keep consistency but you lose the efficiency property if you leave it as it is but i cant.freakin.find.the.chapter.where.i.read.this.

ideas or references of where to look for this stuff are appreciated

thanks peeps!

(ps- i'll mention y'all who helped me on the dedication section of my thesis. and will get you a virtual cupcake <-- (maybe)

2. Re: Consequences of choosing the wrong likelihood function...

solved through a private chat with Dason.

3. Re: Consequences of choosing the wrong likelihood function...

Originally Posted by spunky
just a quicky-quick question here.
But if we are typing very slowly?

Pawitan (2001) In all likelihood page 370, says under ”Maximum likelihood under a wrong model”:

“Therefore, maximizing the likelihood is equivalent to finding the best model, the one closest to the true distribution in the sense of the Kullback-Leibler distance.”
Then Pawitan shows a number of examples, among them a gamma model estimated with a normal distribution model.

“Thus the mean and variance of the true distribution is consistently estimated. This is an example where a ‘wrong’ model would still yield consistent estimates of useful population parameters. Such estimates are said to be robust with respect to model mis-specification.
Using a wrong mode, we will generally get biased or inconsistent estimates, but we might also lose efficiency.”
Maximum likelihood works well for a correct model under regularity conditions, when likelihood can be approximated by a quadratic function. But regularity conditions are not fulfilled for an example with a uniform distribution when the parameter is a boundary parameter.

4. The Following User Says Thank You to GretaGarbo For This Useful Post:

spunky (09-20-2012)

5. Re: Consequences of choosing the wrong likelihood function...

Originally Posted by GretaGarbo
But if we are typing very slowly?
oh well, a quicky-quick question can also warrant a long and tedious answer... but i'm prepared for that.

and greta, thank you very much for everything (now i can even provide an refernce for that part of my thesis).

your command of statistics never ceases to amaze me. i really like reading your posts, i learn quite a bit from them

6. Re: Consequences of choosing the wrong likelihood function...

I've heard God kills a kitten if you're wrong. Hope this helps or at least provides a smile (no need to give me credit in your thesis though).

7. Re: Consequences of choosing the wrong likelihood function...

Originally Posted by trinker
I've heard God kills a kitten if you're wrong.
uhmm... are you sure? i've only heard of something like that in this context

8. Re: Consequences of choosing the wrong likelihood function...

True but there are many kinds. See LINK

Again please don't quote me there's no need. I'm just glad to provide assistance.

9. Re: Consequences of choosing the wrong likelihood function...

I'll add that I'm certainly hoping the consequences aren't too severe since I'm not entirely convinced that anybody has ever really chosen a 100% correct likelihood ever for any non-trivial data.

10. Re: Consequences of choosing the wrong likelihood function...

Originally Posted by Dason
I'm not entirely convinced that anybody has ever really chosen a 100% correct likelihood ever for any non-trivial data.
i have (i had to add more smileys because i didnt know i need at least 10 characters on my messages)

11. Re: Consequences of choosing the wrong likelihood function...

Oh really? I'd love to hear about it.

12. Re: Consequences of choosing the wrong likelihood function...

you'll have to give me a cupcake.

i am in possession of THE only dataset ever known to humankind where the correct likelihood was fit to the data...

13. Re: Consequences of choosing the wrong likelihood function...

Spunky: [MATH]\hspace{5in}[/MATH] tricks the box so you can get away ith less characters and not have it show up. I'll prove it in the next post (now we've truly derailed this thread)

Hi

15. Re: Consequences of choosing the wrong likelihood function...

Originally Posted by trinker
I'll prove it in the next post (now we've truly derailed this thread)
you have proved nothing! :P

ps- meh, i got the answer i needed from Dasonn & Greta. this post can come down in flames now...

16. Re: Consequences of choosing the wrong likelihood function...

Originally Posted by spunky
(ps- i'll mention y'all who helped me on the dedication section of my thesis. and will get you a virtual cupcake <-- (maybe)
@spunky

And give the cakes to Jake, who is, I believe, a “cookie scientist”.

To bring back this discussion a little bit to the original post I would like to ask this:

What is the meaning of this:

Dason on the Cauchy distribution:
"YOU BETTER LOOK OUT BECAUSE THIS IS SOMETHING THAT IS GOING TO GET YOU"
And what did you mean by this expression (something like this):

“Frequentism used to be cool but then they got a knife in their knee.”
What means:

“Workin’ for the Raptors”
Does it mean: “Working for the Raptors”?

I understand (nowdays, but I didn’t before) that “raptors” is not “eagles and hawks”, but rather velociraptors, some kind of ancient dinosaurs. All of this can be very confusing for new readers.

(Why don’t you ever use upper case letters in the beginning of a sentence?)

I was gooling the expression: “Maximum likelihood under a wrong model” and found many links.

Edit:
Thanks for friendly words! I hope this is not considered confrontational. I am just curious and want to know.