STAT 330 September 25, 2012

Today’s class centered on likelihoods. First we looked at the Cauchy distribution. I asserted without proof that the average of N (standard) Cauchy-distributed random variables has the same distribution as a single random variable, so that taking averages doesn’t improve our estimates. The likelihood does, however, get more and more peaked as we get more and more data. We looked at the likelihood for 1, 2, 3 and more Cauchy variables. We noted that for 2 we often get a “two-peaked” distribution, and with 3 we sometimes have odd bumps on the likelihood. These quiet down as more and more data is obtained.

I then stated the Likelihood Principle and discussed the fact that it is a consequence of the Sufficiency Principle and the Conditionality Principle, both of which seem unremarkable. Yet the Likelihood Principle is quite controversial, and many frequentist procedures violate it. Bayesian procedures never violate it, because in Bayesian inference, the information about the parameters contained in the data is always in the likelihood, which the Bayesian mantra automatically uses.

2 Responses to “STAT 330 September 25, 2012”

1. Kalil Says:

Hi professor, two questions:

– First I think you said that you have the proof that the sample average distribution is Cauchy (the average doesnt improve the estimate), if you do have it can you upload it?
– and Second,in chart 60 the likelihood function is defined as:
L(xo|{x1,..xn}) is that correct? or is L({x1,…xn}|xo) ?

Regards

• bayesrules Says:

I really should have used a ; instead of a | in writing the likelihood, e.g., L(x0; {x1, x2, …, xn}). Usually the likelihood has the data on the right and the parameters on the left.

I’ll make a pdf of the proof and post it.