Stat 330 September 20, 2012

A student asked me to post the hidden charts 36-40 from today’s lecture. Here they are.

I have posted the next Chart Set, on MCMC.

We looked at the problem of measuring the bias of a coin that is not fair, when we have observed a certain number h of heads and t of tails. Again, we need to identify the states of nature. The states of nature are always things that you do not know but would like to know. We know how many heads and tails we observed, so that cannot be the states of nature. What we do not know is the bias b of the coin, so the possible values of b, between 0 and 1, are the states of nature. For a prior we chose one that is uniform in the bias, that is, all values of the bias are equally probable, but we considered the possibility of a triangular or bell-shaped prior as well. For the likelihood, if the tosses are independent, so the likelihood is proportional to b^h(1-b)^t. If we know the exact sequence of heads and tails, that is the probability of the particular sequence we observed; if we only know the number of heads and tails, then there is also a binomial coefficient as a factor: C^{h+t}_h. However, as this binomial coefficient is independent of b, the two likelihoods are proportional up to a constant factor and we can use either one.

Then the Bayesian mantra: posterior is proportional to prior times likelihood. We’d have to normalize by dividing this product by its integral from 0 to 1. That’s the posterior distribution. See Chart #40 in the supplemental chart set you can download above.

We talked about various summaries of the posterior. As a point summary, the mean is superior to the mode (Chart #43).

We then showed how to get the results by simulation, and how to summarize the results from the sample drawn from the posterior distribution. As the data get more and more numerous, the results will become better and better.

We also showed how you can use a beta prior with the binomial likelihood to get a beta posterior, thus staying within the class of beta functions (“conjugate priors”); and how you can use the beta posterior as a prior with a new set of data to get an even more accurate result. We also pointed out that the rules of probability theory (when you explicitly consider the stuff to the right of the conditioning bar) don’t let you “cheat” by using the same data twice!

Finally, I introduced the Cauchy distribution and stated that the mean of N Cauchy-distributed random variables is again Cauchy, from the same distribution. Taking averages of Cauchy-distributed data does not improve things; the average is no better than a single observation. So that’s not how we should approach data that happens to have a Cauchy distribution.

Advertisements

One Response to “Stat 330 September 20, 2012”

  1. Ahmed Says:

    Thank you for posting the notes Professor Jefferys!
    -Ahmed

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


%d bloggers like this: