The next assignment is to do problems from Michael Lavine’s book, “Introduction to Statistical Thought“. Please do problems 2, 28, 29 and 36 at the end of Chapter 1 (starts on p. 80). This is due on Thursday, October 4.

Here is a link to the proof that the average of two standard Cauchys is a standard Cauchy; the proof can be generalized to the average of N Cauchys.

I already mentioned Nate Silver’s blog, where he uses statistics (mostly Bayesian) to predict election outcomes. He was interviewed today by Salon.com; it’s a very interesting interview with lots of insights about sampling and interpretation of data. Nate has just come out with a book based on his experiences, which looks quite interesting.

Today’s lecture looked at MCMC in general and at Gibbs sampling in particular. Gibbs sampling is used if you can sample from the conditional distributions of some of the parameters given other parameters. You start somewhere in the sample space, sample on some parameters conditional on the remaining one, *replace* the old value of those samples with the new sample, then, conditional on this updated set, sample another set of parameters conditional on the remaining samples in the updated set. We continue in this way until all the parameters have been sampled. The result is that we will have jumped to a new place in the sample space, generating a new sample on all the parameters. We then repeat this procedure as long as we wish to generate a large number of samples.

We illustrated this with a very simple (trivial) example on a finite state space; we then derived a sampling scheme for inference on normally distributed data with unknown mean and variance, and were about to discuss an R program for doing the sampling.