Course talk:CPSC522/MCMC

From UBC Wiki

Contents

Thread titleRepliesLast modified
Some suggestions103:24, 11 February 2016
Some suggestions103:16, 11 February 2016
Some Suggestions reagrding MCMC103:13, 11 February 2016

Some suggestions

Hi Ricky,

This page is very interesting, and here are some suggestions:

1. I think your page are missing the build on part.

2. The Example of the Monte Carlo Integration is a little hard to understand, I think the idea is very easy, maybe you can get some idea from the Wikipedia page: https://en.wikipedia.org/wiki/Monte_Carlo_integration#Example, their example is more concise.

3. I have the same feeling as Arthur, maybe visit this website: http://galton.uchicago.edu/~eichler/stat24600/Handouts/l12.pdf, I don't think you can need to spend a lot of time on proving the correctness of the Metropolis-Hastings Algorithm, better give some example, have a look at the one in the link.

Sincerely,

Junyuan Zheng

JunyuanZheng (talk)06:34, 4 February 2016

Hi Junyuan,

1. I changed the wording to Background.

2. Weird... Mine is the exact same example, but definitely with more explanation...

3. Thanks! I'll take a look at the pdf and see if there's anything I can add. :)

Thanks for the feedback,

Ricky

TianQiChen (talk)03:24, 11 February 2016
 

Some suggestions

Hi Ricky,

Thanks for your informative page. I learned so many things.

Just a few things came into my mind about your page:

1. The abstract is too short and is not a clear summary of the wiki page.

2. You explain MCMC from machine learning point of view, not artificial intelligence. I expected this page to be explained on graphical models. For example to have a graph with large tree width and show how to do approximate inference on that using MCMC approaches.

3. There are a few references with no citations in the text. Also, there are some parts like the algorithms that need to be cited.


Cheers,

Bahare

BahareFatemi (talk)04:08, 3 February 2016

Hi Bahare,

1. I'll update the abstract to contain more specifics.

2. True. But talking about graphical models probably requires another page. Instead, the Gibbs sampler is the most basic MCMC tool used for sampling from graphical models.

3. I'll add more citations to some specific claims.

Thanks,

Ricky

TianQiChen (talk)03:16, 11 February 2016
 

Some Suggestions reagrding MCMC

Hi, Ricky Nice to read your wiki about MCMC, which helps me a lot in understanding the mutual relationship between Markov Chain and Monte Carlo. I have some questions regarding your wiki listed as follows: 1. You put a large paragraph on the proof of correctness of MH Algorithm, is there any particular reason for that? 2. Can you have more explanation on the reason for choosing different proposal distribution and their respective advantages and disadvantages? 3. You mentioned in the application that MCMC can be applied in any context where density of interest is too complex for simple sampling method can you elaborate more on your Gibbs sample because I don't see the advantage of applying MCMC on the model over other models.

Thanks Arthur

BaoSun (talk)02:52, 2 February 2016

Hey Arthur,

1. The proof of correctness is essential for any MCMC algorithm, as it is the core proof behind why the law of large numbers can be applied. Generally, any variant (new) MCMC algorithm must be accompanied by such a proof. I'm just showing the basic idea behind why Metropolis-Hastings is correct.

2. Sure. Although there really isn't any, other than just using what densities are available to you.

3. That is more suited as a "problems" section for basic Monte Carlo methods, like rejection sampling. But let's see if I can add some more intuition.

Thanks for the feedback,

Ricky

TianQiChen (talk)03:13, 11 February 2016