Course talk:CPSC522/Hidden Markov Models

From UBC Wiki

Contents

Thread titleRepliesLast modified
critiques108:23, 9 February 2016
suggestions108:20, 9 February 2016
Some Suggestions Regarding Hidden Markov Models108:16, 9 February 2016

There's good potential here. Here are my scores, with comments below. Let me know if you want to discuss any of the points.

Scale of 1 to 5, where 1 = strongly disagree and 5 = strongly agree:


  • (5) The topic is relevant for the course.
  • (2) The writing is clear and the English is good.
  • (4) The page is written at an appropriate level for CPSC 522 students (where the students have diverse backgrounds).
  • (2) The formalism (definitions, mathematics) was well chosen to make the page easier to understand.
  • (2) The abstract is a concise and clear summary.
  • (2.5) There were appropriate (original) examples that helped make the topic clear.
  • (3) There was appropriate use of (pseudo-) code.
  • (4) It had a good coverage of representations, semantics, inference and learning (as appropriate for the topic).
  • (5) It is correct.
  • (4) It was neither too short nor too long for the topic.
  • (5) It was an appropriate unit for a page (it shouldn't be split into different topics or merged with another page).
  • (1) It links to appropriate other pages in the wiki.
  • (3) The references and links to external pages are well chosen.
  • (2) I would recommend this page to someone who wanted to find out about the topic.
  • (2) This page should be highlighted as an exemplary page for others to emulate.

Comments:


  • Proofreading is recommended, as there are grammatical errors throughout that make the page harder to read; this is more of a problem given the low text-to-math ratio.
  • Centering the formulas makes the page harder to read. You can indent the formulas by starting a line with ':'.
  • I think were supposed to use the built-in math tags to enter formulas instead of images.
  • There's a consistent pattern of giving a definition/algorithm before motivating it; this results in a disjointed flow that can be difficult to follow.
  • I didn't see any links to other pages in the wiki.

Abstract:

  • This is more of an introduction than an abstract. The abstract should contain a summary of the page contents, and the introduction currently there would be better placed in the content section.
  • I believe the HMM page is mentioned in the Graphical Models and Bayesian Networks page, so it might be good to have links to those in the "Builds on" section.

Definition of HMM:

  • It feels like the goal is to have as little text as possible. If I knew nothing about HMMs, I'd have difficulty understanding this section. Having more qualitative descriptions, perhaps with figures, would be useful here.
  • "All possible states" and "all possible observations" reads as very universal. You should specify that you're talking about some system that can be in any one of N states, and that at any time, one of M observations can be made; then that you're making observations at T different time steps, so that at each time step the system is in state i_t with observation o_t.
  • If someone doesn't know what a state transposition probability matrix is, then they also wouldn't know why they need one. As previous, it would be good to motivate its existence by talking about how, when the system is in state i, it can transition into state j with some probability p; and that it's useful to us to collect these probabilities into a matrix that we call...
  • Similarly for the output matrix and initial state vector: motivate/describe them before you give the math.

Example:

  • The example you chose is a good one, as it can be used to illustrate the concepts very well.
  • The description of the rules would be better put as a caption to a state transition diagram (as a figure).
  • Tables 2 and 3 should be explicitly mentioned in the text of the example; a reader wouldn't know to look for them and might easily get confused.

Three core problems of HMM:

  • The hierarchy of section headings seems to break down in here, so it's hard to tell what is a subsection of what.
  • You should mention that the forward-backward algorithm consists of a forward procedure and a backward procedure, so that people expect those headings.
  • It would be very useful to continue the ball example through this section so that people have an example of how the calculations are carried out.
  • You discuss the/an Expectation Maximization algorithm, which is what I expect to see next, but instead the Baum-Welch algorithm is given, which has not yet been mentioned.
  • "Viterbi", not "Veterbi".
JordonJohnson (talk)04:53, 5 February 2016

Hi Jordan,

Thank you for your detailed critique, it really helps me a lot in improving my page.I have fixed questions you pointed out, if you have further suggestions, please let me know. Thanks a lot!

Best regards, Jiahong Chen

JiahongChen (talk)08:23, 9 February 2016
 

suggestions

Hi Chen,

Thanks for your informative page and great examples.

There are a few suggestions:

1. The HMM image is very small. It is better to make it bigger and explain the image and fundamental concepts of HMM more than what it is.

2. There are a few typos in the page.

3. There are lots of algorithms, which I do not think we need them in explaining HMM in a wiki page. The other point is that you have not explained these algorithms enough. I prefer to read less algorithms with more discussions about that, rather than reading many algorithms.


Cheers,

Bahare

BahareFatemi (talk)05:20, 3 February 2016

Hi Bahare,

Thank you! I have increased the size of the image and fixed some typos. I think the algorithms are the important component of solving those three fundamental problems in HMM, so I want to keep them. I added some explanation in the algorithm page, I think it will help you to better understand them. Thank you for your kindly critique!

Best regards, Jiahong Chen

JiahongChen (talk)08:20, 9 February 2016
 

Some Suggestions Regarding Hidden Markov Models

Hi, Chen Nice to read your wiki about Hidden Markov Models and it gives me a sound knowledge about HMM from scratch to detail. I am some questions regarding your wiki you wrote listed as follows: 1. Since you mention in the introduction that HMM has lots of application in industry, can you help to describe a detailed example for its usage for illustration? 2. Regarding the core problems of HMM, can you elaborate more why probability calculation , learning and predicting are the core problem affecting HMM and in which way do they affect HMM? 3. Can you elaborate your algorithm with their corresponding letters in detail because it is a little bit hard to understand those math formula. Thank you Arthur

BaoSun (talk)00:48, 2 February 2016

Hi Arthur,

Thank you for your kindly critique! I have add the application of HMM to my wiki page. As for those algorithms' corresponding letter, I think I have define all of them in the algorithm description and HMM's property. Could you point out where you did not understand clearly? I will fix that, Thank you!

Best regards, Jiahong Chen

JiahongChen (talk)08:16, 9 February 2016