critique

Hi Merhdad,

A solid first draft! Fairly easy to read, and the overall flow is reasonable.

General comments:

  • In the way the page is organized, it's not intuitive to me where the two papers come in, or what parts of the content are from which paper. Also, it doesn't seem to be clearly stated what exactly the contribution of the second paper is.
  • For your references, instead of using superscript tags, there are reference tags you should be using. It makes it a lot easier to maintain the references (the wiki does it for you).
  • Some of your equations are done using images instead of math tags; it's better to consistently use the math tags to fit the assignment specification.

Section-specific comments:

  • Abstract
    • There seems to be some overlap between your abstract and background knowledge sections. I'd recommend removing the background knowledge part from the abstract and just cover it in a "Builds on" section, as per the page template.
    • Some parts of the abstract feel more like an introduction than a summary of the content.
    • There is some redundancy (eg. reference to conditional model twice)
  • HMMs
    • Markov networks are undirected, so HMMs are more like Bayes networks with Markov properties.
  • Maximum Entropy
    • The writing style is a bit confusing here. For example, the first sentence seems to say that maximum entropy models have entropy, which is obvious; and it takes until after the part in parentheses to realize that the sentence was actually saying something else.
  • Skip-chain Models
    • I see that the images are credited to the proper source if you click on them, but I'd recommend crediting them directly in the page as well. You can see how figures are handled in the Leaning Markov Logic Network Structure page for an example of what I mean, and the help pages in the UBC wiki give different options for how to do it.
    • I was able to glean what is from the figure, but it would be good to state it explicitly.
  • Motivation
    • Why do we want to model the posterior? Can you give an example?
  • Model and Algorithm
    • What is ?
    • What I'm getting from this section is that MEMMs are HMMs where the transition probabilities are learned from a set of features applied to a maximum entropy model. Is that correct?
  • Other Alternatives
    • A bullet list would be more readable here.
    • "Here we present its model and we provide a specific task and analyze its performance on it." Are you providing the task and analyzing the performance? Or are you summarizing the authors' work in providing the task and analyzing the performance? If this is a quote from the paper, it needs to be quoted and cited.
  • MoP-MEMM
    • It would be good to explicitly state the new terms and symbols used in the equations
    • I'm having trouble understanding how this would be useful (i.e. how it solves the problems you mentioned earlier).
  • The Task and Results
    • Which authors?

Clear skies,
Jordon

JordonJohnson (talk)21:39, 10 March 2016

Hi Jordon,

Thank you for your very insightful feedback. I will work on my page and solve the problems you mentioned. I will rearrange my page. I thought we should read the 2 papers and develop a page about the concept in the same format of the previous page we developed. So rearranging would help a lot. I will change the images into math format and add some citations. and remove all the duplicity and redundancies. And explain in more detail that which paper contributed to which parts. and add a legend kind of thing for the formulas. Thank you again!

Cheers,

Mehrdad Ghomi

MehrdadGhomi (talk)21:01, 12 March 2016