Critique 3

Critique 3

Comments:[wikitext]

Overall, well done. I felt that you explained the material well and the examples were useful. I will mention some grammar errors:

  • Decision Trees are tree-like structures that depict different possible decisions that can be made and their outcome for a given problem.
  • This page gives an overview about decision trees, their building blocks, and learning a decision tree.
  • Decision trees, as the name suggests, are particularly
  • it to make a decision for any new data set
  • Influence diagrams and Decision Analysis tools are related to Decision trees
  • Graph has an inconsistent use of "Get's" instead of "Get's"
  • The paragraph: "But in order to make decisions on the basis of a decision tree, we first need to construct one. But before we construct a decision tree we need to keep in mind what kind of tree we want. A number of decision trees can be generated based on the input features of the historical data. But, trees with minimum height are preferred" uses the word "but" too often

There were also a couple points I think could be clearer:

  • In the algorithm shown, how is pick happening? is it random?
  • "Thus, to make a decision tree one can search the entire space for the smallest decision tree." Where do you get the search space from? must you create all the trees first?
  • The word "outlook" isn't used in weather much, they tend to call it the "forecast".
Scheme[wikitext]
  • The topic is relevant for the course. 5
  • The writing is clear and the English is good. 3
  • The page is written at an appropriate level for CPSC 522 students (where the students have diverse backgrounds). 5
  • The formalism (definitions, mathematics) was well chosen to make the page easier to understand. 4
  • The abstract is a concise and clear summary. 5
  • There were appropriate (original) examples that helped make the topic clear. 5
  • There was appropriate use of (pseudo-) code. 4
  • It had a good coverage of representations, semantics, inference and learning (as appropriate for the topic). 5
  • It is correct. 5
  • It was neither too short nor too long for the topic. 5
  • It was an appropriate unit for a page (it shouldn't be split into different topics or merged with another page). 5
  • It links to appropriate other pages in the wiki. 5
  • The references and links to external pages are well chosen. 5
  • I would recommend this page to someone who wanted to find out about the topic. 4
  • This page should be highlighted as an exemplary page for others to emulate. 4

If I was grading it out of 20, I would give it: 17

JocelynMinns (talk)21:10, 7 February 2018

Hi JocelynMinns,

I have made changes according to the mistakes you pointed out. And answers for the questions:

  • In the algorithm shown, how is pick happening? is it random?

-- Yes, here the algorithm randomly picks a condition on which it will split the input data samples.

  • "Thus, to make a decision tree one can search the entire space for the smallest decision tree." Where do you get the search space from? must you create all the trees first?

-- Yes, the naive way is to search the entire decision tree space by making them and then pick the smallest one. But as the state space grows exponentially, we greedily search while minimizing the error. The description of this greedy search algorithm is provided in my Wiki page.

Thanks, Ekta Aggarwal

EktaAggarwal (talk)21:25, 8 February 2018