Course talk:CPSC 522/Progressive Neural Network

From UBC Wiki

Contents

Thread titleRepliesLast modified
Peer Feedback018:55, 14 March 2020
Peer Feedback001:25, 10 March 2020
Feedback023:15, 6 March 2020

Peer Feedback

Page is nicely structured and I love how concise it is and to the point. The diagram really helps with understanding. I feel like at times, repeating what you are about to discuss especially on shorter sections may be repeating the same idea over and over again. But overall, very nice informative fundamental page. Overall, I'd rate it 18/20.

(5) The topic is relevant for the course.

(5) The writing is clear and the English is good.

(5) The page is written at an appropriate level for CPSC 522 students (where the students have diverse backgrounds).

(5) The formalism (definitions, mathematics) was well chosen to make the page easier to understand.

(5) The abstract is a concise and clear summary.

(4) There were appropriate (original) examples that helped make the topic clear.

(N/A) There was appropriate use of (pseudo-) code.

(5) It had a good coverage of representations, semantics, inference and learning (as appropriate for the topic).

(5) It is correct.

(5) It was neither too short nor too long for the topic.

(5) It was an appropriate unit for a page (it shouldn't be split into different topics or merged with another page).

(5) It links to appropriate other pages in the wiki.

(5) The references and links to external pages are well chosen.

(5) I would recommend this page to someone who wanted to find out about the topic.

(4) This page should be highlighted as an exemplary page for others to emulate.

PeymanBateni (talk)18:55, 14 March 2020

Peer Feedback

Your page is well structured and the concepts presented are well introduced. I'd rephrase some parts of the introduction to make it easier to read. for example "This paper will give a short overview of continual learning and the different approaches used in literature to achieve it. The paper will then explore one of these methods, progressive neural networks." maybe something of this kind: "This paper will provide a short overview of continual learning and the different approaches used, diving more into detail in the progressive neural networks method." It seems as if you are yet to finish your page. I'd like to maybe have some examples of applications of the algorithm or where it is been used. I liked your graph, it was very relevant and helpful to my comprehension.

(5) The topic is relevant for the course.

(4) The writing is clear and the English is good.

(5) The page is written at an appropriate level for CPSC 522 students (where the students have diverse backgrounds).

(4) The formalism (definitions, mathematics) was well chosen to make the page easier to understand.

(5) The abstract is a concise and clear summary.

(-) There were appropriate (original) examples that helped make the topic clear.

(-) There was appropriate use of (pseudo-) code.

(4) It had a good coverage of representations, semantics, inference and learning (as appropriate for the topic).

(5) It is correct.

(5) It was neither too short nor too long for the topic.

(5) It was an appropriate unit for a page (it shouldn't be split into different topics or merged with another page).

(5) It links to appropriate other pages in the wiki.

(5) The references and links to external pages are well chosen.

(5) I would recommend this page to someone who wanted to find out about the topic.

(4) This page should be highlighted as an exemplary page for others to emulate.

TommasoDAmico (talk)01:25, 10 March 2020

I think you need to be much clearer about what the problem is. You know the problem is that we want to learn about multiple tasks, where we want to both not forgot how to solve earlier tasks and want to use what is learned in previous tasks for subsequent tasks. (Is this correct). But you don't tell us; we have to guess.

You need to set up the definitions clearer. Eg., "However, most ANN’s fail to learn a new task without becoming inadequate in the original task". You should add something like "when learning multiple tasks sequentially"

In your defintion of "Continual learning" are you assuming that the learning knows when the task is changing? Do you really mean "at each time step a system (eg. a neural network) receives a new sample from a different task"? This seems to imply that the task changes for each sample. Is that correct? surely then X_t and y_t should be multiple pairs, not one pair. Why do you use {} for pairs, or do you mean sets? I think the setup needs to be explained more clearly in straightforward language.

I found the math impenetrable and unhelpful. Notation should only ever be introduced when it makes the description simpler.

Do progressive neural networks - the topic of this page - degrade performance of previous tasks? (I'm guessing not, but that assumes that we know what the task is, so you should be more explicit about this). If not, it seems like the introduction of the slack variables is an irrelevant detail. You could just say that other methods degrade the performance.

It might be useful in the progressive NN section to say what happend with more than 2 tasks. The formulation looks weird to me. Why would we sum the outputs of the previous layers -- which I think you are doing -- rather that letting the number of inputs nodes grow linearly for the n-th task. Maybe you need to define U notation better.

"Please note that the notations follow the notations from [3]." Note that this shouldn't mean that can get away without defining the notation. (I can't tell whether you intend that). It would be simpler to say "The notation follows [3]."

DavidPoole (talk)23:14, 6 March 2020