Course talk:CPSC522/Regularization for Neural Networks

From UBC Wiki

Contents

Thread titleRepliesLast modified
Critique107:47, 21 April 2016
Comments and critique103:03, 21 April 2016
Suggestion122:34, 20 April 2016
Suggestions for Regularization for Neural Networks200:46, 19 April 2016
Edited by author.
Last edit: 07:47, 21 April 2016

Hi Ricky,

Good job. Your page is concise and clear. The layout is also good. I learned a lot from your page and I really like the pictures you put in your page. But Analysis Section seems confusing. You'd better add a separate Conclusion Section or just change it to Conclusion Section. And Background Section should be changed into Builds on Section.

Sincerely,

Ke Dai

KeDai (talk)06:27, 21 April 2016

True, the analysis is scattered around in the wiki. I've changed that section header to Conclusion.

TianQiChen (talk)07:04, 21 April 2016
 

Comments and critique

Hi Ricky,

Firstly, let me thank for your project and your contribution in CPSC 522 Wiki pages. I like your page and find your topic of project interesting. I like to share my opinion and give you my comments, maybe they might help you.

1) First of you can treat this assignment's wiki page as a research paper and include the usual and common sections such as: abstract, motivation, introduction, method, evaluation, results, conclusion, future work and acknowledgment. 2) Some of the links must make maybe a single word as the clickable blue pointer, but in some parts of your page there is a long blue link which looks a bit unusual 3) Finally, as I mentioned in the first point above, there must be subsections such as conclusion or results or data, so when a reader wants to just look at a specific aspect of the page, be able to do so as efficient as possible.

Thanks for your page it was a pleasure to read it and I hope these comments can help you improve your contribution.

Mehrdad Ghomi

MehrdadGhomi (talk)02:40, 21 April 2016

Hey Mehrdad,

1) Which section is missing? I currently have the following:

1 Hypothesis 2 Abstract 3 Method Descriptions 4 Experiments 5 Analysis 6 References

Given the structure of my hypothesis, I simply put the conclusion into the analysis section. I can add a future work section though.

2) What are the "some parts"? I found one long hyperlink and changed it to a reference, but I couldn't find any others. Thanks!

3) What subsections are missing? The results are big PNG images, and the data is available for reproducing the results. All the links are in the table of contents..

TianQiChen (talk)02:58, 21 April 2016
 

Suggestion

Hi Ricky,

This page looks good! Here are a few suggestions:

1. Some terminology used in this page is a little hard to understand, for instance, ReLU activation functions, I hope you can add more external links, so people does not need to google all the thing by themselves. 2. I can't understand why the input for each sample is the vector , would you mind to explain this to me?

The experiment results look good!

Sincerely,

Junyuan Zheng

JunyuanZheng (talk)18:47, 19 April 2016

Hey Junyuan,

I've added a link explaining ReLU. It's a simple function that removes all negative values: ReLU(x) = max(0,x).

The most basic method to learn a function for is to construct a neural network that approximates using . This is a neural network with 1 input (x) and 1 output (y). This requires the neural network to learn properties of the function from only . To learn a complex function, the network needs to be large. But to simply the network's job, I'm essentially giving it more information in the inputs. So the neural network learns an approximation , so it now has 4 inputs and 1 output.

TianQiChen (talk)22:34, 20 April 2016
 

Suggestions for Regularization for Neural Networks

Hi Ricky, Till now whatever I've read on your page gives me a good background required on the methods that you'll be comparing against L2-regularization. But your actual description of the problem, experiments and analysis of the results seem incomplete. I'm guessing you'll be putting in more material on the page soon.
Just as a side note, under your L2-regularization, you might want to look into the math formula. It is not rendered properly.
Thanks,
Ritika
PS: Our critiques are due tomorrow (i.e the 19th), right?

RitikaJain (talk)23:45, 18 April 2016

Right. Still working on writing down the experiment results.

Thanks for the heads up on that math equation.

The critiques are due on the 20th I think.

TianQiChen (talk)00:08, 19 April 2016

Oh lol okay. I had it in mind that they were due tomorrow and I was getting all worked up. :P
Good luck with the writing!

RitikaJain (talk)00:46, 19 April 2016