Suggestion

Hey Junyuan,

I've added a link explaining ReLU. It's a simple function that removes all negative values: ReLU(x) = max(0,x).

The most basic method to learn a function for is to construct a neural network that approximates using . This is a neural network with 1 input (x) and 1 output (y). This requires the neural network to learn properties of the function from only . To learn a complex function, the network needs to be large. But to simply the network's job, I'm essentially giving it more information in the inputs. So the neural network learns an approximation , so it now has 4 inputs and 1 output.

TianQiChen (talk)22:34, 20 April 2016