AN02 Activation Functions
Activation Function
Summation function for a particular node combines inputs of all the nodes in the previous layers (weighted sum) and adds bias corresponding to the node. The output of summation function acts as input to activation function.
Activation function is one of the most important part of Artificial Neural Networks (ANNs). It determines output of a node for given input. We have choice of several activation functions depending on whether the node is in a hidden layer or output layer. Nodes in input layer do not have activation function. They just let the input pass as it is. The activation function also imports non-linearity to ANN, i.e., our network can learn any complex non-linear relationship if appropriate activation functions are used.
Summation Function and Activation Function together are also called Transfer Function of the neuron.
Commonly Used Activation Function
1. Step function
2. Signum (sign) function
3. Linear Function
4. ReLU (Rectified Linear Unit) function
5. Leaky ReLU ( Leaky Rectified Linear Unit Function )
6. tanh function
7. Sigmoid function
8. Softmax funciton