The Activation Function: Let us continue from where we left in our section and that is basic structure of one neuron. Till now we have learned that input layer would feed values into neuron (green circle) and neuron would summarize the input values and will give us a final output value. But what exactly happens in the neuron is very important to understand.
The Processing inside Neuron: Once values are fed into neuron, it calculates the weighted sum of the input values. As we can see, neuron (green circle) calculates weighted sum of all the input variables and then applies the chosen activation function on the weighted sum. Below is the list of most common activation function used for ANN:
- Threshold function: On the X-axis we have weighted sum of the inputs and on Y-axis we have values either 0 or 1. Threshold function is a very simple function. It gives values equal to 1 when weighted sum is greater than certain threshold value otherwise output would be 0.
- Sigmoid function: A sigmoid function is a mathematical function having a characteristic “S”-shaped curve or sigmoid curve. Often, sigmoid function refers to the special case of the logistic function. The x in the sigmoid function is result of weighted sum of input values.
- Rectifier Function: In the context of artificial neural networks, the rectifier is an activation function defined as the positive part of its argument.
- Hyperbolic Tangent Function: Like the logistic sigmoid, the tanh function is also sigmoidal (“s”-shaped), but instead outputs values range from (-1,1). Thus strongly negative inputs to the tanh will map to negative outputs. Additionally, only zero-valued inputs are mapped to near-zero outputs.
Next Section: Artificial Neural Network part 4