Activation Function
Nerd Cafe
What is an Activation Function in Artificial Neural Networks (ANNs)?
Why Do We Use Activation Functions?
1. Introduce Non-Linearity
2. Control the Output Range
Activation Function
Output Range
3. Gradient Propagation in Backpropagation
1. Step Function (Binary Threshold)
Numerical Example:
Python Code:
Output

2. Sigmoid Function
Formula:
Numerical Example:
Python Code:
Output

3. Tanh (Hyperbolic Tangent)
Formula:
Numerical Example:
Python Code:

4. ReLU (Rectified Linear Unit)
Formula:
Numerical Example:
Python Code:
Output:

5. Leaky ReLU
Formula:
Numerical Example:
Python Code:
Output:

6. ELU (Exponential Linear Unit)
Formula:
Numerical Example:
Python Code:
Output:

7. Softplus
Numerical Example:
Python Code:
Output:

8. Swish (by Google)
Formula:
Numerical Example:
Python Code:
Output:

9. Mish
Formula:
Python Code:
Output:

Summary Table:
Function
Range
Differentiable
Non-linearity
Uses
Keywords
Last updated