Sigmoid
- Function: g(z) = sigmoid function
- Output range: (0, 1)
- Common in binary classification output layers
g(z) = max(0, z)
Sigmoid
ReLU
Linear
Activation functions are critical components that determine how neural networks process information. While sigmoid functions were initially used due to their connection with logistic regression, alternatives like ReLU enable models to capture more complex patterns by allowing for unbounded positive activations.