Activation Functions

Activation functions decide whether a neuron should be activated or not. They introduce non-linearity into the network, allowing it to learn complex patterns.

Common Functions

Sigmoid

Squashes output between 0 and 1. Good for probability.

ReLU

Rectified Linear Unit. Returns x if x > 0, else 0. Most common.

Tanh

Squashes output between -1 and 1.

Code Visualization

python
Output:
Click "Run Code" to see output