Backpropagation
Backpropagation is the algorithm used to train neural networks. It calculates the gradient of the loss function with respect to the weights, allowing us to update the weights to minimize error.
How it Works
- Forward Pass: Input data flows through the network to generate a prediction.
- Calculate Error: Compare prediction with actual target (Loss).
- Backward Pass: Propagate the error backwards to calculate gradients.
- Update Weights: Adjust weights using Gradient Descent.
Simulating Weight Updates
This is a simplified view of how a single weight gets updated to reduce error.
python
Output:
Click "Run Code" to see output