Backpropagation is the algorithm used to calculate gradients in neural networks by applying the chain rule of calculus backwards through the network layers. It enables efficient training of deep neural networks.
Interactive lesson with visualizations and practice problems
Part of the Chain Rule (Multivariate) lesson in Mathematics Foundations