What is Backpropagation?
TL;DR
An algorithm for efficiently computing gradients in neural network training. The foundational technique of deep learning.
Backpropagation: Definition & Explanation
Backpropagation is an algorithm that propagates errors (losses) backward from the output layer to the input layer of a neural network, efficiently computing gradients for each parameter. Popularized by Rumelhart et al. in 1986, it is the foundational technique that enabled deep learning's advancement. By applying the chain rule, it can efficiently compute gradients even in networks with enormous numbers of parameters, combining with gradient descent to optimize them. All modern neural network models — GPT, BERT, Transformers — use backpropagation for training. It is one of the most important concepts for understanding deep learning.