Backpropagation

Definition - What does Backpropagation mean?

Backpropagation is a technique used to train certain classes of neural networks – it is essentially a principal that allows the machine learning program to adjust itself according to looking at its past function.

Backpropagation is sometimes called the “backpropagation of errors.”

Techopedia explains Backpropagation

Backpropagation as a technique uses gradient descent: It calculates the gradient of the loss function at output, and distributes it back through the layers of a deep neural network. The result is adjusted weights for neurons. Although backpropagation may be used in both supervised and unsupervised networks, it is seen as a supervised learning method.

After the emergence of simple feedforward neural networks, where data only goes one way, engineers found that they could use backpropagation to adjust neural input weights after the fact. Backpropagation can be thought of as a way to train a system based on its activity, to adjust how accurately or precisely the neural network processes certain inputs, or how it leads toward some other desired state.

Share this: