Backpropagation is a training algorithm used for multiple users for a many layer neural network. In this method, we move the error into the end of the net to all weights inside the system and allowing efficient calculation of the gradient.
It is divided into several steps as follows :
* Forward propagation of training data to generate output.
* Then by using target value and output value error derivative can be computed concerning output activation.
* Then we back produce for computing derivative of the error concerning output activation on previous and continue this for all the hidden layers.
* Using previously solved derivatives for output and all the hidden layers, we calculate error derivatives.
* And then we update the weights.