Google News
logo
Machine Learning - Interview Questions
What Vanishing Gradient Descent?
In Machine Learning, we encounter the Vanishing Gradient Problem while training the Neural Networks with gradient-based methods like Back Propagation. This problem makes it hard to tune and learn the parameters of the earlier layers in the given network.
 
The vanishing gradients problem can be taken as one example of the unstable behavior that we may encounter when training the deep neural network.
 
It describes a situation where the deep multilayer feed-forward network or the recurrent neural network is not able to propagate the useful gradient information from the given output end of the model back to the layers close to the input end of the model.
Advertisement