Google News
logo
Deep Learning - Interview Questions
What are the main benefits of Mini-batch Gradient Descent?
* It is computationally efficient compared to stochastic gradient descent.
* It improves generalization by finding flat minima.
* It improves convergence by using mini-batches. We can approximate the gradient of the entire training set, which might help to avoid local minima.
Advertisement