Google News
logo
Deep Learning - Interview Questions
Explain the following variant of Gradient Descent: Stochastic, Batch, and Mini-batch?
Stochastic Gradient Descent : Stochastic gradient descent is used to calculate the gradient and update the parameters by using only a single training example.

Batch Gradient Descent :
Batch gradient descent is used to calculate the gradients for the whole dataset and perform just one update at each iteration.

Mini-batch Gradient Descent :
Mini-batch gradient descent is a variation of stochastic gradient descent. Instead of a single training example, mini-batch of samples is used. Mini-batch gradient descent is one of the most popular optimization algorithms.
Advertisement