Google News
logo
Keras - Interview Questions
Why do you think batch normalization is important for training deep networks?
Batch normalization is important for training deep networks because it helps to prevent the internal covariate shift problem. This problem occurs when the distribution of the input data changes during training, which can lead to training instability and slow convergence. Batch normalization helps to mitigate this problem by normalizing the input data at each layer, which stabilizes the training process and leads to faster convergence.
Advertisement