Google News
logo
Deep Learning - Interview Questions
Why is zero initialization not a good weight initialization process?
If the set of weights in the network is put to a zero, then all the neurons at each layer will start producing the same output and the same gradients during backpropagation.
 
As a result, the network cannot learn at all because there is no source of asymmetry between neurons. That is the reason why we need to add randomness to the weight initialization process.
Advertisement