Correct Answer : Adding dropout layers
Explanation : Dropout layers randomly drop out some of the neurons during training, which can help prevent overfitting.