Google News
logo
Machine Learning - Interview Questions
What is the difference between Gini Impurity and Entropy in a Decision Tree?
* Gini Impurity and Entropy are the metrics used for deciding how to split a Decision Tree.
* Gini measurement is the probability of a random sample being classified correctly if you randomly pick a label according to the distribution in the branch.
* Entropy is a measurement to calculate the lack of information. You calculate the Information Gain (difference in entropies) by making a split. This measure helps to reduce the uncertainty about the output label.
Advertisement