Bootstrapping (Random Sampling with Replacement)
Training Base Models
Aggregation (Voting or Averaging)
from sklearn.ensemble import BaggingClassifier, RandomForestClassifier
from sklearn.tree import DecisionTreeClassifier
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from sklearn.metrics import accuracy_score
# Load dataset
data = load_iris()
X, y = data.data, data.target
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3, random_state=42)
# Bagging with Decision Trees
bagging = BaggingClassifier(base_estimator=DecisionTreeClassifier(), n_estimators=100, random_state=42)
bagging.fit(X_train, y_train)
y_pred_bagging = bagging.predict(X_test)
print("Bagging Accuracy:", accuracy_score(y_test, y_pred_bagging))
# Random Forest
random_forest = RandomForestClassifier(n_estimators=100, random_state=42)
random_forest.fit(X_train, y_train)
y_pred_rf = random_forest.predict(X_test)
print("Random Forest Accuracy:", accuracy_score(y_test, y_pred_rf))
Here's a breakdown of the key concepts:
Types of Ensemble Learning :
There are several popular ensemble learning techniques, including:
* How Ensemble Learning Works :
The basic idea behind ensemble learning is to create a diverse set of models, each with its own strengths and weaknesses. By combining the predictions of these models, the ensemble can achieve better overall performance.
* Benefits of Ensemble Learning :
* Challenges of Ensemble Learning :
* Applications of Ensemble Learning :
Ensemble learning is used in a wide variety of applications, including:
|
Feature |
Bagging |
Boosting |
|
Goal |
Reduce variance |
|
|
Type of Ensemble |
Parallel ensemble method, where base learners are trained independently. |
Sequential ensemble method, where base learners are trained sequentially. |
|
Base Learners |
Base learners are typically trained in parallel on different subsets of the data. |
Base learners are trained sequentially, with each subsequent learner focusing more on correcting the mistakes of its predecessors. |