Member-only story

Ensemble Learning, Bagging and Boosting

Buse Köseoğlu
2 min readMar 25, 2021

What is Ensemble Learning?

I’ll start with an example to illustrate ensemble learning. Take a group of students in the classroom and they are working on a problem. If only one person works on this problem and solves it incorrectly or correctly, there will be no one to ask to check if it is correct. But if they work on this problem as a group, they can find each other’s mistakes and finally vote for the correct answer. This is called the wisdom of the crowd.

We can apply the same method while developing a model. If we train a group of algorithms and get the best out of their guesses, this technique is called ensemble learning. The predictors are called ensemble.

What is Bagging?

Another name is bootstrap aggregating. This approach uses the same training algorithm for each predictor, but each is trained with a different subset of the data set reserved for training. As a result, the best estimate is used.

If each forecaster was trained individually, a high underfitting would occur. But with the use of the bagging method, incomplete learning can be prevented. More consistent predictions emerge and no overfitting occurs.

--

--

No responses yet