Bagging is a technique used to reduce the variance of our predictions by combining the result of multiple classifiers modeled on different sub-samples of the same dataset. The following figure will make it clearer.
The steps followed in bagging are:
1) Create Multiple DataSets
2) Build Multiple Classifiers
3) Combine Classifiers
Note that, here the number of models built is not a hyper-parameters. Higher number of models are always better or may give similar performance than lower numbers. It can be theoretically shown that the variance of the combined predictions are reduced to 1/n (n: number of classifiers) of the original variance, under some assumptions.
Bagging, Classifier — Nov 21, 2018
Made with ❤️ and ☀️ on Earth.