Bagging
Bagging is short for bootstrap aggregating. This is one of the most popular ensemble learning approaches. The main idea of bagging is that it samples subsets from the training set with replacement, and then it trains the same algorithm (e.g., decision tree) on the subsets. Sampling with replacement means that some training instances may be repeated in multiple subsets. Another approach sampling without replacement is called pasting.
So, how can we make a final prediction from the predictions of the trained predictors? The simple way is picking the most frequent prediction for classification or averaging the predictions for regression. The main benefit of bagging is that it reduces variance and helps to avoid overfitting.