Episode 4 - Ensemble models. Bagging and boosting
Episode 4 of the Your Data Teacher Podcast podcast, hosted by Your Data Teacher, titled "Episode 4 - Ensemble models. Bagging and boosting" was published on June 10, 2021 and runs 11 minutes.
June 10, 2021 ·11m · Your Data Teacher Podcast
Summary
In this episode, I'm going to talk about ensemble models, particularly bagging and boosting. Bagging is very useful for reducing variance, boosting is used for reducing bias. The most common bagging algorithm is Random Forest, the most common boosting algorithm is Gradient Boosting, whose most common implementations are XGBoost, LightGBM and CatBoost. Home Page: https://www.yourdatateacher.com
Episode Description
In this episode, I'm going to talk about ensemble models, particularly bagging and boosting. Bagging is very useful for reducing variance, boosting is used for reducing bias. The most common bagging algorithm is Random Forest, the most common boosting algorithm is Gradient Boosting, whose most common implementations are XGBoost, LightGBM and CatBoost.
Home Page: https://www.yourdatateacher.com
Similar Episodes
Dec 29, 2025 ·9m
Dec 22, 2025 ·14m
Dec 15, 2025 ·14m
Dec 8, 2025 ·14m
Dec 1, 2025 ·11m
Nov 24, 2025 ·13m