Gradient Boosted Decision Trees
Gradient boosting builds models sequentially, where each new model corrects errors made by previous ones. It creates a powerful predictor by combining many simple models (often decision trees).
Example: Like a team of specialists where each member fixes the mistakes of the previous one. Popular implementations include XGBoost and LightGBM, used in fraud detection, credit scoring, and recommendation systems.
Key components include weak learners, a loss function to measure errors, and an additive model that weights each tree’s contribution.