Random Forests
Random forests improve decision trees by combining many trees and averaging their predictions to reduce overfitting and increase stability. They are widely used in credit scoring, fraud detection, and customer churn prediction.
They work by bootstrap sampling and random feature selection, then aggregating predictions through majority voting (for classification) or averaging (for regression).