Random Forest builds hundreds of decision trees and combines their predictions — dramatically outperforming any single tree.
How It Works
- Randomly sample the training data (with replacement) — bagging
- At each split, consider only a random subset of features
- Build many trees, each slightly different
- For classification: majority vote. For regression: average
Why It's Great
- Highly accurate out of the box
- Handles missing values well
- Provides feature importance scores
- Resistant to overfitting
When to Use
When you need a strong baseline fast. Random Forest is one of the most reliable all-purpose algorithms.
Reference: