🎓 All Courses | 📚 Machine Learning Fundamentals Syllabus
Stickipedia University
📋 Study this course on TaskLoco

Gradient boosting builds trees sequentially — each new tree corrects the errors of the previous ones. It's the dominant algorithm for structured/tabular data.

How It Works

Each new model focuses on the examples the previous models got wrong. Combines many weak learners into a very strong one.

Key Implementations

  • XGBoost: The original high-performance gradient boosting library — dominated Kaggle competitions for years
  • LightGBM: Faster than XGBoost on large datasets
  • CatBoost: Best for categorical features, minimal tuning needed

When to Use

Structured/tabular data. When you need the best possible accuracy and can spend time tuning hyperparameters.


YouTube • Top 10
Machine Learning Fundamentals: Gradient Boosting — XGBoost and LightGBM
Tap to Watch ›
📸
Google Images • Top 10
Machine Learning Fundamentals: Gradient Boosting — XGBoost and LightGBM
Tap to View ›

Reference:

XGBoost documentation

image for linkhttps://xgboost.readthedocs.io/

📚 Machine Learning Fundamentals — Full Course Syllabus
📋 Study this course on TaskLoco

TaskLoco™ — The Sticky Note GOAT