A comprehensive series explaining gradient boosting from first principles to advanced implementation details. Learn how modern gradient boosting libraries like XGBoost and LightGBM actually work under the hood.

Posts in this series

  1. What is Gradient Boosting? — Intuition for gradient boosting without heavy math
  2. Functional Gradient Descent — The mathematical foundation: gradient descent in function space
  3. Trees and the Split Gain Formula — Why trees are the standard weak learner and how we derive optimal splits
  4. Histogram-Based Split Finding — How XGBoost and LightGBM achieve O(bins) complexity
  5. Depth-Wise vs Leaf-Wise Growth — Two strategies for tree construction
  6. Gradient-Based Sampling (GOSS) — LightGBM’s sampling optimization
  7. EFB and Categorical Features — Exclusive Feature Bundling and categorical handling
  8. Regularization in Practice — Hyperparameter tuning for gradient boosting
  9. XGBoost vs LightGBM — A practical comparison