Boosters is a high-performance gradient boosting library for Python and Rust.
The Story Behind Boosters
What started as a learning experiment turned into a deep dive down the rabbit hole of gradient boosting internals.
I initially set out to dissect XGBoost models - understanding what’s really inside them and trying to reproduce their results from scratch. This curiosity led me through extensive research into how various optimizations and algorithms actually work under the hood.
Along the way, I documented my entire research and design process on the library’s website, complete with proper LaTeX formulas and detailed explanations:
- 📚 Research Documentation - Deep dives into gradient boosting theory and algorithms
- 🏗️ Design RFCs - Architecture decisions and implementation rationale
I implemented everything in Rust, which naturally led me further down the optimization path. After months of work, I ended up with a library that not only has feature and algorithm parity with XGBoost and LightGBM, but is also just as fast - or faster.
Features
- ⚡ High Performance - Optimized Rust implementation with zero-cost abstractions
- 🐍 Python Integration - Seamless Python bindings via PyO3
- 🔧 Full Feature Parity - Supports all major XGBoost and LightGBM algorithms
- 📊 Well Documented - Comprehensive research docs and design rationale
Quick Start
pip install boostersfrom boosters import GradientBoosting
model = GradientBoosting()
model.fit(X_train, y_train)
predictions = model.predict(X_test)Links
- 📚 Documentation - Full docs, tutorials, and API reference
- 🐙 GitHub Repository - Source code and issue tracker
- 🔬 Research - Theory and algorithm deep-dives
- 🏗️ Design RFCs - Implementation decisions
Related Posts
This project is accompanied by a 10-part blog series that explains gradient boosting from theory to implementation:
Inside Gradient Boosting:
- What is Gradient Boosting?
- Functional Gradient Descent
- Trees and the Split Gain Formula
- Histogram-Based Split Finding
- Depth-Wise vs Leaf-Wise Tree Growth
- Gradient-Based Sampling (GOSS)
- EFB and Categorical Features
- Regularization and Hyperparameter Tuning
- XGBoost vs LightGBM: A Practical Comparison
- Building Boosters: From Scratch in Rust
Other ML Articles: