B.1 Gradient-Boosted Trees

The open-source tools for gradient tree boosting XGBoost [Chen and Guestrin, 2016] and LightGBM [Ke et al., 2017] have been used for many winning entries in machine learning competitions; see https://xgboost.readthedocs.io/ and https://lightgbm.readthedocs.io/.

Table B.1 provides the mapping from the code of Table 7.21 into both XGBoost and LightGBM. They each have many parameters not shown here.

Figure 7.21 XGBoost LightGBM
Parameter Parameter Default Parameter Default
K num_boost_round 10 num_iterations 100
λ lambda_reg 1 lambda_l2 0
η eta 0.3 learning_rate 0.1
γ gamma 0 min_gain_to_split 0
css colsample_bytree 1 feature_fraction 1
ss subsample 1 bagging_fraction 1
Table B.1: Hyperparameters for two open-source gradient-boosted trees learning packages