foundations of computational agents
The open-source tools for gradient tree boosting XGBoost [Chen and Guestrin, 2016] and LightGBM [Ke et al., 2017] have been used for many winning entries in machine learning competitions; see https://xgboost.readthedocs.io/ and https://lightgbm.readthedocs.io/.
Table B.1 provides the mapping from the code of Table 7.21 into both XGBoost and LightGBM. They each have many parameters not shown here.
Figure 7.21 | XGBoost | LightGBM | ||
---|---|---|---|---|
Parameter | Parameter | Default | Parameter | Default |
num_boost_round | 10 | num_iterations | 100 | |
lambda_reg | 1 | lambda_l2 | 0 | |
eta | 0.3 | learning_rate | 0.1 | |
gamma | 0 | min_gain_to_split | 0 | |
colsample_bytree | 1 | feature_fraction | 1 | |
subsample | 1 | bagging_fraction | 1 |