32 private links
GPU speedup: XGBoost 7.3x, LightGBM 3.6x (excluding goss results), Catboost 3.3x
"while XGBoost can explore its space of hyper-parameters very fast, it does not always locate the configuration that results in the best score. While it clearly wins in both multi-class ranking tasks (Microsoft, Yahoo), for the Higgs dataset it loses to LightGBM, despite the latter being significantly slower. Furthermore, for the Epsilon dataset XGBoost cannot be used due to memory limitations.
... there are tasks for which LightGBM, albeit slower, can converge to a solution that generalizes better. Furthermore, for datasets with a large number of features, XGBoost cannot run due to memory limitations, and Catboost converges to a good solution in the shortest time. Therefore, while we observe interesting trends, there is still no clear winner in terms of time-to-solution across all datasets and learning tasks. The challenge of building a robust GPU-accelerated GBDT framework that excels in all scenarios is thus very much an open problem."