Gradient boosting Machine Learning
noun phrase
Definition: An ensemble learning method in which models are built sequentially, with each new model trained to reduce the errors made by the combined previous models by optimizing a loss function through gradient-based updates in function space [Friedman 2001; scikit-learn Documentation].
Example in context: “In particular, the last ten years have seen a resurgence of the popularity of neural networks, following the ‘’deep learning revolution’’, as well as highlighted superior performance of ensemble algorithms like gradient boosting in a number of applications.” [Parimbelli et al. 2022]
Related terms: boosting, gradient boosting decision trees (GBDT)