Gradient boosting is used in regression and classification problems to produce a predictive model in the form of a set of weak predictive models, typically decision trees. This methodology is similar to the boosting methods and generalizes them, allowing for the optimization of an arbitrary differentiable loss function.
The Light Gradient Boosting Machine (LightGBM) is a particular variation of gradient boosting, with some modifications that make it particularly advantageous. It is based on classification trees, but the choice of splitting the leaf at each step is done more effectively.