Ensemble modeling - XGBoost
XGBoost, that is, Extreme Gradient Boosting, is a very popular machine learning ensemble technique that has helped data scientists across the globe to achieve great results with phenomenal accuracy. XGBoost is built on the principles of ensemble modeling and is an improved version of the Gradient Boosted Machine algorithm. In general, the XgBoost algorithm creates multiple classifiers that are weak learners, which means a model that gives a bit better accuracy than just a random guess. The learner in the ensemble model can be a linear or tree model that is built iteratively with random sampling along with an added weight from the learnings of the previously built model. At each step, a tree is built and the cases where the tree has failed to classify an outcome correctly is assigned a corresponding weight. The next iteration of model building learns from the mistakes of the previous model. At each step, the weight of an incorrect prediction is calculated using...