7. Building ML models using Azure Machine Learning
In the previous chapter, we learned how to extract features from textual and categorical columns using NLP techniques. In this chapter, we will use the knowledge we have gained so far to create and train a powerful tree-based ensemble classifier.
First, we will look behind the scenes of popular ensemble classifiers such as random forest, XGBoost, and LightGBM. These classifiers perform extremely well in practical real-world scenarios, and all are based on decision trees under the hood. By understanding their main benefits, you will be able to spot problems that can be solved with ensemble decision tree classifiers easily.
We will also learn the difference between gradient boosting and random forest and what makes these tree ensembles useful for practical applications. Both techniques help to overcome the main weaknesses of decision trees and can be applied to many different classification and regression problems.
Finally...