Summary
In this chapter, we learned about Extreme Gradient Boosting --an implementation of Gradient Boosting Machines. We learned how to install the library and then we applied to solve a variety of supervised learning problems: classification, regression, and ranking.
XGBoost shines when the data is structured: when it is possible to extract good features from our data and put these features into a tabular format. However, in some cases, the data is quite hard to structure. For example, when dealing with images or sounds, a lot of effort is needed to extract useful features. But we do not necessarily have to do the feature extraction ourselves, instead, we can use Neural Network models which can learn the best features themselves.
In the next chapter, we will look at deeplearning4j--a deep learning library for Java.