Neural networks for tabular competitions
Having discussed neural networks with DAEs, we have to complete this chapter by discussing how neural networks can help you in a tabular competition more generally. Gradient boosting solutions still clearly dominate tabular competitions (as well as real-world projects); however, sometimes neural networks can catch signals that gradient boosting models cannot get, and can be excellent single models or models that shine in an ensemble.
As many Grandmasters of the present and the past often quote, mixing together diverse models (such as a neural network and a gradient boosting model) always produces better results than single models taken separately in a tabular data problem. Owen Zhang, previously number one on Kaggle, discusses at length in the following interview how neural networks and GBMs can be blended nicely for better results in a competition: https://www.youtube.com/watch?v=LgLcfZjNF44.
Building a neural network quickly...