Hyperparameter Optimization
How a Kaggle solution performs is not simply determined by the type of learning algorithm you choose. Aside from the data and the features that you use, it is also strongly determined by the algorithm’s hyperparameters, the parameters of the algorithm that have to be fixed prior to training, and cannot be learned during the training process. Choosing the right variables/data/features is most effective in tabular data competitions; however, hyperparameter optimization is effective in all competitions, of any kind. In fact, given fixed data and an algorithm, hyperparameter optimization is the only sure way to enhance the predictive performance of the algorithm and climb the leaderboard. It also helps in ensembling, because an ensemble of tuned models always performs better than an ensemble of untuned ones.
You may hear that tuning hyperparameters manually is possible if you know and understand the effects of your choices on the algorithm. Many...