Adding dropout to prevent overfitting
Another popular method for regularization is dropout. A forces a neural network to learn multiple independent representations by randomly removing connections between neurons in the learning phase. For example, when using a dropout of 0.5, the network has to see each example twice before the connection is learned. Therefore, a network with dropout can be seen as an ensemble of networks.
In the following recipe, we will improve a model that clearly overfits the training data by adding dropouts.
How to do it...
- Import the as follows:
import numpy as np import pandas as pd from sklearn.model_selection import train_test_split from keras.models import Sequential from keras.layers import Dense, Dropout from keras.wrappers.scikit_learn import KerasRegressor from sklearn.model_selection import cross_val_score from sklearn.model_selection import KFold from sklearn.preprocessing import StandardScaler from sklearn.pipeline import Pipeline import numpy as np...