DNNs
While there are better ways to implement purely linear models, simplifying DNNs with a varying number of layers is where TensorFlow and learn
really shine.
We'll use the same input features, but now we'll build a DNN with two hidden layers, first with 10
neurons and then 5
. Creating this model will only take one line of Python code; it could not be easier.
The specification is similar to our linear model. We still need SKCompat
, but now it's learn.DNNClassifier
. For arguments, there's one additional requirement: the number of neurons on each hidden layer, passed as a list. This one simple argument, which really captures the essence of a DNN model, puts the power of deep learning at your fingertips.
There are some optional arguments to this as well, but we'll only mention optimizer
. This allows you to choose between different common optimizer routines, such as Stochastic Gradient Descent (SGD) or Adam. Very convenient!
# Dense neural net classifier = estimator.SKCompat...