Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon

How TFLearn makes building TensorFlow models easier

Save for later
  • 7 min read
  • 04 Jun 2018

article-image

Today, we will introduce you to TFLearn, and will create layers and models which are directly beneficial in any model implementation with Tensorflow.

TFLearn is a modular library in Python that is built on top of core TensorFlow.

[box type="note" align="" class="" width=""]This article is an excerpt taken from the book Mastering TensorFlow 1.x written by Armando Fandango. In this book, you will learn how to build TensorFlow models to work with multilayer perceptrons using Keras, TFLearn, and R.[/box]

TIP: TFLearn is different from the TensorFlow Learn package which is also known as TF Learn (with one space in between TF and Learn). It is available at the following link; and the source code is available on GitHub.

TFLearn can be installed in Python 3 with the following command:

pip3  install  tflearn


Note: To install TFLearn in other environments or from source, please refer to the following link: http://tflearn.org/installation/

The simple workflow in TFLearn is as follows:

  1.  Create an input layer first.
  2.  Pass the input object to create further layers.
  3.  Add the output layer.
  4.  Create the net using an estimator layer such as regression.
  5.  Create a model from the net created in the previous step.
  6.  Train the model with the model.fit() method.
  7.  Use the trained model to predict or evaluate.

Creating the TFLearn Layers


Let us learn how to create the layers of the neural network models in TFLearn:

  1.  Create an input layer first:

input_layer  =  tflearn.input_data(shape=[None,num_inputs]

  1.  Pass the input object to create further layers:

layer1  =  tflearn.fully_connected(input_layer,10, activation='relu')

layer2  =  tflearn.fully_connected(layer1,10, activation='relu')

  1.  Add the output layer:

output  =  tflearn.fully_connected(layer2,n_classes, activation='softmax')

  1.  Create the final net from the estimator layer such as regression:

net  =  tflearn.regression(output, optimizer='adam', metric=tflearn.metrics.Accuracy(), loss='categorical_crossentropy'

)


The TFLearn provides several classes for layers that are described in following sub-sections.

TFLearn core layers


TFLearn offers the following layers in the tflearn.layers.core module:

Layer classDescriptioninput_dataThis layer is used to specify the input layer for the neural network.fully_connectedThis layer is used to specify a layer where all the neurons are connected to all the neurons in the previous layer.dropoutThis layer is used to specify the dropout regularization. The input elements are scaled by 1/keep_prob while keeping the expected sum unchanged.Layer classDescriptioncustom_layerThis layer is used to specify a custom function to be applied to the input. This class wraps our custom function and presents the function as a layer.reshapeThis layer reshapes the input into the output of specified shape.flattenThis layer converts the input tensor to a 2D tensor.activationThis layer applies the specified activation function to the input tensor.single_unitThis layer applies the linear function to the inputs.highwayThis layer implements the fully connected highway function.one_hot_encodingThis layer converts the numeric labels to their binary vector one-hot encoded representations.time_distributedThis layer applies the specified function to each time step of the input tensor.multi_target_dataThis layer creates and concatenates multiple placeholders, specifically used when the layers use targets from multiple sources.

TFLearn convolutional layers


TFLearn offers the following layers in the tflearn.layers.conv module:

Layer classDescriptionconv_1dThis layer applies 1D convolutions to the input dataconv_2dThis layer applies 2D convolutions to the input dataconv_3dThis layer applies 3D convolutions to the input dataconv_2d_transposeThis layer applies transpose of conv2_d to the input dataconv_3d_transposeThis layer applies transpose of conv3_d to the input dataatrous_conv_2dThis layer computes a 2-D atrous convolutiongrouped_conv_2dThis layer computes a depth-wise 2-D convolutionmax_pool_1dThis layer computes 1-D max poolingmax_pool_2dThis layer computes 2D max poolingavg_pool_1dThis layer computes 1D average poolingavg_pool_2dThis layer computes 2D average poolingupsample_2dThis layer applies the row and column wise 2-D repeat operationupscore_layerThis layer implements the upscore as specified in http://arxiv. org/abs/1411.4038global_max_poolThis layer implements the global max pooling operationglobal_avg_poolThis layer implements the global average pooling operationresidual_blockThis layer implements the residual block to create deep residual networksresidual_bottleneckThis layer implements the residual bottleneck block for deep residual networksresnext_blockThis layer implements the ResNeXt block

TFLearn recurrent layers


TFLearn offers the following layers in the tflearn.layers.recurrent module:

Layer classDescriptionsimple_rnnThis layer implements the simple recurrent neural network modelbidirectional_rnnThis layer implements the bi-directional RNN modellstmThis layer implements the LSTM modelgruThis layer implements the GRU model

TFLearn normalization layers


TFLearn offers the following layers in the tflearn.layers.normalization module:

Layer classDescriptionbatch_normalizationThis layer normalizes the output of activations of previous layers for each batchlocal_response_normalizationThis layer implements the LR normalizationl2_normalizationThis layer applies the L2 normalization to the input tensors

TFLearn embedding layers


TFLearn offers only one layer in the tflearn.layers.embedding_ops module:

Layer classDescriptionembeddingThis layer implements the embedding function for a sequence of integer IDs or floats

TFLearn merge layers


TFLearn offers the following layers in the tflearn.layers.merge_ops module:

Layer classDescriptionmerge_outputsThis layer merges the list of tensors into a single tensor, generally used to merge the output tensors of the same shapemergeThis layer merges the list of tensors into a single tensor; you can specify the axis along which the merge needs to be done

TFLearn estimator layers


TFLearn offers only one layer in the tflearn.layers.estimator module:

Layer classDescriptionregressionThis layer implements the linear or logistic regression

While creating the regression layer, you can specify the optimizer and the loss and metric functions. TFLearn offers the following optimizer functions as classes in the tflearn.optimizers module:

  • SGD
  • RMSprop
  • Adam
  • Momentum
  • AdaGrad
  • Ftrl
  • AdaDelta
  • ProximalAdaGrad
  • Nesterov


Note: You can create custom optimizers by extending the tflearn.optimizers.Optimizer base class.

TFLearn offers the following metric functions as classes or ops in the tflearn.metrics module:

Unlock access to the largest independent learning library in Tech for FREE!
Get unlimited access to 7500+ expert-authored eBooks and video courses covering every tech area you can think of.
Renews at €18.99/month. Cancel anytime
  • Accuracy or  accuracy_op
  • Top_k or top_k_op
  • R2 or r2_op
  • WeightedR2  or weighted_r2_op
  • Binary_accuracy_op


Note : You can create custom metrics by extending the tflearn.metrics.Metric base class.

TFLearn provides the following loss functions, known as objectives, in the tflearn.objectives module:

  • Softymax_categorical_crossentropy
  • categorical_crossentropy binary_crossentropy
  • Weighted_crossentropy
  • mean_square hinge_loss roc_auc_score
  • Weak_cross_entropy_2d


While specifying the input, hidden, and output layers, you can specify the activation functions to be applied to the output. TFLearn provides the following activation functions in the tflearn.activations module:

  • linear
  • tanh
  • Sigmoid
  • softmax
  • softplus
  • Softsign
  • relu relu6
  • leaky_relu
  • Prelu
  • elu
  • Crelu
  • selu

Creating the TFLearn Model


Create the model from the net created in the previous step (step 4 in creating the TFLearn layers section):

model  =  tflearn.DNN(net)

Types of TFLearn models


The TFLearn offers two different classes of the models:

DNN  (Deep Neural Network) model: This class allows you to create a multilayer perceptron from the network that you have created from the layers SequenceGenerator model: This class allows you to create a deep neural network that can generate sequences

Training the TFLearn Model


After creating, train the model with the model.fit() method:

model.fit(X_train, Y_train, n_epoch=n_epochs, batch_size=batch_size, show_metric=True, run_id='dense_model')

Using the TFLearn Model


Use the trained model to predict or evaluate:

score  =  model.evaluate(X_test,  Y_test)

print('Test  accuracy:',  score[0])


The complete code for the TFLearn MNIST classification example is provided in the notebook ch-02_TF_High_Level_Libraries. The output from the TFLearn MNIST example is as follows:

Training  Step:  5499         |  total  loss:  0.42119  |  time:  1.817s

|  Adam  |  epoch:  010  |  loss:  0.42119  -  acc:  0.8860  --  iter:  54900/55000

Training  Step:  5500         |  total  loss:  0.40881  |  time:  1.820s

|  Adam  |  epoch:  010  |  loss:  0.40881  -  acc:  0.8854  --  iter:  55000/55000

-- Test  accuracy:  0.9029


Note: You can get more information about TFLearn from the following link: http://tflearn.org/.

To summarize, we got to know about TFLearn and the different TFLearn layers and models.

If you found this post useful, do check out this book Mastering TensorFlow 1.x, to explore advanced features of TensorFlow 1.x, and gain insight into TensorFlow Core, Keras, TF Estimators, TFLearn, TF Slim, Pretty Tensor, and Sonnet.

how-tflearn-makes-building-tensorflow-models-easier-img-0

TensorFlow.js 0.11.1 releases!

How to Build TensorFlow Models for Mobile and Embedded devices

Distributed TensorFlow: Working with multiple GPUs and servers