At this point we introduce our implementation of a recurrent model including LSTMs blocks for an image classification problem. The dataset we used is the well known MNIST.
The implemented model is composed of a single LSTM layer followed by a reduce mean operation and a softmax layer, as illustrated in the following figure:
Dataflow in an RNN architecture
The following code computes the mean of elements across dimensions of a tensor and reduces input_tensor along the dimensions given in axis. Unless keep_dims is true, the rank of the tensor is reduced by 1 for each entry in axis. If keep_dims is true, the reduced dimensions are retained with length 1:
tf.reduce_mean(input_tensor, axis=None,
keep_dims=False, name=None, reduction_indices=None)
If axis has no entries, all dimensions are reduced, and a tensor with a single element is returned.
For example:
# 'x' is [[1., 1....
tf.reduce_mean(input_tensor, axis=None,
keep_dims=False, name=None, reduction_indices=None)
If axis has no entries, all dimensions are reduced, and a tensor with a single element is returned.
For example:
# 'x' is [[1., 1....