As we have seen previously, one technique for building a model is to pass the required layers into the tf.keras.Sequential() constructor. In this instance, we have three layers: an embedding layer, an RNN layer, and a dense layer.
The first, embedding layer is a lookup table of vectors, one vector for the numeric value of each character. It has the dimension, embedding_dimension. The middle, the recurrent layer is a GRU; its size is recurrent_nn_units. The last layer is a dense output layer of the length vocabulary_length units.
What the model does is look up the embedding, run the GRU for a single time step using the embedding for input, and pass this to the dense layer, which generates logits (log odds) for the next character.
A diagram showing this is as follows:
![](https://static.packt-cdn.com/products/9781789530759/graphics/assets/3832c4d7-fb9f-40af-99bf-9f2c4acd2584.png)
The code that implements this model is, therefore, as follows:
def build_model...