In this chapter, we introduced an RNN and the role it plays in music generation, by showing that operating on a sequence and remembering the past are mandatory properties for music generation.
We also generated a MIDI file using the Drums RNN model on the command line. We've covered most of its parameters and learned how to configure the model's output. By looking at the generation algorithm, we explained how it worked and how the different flags can change its execution.
By using the Drums RNN model in Python, we've shown how we can build a versatile application. By doing that, we learned about the MIDI specification, how Magenta encodes NoteSequence using Protobuf, and how to encode a sequence as a one-hot vector. We've also introduced the idea of sending the generated MIDI to other applications, a topic we'll cover in Chapter 9, Making Magenta...