Generating new music notes
In this recipe, we will generate new sample music notes. New musical notes can be generated by altering parameter num_timesteps
. However, one should keep in mind to increase the timesteps, as it can become computationally inefficient to handle increased dimensionality of vectors in the current setup of RBM. These RBMs can be made efficient in learning by creating their stacks (namely Deep Belief Networks). Readers can leverage the DBN codes of Chapter 5, Generative Models in Deep Learning, to generate new musical notes.
How to do it...
- Create new sample music:
hh0 = tf$nn$sigmoid(tf$matmul(X, W) + hb) vv1 = tf$nn$sigmoid(tf$matmul(hh0, tf$transpose(W)) + vb) feed = sess$run(hh0, feed_dict=dict( X= sample_image, W= prv_w, hb= prv_hb)) rec = sess$run(vv1, feed_dict=dict( hh0= feed, W= prv_w, vb= prv_vb)) S = np$reshape(rec[1,],newshape=shape(num_timesteps,2*note_range))
- Regenerate the MIDI file:
midi_manipulation$noteStateMatrixToMidi(S, name=paste0("generated_chord_1...