Attention
Are you paying attention? If so, certainly not to everyone equally. In any text, some words matter more than others. An attention mechanism is a way for a neural network to focus on a certain element in a sequence. Focusing, for neural networks, means amplifying what is important:
Attention layers are fully connected layers that take in a sequence and output the weighting for a sequence. The sequence is then multiplied with the weightings:
def attention_3d_block(inputs,time_steps,single_attention_vector = False): input_dim = int(inputs.shape[2]) #1 a = Permute((2, 1),name='Attent_Permute')(inputs) #2 a = Reshape((input_dim, time_steps),name='Reshape')(a) #3 a = Dense(time_steps, activation='softmax', name='Attent_Dense')(a) # Create attention vector #4 if single_attention_vector: #5 a = Lambda(lambda x: K.mean(x, axis=1), name='Dim_reduction...