Store and retrieve information in Neural Turing Machines
Attention mechanism can be used as an access to a part of memory in the memory-augmented networks.
The concept of memory in Neural Turing Machines has been inspired by both neuroscience and computer hardware.
RNN hidden states to store information is not capable of storing sufficiently large amounts of data and retrieving it, even when the RNN is augmented with a memory cell, such as in the case of LSTM.
To solve this problem, Neural Turing Machines (NTM) have been first designed with an external memory bank and read/write heads, whilst retaining the magic of being trained via gradient descent.
Reading the memory bank is given by an attention on the variable memory bank as the attention on inputs in the previous examples:
Which can be illustrated the following way:
While writing a value to the memory bank consists of assigning our new value to part of the memory, thanks to another attention mechanism:
describes the information to store,...