Memory networks
Answering questions or resolving problems given a few facts or a story have led to the design of a new type of networks, memory networks. In this case, the facts or the story are embedded into a memory bank, as if they were inputs. To solve tasks that require the facts to be ordered or to create transitions between the facts, memory networks use a recurrent reasoning process in multiple steps or hops on the memory banks.
First, the query or question q is converted into a constant input embedding:
While, at each step of the reasoning, the facts X to answer the question are embedded into two memory banks, where the embedding coefficients are a function of the timestep:
To compute attention weights:
And:
Selected with the attention:
The output at each reasoning time step is then combined with the identity connection, as seen previously to improve the efficiency of the recurrency:
A linear layer and classification softmax layer are added to the last :