Memory is often associated with Recurrent Neural Network (RNN), but that is not entirely an accurate association. An RNN is really only useful for storing a sequence of events or what you may refer to as a temporal sense, a sense of time if you will. RNNs do this by persisting state back onto itself in a recursive or recurrent loop. An example of how this looks is shown here:
What the diagram shows is the internal representation of a recurrent neuron that is set to track a number of time steps or iterations where x represents the input at a time step and h denotes the state. The network weights of W, U, and V remain the same for all time steps and are trained using a technique called Backpropagation Through Time (BPTT). We won't go into the math of BPTT and leave that up the reader to discover on their own,...