Long short-term memory (LSTM) is a particular architecture of recurrent neural networks (RNNs). RNNs are based on the need to preserve the memory of past events; this behavior is not possible with normal networks, and that is why RNNs are used in areas where the classic networks do not produce results, such as the prediction of time series (weather, quotations, and so on) that refer to previous data.
An LSTM network consists of cells (LSTM blocks) that are linked together. Each cell is, in turn, composed of three types of ports: the input gate, output gate, and forget gate. They implement the write, read, and reset functions on the cell memory, respectively, so the LSTM modules are able to regulate what is stored and deleted. This is possible thanks to the presence of various elements called gates, which are composed of a sigmoid neural layer...