Recurrent Neural Network
Within the set of Artificial Neural Networks (ANN), there are several variants based on the number of hidden layers and data flow. One of the variants is RNN, where the connections between neurons can form a cycle. Unlike feed-forward networks, RNNs can use internal memory for their processing. RNNs are a class of ANNs that feature connections between hidden layers that are propagated through time in order to learn sequences. RNN use cases include the following fields:
- Stock market predictions
- Image captioning
- Weather forecast
- Time-series-based forecasts
- Language translation
- Speech recognition
- Handwriting recognition
- Audio or video processing
- Robotics action sequencing
The networks we have studied so far (feed-forward networks) are based on input data that is powered to the network and converted into output. If it is a supervised learning algorithm, the output is a label that can recognize the input. Basically, these algorithms connect raw data to specific categories by recognizing...