25 August 2024 | Ibomoiyе Dомor Mienyе, Theo G. Swart, George Obaidо
This paper provides a comprehensive review of Recurrent Neural Networks (RNNs) and their applications, highlighting advancements in architectures such as Long Short-Term Memory (LSTM) networks, gated recurrent units (GRUs), bidirectional LSTM (BiLSTM), echo state networks (ESNs), peephole LSTM, and stacked LSTM. The study examines the application of RNNs in various domains, including natural language processing (NLP), speech recognition, time series forecasting, autonomous vehicles, and anomaly detection. It discusses recent innovations, such as the integration of attention mechanisms and the development of hybrid models combining RNNs with convolutional neural networks (CNNs) and transformer architectures. The review aims to provide ML researchers and practitioners with a comprehensive overview of the current state and future directions of RNN research. The paper is organized into sections covering related works, the fundamentals of RNNs, advanced RNN variants, innovations in RNN architectures and training methodologies, public datasets for RNN research, applications in peer-reviewed literature, and future research directions.This paper provides a comprehensive review of Recurrent Neural Networks (RNNs) and their applications, highlighting advancements in architectures such as Long Short-Term Memory (LSTM) networks, gated recurrent units (GRUs), bidirectional LSTM (BiLSTM), echo state networks (ESNs), peephole LSTM, and stacked LSTM. The study examines the application of RNNs in various domains, including natural language processing (NLP), speech recognition, time series forecasting, autonomous vehicles, and anomaly detection. It discusses recent innovations, such as the integration of attention mechanisms and the development of hybrid models combining RNNs with convolutional neural networks (CNNs) and transformer architectures. The review aims to provide ML researchers and practitioners with a comprehensive overview of the current state and future directions of RNN research. The paper is organized into sections covering related works, the fundamentals of RNNs, advanced RNN variants, innovations in RNN architectures and training methodologies, public datasets for RNN research, applications in peer-reviewed literature, and future research directions.