Deep Residual Echo State Networks: exploring residual orthogonal connections in untrained Recurrent Neural Networks
Abstract
Deep Residual Echo State Networks (DeepResESNs) enhance long-term temporal modeling and memory capacity through hierarchical untrained residual layers, outperforming traditional shallow and deep reservoir computing methods.
Echo State Networks (ESNs) are a particular type of untrained Recurrent Neural Networks (RNNs) within the Reservoir Computing (RC) framework, popular for their fast and efficient learning. However, traditional ESNs often struggle with long-term information processing. In this paper, we introduce a novel class of deep untrained RNNs based on temporal residual connections, called Deep Residual Echo State Networks (DeepResESNs). We show that leveraging a hierarchy of untrained residual recurrent layers significantly boosts memory capacity and long-term temporal modeling. For the temporal residual connections, we consider different orthogonal configurations, including randomly generated and fixed-structure configurations, and we study their effect on network dynamics. A thorough mathematical analysis outlines necessary and sufficient conditions to ensure stable dynamics within DeepResESN. Our experiments on a variety of time series tasks showcase the advantages of the proposed approach over traditional shallow and deep RC.
Community
Echo State Networks (ESNs) are a particular type of untrained Recurrent Neural Networks (RNNs) within the Reservoir Computing (RC) framework, popular for their fast and efficient learning. However, traditional ESNs often struggle with long-term information processing. In this paper, we introduce a novel class of deep untrained RNNs based on temporal residual connections, called Deep Residual Echo State Networks (DeepResESNs). We show that leveraging a hierarchy of untrained residual recurrent layers significantly boosts memory capacity and long-term temporal modeling. For the temporal residual connections, we consider different orthogonal configurations, including randomly generated and fixed-structure configurations, and we study their effect on network dynamics. A thorough mathematical analysis outlines necessary and sufficient conditions to ensure stable dynamics within DeepResESN. Our experiments on a variety of time series tasks showcase the advantages of the proposed approach over traditional shallow and deep RC.
This is an automated message from the Librarian Bot. I found the following papers similar to this paper.
The following papers were recommended by the Semantic Scholar API
- Residual Reservoir Memory Networks (2025)
- Scalable Equilibrium Propagation via Intermediate Error Signals for Deep Convolutional CRNNs (2025)
- Feature-Based Echo-State Networks: A Step Towards Minimalism and Interpretability in Reservoir Computer (2024)
- Time-Scale Coupling Between States and Parameters in Recurrent Neural Networks (2025)
- Hebbian Memory-Augmented Recurrent Networks: Engram Neurons in Deep Learning (2025)
- Minimal Deterministic Echo State Networks Outperform Random Reservoirs in Learning Chaotic Dynamics (2025)
- DeepKoopFormer: A Koopman Enhanced Transformer Based Architecture for Time Series Forecasting (2025)
Please give a thumbs up to this comment if you found it helpful!
If you want recommendations for any Paper on Hugging Face checkout this Space
You can directly ask Librarian Bot for paper recommendations by tagging it in a comment:
@librarian-bot
recommend
Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper