AIT Lab Logo

From S4 to Mamba: A Comprehensive Survey on Structured State Space Models

Published in arXiv preprint 2025

Abstract

Recent advancements in sequence modeling have led to the emergence of Structured State Space Models (SSMs) as an efficient alternative to Recurrent Neural Networks (RNNs) and Transformers, addressing challenges in long-range dependency modeling and computational efficiency. While RNNs suffer from vanishing gradients and sequential inefficiencies, and Transformers face quadratic complexity, SSMs leverage structured recurrence and state-space representations to achieve superior long-sequence processing with linear or near-linear complexity. This survey provides a comprehensive review of SSMs, tracing their evolution from the foundational S4 model to its successors like Mamba, Simplified Structured State Space Sequence Model (S5), and Jamba, highlighting their improvements in computational efficiency, memory optimization, and inference speed. By comparing SSMs with traditional sequence models across domains such as natural language processing (NLP), speech recognition, vision, and time-series forecasting, we demonstrate their advantages in handling long-range dependencies while reducing computational overhead. Despite their potential, challenges remain in areas such as training optimization, hybrid modeling, and interpretability. This survey serves as a structured guide for researchers and practitioners, detailing the advancements, trade-offs, and future directions of SSM-based architectures in AI and deep learning.

Computing methodologiesMachine learningDeep learning theoryStructured State Space Models (SSMs)Sequence modelingComputational efficiencyEfficient transformer alternatives

SSM Architecture

Structured State Space Models (SSMs) represent a paradigm shift in sequence modeling, offering an efficient alternative to traditional Recurrent Neural Networks (RNNs) and Transformers. At their core, SSMs leverage structured recurrence and state-space representations to achieve superior long-sequence processing with linear or near-linear complexity.

The fundamental architecture of SSMs is based on continuous-time state-space equations that are discretized for practical implementation. This approach enables efficient processing of long sequences while maintaining the ability to capture complex temporal dependencies.

Structured State Space Model Architecture Figure 1: Three Views of Linear State Space Layer

🌱 Evolution from S4 to Mamba

The journey of Structured State Space Models has been marked by significant architectural innovations, each building upon the previous to address specific computational and modeling challenges.

  • S4 (Structured State Space Sequence Model): The foundational model that introduced structured state-space representations for efficient sequence modeling, achieving linear complexity while maintaining strong performance on long-range tasks.

  • Mamba: A breakthrough architecture that combines the efficiency of SSMs with selective state updates, enabling even faster inference and better scaling to longer sequences. Mamba introduces a selective scan mechanism that adaptively processes relevant parts of the input.

  • S5 (Simplified Structured State Space Sequence Model): An optimized variant that simplifies the S4 architecture while maintaining its core benefits, offering improved training stability and easier implementation.

  • Jamba: A hybrid architecture that combines the strengths of SSMs with attention mechanisms, providing a flexible framework for various sequence modeling tasks.

These variants demonstrate the continuous evolution of SSM architectures, each addressing specific challenges in computational efficiency, memory usage, and modeling capabilities.

📈 Research Impact & Applications

This comprehensive survey provides a structured guide for researchers and practitioners working with Structured State Space Models. The work has been submitted to arXiv and is currently under review, reflecting the growing interest in efficient sequence modeling alternatives.

The survey covers multiple domains where SSMs have shown promising results:

  • Natural Language Processing (NLP): SSMs have demonstrated competitive performance with Transformers while requiring significantly less computational resources for long sequences.

  • Speech Recognition: The linear complexity of SSMs makes them particularly suitable for processing long audio sequences, where traditional models struggle with computational overhead.

  • Computer Vision: SSMs have been successfully applied to video understanding tasks, leveraging their ability to model temporal dependencies efficiently.

  • Time-Series Forecasting: The structured nature of SSMs makes them well-suited for modeling complex temporal patterns in various forecasting applications.

The research community has shown significant interest in SSMs as a potential alternative to Transformers, particularly for applications requiring long-range dependencies. The survey serves as a comprehensive resource for understanding the current state of SSM research and identifying future directions for development.

🔬 Technical Contributions

The survey provides detailed analysis of several key aspects of SSMs:

  1. Theoretical Foundations: Comprehensive coverage of the mathematical underpinnings of state-space modeling and its discretization for practical applications.

  2. Architectural Comparisons: Detailed comparison of SSMs with RNNs and Transformers across multiple dimensions including computational complexity, memory usage, and modeling capabilities.

  3. Implementation Considerations: Practical guidance on training optimization, memory management, and hybrid modeling approaches.

  4. Future Directions: Identification of open challenges and promising research directions in SSM development.

The work contributes to the broader understanding of efficient sequence modeling and provides a foundation for future research in this rapidly evolving field.

BibTeX Citation
Click to copy
@article{2503.18970,
      author = {Somvanshi, Shriyank and Islam, Md Monzurul and Mimi, Mahmuda Sultana and Polock, Sazzad Bin Bashar and Chhetri, Gaurab and Das, Subasish},
      title = {From S4 to Mamba: A Comprehensive Survey on Structured State Space Models},
      year = {2025},
      publisher = {arXiv},
      url = {https://arxiv.org/abs/2503.18970},
      doi = {10.48550/arXiv.2503.18970},
      note = {arXiv preprint},
      journal = {arXiv:2503.18970},
      month = mar,
      keywords = {Structured State Space Models, S4, Mamba, Sequence Modeling}
  }