AIT Lab Logo

A Survey on Kolmogorov-Arnold Networks

Published in ACM Computing Surveys 2025

Abstract

This review study explores the theoretical foundations, evolution, applications, and future potential of Kolmogorov-Arnold Networks (KAN), a neural network model inspired by the Kolmogorov-Arnold representation theorem. KANs set themselves apart from traditional neural networks by employing learnable, spline-parameterized functions rather than fixed activation functions, allowing for flexible and interpretable representations of high-dimensional functions. The review explores Kan's architectural strengths, including adaptive edge-based activation functions that enhance parameter efficiency and scalability across varied applications such as time series forecasting, computational biomedicine, and graph learning. Key advancements including Temporal-KAN (T-KAN), FastKAN, and Partial Differential Equation (PDE) KAN illustrate KAN's growing applicability in dynamic environments, significantly improving interpretability, computational efficiency, and adaptability for complex function approximation tasks. Moreover, the paper discusses KANs integration with other architectures, such as convolutional, recurrent, and transformer-based models, showcasing its versatility in complementing established neural networks for tasks that require hybrid approaches. Despite its strengths, KAN faces computational challenges in high-dimensional and noisy data settings, sparking continued research into optimization strategies, regularization techniques, and hybrid models. This paper highlights KANs expanding role in modern neural architectures and outlines future directions to enhance its computational efficiency, interpretability, and scalability in data-intensive applications.

Computing methodologiesMachine learningDeep learning theoryKolmogorov Arnold Networks (KAN)Model interpretabilityApplied computingPredictive analytics

KAN Architecture

Kolmogorov-Arnold Networks (KANs) introduce a novel edge-based approach where learnable spline functions replace traditional fixed activation functions. Instead of applying non-linearities at nodes (like ReLU), each edge in a KAN applies a parameterized transformation ϕij(x)\phi_{ij}(x), typically modeled using B-splines.

This design enables adaptive, interpretable, and compositional learning, inspired by the Kolmogorov–Arnold representation theorem. It enhances both model flexibility and explainability by learning customized activation behavior along each connection.

Kolmogorov-Arnold Network Architecture

Figure 1: Schematic representation of the KAN architecture. Each layer applies learnable univariate spline functions along the edges (between nodes), enabling edge-based transformations rather than traditional node-based activations. While the classical Kolmogorov-Arnold formulation applied outer functions only at the final stage, modern deep KAN architectures generalize this by applying spline-based transformations at every layer, ensuring compositional and interpretable modeling across the network.

🌱 Key Variants of KAN

Since its introduction, the Kolmogorov-Arnold Network (KAN) has inspired several powerful architectural variants. Each is tailored to specific tasks or computational challenges, while preserving the core idea of edge-based, learnable spline activations.

  • FastKAN: An optimized version of KAN that replaces B-splines with Gaussian RBFs for faster computation and improved scalability.
  • T-KAN: A time-aware extension of KAN that incorporates temporal patterns, designed for sequence and time-series modeling.
  • PDE-KAN: Combines KAN with physics-informed learning to solve partial differential equations, enforcing domain constraints through regularization.
  • Wav-KAN: Integrates wavelet transformations into KAN, enhancing signal decomposition and localized feature extraction.
  • SigKAN: Uses signature transforms to model temporal dynamics in continuous streams, improving interpretability in time-series data.
  • GKAN: Adapts KAN to graph-structured data, supporting node-level and graph-level tasks via spline-transformed edge weights.

These variants demonstrate KAN's adaptability across domains such as physics, time-series analysis, graph learning, and signal processing.

📈 Research Impact & Community Engagement

This comprehensive survey has been officially published in ACM Computing Surveys, one of the most prestigious journals in computer science, reflecting its significance and contribution to the field. The work has garnered significant attention within the machine learning community, contributing to the growing understanding and adoption of Kolmogorov-Arnold Networks.

Beyond the academic publication, our research has fostered a vibrant community ecosystem. We maintain an awesome KAN repository - a curated collection of KAN (Kolmogorov-Arnold Network) resources including libraries, projects, tutorials, papers, and more, specifically designed for researchers and developers working in this exciting field. This repository serves as a comprehensive hub for anyone looking to explore, implement, or contribute to KAN-related research.

The work has been featured in various academic platforms and has sparked discussions about the future of interpretable neural architectures. To learn more about the broader impact of this research and its implications for the field, you can explore our publication story on Kudos, which provides additional insights into the research journey, community reception, and potential applications of KAN architectures.

BibTeX Citation
Click to copy
@article{10.1145/3743128,
      author = {Somvanshi, Shriyank and Javed, Syed Aaqib and Islam, Md Monzurul and Pandit, Diwas and Das, Subasish},
      title = {A Survey on Kolmogorov-Arnold Network},
      year = {2025},
      publisher = {Association for Computing Machinery},
      address = {New York, NY, USA},
      issn = {0360-0300},
      url = {https://doi.org/10.1145/3743128},
      doi = {10.1145/3743128},
      note = {Just Accepted},
      journal = {ACM Comput. Surv.},
      month = jun,
      keywords = {Kolmogorov-Arnold Network}
  }