Document Type

Article

Publication Date

2025

DOI

10.1109/ACCESS.2025.3642464

Publication Title

IEEE Access

Volume

13

Pages

210237-210245

Abstract

Feature Distillation (FD) strategies are proven to be effective in mitigating Catastrophic Forgetting (CF) seen in Class Incremental Learning (CIL). However, current FD approaches enforce strict alignment of feature magnitudes and directions across incremental steps, limiting the model’s ability to adapt to new knowledge. In this paper, we propose Structurally Stable Incremental Learning (S²IL), a FD method for CIL that mitigates forgetting by focusing on preserving the overall spatial patterns of features which promote flexible (plasticity) yet stable representations that preserve old knowledge (stability). We also demonstrate that our proposed method S²IL achieves strong incremental accuracy and outperforms other FD methods on SOTA benchmark datasets CIFAR-100, ImageNet-100 and ImageNet-1K. Notably, S²IL outperforms other methods by a significant margin in scenarios that have a large number of incremental tasks. The source code is available at

https://github.com/dlclub2311/Structurally-Stable-Incremental-Learning.

Rights

© 2025 The Authors.

This work is licensed under a Creative Commons Attribution 4.0 International (CC BY 4.0) License.

Original Publication Citation

Balasubramanian, S., Krishna, P. Y., Sriram, T. S., Subramaniam, M. S., Pranav Phanindra Sai, M., & Mukkamala, R. (2025). S²IL: Structurally stable incremental learning. IEEE Access, 13, 210237-210245. https://doi.org/10.1109/ACCESS.2025.3642464

ORCID

0000-0001-6323-9789 (Mukkamala)

Share

COinS