Document Type


Publication Date




Publication Title

Information Processing and Management






103664 (1-17)


Few-Shot Class-Incremental Learning (FSCIL) aims to learn new classes incrementally with a limited number of samples per class. It faces issues of forgetting previously learned classes and overfitting on few-shot classes. An efficient strategy is to learn features that are discriminative in both base and incremental sessions. Current methods improve discriminability by manually designing inter-class margins based on empirical observations, which can be suboptimal. The emerging Neural Collapse (NC) theory provides a theoretically optimal inter-class margin for classification, serving as a basis for adaptively computing the margin. Yet, it is designed for closed, balanced data, not for sequential or few-shot imbalanced data. To address this gap, we propose a Meta-learning- and NC-based FSCIL method, MetaNC-FSCIL, to compute the optimal margin adaptively and maintain it at each incremental session. Specifically, we first compute the theoretically optimal margin based on the NC theory. Then we introduce a novel loss function to ensure that the loss value is minimized precisely when the inter-class margin reaches its theoretically best. Motivated by the intuition that "learn how to preserve the margin" matches the meta-learning's goal of "learn how to learn", we embed the loss function in base-session meta-training to preserve the margin for future meta-testing sessions. Experimental results demonstrate the effectiveness of MetaNC-FSCIL, achieving superior performance on multiple datasets. The code is available at


© 2024 The Authors.

This is an open access article under the Creative Commons Attribution 4.0 International (CC BY 4.0) License.

Data Availability

Article states: "I have shared the data through Github link."

Original Publication Citation

Ran, H., Li, W., Li, L., Tian, S., Ning, X., & Tiwari, P. (2024). Learning optimal inter-class margin adaptively for few-shot class-incremental learning via neural collapse-based meta-learning. Information Processing and Management, 61(3), 1-17, Article 103664.


0000-0002-4323-2632 (Li)