Document Type
Article
Publication Date
2024
DOI
10.1016/j.neunet.2023.10.039
Publication Title
Neural Networks
Volume
169
Pages
307-324
Abstract
Large deep learning models are impressive, but they struggle when real-time data is not available. Few-shot class-incremental learning (FSCIL) poses a significant challenge for deep neural networks to learn new tasks from just a few labeled samples without forgetting the previously learned ones. This setup can easily leads to catastrophic forgetting and overfitting problems, severely affecting model performance. Studying FSCIL helps overcome deep learning model limitations on data volume and acquisition time, while improving practicality and adaptability of machine learning models. This paper provides a comprehensive survey on FSCIL. Unlike previous surveys, we aim to synthesize few-shot learning and incremental learning, focusing on introducing FSCIL from two perspectives, while reviewing over 30 theoretical research studies and more than 20 applied research studies. From the theoretical perspective, we provide a novel categorization approach that divides the field into five subcategories, including traditional machine learning methods, meta learning-based methods, feature and feature space-based methods, replay-based methods, and dynamic network structure-based methods. We also evaluate the performance of recent theoretical research on benchmark datasets of FSCIL. From the application perspective, FSCIL has achieved impressive achievements in various fields of computer vision such as image classification, object detection, and image segmentation, as well as in natural language processing and graph. We summarize the important applications. Finally, we point out potential future research directions, including applications, problem setups, and theory development. Overall, this paper offers a comprehensive analysis of the latest advances in FSCIL from a methodological, performance, and application perspective.
Rights
© 2023 The Authors.
This is an open access article under the Creative Commons Attribution 4.0 International (CC BY 4.0) License.
Data Availability
Article states: "Its survey paper and we will make it open access."
Original Publication Citation
Tian, S., Li, L., Li, W., Ran, H., Ning, X., & Tiwari, P. (2024). A survey on few-shot class-incremental learning. Neural Networks, 169, 307-324. https://doi.org/10.1016/j.neunet.2023.10.039
Repository Citation
Tian, S., Li, L., Li, W., Ran, H., Ning, X., & Tiwari, P. (2024). A survey on few-shot class-incremental learning. Neural Networks, 169, 307-324. https://doi.org/10.1016/j.neunet.2023.10.039
Included in
Artificial Intelligence and Robotics Commons, Electrical and Computer Engineering Commons