版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:Beijing Univ Posts & Telecommun State Key Lab Networking & Switching Technol Beijing 100876 Peoples R China
出 版 物:《IEEE TRANSACTIONS ON MULTIMEDIA》 (IEEE Trans Multimedia)
年 卷 期:2025年第27卷
页 面:1448-1460页
核心收录:
学科分类:0810[工学-信息与通信工程] 0808[工学-电气工程] 08[工学] 0835[工学-软件工程] 0812[工学-计算机科学与技术(可授工学、理学学位)]
基 金:Innovation Research Group Project of NSFC NSFC Beijing Nova Program Beiing Natural Science Foundation [L223002, JQ24020] The 111 Project [B18008]
主 题:Measurement Feature extraction Image classification Extraterrestrial measurements Few shot learning Prototypes Neural networks Metalearning Image color analysis Hands Fine-grained image classification few-shot learning metric learning part-level relationship learning
摘 要:Recently, an increasing number of few-shot image classification methods have been proposed, and they aim at seeking a learning paradigm to train a high-performance classification model with limited labeled samples. However, the neglect of part-level relationships causes few-shot methods to struggle to distinguish between closely similar subcategories, which makes it difficult for them to solve the fine-grained image classification problem. To tackle this challenging task, this paper proposes a fine-grained few-shot image classification method that exploits both intra-part and inter-part relationships among different samples. To establish comprehensive relationships, we first extract multiple discriminative descriptors from the input image, representing its different parts. Then, we propose to define the metric spaces by interpolating intra-part relationships, which can help the model adaptively find clear boundaries for these confusing classes. Finally, since the unlabeled image has high similarities to all classes, we project these similarities into a high-dimension space according to the inter-part relationship and interpolate a parameterized classifier to discover the subtle differences among these similar classes. To evaluate our proposed method, we conduct extensive experiments on various fine-grained datasets. Without any pre-train/fine-tuning process, our approach clearly outperforms previous few-shot learning methods, which demonstrates the effectiveness of our approach.