版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:South China Normal Univ Dept Software Engn Foshan 528225 Guangdong Peoples R China
出 版 物:《NEUROCOMPUTING》 (Neurocomputing)
年 卷 期:2025年第624卷
核心收录:
学科分类:08[工学] 0812[工学-计算机科学与技术(可授工学、理学学位)]
基 金:Natural Science Foundation of Guangdong Province of China [2020B1515120089]
主 题:Knowledge distillation Object detectors
摘 要:Knowledge distillation is a popular technique for model compression. However, since knowledge distillation originated in image classification, it primarily concentrates on classification tasks in object detection and often overlooks regression tasks. Emphasizing the classification task while neglecting the regression task can lead to skewed predictions about the model s learning condition in knowledge distillation methods. To address this, we propose Task Integration Distillation (TID), a method that integrates both classification and regression tasks, enhancing the model s ability to accurately capture the learning condition. Inspired by real-world teaching strategies and the concept of learning conditions, TID emphasizes the importance of features derived from both tasks, ensuring a balanced consideration of key and weak areas. We quantify and map the outputs of the two tasks from the object detector onto the feature map. Through this output mapping, we identify key areas and weak areas of the features to assess the model s current learning state. Extensive experiments demonstrate that TID consistently outperforms existing methods, with a notable increase of about 2.0% in mean Average Precision (mAP) over recent feature decoupling and distillation approaches.