咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Classification and regression ... 收藏

Classification and regression Task Integration in distillation for object detectors

作     者:Su, Hai Jian, Zhenwen Wei, Yanghui Yu, Songsen 

作者机构:South China Normal Univ Dept Software Engn Foshan 528225 Guangdong Peoples R China 

出 版 物:《NEUROCOMPUTING》 (Neurocomputing)

年 卷 期:2025年第624卷

核心收录:

学科分类:08[工学] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

基  金:Natural Science Foundation of Guangdong Province of China [2020B1515120089] 

主  题:Knowledge distillation Object detectors 

摘      要:Knowledge distillation is a popular technique for model compression. However, since knowledge distillation originated in image classification, it primarily concentrates on classification tasks in object detection and often overlooks regression tasks. Emphasizing the classification task while neglecting the regression task can lead to skewed predictions about the model s learning condition in knowledge distillation methods. To address this, we propose Task Integration Distillation (TID), a method that integrates both classification and regression tasks, enhancing the model s ability to accurately capture the learning condition. Inspired by real-world teaching strategies and the concept of learning conditions, TID emphasizes the importance of features derived from both tasks, ensuring a balanced consideration of key and weak areas. We quantify and map the outputs of the two tasks from the object detector onto the feature map. Through this output mapping, we identify key areas and weak areas of the features to assess the model s current learning state. Extensive experiments demonstrate that TID consistently outperforms existing methods, with a notable increase of about 2.0% in mean Average Precision (mAP) over recent feature decoupling and distillation approaches.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分