咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Unlocking the Potential of Rev... 收藏
arXiv

Unlocking the Potential of Reverse Distillation for Anomaly Detection

作     者:Liu, Xinyue Wang, Jianyuan Leng, Biao Zhang, Shuo 

作者机构:School of Computer Science and Engineering Beihang University China School of Intelligence Science and Technology University of Science and Technology Beijing China Beijing Key Lab of Traffic Data Analysis and Mining School of Computer & Technology Beijing Jiaotong University China 

出 版 物:《arXiv》 (arXiv)

年 卷 期:2024年

核心收录:

主  题:Students 

摘      要:Knowledge Distillation (KD) is a promising approach for unsupervised Anomaly Detection (AD). However, the student network’s over-generalization often diminishes the crucial representation differences between teacher and student in anomalous regions, leading to detection failures. To addresses this problem, the widely accepted Reverse Distillation (RD) paradigm designs asymmetry teacher and student network, using an encoder as teacher and a decoder as student. Yet, the design of RD does not ensure that the teacher encoder effectively distinguishes between normal and abnormal features or that the student decoder generates anomaly-free features. Additionally, the absence of skip connections results in a loss of fine details during feature reconstruction. To address these issues, we propose RD with Expert, which introduces a novel Expert-Teacher-Student network for simultaneous distillation of both the teacher encoder and student decoder. The added expert network enhances the student’s ability to generate normal features and optimizes the teacher’s differentiation between normal and abnormal features, reducing missed detections. Additionally, Guided Information Injection is designed to filter and transfer features from teacher to student, improving detail reconstruction and minimizing false positives. Experiments on several benchmarks prove that our method outperforms existing unsupervised AD methods under RD paradigm, fully unlocking RD’s potential. Copyright © 2024, The Authors. All rights reserved.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分