版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:School of Computer Science and Technology University of Electronic Science and Technology of China Sichuan Chengdu China Tsinghua Shenzhen International Graduate School Tsinghua University Guangdong Shenzhen China West China Medical Center Sichuan University Sichuan Chengdu China J. Crayton Pruitt Family Department of Biomedical Engineering University of Florida GainesvilleFL United States
出 版 物:《arXiv》 (arXiv)
年 卷 期:2021年
核心收录:
主 题:Convolutional neural networks
摘 要:Noisy labels composed of correct and corrupted ones are pervasive in practice. They might significantly deteriorate the performance of convolutional neural networks (CNNs), because CNNs are easily overfitted on corrupted labels. To address this issue, inspired by an observation, deep neural networks might first memorize the probably correct-label data and then corrupt-label samples, we propose a novel yet simple self-paced resistance framework to resist corrupted labels, without using any clean validation data. The proposed framework first utilizes the memorization effect of CNNs to learn a curriculum, which contains confident samples and provides meaningful supervision for other training samples. Then it adopts selected confident samples and a proposed resistance loss to update model parameters;the resistance loss tends to smooth model parameters’ update or attain equivalent prediction over each class, thereby resisting model overfitting on corrupted labels. Finally, we unify these two modules into a single loss function and optimize it in an alternative learning. Extensive experiments demonstrate the significantly superior performance of the proposed framework over recent state-of-the-art methods on noisy-label data. Source codes of the proposed method are available on https://***/xsshi2015/Self-paced-Resistance-Learning. Copyright © 2021, The Authors. All rights reserved.