版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:Fujian Provincial Key Laboratory of Big Data Mining and Applications School of Computer Science and Mathematics Fujian University of Technology Fuzhou350118 China Fujian Provincial Key Laboratory of Information Processing and Intelligent Control College of Computer and Data Science Minjiang University Fuzhou350121 China School of Intelligent Systems Engineering Shenzhen Campus of Sun Yat-sen University Guangdong Shenzhen518107 China The Key Laboratory of Cognitive Computing and Intelligent Information Processing of Fujian Education Institutions Wuyi University Wuyishan354300 China School of Physics and Mechanical and Electrical Engineering Longyan University Longyan364012 China
出 版 物:《SSRN》
年 卷 期:2024年
核心收录:
主 题:Image classification
摘 要:SSL(Semi-supervised learning) is widely used in machine learning, which leverages labeled and unlabeled data to improve model performance. SSL aims to optimize class mutual information, but noisy pseudo-labels introduce false class information due to the scarcity of labels. Therefore, these algorithms often need significant training time to refine pseudo-labels for performance improvement iteratively. To tackle this challenge, we propose a novel plug-and-play method named ASCL(Accelerating Semi-Supervised Learning via Contrastive Learning). This method combines contrastive learning with uncertainty-based selection for performance improvement and accelerates the convergence of SSL algorithms. Contrastive learning initially emphasizes the mutual information between samples as a means to decrease dependence on pseudo-labels. Subsequently, it gradually turns to maximizing the mutual information between classes, aligning with the objective of semi-supervised learning. Uncertainty-based selection provides a robust mechanism for acquiring pseudo-labels. The combination of the contrastive learning module and the uncertainty-based selection module forms a virtuous cycle to improve the performance of the proposed model. Extensive experiments demonstrate that ASCL outperforms state-of-the-art methods in terms of both convergence efficiency and performance. In the experimental scenario where only one label is assigned per class in the CIFAR-10 dataset, the application of ASCL to Pseudo-label, UDA (Unsupervised Data Augmentation for Consistency Training), and Fixmatch benefits substantial improvements in classification accuracy. Specifically, the results demonstrate notable improvements in respect of 16.32%, 6.9%, and 24.43% when compared to the original outcomes. Moreover, the required training time is reduced by almost 50%. © 2024, The Authors. All rights reserved.