咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Boundary-guided multi-scale re... 收藏

Boundary-guided multi-scale refinement network for camouflaged object detection

作     者:Ye, Qian Li, Qingwu Huo, Guanying Liu, Yan Zhou, Yan 

作者机构:Hohai Univ Coll Informat Sci & Engn Changzhou 213200 Peoples R China Hohai Univ Jiangsu Key Lab Power Transmiss & Distribut Equipm Changzhou 213200 Peoples R China 

出 版 物:《VISUAL COMPUTER》 (Visual Comput)

年 卷 期:2025年第41卷第8期

页      面:6271-6297页

核心收录:

学科分类:08[工学] 0835[工学-软件工程] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

基  金:Jiangsu Provincial Key Research and Development Program 

主  题:Camouflaged object detection Multi-scale feature extraction Boundary-aware learning Convolutional neural network Multi-guidance 

摘      要:Camouflaged object detection (COD) is significantly more challenging than traditional salient object detection (SOD) due to the high intrinsic similarity between camouflaged objects and their backgrounds, as well as complex environmental conditions. Although current deep learning methods have achieved remarkable performance across various scenarios, they still face limitations in challenging situations, such as occluded targets or scenes with multiple targets. Inspired by the human visual process of detecting camouflaged objects, we introduce BGMR-Net, a boundary-guided multi-scale refinement network designed to identify camouflaged objects accurately. Specifically, we propose the Global Information Extraction (GIE) module to expand the receptive field while preserving detailed cues. Additionally, we design the Boundary-Aware (BA) module, which integrates features across all scales and explores local information from neighboring layer features. Finally, we propose the Multi-information Fusion Dual Stream (MFDS) module, which combines various types of guidance information (i.e., side-output backbone guidance, boundary guidance, neighbor guidance, and global guidance) to generate more fine-grained results through a step-by-step refinement process. Extensive experiments on three benchmark datasets demonstrate that our method significantly outperforms 30 competing approaches. Our code is available at https://***/yeqian1961/BGMR-Net.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分