版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:Ningbo University Faculty of Electrical Engineering and Computer Science Ningbo China
出 版 物:《Journal of Applied Remote Sensing》 (J. Appl. Remote Sens.)
年 卷 期:2025年第19卷第1期
核心收录:
学科分类:0810[工学-信息与通信工程] 1205[管理学-图书情报与档案管理] 070207[理学-光学] 07[理学] 08[工学] 0714[理学-统计学(可授理学、经济学学位)] 0803[工学-光学工程] 0701[理学-数学] 0702[理学-物理学]
基 金:This work was supported in part by the National Natural Science Foundation of China (Grant Nos. 42071323 and 42371331) the Joint Funds of the Zhejiang Provincial Natural Science Foundation of China (Grant No. LZJMZ24D050002) and the Public Welfare Science and Technology Project of Ningbo (Grant No. 202002N3104)
摘 要:Multi-label scene classification (MLSC) in remote sensing (RS) plays a crucial role in recognizing the intricate contents of RS images. However, the MLSC task is challenged by the complexity of label combinations and the diversity of visual content. Multispectral (MS) data provide valuable information for scene classification. To achieve accurate classification results, it is important to fully utilize the diverse information contained in MS data. To this end, we propose a multispectral transformer network (MST-Net), which leverages transformer-based architectures to capture the diverse information within MS images and the complex relationships between labels and features. Specifically, MST-Net consists of a feature fusion encoder and a semantic query decoder. Within the encoder, we have developed an MS deformable attention module based on a sampling strategy that reduces focus on redundant spectral areas, allowing for better integration of complementary MS information. In the decoder, geographic information is introduced as an inductive bias, utilizing the unique spatiotemporal characteristics of RS images to learn better class-related features. Extensive experiments were conducted on two RS multi-label datasets, namely, LSCIDMRv2 and BigEarthNet. Comparisons with several state-of-the-art multi-label classification methods demonstrate the effectiveness and superiority of MST-Net. © 2024 Society of Photo-Optical Instrumentation Engineers (SPIE).