版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:Xidian University Key Laboratory of Intelligent Perception and Image Understanding of the Ministry of Education of China International Research Center of Intelligent Perception and Computation School of Artificial Intelligence Xi'an China
出 版 物:《IEEE Transactions on Multimedia》 (IEEE Trans Multimedia)
年 卷 期:2024年第27卷
页 面:2281-2292页
核心收录:
学科分类:1205[管理学-图书情报与档案管理] 0808[工学-电气工程] 08[工学] 0835[工学-软件工程] 0812[工学-计算机科学与技术(可授工学、理学学位)]
基 金:National Natural Science Foundation of China Higher Education Discipline Innovation Project Program for Cheung Kong Scholars and Innovative Research Team in University Science and Technology Innovation Project from the Chinese Ministry of Education National Key Laboratory of Human-Machine Hybrid Augmented Intelligence Xi'an Jiaotong University
主 题:Semantics
摘 要:Self-attention learns capturing the long-range dependencies between token embeddings (e.g., image pixels). However, the memory overhead and computation cost are prohibitive due to being quadratic in term of the spatial resolution. The structure analysis is conducted on the vanilla attention and reveals two crucial roles: the correlation-based dependency structure and feature normalization. In this work, an efficacious Local-Global Semantics (LGS) module is proposed to alleviate the above issues. Our LGS module contains a group convolution and an Efficient Global Semantic Attention (EGSA). Firstly, the group convolution aggregates local semantics. Secondly, EGSA formulates a general model for the global semantic interaction. The feature normalization is directly applied to sequence representations (e.g., query, key or value) and the linear semantic correlation is calculated between channels. LGS has the linear memory overhead and computation cost in term of the spatial resolution. The proposed LGS module can be seamlessly incorporated into a backbone network. The experiment results verify its effectiveness on two popular detection datasets: the MS COCO and PASCAL VOC. The codes will be given in https://***/TimeIsFuture/LGS. © 1999-2012 IEEE.