咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Multi-task multi-scale attenti... 收藏

Multi-task multi-scale attention learning-based facial age estimation

作     者:Shi, Chaojun Zhao, Shiwei Zhang, Ke Feng, Xiaohan 

作者机构:North China Elect Power Univ Dept Elect & Commun Engn Baoding Hebei Peoples R China North China Elect Power Univ Hebei Key Lab Power Internet Things Technol Baoding Hebei Peoples R China 

出 版 物:《IET SIGNAL PROCESSING》 (IET Signal Proc.)

年 卷 期:2023年第17卷第2期

核心收录:

学科分类:0808[工学-电气工程] 08[工学] 

基  金:National Natural Science Foundation of China [62076093, 62206095] Fundamental Research Funds for the Central Universities [2020MS099, 2020YJ006, 2022MS078] 

主  题:computer vision feature selection feedforward neural nets image processing neural nets 

摘      要:Face-based age estimation strongly depends on deep residual networks (ResNets), used as the backbone in the relevant research. However, ResNet-based methods ignore the importance of some large-scale facial information and other facial age attributes. Inspired by the attention mechanism, a multi-task learning framework for face-based age estimation called multi-task multi-scale attention is proposed. First, the authors embed the alternative strategy structure of dilated convolution into ResNet34 to construct a multi-scale attention module (MSA) to improve the network s receptive field, which extracts local age-sensitive information while obtaining multi-scale features. The MSA can have a larger receptive field to extract both large-scale and local detailed feature information. Second, multi-task learning network structures are built to predict gender and race, which can share rigid network parameters to improve age estimation and improve the accuracy of age estimation by other age-related parameters. Finally, the Kullback-Leibler divergence loss is adopted between a Dirac delta label and a Gaussian prediction to guide the training. The numerical tests on the MORPH Album II and Adience datasets prove the superiority of the proposed method over other state-of-the-art ones.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分