版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:Minnan Normal Univ Sch Comp Sci Zhangzhou 363000 Peoples R China Minnan Normal Univ Key Lab Data Sci & Intelligence Applicat Zhangzhou 363000 Peoples R China
出 版 物:《PATTERN RECOGNITION》 (Pattern Recogn.)
年 卷 期:2025年第161卷
核心收录:
学科分类:0808[工学-电气工程] 08[工学] 0812[工学-计算机科学与技术(可授工学、理学学位)]
基 金:National Natural Science Founda-tion of China Natural Science Foundation of Fujian Province, China [2021J02049]
主 题:Hierarchical feature selection Strongly dependent relationship Inter-category relevance Semantic independence
摘 要:Hierarchical feature selection is crucial in simplifying hierarchical classification and improving time efficiency. The existing methods select features by distinguishing the weight matrix of the current node from its siblings and approximating its parent weight matrix. However, this approach induces a strongly dependent relationship that overuses the weight matrix. Asa result, the model can lose features independent of that system and characterized by semantic independence. In addition, the relevance among categories is ignored considering that by the overuse of the weight matrix, only the correlation of the features and categories is learned. To effectively address these issues, in this work, a method named Hierarchical Feature Selection Driven by Inter-category Relevance and Semantic Independence (HFSIS)was proposed to weaken the strongly dependent relationship. HFSIS weakens the strongly dependent relationship by directly multiplying the Spearman correlation matrix with the weight matrix. Thus, the semantic independence features are preserved and the correlation of the features and categories is obtained. Finally, a rich-label information method was used to learn the relevance among the various categories. To demonstrate the superiority of HFSIS, it was compared with eleven state-of-the-art methods in eight hierarchical datasets. It is worth mentioning that HFSIS surpasses other algorithms when using its default parameter settings.