咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Rank-based Decomposable Losses... 收藏
arXiv

Rank-based Decomposable Losses in Machine Learning: A Survey

作     者:Hu, Shu Wang, Xin Lyu, Siwei 

作者机构:The Department of Computer Information and Graphics Technology Purdue School of Engineering and Technology Indiana University-Purdue University IndianapolisIN46202 United States The Heinz College of Information Systems and Public Policy Carnegie Mellon University PittsburghPA15213 United States The Department of Computer Science and Engineering University at Buffalo SUNY BuffaloNY14260 United States 

出 版 物:《arXiv》 (arXiv)

年 卷 期:2022年

核心收录:

主  题:Aggregates 

摘      要:Recent works have revealed an essential paradigm in designing loss functions that differentiate individual losses vs. aggregate losses. The individual loss measures the quality of the model on a sample, while the aggregate loss combines individual losses/scores over each training sample. Both have a common procedure that aggregates a set of individual values to a single numerical value. The ranking order reflects the most fundamental relation among individual values in designing losses. In addition, decomposability, in which a loss can be decomposed into an ensemble of individual terms, becomes a significant property of organizing losses/scores. This survey provides a systematic and comprehensive review of rank-based decomposable losses in machine learning. Specifically, we provide a new taxonomy of loss functions that follows the perspectives of aggregate loss and individual loss. We identify the aggregator to form such losses, which are examples of set functions. We organize the rank-based decomposable losses into eight categories. Following these categories, we review the literature on rank-based aggregate losses and rank-based individual losses. We describe general formulas for these losses and connect them with existing research topics. We also suggest future research directions spanning unexplored, remaining, and emerging issues in rank-based decomposable losses. Copyright © 2022, The Authors. All rights reserved.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分