咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Tackling Prevalent Conditions ... 收藏
arXiv

Tackling Prevalent Conditions in Unsupervised Combinatorial Optimization: Cardinality, Minimum, Covering, and More

作     者:Bu, Fanchen Jo, Hyeonsoo Lee, Soo Yong Ahn, Sungsoo Shin, Kijung 

作者机构:School of Electrical Engineering KAIST Daejeon Korea Republic of Kim Jaechul Graduate School of Artificial Intelligence KAIST Seoul Korea Republic of Graduate School of Artificial Intelligence POSTECH Pohang Korea Republic of Department of Computer Science and Engineering Pohang Korea Republic of 

出 版 物:《arXiv》 (arXiv)

年 卷 期:2024年

核心收录:

主  题:Combinatorial optimization 

摘      要:Combinatorial optimization (CO) is naturally discrete, making machine learning based on differentiable optimization inapplicable. Karalias & Loukas (2020) adapted the probabilistic method to incorporate CO into differentiable optimization. Their work ignited the research on unsupervised learning for CO, composed of two main components: probabilistic objectives and derandomization. However, each component confronts unique challenges. First, deriving objectives under various conditions (e.g., cardinality constraints and minimum) is nontrivial. Second, the derandomization process is underexplored, and the existing derandomization methods are either random sampling or naive rounding. In this work, we aim to tackle prevalent (i.e., commonly involved) conditions in unsupervised CO. First, we concretize the targets for objective construction and derandomization with theoretical justification. Then, for various conditions commonly involved in different CO problems, we derive nontrivial objectives and derandomization to meet the targets. Finally, we apply the derivations to various CO problems. Via extensive experiments on synthetic and real-world graphs, we validate the correctness of our derivations and show our empirical superiority w.r.t. both optimization quality and speed. © 2024, CC BY.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分