版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:Department of Computer Science and Engineering University of Nevada Reno Virginia Street RenoNV89557 United States
出 版 物:《Machine Vision and Applications》 (Mach Vision Appl)
年 卷 期:2025年第36卷第4期
页 面:1-13页
学科分类:0808[工学-电气工程] 08[工学] 0835[工学-软件工程] 0811[工学-控制科学与工程] 0812[工学-计算机科学与技术(可授工学、理学学位)]
主 题:Image segmentation
摘 要:Breast cancer continues to be one of the most lethal cancer types, mainly affecting women. However, thanks to the utilization of deep learning approaches, there has been a considerable boost in the performance of the methods for breast cancer detection. The loss function is a core element of any deep learning architecture with a significant influence on their performance. The loss function is particularly important for tasks such as breast mass segmentation. For this task, challenging properties of input images, such as pixel class imbalance, may result in instability of training or poor detection results due to the bias of the loss function toward correctly segmenting the majority class. We propose a hybrid loss function incorporating both pixel-level and region-level losses, where the breast tissue density is used as a sample-level weighting signal. We refer to the proposed loss as Density-based Adaptive Sample-Level Prioritizing (Density-ASP) loss. Our motivation stems from the observation that mass segmentation becomes more challenging as breast density increases. This observation makes density a viable option for controlling the effect of region-level losses. We also propose to evaluate the method using automated density estimation approaches. To demonstrate the effectiveness of the proposed Density-ASP, we conducted mass segmentation experiments using two publicly available datasets: INbreast and CBIS-DDSM. Our experimental results demonstrate that Density-ASP improves segmentation performance compared to the commonly used hybrid losses across multiple metrics. © The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2025.