weakly supervised incremental semantic segmentation (WISS) aims to enable deep neural networks to incrementally learn new classes using only image-level labels without catastrophic forgetting. Despite WISS eliminating...
详细信息
weakly supervised incremental semantic segmentation (WISS) aims to enable deep neural networks to incrementally learn new classes using only image-level labels without catastrophic forgetting. Despite WISS eliminating the usage of costly and time-consuming pixel-by-pixel annotations, the image-level labels can not provide details about the location of new classes, resulting in inferior performance. To address these issues, we take inspiration from zero-shot learning to model the inter-class semantic relation utilizing class names as text prompts, thereby facilitating knowledge transfer between classes. However, some class names of the segmentation datasets are polysemous. Thus, we design a new prompt template to better capture the semantic relation by appending synonyms and definitions of the corresponding classes. Guided by this semantic relation, we propose semantic relation weighted distillation to transfer the knowledge from old to new classes, significantly improving plasticity while reducing forgetting. Additionally, we introduce a novel superclass-level distillation aimed at preserving shared global knowledge within the superclass, further alleviating catastrophic forgetting. We extensively evaluate our method by integrating it into state-of-the-art WISS approaches on Pascal VOC and COCO datasets. We observe consistent gains in performance across diverse experimental scenarios. Code is available at https://***/Magic-Nova77/PGSD.
暂无评论