Published October 4, 2024 | Version v1
Conference paper Open

Category Adaptation Meets Projected Distillation in Generalized Continual Category Discovery

  • 1. IDEAS NCBR
  • 2. ROR icon Warsaw University of Technology
  • 3. ROR icon Gdańsk University of Technology
  • 4. ROR icon Computer Vision Center

Description

Generalized Continual Category Discovery (GCCD) tackles learning from sequentially arriving, partially labeled datasets while uncovering new categories. Traditional methods depend on feature distillation to prevent forgetting the old knowledge. However, this strategy restricts the model’s ability to adapt and effectively distinguish new categories. To address this, we introduce a novel technique integrating a learnable
projector with feature distillation, thus enhancing model adaptability without sacrificing past knowledge. The resulting distribution shift of the previously learned categories is mitigated with the auxiliary category
adaptation network. We demonstrate that while each component offers modest benefits individually, their combination – dubbed CAMP (Category Adaptation Meets Projected distillation) – significantly improves the
balance between learning new information and retaining old. CAMP exhibits superior performance across several GCCD and Class Incremental Learning scenarios. The code is available on Github.

Files

CAMP.pdf

Files (8.9 MB)

Name Size Download all
md5:11baa0cc104c3e5f65d968648625e3b3
8.9 MB Preview Download

Additional details

Funding

European Commission
ELIAS - European Lighthouse of AI for Sustainability 101120237

Software