Category Adaptation Meets Projected Distillation in Generalized Continual Category Discovery
Creators
Description
Generalized Continual Category Discovery (GCCD) tackles learning from sequentially arriving, partially labeled datasets while uncovering new categories. Traditional methods depend on feature distillation to prevent forgetting the old knowledge. However, this strategy restricts the model’s ability to adapt and effectively distinguish new categories. To address this, we introduce a novel technique integrating a learnable
projector with feature distillation, thus enhancing model adaptability without sacrificing past knowledge. The resulting distribution shift of the previously learned categories is mitigated with the auxiliary category
adaptation network. We demonstrate that while each component offers modest benefits individually, their combination – dubbed CAMP (Category Adaptation Meets Projected distillation) – significantly improves the
balance between learning new information and retaining old. CAMP exhibits superior performance across several GCCD and Class Incremental Learning scenarios. The code is available on Github.
Files
CAMP.pdf
Files
(8.9 MB)
| Name | Size | Download all |
|---|---|---|
|
md5:11baa0cc104c3e5f65d968648625e3b3
|
8.9 MB | Preview Download |
Additional details
Software
- Repository URL
- https://arxiv.org/abs/2308.12112