Published January 4, 2022 | Version v1
Conference paper Open

Dataset Knowledge Transfer for Class-Incremental Learning without Memory

  • 1. Université Paris-Saclay, CEA, List
  • 2. Université Paris-Saclay, CEA, List and IMT Atlantique, Lab-STICC, team RAMBO
  • 3. West University of Timisoara

Description

ncremental learning enables artificial agents to learn from sequential data. While important progress was made by exploiting deep neural networks, incremental learning remains very challenging. This is particularly the case when no memory of past data is allowed and catastrophic forgetting has a strong negative effect. We tackle class-incremental learning without memory by adapting prediction bias correction, a method which makes predictions of past and new classes more comparable. It was proposed when a memory is allowed and cannot be directly used without memory, since samples of past classes are required. We introduce a two-step learning process which allows the transfer of bias correction parameters between reference and target datasets. Bias correction is first optimized offline on reference datasets which have an associated validation memory. The obtained correction parameters are then transferred to target datasets, for which no memory is available. The second contribution is to introduce a finer modeling of bias correction by learning its parameters per incremental state instead of the usual past vs. new class modeling. The proposed dataset knowledge transfer is applicable to any incremental method which works without memory. We test its effectiveness by applying it to four existing methods. Evaluation with four target datasets and different configurations shows consistent improvement, with practically no computational and memory overhead.

Files

0206-supp.pdf

Files (1.2 MB)

Name Size Download all
md5:24c3bf01f8ed03e12f0879292021d2f8
684.0 kB Preview Download
md5:15cf4590c35adc1bdcf830d698a54334
557.2 kB Preview Download