Conference paper Open Access

Dataset Knowledge Transfer for Class-Incremental Learning without Memory

Habib Slim; Eden Belouadah; Adrian Popescu; Darian Onchis

ncremental learning enables artificial agents to learn from sequential data. While important progress was made by exploiting deep neural networks, incremental learning remains very challenging. This is particularly the case when no memory of past data is allowed and catastrophic forgetting has a strong negative effect. We tackle class-incremental learning without memory by adapting prediction bias correction, a method which makes predictions of past and new classes more comparable. It was proposed when a memory is allowed and cannot be directly used without memory, since samples of past classes are required. We introduce a two-step learning process which allows the transfer of bias correction parameters between reference and target datasets. Bias correction is first optimized offline on reference datasets which have an associated validation memory. The obtained correction parameters are then transferred to target datasets, for which no memory is available. The second contribution is to introduce a finer modeling of bias correction by learning its parameters per incremental state instead of the usual past vs. new class modeling. The proposed dataset knowledge transfer is applicable to any incremental method which works without memory. We test its effectiveness by applying it to four existing methods. Evaluation with four target datasets and different configurations shows consistent improvement, with practically no computational and memory overhead.

Files (1.2 MB)
Name Size
0206-supp.pdf
md5:24c3bf01f8ed03e12f0879292021d2f8
684.0 kB Download
0206.pdf
md5:15cf4590c35adc1bdcf830d698a54334
557.2 kB Download
21
26
views
downloads
All versions This version
Views 2121
Downloads 2626
Data volume 16.5 MB16.5 MB
Unique views 1717
Unique downloads 1717

Share

Cite as