%0 Conference Proceedings %T Incremental Meta-Learning via Episodic Replay Distillation for Few-Shot Image Recognition %A Kai Wang %A Xialei Liu %A Andrew Bagdanov %A Luis Herranz %A Shangling Jui %A Joost Van de Weijer %B CVPR 2022 Workshop on Continual Learning (CLVision, 3rd Edition) %D 2022 %F Kai Wang2022 %O LAMP; 600.147 %O exported from refbase (http://refbase.cvc.uab.es/show.php?record=3686), last updated on Thu, 27 Apr 2023 10:23:44 +0200 %X In this paper we consider the problem of incremental meta-learning in which classes are presented incrementally in discrete tasks. We propose Episodic Replay Distillation (ERD), that mixes classes from the current task with exemplars from previous tasks when sampling episodes for meta-learning. To allow the training to benefit from a large as possible variety of classes, which leads to more gener-alizable feature representations, we propose the cross-task meta loss. Furthermore, we propose episodic replay distillation that also exploits exemplars for improved knowledge distillation. Experiments on four datasets demonstrate that ERD surpasses the state-of-the-art. In particular, on the more challenging one-shot, long task sequence scenarios, we reduce the gap between Incremental Meta-Learning andthe joint-training upper bound from 3.5% / 10.1% / 13.4% / 11.7% with the current state-of-the-art to 2.6% / 2.9% / 5.0% / 0.2% with our method on Tiered-ImageNet / Mini-ImageNet / CIFAR100 / CUB, respectively. %K Training %K Computer vision %K Image recognition %K Upper bound %K Conferences %K Pattern recognition %K Task analysis %U http://refbase.cvc.uab.es/files/WLB2022.pdf %U http://dx.doi.org/10.1109/CVPRW56347.2022.00417 %P 3728-3738