TY - CONF AU - Lu Yu AU - Bartlomiej Twardowski AU - Xialei Liu AU - Luis Herranz AU - Kai Wang AU - Yongmai Cheng AU - Shangling Jui AU - Joost Van de Weijer A2 - CVPR PY - 2020// TI - Semantic Drift Compensation for Class-Incremental Learning of Embeddings BT - 33rd IEEE Conference on Computer Vision and Pattern Recognition N2 - Class-incremental learning of deep networks sequentially increases the number of classes to be classified. During training, the network has only access to data of one task at a time, where each task contains several classes. In this setting, networks suffer from catastrophic forgetting which refers to the drastic drop in performance on previous tasks. The vast majority of methods have studied this scenario for classification networks, where for each new task the classification layer of the network must be augmented with additional weights to make room for the newly added classes. Embedding networks have the advantage that new classes can be naturally included into the network without adding new weights. Therefore, we study incremental learning for embedding networks. In addition, we propose a new method to estimate the drift, called semantic drift, of features and compensate for it without the need of any exemplars. We approximate the drift of previous tasks based on the drift that is experienced by current task data. We perform experiments on fine-grained datasets, CIFAR100 and ImageNet-Subset. We demonstrate that embedding networks suffer significantly less from catastrophic forgetting. We outperform existing methods which do not require exemplars and obtain competitive results compared to methods which store exemplars. Furthermore, we show that our proposed SDC when combined with existing methods to prevent forgetting consistently improves results. L1 - http://refbase.cvc.uab.es/files/YTL2020.pdf N1 - LAMP; 600.141; 601.309; 602.200; 600.120 ID - Lu Yu2020 ER -