%0 Conference Proceedings %T Rotate your Networks: Better Weight Consolidation and Less Catastrophic Forgetting %A Xialei Liu %A Marc Masana %A Luis Herranz %A Joost Van de Weijer %A Antonio Lopez %A Andrew Bagdanov %B 24th International Conference on Pattern Recognition %D 2018 %F Xialei Liu2018 %O LAMP; ADAS; 601.305; 601.109; 600.124; 600.106; 602.200; 600.120; 600.118 %O exported from refbase (http://refbase.cvc.uab.es/show.php?record=3160), last updated on Tue, 08 Feb 2022 14:03:45 +0100 %X In this paper we propose an approach to avoiding catastrophic forgetting in sequential task learning scenarios. Our technique is based on a network reparameterization that approximately diagonalizes the Fisher Information Matrix of the network parameters. This reparameterization takes the form ofa factorized rotation of parameter space which, when used in conjunction with Elastic Weight Consolidation (which assumes a diagonal Fisher Information Matrix), leads to significantly better performance on lifelong learning of sequential tasks. Experimental results on the MNIST, CIFAR-100, CUB-200 andStanford-40 datasets demonstrate that we significantly improve the results of standard elastic weight consolidation, and that we obtain competitive results when compared to the state-of-the-art in lifelong learning without forgetting. %U http://refbase.cvc.uab.es/files/LMH2018.pdf %U http://dx.doi.org/10.1109/ICPR.2018.8545895 %P 2262-2268