Rotate your Networks: Better Weight Consolidation and Less Catastrophic Forgetting

Xialei Liu, Marc Massana, Luis Herranz Arribas, van de Weijer Joost, Antonio Manuel Lopez Peña, Bagdanov Andrew

Producció científica: Contribució a una revistaArticleRecercaAvaluat per experts

139 Cites (Scopus)
1 Descàrregues (Pure)

Resum

In this paper we propose an approach to avoiding catastrophic forgetting in sequential task learning scenarios. Our technique is based on a network reparameterization that approximately diagonalizes the Fisher Information Matrix of the network parameters. This reparameterization takes the form of a factorized rotation of parameter space which, when used in conjunction with Elastic Weight Consolidation (which assumes a diagonal Fisher Information Matrix), leads to significantly better performance on lifelong learning of sequential tasks. Experimental results on the MNIST, CIFAR-100, CUB-200 and Stanford-40 datasets demonstrate that we significantly improve the results of standard elastic weight consolidation, and that we obtain competitive results when compared to the state-of-the-art in lifelong learning without forgetting.
Idioma originalEnglish
Pàgines (de-a)2262-2268
Nombre de pàgines7
RevistaProceedings - International Conference on Pattern Recognition
Estat de la publicacióPublicada - 2018

Fingerprint

Navegar pels temes de recerca de 'Rotate your Networks: Better Weight Consolidation and Less Catastrophic Forgetting'. Junts formen un fingerprint únic.

Com citar-ho