A quantitative analysis of how the Variational Continual Learning method handles catastrophic forgetting

University essay from KTH/Skolan för elektroteknik och datavetenskap (EECS)

Author: Caroline Larsen; Elin Ryman; [2020]

Keywords: ;

Abstract: Catastrophic forgetting is a problem that occurs when an artificial neural network in the continual learning setting replaces historic information as additional information is acquired. Several methods claiming to handle the aforementioned problem are trained and evaluated using data sets with a small number of tasks, which does not represent a real continual learning situation where the number of tasks could be large. In this report, it is examined how three versions of the method Variational Continual Learning (VCL) handles catastrophic forgetting when training an artificial neural network using a data set with 20 tasks, as well as a data set with 5 tasks. The results show that all three versions of VCL performed well, even though there were some signs of catastrophic forgetting. Notably, the two versions of VCL extended with an episodic memory achieved the highest accuracy of the three versions. In conclusion, we believe that all three versions of the VCL method handles the problem of catastrophic forgetting, when trained on data set with up to 20 tasks.

  AT THIS PAGE YOU CAN DOWNLOAD THE WHOLE ESSAY. (follow the link to the next page)