SPACE: Structured Compression and Sharing of Representational Space for Continual Learning
Humans learn incrementally from sequential experiences throughout their lives, which has proven hard to emulate in artificial neural networks.Incrementally learning tasks causes neural networks to overwrite relevant information learned about older tasks, resulting in ‘Catastrophic Forgetting’.Efforts to overcome this phenomenon often