Catastrophic forgetting is the tendency of an artificial neural network to completely and abruptly forget previously learned information upon learning new information. We can overcome this forgetting nature by using Elastic Weight Consolidation.
Elastic Weight Consilidation is one of DeepMind's Research. As the catastrophic forgetting nature of neural nets are removed, then they can learn without forgetting about the past. This property is helpful for future.
python3 keras_model.py