Skip to content

Catastrophic forgetting is the tendency of an artificial neural network to completely and abruptly forget previously learned information upon learning new information. We can overcome this forgetting nature by using Elastic Weight Consolidation.

License

Notifications You must be signed in to change notification settings

DextroLaev/Overcoming-catastrophic-forgetting-nature-of-neural-networks

Repository files navigation

Overcoming-catastrophic-forgetting-nature-of-neural-networks

Catastrophic forgetting is the tendency of an artificial neural network to completely and abruptly forget previously learned information upon learning new information. We can overcome this forgetting nature by using Elastic Weight Consolidation.

Elastic Weight Consilidation is one of DeepMind's Research. As the catastrophic forgetting nature of neural nets are removed, then they can learn without forgetting about the past. This property is helpful for future.

python3 keras_model.py

References:

About

Catastrophic forgetting is the tendency of an artificial neural network to completely and abruptly forget previously learned information upon learning new information. We can overcome this forgetting nature by using Elastic Weight Consolidation.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages