Skip to content

Daromog/Classifier-with-KL-divergence-For-Knowledge-Destillation-

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 

Repository files navigation

Classifier-with-KL-divergence-For-Knowledge-Destillation-

Classifier with KL divergence (For Knowledge Destillation))

This code is used to train a classifier for knowledge destillation. If you have multiple scores of different classifiers you can perform knowledge destillation and use those scores as a teacher to train a smaller classifier that achieves a similar performance.

About

Classifier with KL divergence (For Knowledge Destillation))

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages