Skip to content

zhuangzi926/KnowledgeDistillation-pytorch

Repository files navigation

Robust Knowledge Distillation by pytorch

Implementation of several KD papers using pytorch.

Dependencies

  • python>=3.7
  • torch>=1.6
  • torchvision>=0.7

TODO

  • Add experimental codes for poison attack under KD
  • Add experimental codes for adversarial training under KD
  • Implement core procedure in attention transfer paper
  • Implement core KD procedures in FitNets paper
  • Add tests for network performance on MNIST and CIFAR10
  • Add FitNet
  • Add Maxout Network

Reference

Related repos

About

Implementation of several KD papers using pytorch.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages