Highlights
- Pro
Pinned Loading
- 
  torchdistilltorchdistill PublicA coding-free framework built on PyTorch for reproducible deep learning studies. PyTorch Ecosystem. 🏆26 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemen… 
- 
  sc2-benchmarksc2-benchmark Public[TMLR] "SC2 Benchmark: Supervised Compression for Split Computing" 
- 
  ladon-multi-task-sc2ladon-multi-task-sc2 Public[WACV 2025] "A Multi-task Supervised Compression Model for Split Computing" Python 3 
- 
  supervised-compressionsupervised-compression Public[WACV 2022] "Supervised Compression for Resource-Constrained Edge Computing Systems" 
- 
  hnd-ghnd-object-detectorshnd-ghnd-object-detectors Public[ICPR 2020] "Neural Compression and Filtering for Edge-assisted Real-time Object Detection in Challenged Networks" and [ACM MobiCom EMDL 2020] "Split Computing for Complex Object Detectors: Challen… 
- 
  head-network-distillationhead-network-distillation Public[IEEE Access] "Head Network Distillation: Splitting Distilled Deep Neural Networks for Resource-constrained Edge Computing Systems" and [ACM MobiCom HotEdgeVideo 2019] "Distilled Split Deep Neural … 
If the problem persists, check the GitHub status page or contact support.





