Official PyTorch implementation of the paper "Dataset Distillation with Neural Characteristic Function: A Minmax Perspective" (NCFM) in CVPR 2025 (Highlight).
-
Updated
Jun 8, 2025 - Python
Official PyTorch implementation of the paper "Dataset Distillation with Neural Characteristic Function: A Minmax Perspective" (NCFM) in CVPR 2025 (Highlight).
[IJCAI 2024] Papers about graph reduction including graph coarsening, graph condensation, graph sparsification, graph summarization, etc.
[ICLR'22] [KDD'22] [IJCAI'24] Implementation of "Graph Condensation for Graph Neural Networks"
(NeurIPS 2023 spotlight) Large-scale Dataset Distillation/Condensation, 50 IPC (Images Per Class) achieves the highest 60.8% on original ImageNet-1K val set.
Official PyTorch implementation of "Dataset Condensation via Efficient Synthetic-Data Parameterization" (ICML'22)
ICLR 2024, Towards Lossless Dataset Distillation via Difficulty-Aligned Trajectory Matching
Awesome Graph Condensation Papers, TKDE'25 paper: Graph Condensation: A Survey.
Official PyTorch Implementation for the "Distilling Datasets Into Less Than One Image" paper.
Code for Backdoor Attacks Against Dataset Distillation
(CVPR 2025) Official implementation to DELT: A Simple Diversity-driven EarlyLate Training for Dataset Distillation which outperforms SOTA top 1-acc by +1.3% and increases diversity per class by +5%
Code for our paper "Towards Trustworthy Dataset Distillation" (Pattern Recognition 2025)
[ICLR 2024] "Data Distillation Can Be Like Vodka: Distilling More Times For Better Quality" by Xuxi Chen*, Yu Yang*, Zhangyang Wang, Baharan Mirzasoleiman
An Efficient Dataset Condensation Plugin and Its Application to Continual Learning. NeurIPS, 2023.
Optimization-free Dataset Distillation for Object Detection. Paper at: https://arxiv.org/abs/2506.01942
Dataset Distillation on 3D Point Clouds using Gradient Matching
A collection of dataset distillation papers.
Continual Learning code for SRe2L paper (NeurIPS 2023 spotlight)
Add a description, image, and links to the dataset-distillation topic page so that developers can more easily learn about it.
To associate your repository with the dataset-distillation topic, visit your repo's landing page and select "manage topics."