-
2025/05/27
ππ We release a new paper, pointing to shifting AI efficiency from model-centric to data-eentric compression. Project is available! Our paper is honored to be the #2 Paper of the day! -
2024/12/24
π€π€ We release an open-sourse repo "Awesome-Token-level-Model-Compression", which collects recent awesome token compression papers! Feel free to contribute your suggestions!
TLDR: We argue that the focus of research for efficient AI is shifting from model-centric compression to datacentric compression. To this end, we have compiled a comprehensive summary of 200+ papers of token-level model compression.
π±οΈ Please click on each domain to explore the applications of token-level model compression across different downstream scenarios. In total, it includes over 200+ awesome papers.
We use the following tags to summarize key information about each paper:
-
[1] DynamicViT: Efficient Vision Transformers with Dynamic Token Sparsification, NeurIPS 2021.
Rao, Yongming and Zhao, Wenliang and Liu, Benlin and Lu, Jiwen and Zhou, Jie and Hsieh, Cho-Jui.
BibTex
@inproceedings{Rao2021:DynamicViT, title={{DynamicViT}: Efficient Vision Transformers with Dynamic Token Sparsification}, author={Yongming Rao and Wenliang Zhao and Benlin Liu and Jiwen Lu and Jie Zhou and Cho{-}Jui Hsieh}, booktitle=NIPS, volume={34}, pages={13937--13949}, year={2021} }
We have summarized detailed information about the paper's method abbreviation, downstream application tasks, compression method classification, BibTex reference format etc. through the following tags:
- Awesome Generation Acceleration: An open-source repository that curates a collection of recent awesome papers on AIGC acceleration.
Please consider citing our paper in your publications, if our findings help your research.
@article{liu2025shifting,
title={Shifting AI Efficiency From Model-Centric to Data-Centric Compression},
author={Liu, Xuyang and Wen, Zichen and Wang, Shaobo and Chen, Junjie and Tao, Zhishan and Wang, Yubo and Jin, Xiangqi and Zou, Chang and Wang, Yiyu and Liao, Chenfei and Zheng, Xu and Chen, Honggang and Li, Weijia and Hu, Xuming and He, Conghui and Zhang, Linfeng},
journal={arXiv preprint arXiv:2505.19147},
year={2025}
}
π Thanks to these contributors for this excellent workοΌ