Skip to content

LabShuHangGU/MVAR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 

Repository files navigation

MVAR: Visual Autoregressive Modeling with Scale and Spatial Markovian Conditioning

Jinhua Zhang, Wei Long, Minghao Han, Weiyi You, Shuhang Gu

arXiv GitHub Stars

⭐If this work is helpful for you, please help star this repo. Thanks!🤗

✨ Key Contributions

1️⃣ VAR exhibits scale and spatial redundancy, causing high GPU memory consumption.

2️⃣ The proposed method enables MVAR generation without relying on KV cache during inference.

📑 Contents

📰 News

  • 2025-05-20: Our MVAR paper has been published on arXiv.

🛠️ Pipeline

Our MVAR introduces the scale and spatial Markovian assumpation which only adopt adjacent preceding scale for next-scale prediction and restricts the attention of each token to a localized neighborhood of size k at corresponding positions on adjacent scales.

✅ Status

  • 📄 Paper available on arXiv
  • 🧠 Codebase under preparation
  • 🚀 Planned improvements and model refinement

🥇 Results

Our MVAR model achieves a 3.0× reduction in GPU memory footprint compared to VAR. Detailed results can be found in the paper.

Comparison of Quantitative Results: MVAR vs. VAR (click to expand)

Quantitative Results on the ImageNet 256×256 Benchmark (click to expand)

Ablation Study on Scale and Spatial Markovian Assumptions (click to expand)

🥰 Citation

Please cite us if our work is useful for your research.

@article{zhang2025mvar,
  title={MVAR: Visual Autoregressive Modeling with Scale and Spatial Markovian Conditioning},
  author={Zhang, Jinhua and Long, Wei and Han, Minghao and You, Weiyi and Gu, Shuhang},
  journal={arXiv preprint arXiv:2505.12742},
  year={2025}
}

Contact

If you have any questions, feel free to approach me at jinhua.zjh@gmail.com

About

MVAR: Visual Autoregressive Modeling with Scale and Spatial Markovian Conditioning

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published