Context Tokens are Anchors: Understanding the Repeat Curse in dMLLMs from an Information Flow Perspective
Context Tokens are Anchors: Understanding the Repeat Curse in dMLLMs from an Information Flow Perspective
ICLR 2026
Qiyan Zhao, Xiaofeng Zhang*, Shuochen Chang, Qianyu Chen, Xiaosong Yuan, Xuhang Chen, Luoqi Liu, Jiajun Zhang, Xu-Yao Zhang, Da-Han Wang
git clone https://github.com/ErikZ719/CoTA
cd CoTA/train
bash init_env.sh
python generate_demo.py
@inproceedings{zhao2026context,
title={Context Tokens are Anchors: Understanding the Repeat Curse in dMLLMs from an Information Flow Perspective},
author={Zhao, Qiyan and Zhang, Xiaofeng and Chang, Shuochen and Chen, Qianyu and Yuan, Xiaosong and Chen, Xuhang and Liu, Luoqi and Zhang, Jiajun and Zhang, Xu-Yao and Wang, Da-Han},
booktitle={The Fourteenth International Conference on Learning Representations (ICLR)},
year={2026}
}This repo is built on LLaDA-V (models) and dLLM-Cache . Many thanks for their efforts. The use of our code should also follow the original licenses.



