Code Release TBD.
We would like to express our gratitude to the previous studies on DLLM KV caching that inspired our work. Special thanks to the authors of d2Cache for providing excellent open-source code, which served as a valuable foundation for our experimental framework. We also acknowledge the developers of LLADA and Dream for making their models publicly available, which were instrumental in our experiments.
If you find this work useful, please cite our paper:
@article{cheong2026entropycache,
title={EntropyCache: Decoded Token Entropy Guided KV Caching for Diffusion Language Models},
author={Cheong, Minsoo and Son, Donghyun and Lim, Woosang and Yoo, Sungjoo},
journal={arXiv preprint arXiv:2603.18489},
year={2026}
}