CODA: Repurposing Continuous VAEs for Discrete Tokenization
Paper
•
2503.17760
•
Published
•
4
This repository contains the CODA tokenizer, as introduced in CODA: Repurposing Continuous VAEs for Discrete Tokenization.
Project Page: https://lzy-tony.github.io/coda
Code: https://github.com/LeapLabTHU/CODA
CODA addresses the challenges of training conventional VQ tokenizers by decoupling compression and discretization. Instead of training from scratch, CODA adapts off-the-shelf continuous VAEs into discrete tokenizers, leading to stable and efficient training with strong visual fidelity.