AI Lab Tracker
Labs
Timeline
LLaDA
model
2025-02-14
Ant Group
Experimental diffusion language model from Inclusion AI. Explores an alternative to autoregressive generation. LLaDA-MoE extends the approach with Mixture-of-Experts.
Paper (arXiv)
GitHub
Project Page
HuggingFace (LLaDA-MoE)
Outputs
2
LLaDA
model
Diffusion language model from Inclusion AI.
Paper (arXiv)
GitHub
arXiv:
2502.09992
LLaDA-MoE
model
2025-09-29
Mixture-of-Experts variant of LLaDA.
Paper (arXiv)
HuggingFace
Architecture
MOE
generation
architecture
research
Related
llada-2