AI Lab Tracker
Labs
Timeline
Pangu Ultra MoE
model
paper
2025-05-07
Huawei
718B sparse MoE with 256 experts per layer and 39B active parameters. The flagship model announced at HDC 2025, later open-sourced as part of the openPangu initiative.
Paper (arXiv)
HuggingFace
Outputs
2
Pangu Ultra MoE 718B
model
HuggingFace
Architecture
MOE
Parameters
718B
Active params
39B
Pangu Ultra MoE: A Scalable Mixture-of-Experts Framework
paper
Paper (arXiv)
arXiv:
2505.04519
moe
frontier
open-weight
Related
pangu-ultra
pangu-pro-moe
openpangu