AI Lab Tracker
Labs
Timeline
Pangu Pro MoE
paper
2025-05-27
Huawei
72B MoE model with a novel Mixture of Grouped Experts (MoGE) architecture, activating 16B parameters. Open-sourced as part of the openPangu initiative.
Paper (arXiv)
HuggingFace
Paper
arXiv:
2505.21411
moe
open-weight
Related
pangu-ultra-moe
openpangu