Family of 10 open-source multimodal MoE models ranging from 424B total parameters (47B active) down to a 0.3B dense model. Uses a heterogeneous modality structure supporting parameter sharing across modalities with dedicated parameters per modality. Initially announced March 16, 2025; fully open-sourced under Apache 2.0 on June 30, 2025. Achieves SOTA on multiple text and multimodal benchmarks.

Model Details

Architecture MOE

Variants

Name Parameters Notes
ERNIE-4.5-VL-424B-A47B 424B Flagship multimodal MoE model
ERNIE-4.5-VL-28B-A3B 28B Lightweight multimodal MoE
ERNIE-4.5-VL-28B-A3B-Thinking 28B Reasoning variant with thinking mode
ERNIE-4.5-21B-A3B 21B Text-only MoE
ERNIE-4.5-0.3B 0.3B Compact dense model
moeopen-weightmultimodal