One of the first large-scale Generative Chinese Pre-trained Language Models. With 2.6 billion parameters, it demonstrated strong performance across various Chinese NLP tasks and laid the groundwork for the CPM series.

Model Details

Parameters 2.6B

Paper

arXiv: 2012.00413

nlptrainingresearch

Related