Baichuan-7B
modelOpen-source 7-billion-parameter language model trained on 1.2 trillion tokens of Chinese and English data. Released under an Apache 2.0 license, it was the first model from Baichuan Intelligence and achieved competitive results on Chinese benchmarks.
Model Details
Architecture DENSE
Parameters 7B