
v1v2 (latest)
OpenBA: An Open-sourced 15B Bilingual Asymmetric seq2seq Model Pre-trained from Scratch
Papers citing "OpenBA: An Open-sourced 15B Bilingual Asymmetric seq2seq Model Pre-trained from Scratch"
50 / 50 papers shown
Title |
---|
![]() CPM-2: Large-scale Cost-effective Pre-trained Language Models Zhengyan Zhang Yuxian Gu Xu Han Shengqi Chen Chaojun Xiao ...Minlie Huang Wentao Han Yang Liu Xiaoyan Zhu Maosong Sun |