DaG LLM ver 1.0: Pioneering Instruction-Tuned Language Modeling for Korean NLP
Dongjun Jang
Sangah Lee
Sungjoo Byun
Jinwoong Kim
Jean Seo
Minseok Kim
Soyeon Kim
Chaeyoung Oh
Jaeyoon Kim
Hyemi Jo
Hyopil Shin

Abstract
This paper presents the DaG LLM (David and Goliath Large Language Model), a language model specialized for Korean and fine-tuned through Instruction Tuning across 41 tasks within 13 distinct categories.
View on arXivComments on this paper