Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2408.11294
Cited By
RedWhale: An Adapted Korean LLM Through Efficient Continual Pretraining
21 August 2024
Anh-Dung Vo
Minseong Jung
Wonbeen Lee
Daewoo Choi
Re-assign community
ArXiv
PDF
HTML
Papers citing
"RedWhale: An Adapted Korean LLM Through Efficient Continual Pretraining"
3 / 3 papers shown
Title
Improving Multilingual Capabilities with Cultural and Local Knowledge in Large Language Models While Enhancing Native Performance
Ram Mohan Rao Kadiyala
Siddartha Pullakhandam
Siddhant Gupta
Drishti Sharma
Jebish Purbey
Kanwal Mehreen
Muhammad Arham
Hamza Farooq
104
0
0
13 Apr 2025
Adapting Multilingual LLMs to Low-Resource Languages using Continued Pre-training and Synthetic Corpus
Raviraj Joshi
Kanishk Singla
Anusha Kamath
Raunak Kalani
Rakesh Paul
Utkarsh Vaidya
Sanjay Singh Chauhan
Niranjan Wartikar
Eileen Long
SyDa
CLL
89
4
0
18 Oct 2024
Investigating Continual Pretraining in Large Language Models: Insights and Implications
cCaugatay Yildiz
Nishaanth Kanna Ravichandran
Prishruit Punia
Matthias Bethge
Beyza Ermis
CLL
KELM
LRM
93
29
0
27 Feb 2024
1