Rethinking Multilingual Continual Pretraining: Data Mixing for Adapting LLMs Across Languages and Resources
-
Rethinking Multilingual Continual Pretraining: Data Mixing for Adapting LLMs Across Languages and Resources
Paper • 2504.04152 • Published • 1 -
Zihao-Li/L2-Mono-Stag
Text Generation • 7B • Updated • 1 -
Zihao-Li/L2-Bi-Code-Alt
Text Generation • 7B • Updated • 2 -
Zihao-Li/L2-Bi-Code-Sel
Text Generation • 7B • Updated • 1