Luth: Efficient French Specialization for Small Language Models and Cross-Lingual Transfer Paper • 2510.05846 • Published 27 days ago • 2
⚛️ Liquid Nanos Collection Library of task-specific models: https://www.liquid.ai/blog/introducing-liquid-nanos-frontier-grade-performance-on-everyday-devices • 21 items • Updated 4 days ago • 88
view article Article Luth: Efficient French Specialization for Small Language Models By MaxLSB and 1 other • Aug 11 • 17
Tulu 3 Datasets Collection All datasets released with Tulu 3 -- state of the art open post-training recipes. • 33 items • Updated Sep 18 • 94
💧 LFM2 Collection LFM2 is a new generation of hybrid models, designed for on-device deployment. • 21 items • Updated 12 days ago • 116
Falcon-H1 Collection Falcon-H1 Family of Hybrid-Head Language Models (Transformer-SSM), including 0.5B, 1.5B, 1.5B-Deep, 3B, 7B, and 34B (pretrained & instruction-tuned). • 38 items • Updated Sep 14 • 57
TinyStories: How Small Can Language Models Be and Still Speak Coherent English? Paper • 2305.07759 • Published May 12, 2023 • 36