For T2T task of Workshop on Asian Translation(2025), these are the fine-tuned models with NLLB-200-XB as base model, with WAT + 100k samanantar pairs.
Debasish Dhal
DebasishDhal99
AI & ML interests
None yet
Recent Activity
new activity
12 days ago
nanonets/Nanonets-OCR2-3B:Is adding a requirements.txt possible for this model?
liked
a Space
26 days ago
HuggingFaceTB/smol-training-playbook
updated
a collection
about 1 month ago
WAT-2025-FinetunedModels