Instructions to use rausch/pt-t5-sci-transfer-init-spm32k with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use rausch/pt-t5-sci-transfer-init-spm32k with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("rausch/pt-t5-sci-transfer-init-spm32k") model = AutoModelForSeq2SeqLM.from_pretrained("rausch/pt-t5-sci-transfer-init-spm32k") - Notebooks
- Google Colab
- Kaggle
PT-Trans-Init
Portuguese scientific T5 model initialized from EN-T5-Sci using WECHSEL and a language-specific SentencePiece 32k tokenizer.
Model Details
This is one of the non-English scientific T5 transfer models from the paper. The model keeps the EN-T5-Sci Transformer weights and reinitializes the language-specific embeddings with WECHSEL using a target SentencePiece tokenizer.
- Paper name:
PT-Trans-Init - Model role:
main - Source/base model: EN-T5-Sci
- Code and pipeline: GitHub repository
- Architecture: T5 encoder-decoder
- SciLaD dataset: scilons/SciLaD-all-text-v1
- Evaluation benchmark: Global-MMLU
- Target-language tokenizer: Portuguese SciLaD split; language-specific SentencePiece 32k tokenizer
Evaluated against:
- PT-Base-CP control: reported as the continued-pretraining control.
- upstream target-language base: reported as the monolingual base comparison.
WECHSEL resources: English fastText embeddings + Portuguese fastText embeddings (pt) with the portuguese bilingual dictionary.
Evaluation
Zero-shot Global-MMLU accuracy reported by the paper aggregation:
| Metric | Accuracy |
|---|---|
| Average | 24.98 |
| STEM | 24.64 |
| Humanities | 24.44 |
| Social Sciences | 24.34 |
| Other | 26.75 |
Limitations
The model is evaluated primarily with zero-shot Global-MMLU. Downstream task-specific evaluation is recommended before deployment in specialized scientific workflows.
Citation
- Title: Transferring Scientific English Pre-Trained Language Models to Multiple Languages Using Cross-Lingual Transfer
- Authors: Nikolas Rauscher, Fabio Barth, Georg Rehm
- Venue: LREC-COLING 2026, citation details TBA after publication
- Downloads last month
- 14
Model tree for rausch/pt-t5-sci-transfer-init-spm32k
Base model
rausch/en-t5-sci-continued-pretraining-487k