webnlg-challenge/web_nlg
Updated • 2.1k • 24
How to use preetk21/mt5-small-finetuned-amazon-en-es with Transformers:
# Use a pipeline as a high-level helper
# Warning: Pipeline type "summarization" is no longer supported in transformers v5.
# You must load the model directly (see below) or downgrade to v4.x with:
# 'pip install "transformers<5.0.0'
from transformers import pipeline
pipe = pipeline("summarization", model="preetk21/mt5-small-finetuned-amazon-en-es") # Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("preetk21/mt5-small-finetuned-amazon-en-es")
model = AutoModelForSeq2SeqLM.from_pretrained("preetk21/mt5-small-finetuned-amazon-en-es")This model is a fine-tuned version of google/mt5-small on the web_nlg dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
|---|---|---|---|---|---|---|---|
| 1.9276 | 1.0 | 4429 | 0.4272 | 68.6843 | 56.7537 | 65.8818 | 65.9389 |
| 0.5548 | 2.0 | 8858 | 0.2903 | 72.0968 | 62.884 | 69.6164 | 69.6271 |
| 0.3936 | 3.0 | 13287 | 0.2308 | 73.8306 | 65.8224 | 71.4996 | 71.4971 |
| 0.3093 | 4.0 | 17716 | 0.1632 | 75.0861 | 67.7273 | 72.9128 | 72.9615 |
| 0.2592 | 5.0 | 22145 | 0.1484 | 75.7699 | 68.7078 | 73.5831 | 73.5905 |
| 0.2295 | 6.0 | 26574 | 0.1353 | 76.4394 | 69.689 | 74.3168 | 74.3496 |
| 0.2117 | 7.0 | 31003 | 0.1289 | 76.6532 | 69.9438 | 74.5065 | 74.5616 |
| 0.2026 | 8.0 | 35432 | 0.1274 | 76.7573 | 70.2881 | 74.6384 | 74.6743 |
Base model
google/mt5-small