Update README.md
Browse files
README.md
CHANGED
|
@@ -34,11 +34,12 @@ This model is a fine-tuned version of [NbAiLab/nb-bert-large](https://huggingfac
|
|
| 34 |
|
| 35 |
We have released three models for Scandinavian NLI, of different sizes:
|
| 36 |
|
|
|
|
| 37 |
- alexandrainst/scandi-nli-large (this)
|
| 38 |
- [alexandrainst/scandi-nli-base](https://huggingface.co/alexandrainst/scandi-nli-base)
|
| 39 |
- [alexandrainst/scandi-nli-small](https://huggingface.co/alexandrainst/scandi-nli-small)
|
| 40 |
|
| 41 |
-
A demo of the large model can be found in [this Hugging Face Space](https://huggingface.co/spaces/alexandrainst/zero-shot-classification) - check it out!
|
| 42 |
|
| 43 |
The performance and model size of each of them can be found in the Performance section below.
|
| 44 |
|
|
@@ -79,7 +80,8 @@ The Scandinavian scores are the average of the Danish, Swedish and Norwegian sco
|
|
| 79 |
|
| 80 |
| **Model** | **MCC** | **Macro-F1** | **Accuracy** | **Number of Parameters** |
|
| 81 |
| :-------- | :------------ | :--------- | :----------- | :----------- |
|
| 82 |
-
| `alexandrainst/scandi-nli-large`
|
|
|
|
| 83 |
| [`MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7) | 69.01% | 71.99% | 80.66% | 279M |
|
| 84 |
| [`alexandrainst/scandi-nli-base`](https://huggingface.co/alexandrainst/scandi-nli-base) | 67.42% | 71.54% | 80.09% | 178M |
|
| 85 |
| [`joeddav/xlm-roberta-large-xnli`](https://huggingface.co/joeddav/xlm-roberta-large-xnli) | 64.17% | 70.80% | 77.29% | 560M |
|
|
@@ -96,7 +98,8 @@ The test split is generated using [this gist](https://gist.github.com/saattrupda
|
|
| 96 |
|
| 97 |
| **Model** | **MCC** | **Macro-F1** | **Accuracy** | **Number of Parameters** |
|
| 98 |
| :-------- | :------------ | :--------- | :----------- | :----------- |
|
| 99 |
-
| `alexandrainst/scandi-nli-large`
|
|
|
|
| 100 |
| [`MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7) | 68.37% | 57.10% | 83.25% | 279M |
|
| 101 |
| [`alexandrainst/scandi-nli-base`](https://huggingface.co/alexandrainst/scandi-nli-base) | 62.44% | 55.00% | 80.42% | 178M |
|
| 102 |
| [`NbAiLab/nb-bert-base-mnli`](https://huggingface.co/NbAiLab/nb-bert-base-mnli) | 56.92% | 53.25% | 76.39% | 178M |
|
|
@@ -113,7 +116,8 @@ We acknowledge that not evaluating on a gold standard dataset is not ideal, but
|
|
| 113 |
|
| 114 |
| **Model** | **MCC** | **Macro-F1** | **Accuracy** | **Number of Parameters** |
|
| 115 |
| :-------- | :------------ | :--------- | :----------- | :----------- |
|
| 116 |
-
| `alexandrainst/scandi-nli-large`
|
|
|
|
| 117 |
| [`joeddav/xlm-roberta-large-xnli`](https://huggingface.co/joeddav/xlm-roberta-large-xnli) | 75.35% | 83.42% | 83.55% | 560M |
|
| 118 |
| [`MoritzLaurer/mDeBERTa-v3-base-mnli-xnli`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-mnli-xnli) | 73.84% | 82.46% | 82.58% | 279M |
|
| 119 |
| [`MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7) | 73.32% | 82.15% | 82.08% | 279M |
|
|
@@ -130,7 +134,8 @@ We acknowledge that not evaluating on a gold standard dataset is not ideal, but
|
|
| 130 |
|
| 131 |
| **Model** | **MCC** | **Macro-F1** | **Accuracy** | **Number of Parameters** |
|
| 132 |
| :-------- | :------------ | :--------- | :----------- | :----------- |
|
| 133 |
-
| `alexandrainst/scandi-nli-large`
|
|
|
|
| 134 |
| [`joeddav/xlm-roberta-large-xnli`](https://huggingface.co/joeddav/xlm-roberta-large-xnli) | 67.99% | 78.68% | 78.60% | 560M |
|
| 135 |
| [`alexandrainst/scandi-nli-base`](https://huggingface.co/alexandrainst/scandi-nli-base) | 67.53% | 78.24% | 78.33% | 178M |
|
| 136 |
| [`MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7) | 65.33% | 76.73% | 76.65% | 279M |
|
|
|
|
| 34 |
|
| 35 |
We have released three models for Scandinavian NLI, of different sizes:
|
| 36 |
|
| 37 |
+
- [alexandrainst/scandi-nli-large-v2](https://huggingface.co/alexandrainst/scandi-nli-large-v2)
|
| 38 |
- alexandrainst/scandi-nli-large (this)
|
| 39 |
- [alexandrainst/scandi-nli-base](https://huggingface.co/alexandrainst/scandi-nli-base)
|
| 40 |
- [alexandrainst/scandi-nli-small](https://huggingface.co/alexandrainst/scandi-nli-small)
|
| 41 |
|
| 42 |
+
A demo of the large-v2 model can be found in [this Hugging Face Space](https://huggingface.co/spaces/alexandrainst/zero-shot-classification) - check it out!
|
| 43 |
|
| 44 |
The performance and model size of each of them can be found in the Performance section below.
|
| 45 |
|
|
|
|
| 80 |
|
| 81 |
| **Model** | **MCC** | **Macro-F1** | **Accuracy** | **Number of Parameters** |
|
| 82 |
| :-------- | :------------ | :--------- | :----------- | :----------- |
|
| 83 |
+
| [`alexandrainst/scandi-nli-large-v2`](https://huggingface.co/alexandrainst/scandi-nli-large-v2) | **75.42%** | **75.41%** | **84.95%** | 354M |
|
| 84 |
+
| `alexandrainst/scandi-nli-large` (this) | 73.70% | 74.44% | 83.91% | 354M |
|
| 85 |
| [`MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7) | 69.01% | 71.99% | 80.66% | 279M |
|
| 86 |
| [`alexandrainst/scandi-nli-base`](https://huggingface.co/alexandrainst/scandi-nli-base) | 67.42% | 71.54% | 80.09% | 178M |
|
| 87 |
| [`joeddav/xlm-roberta-large-xnli`](https://huggingface.co/joeddav/xlm-roberta-large-xnli) | 64.17% | 70.80% | 77.29% | 560M |
|
|
|
|
| 98 |
|
| 99 |
| **Model** | **MCC** | **Macro-F1** | **Accuracy** | **Number of Parameters** |
|
| 100 |
| :-------- | :------------ | :--------- | :----------- | :----------- |
|
| 101 |
+
| [`alexandrainst/scandi-nli-large-v2`](https://huggingface.co/alexandrainst/scandi-nli-large-v2) | **75.65%** | **59.23%** | **87.89%** | 354M |
|
| 102 |
+
| `alexandrainst/scandi-nli-large` (this) | 73.80% | 58.41% | 86.98% | 354M |
|
| 103 |
| [`MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7) | 68.37% | 57.10% | 83.25% | 279M |
|
| 104 |
| [`alexandrainst/scandi-nli-base`](https://huggingface.co/alexandrainst/scandi-nli-base) | 62.44% | 55.00% | 80.42% | 178M |
|
| 105 |
| [`NbAiLab/nb-bert-base-mnli`](https://huggingface.co/NbAiLab/nb-bert-base-mnli) | 56.92% | 53.25% | 76.39% | 178M |
|
|
|
|
| 116 |
|
| 117 |
| **Model** | **MCC** | **Macro-F1** | **Accuracy** | **Number of Parameters** |
|
| 118 |
| :-------- | :------------ | :--------- | :----------- | :----------- |
|
| 119 |
+
| [`alexandrainst/scandi-nli-large-v2`](https://huggingface.co/alexandrainst/scandi-nli-large-v2) | **79.02%** | **85.99%** | **85.99%** | 354M |
|
| 120 |
+
| `alexandrainst/scandi-nli-large` (this) | 76.69% | 84.47% | 84.38% | 354M |
|
| 121 |
| [`joeddav/xlm-roberta-large-xnli`](https://huggingface.co/joeddav/xlm-roberta-large-xnli) | 75.35% | 83.42% | 83.55% | 560M |
|
| 122 |
| [`MoritzLaurer/mDeBERTa-v3-base-mnli-xnli`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-mnli-xnli) | 73.84% | 82.46% | 82.58% | 279M |
|
| 123 |
| [`MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7) | 73.32% | 82.15% | 82.08% | 279M |
|
|
|
|
| 134 |
|
| 135 |
| **Model** | **MCC** | **Macro-F1** | **Accuracy** | **Number of Parameters** |
|
| 136 |
| :-------- | :------------ | :--------- | :----------- | :----------- |
|
| 137 |
+
| [`alexandrainst/scandi-nli-large-v2`](https://huggingface.co/alexandrainst/scandi-nli-large-v2) | **71.59%** | **81.00%** | **80.96%** | 354M |
|
| 138 |
+
| `alexandrainst/scandi-nli-large` (this) | 70.61% | 80.43% | 80.36% | 354M |
|
| 139 |
| [`joeddav/xlm-roberta-large-xnli`](https://huggingface.co/joeddav/xlm-roberta-large-xnli) | 67.99% | 78.68% | 78.60% | 560M |
|
| 140 |
| [`alexandrainst/scandi-nli-base`](https://huggingface.co/alexandrainst/scandi-nli-base) | 67.53% | 78.24% | 78.33% | 178M |
|
| 141 |
| [`MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7`](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-xnli-multilingual-nli-2mil7) | 65.33% | 76.73% | 76.65% | 279M |
|