Commit
c412cd4
·
verified ·
1 Parent(s): 0b40293

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +18 -35
README.md CHANGED
@@ -187,41 +187,24 @@ print(similarities.shape)
187
 
188
  ## Evaluation
189
 
190
- ### Metrics
191
-
192
- #### Semantic Similarity
193
- * Dataset: `sts-dev`
194
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
195
-
196
- | Metric | Value |
197
- |:--------------------|:----------|
198
- | pearson_cosine | 0.8391 |
199
- | **spearman_cosine** | **0.841** |
200
- | pearson_manhattan | 0.8277 |
201
- | spearman_manhattan | 0.8361 |
202
- | pearson_euclidean | 0.8274 |
203
- | spearman_euclidean | 0.8358 |
204
- | pearson_dot | 0.8154 |
205
- | spearman_dot | 0.818 |
206
- | pearson_max | 0.8391 |
207
- | spearman_max | 0.841 |
208
-
209
- #### Semantic Similarity
210
- * Dataset: `sts-test`
211
- * Evaluated with [<code>EmbeddingSimilarityEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator)
212
-
213
- | Metric | Value |
214
- |:--------------------|:-----------|
215
- | pearson_cosine | 0.813 |
216
- | **spearman_cosine** | **0.8173** |
217
- | pearson_manhattan | 0.8114 |
218
- | spearman_manhattan | 0.8164 |
219
- | pearson_euclidean | 0.8103 |
220
- | spearman_euclidean | 0.8158 |
221
- | pearson_dot | 0.7908 |
222
- | spearman_dot | 0.7887 |
223
- | pearson_max | 0.813 |
224
- | spearman_max | 0.8173 |
225
 
226
 
227
  ## <span style="color:blue">Acknowledgments</span>
 
187
 
188
  ## Evaluation
189
 
190
+ | Model | Dim | # Params. | STS17 | STS22-v2 | Average |
191
+ |------------------------------------------|------|-----------|-------|----------|---------|
192
+ | Arabic-Triplet-Matryoshka-V2 | 768 | 135M | 85 | 64 | 75 |
193
+ | Arabert-all-nli-triplet-Matryoshka | 768 | 135M | 83 | 64 | 74 |
194
+ | AraGemma-Embedding-300m | 768 | 303M | 84 | 62 | 73 |
195
+ | **GATE-AraBert-V1** | 767 | 135M | 83 | 63 | 73 |
196
+ | Marbert-all-nli-triplet-Matryoshka | 768 | 163M | 82 | 61 | 72 |
197
+ | Arabic-labse-Matryoshka | 768 | 471M | 82 | 61 | 72 |
198
+ | AraEuroBert-Small | 768 | 210M | 80 | 61 | 71 |
199
+ | E5-all-nli-triplet-Matryoshka | 384 | 278M | 80 | 60 | 70 |
200
+ | text-embedding-3-large | 3072 | - | 81 | 59 | 70 |
201
+ | Arabic-all-nli-triplet-Matryoshka | 768 | 135M | 82 | 54 | 68 |
202
+ | AraEuroBert-Mid | 1151 | 610M | 83 | 53 | 68 |
203
+ | paraphrase-multilingual-mpnet-base-v2 | 768 | 135M | 79 | 55 | 67 |
204
+ | AraEuroBert-Large | 2304 | 2.1B | 79 | 55 | 67 |
205
+ | text-embedding-ada-002 | 1536 | - | 71 | 62 | 66 |
206
+ | text-embedding-3-small | 1536 | - | 72 | 57 | 65 |
207
+
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
208
 
209
 
210
  ## <span style="color:blue">Acknowledgments</span>