Dataset Viewer
path
stringlengths 19
43
| generation
float64 0.01
0.49
| open_qa
float64 0.16
0.71
| brainstorm
float64 0
0.51
| rewrite
float64 0.02
0.51
| summarize
float64 0
0.49
| classify
float64 0.03
0.51
| closed_qa
float64 0.01
0.51
| extract
float64 0.02
0.5
| reasoning_over_numerical_data
float64 0.03
0.52
| multi-document_synthesis
float64 0
0.6
| fact_checking_or_attributed_qa
float64 0.03
0.5
| average
float64 0.02
0.49
| generation_rank
int64 1
33
| open_qa_rank
int64 1
28
| brainstorm_rank
int64 1
32
| rewrite_rank
int64 1
32
| summarize_rank
int64 1
34
| classify_rank
int64 1
30
| closed_qa_rank
int64 1
34
| extract_rank
int64 1
32
| reasoning_over_numerical_data_rank
int64 1
33
| multi-document_synthesis_rank
int64 1
32
| fact_checking_or_attributed_qa_rank
int64 1
27
| average_rank
int64 1
34
| generation_confi
stringlengths 11
11
| open_qa_confi
stringlengths 11
11
| brainstorm_confi
stringlengths 11
11
| rewrite_confi
stringlengths 11
11
| summarize_confi
stringlengths 11
11
| classify_confi
stringlengths 11
11
| closed_qa_confi
stringlengths 11
11
| extract_confi
stringlengths 11
11
| reasoning_over_numerical_data_confi
stringlengths 11
11
| multi-document_synthesis_confi
stringlengths 11
11
| fact_checking_or_attributed_qa_confi
stringlengths 11
11
| average_confi
stringlengths 13
13
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
meta-llama/Llama-2-13b-chat-hf
| 0.179 | 0.343 | 0.245 | 0.191 | 0.307 | 0.249 | 0.24 | 0.22 | 0.125 | 0.104 | 0.198 | 0.1956 | 14 | 8 | 17 | 14 | 9 | 12 | 3 | 14 | 18 | 19 | 16 | 17 |
+3.0 / -3.1
|
+8.8 / -8.8
|
+3.4 / -3.3
|
+2.9 / -2.8
|
+6.4 / -6.4
|
+6.0 / -5.7
|
+5.9 / -5.4
|
+5.4 / -5.2
|
+2.7 / -2.7
|
+2.9 / -2.6
|
+3.6 / -3.6
|
+1.17 / -1.17
|
meta-llama/Llama-2-70b-chat-hf
| 0.239 | 0.377 | 0.306 | 0.22 | 0.275 | 0.296 | 0.339 | 0.238 | 0.197 | 0.13 | 0.222 | 0.239 | 14 | 8 | 14 | 14 | 9 | 12 | 3 | 14 | 12 | 16 | 8 | 14 |
+3.5 / -3.4
|
+9.3 / -9.3
|
+3.7 / -3.6
|
+3.1 / -3.1
|
+6.2 / -5.9
|
+6.5 / -6.0
|
+6.2 / -6.2
|
+5.2 / -5.2
|
+3.4 / -3.3
|
+3.1 / -2.9
|
+4.0 / -3.7
|
+1.28 / -1.28
|
meta-llama/Llama-2-7b-chat-hf
| 0.166 | 0.299 | 0.221 | 0.144 | 0.223 | 0.194 | 0.218 | 0.131 | 0.075 | 0.097 | 0.115 | 0.1538 | 19 | 8 | 17 | 20 | 18 | 23 | 20 | 22 | 23 | 19 | 22 | 20 |
+3.2 / -2.9
|
+9.3 / -8.8
|
+3.3 / -3.3
|
+2.6 / -2.4
|
+5.9 / -5.4
|
+5.5 / -5.0
|
+5.7 / -5.2
|
+4.7 / -4.2
|
+2.2 / -2.1
|
+2.6 / -2.6
|
+2.8 / -2.8
|
+1.08 / -1.03
|
meta-llama/Llama-3.1-70B-Instruct
| 0.445 | 0.706 | 0.486 | 0.48 | 0.493 | 0.507 | 0.512 | 0.502 | 0.521 | 0.453 | 0.496 | 0.4898 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 5 | 1 | 1 |
+3.8 / -3.9
|
+8.3 / -8.8
|
+4.2 / -4.1
|
+3.7 / -3.8
|
+6.7 / -6.7
|
+6.7 / -6.5
|
+6.2 / -6.4
|
+6.2 / -5.9
|
+4.0 / -4.1
|
+4.6 / -4.4
|
+4.5 / -4.5
|
+1.44 / -1.44
|
allenai/Llama-3.1-Tulu-3-70B-DPO
| 0.06 | 0.539 | 0.02 | 0.084 | 0.015 | 0.264 | 0.297 | 0.22 | 0.2 | 0.029 | 0.141 | 0.1191 | 23 | 1 | 26 | 24 | 27 | 12 | 3 | 14 | 12 | 24 | 16 | 22 |
+2.0 / -1.8
|
+9.8 / -9.8
|
+1.2 / -1.0
|
+2.1 / -2.0
|
+2.0 / -1.5
|
+6.0 / -5.7
|
+6.4 / -6.2
|
+5.7 / -5.4
|
+3.4 / -3.3
|
+1.5 / -1.3
|
+3.3 / -3.0
|
+0.94 / -0.96
|
allenai/Llama-3.1-Tulu-3-70B
| 0.448 | 0.456 | 0.504 | 0.434 | 0.364 | 0.45 | 0.354 | 0.399 | 0.439 | 0.53 | 0.319 | 0.4368 | 1 | 8 | 1 | 5 | 1 | 1 | 3 | 1 | 4 | 1 | 3 | 4 |
+4.0 / -4.0
|
+9.3 / -9.3
|
+4.0 / -4.0
|
+3.6 / -3.7
|
+6.7 / -6.7
|
+6.5 / -7.0
|
+6.4 / -6.4
|
+6.7 / -6.4
|
+4.0 / -4.1
|
+4.6 / -4.6
|
+4.3 / -4.1
|
+1.48 / -1.47
|
allenai/Llama-3.1-Tulu-3-8B
| 0.363 | 0.402 | 0.391 | 0.341 | 0.302 | 0.331 | 0.243 | 0.262 | 0.279 | 0.45 | 0.247 | 0.3354 | 7 | 8 | 9 | 9 | 9 | 12 | 3 | 5 | 9 | 5 | 8 | 8 |
+3.8 / -3.8
|
+9.8 / -9.8
|
+4.0 / -3.8
|
+3.6 / -3.5
|
+6.4 / -5.9
|
+6.5 / -6.2
|
+5.9 / -5.7
|
+6.2 / -5.9
|
+3.7 / -3.7
|
+4.4 / -4.6
|
+4.1 / -3.8
|
+1.40 / -1.41
|
meta-llama/Meta-Llama-3.1-8B-Instruct
| 0.398 | 0.564 | 0.482 | 0.386 | 0.379 | 0.4 | 0.389 | 0.334 | 0.311 | 0.36 | 0.334 | 0.3857 | 7 | 1 | 1 | 5 | 1 | 1 | 1 | 5 | 9 | 11 | 3 | 7 |
+3.8 / -3.7
|
+9.3 / -9.8
|
+4.0 / -4.0
|
+3.7 / -3.6
|
+6.9 / -6.4
|
+6.5 / -6.5
|
+6.4 / -6.7
|
+5.7 / -5.7
|
+3.7 / -3.8
|
+4.6 / -4.4
|
+4.3 / -4.2
|
+1.48 / -1.46
|
mistralai/Mistral-7B-Instruct-v0.3
| 0.19 | 0.426 | 0.292 | 0.209 | 0.225 | 0.256 | 0.252 | 0.171 | 0.177 | 0.23 | 0.228 | 0.2266 | 14 | 8 | 14 | 14 | 18 | 12 | 3 | 14 | 12 | 12 | 8 | 14 |
+3.2 / -3.0
|
+9.8 / -9.3
|
+3.7 / -3.7
|
+3.1 / -3.0
|
+5.9 / -5.4
|
+6.2 / -5.7
|
+6.2 / -5.9
|
+5.2 / -5.0
|
+3.3 / -3.2
|
+4.0 / -3.8
|
+3.9 / -3.8
|
+1.24 / -1.29
|
mistralai/Mistral-Large-Instruct-2407
| 0.492 | 0.471 | 0.513 | 0.51 | 0.485 | 0.47 | 0.307 | 0.426 | 0.452 | 0.6 | 0.43 | 0.4839 | 1 | 8 | 1 | 1 | 1 | 1 | 3 | 1 | 1 | 1 | 1 | 1 |
+4.1 / -4.0
|
+9.8 / -9.8
|
+3.9 / -3.9
|
+3.6 / -3.7
|
+6.9 / -6.9
|
+6.7 / -6.5
|
+6.2 / -5.9
|
+6.4 / -6.2
|
+4.0 / -4.0
|
+4.4 / -4.6
|
+4.4 / -4.3
|
+1.49 / -1.53
|
mistralai/Mistral-Small-Instruct-2409
| 0.414 | 0.461 | 0.467 | 0.458 | 0.399 | 0.418 | 0.287 | 0.354 | 0.393 | 0.541 | 0.392 | 0.4287 | 1 | 8 | 1 | 1 | 1 | 1 | 3 | 5 | 4 | 1 | 3 | 4 |
+3.9 / -4.1
|
+9.8 / -9.8
|
+4.1 / -3.9
|
+3.7 / -3.7
|
+6.7 / -6.7
|
+6.5 / -6.5
|
+6.2 / -5.9
|
+6.2 / -6.2
|
+4.1 / -3.9
|
+4.6 / -4.6
|
+4.4 / -4.5
|
+1.48 / -1.43
|
allenai/OLMo-2-1124-13B-Instruct
| 0.376 | 0.431 | 0.386 | 0.399 | 0.391 | 0.438 | 0.319 | 0.248 | 0.237 | 0.422 | 0.295 | 0.356 | 7 | 8 | 9 | 5 | 1 | 1 | 3 | 5 | 12 | 5 | 8 | 8 |
+4.0 / -3.9
|
+9.8 / -9.8
|
+3.9 / -3.8
|
+3.6 / -3.7
|
+6.4 / -6.4
|
+6.7 / -6.7
|
+6.4 / -6.2
|
+5.9 / -5.5
|
+3.5 / -3.5
|
+4.6 / -4.4
|
+4.3 / -4.0
|
+1.48 / -1.44
|
allenai/OLMo-2-1124-7B-Instruct
| 0.318 | 0.348 | 0.369 | 0.319 | 0.252 | 0.254 | 0.183 | 0.183 | 0.173 | 0.393 | 0.208 | 0.2849 | 12 | 8 | 9 | 9 | 9 | 12 | 20 | 14 | 12 | 5 | 16 | 12 |
+3.7 / -3.7
|
+9.3 / -8.3
|
+3.8 / -3.8
|
+3.4 / -3.4
|
+5.9 / -5.9
|
+6.0 / -5.7
|
+5.2 / -5.0
|
+5.4 / -5.0
|
+3.4 / -3.1
|
+4.4 / -4.4
|
+3.7 / -3.7
|
+1.36 / -1.35
|
allenai/OLMo-7B-0724-Instruct-hf
| 0.064 | 0.157 | 0.127 | 0.053 | 0.104 | 0.087 | 0.059 | 0.05 | 0.052 | 0.079 | 0.06 | 0.075 | 23 | 28 | 21 | 28 | 23 | 30 | 29 | 32 | 23 | 19 | 27 | 26 |
+2.0 / -1.8
|
+7.8 / -6.9
|
+2.7 / -2.7
|
+1.7 / -1.6
|
+4.5 / -4.0
|
+4.0 / -3.5
|
+3.5 / -3.0
|
+3.2 / -2.7
|
+1.9 / -1.9
|
+2.6 / -2.4
|
+2.2 / -2.1
|
+0.82 / -0.80
|
allenai/OLMo-7B-SFT
| 0.052 | 0.348 | 0.02 | 0.05 | 0.054 | 0.187 | 0.156 | 0.072 | 0.047 | 0.024 | 0.081 | 0.0661 | 28 | 8 | 26 | 28 | 23 | 23 | 20 | 22 | 23 | 24 | 22 | 26 |
+1.8 / -1.7
|
+9.3 / -9.3
|
+1.2 / -1.0
|
+1.7 / -1.6
|
+3.5 / -3.0
|
+5.2 / -5.2
|
+5.0 / -4.7
|
+3.5 / -3.2
|
+1.9 / -1.7
|
+1.5 / -1.3
|
+2.6 / -2.5
|
+0.74 / -0.72
|
microsoft/Phi-3-medium-4k-instruct
| 0.286 | 0.564 | 0.337 | 0.294 | 0.356 | 0.415 | 0.334 | 0.248 | 0.385 | 0.181 | 0.255 | 0.3091 | 12 | 1 | 14 | 9 | 9 | 1 | 3 | 5 | 4 | 16 | 8 | 12 |
+3.7 / -3.4
|
+9.3 / -9.3
|
+3.8 / -3.7
|
+3.4 / -3.3
|
+6.9 / -6.4
|
+6.5 / -6.5
|
+6.2 / -6.2
|
+5.7 / -5.4
|
+4.0 / -3.9
|
+3.5 / -3.5
|
+3.9 / -4.0
|
+1.33 / -1.35
|
Qwen/Qwen1.5-110B-Chat
| 0.414 | 0.593 | 0.462 | 0.376 | 0.371 | 0.453 | 0.317 | 0.317 | 0.365 | 0.481 | 0.387 | 0.4076 | 1 | 1 | 1 | 5 | 1 | 1 | 3 | 5 | 4 | 5 | 3 | 4 |
+3.9 / -3.9
|
+9.3 / -9.3
|
+3.8 / -4.0
|
+3.6 / -3.7
|
+6.4 / -6.4
|
+6.7 / -6.7
|
+6.2 / -5.9
|
+6.2 / -6.2
|
+3.8 / -3.9
|
+4.6 / -4.6
|
+4.3 / -4.4
|
+1.47 / -1.43
|
Qwen/Qwen2-72B-Instruct
| 0.327 | 0.549 | 0.431 | 0.324 | 0.26 | 0.4 | 0.262 | 0.342 | 0.373 | 0.263 | 0.271 | 0.3371 | 7 | 1 | 9 | 9 | 9 | 1 | 3 | 5 | 4 | 12 | 8 | 8 |
+3.8 / -3.7
|
+9.8 / -9.8
|
+4.0 / -4.2
|
+3.4 / -3.4
|
+6.2 / -5.9
|
+6.5 / -6.5
|
+5.9 / -5.7
|
+6.4 / -6.4
|
+3.9 / -4.0
|
+4.2 / -4.0
|
+4.0 / -4.0
|
+1.43 / -1.42
|
Qwen/Qwen2.5-72B-Instruct
| 0.454 | 0.49 | 0.503 | 0.454 | 0.431 | 0.468 | 0.285 | 0.433 | 0.452 | 0.603 | 0.4 | 0.4621 | 1 | 8 | 1 | 1 | 1 | 1 | 3 | 1 | 1 | 1 | 3 | 1 |
+3.9 / -4.0
|
+9.3 / -9.8
|
+4.1 / -4.2
|
+3.7 / -3.7
|
+6.9 / -6.9
|
+6.5 / -6.5
|
+5.9 / -5.7
|
+6.4 / -6.7
|
+4.0 / -4.0
|
+4.4 / -4.6
|
+4.4 / -4.4
|
+1.54 / -1.49
|
WizardLMTeam/WizardLM-13B-V1.2
| 0.163 | 0.353 | 0.231 | 0.186 | 0.213 | 0.266 | 0.243 | 0.146 | 0.11 | 0.132 | 0.141 | 0.179 | 19 | 8 | 17 | 14 | 18 | 12 | 3 | 14 | 18 | 16 | 16 | 17 |
+3.0 / -2.8
|
+9.3 / -9.3
|
+3.4 / -3.3
|
+2.9 / -2.9
|
+5.9 / -5.4
|
+6.0 / -5.7
|
+5.7 / -5.4
|
+5.0 / -4.5
|
+2.6 / -2.5
|
+3.3 / -3.1
|
+3.2 / -3.0
|
+1.14 / -1.14
|
01-ai/Yi-1.5-34B-Chat
| 0.353 | 0.446 | 0.489 | 0.297 | 0.282 | 0.391 | 0.275 | 0.267 | 0.307 | 0.417 | 0.299 | 0.351 | 7 | 8 | 1 | 9 | 9 | 1 | 3 | 5 | 9 | 5 | 8 | 8 |
+3.8 / -3.8
|
+9.3 / -9.3
|
+4.2 / -4.0
|
+3.4 / -3.3
|
+6.4 / -5.9
|
+6.7 / -6.2
|
+5.9 / -5.9
|
+5.9 / -5.7
|
+3.7 / -3.7
|
+4.4 / -4.4
|
+4.2 / -4.1
|
+1.44 / -1.40
|
databricks/dolly-v2-12b
| 0.02 | 0.225 | 0.002 | 0.024 | 0.005 | 0.06 | 0.131 | 0.079 | 0.036 | 0.009 | 0.041 | 0.0353 | 33 | 28 | 32 | 32 | 27 | 30 | 20 | 22 | 33 | 32 | 27 | 31 |
+1.2 / -1.0
|
+7.8 / -7.8
|
+0.3 / -0.2
|
+1.1 / -1.0
|
+1.0 / -0.5
|
+3.5 / -3.0
|
+4.7 / -4.5
|
+3.7 / -3.5
|
+1.7 / -1.5
|
+0.9 / -0.7
|
+1.9 / -1.7
|
+0.55 / -0.54
|
databricks/dolly-v2-7b
| 0.022 | 0.255 | 0.003 | 0.016 | 0 | 0.05 | 0.097 | 0.087 | 0.039 | 0.002 | 0.055 | 0.0344 | 28 | 28 | 32 | 32 | 34 | 30 | 29 | 22 | 23 | 32 | 27 | 31 |
+1.3 / -1.2
|
+8.8 / -7.8
|
+0.5 / -0.3
|
+1.0 / -0.9
|
+0.0 / -0.0
|
+3.5 / -2.5
|
+4.2 / -4.0
|
+4.2 / -3.7
|
+1.7 / -1.5
|
+0.4 / -0.2
|
+2.2 / -1.9
|
+0.54 / -0.55
|
nomic-ai/gpt4all-13b-snoozy
| 0.06 | 0.549 | 0.02 | 0.059 | 0.02 | 0.102 | 0.04 | 0.067 | 0.064 | 0.022 | 0.054 | 0.0612 | 23 | 1 | 26 | 24 | 27 | 23 | 29 | 22 | 23 | 24 | 27 | 26 |
+1.8 / -1.8
|
+9.8 / -9.8
|
+1.2 / -1.0
|
+1.7 / -1.7
|
+2.0 / -1.5
|
+4.2 / -4.0
|
+3.0 / -2.5
|
+3.5 / -3.2
|
+2.2 / -2.0
|
+1.5 / -1.3
|
+2.2 / -1.9
|
+0.72 / -0.68
|
TheBloke/koala-13B-HF
| 0.03 | 0.333 | 0.013 | 0.046 | 0.032 | 0.144 | 0.134 | 0.082 | 0.06 | 0.031 | 0.051 | 0.0566 | 28 | 8 | 26 | 28 | 27 | 23 | 20 | 22 | 23 | 24 | 27 | 29 |
+1.5 / -1.2
|
+9.3 / -8.8
|
+1.0 / -0.8
|
+1.6 / -1.5
|
+2.5 / -2.2
|
+5.0 / -4.5
|
+4.7 / -4.2
|
+3.7 / -3.5
|
+2.1 / -1.9
|
+1.8 / -1.5
|
+2.2 / -1.9
|
+0.69 / -0.69
|
TheBloke/koala-7B-HF
| 0.027 | 0.245 | 0.004 | 0.037 | 0.005 | 0.095 | 0.064 | 0.089 | 0.041 | 0.02 | 0.049 | 0.0409 | 28 | 28 | 26 | 28 | 27 | 30 | 29 | 22 | 23 | 24 | 27 | 31 |
+1.3 / -1.2
|
+8.8 / -7.8
|
+0.6 / -0.4
|
+1.4 / -1.3
|
+1.0 / -0.5
|
+4.0 / -4.0
|
+3.7 / -3.0
|
+4.0 / -3.7
|
+1.7 / -1.5
|
+1.3 / -1.1
|
+2.1 / -1.8
|
+0.59 / -0.58
|
mosaicml/mpt-7b-chat
| 0.037 | 0.363 | 0.005 | 0.057 | 0.025 | 0.182 | 0.136 | 0.042 | 0.052 | 0.015 | 0.045 | 0.0553 | 28 | 8 | 26 | 24 | 27 | 23 | 20 | 32 | 23 | 32 | 27 | 29 |
+1.7 / -1.3
|
+8.8 / -8.8
|
+0.7 / -0.5
|
+1.7 / -1.6
|
+2.5 / -2.0
|
+5.5 / -5.2
|
+4.7 / -4.5
|
+2.7 / -2.5
|
+1.9 / -1.9
|
+1.3 / -1.1
|
+2.1 / -1.7
|
+0.69 / -0.67
|
OpenAssistant/oasst-sft-1-pythia-12b
| 0.012 | 0.216 | 0.003 | 0.016 | 0.005 | 0.03 | 0.01 | 0.02 | 0.032 | 0.018 | 0.028 | 0.0218 | 33 | 28 | 32 | 32 | 27 | 30 | 34 | 32 | 33 | 24 | 27 | 34 |
+1.0 / -0.8
|
+7.8 / -7.8
|
+0.5 / -0.3
|
+1.0 / -0.9
|
+1.0 / -0.5
|
+2.5 / -2.0
|
+1.5 / -1.0
|
+2.0 / -1.5
|
+1.7 / -1.3
|
+1.3 / -1.1
|
+1.5 / -1.3
|
+0.45 / -0.42
|
allenai/tulu-2-dpo-13b
| 0.124 | 0.358 | 0.137 | 0.144 | 0.171 | 0.244 | 0.151 | 0.181 | 0.129 | 0.095 | 0.161 | 0.1481 | 19 | 8 | 21 | 20 | 18 | 12 | 20 | 14 | 18 | 19 | 16 | 20 |
+2.7 / -2.6
|
+9.3 / -8.8
|
+2.8 / -2.7
|
+2.6 / -2.6
|
+5.2 / -5.0
|
+6.0 / -5.7
|
+5.2 / -4.7
|
+5.2 / -5.0
|
+2.8 / -2.7
|
+2.9 / -2.6
|
+3.4 / -3.2
|
+1.05 / -1.07
|
allenai/tulu-2-dpo-70b
| 0.211 | 0.328 | 0.227 | 0.221 | 0.235 | 0.281 | 0.255 | 0.238 | 0.224 | 0.188 | 0.229 | 0.2267 | 14 | 8 | 17 | 14 | 9 | 12 | 3 | 5 | 12 | 12 | 8 | 14 |
+3.2 / -3.2
|
+9.3 / -8.3
|
+3.5 / -3.3
|
+3.1 / -3.0
|
+6.2 / -5.7
|
+6.0 / -6.0
|
+5.9 / -5.7
|
+5.7 / -5.4
|
+3.5 / -3.4
|
+3.8 / -3.5
|
+4.0 / -3.7
|
+1.26 / -1.22
|
allenai/tulu-2-dpo-7b
| 0.1 | 0.284 | 0.076 | 0.102 | 0.119 | 0.164 | 0.158 | 0.116 | 0.057 | 0.062 | 0.117 | 0.1012 | 23 | 28 | 23 | 20 | 23 | 23 | 20 | 22 | 23 | 19 | 22 | 24 |
+2.5 / -2.3
|
+8.8 / -8.8
|
+2.1 / -2.1
|
+2.3 / -2.2
|
+4.5 / -4.5
|
+5.0 / -5.0
|
+5.0 / -5.0
|
+4.5 / -4.0
|
+2.0 / -1.9
|
+2.2 / -2.2
|
+3.0 / -2.8
|
+0.92 / -0.92
|
allenai/tulu-v2.5-ppo-13b-uf-mean-70b-uf-rm
| 0.186 | 0.196 | 0.387 | 0.177 | 0.243 | 0.129 | 0.05 | 0.111 | 0.126 | 0.223 | 0.152 | 0.1957 | 14 | 28 | 9 | 14 | 9 | 23 | 29 | 22 | 18 | 12 | 16 | 17 |
+3.2 / -3.0
|
+7.8 / -6.9
|
+4.0 / -3.7
|
+2.9 / -2.7
|
+5.9 / -5.9
|
+4.7 / -4.5
|
+3.2 / -2.7
|
+4.2 / -4.2
|
+2.9 / -2.6
|
+4.0 / -4.0
|
+3.5 / -3.0
|
+1.20 / -1.24
|
lmsys/vicuna-13b-v1.5
| 0.11 | 0.49 | 0.075 | 0.129 | 0.134 | 0.224 | 0.25 | 0.173 | 0.11 | 0.049 | 0.137 | 0.1299 | 19 | 8 | 23 | 20 | 18 | 12 | 3 | 14 | 18 | 24 | 22 | 22 |
+2.7 / -2.5
|
+9.3 / -9.3
|
+2.2 / -2.0
|
+2.6 / -2.4
|
+5.0 / -4.5
|
+5.7 / -5.5
|
+5.9 / -5.7
|
+5.0 / -4.7
|
+2.7 / -2.6
|
+2.2 / -1.8
|
+3.0 / -3.0
|
+1.02 / -0.97
|
lmsys/vicuna-7b-v1.5
| 0.072 | 0.373 | 0.051 | 0.096 | 0.077 | 0.219 | 0.213 | 0.126 | 0.074 | 0.044 | 0.083 | 0.095 | 23 | 8 | 23 | 24 | 23 | 12 | 20 | 22 | 23 | 24 | 22 | 24 |
+2.2 / -2.0
|
+9.3 / -8.8
|
+1.8 / -1.7
|
+2.3 / -2.1
|
+3.7 / -3.5
|
+5.7 / -5.7
|
+5.7 / -5.7
|
+4.7 / -4.2
|
+2.3 / -2.1
|
+2.0 / -1.8
|
+2.6 / -2.4
|
+0.88 / -0.86
|
No dataset card yet
- Downloads last month
- 39
Size of downloaded dataset files:
42.3 kB
Size of the auto-converted Parquet files:
400 kB
Number of rows:
34