model_type
stringclasses 5
values | model
stringlengths 12
62
| AVG
float64 0.03
0.74
| CG
float64 0
0.76
| EL
float64 0
0.77
| FA
float64 0
0.62
| HE
float64 0
0.83
| MC
float64 0
0.95
| MR
float64 0
0.95
| MT
float64 0.19
0.86
| NLI
float64 0
0.97
| QA
float64 0
0.77
| RC
float64 0
0.94
| SUM
float64 0
0.29
| aio_char_f1
float64 0
0.9
| alt-e-to-j_bert_score_ja_f1
float64 0
0.88
| alt-e-to-j_bleu_ja
float64 0
16
| alt-e-to-j_comet_wmt22
float64 0.2
0.92
| alt-j-to-e_bert_score_en_f1
float64 0
0.96
| alt-j-to-e_bleu_en
float64 0
20.1
| alt-j-to-e_comet_wmt22
float64 0.17
0.89
| chabsa_set_f1
float64 0
0.77
| commonsensemoralja_exact_match
float64 0
0.94
| jamp_exact_match
float64 0
1
| janli_exact_match
float64 0
1
| jcommonsenseqa_exact_match
float64 0
0.98
| jemhopqa_char_f1
float64 0
0.71
| jmmlu_exact_match
float64 0
0.81
| jnli_exact_match
float64 0
0.94
| jsem_exact_match
float64 0
0.96
| jsick_exact_match
float64 0
0.93
| jsquad_char_f1
float64 0
0.94
| jsts_pearson
float64 -0.35
0.94
| jsts_spearman
float64 -0.6
0.91
| kuci_exact_match
float64 0
0.93
| mawps_exact_match
float64 0
0.95
| mbpp_code_exec
float64 0
0.76
| mbpp_pylint_check
float64 0
1
| mmlu_en_exact_match
float64 0
0.86
| niilc_char_f1
float64 0
0.7
| wiki_coreference_set_f1
float64 0
0.4
| wiki_dependency_set_f1
float64 0
0.89
| wiki_ner_set_f1
float64 0
0.33
| wiki_pas_set_f1
float64 0
0.57
| wiki_reading_char_f1
float64 0
0.94
| wikicorpus-e-to-j_bert_score_ja_f1
float64 0
0.88
| wikicorpus-e-to-j_bleu_ja
float64 0
24
| wikicorpus-e-to-j_comet_wmt22
float64 0.18
0.87
| wikicorpus-j-to-e_bert_score_en_f1
float64 0
0.93
| wikicorpus-j-to-e_bleu_en
float64 0
15.9
| wikicorpus-j-to-e_comet_wmt22
float64 0.17
0.79
| xlsum_ja_bert_score_ja_f1
float64 0
0.79
| xlsum_ja_bleu_ja
float64 0
10.2
| xlsum_ja_rouge1
float64 0
54
| xlsum_ja_rouge2
float64 0
29.2
| xlsum_ja_rouge2_scaling
float64 0
0.29
| xlsum_ja_rougeLsum
float64 0
45.6
| architecture
stringclasses 12
values | precision
stringclasses 3
values | license
stringclasses 15
values | params
float64 0
70.6
| likes
int64 0
6.19k
| revision
stringclasses 1
value | num_few_shot
int64 0
4
| add_special_tokens
stringclasses 2
values | llm_jp_eval_version
stringclasses 1
value | vllm_version
stringclasses 1
value |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
⭕ : instruction-tuned
|
llm-jp/llm-jp-3-1.8b-instruct
| 0.3923
| 0.002
| 0.4046
| 0.1919
| 0.2842
| 0.3697
| 0.418
| 0.7999
| 0.4786
| 0.4957
| 0.834
| 0.0371
| 0.6003
| 0.8445
| 10.6032
| 0.8842
| 0.9337
| 12.773
| 0.8448
| 0.4046
| 0.5421
| 0.3822
| 0.4847
| 0.3101
| 0.4307
| 0.2895
| 0.5838
| 0.6976
| 0.2446
| 0.834
| 0.4386
| 0.5179
| 0.257
| 0.418
| 0.002
| 0.0141
| 0.2789
| 0.4561
| 0.0288
| 0.172
| 0.0177
| 0.0139
| 0.7272
| 0.7723
| 7.495
| 0.7693
| 0.8829
| 9.2901
| 0.7011
| 0.6449
| 1.1871
| 18.1463
| 3.7324
| 0.0371
| 15.2299
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 1.868
| 20
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
llm-jp/llm-jp-3-1.8b-instruct
| 0.2147
| 0.002
| 0
| 0.0442
| 0.1461
| 0.2648
| 0
| 0.7888
| 0.2062
| 0.1956
| 0.6768
| 0.0371
| 0.2575
| 0.8361
| 10.3819
| 0.8796
| 0.931
| 12.1086
| 0.8421
| 0
| 0.5321
| 0.1724
| 0.0028
| 0
| 0.0767
| 0.0263
| 0.2424
| 0.0265
| 0.587
| 0.6768
| -0.0876
| -0.0888
| 0.2624
| 0
| 0.002
| 0.0141
| 0.2659
| 0.2526
| 0
| 0
| 0
| 0
| 0.2208
| 0.7485
| 6.3026
| 0.7504
| 0.873
| 7.9213
| 0.683
| 0.6449
| 1.1871
| 18.1463
| 3.7324
| 0.0371
| 15.2299
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 1.868
| 20
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
llm-jp/llm-jp-3-13b-instruct
| 0.5462
| 0.0843
| 0.4843
| 0.2782
| 0.4739
| 0.8438
| 0.71
| 0.8393
| 0.669
| 0.6097
| 0.9014
| 0.1143
| 0.8415
| 0.8618
| 13.1859
| 0.9068
| 0.9476
| 14.866
| 0.8726
| 0.4843
| 0.8903
| 0.5575
| 0.6403
| 0.8928
| 0.3473
| 0.4651
| 0.7309
| 0.7481
| 0.6682
| 0.9014
| 0.7957
| 0.7949
| 0.7484
| 0.71
| 0.0843
| 0.2731
| 0.4827
| 0.6403
| 0.0396
| 0.4027
| 0.0354
| 0.047
| 0.8665
| 0.82
| 10.6349
| 0.8302
| 0.9031
| 10.9633
| 0.7478
| 0.7015
| 2.9771
| 29.3132
| 11.4455
| 0.1143
| 25.0207
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 13.708
| 16
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
llm-jp/llm-jp-3-13b-instruct
| 0.412
| 0.0843
| 0.0104
| 0.1583
| 0.2727
| 0.7287
| 0.374
| 0.8316
| 0.6881
| 0.3822
| 0.8868
| 0.1143
| 0.6447
| 0.8561
| 12.9677
| 0.9014
| 0.9483
| 15.0672
| 0.8752
| 0.0104
| 0.7492
| 0.5431
| 0.6014
| 0.8704
| 0.0655
| 0.1519
| 0.7896
| 0.7424
| 0.7642
| 0.8868
| 0.8204
| 0.8177
| 0.5663
| 0.374
| 0.0843
| 0.2731
| 0.3935
| 0.4366
| 0
| 0.0025
| 0
| 0
| 0.7892
| 0.7918
| 8.7021
| 0.8163
| 0.8927
| 9.9921
| 0.7335
| 0.7015
| 2.9771
| 29.3132
| 11.4455
| 0.1143
| 25.0207
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 13.708
| 16
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
llm-jp/llm-jp-3-3.7b-instruct
| 0.4597
| 0
| 0.3691
| 0.2398
| 0.3628
| 0.5839
| 0.58
| 0.822
| 0.551
| 0.5709
| 0.8488
| 0.1283
| 0.7141
| 0.8504
| 10.634
| 0.8959
| 0.9388
| 13.2399
| 0.8562
| 0.3691
| 0.7791
| 0.4282
| 0.5236
| 0.5407
| 0.4693
| 0.3555
| 0.6845
| 0.7374
| 0.3816
| 0.8488
| 0.7156
| 0.6453
| 0.4321
| 0.58
| 0
| 0
| 0.37
| 0.5293
| 0.0142
| 0.3409
| 0.0088
| 0.0575
| 0.7774
| 0.7961
| 8.5097
| 0.8052
| 0.8954
| 9.6317
| 0.7307
| 0.7119
| 2.9956
| 35.8746
| 12.8065
| 0.1283
| 29.3662
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 3.783
| 7
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
llm-jp/llm-jp-3-3.7b-instruct
| 0.3016
| 0
| 0
| 0.115
| 0.1869
| 0.4517
| 0.01
| 0.8106
| 0.5867
| 0.2262
| 0.8022
| 0.1283
| 0.3361
| 0.8374
| 10.3631
| 0.8874
| 0.939
| 13.6521
| 0.8578
| 0
| 0.5321
| 0.3649
| 0.6028
| 0.4209
| 0.0591
| 0.0181
| 0.6002
| 0.7115
| 0.6539
| 0.8022
| 0.2802
| 0.3889
| 0.4022
| 0.01
| 0
| 0
| 0.3556
| 0.2835
| 0
| 0
| 0
| 0
| 0.5751
| 0.7703
| 7.1206
| 0.7906
| 0.8781
| 8.625
| 0.7064
| 0.7119
| 2.9956
| 35.8746
| 12.8065
| 0.1283
| 29.3662
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 3.783
| 7
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
Qwen/Qwen2.5-32B-Instruct
| 0.6553
| 0.5281
| 0.5894
| 0.2737
| 0.7757
| 0.8966
| 0.944
| 0.8479
| 0.8106
| 0.541
| 0.9047
| 0.097
| 0.553
| 0.8644
| 13.2738
| 0.9081
| 0.9554
| 17.7737
| 0.8859
| 0.5894
| 0.8975
| 0.6724
| 0.8431
| 0.958
| 0.5672
| 0.7515
| 0.8973
| 0.7835
| 0.8569
| 0.9047
| 0.8895
| 0.877
| 0.8343
| 0.944
| 0.5281
| 0.755
| 0.8
| 0.5029
| 0.0543
| 0.3837
| 0
| 0.1104
| 0.8204
| 0.8291
| 10.9975
| 0.8389
| 0.9045
| 11.1213
| 0.7585
| 0.6926
| 2.7959
| 25.855
| 9.7054
| 0.097
| 22.5323
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 32.764
| 120
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
Qwen/Qwen2.5-32B-Instruct
| 0.5443
| 0.5281
| 0.107
| 0.1453
| 0.568
| 0.8739
| 0.79
| 0.8386
| 0.7647
| 0.3873
| 0.8871
| 0.097
| 0.4392
| 0.8489
| 11.3776
| 0.9009
| 0.9511
| 15.766
| 0.8797
| 0.107
| 0.9046
| 0.6494
| 0.7944
| 0.9303
| 0.2681
| 0.5561
| 0.82
| 0.798
| 0.7615
| 0.8871
| 0.8951
| 0.8761
| 0.7869
| 0.79
| 0.5281
| 0.755
| 0.58
| 0.4547
| 0.0281
| 0.0071
| 0.0354
| 0.0058
| 0.6499
| 0.8004
| 8.7234
| 0.8268
| 0.8969
| 9.5439
| 0.7471
| 0.6926
| 2.7959
| 25.855
| 9.7054
| 0.097
| 22.5323
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 32.764
| 120
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
Qwen/Qwen2.5-7B-Instruct
| 0.5304
| 0.0221
| 0.498
| 0.1822
| 0.6385
| 0.8416
| 0.796
| 0.7302
| 0.7371
| 0.4006
| 0.8899
| 0.0983
| 0.3771
| 0.8222
| 11.2802
| 0.826
| 0.9327
| 15.5139
| 0.8347
| 0.498
| 0.859
| 0.6034
| 0.7431
| 0.9151
| 0.4305
| 0.6103
| 0.8295
| 0.6989
| 0.8106
| 0.8899
| 0.8729
| 0.8468
| 0.7507
| 0.796
| 0.0221
| 0.012
| 0.6666
| 0.3941
| 0.0449
| 0.2922
| 0.0354
| 0.0773
| 0.4614
| 0.7288
| 8.1478
| 0.6153
| 0.8592
| 9.0236
| 0.6446
| 0.692
| 2.1112
| 29.1184
| 9.8419
| 0.0983
| 23.7063
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 7.616
| 271
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
Qwen/Qwen2.5-7B-Instruct
| 0.2896
| 0.0221
| 0.2081
| 0.0822
| 0.4418
| 0.496
| 0.292
| 0.6415
| 0.2336
| 0.2228
| 0.4478
| 0.0983
| 0.2136
| 0.7191
| 7.042
| 0.7345
| 0.831
| 11.0762
| 0.5875
| 0.2081
| 0.0143
| 0.5029
| 0
| 0.8213
| 0.2105
| 0.4936
| 0.3246
| 0
| 0.3408
| 0.4478
| 0.875
| 0.8434
| 0.6524
| 0.292
| 0.0221
| 0.012
| 0.3899
| 0.2442
| 0.01
| 0.0234
| 0
| 0
| 0.3778
| 0.6795
| 6.3014
| 0.6785
| 0.8165
| 7.1661
| 0.5654
| 0.692
| 2.1112
| 29.1184
| 9.8419
| 0.0983
| 23.7063
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 7.616
| 271
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
weblab-GENIAC/Tanuki-8B-dpo-v1.0
| 0.3779
| 0.0783
| 0.398
| 0.1031
| 0.3067
| 0.4096
| 0.3
| 0.8275
| 0.5318
| 0.4266
| 0.6665
| 0.1083
| 0.6303
| 0.8351
| 10.0227
| 0.899
| 0.938
| 13.2353
| 0.8598
| 0.398
| 0.6964
| 0.4368
| 0.5431
| 0.2717
| 0.2847
| 0.3152
| 0.5965
| 0.661
| 0.4216
| 0.6665
| 0.3897
| 0.3894
| 0.2606
| 0.3
| 0.0783
| 0.2088
| 0.2983
| 0.3647
| 0
| 0.0212
| 0
| 0.0096
| 0.4849
| 0.7795
| 7.8083
| 0.8166
| 0.8903
| 8.5388
| 0.7346
| 0.7025
| 2.3051
| 34.8952
| 10.8447
| 0.1083
| 28.3762
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 7.512
| 29
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
weblab-GENIAC/Tanuki-8B-dpo-v1.0
| 0.2122
| 0.0783
| 0
| 0.0873
| 0
| 0.1774
| 0
| 0.829
| 0.414
| 0.1057
| 0.534
| 0.1083
| 0.0853
| 0.8373
| 10.8897
| 0.9
| 0.9427
| 12.9964
| 0.8684
| 0
| 0.5321
| 0.3563
| 0.3861
| 0
| 0.098
| 0
| 0.1553
| 0.5997
| 0.5726
| 0.534
| 0
| 0
| 0
| 0
| 0.0783
| 0.2088
| 0
| 0.1337
| 0
| 0
| 0
| 0
| 0.4366
| 0.7729
| 6.9734
| 0.8122
| 0.8905
| 7.4821
| 0.7355
| 0.7025
| 2.3051
| 34.8952
| 10.8447
| 0.1083
| 28.3762
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 7.512
| 29
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
tokyotech-llm/Llama-3.1-Swallow-70B-Instruct-v0.1
| 0.62
| 0.0201
| 0.5529
| 0.3371
| 0.7313
| 0.9057
| 0.932
| 0.8565
| 0.7605
| 0.7131
| 0.9252
| 0.0856
| 0.8606
| 0.876
| 14.5217
| 0.9157
| 0.9603
| 19.3015
| 0.8924
| 0.5529
| 0.9296
| 0.6523
| 0.8069
| 0.9589
| 0.619
| 0.7063
| 0.7477
| 0.7923
| 0.8033
| 0.9252
| 0.878
| 0.8477
| 0.8287
| 0.932
| 0.0201
| 0.0422
| 0.7562
| 0.6597
| 0.0906
| 0.4663
| 0.1327
| 0.0903
| 0.9053
| 0.8501
| 13.863
| 0.8529
| 0.9137
| 12.9227
| 0.7651
| 0.6804
| 3.3097
| 20.1389
| 8.5601
| 0.0856
| 18.0061
|
LlamaForCausalLM
|
bfloat16
|
llama3.1;gemma
| 70.554
| 3
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
tokyotech-llm/Llama-3.1-Swallow-70B-Instruct-v0.1
| 0.507
| 0.0201
| 0.2715
| 0.1917
| 0.6465
| 0.8642
| 0.744
| 0.7167
| 0.7081
| 0.5798
| 0.749
| 0.0856
| 0.7149
| 0.8644
| 13.3248
| 0.9114
| 0.8297
| 17.2139
| 0.5661
| 0.2715
| 0.9216
| 0.6351
| 0.7458
| 0.9267
| 0.4623
| 0.6058
| 0.576
| 0.7753
| 0.8084
| 0.749
| 0.8408
| 0.7986
| 0.7443
| 0.744
| 0.0201
| 0.0422
| 0.6873
| 0.5622
| 0.0163
| 0.0057
| 0.0619
| 0.0075
| 0.867
| 0.8134
| 10.2308
| 0.8336
| 0.8206
| 10.8013
| 0.5556
| 0.6804
| 3.3097
| 20.1389
| 8.5601
| 0.0856
| 18.0061
|
LlamaForCausalLM
|
bfloat16
|
llama3.1;gemma
| 70.554
| 3
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
tokyotech-llm/Llama-3.1-Swallow-8B-Instruct-v0.2
| 0.565
| 0.0321
| 0.5385
| 0.2951
| 0.5743
| 0.8428
| 0.72
| 0.8484
| 0.7158
| 0.6428
| 0.9167
| 0.0883
| 0.7629
| 0.8709
| 13.8312
| 0.9115
| 0.9537
| 16.8957
| 0.8828
| 0.5385
| 0.8823
| 0.5345
| 0.725
| 0.9276
| 0.559
| 0.5507
| 0.8073
| 0.7601
| 0.752
| 0.9167
| 0.7888
| 0.7604
| 0.7184
| 0.72
| 0.0321
| 0.1365
| 0.5978
| 0.6066
| 0.0352
| 0.395
| 0.0885
| 0.1022
| 0.8549
| 0.835
| 11.7829
| 0.8406
| 0.9092
| 11.5867
| 0.7587
| 0.6827
| 2.8084
| 21.4684
| 8.8428
| 0.0883
| 18.9717
|
LlamaForCausalLM
|
bfloat16
|
llama3.1;gemma
| 8.03
| 6
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
tokyotech-llm/Llama-3.1-Swallow-8B-Instruct-v0.2
| 0.4292
| 0.0321
| 0.1047
| 0.1583
| 0.4096
| 0.799
| 0.294
| 0.8335
| 0.6602
| 0.4712
| 0.8707
| 0.0883
| 0.6063
| 0.8593
| 12.1636
| 0.905
| 0.9508
| 15.8558
| 0.8781
| 0.1047
| 0.8682
| 0.5172
| 0.6528
| 0.8838
| 0.4527
| 0.3663
| 0.6167
| 0.7601
| 0.7542
| 0.8707
| 0.8008
| 0.7914
| 0.6449
| 0.294
| 0.0321
| 0.1365
| 0.4529
| 0.3547
| 0.0026
| 0.0004
| 0.0088
| 0
| 0.7798
| 0.8002
| 8.6761
| 0.818
| 0.8933
| 10.2604
| 0.7328
| 0.6827
| 2.8084
| 21.4684
| 8.8428
| 0.0883
| 18.9717
|
LlamaForCausalLM
|
bfloat16
|
llama3.1;gemma
| 8.03
| 6
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
rinna/gemma-2-baku-2b-it
| 0.4477
| 0
| 0.4434
| 0.1815
| 0.4444
| 0.7059
| 0.536
| 0.8302
| 0.4111
| 0.3726
| 0.8741
| 0.1256
| 0.4408
| 0.8527
| 10.6366
| 0.8981
| 0.9441
| 13.6152
| 0.8697
| 0.4434
| 0.7004
| 0.4023
| 0.5639
| 0.8534
| 0.2712
| 0.4084
| 0.4129
| 0.2986
| 0.3779
| 0.8741
| 0.6961
| 0.7189
| 0.5639
| 0.536
| 0
| 0
| 0.4805
| 0.4057
| 0.0348
| 0.0877
| 0.0708
| 0.0355
| 0.6786
| 0.7845
| 7.0376
| 0.8119
| 0.8944
| 8.8013
| 0.741
| 0.7177
| 2.2358
| 37.6981
| 12.5686
| 0.1256
| 30.0896
|
Gemma2ForCausalLM
|
float16
|
gemma
| 2.614
| 14
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
rinna/gemma-2-baku-2b-it
| 0.2449
| 0
| 0.0093
| 0.1214
| 0.0013
| 0.2852
| 0.008
| 0.823
| 0.3259
| 0.2184
| 0.7761
| 0.1256
| 0.3312
| 0.8416
| 9.5574
| 0.8942
| 0.9392
| 11.6762
| 0.8618
| 0.0093
| 0.8021
| 0.3649
| 0.5014
| 0
| 0.0083
| 0.0025
| 0.2991
| 0.1629
| 0.3012
| 0.7761
| 0.2881
| 0.2782
| 0.0533
| 0.008
| 0
| 0
| 0
| 0.3157
| 0
| 0
| 0
| 0
| 0.6068
| 0.7784
| 6.6772
| 0.7992
| 0.8924
| 7.8981
| 0.7369
| 0.7177
| 2.2358
| 37.6981
| 12.5686
| 0.1256
| 30.0896
|
Gemma2ForCausalLM
|
float16
|
gemma
| 2.614
| 14
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
cyberagent/calm3-22b-chat
| 0.5612
| 0
| 0.5404
| 0.247
| 0.5592
| 0.8527
| 0.684
| 0.8464
| 0.7519
| 0.6677
| 0.9118
| 0.1121
| 0.8433
| 0.8705
| 13.6161
| 0.9117
| 0.9553
| 16.6257
| 0.8857
| 0.5404
| 0.8677
| 0.5805
| 0.7875
| 0.933
| 0.5109
| 0.5408
| 0.8583
| 0.7715
| 0.7617
| 0.9118
| 0.8637
| 0.86
| 0.7575
| 0.684
| 0
| 0
| 0.5776
| 0.6489
| 0.001
| 0.2251
| 0.1239
| 0.0316
| 0.8532
| 0.8177
| 9.4163
| 0.8375
| 0.9039
| 10.3951
| 0.7506
| 0.7019
| 2.5206
| 31.9179
| 11.2189
| 0.1121
| 26.8509
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 22.543
| 67
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
cyberagent/calm3-22b-chat
| 0.4792
| 0
| 0.2776
| 0.1668
| 0.4546
| 0.7844
| 0.578
| 0.8388
| 0.6843
| 0.4854
| 0.8887
| 0.1121
| 0.6197
| 0.8517
| 12.6994
| 0.907
| 0.9535
| 16.1239
| 0.8829
| 0.2776
| 0.8297
| 0.5517
| 0.6986
| 0.8954
| 0.3577
| 0.4818
| 0.7683
| 0.7689
| 0.6341
| 0.8887
| 0.8853
| 0.8589
| 0.6281
| 0.578
| 0
| 0
| 0.4275
| 0.4788
| 0.0019
| 0.0016
| 0.0022
| 0
| 0.8285
| 0.7865
| 7.7148
| 0.8174
| 0.9002
| 9.9171
| 0.748
| 0.7019
| 2.5206
| 31.9179
| 11.2189
| 0.1121
| 26.8509
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 22.543
| 67
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
elyza/Llama-3-ELYZA-JP-8B
| 0.556
| 0.2972
| 0.5105
| 0.27
| 0.4999
| 0.8048
| 0.712
| 0.8285
| 0.6305
| 0.5457
| 0.9112
| 0.1058
| 0.6179
| 0.8476
| 11.1585
| 0.8843
| 0.9453
| 14.1463
| 0.8677
| 0.5105
| 0.8166
| 0.4856
| 0.6069
| 0.8981
| 0.5168
| 0.4761
| 0.7555
| 0.6654
| 0.6389
| 0.9112
| 0.8215
| 0.7794
| 0.6995
| 0.712
| 0.2972
| 0.7269
| 0.5236
| 0.5023
| 0.0209
| 0.3524
| 0.0885
| 0.0848
| 0.8035
| 0.8125
| 9.6534
| 0.8214
| 0.8976
| 10.308
| 0.7407
| 0.6986
| 2.4239
| 28.7904
| 10.5753
| 0.1058
| 24.7182
|
LlamaForCausalLM
|
bfloat16
|
llama3
| 8.03
| 76
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
elyza/Llama-3-ELYZA-JP-8B
| 0.4106
| 0.2972
| 0.0401
| 0.1057
| 0.3654
| 0.5815
| 0.446
| 0.8151
| 0.549
| 0.3639
| 0.8475
| 0.1058
| 0.4164
| 0.8396
| 9.0926
| 0.886
| 0.94
| 12.3152
| 0.8587
| 0.0401
| 0.7778
| 0.3678
| 0.4972
| 0.563
| 0.3615
| 0.3211
| 0.5415
| 0.7096
| 0.629
| 0.8475
| 0.4597
| 0.4683
| 0.4036
| 0.446
| 0.2972
| 0.7269
| 0.4098
| 0.3137
| 0
| 0.0021
| 0
| 0.0006
| 0.5257
| 0.7819
| 7.1661
| 0.7972
| 0.89
| 9.2311
| 0.7182
| 0.6986
| 2.4239
| 28.7904
| 10.5753
| 0.1058
| 24.7182
|
LlamaForCausalLM
|
bfloat16
|
llama3
| 8.03
| 76
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
Rakuten/RakutenAI-7B-chat
| 0.4627
| 0.0863
| 0.3989
| 0.1521
| 0.4763
| 0.8122
| 0.41
| 0.8169
| 0.5524
| 0.418
| 0.8645
| 0.1017
| 0.5386
| 0.8624
| 13.0202
| 0.8952
| 0.9509
| 16.8546
| 0.8778
| 0.3989
| 0.7708
| 0.4713
| 0.6458
| 0.9053
| 0.26
| 0.4377
| 0.7449
| 0.4444
| 0.4554
| 0.8645
| 0.7956
| 0.7642
| 0.7605
| 0.41
| 0.0863
| 0.3072
| 0.515
| 0.4555
| 0.0108
| 0.1118
| 0.0442
| 0.0445
| 0.5493
| 0.7868
| 9.6886
| 0.7865
| 0.8938
| 9.7413
| 0.708
| 0.6961
| 2.2839
| 28.4861
| 10.1816
| 0.1017
| 23.911
|
MistralForCausalLM
|
bfloat16
|
apache-2.0
| 7.373
| 59
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
Rakuten/RakutenAI-7B-chat
| 0.3167
| 0.0863
| 0.0067
| 0.0985
| 0.0041
| 0.7895
| 0.02
| 0.8058
| 0.6148
| 0.2592
| 0.6967
| 0.1017
| 0.3986
| 0.8555
| 13.0705
| 0.8908
| 0.9425
| 14.4667
| 0.8668
| 0.0067
| 0.7455
| 0.477
| 0.6583
| 0.9133
| 0.0218
| 0.0003
| 0.8509
| 0.5941
| 0.4936
| 0.6967
| 0.7406
| 0.7445
| 0.7097
| 0.02
| 0.0863
| 0.3072
| 0.0078
| 0.3572
| 0.0033
| 0
| 0
| 0
| 0.4891
| 0.7883
| 8.5147
| 0.7951
| 0.8782
| 9.0212
| 0.6703
| 0.6961
| 2.2839
| 28.4861
| 10.1816
| 0.1017
| 23.911
|
MistralForCausalLM
|
bfloat16
|
apache-2.0
| 7.373
| 59
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
cyberagent/calm2-7b-chat
| 0.3474
| 0
| 0.3317
| 0.1411
| 0.254
| 0.3255
| 0.108
| 0.7897
| 0.5213
| 0.5648
| 0.7719
| 0.0136
| 0.7041
| 0.8445
| 11.3228
| 0.8869
| 0.9368
| 13.4747
| 0.8479
| 0.3317
| 0.4679
| 0.4052
| 0.5028
| 0.2395
| 0.4973
| 0.2426
| 0.6253
| 0.4646
| 0.6085
| 0.7719
| -0.0728
| -0.0108
| 0.2691
| 0.108
| 0
| 0
| 0.2654
| 0.4929
| 0.0025
| 0.0851
| 0.0177
| 0.0324
| 0.5676
| 0.7587
| 6.7623
| 0.7498
| 0.8779
| 8.7567
| 0.6743
| 0.5871
| 0.4037
| 8.8424
| 1.354
| 0.0136
| 7.5443
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 7.009
| 76
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
cyberagent/calm2-7b-chat
| 0.1367
| 0
| 0
| 0.0496
| 0.0416
| 0.1877
| 0.004
| 0.4887
| 0.0005
| 0.2584
| 0.4593
| 0.0136
| 0.4671
| 0.6331
| 2.2109
| 0.4951
| 0.8192
| 4.0251
| 0.5615
| 0
| 0.4679
| 0
| 0
| 0.0831
| 0.0573
| 0.0387
| 0
| 0.0025
| 0
| 0.4593
| 0.0689
| 0.0684
| 0.0121
| 0.004
| 0
| 0
| 0.0446
| 0.2508
| 0
| 0
| 0
| 0
| 0.2481
| 0.5869
| 1.81
| 0.4237
| 0.7916
| 2.6109
| 0.4745
| 0.5871
| 0.4037
| 8.8424
| 1.354
| 0.0136
| 7.5443
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 7.009
| 76
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
tokyotech-llm/Llama-3.1-Swallow-8B-v0.2
| 0.5414
| 0.012
| 0.4659
| 0.2799
| 0.5427
| 0.8066
| 0.732
| 0.8445
| 0.6617
| 0.6405
| 0.8886
| 0.0813
| 0.8015
| 0.8747
| 13.8553
| 0.9099
| 0.9546
| 16.517
| 0.883
| 0.4659
| 0.8555
| 0.5259
| 0.7125
| 0.9071
| 0.4925
| 0.5244
| 0.7773
| 0.7595
| 0.5334
| 0.8886
| 0.7765
| 0.7511
| 0.6572
| 0.732
| 0.012
| 0.0562
| 0.561
| 0.6276
| 0.007
| 0.3933
| 0.0531
| 0.0803
| 0.8658
| 0.8368
| 13.1609
| 0.8317
| 0.9093
| 12.1377
| 0.7534
| 0.6801
| 1.9566
| 20.1115
| 8.1373
| 0.0813
| 17.7927
|
LlamaForCausalLM
|
bfloat16
|
llama3.1
| 8.03
| 2
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
tokyotech-llm/Llama-3.1-Swallow-8B-v0.2
| 0.3813
| 0.012
| 0.024
| 0.1587
| 0.3409
| 0.623
| 0.382
| 0.8264
| 0.4968
| 0.4812
| 0.7682
| 0.0813
| 0.6752
| 0.8586
| 12.1659
| 0.8998
| 0.9498
| 14.0954
| 0.8754
| 0.024
| 0.8034
| 0.4684
| 0.5
| 0.6604
| 0.2578
| 0.2906
| 0.3878
| 0.7045
| 0.423
| 0.7682
| 0.0806
| 0.0855
| 0.4052
| 0.382
| 0.012
| 0.0562
| 0.3911
| 0.5105
| 0.002
| 0.0016
| 0
| 0
| 0.79
| 0.7912
| 8.2683
| 0.8
| 0.8949
| 9.4323
| 0.7302
| 0.6801
| 1.9566
| 20.1115
| 8.1373
| 0.0813
| 17.7927
|
LlamaForCausalLM
|
bfloat16
|
llama3.1
| 8.03
| 2
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
llm-jp/llm-jp-3-13b
| 0.4861
| 0
| 0.4856
| 0.256
| 0.4043
| 0.6456
| 0.598
| 0.8373
| 0.5444
| 0.6721
| 0.8821
| 0.0218
| 0.8577
| 0.8682
| 13.5301
| 0.9064
| 0.95
| 16.6943
| 0.8747
| 0.4856
| 0.8507
| 0.3822
| 0.5028
| 0.6506
| 0.5161
| 0.3979
| 0.6011
| 0.7311
| 0.505
| 0.8821
| 0.2931
| 0.2998
| 0.4356
| 0.598
| 0
| 0.9578
| 0.4108
| 0.6424
| 0.0068
| 0.3231
| 0.0442
| 0.0366
| 0.8693
| 0.828
| 11.1755
| 0.8298
| 0.9036
| 11.0157
| 0.7381
| 0.6126
| 0.6253
| 13.6586
| 2.1977
| 0.0218
| 11.752
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 13.708
| 0
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
llm-jp/llm-jp-3-13b
| 0.1505
| 0
| 0.0021
| 0.1179
| 0.0097
| 0.0003
| 0.006
| 0.6423
| 0.0166
| 0.308
| 0.5313
| 0.0218
| 0.4529
| 0.758
| 7.0049
| 0.7205
| 0.8881
| 10.7045
| 0.7285
| 0.0021
| 0.0008
| 0
| 0.0806
| 0
| 0.1155
| 0.0175
| 0.0008
| 0
| 0.0016
| 0.5313
| 0.2138
| 0.2025
| 0
| 0.006
| 0
| 0.9578
| 0.0019
| 0.3557
| 0
| 0
| 0
| 0
| 0.5897
| 0.6568
| 3.6264
| 0.5514
| 0.8298
| 5.779
| 0.5687
| 0.6126
| 0.6253
| 13.6586
| 2.1977
| 0.0218
| 11.752
|
LlamaForCausalLM
|
bfloat16
|
apache-2.0
| 13.708
| 0
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
meta-llama/Llama-3.1-8B-Instruct
| 0.5319
| 0.3213
| 0.4598
| 0.2316
| 0.5808
| 0.7657
| 0.736
| 0.7522
| 0.6298
| 0.4205
| 0.8881
| 0.0648
| 0.419
| 0.8496
| 11.4306
| 0.8909
| 0.9171
| 15.0559
| 0.7837
| 0.4598
| 0.8011
| 0.4971
| 0.6722
| 0.8838
| 0.4502
| 0.5171
| 0.7173
| 0.6553
| 0.6069
| 0.8881
| 0.7479
| 0.7498
| 0.6121
| 0.736
| 0.3213
| 0.6205
| 0.6446
| 0.3923
| 0.0187
| 0.3105
| 0.1239
| 0.0315
| 0.6735
| 0.7956
| 11.0723
| 0.7762
| 0.8332
| 9.7048
| 0.5581
| 0.663
| 3.1983
| 15.7749
| 6.481
| 0.0648
| 14.169
|
LlamaForCausalLM
|
bfloat16
|
llama3.1
| 8.03
| 3,059
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
meta-llama/Llama-3.1-8B-Instruct
| 0.2303
| 0.3213
| 0.0154
| 0.0483
| 0.0038
| 0.3839
| 0.076
| 0.4675
| 0.2672
| 0.2386
| 0.6467
| 0.0648
| 0.2288
| 0.6335
| 7.9862
| 0.5261
| 0.7807
| 12.5089
| 0.4486
| 0.0154
| 0.0008
| 0.4425
| 0.3583
| 0.7131
| 0.2612
| 0.0028
| 0.2219
| 0.0063
| 0.3071
| 0.6467
| 0.6254
| 0.5865
| 0.4377
| 0.076
| 0.3213
| 0.6205
| 0.0047
| 0.2259
| 0
| 0.008
| 0
| 0
| 0.2334
| 0.6074
| 7.3187
| 0.4883
| 0.7688
| 7.4026
| 0.4071
| 0.663
| 3.1983
| 15.7749
| 6.481
| 0.0648
| 14.169
|
LlamaForCausalLM
|
bfloat16
|
llama3.1
| 8.03
| 3,059
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
CohereForAI/aya-23-8B
| 0.484
| 0
| 0.3829
| 0.1502
| 0.4958
| 0.7439
| 0.578
| 0.8245
| 0.6414
| 0.4883
| 0.8982
| 0.1213
| 0.5051
| 0.8583
| 12.3567
| 0.9043
| 0.9507
| 16.5267
| 0.8763
| 0.3829
| 0.748
| 0.5345
| 0.6611
| 0.8731
| 0.4958
| 0.4671
| 0.6187
| 0.7153
| 0.6773
| 0.8982
| 0.7539
| 0.7297
| 0.6105
| 0.578
| 0
| 0
| 0.5244
| 0.4639
| 0.0198
| 0.1152
| 0.0354
| 0.0154
| 0.5652
| 0.7963
| 8.9616
| 0.79
| 0.8934
| 9.7535
| 0.7274
| 0.7072
| 2.6432
| 30.8244
| 12.1131
| 0.1213
| 26.0328
|
CohereForCausalLM
|
float16
|
cc-by-nc-4.0
| 8.028
| 392
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
CohereForAI/aya-23-8B
| 0.31
| 0
| 0.0009
| 0.0738
| 0.0287
| 0.5859
| 0.176
| 0.8075
| 0.4615
| 0.3263
| 0.8285
| 0.1213
| 0.4499
| 0.8423
| 10.334
| 0.893
| 0.9461
| 14.353
| 0.8684
| 0.0009
| 0.8041
| 0.3534
| 0.5222
| 0.5371
| 0.1938
| 0.05
| 0.2091
| 0.6742
| 0.5486
| 0.8285
| 0.504
| 0.4839
| 0.4165
| 0.176
| 0
| 0
| 0.0075
| 0.3353
| 0.0054
| 0
| 0
| 0
| 0.3636
| 0.7675
| 7.5791
| 0.7666
| 0.8807
| 8.2645
| 0.7021
| 0.7072
| 2.6432
| 30.8244
| 12.1131
| 0.1213
| 26.0328
|
CohereForCausalLM
|
float16
|
cc-by-nc-4.0
| 8.028
| 392
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
CohereForAI/aya-23-35B
| 0.5742
| 0.1888
| 0.5045
| 0.2595
| 0.6038
| 0.8407
| 0.692
| 0.8443
| 0.7218
| 0.5991
| 0.9175
| 0.1443
| 0.7088
| 0.8668
| 13.519
| 0.9097
| 0.9564
| 17.9446
| 0.8862
| 0.5045
| 0.8236
| 0.569
| 0.7833
| 0.9446
| 0.5383
| 0.5688
| 0.7091
| 0.7538
| 0.7936
| 0.9175
| 0.8689
| 0.8381
| 0.754
| 0.692
| 0.1888
| 0.3936
| 0.6389
| 0.5501
| 0.022
| 0.3005
| 0.115
| 0.043
| 0.8169
| 0.8337
| 11.9736
| 0.8333
| 0.9054
| 11.0157
| 0.7482
| 0.7212
| 4.1534
| 34.0784
| 14.4071
| 0.1443
| 29.011
|
CohereForCausalLM
|
float16
|
cc-by-nc-4.0
| 34.981
| 264
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
CohereForAI/aya-23-35B
| 0.4229
| 0.1888
| 0.0323
| 0.1458
| 0.3212
| 0.7576
| 0.346
| 0.8366
| 0.6044
| 0.5036
| 0.7708
| 0.1443
| 0.6728
| 0.8567
| 12.0892
| 0.9046
| 0.9545
| 16.4732
| 0.8841
| 0.0323
| 0.8582
| 0.4971
| 0.5847
| 0.8418
| 0.3595
| 0.1573
| 0.4626
| 0.7128
| 0.7646
| 0.7708
| 0.7269
| 0.7625
| 0.5728
| 0.346
| 0.1888
| 0.3936
| 0.4852
| 0.4784
| 0
| 0.0017
| 0
| 0
| 0.7275
| 0.8127
| 9.6579
| 0.8224
| 0.8987
| 10.0728
| 0.7353
| 0.7212
| 4.1534
| 34.0784
| 14.4071
| 0.1443
| 29.011
|
CohereForCausalLM
|
float16
|
cc-by-nc-4.0
| 34.981
| 264
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
CohereForAI/aya-expanse-8b
| 0.5163
| 0
| 0.5127
| 0.2133
| 0.5521
| 0.8281
| 0.64
| 0.8427
| 0.5808
| 0.5008
| 0.9005
| 0.1088
| 0.5245
| 0.8658
| 13.3065
| 0.9096
| 0.9526
| 16.1479
| 0.8817
| 0.5127
| 0.864
| 0.5086
| 0.7
| 0.9223
| 0.4943
| 0.5137
| 0.4906
| 0.7393
| 0.4658
| 0.9005
| 0.8382
| 0.8038
| 0.6981
| 0.64
| 0
| 0
| 0.5905
| 0.4835
| 0.0103
| 0.2809
| 0.0708
| 0.032
| 0.6724
| 0.8152
| 9.3487
| 0.8304
| 0.8988
| 9.9183
| 0.7488
| 0.6887
| 2.3612
| 30.7157
| 10.8872
| 0.1088
| 24.9318
|
CohereForCausalLM
|
float16
|
cc-by-nc-4.0
| 8.028
| 276
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
CohereForAI/aya-expanse-8b
| 0.371
| 0
| 0.0296
| 0.133
| 0.227
| 0.6603
| 0.398
| 0.8352
| 0.6017
| 0.3012
| 0.7864
| 0.1088
| 0.331
| 0.8544
| 12.701
| 0.9078
| 0.953
| 16.887
| 0.8831
| 0.0296
| 0.8464
| 0.4626
| 0.6333
| 0.6962
| 0.3262
| 0.0387
| 0.5066
| 0.7513
| 0.6548
| 0.7864
| 0.8373
| 0.8031
| 0.4384
| 0.398
| 0
| 0
| 0.4153
| 0.2465
| 0.0103
| 0
| 0
| 0.0017
| 0.6531
| 0.7814
| 7.1072
| 0.8112
| 0.8903
| 9.1866
| 0.7388
| 0.6887
| 2.3612
| 30.7157
| 10.8872
| 0.1088
| 24.9318
|
CohereForCausalLM
|
float16
|
cc-by-nc-4.0
| 8.028
| 276
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
CohereForAI/aya-expanse-32b
| 0.5082
| 0.1426
| 0.535
| 0.2935
| 0.4225
| 0.8468
| 0.206
| 0.8553
| 0.5893
| 0.6495
| 0.9112
| 0.1388
| 0.7489
| 0.8755
| 14.7691
| 0.9167
| 0.9593
| 19.4268
| 0.8912
| 0.535
| 0.774
| 0.5891
| 0.1556
| 0.9625
| 0.5815
| 0.1367
| 0.8311
| 0.7822
| 0.5884
| 0.9112
| 0.8863
| 0.8648
| 0.8038
| 0.206
| 0.1426
| 0.255
| 0.7082
| 0.6179
| 0.0228
| 0.4033
| 0.0855
| 0.0853
| 0.8705
| 0.8462
| 13.016
| 0.8511
| 0.9099
| 11.557
| 0.7621
| 0.7167
| 3.6554
| 35.73
| 13.8915
| 0.1388
| 30.2516
|
CohereForCausalLM
|
float16
|
cc-by-nc-4.0
| 32.296
| 168
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
CohereForAI/aya-expanse-32b
| 0.4795
| 0.1426
| 0.2165
| 0.1746
| 0.5658
| 0.84
| 0.514
| 0.8516
| 0.3936
| 0.5438
| 0.893
| 0.1388
| 0.6921
| 0.8627
| 13.3137
| 0.9142
| 0.9581
| 17.7546
| 0.8913
| 0.2165
| 0.883
| 0.5747
| 0.0306
| 0.9205
| 0.4042
| 0.5916
| 0.5645
| 0.3371
| 0.4613
| 0.893
| 0.8629
| 0.8428
| 0.7166
| 0.514
| 0.1426
| 0.255
| 0.5399
| 0.535
| 0.0115
| 0
| 0.0177
| 0.0043
| 0.8395
| 0.8184
| 9.7295
| 0.8422
| 0.9034
| 10.1636
| 0.7585
| 0.7167
| 3.6554
| 35.73
| 13.8915
| 0.1388
| 30.2516
|
CohereForCausalLM
|
float16
|
cc-by-nc-4.0
| 32.296
| 168
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
google/gemma-2-2b-it
| 0.4051
| 0
| 0.3826
| 0.094
| 0.4379
| 0.5978
| 0.492
| 0.7515
| 0.5586
| 0.2522
| 0.8379
| 0.0513
| 0.2125
| 0.8197
| 8.5773
| 0.8372
| 0.9333
| 13.6777
| 0.8375
| 0.3826
| 0.5754
| 0.4253
| 0.5208
| 0.7819
| 0.3277
| 0.3838
| 0.5711
| 0.7014
| 0.5742
| 0.8379
| 0.5681
| 0.5442
| 0.4359
| 0.492
| 0
| 0
| 0.4921
| 0.2165
| 0.0123
| 0.1042
| 0.0973
| 0.0127
| 0.2434
| 0.7388
| 6.3148
| 0.6881
| 0.8677
| 7.5156
| 0.6434
| 0.6573
| 1.0802
| 16.8494
| 5.1204
| 0.0513
| 13.5661
|
Gemma2ForCausalLM
|
bfloat16
|
gemma
| 2.614
| 678
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
google/gemma-2-2b-it
| 0.1798
| 0
| 0
| 0.0487
| 0.0075
| 0.2901
| 0.044
| 0.576
| 0.2194
| 0.1283
| 0.6129
| 0.0513
| 0.0651
| 0.7304
| 5.5835
| 0.6514
| 0.792
| 3.1738
| 0.6637
| 0
| 0.4123
| 0.3276
| 0.0056
| 0.2055
| 0.2203
| 0.0031
| 0.145
| 0.3851
| 0.2336
| 0.6129
| 0.25
| 0.2473
| 0.2524
| 0.044
| 0
| 0
| 0.0118
| 0.0994
| 0
| 0
| 0
| 0
| 0.2434
| 0.6395
| 3.479
| 0.5003
| 0.7665
| 3.0008
| 0.4887
| 0.6573
| 1.0802
| 16.8494
| 5.1204
| 0.0513
| 13.5661
|
Gemma2ForCausalLM
|
bfloat16
|
gemma
| 2.614
| 678
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
google/gemma-2-9b-it
| 0.5206
| 0
| 0.45
| 0.2212
| 0.6149
| 0.801
| 0.75
| 0.8332
| 0.6079
| 0.4357
| 0.8883
| 0.1244
| 0.4964
| 0.8542
| 11.4808
| 0.8905
| 0.9446
| 15.7498
| 0.8611
| 0.45
| 0.8249
| 0.5
| 0.6819
| 0.8695
| 0.4161
| 0.5586
| 0.5592
| 0.6755
| 0.6231
| 0.8883
| 0.8474
| 0.815
| 0.7085
| 0.75
| 0
| 0
| 0.6713
| 0.3946
| 0.0114
| 0.294
| 0.0973
| 0.0792
| 0.624
| 0.8201
| 9.9977
| 0.8297
| 0.9044
| 10.9491
| 0.7513
| 0.7114
| 2.5273
| 36.1583
| 12.4421
| 0.1244
| 29.7442
|
Gemma2ForCausalLM
|
bfloat16
|
gemma
| 9.242
| 551
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
google/gemma-2-9b-it
| 0.3631
| 0
| 0
| 0.053
| 0.3421
| 0.6453
| 0.512
| 0.7368
| 0.6036
| 0.2294
| 0.7473
| 0.1244
| 0.196
| 0.8117
| 9.1697
| 0.8305
| 0.929
| 12.8368
| 0.8198
| 0
| 0.7219
| 0.5115
| 0.5764
| 0.7605
| 0.2548
| 0.4177
| 0.493
| 0.7197
| 0.7175
| 0.7473
| 0.7495
| 0.7103
| 0.4536
| 0.512
| 0
| 0
| 0.2665
| 0.2373
| 0.005
| 0.0021
| 0
| 0
| 0.2578
| 0.7155
| 6.0728
| 0.6682
| 0.8632
| 8.6791
| 0.6287
| 0.7114
| 2.5273
| 36.1583
| 12.4421
| 0.1244
| 29.7442
|
Gemma2ForCausalLM
|
bfloat16
|
gemma
| 9.242
| 551
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
google/gemma-2-27b-it
| 0.5809
| 0
| 0.4993
| 0.2738
| 0.676
| 0.8434
| 0.9
| 0.8539
| 0.7242
| 0.5932
| 0.8934
| 0.1322
| 0.6927
| 0.8709
| 13.2355
| 0.9126
| 0.9551
| 18.5224
| 0.8844
| 0.4993
| 0.8707
| 0.6552
| 0.8014
| 0.9267
| 0.5262
| 0.6278
| 0.6935
| 0.7727
| 0.6982
| 0.8934
| 0.9006
| 0.8776
| 0.7329
| 0.9
| 0
| 0
| 0.7243
| 0.5607
| 0.0127
| 0.3508
| 0.1062
| 0.0692
| 0.8303
| 0.8457
| 12.4513
| 0.8504
| 0.9127
| 12.1106
| 0.7681
| 0.7162
| 3.0568
| 35.8752
| 13.2254
| 0.1322
| 29.9498
|
Gemma2ForCausalLM
|
bfloat16
|
gemma
| 27.227
| 443
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
google/gemma-2-27b-it
| 0.4948
| 0
| 0.4455
| 0.1595
| 0.6023
| 0.721
| 0.798
| 0.8253
| 0.6287
| 0.449
| 0.6812
| 0.1322
| 0.5564
| 0.8321
| 11.1934
| 0.8824
| 0.9501
| 15.8838
| 0.8764
| 0.4455
| 0.7823
| 0.5259
| 0.6722
| 0.8365
| 0.3194
| 0.5479
| 0.521
| 0.7519
| 0.6726
| 0.6812
| 0.8502
| 0.8294
| 0.5442
| 0.798
| 0
| 0
| 0.6567
| 0.4713
| 0.0075
| 0.0012
| 0.0177
| 0
| 0.7712
| 0.7655
| 8.7339
| 0.7965
| 0.899
| 10.1975
| 0.7459
| 0.7162
| 3.0568
| 35.8752
| 13.2254
| 0.1322
| 29.9498
|
Gemma2ForCausalLM
|
bfloat16
|
gemma
| 27.227
| 443
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
meta-llama/Llama-3.1-70B-Instruct
| 0.6638
| 0.6004
| 0.5611
| 0.2644
| 0.7678
| 0.8854
| 0.93
| 0.8467
| 0.7821
| 0.6456
| 0.9215
| 0.097
| 0.7264
| 0.8675
| 13.7097
| 0.9095
| 0.9576
| 18.4691
| 0.8881
| 0.5611
| 0.9031
| 0.658
| 0.8556
| 0.95
| 0.6331
| 0.7258
| 0.7823
| 0.7992
| 0.8155
| 0.9215
| 0.8729
| 0.8407
| 0.8032
| 0.93
| 0.6004
| 0.9719
| 0.8099
| 0.5773
| 0.0808
| 0.2779
| 0.0442
| 0.0576
| 0.8616
| 0.8444
| 16.6273
| 0.8355
| 0.9073
| 12.3916
| 0.7537
| 0.6883
| 3.4286
| 23.6076
| 9.7127
| 0.097
| 20.9281
|
LlamaForCausalLM
|
bfloat16
|
llama3.1
| 70.554
| 685
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
meta-llama/Llama-3.1-70B-Instruct
| 0.4997
| 0.6004
| 0.2763
| 0.153
| 0.6753
| 0.8254
| 0.09
| 0.8243
| 0.7071
| 0.3729
| 0.8744
| 0.097
| 0.4822
| 0.8627
| 12.6354
| 0.9047
| 0.8888
| 11.0722
| 0.8239
| 0.2763
| 0.893
| 0.6063
| 0.7278
| 0.8579
| 0.2659
| 0.6543
| 0.5793
| 0.7841
| 0.8382
| 0.8744
| 0.8862
| 0.8528
| 0.7254
| 0.09
| 0.6004
| 0.9719
| 0.6963
| 0.3706
| 0.002
| 0.0257
| 0.0265
| 0.0008
| 0.7099
| 0.8326
| 12.7335
| 0.8355
| 0.8921
| 10.8621
| 0.7332
| 0.6883
| 3.4286
| 23.6076
| 9.7127
| 0.097
| 20.9281
|
LlamaForCausalLM
|
bfloat16
|
llama3.1
| 70.554
| 685
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
nvidia/Llama-3.1-Nemotron-70B-Instruct-HF
| 0.6624
| 0.5582
| 0.5734
| 0.2715
| 0.7673
| 0.8851
| 0.928
| 0.8495
| 0.7889
| 0.6453
| 0.9202
| 0.0984
| 0.727
| 0.8668
| 13.9208
| 0.9099
| 0.9577
| 18.3148
| 0.8882
| 0.5734
| 0.9051
| 0.6609
| 0.8694
| 0.9464
| 0.6205
| 0.7224
| 0.8028
| 0.7891
| 0.8224
| 0.9202
| 0.881
| 0.8533
| 0.8037
| 0.928
| 0.5582
| 0.8896
| 0.8123
| 0.5883
| 0.0984
| 0.3077
| 0.0265
| 0.0668
| 0.8582
| 0.8485
| 16.7533
| 0.8436
| 0.9059
| 12.0753
| 0.7565
| 0.6918
| 3.4088
| 23.8265
| 9.8542
| 0.0984
| 21.032
|
LlamaForCausalLM
|
bfloat16
|
llama3.1
| 70.554
| 1,680
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
nvidia/Llama-3.1-Nemotron-70B-Instruct-HF
| 0.5134
| 0.5582
| 0.2532
| 0.1623
| 0.6794
| 0.8242
| 0.232
| 0.8424
| 0.7365
| 0.3953
| 0.8654
| 0.0984
| 0.4611
| 0.8644
| 12.7358
| 0.9064
| 0.9481
| 16.8129
| 0.8795
| 0.2532
| 0.893
| 0.6408
| 0.7583
| 0.8642
| 0.3665
| 0.6614
| 0.6537
| 0.7633
| 0.8662
| 0.8654
| 0.8959
| 0.8638
| 0.7154
| 0.232
| 0.5582
| 0.8896
| 0.6975
| 0.3583
| 0.0024
| 0.0281
| 0.0177
| 0
| 0.7631
| 0.8328
| 12.6953
| 0.8383
| 0.9004
| 11.1724
| 0.7455
| 0.6918
| 3.4088
| 23.8265
| 9.8542
| 0.0984
| 21.032
|
LlamaForCausalLM
|
bfloat16
|
llama3.1
| 70.554
| 1,680
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
cyberagent/Llama-3.1-70B-Japanese-Instruct-2407
| 0.6705
| 0.5823
| 0.5464
| 0.2862
| 0.7522
| 0.9108
| 0.924
| 0.8518
| 0.7869
| 0.7169
| 0.9188
| 0.0989
| 0.8472
| 0.8747
| 15.2977
| 0.9123
| 0.9586
| 18.0168
| 0.8888
| 0.5464
| 0.9311
| 0.658
| 0.85
| 0.9589
| 0.6432
| 0.7159
| 0.7634
| 0.8106
| 0.8526
| 0.9188
| 0.878
| 0.8431
| 0.8424
| 0.924
| 0.5823
| 0.9699
| 0.7885
| 0.6602
| 0.0513
| 0.416
| 0.0177
| 0.0609
| 0.8852
| 0.8463
| 14.9733
| 0.8474
| 0.9099
| 12.2264
| 0.7588
| 0.6921
| 3.1649
| 23.7422
| 9.8925
| 0.0989
| 20.9876
|
LlamaForCausalLM
|
bfloat16
|
llama3.1
| 70.554
| 64
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
cyberagent/Llama-3.1-70B-Japanese-Instruct-2407
| 0.4359
| 0.5823
| 0.1812
| 0.1673
| 0.1655
| 0.8665
| 0.246
| 0.8368
| 0.752
| 0.1899
| 0.7083
| 0.0989
| 0.2384
| 0.8693
| 14.3716
| 0.907
| 0.9579
| 18.4078
| 0.888
| 0.1812
| 0.9181
| 0.6839
| 0.7514
| 0.9106
| 0.196
| 0.0085
| 0.6878
| 0.7879
| 0.8492
| 0.7083
| 0.8787
| 0.862
| 0.7708
| 0.246
| 0.5823
| 0.9699
| 0.3225
| 0.1352
| 0.0024
| 0.0102
| 0
| 0.0009
| 0.823
| 0.7949
| 10.767
| 0.8057
| 0.9029
| 11.2189
| 0.7465
| 0.6921
| 3.1649
| 23.7422
| 9.8925
| 0.0989
| 20.9876
|
LlamaForCausalLM
|
bfloat16
|
llama3.1
| 70.554
| 64
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
SakanaAI/EvoLLM-JP-A-v1-7B
| 0.3075
| 0.0783
| 0.0602
| 0.092
| 0.1096
| 0.4092
| 0.18
| 0.7626
| 0.5566
| 0.2941
| 0.7324
| 0.1075
| 0.2852
| 0.7996
| 7.8617
| 0.8364
| 0.9257
| 11.172
| 0.8301
| 0.0602
| 0.5496
| 0.4741
| 0.3583
| 0.3655
| 0.3017
| 0.1214
| 0.5329
| 0.6301
| 0.7875
| 0.7324
| 0.814
| 0.7803
| 0.3126
| 0.18
| 0.0783
| 0.2851
| 0.0977
| 0.2955
| 0.0064
| 0
| 0
| 0
| 0.4537
| 0.7262
| 5.4783
| 0.7133
| 0.8689
| 7.0128
| 0.6707
| 0.7015
| 2.2417
| 31.8526
| 10.7599
| 0.1075
| 26.5604
|
MistralForCausalLM
|
bfloat16
|
apache-2.0
| 7.242
| 11
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
SakanaAI/EvoLLM-JP-A-v1-7B
| 0.4981
| 0.0783
| 0.4291
| 0.2025
| 0.4799
| 0.7665
| 0.662
| 0.8102
| 0.6447
| 0.411
| 0.8874
| 0.1075
| 0.4051
| 0.8354
| 9.5638
| 0.8798
| 0.9423
| 14.1638
| 0.8646
| 0.4291
| 0.8469
| 0.5718
| 0.6458
| 0.8686
| 0.4383
| 0.4262
| 0.6828
| 0.6806
| 0.6424
| 0.8874
| 0.8239
| 0.7917
| 0.5839
| 0.662
| 0.0783
| 0.2851
| 0.5337
| 0.3898
| 0.0079
| 0.3115
| 0.0354
| 0.0561
| 0.6015
| 0.7786
| 8.2927
| 0.7788
| 0.88
| 8.3322
| 0.7178
| 0.7015
| 2.2417
| 31.8526
| 10.7599
| 0.1075
| 26.5604
|
MistralForCausalLM
|
bfloat16
|
apache-2.0
| 7.242
| 11
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
meta-llama/Llama-3.2-3B-Instruct
| 0.2042
| 0.0201
| 0
| 0.0345
| 0.1693
| 0.1582
| 0.008
| 0.6778
| 0.3061
| 0.1221
| 0.6936
| 0.0562
| 0.0882
| 0.794
| 7.0688
| 0.8148
| 0.906
| 9.086
| 0.7803
| 0
| 0.01
| 0.3851
| 0.5
| 0.2118
| 0.1375
| 0.0508
| 0.2342
| 0.1307
| 0.2807
| 0.6936
| 0.0555
| 0.0543
| 0.2526
| 0.008
| 0.0201
| 0.0843
| 0.2879
| 0.1406
| 0
| 0.0009
| 0
| 0
| 0.1716
| 0.6874
| 3.9743
| 0.6419
| 0.8229
| 5.2307
| 0.4743
| 0.6484
| 1.9187
| 16.1297
| 5.6101
| 0.0562
| 14.0826
|
LlamaForCausalLM
|
bfloat16
|
llama3.2
| 3.213
| 638
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
meta-llama/Llama-3.2-3B-Instruct
| 0.4111
| 0.0201
| 0.403
| 0.1037
| 0.4761
| 0.5994
| 0.61
| 0.7681
| 0.3565
| 0.2786
| 0.8506
| 0.0562
| 0.2529
| 0.8285
| 9.1433
| 0.86
| 0.9366
| 13.7136
| 0.8465
| 0.403
| 0.6528
| 0.3477
| 0.4958
| 0.7551
| 0.3175
| 0.3973
| 0.3414
| 0.4059
| 0.1916
| 0.8506
| 0.1652
| 0.1046
| 0.3903
| 0.61
| 0.0201
| 0.0843
| 0.5548
| 0.2655
| 0.0153
| 0.1642
| 0.0265
| 0.022
| 0.2904
| 0.7271
| 7.1521
| 0.6776
| 0.8772
| 8.3252
| 0.6881
| 0.6484
| 1.9187
| 16.1297
| 5.6101
| 0.0562
| 14.0826
|
LlamaForCausalLM
|
bfloat16
|
llama3.2
| 3.213
| 638
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
rinna/llama-3-youko-8b-instruct
| 0.5444
| 0.2811
| 0.4998
| 0.2651
| 0.4879
| 0.8302
| 0.526
| 0.8517
| 0.6812
| 0.5522
| 0.9004
| 0.113
| 0.6523
| 0.8605
| 11.8261
| 0.9056
| 0.9515
| 15.745
| 0.8797
| 0.4998
| 0.876
| 0.5661
| 0.6806
| 0.9169
| 0.485
| 0.4383
| 0.8069
| 0.7285
| 0.6237
| 0.9004
| 0.8224
| 0.7804
| 0.6976
| 0.526
| 0.2811
| 0.739
| 0.5375
| 0.5192
| 0.034
| 0.3175
| 0.0619
| 0.0749
| 0.8374
| 0.847
| 13.2105
| 0.8486
| 0.9147
| 12.3221
| 0.773
| 0.7051
| 3.0857
| 28.1083
| 11.3046
| 0.113
| 24.4835
|
LlamaForCausalLM
|
bfloat16
|
llama3
| 8.03
| 11
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
rinna/llama-3-youko-8b-instruct
| 0.3585
| 0.2811
| 0.1149
| 0.1395
| 0.0168
| 0.6881
| 0.006
| 0.8366
| 0.5024
| 0.3991
| 0.8462
| 0.113
| 0.5068
| 0.8499
| 10.1041
| 0.8963
| 0.9469
| 13.8825
| 0.8722
| 0.1149
| 0.7733
| 0.4626
| 0.5861
| 0.7936
| 0.2486
| 0
| 0.4022
| 0.6824
| 0.3785
| 0.8462
| 0.4037
| 0.4642
| 0.4975
| 0.006
| 0.2811
| 0.739
| 0.0337
| 0.442
| 0
| 0.0008
| 0
| 0.0059
| 0.6909
| 0.8191
| 9.923
| 0.8214
| 0.9051
| 10.3137
| 0.7563
| 0.7051
| 3.0857
| 28.1083
| 11.3046
| 0.113
| 24.4835
|
LlamaForCausalLM
|
bfloat16
|
llama3
| 8.03
| 11
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
rinna/llama-3-youko-70b-instruct
| 0.5358
| 0.4498
| 0.3448
| 0.1644
| 0.446
| 0.7631
| 0.808
| 0.8454
| 0.636
| 0.4595
| 0.8891
| 0.0871
| 0.4512
| 0.8558
| 12.2981
| 0.904
| 0.9512
| 17.0644
| 0.8793
| 0.3448
| 0.7911
| 0.5086
| 0.675
| 0.8445
| 0.4467
| 0.4581
| 0.4573
| 0.7677
| 0.7713
| 0.8891
| 0.86
| 0.8183
| 0.6538
| 0.808
| 0.4498
| 0.7028
| 0.434
| 0.4806
| 0
| 0.0224
| 0.0177
| 0.005
| 0.777
| 0.8197
| 10.0643
| 0.8368
| 0.9068
| 10.9401
| 0.7617
| 0.671
| 2.8969
| 21.0754
| 8.7164
| 0.0871
| 18.5691
|
LlamaForCausalLM
|
bfloat16
|
llama3
| 70.554
| 0
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
rinna/llama-3-youko-70b-instruct
| 0.6505
| 0.4498
| 0.5876
| 0.2869
| 0.7458
| 0.8863
| 0.934
| 0.8575
| 0.7479
| 0.6521
| 0.9203
| 0.0871
| 0.7374
| 0.8677
| 13.7925
| 0.9099
| 0.9584
| 18.5638
| 0.8905
| 0.5876
| 0.9018
| 0.6724
| 0.7764
| 0.9517
| 0.6348
| 0.7151
| 0.7609
| 0.7942
| 0.7355
| 0.9203
| 0.8946
| 0.8666
| 0.8054
| 0.934
| 0.4498
| 0.7028
| 0.7765
| 0.584
| 0.0542
| 0.3116
| 0.1416
| 0.0677
| 0.8596
| 0.8471
| 13.5325
| 0.8505
| 0.9163
| 12.5548
| 0.7793
| 0.671
| 2.8969
| 21.0754
| 8.7164
| 0.0871
| 18.5691
|
LlamaForCausalLM
|
bfloat16
|
llama3
| 70.554
| 0
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🤝 : base merges and moerges
|
ryota39/Tora-12B
| 0.5036
| 0
| 0.5251
| 0.1908
| 0.4971
| 0.7748
| 0.684
| 0.8132
| 0.6189
| 0.4522
| 0.8868
| 0.0972
| 0.4565
| 0.8568
| 11.6997
| 0.9014
| 0.9445
| 14.3901
| 0.8681
| 0.5251
| 0.8434
| 0.5259
| 0.7097
| 0.8794
| 0.4327
| 0.4685
| 0.5362
| 0.75
| 0.5728
| 0.8868
| 0.8288
| 0.8246
| 0.6016
| 0.684
| 0
| 0
| 0.5256
| 0.4674
| 0.0163
| 0.2399
| 0.0133
| 0.0305
| 0.6541
| 0.7675
| 6.7435
| 0.7813
| 0.8824
| 8.4109
| 0.702
| 0.6902
| 2.606
| 27.0004
| 9.7306
| 0.0972
| 22.9946
|
MistralForCausalLM
|
bfloat16
|
apache-2.0
| 12.248
| 1
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🤝 : base merges and moerges
|
ryota39/Tora-12B
| 0.3728
| 0
| 0.0171
| 0.1148
| 0.2861
| 0.6261
| 0.56
| 0.798
| 0.5746
| 0.204
| 0.8228
| 0.0972
| 0.1097
| 0.8398
| 10.3864
| 0.8934
| 0.9431
| 13.1423
| 0.8644
| 0.0171
| 0.763
| 0.5201
| 0.7056
| 0.7417
| 0.2828
| 0.2824
| 0.4552
| 0.7544
| 0.4378
| 0.8228
| 0.8389
| 0.8398
| 0.3734
| 0.56
| 0
| 0
| 0.2898
| 0.2197
| 0.0025
| 0.0008
| 0.023
| 0
| 0.5474
| 0.734
| 5.0103
| 0.7377
| 0.8804
| 7.659
| 0.6964
| 0.6902
| 2.606
| 27.0004
| 9.7306
| 0.0972
| 22.9946
|
MistralForCausalLM
|
bfloat16
|
apache-2.0
| 12.248
| 1
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
Qwen/Qwen2.5-Coder-32B-Instruct
| 0.6196
| 0.6847
| 0.5416
| 0.2769
| 0.6754
| 0.8225
| 0.896
| 0.8349
| 0.7292
| 0.4215
| 0.858
| 0.0744
| 0.3762
| 0.8533
| 11.6603
| 0.8952
| 0.9501
| 15.695
| 0.8762
| 0.5416
| 0.8437
| 0.6523
| 0.8708
| 0.908
| 0.5112
| 0.6315
| 0.7502
| 0.6818
| 0.6911
| 0.858
| 0.8716
| 0.845
| 0.716
| 0.896
| 0.6847
| 0.755
| 0.7193
| 0.3772
| 0.0529
| 0.3638
| 0.1239
| 0.0749
| 0.769
| 0.8104
| 9.7551
| 0.8238
| 0.8991
| 10.3091
| 0.7445
| 0.6725
| 1.8374
| 21.7121
| 7.4326
| 0.0744
| 18.911
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 32.764
| 958
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
Qwen/Qwen2.5-Coder-32B-Instruct
| 0.4649
| 0.6847
| 0.2414
| 0.1542
| 0.5666
| 0.7859
| 0.332
| 0.8067
| 0.4242
| 0.2567
| 0.7872
| 0.0744
| 0.2702
| 0.825
| 8.635
| 0.8742
| 0.9437
| 13.3534
| 0.8633
| 0.2414
| 0.8236
| 0.4598
| 0.7194
| 0.8758
| 0.2579
| 0.5055
| 0.3977
| 0.1957
| 0.3485
| 0.7872
| 0.8391
| 0.8255
| 0.6582
| 0.332
| 0.6847
| 0.755
| 0.6278
| 0.2419
| 0.0231
| 0.0118
| 0.0708
| 0.0048
| 0.6604
| 0.7637
| 7.3008
| 0.7796
| 0.8868
| 8.9065
| 0.7098
| 0.6725
| 1.8374
| 21.7121
| 7.4326
| 0.0744
| 18.911
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 32.764
| 958
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
AXCXEPT/EZO-Common-9B-gemma-2-it
| 0.3855
| 0.006
| 0.0125
| 0.0537
| 0.3964
| 0.6565
| 0.56
| 0.781
| 0.5944
| 0.284
| 0.7735
| 0.1228
| 0.2447
| 0.8243
| 9.2281
| 0.8642
| 0.9336
| 12.3934
| 0.8402
| 0.0125
| 0.7725
| 0.5259
| 0.5625
| 0.7498
| 0.313
| 0.4253
| 0.479
| 0.714
| 0.6905
| 0.7735
| 0.7556
| 0.7149
| 0.4472
| 0.56
| 0.006
| 0.0141
| 0.3675
| 0.2943
| 0.0064
| 0.0003
| 0
| 0
| 0.2616
| 0.7432
| 6.4405
| 0.7328
| 0.8779
| 8.6208
| 0.6867
| 0.714
| 2.8145
| 35.9536
| 12.2713
| 0.1228
| 29.3246
|
Gemma2ForCausalLM
|
bfloat16
|
gemma
| 9.242
| 31
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
AXCXEPT/EZO-Common-9B-gemma-2-it
| 0.5275
| 0.006
| 0.4616
| 0.2047
| 0.609
| 0.7961
| 0.764
| 0.8381
| 0.6519
| 0.4585
| 0.89
| 0.1228
| 0.4946
| 0.86
| 11.9103
| 0.8978
| 0.9452
| 16.075
| 0.8646
| 0.4616
| 0.8304
| 0.4943
| 0.6778
| 0.8633
| 0.4892
| 0.5498
| 0.7005
| 0.6711
| 0.7161
| 0.89
| 0.8535
| 0.8079
| 0.6945
| 0.764
| 0.006
| 0.0141
| 0.6682
| 0.3916
| 0.0131
| 0.2754
| 0.0796
| 0.0708
| 0.5846
| 0.8216
| 9.7229
| 0.8338
| 0.904
| 10.8852
| 0.7563
| 0.714
| 2.8145
| 35.9536
| 12.2713
| 0.1228
| 29.3246
|
Gemma2ForCausalLM
|
bfloat16
|
gemma
| 9.242
| 31
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
AXCXEPT/EZO-Humanities-9B-gemma-2-it
| 0.5294
| 0.0301
| 0.4571
| 0.1934
| 0.6078
| 0.8
| 0.758
| 0.8364
| 0.6586
| 0.4751
| 0.8835
| 0.1236
| 0.5006
| 0.8569
| 11.5508
| 0.8935
| 0.9446
| 15.2615
| 0.8634
| 0.4571
| 0.8447
| 0.5029
| 0.7222
| 0.8624
| 0.4724
| 0.5484
| 0.6927
| 0.6597
| 0.7154
| 0.8835
| 0.8472
| 0.8046
| 0.6929
| 0.758
| 0.0301
| 0.0562
| 0.6671
| 0.4522
| 0.017
| 0.2816
| 0.0354
| 0.0762
| 0.5568
| 0.8215
| 9.5432
| 0.8344
| 0.9036
| 10.7545
| 0.7545
| 0.7143
| 2.6119
| 36.4581
| 12.3595
| 0.1236
| 29.6341
|
Gemma2ForCausalLM
|
bfloat16
|
gemma
| 9.242
| 21
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
AXCXEPT/EZO-Humanities-9B-gemma-2-it
| 0.3874
| 0.0301
| 0.0083
| 0.0518
| 0.4085
| 0.6664
| 0.572
| 0.7624
| 0.6096
| 0.2605
| 0.7687
| 0.1236
| 0.2273
| 0.8157
| 9.1465
| 0.8474
| 0.9309
| 12.2476
| 0.8339
| 0.0083
| 0.7861
| 0.5259
| 0.5611
| 0.7516
| 0.2606
| 0.4211
| 0.537
| 0.7172
| 0.7069
| 0.7687
| 0.7466
| 0.7017
| 0.4616
| 0.572
| 0.0301
| 0.0562
| 0.396
| 0.2936
| 0.0014
| 0
| 0
| 0
| 0.2577
| 0.7292
| 6.0949
| 0.7
| 0.8718
| 8.3038
| 0.6684
| 0.7143
| 2.6119
| 36.4581
| 12.3595
| 0.1236
| 29.6341
|
Gemma2ForCausalLM
|
bfloat16
|
gemma
| 9.242
| 21
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
AXCXEPT/EZO-Qwen2.5-32B-Instruct
| 0.6217
| 0.2189
| 0.5779
| 0.2486
| 0.7633
| 0.8982
| 0.928
| 0.8501
| 0.8095
| 0.5476
| 0.9029
| 0.0942
| 0.5622
| 0.8649
| 13.1459
| 0.9085
| 0.9545
| 17.3022
| 0.8858
| 0.5779
| 0.9123
| 0.6925
| 0.8542
| 0.9625
| 0.5803
| 0.7343
| 0.8673
| 0.7784
| 0.8551
| 0.9029
| 0.9014
| 0.8823
| 0.8197
| 0.928
| 0.2189
| 0.2691
| 0.7923
| 0.5004
| 0.0025
| 0.3567
| 0
| 0.0806
| 0.8032
| 0.8248
| 10.4495
| 0.8444
| 0.9024
| 10.5557
| 0.7616
| 0.6932
| 2.5841
| 26.0188
| 9.4165
| 0.0942
| 22.2414
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 32.764
| 6
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
AXCXEPT/EZO-Qwen2.5-32B-Instruct
| 0.5255
| 0.2189
| 0.164
| 0.1459
| 0.6883
| 0.8671
| 0.736
| 0.8388
| 0.7692
| 0.3722
| 0.886
| 0.0942
| 0.3589
| 0.8446
| 10.8444
| 0.9003
| 0.9508
| 15.2871
| 0.8811
| 0.164
| 0.8938
| 0.6839
| 0.7778
| 0.9357
| 0.3282
| 0.6408
| 0.8073
| 0.7614
| 0.8155
| 0.886
| 0.8954
| 0.8736
| 0.7718
| 0.736
| 0.2189
| 0.2691
| 0.7358
| 0.4294
| 0.0083
| 0.0044
| 0
| 0.005
| 0.7118
| 0.7931
| 8.3381
| 0.8245
| 0.8967
| 9.4364
| 0.7493
| 0.6932
| 2.5841
| 26.0188
| 9.4165
| 0.0942
| 22.2414
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 32.764
| 6
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
karakuri-ai/karakuri-lm-8x7b-chat-v0.1
| 0.579
| 0.4398
| 0.5106
| 0.2509
| 0.553
| 0.7859
| 0.738
| 0.8295
| 0.6669
| 0.5578
| 0.906
| 0.1308
| 0.6857
| 0.859
| 12.5191
| 0.9048
| 0.948
| 15.3313
| 0.8715
| 0.5106
| 0.8835
| 0.5086
| 0.6569
| 0.8391
| 0.4662
| 0.5165
| 0.8003
| 0.7633
| 0.6052
| 0.906
| 0.8325
| 0.7804
| 0.6349
| 0.738
| 0.4398
| 0.9157
| 0.5895
| 0.5214
| 0.0142
| 0.2769
| 0.115
| 0.0787
| 0.7696
| 0.8112
| 9.7984
| 0.8191
| 0.8947
| 9.9615
| 0.7225
| 0.714
| 3.0687
| 34.8302
| 13.0565
| 0.1308
| 29.23
|
MixtralForCausalLM
|
bfloat16
|
apache-2.0
| 46.703
| 20
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
karakuri-ai/karakuri-lm-8x7b-chat-v0.1
| 0.4643
| 0.4398
| 0.1461
| 0.1553
| 0.4483
| 0.6098
| 0.586
| 0.8132
| 0.5708
| 0.4013
| 0.8056
| 0.1308
| 0.5219
| 0.8413
| 9.5173
| 0.8921
| 0.9451
| 12.9912
| 0.8695
| 0.1461
| 0.7282
| 0.4684
| 0.6806
| 0.7596
| 0.2338
| 0.3988
| 0.3652
| 0.6818
| 0.6582
| 0.8056
| 0.8181
| 0.7938
| 0.3416
| 0.586
| 0.4398
| 0.9157
| 0.4979
| 0.4483
| 0.0208
| 0.0054
| 0.0354
| 0
| 0.7151
| 0.7783
| 7.1414
| 0.7944
| 0.8831
| 8.0624
| 0.6967
| 0.714
| 3.0687
| 34.8302
| 13.0565
| 0.1308
| 29.23
|
MixtralForCausalLM
|
bfloat16
|
apache-2.0
| 46.703
| 20
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
meta-llama/Llama-3.2-1B-Instruct
| 0.3059
| 0.006
| 0.3175
| 0.0299
| 0.3193
| 0.4104
| 0.448
| 0.6088
| 0.3826
| 0.2129
| 0.5948
| 0.0344
| 0.1483
| 0.7505
| 6.5622
| 0.7029
| 0.913
| 10.3897
| 0.7881
| 0.3175
| 0.4765
| 0.3391
| 0.5056
| 0.4683
| 0.3055
| 0.2801
| 0.5308
| 0.3163
| 0.2212
| 0.5948
| 0
| 0
| 0.2865
| 0.448
| 0.006
| 0.0321
| 0.3584
| 0.1849
| 0
| 0.0521
| 0
| 0.0123
| 0.0849
| 0.5608
| 1.4338
| 0.3883
| 0.8311
| 5.383
| 0.5558
| 0.6012
| 1.3537
| 12.4088
| 3.448
| 0.0344
| 9.9124
|
LlamaForCausalLM
|
bfloat16
|
llama3.2
| 1.236
| 573
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
meta-llama/Llama-3.2-1B-Instruct
| 0.1052
| 0.006
| 0
| 0.0318
| 0.0014
| 0.189
| 0
| 0.4682
| 0
| 0.1129
| 0.3138
| 0.0344
| 0.0727
| 0.6638
| 1.5895
| 0.5399
| 0.7425
| 0.1903
| 0.4989
| 0
| 0.4051
| 0
| 0
| 0.0179
| 0.1733
| 0
| 0
| 0
| 0
| 0.3138
| -0.0837
| -0.1146
| 0.1441
| 0
| 0.006
| 0.0321
| 0.0028
| 0.0929
| 0
| 0
| 0
| 0
| 0.1589
| 0.5919
| 1.3013
| 0.4303
| 0.7328
| 0.3692
| 0.4039
| 0.6012
| 1.3537
| 12.4088
| 3.448
| 0.0344
| 9.9124
|
LlamaForCausalLM
|
bfloat16
|
llama3.2
| 1.236
| 573
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
AXCXEPT/Llama-3.1-70B-EZO-1.1-it
| 0.4493
| 0.4759
| 0.1461
| 0.167
| 0.1811
| 0.87
| 0.374
| 0.8432
| 0.7253
| 0.3434
| 0.7049
| 0.1112
| 0.4223
| 0.8652
| 13.0595
| 0.9109
| 0.9559
| 17.4464
| 0.8853
| 0.1461
| 0.9236
| 0.6351
| 0.7486
| 0.9178
| 0.3892
| 0.0873
| 0.6265
| 0.7822
| 0.834
| 0.7049
| 0.8848
| 0.8564
| 0.7687
| 0.374
| 0.4759
| 0.7671
| 0.275
| 0.2188
| 0
| 0
| 0.0088
| 0
| 0.8263
| 0.8017
| 9.9795
| 0.8247
| 0.9003
| 10.2301
| 0.7519
| 0.7018
| 3.1569
| 26.8732
| 11.1146
| 0.1112
| 23.5595
|
LlamaForCausalLM
|
bfloat16
|
llama3.1
| 70.554
| 11
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
AXCXEPT/Llama-3.1-70B-EZO-1.1-it
| 0.6537
| 0.4759
| 0.5341
| 0.2846
| 0.7542
| 0.9071
| 0.902
| 0.8441
| 0.7667
| 0.7038
| 0.9073
| 0.1112
| 0.8429
| 0.8606
| 13.1933
| 0.8997
| 0.9486
| 17.0482
| 0.8675
| 0.5341
| 0.9354
| 0.6178
| 0.7847
| 0.9607
| 0.6168
| 0.7201
| 0.7896
| 0.8056
| 0.836
| 0.9073
| 0.8743
| 0.8479
| 0.8253
| 0.902
| 0.4759
| 0.7671
| 0.7883
| 0.6516
| 0.0263
| 0.3609
| 0.1239
| 0.0555
| 0.8567
| 0.8353
| 13.154
| 0.8452
| 0.9071
| 10.9842
| 0.7638
| 0.7018
| 3.1569
| 26.8732
| 11.1146
| 0.1112
| 23.5595
|
LlamaForCausalLM
|
bfloat16
|
llama3.1
| 70.554
| 11
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
karakuri-ai/karakuri-lm-8x7b-instruct-v0.1
| 0.5485
| 0.0502
| 0.5043
| 0.2695
| 0.565
| 0.7869
| 0.72
| 0.8283
| 0.6865
| 0.596
| 0.9122
| 0.115
| 0.6905
| 0.8604
| 11.8326
| 0.9048
| 0.9486
| 15.6642
| 0.8721
| 0.5043
| 0.8442
| 0.5172
| 0.7292
| 0.8919
| 0.5456
| 0.5253
| 0.7576
| 0.7664
| 0.6621
| 0.9122
| 0.8363
| 0.7498
| 0.6245
| 0.72
| 0.0502
| 0.1245
| 0.6048
| 0.552
| 0.0222
| 0.3319
| 0.1239
| 0.0908
| 0.7789
| 0.8137
| 10.0974
| 0.8189
| 0.8917
| 9.9485
| 0.7173
| 0.6983
| 2.8331
| 30.452
| 11.4983
| 0.115
| 25.7002
|
MixtralForCausalLM
|
bfloat16
|
apache-2.0
| 46.704
| 19
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
karakuri-ai/karakuri-lm-8x7b-instruct-v0.1
| 0.4249
| 0.0502
| 0.2769
| 0.1339
| 0.4183
| 0.6429
| 0.5
| 0.8189
| 0.5858
| 0.3979
| 0.7338
| 0.115
| 0.4678
| 0.8427
| 10.4168
| 0.8933
| 0.9465
| 12.8796
| 0.8725
| 0.2769
| 0.8818
| 0.5144
| 0.6944
| 0.7364
| 0.2846
| 0.3812
| 0.3726
| 0.6723
| 0.6753
| 0.7338
| 0.8244
| 0.8011
| 0.3106
| 0.5
| 0.0502
| 0.1245
| 0.4554
| 0.4413
| 0.015
| 0.0011
| 0.0088
| 0
| 0.6445
| 0.7794
| 7.5151
| 0.7969
| 0.8851
| 8.337
| 0.7127
| 0.6983
| 2.8331
| 30.452
| 11.4983
| 0.115
| 25.7002
|
MixtralForCausalLM
|
bfloat16
|
apache-2.0
| 46.704
| 19
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
karakuri-ai/karakuri-lm-70b-chat-v0.1
| 0.52
| 0
| 0.4993
| 0.2285
| 0.516
| 0.8123
| 0.636
| 0.8387
| 0.5423
| 0.6236
| 0.9103
| 0.1135
| 0.7876
| 0.8589
| 12.8743
| 0.9023
| 0.9513
| 15.9804
| 0.8781
| 0.4993
| 0.8702
| 0.4138
| 0.7139
| 0.8856
| 0.5177
| 0.4679
| 0.4491
| 0.5871
| 0.5476
| 0.9103
| 0.6075
| 0.5192
| 0.6811
| 0.636
| 0
| 0.002
| 0.564
| 0.5656
| 0.0013
| 0.274
| 0.0354
| 0.0384
| 0.7933
| 0.8237
| 11.0849
| 0.8292
| 0.9051
| 11.4481
| 0.745
| 0.7048
| 2.0013
| 32.715
| 11.3497
| 0.1135
| 27.593
|
LlamaForCausalLM
|
bfloat16
|
other
| 69.196
| 23
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
karakuri-ai/karakuri-lm-70b-chat-v0.1
| 0.3458
| 0
| 0.0708
| 0.1421
| 0.3197
| 0.4886
| 0.032
| 0.8226
| 0.5578
| 0.4086
| 0.848
| 0.1135
| 0.5177
| 0.8451
| 11.1751
| 0.8928
| 0.9485
| 14.225
| 0.8742
| 0.0708
| 0.7525
| 0.3649
| 0.5069
| 0.3342
| 0.2332
| 0.2231
| 0.5616
| 0.7102
| 0.645
| 0.848
| 0.202
| 0.4942
| 0.3792
| 0.032
| 0
| 0.002
| 0.4164
| 0.475
| 0.0025
| 0.0023
| 0
| 0
| 0.7056
| 0.7811
| 7.3019
| 0.7962
| 0.8954
| 9.1576
| 0.727
| 0.7048
| 2.0013
| 32.715
| 11.3497
| 0.1135
| 27.593
|
LlamaForCausalLM
|
bfloat16
|
other
| 69.196
| 23
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
anthracite-org/magnum-v2.5-12b-kto
| 0.5803
| 0.4659
| 0.5056
| 0.2575
| 0.5765
| 0.8432
| 0.742
| 0.8433
| 0.6803
| 0.4789
| 0.8929
| 0.0973
| 0.5346
| 0.8523
| 11.0049
| 0.9043
| 0.9507
| 15.6039
| 0.8798
| 0.5056
| 0.892
| 0.5546
| 0.75
| 0.9187
| 0.4296
| 0.5247
| 0.751
| 0.7557
| 0.5904
| 0.8929
| 0.8691
| 0.8385
| 0.7188
| 0.742
| 0.4659
| 0.8835
| 0.6283
| 0.4727
| 0.0461
| 0.3907
| 0.0265
| 0.0565
| 0.7675
| 0.8196
| 10.223
| 0.8337
| 0.901
| 9.7323
| 0.7555
| 0.6934
| 2.7577
| 28.2469
| 9.7173
| 0.0973
| 24.3763
|
MistralForCausalLM
|
float16
|
apache-2.0
| 12.248
| 40
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
anthracite-org/magnum-v2.5-12b-kto
| 0.431
| 0.4659
| 0.2191
| 0.1406
| 0.3713
| 0.6504
| 0.256
| 0.7663
| 0.6381
| 0.3328
| 0.8032
| 0.0973
| 0.3706
| 0.8273
| 9.1971
| 0.8933
| 0.8894
| 13.3192
| 0.7395
| 0.2191
| 0.7866
| 0.5115
| 0.7389
| 0.6711
| 0.26
| 0.316
| 0.3977
| 0.7462
| 0.7964
| 0.8032
| 0.7715
| 0.7927
| 0.4936
| 0.256
| 0.4659
| 0.8835
| 0.4266
| 0.3679
| 0.0233
| 0.0011
| 0
| 0
| 0.6786
| 0.7848
| 7.6412
| 0.8147
| 0.8558
| 8.2921
| 0.6179
| 0.6934
| 2.7577
| 28.2469
| 9.7173
| 0.0973
| 24.3763
|
MistralForCausalLM
|
float16
|
apache-2.0
| 12.248
| 40
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
mistralai/Mistral-Nemo-Instruct-2407
| 0.5006
| 0.4438
| 0.3349
| 0.1314
| 0.505
| 0.6974
| 0.664
| 0.8173
| 0.6304
| 0.3476
| 0.8248
| 0.1102
| 0.4349
| 0.8356
| 9.7971
| 0.8909
| 0.9465
| 13.9763
| 0.8721
| 0.3349
| 0.7142
| 0.4339
| 0.7028
| 0.8284
| 0.2038
| 0.447
| 0.5255
| 0.7247
| 0.765
| 0.8248
| 0.8271
| 0.7887
| 0.5497
| 0.664
| 0.4438
| 0.8594
| 0.563
| 0.4041
| 0.0058
| 0
| 0.0265
| 0
| 0.6245
| 0.7781
| 8.1055
| 0.7868
| 0.8889
| 8.9971
| 0.7192
| 0.702
| 2.6477
| 30.1236
| 11.0167
| 0.1102
| 25.8189
|
MistralForCausalLM
|
bfloat16
|
apache-2.0
| 12.248
| 1,242
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
mistralai/Mistral-Nemo-Instruct-2407
| 0.582
| 0.4438
| 0.5297
| 0.251
| 0.5899
| 0.8386
| 0.754
| 0.8382
| 0.6393
| 0.5034
| 0.904
| 0.1102
| 0.5181
| 0.8563
| 11.6408
| 0.902
| 0.9503
| 15.776
| 0.8793
| 0.5297
| 0.8848
| 0.408
| 0.7458
| 0.9205
| 0.5032
| 0.5397
| 0.689
| 0.6907
| 0.6629
| 0.904
| 0.8645
| 0.821
| 0.7105
| 0.754
| 0.4438
| 0.8594
| 0.6401
| 0.489
| 0.0358
| 0.3292
| 0.0442
| 0.0898
| 0.7562
| 0.8181
| 9.9231
| 0.8269
| 0.8978
| 9.7514
| 0.7447
| 0.702
| 2.6477
| 30.1236
| 11.0167
| 0.1102
| 25.8189
|
MistralForCausalLM
|
bfloat16
|
apache-2.0
| 12.248
| 1,242
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
cyberagent/Mistral-Nemo-Japanese-Instruct-2408
| 0.5436
| 0.0181
| 0.5247
| 0.2218
| 0.5626
| 0.8079
| 0.768
| 0.8355
| 0.7091
| 0.52
| 0.9142
| 0.0977
| 0.5393
| 0.8645
| 12.7846
| 0.9087
| 0.9497
| 15.6384
| 0.8786
| 0.5247
| 0.8196
| 0.592
| 0.8375
| 0.9133
| 0.5091
| 0.5346
| 0.7194
| 0.7696
| 0.6272
| 0.9142
| 0.8762
| 0.8532
| 0.6908
| 0.768
| 0.0181
| 0.0422
| 0.5905
| 0.5116
| 0.0209
| 0.2932
| 0.0442
| 0.0512
| 0.6994
| 0.7948
| 7.9257
| 0.8172
| 0.893
| 9.2525
| 0.7373
| 0.6934
| 2.6755
| 26.798
| 9.7702
| 0.0977
| 23.1335
|
MistralForCausalLM
|
bfloat16
|
apache-2.0
| 12.248
| 23
|
main
| 4
|
True
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
cyberagent/Mistral-Nemo-Japanese-Instruct-2408
| 0.4725
| 0.0181
| 0.2617
| 0.1413
| 0.4206
| 0.7856
| 0.684
| 0.8286
| 0.7287
| 0.3388
| 0.892
| 0.0977
| 0.2819
| 0.85
| 12.1913
| 0.9051
| 0.9484
| 14.8813
| 0.8773
| 0.2617
| 0.8312
| 0.6063
| 0.7847
| 0.8937
| 0.3612
| 0.5095
| 0.804
| 0.7696
| 0.6787
| 0.892
| 0.8913
| 0.8646
| 0.632
| 0.684
| 0.0181
| 0.0422
| 0.3316
| 0.3733
| 0.0095
| 0.0039
| 0.0265
| 0.0092
| 0.6575
| 0.7695
| 6.2908
| 0.8015
| 0.8889
| 8.8337
| 0.7306
| 0.6934
| 2.6755
| 26.798
| 9.7702
| 0.0977
| 23.1335
|
MistralForCausalLM
|
bfloat16
|
apache-2.0
| 12.248
| 23
|
main
| 0
|
True
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
Elizezen/Himeyuri-v0.1-12B
| 0.5838
| 0.4719
| 0.5117
| 0.2548
| 0.5801
| 0.8502
| 0.742
| 0.844
| 0.6753
| 0.4922
| 0.8967
| 0.1033
| 0.5394
| 0.8537
| 11.1719
| 0.9055
| 0.9511
| 16.1927
| 0.88
| 0.5117
| 0.8945
| 0.5345
| 0.7472
| 0.9267
| 0.4574
| 0.5301
| 0.7564
| 0.7519
| 0.5864
| 0.8967
| 0.8673
| 0.8357
| 0.7295
| 0.742
| 0.4719
| 0.8735
| 0.6301
| 0.4798
| 0.0395
| 0.3845
| 0.0177
| 0.0627
| 0.7698
| 0.8222
| 10.426
| 0.8357
| 0.9015
| 9.9963
| 0.7547
| 0.6968
| 2.7793
| 28.6041
| 10.3203
| 0.1033
| 24.6055
|
MistralForCausalLM
|
bfloat16
|
apache-2.0
| 12.248
| 10
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
Elizezen/Himeyuri-v0.1-12B
| 0.4116
| 0.4719
| 0.2167
| 0.1393
| 0.2734
| 0.5405
| 0.268
| 0.7667
| 0.6341
| 0.3081
| 0.8052
| 0.1033
| 0.3542
| 0.8028
| 9.2799
| 0.8537
| 0.9143
| 13.4908
| 0.777
| 0.2167
| 0.5168
| 0.5086
| 0.7264
| 0.6345
| 0.2323
| 0.2866
| 0.3895
| 0.7487
| 0.797
| 0.8052
| 0.8328
| 0.7993
| 0.4701
| 0.268
| 0.4719
| 0.8735
| 0.2601
| 0.3379
| 0.0106
| 0.0019
| 0
| 0.0014
| 0.6827
| 0.7734
| 7.6856
| 0.7975
| 0.8653
| 8.3046
| 0.6386
| 0.6968
| 2.7793
| 28.6041
| 10.3203
| 0.1033
| 24.6055
|
MistralForCausalLM
|
bfloat16
|
apache-2.0
| 12.248
| 10
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
spow12/ChatWaifu_v2.0_22B
| 0.5755
| 0.4498
| 0.489
| 0.2754
| 0.5818
| 0.7434
| 0.774
| 0.8296
| 0.7435
| 0.425
| 0.9072
| 0.1113
| 0.399
| 0.849
| 11.2044
| 0.8882
| 0.9473
| 15.5331
| 0.872
| 0.489
| 0.8069
| 0.6236
| 0.8236
| 0.7757
| 0.492
| 0.5027
| 0.7576
| 0.7399
| 0.7727
| 0.9072
| 0.8752
| 0.829
| 0.6476
| 0.774
| 0.4498
| 0.9137
| 0.6609
| 0.3841
| 0.002
| 0.5037
| 0.1062
| 0.079
| 0.6864
| 0.8089
| 9.9046
| 0.8128
| 0.8999
| 10.5372
| 0.7454
| 0.7051
| 2.6042
| 32.0946
| 11.1286
| 0.1113
| 27.1554
|
MistralForCausalLM
|
bfloat16
|
cc-by-nc-4.0
| 22.247
| 6
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
spow12/ChatWaifu_v2.0_22B
| 0.4908
| 0.4498
| 0.0584
| 0.1221
| 0.5491
| 0.6809
| 0.744
| 0.8135
| 0.6798
| 0.3484
| 0.842
| 0.1113
| 0.3116
| 0.826
| 9.8798
| 0.875
| 0.9448
| 14.384
| 0.8682
| 0.0584
| 0.7179
| 0.546
| 0.7611
| 0.7811
| 0.4214
| 0.4708
| 0.6742
| 0.7096
| 0.7079
| 0.842
| 0.8597
| 0.838
| 0.5437
| 0.744
| 0.4498
| 0.9137
| 0.6274
| 0.3123
| 0.002
| 0.004
| 0.0265
| 0.0103
| 0.5676
| 0.7695
| 6.8754
| 0.7822
| 0.8916
| 9.4839
| 0.7285
| 0.7051
| 2.6042
| 32.0946
| 11.1286
| 0.1113
| 27.1554
|
MistralForCausalLM
|
bfloat16
|
cc-by-nc-4.0
| 22.247
| 6
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
spow12/ChatWaifu_12B_v2.0
| 0.4009
| 0.4759
| 0.2443
| 0.1112
| 0.2387
| 0.4261
| 0.468
| 0.5958
| 0.6377
| 0.2964
| 0.825
| 0.0904
| 0.3011
| 0.7236
| 10.3014
| 0.6
| 0.8686
| 14.3307
| 0.521
| 0.2443
| 0.0098
| 0.4569
| 0.7431
| 0.7256
| 0.2627
| 0.3335
| 0.4528
| 0.7569
| 0.7788
| 0.825
| 0.8504
| 0.8101
| 0.5429
| 0.468
| 0.4759
| 0.8876
| 0.1439
| 0.3254
| 0.0025
| 0.001
| 0
| 0.0027
| 0.55
| 0.7188
| 8.0496
| 0.6563
| 0.8593
| 8.5589
| 0.6056
| 0.677
| 2.6975
| 24.4803
| 9.0565
| 0.0904
| 19.7651
|
MistralForCausalLM
|
bfloat16
|
cc-by-nc-4.0
| 12.248
| 11
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
spow12/ChatWaifu_12B_v2.0
| 0.5865
| 0.4759
| 0.5181
| 0.2591
| 0.5898
| 0.8507
| 0.752
| 0.8421
| 0.6773
| 0.4988
| 0.8976
| 0.0904
| 0.5378
| 0.8578
| 11.8316
| 0.9061
| 0.9515
| 15.9457
| 0.8809
| 0.5181
| 0.893
| 0.4885
| 0.75
| 0.9303
| 0.4831
| 0.5414
| 0.756
| 0.7626
| 0.6292
| 0.8976
| 0.8722
| 0.8432
| 0.7289
| 0.752
| 0.4759
| 0.8876
| 0.6382
| 0.4755
| 0.0492
| 0.358
| 0.0442
| 0.0699
| 0.7743
| 0.8223
| 10.3095
| 0.832
| 0.9009
| 9.937
| 0.7493
| 0.677
| 2.6975
| 24.4803
| 9.0565
| 0.0904
| 19.7651
|
MistralForCausalLM
|
bfloat16
|
cc-by-nc-4.0
| 12.248
| 11
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
augmxnt/shisa-gamma-7b-v1
| 0.4659
| 0.2369
| 0.0387
| 0.1173
| 0.3918
| 0.7352
| 0.476
| 0.8017
| 0.8178
| 0.4707
| 0.9121
| 0.1262
| 0.6101
| 0.8455
| 10.7929
| 0.8931
| 0.9422
| 13.3796
| 0.8623
| 0.0387
| 0.7751
| 0.6207
| 0.9486
| 0.8606
| 0.3184
| 0.3668
| 0.878
| 0.7854
| 0.8563
| 0.9121
| 0.9069
| 0.873
| 0.5701
| 0.476
| 0.2369
| 0.8574
| 0.4168
| 0.4836
| 0
| 0
| 0
| 0
| 0.5867
| 0.7639
| 7.1315
| 0.7741
| 0.8767
| 9.7167
| 0.6773
| 0.7158
| 2.6506
| 34.2569
| 12.6511
| 0.1262
| 28.614
|
MistralForCausalLM
|
bfloat16
|
apache-2.0
| 7.242
| 15
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🔶 : fine-tuned
|
augmxnt/shisa-gamma-7b-v1
| 0.547
| 0.2369
| 0.4704
| 0.2215
| 0.4648
| 0.8258
| 0.622
| 0.8253
| 0.7628
| 0.5497
| 0.9113
| 0.1262
| 0.666
| 0.8601
| 12.2649
| 0.9022
| 0.9458
| 15.6194
| 0.8687
| 0.4704
| 0.8765
| 0.6034
| 0.8722
| 0.9178
| 0.4577
| 0.4541
| 0.8118
| 0.7797
| 0.7469
| 0.9113
| 0.8235
| 0.7898
| 0.683
| 0.622
| 0.2369
| 0.8574
| 0.4756
| 0.5253
| 0
| 0.2735
| 0.0265
| 0.0612
| 0.7464
| 0.7979
| 8.7329
| 0.8034
| 0.8925
| 10.2192
| 0.7268
| 0.7158
| 2.6506
| 34.2569
| 12.6511
| 0.1262
| 28.614
|
MistralForCausalLM
|
bfloat16
|
apache-2.0
| 7.242
| 15
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
Qwen/QwQ-32B-Preview
| 0.5785
| 0.6606
| 0.2402
| 0.1527
| 0.7286
| 0.8531
| 0.82
| 0.8396
| 0.7285
| 0.3296
| 0.8978
| 0.1134
| 0.1838
| 0.8478
| 11.7965
| 0.9037
| 0.9536
| 16.5924
| 0.8832
| 0.2402
| 0.895
| 0.6063
| 0.7139
| 0.9178
| 0.4363
| 0.6944
| 0.8012
| 0.7891
| 0.7319
| 0.8978
| 0.8994
| 0.8717
| 0.7464
| 0.82
| 0.6606
| 0.9478
| 0.7627
| 0.3686
| 0
| 0.0008
| 0.0531
| 0.0022
| 0.7073
| 0.7991
| 9.0753
| 0.8271
| 0.8955
| 9.2158
| 0.7442
| 0.7031
| 3.1354
| 29.4104
| 11.3342
| 0.1134
| 25.3233
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 32.764
| 594
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
🟢 : pretrained
|
Qwen/QwQ-32B-Preview
| 0.6715
| 0.6606
| 0.5794
| 0.2813
| 0.7756
| 0.8955
| 0.938
| 0.8487
| 0.7912
| 0.5775
| 0.9254
| 0.1134
| 0.5875
| 0.8633
| 12.6447
| 0.9093
| 0.9555
| 16.8456
| 0.8861
| 0.5794
| 0.9126
| 0.6609
| 0.8139
| 0.9517
| 0.6171
| 0.7557
| 0.8965
| 0.8049
| 0.7798
| 0.9254
| 0.8768
| 0.872
| 0.8223
| 0.938
| 0.6606
| 0.9478
| 0.7955
| 0.5279
| 0.0203
| 0.3842
| 0.1327
| 0.0816
| 0.7875
| 0.8335
| 12.1622
| 0.8381
| 0.9058
| 11.4057
| 0.7612
| 0.7031
| 3.1354
| 29.4104
| 11.3342
| 0.1134
| 25.3233
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 32.764
| 594
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
Qwen/Qwen2.5-14B-Instruct
| 0.5476
| 0.4438
| 0.2239
| 0.1282
| 0.6851
| 0.8539
| 0.794
| 0.8041
| 0.7474
| 0.3898
| 0.8551
| 0.0982
| 0.4017
| 0.8419
| 9.2171
| 0.8957
| 0.9195
| 15.4526
| 0.8051
| 0.2239
| 0.8945
| 0.6351
| 0.8306
| 0.9285
| 0.3404
| 0.6631
| 0.7703
| 0.6894
| 0.8114
| 0.8551
| 0.884
| 0.8556
| 0.7388
| 0.794
| 0.4438
| 0.7068
| 0.7072
| 0.4274
| 0.0209
| 0.0088
| 0.0088
| 0.0008
| 0.6014
| 0.792
| 7.5747
| 0.811
| 0.875
| 8.7641
| 0.7048
| 0.6941
| 2.8388
| 25.5114
| 9.8292
| 0.0982
| 22.4451
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 14.77
| 126
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
Qwen/Qwen2.5-14B-Instruct
| 0.6285
| 0.4438
| 0.5656
| 0.2613
| 0.7345
| 0.8802
| 0.892
| 0.8414
| 0.7568
| 0.5383
| 0.9019
| 0.0982
| 0.5091
| 0.8628
| 12.6532
| 0.9084
| 0.9458
| 16.5642
| 0.8663
| 0.5656
| 0.8988
| 0.6149
| 0.8278
| 0.9517
| 0.5883
| 0.7043
| 0.8225
| 0.762
| 0.7569
| 0.9019
| 0.8874
| 0.8586
| 0.7901
| 0.892
| 0.4438
| 0.7068
| 0.7648
| 0.5175
| 0.0969
| 0.3345
| 0.0147
| 0.0848
| 0.7754
| 0.8212
| 9.9418
| 0.8341
| 0.9023
| 10.5742
| 0.7567
| 0.6941
| 2.8388
| 25.5114
| 9.8292
| 0.0982
| 22.4451
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 14.77
| 126
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
AIDC-AI/Marco-o1
| 0.2171
| 0
| 0.1841
| 0.0632
| 0.1052
| 0.4735
| 0.042
| 0.6498
| 0.23
| 0.1423
| 0.4048
| 0.0936
| 0.1335
| 0.695
| 7.3933
| 0.7214
| 0.8533
| 11.4251
| 0.6421
| 0.1841
| 0
| 0.4828
| 0
| 0.7998
| 0.1625
| 0.203
| 0.3246
| 0
| 0.3428
| 0.4048
| 0.8694
| 0.835
| 0.6205
| 0.042
| 0
| 0
| 0.0073
| 0.1308
| 0
| 0.0071
| 0.0114
| 0.0028
| 0.2947
| 0.665
| 6.8745
| 0.6654
| 0.8227
| 7.7347
| 0.5703
| 0.6912
| 2.3785
| 26.7848
| 9.367
| 0.0936
| 22.6888
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 7.616
| 542
|
main
| 0
|
False
|
v1.4.1
|
v0.6.3.post1
|
⭕ : instruction-tuned
|
AIDC-AI/Marco-o1
| 0.5419
| 0
| 0.4799
| 0.2396
| 0.6505
| 0.8392
| 0.782
| 0.8328
| 0.7167
| 0.4246
| 0.9021
| 0.0936
| 0.3833
| 0.8513
| 11.848
| 0.895
| 0.949
| 15.5601
| 0.8771
| 0.4799
| 0.8615
| 0.5661
| 0.6889
| 0.9214
| 0.4758
| 0.6125
| 0.8221
| 0.7115
| 0.795
| 0.9021
| 0.8697
| 0.847
| 0.7347
| 0.782
| 0
| 0
| 0.6884
| 0.4146
| 0.036
| 0.3458
| 0.0708
| 0.0545
| 0.6909
| 0.8011
| 8.5787
| 0.8175
| 0.894
| 9.6318
| 0.7415
| 0.6912
| 2.3785
| 26.7848
| 9.367
| 0.0936
| 22.6888
|
Qwen2ForCausalLM
|
bfloat16
|
apache-2.0
| 7.616
| 542
|
main
| 4
|
False
|
v1.4.1
|
v0.6.3.post1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.