ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run999_AugV5_k20_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7784
  • Qwk: 0.7
  • Mse: 0.7784
  • Rmse: 0.8822

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0196 2 6.7818 0.0176 6.7818 2.6042
No log 0.0392 4 4.6522 0.0543 4.6522 2.1569
No log 0.0588 6 3.0219 0.0848 3.0219 1.7384
No log 0.0784 8 2.3178 0.1096 2.3178 1.5224
No log 0.0980 10 1.9280 0.2857 1.9280 1.3885
No log 0.1176 12 1.7788 0.1495 1.7788 1.3337
No log 0.1373 14 1.9031 0.1356 1.9031 1.3795
No log 0.1569 16 2.0256 0.1626 2.0256 1.4233
No log 0.1765 18 1.7571 0.1391 1.7571 1.3256
No log 0.1961 20 1.6395 0.2124 1.6395 1.2804
No log 0.2157 22 1.6378 0.2479 1.6378 1.2797
No log 0.2353 24 1.6649 0.3125 1.6649 1.2903
No log 0.2549 26 1.4446 0.3902 1.4446 1.2019
No log 0.2745 28 1.3152 0.3967 1.3152 1.1468
No log 0.2941 30 1.3054 0.4202 1.3054 1.1425
No log 0.3137 32 1.3463 0.4677 1.3463 1.1603
No log 0.3333 34 1.5629 0.3040 1.5629 1.2501
No log 0.3529 36 1.6729 0.3881 1.6729 1.2934
No log 0.3725 38 1.4413 0.4030 1.4413 1.2006
No log 0.3922 40 1.3489 0.4429 1.3489 1.1614
No log 0.4118 42 1.1669 0.5344 1.1669 1.0802
No log 0.4314 44 1.1770 0.4961 1.1770 1.0849
No log 0.4510 46 1.3361 0.4219 1.3361 1.1559
No log 0.4706 48 1.1585 0.5191 1.1585 1.0764
No log 0.4902 50 1.0191 0.5556 1.0191 1.0095
No log 0.5098 52 1.0268 0.512 1.0268 1.0133
No log 0.5294 54 1.0604 0.5238 1.0604 1.0298
No log 0.5490 56 1.0327 0.5469 1.0327 1.0162
No log 0.5686 58 1.0514 0.5397 1.0514 1.0254
No log 0.5882 60 1.1474 0.5692 1.1474 1.0712
No log 0.6078 62 1.2184 0.5692 1.2184 1.1038
No log 0.6275 64 1.2375 0.5581 1.2375 1.1124
No log 0.6471 66 1.2029 0.5156 1.2029 1.0968
No log 0.6667 68 1.2033 0.5625 1.2033 1.0969
No log 0.6863 70 1.1511 0.5649 1.1511 1.0729
No log 0.7059 72 1.1070 0.5556 1.1070 1.0521
No log 0.7255 74 1.1231 0.5397 1.1231 1.0598
No log 0.7451 76 1.1984 0.5649 1.1984 1.0947
No log 0.7647 78 1.3196 0.5294 1.3196 1.1488
No log 0.7843 80 1.3995 0.4604 1.3995 1.1830
No log 0.8039 82 1.3966 0.4604 1.3966 1.1818
No log 0.8235 84 1.2886 0.5116 1.2886 1.1352
No log 0.8431 86 1.2660 0.4688 1.2660 1.1252
No log 0.8627 88 1.3945 0.3810 1.3945 1.1809
No log 0.8824 90 1.3077 0.4252 1.3077 1.1435
No log 0.9020 92 1.1490 0.4706 1.1490 1.0719
No log 0.9216 94 1.2295 0.5692 1.2295 1.1088
No log 0.9412 96 1.2993 0.5606 1.2993 1.1399
No log 0.9608 98 1.1591 0.5909 1.1591 1.0766
No log 0.9804 100 1.0642 0.4839 1.0642 1.0316
No log 1.0 102 1.1627 0.4516 1.1627 1.0783
No log 1.0196 104 1.1855 0.4762 1.1855 1.0888
No log 1.0392 106 1.0879 0.4677 1.0879 1.0430
No log 1.0588 108 1.0408 0.5538 1.0408 1.0202
No log 1.0784 110 1.1507 0.6197 1.1507 1.0727
No log 1.0980 112 1.1172 0.6286 1.1172 1.0570
No log 1.1176 114 1.0629 0.5985 1.0629 1.0310
No log 1.1373 116 1.1036 0.6029 1.1036 1.0505
No log 1.1569 118 1.1590 0.5414 1.1590 1.0766
No log 1.1765 120 1.1617 0.5156 1.1617 1.0778
No log 1.1961 122 1.1835 0.5484 1.1835 1.0879
No log 1.2157 124 1.3976 0.4265 1.3976 1.1822
No log 1.2353 126 1.4555 0.3942 1.4555 1.2064
No log 1.2549 128 1.2962 0.4962 1.2962 1.1385
No log 1.2745 130 1.1217 0.5366 1.1217 1.0591
No log 1.2941 132 1.0895 0.4715 1.0895 1.0438
No log 1.3137 134 1.0374 0.5238 1.0374 1.0185
No log 1.3333 136 1.0111 0.5781 1.0111 1.0055
No log 1.3529 138 1.0378 0.5714 1.0378 1.0187
No log 1.3725 140 1.0869 0.5760 1.0869 1.0425
No log 1.3922 142 1.1235 0.6 1.1235 1.0600
No log 1.4118 144 1.1780 0.5802 1.1780 1.0854
No log 1.4314 146 1.1252 0.6047 1.1252 1.0608
No log 1.4510 148 1.0457 0.5760 1.0457 1.0226
No log 1.4706 150 0.9769 0.5873 0.9769 0.9884
No log 1.4902 152 0.9491 0.6107 0.9491 0.9742
No log 1.5098 154 0.9790 0.5954 0.9790 0.9894
No log 1.5294 156 1.0869 0.5846 1.0869 1.0426
No log 1.5490 158 1.0766 0.5802 1.0766 1.0376
No log 1.5686 160 1.0261 0.6277 1.0261 1.0130
No log 1.5882 162 1.0265 0.6331 1.0265 1.0132
No log 1.6078 164 1.1473 0.5867 1.1473 1.0711
No log 1.6275 166 1.1671 0.6093 1.1671 1.0803
No log 1.6471 168 0.9925 0.6803 0.9925 0.9962
No log 1.6667 170 0.8294 0.6475 0.8294 0.9107
No log 1.6863 172 0.8660 0.6308 0.8660 0.9306
No log 1.7059 174 0.8770 0.6406 0.8770 0.9365
No log 1.7255 176 0.8696 0.6047 0.8696 0.9325
No log 1.7451 178 0.9208 0.5781 0.9208 0.9596
No log 1.7647 180 0.9094 0.5827 0.9094 0.9536
No log 1.7843 182 0.8658 0.6154 0.8658 0.9305
No log 1.8039 184 0.8416 0.6202 0.8416 0.9174
No log 1.8235 186 0.8190 0.6565 0.8190 0.9050
No log 1.8431 188 0.8062 0.7068 0.8062 0.8979
No log 1.8627 190 0.8040 0.6466 0.8040 0.8967
No log 1.8824 192 0.8161 0.6466 0.8161 0.9034
No log 1.9020 194 0.8314 0.6462 0.8314 0.9118
No log 1.9216 196 0.8418 0.6094 0.8418 0.9175
No log 1.9412 198 0.8488 0.6565 0.8488 0.9213
No log 1.9608 200 0.8410 0.6716 0.8410 0.9170
No log 1.9804 202 0.8487 0.6308 0.8487 0.9213
No log 2.0 204 0.8848 0.6047 0.8848 0.9407
No log 2.0196 206 0.9316 0.5760 0.9316 0.9652
No log 2.0392 208 0.9567 0.6190 0.9567 0.9781
No log 2.0588 210 0.9743 0.6190 0.9743 0.9870
No log 2.0784 212 0.9355 0.6406 0.9355 0.9672
No log 2.0980 214 0.9231 0.5426 0.9231 0.9608
No log 2.1176 216 0.9246 0.5649 0.9246 0.9615
No log 2.1373 218 0.8056 0.6418 0.8056 0.8975
No log 2.1569 220 0.7596 0.6567 0.7596 0.8715
No log 2.1765 222 0.7760 0.6763 0.7760 0.8809
No log 2.1961 224 0.7778 0.6316 0.7778 0.8819
No log 2.2157 226 0.8593 0.6620 0.8593 0.9270
No log 2.2353 228 0.9575 0.6187 0.9575 0.9785
No log 2.2549 230 0.9236 0.5839 0.9236 0.9610
No log 2.2745 232 0.9092 0.6423 0.9092 0.9535
No log 2.2941 234 0.9472 0.6176 0.9472 0.9733
No log 2.3137 236 1.0041 0.5926 1.0041 1.0020
No log 2.3333 238 1.1589 0.5714 1.1589 1.0765
No log 2.3529 240 1.1044 0.5344 1.1044 1.0509
No log 2.3725 242 1.0228 0.5354 1.0228 1.0113
No log 2.3922 244 0.9589 0.5781 0.9589 0.9793
No log 2.4118 246 0.9432 0.5714 0.9432 0.9712
No log 2.4314 248 0.9273 0.5827 0.9273 0.9629
No log 2.4510 250 0.8770 0.6107 0.8770 0.9365
No log 2.4706 252 0.8746 0.6107 0.8746 0.9352
No log 2.4902 254 0.9031 0.6212 0.9031 0.9503
No log 2.5098 256 1.0400 0.56 1.0400 1.0198
No log 2.5294 258 1.1501 0.4407 1.1501 1.0724
No log 2.5490 260 1.1596 0.3826 1.1596 1.0768
No log 2.5686 262 1.1035 0.5366 1.1035 1.0505
No log 2.5882 264 1.0509 0.5645 1.0509 1.0251
No log 2.6078 266 1.0043 0.5873 1.0043 1.0022
No log 2.6275 268 0.9603 0.56 0.9603 0.9800
No log 2.6471 270 0.9206 0.6047 0.9206 0.9595
No log 2.6667 272 0.9667 0.5758 0.9667 0.9832
No log 2.6863 274 0.9398 0.5926 0.9398 0.9694
No log 2.7059 276 0.9272 0.6232 0.9272 0.9629
No log 2.7255 278 0.9537 0.6131 0.9537 0.9766
No log 2.7451 280 0.9515 0.6357 0.9515 0.9754
No log 2.7647 282 0.9669 0.5736 0.9669 0.9833
No log 2.7843 284 1.0894 0.5581 1.0894 1.0437
No log 2.8039 286 1.1140 0.5312 1.1140 1.0555
No log 2.8235 288 1.0541 0.528 1.0541 1.0267
No log 2.8431 290 1.0112 0.5984 1.0112 1.0056
No log 2.8627 292 0.9725 0.5891 0.9725 0.9862
No log 2.8824 294 0.9451 0.5649 0.9451 0.9722
No log 2.9020 296 0.9430 0.5649 0.9430 0.9711
No log 2.9216 298 0.9765 0.5238 0.9765 0.9882
No log 2.9412 300 1.0507 0.5203 1.0507 1.0250
No log 2.9608 302 1.0440 0.5246 1.0440 1.0217
No log 2.9804 304 1.0122 0.5410 1.0122 1.0061
No log 3.0 306 0.9528 0.5873 0.9528 0.9761
No log 3.0196 308 0.8923 0.5806 0.8923 0.9446
No log 3.0392 310 0.8179 0.5984 0.8179 0.9044
No log 3.0588 312 0.7404 0.6618 0.7404 0.8604
No log 3.0784 314 0.7016 0.7101 0.7016 0.8376
No log 3.0980 316 0.6815 0.7007 0.6815 0.8255
No log 3.1176 318 0.6921 0.6957 0.6921 0.8319
No log 3.1373 320 0.6881 0.7007 0.6881 0.8295
No log 3.1569 322 0.7288 0.6866 0.7288 0.8537
No log 3.1765 324 0.7873 0.6269 0.7873 0.8873
No log 3.1961 326 0.7895 0.6370 0.7895 0.8885
No log 3.2157 328 0.7754 0.6866 0.7754 0.8806
No log 3.2353 330 0.8123 0.6617 0.8123 0.9013
No log 3.2549 332 0.8661 0.5984 0.8661 0.9306
No log 3.2745 334 0.8900 0.6202 0.8900 0.9434
No log 3.2941 336 0.8836 0.6308 0.8836 0.9400
No log 3.3137 338 0.8621 0.6406 0.8621 0.9285
No log 3.3333 340 0.8041 0.6512 0.8041 0.8967
No log 3.3529 342 0.7597 0.6212 0.7597 0.8716
No log 3.3725 344 0.7960 0.6765 0.7960 0.8922
No log 3.3922 346 0.7790 0.5802 0.7790 0.8826
No log 3.4118 348 0.8225 0.576 0.8225 0.9069
No log 3.4314 350 0.8779 0.5484 0.8779 0.9369
No log 3.4510 352 1.0074 0.4959 1.0074 1.0037
No log 3.4706 354 1.1250 0.4553 1.1250 1.0607
No log 3.4902 356 1.1112 0.4762 1.1112 1.0541
No log 3.5098 358 0.9957 0.5538 0.9957 0.9979
No log 3.5294 360 0.8745 0.5827 0.8745 0.9352
No log 3.5490 362 0.8605 0.6667 0.8605 0.9276
No log 3.5686 364 0.8609 0.6462 0.8609 0.9279
No log 3.5882 366 0.8289 0.6667 0.8289 0.9105
No log 3.6078 368 0.8040 0.625 0.8040 0.8967
No log 3.6275 370 0.7942 0.6202 0.7942 0.8912
No log 3.6471 372 0.7906 0.6462 0.7906 0.8892
No log 3.6667 374 0.8043 0.6462 0.8043 0.8968
No log 3.6863 376 0.8543 0.6466 0.8543 0.9243
No log 3.7059 378 0.8877 0.6418 0.8877 0.9422
No log 3.7255 380 0.9122 0.6515 0.9122 0.9551
No log 3.7451 382 0.8954 0.6466 0.8954 0.9463
No log 3.7647 384 0.8760 0.6466 0.8760 0.9359
No log 3.7843 386 0.8540 0.6154 0.8540 0.9241
No log 3.8039 388 0.8909 0.6032 0.8909 0.9439
No log 3.8235 390 0.9215 0.6032 0.9215 0.9599
No log 3.8431 392 0.9296 0.5806 0.9296 0.9641
No log 3.8627 394 0.9548 0.5645 0.9548 0.9772
No log 3.8824 396 0.9898 0.5691 0.9898 0.9949
No log 3.9020 398 0.9848 0.5645 0.9848 0.9924
No log 3.9216 400 0.9266 0.6299 0.9266 0.9626
No log 3.9412 402 0.8332 0.6667 0.8332 0.9128
No log 3.9608 404 0.7684 0.6667 0.7684 0.8766
No log 3.9804 406 0.7530 0.6667 0.7530 0.8677
No log 4.0 408 0.7637 0.6769 0.7637 0.8739
No log 4.0196 410 0.8047 0.6667 0.8047 0.8970
No log 4.0392 412 0.8344 0.6457 0.8344 0.9135
No log 4.0588 414 0.8651 0.6562 0.8651 0.9301
No log 4.0784 416 0.9159 0.5806 0.9159 0.9570
No log 4.0980 418 0.9491 0.5366 0.9491 0.9742
No log 4.1176 420 0.9376 0.5691 0.9376 0.9683
No log 4.1373 422 0.8972 0.6190 0.8972 0.9472
No log 4.1569 424 0.9206 0.6667 0.9206 0.9595
No log 4.1765 426 0.9081 0.6615 0.9081 0.9529
No log 4.1961 428 0.8408 0.6667 0.8408 0.9170
No log 4.2157 430 0.8603 0.6015 0.8603 0.9275
No log 4.2353 432 0.9558 0.5758 0.9558 0.9776
No log 4.2549 434 0.9901 0.5496 0.9901 0.9950
No log 4.2745 436 0.9626 0.5528 0.9626 0.9811
No log 4.2941 438 0.9366 0.6357 0.9366 0.9678
No log 4.3137 440 0.9171 0.6615 0.9171 0.9576
No log 4.3333 442 0.8374 0.6912 0.8374 0.9151
No log 4.3529 444 0.7750 0.6571 0.7750 0.8803
No log 4.3725 446 0.7853 0.6809 0.7853 0.8862
No log 4.3922 448 0.7972 0.6853 0.7972 0.8929
No log 4.4118 450 0.7953 0.6522 0.7953 0.8918
No log 4.4314 452 0.8065 0.6423 0.8065 0.8980
No log 4.4510 454 0.8453 0.6222 0.8453 0.9194
No log 4.4706 456 0.8459 0.6222 0.8459 0.9197
No log 4.4902 458 0.8731 0.5954 0.8731 0.9344
No log 4.5098 460 0.8983 0.5891 0.8983 0.9478
No log 4.5294 462 0.9076 0.5938 0.9076 0.9527
No log 4.5490 464 0.9134 0.6142 0.9134 0.9557
No log 4.5686 466 0.9136 0.6142 0.9136 0.9558
No log 4.5882 468 0.8786 0.6406 0.8786 0.9373
No log 4.6078 470 0.8642 0.6462 0.8642 0.9296
No log 4.6275 472 0.8582 0.6462 0.8582 0.9264
No log 4.6471 474 0.8347 0.6718 0.8347 0.9136
No log 4.6667 476 0.9075 0.625 0.9075 0.9526
No log 4.6863 478 0.9897 0.5366 0.9897 0.9948
No log 4.7059 480 1.0402 0.5289 1.0402 1.0199
No log 4.7255 482 1.0209 0.5806 1.0209 1.0104
No log 4.7451 484 0.9984 0.6349 0.9984 0.9992
No log 4.7647 486 0.9289 0.625 0.9289 0.9638
No log 4.7843 488 0.8248 0.6565 0.8248 0.9082
No log 4.8039 490 0.7036 0.7121 0.7036 0.8388
No log 4.8235 492 0.6861 0.7273 0.6861 0.8283
No log 4.8431 494 0.8577 0.6575 0.8577 0.9261
No log 4.8627 496 0.8961 0.6575 0.8961 0.9466
No log 4.8824 498 0.7991 0.6944 0.7991 0.8939
0.28 4.9020 500 0.7181 0.7448 0.7181 0.8474
0.28 4.9216 502 0.6866 0.7429 0.6866 0.8286
0.28 4.9412 504 0.7055 0.6963 0.7055 0.8399
0.28 4.9608 506 0.7292 0.6866 0.7292 0.8539
0.28 4.9804 508 0.7557 0.6957 0.7557 0.8693
0.28 5.0 510 0.7784 0.7 0.7784 0.8822

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
0.1B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run999_AugV5_k20_task1_organization

Finetuned
(4013)
this model