tunned_albert_model2

This model is a fine-tuned version of ckiplab/albert-tiny-chinese on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0275
  • Accuracy: 1.0
  • Precision: 1.0
  • Recall: 1.0
  • F1: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Recall F1
0.6954 0.3333 3 0.6878 0.8 0.8038 0.7873 0.7917
0.6833 0.6667 6 0.6773 0.5667 0.2833 0.5 0.3617
0.6695 1.0 9 0.6654 0.5667 0.2833 0.5 0.3617
0.6532 1.3333 12 0.6557 0.5667 0.2833 0.5 0.3617
0.6669 1.6667 15 0.6469 0.5667 0.2833 0.5 0.3617
0.652 2.0 18 0.6333 0.5667 0.2833 0.5 0.3617
0.6134 2.3333 21 0.6109 0.5667 0.2833 0.5 0.3617
0.5863 2.6667 24 0.5836 0.6 0.6296 0.5475 0.4886
0.5766 3.0 27 0.5563 0.6667 0.72 0.6244 0.6032
0.5285 3.3333 30 0.5302 0.7333 0.7764 0.7014 0.7
0.5476 3.6667 33 0.5070 0.7667 0.8011 0.7398 0.7436
0.4637 4.0 36 0.4830 0.8333 0.8500 0.8167 0.8237
0.469 4.3333 39 0.4656 0.8333 0.8500 0.8167 0.8237
0.4471 4.6667 42 0.4441 0.8333 0.8500 0.8167 0.8237
0.4235 5.0 45 0.4154 0.8667 0.8756 0.8552 0.8611
0.4401 5.3333 48 0.3934 0.9 0.9028 0.8937 0.8971
0.3664 5.6667 51 0.3736 0.8667 0.8756 0.8552 0.8611
0.3439 6.0 54 0.3777 0.8667 0.8756 0.8552 0.8611
0.3148 6.3333 57 0.3944 0.8333 0.8500 0.8167 0.8237
0.3827 6.6667 60 0.3624 0.8667 0.8756 0.8552 0.8611
0.3243 7.0 63 0.3175 0.9 0.9028 0.8937 0.8971
0.3449 7.3333 66 0.3008 0.9333 0.9321 0.9321 0.9321
0.2895 7.6667 69 0.2910 0.9333 0.9321 0.9321 0.9321
0.2867 8.0 72 0.3103 0.9 0.9028 0.8937 0.8971
0.3041 8.3333 75 0.3456 0.8667 0.8756 0.8552 0.8611
0.2793 8.6667 78 0.3280 0.9 0.9028 0.8937 0.8971
0.2609 9.0 81 0.2865 0.9 0.9028 0.8937 0.8971
0.2499 9.3333 84 0.2559 0.9 0.9028 0.8937 0.8971
0.2298 9.6667 87 0.2587 0.9 0.9028 0.8937 0.8971
0.2297 10.0 90 0.3077 0.9 0.9028 0.8937 0.8971
0.2286 10.3333 93 0.3392 0.8667 0.8756 0.8552 0.8611
0.2415 10.6667 96 0.3521 0.8333 0.8500 0.8167 0.8237
0.2151 11.0 99 0.2810 0.9 0.9028 0.8937 0.8971
0.1832 11.3333 102 0.2411 0.9333 0.9321 0.9321 0.9321
0.2159 11.6667 105 0.2552 0.9 0.9028 0.8937 0.8971
0.1732 12.0 108 0.2748 0.9 0.9028 0.8937 0.8971
0.1712 12.3333 111 0.2500 0.9 0.9028 0.8937 0.8971
0.1728 12.6667 114 0.2306 0.9333 0.9321 0.9321 0.9321
0.1295 13.0 117 0.2173 0.9333 0.9321 0.9321 0.9321
0.1377 13.3333 120 0.2244 0.9333 0.9321 0.9321 0.9321
0.113 13.6667 123 0.2151 0.9333 0.9321 0.9321 0.9321
0.1386 14.0 126 0.1908 0.9333 0.9321 0.9321 0.9321
0.1004 14.3333 129 0.1701 0.9333 0.9321 0.9321 0.9321
0.1525 14.6667 132 0.1809 0.9333 0.9321 0.9321 0.9321
0.0799 15.0 135 0.2101 0.9333 0.9321 0.9321 0.9321
0.0544 15.3333 138 0.1978 0.9333 0.9321 0.9321 0.9321
0.1573 15.6667 141 0.1659 0.9333 0.9321 0.9321 0.9321
0.0695 16.0 144 0.1742 0.9333 0.9321 0.9321 0.9321
0.1158 16.3333 147 0.1831 0.9333 0.9321 0.9321 0.9321
0.0427 16.6667 150 0.1980 0.9333 0.9321 0.9321 0.9321
0.0883 17.0 153 0.2152 0.9333 0.9321 0.9321 0.9321
0.0705 17.3333 156 0.2301 0.9333 0.9321 0.9321 0.9321
0.0711 17.6667 159 0.2214 0.9333 0.9321 0.9321 0.9321
0.0607 18.0 162 0.1914 0.9333 0.9321 0.9321 0.9321
0.0671 18.3333 165 0.1565 0.9333 0.9321 0.9321 0.9321
0.0803 18.6667 168 0.1608 0.9333 0.9321 0.9321 0.9321
0.0275 19.0 171 0.1890 0.9333 0.9321 0.9321 0.9321
0.072 19.3333 174 0.2328 0.9 0.9028 0.8937 0.8971
0.0468 19.6667 177 0.1891 0.9333 0.9321 0.9321 0.9321
0.0418 20.0 180 0.1773 0.9333 0.9321 0.9321 0.9321
0.0547 20.3333 183 0.1812 0.9333 0.9321 0.9321 0.9321
0.0211 20.6667 186 0.1740 0.9333 0.9321 0.9321 0.9321
0.0551 21.0 189 0.1685 0.9333 0.9321 0.9321 0.9321
0.0196 21.3333 192 0.1641 0.9333 0.9321 0.9321 0.9321
0.0469 21.6667 195 0.1629 0.9667 0.9722 0.9615 0.9657
0.0433 22.0 198 0.1651 0.9667 0.9722 0.9615 0.9657
0.0404 22.3333 201 0.1691 0.9667 0.9722 0.9615 0.9657
0.0348 22.6667 204 0.1731 0.9667 0.9722 0.9615 0.9657
0.018 23.0 207 0.1715 0.9667 0.9722 0.9615 0.9657
0.0211 23.3333 210 0.1668 0.9667 0.9722 0.9615 0.9657
0.0299 23.6667 213 0.1628 0.9667 0.9722 0.9615 0.9657
0.0219 24.0 216 0.1627 0.9667 0.9722 0.9615 0.9657
0.0119 24.3333 219 0.1627 0.9333 0.9321 0.9321 0.9321
0.0143 24.6667 222 0.1625 0.9333 0.9321 0.9321 0.9321
0.042 25.0 225 0.1627 0.9667 0.9722 0.9615 0.9657
0.0122 25.3333 228 0.1634 0.9667 0.9722 0.9615 0.9657
0.0215 25.6667 231 0.1632 0.9667 0.9722 0.9615 0.9657
0.0106 26.0 234 0.1579 0.9667 0.9722 0.9615 0.9657
0.0185 26.3333 237 0.1495 0.9667 0.9722 0.9615 0.9657
0.0102 26.6667 240 0.1406 0.9667 0.9722 0.9615 0.9657
0.0098 27.0 243 0.1297 0.9667 0.9722 0.9615 0.9657
0.0094 27.3333 246 0.1162 0.9667 0.9722 0.9615 0.9657
0.0158 27.6667 249 0.1127 0.9667 0.9722 0.9615 0.9657
0.0083 28.0 252 0.1165 0.9667 0.9722 0.9615 0.9657
0.0082 28.3333 255 0.1188 0.9667 0.9722 0.9615 0.9657
0.0083 28.6667 258 0.1178 0.9667 0.9722 0.9615 0.9657
0.0136 29.0 261 0.1239 0.9667 0.9722 0.9615 0.9657
0.008 29.3333 264 0.1270 0.9667 0.9722 0.9615 0.9657
0.0073 29.6667 267 0.1229 0.9667 0.9722 0.9615 0.9657
0.0126 30.0 270 0.1256 0.9667 0.9722 0.9615 0.9657
0.0071 30.3333 273 0.1258 0.9667 0.9722 0.9615 0.9657
0.0113 30.6667 276 0.1282 0.9667 0.9722 0.9615 0.9657
0.0066 31.0 279 0.1300 0.9667 0.9722 0.9615 0.9657
0.0066 31.3333 282 0.1273 0.9667 0.9722 0.9615 0.9657
0.007 31.6667 285 0.1164 0.9667 0.9722 0.9615 0.9657
0.0134 32.0 288 0.1143 0.9667 0.9722 0.9615 0.9657
0.0061 32.3333 291 0.1240 0.9667 0.9722 0.9615 0.9657
0.0075 32.6667 294 0.1392 0.9667 0.9722 0.9615 0.9657
0.0076 33.0 297 0.1301 0.9667 0.9722 0.9615 0.9657
0.0058 33.3333 300 0.0953 0.9667 0.9722 0.9615 0.9657
0.0055 33.6667 303 0.0573 0.9667 0.9722 0.9615 0.9657
0.0061 34.0 306 0.0387 0.9667 0.9722 0.9615 0.9657
0.0059 34.3333 309 0.0329 0.9667 0.9722 0.9615 0.9657
0.0051 34.6667 312 0.0350 0.9667 0.9722 0.9615 0.9657
0.0053 35.0 315 0.0379 0.9667 0.9722 0.9615 0.9657
0.0052 35.3333 318 0.0379 0.9667 0.9722 0.9615 0.9657
0.005 35.6667 321 0.0389 0.9667 0.9722 0.9615 0.9657
0.0052 36.0 324 0.0409 0.9667 0.9722 0.9615 0.9657
0.0049 36.3333 327 0.0408 0.9667 0.9722 0.9615 0.9657
0.005 36.6667 330 0.0405 0.9667 0.9722 0.9615 0.9657
0.0046 37.0 333 0.0393 0.9667 0.9722 0.9615 0.9657
0.0044 37.3333 336 0.0367 0.9667 0.9722 0.9615 0.9657
0.0047 37.6667 339 0.0342 0.9667 0.9722 0.9615 0.9657
0.0045 38.0 342 0.0316 0.9667 0.9722 0.9615 0.9657
0.0044 38.3333 345 0.0286 1.0 1.0 1.0 1.0
0.0041 38.6667 348 0.0275 1.0 1.0 1.0 1.0
0.0044 39.0 351 0.0283 1.0 1.0 1.0 1.0
0.0044 39.3333 354 0.0307 0.9667 0.9722 0.9615 0.9657
0.0041 39.6667 357 0.0327 0.9667 0.9722 0.9615 0.9657
0.0039 40.0 360 0.0344 0.9667 0.9722 0.9615 0.9657
0.0039 40.3333 363 0.0363 0.9667 0.9722 0.9615 0.9657
0.004 40.6667 366 0.0381 0.9667 0.9722 0.9615 0.9657
0.0041 41.0 369 0.0375 0.9667 0.9722 0.9615 0.9657
0.004 41.3333 372 0.0365 0.9667 0.9722 0.9615 0.9657
0.0037 41.6667 375 0.0358 0.9667 0.9722 0.9615 0.9657
0.0038 42.0 378 0.0364 0.9667 0.9722 0.9615 0.9657
0.0037 42.3333 381 0.0383 0.9667 0.9722 0.9615 0.9657
0.0039 42.6667 384 0.0394 0.9667 0.9722 0.9615 0.9657
0.0036 43.0 387 0.0398 0.9667 0.9722 0.9615 0.9657
0.0035 43.3333 390 0.0414 0.9667 0.9722 0.9615 0.9657
0.0035 43.6667 393 0.0429 0.9667 0.9722 0.9615 0.9657
0.0035 44.0 396 0.0423 0.9667 0.9722 0.9615 0.9657
0.0035 44.3333 399 0.0408 0.9667 0.9722 0.9615 0.9657
0.0035 44.6667 402 0.0393 0.9667 0.9722 0.9615 0.9657
0.0034 45.0 405 0.0387 0.9667 0.9722 0.9615 0.9657
0.0033 45.3333 408 0.0388 0.9667 0.9722 0.9615 0.9657
0.0034 45.6667 411 0.0392 0.9667 0.9722 0.9615 0.9657
0.0033 46.0 414 0.0386 0.9667 0.9722 0.9615 0.9657
0.0032 46.3333 417 0.0385 0.9667 0.9722 0.9615 0.9657
0.0033 46.6667 420 0.0374 0.9667 0.9722 0.9615 0.9657
0.0032 47.0 423 0.0362 0.9667 0.9722 0.9615 0.9657
0.0032 47.3333 426 0.0355 0.9667 0.9722 0.9615 0.9657
0.003 47.6667 429 0.0347 0.9667 0.9722 0.9615 0.9657
0.0029 48.0 432 0.0344 0.9667 0.9722 0.9615 0.9657
0.003 48.3333 435 0.0347 0.9667 0.9722 0.9615 0.9657
0.003 48.6667 438 0.0350 0.9667 0.9722 0.9615 0.9657
0.003 49.0 441 0.0352 0.9667 0.9722 0.9615 0.9657
0.0029 49.3333 444 0.0358 0.9667 0.9722 0.9615 0.9657
0.003 49.6667 447 0.0364 0.9667 0.9722 0.9615 0.9657
0.0029 50.0 450 0.0369 0.9667 0.9722 0.9615 0.9657
0.003 50.3333 453 0.0380 0.9667 0.9722 0.9615 0.9657
0.0027 50.6667 456 0.0398 0.9667 0.9722 0.9615 0.9657
0.0028 51.0 459 0.0396 0.9667 0.9722 0.9615 0.9657
0.0028 51.3333 462 0.0382 0.9667 0.9722 0.9615 0.9657
0.0028 51.6667 465 0.0365 0.9667 0.9722 0.9615 0.9657
0.0028 52.0 468 0.0357 0.9667 0.9722 0.9615 0.9657
0.0028 52.3333 471 0.0357 0.9667 0.9722 0.9615 0.9657
0.0026 52.6667 474 0.0360 0.9667 0.9722 0.9615 0.9657
0.0027 53.0 477 0.0363 0.9667 0.9722 0.9615 0.9657
0.0027 53.3333 480 0.0379 0.9667 0.9722 0.9615 0.9657
0.0027 53.6667 483 0.0378 0.9667 0.9722 0.9615 0.9657
0.0026 54.0 486 0.0375 0.9667 0.9722 0.9615 0.9657
0.0027 54.3333 489 0.0372 0.9667 0.9722 0.9615 0.9657
0.0024 54.6667 492 0.0369 0.9667 0.9722 0.9615 0.9657
0.0024 55.0 495 0.0367 0.9667 0.9722 0.9615 0.9657
0.0025 55.3333 498 0.0368 0.9667 0.9722 0.9615 0.9657
0.0025 55.6667 501 0.0369 0.9667 0.9722 0.9615 0.9657
0.0025 56.0 504 0.0367 0.9667 0.9722 0.9615 0.9657
0.0024 56.3333 507 0.0360 0.9667 0.9722 0.9615 0.9657
0.0025 56.6667 510 0.0362 0.9667 0.9722 0.9615 0.9657
0.0024 57.0 513 0.0358 0.9667 0.9722 0.9615 0.9657
0.0024 57.3333 516 0.0360 0.9667 0.9722 0.9615 0.9657
0.0024 57.6667 519 0.0356 0.9667 0.9722 0.9615 0.9657
0.0026 58.0 522 0.0354 0.9667 0.9722 0.9615 0.9657
0.0024 58.3333 525 0.0361 0.9667 0.9722 0.9615 0.9657
0.0023 58.6667 528 0.0357 0.9667 0.9722 0.9615 0.9657
0.0023 59.0 531 0.0349 0.9667 0.9722 0.9615 0.9657
0.0023 59.3333 534 0.0347 0.9667 0.9722 0.9615 0.9657
0.0023 59.6667 537 0.0335 0.9667 0.9722 0.9615 0.9657
0.0023 60.0 540 0.0337 0.9667 0.9722 0.9615 0.9657
0.0021 60.3333 543 0.0346 0.9667 0.9722 0.9615 0.9657
0.0023 60.6667 546 0.0352 0.9667 0.9722 0.9615 0.9657
0.0022 61.0 549 0.0351 0.9667 0.9722 0.9615 0.9657
0.0022 61.3333 552 0.0348 0.9667 0.9722 0.9615 0.9657
0.0022 61.6667 555 0.0343 0.9667 0.9722 0.9615 0.9657
0.0023 62.0 558 0.0343 0.9667 0.9722 0.9615 0.9657
0.0022 62.3333 561 0.0345 0.9667 0.9722 0.9615 0.9657
0.0022 62.6667 564 0.0341 0.9667 0.9722 0.9615 0.9657
0.0021 63.0 567 0.0345 0.9667 0.9722 0.9615 0.9657
0.0022 63.3333 570 0.0346 0.9667 0.9722 0.9615 0.9657
0.0021 63.6667 573 0.0349 0.9667 0.9722 0.9615 0.9657
0.0022 64.0 576 0.0351 0.9667 0.9722 0.9615 0.9657
0.0021 64.3333 579 0.0353 0.9667 0.9722 0.9615 0.9657
0.0021 64.6667 582 0.0361 0.9667 0.9722 0.9615 0.9657
0.0021 65.0 585 0.0360 0.9667 0.9722 0.9615 0.9657
0.0021 65.3333 588 0.0354 0.9667 0.9722 0.9615 0.9657
0.0021 65.6667 591 0.0353 0.9667 0.9722 0.9615 0.9657
0.0021 66.0 594 0.0346 0.9667 0.9722 0.9615 0.9657
0.0021 66.3333 597 0.0340 0.9667 0.9722 0.9615 0.9657
0.0021 66.6667 600 0.0337 0.9667 0.9722 0.9615 0.9657
0.002 67.0 603 0.0336 0.9667 0.9722 0.9615 0.9657
0.0019 67.3333 606 0.0335 0.9667 0.9722 0.9615 0.9657
0.0021 67.6667 609 0.0334 0.9667 0.9722 0.9615 0.9657
0.0021 68.0 612 0.0334 0.9667 0.9722 0.9615 0.9657
0.0021 68.3333 615 0.0335 0.9667 0.9722 0.9615 0.9657
0.002 68.6667 618 0.0334 0.9667 0.9722 0.9615 0.9657
0.002 69.0 621 0.0333 0.9667 0.9722 0.9615 0.9657
0.002 69.3333 624 0.0331 0.9667 0.9722 0.9615 0.9657
0.0018 69.6667 627 0.0335 0.9667 0.9722 0.9615 0.9657
0.0019 70.0 630 0.0333 0.9667 0.9722 0.9615 0.9657
0.002 70.3333 633 0.0329 0.9667 0.9722 0.9615 0.9657
0.0019 70.6667 636 0.0326 0.9667 0.9722 0.9615 0.9657
0.002 71.0 639 0.0329 0.9667 0.9722 0.9615 0.9657
0.002 71.3333 642 0.0333 0.9667 0.9722 0.9615 0.9657
0.0019 71.6667 645 0.0336 0.9667 0.9722 0.9615 0.9657
0.0019 72.0 648 0.0331 0.9667 0.9722 0.9615 0.9657
0.0019 72.3333 651 0.0325 0.9667 0.9722 0.9615 0.9657
0.0019 72.6667 654 0.0318 0.9667 0.9722 0.9615 0.9657
0.0019 73.0 657 0.0313 0.9667 0.9722 0.9615 0.9657
0.0018 73.3333 660 0.0314 0.9667 0.9722 0.9615 0.9657
0.0019 73.6667 663 0.0314 0.9667 0.9722 0.9615 0.9657
0.0018 74.0 666 0.0313 0.9667 0.9722 0.9615 0.9657
0.0019 74.3333 669 0.0315 0.9667 0.9722 0.9615 0.9657
0.0018 74.6667 672 0.0318 0.9667 0.9722 0.9615 0.9657
0.0018 75.0 675 0.0320 0.9667 0.9722 0.9615 0.9657
0.0019 75.3333 678 0.0318 0.9667 0.9722 0.9615 0.9657
0.0019 75.6667 681 0.0317 0.9667 0.9722 0.9615 0.9657
0.0018 76.0 684 0.0316 0.9667 0.9722 0.9615 0.9657
0.0019 76.3333 687 0.0315 0.9667 0.9722 0.9615 0.9657
0.0018 76.6667 690 0.0310 0.9667 0.9722 0.9615 0.9657
0.0018 77.0 693 0.0307 0.9667 0.9722 0.9615 0.9657
0.0018 77.3333 696 0.0305 0.9667 0.9722 0.9615 0.9657
0.0018 77.6667 699 0.0304 0.9667 0.9722 0.9615 0.9657
0.0019 78.0 702 0.0304 0.9667 0.9722 0.9615 0.9657
0.0018 78.3333 705 0.0305 0.9667 0.9722 0.9615 0.9657
0.0018 78.6667 708 0.0307 0.9667 0.9722 0.9615 0.9657
0.0018 79.0 711 0.0306 0.9667 0.9722 0.9615 0.9657
0.0017 79.3333 714 0.0309 0.9667 0.9722 0.9615 0.9657
0.0018 79.6667 717 0.0312 0.9667 0.9722 0.9615 0.9657
0.0017 80.0 720 0.0312 0.9667 0.9722 0.9615 0.9657
0.0018 80.3333 723 0.0309 0.9667 0.9722 0.9615 0.9657
0.0017 80.6667 726 0.0310 0.9667 0.9722 0.9615 0.9657
0.0017 81.0 729 0.0311 0.9667 0.9722 0.9615 0.9657
0.0017 81.3333 732 0.0314 0.9667 0.9722 0.9615 0.9657
0.0017 81.6667 735 0.0316 0.9667 0.9722 0.9615 0.9657
0.0018 82.0 738 0.0316 0.9667 0.9722 0.9615 0.9657
0.0018 82.3333 741 0.0312 0.9667 0.9722 0.9615 0.9657
0.0017 82.6667 744 0.0309 0.9667 0.9722 0.9615 0.9657
0.0017 83.0 747 0.0308 0.9667 0.9722 0.9615 0.9657
0.0017 83.3333 750 0.0307 0.9667 0.9722 0.9615 0.9657
0.0017 83.6667 753 0.0308 0.9667 0.9722 0.9615 0.9657
0.0017 84.0 756 0.0307 0.9667 0.9722 0.9615 0.9657
0.0017 84.3333 759 0.0303 0.9667 0.9722 0.9615 0.9657
0.0018 84.6667 762 0.0299 0.9667 0.9722 0.9615 0.9657
0.0017 85.0 765 0.0298 0.9667 0.9722 0.9615 0.9657
0.0018 85.3333 768 0.0296 0.9667 0.9722 0.9615 0.9657
0.0016 85.6667 771 0.0294 0.9667 0.9722 0.9615 0.9657
0.0017 86.0 774 0.0294 0.9667 0.9722 0.9615 0.9657
0.0016 86.3333 777 0.0294 0.9667 0.9722 0.9615 0.9657
0.0017 86.6667 780 0.0295 0.9667 0.9722 0.9615 0.9657
0.0016 87.0 783 0.0295 0.9667 0.9722 0.9615 0.9657
0.0017 87.3333 786 0.0295 0.9667 0.9722 0.9615 0.9657
0.0017 87.6667 789 0.0295 0.9667 0.9722 0.9615 0.9657
0.0016 88.0 792 0.0293 0.9667 0.9722 0.9615 0.9657
0.0017 88.3333 795 0.0291 0.9667 0.9722 0.9615 0.9657
0.0017 88.6667 798 0.0288 0.9667 0.9722 0.9615 0.9657
0.0016 89.0 801 0.0286 0.9667 0.9722 0.9615 0.9657
0.0017 89.3333 804 0.0286 0.9667 0.9722 0.9615 0.9657
0.0016 89.6667 807 0.0286 0.9667 0.9722 0.9615 0.9657
0.0017 90.0 810 0.0285 0.9667 0.9722 0.9615 0.9657
0.0016 90.3333 813 0.0283 0.9667 0.9722 0.9615 0.9657
0.0016 90.6667 816 0.0282 0.9667 0.9722 0.9615 0.9657
0.0017 91.0 819 0.0280 0.9667 0.9722 0.9615 0.9657
0.0016 91.3333 822 0.0280 0.9667 0.9722 0.9615 0.9657
0.0017 91.6667 825 0.0279 0.9667 0.9722 0.9615 0.9657
0.0017 92.0 828 0.0279 0.9667 0.9722 0.9615 0.9657
0.0016 92.3333 831 0.0277 0.9667 0.9722 0.9615 0.9657
0.0016 92.6667 834 0.0276 0.9667 0.9722 0.9615 0.9657
0.0016 93.0 837 0.0276 0.9667 0.9722 0.9615 0.9657
0.0017 93.3333 840 0.0277 0.9667 0.9722 0.9615 0.9657
0.0016 93.6667 843 0.0277 0.9667 0.9722 0.9615 0.9657
0.0017 94.0 846 0.0278 0.9667 0.9722 0.9615 0.9657
0.0016 94.3333 849 0.0279 0.9667 0.9722 0.9615 0.9657
0.0016 94.6667 852 0.0281 0.9667 0.9722 0.9615 0.9657
0.0015 95.0 855 0.0282 0.9667 0.9722 0.9615 0.9657
0.0016 95.3333 858 0.0283 0.9667 0.9722 0.9615 0.9657
0.0016 95.6667 861 0.0283 0.9667 0.9722 0.9615 0.9657
0.0017 96.0 864 0.0283 0.9667 0.9722 0.9615 0.9657
0.0016 96.3333 867 0.0283 0.9667 0.9722 0.9615 0.9657
0.0016 96.6667 870 0.0282 0.9667 0.9722 0.9615 0.9657
0.0017 97.0 873 0.0282 0.9667 0.9722 0.9615 0.9657
0.0016 97.3333 876 0.0283 0.9667 0.9722 0.9615 0.9657
0.0015 97.6667 879 0.0284 0.9667 0.9722 0.9615 0.9657
0.0016 98.0 882 0.0284 0.9667 0.9722 0.9615 0.9657
0.0016 98.3333 885 0.0284 0.9667 0.9722 0.9615 0.9657
0.0016 98.6667 888 0.0284 0.9667 0.9722 0.9615 0.9657
0.0016 99.0 891 0.0284 0.9667 0.9722 0.9615 0.9657
0.0016 99.3333 894 0.0284 0.9667 0.9722 0.9615 0.9657
0.0016 99.6667 897 0.0284 0.9667 0.9722 0.9615 0.9657
0.0016 100.0 900 0.0284 0.9667 0.9722 0.9615 0.9657

Framework versions

  • Transformers 4.49.0
  • Pytorch 2.5.1+cu124
  • Datasets 3.5.0
  • Tokenizers 0.21.0
Downloads last month
1
Safetensors
Model size
4.08M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for picard47at/tunned_albert_model2

Finetuned
(3)
this model