out_dir

This model is a fine-tuned version of d4data/biomedical-ner-all on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7439
  • Accuracy: 0.8286

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.0748 1.0 9 1.0905 0.7513
0.9884 2.0 18 1.0329 0.7630
0.926 3.0 27 0.9917 0.7661
0.8732 4.0 36 0.9571 0.7717
0.8217 5.0 45 0.9290 0.7779
0.7878 6.0 54 0.9038 0.7834
0.7485 7.0 63 0.8815 0.7863
0.7075 8.0 72 0.8649 0.7962
0.6833 9.0 81 0.8501 0.7981
0.659 10.0 90 0.8348 0.7997
0.6291 11.0 99 0.8247 0.8047
0.6063 12.0 108 0.8142 0.8052
0.5911 13.0 117 0.8057 0.8088
0.5676 14.0 126 0.7965 0.8097
0.5491 15.0 135 0.7948 0.8114
0.5333 16.0 144 0.7852 0.8160
0.5263 17.0 153 0.7797 0.8170
0.513 18.0 162 0.7724 0.8180
0.4978 19.0 171 0.7708 0.8202
0.4889 20.0 180 0.7695 0.8208
0.4786 21.0 189 0.7646 0.8218
0.4653 22.0 198 0.7593 0.8233
0.4507 23.0 207 0.7579 0.8242
0.4426 24.0 216 0.7564 0.8253
0.4402 25.0 225 0.7508 0.8240
0.4254 26.0 234 0.7539 0.8251
0.4237 27.0 243 0.7528 0.8266
0.4149 28.0 252 0.7494 0.8261
0.4145 29.0 261 0.7482 0.8260
0.4021 30.0 270 0.7479 0.8278
0.3996 31.0 279 0.7487 0.8271
0.3926 32.0 288 0.7478 0.8275
0.3954 33.0 297 0.7468 0.8277
0.391 34.0 306 0.7473 0.8278
0.3899 35.0 315 0.7466 0.8279
0.3852 36.0 324 0.7463 0.8282
0.3823 37.0 333 0.7451 0.8282
0.3826 38.0 342 0.7439 0.8285
0.3813 39.0 351 0.7440 0.8287
0.379 40.0 360 0.7439 0.8286

Framework versions

  • Transformers 4.54.0
  • Pytorch 2.6.0+cu124
  • Datasets 4.0.0
  • Tokenizers 0.21.2
Downloads last month
8
Safetensors
Model size
66.4M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for DangNguyenVan258/out_dir

Finetuned
(6)
this model

Evaluation results