Ner-vibert4news-base-cased

This model is a fine-tuned version of NlpHUST/electra-base-vn on the hts98/UIT dataset. It achieves the following results on the evaluation set:

  • Loss: 2.2270
  • Precision: 0.6232
  • Recall: 0.6731
  • F1: 0.6472
  • Accuracy: 0.7938

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 120.0

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 487 0.7563 0.4074 0.5740 0.4765 0.7624
1.1928 2.0 974 0.6945 0.4871 0.6226 0.5466 0.7788
0.6432 3.0 1461 0.7379 0.5035 0.6348 0.5616 0.7798
0.4783 4.0 1948 0.7334 0.5022 0.6396 0.5626 0.7849
0.3609 5.0 2435 0.8053 0.5322 0.6535 0.5866 0.7819
0.2735 6.0 2922 0.8289 0.5283 0.6401 0.5789 0.7850
0.2243 7.0 3409 0.9323 0.5463 0.6396 0.5892 0.7812
0.1753 8.0 3896 0.9913 0.5343 0.6547 0.5884 0.7798
0.147 9.0 4383 1.0703 0.5336 0.6535 0.5875 0.7738
0.1204 10.0 4870 1.0797 0.5470 0.6357 0.5880 0.7853
0.1046 11.0 5357 1.1283 0.5346 0.6527 0.5878 0.7804
0.0839 12.0 5844 1.1800 0.5472 0.6516 0.5949 0.7841
0.0702 13.0 6331 1.2153 0.5662 0.6600 0.6095 0.7898
0.0631 14.0 6818 1.2912 0.5451 0.6443 0.5906 0.7824
0.0508 15.0 7305 1.3288 0.5665 0.6505 0.6056 0.7855
0.0463 16.0 7792 1.4110 0.5716 0.6485 0.6076 0.7857
0.0387 17.0 8279 1.4022 0.5641 0.6563 0.6067 0.7860
0.0352 18.0 8766 1.4306 0.5540 0.6555 0.6005 0.7892
0.031 19.0 9253 1.4502 0.5659 0.6482 0.6043 0.7931
0.0286 20.0 9740 1.5111 0.5469 0.6572 0.5970 0.7867
0.0262 21.0 10227 1.6086 0.5745 0.6468 0.6085 0.7826
0.0212 22.0 10714 1.6134 0.5790 0.6622 0.6178 0.7845
0.0188 23.0 11201 1.6032 0.5662 0.6533 0.6066 0.7831
0.0179 24.0 11688 1.6524 0.5649 0.6561 0.6071 0.7821
0.0156 25.0 12175 1.6468 0.5643 0.6614 0.6090 0.7832
0.0165 26.0 12662 1.6683 0.5753 0.6524 0.6115 0.7857
0.0141 27.0 13149 1.6882 0.5740 0.6499 0.6096 0.7868
0.013 28.0 13636 1.7041 0.5761 0.6443 0.6083 0.7892
0.0125 29.0 14123 1.7665 0.5803 0.6549 0.6153 0.7868
0.0113 30.0 14610 1.7239 0.5814 0.6538 0.6155 0.7935
0.0115 31.0 15097 1.8083 0.5721 0.6535 0.6101 0.7848
0.0121 32.0 15584 1.7592 0.5660 0.6628 0.6106 0.7925
0.0105 33.0 16071 1.7803 0.5799 0.6572 0.6161 0.7882
0.0089 34.0 16558 1.8192 0.5786 0.6513 0.6128 0.7871
0.0107 35.0 17045 1.8329 0.5668 0.6597 0.6097 0.7860
0.011 36.0 17532 1.8010 0.5714 0.6547 0.6102 0.7834
0.0087 37.0 18019 1.8314 0.5906 0.6544 0.6208 0.7898
0.0075 38.0 18506 1.8428 0.5912 0.6577 0.6227 0.7913
0.0075 39.0 18993 1.8757 0.5816 0.6678 0.6217 0.7893
0.0079 40.0 19480 1.8514 0.5897 0.6586 0.6223 0.7897
0.0086 41.0 19967 1.8783 0.5878 0.6655 0.6242 0.7897
0.0075 42.0 20454 1.8177 0.5868 0.6644 0.6232 0.7951
0.0071 43.0 20941 1.8850 0.6038 0.6650 0.6329 0.7940
0.0068 44.0 21428 1.9210 0.5996 0.6661 0.6311 0.7918
0.006 45.0 21915 1.9289 0.5892 0.6630 0.6239 0.7913
0.0077 46.0 22402 1.9011 0.5876 0.6602 0.6218 0.7938
0.0047 47.0 22889 1.9092 0.5856 0.6630 0.6219 0.7934
0.0073 48.0 23376 1.9654 0.5886 0.6639 0.6240 0.7885
0.0058 49.0 23863 1.9483 0.5809 0.6639 0.6196 0.7884
0.0081 50.0 24350 1.9434 0.5995 0.6566 0.6268 0.7899
0.0063 51.0 24837 1.9490 0.5938 0.6639 0.6269 0.7903
0.0052 52.0 25324 1.9654 0.6072 0.6552 0.6303 0.7879
0.007 53.0 25811 1.9699 0.5967 0.6591 0.6263 0.7880
0.0047 54.0 26298 1.9713 0.5967 0.6614 0.6274 0.7909
0.0041 55.0 26785 1.9534 0.5909 0.6630 0.6249 0.7895
0.0042 56.0 27272 1.9982 0.6028 0.6630 0.6315 0.7941
0.0045 57.0 27759 1.9968 0.6058 0.6544 0.6292 0.7921
0.0045 58.0 28246 1.9851 0.6039 0.6580 0.6298 0.7905
0.0039 59.0 28733 2.0431 0.6067 0.6653 0.6346 0.7891
0.0048 60.0 29220 2.0036 0.5953 0.6494 0.6212 0.7878
0.004 61.0 29707 1.9971 0.6022 0.6669 0.6329 0.7914
0.0032 62.0 30194 2.0073 0.6025 0.6605 0.6302 0.7912
0.0033 63.0 30681 2.0134 0.5962 0.6608 0.6269 0.7918
0.0035 64.0 31168 2.0015 0.5981 0.6619 0.6284 0.7937
0.0032 65.0 31655 1.9974 0.5905 0.6650 0.6255 0.7940
0.0036 66.0 32142 2.0523 0.5935 0.6672 0.6282 0.7892
0.0027 67.0 32629 2.0683 0.6010 0.6695 0.6334 0.7901
0.0039 68.0 33116 2.1081 0.5919 0.6608 0.6245 0.7876
0.0027 69.0 33603 2.0555 0.5973 0.6655 0.6296 0.7923
0.003 70.0 34090 2.1007 0.5912 0.6614 0.6243 0.7880
0.0023 71.0 34577 2.0916 0.6085 0.6709 0.6382 0.7937
0.0016 72.0 35064 2.1564 0.5940 0.6600 0.6252 0.7908
0.0028 73.0 35551 2.1620 0.5947 0.6633 0.6272 0.7863
0.0028 74.0 36038 2.1390 0.5991 0.6683 0.6318 0.7892
0.0025 75.0 36525 2.1204 0.6026 0.6681 0.6337 0.7925
0.0026 76.0 37012 2.1700 0.6011 0.6614 0.6298 0.7884
0.0026 77.0 37499 2.1478 0.5994 0.6639 0.6300 0.7924
0.0022 78.0 37986 2.1547 0.5954 0.6650 0.6282 0.7879
0.0026 79.0 38473 2.1489 0.5851 0.6686 0.6241 0.7879
0.0017 80.0 38960 2.1789 0.5903 0.6706 0.6279 0.7870
0.0016 81.0 39447 2.1882 0.6026 0.6639 0.6318 0.7877
0.0014 82.0 39934 2.1825 0.6015 0.6711 0.6344 0.7880
0.0019 83.0 40421 2.1753 0.6013 0.6661 0.6321 0.7903
0.0014 84.0 40908 2.1887 0.6001 0.6661 0.6314 0.7911
0.0011 85.0 41395 2.1974 0.6055 0.6667 0.6346 0.7913
0.0019 86.0 41882 2.1918 0.6025 0.6678 0.6335 0.7913
0.0014 87.0 42369 2.1962 0.6133 0.6588 0.6353 0.7901
0.0019 88.0 42856 2.1974 0.5953 0.6628 0.6272 0.7902
0.0009 89.0 43343 2.1818 0.6002 0.6667 0.6317 0.7918
0.0016 90.0 43830 2.2059 0.6140 0.6706 0.6410 0.7945
0.0013 91.0 44317 2.2013 0.6086 0.6720 0.6387 0.7922
0.001 92.0 44804 2.1723 0.6084 0.6689 0.6372 0.7945
0.0012 93.0 45291 2.1967 0.6104 0.6706 0.6391 0.7966
0.0023 94.0 45778 2.2024 0.6157 0.6695 0.6414 0.7939
0.0012 95.0 46265 2.2250 0.6097 0.6748 0.6406 0.7929
0.0015 96.0 46752 2.1938 0.6204 0.6734 0.6458 0.7914
0.0012 97.0 47239 2.1854 0.6012 0.6801 0.6382 0.7897
0.0008 98.0 47726 2.2005 0.6199 0.6734 0.6455 0.7930
0.0008 99.0 48213 2.1999 0.6088 0.6731 0.6394 0.7896
0.0011 100.0 48700 2.2228 0.6086 0.6695 0.6376 0.7931
0.0006 101.0 49187 2.2300 0.6110 0.6784 0.6429 0.7925
0.001 102.0 49674 2.2194 0.6059 0.6748 0.6385 0.7917
0.0007 103.0 50161 2.2048 0.6131 0.6742 0.6422 0.7947
0.0003 104.0 50648 2.2270 0.6232 0.6731 0.6472 0.7938
0.0008 105.0 51135 2.2284 0.6184 0.6742 0.6451 0.7952
0.0005 106.0 51622 2.2278 0.6080 0.6742 0.6394 0.7921
0.0004 107.0 52109 2.2571 0.6157 0.6759 0.6444 0.7926
0.0006 108.0 52596 2.2562 0.6069 0.6723 0.6379 0.7925
0.0005 109.0 53083 2.2255 0.6172 0.6717 0.6433 0.7950
0.0006 110.0 53570 2.2429 0.6104 0.6759 0.6415 0.7931
0.0004 111.0 54057 2.2416 0.6123 0.6742 0.6418 0.7927
0.0004 112.0 54544 2.2629 0.6123 0.6689 0.6394 0.7939
0.0004 113.0 55031 2.2645 0.6136 0.6748 0.6427 0.7932
0.0003 114.0 55518 2.2761 0.6208 0.6736 0.6461 0.7944
0.0004 115.0 56005 2.2684 0.6159 0.6745 0.6438 0.7937
0.0004 116.0 56492 2.2741 0.6160 0.6736 0.6436 0.7926
0.0003 117.0 56979 2.2576 0.6160 0.6736 0.6436 0.7939
0.001 118.0 57466 2.2543 0.6157 0.6731 0.6431 0.7943
0.0003 119.0 57953 2.2541 0.6163 0.6739 0.6438 0.7947
0.0005 120.0 58440 2.2552 0.6164 0.6728 0.6434 0.7944

Framework versions

  • Transformers 4.32.0.dev0
  • Pytorch 2.1.0+cu121
  • Datasets 3.1.0
  • Tokenizers 0.13.3
Downloads last month
0
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.

Model tree for hts98/electra-ner

Finetuned
(3)
this model

Dataset used to train hts98/electra-ner

Evaluation results