finbertv4_ftis_noPretrain

This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 140.6824
  • Accuracy: 0.9460
  • Macro F1: 0.8637

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 6731
  • training_steps: 134625

Training results

Training Loss Epoch Step Validation Loss Accuracy Macro F1
45.549 0.0010 134 37.7000 0.1454 0.0430
12.1687 1.0010 268 120.4397 0.3653 0.0979
6.8804 2.0010 402 190.1079 0.5425 0.1458
6.0876 3.0010 536 177.8521 0.5840 0.1758
5.3127 4.0010 670 181.0252 0.6249 0.2063
4.4624 5.0010 804 165.3632 0.6468 0.2158
3.8216 6.0010 938 106.9821 0.6504 0.2401
3.0405 7.0009 1072 71.6789 0.6904 0.2770
2.3408 8.0009 1206 45.3098 0.6912 0.2993
2.0837 9.0009 1340 28.4964 0.7263 0.3498
1.7971 10.0009 1474 17.5005 0.7597 0.3933
1.5485 11.0009 1608 10.0478 0.7551 0.4004
1.4729 12.0009 1742 7.6028 0.7842 0.4397
1.3164 13.0009 1876 7.1881 0.8088 0.4893
1.1254 14.0009 2010 4.9870 0.8136 0.5220
1.0691 15.0009 2144 4.3065 0.8070 0.5180
1.1624 16.0009 2278 3.5147 0.8304 0.5520
0.9048 17.0009 2412 3.7801 0.8320 0.5597
0.9057 18.0009 2546 2.9275 0.8361 0.5762
0.903 19.0009 2680 2.5883 0.8463 0.6073
0.7304 20.0008 2814 2.6785 0.8540 0.6196
0.6946 21.0008 2948 2.5038 0.8464 0.6104
0.6078 22.0008 3082 2.3498 0.8606 0.6291
0.6088 23.0008 3216 2.4271 0.8530 0.6207
0.4886 24.0008 3350 2.4903 0.8647 0.6428
0.4934 25.0008 3484 2.5023 0.8676 0.6451
0.4467 26.0008 3618 3.1005 0.8743 0.6647
0.4272 27.0008 3752 2.8655 0.8730 0.6611
0.3755 28.0008 3886 3.4087 0.8734 0.6698
0.3419 29.0008 4020 3.0434 0.8803 0.6908
0.3178 30.0008 4154 4.0369 0.8847 0.6970
0.2935 31.0008 4288 4.9078 0.8787 0.6848
0.2843 32.0008 4422 5.1140 0.8852 0.6986
0.2528 33.0008 4556 5.2650 0.8851 0.6992
0.2419 34.0007 4690 5.6761 0.8939 0.7100
0.2291 35.0007 4824 5.0780 0.8919 0.7164
0.2123 36.0007 4958 6.7257 0.8803 0.7084
0.2129 37.0007 5092 6.6463 0.8879 0.7106
0.1909 38.0007 5226 7.0005 0.8976 0.7224
0.1811 39.0007 5360 7.3596 0.8996 0.7331
0.156 40.0007 5494 8.9054 0.9002 0.7356
0.1412 41.0007 5628 8.4261 0.9034 0.7396
0.1408 42.0007 5762 10.1692 0.9005 0.7437
0.1365 43.0007 5896 10.4511 0.9036 0.7492
0.1302 44.0007 6030 8.5259 0.9028 0.7474
0.115 45.0007 6164 8.8409 0.9020 0.7486
0.1145 46.0007 6298 12.7269 0.9079 0.7511
0.1086 47.0006 6432 11.0894 0.9079 0.7558
0.1066 48.0006 6566 9.6379 0.9109 0.7645
0.0865 49.0006 6700 12.0341 0.9088 0.7590
0.0902 50.0006 6834 11.4583 0.9131 0.7690
0.0857 51.0006 6968 8.3967 0.9123 0.7709
0.075 52.0006 7102 12.6557 0.9151 0.7722
0.0814 53.0006 7236 13.2238 0.9165 0.7808
0.077 54.0006 7370 10.6487 0.9161 0.7797
0.0639 55.0006 7504 11.9067 0.9181 0.7828
0.0648 56.0006 7638 11.7533 0.9161 0.7821
0.0602 57.0006 7772 10.3832 0.9206 0.7910
0.0538 58.0006 7906 12.8221 0.9213 0.7850
0.0586 59.0006 8040 11.5703 0.9208 0.7906
0.0414 60.0005 8174 11.2536 0.9214 0.7923
0.0395 61.0005 8308 13.5535 0.9246 0.7960
0.0518 62.0005 8442 13.5787 0.9223 0.7922
0.0441 63.0005 8576 11.8294 0.9198 0.7890
0.0464 64.0005 8710 12.4562 0.9192 0.7936
0.0423 65.0005 8844 8.8058 0.9243 0.8010
0.0455 66.0005 8978 12.6342 0.9232 0.7962
0.0372 67.0005 9112 13.5254 0.9261 0.8008
0.0386 68.0005 9246 12.5933 0.9269 0.8044
0.0351 69.0005 9380 16.2459 0.9235 0.8031
0.0337 70.0005 9514 10.9077 0.9246 0.8031
0.0362 71.0005 9648 11.3444 0.9279 0.8057
0.0289 72.0005 9782 12.8704 0.9276 0.8114
0.0277 73.0005 9916 12.9896 0.9232 0.8039
0.0337 74.0004 10050 14.7879 0.9243 0.8070
0.038 75.0004 10184 13.4084 0.9221 0.7950
0.0389 76.0004 10318 15.4665 0.9208 0.7877
0.0331 77.0004 10452 13.3072 0.9252 0.8066
0.0309 78.0004 10586 12.6113 0.9238 0.8079
0.0234 79.0004 10720 14.3185 0.9298 0.8135
0.0224 80.0004 10854 18.7701 0.9309 0.8149
0.0249 81.0004 10988 14.7982 0.9268 0.8159
0.0293 82.0004 11122 16.2099 0.9252 0.8096
0.0375 83.0004 11256 13.1446 0.9222 0.8083
0.0398 84.0004 11390 12.2897 0.9270 0.8069
0.0329 85.0004 11524 9.9588 0.9277 0.8101
0.0264 86.0004 11658 15.5504 0.9309 0.8207
0.0212 87.0003 11792 16.3288 0.9319 0.8223
0.0201 88.0003 11926 16.9663 0.9327 0.8222
0.0197 89.0003 12060 23.1126 0.9328 0.8241
0.0195 90.0003 12194 20.7772 0.9316 0.8216
0.0217 91.0003 12328 20.8325 0.9297 0.8180
0.0228 92.0003 12462 20.2237 0.9286 0.8192
0.0229 93.0003 12596 19.9432 0.9331 0.8228
0.0257 94.0003 12730 20.8237 0.9319 0.8147
0.0164 95.0003 12864 24.2862 0.9347 0.8267
0.0174 96.0003 12998 26.0493 0.9363 0.8256
0.0276 97.0003 13132 19.0496 0.9289 0.8211
0.0252 98.0003 13266 11.8151 0.9322 0.8221
0.0239 99.0003 13400 26.1875 0.9295 0.8220
0.032 100.0003 13534 21.7392 0.9347 0.8245
0.0725 101.0002 13668 17.1267 0.9274 0.8222
0.0326 102.0002 13802 16.2185 0.9338 0.8279
0.0311 103.0002 13936 16.9710 0.9314 0.8216
0.0226 104.0002 14070 25.8568 0.9327 0.8270
0.0166 105.0002 14204 22.9278 0.9368 0.8334
0.0124 106.0002 14338 25.7349 0.9392 0.8366
0.0107 107.0002 14472 32.1370 0.9384 0.8322
0.0104 108.0002 14606 33.9868 0.9382 0.8338
0.0245 109.0002 14740 19.0870 0.9359 0.8335
0.0148 110.0002 14874 27.1582 0.9343 0.8213
0.0123 111.0002 15008 21.2497 0.9382 0.8322
0.0121 112.0002 15142 21.2890 0.9396 0.8364
0.0098 113.0002 15276 33.8945 0.9322 0.8336
0.0127 114.0001 15410 23.7403 0.9354 0.8340
0.016 115.0001 15544 36.1000 0.9346 0.8317
0.0159 116.0001 15678 34.1377 0.9352 0.8306
0.0238 117.0001 15812 29.9560 0.9336 0.8289
0.0215 118.0001 15946 23.3953 0.9352 0.8337
0.0196 119.0001 16080 21.2258 0.9328 0.8361
0.016 120.0001 16214 17.8038 0.9359 0.8368
0.0317 121.0001 16348 19.2920 0.9325 0.8351
0.0144 122.0001 16482 29.3245 0.9377 0.8390
0.0142 123.0001 16616 29.8097 0.9377 0.8402
0.011 124.0001 16750 35.3951 0.9388 0.8432
0.0094 125.0001 16884 26.2506 0.9370 0.8426
0.0116 126.0001 17018 31.6060 0.9262 0.8371
0.0102 127.0001 17152 31.9086 0.9345 0.8435
0.0146 128.0000 17286 30.6725 0.9363 0.8391
0.0126 129.0000 17420 22.8775 0.9370 0.8364
0.0108 130.0000 17554 27.2359 0.9397 0.8434
0.0097 131.0000 17688 25.9838 0.9393 0.8417
0.008 132.0000 17822 38.6480 0.9406 0.8445
0.0067 133.0000 17956 49.8206 0.9410 0.8463
0.0058 133.0010 18090 49.0981 0.9399 0.8446
0.0065 134.0010 18224 37.6428 0.9398 0.8425
0.0062 135.0010 18358 46.4948 0.9381 0.8376
0.0065 136.0010 18492 33.9389 0.9409 0.8474
0.0344 137.0010 18626 17.4156 0.9336 0.8368
0.0194 138.0010 18760 20.2359 0.9378 0.8407
0.0205 139.0010 18894 24.1386 0.9416 0.8477
0.0091 140.0010 19028 38.4560 0.9403 0.8431
0.008 141.0009 19162 46.6457 0.9430 0.8494
0.0076 142.0009 19296 67.7721 0.9415 0.8450
0.006 143.0009 19430 53.1972 0.9438 0.8506
0.006 144.0009 19564 55.7990 0.9435 0.8510
0.0063 145.0009 19698 63.8235 0.9430 0.8528
0.0046 146.0009 19832 60.1267 0.9433 0.8506
0.0043 147.0009 19966 51.8146 0.9423 0.8491
0.0036 148.0009 20100 53.4295 0.9433 0.8526
0.0052 149.0009 20234 73.8274 0.9428 0.8525
0.0066 150.0009 20368 50.7785 0.9402 0.8476
0.0467 151.0009 20502 12.7505 0.9272 0.8218
0.0183 152.0009 20636 16.3303 0.9334 0.8400
0.017 153.0009 20770 32.9632 0.9407 0.8472
0.0054 154.0008 20904 45.1417 0.9422 0.8513
0.0046 155.0008 21038 58.7558 0.9427 0.8532
0.0037 156.0008 21172 71.0211 0.9435 0.8521
0.0036 157.0008 21306 78.0138 0.9436 0.8552
0.0035 158.0008 21440 70.2300 0.9429 0.8551
0.0032 159.0008 21574 84.0004 0.9429 0.8541
0.0031 160.0008 21708 76.8712 0.9437 0.8545
0.0034 161.0008 21842 63.4302 0.9436 0.8523
0.0069 162.0008 21976 36.4633 0.9415 0.8474
0.0135 163.0008 22110 15.8481 0.9342 0.8337
0.0183 164.0008 22244 19.2228 0.9341 0.8387
0.017 165.0008 22378 26.5695 0.9372 0.8459
0.0187 166.0008 22512 40.8066 0.9402 0.8527
0.0057 167.0008 22646 66.2340 0.9401 0.8451
0.0061 168.0007 22780 65.7938 0.9422 0.8555
0.0091 169.0007 22914 38.5296 0.9372 0.8454
0.0074 170.0007 23048 37.0112 0.9359 0.8422
0.0096 171.0007 23182 58.2084 0.9409 0.8490
0.0173 172.0007 23316 43.4838 0.9434 0.8525
0.0095 173.0007 23450 53.2788 0.9409 0.8555
0.0074 174.0007 23584 49.6918 0.9419 0.8589
0.0043 175.0007 23718 73.2480 0.9405 0.8538
0.0029 176.0007 23852 77.7240 0.9428 0.8590
0.0045 177.0007 23986 61.1120 0.9415 0.8508
0.0056 178.0007 24120 77.8616 0.9438 0.8572
0.0045 179.0007 24254 89.1427 0.9419 0.8544
0.0098 180.0007 24388 82.9110 0.9440 0.8579
0.0028 181.0006 24522 106.0760 0.9444 0.8607
0.003 182.0006 24656 112.8113 0.9445 0.8580
0.0028 183.0006 24790 87.1914 0.9437 0.8526
0.0037 184.0006 24924 100.4757 0.9445 0.8573
0.0028 185.0006 25058 125.8635 0.9443 0.8576
0.0028 186.0006 25192 122.9104 0.9452 0.8594
0.0029 187.0006 25326 137.2739 0.9422 0.8549
0.0074 188.0006 25460 76.9213 0.9421 0.8482
0.007 189.0006 25594 70.3398 0.9424 0.8543
0.0191 190.0006 25728 47.5316 0.9443 0.8576
0.0102 191.0006 25862 39.1774 0.9422 0.8566
0.0123 192.0006 25996 46.5033 0.9423 0.8555
0.0134 193.0006 26130 44.7733 0.9423 0.8523
0.0343 194.0005 26264 58.8501 0.9384 0.8524
0.0046 195.0005 26398 100.5458 0.9397 0.8571
0.0036 196.0005 26532 95.4706 0.9462 0.8600
0.0025 197.0005 26666 118.5605 0.9457 0.8605
0.002 198.0005 26800 115.6680 0.9458 0.8614
0.0022 199.0005 26934 95.8778 0.9445 0.8612
0.0019 200.0005 27068 105.3724 0.9460 0.8634
0.0016 201.0005 27202 138.2059 0.9460 0.8637
0.0018 202.0005 27336 115.5890 0.9448 0.8606
0.0084 203.0005 27470 94.2083 0.9421 0.8561
0.0032 204.0005 27604 88.8184 0.9439 0.8618
0.006 205.0005 27738 97.2172 0.9425 0.8586
0.0032 206.0005 27872 89.8283 0.9407 0.8597
0.008 207.0005 28006 65.2748 0.9440 0.8575
0.0071 208.0004 28140 82.2443 0.9455 0.8592
0.0055 209.0004 28274 94.2593 0.9438 0.8558
0.0119 210.0004 28408 87.6687 0.9425 0.8502
0.0118 211.0004 28542 42.2785 0.9392 0.8534
0.0091 212.0004 28676 59.0222 0.9417 0.8568
0.007 213.0004 28810 79.9297 0.9403 0.8590
0.0056 214.0004 28944 109.2196 0.9391 0.8515
0.0068 215.0004 29078 90.4840 0.9422 0.8598
0.0043 216.0004 29212 86.8893 0.9425 0.8565
0.0056 217.0004 29346 99.9204 0.9427 0.8611
0.0038 218.0004 29480 98.2803 0.9396 0.8587
0.0025 219.0004 29614 125.4547 0.9442 0.8625
0.0016 220.0004 29748 126.0455 0.9447 0.8631
0.0015 221.0003 29882 147.4570 0.9441 0.8612

Framework versions

  • Transformers 4.46.0
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.20.1
Downloads last month
5
Safetensors
Model size
130M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.