histv4_ftis_pretrain_smlm

This model is a fine-tuned version of Arthur-Tsai/histv4_pretrain_smlm on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8107
  • Accuracy: 0.9403
  • Macro F1: 0.8477

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 8
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 6731
  • training_steps: 134625

Training results

Training Loss Epoch Step Validation Loss Accuracy Macro F1
41.5551 0.0010 134 23.5263 0.0544 0.0272
16.2558 1.0010 268 8.7211 0.1802 0.0537
6.9342 2.0010 402 6.4634 0.5184 0.1220
5.9178 3.0010 536 5.1432 0.5568 0.1404
5.201 4.0010 670 4.8583 0.5807 0.1479
4.7006 5.0010 804 4.0314 0.5984 0.1548
4.2736 6.0010 938 3.4220 0.6089 0.1647
3.8421 7.0009 1072 3.0184 0.6234 0.1776
3.2177 8.0009 1206 2.7418 0.6212 0.1735
3.0887 9.0009 1340 2.4012 0.6326 0.1792
2.7333 10.0009 1474 2.3004 0.6488 0.1914
2.5787 11.0009 1608 2.2017 0.6553 0.2067
2.4664 12.0009 1742 2.0509 0.6689 0.2198
2.3341 13.0009 1876 1.9938 0.6728 0.2479
2.131 14.0009 2010 2.0652 0.6776 0.2446
2.0956 15.0009 2144 1.8677 0.6889 0.2751
2.2169 16.0009 2278 1.8121 0.6926 0.2824
1.9588 17.0009 2412 1.8118 0.7018 0.3110
1.9699 18.0009 2546 1.7990 0.6966 0.3193
1.8866 19.0009 2680 1.6326 0.7145 0.3412
1.6753 20.0008 2814 1.6780 0.7134 0.3435
1.7359 21.0008 2948 1.6077 0.7326 0.3548
1.5679 22.0008 3082 1.3690 0.7725 0.4050
1.4809 23.0008 3216 1.4555 0.7674 0.4124
1.3325 24.0008 3350 1.3400 0.7898 0.4263
1.3643 25.0008 3484 1.3297 0.7869 0.4563
1.274 26.0008 3618 1.2339 0.8029 0.4740
1.2455 27.0008 3752 1.1572 0.8076 0.4825
1.1939 28.0008 3886 1.2200 0.8075 0.4868
1.0667 29.0008 4020 1.1408 0.8158 0.5129
1.0309 30.0008 4154 1.0759 0.8242 0.5365
0.9944 31.0008 4288 1.1034 0.8225 0.5291
0.9707 32.0008 4422 1.0481 0.8276 0.5551
0.9262 33.0008 4556 1.0645 0.8302 0.5516
0.9374 34.0007 4690 0.9864 0.8401 0.5692
0.8312 35.0007 4824 1.0473 0.8426 0.5848
0.8721 36.0007 4958 0.9661 0.8523 0.5934
0.7767 37.0007 5092 0.9365 0.8457 0.5993
0.7556 38.0007 5226 0.9342 0.8506 0.6013
0.7268 39.0007 5360 0.9538 0.8563 0.5919
0.6446 40.0007 5494 0.8707 0.8673 0.6246
0.6231 41.0007 5628 0.8759 0.8631 0.6210
0.5897 42.0007 5762 0.9076 0.8694 0.6320
0.5714 43.0007 5896 0.9501 0.8567 0.6286
0.598 44.0007 6030 1.0102 0.8491 0.6329
0.5199 45.0007 6164 0.8747 0.8757 0.6491
0.5363 46.0007 6298 0.8333 0.8725 0.6515
0.5191 47.0006 6432 0.9014 0.8717 0.6571
0.4919 48.0006 6566 0.9997 0.8581 0.6336
0.4623 49.0006 6700 0.8609 0.8793 0.6605
0.4495 50.0006 6834 0.8148 0.8808 0.6749
0.4388 51.0006 6968 0.9222 0.8722 0.6622
0.4272 52.0006 7102 0.9723 0.8688 0.6609
0.3909 53.0006 7236 0.7780 0.8886 0.6889
0.3846 54.0006 7370 0.8528 0.8855 0.6834
0.3653 55.0006 7504 0.7721 0.8926 0.6973
0.3885 56.0006 7638 0.7939 0.8934 0.6943
0.3468 57.0006 7772 0.7859 0.8946 0.7011
0.3341 58.0006 7906 0.7899 0.8885 0.6989
0.3416 59.0006 8040 0.7807 0.8947 0.7031
0.279 60.0005 8174 0.7811 0.8943 0.7118
0.301 61.0005 8308 0.8019 0.8912 0.6952
0.3008 62.0005 8442 0.7605 0.8967 0.7143
0.292 63.0005 8576 0.7776 0.8977 0.7150
0.285 64.0005 8710 0.8281 0.8943 0.7087
0.2631 65.0005 8844 0.8047 0.9035 0.7263
0.2593 66.0005 8978 0.7932 0.8993 0.7246
0.242 67.0005 9112 0.7740 0.9039 0.7298
0.2665 68.0005 9246 0.7261 0.9027 0.7325
0.2534 69.0005 9380 0.8071 0.9040 0.7329
0.2205 70.0005 9514 0.7483 0.9097 0.7411
0.2538 71.0005 9648 0.7166 0.9111 0.7388
0.223 72.0005 9782 0.7492 0.9100 0.7455
0.2131 73.0005 9916 0.7118 0.9126 0.7507
0.1985 74.0004 10050 0.6889 0.9079 0.7453
0.1967 75.0004 10184 0.7733 0.9042 0.7486
0.1911 76.0004 10318 0.7425 0.9132 0.7515
0.1917 77.0004 10452 0.7944 0.9094 0.7509
0.1907 78.0004 10586 0.7017 0.9141 0.7560
0.1806 79.0004 10720 0.7798 0.9142 0.7561
0.1897 80.0004 10854 0.7465 0.9163 0.7587
0.1771 81.0004 10988 0.7284 0.9149 0.7596
0.1654 82.0004 11122 0.8105 0.9079 0.7535
0.1562 83.0004 11256 0.7314 0.9132 0.7605
0.1809 84.0004 11390 0.6930 0.9175 0.7625
0.1709 85.0004 11524 0.7884 0.9150 0.7654
0.158 86.0004 11658 0.7529 0.9140 0.7635
0.1593 87.0003 11792 0.7558 0.9208 0.7668
0.1542 88.0003 11926 0.7057 0.9208 0.7722
0.144 89.0003 12060 0.7488 0.9183 0.7713
0.1559 90.0003 12194 0.7894 0.9160 0.7744
0.1395 91.0003 12328 0.8427 0.9142 0.7752
0.1354 92.0003 12462 0.7397 0.9158 0.7780
0.1485 93.0003 12596 0.7778 0.9196 0.7739
0.145 94.0003 12730 0.7901 0.9200 0.7789
0.135 95.0003 12864 0.7491 0.9237 0.7833
0.1434 96.0003 12998 0.7421 0.9176 0.7794
0.1408 97.0003 13132 0.8034 0.9204 0.7826
0.1188 98.0003 13266 0.8200 0.9187 0.7861
0.1323 99.0003 13400 0.7874 0.9185 0.7848
0.116 100.0003 13534 0.7543 0.9221 0.7865
0.1263 101.0002 13668 0.7470 0.9250 0.7888
0.1074 102.0002 13802 0.7529 0.9219 0.7870
0.1155 103.0002 13936 0.7924 0.9241 0.7924
0.124 104.0002 14070 0.7581 0.9235 0.7877
0.1146 105.0002 14204 0.7882 0.9232 0.7955
0.1155 106.0002 14338 0.8141 0.9262 0.7934
0.1187 107.0002 14472 0.6847 0.9253 0.7929
0.1113 108.0002 14606 0.7904 0.9231 0.7911
0.1079 109.0002 14740 0.7682 0.9234 0.7915
0.1114 110.0002 14874 0.7867 0.9220 0.7914
0.1134 111.0002 15008 0.7289 0.9269 0.7949
0.1099 112.0002 15142 0.7459 0.9249 0.7970
0.108 113.0002 15276 0.7593 0.9250 0.8014
0.0994 114.0001 15410 0.7299 0.9224 0.7995
0.1045 115.0001 15544 0.7006 0.9271 0.8025
0.1009 116.0001 15678 0.7219 0.9271 0.7998
0.1005 117.0001 15812 0.7966 0.9275 0.8023
0.0968 118.0001 15946 0.8041 0.9241 0.8047
0.1034 119.0001 16080 0.7535 0.9277 0.8023
0.0901 120.0001 16214 0.7161 0.9284 0.8053
0.0942 121.0001 16348 0.8485 0.9263 0.8041
0.1017 122.0001 16482 0.8011 0.9247 0.8016
0.1016 123.0001 16616 0.8531 0.9271 0.8060
0.0965 124.0001 16750 0.7779 0.9262 0.8075
0.1036 125.0001 16884 0.7804 0.9268 0.8064
0.0862 126.0001 17018 0.7374 0.9292 0.8099
0.0988 127.0001 17152 0.7485 0.9258 0.8068
0.1001 128.0000 17286 0.7971 0.9291 0.8123
0.0826 129.0000 17420 0.7788 0.9300 0.8102
0.0895 130.0000 17554 0.8128 0.9273 0.8093
0.086 131.0000 17688 0.6950 0.9308 0.8135
0.0834 132.0000 17822 0.7838 0.9287 0.8114
0.0847 133.0000 17956 0.8893 0.9295 0.8104
0.0837 133.0010 18090 0.7676 0.9319 0.8132
0.0811 134.0010 18224 0.7416 0.9279 0.8130
0.0783 135.0010 18358 0.7640 0.9318 0.8146
0.0881 136.0010 18492 0.8202 0.9310 0.8129
0.0807 137.0010 18626 0.8029 0.9268 0.8103
0.0901 138.0010 18760 0.8089 0.9294 0.8118
0.0881 139.0010 18894 0.8343 0.9311 0.8106
0.0794 140.0010 19028 0.8175 0.9263 0.8096
0.0831 141.0009 19162 0.7771 0.9320 0.8150
0.0802 142.0009 19296 0.8411 0.9266 0.8109
0.0756 143.0009 19430 0.7972 0.9261 0.8096
0.0775 144.0009 19564 0.8106 0.9324 0.8163
0.0791 145.0009 19698 0.7681 0.9317 0.8173
0.0828 146.0009 19832 0.8258 0.9275 0.8170
0.0776 147.0009 19966 0.8004 0.9274 0.8133
0.0835 148.0009 20100 0.7324 0.9322 0.8187
0.0746 149.0009 20234 0.7475 0.9320 0.8241
0.0692 150.0009 20368 0.8213 0.9299 0.8163
0.0754 151.0009 20502 0.9008 0.9300 0.8177
0.0787 152.0009 20636 0.7721 0.9339 0.8203
0.0701 153.0009 20770 0.7928 0.9312 0.8187
0.0851 154.0008 20904 0.7540 0.9297 0.8163
0.0733 155.0008 21038 0.7263 0.9352 0.8241
0.0775 156.0008 21172 0.7911 0.9317 0.8200
0.0697 157.0008 21306 0.8417 0.9342 0.8267
0.0845 158.0008 21440 0.8306 0.9319 0.8214
0.0706 159.0008 21574 0.8212 0.9316 0.8233
0.0692 160.0008 21708 0.8235 0.9298 0.8200
0.0778 161.0008 21842 0.8269 0.9329 0.8225
0.0869 162.0008 21976 0.8551 0.9324 0.8219
0.0662 163.0008 22110 0.8053 0.9324 0.8228
0.07 164.0008 22244 0.8416 0.9352 0.8264
0.069 165.0008 22378 0.8288 0.9342 0.8228
0.0705 166.0008 22512 0.7717 0.9339 0.8198
0.0672 167.0008 22646 0.7860 0.9355 0.8252
0.0714 168.0007 22780 0.7715 0.9353 0.8252
0.0669 169.0007 22914 0.9343 0.9330 0.8262
0.0636 170.0007 23048 0.8008 0.9304 0.8209
0.0691 171.0007 23182 0.7393 0.9316 0.8243
0.0658 172.0007 23316 0.7584 0.9328 0.8263
0.0683 173.0007 23450 0.7970 0.9320 0.8265
0.0718 174.0007 23584 0.9420 0.9332 0.8270
0.0714 175.0007 23718 0.8105 0.9336 0.8223
0.0673 176.0007 23852 0.8063 0.9315 0.8254
0.0685 177.0007 23986 0.8507 0.9324 0.8232
0.0672 178.0007 24120 0.8262 0.9354 0.8296
0.0666 179.0007 24254 0.8841 0.9358 0.8272
0.0623 180.0007 24388 0.8183 0.9354 0.8235
0.0616 181.0006 24522 0.7900 0.9350 0.8220
0.0625 182.0006 24656 0.7423 0.9360 0.8282
0.059 183.0006 24790 0.7458 0.9327 0.8281
0.0586 184.0006 24924 0.8024 0.9353 0.8282
0.0613 185.0006 25058 0.8241 0.9369 0.8297
0.0637 186.0006 25192 0.7518 0.9345 0.8327
0.0724 187.0006 25326 0.8642 0.9331 0.8256
0.0664 188.0006 25460 0.8348 0.9344 0.8317
0.0592 189.0006 25594 0.7909 0.9362 0.8312
0.0659 190.0006 25728 0.7717 0.9373 0.8328
0.0662 191.0006 25862 0.7877 0.9367 0.8308
0.0569 192.0006 25996 0.7889 0.9334 0.8325
0.0681 193.0006 26130 0.8186 0.9332 0.8280
0.0625 194.0005 26264 0.8294 0.9338 0.8287
0.0586 195.0005 26398 0.8620 0.9330 0.8279
0.0591 196.0005 26532 0.7812 0.9334 0.8297
0.0562 197.0005 26666 0.8911 0.9327 0.8291
0.0537 198.0005 26800 0.8063 0.9335 0.8324
0.0653 199.0005 26934 0.7315 0.9355 0.8354
0.0586 200.0005 27068 0.8556 0.9358 0.8360
0.0662 201.0005 27202 0.7125 0.9357 0.8332
0.0544 202.0005 27336 0.7896 0.9375 0.8347
0.058 203.0005 27470 0.7741 0.9346 0.8331
0.0569 204.0005 27604 0.8569 0.9358 0.8327
0.0541 205.0005 27738 0.8176 0.9360 0.8341
0.0639 206.0005 27872 0.8133 0.9347 0.8314
0.0544 207.0005 28006 0.8501 0.9379 0.8380
0.056 208.0004 28140 0.7397 0.9357 0.8334
0.0605 209.0004 28274 0.8294 0.9380 0.8344
0.0598 210.0004 28408 0.8295 0.9360 0.8314
0.0621 211.0004 28542 0.9268 0.9325 0.8285
0.0638 212.0004 28676 0.7732 0.9363 0.8370
0.0652 213.0004 28810 0.8035 0.9354 0.8364
0.0623 214.0004 28944 0.7354 0.9342 0.8313
0.0601 215.0004 29078 0.8142 0.9362 0.8346
0.0487 216.0004 29212 0.8914 0.9359 0.8362
0.0522 217.0004 29346 0.8071 0.9359 0.8349
0.0509 218.0004 29480 0.8832 0.9365 0.8364
0.059 219.0004 29614 0.9041 0.9349 0.8338
0.0548 220.0004 29748 0.7905 0.9387 0.8374
0.0615 221.0003 29882 0.9597 0.9338 0.8348
0.0539 222.0003 30016 0.8774 0.9338 0.8349
0.0558 223.0003 30150 0.7731 0.9358 0.8342
0.0627 224.0003 30284 0.7959 0.9410 0.8400
0.0515 225.0003 30418 0.9352 0.9356 0.8396
0.0519 226.0003 30552 0.8323 0.9339 0.8364
0.058 227.0003 30686 0.8642 0.9330 0.8365
0.0556 228.0003 30820 0.9974 0.9284 0.8343
0.0537 229.0003 30954 0.7829 0.9356 0.8361
0.0545 230.0003 31088 0.7687 0.9375 0.8351
0.0493 231.0003 31222 0.8174 0.9381 0.8359
0.0509 232.0003 31356 0.8749 0.9352 0.8365
0.0578 233.0003 31490 0.8092 0.9351 0.8389
0.054 234.0003 31624 0.9021 0.9320 0.8412
0.0546 235.0002 31758 0.8734 0.9371 0.8402
0.048 236.0002 31892 0.8324 0.9379 0.8424
0.0501 237.0002 32026 0.8272 0.9366 0.8400
0.0503 238.0002 32160 0.8997 0.9369 0.8430
0.047 239.0002 32294 0.8471 0.9380 0.8415
0.0449 240.0002 32428 0.8074 0.9359 0.8373
0.0485 241.0002 32562 0.8875 0.9318 0.8347
0.0546 242.0002 32696 0.7826 0.9346 0.8399
0.0601 243.0002 32830 0.8505 0.9390 0.8397
0.0485 244.0002 32964 0.8046 0.9395 0.8439
0.0536 245.0002 33098 0.8315 0.9372 0.8416
0.0463 246.0002 33232 0.8194 0.9383 0.8418
0.0536 247.0002 33366 0.8217 0.9375 0.8385
0.0461 248.0001 33500 0.7930 0.9388 0.8389
0.0487 249.0001 33634 0.8873 0.9390 0.8381
0.0603 250.0001 33768 0.7746 0.9331 0.8330
0.051 251.0001 33902 0.8490 0.9340 0.8359
0.0498 252.0001 34036 0.8098 0.9395 0.8417
0.0466 253.0001 34170 0.8295 0.9388 0.8410
0.0497 254.0001 34304 0.7591 0.9390 0.8442
0.0594 255.0001 34438 0.9901 0.9394 0.8402
0.0514 256.0001 34572 0.8113 0.9360 0.8419
0.0461 257.0001 34706 0.8267 0.9334 0.8364
0.0434 258.0001 34840 0.7760 0.9379 0.8386
0.0506 259.0001 34974 0.8669 0.9382 0.8408
0.0471 260.0001 35108 0.7772 0.9390 0.8402
0.0486 261.0001 35242 0.8321 0.9370 0.8405
0.0518 262.0000 35376 0.9146 0.9367 0.8392
0.0413 263.0000 35510 0.8359 0.9360 0.8413
0.0457 264.0000 35644 0.8839 0.9401 0.8442
0.0456 265.0000 35778 0.8783 0.9391 0.8431
0.0458 266.0000 35912 0.8283 0.9410 0.8461
0.0494 267.0000 36046 0.8519 0.9373 0.8374
0.0529 267.0010 36180 0.8561 0.9363 0.8440
0.0412 268.0010 36314 0.9018 0.9403 0.8450
0.0529 269.0010 36448 0.8497 0.9356 0.8282
0.0505 270.0010 36582 0.8640 0.9365 0.8379
0.0461 271.0010 36716 0.7976 0.9393 0.8413
0.0517 272.0010 36850 0.8718 0.9346 0.8372
0.0467 273.0010 36984 0.8357 0.9361 0.8399
0.0439 274.0010 37118 0.8862 0.9382 0.8425
0.0451 275.0009 37252 0.8501 0.9379 0.8431
0.047 276.0009 37386 0.7937 0.9384 0.8417
0.0495 277.0009 37520 0.8400 0.9368 0.8382
0.0463 278.0009 37654 0.7838 0.9355 0.8373
0.0489 279.0009 37788 0.8209 0.9400 0.8430
0.0442 280.0009 37922 0.8982 0.9344 0.8431
0.0495 281.0009 38056 0.7733 0.9387 0.8428
0.0414 282.0009 38190 0.8784 0.9394 0.8442
0.0392 283.0009 38324 0.9188 0.9371 0.8427
0.0431 284.0009 38458 0.8236 0.9404 0.8493
0.0487 285.0009 38592 0.8183 0.9396 0.8392
0.0464 286.0009 38726 0.8479 0.9374 0.8434
0.0416 287.0009 38860 0.8527 0.9396 0.8429
0.0454 288.0008 38994 0.8712 0.9354 0.8364
0.0514 289.0008 39128 0.8558 0.9394 0.8439
0.048 290.0008 39262 0.8637 0.9370 0.8394
0.0535 291.0008 39396 0.8481 0.9382 0.8410
0.048 292.0008 39530 0.8196 0.9404 0.8455
0.0406 293.0008 39664 0.8391 0.9399 0.8464
0.0473 294.0008 39798 0.8275 0.9406 0.8451
0.0468 295.0008 39932 0.7980 0.9398 0.8484
0.0419 296.0008 40066 0.8415 0.9403 0.8483
0.051 297.0008 40200 0.8705 0.9412 0.8453
0.0457 298.0008 40334 0.7646 0.9393 0.8454
0.0488 299.0008 40468 0.8858 0.9366 0.8446
0.0399 300.0008 40602 0.8592 0.9366 0.8453
0.0398 301.0008 40736 0.9023 0.9396 0.8458
0.0372 302.0007 40870 0.8517 0.9380 0.8453
0.0384 303.0007 41004 0.7671 0.9409 0.8462
0.0426 304.0007 41138 0.8307 0.9378 0.8466

Framework versions

  • Transformers 4.46.0
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.20.1
Downloads last month
490
Safetensors
Model size
31.5M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Model tree for Arthur-Tsai/histv4_ftis_pretrain_smlm

Finetuned
(1)
this model