--- library_name: transformers base_model: echodrift/terminator tags: - generated_from_trainer metrics: - f1 model-index: - name: terminator_finetune results: [] --- # terminator_finetune This model is a fine-tuned version of [echodrift/terminator](https://huggingface.co/echodrift/terminator) on the None dataset. It achieves the following results on the evaluation set: - Loss: 4.3450 - F1: 0.4817 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 4.676339096688447e-05 - train_batch_size: 16 - eval_batch_size: 64 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: cosine - num_epochs: 40.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | |:-------------:|:-------:|:----:|:---------------:|:------:| | No log | 0.9091 | 60 | 1.0333 | 0.4571 | | No log | 1.8182 | 120 | 1.0937 | 0.4575 | | No log | 2.7273 | 180 | 1.4988 | 0.4340 | | No log | 3.6364 | 240 | 1.8738 | 0.4582 | | No log | 4.5455 | 300 | 2.7333 | 0.4141 | | No log | 5.4545 | 360 | 3.1445 | 0.4468 | | No log | 6.3636 | 420 | 3.2106 | 0.5097 | | No log | 7.2727 | 480 | 3.3219 | 0.4878 | | 0.3564 | 8.1818 | 540 | 4.1566 | 0.4493 | | 0.3564 | 9.0909 | 600 | 3.5661 | 0.4938 | | 0.3564 | 10.0 | 660 | 3.5243 | 0.5015 | | 0.3564 | 10.9091 | 720 | 3.7514 | 0.5057 | | 0.3564 | 11.8182 | 780 | 4.0015 | 0.4608 | | 0.3564 | 12.7273 | 840 | 4.4677 | 0.4278 | | 0.3564 | 13.6364 | 900 | 4.0757 | 0.4677 | | 0.3564 | 14.5455 | 960 | 4.4461 | 0.4501 | | 0.0105 | 15.4545 | 1020 | 4.1675 | 0.4820 | | 0.0105 | 16.3636 | 1080 | 4.2034 | 0.4752 | | 0.0105 | 17.2727 | 1140 | 4.2144 | 0.4820 | | 0.0105 | 18.1818 | 1200 | 4.2162 | 0.4871 | | 0.0105 | 19.0909 | 1260 | 4.0772 | 0.4972 | | 0.0105 | 20.0 | 1320 | 4.3442 | 0.4733 | | 0.0105 | 20.9091 | 1380 | 4.2116 | 0.4912 | | 0.0105 | 21.8182 | 1440 | 4.1968 | 0.4860 | | 0.0008 | 22.7273 | 1500 | 4.2478 | 0.4855 | | 0.0008 | 23.6364 | 1560 | 4.3012 | 0.5041 | | 0.0008 | 24.5455 | 1620 | 4.6983 | 0.4779 | | 0.0008 | 25.4545 | 1680 | 4.1226 | 0.5194 | | 0.0008 | 26.3636 | 1740 | 4.1304 | 0.5282 | | 0.0008 | 27.2727 | 1800 | 4.1460 | 0.5250 | | 0.0008 | 28.1818 | 1860 | 4.1624 | 0.5271 | | 0.0008 | 29.0909 | 1920 | 4.1758 | 0.5210 | | 0.0008 | 30.0 | 1980 | 4.1815 | 0.5210 | | 0.0005 | 30.9091 | 2040 | 4.1975 | 0.5154 | | 0.0005 | 31.8182 | 2100 | 4.2007 | 0.5154 | | 0.0005 | 32.7273 | 2160 | 4.2079 | 0.5160 | | 0.0005 | 33.6364 | 2220 | 4.3222 | 0.4817 | | 0.0005 | 34.5455 | 2280 | 4.3393 | 0.4817 | | 0.0005 | 35.4545 | 2340 | 4.3413 | 0.4817 | | 0.0005 | 36.3636 | 2400 | 4.3432 | 0.4817 | | 0.0005 | 37.2727 | 2460 | 4.3440 | 0.4817 | | 0.0001 | 38.1818 | 2520 | 4.3442 | 0.4817 | | 0.0001 | 39.0909 | 2580 | 4.3442 | 0.4817 | | 0.0001 | 40.0 | 2640 | 4.3450 | 0.4817 | ### Framework versions - Transformers 4.47.0.dev0 - Pytorch 2.5.1+cu121 - Datasets 3.1.0 - Tokenizers 0.20.3