turkish-hs-degree-prediction

This model is a fine-tuned version of dbmdz/bert-base-turkish-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.4057
  • Mse: 2.4057
  • Mae: 1.0103
  • R2: 0.6519
  • Accuracy: 0.4909

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-06
  • train_batch_size: 16
  • eval_batch_size: 20
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Mse Mae R2 Accuracy
6.2323 0.1376 100 5.6129 5.6129 1.8340 0.1878 0.1210
5.1946 0.2751 200 4.2740 4.2740 1.5758 0.3815 0.2750
4.1391 0.4127 300 3.6789 3.6789 1.4001 0.4676 0.3203
3.643 0.5502 400 3.4047 3.4047 1.3039 0.5073 0.3664
3.1216 0.6878 500 3.2868 3.2868 1.2569 0.5244 0.3890
3.0051 0.8253 600 3.1198 3.1198 1.2476 0.5486 0.3812
2.7453 0.9629 700 2.9603 2.9603 1.1832 0.5716 0.4082
2.6567 1.1004 800 2.8558 2.8558 1.1548 0.5868 0.4291
2.6505 1.2380 900 2.8028 2.8028 1.1569 0.5944 0.4091
2.4613 1.3755 1000 2.6528 2.6528 1.1165 0.6161 0.4308
2.494 1.5131 1100 2.6037 2.6037 1.1299 0.6232 0.4117
2.5598 1.6506 1200 2.5742 2.5742 1.0774 0.6275 0.4552
2.3135 1.7882 1300 2.5063 2.5063 1.1045 0.6373 0.4169
2.3415 1.9257 1400 2.6034 2.6034 1.0952 0.6233 0.4482
2.1288 2.0633 1500 2.5379 2.5379 1.0790 0.6328 0.4456
2.1298 2.2008 1600 2.5946 2.5946 1.1048 0.6245 0.4360
2.0684 2.3384 1700 2.5250 2.5250 1.0503 0.6346 0.4656
1.9188 2.4759 1800 2.4779 2.4779 1.0466 0.6414 0.4630
2.0123 2.6135 1900 2.4917 2.4917 1.0547 0.6394 0.4639
1.9938 2.7510 2000 2.3888 2.3888 1.0420 0.6543 0.4578
1.8904 2.8886 2100 2.3776 2.3776 1.0356 0.6559 0.4569
1.9122 3.0261 2200 2.4560 2.4560 1.0394 0.6446 0.4630
1.7057 3.1637 2300 2.3577 2.3577 1.0316 0.6588 0.4621
1.7284 3.3012 2400 2.4423 2.4423 1.0436 0.6466 0.4595
1.7337 3.4388 2500 2.4151 2.4151 1.0406 0.6505 0.4682
1.6713 3.5763 2600 2.3681 2.3681 1.0152 0.6573 0.4665
1.8363 3.7139 2700 2.3612 2.3612 1.0313 0.6583 0.4621
1.719 3.8514 2800 2.4504 2.4504 1.0522 0.6454 0.4639
1.5755 3.9890 2900 2.3961 2.3961 1.0258 0.6533 0.4648
1.4174 4.1265 3000 2.4370 2.4370 1.0279 0.6473 0.4822
1.6906 4.2641 3100 2.4672 2.4672 1.0472 0.6430 0.4743
1.4928 4.4017 3200 2.5710 2.5710 1.0684 0.6280 0.4761
1.4598 4.5392 3300 2.4387 2.4387 1.0303 0.6471 0.4813
1.45 4.6768 3400 2.4174 2.4174 1.0193 0.6502 0.4856
1.3921 4.8143 3500 2.3611 2.3611 1.0084 0.6583 0.4822
1.4918 4.9519 3600 2.3255 2.3255 1.0267 0.6635 0.4621
1.3672 5.0894 3700 2.4369 2.4369 1.0370 0.6474 0.4787
1.2711 5.2270 3800 2.4173 2.4173 1.0317 0.6502 0.4752
1.2531 5.3645 3900 2.4008 2.4008 1.0170 0.6526 0.4900
1.3035 5.5021 4000 2.4040 2.4040 1.0133 0.6521 0.4891
1.3362 5.6396 4100 2.3974 2.3974 1.0042 0.6531 0.4943
1.2697 5.7772 4200 2.3708 2.3708 1.0200 0.6569 0.4708
1.2656 5.9147 4300 2.4463 2.4463 1.0346 0.6460 0.4813
1.2401 6.0523 4400 2.4412 2.4412 1.0320 0.6467 0.4778
1.1675 6.1898 4500 2.4380 2.4380 1.0226 0.6472 0.4874
1.3219 6.3274 4600 2.3325 2.3325 0.9982 0.6625 0.4943
1.1802 6.4649 4700 2.3582 2.3582 1.0133 0.6588 0.4778
1.2132 6.6025 4800 2.3610 2.3610 1.0035 0.6584 0.5013
1.109 6.7400 4900 2.4004 2.4004 1.0214 0.6526 0.4735
1.0964 6.8776 5000 2.4150 2.4150 1.0181 0.6505 0.4909
1.1356 7.0151 5100 2.4174 2.4174 1.0027 0.6502 0.5022
1.0781 7.1527 5200 2.4172 2.4172 1.0113 0.6502 0.4917
1.0545 7.2902 5300 2.4086 2.4086 1.0105 0.6515 0.4987
1.063 7.4278 5400 2.4091 2.4091 1.0082 0.6514 0.4874
1.2053 7.5653 5500 2.3560 2.3560 0.9998 0.6591 0.4970
1.0971 7.7029 5600 2.4048 2.4048 1.0113 0.6520 0.4839
1.0788 7.8404 5700 2.4743 2.4743 1.0383 0.6420 0.4648
1.0963 7.9780 5800 2.3513 2.3513 0.9999 0.6598 0.4961
1.0488 8.1155 5900 2.3650 2.3650 1.0014 0.6578 0.4978
1.013 8.2531 6000 2.4043 2.4043 1.0112 0.6521 0.4935
0.9864 8.3906 6100 2.4053 2.4053 1.0108 0.6519 0.4917
1.1504 8.5282 6200 2.4569 2.4569 1.0305 0.6445 0.4735
0.9722 8.6657 6300 2.4546 2.4546 1.0223 0.6448 0.4900
0.9892 8.8033 6400 2.3933 2.3933 1.0060 0.6537 0.4874
0.9928 8.9409 6500 2.4005 2.4005 1.0114 0.6526 0.4874
0.905 9.0784 6600 2.4132 2.4132 1.0160 0.6508 0.4830
0.9691 9.2160 6700 2.3942 2.3942 1.0092 0.6535 0.4961
0.9424 9.3535 6800 2.4073 2.4073 1.0117 0.6516 0.4900
0.8763 9.4911 6900 2.3933 2.3933 1.0009 0.6537 0.5030
1.0059 9.6286 7000 2.3984 2.3984 1.0075 0.6529 0.4952
1.0035 9.7662 7100 2.4082 2.4082 1.0112 0.6515 0.4917
1.1103 9.9037 7200 2.4057 2.4057 1.0103 0.6519 0.4909

Framework versions

  • Transformers 4.45.2
  • Pytorch 2.5.1+cu124
  • Datasets 3.3.2
  • Tokenizers 0.20.3
Downloads last month
203
Safetensors
Model size
112M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for HrantDinkFoundation/turkish-hs-degree-prediction

Finetuned
(20)
this model