arabic-hs-degree-prediction
This model is a fine-tuned version of aubmindlab/bert-base-arabert on the None dataset. It achieves the following results on the evaluation set:
- Loss: 3.1882
- Mse: 3.1882
- Mae: 1.1956
- R2: 0.5149
- Accuracy: 0.4225
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 16
- eval_batch_size: 20
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Mse | Mae | R2 | Accuracy |
---|---|---|---|---|---|---|---|
5.7507 | 0.1017 | 100 | 6.2822 | 6.2822 | 1.9385 | 0.0441 | 0.0830 |
5.299 | 0.2035 | 200 | 5.5817 | 5.5817 | 1.8516 | 0.1507 | 0.1395 |
5.53 | 0.3052 | 300 | 5.2052 | 5.2052 | 1.6906 | 0.2080 | 0.2566 |
4.8912 | 0.4069 | 400 | 4.8571 | 4.8571 | 1.6205 | 0.2610 | 0.2624 |
4.3919 | 0.5086 | 500 | 4.4455 | 4.4455 | 1.5954 | 0.3236 | 0.2740 |
4.4037 | 0.6104 | 600 | 4.2926 | 4.2926 | 1.5719 | 0.3469 | 0.2469 |
4.3753 | 0.7121 | 700 | 4.3337 | 4.3337 | 1.4822 | 0.3406 | 0.3190 |
3.9158 | 0.8138 | 800 | 4.3284 | 4.3284 | 1.4422 | 0.3414 | 0.3492 |
3.879 | 0.9156 | 900 | 3.9436 | 3.9436 | 1.4679 | 0.4000 | 0.2740 |
4.0891 | 1.0173 | 1000 | 3.7468 | 3.7468 | 1.4508 | 0.4299 | 0.2502 |
3.4442 | 1.1190 | 1100 | 3.8175 | 3.8175 | 1.4860 | 0.4191 | 0.2534 |
3.6186 | 1.2208 | 1200 | 3.6917 | 3.6917 | 1.4358 | 0.4383 | 0.2920 |
3.3972 | 1.3225 | 1300 | 3.5518 | 3.5518 | 1.3380 | 0.4596 | 0.3434 |
3.1744 | 1.4242 | 1400 | 3.6608 | 3.6608 | 1.4128 | 0.4430 | 0.3042 |
3.2872 | 1.5259 | 1500 | 3.4593 | 3.4593 | 1.3247 | 0.4736 | 0.3383 |
3.3208 | 1.6277 | 1600 | 3.4926 | 3.4926 | 1.3261 | 0.4686 | 0.3383 |
3.3749 | 1.7294 | 1700 | 3.4436 | 3.4436 | 1.2935 | 0.4760 | 0.3685 |
3.216 | 1.8311 | 1800 | 3.4275 | 3.4275 | 1.2742 | 0.4785 | 0.3711 |
2.9892 | 1.9329 | 1900 | 3.3504 | 3.3504 | 1.3199 | 0.4902 | 0.3222 |
2.9693 | 2.0346 | 2000 | 3.2945 | 3.2945 | 1.2992 | 0.4987 | 0.3434 |
2.7422 | 2.1363 | 2100 | 3.3692 | 3.3692 | 1.2683 | 0.4874 | 0.3865 |
2.6642 | 2.2380 | 2200 | 3.2903 | 3.2903 | 1.2463 | 0.4994 | 0.3910 |
2.6804 | 2.3398 | 2300 | 3.4485 | 3.4485 | 1.3091 | 0.4753 | 0.3659 |
2.8668 | 2.4415 | 2400 | 3.2589 | 3.2589 | 1.2682 | 0.5041 | 0.3698 |
2.9279 | 2.5432 | 2500 | 3.2159 | 3.2159 | 1.2381 | 0.5107 | 0.4006 |
2.5987 | 2.6450 | 2600 | 3.2272 | 3.2272 | 1.2692 | 0.5090 | 0.3621 |
2.5969 | 2.7467 | 2700 | 3.0977 | 3.0977 | 1.2095 | 0.5287 | 0.3923 |
2.603 | 2.8484 | 2800 | 3.2001 | 3.2001 | 1.2396 | 0.5131 | 0.3730 |
2.6199 | 2.9502 | 2900 | 3.3361 | 3.3361 | 1.3132 | 0.4924 | 0.3428 |
2.4894 | 3.0519 | 3000 | 3.0607 | 3.0607 | 1.2034 | 0.5343 | 0.3852 |
2.2048 | 3.1536 | 3100 | 3.3162 | 3.3162 | 1.3002 | 0.4954 | 0.3428 |
2.3359 | 3.2553 | 3200 | 3.1354 | 3.1354 | 1.1810 | 0.5229 | 0.4251 |
2.1222 | 3.3571 | 3300 | 3.1532 | 3.1532 | 1.1849 | 0.5202 | 0.4386 |
2.34 | 3.4588 | 3400 | 3.1854 | 3.1854 | 1.2317 | 0.5153 | 0.3749 |
2.1876 | 3.5605 | 3500 | 3.0598 | 3.0598 | 1.1787 | 0.5344 | 0.4174 |
2.3496 | 3.6623 | 3600 | 3.1563 | 3.1563 | 1.2685 | 0.5198 | 0.3312 |
2.2078 | 3.7640 | 3700 | 3.0814 | 3.0814 | 1.1775 | 0.5312 | 0.4322 |
2.2646 | 3.8657 | 3800 | 3.0456 | 3.0456 | 1.1873 | 0.5366 | 0.4006 |
2.2826 | 3.9674 | 3900 | 2.9952 | 2.9952 | 1.1629 | 0.5443 | 0.4232 |
2.2043 | 4.0692 | 4000 | 3.1038 | 3.1038 | 1.1657 | 0.5277 | 0.4412 |
2.0381 | 4.1709 | 4100 | 3.0986 | 3.0986 | 1.1875 | 0.5285 | 0.4077 |
1.771 | 4.2726 | 4200 | 3.1262 | 3.1262 | 1.1907 | 0.5243 | 0.4096 |
1.9437 | 4.3744 | 4300 | 3.0050 | 3.0050 | 1.1789 | 0.5428 | 0.4096 |
2.0255 | 4.4761 | 4400 | 3.0807 | 3.0807 | 1.1714 | 0.5313 | 0.4315 |
1.8701 | 4.5778 | 4500 | 3.1328 | 3.1328 | 1.1720 | 0.5233 | 0.4328 |
2.1164 | 4.6796 | 4600 | 3.1848 | 3.1848 | 1.1969 | 0.5154 | 0.4212 |
1.9437 | 4.7813 | 4700 | 3.0139 | 3.0139 | 1.1639 | 0.5414 | 0.4186 |
1.911 | 4.8830 | 4800 | 3.0545 | 3.0545 | 1.1866 | 0.5352 | 0.4013 |
2.0767 | 4.9847 | 4900 | 3.0239 | 3.0239 | 1.1791 | 0.5399 | 0.4122 |
1.5068 | 5.0865 | 5000 | 3.0965 | 3.0965 | 1.1789 | 0.5289 | 0.4264 |
1.6397 | 5.1882 | 5100 | 3.1433 | 3.1433 | 1.1853 | 0.5217 | 0.4154 |
1.7398 | 5.2899 | 5200 | 3.1553 | 3.1553 | 1.1764 | 0.5199 | 0.4367 |
1.703 | 5.3917 | 5300 | 3.1266 | 3.1266 | 1.1768 | 0.5243 | 0.4315 |
1.7731 | 5.4934 | 5400 | 3.2271 | 3.2271 | 1.2336 | 0.5090 | 0.3865 |
1.8083 | 5.5951 | 5500 | 3.3437 | 3.3437 | 1.2578 | 0.4912 | 0.3743 |
1.5717 | 5.6968 | 5600 | 3.1819 | 3.1819 | 1.1758 | 0.5158 | 0.4450 |
1.7819 | 5.7986 | 5700 | 3.0774 | 3.0774 | 1.1895 | 0.5318 | 0.4096 |
1.6773 | 5.9003 | 5800 | 3.3598 | 3.3598 | 1.2407 | 0.4888 | 0.3968 |
1.7355 | 6.0020 | 5900 | 3.1103 | 3.1103 | 1.1776 | 0.5268 | 0.4309 |
1.4878 | 6.1038 | 6000 | 3.3476 | 3.3476 | 1.2364 | 0.4906 | 0.4032 |
1.4709 | 6.2055 | 6100 | 3.0955 | 3.0955 | 1.1916 | 0.5290 | 0.4045 |
1.5061 | 6.3072 | 6200 | 3.0975 | 3.0975 | 1.1903 | 0.5287 | 0.4161 |
1.6804 | 6.4090 | 6300 | 3.1653 | 3.1653 | 1.2117 | 0.5184 | 0.4006 |
1.5214 | 6.5107 | 6400 | 3.1052 | 3.1052 | 1.1593 | 0.5275 | 0.4412 |
1.375 | 6.6124 | 6500 | 3.3933 | 3.3933 | 1.2594 | 0.4837 | 0.3788 |
1.5458 | 6.7141 | 6600 | 3.1259 | 3.1259 | 1.1923 | 0.5244 | 0.4135 |
1.3139 | 6.8159 | 6700 | 3.1141 | 3.1141 | 1.1774 | 0.5262 | 0.4257 |
1.3872 | 6.9176 | 6800 | 3.1789 | 3.1789 | 1.2179 | 0.5163 | 0.3949 |
1.5664 | 7.0193 | 6900 | 3.1296 | 3.1296 | 1.1811 | 0.5238 | 0.4264 |
1.3352 | 7.1211 | 7000 | 3.1715 | 3.1715 | 1.2001 | 0.5174 | 0.4077 |
1.4608 | 7.2228 | 7100 | 3.1279 | 3.1279 | 1.1844 | 0.5241 | 0.4264 |
1.3192 | 7.3245 | 7200 | 3.2494 | 3.2494 | 1.2249 | 0.5056 | 0.4026 |
1.3222 | 7.4262 | 7300 | 3.2562 | 3.2562 | 1.2057 | 0.5046 | 0.4199 |
1.3426 | 7.5280 | 7400 | 3.1636 | 3.1636 | 1.1844 | 0.5186 | 0.4225 |
1.1591 | 7.6297 | 7500 | 3.1366 | 3.1366 | 1.1895 | 0.5227 | 0.4167 |
1.3475 | 7.7314 | 7600 | 3.1563 | 3.1563 | 1.1891 | 0.5198 | 0.4193 |
1.3258 | 7.8332 | 7700 | 3.1369 | 3.1369 | 1.1841 | 0.5227 | 0.4225 |
1.4465 | 7.9349 | 7800 | 3.1352 | 3.1352 | 1.1856 | 0.5230 | 0.4232 |
1.4019 | 8.0366 | 7900 | 3.1505 | 3.1505 | 1.1820 | 0.5206 | 0.4315 |
1.2362 | 8.1384 | 8000 | 3.1832 | 3.1832 | 1.1930 | 0.5157 | 0.4270 |
1.3815 | 8.2401 | 8100 | 3.1928 | 3.1928 | 1.2023 | 0.5142 | 0.4161 |
1.2297 | 8.3418 | 8200 | 3.0869 | 3.0869 | 1.1768 | 0.5303 | 0.4232 |
1.2599 | 8.4435 | 8300 | 3.1602 | 3.1602 | 1.1933 | 0.5192 | 0.4116 |
1.3223 | 8.5453 | 8400 | 3.1198 | 3.1198 | 1.1772 | 0.5253 | 0.4277 |
1.262 | 8.6470 | 8500 | 3.1153 | 3.1153 | 1.1791 | 0.5260 | 0.4277 |
1.3021 | 8.7487 | 8600 | 3.2229 | 3.2229 | 1.2018 | 0.5096 | 0.4135 |
1.1122 | 8.8505 | 8700 | 3.1321 | 3.1321 | 1.1706 | 0.5234 | 0.4341 |
1.1733 | 8.9522 | 8800 | 3.1953 | 3.1953 | 1.2024 | 0.5138 | 0.4116 |
1.2663 | 9.0539 | 8900 | 3.1807 | 3.1807 | 1.1836 | 0.5160 | 0.4264 |
1.1937 | 9.1556 | 9000 | 3.1440 | 3.1440 | 1.1798 | 0.5216 | 0.4257 |
1.1342 | 9.2574 | 9100 | 3.1765 | 3.1765 | 1.1867 | 0.5167 | 0.4225 |
1.1479 | 9.3591 | 9200 | 3.1854 | 3.1854 | 1.1847 | 0.5153 | 0.4257 |
1.1885 | 9.4608 | 9300 | 3.2090 | 3.2090 | 1.1913 | 0.5117 | 0.4302 |
1.231 | 9.5626 | 9400 | 3.1888 | 3.1888 | 1.1956 | 0.5148 | 0.4167 |
1.1874 | 9.6643 | 9500 | 3.1671 | 3.1671 | 1.1905 | 0.5181 | 0.4244 |
1.1844 | 9.7660 | 9600 | 3.1895 | 3.1895 | 1.1938 | 0.5147 | 0.4219 |
1.1984 | 9.8678 | 9700 | 3.1726 | 3.1726 | 1.1896 | 0.5173 | 0.4225 |
1.1881 | 9.9695 | 9800 | 3.1882 | 3.1882 | 1.1956 | 0.5149 | 0.4225 |
Framework versions
- Transformers 4.45.2
- Pytorch 2.5.1+cu124
- Datasets 3.3.2
- Tokenizers 0.20.3
- Downloads last month
- 56
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for HrantDinkFoundation/arabic-hs-degree-prediction
Base model
aubmindlab/bert-base-arabert