ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k2_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7705
  • Qwk: 0.6569
  • Mse: 0.7705
  • Rmse: 0.8778

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1333 2 7.1591 0.0 7.1591 2.6756
No log 0.2667 4 4.7877 0.0524 4.7877 2.1881
No log 0.4 6 3.1233 0.0702 3.1233 1.7673
No log 0.5333 8 2.1962 0.1727 2.1962 1.4819
No log 0.6667 10 1.9398 0.2903 1.9398 1.3928
No log 0.8 12 1.6027 0.1165 1.6027 1.2660
No log 0.9333 14 1.5506 0.1346 1.5506 1.2452
No log 1.0667 16 1.3790 0.2056 1.3790 1.1743
No log 1.2 18 1.2470 0.3186 1.2470 1.1167
No log 1.3333 20 1.1550 0.4333 1.1550 1.0747
No log 1.4667 22 1.0187 0.576 1.0187 1.0093
No log 1.6 24 1.0088 0.5738 1.0088 1.0044
No log 1.7333 26 1.0497 0.5289 1.0497 1.0246
No log 1.8667 28 1.2026 0.4098 1.2026 1.0966
No log 2.0 30 1.5324 0.4714 1.5324 1.2379
No log 2.1333 32 1.5250 0.4714 1.5250 1.2349
No log 2.2667 34 1.2375 0.4688 1.2375 1.1124
No log 2.4 36 1.0766 0.4 1.0766 1.0376
No log 2.5333 38 1.1511 0.512 1.1511 1.0729
No log 2.6667 40 1.0394 0.5414 1.0394 1.0195
No log 2.8 42 0.8275 0.7639 0.8275 0.9096
No log 2.9333 44 0.9014 0.6294 0.9014 0.9494
No log 3.0667 46 0.8321 0.6849 0.8321 0.9122
No log 3.2 48 0.9310 0.5985 0.9310 0.9649
No log 3.3333 50 1.1985 0.5255 1.1985 1.0948
No log 3.4667 52 1.2488 0.5522 1.2488 1.1175
No log 3.6 54 0.8848 0.5942 0.8848 0.9406
No log 3.7333 56 0.8724 0.6573 0.8724 0.9340
No log 3.8667 58 0.8395 0.6429 0.8395 0.9163
No log 4.0 60 0.8375 0.6667 0.8375 0.9151
No log 4.1333 62 1.4544 0.5217 1.4544 1.2060
No log 4.2667 64 1.7212 0.3623 1.7212 1.3119
No log 4.4 66 1.3372 0.5217 1.3372 1.1564
No log 4.5333 68 0.9463 0.6331 0.9463 0.9728
No log 4.6667 70 0.8381 0.6131 0.8381 0.9155
No log 4.8 72 0.9863 0.6143 0.9863 0.9931
No log 4.9333 74 0.8956 0.5909 0.8956 0.9464
No log 5.0667 76 0.9954 0.5985 0.9954 0.9977
No log 5.2 78 1.1145 0.6014 1.1145 1.0557
No log 5.3333 80 0.7857 0.7421 0.7857 0.8864
No log 5.4667 82 0.6092 0.8070 0.6092 0.7805
No log 5.6 84 0.6328 0.8249 0.6328 0.7955
No log 5.7333 86 0.6696 0.7958 0.6696 0.8183
No log 5.8667 88 0.6070 0.8415 0.6070 0.7791
No log 6.0 90 0.5202 0.8409 0.5202 0.7212
No log 6.1333 92 0.5561 0.8 0.5561 0.7457
No log 6.2667 94 0.6238 0.7222 0.6238 0.7898
No log 6.4 96 0.6279 0.7347 0.6279 0.7924
No log 6.5333 98 0.6088 0.7724 0.6088 0.7802
No log 6.6667 100 0.6743 0.7662 0.6743 0.8212
No log 6.8 102 0.7988 0.6792 0.7988 0.8937
No log 6.9333 104 0.9505 0.6835 0.9505 0.9750
No log 7.0667 106 0.9604 0.6918 0.9604 0.9800
No log 7.2 108 0.8895 0.6623 0.8895 0.9431
No log 7.3333 110 0.8564 0.6797 0.8564 0.9254
No log 7.4667 112 0.8144 0.6667 0.8144 0.9024
No log 7.6 114 0.7924 0.6761 0.7924 0.8902
No log 7.7333 116 0.8918 0.6569 0.8918 0.9444
No log 7.8667 118 1.0102 0.6015 1.0102 1.0051
No log 8.0 120 1.2182 0.5532 1.2182 1.1037
No log 8.1333 122 1.0953 0.5571 1.0953 1.0466
No log 8.2667 124 0.7035 0.6944 0.7035 0.8387
No log 8.4 126 0.6005 0.7922 0.6005 0.7749
No log 8.5333 128 0.5702 0.7922 0.5702 0.7551
No log 8.6667 130 0.6514 0.75 0.6514 0.8071
No log 8.8 132 0.8233 0.7125 0.8233 0.9074
No log 8.9333 134 0.7964 0.6968 0.7964 0.8924
No log 9.0667 136 0.7415 0.7152 0.7415 0.8611
No log 9.2 138 0.7455 0.7237 0.7455 0.8634
No log 9.3333 140 0.7137 0.7237 0.7137 0.8448
No log 9.4667 142 0.7399 0.7123 0.7399 0.8602
No log 9.6 144 0.8228 0.7006 0.8228 0.9071
No log 9.7333 146 0.9337 0.6707 0.9337 0.9663
No log 9.8667 148 0.8870 0.6875 0.8870 0.9418
No log 10.0 150 0.7380 0.6803 0.7380 0.8591
No log 10.1333 152 0.6247 0.7785 0.6247 0.7904
No log 10.2667 154 0.6099 0.7517 0.6099 0.7810
No log 10.4 156 0.6132 0.76 0.6132 0.7830
No log 10.5333 158 0.6745 0.7895 0.6745 0.8213
No log 10.6667 160 0.7912 0.7134 0.7912 0.8895
No log 10.8 162 0.8054 0.7 0.8054 0.8974
No log 10.9333 164 0.7294 0.7273 0.7294 0.8541
No log 11.0667 166 0.7523 0.7162 0.7523 0.8674
No log 11.2 168 0.7630 0.7162 0.7630 0.8735
No log 11.3333 170 0.8160 0.6849 0.8160 0.9033
No log 11.4667 172 0.9148 0.7114 0.9148 0.9565
No log 11.6 174 0.8240 0.7211 0.8240 0.9077
No log 11.7333 176 0.8015 0.7211 0.8015 0.8952
No log 11.8667 178 0.7172 0.7285 0.7172 0.8469
No log 12.0 180 0.6864 0.7532 0.6864 0.8285
No log 12.1333 182 0.6589 0.7484 0.6589 0.8117
No log 12.2667 184 0.7187 0.75 0.7187 0.8478
No log 12.4 186 0.9424 0.6747 0.9424 0.9708
No log 12.5333 188 0.9982 0.6624 0.9982 0.9991
No log 12.6667 190 0.8464 0.6980 0.8464 0.9200
No log 12.8 192 0.7313 0.7 0.7313 0.8552
No log 12.9333 194 0.7473 0.7059 0.7473 0.8645
No log 13.0667 196 0.7806 0.6667 0.7806 0.8835
No log 13.2 198 0.8623 0.6853 0.8623 0.9286
No log 13.3333 200 0.8941 0.6980 0.8941 0.9456
No log 13.4667 202 0.8030 0.7190 0.8030 0.8961
No log 13.6 204 0.6835 0.7211 0.6835 0.8267
No log 13.7333 206 0.6546 0.7517 0.6546 0.8091
No log 13.8667 208 0.6821 0.7248 0.6821 0.8259
No log 14.0 210 0.7493 0.7059 0.7493 0.8656
No log 14.1333 212 0.6796 0.7595 0.6796 0.8244
No log 14.2667 214 0.6097 0.7211 0.6097 0.7808
No log 14.4 216 0.6323 0.7517 0.6323 0.7952
No log 14.5333 218 0.7023 0.7595 0.7023 0.8380
No log 14.6667 220 0.7917 0.7485 0.7917 0.8898
No log 14.8 222 0.8905 0.7126 0.8905 0.9437
No log 14.9333 224 0.7866 0.7273 0.7866 0.8869
No log 15.0667 226 0.7307 0.7226 0.7307 0.8548
No log 15.2 228 0.6855 0.7467 0.6855 0.8279
No log 15.3333 230 0.7603 0.7162 0.7603 0.8720
No log 15.4667 232 0.8091 0.6853 0.8091 0.8995
No log 15.6 234 0.8997 0.6667 0.8997 0.9485
No log 15.7333 236 0.8520 0.6667 0.8520 0.9230
No log 15.8667 238 0.7621 0.7092 0.7621 0.8730
No log 16.0 240 0.7004 0.7361 0.7004 0.8369
No log 16.1333 242 0.7369 0.7703 0.7369 0.8585
No log 16.2667 244 0.7463 0.7467 0.7463 0.8639
No log 16.4 246 0.7566 0.7248 0.7566 0.8698
No log 16.5333 248 0.8413 0.6846 0.8413 0.9172
No log 16.6667 250 0.8310 0.6846 0.8310 0.9116
No log 16.8 252 0.7611 0.7211 0.7611 0.8724
No log 16.9333 254 0.7351 0.7133 0.7351 0.8574
No log 17.0667 256 0.7503 0.6980 0.7503 0.8662
No log 17.2 258 0.9328 0.6788 0.9328 0.9658
No log 17.3333 260 1.0674 0.6548 1.0674 1.0332
No log 17.4667 262 0.9606 0.6824 0.9606 0.9801
No log 17.6 264 0.7710 0.7013 0.7710 0.8781
No log 17.7333 266 0.7065 0.7368 0.7065 0.8406
No log 17.8667 268 0.7576 0.7337 0.7576 0.8704
No log 18.0 270 0.8783 0.6905 0.8783 0.9372
No log 18.1333 272 0.8999 0.6826 0.8999 0.9486
No log 18.2667 274 0.8225 0.7073 0.8225 0.9069
No log 18.4 276 0.7227 0.7368 0.7227 0.8501
No log 18.5333 278 0.6778 0.7516 0.6778 0.8233
No log 18.6667 280 0.6369 0.7654 0.6369 0.7981
No log 18.8 282 0.6252 0.7722 0.6252 0.7907
No log 18.9333 284 0.6745 0.7613 0.6745 0.8213
No log 19.0667 286 0.7654 0.7042 0.7654 0.8749
No log 19.2 288 0.9340 0.6438 0.9340 0.9664
No log 19.3333 290 0.9901 0.6479 0.9901 0.9950
No log 19.4667 292 0.9158 0.6761 0.9158 0.9570
No log 19.6 294 0.7656 0.7 0.7656 0.8750
No log 19.7333 296 0.6968 0.7465 0.6968 0.8347
No log 19.8667 298 0.7134 0.7234 0.7134 0.8446
No log 20.0 300 0.7965 0.7006 0.7965 0.8925
No log 20.1333 302 0.9796 0.6667 0.9796 0.9897
No log 20.2667 304 1.0642 0.6588 1.0642 1.0316
No log 20.4 306 1.0274 0.6667 1.0274 1.0136
No log 20.5333 308 0.9192 0.6864 0.9192 0.9588
No log 20.6667 310 0.8704 0.6667 0.8704 0.9329
No log 20.8 312 0.8688 0.6800 0.8688 0.9321
No log 20.9333 314 0.7876 0.6944 0.7876 0.8875
No log 21.0667 316 0.7202 0.6912 0.7202 0.8487
No log 21.2 318 0.6952 0.7111 0.6952 0.8338
No log 21.3333 320 0.6970 0.7338 0.6970 0.8349
No log 21.4667 322 0.6890 0.7429 0.6890 0.8300
No log 21.6 324 0.6939 0.7413 0.6939 0.8330
No log 21.7333 326 0.8418 0.6667 0.8418 0.9175
No log 21.8667 328 0.9336 0.6582 0.9336 0.9662
No log 22.0 330 0.8689 0.6709 0.8689 0.9322
No log 22.1333 332 0.7287 0.7432 0.7287 0.8536
No log 22.2667 334 0.7207 0.7552 0.7207 0.8489
No log 22.4 336 0.7480 0.7324 0.7480 0.8649
No log 22.5333 338 0.7340 0.7042 0.7340 0.8567
No log 22.6667 340 0.7411 0.6809 0.7411 0.8609
No log 22.8 342 0.8965 0.6797 0.8965 0.9468
No log 22.9333 344 1.1200 0.6509 1.1200 1.0583
No log 23.0667 346 1.1414 0.6463 1.1414 1.0684
No log 23.2 348 0.9674 0.6490 0.9674 0.9835
No log 23.3333 350 0.7491 0.7042 0.7491 0.8655
No log 23.4667 352 0.6221 0.7947 0.6221 0.7887
No log 23.6 354 0.5920 0.8258 0.5920 0.7694
No log 23.7333 356 0.6024 0.8182 0.6024 0.7761
No log 23.8667 358 0.6768 0.7654 0.6768 0.8227
No log 24.0 360 0.8628 0.6790 0.8628 0.9288
No log 24.1333 362 0.9549 0.6667 0.9549 0.9772
No log 24.2667 364 0.8936 0.6711 0.8936 0.9453
No log 24.4 366 0.7763 0.6950 0.7763 0.8811
No log 24.5333 368 0.7381 0.7518 0.7381 0.8591
No log 24.6667 370 0.7743 0.7143 0.7743 0.8800
No log 24.8 372 0.7473 0.7413 0.7473 0.8645
No log 24.9333 374 0.7225 0.7234 0.7225 0.8500
No log 25.0667 376 0.7859 0.6853 0.7859 0.8865
No log 25.2 378 0.9726 0.6667 0.9726 0.9862
No log 25.3333 380 1.0509 0.6289 1.0509 1.0251
No log 25.4667 382 0.9702 0.6460 0.9702 0.9850
No log 25.6 384 0.8070 0.7105 0.8070 0.8983
No log 25.7333 386 0.6603 0.7383 0.6603 0.8126
No log 25.8667 388 0.6037 0.8077 0.6037 0.7770
No log 26.0 390 0.5967 0.8077 0.5967 0.7725
No log 26.1333 392 0.6240 0.7651 0.6240 0.7899
No log 26.2667 394 0.7088 0.7368 0.7088 0.8419
No log 26.4 396 0.7735 0.7067 0.7735 0.8795
No log 26.5333 398 0.7934 0.6897 0.7934 0.8907
No log 26.6667 400 0.7827 0.6761 0.7827 0.8847
No log 26.8 402 0.7586 0.6715 0.7586 0.8710
No log 26.9333 404 0.7371 0.6715 0.7371 0.8586
No log 27.0667 406 0.7522 0.7123 0.7522 0.8673
No log 27.2 408 0.7569 0.7297 0.7569 0.8700
No log 27.3333 410 0.7414 0.7211 0.7414 0.8611
No log 27.4667 412 0.7036 0.6950 0.7036 0.8388
No log 27.6 414 0.7033 0.7324 0.7033 0.8386
No log 27.7333 416 0.7222 0.7376 0.7222 0.8498
No log 27.8667 418 0.7633 0.6714 0.7633 0.8737
No log 28.0 420 0.7897 0.6986 0.7897 0.8887
No log 28.1333 422 0.7657 0.7075 0.7657 0.8750
No log 28.2667 424 0.7561 0.7075 0.7561 0.8695
No log 28.4 426 0.7343 0.7297 0.7343 0.8569
No log 28.5333 428 0.7075 0.7042 0.7075 0.8412
No log 28.6667 430 0.6585 0.6993 0.6585 0.8115
No log 28.8 432 0.6659 0.6993 0.6659 0.8160
No log 28.9333 434 0.7024 0.7042 0.7024 0.8381
No log 29.0667 436 0.7678 0.6714 0.7678 0.8762
No log 29.2 438 0.8246 0.6522 0.8246 0.9081
No log 29.3333 440 0.8350 0.6522 0.8350 0.9138
No log 29.4667 442 0.8172 0.6522 0.8172 0.9040
No log 29.6 444 0.7619 0.6715 0.7619 0.8729
No log 29.7333 446 0.7177 0.7050 0.7177 0.8472
No log 29.8667 448 0.6903 0.7234 0.6903 0.8308
No log 30.0 450 0.7045 0.6950 0.7045 0.8394
No log 30.1333 452 0.7434 0.6950 0.7434 0.8622
No log 30.2667 454 0.7594 0.6619 0.7594 0.8714
No log 30.4 456 0.7679 0.6522 0.7679 0.8763
No log 30.5333 458 0.8242 0.6667 0.8242 0.9079
No log 30.6667 460 0.9060 0.6835 0.9060 0.9519
No log 30.8 462 0.9288 0.6835 0.9288 0.9637
No log 30.9333 464 0.8376 0.6755 0.8376 0.9152
No log 31.0667 466 0.7359 0.6853 0.7359 0.8578
No log 31.2 468 0.7001 0.7042 0.7001 0.8367
No log 31.3333 470 0.7119 0.7042 0.7119 0.8438
No log 31.4667 472 0.7656 0.6667 0.7656 0.8750
No log 31.6 474 0.8394 0.6761 0.8394 0.9162
No log 31.7333 476 0.8799 0.6755 0.8799 0.9380
No log 31.8667 478 0.8501 0.6875 0.8501 0.9220
No log 32.0 480 0.7808 0.7037 0.7808 0.8836
No log 32.1333 482 0.6656 0.725 0.6656 0.8159
No log 32.2667 484 0.5850 0.8054 0.5850 0.7648
No log 32.4 486 0.5780 0.7919 0.5780 0.7603
No log 32.5333 488 0.5805 0.7973 0.5805 0.7619
No log 32.6667 490 0.6103 0.7862 0.6103 0.7812
No log 32.8 492 0.6731 0.7042 0.6731 0.8205
No log 32.9333 494 0.7432 0.6809 0.7432 0.8621
No log 33.0667 496 0.7625 0.6714 0.7625 0.8732
No log 33.2 498 0.7409 0.6714 0.7409 0.8608
0.3345 33.3333 500 0.7438 0.6761 0.7438 0.8624
0.3345 33.4667 502 0.7868 0.6667 0.7868 0.8870
0.3345 33.6 504 0.8910 0.6994 0.8910 0.9439
0.3345 33.7333 506 0.9460 0.6786 0.9460 0.9726
0.3345 33.8667 508 0.8945 0.6826 0.8945 0.9458
0.3345 34.0 510 0.7660 0.7066 0.7660 0.8752
0.3345 34.1333 512 0.6677 0.75 0.6677 0.8171
0.3345 34.2667 514 0.6554 0.7651 0.6554 0.8096
0.3345 34.4 516 0.6841 0.7 0.6841 0.8271
0.3345 34.5333 518 0.7448 0.6906 0.7448 0.8630
0.3345 34.6667 520 0.8404 0.6370 0.8404 0.9167
0.3345 34.8 522 0.9012 0.6475 0.9012 0.9493
0.3345 34.9333 524 0.9121 0.6757 0.9121 0.9550
0.3345 35.0667 526 0.8488 0.6525 0.8488 0.9213
0.3345 35.2 528 0.7705 0.6569 0.7705 0.8778

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
5
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k2_task1_organization

Finetuned
(4222)
this model