ihsanahakiim's picture
Model save
350269e verified
|
raw
history blame
9.29 kB
metadata
library_name: transformers
license: cc-by-nc-4.0
base_model: MCG-NJU/videomae-base
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: videomae-base-finetuned-ucf101-subset
    results: []

videomae-base-finetuned-ucf101-subset

This model is a fine-tuned version of MCG-NJU/videomae-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.9989
  • Accuracy: 0.5144

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 1920

Training results

Training Loss Epoch Step Validation Loss Accuracy
4.2685 0.0083 16 4.2443 0.0144
4.2462 1.0083 32 4.2229 0.0165
4.2193 2.0083 48 4.2111 0.0129
4.2251 3.0083 64 4.2124 0.0136
4.231 4.0083 80 4.2131 0.0158
4.2197 5.0083 96 4.2091 0.0151
4.2239 6.0083 112 4.2100 0.0108
4.2246 7.0083 128 4.2119 0.0144
4.2131 8.0083 144 4.2030 0.0172
4.236 9.0083 160 4.1979 0.0129
4.177 10.0083 176 4.1328 0.0323
4.0656 11.0083 192 4.0373 0.0316
3.9339 12.0083 208 3.8848 0.0503
3.7197 13.0083 224 3.7681 0.0783
3.5657 14.0083 240 3.5790 0.1185
3.3108 15.0083 256 3.5642 0.1466
3.1687 16.0083 272 3.3210 0.1832
3.1376 17.0083 288 3.1950 0.2292
2.8366 18.0083 304 3.1270 0.2493
2.6811 19.0083 320 2.9997 0.2895
2.6104 20.0083 336 2.9776 0.2773
2.5156 21.0083 352 2.8625 0.3125
2.3804 22.0083 368 2.8223 0.2974
2.2389 23.0083 384 2.7042 0.3412
2.1889 24.0083 400 2.6846 0.3226
1.9829 25.0083 416 2.6066 0.3592
1.9466 26.0083 432 2.5845 0.3642
1.8991 27.0083 448 2.5150 0.3922
1.8629 28.0083 464 2.4958 0.3994
1.8563 29.0083 480 2.5036 0.3994
1.832 30.0083 496 2.4212 0.4037
1.7148 31.0083 512 2.3891 0.4253
1.6525 32.0083 528 2.3817 0.4109
1.6489 33.0083 544 2.3351 0.4274
1.6928 34.0083 560 2.3495 0.4188
1.503 35.0083 576 2.2961 0.4274
1.5126 36.0083 592 2.2620 0.4454
1.4732 37.0083 608 2.2635 0.4368
1.5553 38.0083 624 2.2326 0.4490
1.7115 39.0083 640 2.2266 0.4404
1.4851 40.0083 656 2.2690 0.4274
1.455 41.0083 672 2.1921 0.4569
1.4827 42.0083 688 2.2387 0.4504
1.4839 43.0083 704 2.2020 0.4432
1.2879 44.0083 720 2.1959 0.4397
1.2722 45.0083 736 2.2158 0.4440
1.2225 46.0083 752 2.1568 0.4662
1.1821 47.0083 768 2.1312 0.4763
1.2406 48.0083 784 2.1162 0.4784
1.1717 49.0083 800 2.1368 0.4756
1.2366 50.0083 816 2.1134 0.4835
1.2534 51.0083 832 2.0964 0.4734
1.2322 52.0083 848 2.1506 0.4641
1.2742 53.0083 864 2.1719 0.4591
1.132 54.0083 880 2.1557 0.4648
1.1306 55.0083 896 2.1010 0.4878
1.2719 56.0083 912 2.1482 0.4619
1.1549 57.0083 928 2.0961 0.4813
1.1495 58.0083 944 2.1246 0.4763
1.2539 59.0083 960 2.1118 0.4806
1.1719 60.0083 976 2.0666 0.4935
1.1108 61.0083 992 2.0630 0.4835
1.0417 62.0083 1008 2.0635 0.4899
1.1755 63.0083 1024 2.0862 0.4720
1.0512 64.0083 1040 2.0730 0.4878
0.9824 65.0083 1056 2.0709 0.4907
1.0924 66.0083 1072 2.1638 0.4612
1.1027 67.0083 1088 2.0572 0.4777
1.0956 68.0083 1104 2.0502 0.4892
0.8823 69.0083 1120 2.1128 0.4756
1.0344 70.0083 1136 2.0950 0.4727
1.0887 71.0083 1152 2.0543 0.4943
1.0763 72.0083 1168 2.0535 0.4907
0.9652 73.0083 1184 2.0280 0.5
1.0445 74.0083 1200 2.0551 0.4820
0.9844 75.0083 1216 2.0514 0.5043
1.0809 76.0083 1232 2.0552 0.5022
1.1158 77.0083 1248 2.0469 0.4878
0.9017 78.0083 1264 2.0516 0.4907
1.0449 79.0083 1280 2.0770 0.4864
1.0167 80.0083 1296 2.0374 0.4943
0.975 81.0083 1312 2.0631 0.4943
0.9285 82.0083 1328 2.0499 0.4871
0.9762 83.0083 1344 2.0618 0.4964
0.9454 84.0083 1360 2.0462 0.4993
0.8665 85.0083 1376 2.0765 0.4892
0.9202 86.0083 1392 2.0513 0.4950
0.8186 87.0083 1408 2.0254 0.5093
0.8659 88.0083 1424 2.1060 0.4792
0.8789 89.0083 1440 2.0296 0.4964
0.8592 90.0083 1456 2.0757 0.4849
0.8093 91.0083 1472 2.0289 0.4986
0.9074 92.0083 1488 2.0539 0.4921
0.82 93.0083 1504 2.0481 0.5014
0.8318 94.0083 1520 2.0309 0.5022
0.8337 95.0083 1536 2.0335 0.5050
0.9089 96.0083 1552 2.0456 0.5022
0.8189 97.0083 1568 2.0107 0.4986
0.7603 98.0083 1584 2.0147 0.5072
0.9197 99.0083 1600 2.0438 0.5014
0.8021 100.0083 1616 2.0311 0.5
0.7474 101.0083 1632 2.0348 0.5007
0.9423 102.0083 1648 2.0177 0.5007
0.8135 103.0083 1664 2.0140 0.5029
0.8244 104.0083 1680 2.0124 0.4986
0.8446 105.0083 1696 2.0022 0.5065
0.7965 106.0083 1712 1.9957 0.5108
0.8256 107.0083 1728 1.9995 0.5108
0.8448 108.0083 1744 2.0056 0.5093
0.7144 109.0083 1760 2.0084 0.5072
0.7869 110.0083 1776 1.9967 0.5115
0.8149 111.0083 1792 1.9973 0.5115
0.7896 112.0083 1808 2.0014 0.5122
0.8189 113.0083 1824 1.9989 0.5144
0.6775 114.0083 1840 1.9957 0.5122
0.8642 115.0083 1856 2.0001 0.5101
0.7308 116.0083 1872 1.9895 0.5086
0.8616 117.0083 1888 1.9854 0.5079
0.7763 118.0083 1904 1.9896 0.5072
0.8009 119.0083 1920 1.9903 0.5072

Framework versions

  • Transformers 4.48.0
  • Pytorch 2.5.1+cu118
  • Datasets 3.2.0
  • Tokenizers 0.21.0