CTMAE-P2-V2-S1

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4718
  • Accuracy: 0.8261

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 3250

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.62 0.02 65 0.7161 0.5435
0.5694 1.02 130 0.7819 0.5435
0.546 2.02 195 0.8927 0.5435
0.6022 3.02 260 0.6859 0.5435
0.5779 4.02 325 0.6449 0.5435
0.4662 5.02 390 0.8167 0.5435
0.5101 6.02 455 0.5114 0.7826
0.3779 7.02 520 0.5149 0.7391
0.3656 8.02 585 0.6273 0.6304
0.4837 9.02 650 0.9093 0.6522
0.6897 10.02 715 0.5653 0.6739
0.435 11.02 780 0.4927 0.7826
0.6362 12.02 845 0.5877 0.6739
0.4422 13.02 910 0.5351 0.8043
0.3913 14.02 975 0.7300 0.8043
0.6191 15.02 1040 1.1917 0.5652
0.2704 16.02 1105 0.5930 0.7826
0.3976 17.02 1170 0.5296 0.8043
0.3038 18.02 1235 0.6735 0.7609
0.2974 19.02 1300 0.4718 0.8261
0.2434 20.02 1365 0.5224 0.8261
0.4984 21.02 1430 1.2637 0.6957
0.1256 22.02 1495 0.7204 0.8261
0.448 23.02 1560 0.6897 0.7609
0.2702 24.02 1625 0.6801 0.8261
0.5101 25.02 1690 0.5134 0.8261
0.354 26.02 1755 0.8076 0.8043
0.4218 27.02 1820 0.7551 0.7826
1.1586 28.02 1885 1.1514 0.6522
0.3586 29.02 1950 1.1479 0.7391
0.4746 30.02 2015 0.9521 0.7174
0.6256 31.02 2080 0.8559 0.8043
0.4668 32.02 2145 0.9766 0.7826
0.1502 33.02 2210 0.9262 0.7826
0.5093 34.02 2275 0.9402 0.7609
0.2621 35.02 2340 0.9229 0.7609
0.1456 36.02 2405 0.7937 0.8261
0.1826 37.02 2470 0.9106 0.7826
0.3778 38.02 2535 0.9376 0.7826
0.1763 39.02 2600 0.9300 0.7826
0.1083 40.02 2665 1.1018 0.7609
0.1994 41.02 2730 0.8667 0.8261
0.0111 42.02 2795 0.9896 0.8043
0.0818 43.02 2860 1.0258 0.7826
0.1808 44.02 2925 0.9841 0.7826
0.1371 45.02 2990 0.9337 0.8043
0.0129 46.02 3055 0.8905 0.8043
0.1492 47.02 3120 0.9629 0.8261
0.0184 48.02 3185 1.0828 0.7174
0.1146 49.02 3250 1.0449 0.7826

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
48
Safetensors
Model size
304M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for beingbatman/CTMAE-P2-V2-S1

Finetuned
(49)
this model