CTMAE2_CS_V7_7

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4462
  • Accuracy: 0.8222

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 5
  • eval_batch_size: 5
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 7750

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.675 0.0201 156 0.8126 0.4667
0.6306 1.0201 312 0.8617 0.4667
0.5345 2.0201 468 0.6569 0.6222
0.36 3.0201 624 0.6944 0.6444
0.4708 4.0201 780 1.5217 0.5333
0.274 5.0201 936 0.7516 0.6
0.4109 6.0201 1092 0.7190 0.5778
0.639 7.0201 1248 0.8166 0.6444
0.5028 8.0201 1404 0.6623 0.7778
0.3825 9.0201 1560 0.5929 0.7111
0.4497 10.0201 1716 0.6088 0.7556
0.5095 11.0201 1872 1.6115 0.5556
0.3282 12.0201 2028 1.2386 0.5778
0.5552 13.0201 2184 1.5737 0.6
0.3099 14.0201 2340 1.4204 0.6
0.2961 15.0201 2496 1.5614 0.5333
0.3876 16.0201 2652 0.9776 0.7778
0.2942 17.0201 2808 1.1406 0.7778
0.4692 18.0201 2964 1.2078 0.7778
0.1124 19.0201 3120 1.3382 0.7333
0.3766 20.0201 3276 1.0293 0.8
0.2437 21.0201 3432 1.3097 0.7111
0.3116 22.0201 3588 1.4258 0.7111
0.0411 23.0201 3744 1.1758 0.7556
0.2582 24.0201 3900 1.5001 0.7333
0.0351 25.0201 4056 1.4352 0.7778
0.2065 26.0201 4212 1.3168 0.7556
0.4435 27.0201 4368 1.2661 0.8
0.0559 28.0201 4524 1.2387 0.7778
0.3598 29.0201 4680 1.4478 0.7333
0.0103 30.0201 4836 1.8970 0.6444
0.1988 31.0201 4992 1.3777 0.8
0.1377 32.0201 5148 1.8340 0.7333
0.281 33.0201 5304 1.5446 0.7778
0.1181 34.0201 5460 1.5648 0.7778
0.194 35.0201 5616 1.5953 0.7333
0.0012 36.0201 5772 1.6369 0.7556
0.2422 37.0201 5928 1.8253 0.7556
0.2209 38.0201 6084 1.4462 0.8222
0.2681 39.0201 6240 1.9831 0.6889
0.3073 40.0201 6396 1.5197 0.7556
0.1418 41.0201 6552 1.9243 0.7333
0.0021 42.0201 6708 1.6636 0.7556
0.0029 43.0201 6864 2.1923 0.6667
0.2099 44.0201 7020 1.7219 0.7556
0.0003 45.0201 7176 2.0043 0.6889
0.0269 46.0201 7332 1.8475 0.7333
0.0008 47.0201 7488 1.8957 0.7333
0.0003 48.0201 7644 1.8716 0.7556
0.1757 49.0137 7750 1.8668 0.7556

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
3
Safetensors
Model size
304M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for beingbatman/CTMAE2_CS_V7_7

Finetuned
(49)
this model