--- library_name: transformers license: cc-by-nc-4.0 base_model: MCG-NJU/videomae-large-finetuned-kinetics tags: - generated_from_trainer metrics: - accuracy model-index: - name: CTMAE-P2-V4-S1 results: [] --- # CTMAE-P2-V4-S1 This model is a fine-tuned version of [MCG-NJU/videomae-large-finetuned-kinetics](https://huggingface.co/MCG-NJU/videomae-large-finetuned-kinetics) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.4018 - Accuracy: 0.8696 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - training_steps: 6500 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-------:|:----:|:---------------:|:--------:| | 0.6432 | 0.0202 | 131 | 0.8218 | 0.5435 | | 0.4243 | 1.0202 | 262 | 2.1313 | 0.5435 | | 1.0867 | 2.0202 | 393 | 1.9686 | 0.5435 | | 0.7155 | 3.0202 | 524 | 0.8454 | 0.5435 | | 1.2743 | 4.0202 | 655 | 1.4546 | 0.5435 | | 0.7952 | 5.0202 | 786 | 1.4258 | 0.5435 | | 0.7364 | 6.0202 | 917 | 1.2201 | 0.5435 | | 1.4976 | 7.0202 | 1048 | 0.9825 | 0.5435 | | 0.7766 | 8.0202 | 1179 | 1.4507 | 0.5435 | | 0.8254 | 9.0202 | 1310 | 1.0845 | 0.5435 | | 0.7267 | 10.0202 | 1441 | 1.2709 | 0.5435 | | 0.6193 | 11.0202 | 1572 | 1.2887 | 0.5435 | | 0.637 | 12.0202 | 1703 | 0.6064 | 0.5870 | | 0.7321 | 13.0202 | 1834 | 0.9141 | 0.5435 | | 1.0134 | 14.0202 | 1965 | 1.6380 | 0.5435 | | 0.2139 | 15.0202 | 2096 | 0.6076 | 0.6957 | | 0.687 | 16.0202 | 2227 | 1.1890 | 0.5435 | | 0.8722 | 17.0202 | 2358 | 0.5484 | 0.7826 | | 1.176 | 18.0202 | 2489 | 0.4018 | 0.8696 | | 0.8042 | 19.0202 | 2620 | 0.9282 | 0.7174 | | 0.3013 | 20.0202 | 2751 | 0.5083 | 0.7609 | | 0.646 | 21.0202 | 2882 | 0.5288 | 0.8261 | | 1.3053 | 22.0202 | 3013 | 1.1224 | 0.6522 | | 0.9903 | 23.0202 | 3144 | 0.4706 | 0.8043 | | 0.7157 | 24.0202 | 3275 | 0.8163 | 0.7826 | | 0.3073 | 25.0202 | 3406 | 1.1409 | 0.7609 | | 0.5069 | 26.0202 | 3537 | 0.5148 | 0.8043 | | 0.2683 | 27.0202 | 3668 | 1.2584 | 0.7609 | | 0.4506 | 28.0202 | 3799 | 0.8241 | 0.7826 | | 0.6501 | 29.0202 | 3930 | 0.9973 | 0.8261 | | 0.6552 | 30.0202 | 4061 | 1.0375 | 0.7609 | | 0.0676 | 31.0202 | 4192 | 0.9151 | 0.8043 | | 0.0514 | 32.0202 | 4323 | 1.3664 | 0.7609 | | 0.5447 | 33.0202 | 4454 | 1.4551 | 0.7391 | | 0.3701 | 34.0202 | 4585 | 1.3388 | 0.7826 | | 0.6869 | 35.0202 | 4716 | 1.3302 | 0.7826 | | 0.1371 | 36.0202 | 4847 | 1.4859 | 0.7826 | | 0.5565 | 37.0202 | 4978 | 1.4103 | 0.7609 | | 0.0022 | 38.0202 | 5109 | 1.3428 | 0.7391 | | 1.5897 | 39.0202 | 5240 | 1.4125 | 0.7391 | | 0.1302 | 40.0202 | 5371 | 1.3893 | 0.7391 | | 0.2463 | 41.0202 | 5502 | 1.5599 | 0.7609 | | 0.0007 | 42.0202 | 5633 | 1.3542 | 0.7826 | | 0.0004 | 43.0202 | 5764 | 1.8062 | 0.7391 | | 0.0803 | 44.0202 | 5895 | 1.5014 | 0.7826 | | 0.3332 | 45.0202 | 6026 | 1.5840 | 0.7391 | | 0.0004 | 46.0202 | 6157 | 1.5471 | 0.7391 | | 0.0004 | 47.0202 | 6288 | 1.3982 | 0.7826 | | 0.0019 | 48.0202 | 6419 | 1.4889 | 0.7391 | | 0.0006 | 49.0125 | 6500 | 1.4876 | 0.7609 | ### Framework versions - Transformers 4.46.2 - Pytorch 2.0.1+cu117 - Datasets 3.0.1 - Tokenizers 0.20.0