metadata
library_name: transformers
license: cc-by-nc-4.0
base_model: MCG-NJU/videomae-base
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: videomae-base-finetuned-ucf101-subset
results: []
videomae-base-finetuned-ucf101-subset
This model is a fine-tuned version of MCG-NJU/videomae-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 2.1806
- Accuracy: 0.4883
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 960
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
4.2427 | 0.0323 | 31 | 4.2265 | 0.0033 |
4.2321 | 1.0323 | 62 | 4.2235 | 0.0100 |
4.24 | 2.0323 | 93 | 4.2282 | 0.0100 |
4.2445 | 3.0323 | 124 | 4.2250 | 0.0067 |
4.2327 | 4.0323 | 155 | 4.2244 | 0.0100 |
4.2104 | 5.0323 | 186 | 4.2100 | 0.0201 |
4.2374 | 6.0323 | 217 | 4.2022 | 0.0067 |
4.1597 | 7.0323 | 248 | 4.1188 | 0.0301 |
4.0522 | 8.0323 | 279 | 3.9351 | 0.0702 |
3.768 | 9.0323 | 310 | 3.6800 | 0.1070 |
3.5147 | 10.0323 | 341 | 3.5416 | 0.1104 |
3.2878 | 11.0323 | 372 | 3.7074 | 0.0702 |
2.9491 | 12.0323 | 403 | 3.3954 | 0.1070 |
2.806 | 13.0323 | 434 | 3.2552 | 0.1706 |
2.4568 | 14.0323 | 465 | 3.0654 | 0.2040 |
2.3102 | 15.0323 | 496 | 2.7440 | 0.3010 |
2.2079 | 16.0323 | 527 | 2.6789 | 0.3144 |
1.9638 | 17.0323 | 558 | 2.5920 | 0.3679 |
1.7914 | 18.0323 | 589 | 2.6152 | 0.3378 |
1.6925 | 19.0323 | 620 | 2.5971 | 0.3445 |
1.5124 | 20.0323 | 651 | 2.5767 | 0.3478 |
1.4834 | 21.0323 | 682 | 2.4439 | 0.3880 |
1.4565 | 22.0323 | 713 | 2.4057 | 0.3846 |
1.279 | 23.0323 | 744 | 2.5501 | 0.3545 |
1.1477 | 24.0323 | 775 | 2.3247 | 0.4482 |
1.2573 | 25.0323 | 806 | 2.1776 | 0.4883 |
1.0825 | 26.0323 | 837 | 2.1443 | 0.4783 |
1.2121 | 27.0323 | 868 | 2.1490 | 0.4783 |
1.0887 | 28.0323 | 899 | 2.1516 | 0.4716 |
1.1127 | 29.0323 | 930 | 2.1051 | 0.4883 |
0.9905 | 30.0312 | 960 | 2.1170 | 0.4816 |
Framework versions
- Transformers 4.48.0
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0