File size: 4,645 Bytes
a8fd5da
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
dbf3f72
bce9afa
a8fd5da
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
dbf3f72
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
a8fd5da
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
---
library_name: transformers
license: cc-by-nc-4.0
base_model: MCG-NJU/videomae-large-finetuned-kinetics
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: CTMAE-P2-V4-S3
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# CTMAE-P2-V4-S3

This model is a fine-tuned version of [MCG-NJU/videomae-large-finetuned-kinetics](https://huggingface.co/MCG-NJU/videomae-large-finetuned-kinetics) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.1094
- Accuracy: 0.7111

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 13050

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 1.5461        | 0.02  | 261   | 2.1854          | 0.5556   |
| 0.6074        | 1.02  | 522   | 2.6518          | 0.5556   |
| 1.5766        | 2.02  | 783   | 1.9843          | 0.5556   |
| 0.7713        | 3.02  | 1044  | 2.2332          | 0.5556   |
| 1.797         | 4.02  | 1305  | 1.7064          | 0.5556   |
| 0.8914        | 5.02  | 1566  | 1.8977          | 0.5556   |
| 0.7372        | 6.02  | 1827  | 2.2072          | 0.5556   |
| 1.0467        | 7.02  | 2088  | 1.7544          | 0.5556   |
| 1.2248        | 8.02  | 2349  | 2.0315          | 0.5556   |
| 0.7126        | 9.02  | 2610  | 1.7717          | 0.5556   |
| 1.2486        | 10.02 | 2871  | 2.0448          | 0.5556   |
| 2.2836        | 11.02 | 3132  | 2.1988          | 0.5556   |
| 0.8409        | 12.02 | 3393  | 1.6258          | 0.6444   |
| 0.4642        | 13.02 | 3654  | 1.3451          | 0.6667   |
| 0.007         | 14.02 | 3915  | 2.2438          | 0.5556   |
| 0.9377        | 15.02 | 4176  | 1.1871          | 0.6444   |
| 0.7025        | 16.02 | 4437  | 1.8905          | 0.6444   |
| 0.2657        | 17.02 | 4698  | 2.1760          | 0.6222   |
| 1.3937        | 18.02 | 4959  | 2.0622          | 0.6      |
| 1.9924        | 19.02 | 5220  | 1.8416          | 0.6667   |
| 0.0009        | 20.02 | 5481  | 1.9068          | 0.6444   |
| 1.0231        | 21.02 | 5742  | 1.8428          | 0.6667   |
| 0.7099        | 22.02 | 6003  | 2.3108          | 0.6      |
| 0.3243        | 23.02 | 6264  | 2.2084          | 0.5778   |
| 2.748         | 24.02 | 6525  | 1.8855          | 0.6889   |
| 0.0002        | 25.02 | 6786  | 1.9443          | 0.6667   |
| 1.1288        | 26.02 | 7047  | 1.6372          | 0.6444   |
| 0.0024        | 27.02 | 7308  | 2.0813          | 0.6444   |
| 1.3731        | 28.02 | 7569  | 2.1846          | 0.6444   |
| 0.0085        | 29.02 | 7830  | 2.2414          | 0.6222   |
| 0.0004        | 30.02 | 8091  | 2.5363          | 0.5778   |
| 0.7817        | 31.02 | 8352  | 2.8433          | 0.5778   |
| 0.3487        | 32.02 | 8613  | 2.6374          | 0.6444   |
| 0.0014        | 33.02 | 8874  | 3.0313          | 0.5778   |
| 0.0009        | 34.02 | 9135  | 2.6187          | 0.6667   |
| 0.014         | 35.02 | 9396  | 2.1094          | 0.7111   |
| 0.512         | 36.02 | 9657  | 2.1110          | 0.6667   |
| 0.0003        | 37.02 | 9918  | 3.0441          | 0.5778   |
| 0.0001        | 38.02 | 10179 | 2.4423          | 0.6889   |
| 0.0009        | 39.02 | 10440 | 2.3538          | 0.6889   |
| 0.0001        | 40.02 | 10701 | 2.4812          | 0.6667   |
| 0.0001        | 41.02 | 10962 | 2.5847          | 0.6667   |
| 0.0           | 42.02 | 11223 | 2.5525          | 0.6889   |
| 0.002         | 43.02 | 11484 | 2.6746          | 0.6889   |
| 0.0004        | 44.02 | 11745 | 2.4888          | 0.6667   |
| 0.0001        | 45.02 | 12006 | 2.5662          | 0.6444   |
| 0.0011        | 46.02 | 12267 | 2.5288          | 0.6667   |
| 0.0001        | 47.02 | 12528 | 2.5611          | 0.6667   |
| 0.7043        | 48.02 | 12789 | 2.7606          | 0.6667   |
| 0.0001        | 49.02 | 13050 | 2.7966          | 0.6667   |


### Framework versions

- Transformers 4.46.2
- Pytorch 2.0.1+cu117
- Datasets 3.0.1
- Tokenizers 0.20.0