File size: 18,183 Bytes
632262c 330eb8b 632262c 330eb8b 632262c 330eb8b 632262c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 |
---
license: cc-by-nc-4.0
base_model: MCG-NJU/videomae-small-finetuned-kinetics
tags:
- generated_from_trainer
metrics:
- accuracy
- matthews_correlation
model-index:
- name: videomae-small-finetuned-kinetics-finetuned-SNchunks-5c-a40
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# videomae-small-finetuned-kinetics-finetuned-SNchunks-5c-a40
This model is a fine-tuned version of [MCG-NJU/videomae-small-finetuned-kinetics](https://huggingface.co/MCG-NJU/videomae-small-finetuned-kinetics) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7599
- Accuracy: 0.7159
- Balanced Accuracy: 0.7157
- Matthews Correlation: 0.6515
- Confusion Matrix: [[1135 54 69 73 41]
[ 333 828 92 50 68]
[ 161 23 1008 165 13]
[ 306 34 292 705 27]
[ 102 17 16 9 1226]]
- 0 Ball out of play: {'precision': 0.5571919489445263, 'recall': 0.827259475218659, 'f1-score': 0.665884423584629, 'support': 1372.0}
- Precision 0: 0.5572
- Recall 0: 0.8273
- F1-score 0: 0.6659
- Support 0: 1372.0
- 1 Foul: {'precision': 0.8661087866108786, 'recall': 0.6039387308533917, 'f1-score': 0.7116458960034379, 'support': 1371.0}
- Precision 1: 0.8661
- Recall 1: 0.6039
- F1-score 1: 0.7116
- Support 1: 1371.0
- 2 Goal: {'precision': 0.6824644549763034, 'recall': 0.7357664233576642, 'f1-score': 0.7081138040042149, 'support': 1370.0}
- Precision 2: 0.6825
- Recall 2: 0.7358
- F1-score 2: 0.7081
- Support 2: 1370.0
- 3 Shots: {'precision': 0.7035928143712575, 'recall': 0.5168621700879765, 'f1-score': 0.5959425190194421, 'support': 1364.0}
- Precision 3: 0.7036
- Recall 3: 0.5169
- F1-score 3: 0.5959
- Support 3: 1364.0
- 4 Throw-in: {'precision': 0.8916363636363637, 'recall': 0.8948905109489051, 'f1-score': 0.8932604735883425, 'support': 1370.0}
- Precision 4: 0.8916
- Recall 4: 0.8949
- F1-score 4: 0.8933
- Support 4: 1370.0
- Precision Macro avg: 0.7402
- Recall Macro avg: 0.7157
- F1-score Macro avg: 0.7150
- Support Macro avg: 6847.0
- Precision Weighted avg: 0.7402
- Recall Weighted avg: 0.7159
- F1-score Weighted avg: 0.7151
- Support Weighted avg: 6847.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- distributed_type: multi-GPU
- num_devices: 2
- gradient_accumulation_steps: 2
- total_train_batch_size: 64
- total_eval_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | Balanced Accuracy | Matthews Correlation | Confusion Matrix | 0 Ball out of play | Precision 0 | Recall 0 | F1-score 0 | Support 0 | 1 Foul | Precision 1 | Recall 1 | F1-score 1 | Support 1 | 2 Goal | Precision 2 | Recall 2 | F1-score 2 | Support 2 | 3 Shots | Precision 3 | Recall 3 | F1-score 3 | Support 3 | 4 Throw-in | Precision 4 | Recall 4 | F1-score 4 | Support 4 | Precision Macro avg | Recall Macro avg | F1-score Macro avg | Support Macro avg | Precision Weighted avg | Recall Weighted avg | F1-score Weighted avg | Support Weighted avg |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-----------------:|:--------------------:|:--------------------------------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------:|:-----------:|:--------:|:----------:|:---------:|:-------------------------------------------------------------------------------------------------------------------:|:-----------:|:--------:|:----------:|:---------:|:------------------------------------------------------------------------------------------------------------------:|:-----------:|:--------:|:----------:|:---------:|:--------------------------------------------------------------------------------------------------------------------:|:-----------:|:--------:|:----------:|:---------:|:------------------------------------------------------------------------------------------------------------------:|:-----------:|:--------:|:----------:|:---------:|:-------------------:|:----------------:|:------------------:|:-----------------:|:----------------------:|:-------------------:|:---------------------:|:--------------------:|
| 0.7825 | 1.0 | 428 | 0.9234 | 0.6207 | 0.6205 | 0.5371 | [[ 961 43 100 80 188]
[ 461 597 105 35 173]
[ 195 22 953 167 33]
[ 454 33 354 464 59]
[ 65 7 19 4 1275]] | {'precision': 0.4499063670411985, 'recall': 0.7004373177842566, 'f1-score': 0.5478905359179019, 'support': 1372.0} | 0.4499 | 0.7004 | 0.5479 | 1372.0 | {'precision': 0.8504273504273504, 'recall': 0.43544857768052514, 'f1-score': 0.5759768451519537, 'support': 1371.0} | 0.8504 | 0.4354 | 0.5760 | 1371.0 | {'precision': 0.6224689745264533, 'recall': 0.6956204379562044, 'f1-score': 0.6570148224750086, 'support': 1370.0} | 0.6225 | 0.6956 | 0.6570 | 1370.0 | {'precision': 0.6186666666666667, 'recall': 0.34017595307917886, 'f1-score': 0.4389782403027436, 'support': 1364.0} | 0.6187 | 0.3402 | 0.4390 | 1364.0 | {'precision': 0.7378472222222222, 'recall': 0.9306569343065694, 'f1-score': 0.8231116849580373, 'support': 1370.0} | 0.7378 | 0.9307 | 0.8231 | 1370.0 | 0.6559 | 0.6205 | 0.6086 | 6847.0 | 0.6559 | 0.6207 | 0.6087 | 6847.0 |
| 0.8655 | 2.0 | 856 | 0.8769 | 0.6648 | 0.6646 | 0.5874 | [[1007 78 82 82 123]
[ 328 794 78 54 117]
[ 162 36 972 182 18]
[ 398 50 313 536 67]
[ 74 15 30 8 1243]] | {'precision': 0.5114271203656678, 'recall': 0.7339650145772595, 'f1-score': 0.6028135288835678, 'support': 1372.0} | 0.5114 | 0.7340 | 0.6028 | 1372.0 | {'precision': 0.816032887975334, 'recall': 0.5791393143690736, 'f1-score': 0.6774744027303755, 'support': 1371.0} | 0.8160 | 0.5791 | 0.6775 | 1371.0 | {'precision': 0.6589830508474577, 'recall': 0.7094890510948905, 'f1-score': 0.6833040421792619, 'support': 1370.0} | 0.6590 | 0.7095 | 0.6833 | 1370.0 | {'precision': 0.6218097447795824, 'recall': 0.39296187683284456, 'f1-score': 0.48158131176999097, 'support': 1364.0} | 0.6218 | 0.3930 | 0.4816 | 1364.0 | {'precision': 0.7927295918367347, 'recall': 0.9072992700729927, 'f1-score': 0.8461538461538461, 'support': 1370.0} | 0.7927 | 0.9073 | 0.8462 | 1370.0 | 0.6802 | 0.6646 | 0.6583 | 6847.0 | 0.6802 | 0.6648 | 0.6584 | 6847.0 |
| 0.8065 | 3.0 | 1284 | 0.7639 | 0.7037 | 0.7035 | 0.6356 | [[1046 89 119 82 36]
[ 271 906 104 47 43]
[ 106 21 1116 126 1]
[ 266 35 408 646 9]
[ 141 51 60 14 1104]] | {'precision': 0.571584699453552, 'recall': 0.7623906705539358, 'f1-score': 0.6533416614615865, 'support': 1372.0} | 0.5716 | 0.7624 | 0.6533 | 1372.0 | {'precision': 0.822141560798548, 'recall': 0.6608315098468271, 'f1-score': 0.7327133036797413, 'support': 1371.0} | 0.8221 | 0.6608 | 0.7327 | 1371.0 | {'precision': 0.6175982291090205, 'recall': 0.8145985401459854, 'f1-score': 0.7025495750708216, 'support': 1370.0} | 0.6176 | 0.8146 | 0.7025 | 1370.0 | {'precision': 0.7060109289617487, 'recall': 0.4736070381231672, 'f1-score': 0.566915313734094, 'support': 1364.0} | 0.7060 | 0.4736 | 0.5669 | 1364.0 | {'precision': 0.9253981559094719, 'recall': 0.8058394160583942, 'f1-score': 0.8614904408895825, 'support': 1370.0} | 0.9254 | 0.8058 | 0.8615 | 1370.0 | 0.7285 | 0.7035 | 0.7034 | 6847.0 | 0.7285 | 0.7037 | 0.7035 | 6847.0 |
| 0.6598 | 4.0 | 1712 | 0.7694 | 0.6994 | 0.6992 | 0.6319 | [[1106 42 82 80 62]
[ 379 735 117 60 80]
[ 133 17 1053 159 8]
[ 293 28 340 671 32]
[ 98 16 21 11 1224]] | {'precision': 0.5505226480836237, 'recall': 0.8061224489795918, 'f1-score': 0.6542443064182195, 'support': 1372.0} | 0.5505 | 0.8061 | 0.6542 | 1372.0 | {'precision': 0.8770883054892601, 'recall': 0.5361050328227571, 'f1-score': 0.6654594839293798, 'support': 1371.0} | 0.8771 | 0.5361 | 0.6655 | 1371.0 | {'precision': 0.652820830750155, 'recall': 0.7686131386861313, 'f1-score': 0.7060006704659737, 'support': 1370.0} | 0.6528 | 0.7686 | 0.7060 | 1370.0 | {'precision': 0.6839959225280327, 'recall': 0.49193548387096775, 'f1-score': 0.5722814498933901, 'support': 1364.0} | 0.6840 | 0.4919 | 0.5723 | 1364.0 | {'precision': 0.8705547652916074, 'recall': 0.8934306569343066, 'f1-score': 0.8818443804034583, 'support': 1370.0} | 0.8706 | 0.8934 | 0.8818 | 1370.0 | 0.7270 | 0.6992 | 0.6960 | 6847.0 | 0.7270 | 0.6994 | 0.6961 | 6847.0 |
| 0.5968 | 5.0 | 2140 | 0.7820 | 0.6991 | 0.6989 | 0.6335 | [[1140 50 77 59 46]
[ 360 834 85 32 60]
[ 186 26 1007 140 11]
[ 384 56 293 593 38]
[ 129 19 6 3 1213]] | {'precision': 0.5184174624829468, 'recall': 0.8309037900874635, 'f1-score': 0.6384766171940633, 'support': 1372.0} | 0.5184 | 0.8309 | 0.6385 | 1372.0 | {'precision': 0.8467005076142132, 'recall': 0.6083150984682714, 'f1-score': 0.7079796264855689, 'support': 1371.0} | 0.8467 | 0.6083 | 0.7080 | 1371.0 | {'precision': 0.6859673024523161, 'recall': 0.7350364963503649, 'f1-score': 0.7096546863988723, 'support': 1370.0} | 0.6860 | 0.7350 | 0.7097 | 1370.0 | {'precision': 0.717049576783555, 'recall': 0.4347507331378299, 'f1-score': 0.5413053400273847, 'support': 1364.0} | 0.7170 | 0.4348 | 0.5413 | 1364.0 | {'precision': 0.8866959064327485, 'recall': 0.8854014598540146, 'f1-score': 0.8860482103725348, 'support': 1370.0} | 0.8867 | 0.8854 | 0.8860 | 1370.0 | 0.7310 | 0.6989 | 0.6967 | 6847.0 | 0.7309 | 0.6991 | 0.6968 | 6847.0 |
| 0.5675 | 6.0 | 2568 | 0.7603 | 0.7159 | 0.7157 | 0.6515 | [[1135 54 69 73 41]
[ 333 828 92 50 68]
[ 161 23 1008 165 13]
[ 306 34 292 705 27]
[ 102 17 16 9 1226]] | {'precision': 0.5571919489445263, 'recall': 0.827259475218659, 'f1-score': 0.665884423584629, 'support': 1372.0} | 0.5572 | 0.8273 | 0.6659 | 1372.0 | {'precision': 0.8661087866108786, 'recall': 0.6039387308533917, 'f1-score': 0.7116458960034379, 'support': 1371.0} | 0.8661 | 0.6039 | 0.7116 | 1371.0 | {'precision': 0.6824644549763034, 'recall': 0.7357664233576642, 'f1-score': 0.7081138040042149, 'support': 1370.0} | 0.6825 | 0.7358 | 0.7081 | 1370.0 | {'precision': 0.7035928143712575, 'recall': 0.5168621700879765, 'f1-score': 0.5959425190194421, 'support': 1364.0} | 0.7036 | 0.5169 | 0.5959 | 1364.0 | {'precision': 0.8916363636363637, 'recall': 0.8948905109489051, 'f1-score': 0.8932604735883425, 'support': 1370.0} | 0.8916 | 0.8949 | 0.8933 | 1370.0 | 0.7402 | 0.7157 | 0.7150 | 6847.0 | 0.7402 | 0.7159 | 0.7151 | 6847.0 |
| 0.4824 | 7.0 | 2996 | 0.8064 | 0.6958 | 0.6956 | 0.6308 | [[1178 37 62 69 26]
[ 396 787 80 57 51]
[ 188 14 993 172 3]
[ 378 32 287 650 17]
[ 173 16 17 8 1156]] | {'precision': 0.5092952875054042, 'recall': 0.858600583090379, 'f1-score': 0.639348710990502, 'support': 1372.0} | 0.5093 | 0.8586 | 0.6393 | 1372.0 | {'precision': 0.8882618510158014, 'recall': 0.574033552151714, 'f1-score': 0.6973859105006646, 'support': 1371.0} | 0.8883 | 0.5740 | 0.6974 | 1371.0 | {'precision': 0.6900625434329395, 'recall': 0.7248175182481752, 'f1-score': 0.7070131719473122, 'support': 1370.0} | 0.6901 | 0.7248 | 0.7070 | 1370.0 | {'precision': 0.6799163179916318, 'recall': 0.47653958944281527, 'f1-score': 0.560344827586207, 'support': 1364.0} | 0.6799 | 0.4765 | 0.5603 | 1364.0 | {'precision': 0.922585794094174, 'recall': 0.8437956204379562, 'f1-score': 0.881433473122379, 'support': 1370.0} | 0.9226 | 0.8438 | 0.8814 | 1370.0 | 0.7380 | 0.6956 | 0.6971 | 6847.0 | 0.7380 | 0.6958 | 0.6972 | 6847.0 |
| 0.6574 | 8.0 | 3424 | 0.7998 | 0.7035 | 0.7033 | 0.6385 | [[1141 55 85 65 26]
[ 341 827 113 50 40]
[ 150 19 1084 113 4]
[ 321 47 353 624 19]
[ 166 32 25 6 1141]] | {'precision': 0.5384615384615384, 'recall': 0.8316326530612245, 'f1-score': 0.6536808937267259, 'support': 1372.0} | 0.5385 | 0.8316 | 0.6537 | 1372.0 | {'precision': 0.8438775510204082, 'recall': 0.6032093362509118, 'f1-score': 0.7035304125903871, 'support': 1371.0} | 0.8439 | 0.6032 | 0.7035 | 1371.0 | {'precision': 0.653012048192771, 'recall': 0.7912408759124088, 'f1-score': 0.7155115511551154, 'support': 1370.0} | 0.6530 | 0.7912 | 0.7155 | 1370.0 | {'precision': 0.7272727272727273, 'recall': 0.4574780058651026, 'f1-score': 0.5616561656165616, 'support': 1364.0} | 0.7273 | 0.4575 | 0.5617 | 1364.0 | {'precision': 0.9276422764227642, 'recall': 0.8328467153284671, 'f1-score': 0.8776923076923077, 'support': 1370.0} | 0.9276 | 0.8328 | 0.8777 | 1370.0 | 0.7381 | 0.7033 | 0.7024 | 6847.0 | 0.7380 | 0.7035 | 0.7025 | 6847.0 |
| 0.4709 | 9.0 | 3852 | 0.8032 | 0.7024 | 0.7021 | 0.6373 | [[1161 47 70 68 26]
[ 365 794 98 62 52]
[ 177 16 1019 155 3]
[ 353 39 297 654 21]
[ 149 19 16 5 1181]] | {'precision': 0.5265306122448979, 'recall': 0.8462099125364432, 'f1-score': 0.6491473301649426, 'support': 1372.0} | 0.5265 | 0.8462 | 0.6491 | 1372.0 | {'precision': 0.8677595628415301, 'recall': 0.5791393143690736, 'f1-score': 0.6946631671041119, 'support': 1371.0} | 0.8678 | 0.5791 | 0.6947 | 1371.0 | {'precision': 0.6793333333333333, 'recall': 0.7437956204379562, 'f1-score': 0.7101045296167248, 'support': 1370.0} | 0.6793 | 0.7438 | 0.7101 | 1370.0 | {'precision': 0.6927966101694916, 'recall': 0.47947214076246336, 'f1-score': 0.5667244367417678, 'support': 1364.0} | 0.6928 | 0.4795 | 0.5667 | 1364.0 | {'precision': 0.9204988308651598, 'recall': 0.862043795620438, 'f1-score': 0.8903128533735394, 'support': 1370.0} | 0.9205 | 0.8620 | 0.8903 | 1370.0 | 0.7374 | 0.7021 | 0.7022 | 6847.0 | 0.7374 | 0.7024 | 0.7023 | 6847.0 |
| 0.3689 | 10.0 | 4280 | 0.8093 | 0.7082 | 0.7079 | 0.6447 | [[1160 58 65 58 31]
[ 343 852 86 40 50]
[ 191 23 1015 136 5]
[ 383 52 284 624 21]
[ 130 24 13 5 1198]] | {'precision': 0.5256003624830086, 'recall': 0.8454810495626822, 'f1-score': 0.648225761385862, 'support': 1372.0} | 0.5256 | 0.8455 | 0.6482 | 1372.0 | {'precision': 0.844400396432111, 'recall': 0.6214442013129103, 'f1-score': 0.7159663865546219, 'support': 1371.0} | 0.8444 | 0.6214 | 0.7160 | 1371.0 | {'precision': 0.69377990430622, 'recall': 0.7408759124087592, 'f1-score': 0.7165548888104484, 'support': 1370.0} | 0.6938 | 0.7409 | 0.7166 | 1370.0 | {'precision': 0.7230590961761297, 'recall': 0.4574780058651026, 'f1-score': 0.5603951504265828, 'support': 1364.0} | 0.7231 | 0.4575 | 0.5604 | 1364.0 | {'precision': 0.918007662835249, 'recall': 0.8744525547445255, 'f1-score': 0.8957009345794392, 'support': 1370.0} | 0.9180 | 0.8745 | 0.8957 | 1370.0 | 0.7410 | 0.7079 | 0.7074 | 6847.0 | 0.7409 | 0.7082 | 0.7075 | 6847.0 |
### Framework versions
- Transformers 4.31.0
- Pytorch 2.0.1+git8bfa463
- Datasets 2.13.1
- Tokenizers 0.13.3
|