izaitova commited on
Commit
0181691
·
verified ·
1 Parent(s): beae6a0

End of training

Browse files
Files changed (1) hide show
  1. README.md +13 -13
README.md CHANGED
@@ -15,14 +15,14 @@ should probably proofread and complete it, then remove this comment. -->
15
 
16
  This model is a fine-tuned version of [google/mt5-large](https://huggingface.co/google/mt5-large) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 2.9627
19
- - Loc: {'precision': 0.07002967359050445, 'recall': 0.13817330210772832, 'f1': 0.09294998030720757, 'number': 854}
20
- - Org: {'precision': 0.06141439205955335, 'recall': 0.1523076923076923, 'f1': 0.08753315649867373, 'number': 650}
21
- - Per: {'precision': 0.030874785591766724, 'recall': 0.07741935483870968, 'f1': 0.04414469650521153, 'number': 465}
22
- - Overall Precision: 0.0567
23
- - Overall Recall: 0.1285
24
- - Overall F1: 0.0787
25
- - Overall Accuracy: 0.3287
26
 
27
  ## Model description
28
 
@@ -47,14 +47,14 @@ The following hyperparameters were used during training:
47
  - seed: 42
48
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
  - lr_scheduler_type: linear
50
- - num_epochs: 4
51
 
52
  ### Training results
53
 
54
- | Training Loss | Epoch | Step | Validation Loss | Loc | Org | Per | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
55
- |:-------------:|:-----:|:----:|:---------------:|:-----------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
56
- | 3.8187 | 2.0 | 10 | 3.1219 | {'precision': 0.06360022714366836, 'recall': 0.13114754098360656, 'f1': 0.08565965583173997, 'number': 854} | {'precision': 0.05763688760806916, 'recall': 0.15384615384615385, 'f1': 0.08385744234800839, 'number': 650} | {'precision': 0.027879677182685254, 'recall': 0.08172043010752689, 'f1': 0.04157549234135668, 'number': 465} | 0.0515 | 0.1270 | 0.0732 | 0.2983 |
57
- | 3.2942 | 4.0 | 20 | 2.9627 | {'precision': 0.07002967359050445, 'recall': 0.13817330210772832, 'f1': 0.09294998030720757, 'number': 854} | {'precision': 0.06141439205955335, 'recall': 0.1523076923076923, 'f1': 0.08753315649867373, 'number': 650} | {'precision': 0.030874785591766724, 'recall': 0.07741935483870968, 'f1': 0.04414469650521153, 'number': 465} | 0.0567 | 0.1285 | 0.0787 | 0.3287 |
58
 
59
 
60
  ### Framework versions
 
15
 
16
  This model is a fine-tuned version of [google/mt5-large](https://huggingface.co/google/mt5-large) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 0.5622
19
+ - Loc: {'precision': 0.9222857142857143, 'recall': 0.9449648711943794, 'f1': 0.9334875650665124, 'number': 854}
20
+ - Org: {'precision': 0.8973561430793157, 'recall': 0.8876923076923077, 'f1': 0.8924980665119876, 'number': 650}
21
+ - Per: {'precision': 0.9014373716632443, 'recall': 0.9440860215053763, 'f1': 0.9222689075630252, 'number': 465}
22
+ - Overall Precision: 0.9092
23
+ - Overall Recall: 0.9259
24
+ - Overall F1: 0.9175
25
+ - Overall Accuracy: 0.9582
26
 
27
  ## Model description
28
 
 
47
  - seed: 42
48
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
  - lr_scheduler_type: linear
50
+ - num_epochs: 20
51
 
52
  ### Training results
53
 
54
+ | Training Loss | Epoch | Step | Validation Loss | Loc | Org | Per | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
55
+ |:-------------:|:-----:|:-----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
56
+ | 0.1729 | 10.0 | 5000 | 0.4248 | {'precision': 0.9111361079865017, 'recall': 0.9484777517564403, 'f1': 0.9294320137693631, 'number': 854} | {'precision': 0.9027113237639554, 'recall': 0.8707692307692307, 'f1': 0.8864526233359435, 'number': 650} | {'precision': 0.9010309278350516, 'recall': 0.9397849462365592, 'f1': 0.92, 'number': 465} | 0.9060 | 0.9208 | 0.9134 | 0.9584 |
57
+ | 0.0068 | 20.0 | 10000 | 0.5622 | {'precision': 0.9222857142857143, 'recall': 0.9449648711943794, 'f1': 0.9334875650665124, 'number': 854} | {'precision': 0.8973561430793157, 'recall': 0.8876923076923077, 'f1': 0.8924980665119876, 'number': 650} | {'precision': 0.9014373716632443, 'recall': 0.9440860215053763, 'f1': 0.9222689075630252, 'number': 465} | 0.9092 | 0.9259 | 0.9175 | 0.9582 |
58
 
59
 
60
  ### Framework versions