ankush-003 commited on
Commit
7e62ff0
·
1 Parent(s): cc4612d

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +132 -0
README.md ADDED
@@ -0,0 +1,132 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ base_model: roberta-base
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: fine-tuned-roberta-nosql-injection
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # fine-tuned-roberta-nosql-injection
15
+
16
+ This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.0000
19
+
20
+ ## Model description
21
+
22
+ More information needed
23
+
24
+ ## Intended uses & limitations
25
+
26
+ More information needed
27
+
28
+ ## Training and evaluation data
29
+
30
+ More information needed
31
+
32
+ ## Training procedure
33
+
34
+ ### Training hyperparameters
35
+
36
+ The following hyperparameters were used during training:
37
+ - learning_rate: 2e-05
38
+ - train_batch_size: 4
39
+ - eval_batch_size: 8
40
+ - seed: 42
41
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
+ - lr_scheduler_type: linear
43
+ - lr_scheduler_warmup_steps: 100
44
+ - num_epochs: 75
45
+
46
+ ### Training results
47
+
48
+ | Training Loss | Epoch | Step | Validation Loss |
49
+ |:-------------:|:-----:|:-----:|:---------------:|
50
+ | 1.2572 | 1.0 | 158 | 0.2235 |
51
+ | 0.1175 | 2.0 | 316 | 0.0325 |
52
+ | 0.0454 | 3.0 | 474 | 0.1079 |
53
+ | 0.05 | 4.0 | 632 | 0.0212 |
54
+ | 0.0677 | 5.0 | 790 | 0.0713 |
55
+ | 0.0821 | 6.0 | 948 | 0.0007 |
56
+ | 0.0259 | 7.0 | 1106 | 0.0277 |
57
+ | 0.0422 | 8.0 | 1264 | 0.0068 |
58
+ | 0.0282 | 9.0 | 1422 | 0.0492 |
59
+ | 0.0273 | 10.0 | 1580 | 0.0008 |
60
+ | 0.0272 | 11.0 | 1738 | 0.0256 |
61
+ | 0.0859 | 12.0 | 1896 | 0.0000 |
62
+ | 0.0271 | 13.0 | 2054 | 0.0001 |
63
+ | 0.0058 | 14.0 | 2212 | 0.0583 |
64
+ | 0.0121 | 15.0 | 2370 | 0.0257 |
65
+ | 0.0189 | 16.0 | 2528 | 0.0631 |
66
+ | 0.0275 | 17.0 | 2686 | 0.0186 |
67
+ | 0.006 | 18.0 | 2844 | 0.0027 |
68
+ | 0.025 | 19.0 | 3002 | 0.0349 |
69
+ | 0.0377 | 20.0 | 3160 | 0.0004 |
70
+ | 0.0108 | 21.0 | 3318 | 0.0091 |
71
+ | 0.0233 | 22.0 | 3476 | 0.0772 |
72
+ | 0.0216 | 23.0 | 3634 | 0.0000 |
73
+ | 0.0255 | 24.0 | 3792 | 0.0607 |
74
+ | 0.0211 | 25.0 | 3950 | 0.0251 |
75
+ | 0.037 | 26.0 | 4108 | 0.0223 |
76
+ | 0.0057 | 27.0 | 4266 | 0.0375 |
77
+ | 0.0464 | 28.0 | 4424 | 0.0659 |
78
+ | 0.0446 | 29.0 | 4582 | 0.0235 |
79
+ | 0.0453 | 30.0 | 4740 | 0.0278 |
80
+ | 0.0033 | 31.0 | 4898 | 0.0417 |
81
+ | 0.0104 | 32.0 | 5056 | 0.0544 |
82
+ | 0.0084 | 33.0 | 5214 | 0.0000 |
83
+ | 0.0004 | 34.0 | 5372 | 0.0247 |
84
+ | 0.0185 | 35.0 | 5530 | 0.0002 |
85
+ | 0.0165 | 36.0 | 5688 | 0.0000 |
86
+ | 0.0381 | 37.0 | 5846 | 0.0000 |
87
+ | 0.0281 | 38.0 | 6004 | 0.0000 |
88
+ | 0.006 | 39.0 | 6162 | 0.0085 |
89
+ | 0.0083 | 40.0 | 6320 | 0.0000 |
90
+ | 0.0101 | 41.0 | 6478 | 0.0006 |
91
+ | 0.0282 | 42.0 | 6636 | 0.0003 |
92
+ | 0.0202 | 43.0 | 6794 | 0.0205 |
93
+ | 0.0053 | 44.0 | 6952 | 0.0275 |
94
+ | 0.0293 | 45.0 | 7110 | 0.0485 |
95
+ | 0.0119 | 46.0 | 7268 | 0.0000 |
96
+ | 0.0045 | 47.0 | 7426 | 0.0000 |
97
+ | 0.0066 | 48.0 | 7584 | 0.0268 |
98
+ | 0.0191 | 49.0 | 7742 | 0.0103 |
99
+ | 0.0007 | 50.0 | 7900 | 0.0386 |
100
+ | 0.0072 | 51.0 | 8058 | 0.0000 |
101
+ | 0.0031 | 52.0 | 8216 | 0.0000 |
102
+ | 0.0037 | 53.0 | 8374 | 0.0225 |
103
+ | 0.0135 | 54.0 | 8532 | 0.0003 |
104
+ | 0.0015 | 55.0 | 8690 | 0.0002 |
105
+ | 0.0066 | 56.0 | 8848 | 0.0025 |
106
+ | 0.0281 | 57.0 | 9006 | 0.0145 |
107
+ | 0.012 | 58.0 | 9164 | 0.0000 |
108
+ | 0.0065 | 59.0 | 9322 | 0.0000 |
109
+ | 0.0054 | 60.0 | 9480 | 0.0082 |
110
+ | 0.0104 | 61.0 | 9638 | 0.0000 |
111
+ | 0.0005 | 62.0 | 9796 | 0.0303 |
112
+ | 0.005 | 63.0 | 9954 | 0.0000 |
113
+ | 0.0092 | 64.0 | 10112 | 0.0412 |
114
+ | 0.0055 | 65.0 | 10270 | 0.0191 |
115
+ | 0.0092 | 66.0 | 10428 | 0.0158 |
116
+ | 0.0065 | 67.0 | 10586 | 0.0087 |
117
+ | 0.0004 | 68.0 | 10744 | 0.0000 |
118
+ | 0.0068 | 69.0 | 10902 | 0.0044 |
119
+ | 0.0043 | 70.0 | 11060 | 0.0022 |
120
+ | 0.0055 | 71.0 | 11218 | 0.0009 |
121
+ | 0.0063 | 72.0 | 11376 | 0.0000 |
122
+ | 0.0022 | 73.0 | 11534 | 0.0006 |
123
+ | 0.0116 | 74.0 | 11692 | 0.0014 |
124
+ | 0.0043 | 75.0 | 11850 | 0.0000 |
125
+
126
+
127
+ ### Framework versions
128
+
129
+ - Transformers 4.31.0.dev0
130
+ - Pytorch 2.0.1+cu118
131
+ - Datasets 2.13.1
132
+ - Tokenizers 0.11.0