osmajic commited on
Commit
fdb90b3
·
verified ·
1 Parent(s): 593c250

Model save

Browse files
Files changed (1) hide show
  1. README.md +34 -8
README.md CHANGED
@@ -16,12 +16,7 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [google/paligemma-3b-pt-224](https://huggingface.co/google/paligemma-3b-pt-224) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
- - eval_loss: 0.1162
20
- - eval_runtime: 61.0128
21
- - eval_samples_per_second: 1.328
22
- - eval_steps_per_second: 0.672
23
- - epoch: 4.0
24
- - step: 200
25
 
26
  ## Model description
27
 
@@ -40,7 +35,7 @@ More information needed
40
  ### Training hyperparameters
41
 
42
  The following hyperparameters were used during training:
43
- - learning_rate: 1e-05
44
  - train_batch_size: 2
45
  - eval_batch_size: 2
46
  - seed: 42
@@ -48,7 +43,38 @@ The following hyperparameters were used during training:
48
  - total_train_batch_size: 8
49
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
  - lr_scheduler_type: linear
51
- - num_epochs: 10
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
52
 
53
  ### Framework versions
54
 
 
16
 
17
  This model is a fine-tuned version of [google/paligemma-3b-pt-224](https://huggingface.co/google/paligemma-3b-pt-224) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 0.1043
 
 
 
 
 
20
 
21
  ## Model description
22
 
 
35
  ### Training hyperparameters
36
 
37
  The following hyperparameters were used during training:
38
+ - learning_rate: 2e-06
39
  - train_batch_size: 2
40
  - eval_batch_size: 2
41
  - seed: 42
 
43
  - total_train_batch_size: 8
44
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
  - lr_scheduler_type: linear
46
+ - num_epochs: 5
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss |
51
+ |:-------------:|:-----:|:----:|:---------------:|
52
+ | 0.1502 | 0.2 | 10 | 0.1155 |
53
+ | 0.1414 | 0.4 | 20 | 0.1136 |
54
+ | 0.1481 | 0.6 | 30 | 0.1134 |
55
+ | 0.1384 | 0.8 | 40 | 0.1121 |
56
+ | 0.1472 | 1.0 | 50 | 0.1112 |
57
+ | 0.1357 | 1.2 | 60 | 0.1108 |
58
+ | 0.124 | 1.4 | 70 | 0.1102 |
59
+ | 0.1495 | 1.6 | 80 | 0.1096 |
60
+ | 0.1438 | 1.8 | 90 | 0.1090 |
61
+ | 0.1394 | 2.0 | 100 | 0.1078 |
62
+ | 0.1285 | 2.2 | 110 | 0.1071 |
63
+ | 0.131 | 2.4 | 120 | 0.1071 |
64
+ | 0.1375 | 2.6 | 130 | 0.1068 |
65
+ | 0.1243 | 2.8 | 140 | 0.1061 |
66
+ | 0.138 | 3.0 | 150 | 0.1056 |
67
+ | 0.1386 | 3.2 | 160 | 0.1055 |
68
+ | 0.1325 | 3.4 | 170 | 0.1052 |
69
+ | 0.1252 | 3.6 | 180 | 0.1052 |
70
+ | 0.1371 | 3.8 | 190 | 0.1049 |
71
+ | 0.122 | 4.0 | 200 | 0.1048 |
72
+ | 0.1149 | 4.2 | 210 | 0.1046 |
73
+ | 0.1318 | 4.4 | 220 | 0.1044 |
74
+ | 0.1169 | 4.6 | 230 | 0.1043 |
75
+ | 0.1412 | 4.8 | 240 | 0.1041 |
76
+ | 0.1514 | 5.0 | 250 | 0.1043 |
77
+
78
 
79
  ### Framework versions
80