lombardata's picture
Training in progress, epoch 1
c0efc49 verified
|
raw
history blame
7.72 kB
---
language:
- eng
license: cc0-1.0
tags:
- multilabel-image-classification
- multilabel
- generated_from_trainer
base_model: drone-DinoVdeau-from-binary-large-2024_11_14-batch-size16_freeze_probs
model-index:
- name: drone-DinoVdeau-from-binary-large-2024_11_14-batch-size16_freeze_probs
results: []
---
drone-DinoVdeau-from-binary is a fine-tuned version of [drone-DinoVdeau-from-binary-large-2024_11_14-batch-size16_freeze_probs](https://huggingface.co/drone-DinoVdeau-from-binary-large-2024_11_14-batch-size16_freeze_probs). It achieves the following results on the test set:
- Loss: 0.4693
- F1 Micro: 0.0000
- F1 Macro: 0.0000
- Accuracy: 0.0000
- RMSE: 0.1576
- MAE: 0.1172
- KL Divergence: 0.4185
---
# Model description
drone-DinoVdeau-from-binary is a model built on top of drone-DinoVdeau-from-binary-large-2024_11_14-batch-size16_freeze_probs model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
The source code for training the model can be found in this [Git repository](https://github.com/SeatizenDOI/DinoVdeau).
- **Developed by:** [lombardata](https://huggingface.co/lombardata), credits to [César Leblanc](https://huggingface.co/CesarLeblanc) and [Victor Illien](https://huggingface.co/groderg)
---
# Intended uses & limitations
You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
---
# Training and evaluation data
Details on the estimated number of images for each class are given in the following table:
| Class | train | test | val | Total |
|:------------------------|--------:|-------:|------:|--------:|
| Acropore_branched | 1220 | 363 | 362 | 1945 |
| Acropore_digitised | 586 | 195 | 189 | 970 |
| Acropore_tabular | 308 | 133 | 119 | 560 |
| Algae | 4777 | 1372 | 1384 | 7533 |
| Dead_coral | 2513 | 671 | 693 | 3877 |
| Millepore | 136 | 55 | 59 | 250 |
| No_acropore_encrusting | 252 | 88 | 93 | 433 |
| No_acropore_massive | 2158 | 725 | 726 | 3609 |
| No_acropore_sub_massive | 2036 | 582 | 612 | 3230 |
| Rock | 5976 | 1941 | 1928 | 9845 |
| Rubble | 4851 | 1486 | 1474 | 7811 |
| Sand | 6155 | 2019 | 1990 | 10164 |
---
# Training procedure
## Training hyperparameters
The following hyperparameters were used during training:
- **Number of Epochs**: 62.0
- **Learning Rate**: 0.001
- **Train Batch Size**: 16
- **Eval Batch Size**: 16
- **Optimizer**: Adam
- **LR Scheduler Type**: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
- **Freeze Encoder**: Yes
- **Data Augmentation**: Yes
## Data Augmentation
Data were augmented using the following transformations :
Train Transforms
- **PreProcess**: No additional parameters
- **Resize**: probability=1.00
- **RandomHorizontalFlip**: probability=0.25
- **RandomVerticalFlip**: probability=0.25
- **ColorJiggle**: probability=0.25
- **RandomPerspective**: probability=0.25
- **Normalize**: probability=1.00
Val Transforms
- **PreProcess**: No additional parameters
- **Resize**: probability=1.00
- **Normalize**: probability=1.00
## Training results
Epoch | Validation Loss | MAE | RMSE | KL div | Learning Rate
--- | --- | --- | --- | --- | ---
1 | 0.4821413457393646 | 0.1308 | 0.1731 | 0.4219 | 0.001
2 | 0.4784533977508545 | 0.1263 | 0.1710 | 0.6148 | 0.001
3 | 0.47776785492897034 | 0.1273 | 0.1699 | 0.4880 | 0.001
4 | 0.4793245792388916 | 0.1290 | 0.1710 | 0.3418 | 0.001
5 | 0.47521594166755676 | 0.1280 | 0.1674 | 0.3456 | 0.001
6 | 0.478865385055542 | 0.1254 | 0.1707 | 0.6403 | 0.001
7 | 0.4779475927352905 | 0.1288 | 0.1709 | 0.5492 | 0.001
8 | 0.4756968021392822 | 0.1270 | 0.1678 | 0.3383 | 0.001
9 | 0.4731782376766205 | 0.1231 | 0.1657 | 0.5418 | 0.001
10 | 0.4799855649471283 | 0.1321 | 0.1723 | 0.1547 | 0.001
11 | 0.4731641411781311 | 0.1256 | 0.1656 | 0.3437 | 0.001
12 | 0.47767141461372375 | 0.1293 | 0.1701 | 0.2947 | 0.001
13 | 0.48009705543518066 | 0.1248 | 0.1677 | 0.6136 | 0.001
14 | 0.4954195022583008 | 0.1253 | 0.1669 | inf | 0.001
15 | 0.4812238812446594 | 0.1254 | 0.1662 | inf | 0.001
16 | 0.4858487546443939 | 0.1243 | 0.1656 | inf | 0.0001
17 | 0.47084349393844604 | 0.1223 | 0.1628 | 0.4165 | 0.0001
18 | 0.4707622528076172 | 0.1216 | 0.1626 | 0.4066 | 0.0001
19 | 0.47095733880996704 | 0.1227 | 0.1632 | 0.3185 | 0.0001
20 | 0.4696938395500183 | 0.1205 | 0.1620 | 0.4651 | 0.0001
21 | 0.46922874450683594 | 0.1216 | 0.1614 | 0.3773 | 0.0001
22 | 0.4685634672641754 | 0.1203 | 0.1609 | 0.4611 | 0.0001
23 | 0.47018975019454956 | 0.1226 | 0.1621 | 0.2499 | 0.0001
24 | 0.4705464243888855 | 0.1213 | 0.1628 | 0.3702 | 0.0001
25 | 0.4678299129009247 | 0.1188 | 0.1601 | 0.5133 | 0.0001
26 | 0.46802961826324463 | 0.1179 | 0.1604 | 0.5665 | 0.0001
27 | 0.4680938124656677 | 0.1200 | 0.1604 | 0.4242 | 0.0001
28 | 0.4693257212638855 | 0.1215 | 0.1616 | 0.2968 | 0.0001
29 | 0.46847742795944214 | 0.1197 | 0.1607 | 0.3925 | 0.0001
30 | 0.46944141387939453 | 0.1221 | 0.1614 | 0.2495 | 0.0001
31 | 0.4678958058357239 | 0.1185 | 0.1601 | 0.4510 | 0.0001
32 | 0.46778997778892517 | 0.1193 | 0.1601 | 0.3886 | 1e-05
33 | 0.4686955511569977 | 0.1202 | 0.1606 | 0.3132 | 1e-05
34 | 0.46784329414367676 | 0.1195 | 0.1601 | 0.3958 | 1e-05
35 | 0.4671097695827484 | 0.1180 | 0.1595 | 0.4579 | 1e-05
36 | 0.46735426783561707 | 0.1184 | 0.1595 | 0.4391 | 1e-05
37 | 0.468018501996994 | 0.1191 | 0.1600 | 0.3633 | 1e-05
38 | 0.46701580286026 | 0.1186 | 0.1592 | 0.4303 | 1e-05
39 | 0.4673251509666443 | 0.1187 | 0.1596 | 0.4562 | 1e-05
40 | 0.4673212468624115 | 0.1189 | 0.1594 | 0.4065 | 1e-05
41 | 0.4677547216415405 | 0.1206 | 0.1599 | 0.3336 | 1e-05
42 | 0.4671882390975952 | 0.1178 | 0.1597 | 0.5312 | 1e-05
43 | 0.46716412901878357 | 0.1185 | 0.1592 | 0.3924 | 1e-05
44 | 0.4678168296813965 | 0.1194 | 0.1602 | 0.4259 | 1e-05
45 | 0.46699702739715576 | 0.1172 | 0.1594 | 0.5214 | 1.0000000000000002e-06
46 | 0.46712958812713623 | 0.1188 | 0.1594 | 0.4175 | 1.0000000000000002e-06
47 | 0.4666382074356079 | 0.1188 | 0.1589 | 0.4446 | 1.0000000000000002e-06
48 | 0.46714723110198975 | 0.1180 | 0.1597 | 0.5755 | 1.0000000000000002e-06
49 | 0.46758702397346497 | 0.1192 | 0.1600 | 0.4304 | 1.0000000000000002e-06
50 | 0.46752068400382996 | 0.1204 | 0.1595 | 0.3337 | 1.0000000000000002e-06
51 | 0.46691644191741943 | 0.1181 | 0.1591 | 0.3955 | 1.0000000000000002e-06
52 | 0.466439425945282 | 0.1175 | 0.1588 | 0.4761 | 1.0000000000000002e-06
53 | 0.4667709469795227 | 0.1189 | 0.1590 | 0.4327 | 1.0000000000000002e-06
54 | 0.46701404452323914 | 0.1187 | 0.1592 | 0.3725 | 1.0000000000000002e-06
55 | 0.467383474111557 | 0.1199 | 0.1595 | 0.3841 | 1.0000000000000002e-06
56 | 0.46739572286605835 | 0.1190 | 0.1596 | 0.3822 | 1.0000000000000002e-06
57 | 0.46702033281326294 | 0.1186 | 0.1593 | 0.4675 | 1.0000000000000002e-06
58 | 0.46735846996307373 | 0.1189 | 0.1596 | 0.3738 | 1.0000000000000002e-06
59 | 0.46666717529296875 | 0.1185 | 0.1589 | 0.4204 | 1.0000000000000002e-07
60 | 0.46685320138931274 | 0.1178 | 0.1592 | 0.4532 | 1.0000000000000002e-07
61 | 0.46734780073165894 | 0.1189 | 0.1596 | 0.4032 | 1.0000000000000002e-07
62 | 0.4673011302947998 | 0.1189 | 0.1595 | 0.3407 | 1.0000000000000002e-07
---
# Framework Versions
- **Transformers**: 4.41.0
- **Pytorch**: 2.5.0+cu124
- **Datasets**: 3.0.2
- **Tokenizers**: 0.19.1