File size: 11,487 Bytes
234e14c
 
 
 
6755376
 
234e14c
 
 
 
 
 
 
 
 
 
 
6755376
234e14c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
---
license: other
base_model: nvidia/mit-b5
tags:
- image-segmentation
- vision
- generated_from_trainer
model-index:
- name: ecc_segformer_main
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# ecc_segformer_main

This model is a fine-tuned version of [nvidia/mit-b5](https://huggingface.co/nvidia/mit-b5) on the rishitunu/ecc_crackdetector_dataset_main dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1918
- Mean Iou: 0.2329
- Mean Accuracy: 0.4658
- Overall Accuracy: 0.4658
- Accuracy Background: nan
- Accuracy Crack: 0.4658
- Iou Background: 0.0
- Iou Crack: 0.4658

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 1337
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: polynomial
- training_steps: 10000

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Crack | Iou Background | Iou Crack |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:--------------:|:--------------:|:---------:|
| 0.1069        | 1.0   | 172   | 0.1376          | 0.1660   | 0.3320        | 0.3320           | nan                 | 0.3320         | 0.0            | 0.3320    |
| 0.0682        | 2.0   | 344   | 0.1327          | 0.2298   | 0.4596        | 0.4596           | nan                 | 0.4596         | 0.0            | 0.4596    |
| 0.0666        | 3.0   | 516   | 0.2478          | 0.1200   | 0.2401        | 0.2401           | nan                 | 0.2401         | 0.0            | 0.2401    |
| 0.0639        | 4.0   | 688   | 0.1732          | 0.1538   | 0.3076        | 0.3076           | nan                 | 0.3076         | 0.0            | 0.3076    |
| 0.0624        | 5.0   | 860   | 0.1027          | 0.2334   | 0.4668        | 0.4668           | nan                 | 0.4668         | 0.0            | 0.4668    |
| 0.0557        | 6.0   | 1032  | 0.1003          | 0.1851   | 0.3703        | 0.3703           | nan                 | 0.3703         | 0.0            | 0.3703    |
| 0.0563        | 7.0   | 1204  | 0.1512          | 0.2007   | 0.4014        | 0.4014           | nan                 | 0.4014         | 0.0            | 0.4014    |
| 0.054         | 8.0   | 1376  | 0.1000          | 0.2401   | 0.4802        | 0.4802           | nan                 | 0.4802         | 0.0            | 0.4802    |
| 0.0546        | 9.0   | 1548  | 0.0933          | 0.2238   | 0.4475        | 0.4475           | nan                 | 0.4475         | 0.0            | 0.4475    |
| 0.0498        | 10.0  | 1720  | 0.0964          | 0.2303   | 0.4606        | 0.4606           | nan                 | 0.4606         | 0.0            | 0.4606    |
| 0.0515        | 11.0  | 1892  | 0.1107          | 0.2258   | 0.4516        | 0.4516           | nan                 | 0.4516         | 0.0            | 0.4516    |
| 0.0453        | 12.0  | 2064  | 0.0961          | 0.2557   | 0.5115        | 0.5115           | nan                 | 0.5115         | 0.0            | 0.5115    |
| 0.0431        | 13.0  | 2236  | 0.1027          | 0.2396   | 0.4792        | 0.4792           | nan                 | 0.4792         | 0.0            | 0.4792    |
| 0.0418        | 14.0  | 2408  | 0.1027          | 0.2521   | 0.5042        | 0.5042           | nan                 | 0.5042         | 0.0            | 0.5042    |
| 0.0426        | 15.0  | 2580  | 0.1059          | 0.2561   | 0.5123        | 0.5123           | nan                 | 0.5123         | 0.0            | 0.5123    |
| 0.0377        | 16.0  | 2752  | 0.1193          | 0.2281   | 0.4561        | 0.4561           | nan                 | 0.4561         | 0.0            | 0.4561    |
| 0.0369        | 17.0  | 2924  | 0.1161          | 0.2486   | 0.4972        | 0.4972           | nan                 | 0.4972         | 0.0            | 0.4972    |
| 0.036         | 18.0  | 3096  | 0.1058          | 0.2515   | 0.5029        | 0.5029           | nan                 | 0.5029         | 0.0            | 0.5029    |
| 0.034         | 19.0  | 3268  | 0.1176          | 0.2434   | 0.4868        | 0.4868           | nan                 | 0.4868         | 0.0            | 0.4868    |
| 0.0337        | 20.0  | 3440  | 0.1162          | 0.2254   | 0.4509        | 0.4509           | nan                 | 0.4509         | 0.0            | 0.4509    |
| 0.0281        | 21.0  | 3612  | 0.1203          | 0.2213   | 0.4426        | 0.4426           | nan                 | 0.4426         | 0.0            | 0.4426    |
| 0.0354        | 22.0  | 3784  | 0.1266          | 0.2384   | 0.4768        | 0.4768           | nan                 | 0.4768         | 0.0            | 0.4768    |
| 0.0323        | 23.0  | 3956  | 0.1223          | 0.2409   | 0.4818        | 0.4818           | nan                 | 0.4818         | 0.0            | 0.4818    |
| 0.0299        | 24.0  | 4128  | 0.1356          | 0.2195   | 0.4390        | 0.4390           | nan                 | 0.4390         | 0.0            | 0.4390    |
| 0.0294        | 25.0  | 4300  | 0.1285          | 0.2318   | 0.4636        | 0.4636           | nan                 | 0.4636         | 0.0            | 0.4636    |
| 0.0295        | 26.0  | 4472  | 0.1274          | 0.2559   | 0.5119        | 0.5119           | nan                 | 0.5119         | 0.0            | 0.5119    |
| 0.0252        | 27.0  | 4644  | 0.1387          | 0.2413   | 0.4827        | 0.4827           | nan                 | 0.4827         | 0.0            | 0.4827    |
| 0.029         | 28.0  | 4816  | 0.1468          | 0.2236   | 0.4472        | 0.4472           | nan                 | 0.4472         | 0.0            | 0.4472    |
| 0.0218        | 29.0  | 4988  | 0.1448          | 0.2433   | 0.4866        | 0.4866           | nan                 | 0.4866         | 0.0            | 0.4866    |
| 0.0275        | 30.0  | 5160  | 0.1478          | 0.2318   | 0.4635        | 0.4635           | nan                 | 0.4635         | 0.0            | 0.4635    |
| 0.0233        | 31.0  | 5332  | 0.1377          | 0.2502   | 0.5005        | 0.5005           | nan                 | 0.5005         | 0.0            | 0.5005    |
| 0.0252        | 32.0  | 5504  | 0.1458          | 0.2399   | 0.4797        | 0.4797           | nan                 | 0.4797         | 0.0            | 0.4797    |
| 0.0245        | 33.0  | 5676  | 0.1431          | 0.2480   | 0.4960        | 0.4960           | nan                 | 0.4960         | 0.0            | 0.4960    |
| 0.0225        | 34.0  | 5848  | 0.1562          | 0.2439   | 0.4879        | 0.4879           | nan                 | 0.4879         | 0.0            | 0.4879    |
| 0.0242        | 35.0  | 6020  | 0.1633          | 0.2323   | 0.4646        | 0.4646           | nan                 | 0.4646         | 0.0            | 0.4646    |
| 0.0213        | 36.0  | 6192  | 0.1666          | 0.2274   | 0.4549        | 0.4549           | nan                 | 0.4549         | 0.0            | 0.4549    |
| 0.0256        | 37.0  | 6364  | 0.1665          | 0.2340   | 0.4680        | 0.4680           | nan                 | 0.4680         | 0.0            | 0.4680    |
| 0.0237        | 38.0  | 6536  | 0.1658          | 0.2410   | 0.4819        | 0.4819           | nan                 | 0.4819         | 0.0            | 0.4819    |
| 0.0192        | 39.0  | 6708  | 0.1705          | 0.2286   | 0.4572        | 0.4572           | nan                 | 0.4572         | 0.0            | 0.4572    |
| 0.0198        | 40.0  | 6880  | 0.1688          | 0.2322   | 0.4644        | 0.4644           | nan                 | 0.4644         | 0.0            | 0.4644    |
| 0.0214        | 41.0  | 7052  | 0.1717          | 0.2315   | 0.4630        | 0.4630           | nan                 | 0.4630         | 0.0            | 0.4630    |
| 0.0197        | 42.0  | 7224  | 0.1764          | 0.2338   | 0.4677        | 0.4677           | nan                 | 0.4677         | 0.0            | 0.4677    |
| 0.0187        | 43.0  | 7396  | 0.1764          | 0.2437   | 0.4874        | 0.4874           | nan                 | 0.4874         | 0.0            | 0.4874    |
| 0.0212        | 44.0  | 7568  | 0.1874          | 0.2259   | 0.4519        | 0.4519           | nan                 | 0.4519         | 0.0            | 0.4519    |
| 0.0188        | 45.0  | 7740  | 0.1854          | 0.2362   | 0.4725        | 0.4725           | nan                 | 0.4725         | 0.0            | 0.4725    |
| 0.0188        | 46.0  | 7912  | 0.1772          | 0.2320   | 0.4641        | 0.4641           | nan                 | 0.4641         | 0.0            | 0.4641    |
| 0.0228        | 47.0  | 8084  | 0.1783          | 0.2385   | 0.4770        | 0.4770           | nan                 | 0.4770         | 0.0            | 0.4770    |
| 0.0199        | 48.0  | 8256  | 0.1850          | 0.2317   | 0.4634        | 0.4634           | nan                 | 0.4634         | 0.0            | 0.4634    |
| 0.0202        | 49.0  | 8428  | 0.1872          | 0.2336   | 0.4672        | 0.4672           | nan                 | 0.4672         | 0.0            | 0.4672    |
| 0.0181        | 50.0  | 8600  | 0.1803          | 0.2405   | 0.4810        | 0.4810           | nan                 | 0.4810         | 0.0            | 0.4810    |
| 0.0157        | 51.0  | 8772  | 0.1874          | 0.2349   | 0.4697        | 0.4697           | nan                 | 0.4697         | 0.0            | 0.4697    |
| 0.0162        | 52.0  | 8944  | 0.1889          | 0.2332   | 0.4665        | 0.4665           | nan                 | 0.4665         | 0.0            | 0.4665    |
| 0.0178        | 53.0  | 9116  | 0.1948          | 0.2357   | 0.4715        | 0.4715           | nan                 | 0.4715         | 0.0            | 0.4715    |
| 0.0166        | 54.0  | 9288  | 0.1911          | 0.2333   | 0.4666        | 0.4666           | nan                 | 0.4666         | 0.0            | 0.4666    |
| 0.0193        | 55.0  | 9460  | 0.1959          | 0.2306   | 0.4611        | 0.4611           | nan                 | 0.4611         | 0.0            | 0.4611    |
| 0.0199        | 56.0  | 9632  | 0.1999          | 0.2330   | 0.4659        | 0.4659           | nan                 | 0.4659         | 0.0            | 0.4659    |
| 0.0177        | 57.0  | 9804  | 0.1943          | 0.2319   | 0.4639        | 0.4639           | nan                 | 0.4639         | 0.0            | 0.4639    |
| 0.019         | 58.0  | 9976  | 0.1926          | 0.2327   | 0.4653        | 0.4653           | nan                 | 0.4653         | 0.0            | 0.4653    |
| 0.0187        | 58.14 | 10000 | 0.1918          | 0.2329   | 0.4658        | 0.4658           | nan                 | 0.4658         | 0.0            | 0.4658    |


### Framework versions

- Transformers 4.32.0.dev0
- Pytorch 2.0.1+cpu
- Datasets 2.14.4
- Tokenizers 0.13.3