File size: 9,164 Bytes
e47f917
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4ff1976
 
 
 
e47f917
4ff1976
 
 
 
 
e47f917
4ff1976
 
e47f917
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4ff1976
e47f917
 
 
 
 
4ff1976
e47f917
 
 
 
4ff1976
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e47f917
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
---
library_name: transformers
license: apache-2.0
base_model: facebook/detr-resnet-50-dc5
tags:
- generated_from_trainer
model-index:
- name: facebook/detr-resnet-50-dc5
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# facebook/detr-resnet-50-dc5

This model is a fine-tuned version of [facebook/detr-resnet-50-dc5](https://huggingface.co/facebook/detr-resnet-50-dc5) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5836
- Map: 0.5257
- Map 50: 0.6508
- Map 75: 0.6241
- Map Small: 0.0
- Map Medium: 0.4752
- Map Large: 0.7513
- Mar 1: 0.1853
- Mar 10: 0.6
- Mar 100: 0.7147
- Mar Small: 0.0
- Mar Medium: 0.6684
- Mar Large: 0.8923

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- training_steps: 400
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch   | Step | Validation Loss | Map    | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1  | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large |
|:-------------:|:-------:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|
| 4.1002        | 0.7692  | 10   | 4.1741          | 0.0003 | 0.001  | 0.0003 | 0.0       | 0.0062     | 0.0002    | 0.0    | 0.0    | 0.0441  | 0.0       | 0.0474     | 0.0462    |
| 1.772         | 1.5385  | 20   | 1.4577          | 0.0298 | 0.05   | 0.0286 | 0.0       | 0.0185     | 0.0656    | 0.0294 | 0.1206 | 0.4882  | 0.0       | 0.3421     | 0.7769    |
| 1.5665        | 2.3077  | 30   | 1.3869          | 0.0339 | 0.0549 | 0.0351 | 0.0       | 0.0407     | 0.0516    | 0.0029 | 0.0824 | 0.6059  | 0.0       | 0.5158     | 0.8308    |
| 2.0258        | 3.0769  | 40   | 1.2246          | 0.0561 | 0.0797 | 0.0593 | 0.0       | 0.0398     | 0.1166    | 0.0265 | 0.1206 | 0.6441  | 0.0       | 0.5789     | 0.8385    |
| 1.5082        | 3.8462  | 50   | 1.1988          | 0.0477 | 0.0869 | 0.0542 | 0.0       | 0.0927     | 0.063     | 0.0235 | 0.0853 | 0.6471  | 0.0       | 0.6316     | 0.7692    |
| 1.3716        | 4.6154  | 60   | 1.1917          | 0.0549 | 0.1014 | 0.0602 | 0.0       | 0.0902     | 0.0761    | 0.0588 | 0.1618 | 0.5971  | 0.0       | 0.5421     | 0.7692    |
| 1.2398        | 5.3846  | 70   | 1.0554          | 0.1329 | 0.1674 | 0.1485 | 0.0       | 0.1462     | 0.1957    | 0.0765 | 0.1882 | 0.7294  | 0.0       | 0.7474     | 0.8154    |
| 1.401         | 6.1538  | 80   | 0.9179          | 0.1176 | 0.1821 | 0.1315 | 0.0       | 0.0835     | 0.2295    | 0.0529 | 0.1794 | 0.7294  | 0.0       | 0.7211     | 0.8538    |
| 2.0328        | 6.9231  | 90   | 0.9198          | 0.1361 | 0.2109 | 0.1554 | 0.0       | 0.0937     | 0.2424    | 0.0559 | 0.2088 | 0.6882  | 0.0       | 0.6368     | 0.8692    |
| 1.6358        | 7.6923  | 100  | 0.9298          | 0.2252 | 0.2898 | 0.2523 | 0.0       | 0.2279     | 0.3487    | 0.1059 | 0.3176 | 0.6882  | 0.0       | 0.6263     | 0.8846    |
| 0.8849        | 8.4615  | 110  | 0.8894          | 0.1893 | 0.2435 | 0.2248 | 0.0       | 0.1438     | 0.3337    | 0.0971 | 0.2265 | 0.7265  | 0.0       | 0.7263     | 0.8385    |
| 1.1906        | 9.2308  | 120  | 0.8505          | 0.2105 | 0.2704 | 0.2598 | 0.0       | 0.1879     | 0.3317    | 0.1324 | 0.2706 | 0.6853  | 0.0       | 0.6474     | 0.8462    |
| 1.0404        | 10.0    | 130  | 0.7320          | 0.2508 | 0.2998 | 0.29   | 0.0       | 0.2031     | 0.4149    | 0.1588 | 0.2971 | 0.7471  | 0.0       | 0.7421     | 0.8692    |
| 1.1534        | 10.7692 | 140  | 0.7996          | 0.2832 | 0.374  | 0.3479 | 0.0       | 0.2502     | 0.411     | 0.1676 | 0.3647 | 0.6647  | 0.0       | 0.6263     | 0.8231    |
| 1.1725        | 11.5385 | 150  | 0.7990          | 0.3115 | 0.4464 | 0.3745 | 0.0       | 0.2972     | 0.4147    | 0.1294 | 0.3735 | 0.6588  | 0.0       | 0.6158     | 0.8231    |
| 0.891         | 12.3077 | 160  | 0.9007          | 0.2856 | 0.3519 | 0.3449 | 0.0       | 0.2607     | 0.3788    | 0.1029 | 0.3529 | 0.6735  | 0.0       | 0.6263     | 0.8462    |
| 1.1           | 13.0769 | 170  | 0.7376          | 0.2642 | 0.3608 | 0.3377 | 0.0       | 0.2281     | 0.4018    | 0.1176 | 0.3676 | 0.7176  | 0.0       | 0.7        | 0.8538    |
| 1.2631        | 13.8462 | 180  | 0.7162          | 0.306  | 0.4363 | 0.3899 | 0.0       | 0.2997     | 0.3933    | 0.1412 | 0.45   | 0.7059  | 0.0       | 0.7053     | 0.8154    |
| 1.0496        | 14.6154 | 190  | 0.7276          | 0.2811 | 0.3866 | 0.3483 | 0.0       | 0.3061     | 0.3685    | 0.1471 | 0.3882 | 0.7235  | 0.0       | 0.7316     | 0.8231    |
| 0.8883        | 15.3846 | 200  | 0.6855          | 0.3373 | 0.4578 | 0.4385 | 0.0       | 0.3441     | 0.4654    | 0.15   | 0.4824 | 0.7412  | 0.0       | 0.7579     | 0.8308    |
| 0.8471        | 16.1538 | 210  | 0.6733          | 0.4351 | 0.5932 | 0.5367 | 0.0       | 0.3702     | 0.6215    | 0.15   | 0.5412 | 0.7206  | 0.0       | 0.7158     | 0.8385    |
| 0.9084        | 16.9231 | 220  | 0.6526          | 0.4279 | 0.5632 | 0.4848 | 0.0       | 0.4011     | 0.572     | 0.1824 | 0.5647 | 0.7294  | 0.0       | 0.7105     | 0.8692    |
| 0.8872        | 17.6923 | 230  | 0.6218          | 0.4376 | 0.5753 | 0.5274 | 0.0       | 0.3879     | 0.6215    | 0.1559 | 0.5853 | 0.7382  | 0.0       | 0.7263     | 0.8692    |
| 0.9739        | 18.4615 | 240  | 0.6590          | 0.4494 | 0.6293 | 0.505  | 0.0       | 0.3889     | 0.65      | 0.1471 | 0.5853 | 0.7029  | 0.0       | 0.6895     | 0.8308    |
| 0.7596        | 19.2308 | 250  | 0.6367          | 0.4625 | 0.6229 | 0.5322 | 0.0       | 0.4106     | 0.6581    | 0.1529 | 0.5853 | 0.7118  | 0.0       | 0.7053     | 0.8308    |
| 0.7124        | 20.0    | 260  | 0.6601          | 0.4619 | 0.6411 | 0.5327 | 0.0       | 0.39       | 0.6852    | 0.1559 | 0.5765 | 0.6794  | 0.0       | 0.6421     | 0.8385    |
| 0.8369        | 20.7692 | 270  | 0.6363          | 0.4736 | 0.64   | 0.5738 | 0.0       | 0.3993     | 0.737     | 0.1559 | 0.5853 | 0.6853  | 0.0       | 0.6474     | 0.8462    |
| 0.8608        | 21.5385 | 280  | 0.6304          | 0.496  | 0.6406 | 0.5583 | 0.0       | 0.4484     | 0.6973    | 0.1588 | 0.5912 | 0.7     | 0.0       | 0.6579     | 0.8692    |
| 0.6174        | 22.3077 | 290  | 0.6825          | 0.4808 | 0.6714 | 0.5569 | 0.0       | 0.4264     | 0.6738    | 0.1529 | 0.5765 | 0.6735  | 0.0       | 0.6158     | 0.8615    |
| 0.5903        | 23.0769 | 300  | 0.6037          | 0.5187 | 0.6804 | 0.6126 | 0.0       | 0.4604     | 0.709     | 0.1824 | 0.6118 | 0.7206  | 0.0       | 0.6842     | 0.8846    |
| 0.6325        | 23.8462 | 310  | 0.6373          | 0.529  | 0.6819 | 0.6246 | 0.0       | 0.4489     | 0.7601    | 0.1765 | 0.5941 | 0.7088  | 0.0       | 0.6579     | 0.8923    |
| 0.8569        | 24.6154 | 320  | 0.6131          | 0.5382 | 0.6684 | 0.6357 | 0.0       | 0.4862     | 0.7382    | 0.1794 | 0.6147 | 0.7294  | 0.0       | 0.7        | 0.8846    |
| 0.7056        | 25.3846 | 330  | 0.5700          | 0.5244 | 0.6545 | 0.6089 | 0.0       | 0.4891     | 0.6871    | 0.1824 | 0.6176 | 0.75    | 0.0       | 0.7421     | 0.8769    |
| 0.5988        | 26.1538 | 340  | 0.5738          | 0.5437 | 0.7119 | 0.651  | 0.0       | 0.5362     | 0.6823    | 0.1853 | 0.6206 | 0.7529  | 0.0       | 0.7579     | 0.8615    |
| 0.5209        | 26.9231 | 350  | 0.6136          | 0.5153 | 0.6944 | 0.6047 | 0.0       | 0.4772     | 0.7054    | 0.1824 | 0.5882 | 0.7059  | 0.0       | 0.6789     | 0.8538    |
| 0.6547        | 27.6923 | 360  | 0.6338          | 0.5166 | 0.6645 | 0.6224 | 0.0       | 0.4842     | 0.7072    | 0.1882 | 0.5971 | 0.7088  | 0.0       | 0.6842     | 0.8538    |
| 0.6324        | 28.4615 | 370  | 0.6083          | 0.5143 | 0.6543 | 0.6279 | 0.0       | 0.4683     | 0.729     | 0.1853 | 0.6    | 0.7118  | 0.0       | 0.6789     | 0.8692    |
| 0.6323        | 29.2308 | 380  | 0.5748          | 0.529  | 0.6552 | 0.637  | 0.0       | 0.48       | 0.7529    | 0.1853 | 0.6088 | 0.7206  | 0.0       | 0.6842     | 0.8846    |
| 0.4509        | 30.0    | 390  | 0.5758          | 0.5311 | 0.652  | 0.6325 | 0.0       | 0.4923     | 0.7454    | 0.1882 | 0.6206 | 0.7324  | 0.0       | 0.7053     | 0.8846    |
| 0.8259        | 30.7692 | 400  | 0.5836          | 0.5257 | 0.6508 | 0.6241 | 0.0       | 0.4752     | 0.7513    | 0.1853 | 0.6    | 0.7147  | 0.0       | 0.6684     | 0.8923    |


### Framework versions

- Transformers 4.46.3
- Pytorch 2.4.0
- Datasets 3.1.0
- Tokenizers 0.20.0