
aoi_clip_clean_new_sampler_fomula_clean
This model is a fine-tuned version of OFA-Sys/chinese-clip-vit-base-patch16 on an unknown dataset.
It achieves the following results on the evaluation set:
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 40
- eval_batch_size: 44
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 60.0
- mixed_precision_training: Native AMP
Training results
Training Loss |
Epoch |
Step |
Validation Loss |
2.4122 |
6.0 |
8874 |
3.9394 |
2.2167 |
12.0 |
17748 |
4.1647 |
2.0965 |
18.0 |
26622 |
4.4300 |
2.0238 |
24.0 |
35496 |
4.5740 |
1.9938 |
30.0 |
44370 |
4.6265 |
1.973 |
36.0 |
53244 |
4.6714 |
1.9583 |
42.0 |
62118 |
4.7931 |
1.9466 |
48.0 |
70992 |
4.7913 |
1.9415 |
54.0 |
79866 |
4.8448 |
1.9369 |
60.0 |
88740 |
4.8862 |
Framework versions
- Transformers 4.42.3
- Pytorch 2.3.1+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1