pixtral-12b-transformers-v0
This model is a fine-tuned version of mistral-community/pixtral-12b on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0344
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 4
- optimizer: Use OptimizerNames.PAGED_ADAMW_8BIT with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: constant
- lr_scheduler_warmup_ratio: 0.03
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
0.0752 | 0.1998 | 50 | 0.0506 |
0.035 | 0.3996 | 100 | 0.0443 |
0.0318 | 0.5994 | 150 | 0.0392 |
0.0296 | 0.7992 | 200 | 0.0329 |
0.0239 | 0.9990 | 250 | 0.0300 |
0.022 | 1.1958 | 300 | 0.0301 |
0.0201 | 1.3956 | 350 | 0.0343 |
0.0225 | 1.5954 | 400 | 0.0296 |
0.0197 | 1.7952 | 450 | 0.0272 |
0.0174 | 1.9950 | 500 | 0.0285 |
0.0167 | 2.1918 | 550 | 0.0266 |
0.0159 | 2.3916 | 600 | 0.0267 |
0.0141 | 2.5914 | 650 | 0.0262 |
0.017 | 2.7912 | 700 | 0.0270 |
0.0138 | 2.9910 | 750 | 0.0263 |
0.0126 | 3.1878 | 800 | 0.0272 |
0.0123 | 3.3876 | 850 | 0.0260 |
0.0111 | 3.5874 | 900 | 0.0249 |
0.011 | 3.7872 | 950 | 0.0257 |
0.015 | 3.9870 | 1000 | 0.0258 |
0.0094 | 4.1838 | 1050 | 0.0275 |
0.0093 | 4.3836 | 1100 | 0.0275 |
0.01 | 4.5834 | 1150 | 0.0272 |
0.0115 | 4.7832 | 1200 | 0.0269 |
0.012 | 4.9830 | 1250 | 0.0250 |
0.0079 | 5.1798 | 1300 | 0.0284 |
0.0081 | 5.3796 | 1350 | 0.0309 |
0.0096 | 5.5794 | 1400 | 0.0279 |
0.0094 | 5.7792 | 1450 | 0.0280 |
0.009 | 5.9790 | 1500 | 0.0276 |
0.0081 | 6.1758 | 1550 | 0.0329 |
0.008 | 6.3756 | 1600 | 0.0275 |
0.0076 | 6.5754 | 1650 | 0.0271 |
0.008 | 6.7752 | 1700 | 0.0292 |
0.0082 | 6.9750 | 1750 | 0.0270 |
0.005 | 7.1718 | 1800 | 0.0315 |
0.0069 | 7.3716 | 1850 | 0.0271 |
0.0063 | 7.5714 | 1900 | 0.0305 |
0.007 | 7.7712 | 1950 | 0.0282 |
0.006 | 7.9710 | 2000 | 0.0293 |
0.0059 | 8.1678 | 2050 | 0.0302 |
0.0042 | 8.3676 | 2100 | 0.0332 |
0.0053 | 8.5674 | 2150 | 0.0297 |
0.0053 | 8.7672 | 2200 | 0.0327 |
0.0057 | 8.9670 | 2250 | 0.0300 |
0.0066 | 9.1638 | 2300 | 0.0318 |
0.0042 | 9.3636 | 2350 | 0.0316 |
0.0037 | 9.5634 | 2400 | 0.0342 |
0.0066 | 9.7632 | 2450 | 0.0323 |
0.0045 | 9.9630 | 2500 | 0.0344 |
Framework versions
- PEFT 0.14.0
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
- Downloads last month
- 3
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no pipeline_tag.
Model tree for minhtien2405/pixtral-12b-transformers-v0
Base model
mistral-community/pixtral-12b