V0507HMA15HB2 / README.md
L0I6T1Z9Y
upload V0507HMA15HB2
509e881
---
license: mit
base_model: microsoft/phi-2
tags:
- generated_from_trainer
model-index:
- name: V0507HMA15HB2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# V0507HMA15HB2
This model is a fine-tuned version of [microsoft/phi-2](https://huggingface.co/microsoft/phi-2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: -81.6054
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine_with_restarts
- lr_scheduler_warmup_steps: 100
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| -10.1232 | 0.09 | 10 | -11.7425 |
| -13.0448 | 0.18 | 20 | -15.0037 |
| -17.5347 | 0.27 | 30 | -21.6644 |
| -25.551 | 0.36 | 40 | -31.2337 |
| -35.3456 | 0.45 | 50 | -41.1929 |
| -44.7681 | 0.54 | 60 | -50.2314 |
| -53.1455 | 0.63 | 70 | -57.6267 |
| -59.6872 | 0.73 | 80 | -63.3874 |
| -65.1855 | 0.82 | 90 | -67.4235 |
| -67.6972 | 0.91 | 100 | -68.9758 |
| -70.4407 | 1.0 | 110 | -72.7099 |
| -73.0595 | 1.09 | 120 | -72.9839 |
| -72.4114 | 1.18 | 130 | -73.4895 |
| -73.3489 | 1.27 | 140 | -73.0341 |
| -68.9142 | 1.36 | 150 | -71.6919 |
| -75.8434 | 1.45 | 160 | -76.9335 |
| -77.7082 | 1.54 | 170 | -79.3035 |
| -79.5405 | 1.63 | 180 | -78.0217 |
| -73.5315 | 1.72 | 190 | -72.0316 |
| -72.5674 | 1.81 | 200 | -74.5039 |
| -76.8928 | 1.9 | 210 | -77.8919 |
| -78.6004 | 1.99 | 220 | -79.7306 |
| -79.779 | 2.08 | 230 | -78.9037 |
| -78.5156 | 2.18 | 240 | -78.2094 |
| -77.3853 | 2.27 | 250 | -74.1239 |
| -77.7728 | 2.36 | 260 | -79.7795 |
| -80.4204 | 2.45 | 270 | -81.1776 |
| -81.1502 | 2.54 | 280 | -81.5114 |
| -81.4538 | 2.63 | 290 | -81.3391 |
| -81.3301 | 2.72 | 300 | -81.3797 |
| -81.3074 | 2.81 | 310 | -81.5299 |
| -81.527 | 2.9 | 320 | -81.5893 |
| -81.5978 | 2.99 | 330 | -81.6054 |
### Framework versions
- Transformers 4.36.0.dev0
- Pytorch 2.1.2+cu121
- Datasets 2.18.0
- Tokenizers 0.14.1