windowz_test-022625

This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:

  • Accuracy: 0.9908
  • F1: 0.9910
  • Iou: 0.9832
  • Per Class Metrics: {0: {'f1': 0.99751, 'iou': 0.99504, 'accuracy': 0.99628}, 1: {'f1': 0.98091, 'iou': 0.96254, 'accuracy': 0.99081}, 2: {'f1': 0.73081, 'iou': 0.5758, 'accuracy': 0.99448}}
  • Loss: 0.0169

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 100

Training results

Training Loss Epoch Step Class Metrics Validation Loss
0.4586 5.0 12815 0.9710 {0: {'f1': 0.99385, 'iou': 0.98778, 'accuracy': 0.99081}, 1: {'f1': 0.96879, 'iou': 0.93947, 'accuracy': 0.98478}, 2: {'f1': 0.62073, 'iou': 0.45004, 'accuracy': 0.99365}} 0.0890
0.4363 10.0 25630 0.9810 {0: {'f1': 0.99729, 'iou': 0.9946, 'accuracy': 0.99595}, 1: {'f1': 0.97943, 'iou': 0.95969, 'accuracy': 0.99}, 2: {'f1': 0.6238, 'iou': 0.45327, 'accuracy': 0.99404}} 0.0220
0.4148 15.0 38445 0.9785 {0: {'f1': 0.99619, 'iou': 0.9924, 'accuracy': 0.99428}, 1: {'f1': 0.97538, 'iou': 0.95195, 'accuracy': 0.98824}, 2: {'f1': 0.71793, 'iou': 0.55998, 'accuracy': 0.99388}} 0.0593
0.3935 20.0 51260 0.9743 {0: {'f1': 0.99419, 'iou': 0.98845, 'accuracy': 0.99126}, 1: {'f1': 0.97367, 'iou': 0.94869, 'accuracy': 0.9874}, 2: {'f1': 0.67815, 'iou': 0.51303, 'accuracy': 0.99437}} 0.0229
0.3755 25.0 64075 0.9826 {0: {'f1': 0.99767, 'iou': 0.99534, 'accuracy': 0.99651}, 1: {'f1': 0.9796, 'iou': 0.96001, 'accuracy': 0.9902}, 2: {'f1': 0.71337, 'iou': 0.55445, 'accuracy': 0.99367}} 0.0187
0.3834 30.0 76890 0.9814 {0: {'f1': 0.99714, 'iou': 0.9943, 'accuracy': 0.99572}, 1: {'f1': 0.97847, 'iou': 0.95784, 'accuracy': 0.98967}, 2: {'f1': 0.71791, 'iou': 0.55995, 'accuracy': 0.99391}} 0.0175
0.3609 35.0 89705 0.9832 {0: {'f1': 0.99751, 'iou': 0.99504, 'accuracy': 0.99628}, 1: {'f1': 0.98091, 'iou': 0.96254, 'accuracy': 0.99081}, 2: {'f1': 0.73081, 'iou': 0.5758, 'accuracy': 0.99448}} 0.0169
0.364 40.0 102520 0.9815 {0: {'f1': 0.99756, 'iou': 0.99513, 'accuracy': 0.99635}, 1: {'f1': 0.97769, 'iou': 0.95635, 'accuracy': 0.98933}, 2: {'f1': 0.70736, 'iou': 0.54722, 'accuracy': 0.99295}} 0.0210
0.3561 45.0 115335 0.9857 {0: {'f1': 0.99789, 'iou': 0.9958, 'accuracy': 0.99685}, 1: {'f1': 0.98385, 'iou': 0.96822, 'accuracy': 0.99221}, 2: {'f1': 0.77212, 'iou': 0.62883, 'accuracy': 0.99536}} 0.0208
0.3714 50.0 128150 0.9843 {0: {'f1': 0.99769, 'iou': 0.99539, 'accuracy': 0.99654}, 1: {'f1': 0.98205, 'iou': 0.96473, 'accuracy': 0.99135}, 2: {'f1': 0.75937, 'iou': 0.61209, 'accuracy': 0.99479}} 0.0170

Framework versions

  • Transformers 4.45.0
  • Pytorch 2.5.1+cu124
  • Datasets 2.21.0
  • Tokenizers 0.20.3
Downloads last month
11
Safetensors
Model size
544k params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.