distilbert-classn-LAlg-multihead-context-width-4
This model is a fine-tuned version of dslim/distilbert-NER on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.9001
- Accuracy: 0.7778
- F1: 0.7818
- Precision: 0.8035
- Recall: 0.7778
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 25
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
---|---|---|---|---|---|---|---|
2.4495 | 1.3514 | 50 | 2.5136 | 0.0556 | 0.0522 | 0.1279 | 0.0556 |
2.4414 | 2.7027 | 100 | 2.5005 | 0.0714 | 0.0678 | 0.0799 | 0.0714 |
2.4231 | 4.0541 | 150 | 2.4821 | 0.0714 | 0.0718 | 0.0911 | 0.0714 |
2.3702 | 5.4054 | 200 | 2.4614 | 0.0873 | 0.0946 | 0.1167 | 0.0873 |
2.3442 | 6.7568 | 250 | 2.4363 | 0.0714 | 0.0651 | 0.0683 | 0.0714 |
2.271 | 8.1081 | 300 | 2.3787 | 0.1349 | 0.1268 | 0.1264 | 0.1349 |
2.1474 | 9.4595 | 350 | 2.2815 | 0.1746 | 0.1619 | 0.1687 | 0.1746 |
1.9343 | 10.8108 | 400 | 2.0912 | 0.3413 | 0.3320 | 0.3770 | 0.3413 |
1.6174 | 12.1622 | 450 | 1.8072 | 0.4524 | 0.4455 | 0.5545 | 0.4524 |
1.1783 | 13.5135 | 500 | 1.4265 | 0.6111 | 0.6193 | 0.6449 | 0.6111 |
0.7923 | 14.8649 | 550 | 1.1770 | 0.6905 | 0.6985 | 0.7139 | 0.6905 |
0.4365 | 16.2162 | 600 | 1.0497 | 0.7143 | 0.7147 | 0.7417 | 0.7143 |
0.2509 | 17.5676 | 650 | 0.9858 | 0.7460 | 0.7486 | 0.7734 | 0.7460 |
0.1531 | 18.9189 | 700 | 0.9514 | 0.7460 | 0.7506 | 0.7804 | 0.7460 |
0.0923 | 20.2703 | 750 | 0.9163 | 0.7698 | 0.7709 | 0.7911 | 0.7698 |
0.0653 | 21.6216 | 800 | 0.9064 | 0.7778 | 0.7797 | 0.7978 | 0.7778 |
0.0509 | 22.9730 | 850 | 0.9130 | 0.7778 | 0.7801 | 0.8004 | 0.7778 |
0.04 | 24.3243 | 900 | 0.9001 | 0.7778 | 0.7818 | 0.8035 | 0.7778 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.3.1
- Tokenizers 0.21.0
- Downloads last month
- 0
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no pipeline_tag.
Model tree for Heather-Driver/distilbert-classn-LAlg-multihead-context-width-4
Base model
distilbert/distilbert-base-cased
Quantized
dslim/distilbert-NER