Lagadro/teknofest_ner_mammo

This model is a fine-tuned version of dbmdz/bert-base-turkish-cased on an private mammogram reports dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.0812
  • Validation Loss: 0.0961
  • Epoch: 4
  • Overall Accuracy: 0.968

Model description

This model is trained to extract entity names from reports of mammographic images.

Training and evaluation data

The data has been provided by Teknofest.

Training hyperparameters

  • Name: AdamWeightDecay
  • Learning Rate:
    • Module: keras.optimizers.schedules
    • Class: PolynomialDecay
    • Config:
      • Initial Learning Rate: 2e-05
      • Decay Steps: 260
      • End Learning Rate: 0.0
      • Power: 1.0
      • Cycle: False
      • Name: None
    • Registered Name: None
  • Decay: 0.0
  • Beta 1: 0.9
  • Beta 2: 0.999
  • Epsilon: 1e-08
  • Amsgrad: False
  • Weight Decay Rate: 0.01
  • Training Precision: float32

Training results

Train Loss Validation Loss Epoch
0.5769 0.2226 0
0.1650 0.1344 1
0.1119 0.1110 2
0.0907 0.0994 3
0.0812 0.0961 4

Framework versions

  • Transformers 4.41.2
  • TensorFlow 2.15.0
  • Datasets 2.19.2
  • Tokenizers 0.19.1
Downloads last month
61
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for Lagadro/teknofest_ner_mammo

Finetuned
(115)
this model