library_name: transformers
license: mit
base_model: pierreguillou/bert-base-cased-squad-v1.1-portuguese
tags:
- generated_from_trainer
model-index:
- name: ibama_29102024_20241029175942
results: []
ibama_29102024_20241029175942
This model is a fine-tuned version of pierreguillou/bert-base-cased-squad-v1.1-portuguese on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 4.1817
Model description
Dataset com 1750 registros. M茅dia do tamanho dos contextos: 2467.439831104856
["train"] : 1421 registros
["test"] : 329 registros 20%
{'exact_match': 6.990881458966565, 'f1': 41.36428322707063}
Resultados:
Resultados com contexto com 6697 caracteres
" content/sample_data/ibama_29102024_20241029175942 : 'exact_match': 3.9755351681957185, 'f1': 38.429269059347 ", " pierreguillou/bert-base-cased-squad-v1.1-portuguese : 'exact_match': 6.422018348623853, 'f1': 37.47550481021018 ", " neuralmind/bert-base-portuguese-cased : 'exact_match': 0.0, 'f1': 21.520346204352514 "
Resultados com contexto com 512 caracteres
" content/sample_data/ibama_29102024_20241029175942 : 'exact_match': 12.67605633802817, 'f1': 70.76635146201694 ", " pierreguillou/bert-base-cased-squad-v1.1-portuguese : 'exact_match': 1.408450704225352, 'f1': 38.42469128241023 ", " neuralmind/bert-base-portuguese-cased : 'exact_match': 0.0, 'f1': 15.264048430063177 "
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
No log | 1.0 | 45 | 4.5987 |
No log | 2.0 | 90 | 4.2668 |
No log | 3.0 | 135 | 4.2254 |
No log | 4.0 | 180 | 4.1817 |
Framework versions
- Transformers 4.44.2
- Pytorch 2.5.0+cu121
- Datasets 3.0.2
- Tokenizers 0.19.1