shay681's picture
Update README.md
1d2f0c2 verified
metadata
datasets:
  - shay681/Precedents
language:
  - he
base_model:
  - google/mt5-small
pipeline_tag: text2text-generation

Text2Text Precedents Finetuned Model

This model fine-tunes google/mt5-small model on shay681/Precedents dataset.

Training and evaluation data

Dataset Split # samples
Precedents train 473,204
Precedents validation 118,302

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • evaluation_strategy: "epoch"
  • learning_rate: 5e-5
  • train_batch_size: 4
  • eval_batch_size: 4
  • num_train_epochs: 5
  • weight_decay: 0.01

Framework versions

  • Transformers 4.17.0
  • Pytorch 1.10.0+cu111
  • Datasets 1.18.4
  • Tokenizers 0.11.6

Results

Metric # Value
Accuracy 0.075
F1 0.024

About Me

Created by Shay Doner. This is my final project as part of intelligent systems M.Sc studies at Afeka College in Tel-Aviv. For more cooperation, please contact email: [email protected]