File size: 949 Bytes
4ecc416 326a0ba |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 |
---
library_name: transformers
tags: []
---
# Time Saving Stated Aim Classifier
This is a RoBERTa-large model that is trained to classify whether an explicit stated aim described in a British [historical patent](https://huggingface.co/datasets/matthewleechen/300YearsOfBritishPatents) is designed to save time.
Hyperparameters:
- lr = 3e-5
- batch size = 50
Validation set results:
```text
{'eval_loss': 0.6435797214508057,
'eval_accuracy': 0.85,
'eval_precision': 0.8466448445171849,
'eval_recall': 0.85,
'eval_f1': 0.8480286738351254,
'eval_runtime': 1.075,
'eval_samples_per_second': 55.816,
'eval_steps_per_second': 1.861}
```
Test set results:
```text
{'eval_loss': 0.9021337032318115,
'eval_accuracy': 0.8333333333333334,
'eval_precision': 0.8278777959629023,
'eval_recall': 0.8333333333333334,
'eval_f1': 0.825925925925926,
'eval_runtime': 0.8154,
'eval_samples_per_second': 73.583,
'eval_steps_per_second': 2.453}
```
|