File size: 533 Bytes
5bd8ae5 71f9d9d 5bd8ae5 |
1 2 3 4 5 6 7 8 9 10 11 12 |
---
language: en
license: apache-2.0
datasets:
- nyu-mll/glue
---
# Bert-base-cased Fine Tuned Glue Mrpc Demo
This checkpoint was initialized from the pre-trained checkpoint bert-base-cased and subsequently fine-tuned on GLUE task: mrpc using [this](https://colab.research.google.com/drive/162pW3wonGcMMrGxmA-jdxwy1rhqXd90x?usp=sharing) notebook.
Training was conducted for 3 epochs, using a linear decaying learning rate of 2e-05, and a total batch size of 32.
The model has a final training loss of 0.103 and a accuracy of 0.831. |