Update README.md
Browse files
README.md
CHANGED
@@ -28,14 +28,14 @@ You can download the 6 Chinese RoBERTa miniatures either from the [UER-py Github
|
|
28 |
|
29 |
Here are scores on the devlopment set of six Chinese tasks:
|
30 |
|
31 |
-
| Model | Score |
|
32 |
| ------------------ | :---: | :----: | :----------: | :---: | :---------: | :-----------: | :---------: |
|
33 |
-
| RoBERTa-Tiny-WWM | 72.
|
34 |
-
| RoBERTa-Mini-WWM | 76.
|
35 |
-
| RoBERTa-Small-WWM | 77.
|
36 |
-
| RoBERTa-Medium-WWM | 78.
|
37 |
-
| RoBERTa-Base-WWM | 80.
|
38 |
-
| RoBERTa-Large-WWM | 81.
|
39 |
|
40 |
For each task, we selected the best fine-tuning hyperparameters from the lists below, and trained with the sequence length of 128:
|
41 |
|
|
|
28 |
|
29 |
Here are scores on the devlopment set of six Chinese tasks:
|
30 |
|
31 |
+
| Model | Score | book_review | chnsenticorp | lcqmc | tnews(CLUE) | iflytek(CLUE) | ocnli(CLUE) |
|
32 |
| ------------------ | :---: | :----: | :----------: | :---: | :---------: | :-----------: | :---------: |
|
33 |
+
| RoBERTa-Tiny-WWM | 72.2 | 83.6 | 91.8 | 81.8 | 62.1 | 55.4 | 58.6 |
|
34 |
+
| RoBERTa-Mini-WWM | 76.3 | 86.2 | 93.0 | 86.8 | 64.4 | 58.7 | 68.8 |
|
35 |
+
| RoBERTa-Small-WWM | 77.6 | 88.1 | 93.8 | 87.2 | 65.2 | 59.6 | 71.4 |
|
36 |
+
| RoBERTa-Medium-WWM | 78.6 | 89.5 | 94.4 | 88.8 | 66.0 | 59.9 | 73.2 |
|
37 |
+
| RoBERTa-Base-WWM | 80.2 | 90.3 | 95.8 | 89.4 | 67.5 | 61.8 | 76.2 |
|
38 |
+
| RoBERTa-Large-WWM | 81.1 | 91.3 | 95.8 | 90.0 | 68.5 | 62.1 | 79.1 |
|
39 |
|
40 |
For each task, we selected the best fine-tuning hyperparameters from the lists below, and trained with the sequence length of 128:
|
41 |
|