add the results on JSTS v1.1
Browse files
README.md
CHANGED
@@ -37,15 +37,15 @@ The experimental results evaluated on the dev set of
|
|
37 |
| Model | MARC-ja | JSTS | JNLI | JCommonsenseQA |
|
38 |
| ---------------------- | --------- | ------------------- | --------- | -------------- |
|
39 |
| | acc | Pearson/Spearman | acc | acc |
|
40 |
-
| **LUKE Japanese base** | **0.965** | **0.
|
41 |
| _Baselines:_ | |
|
42 |
-
| Tohoku BERT base | 0.958 | 0.
|
43 |
-
| NICT BERT base | 0.958 | 0.
|
44 |
-
| Waseda RoBERTa base | 0.962 | 0.
|
45 |
-
| XLM RoBERTa base | 0.961 | 0.
|
46 |
|
47 |
The baseline scores are obtained from
|
48 |
-
[here](https://github.com/yahoojapan/JGLUE/
|
49 |
|
50 |
### Citation
|
51 |
|
|
|
37 |
| Model | MARC-ja | JSTS | JNLI | JCommonsenseQA |
|
38 |
| ---------------------- | --------- | ------------------- | --------- | -------------- |
|
39 |
| | acc | Pearson/Spearman | acc | acc |
|
40 |
+
| **LUKE Japanese base** | **0.965** | **0.916**/**0.877** | **0.912** | **0.842** |
|
41 |
| _Baselines:_ | |
|
42 |
+
| Tohoku BERT base | 0.958 | 0.909/0.868 | 0.899 | 0.808 |
|
43 |
+
| NICT BERT base | 0.958 | 0.910/0.871 | 0.902 | 0.823 |
|
44 |
+
| Waseda RoBERTa base | 0.962 | 0.913/0.873 | 0.895 | 0.840 |
|
45 |
+
| XLM RoBERTa base | 0.961 | 0.877/0.831 | 0.893 | 0.687 |
|
46 |
|
47 |
The baseline scores are obtained from
|
48 |
+
[here](https://github.com/yahoojapan/JGLUE/blob/a6832af23895d6faec8ecf39ec925f1a91601d62/README.md).
|
49 |
|
50 |
### Citation
|
51 |
|