Datasets:

Modalities:
Text
Formats:
json
Languages:
Basque
Libraries:
Datasets
pandas
License:
File size: 3,565 Bytes
afb9a85
 
f493343
 
 
 
 
afb9a85
f493343
ca7bc71
f493343
79e3fc8
627efe8
 
 
 
851e991
 
f493343
15d349b
f493343
 
a260fe0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f493343
 
 
 
 
 
 
 
 
 
 
79e3fc8
f493343
 
 
 
 
4d9c4c5
f493343
4d9c4c5
f493343
 
 
 
 
 
312c7e1
f493343
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
---
license: cc-by-nc-sa-4.0
language:
- eu
pretty_name: BL2MP
size_categories:
- 1K<n<10K
---

# BL2MP (Basque L2 student-based Minimal Pairs)

The BL2MP test set, designed to assess the grammatical knowledge of language Models in the Basque language, inspired by the BLiMP benchmark. The BL2MP dataset includes examples sourced from the [bai&by](https://www.baiby.com/en/) language academy,
derived from essays written by students enrolled at the academy. These instances provide a wealth of authentic and natural grammatical errors, representing genuine mistakes made by learners and
thus offering a realistic reflection of real-world language errors.

We randomly selected 1,800 sentences from student essays provided by the bai&by academy, adhering consistently to the ”minimal pairs” criterion. To ensure a balanced diversity, we ensured an
equal distribution of examples across three proficiency levels (A: Beginner, B: Intermediate, and C: Advanced) and three error types (E1: Declension, E2: Verb, E3: Structure and Order) , as shown in
the Table below. This approach aimed to represent a vari ety of proficiency levels and error types within the dataset.

See our paper [*How Well Can BERT Learn the Grammar of an Agglutinative and Flexible-Order Language? The Case of Basque.*]() accepted at LREC-COLING2024 and check our [Github](https://github.com/orai-nlp/bl2mp) for more.


| Types                                                    | Levels | \# of sentences |
|----------------------------------------------------------|--------|-----------------|
| E1: Declension                                           | A      | 200             |
|                                                          | B      | 200             |
|                                                          | C      | 200             |
| E2: Verb                                                 | A      | 200             |
|                                                          | B      | 200             |
|                                                          | C      | 200             |
| E3: Structure                                            | A      | 200             |
|                                                          | B      | 200             |
|                                                          | C      | 200             |
| Total                                                    |        | 1,800           |



Authors
-----------
Gorka Urbizu [1] [2], Muitze Zulaika [1], Xabier Saralegi [1], Ander Corral [1]

Affiliation of the authors: 

[1] Orai NLP Technologies

[2] University of the Basque Country



Licensing
-------------

Copyright (C) by Orai NLP Technologies. 

The corpora, datasets and models created in this work, are licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0.

International License (CC BY-NC-SA 4.0). To view a copy of this license, visit [http://creativecommons.org/licenses/by-nc-sa/4.0/](https://creativecommons.org/licenses/by-nc-sa/4.0/).


Acknowledgements
-------------------
If you use these corpora, datasets or models please cite the following paper:

- G. Urbizu, M. Zulaika, X. Saralegi, A. Corral. How Well Can BERT Learn the Grammar of an Agglutinative and Flexible-Order Language? The Case of Basque. The 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING2024). May, 2024. Torino, Italy



Contact information
-----------------------
Gorka Urbizu, Muitze Zulaika: {g.urbizu,m.zulaika}@orai.eus