Umean commited on
Commit
d9bd4d2
·
verified ·
1 Parent(s): 1af003b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +33 -3
README.md CHANGED
@@ -1,3 +1,33 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ datasets:
4
+ - Umean/B2NERD
5
+ language:
6
+ - en
7
+ - zh
8
+ library_name: peft
9
+ ---
10
+
11
+ # B2NER
12
+
13
+ We present B2NERD, a cohesive and efficient dataset that can improve LLMs' generalization on the challenging Open NER task, refined from 54 existing English or Chinese datasets.
14
+ Our B2NER models, trained on B2NERD, outperform GPT-4 by 6.8-12.0 F1 points and surpass previous methods in 3 out-of-domain benchmarks across 15 datasets and 6 languages.
15
+
16
+ - 📖 Paper: [Beyond Boundaries: Learning a Universal Entity Taxonomy across Datasets and Languages for Open Named Entity Recognition](http://arxiv.org/abs/2406.11192)
17
+ - 🎮 Github Repo: https://github.com/UmeanNever/B2NER
18
+ - 📀 Data: See below data section. You can download from [HuggingFace](https://huggingface.co/datasets/Umean/B2NERD) or [Google Drive](https://drive.google.com/file/d/11Wt4RU48i06OruRca2q_MsgpylzNDdjN/view?usp=drive_link).
19
+ - 💾 Model (LoRA Adapters): Current repo saves the B2NER model LoRA adapter based on InternLM2.5-7B.
20
+
21
+ **See github repo for more information about model usage and this work.**
22
+
23
+ # Cite
24
+ ```
25
+ @article{yang2024beyond,
26
+ title={Beyond Boundaries: Learning a Universal Entity Taxonomy across Datasets and Languages for Open Named Entity Recognition},
27
+ author={Yang, Yuming and Zhao, Wantong and Huang, Caishuang and Ye, Junjie and Wang, Xiao and Zheng, Huiyuan and Nan, Yang and Wang, Yuran and Xu, Xueying and Huang, Kaixin and others},
28
+ journal={arXiv preprint arXiv:2406.11192},
29
+ year={2024}
30
+ }
31
+ ```
32
+
33
+