Update README.md
Browse files
README.md
CHANGED
@@ -14,19 +14,22 @@ library_name: peft
|
|
14 |
# example_title: "Chinese Example"
|
15 |
---
|
16 |
|
17 |
-
|
|
|
|
|
|
|
18 |
|
19 |
We present B2NERD, a cohesive and efficient dataset that can improve LLMs' generalization on the challenging Open NER task, refined from 54 existing English or Chinese datasets.
|
20 |
Our B2NER models, trained on B2NERD, outperform GPT-4 by 6.8-12.0 F1 points and surpass previous methods in 3 out-of-domain benchmarks across 15 datasets and 6 languages.
|
21 |
|
22 |
- ๐ Paper: [Beyond Boundaries: Learning a Universal Entity Taxonomy across Datasets and Languages for Open Named Entity Recognition](http://arxiv.org/abs/2406.11192)
|
23 |
-
- ๐ฎ
|
24 |
-
- ๐ Data: See
|
25 |
- ๐พ Model (LoRA Adapters): Current repo saves the B2NER model LoRA adapter based on InternLM2.5-7B. See [20B model](https://huggingface.co/Umean/B2NER-Internlm2-20B-LoRA) for a 20B adapter.
|
26 |
|
27 |
**See github repo for more information about model usage and this work.**
|
28 |
|
29 |
-
|
30 |
```
|
31 |
@article{yang2024beyond,
|
32 |
title={Beyond Boundaries: Learning a Universal Entity Taxonomy across Datasets and Languages for Open Named Entity Recognition},
|
|
|
14 |
# example_title: "Chinese Example"
|
15 |
---
|
16 |
|
17 |
+
This is the B2NER model's LoRA adapter based on [InternLM2.5-7B](https://huggingface.co/internlm/internlm2_5-7b).
|
18 |
+
**See github repo for quick demo usage and more information about this work.**
|
19 |
+
|
20 |
+
## B2NER
|
21 |
|
22 |
We present B2NERD, a cohesive and efficient dataset that can improve LLMs' generalization on the challenging Open NER task, refined from 54 existing English or Chinese datasets.
|
23 |
Our B2NER models, trained on B2NERD, outperform GPT-4 by 6.8-12.0 F1 points and surpass previous methods in 3 out-of-domain benchmarks across 15 datasets and 6 languages.
|
24 |
|
25 |
- ๐ Paper: [Beyond Boundaries: Learning a Universal Entity Taxonomy across Datasets and Languages for Open Named Entity Recognition](http://arxiv.org/abs/2406.11192)
|
26 |
+
- ๐ฎ Code Repo: We provide codes for both training and inference at https://github.com/UmeanNever/B2NER
|
27 |
+
- ๐ Data: See [B2NERD](https://huggingface.co/datasets/Umean/B2NERD).
|
28 |
- ๐พ Model (LoRA Adapters): Current repo saves the B2NER model LoRA adapter based on InternLM2.5-7B. See [20B model](https://huggingface.co/Umean/B2NER-Internlm2-20B-LoRA) for a 20B adapter.
|
29 |
|
30 |
**See github repo for more information about model usage and this work.**
|
31 |
|
32 |
+
## Cite
|
33 |
```
|
34 |
@article{yang2024beyond,
|
35 |
title={Beyond Boundaries: Learning a Universal Entity Taxonomy across Datasets and Languages for Open Named Entity Recognition},
|