Umean commited on
Commit
e31081b
ยท
verified ยท
1 Parent(s): 2077f4d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -4
README.md CHANGED
@@ -14,19 +14,22 @@ library_name: peft
14
  # example_title: "Chinese Example"
15
  ---
16
 
17
- # B2NER
 
 
 
18
 
19
  We present B2NERD, a cohesive and efficient dataset that can improve LLMs' generalization on the challenging Open NER task, refined from 54 existing English or Chinese datasets.
20
  Our B2NER models, trained on B2NERD, outperform GPT-4 by 6.8-12.0 F1 points and surpass previous methods in 3 out-of-domain benchmarks across 15 datasets and 6 languages.
21
 
22
  - ๐Ÿ“– Paper: [Beyond Boundaries: Learning a Universal Entity Taxonomy across Datasets and Languages for Open Named Entity Recognition](http://arxiv.org/abs/2406.11192)
23
- - ๐ŸŽฎ Github Repo: https://github.com/UmeanNever/B2NER
24
- - ๐Ÿ“€ Data: See below data section. You can download from [HuggingFace](https://huggingface.co/datasets/Umean/B2NERD) or [Google Drive](https://drive.google.com/file/d/11Wt4RU48i06OruRca2q_MsgpylzNDdjN/view?usp=drive_link).
25
  - ๐Ÿ’พ Model (LoRA Adapters): Current repo saves the B2NER model LoRA adapter based on InternLM2.5-7B. See [20B model](https://huggingface.co/Umean/B2NER-Internlm2-20B-LoRA) for a 20B adapter.
26
 
27
  **See github repo for more information about model usage and this work.**
28
 
29
- # Cite
30
  ```
31
  @article{yang2024beyond,
32
  title={Beyond Boundaries: Learning a Universal Entity Taxonomy across Datasets and Languages for Open Named Entity Recognition},
 
14
  # example_title: "Chinese Example"
15
  ---
16
 
17
+ This is the B2NER model's LoRA adapter based on [InternLM2.5-7B](https://huggingface.co/internlm/internlm2_5-7b).
18
+ **See github repo for quick demo usage and more information about this work.**
19
+
20
+ ## B2NER
21
 
22
  We present B2NERD, a cohesive and efficient dataset that can improve LLMs' generalization on the challenging Open NER task, refined from 54 existing English or Chinese datasets.
23
  Our B2NER models, trained on B2NERD, outperform GPT-4 by 6.8-12.0 F1 points and surpass previous methods in 3 out-of-domain benchmarks across 15 datasets and 6 languages.
24
 
25
  - ๐Ÿ“– Paper: [Beyond Boundaries: Learning a Universal Entity Taxonomy across Datasets and Languages for Open Named Entity Recognition](http://arxiv.org/abs/2406.11192)
26
+ - ๐ŸŽฎ Code Repo: We provide codes for both training and inference at https://github.com/UmeanNever/B2NER
27
+ - ๐Ÿ“€ Data: See [B2NERD](https://huggingface.co/datasets/Umean/B2NERD).
28
  - ๐Ÿ’พ Model (LoRA Adapters): Current repo saves the B2NER model LoRA adapter based on InternLM2.5-7B. See [20B model](https://huggingface.co/Umean/B2NER-Internlm2-20B-LoRA) for a 20B adapter.
29
 
30
  **See github repo for more information about model usage and this work.**
31
 
32
+ ## Cite
33
  ```
34
  @article{yang2024beyond,
35
  title={Beyond Boundaries: Learning a Universal Entity Taxonomy across Datasets and Languages for Open Named Entity Recognition},