RichardErkhov commited on
Commit
deac07d
Β·
verified Β·
1 Parent(s): 8286b69

uploaded readme

Browse files
Files changed (1) hide show
  1. README.md +176 -0
README.md ADDED
@@ -0,0 +1,176 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Quantization made by Richard Erkhov.
2
+
3
+ [Github](https://github.com/RichardErkhov)
4
+
5
+ [Discord](https://discord.gg/pvy7H8DZMG)
6
+
7
+ [Request more models](https://github.com/RichardErkhov/quant_request)
8
+
9
+
10
+ Apollo-0.5B - bnb 8bits
11
+ - Model creator: https://huggingface.co/FreedomIntelligence/
12
+ - Original model: https://huggingface.co/FreedomIntelligence/Apollo-0.5B/
13
+
14
+
15
+
16
+
17
+ Original model description:
18
+ ---
19
+ license: apache-2.0
20
+ ---
21
+ # Multilingual Medicine: Model, Dataset, Benchmark, Code
22
+
23
+ Covering English, Chinese, French, Hindi, Spanish, Hindi, Arabic So far
24
+
25
+
26
+ <p align="center">
27
+ πŸ‘¨πŸ»β€πŸ’»<a href="https://github.com/FreedomIntelligence/Apollo" target="_blank">Github</a> β€’πŸ“ƒ <a href="https://arxiv.org/abs/2403.03640" target="_blank">Paper</a> β€’ 🌐 <a href="https://apollo.llmzoo.com/" target="_blank">Demo</a> β€’ πŸ€— <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus" target="_blank">ApolloCorpus</a> β€’ πŸ€— <a href="https://huggingface.co/datasets/FreedomIntelligence/XMedbench" target="_blank">XMedBench</a>
28
+ <br> <a href="./README_zh.md"> δΈ­ζ–‡ </a> | <a href="./README.md"> English
29
+ </p>
30
+
31
+ ![Apollo](assets/apollo_medium_final.png)
32
+
33
+ ## 🌈 Update
34
+
35
+ * **[2024.04.25]** [MedJamba](https://huggingface.co/FreedomIntelligence/Apollo-MedJamba) released, train and evaluation code refer to [repo](https://github.com/FreedomIntelligence/MedJamba).
36
+ * **[2024.03.07]** [Paper](https://arxiv.org/abs/2403.03640) released.
37
+ * **[2024.02.12]** <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus" target="_blank">ApolloCorpus</a> and <a href="https://huggingface.co/datasets/FreedomIntelligence/XMedbench" target="_blank">XMedBench</a> is publishedοΌπŸŽ‰
38
+ * **[2024.01.23]** Apollo repo is publishedοΌπŸŽ‰
39
+
40
+
41
+ ## Results
42
+ πŸ€— <a href="https://huggingface.co/FreedomIntelligence/Apollo-0.5B" target="_blank">Apollo-0.5B</a> β€’ πŸ€— <a href="https://huggingface.co/FreedomIntelligence/Apollo-1.8B" target="_blank">Apollo-1.8B</a> β€’ πŸ€— <a href="https://huggingface.co/FreedomIntelligence/Apollo-2B" target="_blank">Apollo-2B</a> β€’ πŸ€— <a href="https://huggingface.co/FreedomIntelligence/Apollo-6B" target="_blank">Apollo-6B</a> β€’ πŸ€— <a href="https://huggingface.co/FreedomIntelligence/Apollo-7B" target="_blank">Apollo-7B</a> β€’ πŸ€— <a href="https://huggingface.co/FreedomIntelligence/Apollo-34B" target="_blank">Apollo-34B</a> β€’ πŸ€— <a href="https://huggingface.co/FreedomIntelligence/Apollo-72B" target="_blank">Apollo-72B</a>
43
+
44
+ πŸ€— <a href="https://huggingface.co/FreedomIntelligence/Apollo-MedJamba" target="_blank">MedJamba</a>
45
+
46
+ πŸ€— <a href="https://huggingface.co/FreedomIntelligence/Apollo-0.5B-GGUF" target="_blank">Apollo-0.5B-GGUF</a> β€’ πŸ€— <a href="https://huggingface.co/FreedomIntelligence/Apollo-2B-GGUF" target="_blank">Apollo-2B-GGUF</a> β€’ πŸ€— <a href="https://huggingface.co/FreedomIntelligence/Apollo-6B-GGUF" target="_blank">Apollo-6B-GGUF</a> β€’ πŸ€— <a href="https://huggingface.co/FreedomIntelligence/Apollo-7B-GGUF" target="_blank">Apollo-7B-GGUF</a>
47
+
48
+
49
+
50
+ ![Apollo](assets/result.png)
51
+
52
+
53
+ ## Usage Format
54
+
55
+ User:{query}\nAssistant:{response}<|endoftext|>
56
+
57
+
58
+ ## Dataset & Evaluation
59
+
60
+ - Dataset
61
+ πŸ€— <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus" target="_blank">ApolloCorpus</a>
62
+
63
+ <details><summary>Click to expand</summary>
64
+
65
+ ![Apollo](assets/dataset.png)
66
+
67
+ - [Zip File](https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus/blob/main/ApolloCorpus.zip)
68
+ - [Data category](https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus/tree/main/train)
69
+ - Pretrain:
70
+ - data item:
71
+ - json_name: {data_source}_{language}_{data_type}.json
72
+ - data_type: medicalBook, medicalGuideline, medicalPaper, medicalWeb(from online forum), medicalWiki
73
+ - language: en(English), zh(chinese), es(spanish), fr(french), hi(Hindi)
74
+ - data_type: qa(generated qa from text)
75
+ - data_type==text: list of string
76
+ ```
77
+ [
78
+ "string1",
79
+ "string2",
80
+ ...
81
+ ]
82
+ ```
83
+ - data_type==qa: list of qa pairs(list of string)
84
+ ```
85
+ [
86
+ [
87
+ "q1",
88
+ "a1",
89
+ "q2",
90
+ "a2",
91
+ ...
92
+ ],
93
+ ...
94
+ ]
95
+ ```
96
+ - SFT:
97
+ - json_name: {data_source}_{language}.json
98
+ - data_type: code, general, math, medicalExam, medicalPatient
99
+ - data item: list of qa pairs(list of string)
100
+ ```
101
+ [
102
+ [
103
+ "q1",
104
+ "a1",
105
+ "q2",
106
+ "a2",
107
+ ...
108
+ ],
109
+ ...
110
+ ]
111
+ ```
112
+
113
+
114
+ </details>
115
+
116
+
117
+
118
+ - Evaluation
119
+ πŸ€— <a href="https://huggingface.co/datasets/FreedomIntelligence/XMedbench" target="_blank">XMedBench</a>
120
+
121
+ <details><summary>Click to expand</summary>
122
+
123
+ - EN:
124
+ - [MedQA-USMLE](https://huggingface.co/datasets/GBaker/MedQA-USMLE-4-options)
125
+ - [MedMCQA](https://huggingface.co/datasets/medmcqa/viewer/default/test)
126
+ - [PubMedQA](https://huggingface.co/datasets/pubmed_qa): Because the results fluctuated too much, they were not used in the paper.
127
+ - [MMLU-Medical](https://huggingface.co/datasets/cais/mmlu)
128
+ - Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine
129
+ - ZH:
130
+ - [MedQA-MCMLE](https://huggingface.co/datasets/bigbio/med_qa/viewer/med_qa_zh_4options_bigbio_qa/test)
131
+ - [CMB-single](https://huggingface.co/datasets/FreedomIntelligence/CMB): Not used in the paper
132
+ - Randomly sample 2,000 multiple-choice questions with single answer.
133
+ - [CMMLU-Medical](https://huggingface.co/datasets/haonan-li/cmmlu)
134
+ - Anatomy, Clinical_knowledge, College_medicine, Genetics, Nutrition, Traditional_chinese_medicine, Virology
135
+ - [CExam](https://github.com/williamliujl/CMExam): Not used in the paper
136
+ - Randomly sample 2,000 multiple-choice questions
137
+
138
+
139
+ - ES: [Head_qa](https://huggingface.co/datasets/head_qa)
140
+ - FR: [Frenchmedmcqa](https://github.com/qanastek/FrenchMedMCQA)
141
+ - HI: [MMLU_HI](https://huggingface.co/datasets/FreedomIntelligence/MMLU_Arabic)
142
+ - Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine
143
+ - AR: [MMLU_Ara](https://huggingface.co/datasets/FreedomIntelligence/MMLU_Hindi)
144
+ - Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine
145
+
146
+
147
+ </details>
148
+
149
+
150
+ ## Results reproduction
151
+ <details><summary>Click to expand</summary>
152
+
153
+ **Waiting for Update**
154
+
155
+
156
+
157
+ </details>
158
+
159
+
160
+
161
+
162
+ ## Citation
163
+ Please use the following citation if you intend to use our dataset for training or evaluation:
164
+
165
+ ```
166
+ @misc{wang2024apollo,
167
+ title={Apollo: Lightweight Multilingual Medical LLMs towards Democratizing Medical AI to 6B People},
168
+ author={Xidong Wang and Nuo Chen and Junyin Chen and Yan Hu and Yidong Wang and Xiangbo Wu and Anningzhe Gao and Xiang Wan and Haizhou Li and Benyou Wang},
169
+ year={2024},
170
+ eprint={2403.03640},
171
+ archivePrefix={arXiv},
172
+ primaryClass={cs.CL}
173
+ }
174
+ ```
175
+
176
+