RichardErkhov commited on
Commit
ca48753
·
verified ·
1 Parent(s): 3752d8b

uploaded readme

Browse files
Files changed (1) hide show
  1. README.md +195 -0
README.md ADDED
@@ -0,0 +1,195 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Quantization made by Richard Erkhov.
2
+
3
+ [Github](https://github.com/RichardErkhov)
4
+
5
+ [Discord](https://discord.gg/pvy7H8DZMG)
6
+
7
+ [Request more models](https://github.com/RichardErkhov/quant_request)
8
+
9
+
10
+ Apollo-1.8B - GGUF
11
+ - Model creator: https://huggingface.co/FreedomIntelligence/
12
+ - Original model: https://huggingface.co/FreedomIntelligence/Apollo-1.8B/
13
+
14
+
15
+ | Name | Quant method | Size |
16
+ | ---- | ---- | ---- |
17
+ | [Apollo-1.8B.Q2_K.gguf](https://huggingface.co/RichardErkhov/FreedomIntelligence_-_Apollo-1.8B-gguf/blob/main/Apollo-1.8B.Q2_K.gguf) | Q2_K | 0.78GB |
18
+ | [Apollo-1.8B.Q3_K_S.gguf](https://huggingface.co/RichardErkhov/FreedomIntelligence_-_Apollo-1.8B-gguf/blob/main/Apollo-1.8B.Q3_K_S.gguf) | Q3_K_S | 0.89GB |
19
+ | [Apollo-1.8B.Q3_K.gguf](https://huggingface.co/RichardErkhov/FreedomIntelligence_-_Apollo-1.8B-gguf/blob/main/Apollo-1.8B.Q3_K.gguf) | Q3_K | 0.97GB |
20
+ | [Apollo-1.8B.Q3_K_M.gguf](https://huggingface.co/RichardErkhov/FreedomIntelligence_-_Apollo-1.8B-gguf/blob/main/Apollo-1.8B.Q3_K_M.gguf) | Q3_K_M | 0.97GB |
21
+ | [Apollo-1.8B.Q3_K_L.gguf](https://huggingface.co/RichardErkhov/FreedomIntelligence_-_Apollo-1.8B-gguf/blob/main/Apollo-1.8B.Q3_K_L.gguf) | Q3_K_L | 1.0GB |
22
+ | [Apollo-1.8B.IQ4_XS.gguf](https://huggingface.co/RichardErkhov/FreedomIntelligence_-_Apollo-1.8B-gguf/blob/main/Apollo-1.8B.IQ4_XS.gguf) | IQ4_XS | 1.01GB |
23
+ | [Apollo-1.8B.Q4_0.gguf](https://huggingface.co/RichardErkhov/FreedomIntelligence_-_Apollo-1.8B-gguf/blob/main/Apollo-1.8B.Q4_0.gguf) | Q4_0 | 1.04GB |
24
+ | [Apollo-1.8B.IQ4_NL.gguf](https://huggingface.co/RichardErkhov/FreedomIntelligence_-_Apollo-1.8B-gguf/blob/main/Apollo-1.8B.IQ4_NL.gguf) | IQ4_NL | 1.05GB |
25
+ | [Apollo-1.8B.Q4_K_S.gguf](https://huggingface.co/RichardErkhov/FreedomIntelligence_-_Apollo-1.8B-gguf/blob/main/Apollo-1.8B.Q4_K_S.gguf) | Q4_K_S | 1.08GB |
26
+ | [Apollo-1.8B.Q4_K.gguf](https://huggingface.co/RichardErkhov/FreedomIntelligence_-_Apollo-1.8B-gguf/blob/main/Apollo-1.8B.Q4_K.gguf) | Q4_K | 1.16GB |
27
+ | [Apollo-1.8B.Q4_K_M.gguf](https://huggingface.co/RichardErkhov/FreedomIntelligence_-_Apollo-1.8B-gguf/blob/main/Apollo-1.8B.Q4_K_M.gguf) | Q4_K_M | 1.16GB |
28
+ | [Apollo-1.8B.Q4_1.gguf](https://huggingface.co/RichardErkhov/FreedomIntelligence_-_Apollo-1.8B-gguf/blob/main/Apollo-1.8B.Q4_1.gguf) | Q4_1 | 1.13GB |
29
+ | [Apollo-1.8B.Q5_0.gguf](https://huggingface.co/RichardErkhov/FreedomIntelligence_-_Apollo-1.8B-gguf/blob/main/Apollo-1.8B.Q5_0.gguf) | Q5_0 | 1.22GB |
30
+ | [Apollo-1.8B.Q5_K_S.gguf](https://huggingface.co/RichardErkhov/FreedomIntelligence_-_Apollo-1.8B-gguf/blob/main/Apollo-1.8B.Q5_K_S.gguf) | Q5_K_S | 1.24GB |
31
+ | [Apollo-1.8B.Q5_K.gguf](https://huggingface.co/RichardErkhov/FreedomIntelligence_-_Apollo-1.8B-gguf/blob/main/Apollo-1.8B.Q5_K.gguf) | Q5_K | 1.31GB |
32
+ | [Apollo-1.8B.Q5_K_M.gguf](https://huggingface.co/RichardErkhov/FreedomIntelligence_-_Apollo-1.8B-gguf/blob/main/Apollo-1.8B.Q5_K_M.gguf) | Q5_K_M | 1.31GB |
33
+ | [Apollo-1.8B.Q5_1.gguf](https://huggingface.co/RichardErkhov/FreedomIntelligence_-_Apollo-1.8B-gguf/blob/main/Apollo-1.8B.Q5_1.gguf) | Q5_1 | 1.31GB |
34
+ | [Apollo-1.8B.Q6_K.gguf](https://huggingface.co/RichardErkhov/FreedomIntelligence_-_Apollo-1.8B-gguf/blob/main/Apollo-1.8B.Q6_K.gguf) | Q6_K | 1.47GB |
35
+ | [Apollo-1.8B.Q8_0.gguf](https://huggingface.co/RichardErkhov/FreedomIntelligence_-_Apollo-1.8B-gguf/blob/main/Apollo-1.8B.Q8_0.gguf) | Q8_0 | 1.82GB |
36
+
37
+
38
+
39
+
40
+ Original model description:
41
+ ---
42
+ license: apache-2.0
43
+ ---
44
+ # Multilingual Medicine: Model, Dataset, Benchmark, Code
45
+
46
+ Covering English, Chinese, French, Hindi, Spanish, Hindi, Arabic So far
47
+
48
+
49
+ <p align="center">
50
+ 👨🏻‍💻<a href="https://github.com/FreedomIntelligence/Apollo" target="_blank">Github</a> •📃 <a href="https://arxiv.org/abs/2403.03640" target="_blank">Paper</a> • 🌐 <a href="https://apollo.llmzoo.com/" target="_blank">Demo</a> • 🤗 <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus" target="_blank">ApolloCorpus</a> • 🤗 <a href="https://huggingface.co/datasets/FreedomIntelligence/XMedbench" target="_blank">XMedBench</a>
51
+ <br> <a href="./README_zh.md"> 中文 </a> | <a href="./README.md"> English
52
+ </p>
53
+
54
+ ![Apollo](assets/apollo_medium_final.png)
55
+
56
+ ## 🌈 Update
57
+
58
+ * **[2024.03.07]** [Paper](https://arxiv.org/abs/2403.03640) released.
59
+ * **[2024.02.12]** <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus" target="_blank">ApolloCorpus</a> and <a href="https://huggingface.co/datasets/FreedomIntelligence/XMedbench" target="_blank">XMedBench</a> is published!🎉
60
+ * **[2024.01.23]** Apollo repo is published!🎉
61
+
62
+
63
+ ## Results
64
+ 🤗<a href="https://huggingface.co/FreedomIntelligence/Apollo-0.5B" target="_blank">Apollo-0.5B</a> • 🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo-1.8B" target="_blank">Apollo-1.8B</a> • 🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo-2B" target="_blank">Apollo-2B</a> • 🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo-6B" target="_blank">Apollo-6B</a> • 🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo-7B" target="_blank">Apollo-7B</a>
65
+
66
+ 🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo-0.5B-GGUF" target="_blank">Apollo-0.5B-GGUF</a> • 🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo-2B-GGUF" target="_blank">Apollo-2B-GGUF</a> • 🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo-6B-GGUF" target="_blank">Apollo-6B-GGUF</a> • 🤗 <a href="https://huggingface.co/FreedomIntelligence/Apollo-7B-GGUF" target="_blank">Apollo-7B-GGUF</a>
67
+
68
+
69
+ ![Apollo](assets/result.png)
70
+
71
+
72
+ ## Usage Format
73
+
74
+ User:{query}\nAssistant:{response}<|endoftext|>
75
+
76
+
77
+ ## Dataset & Evaluation
78
+
79
+ - Dataset
80
+ 🤗 <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus" target="_blank">ApolloCorpus</a>
81
+
82
+ <details><summary>Click to expand</summary>
83
+
84
+ ![Apollo](assets/dataset.png)
85
+
86
+ - [Zip File](https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus/blob/main/ApolloCorpus.zip)
87
+ - [Data category](https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus/tree/main/train)
88
+ - Pretrain:
89
+ - data item:
90
+ - json_name: {data_source}_{language}_{data_type}.json
91
+ - data_type: medicalBook, medicalGuideline, medicalPaper, medicalWeb(from online forum), medicalWiki
92
+ - language: en(English), zh(chinese), es(spanish), fr(french), hi(Hindi)
93
+ - data_type: qa(generated qa from text)
94
+ - data_type==text: list of string
95
+ ```
96
+ [
97
+ "string1",
98
+ "string2",
99
+ ...
100
+ ]
101
+ ```
102
+ - data_type==qa: list of qa pairs(list of string)
103
+ ```
104
+ [
105
+ [
106
+ "q1",
107
+ "a1",
108
+ "q2",
109
+ "a2",
110
+ ...
111
+ ],
112
+ ...
113
+ ]
114
+ ```
115
+ - SFT:
116
+ - json_name: {data_source}_{language}.json
117
+ - data_type: code, general, math, medicalExam, medicalPatient
118
+ - data item: list of qa pairs(list of string)
119
+ ```
120
+ [
121
+ [
122
+ "q1",
123
+ "a1",
124
+ "q2",
125
+ "a2",
126
+ ...
127
+ ],
128
+ ...
129
+ ]
130
+ ```
131
+
132
+
133
+ </details>
134
+
135
+
136
+
137
+ - Evaluation
138
+ 🤗 <a href="https://huggingface.co/datasets/FreedomIntelligence/XMedbench" target="_blank">XMedBench</a>
139
+
140
+ <details><summary>Click to expand</summary>
141
+
142
+ - EN:
143
+ - [MedQA-USMLE](https://huggingface.co/datasets/GBaker/MedQA-USMLE-4-options)
144
+ - [MedMCQA](https://huggingface.co/datasets/medmcqa/viewer/default/test)
145
+ - [PubMedQA](https://huggingface.co/datasets/pubmed_qa): Because the results fluctuated too much, they were not used in the paper.
146
+ - [MMLU-Medical](https://huggingface.co/datasets/cais/mmlu)
147
+ - Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine
148
+ - ZH:
149
+ - [MedQA-MCMLE](https://huggingface.co/datasets/bigbio/med_qa/viewer/med_qa_zh_4options_bigbio_qa/test)
150
+ - [CMB-single](https://huggingface.co/datasets/FreedomIntelligence/CMB): Not used in the paper
151
+ - Randomly sample 2,000 multiple-choice questions with single answer.
152
+ - [CMMLU-Medical](https://huggingface.co/datasets/haonan-li/cmmlu)
153
+ - Anatomy, Clinical_knowledge, College_medicine, Genetics, Nutrition, Traditional_chinese_medicine, Virology
154
+ - [CExam](https://github.com/williamliujl/CMExam): Not used in the paper
155
+ - Randomly sample 2,000 multiple-choice questions
156
+
157
+
158
+ - ES: [Head_qa](https://huggingface.co/datasets/head_qa)
159
+ - FR: [Frenchmedmcqa](https://github.com/qanastek/FrenchMedMCQA)
160
+ - HI: [MMLU_HI](https://huggingface.co/datasets/FreedomIntelligence/MMLU_Arabic)
161
+ - Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine
162
+ - AR: [MMLU_Ara](https://huggingface.co/datasets/FreedomIntelligence/MMLU_Hindi)
163
+ - Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine
164
+
165
+
166
+ </details>
167
+
168
+
169
+ ## Results reproduction
170
+ <details><summary>Click to expand</summary>
171
+
172
+ **Waiting for Update**
173
+
174
+
175
+
176
+ </details>
177
+
178
+
179
+
180
+
181
+ ## Citation
182
+ Please use the following citation if you intend to use our dataset for training or evaluation:
183
+
184
+ ```
185
+ @misc{wang2024apollo,
186
+ title={Apollo: Lightweight Multilingual Medical LLMs towards Democratizing Medical AI to 6B People},
187
+ author={Xidong Wang and Nuo Chen and Junyin Chen and Yan Hu and Yidong Wang and Xiangbo Wu and Anningzhe Gao and Xiang Wan and Haizhou Li and Benyou Wang},
188
+ year={2024},
189
+ eprint={2403.03640},
190
+ archivePrefix={arXiv},
191
+ primaryClass={cs.CL}
192
+ }
193
+ ```
194
+
195
+