|
Quantization made by Richard Erkhov. |
|
|
|
[Github](https://github.com/RichardErkhov) |
|
|
|
[Discord](https://discord.gg/pvy7H8DZMG) |
|
|
|
[Request more models](https://github.com/RichardErkhov/quant_request) |
|
|
|
|
|
Apollo-0.5B - EXL2 |
|
- Model creator: https://huggingface.co/FreedomIntelligence/ |
|
- Original model: https://huggingface.co/FreedomIntelligence/Apollo-0.5B/ |
|
|
|
|
|
## Available sizes |
|
|
|
| Branch | Bits | Description | |
|
| ----- | ---- | ------- | ------ | ------ | ------ | ------ | ------------ | |
|
| [8_0](https://huggingface.co/FreedomIntelligence_-_Apollo-0.5B-exl2/tree/8_0) | 8.0 | Maximum quality that ExLlamaV2 can produce, near unquantized performance. | |
|
| [6_5](https://huggingface.co/FreedomIntelligence_-_Apollo-0.5B-exl2/tree/6_5) | 6.5 | Very similar to 8.0, good tradeoff of size vs performance, **recommended**. | |
|
| [5_0](https://huggingface.co/FreedomIntelligence_-_Apollo-0.5B-exl2/tree/5_0) | 5.0 | Slightly lower quality vs 6.5, but usable on 8GB cards. | |
|
| [4_25](https://huggingface.co/FreedomIntelligence_-_Apollo-0.5B-exl2/tree/4_25) | 4.25 | GPTQ equivalent bits per weight, slightly higher quality. | |
|
| [3_5](https://huggingface.co/FreedomIntelligence_-_Apollo-0.5B-exl2/tree/3_5) | 3.5 | Lower quality, only use if you have to. | |
|
## Download instructions |
|
With git: |
|
```shell |
|
git clone --single-branch --branch 6_5 https://huggingface.co/FreedomIntelligence_-_Apollo-0.5B-exl2 Apollo-0.5B-6_5 |
|
``` |
|
With huggingface hub: |
|
```shell |
|
pip3 install huggingface-hub |
|
``` |
|
To download a specific branch, use the `--revision` parameter. For example, to download the 6.5 bpw branch: |
|
Linux: |
|
```shell |
|
huggingface-cli download FreedomIntelligence_-_Apollo-0.5B-exl2 --revision 6_5 --local-dir Apollo-0.5B-6_5 --local-dir-use-symlinks False |
|
``` |
|
Windows (which apparently doesn't like _ in folders sometimes?): |
|
|
|
```shell |
|
huggingface-cli download FreedomIntelligence_-_Apollo-0.5B-exl2 --revision 6_5 --local-dir Apollo-0.5B-6.5 --local-dir-use-symlinks False |
|
``` |
|
|
|
|
|
|
|
|
|
Original model description: |
|
--- |
|
license: apache-2.0 |
|
--- |
|
# Multilingual Medicine: Model, Dataset, Benchmark, Code |
|
|
|
Covering English, Chinese, French, Hindi, Spanish, Hindi, Arabic So far |
|
|
|
|
|
<p align="center"> |
|
π¨π»βπ»<a href="https://github.com/FreedomIntelligence/Apollo" target="_blank">Github</a> β’π <a href="https://arxiv.org/abs/2403.03640" target="_blank">Paper</a> β’ π <a href="https://apollo.llmzoo.com/" target="_blank">Demo</a> β’ π€ <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus" target="_blank">ApolloCorpus</a> β’ π€ <a href="https://huggingface.co/datasets/FreedomIntelligence/XMedbench" target="_blank">XMedBench</a> |
|
<br> <a href="./README_zh.md"> δΈζ </a> | <a href="./README.md"> English |
|
</p> |
|
|
|
 |
|
|
|
## π Update |
|
|
|
* **[2024.04.25]** [MedJamba](https://huggingface.co/FreedomIntelligence/Apollo-MedJamba) released, train and evaluation code refer to [repo](https://github.com/FreedomIntelligence/MedJamba). |
|
* **[2024.03.07]** [Paper](https://arxiv.org/abs/2403.03640) released. |
|
* **[2024.02.12]** <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus" target="_blank">ApolloCorpus</a> and <a href="https://huggingface.co/datasets/FreedomIntelligence/XMedbench" target="_blank">XMedBench</a> is publishedοΌπ |
|
* **[2024.01.23]** Apollo repo is publishedοΌπ |
|
|
|
|
|
## Results |
|
π€ <a href="https://huggingface.co/FreedomIntelligence/Apollo-0.5B" target="_blank">Apollo-0.5B</a> β’ π€ <a href="https://huggingface.co/FreedomIntelligence/Apollo-1.8B" target="_blank">Apollo-1.8B</a> β’ π€ <a href="https://huggingface.co/FreedomIntelligence/Apollo-2B" target="_blank">Apollo-2B</a> β’ π€ <a href="https://huggingface.co/FreedomIntelligence/Apollo-6B" target="_blank">Apollo-6B</a> β’ π€ <a href="https://huggingface.co/FreedomIntelligence/Apollo-7B" target="_blank">Apollo-7B</a> β’ π€ <a href="https://huggingface.co/FreedomIntelligence/Apollo-34B" target="_blank">Apollo-34B</a> β’ π€ <a href="https://huggingface.co/FreedomIntelligence/Apollo-72B" target="_blank">Apollo-72B</a> |
|
|
|
π€ <a href="https://huggingface.co/FreedomIntelligence/Apollo-MedJamba" target="_blank">MedJamba</a> |
|
|
|
π€ <a href="https://huggingface.co/FreedomIntelligence/Apollo-0.5B-GGUF" target="_blank">Apollo-0.5B-GGUF</a> β’ π€ <a href="https://huggingface.co/FreedomIntelligence/Apollo-2B-GGUF" target="_blank">Apollo-2B-GGUF</a> β’ π€ <a href="https://huggingface.co/FreedomIntelligence/Apollo-6B-GGUF" target="_blank">Apollo-6B-GGUF</a> β’ π€ <a href="https://huggingface.co/FreedomIntelligence/Apollo-7B-GGUF" target="_blank">Apollo-7B-GGUF</a> |
|
|
|
|
|
|
|
 |
|
|
|
|
|
## Usage Format |
|
|
|
User:{query}\nAssistant:{response}<|endoftext|> |
|
|
|
|
|
## Dataset & Evaluation |
|
|
|
- Dataset |
|
π€ <a href="https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus" target="_blank">ApolloCorpus</a> |
|
|
|
<details><summary>Click to expand</summary> |
|
|
|
 |
|
|
|
- [Zip File](https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus/blob/main/ApolloCorpus.zip) |
|
- [Data category](https://huggingface.co/datasets/FreedomIntelligence/ApolloCorpus/tree/main/train) |
|
- Pretrain: |
|
- data item: |
|
- json_name: {data_source}_{language}_{data_type}.json |
|
- data_type: medicalBook, medicalGuideline, medicalPaper, medicalWeb(from online forum), medicalWiki |
|
- language: en(English), zh(chinese), es(spanish), fr(french), hi(Hindi) |
|
- data_type: qa(generated qa from text) |
|
- data_type==text: list of string |
|
``` |
|
[ |
|
"string1", |
|
"string2", |
|
... |
|
] |
|
``` |
|
- data_type==qa: list of qa pairs(list of string) |
|
``` |
|
[ |
|
[ |
|
"q1", |
|
"a1", |
|
"q2", |
|
"a2", |
|
... |
|
], |
|
... |
|
] |
|
``` |
|
- SFT: |
|
- json_name: {data_source}_{language}.json |
|
- data_type: code, general, math, medicalExam, medicalPatient |
|
- data item: list of qa pairs(list of string) |
|
``` |
|
[ |
|
[ |
|
"q1", |
|
"a1", |
|
"q2", |
|
"a2", |
|
... |
|
], |
|
... |
|
] |
|
``` |
|
|
|
|
|
</details> |
|
|
|
|
|
|
|
- Evaluation |
|
π€ <a href="https://huggingface.co/datasets/FreedomIntelligence/XMedbench" target="_blank">XMedBench</a> |
|
|
|
<details><summary>Click to expand</summary> |
|
|
|
- EN: |
|
- [MedQA-USMLE](https://huggingface.co/datasets/GBaker/MedQA-USMLE-4-options) |
|
- [MedMCQA](https://huggingface.co/datasets/medmcqa/viewer/default/test) |
|
- [PubMedQA](https://huggingface.co/datasets/pubmed_qa): Because the results fluctuated too much, they were not used in the paper. |
|
- [MMLU-Medical](https://huggingface.co/datasets/cais/mmlu) |
|
- Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine |
|
- ZH: |
|
- [MedQA-MCMLE](https://huggingface.co/datasets/bigbio/med_qa/viewer/med_qa_zh_4options_bigbio_qa/test) |
|
- [CMB-single](https://huggingface.co/datasets/FreedomIntelligence/CMB): Not used in the paper |
|
- Randomly sample 2,000 multiple-choice questions with single answer. |
|
- [CMMLU-Medical](https://huggingface.co/datasets/haonan-li/cmmlu) |
|
- Anatomy, Clinical_knowledge, College_medicine, Genetics, Nutrition, Traditional_chinese_medicine, Virology |
|
- [CExam](https://github.com/williamliujl/CMExam): Not used in the paper |
|
- Randomly sample 2,000 multiple-choice questions |
|
|
|
|
|
- ES: [Head_qa](https://huggingface.co/datasets/head_qa) |
|
- FR: [Frenchmedmcqa](https://github.com/qanastek/FrenchMedMCQA) |
|
- HI: [MMLU_HI](https://huggingface.co/datasets/FreedomIntelligence/MMLU_Arabic) |
|
- Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine |
|
- AR: [MMLU_Ara](https://huggingface.co/datasets/FreedomIntelligence/MMLU_Hindi) |
|
- Clinical knowledge, Medical genetics, Anatomy, Professional medicine, College biology, College medicine |
|
|
|
|
|
</details> |
|
|
|
|
|
## Results reproduction |
|
<details><summary>Click to expand</summary> |
|
|
|
**Waiting for Update** |
|
|
|
|
|
|
|
</details> |
|
|
|
|
|
|
|
|
|
## Citation |
|
Please use the following citation if you intend to use our dataset for training or evaluation: |
|
|
|
``` |
|
@misc{wang2024apollo, |
|
title={Apollo: Lightweight Multilingual Medical LLMs towards Democratizing Medical AI to 6B People}, |
|
author={Xidong Wang and Nuo Chen and Junyin Chen and Yan Hu and Yidong Wang and Xiangbo Wu and Anningzhe Gao and Xiang Wan and Haizhou Li and Benyou Wang}, |
|
year={2024}, |
|
eprint={2403.03640}, |
|
archivePrefix={arXiv}, |
|
primaryClass={cs.CL} |
|
} |
|
``` |
|
|
|
|
|
|