ABR-Finetuned-BioGPT

Many LLM's have general capabilities, but many times domain specific knowledge is more important. To fill this gap we took the BioGPT model pretrained on a large set of biomedical abstracts and finetuned it on antibiotic resistance abstracts. But to get closer to a chatgpt like model some question and answering datasets where used to achieve this.

Data Trained On

Medical Domain Data

  • qiaojin/PubMedQA
  • Amirkid/MedQuad-dataset
  • PubMed ABR abstracts
  • medalpaca/medical_meadow_medical_flashcards
  • medalpaca/medical_meadow_wikidoc

General Question Answering

  • tatsu-lab/alpaca
Downloads last month
6
Safetensors
Model size
1.57B params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.