cogbuji's picture
Upload folder using huggingface_hub (#1)
94b3284 verified
|
raw
history blame
898 Bytes
metadata
base_model: mistralai/Mistral-7B-v0.1
datasets:
  - Open-Orca/OpenOrca
  - pubmed
  - medmcqa
  - maximegmd/medqa_alpaca_format
language:
  - en
license: apache-2.0
metrics:
  - accuracy
tags:
  - medical
  - mlx
tag: text-generation

cogbuji/MrGrammaticaOntology-internistai-SCT-DRIFT-clinical-problem-0.6.5

The Model cogbuji/MrGrammaticaOntology-internistai-SCT-DRIFT-clinical-problem-0.6.5 was converted to MLX format from internistai/base-7b-v0.2 using mlx-lm version 0.16.0.

Use with mlx

pip install mlx-lm
from mlx_lm import load, generate

model, tokenizer = load("cogbuji/MrGrammaticaOntology-internistai-SCT-DRIFT-clinical-problem-0.6.5")
response = generate(model, tokenizer, prompt="hello", verbose=True)