Run the Model

tokenizer = LlamaTokenizer.from_pretrained("alexpaul/QI-large-v1")

base_model = LlamaForCausalLM.from_pretrained(
    "alexpaul/QI-large-v1",
    load_in_8bit=True,
    device_map='auto',
)
Downloads last month
88
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Datasets used to train alexpaul/QI-Large-v1

Space using alexpaul/QI-Large-v1 1