vicuna-13b-v1.5-PL
ExLlamav2 8 bpw quants of https://huggingface.co/Aspik101/vicuna-13b-v1.5-PL-lora_unload
- Downloads last month
- 9
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model authors have turned it off explicitly.
Model tree for altomek/vicuna-13b-v1.5-PL-8bpw-EXL2
Base model
Aspik101/vicuna-13b-v1.5-PL-lora_unload