---
language:
- eng
tags:
- sft
- StableLM
license:
- mit
datasets:
- LDJnr/LessWrong-Amplify-Instruct
- LDJnr/Pure-Dove
- LDJnr/Verified-Camel
quantized_by: bartowski
---
## Exllama v2 Quantizations of Nous-Capybara-7B-V1.9
Using turboderp's ExLlamaV2 v0.0.7 for quantization.
Each branches contains an individual bits per weight.
Conversion was done using wikitext.parquet as calibration dataset.
Original model: https://huggingface.co/NousResearch/Nous-Capybara-7B-V1.9
4.0 bits per weight
6.0 bits per weight
8.0 bits per weight
## Download instructions
With git:
```shell
git clone --single-branch --branch 4.0 https://huggingface.co/bartowski/Nous-Capybara-7B-V1.9-exl2
```
With huggingface hub (credit to TheBloke for instructions):
```shell
pip3 install huggingface-hub
```
To download the `main` (only useful if you only care about measurement.json) branch to a folder called `Nous-Capybara-7B-V1.9-exl2`:
```shell
mkdir Nous-Capybara-7B-V1.9-exl2
huggingface-cli download bartowski/Nous-Capybara-7B-V1.9-exl2 --local-dir Nous-Capybara-7B-V1.9-exl2 --local-dir-use-symlinks False
```
To download from a different branch, add the `--revision` parameter:
```shell
mkdir Nous-Capybara-7B-V1.9-exl2
huggingface-cli download bartowski/Nous-Capybara-7B-V1.9-exl2 --revision 4.0 --local-dir Nous-Capybara-7B-V1.9-exl2 --local-dir-use-symlinks False
```