|
--- |
|
license: apache-2.0 |
|
base_model: |
|
- Qwen/QwQ-32B |
|
base_model_relation: quantized |
|
pipeline_tag: text-generation |
|
--- |
|
Disclamer: I don't know what I'm doing. I am not an expert at quantizing. |
|
|
|
Original Model: https://huggingface.co/Qwen/QwQ-32B |
|
| QwQ 32B EXL2 | Size | |
|
| --- | --- | |
|
| <a href="https://huggingface.co/cshared/Qwen-QwQ-32B-8.0bpw-exl2">**8.0bpw**</a> | 33.5 GB | |
|
| ~7.0bpw~ | WIP | |
|
| ~6.5bpw~ | WIP | |
|
| ~6.0bpw~ | WIP | |
|
| ~5.5bpw~ | WIP | |
|
| ~5.0bpw~ | WIP | |
|
| ~4.5bpw~ | WIP | |
|
| ~4.0bpw~ | WIP | |
|
| ~3.75bpw~ | WIP | |
|
| ~3.5bpw~ | WIP | |