metadata
license: apache-2.0
base_model:
- Qwen/QwQ-32B
base_model_relation: quantized
pipeline_tag: text-generation
Disclamer: I don't know what I'm doing. I am not an expert at quantizing.
Original Model: https://huggingface.co/Qwen/QwQ-32B
QwQ 32B EXL2 | Size |
---|---|
8.0bpw | 33.5 GB |
WIP | |
WIP | |
WIP | |
WIP | |
WIP | |
WIP | |
WIP | |
WIP | |
WIP |