--- license: apache-2.0 base_model: - Qwen/QwQ-32B base_model_relation: quantized pipeline_tag: text-generation --- Disclamer: I don't know what I'm doing. I am not an expert at quantizing. Original Model: https://huggingface.co/Qwen/QwQ-32B | QwQ 32B EXL2 | Size | | --- | --- | | **8.0bpw** | 33.5 GB | | ~7.0bpw~ | WIP | | ~6.5bpw~ | WIP | | ~6.0bpw~ | WIP | | ~5.5bpw~ | WIP | | ~5.0bpw~ | WIP | | ~4.5bpw~ | WIP | | ~4.0bpw~ | WIP | | ~3.75bpw~ | WIP | | ~3.5bpw~ | WIP |