Configuration Parsing Warning: In config.json: "quantization_config.bits" must be an integer

Abliterated version using the code from (https://github.com/andyrdt/refusal_direction).

Quantized with these exllamav2 parameters:

python3 convert.py
-i ~/exllamav2/zetasepic_Mistral-Small-Instruct-2409-abliterated
-o ~/exllamav2/exl2/
-om ~/exllamav2/ex2m/measurement.json
-l 16000
-ml 16000
-c erotiquant.parquet
-r 400
-mr 50

python3 convert.py \
-i /root/exllamav2/zetasepic_Mistral-Small-Instruct-2409-abliterated \
-o /root/temp/exl2/ \
-nr \
-m /root/exllamav2/measurement.json \
-mr 50 \
-cf /root/4.5bpw/ \
-c  erotiquant.parquet \
-l 16000 \
-r 400 \
-b 4.5
Downloads last month
21
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for openerotica/Mistral-Small-Instruct-2409-abliterated-4.5bpw-exl2

Quantized
(44)
this model