--- license: llama2 language: - en --- quant of [Sao10K's](https://huggingface.co/Sao10K) [Euryale-1.3-L2-70B](https://huggingface.co/Sao10K/Euryale-1.3-L2-70B) fits into 24gb at 8k context lol ``` python3 convert.py \ -i /input/Sao10K_Euryale-1.3-L2-70B/ \ -c /input/wikitext/0000.parquet \ -o /output/temp/ \ -cf /output/2.18bpw/ \ -b 2.18 \ -hb 6 ```