Update README.md
Browse files
README.md
CHANGED
@@ -4,8 +4,9 @@ language:
|
|
4 |
- en
|
5 |
thumbnail:
|
6 |
tags:
|
7 |
-
- text
|
8 |
- conversational
|
|
|
9 |
- ggml
|
10 |
inference: false
|
11 |
---
|
@@ -19,6 +20,7 @@ inference: false
|
|
19 |
**Description:**
|
20 |
- The `pygmalion-6b-main` files are quantized from the main branch of Pygmalion 6B. Also known as "experiment 2", released on January 13th.
|
21 |
- The `pygmalion-6b-dev` files are quantized from the dev branch of Pygmalion 6B. Also known as "part 4/10 of experiment 7", released on March 12th.
|
|
|
22 |
|
23 |
**RAM usage:**
|
24 |
Model | Startup RAM usage (KoboldCpp) | Startup RAM usage (Oobabooga)
|
|
|
4 |
- en
|
5 |
thumbnail:
|
6 |
tags:
|
7 |
+
- text-generation
|
8 |
- conversational
|
9 |
+
- gpt-j
|
10 |
- ggml
|
11 |
inference: false
|
12 |
---
|
|
|
20 |
**Description:**
|
21 |
- The `pygmalion-6b-main` files are quantized from the main branch of Pygmalion 6B. Also known as "experiment 2", released on January 13th.
|
22 |
- The `pygmalion-6b-dev` files are quantized from the dev branch of Pygmalion 6B. Also known as "part 4/10 of experiment 7", released on March 12th.
|
23 |
+
- The motivation behind these quantizations was to have one repository for both the main and dev versions of Pygmalion, as well as all quantization formats available. Some users may prefer the prose and creativity of Pygmalion 6B (and its lack of synthetic GPT-4 data) over newer models, or find 6B's requirements more affordable than 7B. For a modern alternative, [Pygmalion 2 7B](https://huggingface.co/TheBloke/Pygmalion-2-7B-GGUF) is worth investigating.
|
24 |
|
25 |
**RAM usage:**
|
26 |
Model | Startup RAM usage (KoboldCpp) | Startup RAM usage (Oobabooga)
|