Upload folder using huggingface_hub
Browse files- .gitattributes +5 -0
- README.md +45 -0
- utena-7b-nsfw-v2.Q3_K_S.gguf +3 -0
- utena-7b-nsfw-v2.Q4_K_M.gguf +3 -0
- utena-7b-nsfw-v2.Q5_K_M.gguf +3 -0
- utena-7b-nsfw-v2.Q6_K.gguf +3 -0
- utena-7b-nsfw-v2.Q8_0.gguf +3 -0
.gitattributes
CHANGED
@@ -33,3 +33,8 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
|
|
|
|
|
|
|
|
|
|
|
33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
36 |
+
utena-7b-nsfw-v2.Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
37 |
+
utena-7b-nsfw-v2.Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
38 |
+
utena-7b-nsfw-v2.Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
39 |
+
utena-7b-nsfw-v2.Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
|
40 |
+
utena-7b-nsfw-v2.Q8_0.gguf filter=lfs diff=lfs merge=lfs -text
|
README.md
ADDED
@@ -0,0 +1,45 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
|
2 |
+
---
|
3 |
+
license: openrail
|
4 |
+
pipeline_tag: text-generation
|
5 |
+
library_name: transformers
|
6 |
+
language:
|
7 |
+
- zh
|
8 |
+
- en
|
9 |
+
---
|
10 |
+
|
11 |
+
|
12 |
+
## Original model card
|
13 |
+
|
14 |
+
Buy me a coffee if you like this project ;)
|
15 |
+
<a href="https://www.buymeacoffee.com/s3nh"><img src="https://www.buymeacoffee.com/assets/img/guidelines/download-assets-sm-1.svg" alt=""></a>
|
16 |
+
|
17 |
+
#### Description
|
18 |
+
|
19 |
+
GGUF Format model files for [This project](https://huggingface.co/AI-B/UTENA-7B-NSFW-V2).
|
20 |
+
|
21 |
+
### GGUF Specs
|
22 |
+
|
23 |
+
GGUF is a format based on the existing GGJT, but makes a few changes to the format to make it more extensible and easier to use. The following features are desired:
|
24 |
+
|
25 |
+
Single-file deployment: they can be easily distributed and loaded, and do not require any external files for additional information.
|
26 |
+
Extensible: new features can be added to GGML-based executors/new information can be added to GGUF models without breaking compatibility with existing models.
|
27 |
+
mmap compatibility: models can be loaded using mmap for fast loading and saving.
|
28 |
+
Easy to use: models can be easily loaded and saved using a small amount of code, with no need for external libraries, regardless of the language used.
|
29 |
+
Full information: all information needed to load a model is contained in the model file, and no additional information needs to be provided by the user.
|
30 |
+
The key difference between GGJT and GGUF is the use of a key-value structure for the hyperparameters (now referred to as metadata), rather than a list of untyped values.
|
31 |
+
This allows for new metadata to be added without breaking compatibility with existing models, and to annotate the model with additional information that may be useful for
|
32 |
+
inference or for identifying the model.
|
33 |
+
|
34 |
+
|
35 |
+
|
36 |
+
### inference
|
37 |
+
|
38 |
+
|
39 |
+
User: Tell me story about what is an quantization and what do we need to build.
|
40 |
+
|
41 |
+
Ashish: Quantization is the process of mapping continuous values to discrete values. For example, if you have a sound wave that varies continuously over time, you can quantize it to only allow certain frequencies at specific times. This can be useful for compressing data or reducing noise in signals. To build a quantizer, you would need a way to measure the input signal (such as a microphone), a method for determining the desired frequency range and resolution of the output, and a way to produce discrete values based on the measured input and desired output.
|
42 |
+
|
43 |
+
User: What is an example of a use case for
|
44 |
+
|
45 |
+
# Original model card
|
utena-7b-nsfw-v2.Q3_K_S.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ad81cfd7c0ca1573d92bcf66bb5c201058e0d44f748c6b59d064846956e9a87b
|
3 |
+
size 3164567296
|
utena-7b-nsfw-v2.Q4_K_M.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:414fd5bdf18f5db31cb45e1b2824aa52f8675cfcec2580cd7fab80e95f02e1ed
|
3 |
+
size 4368439040
|
utena-7b-nsfw-v2.Q5_K_M.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ae6688ae099c1e0bf405c08944dddbff2d04f0b89b67ef6c75bbf040a559e928
|
3 |
+
size 5131409152
|
utena-7b-nsfw-v2.Q6_K.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ee58dc77f4106cce7a14844b8c639ed5c81008e73de029c7916febe65cc5556b
|
3 |
+
size 5942064896
|
utena-7b-nsfw-v2.Q8_0.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:1e6e1540c934ee44b83b311176ba83a8c36a98a97c5f7cc345d1a882fedc234b
|
3 |
+
size 7695857408
|