YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Quantization made by Richard Erkhov.

Github

Discord

Request more models

TinyKAI-1B-v0.1 - bnb 8bits

Original model description:

license: apache-2.0 tags: - code - chatbot datasets: - Keynote-Technology/PLANE-2K - togethercomputer/RedPajama-Data-V2

TinyKAI 1B

image/png

TinyKAI 1B is a fine-tuned LLM (Large Language Model) based off of Falcon-rw-1B.

Direct Use

TinyKAI 1B is optimal for research on large language models, specifically the influence of web data on the properties of large language models (fairness, safety, limitations, capabilities, etc.).

Banned Use

Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful.

Limitations

TinyKAI 1B is trained on English data only, and will not generate appropriately reasonable content in other languages. Being trained on a representative of the web, it will carry the stereotypes and biases commonly encountered online. In addition, KAI-1B has a very low output limit (less than 2 thousand characters) and struggles when asked to quote online sources.

Recommendations

We recommend users of TinyKAI 1B to consider finetuning it for personal use, and for precautions to be taken for any commercial use.

Banned Use

TinyKAI-1B is governed by the apache 2.0 liscense, and therefore means that whatever the license deems unacceptable shall not be allowed. We specificaly ban the use of ANY AND ALL KAI MODELS for hate speech towards a paricular thing, person, our particular group due to legal and ethical issues.

Downloads last month
3
Safetensors
Model size
1.31B params
Tensor type
F32
FP16
I8
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.