CodeKobzar13B / README.md
ponoma16's picture
Update README.md
0ce08c2 verified
---
datasets:
- osyvokon/zno
- byebyebye/ukr-wiki-qa-v1
- byebyebye/ukr-wiki-qa-v2
language:
- uk
---
## Introduction
CodeKobzar13B is a generative model that was trained on Ukrainian Wikipedia data and Ukrainian language rules. It has knowledge of Ukrainian history, language, literature and culture.
## Model Information
This model is based on [vicuna-13b-v1.5](https://huggingface.co/lmsys/vicuna-13b-v1.5).
## Model Usage
Use the following prompt template: <br>
USER: {input} ASSISTANT:
We recommend using next configurations:
<b>Temperature:</b> 0.8 <br>
<b>Top-p:</b> 0.95
### Inference
```python
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
model_path="ponoma16/CodeKobzar13B"
tokenizer = AutoTokenizer.from_pretrained(model_path)
model = AutoModelForCausalLM.from_pretrained(
model_load_path,
low_cpu_mem_usage=True,
torch_dtype=torch.float16,
load_in_8bit=True,
device_map='auto',
)
model.eval()
prompt = "Яке місто в Україні називають найромантичнішим?"
PROMPT_TEMPLATE = """USER: {prompt} ASSISTANT: """
input_ids = tokenizer(
prompt,
return_tensors="pt",
truncation=True,
).input_ids.cuda()
outputs = model.generate(
input_ids=input_ids,
do_sample=True,
top_p=0.95,
max_new_tokens=150,
temperature=0.5,
)
prediction = tokenizer.batch_decode(outputs.cpu().numpy(), skip_special_tokens=True)[0]
print(prediction)
```
## Contact
If you have any inquiries, please feel free to raise an issue or reach out to us via email at: [email protected], [email protected].
We're here to assist you!"