Transformers
GGUF
Italian
English
conversational
aashish1904 commited on
Commit
9e1a8d3
·
verified ·
1 Parent(s): 78ab502

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +53 -0
README.md ADDED
@@ -0,0 +1,53 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ ---
3
+
4
+ library_name: transformers
5
+ license: apache-2.0
6
+ language:
7
+ - it
8
+ - en
9
+ datasets:
10
+ - DeepMount00/Sonnet-3.5-ITA-INSTRUCTION
11
+ - DeepMount00/Sonnet-3.5-ITA-DPO
12
+
13
+ ---
14
+
15
+ [![QuantFactory Banner](https://lh7-rt.googleusercontent.com/docsz/AD_4nXeiuCm7c8lEwEJuRey9kiVZsRn2W-b4pWlu3-X534V3YmVuVc2ZL-NXg2RkzSOOS2JXGHutDuyyNAUtdJI65jGTo8jT9Y99tMi4H4MqL44Uc5QKG77B0d6-JfIkZHFaUA71-RtjyYZWVIhqsNZcx8-OMaA?key=xt3VSDoCbmTY7o-cwwOFwQ)](https://hf.co/QuantFactory)
16
+
17
+
18
+ # QuantFactory/Lexora-Medium-7B-GGUF
19
+ This is quantized version of [DeepMount00/Lexora-Medium-7B](https://huggingface.co/DeepMount00/Lexora-Medium-7B) created using llama.cpp
20
+
21
+ # Original Model Card
22
+
23
+
24
+ ## How to Use
25
+
26
+ ```python
27
+ import torch
28
+ from transformers import AutoTokenizer, AutoModelForCausalLM
29
+
30
+ model_name = "DeepMount00/Lexora-Medium-7B"
31
+
32
+ tokenizer = AutoTokenizer.from_pretrained(model_name)
33
+ model = AutoModelForCausalLM.from_pretrained(
34
+ model_name,
35
+ torch_dtype=torch.bfloat16,
36
+ device_map="auto",
37
+ )
38
+
39
+ prompt = [{'role': 'user', 'content': """Marco ha comprato 5 scatole di cioccolatini. Ogni scatola contiene 12 cioccolatini. Ha deciso di dare 3 cioccolatini a ciascuno dei suoi 7 amici. Quanti cioccolatini gli rimarranno dopo averli distribuiti ai suoi amici?"""}]
40
+ inputs = tokenizer.apply_chat_template(
41
+ prompt,
42
+ add_generation_prompt=True,
43
+ return_tensors='pt'
44
+ )
45
+ tokens = model.generate(
46
+ inputs.to(model.device),
47
+ max_new_tokens=1024,
48
+ temperature=0.001,
49
+ do_sample=True
50
+ )
51
+
52
+ print(tokenizer.decode(tokens[0], skip_special_tokens=False))
53
+ ```