MCES10 commited on
Commit
fbf383a
·
verified ·
1 Parent(s): c9f4e17

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +52 -0
README.md ADDED
@@ -0,0 +1,52 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ license_link: https://huggingface.co/microsoft/Phi-4-reasoning-plus/resolve/main/LICENSE
4
+ language:
5
+ - en
6
+ base_model: microsoft/Phi-4-reasoning-plus
7
+ pipeline_tag: text-generation
8
+ tags:
9
+ - phi
10
+ - nlp
11
+ - math
12
+ - code
13
+ - chat
14
+ - conversational
15
+ - reasoning
16
+ - mlx
17
+ - mlx-my-repo
18
+ inference:
19
+ parameters:
20
+ temperature: 0
21
+ widget:
22
+ - messages:
23
+ - role: user
24
+ content: What is the derivative of x^2?
25
+ library_name: transformers
26
+ ---
27
+
28
+ # MCES10/Phi-4-reasoning-plus-mlx-fp16
29
+
30
+ The Model [MCES10/Phi-4-reasoning-plus-mlx-fp16](https://huggingface.co/MCES10/Phi-4-reasoning-plus-mlx-fp16) was converted to MLX format from [microsoft/Phi-4-reasoning-plus](https://huggingface.co/microsoft/Phi-4-reasoning-plus) using mlx-lm version **0.22.3**.
31
+
32
+ ## Use with mlx
33
+
34
+ ```bash
35
+ pip install mlx-lm
36
+ ```
37
+
38
+ ```python
39
+ from mlx_lm import load, generate
40
+
41
+ model, tokenizer = load("MCES10/Phi-4-reasoning-plus-mlx-fp16")
42
+
43
+ prompt="hello"
44
+
45
+ if hasattr(tokenizer, "apply_chat_template") and tokenizer.chat_template is not None:
46
+ messages = [{"role": "user", "content": prompt}]
47
+ prompt = tokenizer.apply_chat_template(
48
+ messages, tokenize=False, add_generation_prompt=True
49
+ )
50
+
51
+ response = generate(model, tokenizer, prompt=prompt, verbose=True)
52
+ ```