---
language:
- multilingual
license: mit
tags:
- nlp
- code
- mlx
license_link: https://huggingface.co/microsoft/Phi-3-medium-128k-instruct/resolve/main/LICENSE
pipeline_tag: text-generation
inference:
  parameters:
    temperature: 0.7
widget:
- messages:
  - role: user
    content: Can you provide ways to eat combinations of bananas and dragonfruits?
---

# ipihq/Phi-3-medium-128k-instruct

The Model [ipihq/Phi-3-medium-128k-instruct](https://huggingface.co/ipihq/Phi-3-medium-128k-instruct) was converted to MLX format from [microsoft/Phi-3-medium-128k-instruct](https://huggingface.co/microsoft/Phi-3-medium-128k-instruct) using mlx-lm version **0.14.2**.

## Use with mlx

```bash
pip install mlx-lm
```

```python
from mlx_lm import load, generate

model, tokenizer = load("ipihq/Phi-3-medium-128k-instruct")
response = generate(model, tokenizer, prompt="hello", verbose=True)
```