Just high-bpw quantization of functionary for a drop-in OpenAI function calling replacement. See the llama-cpp-python docs:
- Downloads last month
- 1
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.