This model was exported from ibm-granite/granite-3b-code-instruct-128k using Optimum with the basic optimization by ONNX Runtime.

The repository owner maintains this model for use with Transformers.js and DirectML, but unfortunately, the external data format is not supported yet.

Downloads last month
20
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Collection including kazssym/granite-3b-code-instruct-128k-onnx