This model was exported from ibm-granite/granite-3b-code-instruct-128k using Optimum with float16 conversion and the basic optimization by ONNX Runtime.

The repository owner maintains this model for use with Transformers.js, but unfortunately, the external data format is not supported yet.

Downloads last month
14
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Collection including kazssym/granite-3b-code-instruct-128k-onnx-float16