T5-XXL Encoder
This repo contains copies of the T5-XXL encoder in various quantization formats. The models in this repo are intended for use in InvokeAI.
Contents:
bfloat16/
- T5-XXL encoder cast to bfloat16. Copied from here.bnb_llm_int8/
- T5-XXL encoder quantized using bitsandbytes LLM.int8() quantization.optimum_quanto_qfloat8/
- T5-XXL encoder quantized using optimum-quanto qfloat8 quantization.
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.