DLCs on Google Cloud

Below you can find a listing of all the Deep Learning Containers (DLCs) available on Google Cloud. Containers are created for each supported combination of use-case (training, inference), accelerator type (CPU, GPU, TPU), and framework (PyTorch, TGI, TEI).

The listing below only contains the latest version of each one of the Hugging Face DLCs, the full listing of the available published containers in Google Cloud can be found either in the Google Cloud Deep Learning Containers Documentation, in the Google Cloud Artifact Registry or via the gcloud container images list --repository="us-docker.pkg.dev/deeplearning-platform-release/gcr.io" | grep "huggingface-" command.

Text Generation Inference (TGI)

Text Generation Inference (TGI) DLC is available for high-performance text generation of Large Language Models on both GPU and TPU (soon). The TGI DLC enables you to deploy any of the +140,000 text generation inference supported models from the Hugging Face Hub, or any custom model as long as its architecture is supported within TGI.

Container URI Path Accelerator
us-docker.pkg.dev/deeplearning-platform-release/gcr.io/huggingface-text-generation-inference-cu121.2-2.ubuntu2204.py310 text-generation-inference-gpu.2.2.0 GPU

Text Embeddings Inference (TEI)

Text Embeddings Inference (TEI) DLC is available for high-performance serving of embedding models on both GPU and GPU. The TEI DLC enables you to deploy any of the +10,000 embedding, re-ranking or sequence classification supported models from the Hugging Face Hub, or any custom model as long as its architecture is supported within TEI.

Container URI Path Accelerator
us-docker.pkg.dev/deeplearning-platform-release/gcr.io/huggingface-text-embeddings-inference-cu122.1-4.ubuntu2204 text-embeddings-inference-gpu.1.4.0 GPU
us-docker.pkg.dev/deeplearning-platform-release/gcr.io/huggingface-text-embeddings-inference-cpu.1-4 text-embeddings-inference-cpu.1.4.0 CPU

PyTorch Inference

Pytorch Inference DLC is available for Pytorch via ๐Ÿค— Transformers, for serving models trained with ๐Ÿค— TRL, Sentence Transformers or ๐Ÿงจ Diffusers, on both CPU and GPU.

Container URI Path Accelerator
us-docker.pkg.dev/deeplearning-platform-release/gcr.io/huggingface-pytorch-inference-cu121.2-2.transformers.4-44.ubuntu2204.py311 huggingface-pytorch-inference-gpu.2.2.2.transformers.4.44.0.py311 GPU
us-docker.pkg.dev/deeplearning-platform-release/gcr.io/huggingface-pytorch-inference-cpu.2-2.transformers.4-44.ubuntu2204.py311 huggingface-pytorch-inference-cpu.2.2.2.transformers.4.44.0.py311 CPU

PyTorch Training

Pytorch Training DLC is available for PyTorch via ๐Ÿค— Transformers. It includes support for training with libraries such as ๐Ÿค— TRL, Sentence Transformers, or ๐Ÿงจ Diffusers, on both GPUs and TPUs (soon).

Container URI Path Accelerator
us-docker.pkg.dev/deeplearning-platform-release/gcr.io/huggingface-pytorch-training-cu121.2-3.transformers.4-42.ubuntu2204.py310 huggingface-pytorch-training-gpu.2.3.0.transformers.4.42.3.py310 GPU
< > Update on GitHub