Training on TPU

Welcome to the 🤗 Optimum-TPU training guide! This section covers how to fine-tune models using Google Cloud TPUs.

Currently Supported Models

The following models have been tested and validated for fine-tuning on TPU v5e and v6e:

Bigger models are supported, but not yet tested.

Getting Started

Prerequisites

Before starting the training process, ensure you have:

  1. A configured Google Cloud TPU instance (see Deployment Guide)
  2. Optimum-TPU installed with PyTorch/XLA support:
pip install optimum-tpu -f https://storage.googleapis.com/libtpu-releases/index.html
export PJRT_DEVICE=TPU

Example Training Scripts

We provide several example scripts to help you get started:

  1. Gemma Fine-tuning:

  2. LLaMA Fine-tuning: