Welcome to the 🤗 Optimum-TPU training guide! This section covers how to fine-tune models using Google Cloud TPUs.
The following models have been tested and validated for fine-tuning on TPU v5e
and v6e
:
Bigger models are supported, but not yet tested.
Before starting the training process, ensure you have:
pip install optimum-tpu -f https://storage.googleapis.com/libtpu-releases/index.html
export PJRT_DEVICE=TPU
We provide several example scripts to help you get started:
Gemma Fine-tuning:
LLaMA Fine-tuning: