Run a calculation on a Cloud TPU VM by using JAX
This document provides a brief introduction to working with JAX and Cloud TPU.
Before you follow this quickstart, you must create a Google Cloud Platform
account, install the Google Cloud CLI, and configure the
For more information, see Set up an account and a Cloud TPU project.
Install the Google Cloud CLI
The Google Cloud CLI contains tools and libraries for interacting with Google Cloud products and services. For more information, see Installing the Google Cloud CLI.
Run the following commands to configure
gcloud to use your Google Cloud
install components needed for the TPU VM preview.
$ gcloud config set account your-email-account $ gcloud config set project project-id
Enable the Cloud TPU API
$ gcloud services enable tpu.googleapis.com
Run the following command to create a service identity.
$ gcloud beta services identity create --service tpu.googleapis.com
Create a Cloud TPU VM with
With Cloud TPU VMs, your model and code run directly on the TPU host machine. You SSH directly into the TPU host. You can run arbitrary code, install packages, view logs, and debug code directly on the TPU Host.
Create your TPU VM by running the following command from a Google Cloud Shell or your computer terminal where the Google Cloud CLI is installed. Replace the
tpu-vm-basefor v2 and v3 TPU versions or
tpu-vm-v4-basefor v4 TPUs.
(vm)$ gcloud compute tpus tpu-vm create tpu-name \ --zone europe-west4-a \ --accelerator-type v3-8 \ --version tpu-vm-base
Connect to your Cloud TPU VM
SSH into your TPU VM by using the following command:
$ gcloud compute tpus tpu-vm ssh tpu-name --zone europe-west4-a
- The name of the TPU VM to which you are connecting.
- The zone where you created your Cloud TPU.
Install JAX on your Cloud TPU VM
(vm)$ pip install "jax[tpu]>=0.2.16" -f https://storage.googleapis.com/jax-releases/libtpu_releases.html
Test that everything is installed correctly by checking that JAX sees the Cloud TPU cores and can run basic operations:
Start the Python 3 interpreter:
>>> import jax
Display the number of TPU cores available:
The number of TPU cores is displayed, this should be
8 if you are running
on a v2 or v3 TPU or
4 if you are running on a v4 TPU.
Perform a simple calculation:
>>> jax.numpy.add(1, 1)
The result of the numpy add is displayed:
Output from the command:
Exit the Python interpreter:
Running JAX code on a TPU VM
You can now run any JAX code you please. The flax examples are a great place to start with running standard ML models in JAX. For instance, to train a basic MNIST convolutional network:
Install Tensorflow datasets
(vm)$ pip install --upgrade clu
(vm)$ git clone https://github.com/google/flax.git (vm)$ pip install --user -e flax
Run the FLAX MNIST training script
(vm)$ cd flax/examples/mnist (vm)$ python3 main.py --workdir=/tmp/mnist \ --config=configs/default.py \ --config.learning_rate=0.05 \ --config.num_epochs=5
The script output should look like this:
I0726 00:57:51.274136 139632684678208 train.py:146] epoch: 1, train_loss: 0.2423, train_accuracy: 92.96, test_loss: 0.0629, test_accuracy: 97.98 I0726 00:57:52.741929 139632684678208 train.py:146] epoch: 2, train_loss: 0.0594, train_accuracy: 98.15, test_loss: 0.0434, test_accuracy: 98.61 I0726 00:57:54.149238 139632684678208 train.py:146] epoch: 3, train_loss: 0.0417, train_accuracy: 98.73, test_loss: 0.0307, test_accuracy: 98.98 I0726 00:57:55.570881 139632684678208 train.py:146] epoch: 4, train_loss: 0.0309, train_accuracy: 99.03, test_loss: 0.0273, test_accuracy: 99.13 I0726 00:57:56.937045 139632684678208 train.py:146] epoch: 5, train_loss: 0.0251, train_accuracy: 99.21, test_loss: 0.0270, test_accuracy: 99.16
When you are done with your TPU VM follow these steps to clean up your resources.
Disconnect from the Compute Engine instance, if you have not already done so:
Delete your Cloud TPU.
$ gcloud compute tpus tpu-vm delete tpu-name \ --zone europe-west4-a
Verify the resources have been deleted by running the following command. Make sure your TPU is no longer listed. The deletion might take several minutes.
$ gcloud compute tpus tpu-vm list \ --zone europe-west4-a
Here are a few important details that are particularly relevant to using TPUs in JAX.
One of the most common causes for slow performance on TPUs is introducing inadvertent padding:
- Arrays in the Cloud TPU are tiled. This entails padding one of the dimensions to a multiple of 8, and a different dimension to a multiple of 128.
- The matrix multiplication unit performs best with pairs of large matrices that minimize the need for padding.
By default, matrix multiplication in JAX on TPUs uses bfloat16 with float32 accumulation. This can be controlled with the precision argument on relevant jax.numpy function calls (matmul, dot, einsum, etc). In particular:
precision=jax.lax.Precision.DEFAULT: uses mixed bfloat16 precision (fastest)
precision=jax.lax.Precision.HIGH: uses multiple MXU passes to achieve higher precision
precision=jax.lax.Precision.HIGHEST: uses even more MXU passes to achieve full float32 precision
JAX also adds the bfloat16 dtype, which you can use to explicitly cast arrays to
Running JAX in a Colab
When you run JAX code in a Colab notebook, Colab automatically creates a legacy TPU node. TPU nodes have a different architecture. For more information, see System Architecture.
For more information about Cloud TPU, see: