Cloud TPU VM JAX quickstart

This document describes provides a brief introduction to working with JAX and Cloud TPU.

Sign in to your Google Account. If you don't already have one, sign up for a new account. In the Google Cloud Console, select or create a Cloud project from the project selector page. Make sure billing is enabled for your project.

Install the Google Cloud SDK

The Google Cloud SDK contains tools and libraries for interacting with Google Cloud products and services. For more information, see Installing the Google Cloud SDK.

Configure the gcloud command

Run the following commands to configure gcloud to use your GCP project and install components needed for the TPU VM preview.

  $ gcloud config set account your-email-account
  $ gcloud config set project project-id

Enable the Cloud TPU API

  1. Enable the Cloud TPU API using the following gcloud command in Cloud Shell. (You may also enable it from the Google Cloud Console).

    $ gcloud services enable tpu.googleapis.com
    
  2. Run the following command to create a service identity.

    $ gcloud beta services identity create --service tpu.googleapis.com
    

Create a Cloud TPU VM with gcloud

With Cloud TPU VMs, your model and code run directly on the TPU host machine. You SSH directly into the TPU host. You can run arbitrary code, install packages, view logs, and debug code directly on the TPU Host.

  1. Create your TPU VM by running the following command from a GCP Cloud Shell or your computer terminal where the Google Cloud SDK is installed.

    (vm)$ gcloud alpha compute tpus tpu-vm create tpu-name \
    --zone europe-west4-a \
    --accelerator-type v3-8 \
    --version v2-alpha

    Required fields

    zone
    The zone where you plan to create your Cloud TPU.
    accelerator-type
    The type of the Cloud TPU to create.
    version
    The Cloud TPU runtime version. Set this to `v2-alpha` when you are using JAX on single TPU devices, Pod slices, or entire Pods.

Connect to your Cloud TPU VM

SSH into your TPU VM by using the following command:

$ gcloud alpha compute tpus tpu-vm ssh tpu-name --zone europe-west4-a

Required fields

tpu_name
The name of the TPU VM to which you are connecting.
zone
The zone where you created your Cloud TPU.

Install JAX on your Cloud TPU VM

(vm)$ pip3 install --upgrade jax jaxlib

System check

Test that everything is installed correctly by checking that JAX sees the Cloud TPU cores and can run basic operations:

Start the Python 3 interpreter:

(vm)$ python3
>>> import jax

Display the number of TPU cores available:

>>> jax.device_count()

The number of TPU cores is displayed, this should be 8.

Perform a simple calculation:

>>> jax.numpy.add(1, 1)

The result of the numpy add is displayed:

Output from the command:

DeviceArray(2, dtype=int32)

Exit the Python interpreter:

>>> exit()

Running JAX code on a TPU VM

You can now run any JAX code you please. The flax examples are a great place to start with running standard ML models in JAX. For instance, to train a basic MNIST convolutional network:

  1. Install Tensorflow datasets

    (vm)$ pip install --upgrade clu
    
  2. Install FLAX.

    (vm)$ git clone https://github.com/google/flax.git
    (vm)$ pip install --user -e flax
    
  3. Run the FLAX MNIST training script

    (vm)$ cd flax/examples/mnist
    (vm)$ python3 main.py --workdir=/tmp/mnist \
    --config=configs/default.py \
    --config.learning_rate=0.05 \
    --config.num_epochs=5
    

    The script output should look like this:

    I0513 21:09:35.448946 140431261813824 train.py:125] train epoch: 1, loss: 0.2312, accuracy: 93.00
    I0513 21:09:36.402860 140431261813824 train.py:176] eval epoch: 1, loss: 0.0563, accuracy: 98.05
    I0513 21:09:37.321380

Cleaning up

When you are done with your TPU VM follow these steps to clean up your resources.

  1. Disconnect from the Compute Engine instance, if you have not already done so:

    (vm)$ exit
    
  2. Delete your Cloud TPU.

    $ gcloud alpha compute tpus tpu-vm delete tpu-name \
      --zone europe-west4-a
    
  3. Verify the resources have been deleted by running the following command. Make sure your TPU is no longer listed. The deletion might take several minutes.

Performance Notes

Here are a few important details that are particularly relevant to using TPUs in JAX.

Padding

One of the most common causes for slow performance on TPUs is introducing inadvertent padding:

  • Arrays in the Cloud TPU are tiled. This entails padding one of the dimensions to a multiple of 8, and a different dimension to a multiple of 128.
  • The matrix multiplication unit performs best with pairs of large matrices that minimize the need for padding.

bfloat16 dtype

By default, matrix multiplication in JAX on TPUs uses bfloat16 with float32 accumulation. This can be controlled with the precision argument on relevant jax.numpy function calls (matmul, dot, einsum, etc). In particular:

  • precision=jax.lax.Precision.DEFAULT: uses mixed bfloat16 precision (fastest)
  • precision=jax.lax.Precision.HIGH: uses multiple MXU passes to achieve higher precision
  • precision=jax.lax.Precision.HIGHEST: uses even more MXU passes to achieve full float32 precision

JAX also adds the bfloat16 dtype, which you can use to explicitly cast arrays to bfloat16, e.g., jax.numpy.array(x, dtype=jax.numpy.bfloat16).

Running JAX in a Colab

When you run JAX code in a Colab notebook, Colab automatically creates a legacy TPU node. TPU nodes have a different architecture. For more information, see System Architecture.