Run a calculation on a Cloud TPU VM using TensorFlow
This quickstart shows you how to create a Cloud TPU and run a calculation on a Cloud TPU using TensorFlow. For a more in depth tutorial showing you how to train a model on a Cloud TPU see one of the Cloud TPU Tutorials.
Before you begin
Before you follow this quickstart, you must create a Google Cloud account,
install the Google Cloud CLI CLI, and configure the gcloud
command. For more
information, see Set up an account and a Cloud TPU project.
Create a Cloud TPU VM with gcloud
Create a Cloud TPU using the gcloud
command.
$ gcloud compute tpus tpu-vm create tpu-name \ --zone=europe-west4-a \ --accelerator-type=v3-8 \ --version=tpu-vm-tf-2.18.0-pjrt \ --project=your-gcp-project-name
Command flag descriptions
tpu-name
- The name of the Cloud TPU to create.
zone
- The zone where you plan to create your Cloud TPU.
accelerator-type
- The accelerator type specifies the version and size of the Cloud TPU you want to create. For more information about supported accelerator types for each TPU version, see TPU versions.
version
- The TPU runtime version. The version used in this quickstart has TensorFlow preinstalled.
project
- The name of the Google Cloud CLI project where you are creating your Cloud TPU.
For more information about the gcloud
command, see the gcloud
reference.
Connect to your Cloud TPU VM
Connect to your TPU VM using SSH:
$ gcloud compute tpus tpu-vm ssh tpu-name \ --zone europe-west4-a \ --project=your-gcp-project-name
Update TPU version
Run the following command to install TensorFlow:
pip install tensorflow-tpu -f https://storage.googleapis.com/libtpu-tf-releases/index.html --force
Verify TensorFlow can access TPUs
Create a file named
tpu-count.py
in the current directory and copy and paste the following script into it.import tensorflow as tf print(f"TensorFlow can access {len(tf.config.list_logical_devices('TPU'))} TPU cores")
Run the script:
(vm)$ python3 tpu-count.py
Output from the script shows the number of TPU cores available to the TPU VM:
TensorFlow can access 8 TPU cores
Run a basic computation using TensorFlow
Once you are connected to the TPU VM, set the following environment variable.
(vm)$ export TPU_NAME=local
When creating your TPU, if you set the --version
parameter to a version ending with
-pjrt
, set the following environment variables to enable the PJRT runtime:
(vm)$ export NEXT_PLUGGABLE_DEVICE_USE_C_API=true (vm)$ export TF_PLUGGABLE_DEVICE_LIBRARY_PATH=/lib/libtpu.so
Create a file named tpu-test.py
in the current directory and copy and paste
the following script into it.
import tensorflow as tf
print("Tensorflow version " + tf.__version__)
@tf.function
def add_fn(x,y):
z = x + y
return z
cluster_resolver = tf.distribute.cluster_resolver.TPUClusterResolver()
tf.config.experimental_connect_to_cluster(cluster_resolver)
tf.tpu.experimental.initialize_tpu_system(cluster_resolver)
strategy = tf.distribute.TPUStrategy(cluster_resolver)
x = tf.constant(1.)
y = tf.constant(1.)
z = strategy.run(add_fn, args=(x,y))
print(z)
Run this script with the following command:
(vm)$ python3 tpu-test.py
This script performs a computation on a each TensorCore of a TPU. The output will look similar to the following:
PerReplica:{ 0: tf.Tensor(2.0, shape=(), dtype=float32), 1: tf.Tensor(2.0, shape=(), dtype=float32), 2: tf.Tensor(2.0, shape=(), dtype=float32), 3: tf.Tensor(2.0, shape=(), dtype=float32), 4: tf.Tensor(2.0, shape=(), dtype=float32), 5: tf.Tensor(2.0, shape=(), dtype=float32), 6: tf.Tensor(2.0, shape=(), dtype=float32), 7: tf.Tensor(2.0, shape=(), dtype=float32) }
Clean up
To avoid incurring charges to your Google Cloud account for the resources used on this page, follow these steps.
Disconnect from the Compute Engine instance, if you have not already done so:
(vm)$ exit
Your prompt should now be
username@projectname
, showing you are in the Cloud Shell.Delete your Cloud TPU.
$ gcloud compute tpus tpu-vm delete tpu-name \ --zone=europe-west4-a
Verify the resources have been deleted by running
gcloud compute tpus tpu-vm list
. The deletion might take several minutes.$ gcloud compute tpus tpu-vm list --zone=europe-west4-a
What's next
For more information about Cloud TPU, see: