[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["很难理解","hardToUnderstand","thumb-down"],["信息或示例代码不正确","incorrectInformationOrSampleCode","thumb-down"],["没有我需要的信息/示例","missingTheInformationSamplesINeed","thumb-down"],["翻译问题","translationIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-09-04。"],[],[],null,["# Run a calculation on a Cloud TPU VM using PyTorch\n=================================================\n\nThis document provides a brief introduction to working with PyTorch and\nCloud TPU.\n| **Note:** This example shows how to run code on a v5litepod-8 (v5e) TPU which is a single-host TPU. Single-host TPUs have only 1 TPU VM. To run code on TPUs with more than one TPU VM (for example, v5litepod-32 or larger), see [Run PyTorch code on Cloud TPU slices](/tpu/docs/pytorch-pods).\n\n\nBefore you begin\n----------------\n\nBefore running the commands in this document, you must create a Google Cloud account,\ninstall the Google Cloud CLI, and configure the `gcloud` command. For more\ninformation, see [Set up the Cloud TPU environment](/tpu/docs/setup-gcp-account).\n\nCreate a Cloud TPU using `gcloud`\n---------------------------------\n\n1. Define some environment variables to make the commands easier to use.\n\n\n ```bash\n export PROJECT_ID=your-project-id\n export TPU_NAME=your-tpu-name\n export ZONE=us-east5-a\n export ACCELERATOR_TYPE=v5litepod-8\n export RUNTIME_VERSION=v2-alpha-tpuv5-lite\n ``` \n\n #### Environment variable descriptions\n\n \u003cbr /\u003e\n\n2. Create your TPU VM by running the following command:\n\n ```bash\n $ gcloud compute tpus tpu-vm create $TPU_NAME \\\n --project=$PROJECT_ID \\\n --zone=$ZONE \\\n --accelerator-type=$ACCELERATOR_TYPE \\\n --version=$RUNTIME_VERSION\n ```\n\nConnect to your Cloud TPU VM\n----------------------------\n\nConnect to your TPU VM over SSH using the following command: \n\n```bash\n$ gcloud compute tpus tpu-vm ssh $TPU_NAME \\\n --project=$PROJECT_ID \\\n --zone=$ZONE\n```\n\nIf you fail to connect to a TPU VM using SSH, it might be because the TPU VM\ndoesn't have an external IP address. To access a TPU VM without an external IP\naddress, follow the instructions in [Connect to a TPU VM without a public IP address](/tpu/docs/tpu-iap).\n\nInstall PyTorch/XLA on your TPU VM\n----------------------------------\n\n```bash\n$ (vm) sudo apt-get update\n$ (vm) sudo apt-get install libopenblas-dev -y\n$ (vm) pip install numpy\n$ (vm) pip install torch torch_xla[tpu] -f https://storage.googleapis.com/libtpu-releases/index.html\n```\n\nVerify PyTorch can access TPUs\n------------------------------\n\nUse the following command to verify PyTorch can access your TPUs: \n\n```bash\n$ (vm) PJRT_DEVICE=TPU python3 -c \"import torch_xla.core.xla_model as xm; print(xm.get_xla_supported_devices(\\\"TPU\\\"))\"\n```\n\nThe output from the command should look like the following: \n\n```\n['xla:0', 'xla:1', 'xla:2', 'xla:3', 'xla:4', 'xla:5', 'xla:6', 'xla:7']\n```\n\nPerform a basic calculation\n---------------------------\n\n1. Create a file named `tpu-test.py` in the current directory and copy and paste\n the following script into it:\n\n import torch\n import torch_xla.core.xla_model as xm\n\n dev = xm.xla_device()\n t1 = torch.randn(3,3,device=dev)\n t2 = torch.randn(3,3,device=dev)\n print(t1 + t2)\n\n2. Run the script:\n\n ```bash\n (vm)$ PJRT_DEVICE=TPU python3 tpu-test.py\n ```\n\n The output from the script shows the result of the computation: \n\n tensor([[-0.2121, 1.5589, -0.6951],\n [-0.7886, -0.2022, 0.9242],\n [ 0.8555, -1.8698, 1.4333]], device='xla:1')\n\n\nClean up\n--------\n\n\nTo avoid incurring charges to your Google Cloud account for\nthe resources used on this page, follow these steps.\n\n1. Disconnect from the Cloud TPU instance, if you have not already\n done so:\n\n ```bash\n (vm)$ exit\n ```\n\n Your prompt should now be `username@projectname`, showing you are in the\n Cloud Shell.\n2. Delete your Cloud TPU.\n\n ```bash\n $ gcloud compute tpus tpu-vm delete $TPU_NAME \\\n --project=$PROJECT_ID \\\n --zone=$ZONE\n ```\n3. Verify the resources have been deleted by running the following command. Make\n sure your TPU is no longer listed. The deletion might take several minutes.\n\n ```bash\n $ gcloud compute tpus tpu-vm list \\\n --zone=$ZONE\n ```\n\n\nWhat's next\n-----------\n\nRead more about Cloud TPU VMs:\n\n- [Run PyTorch code on TPU slices](/tpu/docs/pytorch-pods)\n- [Manage TPUs](/tpu/docs/managing-tpus-tpu-vm)\n- [Cloud TPU system architecture](/tpu/docs/system-architecture-tpu-vm)\n- [PyTorch/XLA documentation](https://pytorch.org/xla/release/1.7/index.html)"]]