How-to

Long Term Version Support is separate from any other support (such as Technical Support Services/TSS) or Deprecation Policy offered for any related Google Cloud Platform Services under the Google Cloud Platform Terms of Service.

In this quickstart guide we walk you through using the TensorFlow Enterprise Distribution. The TensorFlow Enterprise Distribution contains custom built TensorFlow binaries and related packages. It is designed to make it easy for you to get up and running with TensorFlow as well as to scale on Google Cloud Platform; and will be supported with security updates and selected bug fixes for three years.

Building applications with TensorFlow Enterprise is no different from developing a TensorFlow application using any of the supported products. In this quick start guide we walk you through how to get up and running with each of the products that support TensorFlow Enterprise:

  • AI Platform Notebooks
  • Deep Learning Virtual Machine Images
  • Deep Learning Containers

Deep Learning VM

Getting started with TensorFlow Enterprise on AI Platform Deep Learning VM Image

Get started with TensorFlow Enterprise on Deep Learning VM by following these steps:

  1. Create a Deep Learning VM on GCP by selecting from the console.
  2. Once the machine is provisioned ssh into the machine.
  3. Git Clone the TensorFlow model code from TF Model Zoo

    git clone https://github.com/tensorflow/models

  4. Refer to the README instructions for executing the MNIST sample.

Notebooks

TensorFlow Enterprise on AI Platform Notebooks

Get started with TensorFlow Enterprise on AI Platform Notebooks by following these steps:

  1. Create an AI Platform Notebooks from the AI Platform console by clicking New instance.
  2. Select TensorFlow Enterprise 1.15 from the drop down (choose the Without GPUs option).
  3. Once the machine is provisioned click on Open JupyterLab link.
  4. Open and run the notebook tutorials/keras/basic_classification.ipynb from the Jupyter instance.

DL Containers

TensorFlow Enterprise using AI Platform Deep Learning Containers

Get started with TensorFlow Enterprise on AI Platform Deep Learning Containers by following these steps:

  1. For running AI Platform Deep Learning Containers locally see blog post.
  2. Follow the steps to pull the Docker image CPU or GPU and run it in a container.
  3. Open your web browser and enter localhost:8080.
  4. Open and run the notebook tutorials/keras/basic_classification.ipynb from Jupyter.
Was this page helpful? Let us know how we did:

Send feedback about...

TensorFlow Enterprise
Need help? Visit our support page.