Using TensorFlow Enterprise with a local AI Platform Deep Learning Containers instance

This page describes how to get started using TensorFlow Enterprise with a local Deep Learning Containers instance.

In this example, you create and run a TensorFlow Enterprise Deep Learning Containers instance on your local machine. Then you open a JupyterLab notebook (included by default in the container instance) and run a classification tutorial on using neural networks with Keras.

Before you begin

Complete the following steps to install the Cloud SDK and Docker, and then set up your local machine.

Install Cloud SDK and Docker

Complete these steps to install the Cloud SDK and Docker on your local machine.

  1. Download and install Cloud SDK on your local machine. Cloud SDK is a command line tool that you can use to interface with your instance.

  2. Download and install Docker.

Set up your local machine

Complete these steps to set up your local machine.

  1. If you're using a Linux-based operating system, such as Ubuntu or Debian, use the following command to add your username to the docker group so that you can run Docker without using sudo. Replace USERNAME with your username.

    sudo usermod -a -G docker USERNAME
    

    You may need to restart your system after adding yourself to the docker group.

  2. Open Docker. To ensure that Docker is running, run the following Docker command, which returns the current time and date:

    docker run busybox date
    
  3. Use gcloud as the credential helper for Docker:

    gcloud auth configure-docker
    
  4. Optional: If you want to use the GPU-enabled containers, make sure you have a CUDA 10 compatible GPU, the associated driver, nvidia-docker installed.

Create a Deep Learning Containers instance

To create a TensorFlow Enterprise Deep Learning Containers instance, complete the following for the type of local container that you want to make.

If you don't need to use a GPU-enabled container, use the following command. Replace /path/to/local/dir with the path to your local directory that you want to use.

docker run -d -p 8080:8080 -v /path/to/local/dir:/home \
  gcr.io/deeplearning-platform-release/tf2-cpu.2-3

If you want to use a GPU-enabled container, use the following command. Replace /path/to/local/dir with the path to your local directory that you want to use.

docker run --runtime=nvidia -d -p 8080:8080 -v /path/to/local/dir:/home \
  gcr.io/deeplearning-platform-release/tf2-gpu.2-3

This command starts up the container in detached mode, mounts the local directory /path/to/local/dir to /home in the container, and maps port 8080 on the container to port 8080 on your local machine.

Open a JupyterLab notebook and run a classification tutorial

The container is preconfigured to start a JupyterLab server. Complete these steps to open a JupyterLab notebook and run a classification tutorial.

  1. In your local browser, visit http://localhost:8080 to access a JupyterLab notebook.

  2. On the left, double-click tutorials to open the folder, and navigate to and open tutorials/tf2_course/01_neural_nets_with_keras.ipynb.

  3. Click the run button to run cells of the tutorial.

Running your Deep Learning Containers instance on Google Cloud

To run your TensorFlow Enterprise Deep Learning Containers instance in a cloud environment, learn more about options for running containers on Google Cloud. For example, you may want to run your container on a Google Kubernetes Engine cluster.

What's next