AI Platform Notebooks enables you to create and manage virtual machine (VM) instances that are pre-packaged with JupyterLab. AI Platform Notebooks instances support the TensorFlow and PyTorch frameworks and have a pre-installed suite of Python and R deep learning packages. You can configure either CPU-only or GPU-enabled instances to optimize your workflow.
AI Platform Notebooks saves you the difficulty of creating and configuring a Deep Learning virtual machine by providing verified, optimized, and tested images for your chosen framework.
Your notebook instances are protected by Google Cloud Platform (GCP) authentication and authorization, and are available using a notebook instance URL. Notebook instances also integrate with GitHub so that you can easily sync your notebook with a GitHub repository.
You can configure an AI Platform Notebooks instance to include the following:
Python versions 2.7 and 3.*
Python core packages:
- many others
R version 3.6
R core packages:
- rpy2 (an R package for accessing R in Python notebooks)
- many others
Nvidia packages with the latest Nvidia driver for GPU-enabled instances:
- CUDA 9.* and 10.*
- CuDNN 7.*
- NCCL 2.*
VPC Service Controls
VPC Service Controls provides additional security for your AI Platform Notebooks instances. To learn more, read the Overview of VPC Service Controls. To use AI Platform Notebooks within a service perimeter, see Use a notebook instance within a service perimeter.
Using AI Platform Notebooks with Dataproc Hub
Dataproc Hub is a customized JupyterHub server. Administrators can create Dataproc Hub instances that can spawn single-user Dataproc clusters to host AI Platform Notebooks environments. See Configure Dataproc Hub.
Using AI Platform Notebooks with Dataflow
You can use AI Platform Notebooks within a pipeline, and then run the pipeline on Dataflow. To create an Apache Beam AI Platform Notebooks instance that you can use with Dataflow, see Developing interactively with Apache Beam notebooks.