AI Platform Notebooks enables you to create and manage virtual machine (VM) instances that are pre-packaged with JupyterLab.
AI Platform Notebooks instances have a pre-installed suite of deep learning packages, including support for the TensorFlow and PyTorch frameworks. You can configure either CPU-only or GPU-enabled instances, to best suit your needs.
Your notebook instances are protected by Google Cloud authentication and authorization, and are available using a notebook instance URL. Notebook instances also integrate with GitHub so that you can easily sync your notebook with a GitHub repository.
AI Platform Notebooks saves you the difficulty of creating and configuring a Deep Learning virtual machine by providing verified, optimized, and tested images for your chosen framework.
You can configure an AI Platform Notebooks instance to include the following:
Python versions 2.7 and 3.*
Python core packages:
- many others
R version 3.6
R core packages:
- rpy2 (an R package for accessing R in Python notebooks)
- many others
Nvidia packages with the latest Nvidia driver for GPU-enabled instances:
- CUDA 9.*, 10.*, and 11.*
- CuDNN 7.*
- NCCL 2.*
VPC Service Controls
VPC Service Controls provides additional security for your AI Platform Notebooks instances. To learn more, read the Overview of VPC Service Controls. To use AI Platform Notebooks within a service perimeter, see Use a notebook instance within a service perimeter.
Using AI Platform Notebooks with Dataproc Hub
Dataproc Hub is a customized JupyterHub server. Administrators can create Dataproc Hub instances that can spawn single-user Dataproc clusters to host AI Platform Notebooks environments. See Configure Dataproc Hub.
Using AI Platform Notebooks with Dataflow
You can use AI Platform Notebooks within a pipeline, and then run the pipeline on Dataflow. To create an Apache Beam AI Platform Notebooks instance that you can use with Dataflow, see Developing interactively with Apache Beam notebooks.