Using VPC Service Controls with online prediction

VPC Service Controls helps you mitigate the risk of data exfiltration from AI Platform Prediction. VPC Service Controls ensures that your data does not leave a service perimeter when you do the following:

  • Create models and model versions within a project inside the perimeter.
  • Send prediction requests to these resources.

Batch prediction and AI Explanations are not supported by VPC Service Controls, and if you follow this guide to configure a service perimeter then you will not be able to use batch prediction or AI Explanations in any Google Cloud projects inside that perimeter.

Creating a service perimeter

Follow the VPC Service Controls guide to creating a service perimeter. When specifying which services you want to restrict, make sure to restrict all of the following services:

  • AI Platform Training and Prediction API (ml.googleapis.com)
  • Pub/Sub API (pubsub.googleapis.com)
  • Cloud Storage API (storage.googleapis.com)
  • Google Kubernetes Engine API (container.googleapis.com)
  • Container Registry API (containerregistry.googleapis.com)
  • Cloud Logging API (logging.googleapis.com)

Your service perimeter must restrict all these services in order for AI Platform Training and AI Platform Prediction to work properly with VPC Service Controls.

Limitations

After you have created a service perimeter and added your Google Cloud project to it, you can use AI Platform Prediction without any additional configuration. However, the following limitations apply:

  • You cannot use batch prediction.

  • You cannot use AI Explanations.

  • We recommend that you create a new Google Cloud project to set up integration with VPC Service Controls. If you instead configure a service perimeter for a project that already contains AI Platform Prediction resources, then you must account for the following constraint:

    If you created models in the project before you added the project to the service perimeter, then you can no longer use those models.

    For example, you cannot create model versions on models that were created outside the perimeter. Instead, you must create new models inside the perimeter, then create model versions on those new models.

  • If you remove your project from the service perimeter, then you cannot update or delete models that were created while the project was in the perimeter.

  • Legacy (MLS1) machine types are not available, and you cannot use the AI Platform Training and Prediction API's global endpoint. If you try to create a model version that uses a legacy (MLS1) machine type, version creation fails. You must use Compute Engine (N1) machine types and regional endpoints for online prediction.

  • If you create a model or model version in the first few minutes after creating a service perimeter, then the operation might fail. Wait approximately 15 minutes for the VPC Service Controls restrictions to propagate to all the relevant Google Cloud services, and then try again.

  • When ml.googleapis.com is protected, your model versions do not have access to resources outside the perimeter. They can access data in Cloud Storage and other Google Cloud services supported by VPC Service Controls in projects within the perimeter, but if they send requests to services outside the perimeter, those requests will fail.

  • Without additional configuration, you cannot use the Google Cloud console to manage AI Platform Prediction resources of a project inside a service perimeter or to view access and stream logs. Learn about accessing resources protected by a service perimeter in the Google Cloud console.

AI Platform Training and AI Platform Vizier

When you create a service perimeter that protects the AI Platform Training and Prediction API, VPC Service Controls protects both AI Platform Training and AI Platform Prediction. Read about how to use VPC Service Controls with AI Platform Training.

AI Platform Vizier, which also uses the AI Platform Training and Prediction API, does not currently fully support VPC Service Controls. However, AI Platform Vizier remains enabled when you configure a service perimeter to protect the AI Platform Training and Prediction API.

What's next