Cloud Dataflow access control guide

Overview

To limit access for users within a project or organization, you can use Cloud Dataflow IAM roles. You can restrict access to Cloud Dataflow-related resources, as opposed to granting users viewer, editor, or owner access to the entire Google Cloud Platform project.

This page focuses on how to use Cloud Dataflow's IAM roles. For a detailed description of IAM and its features, see the Google Cloud Identity and Access Management developer's guide.

Every Cloud Dataflow method requires the caller to have the necessary permissions. For a list of the permissions and roles Cloud Dataflow supports, see the following section.

Permissions and roles

This section summarizes the permissions and roles Cloud Dataflow IAM supports.

Required permissions

The following table lists the permissions that the caller must have to call each method:

Method Required Permissions
dataflow.jobs.create dataflow.jobs.create
dataflow.jobs.cancel dataflow.jobs.cancel
dataflow.jobs.updateContents dataflow.jobs.updateContents
dataflow.jobs.list dataflow.jobs.list
dataflow.jobs.get dataflow.jobs.get
dataflow.messages.list dataflow.messages.list
dataflow.metrics.get dataflow.metrics.get

Roles

The following table lists the Cloud Dataflow IAM roles with a corresponding list of all the permissions each role includes. Every permission is applicable to a particular resource type.

Role Title Description Permissions Lowest Resource
roles/
dataflow.admin
Dataflow Admin Minimal role for creating and managing dataflow jobs. compute.machineTypes.get
dataflow.*
resourcemanager.projects.get
resourcemanager.projects.list
storage.buckets.get
storage.objects.create
storage.objects.get
storage.objects.list
roles/
dataflow.developer
Dataflow Developer Provides the permissions necessary to execute and manipulate Cloud Dataflow jobs. dataflow.*
resourcemanager.projects.get
resourcemanager.projects.list
Project
roles/
dataflow.viewer
Dataflow Viewer Provides read-only access to all Cloud Dataflow-related resources. dataflow.jobs.get
dataflow.jobs.list
dataflow.messages.*
dataflow.metrics.*
resourcemanager.projects.get
resourcemanager.projects.list
Project
roles/
dataflow.worker
Dataflow Worker Provides the permissions necessary for a Compute Engine service account to execute work units for a Cloud Dataflow pipeline. compute.instanceGroupManagers.update
compute.instances.delete
compute.instances.setDiskAutoDelete
dataflow.jobs.get
logging.logEntries.create
storage.objects.create
storage.objects.get
Project

The Cloud Dataflow Worker role (roles/dataflow.worker) provides the permissions (dataflow.workItems.lease, dataflow.workItems.update, and dataflow.workItems.sendMessage) necessary for a Compute Engine service account to execute work units for an Apache Beam pipeline. It should typically only be assigned to such an account, and only includes the ability to request and update work from the Cloud Dataflow service.

The Cloud Dataflow Service Agent role (roles/dataflow.serviceAgent) provides a Cloud Dataflow service account access to managed resources. This Service Agent role is assigned to a Cloud Dataflow service account when you create a Cloud Dataflow project. You cannot assign this role.

Creating jobs

To a create a job, roles/dataflow.admin includes the minimal set of permissions required to run and examine jobs.

Alternatively, the following permissions are required:

Example role assignment

To illustrate the utility of the different Cloud Dataflow roles, consider the following breakdown:

Assigning Cloud Dataflow roles

Cloud Dataflow roles can currently be set on organizations and projects only.

To manage roles at the organizational level, see Access Control for Organizations Using IAM.

To set project-level roles, see Granting, changing, and revoking access to resources.

Оцените, насколько информация на этой странице была вам полезна:

Оставить отзыв о...

Текущей странице
Cloud Dataflow
Нужна помощь? Обратитесь в службу поддержки.