To limit access for users within a project or organization, you can use Dataflow IAM roles. You can restrict access to Dataflow-related resources, as opposed to granting users viewer, editor, or owner access to the entire Google Cloud project.
This page focuses on how to use Dataflow's IAM roles. For a detailed description of IAM and its features, see the Google Cloud Identity and Access Management developer's guide.
Every Dataflow method requires the caller to have the necessary permissions. For a list of the permissions and roles Dataflow supports, see the following section.
Permissions and roles
This section summarizes the permissions and roles Dataflow IAM supports.
The following table lists the permissions that the caller must have to call each method:
The following table lists the Dataflow IAM roles with a corresponding list of all the permissions each role includes. Every permission is applicable to a particular resource type.
The Dataflow Worker role (
provides the permissions (
necessary for a Compute Engine service account to execute work units
for an Apache Beam pipeline. It should typically only be assigned to such an account,
and only includes the ability to request and update work from the Dataflow
The Dataflow Service Agent role (
provides a Dataflow
access to managed resources. This Service Agent role is assigned to a
Dataflow service account when you create a
Dataflow project. You cannot assign this role.
To a create a job,
the minimal set of permissions required to run and examine jobs.
Alternatively, the following permissions are required:
roles/dataflow.developerrole, to instantiate the job itself.
roles/compute.viewerrole, to access machine type information and view other settings.
roles/storage.objectAdminrole, to provide permission to stage files on Cloud Storage.
Example role assignment
To illustrate the utility of the different Dataflow roles, consider the following breakdown:
- The developer who creates and examines jobs needs the
- For more sophisticated permissions management, the developer interacting with the Dataflow job needs the
- They need the
roles/storage.objectAdminor a related role to stage the required files.
- For debugging and quota checking, they need the project
- Absent other role assignments, this role allows the developer to create and cancel Dataflow jobs, but not interact with the individual VMs or access other Cloud services.
- They need the
- The controller service account needs the
roles/dataflow.workerrole to process data for the Dataflow service. To access job data, the service accounts needs other roles such as
Assigning Dataflow roles
Dataflow roles can currently be set on organizations and projects only.
To manage roles at the organizational level, see Access Control for Organizations Using IAM.
To set project-level roles, see Granting, changing, and revoking access to resources.