Troubleshoot Dataflow permissions

This page shows you how to investigate and resolve issues with Dataflow permissions.

To successfully run Dataflow jobs, your user account and the Dataflow service accounts must have the necessary access to resources. For a list of required roles and steps for granting these roles, see Security and permissions for pipelines on Google Cloud on the Dataflow security and permissions page.

In addition, when your Apache Beam pipelines access Google Cloud resources, your Dataflow project worker service account needs access to the resources. For a list of roles that your worker service account might need, see Example role assignment.

When one or more roles required for running a job is missing, an error might appear in the job logs or in the worker logs. For instructions explaining how to find errors when a job fails, see Find information about pipeline failures.

To resolve permissions issues, you need to understand which permission is missing and which account needs to have that permission. To understand which permission is missing, look at the permission listed in the error message and find the role that contains that permission. Often, but not always, you need to assign the relevant role to the Dataflow worker service account.

To add permissions, your user account needs to be allowed to manage access. For more information, see Manage access to service accounts and Manage access to other resources.

User does not have write access to project

When you try to run a Dataflow job, the job fails and you see an error similar to the following:

PERMISSION_DENIED: (Could not create workflow; user does not have write access to project: $PROJECT_ID Causes: (...): Permission 'dataflow.jobs.create' denied on project: '$PROJECT_ID'

This error occurs when your user account doesn't have the roles/dataflow.developer role.

To resolve this issue, grant your user account the roles/dataflow.developer role. In addition, make sure that your user account has the roles/iam.serviceAccountUser role. For more information, see Grant a single role in the Identity and Access Management documentation.

User does not have sufficient permissions on project

When you try to cancel a Dataflow job, you see an error similar to the following:

Could not cancel workflow; user does not have sufficient permissions on project:PROJECT_ID, or the job does not exist in the project. Causes: (...): Permission 'dataflow.jobs.cancel' denied on project: 'PROJECT_ID' Please ensure you have permission to access the job

Similar errors might occur when trying to drain or update a job.

This error occurs for one of the following reasons:

  • Your user account doesn't have the roles/dataflow.developer role. To resolve this issue, grant your user account the roles/dataflow.developer role. In addition, make sure that your user account has the roles/iam.serviceAccountUser role. For more information, see Grant a single role in the Identity and Access Management documentation.
  • The job ID is incorrect. It might contain a typo, or you might be using the job name to cancel the job instead of the job ID.

Permissions verification for worker service account failed

When you try to run a Dataflow job, you see an error similar to the following:

Workflow failed. Causes: Permissions verification for controller service account failed. All permissions in IAM role roles/dataflow.worker should be granted to controller service account PROJECT_NUMBER-compute@developer.gserviceaccount.com.

This error occurs when the worker service account doesn't have the roles/dataflow.worker role.

To resolve this issue, grant the worker service account the roles/dataflow.worker role. For more information, see Grant a single role in the Identity and Access Management documentation.

Pipeline validation failed

Before a new Dataflow job launches, Dataflow performs validation checks on the pipeline. When the validation checks find problems with the pipeline, to save time and compute resources, Dataflow fails the job submission early. In the job logs, Dataflow includes log messages that contain the validation findings and instructions for resolving the issues.

When the pipeline validation check finds permission issues, you might see the following error in the job logs:

[The preflight pipeline validation failed for job JOB_ID.] Missing permissions
PERMISSION when accessing RESOURCE_PATH as Dataflow worker service account WORKER_SERVICE_ACCOUNT.

If permissions are missing for more than one resource, the job logs contain multiple permission error messages.

Before attempting to resubmit your Dataflow job, fix the permission issues. The following resources provide information about modifying roles and permissions.

If you want to override the pipeline validation and launch your job with validation errors, use the following pipeline option:

--experiment=enable_ppv_effect=false

There was a problem refreshing your credentials

When you try to run a Dataflow job, you see an error similar to the following:

Workflow failed. Causes: There was a problem refreshing your credentials.
Please check: 1. The Dataflow API is enabled for your project.
2. Make sure both the Dataflow service account and the controller service account have sufficient permissions.
If you are not specifying a controller service account, ensure the default Compute Engine service account PROJECT_NUMBER-compute@developer.gserviceaccount.com exists and has sufficient permissions.
If you have deleted the default Compute Engine service account, you must specify a controller service account

This error occurs when the worker service account doesn't have the roles/dataflow.worker role or when the Dataflow API isn't enabled.

First verify that the worker service account the roles/dataflow.worker role. If needed, grant the roles/dataflow.worker to the worker service account. For more information, see Grant a single role in the Identity and Access Management documentation.

To enable the Dataflow API, see Enabling an API in your Google Cloud project.

Required 'compute.subnetworks.get' permission

When you try to run a Dataflow job on a Shared VPC network, you see an error similar to the following:

Required 'compute.subnetworks.get' permission for 'projects/project-id/regions/region/subnetworks/subnet-name' HTTP Code: 403

Shared VPC lets you export subnets from a VPC network in a host project to other service projects in the same organization. Instances in the service projects can have network connections in the shared subnets of the host project. For more information, see Shared VPC overview.

To resolve this issue, first verify that the service project is attached to the host project. For more information, see Attach service projects in the Provisioning Shared VPC page.

Next, grant the following roles to the Compute Engine service account of the host project, the Dataflow worker service account of the service project, and the service account used to submit the job:

For more information, see Grant a single role in the Identity and Access Management documentation.

Dataflow runner does not have access to bucket

When you try to list objects in a Cloud Storage bucket, the Dataflow job fails, and you see an error similar to the following:

"dataflow-runner@project-id.iam.gserviceaccount.com" does not have `storage.objects.list` access to the Google Cloud Storage Bucket

This error occurs when the worker service account doesn't have the roles/storage.objectViewer role.

To resolve this issue, grant your user account account the roles/storage.objectViewer role. For more information, see Grant a single role in the Identity and Access Management documentation.

Cloud KMS key permission denied on resource

When you're using customer-managed encryption keys and try to create a Dataflow job, the job fails, and you see an error similar to the following:

Cloud KMS key permission 'cloudkms.cryptoKeyVersions.useToEncrypt' denied on resource
'projects/project-id/locations/location/keyRings/keyRingName/cryptoKeys/keyname' (or it may not exist). cannot be validated.
Please confirm the full key path is used (starts with projects) and that there are no typos.

This error occurs when the worker service account and the Dataflow service account don't have the roles/cloudkms.cryptoKeyEncrypterDecrypter role.

To resolve this issue, grant the roles/cloudkms.cryptoKeyEncrypterDecrypter role to the worker service account and to the Dataflow service account. For more information, see Granting Encrypter/Decrypter permissions in the Using customer-managed encryption keys page.

Permission denied on resource

When you try to create a pipeline, the pipeline fails with the following error:

Permission 'datapipelines.pipelines.create' denied on resource '//datapipelines.googleapis.com/projects/PROJECT_ID/locations/REGION' (or it may not exist).

This error occurs if the worker service account of your project doesn't have access to the files and other resources associated with the pipeline.

To resolve this issue, assign the following roles to the worker service account:

  • roles/dataflow.admin
  • roles/dataflow.worker

For more information, see Worker service account in "Dataflow security and permissions."

Workflow failed

When you're using customer-managed encryption keys and try to create a Dataflow job, the job fails with the following error:

Workflow failed

This error can occur for the following reasons:

  • The key and the Dataflow job aren't in the same region, or a multi-regional key is used. Global and multi-regional keys are not supported. The region for your CMEK and the region for your Dataflow job must be the same.
  • The key name is not specified correctly. The key might not exist, or the name might have a typo.

Cloud KMS key can't protect resources for this job

When you're running a Dataflow job and trying to enable a customer-managed encryption key, the job fails, and you see an error similar to the following:

Cloud KMS key can't protect resources for this job. Please make sure the KMS key's region matches the Dataflow region

This error can occur for the following reasons:

  • The key and the Dataflow job aren't in the same region, or a multi-regional key is used. Global and multi-regional keys are not supported. The region for your CMEK and the region for your Dataflow job must be the same.
  • The dataflowKMSKey parameter is not specified correctly.

Vertical Autoscaling not working

When you're using Vertical Autoscaling, the job doesn't automatically scale vertically, and the following error appears in the job logs:

{"level":"error","ts":1708815877.1246133,"caller":"exporter/exporter.go:232","msg":"failed to get response from UAS: %v","error":"rpc error: code = PermissionDenied desc = The caller does not have permission","stacktrace":"google3/autoscaler/vitor/external/go/exporter/exporter.receiver\n\tautoscaler/vitor/external/go/exporter/exporter.go:232"}

This error occurs when the worker service account doesn't have the Dataflow Worker (roles/dataflow.worker) role.

To resolve this issue, grant the worker service account the roles/dataflow.worker role. For more information, see Grant a single role in the Identity and Access Management documentation.

If you're using a custom role for the worker service account, add the following permissions to the custom role:

  • autoscaling.sites.readRecommendations
  • autoscaling.sites.writeMetrics
  • autoscaling.sites.writeState