Quotas & limits

Quotas

The Dataflow managed service has the following quota limits:

  • Each user can make up to 3,000,000 requests per minute.
  • Each Dataflow job can use a maximum of 1,000 Compute Engine instances.
  • Each Google Cloud project can run 100 concurrent Dataflow jobs.
  • If you opt-in to organization level quotas, each organization can run 125 concurrent Dataflow jobs.
  • Each user can make up to 15,000 monitoring requests per minute.
  • Each Google Cloud project gets 160 Shuffle slots, which are sufficient to shuffle approximately 100 TB of data concurrently.
  • Each Google Cloud project gets 60 GB per minute per cloud region of Streaming Engine throughput to send data between Compute Engine instances and Streaming Engine.

You can check your current usage of Dataflow-specific quota:

  1. In the Google Cloud Console, go to the APIs & services.
    Go to API & Services
  2. Click Dashboard.
  3. Click Dataflow API.
  4. Click Quotas.
    For example, to check your current Shuffle slots quota usage, find the Shuffle slots chart on the Quotas page.
    Shuffle slots on Quotas page.

The Dataflow service exercises various components of the Google Cloud, such as BigQuery, Cloud Storage, Pub/Sub, and Compute Engine. These (and other Google Cloud services) employ quotas to cap the maximum number of resources you can use within a project. When you use Dataflow, you might need to adjust your quota settings for these services.

Compute Engine quotas

When you run your pipeline on the Dataflow service, Dataflow creates Compute Engine instances to run your pipeline code.

Compute Engine quota is specified per region. Review your project's Compute Engine quota and request the following adjustments if needed:

  • CPUs: The default machine types for Dataflow are n1-standard-1 for batch and n1-standard-4 for streaming. FlexRS uses n1-standard-2 machines by default. During the beta release, FlexRS uses 90% preemptible VMs and 10% regular VMs. Compute Engine calculates the number of CPUs by summing each instance’s total CPU count. For example, running 10 n1-standard-4 instances counts as 40 CPUs. See Compute Engine machine types for a mapping of machine types to CPU count.
  • In-Use IP Addresses: The number of in-use IP addresses in your project must be sufficient to accommodate the desired number of instances. To use 10 Compute Engine instances, you'll need 10 in-use IP addresses.
  • Persistent Disk: Dataflow attaches Persistent Disk to each instance.
    • The default disk size is 250 GB for batch and 420 GB for streaming pipelines. For 10 instances, by default you need 2,500 GB of Persistent Disk for a batch job.
    • The default disk size is 25 GB for Dataflow Shuffle batch pipelines.
    • The default disk size is 30 GB for Streaming Engine streaming pipelines.
  • Managed Instance Groups: Dataflow deploys your Compute Engine instances as a Managed Instance Group. You'll need to ensure you have the following related quota available:
    • One Instance Group per Dataflow job
    • One Managed Instance Group per Dataflow job
    • One Instance Template per Dataflow job

Additional quotas

Depending on which sources and sinks you are using, you might also need additional quota.

  1. Pub/Sub: If you are using Pub/Sub, you might need additional quota. When planning for quota, note that processing 1 message from Pub/Sub involves 3 operations. If you use custom timestamps, you should double your expected number of operations, since Dataflow will create a separate subscription to track custom timestamps.
  2. BigQuery: If you are using the streaming API for BigQuery, quota limits and other restrictions apply.

Limits

This section describes practical production limits for Dataflow.

Limit Amount
Maximum number of workers per pipeline. 1,000
Maximum size for a job creation request. Pipeline descriptions with a lot of steps and very verbose names may hit this limit. 10 MB
Maximum number of side input shards. 20,000
Maximum size for a single element value in Streaming Engine. 100 MB
Оцените, насколько информация на этой странице была вам полезна:

Оставить отзыв о...

Текущей странице
Cloud Dataflow
Нужна помощь? Обратитесь в службу поддержки.