Google Cloud Dataproc has the following daily API quota limits. These quotas are daily quotas, and are enforced at the project level. The quotas reset nightly at midnight Pacific time.
The following tools are built on top of the Cloud Dataproc API, and use your Cloud Dataproc API quota:
|API calls||10,000,000 requests/day|
|API rate||100 requests/second/user|
Cloud Dataproc clusters utilize other Google Cloud Platform products. These products have project-level quotas, which include quotas that apply to Cloud Dataproc use. Some services are required to use Cloud Dataproc, such as Google Compute Engine and Google Cloud Storage. Other services, such as BigQuery and Bigtable, can optionally be used with Cloud Dataproc.
The following services, which enforce quota limits, are automatically (required to be) used to create Cloud Dataproc clusters.
Cloud Dataproc clusters utilize Compute Engine virtual machines. The Compute Engine quotas are split into regional and global limits. These limits apply to clusters you create. For example, to create a cluster with one
n1-standard-4 master node and two
n1-standard-4 worker nodes, you will use 12 virtual CPUs (
4 * 3). This cluster usage will count against the regional quota limit of 24 virtual CPUs.
Fill out a Google Compute Engine Quota Change Request Form to request additional Compute Engine quota for your project.
When you create a Cloud Dataproc cluster with default settings, the following Compute Engine resources are used.
|Virtual Machine (VM) Instances||3|
|Persistent disk||1500 GB|
The following services, which have quota limits, can optionally be used with Cloud Dataproc clusters.
When reading or writing data into BigQuery, the BigQuery quota applies.
When reading or writing data into Bigtable, the Bigtable quota applies.