This document explains the pricing for Cloud Data Fusion. To see the pricing for other products, read the Pricing documentation.
For pricing purposes, usage is measured as the length of time, in minutes, between the time a Cloud Data Fusion instance is created to the time it is deleted. Although the rate for pricing is defined on the hour, Cloud Data Fusion is billed by the minute. Usage is measured in hours (30 minutes is 0.5 hours, for example) to apply hourly pricing to minute-by-minute use.
If you pay in a currency other than USD, the prices listed in your currency on Google Cloud SKUs apply.
Cloud Data Fusion pricing is split across two functions: pipeline development and execution.
For pipeline development, Cloud Data Fusion offers the following three editions:
|Cloud Data Fusion Edition||Price per instance per hour|
|Developer||$0.35 (~$250 per month)|
|Basic||$1.80 (~$1100 per month)|
|Enterprise||$4.20 (~$3000 per month)|
The Basic edition offers the first 120 hours per month per account free.
For pipeline execution, you are charged for the Dataproc clusters that Cloud Data Fusion creates to run your pipelines at the current Dataproc rates.
Comparison of Developer, Basic, and Enterprise editions
|Number of users||2 (recommended)*||Unlimited||Unlimited|
|Workloads||Development, Product exploration||Testing, Sandbox, PoC||Production|
|Developer SDK for extensibility|
|Data quality and cleansing library|
|Private IP support|
|Debugging and testing (programmatic & visual)|
|Join, blend, aggregate transformations|
|Structured, unstructured, semi-structured|
|Integration metadata repository|
|Integration lineage - field and dataset level|
|Runtime||Default ephemeral 1 worker Dataproc||Default ephemeral 2 worker Dataproc||Ephemeral and dedicated Dataproc, Hadoop, EMR|
|Create and customize compute profiles||Versions 6.3 and above|
|Devops support - REST API|
|Triggers / schedules|
|Execution environment selection|
Usage of other Google Cloud resources
In addition to the development cost of a Cloud Data Fusion instance, you are billed only for any resources that you use to execute your pipelines, such as:
Currently, pricing for Cloud Data Fusion is the same for all supported regions.
||Changhua County, Taiwan|
||Seoul, South Korea|
||Jurong West, Singapore|
||St. Ghislain, Belgium|
||London, England, UK|
||Montréal, Québec, Canada|
||Osasco (São Paulo), Brazil|
||Council Bluffs, Iowa, USA|
||Moncks Corner, South Carolina, USA|
||Ashburn, Northern Virginia, USA|
||The Dalles, Oregon, USA|
||Los Angeles, California, USA|
Consider a Cloud Data Fusion instance has been running for 10 hours, and there are no free hours remaining for the Basic edition. Based on the edition, the development charge for Cloud Data Fusion is summarized in the following table:
|Edition||Cost per hour||Number of hours||Development cost|
|Developer||$0.35||10||10 * 0.35 = $3.50|
|Basic||$1.80||10||10 * 1.8 = $18|
|Enterprise||$4.20||10||10 * 4.2 = $42|
During this 10-hour period, you ran a pipeline that read raw data from Cloud Storage, performed transformations, and wrote the data to BigQuery every hour. Each run took approximately 15 minutes to complete. In other words, the Dataproc clusters that were created for these runs were alive for 15 minutes (0.25 hours) each. Assume that the configuration of each Dataproc cluster was the following:
|Item||Machine Type||Virtual CPUs||Attached Persistent Disk||Number in cluster|
|Master Node||n1-standard-4||4||500 GB||1|
|Worker Nodes||n1-standard-4||4||500 GB||5|
The Dataproc clusters each have 24 virtual CPUs: 4 for the master and 20 spread across the workers. For Dataproc billing purposes, the pricing for this cluster would be based on those 24 virtual CPUs and the length of time each cluster ran.
Across all runs of your pipeline, the total charge incurred for Dataproc can be calculated as:
Dataproc charge = # of vCPUs * number of clusters * hours per cluster * Dataproc price = 24 * 10 * 0.25 * $0.01 = $0.60
The Dataproc clusters use other Google Cloud products, which would be billed separately. Specifically, these clusters would incur charges for Compute Engine and Standard Persistent Disk Provisioned Space. You will incur storage charges for Cloud Storage and BigQuery, depending on the amount of data your pipeline processes.
To determine these additional costs based on current rates, you can use the billing calculator.