Cloud Dataprep is an interactive web application in which users define the data preparation rules by interacting with a sample of their data. Use of the application is free. Once a data preparation flow has been defined, the sample can be exported for free or the flow can be executed as a Cloud Dataprep job (using Google Cloud Dataflow) over the original dataset.
Each Cloud Dataprep job is billed as a multiple of the execution cost (Cloud Dataprep Units) of the Cloud Dataflow job that performs the data transformation.
|Cloud Dataprep flow execution price|
|1.16 * (cost of Cloud Dataflow job that executed the Cloud Dataprep flow)1|
1 Cloud Dataprep jobs can execute with different resource configurations in order to optimize performance and efficiency, but the default Cloud Dataflow job configurations are typical.
To monitor or calculate the cost of a Cloud Dataprep job, navigate to the Cloud Dataflow monitoring page for your Cloud Dataprep job, and then note the resource consumption metrics (e.g. vCPU, Memory, Storage, etc.). Calculate the equivalent Dataflow cost, and then multiply the calculated cost by 1.16.
Cloud Dataprep jobs are charged for execution units, which are composed of Memory, vCPU, Storage, etc.
|Cloud Dataprep Units (per Hour)|
If you pay in a currency other than USD, the prices listed in your currency on Cloud Platform SKUs apply.
In addition to the Cloud Dataprep charges, a job may consume the following resources, which are billed at their own pricing, including but not limited to:
See the Google Cloud Platform Pricing Calculator to estimate Google Cloud Platform resource pricing.