Workflow Basics

Cloud Dataprep enables analysts, data specialists, and other domain experts to quickly cleanse and transform datasets of varying sizes for use in other analytics systems. Using an innovative set of web-based tools, you can import complex datasets and wrangle them for use in virtually any target system. Key capabilities include:

  • Import from flat file or databases or distributed storage systems

  • Locate and remove or modify missing or mismatched data
  • Unnest complex data structures
  • Identify statistical outliers in your data for review and management
  • Perform lookups from one dataset into another reference dataset
  • Aggregate columnar data using a variety of aggregation functions
  • Normalize column values for more consistent usage and statistical modeling
  • Merge datasets with joins
  • Append one dataset to another through union operations

Most of these operations can be executed with a few mouse clicks. This section provides a basic overview of common workflows through Cloud Dataprep.

Basic Workflow

  1. Review object overview: Before you begin, you should review the overview of the objects that are created and maintained in Cloud Dataprep. See Object Overview.
  2. Import data: Integrate data from a variety of sources of data. See Import Basics.
  3. Profile your data: Before, during, and after you transform your data, you can use the visual profiling tools to quickly analyze and make decisions about your data. See Profiling Basics.
  4. Build transform recipes: Use the various views in the Transformer Page to build your transform recipes and preview the results on sampled data. See Transform Basics.
  5. Run job: Launch a job to run your recipe on the full dataset. Review results and iterate as needed. See Running Job Basics.

  6. Export results: Export the generated results data for use outside of Cloud Dataprep. See Export Basics.

Send feedback about...

Google Cloud Dataprep Documentation