Pipeline fundamentals for the Apache Beam SDKs

Apache Beam is an open source, unified model for defining both batch and streaming-data parallel-processing pipelines. Before you get started with Dataflow, understand how to design, create and test Apache Beam pipelines.

Apache Beam resources

On the Apache Beam website, you can find documentation on:

  • How to design your pipeline: shows how to determine your pipeline's structure, how to choose which transforms to apply to your data, and how to determine your input and output methods.

  • How to create your pipeline: explains the mechanics of using the classes in the Beam SDKs and the necessary steps needed to build a pipeline.

  • How to test your pipeline: presents best practices for testing your pipelines.