Cloud Dataflow is a managed service for executing a wide variety of data processing patterns. The documentation on this site shows you how to deploy your batch and streaming data processing pipelines on the Cloud Dataflow service. Apache Beam is a unified programming model that enables you to develop both batch and streaming pipelines. Create your pipelines using the Apache Beam SDK, and run them on the Cloud Dataflow service.
Cloud Dataflow Documentation
Apache, Apache Beam, Beam, and the Beam logo are trademarks of The Apache Software Foundation in the United States and/or other countries.