Join Google Cloud Real Time Intelligence workshop to gain hands-on experience with streaming, AI, and fast batch. Register here.
Jump to


Ingest events for streaming into BigQuery, data lakes or operational databases.

New customers get $300 in free credits to spend on Pub/Sub. All customers get up to 10 GB for ingestion or delivery of messages free per month, not charged against your credits.

  • Ingest analytic events and stream them to BigQuery with Dataflow

  • No-ops, secure, scalable messaging or queue system

  • In-order and any-order at-least-once message delivery with pull and push modes

  • Secure data with fine-grained access controls and always-on encryption 


High availability made simple

Synchronous, cross-zone message replication and per-message receipt tracking ensures reliable delivery at any scale.

No-planning, auto-everything

Auto-scaling and auto-provisioning with no partitions eliminates planning and ensures workloads are production ready from day one.

Easy, open foundation for real-time data systems

A fast, reliable way to land small records at any volume, an entry point for real-time and batch pipelines feeding BigQuery, data lakes and operational databases. Use it with ETL/ELT pipelines in Dataflow.    

Key features

Key features

Stream analytics and connectors

Native Dataflow integration enables reliable, expressive, exactly-once processing and integration of event streams in Java, Python, and SQL.

In-order delivery at scale

Optional per-key ordering simplifies stateful application logic without sacrificing horizontal scale—no partitions required.

Cost-optimized ingestion with Pub/Sub Lite

Complementing Pub/Sub, Pub/Sub Lite aims to be the lowest cost option for high-volume  event ingestion. Pub/Sub Lite offers regional or zonal storage, putting you in control of capacity management.

View all features



Google Cloud Basics

What is Pub/Sub?

Get a comprehensive overview of Pub/Sub, from core concepts and message flow to common use cases and integrations.

Introduction to Pub/Sub

Learn how to enable Pub/Sub in a Google Cloud project, create a Pub/Sub topic and subscription, and publish messages and pull them to the subscription.

Quickstart: Using client libraries

See how the Pub/Sub service allows applications to exchange messages reliably, quickly, and asynchronously.

In-order message delivery

Learn how scalable message ordering works and when to use it.

Choosing between Pub/Sub or Pub/Sub Lite

Understand how to make most of both options.

Quickstart: Stream processing with Dataflow

Learn how to use Dataflow to read messages published to a Pub/Sub topic, window the messages by timestamp, and write the messages to Cloud Storage.

Guide: Publishing messages to topics

Learn how to create a message containing your data and send a request to the Pub/Sub Server to publish the message to the desired topic.

Not seeing what you’re looking for?

Use cases

Use cases

Use case
Stream analytics

Google’s stream analytics makes data more organized, useful, and accessible from the instant it’s generated. Built on Pub/Sub along with Dataflow and BigQuery, our streaming solution provisions the resources you need to ingest, process, and analyze fluctuating volumes of real-time data for real-time business insights. This abstracted provisioning reduces complexity and makes stream analytics accessible to both data analysts and data engineers.

Flow across 5 columns, from Trigger, to Ingest, Enrich, Analyze, & Activate. Each column has top and bottom section. In top of Trigger column are edge devices (mobile, web, Data Store, and IoT) which flow to Pub/Sub in Ingest column, and on to Enrich column and Apache Beam / Dataflow Streaming, then down to Analyze and then Activate boxes where it flows back to edge devices in Col 1. From Apache Beam in col 3, flows back and forth to Analyze column, into BigQuery, AI Platform, and Bigtable: all 3 are flowed into by Backfill/ Reprocess - Dataflow Batch. Flow moves from BigQuery to Activate column, into Data Studio, Third-party BI, and Cloud Functions, which flows back to edge devices in column 1. In bottom section of columns, it says Creation Flow: Trigger says “Configure source to push event message to Pub/Sub topic.” Flows to Ingest “Create Pub/Sub Topic and subscription.” To Enrich “Deploy streaming or batch Dataflow job using templates, CLI, or notebooks.” To Analyze “Create dataset, tables, and models to receive stream.” To Activate “Build real-time dashboards and call external APIs.”
Use case
Asynchronous microservices integration

Pub/Sub works as a messaging middleware for traditional service integration or a simple communication medium for modern microservices. Push subscriptions deliver events to serverless webhooks on Cloud Functions, App Engine, Cloud Run, or custom environments on Google Kubernetes Engine or Compute Engine. Low-latency pull delivery is available when exposing webhooks is not an option or for efficient handling of higher throughput streams.

All features

All features

At-least-once delivery
Synchronous, cross-zone message replication and per-message receipt tracking ensures at-least-once delivery at any scale.
Open APIs and client libraries in seven languages support cross-cloud and hybrid deployments.
Exactly-once processing