Choose a subscription type

This document helps you choose the appropriate type of Pub/Sub subscription suited to your business requirements.

Before you begin

Pub/Sub subscription comparison table

The following table offers some guidance in choosing the appropriate delivery mechanism for your application:

Features supported by Pub/Sub subscriptions
Use case Pull subscription
  • Large volume of messages (GBs per second).
  • Efficiency and throughput of message processing is critical.
  • Environments where a public HTTPS endpoint with a non-self-signed SSL certificate is not feasible to set up.
Push subscription
  • Multiple topics that must be processed by the same webhook.
  • App Engine Standard and Cloud Functions subscribers.
  • Environments where Google Cloud dependencies (such as credentials and the client library) are not feasible to set up.
Export subscription
  • Large volume of messages that can scale up to multiple millions of messages per second.
  • Messages are directly sent to a Google Cloud resource without any additional processing.
Endpoints Pull subscription

Any device on the internet that has authorized credentials is able to call the Pub/Sub API.

Push subscription
  • An HTTPS server with non-self-signed certificate accessible on the public web.
  • The receiving endpoint might be decoupled from the Pub/Sub subscription, so that messages from multiple subscriptions are sent to a single endpoint.
Export subscription
  • A BigQuery dataset and table for a BigQuery subscription.
  • A Cloud Storage bucket for a Cloud Storage subscription.
Load balancing Pull subscription
  • Multiple subscribers can make pull calls to the same "shared" subscription.
  • Each subscriber receives a subset of messages.
Push subscription

Push endpoints can be load balancers.

Export subscription

The Pub/Sub service automatically balances the load.

Configuration Pull subscription

No configuration is necessary.

Push subscription
  • No configuration is necessary for App Engine apps in the same project as the subscriber.
  • Verification of push endpoints is not required in the Google Cloud console.
  • Endpoints must be reachable using DNS names and have SSL certificates installed.
Export subscription
  • A BigQuery dataset and table must exist for the BigQuery subscription, configured with the appropriate permissions.
  • A Cloud Storage bucket must exist for the Cloud Storage subscription, configured with the appropriate permissions.
Flow control Pull subscription

The subscriber client controls the rate of delivery. The subscriber can dynamically modify the acknowledgment deadline, allowing message processing to be arbitrarily long.

Push subscription

The Pub/Sub server automatically implements flow control. There's no need to handle message flow at the client side. However, it's possible to indicate that the client cannot handle the current message load by passing back an HTTP error.

Export subscription

The Pub/Sub server automatically implements flow control to optimize writing messages to a Google Cloud resource.

Efficiency and throughput Pull subscription

Achieves high throughput at low CPU and bandwidth by allowing batched delivery, acknowledgments, and massively parallel consumption. May be inefficient if aggressive polling is used to minimize message delivery time.

Push subscription

Delivers one message per request and limits the maximum number of outstanding messages.

Export subscription

Scalability is dynamically handled by Pub/Sub servers.

When to use an export subscription

Without an export subscription, you need a pull or push subscription and a subscriber (such as Dataflow) to read messages and write them to a Google Cloud resource. The overhead of running a Dataflow job is not necessary when messages don't require additional processing before being stored.

Export subscriptions have the following advantages:

  • Simple deployment. You can set up an export subscription through a single workflow in the console, Google Cloud CLI, client library, or Pub/Sub API.

  • Low costs. Reduces the additional cost and latency of similar Pub/Sub pipelines that include Dataflow jobs. This cost optimization is useful for messaging systems that don't require additional processing before storage.

  • Minimal monitoring. Export subscriptions are part of the multi-tenant Pub/Sub service and don't require you to run separate monitoring jobs.

  • Flexibility. A BigQuery subscription can use the schema of the topic to which it is attached, which is not available with the basic Dataflow template for writing from Pub/Sub to BigQuery. Similarly, a Cloud Storage subscription offers configurable file batching options based on file size and elapsed time, which are not configurable in the basic Dataflow template for writing from Pub/Sub to Cloud Storage.

However, a Dataflow pipeline is still recommended for Pub/Sub systems where some data transformation is required before the data is stored in a Google Cloud resource such as a BigQuery table or Cloud Storage bucket.

To learn how to stream data from Pub/Sub to BigQuery with transformation by using Dataflow, see Stream from Pub/Sub to BigQuery.

To learn how to stream data from Pub/Sub to Cloud Storage with transformation by using Dataflow, see Stream messages from Pub/Sub by using Dataflow.

What's next

Understand the workflow for each subscription type: