Known issues for Eventarc

This page lists known issues for Eventarc.

You can also check for existing issues or open new issues in the public issue trackers.

  • Newly created Cloud Audit Logs triggers can take up to two minutes to become operational.
  • There is known duplicate transmission of Cloud Audit Logs from some Google Cloud event sources. When duplicate logs are published, duplicate events are delivered to destinations. To avoid these duplicate events, you should create triggers for fields that ensure the event is unique. This applies to the following event types:

    • Cloud Storage (serviceName: storage.googleapis.com), methodName: storage.buckets.list
    • Compute Engine (serviceName: compute.googleapis.com), methodName: beta.compute.instances.insert
    • BigQuery (serviceName: bigquery.googleapis.com)

    Note that since Workflows handles event deduplication, you don't have to ensure that the event is unique when you create a trigger for Workflows.

  • Cross-project triggers are not yet supported. The service that receives the events for the trigger must be in the same Google Cloud project as the trigger. If requests to your service are triggered by messages published to a Pub/Sub topic, the topic must also be in the same project as the trigger. See Route events across Google Cloud projects.

  • Regardless of where the virtual machine instance is actually located, Cloud Audit Logs triggers for Compute Engine result in events that originate from a single region: us-central1. When creating your trigger, ensure that the trigger location is set to either us-central1 or global.

  • While using Workflows as a destination for an Eventarc trigger, events larger than the maximum Workflows arguments size will fail to trigger workflow executions. For more information, see Quotas and limits.