Route events across Google Cloud projects


This tutorial shows you how to use Eventarc to read events from a source in one Google Cloud project and route them to a target destination in another Google Cloud project. This is possible by using Pub/Sub as a cross-project transport layer.

Objectives

In this tutorial, you will:

  1. Create a topic in one project, and then publish to that topic from another project. This routes events to an unauthenticated Cloud Run service using an Eventarc trigger.

  2. Use Pub/Sub notifications for Cloud Storage to publish Cloud Storage events from one project to another. Route the events to an unauthenticated Cloud Run service using an Eventarc trigger.

  3. Use Cloud Logging sinks to publish Cloud Audit Logs from one project to another. Route the events to an unauthenticated Cloud Run service using an Eventarc trigger.

Costs

In this document, you use the following billable components of Google Cloud:

To generate a cost estimate based on your projected usage, use the pricing calculator. New Google Cloud users might be eligible for a free trial.

Before you begin

Security constraints defined by your organization might prevent you from completing the following steps. For troubleshooting information, see Develop applications in a constrained Google Cloud environment.

Note that you will need two projects for this tutorial. The following steps apply to both projects.

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  3. Make sure that billing is enabled for your Google Cloud project.

  4. Install the Google Cloud CLI.
  5. To initialize the gcloud CLI, run the following command:

    gcloud init
  6. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Go to project selector

  7. Make sure that billing is enabled for your Google Cloud project.

  8. Install the Google Cloud CLI.
  9. To initialize the gcloud CLI, run the following command:

    gcloud init
  10. Update gcloud components:
    gcloud components update
  11. Log in using your account:
    gcloud auth login

Route Pub/Sub events across projects

Because Pub/Sub is a globally distributed service, you can create a topic in one project, publish to that topic from another project, and then trigger Eventarc which routes the message to a Cloud Run service:

Cross-project eventing: Cloud Pub/Sub and Eventarc

  1. Set the Google Cloud project ID to your second project:

    gcloud config set project PROJECT_TWO_ID
    

    Replace PROJECT_TWO_ID with the ID of your second Google Cloud project.

  2. In your second project, do the following:

    1. Enable the Cloud Run and Eventarc APIs:

      gcloud services enable run.googleapis.com eventarc.googleapis.com
      
    2. Set the default location:

      REGION=REGION
      

      Replace REGION with the supported Eventarc location of your choice. For example, us-central1.

    3. Create a Pub/Sub topic:

      TOPIC=my-topic
      gcloud pubsub topics create $TOPIC
      
    4. Deploy an unauthenticated Cloud Run service using a prebuilt image, us-docker.pkg.dev/cloudrun/container/hello:

      gcloud run deploy hello \
          --image=us-docker.pkg.dev/cloudrun/container/hello \
          --allow-unauthenticated \
          --region=$REGION
      

      When you see the service URL, the deployment is complete.

    5. Connect the topic to the service with an Eventarc trigger:

      gcloud eventarc triggers create cross-project-trigger \
          --destination-run-service=hello \
          --destination-run-region=${REGION} \
          --location=${REGION} \
          --event-filters="type=google.cloud.pubsub.topic.v1.messagePublished" \
          --transport-topic=projects/PROJECT_TWO_ID/topics/$TOPIC
      

      This creates a trigger called cross-project-trigger.

  3. Set the Google Cloud project ID to your first project:

    gcloud config set project PROJECT_ONE_ID
    

    Replace PROJECT_ONE_ID with the ID of your first Google Cloud project.

  4. In your first project, publish a message to the topic in the second project:

    gcloud pubsub topics publish projects/PROJECT_TWO_ID/topics/$TOPIC --message="hello"
    
  5. Set the Google Cloud project ID to your second project:

    gcloud config set project PROJECT_TWO_ID
    
  6. In your second project, confirm that the generated event was logged:

    gcloud logging read "resource.labels.service_name=hello AND jsonPayload.message:hello" --format=json
    

    A logging entry similar to the following is returned:

    "message": "Received event of type google.cloud.pubsub.topic.v1.messagePublished. Event data: hello"
    

Route Cloud Storage events across projects

Use Pub/Sub notifications for Cloud Storage to publish events from one project to another, and then route the events to a Cloud Run service through an Eventarc trigger:

Cross-project eventing: Cloud Storage and Eventarc

  1. Set the Google Cloud project ID to your first project:

    gcloud config set project PROJECT_ONE_ID
    
  2. Create a Cloud Storage bucket:

    PROJECT1=$(gcloud config get-value project)
    BUCKET=$PROJECT1-cross-project
    gcloud storage buckets create gs://$BUCKET --location=${REGION}
    
  3. Create a Pub/Sub notification for the bucket to the topic in your second project:

    gcloud storage buckets notifications create gs://$BUCKET --topic=projects/PROJECT_TWO_ID/topics/$TOPIC --payload-format=json
    
  4. Upload a file to the bucket:

    echo "Hello World" > random.txt
    gcloud storage cp random.txt gs://$BUCKET/random.txt
    
  5. Set the Google Cloud project ID to your second project:

    gcloud config set project PROJECT_TWO_ID
    
  6. In your second project, confirm that the generated event was logged:

    gcloud logging read "resource.labels.service_name=hello AND jsonPayload.message:random.txt" --format=json
    

    A logging entry similar to the following is returned:

    Received event of type google.cloud.pubsub.topic.v1.messagePublished. Event data: {
      "kind": "storage#object",
      "id": "project1-cross-project/random.txt/1635327604259719",
      "selfLink": "https://www.googleapis.com/storage/v1/b/project1-cross-project/o/random.txt",
      "name": "random.txt",
      "bucket": "project1-cross-project",
      "generation": "1635327604259719",
    [...]
    }
    

Route Cloud Audit Logs events across projects

Requests to your service can be triggered when an audit log entry is created that matches the trigger's filter criteria. (For more information, see Determine event filters for Cloud Audit Logs.) In this case, when a Compute Engine VM instance is created in your first project, an audit log entry that matches the trigger's filter criteria lets you capture and route an event to a Cloud Run service in the second project:

Cross-project eventing: Cloud Audit Logs and Eventarc

  1. Set the Google Cloud project ID to your first project:

    gcloud config set project PROJECT_ONE_ID
    
  2. In your first project, enable the Admin Read, Data Read, and Data Write Log Types for Compute Engine:

    Note that at the project level, you need the roles/owner Identity and Access Management (IAM) role to configure Data Access audit logs for your Google Cloud resources.

    1. Read your project's IAM policy and store it in a file:

      gcloud projects get-iam-policy PROJECT_ONE_ID > /tmp/policy.yaml
      
    2. Edit /tmp/policy.yaml, adding or changing only the Data Access audit logs configuration.

      auditConfigs:
      - auditLogConfigs:
        - logType: ADMIN_READ
        - logType: DATA_READ
        - logType: DATA_WRITE
        service: compute.googleapis.com
      
    3. Write your new IAM policy:

      gcloud projects set-iam-policy PROJECT_ONE_ID /tmp/policy.yaml
      

      If the preceding command reports a conflict with another change, then repeat these steps, starting with reading the project's IAM policy.

  3. In your first project, create a Cloud Logging sink to route Cloud Audit Logs to the topic in your second project:

    gcloud logging sinks create cross-project-sink \
        pubsub.googleapis.com/projects/PROJECT_TWO_ID/topics/my-topic \
        --log-filter='protoPayload.methodName="beta.compute.instances.insert"'
    

    A reminder similar to the following should be returned:

    Please remember to grant `serviceAccount:p1011272509317-375795@gcp-sa-logging.iam.gserviceaccount.com` the Pub/Sub Publisher role on the topic.
    
  4. Set the Google Cloud project ID to your second project:

    gcloud config set project PROJECT_TWO_ID
    
  5. In your second project, grant the role to the service account:

    gcloud pubsub topics add-iam-policy-binding my-topic \
        --member=SERVICE_ACCOUNT \
        --role=roles/pubsub.publisher
    

    Replace SERVICE_ACCOUNT with the service account email address returned in the previous step.

  6. Set the Google Cloud project ID to your first project:

    gcloud config set project PROJECT_ONE_ID
    
  7. In your first project, create a Compute Engine VM instance.

    If you are using the Google Cloud console to create the VM instance, you can accept the defaults for the purposes of this tutorial.

  8. Set the Google Cloud project ID to your second project:

    gcloud config set project PROJECT_TWO_ID
    
  9. In your second project, confirm that the generated event was logged:

    gcloud logging read "resource.labels.service_name=hello AND jsonPayload.message:beta.compute.instances.insert" --format=json
    

    A logging entry similar to the following is returned:

    Received event of type google.cloud.pubsub.topic.v1.messagePublished. Eventdata: {
      "logName": "projects/workflows-atamel/logs/cloudaudit.googleapis.com%2Factivity",
      "operation": {
        "id": "operation-1635330842489-5cf5321f4f454-ecc363cd-3883c08d",
        "last": true,
        "producer": "compute.googleapis.com"
      },
      "protoPayload": {
        "@type": "type.googleapis.com/google.cloud.audit.AuditLog",
        "methodName": "beta.compute.instances.insert",
      }
    [...]
    }
    

Clean up

If you created a new project for this tutorial, delete the project. If you used an existing project and want to keep it without the changes added in this tutorial, delete the resources created for the tutorial.

Delete the project

The easiest way to eliminate billing is to delete the project that you created for the tutorial.

To delete the project:

  1. In the Google Cloud console, go to the Manage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then click Delete.
  3. In the dialog, type the project ID, and then click Shut down to delete the project.

Delete tutorial resources

  1. Delete the Cloud Run service you deployed in this tutorial:

    gcloud run services delete SERVICE_NAME

    Where SERVICE_NAME is your chosen service name.

    You can also delete Cloud Run services from the Google Cloud console.

  2. Remove any gcloud CLI default configurations you added during the tutorial setup.

    For example:

    gcloud config unset run/region

    or

    gcloud config unset project

  3. Delete other Google Cloud resources created in this tutorial:

    • Delete the Eventarc trigger:

      gcloud eventarc triggers delete TRIGGER_NAME
      
      Replace TRIGGER_NAME with the name of your trigger.

    • Delete the Pub/Sub topic:

      gcloud pubsub topics delete TOPIC TOPIC_ID
      
      Replace TOPIC_ID with the ID of your topic.

    • Delete the Cloud Logging sink:

      gcloud logging sinks delete SINK_NAME
      
      Replace SINK_NAME with the name of your sink.

What's next