This tutorial demonstrates using Cloud Run, Cloud Vision API, and ImageMagick to detect and blur offensive images uploaded to a Cloud Storage bucket. This tutorial builds on the tutorial Use Pub/Sub with Cloud Run.
This tutorial walks through modifying an existing sample app. You can also download the completed sample if you want.
Objectives
- Write, build, and deploy an asynchronous data processing service to Cloud Run.
- Invoke the service by uploading a file to Cloud Storage, creating a Pub/Sub message.
- Use the Cloud Vision API to detect violent or adult content.
- Use ImageMagick to blur offensive images.
- Test the service by uploading an image of a flesh-eating zombie.
Costs
In this document, you use the following billable components of Google Cloud:
To generate a cost estimate based on your projected usage,
use the pricing calculator.
Before you begin
- Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
-
Make sure that billing is enabled for your Google Cloud project.
-
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
-
Make sure that billing is enabled for your Google Cloud project.
-
Enable the Artifact Registry, Cloud Build, Pub/Sub, Cloud Run, Cloud Storage and Cloud Vision APIs.
- Install and initialize the gcloud CLI.
- Update components:
gcloud components update
- Set up a Pub/Sub topic, a secure push subscription, and an initial Cloud Run service to handle messages by following the Use Pub/Sub tutorial
Required roles
To get the permissions that you need to complete the tutorial, ask your administrator to grant you the following IAM roles on your project:
-
Cloud Build Editor (
roles/cloudbuild.builds.editor
) -
Cloud Run Admin (
roles/run.admin
) -
Logs View Accessor (
roles/logging.viewAccessor
) -
Project IAM Admin (
roles/resourcemanager.projectIamAdmin
) -
Pub/Sub Admin (
roles/pubsub.admin
) -
Service Account User (
roles/iam.serviceAccountUser
) -
Service Usage Consumer (
roles/serviceusage.serviceUsageConsumer
) -
Storage Admin (
roles/storage.admin
)
For more information about granting roles, see Manage access to projects, folders, and organizations.
You might also be able to get the required permissions through custom roles or other predefined roles.
Setting up gcloud defaults
To configure gcloud with defaults for your Cloud Run service:
Set your default project:
gcloud config set project PROJECT_ID
Replace PROJECT_ID with the name of the project you created for this tutorial.
Configure gcloud for your chosen region:
gcloud config set run/region REGION
Replace REGION with the supported Cloud Run region of your choice.
Cloud Run locations
Cloud Run is regional, which means the infrastructure that
runs your Cloud Run services is located in a specific region and is
managed by Google to be redundantly available across
all the zones within that region.
Meeting your latency, availability, or durability requirements are primary
factors for selecting the region where your Cloud Run services are run.
You can generally select the region nearest to your users but you should consider
the location of the other Google Cloud
products that are used by your Cloud Run service.
Using Google Cloud products together across multiple locations can affect
your service's latency as well as cost.
Cloud Run is available in the following regions:
Subject to Tier 1 pricing
asia-east1
(Taiwan)asia-northeast1
(Tokyo)asia-northeast2
(Osaka)asia-south1
(Mumbai, India)europe-north1
(Finland) Low CO2europe-southwest1
(Madrid) Low CO2europe-west1
(Belgium) Low CO2europe-west4
(Netherlands) Low CO2europe-west8
(Milan)europe-west9
(Paris) Low CO2me-west1
(Tel Aviv)us-central1
(Iowa) Low CO2us-east1
(South Carolina)us-east4
(Northern Virginia)us-east5
(Columbus)us-south1
(Dallas) Low CO2us-west1
(Oregon) Low CO2
Subject to Tier 2 pricing
africa-south1
(Johannesburg)asia-east2
(Hong Kong)asia-northeast3
(Seoul, South Korea)asia-southeast1
(Singapore)asia-southeast2
(Jakarta)asia-south2
(Delhi, India)australia-southeast1
(Sydney)australia-southeast2
(Melbourne)europe-central2
(Warsaw, Poland)europe-west10
(Berlin) Low CO2europe-west12
(Turin)europe-west2
(London, UK) Low CO2europe-west3
(Frankfurt, Germany) Low CO2europe-west6
(Zurich, Switzerland) Low CO2me-central1
(Doha)me-central2
(Dammam)northamerica-northeast1
(Montreal) Low CO2northamerica-northeast2
(Toronto) Low CO2southamerica-east1
(Sao Paulo, Brazil) Low CO2southamerica-west1
(Santiago, Chile) Low CO2us-west2
(Los Angeles)us-west3
(Salt Lake City)us-west4
(Las Vegas)
If you already created a Cloud Run service, you can view the region in the Cloud Run dashboard in the Google Cloud console.
Understanding the sequence of operations
The flow of data in this tutorial follows these steps:
- A user uploads an image to a Cloud Storage bucket.
- Cloud Storage publishes a message about the new file to Pub/Sub.
- Pub/Sub pushes the message to the Cloud Run service.
- The Cloud Run service retrieves the image file referenced in the Pub/Sub message.
- The Cloud Run service uses the Cloud Vision API to analyze the image.
- If violent or adult content is detected, the Cloud Run service uses ImageMagick to blur the image.
- The Cloud Run service uploads the blurred image to another Cloud Storage bucket for use.
Subsequent use of the blurred image is left as an exercise for the reader.
Create an Artifact Registry standard repository
Create an Artifact Registry standard repository to store your container image:
gcloud artifacts repositories create REPOSITORY \ --repository-format=docker \ --location=REGION
Replace:
- REPOSITORY with a unique name for the repository.
- REGION with the Google Cloud region to be used for the Artifact Registry repository.
Set up Cloud Storage buckets
gcloud
Create a Cloud Storage bucket for uploading images, where INPUT_BUCKET_NAME is a globally unique bucket name:
gcloud storage buckets create gs://INPUT_BUCKET_NAME
The Cloud Run service only reads from this bucket.
Create a second Cloud Storage bucket to receive blurred images, where BLURRED_BUCKET_NAME is a globally unique bucket name:
gcloud storage buckets create gs://BLURRED_BUCKET_NAME
The Cloud Run service uploads blurred images to this bucket. Using a separate bucket prevents processed images from re-triggering the service.
By default, Cloud Run revisions execute as the Compute Engine default service account.
If, instead, you are using a user-managed service account, ensure that you have assigned the required IAM roles so that it has
storage.objects.get
permission for reading from INPUT_BUCKET_NAME andstorage.objects.create
permission for uploading to BLURRED_BUCKET_NAME.
Terraform
To learn how to apply or remove a Terraform configuration, see Basic Terraform commands.
Create two Cloud Storage buckets: one for uploading original images and another for the Cloud Run service to upload blurred images.
To create both Cloud Storage buckets with globally unique names, add
the following to your existing main.tf
file:
By default, Cloud Run revisions execute as the Compute Engine default service account.
If, instead, you are using a
user-managed service account, ensure
that you have assigned the required IAM roles
so that it has storage.objects.get
permission for reading from
google_storage_bucket.imageproc_input
and
storage.objects.create
permission for uploading to
google_storage_bucket.imageproc_output
.
In the following steps, you create and deploy a service that processes notification of file uploads to the INPUT_BUCKET_NAME. You turn on notification delivery after you deploy and test the service, to avoid premature invocation of the new service.
Modify the Pub/Sub tutorial sample code
This tutorial builds on the code assembled in the Use Pub/Sub tutorial. If you have not yet completed that tutorial, do so now, skipping the cleanup steps, then return here to add image processing behavior.
Add image processing code
The image processing code is separated from request handling for readability and ease of testing. To add image processing code:
Change to the directory of the Pub/Sub tutorial sample code.
Add code to import the image processing dependencies, including libraries to integrate with Google Cloud services, ImageMagick, and the file system.
Node.js
Open a newimage.js
file in your editor, and copy in the following:Python
Open a newimage.py
file in your editor, and copy in the following:Go
Open a newimagemagick/imagemagick.go
file in your editor, and copy in the following:Java
Open a newsrc/main/java/com/example/cloudrun/ImageMagick.java
file in your editor, and copy in the following:Add code to receives a Pub/Sub message as an event object and control the image processing.
The event contains data about the originally uploaded image. This code determines if the image needs be blurred by checking the results of a Cloud Vision analysis for violent or adult content.
Node.js
Python
Go
Java
Retrieve the referenced image from the Cloud Storage input bucket created above, use ImageMagick to transform the image with a blur effect, and upload the result to the output bucket.
Node.js
Python
Go
Java
Integrate image processing into the Pub/Sub sample code
To modify the existing service to incorporate the image processing code:
Add new dependencies for your service, including the Cloud Vision and Cloud Storage client libraries:
Node.js
npm install --save gm @google-cloud/storage @google-cloud/vision
Python
Add the necessary client libraries so that yourrequirements.txt
will look something like this:Go
The go sample application uses go modules, the new dependencies added above in theimagemagick/imagemagick.go
import statement will automatically download by the next command that needs them.Java
Add the following dependency under<dependencyManagement>
in thepom.xml
: Add the following dependencies under<dependencies>
in thepom.xml
:Add the ImageMagick system package to your container by modifying the
Dockerfile
below theFROM
statement. If using a "multi-stage" Dockerfile, place this in the final stage.Debian/Ubuntu Alpine Read more about working with system packages in your Cloud Run service in the Using system packages tutorial.
Replace the existing Pub/Sub message handling code with a function call to our new blurring logic.
Node.js
Theapp.js
file defines the Express.js app and prepares received Pub/Sub messages for use. Make the following changes:- Add code to import the new
image.js
file - Remove the existing "Hello World" code from the route
- Add code to further validate the Pub/Sub message
Add code to call the new image processing function
When you are finished, the code will look like this:
Python
Themain.py
file defines the Flask app and prepares received Pub/Sub messages for use. Make the following changes:- Add code to import the new
image.py
file - Remove the existing "Hello World" code from the route
- Add code to further validate the Pub/Sub message
Add code to call the new image processing function
When you are finished, the code will look like this:
Go
Themain.go
file defines the HTTP service and prepares received Pub/Sub messages for use. Make the following changes:- Add code to import the new
imagemagick.go
file - Remove the existing "Hello World" code from the handler
- Add code to further validate the Pub/Sub message
- Add code to call the new image processing function
Java
ThePubSubController.java
file defines the controller that handles HTTP requests and prepares received Pub/Sub messages for use. Make the following changes:- Add the new imports
- Remove the existing "Hello World" code from the controller
- Add code to further validate the Pub/Sub message
- Add code to call the new image processing function
- Add code to import the new
Download the complete sample
To retrieve the complete Image Processing code sample for use:
Clone the sample app repository to your local machine:
Node.js
git clone https://github.com/GoogleCloudPlatform/nodejs-docs-samples.git
Alternatively, you can download the sample as a zip file and extract it.
Python
git clone https://github.com/GoogleCloudPlatform/python-docs-samples.git
Alternatively, you can download the sample as a zip file and extract it.
Go
git clone https://github.com/GoogleCloudPlatform/golang-samples.git
Alternatively, you can download the sample as a zip file and extract it.
Java
git clone https://github.com/GoogleCloudPlatform/java-docs-samples.git
Alternatively, you can download the sample as a zip file and extract it.
Change to the directory that contains the Cloud Run sample code:
Node.js
cd nodejs-docs-samples/run/image-processing/
Python
cd python-docs-samples/run/image-processing/
Go
cd golang-samples/run/image-processing/
Java
cd java-docs-samples/run/image-processing/
Ship the code
Shipping code consists of three steps: building a container image with Cloud Build, uploading the container image to Artifact Registry, and deploying the container image to Cloud Run.
To ship your code:
Build your container and publish on Artifact Registry:
Node.js
gcloud builds submit --tag REGION-docker.pkg.dev/PROJECT_ID /REPOSITORY/pubsub
Where
pubsub
is the name of your service.Replace:
- PROJECT_ID with your Google Cloud project ID
- REPOSITORY with the name of the Artifact Registry repository.
- REGION with the Google Cloud region to be used for the Artifact Registry repository.
Upon success, you will see a SUCCESS message containing the ID, creation time, and image name. The image is stored in Artifact Registry and can be re-used if required.
Python
gcloud builds submit --tag REGION-docker.pkg.dev/PROJECT_ID /REPOSITORY/pubsub
Where
pubsub
is the name of your service.Replace:
- PROJECT_ID with your Google Cloud project ID
- REPOSITORY with the name of the Artifact Registry repository.
- REGION with the Google Cloud region to be used for the Artifact Registry repository.
Upon success, you will see a SUCCESS message containing the ID, creation time, and image name. The image is stored in Artifact Registry and can be re-used if required.
Go
gcloud builds submit --tag REGION-docker.pkg.dev/PROJECT_ID /REPOSITORY/pubsub
Where
pubsub
is the name of your service.Replace:
- PROJECT_ID with your Google Cloud project ID
- REPOSITORY with the name of the Artifact Registry repository.
- REGION with the Google Cloud region to be used for the Artifact Registry repository.
Upon success, you will see a SUCCESS message containing the ID, creation time, and image name. The image is stored in Artifact Registry and can be re-used if required.
Java
This sample uses Jib to build Docker images using common Java tools. Jib optimizes container builds without the need for a Dockerfile or having Docker installed. Learn more about building Java containers with Jib.Using the Dockerfile, configure and build a base image with the system packages installed to override Jib's default base image:
gcloud builds submit --tag REGION-docker.pkg.dev/PROJECT_ID /REPOSITORY/imagemagick
Replace:
- PROJECT_ID with your Google Cloud project ID
- REPOSITORY with the name of the Artifact Registry repository.
- REGION with the Google Cloud region to be used for the Artifact Registry repository.
Use the gcloud credential helper to authorize Docker to push to your Artifact Registry.
gcloud auth configure-docker
Build your final container with Jib and publish on Artifact Registry:
mvn compile jib:build \ -Dimage=REGION-docker.pkg.dev/PROJECT_ID /REPOSITORY/pubsub \ -Djib.from.image=REGION-docker.pkg.dev/PROJECT_ID /REPOSITORY/imagemagick
Replace:
- PROJECT_ID with your Google Cloud project ID
- REPOSITORY with the name of the Artifact Registry repository.
- REGION with the Google Cloud region to be used for the Artifact Registry repository.
Run the following command to deploy your service, using the same service name you used in the Use Pub/Sub tutorial:
Node.js
gcloud run deploy pubsub-tutorial --image REGION-docker.pkg.dev/PROJECT_ID /REPOSITORY/pubsub --set-env-vars=BLURRED_BUCKET_NAME=BLURRED_BUCKET_NAME --no-allow-unauthenticated
Python
gcloud run deploy pubsub-tutorial --image REGION-docker.pkg.dev/PROJECT_ID /REPOSITORY/pubsub --set-env-vars=BLURRED_BUCKET_NAME=BLURRED_BUCKET_NAME --no-allow-unauthenticated
Go
gcloud run deploy pubsub-tutorial --image REGION-docker.pkg.dev/PROJECT_ID /REPOSITORY/pubsub --set-env-vars=BLURRED_BUCKET_NAME=BLURRED_BUCKET_NAME --no-allow-unauthenticated
Java
gcloud run deploy pubsub-tutorial --image REGION-docker.pkg.dev/PROJECT_ID /REPOSITORY/pubsub --set-env-vars=BLURRED_BUCKET_NAME=BLURRED_BUCKET_NAME --memory 512M --no-allow-unauthenticated
Where
pubsub
is the container name andpubsub-tutorial
is the name of the service. Notice that the container image is deployed to the service and region (Cloud Run) that you configured previously under Setting up gcloud defaults. Replace:- PROJECT_ID with your Google Cloud project ID
- REPOSITORY with the name of the Artifact Registry repository.
- REGION with the Google Cloud region to be used for the Artifact Registry repository.
- BLURRED_BUCKET_NAME with your Cloud Storage bucket you created earlier to receive blurred images to set the environment variable.
The
--no-allow-unauthenticated
flag restricts unauthenticated access to the service. By keeping the service private you can rely on Cloud Run's automatic Pub/Sub integration to authenticate requests. See Integrating with Pub/Sub for more details on how this is configured. See Managing Access for more details on IAM-based authentication.Wait until the deployment is complete: this can take about half a minute. On success, the command line displays the service URL.
Turn on notifications from Cloud Storage
Configure Cloud Storage to publish a message to a Pub/Sub topic whenever a file (known as an object), is uploaded or changed. Send the notification to the previously created topic so any new file upload will invoke the service.
gcloud
gcloud storage service-agent --project=PROJECT_ID gcloud storage buckets notifications create gs://INPUT_BUCKET_NAME --topic=myRunTopic --payload-format=json
myRunTopic
is the topic you created in the previous tutorial.
Replace INPUT_BUCKET_NAME with the name you used when you created the buckets.
For more details about storage bucket notifications, read object change notifications.
Terraform
To learn how to apply or remove a Terraform configuration, see Basic Terraform commands.
In order to enable notifications, the Cloud Storage service account
unique to the project must exist and have the IAM permission
pubsub.publisher
on the Pub/Sub topic. To grant this
permission and create a Cloud Storage notification, add the following to
your existing main.tf
file:
Try it out
Upload an offensive image, such as this image of a flesh-eating zombie:
curl -o zombie.jpg https://cdn.pixabay.com/photo/2015/09/21/14/24/zombie-949916_960_720.jpg gcloud storage cp zombie.jpg gs://INPUT_BUCKET_NAME
where INPUT_BUCKET_NAME is the Cloud Storage bucket you created earlier for uploading images.
Navigate to the service logs:
- Navigate to the Cloud Run page in the Google Cloud Console
- Click the
pubsub-tutorial
service. - Select the Logs tab. Logs might take a few moments to appear. If you don't see them immediately, check again after a few moments.
Look for the
Blurred image: zombie.png
message.You can view the blurred images in the BLURRED_BUCKET_NAME Cloud Storage bucket you created earlier: locate the bucket in the Cloud Storage page in the Google Cloud Console
Clean up
If you created a new project for this tutorial, delete the project. If you used an existing project and wish to keep it without the changes added in this tutorial, delete resources created for the tutorial.
Deleting the project
The easiest way to eliminate billing is to delete the project that you created for the tutorial.
To delete the project:
- In the Google Cloud console, go to the Manage resources page.
- In the project list, select the project that you want to delete, and then click Delete.
- In the dialog, type the project ID, and then click Shut down to delete the project.
Deleting tutorial resources
Delete the Cloud Run service you deployed in this tutorial:
gcloud run services delete SERVICE-NAME
Where SERVICE-NAME is your chosen service name.
You can also delete Cloud Run services from the Google Cloud console.
Remove the gcloud default region configuration you added during tutorial setup:
gcloud config unset run/region
Remove the project configuration:
gcloud config unset project
Delete other Google Cloud resources created in this tutorial:
- Delete the Pub/Sub topic
myRunTopic
- Delete the Pub/Sub subscription
myRunSubscription
- Delete your container image from Artifact Registry.
- Delete the invoker service account
cloud-run-pubsub-invoker@PROJECT_ID.iam.gserviceaccount.com
- Delete the Cloud Storage buckets created for the placeholders
INPUT_BUCKET_NAME
andBLURRED_BUCKET_NAME
- Delete the Pub/Sub topic
What's next
- Learn more about persisting data with Cloud Run using Cloud Storage.
- Understand how to use Cloud Vision API to detect things besides explicit content.
- Explore reference architectures, diagrams, and best practices about Google Cloud. Take a look at our Cloud Architecture Center.