This tutorial describes how to transcode low-priority offline videos using Cloud Run jobs.
Prepare your application
To retrieve the code sample for use:
Clone the sample repository to your local machine:
git clone https://github.com/GoogleCloudPlatform/cloud-run-samples
Change to the directory that contains the Cloud Run sample code:
cd cloud-run-samples/jobs-video-encoding
Create Cloud Storage buckets
To store the videos for processing, and to save the results of encoding, create the following two Cloud Storage buckets:
Create a bucket to store videos before processing:
gcloud storage buckets create gs://preprocessing-PROJECT_ID \ --location LOCATION
Replace the following:
- PROJECT_ID: your project ID.
- LOCATION: the Cloud Storage location.
Grant the service account access to read from this bucket:
gcloud storage buckets add-iam-policy-binding gs://preprocessing-PROJECT_ID \ --member="serviceAccount:video-encoding@PROJECT_ID.iam.gserviceaccount.com" \ --role="roles/storage.objectViewer"
Replace PROJECT_ID with your project ID.
Create a bucket to store transcoded videos after processing:
gcloud storage buckets create gs://transcoded-PROJECT_ID \ --location LOCATION
Replace the following:
- PROJECT_ID: your project ID.
- LOCATION: the Cloud Storage location.
Grant the service account access to read from and write to this bucket:
gcloud storage buckets add-iam-policy-binding gs://transcoded-PROJECT_ID \ --member="serviceAccount:video-encoding@PROJECT_ID.iam.gserviceaccount.com" \ --role="roles/storage.objectAdmin"
Replace PROJECT_ID with your project ID.
Deploy a Cloud Run job
Create a Cloud Run job by using the Dockerfile in the sample repository and mounting the buckets that you created:
Navigate to the sample directory:
cd cloud-run-samples/jobs-video-encoding
Create an Artifact Registry if the default Cloud Run registry doesn't already exist:
gcloud artifacts repositories create cloud-run-source-deploy \ --repository-format=docker \ --location LOCATION
Replace LOCATION with the name of the location of the registry.
Build the container image:
gcloud builds submit \ --tag LOCATION-docker.pkg.dev/PROJECT_ID/cloud-run-source-deploy/IMAGE_NAME \ --machine-type E2-HIGHCPU-32
Replace the following:
- PROJECT_ID: your project ID.
- LOCATION:name of the location of the registry.
- IMAGE_NAME: name for the container image, for example:
ffmpeg-image
.
Cloud Run uses a larger machine type to reduce build time.
Deploy the job:
gcloud beta run jobs create video-encoding-job \ --image LOCATION-docker.pkg.dev/PROJECT_ID/cloud-run-source-deploy/IMAGE_NAME \ --region REGION \ --memory 32Gi \ --cpu 8 \ --gpu 1 \ --gpu-type nvidia-l4 \ --no-gpu-zonal-redundancy \ --max-retries 1 \ --service-account video-encoding@PROJECT_ID.iam.gserviceaccount.com \ --add-volume=name=input-volume,type=cloud-storage,bucket=preprocessing-PROJECT_ID,readonly=true \ --add-volume-mount=volume=input-volume,mount-path=/inputs \ --add-volume=name=output-volume,type=cloud-storage,bucket=transcoded-PROJECT_ID \ --add-volume-mount=volume=output-volume,mount-path=/outputs
Replace the following:
- PROJECT_ID: your project ID.
- REGION: the name of the region. Note: This must be the same region that you have GPU quota for.
- IMAGE_NAME: name for the container image, for example,
ffmpeg-image
.
If this is the first time you deployed from source in this project, Cloud Run prompts you to create a default Artifact Registry repository.
Run the job
To run the job, follow these steps:
Upload an example video to encode:
gcloud storage cp gs://cloud-samples-data/video/cat.mp4 gs://preprocessing-PROJECT_ID
Run the job:
gcloud run jobs execute video-encoding-job \ --region REGION \ --wait \ --args="cat.mp4,encoded_cat.mp4,-vcodec,h264_nvenc,-cq,21,-movflags,+faststart"
The
entrypoint.sh
file requires an input file, output file, and any arguments to send to FFmpeg.Review the Cloud Run logs to make sure the video trancoded:
gcloud run jobs logs read video-encoding-job --region REGION
Download the transcoded video:
gcloud storage cp gs://transcoded-PROJECT_ID/encoded_cat.mp4 .