Creating overlays

This page explains how to insert overlays into transcoded videos. An overlay consists of a JPEG image that is inserted on top of the output video, and can optionally be faded in or out over a specified time period. To insert an overlay, use the overlays array in the JobConfig template.

Upload an image to Cloud Storage

To get started, do the following to upload an overlay image to your Cloud Storage bucket:

  1. In the Cloud Console, go to the Cloud Storage Browser page.
    Go to the Cloud Storage Browser page
  2. Click the name of your bucket to open it.
  3. Click Upload files.
  4. Select a JPEG file to upload from your local machine.

Create an overlay

You can create two types of overlays: static or animated. Both types of overlays use a static image. You can show or hide static overlays. Animated overlays support fade in and fade out animations of the image.

You can insert multiple overlays into a single output video.

Create a static overlay

In the image object, use the uri field to specify the overlay image in Cloud Storage. In the resolution object, set the x and y values from 0 to 1.0. A value of 0 maintains the source image resolution for that dimension; a value of 1.0 will stretch the image to match the dimension of the output video. For example, use the values x: 1 and y: 0.5 to stretch the overlay image the full width and half of the height of the output video.

In the animations array, create an animationStatic object with x and y coordinates from 0 to 1.0. These coordinates are based on the output video resolution. Use the values x: 0 and y: 0 to position the top-left corner of the overlay in the top-left corner of the output video. Specify when the overlay should appear in the output video timeline using the startTimeOffset field.

To remove the static animation, create an animationEnd object. Specify when the animation should end (that is, the overlay should disappear) in the output video timeline using the startTimeOffset field.

You can add this configuration to a job template or include it in an ad-hoc job configuration:

REST & CMD LINE

Before using any of the request data below, make the following replacements:

  • PROJECT_ID: Your Google Cloud project ID listed in the IAM Settings.
  • LOCATION: The location where your job will run. Use a location from the following list:
    • us-central1
    • us-west1
    • us-east1
    • southamerica-east1
    • asia-east1
    • europe-west1
  • GCS_BUCKET_NAME: The name of the Cloud Storage bucket you created.
  • GCS_INPUT_VIDEO: The name of the video in your Cloud Storage bucket that you are transcoding, such as my-vid.mp4. This field should take into account any folders that you created in the bucket (for example, input/my-vid.mp4).
  • GCS_INPUT_OVERLAY: The name of the JPEG image in your Cloud Storage bucket that you are using for the overlay, such as my-overlay.jpg. This field should take into account any folders that you created in the bucket (for example, input/my-overlay.jpg).
  • GCS_OUTPUT_FOLDER: The Cloud Storage folder name where you want to save the encoded video outputs.

Request JSON body:

{
  "config": {
    "inputs": [
          {
            "key": "input0",
            "uri": "gs://GCS_BUCKET_NAME/GCS_INPUT_VIDEO"
          }
        ],
    "elementaryStreams": [
      {
        "key": "video-stream0",
        "videoStream": {
          "codec": "h264",
          "heightPixels": 360,
          "widthPixels": 640,
          "bitrateBps": 550000,
          "frameRate": 60
        }
      },
      {
        "key": "audio-stream0",
        "audioStream": {
          "codec": "aac",
          "bitrateBps": 64000
        }
      }
    ],
    "muxStreams": [
      {
        "key": "sd",
        "container": "mp4",
        "elementaryStreams": [
          "video-stream0",
          "audio-stream0"
        ]
      }
    ],
    "output": {
      "uri": "gs://GCS_BUCKET_NAME/GCS_OUTPUT_FOLDER/"
    },
    "overlays": [
      {
        "image": {
          "uri": "gs://GCS_BUCKET_NAME/GCS_INPUT_OVERLAY",
          "resolution": {
            "x": 1,
            "y": 0.5
          },
          "alpha": 1
        },
        "animations": [
          {
            "animationStatic": {
              "xy": {
                "x": 0,
                "y": 0
              },
              "startTimeOffset": "0s"
            }
          },
          {
            "animationEnd": {
              "startTimeOffset": "10s"
            }
          }
        ]
      }
    ]
  }
}

To send your request, expand one of these options:

You should receive a JSON response similar to the following:

{
  "name": "projects/PROJECT_NUMBER/locations/LOCATION/jobs/JOB_ID",
  "config": {
   ...
  },
  "createTime": CREATE_TIME,
  "ttlAfterCompletionDays": 30
}

gcloud

  1. Create a config.json file that defines the job fields. Make the following replacements for an example job:
    • GCS_BUCKET_NAME: The name of the Cloud Storage bucket you created.
    • GCS_INPUT_VIDEO: The name of the video in your Cloud Storage bucket that you are transcoding, such as my-vid.mp4. This field should take into account any folders that you created in the bucket (for example, input/my-vid.mp4).
    • GCS_INPUT_OVERLAY: The name of the JPEG image in your Cloud Storage bucket that you are using for the overlay, such as my-overlay.jpg. This field should take into account any folders that you created in the bucket (for example, input/my-overlay.jpg).
    • LOCATION: The location where your job will run. Use a location from the following list:
      • us-central1
      • us-west1
      • us-east1
      • southamerica-east1
      • asia-east1
      • europe-west1
    • GCS_OUTPUT_FOLDER: The Cloud Storage folder name where you want to save the encoded video outputs.
    {
      "config": {
        "inputs": [
              {
                "key": "input0",
                "uri": "gs://GCS_BUCKET_NAME/GCS_INPUT_VIDEO"
              }
            ],
        "elementaryStreams": [
          {
            "key": "video-stream0",
            "videoStream": {
              "codec": "h264",
              "heightPixels": 360,
              "widthPixels": 640,
              "bitrateBps": 550000,
              "frameRate": 60
            }
          },
          {
            "key": "audio-stream0",
            "audioStream": {
              "codec": "aac",
              "bitrateBps": 64000
            }
          }
        ],
        "muxStreams": [
          {
            "key": "sd",
            "container": "mp4",
            "elementaryStreams": [
              "video-stream0",
              "audio-stream0"
            ]
          }
        ],
        "output": {
          "uri": "gs://GCS_BUCKET_NAME/GCS_OUTPUT_FOLDER/"
        },
        "overlays": [
          {
            "image": {
              "uri": "gs://GCS_BUCKET_NAME/GCS_INPUT_OVERLAY",
              "resolution": {
                "x": 1,
                "y": 0.5
              },
              "alpha": 1
            },
            "animations": [
              {
                "animationStatic": {
                  "xy": {
                    "x": 0,
                    "y": 0
                  },
                  "startTimeOffset": "0s"
                }
              },
              {
                "animationEnd": {
                  "startTimeOffset": "10s"
                }
              }
            ]
          }
        ]
      }
    }
    
  2. Run the following command:
    gcloud beta transcoder jobs create \
        --input-uri="gs://GCS_BUCKET_NAME/GCS_INPUT_VIDEO" \
        --location=LOCATION \
        --output-uri="gs://GCS_BUCKET_NAME/GCS_OUTPUT_FOLDER/" \
        --file="config.json"
    You should see a response similar to the following:
    {
      "name": "projects/PROJECT_NUMBER/locations/LOCATION/jobs/JOB_ID",
      "config": {
       ...
      },
      "createTime": CREATE_TIME,
      "ttlAfterCompletionDays": 30
    }
    

Go

Before trying this sample, follow the Go setup instructions in the Transcoder API Quickstart Using Client Libraries. For more information, see the Transcoder API Go API reference documentation.

import (
	"context"
	"fmt"
	"io"

	"github.com/golang/protobuf/ptypes/duration"

	transcoder "cloud.google.com/go/video/transcoder/apiv1beta1"
	transcoderpb "google.golang.org/genproto/googleapis/cloud/video/transcoder/v1beta1"
)

// createJobWithStaticOverlay creates a job based on a given configuration that
// includes a static overlay. See
// https://cloud.google.com/transcoder/docs/how-to/create-overlays#create-static-overlay
// for more information.
func createJobWithStaticOverlay(w io.Writer, projectID string, location string, inputURI string, overlayImageURI string, outputURI string) error {
	// projectID := "my-project-id"
	// location := "us-central1"
	// inputURI := "gs://my-bucket/my-video-file"
	// overlayImageURI := "gs://my-bucket/my-overlay-image-file" - Must be a JPEG
	// outputURI := "gs://my-bucket/my-output-folder/"
	ctx := context.Background()
	client, err := transcoder.NewClient(ctx)
	if err != nil {
		return fmt.Errorf("NewClient: %v", err)
	}
	defer client.Close()

	req := &transcoderpb.CreateJobRequest{
		Parent: fmt.Sprintf("projects/%s/locations/%s", projectID, location),
		Job: &transcoderpb.Job{
			InputUri:  inputURI,
			OutputUri: outputURI,
			JobConfig: &transcoderpb.Job_Config{
				Config: &transcoderpb.JobConfig{
					ElementaryStreams: []*transcoderpb.ElementaryStream{
						&transcoderpb.ElementaryStream{
							Key: "video_stream0",
							ElementaryStream: &transcoderpb.ElementaryStream_VideoStream{
								VideoStream: &transcoderpb.VideoStream{
									Codec:        "h264",
									BitrateBps:   550000,
									FrameRate:    60,
									HeightPixels: 360,
									WidthPixels:  640,
								},
							},
						},
						&transcoderpb.ElementaryStream{
							Key: "audio_stream0",
							ElementaryStream: &transcoderpb.ElementaryStream_AudioStream{
								AudioStream: &transcoderpb.AudioStream{
									Codec:      "aac",
									BitrateBps: 64000,
								},
							},
						},
					},
					MuxStreams: []*transcoderpb.MuxStream{
						&transcoderpb.MuxStream{
							Key:               "sd",
							Container:         "mp4",
							ElementaryStreams: []string{"video_stream0", "audio_stream0"},
						},
					},
					Overlays: []*transcoderpb.Overlay{
						&transcoderpb.Overlay{
							Image: &transcoderpb.Overlay_Image{
								Uri: overlayImageURI,
								Resolution: &transcoderpb.Overlay_NormalizedCoordinate{
									X: 1,
									Y: 0.5,
								},
								Alpha: 1,
							},
							Animations: []*transcoderpb.Overlay_Animation{
								&transcoderpb.Overlay_Animation{
									AnimationType: &transcoderpb.Overlay_Animation_AnimationStatic{
										AnimationStatic: &transcoderpb.Overlay_AnimationStatic{
											Xy: &transcoderpb.Overlay_NormalizedCoordinate{
												X: 0,
												Y: 0,
											},
											StartTimeOffset: &duration.Duration{
												Seconds: 0,
											},
										},
									},
								},

								&transcoderpb.Overlay_Animation{
									AnimationType: &transcoderpb.Overlay_Animation_AnimationEnd{
										AnimationEnd: &transcoderpb.Overlay_AnimationEnd{
											StartTimeOffset: &duration.Duration{
												Seconds: 10,
											},
										},
									},
								},
							},
						},
					},
				},
			},
		},
	}
	// Creates the job. Jobs take a variable amount of time to run.
	// You can query for the job state; see getJob() in get_job.go.
	response, err := client.CreateJob(ctx, req)
	if err != nil {
		return fmt.Errorf("createJobWithStaticOverlay: %v", err)
	}

	fmt.Fprintf(w, "Job: %v", response.GetName())
	return nil
}

Java

Before trying this sample, follow the Java setup instructions in the Transcoder API Quickstart Using Client Libraries. For more information, see the Transcoder API Java API reference documentation.


import com.google.cloud.video.transcoder.v1beta1.AudioStream;
import com.google.cloud.video.transcoder.v1beta1.CreateJobRequest;
import com.google.cloud.video.transcoder.v1beta1.ElementaryStream;
import com.google.cloud.video.transcoder.v1beta1.Input;
import com.google.cloud.video.transcoder.v1beta1.Job;
import com.google.cloud.video.transcoder.v1beta1.JobConfig;
import com.google.cloud.video.transcoder.v1beta1.LocationName;
import com.google.cloud.video.transcoder.v1beta1.MuxStream;
import com.google.cloud.video.transcoder.v1beta1.Output;
import com.google.cloud.video.transcoder.v1beta1.Overlay;
import com.google.cloud.video.transcoder.v1beta1.Overlay.AnimationEnd;
import com.google.cloud.video.transcoder.v1beta1.Overlay.AnimationStatic;
import com.google.cloud.video.transcoder.v1beta1.Overlay.NormalizedCoordinate;
import com.google.cloud.video.transcoder.v1beta1.TranscoderServiceClient;
import com.google.cloud.video.transcoder.v1beta1.VideoStream;
import com.google.protobuf.Duration;
import java.io.IOException;

public class CreateJobWithStaticOverlay {

  public static void main(String[] args) throws IOException {
    // TODO(developer): Replace these variables before running the sample.
    String projectId = "my-project-id";
    String location = "us-central1";
    String inputUri = "gs://my-bucket/my-video-file";
    String overlayImageUri = "gs://my-bucket/my-overlay-image.jpg"; // Must be a JPEG
    String outputUri = "gs://my-bucket/my-output-folder/";

    createJobWithStaticOverlay(projectId, location, inputUri, overlayImageUri, outputUri);
  }

  // Creates a job from an ad-hoc configuration and adds a static overlay to it.
  public static void createJobWithStaticOverlay(
      String projectId, String location, String inputUri, String overlayImageUri, String outputUri)
      throws IOException {
    // Initialize client that will be used to send requests. This client only needs to be created
    // once, and can be reused for multiple requests.
    try (TranscoderServiceClient transcoderServiceClient = TranscoderServiceClient.create()) {

      VideoStream videoStream0 =
          VideoStream.newBuilder()
              .setCodec("h264")
              .setBitrateBps(550000)
              .setFrameRate(60)
              .setHeightPixels(360)
              .setWidthPixels(640)
              .build();

      AudioStream audioStream0 =
          AudioStream.newBuilder().setCodec("aac").setBitrateBps(64000).build();

      // Create the overlay image. Only JPEG is supported. Image resolution is based on output
      // video resolution. To respect the original image aspect ratio, set either x or y to 0.0.
      // This example stretches the overlay image the full width and half of the height of the
      // output video.
      Overlay.Image overlayImage =
          Overlay.Image.newBuilder()
              .setUri(overlayImageUri)
              .setResolution(NormalizedCoordinate.newBuilder().setX(1).setY(0.5).build())
              .setAlpha(1)
              .build();

      // Create the starting animation (when the overlay appears). Use the values x: 0 and y: 0 to
      // position the top-left corner of the overlay in the top-left corner of the output video.
      Overlay.Animation animationStart =
          Overlay.Animation.newBuilder()
              .setAnimationStatic(
                  AnimationStatic.newBuilder()
                      .setXy(NormalizedCoordinate.newBuilder().setX(0).setY(0).build())
                      .setStartTimeOffset(Duration.newBuilder().setSeconds(0).build())
                      .build())
              .build();

      // Create the ending animation (when the overlay disappears). In this example, the overlay
      // disappears at the 10-second mark in the output video.
      Overlay.Animation animationEnd =
          Overlay.Animation.newBuilder()
              .setAnimationEnd(
                  AnimationEnd.newBuilder()
                      .setStartTimeOffset(Duration.newBuilder().setSeconds(10).build())
                      .build())
              .build();

      // Create the overlay and add the image and animations to it.
      Overlay overlay =
          Overlay.newBuilder()
              .setImage(overlayImage)
              .addAnimations(animationStart)
              .addAnimations(animationEnd)
              .build();

      JobConfig config =
          JobConfig.newBuilder()
              .addInputs(Input.newBuilder().setKey("input0").setUri(inputUri))
              .setOutput(Output.newBuilder().setUri(outputUri))
              .addElementaryStreams(
                  ElementaryStream.newBuilder()
                      .setKey("video_stream0")
                      .setVideoStream(videoStream0))
              .addElementaryStreams(
                  ElementaryStream.newBuilder()
                      .setKey("audio_stream0")
                      .setAudioStream(audioStream0))
              .addMuxStreams(
                  MuxStream.newBuilder()
                      .setKey("sd")
                      .setContainer("mp4")
                      .addElementaryStreams("video_stream0")
                      .addElementaryStreams("audio_stream0")
                      .build())
              .addOverlays(overlay) // Add the overlay to the job config
              .build();

      var createJobRequest =
          CreateJobRequest.newBuilder()
              .setJob(
                  Job.newBuilder()
                      .setInputUri(inputUri)
                      .setOutputUri(outputUri)
                      .setConfig(config)
                      .build())
              .setParent(LocationName.of(projectId, location).toString())
              .build();

      // Send the job creation request and process the response.
      Job job = transcoderServiceClient.createJob(createJobRequest);
      System.out.println("Job: " + job.getName());
    }
  }
}

Node.js

Before trying this sample, follow the Node.js setup instructions in the Transcoder API Quickstart Using Client Libraries. For more information, see the Transcoder API Node.js API reference documentation.

/**
 * TODO(developer): Uncomment these variables before running the sample.
 */
// projectId = 'my-project-id';
// location = 'us-central1';
// inputUri = 'gs://my-bucket/my-video-file';
// overlayImageUri = 'gs://my-bucket/my-overlay-image-file'; // Must be a JPEG
// outputUri = 'gs://my-bucket/my-output-folder/';

// Imports the Transcoder library
const {TranscoderServiceClient} = require('@google-cloud/video-transcoder');

// Instantiates a client
const transcoderServiceClient = new TranscoderServiceClient();

async function createJobFromStaticOverlay() {
  // Construct request
  const request = {
    parent: transcoderServiceClient.locationPath(projectId, location),
    job: {
      inputUri: inputUri,
      outputUri: outputUri,
      config: {
        elementaryStreams: [
          {
            key: 'video-stream0',
            videoStream: {
              codec: 'h264',
              heightPixels: 360,
              widthPixels: 640,
              bitrateBps: 550000,
              frameRate: 60,
            },
          },
          {
            key: 'audio-stream0',
            audioStream: {
              codec: 'aac',
              bitrateBps: 64000,
            },
          },
        ],
        muxStreams: [
          {
            key: 'sd',
            container: 'mp4',
            elementaryStreams: ['video-stream0', 'audio-stream0'],
          },
        ],
        overlays: [
          {
            image: {
              uri: overlayImageUri,
              resolution: {
                x: 1,
                y: 0.5,
              },
              alpha: 1.0,
            },
            animations: [
              {
                animationStatic: {
                  xy: {
                    x: 0,
                    y: 0,
                  },
                  startTimeOffset: {
                    seconds: 0,
                  },
                },
              },
              {
                animationEnd: {
                  startTimeOffset: {
                    seconds: 10,
                  },
                },
              },
            ],
          },
        ],
      },
    },
  };

  // Run request
  const [response] = await transcoderServiceClient.createJob(request);
  console.log(`Job: ${response.name}`);
}

createJobFromStaticOverlay();

Python

Before trying this sample, follow the Python setup instructions in the Transcoder API Quickstart Using Client Libraries. For more information, see the Transcoder API Python API reference documentation.


import argparse

from google.cloud.video import transcoder_v1beta1
from google.cloud.video.transcoder_v1beta1.services.transcoder_service import (
    TranscoderServiceClient,
)
from google.protobuf import duration_pb2 as duration


def create_job_with_static_overlay(
    project_id, location, input_uri, overlay_image_uri, output_uri
):
    """Creates a job based on an ad-hoc job configuration that includes a static image overlay.

    Args:
        project_id: The GCP project ID.
        location: The location to start the job in.
        input_uri: Uri of the video in the Cloud Storage bucket.
        overlay_image_uri: Uri of the JPEG image for the overlay in the Cloud Storage bucket. Must be a JPEG.
        output_uri: Uri of the video output folder in the Cloud Storage bucket."""

    client = TranscoderServiceClient()

    parent = f"projects/{project_id}/locations/{location}"
    job = transcoder_v1beta1.types.Job()
    job.input_uri = input_uri
    job.output_uri = output_uri
    job.config = transcoder_v1beta1.types.JobConfig(
        elementary_streams=[
            transcoder_v1beta1.types.ElementaryStream(
                key="video-stream0",
                video_stream=transcoder_v1beta1.types.VideoStream(
                    codec="h264",
                    height_pixels=360,
                    width_pixels=640,
                    bitrate_bps=550000,
                    frame_rate=60,
                ),
            ),
            transcoder_v1beta1.types.ElementaryStream(
                key="audio-stream0",
                audio_stream=transcoder_v1beta1.types.AudioStream(
                    codec="aac", bitrate_bps=64000
                ),
            ),
        ],
        mux_streams=[
            transcoder_v1beta1.types.MuxStream(
                key="sd",
                container="mp4",
                elementary_streams=["video-stream0", "audio-stream0"],
            ),
        ],
        overlays=[
            transcoder_v1beta1.types.Overlay(
                image=transcoder_v1beta1.types.Overlay.Image(
                    uri=overlay_image_uri,
                    resolution=transcoder_v1beta1.types.Overlay.NormalizedCoordinate(
                        x=1,
                        y=0.5,
                    ),
                    alpha=1,
                ),
                animations=[
                    transcoder_v1beta1.types.Overlay.Animation(
                        animation_static=transcoder_v1beta1.types.Overlay.AnimationStatic(
                            xy=transcoder_v1beta1.types.Overlay.NormalizedCoordinate(
                                x=0,
                                y=0,
                            ),
                            start_time_offset=duration.Duration(
                                seconds=0,
                            ),
                        ),
                    ),
                    transcoder_v1beta1.types.Overlay.Animation(
                        animation_end=transcoder_v1beta1.types.Overlay.AnimationEnd(
                            start_time_offset=duration.Duration(
                                seconds=10,
                            ),
                        ),
                    ),
                ],
            ),
        ],
    )
    response = client.create_job(parent=parent, job=job)
    print(f"Job: {response.name}")
    return response

In the output video, the static overlay has the following characteristics:

  • It appears at the beginning of the timeline and is visible for 10 seconds.
  • It stretches the full width and half of the height of the output video.
  • It is positioned in the top-left corner of the output video.

See the sample output video for this configuration. This video uses a sample overlay image.

Create an animated overlay

In the image object, use the uri field to specify the overlay image in Cloud Storage. In the resolution object, set the x and y values from 0 to 1.0. A value of 0 maintains the source image resolution for that dimension; a value of 1.0 will stretch the image to match the dimension of the output video. For example, use the values x: 0 and y: 0 to maintain the original resolution of the overlay image.

In the animations array, create an animationFade object with a fadeType of FADE_IN. Set the x and y coordinates from 0 to 1.0. These coordinates are based on the output video resolution. Use the values x: 0.5 and y: 0.5 to position the top-left corner of the overlay in the center of the output video. Specify when the overlay should start to appear in the output video timeline using the startTimeOffset field. The overlay should be fully visible by the time set in the endTimeOffset field.

To fade out the overlay, create another animationFade object. This time, set the fadeType to FADE_OUT. Input the position coordinates and start and end times as before.

You can add this configuration to a job template or include it in an ad-hoc job configuration:

REST & CMD LINE

Before using any of the request data below, make the following replacements:

  • PROJECT_ID: Your Google Cloud project ID listed in the IAM Settings.
  • LOCATION: The location where your job will run. Use a location from the following list:
    • us-central1
    • us-west1
    • us-east1
    • southamerica-east1
    • asia-east1
    • europe-west1
  • GCS_BUCKET_NAME: The name of the Cloud Storage bucket you created.
  • GCS_INPUT_VIDEO: The name of the video in your Cloud Storage bucket that you are transcoding, such as my-vid.mp4. This field should take into account any folders that you created in the bucket (for example, input/my-vid.mp4).
  • GCS_INPUT_OVERLAY: The name of the JPEG image in your Cloud Storage bucket that you are using for the overlay, such as my-overlay.jpg. This field should take into account any folders that you created in the bucket (for example, input/my-overlay.jpg).
  • GCS_OUTPUT_FOLDER: The Cloud Storage folder name where you want to save the encoded video outputs.

Request JSON body:

{
  "config": {
    "inputs": [
          {
            "key": "input0",
            "uri": "gs://GCS_BUCKET_NAME/GCS_INPUT_VIDEO"
          }
        ],
    "elementaryStreams": [
      {
        "key": "video-stream0",
        "videoStream": {
          "codec": "h264",
          "heightPixels": 360,
          "widthPixels": 640,
          "bitrateBps": 550000,
          "frameRate": 60
        }
      },
      {
        "key": "audio-stream0",
        "audioStream": {
          "codec": "aac",
          "bitrateBps": 64000
        }
      }
    ],
    "muxStreams": [
      {
        "key": "sd",
        "container": "mp4",
        "elementaryStreams": [
          "video-stream0",
          "audio-stream0"
        ]
      }
    ],
    "output": {
      "uri": "gs://GCS_BUCKET_NAME/GCS_OUTPUT_FOLDER/"
    },
    "overlays": [
      {
        "image": {
          "uri": "gs://GCS_BUCKET_NAME/GCS_INPUT_OVERLAY",
          "resolution": {
            "x": 0,
            "y": 0
          },
          "alpha": 1
        },
        "animations": [
          {
            "animationFade": {
              "fadeType": "FADE_IN",
              "xy": {
                "x": 0.5,
                "y": 0.5
              },
              "startTimeOffset": "5s",
              "endTimeOffset": "10s"
            }
          },
          {
            "animationFade": {
              "fadeType": "FADE_OUT",
              "xy": {
                "x": 0.5,
                "y": 0.5
              },
              "startTimeOffset": "12s",
              "endTimeOffset": "15s"
            }
          }
        ]
      }
    ]
  }
}

To send your request, expand one of these options:

You should receive a JSON response similar to the following:

{
  "name": "projects/PROJECT_NUMBER/locations/LOCATION/jobs/JOB_ID",
  "config": {
   ...
  },
  "createTime": CREATE_TIME,
  "ttlAfterCompletionDays": 30
}

gcloud

  1. Create a config.json file that defines the job fields. Make the following replacements for an example job:
    • GCS_BUCKET_NAME: The name of the Cloud Storage bucket you created.
    • GCS_INPUT_VIDEO: The name of the video in your Cloud Storage bucket that you are transcoding, such as my-vid.mp4. This field should take into account any folders that you created in the bucket (for example, input/my-vid.mp4).
    • GCS_INPUT_OVERLAY: The name of the JPEG image in your Cloud Storage bucket that you are using for the overlay, such as my-overlay.jpg. This field should take into account any folders that you created in the bucket (for example, input/my-overlay.jpg).
    • LOCATION: The location where your job will run. Use a location from the following list:
      • us-central1
      • us-west1
      • us-east1
      • southamerica-east1
      • asia-east1
      • europe-west1
    • GCS_OUTPUT_FOLDER: The Cloud Storage folder name where you want to save the encoded video outputs.
    {
      "config": {
        "inputs": [
              {
                "key": "input0",
                "uri": "gs://GCS_BUCKET_NAME/GCS_INPUT_VIDEO"
              }
            ],
        "elementaryStreams": [
          {
            "key": "video-stream0",
            "videoStream": {
              "codec": "h264",
              "heightPixels": 360,
              "widthPixels": 640,
              "bitrateBps": 550000,
              "frameRate": 60
            }
          },
          {
            "key": "audio-stream0",
            "audioStream": {
              "codec": "aac",
              "bitrateBps": 64000
            }
          }
        ],
        "muxStreams": [
          {
            "key": "sd",
            "container": "mp4",
            "elementaryStreams": [
              "video-stream0",
              "audio-stream0"
            ]
          }
        ],
        "output": {
          "uri": "gs://GCS_BUCKET_NAME/GCS_OUTPUT_FOLDER/"
        },
        "overlays": [
          {
            "image": {
              "uri": "gs://GCS_BUCKET_NAME/GCS_INPUT_OVERLAY",
              "resolution": {
                "x": 0,
                "y": 0
              },
              "alpha": 1
            },
            "animations": [
              {
                "animationFade": {
                  "fadeType": "FADE_IN",
                  "xy": {
                    "x": 0.5,
                    "y": 0.5
                  },
                  "startTimeOffset": "5s",
                  "endTimeOffset": "10s"
                }
              },
              {
                "animationFade": {
                  "fadeType": "FADE_OUT",
                  "xy": {
                    "x": 0.5,
                    "y": 0.5
                  },
                  "startTimeOffset": "12s",
                  "endTimeOffset": "15s"
                }
              }
            ]
          }
        ]
      }
    }
    
  2. Run the following command:
    gcloud beta transcoder jobs create \
        --input-uri="gs://GCS_BUCKET_NAME/GCS_INPUT_VIDEO" \
        --location=LOCATION \
        --output-uri="gs://GCS_BUCKET_NAME/GCS_OUTPUT_FOLDER/" \
        --file="config.json"
    You should see a response similar to the following:
    {
      "name": "projects/PROJECT_NUMBER/locations/LOCATION/jobs/JOB_ID",
      "config": {
       ...
      },
      "createTime": CREATE_TIME,
      "ttlAfterCompletionDays": 30
    }
    

Go

Before trying this sample, follow the Go setup instructions in the Transcoder API Quickstart Using Client Libraries. For more information, see the Transcoder API Go API reference documentation.

import (
	"context"
	"fmt"
	"io"

	"github.com/golang/protobuf/ptypes/duration"

	transcoder "cloud.google.com/go/video/transcoder/apiv1beta1"
	transcoderpb "google.golang.org/genproto/googleapis/cloud/video/transcoder/v1beta1"
)

// createJobWithAnimatedOverlay creates a job based on a given configuration that
// includes an animated overlay. See
// https://cloud.google.com/transcoder/docs/how-to/create-overlays#create-animated-overlay
// for more information.
func createJobWithAnimatedOverlay(w io.Writer, projectID string, location string, inputURI string, overlayImageURI string, outputURI string) error {
	// projectID := "my-project-id"
	// location := "us-central1"
	// inputURI := "gs://my-bucket/my-video-file"
	// overlayImageURI := "gs://my-bucket/my-overlay-image-file" - Must be a JPEG
	// outputURI := "gs://my-bucket/my-output-folder/"
	ctx := context.Background()
	client, err := transcoder.NewClient(ctx)
	if err != nil {
		return fmt.Errorf("NewClient: %v", err)
	}
	defer client.Close()

	req := &transcoderpb.CreateJobRequest{
		Parent: fmt.Sprintf("projects/%s/locations/%s", projectID, location),
		Job: &transcoderpb.Job{
			InputUri:  inputURI,
			OutputUri: outputURI,
			JobConfig: &transcoderpb.Job_Config{
				Config: &transcoderpb.JobConfig{
					ElementaryStreams: []*transcoderpb.ElementaryStream{
						&transcoderpb.ElementaryStream{
							Key: "video_stream0",
							ElementaryStream: &transcoderpb.ElementaryStream_VideoStream{
								VideoStream: &transcoderpb.VideoStream{
									Codec:        "h264",
									BitrateBps:   550000,
									FrameRate:    60,
									HeightPixels: 360,
									WidthPixels:  640,
								},
							},
						},
						&transcoderpb.ElementaryStream{
							Key: "audio_stream0",
							ElementaryStream: &transcoderpb.ElementaryStream_AudioStream{
								AudioStream: &transcoderpb.AudioStream{
									Codec:      "aac",
									BitrateBps: 64000,
								},
							},
						},
					},
					MuxStreams: []*transcoderpb.MuxStream{
						&transcoderpb.MuxStream{
							Key:               "sd",
							Container:         "mp4",
							ElementaryStreams: []string{"video_stream0", "audio_stream0"},
						},
					},
					Overlays: []*transcoderpb.Overlay{
						&transcoderpb.Overlay{
							Image: &transcoderpb.Overlay_Image{
								Uri: overlayImageURI,
								Resolution: &transcoderpb.Overlay_NormalizedCoordinate{
									X: 0,
									Y: 0,
								},
								Alpha: 1,
							},
							Animations: []*transcoderpb.Overlay_Animation{
								&transcoderpb.Overlay_Animation{
									AnimationType: &transcoderpb.Overlay_Animation_AnimationFade{
										AnimationFade: &transcoderpb.Overlay_AnimationFade{
											FadeType: transcoderpb.Overlay_FADE_IN,
											Xy: &transcoderpb.Overlay_NormalizedCoordinate{
												X: 0.5,
												Y: 0.5,
											},
											StartTimeOffset: &duration.Duration{
												Seconds: 5,
											},
											EndTimeOffset: &duration.Duration{
												Seconds: 10,
											},
										},
									},
								},

								&transcoderpb.Overlay_Animation{
									AnimationType: &transcoderpb.Overlay_Animation_AnimationFade{
										AnimationFade: &transcoderpb.Overlay_AnimationFade{
											FadeType: transcoderpb.Overlay_FADE_OUT,
											Xy: &transcoderpb.Overlay_NormalizedCoordinate{
												X: 0.5,
												Y: 0.5,
											},
											StartTimeOffset: &duration.Duration{
												Seconds: 12,
											},
											EndTimeOffset: &duration.Duration{
												Seconds: 15,
											},
										},
									},
								},
							},
						},
					},
				},
			},
		},
	}
	// Creates the job. Jobs take a variable amount of time to run.
	// You can query for the job state; see getJob() in get_job.go.
	response, err := client.CreateJob(ctx, req)
	if err != nil {
		return fmt.Errorf("createJobWithAnimatedOverlay: %v", err)
	}

	fmt.Fprintf(w, "Job: %v", response.GetName())
	return nil
}

Java

Before trying this sample, follow the Java setup instructions in the Transcoder API Quickstart Using Client Libraries. For more information, see the Transcoder API Java API reference documentation.


import com.google.cloud.video.transcoder.v1beta1.AudioStream;
import com.google.cloud.video.transcoder.v1beta1.CreateJobRequest;
import com.google.cloud.video.transcoder.v1beta1.ElementaryStream;
import com.google.cloud.video.transcoder.v1beta1.Input;
import com.google.cloud.video.transcoder.v1beta1.Job;
import com.google.cloud.video.transcoder.v1beta1.JobConfig;
import com.google.cloud.video.transcoder.v1beta1.LocationName;
import com.google.cloud.video.transcoder.v1beta1.MuxStream;
import com.google.cloud.video.transcoder.v1beta1.Output;
import com.google.cloud.video.transcoder.v1beta1.Overlay;
import com.google.cloud.video.transcoder.v1beta1.Overlay.Animation;
import com.google.cloud.video.transcoder.v1beta1.Overlay.AnimationFade;
import com.google.cloud.video.transcoder.v1beta1.Overlay.FadeType;
import com.google.cloud.video.transcoder.v1beta1.Overlay.NormalizedCoordinate;
import com.google.cloud.video.transcoder.v1beta1.TranscoderServiceClient;
import com.google.cloud.video.transcoder.v1beta1.VideoStream;
import com.google.protobuf.Duration;
import java.io.IOException;

public class CreateJobWithAnimatedOverlay {

  public static void main(String[] args) throws IOException {
    // TODO(developer): Replace these variables before running the sample.
    String projectId = "my-project-id";
    String location = "us-central1";
    String inputUri = "gs://my-bucket/my-video-file";
    String overlayImageUri = "gs://my-bucket/my-overlay-image.jpg"; // Must be a JPEG
    String outputUri = "gs://my-bucket/my-output-folder/";

    createJobWithAnimatedOverlay(projectId, location, inputUri, overlayImageUri, outputUri);
  }

  // Creates a job from an ad-hoc configuration and adds an animated overlay to it.
  public static void createJobWithAnimatedOverlay(
      String projectId, String location, String inputUri, String overlayImageUri, String outputUri)
      throws IOException {
    // Initialize client that will be used to send requests. This client only needs to be created
    // once, and can be reused for multiple requests.
    try (TranscoderServiceClient transcoderServiceClient = TranscoderServiceClient.create()) {

      VideoStream videoStream0 =
          VideoStream.newBuilder()
              .setCodec("h264")
              .setBitrateBps(550000)
              .setFrameRate(60)
              .setHeightPixels(360)
              .setWidthPixels(640)
              .build();
      AudioStream audioStream0 =
          AudioStream.newBuilder().setCodec("aac").setBitrateBps(64000).build();

      // Create the overlay image. Only JPEG is supported. Image resolution is based on output
      // video resolution. This example uses the values x: 0 and y: 0 to maintain the original
      // resolution of the overlay image.
      Overlay.Image overlayImage =
          Overlay.Image.newBuilder()
              .setUri(overlayImageUri)
              .setResolution(NormalizedCoordinate.newBuilder().setX(0).setY(0).build())
              .setAlpha(1)
              .build();

      // Create the starting animation (when the overlay starts to fade in). Use the values x: 0.5
      // and y: 0.5 to position the top-left corner of the overlay in the top-left corner of the
      // output video.
      Overlay.Animation animationFadeIn =
          Animation.newBuilder()
              .setAnimationFade(
                  AnimationFade.newBuilder()
                      .setFadeType(FadeType.FADE_IN)
                      .setXy(NormalizedCoordinate.newBuilder().setX(0.5).setY(0.5).build())
                      .setStartTimeOffset(Duration.newBuilder().setSeconds(5).build())
                      .setEndTimeOffset(Duration.newBuilder().setSeconds(10).build())
                      .build())
              .build();

      // Create the ending animation (when the overlay starts to fade out). The overlay will start
      // to fade out at the 12-second mark in the output video.
      Overlay.Animation animationFadeOut =
          Animation.newBuilder()
              .setAnimationFade(
                  AnimationFade.newBuilder()
                      .setFadeType(FadeType.FADE_OUT)
                      .setXy(NormalizedCoordinate.newBuilder().setX(0.5).setY(0.5).build())
                      .setStartTimeOffset(Duration.newBuilder().setSeconds(12).build())
                      .setEndTimeOffset(Duration.newBuilder().setSeconds(15).build())
                      .build())
              .build();

      // Create the overlay and add the image and animations to it.
      Overlay overlay =
          Overlay.newBuilder()
              .setImage(overlayImage)
              .addAnimations(animationFadeIn)
              .addAnimations(animationFadeOut)
              .build();

      JobConfig config =
          JobConfig.newBuilder()
              .addInputs(Input.newBuilder().setKey("input0").setUri(inputUri))
              .setOutput(Output.newBuilder().setUri(outputUri))
              .addElementaryStreams(
                  ElementaryStream.newBuilder()
                      .setKey("video_stream0")
                      .setVideoStream(videoStream0))
              .addElementaryStreams(
                  ElementaryStream.newBuilder()
                      .setKey("audio_stream0")
                      .setAudioStream(audioStream0))
              .addMuxStreams(
                  MuxStream.newBuilder()
                      .setKey("sd")
                      .setContainer("mp4")
                      .addElementaryStreams("video_stream0")
                      .addElementaryStreams("audio_stream0")
                      .build())
              .addOverlays(overlay) // Add the overlay to the job config
              .build();

      var createJobRequest =
          CreateJobRequest.newBuilder()
              .setJob(
                  Job.newBuilder()
                      .setInputUri(inputUri)
                      .setOutputUri(outputUri)
                      .setConfig(config)
                      .build())
              .setParent(LocationName.of(projectId, location).toString())
              .build();

      // Send the job creation request and process the response.
      Job job = transcoderServiceClient.createJob(createJobRequest);
      System.out.println("Job: " + job.getName());
    }
  }
}

Node.js

Before trying this sample, follow the Node.js setup instructions in the Transcoder API Quickstart Using Client Libraries. For more information, see the Transcoder API Node.js API reference documentation.

/**
 * TODO(developer): Uncomment these variables before running the sample.
 */
// projectId = 'my-project-id';
// location = 'us-central1';
// inputUri = 'gs://my-bucket/my-video-file';
// overlayImageUri = 'gs://my-bucket/my-overlay-image-file'; // Must be a JPEG
// outputUri = 'gs://my-bucket/my-output-folder/';

// Imports the Transcoder library
const {TranscoderServiceClient} = require('@google-cloud/video-transcoder');

// Instantiates a client
const transcoderServiceClient = new TranscoderServiceClient();

async function createJobFromAnimatedOverlay() {
  // Construct request
  const request = {
    parent: transcoderServiceClient.locationPath(projectId, location),
    job: {
      inputUri: inputUri,
      outputUri: outputUri,
      config: {
        elementaryStreams: [
          {
            key: 'video-stream0',
            videoStream: {
              codec: 'h264',
              heightPixels: 360,
              widthPixels: 640,
              bitrateBps: 550000,
              frameRate: 60,
            },
          },
          {
            key: 'audio-stream0',
            audioStream: {
              codec: 'aac',
              bitrateBps: 64000,
            },
          },
        ],
        muxStreams: [
          {
            key: 'sd',
            container: 'mp4',
            elementaryStreams: ['video-stream0', 'audio-stream0'],
          },
        ],
        overlays: [
          {
            image: {
              uri: overlayImageUri,
              resolution: {
                x: 0,
                y: 0,
              },
              alpha: 1.0,
            },
            animations: [
              {
                animationFade: {
                  fadeType: 'FADE_IN',
                  xy: {
                    x: 0.5,
                    y: 0.5,
                  },
                  startTimeOffset: {
                    seconds: 5,
                  },
                  endTimeOffset: {
                    seconds: 10,
                  },
                },
              },
              {
                animationFade: {
                  fadeType: 'FADE_OUT',
                  xy: {
                    x: 0.5,
                    y: 0.5,
                  },
                  startTimeOffset: {
                    seconds: 12,
                  },
                  endTimeOffset: {
                    seconds: 15,
                  },
                },
              },
            ],
          },
        ],
      },
    },
  };

  // Run request
  const [response] = await transcoderServiceClient.createJob(request);
  console.log(`Job: ${response.name}`);
}

createJobFromAnimatedOverlay();

Python

Before trying this sample, follow the Python setup instructions in the Transcoder API Quickstart Using Client Libraries. For more information, see the Transcoder API Python API reference documentation.


import argparse

from google.cloud.video import transcoder_v1beta1
from google.cloud.video.transcoder_v1beta1.services.transcoder_service import (
    TranscoderServiceClient,
)
from google.protobuf import duration_pb2 as duration


def create_job_with_animated_overlay(
    project_id, location, input_uri, overlay_image_uri, output_uri
):
    """Creates a job based on an ad-hoc job configuration that includes an animated image overlay.

    Args:
        project_id: The GCP project ID.
        location: The location to start the job in.
        input_uri: Uri of the video in the Cloud Storage bucket.
        overlay_image_uri: Uri of the JPEG image for the overlay in the Cloud Storage bucket. Must be a JPEG.
        output_uri: Uri of the video output folder in the Cloud Storage bucket."""

    client = TranscoderServiceClient()

    parent = f"projects/{project_id}/locations/{location}"
    job = transcoder_v1beta1.types.Job()
    job.input_uri = input_uri
    job.output_uri = output_uri
    job.config = transcoder_v1beta1.types.JobConfig(
        elementary_streams=[
            transcoder_v1beta1.types.ElementaryStream(
                key="video-stream0",
                video_stream=transcoder_v1beta1.types.VideoStream(
                    codec="h264",
                    height_pixels=360,
                    width_pixels=640,
                    bitrate_bps=550000,
                    frame_rate=60,
                ),
            ),
            transcoder_v1beta1.types.ElementaryStream(
                key="audio-stream0",
                audio_stream=transcoder_v1beta1.types.AudioStream(
                    codec="aac", bitrate_bps=64000
                ),
            ),
        ],
        mux_streams=[
            transcoder_v1beta1.types.MuxStream(
                key="sd",
                container="mp4",
                elementary_streams=["video-stream0", "audio-stream0"],
            ),
        ],
        overlays=[
            transcoder_v1beta1.types.Overlay(
                image=transcoder_v1beta1.types.Overlay.Image(
                    uri=overlay_image_uri,
                    resolution=transcoder_v1beta1.types.Overlay.NormalizedCoordinate(
                        x=0,
                        y=0,
                    ),
                    alpha=1,
                ),
                animations=[
                    transcoder_v1beta1.types.Overlay.Animation(
                        animation_fade=transcoder_v1beta1.types.Overlay.AnimationFade(
                            fade_type=transcoder_v1beta1.types.Overlay.FadeType.FADE_IN,
                            xy=transcoder_v1beta1.types.Overlay.NormalizedCoordinate(
                                x=0.5,
                                y=0.5,
                            ),
                            start_time_offset=duration.Duration(
                                seconds=5,
                            ),
                            end_time_offset=duration.Duration(
                                seconds=10,
                            ),
                        ),
                    ),
                    transcoder_v1beta1.types.Overlay.Animation(
                        animation_fade=transcoder_v1beta1.types.Overlay.AnimationFade(
                            fade_type=transcoder_v1beta1.types.Overlay.FadeType.FADE_OUT,
                            xy=transcoder_v1beta1.types.Overlay.NormalizedCoordinate(
                                x=0.5,
                                y=0.5,
                            ),
                            start_time_offset=duration.Duration(
                                seconds=12,
                            ),
                            end_time_offset=duration.Duration(
                                seconds=15,
                            ),
                        ),
                    ),
                ],
            ),
        ],
    )
    response = client.create_job(parent=parent, job=job)
    print(f"Job: {response.name}")
    return response

In the resulting video, the animated overlay has the following characteristics:

  1. It starts to fade in at the 5-second mark in the output video. The alpha value for the overlay starts at 0 and ends at 1.0. The top-left corner of the overlay appears at the center of the output video. The overlay appears at the original resolution of the overlay image.
  2. After it fades in, the overlay shows for 2 seconds.
  3. It starts to fade out at the 12-second mark in the output video. The alpha value for the overlay starts at 1.0 and ends at 0.
  4. The animation disappears by the 15-second mark.

See the sample output video for this configuration. This video uses a sample overlay image.