Mantenha tudo organizado com as coleções
Salve e categorize o conteúdo com base nas suas preferências.
Nesta página, explicamos como inserir sobreposições em vídeos transcodificados. Uma sobreposição
consiste em uma imagem JPEG que é inserida na parte superior do vídeo de saída e pode
ser esmaecida ou desativada durante um período de tempo especificado. Para inserir uma
sobreposição, use a matriz
overlays no
modelo JobConfig.
Fazer upload de uma imagem para o Cloud Storage
Para começar, faça o seguinte para fazer o upload de uma imagem de sobreposição para o
bucket do Cloud Storage:
Selecione um arquivo JPEG para fazer o upload da sua máquina local.
Criar uma sobreposição
Você pode criar dois tipos de sobreposição: estática ou
animada. Os dois tipos de sobreposição usam uma imagem estática.
É possível mostrar ou ocultar sobreposições estáticas. As sobreposições animadas são compatíveis
com a esmaecimento e esmaecem animações da imagem.
É possível inserir várias sobreposições em um único vídeo de saída.
Criar uma sobreposição estática
No objeto image, use o campo
uri para
especificar a imagem de sobreposição no Cloud Storage. No objeto resolution, defina
os valores x e y de 0 a 1,0. O valor 0 mantém a resolução da imagem de origem
para a dimensão; Um valor de 1.0 expandirá a imagem para corresponder
à dimensão do vídeo de saída. Por exemplo, use os valores x: 1 e y:
0.5 para esticar a imagem de sobreposição como largura total e metade da altura do
vídeo de saída.
Na matriz animations, crie um objeto animationStatic com as coordenadas x e y
de 0 a 1,0. Essas coordenadas são baseadas na resolução do
vídeo de saída. Use os valores x: 0 e y: 0 para posicionar o canto superior esquerdo
da sobreposição no canto superior esquerdo do vídeo de saída. Especifique quando a sobreposição
deve aparecer na linha do tempo de saída do vídeo usando o campo startTimeOffset.
Para remover a animação estática, crie um objeto animationEnd. Especifique quando
a animação deve terminar (ou seja, a sobreposição desaparece) na linha do tempo
da saída do vídeo usando o campo startTimeOffset.
LOCATION: o local onde seu job será executado. Use
uma das regiões compatíveis:
us-central1
us-west1
us-west2
us-east1
us-east4
southamerica-east1
asia-east1
asia-south1
asia-southeast1
europe-west1
europe-west2
europe-west4
GCS_BUCKET_NAME: o nome do bucket do
Cloud Storage criado.
GCS_INPUT_VIDEO: o nome do vídeo no
bucket do Cloud Storage que você está transcodificando, como my-vid.mp4.
Este campo precisa considerar todas as pastas criadas no bucket (por exemplo, input/my-vid.mp4).
GCS_INPUT_OVERLAY: o nome da imagem
JPEG no bucket do Cloud Storage que você está usando para a sobreposição, como
my-overlay.jpg. Este campo precisa considerar todas as pastas criadas no bucket (por exemplo, input/my-overlay.jpg).
GCS_OUTPUT_FOLDER: nome da pasta do
Cloud Storage em que você quer salvar as saídas de vídeo codificadas.
using Google.Api.Gax.ResourceNames;
using Google.Cloud.Video.Transcoder.V1;
public class CreateJobWithStaticOverlaySample
{
public Job CreateJobWithStaticOverlay(
string projectId, string location, string inputUri, string overlayImageUri, string outputUri)
{
// Create the client.
TranscoderServiceClient client = TranscoderServiceClient.Create();
// Build the parent location name.
LocationName parent = new LocationName(projectId, location);
// Build the job config.
VideoStream videoStream0 = new VideoStream
{
H264 = new VideoStream.Types.H264CodecSettings
{
BitrateBps = 550000,
FrameRate = 60,
HeightPixels = 360,
WidthPixels = 640
}
};
AudioStream audioStream0 = new AudioStream
{
Codec = "aac",
BitrateBps = 64000
};
// Create the overlay image. Only JPEG is supported. Image resolution is based on output
// video resolution. To respect the original image aspect ratio, set either x or y to 0.0.
// This example stretches the overlay image the full width and half of the height of the
// output video.
Overlay.Types.Image overlayImage = new Overlay.Types.Image
{
Uri = overlayImageUri,
Alpha = 1,
Resolution = new Overlay.Types.NormalizedCoordinate
{
X = 1,
Y = 0.5
}
};
// Create the starting animation (when the overlay appears). Use the values x: 0 and y: 0 to
// position the top-left corner of the overlay in the top-left corner of the output video.
Overlay.Types.Animation animationStart = new Overlay.Types.Animation
{
AnimationStatic = new Overlay.Types.AnimationStatic
{
Xy = new Overlay.Types.NormalizedCoordinate
{
X = 0,
Y = 0
},
StartTimeOffset = new Google.Protobuf.WellKnownTypes.Duration
{
Seconds = 0
}
}
};
// Create the ending animation (when the overlay disappears). In this example, the overlay
// disappears at the 10-second mark in the output video.
Overlay.Types.Animation animationEnd = new Overlay.Types.Animation
{
AnimationEnd = new Overlay.Types.AnimationEnd
{
StartTimeOffset = new Google.Protobuf.WellKnownTypes.Duration
{
Seconds = 10
}
}
};
// Create the overlay and add the image and animations to it.
Overlay overlay = new Overlay
{
Image = overlayImage,
Animations = { animationStart, animationEnd }
};
ElementaryStream elementaryStream0 = new ElementaryStream
{
Key = "video_stream0",
VideoStream = videoStream0
};
ElementaryStream elementaryStream1 = new ElementaryStream
{
Key = "audio_stream0",
AudioStream = audioStream0
};
MuxStream muxStream0 = new MuxStream
{
Key = "sd",
Container = "mp4",
ElementaryStreams = { "video_stream0", "audio_stream0" }
};
Input input = new Input
{
Key = "input0",
Uri = inputUri
};
Output output = new Output
{
Uri = outputUri
};
JobConfig jobConfig = new JobConfig
{
Inputs = { input },
Output = output,
ElementaryStreams = { elementaryStream0, elementaryStream1 },
MuxStreams = { muxStream0 },
Overlays = { overlay }
};
// Build the job.
Job newJob = new Job
{
InputUri = inputUri,
OutputUri = outputUri,
Config = jobConfig
};
// Call the API.
Job job = client.CreateJob(parent, newJob);
// Return the result.
return job;
}
}
import com.google.cloud.video.transcoder.v1.AudioStream;
import com.google.cloud.video.transcoder.v1.CreateJobRequest;
import com.google.cloud.video.transcoder.v1.ElementaryStream;
import com.google.cloud.video.transcoder.v1.Input;
import com.google.cloud.video.transcoder.v1.Job;
import com.google.cloud.video.transcoder.v1.JobConfig;
import com.google.cloud.video.transcoder.v1.LocationName;
import com.google.cloud.video.transcoder.v1.MuxStream;
import com.google.cloud.video.transcoder.v1.Output;
import com.google.cloud.video.transcoder.v1.Overlay;
import com.google.cloud.video.transcoder.v1.Overlay.AnimationEnd;
import com.google.cloud.video.transcoder.v1.Overlay.AnimationStatic;
import com.google.cloud.video.transcoder.v1.Overlay.NormalizedCoordinate;
import com.google.cloud.video.transcoder.v1.TranscoderServiceClient;
import com.google.cloud.video.transcoder.v1.VideoStream;
import com.google.protobuf.Duration;
import java.io.IOException;
public class CreateJobWithStaticOverlay {
public static void main(String[] args) throws IOException {
// TODO(developer): Replace these variables before running the sample.
String projectId = "my-project-id";
String location = "us-central1";
String inputUri = "gs://my-bucket/my-video-file";
String overlayImageUri = "gs://my-bucket/my-overlay-image.jpg"; // Must be a JPEG
String outputUri = "gs://my-bucket/my-output-folder/";
createJobWithStaticOverlay(projectId, location, inputUri, overlayImageUri, outputUri);
}
// Creates a job from an ad-hoc configuration and adds a static overlay to it.
public static void createJobWithStaticOverlay(
String projectId, String location, String inputUri, String overlayImageUri, String outputUri)
throws IOException {
// Initialize client that will be used to send requests. This client only needs to be created
// once, and can be reused for multiple requests.
try (TranscoderServiceClient transcoderServiceClient = TranscoderServiceClient.create()) {
VideoStream videoStream0 =
VideoStream.newBuilder()
.setH264(
VideoStream.H264CodecSettings.newBuilder()
.setBitrateBps(550000)
.setFrameRate(60)
.setHeightPixels(360)
.setWidthPixels(640))
.build();
AudioStream audioStream0 =
AudioStream.newBuilder().setCodec("aac").setBitrateBps(64000).build();
// Create the overlay image. Only JPEG is supported. Image resolution is based on output
// video resolution. To respect the original image aspect ratio, set either x or y to 0.0.
// This example stretches the overlay image the full width and half of the height of the
// output video.
Overlay.Image overlayImage =
Overlay.Image.newBuilder()
.setUri(overlayImageUri)
.setResolution(NormalizedCoordinate.newBuilder().setX(1).setY(0.5).build())
.setAlpha(1)
.build();
// Create the starting animation (when the overlay appears). Use the values x: 0 and y: 0 to
// position the top-left corner of the overlay in the top-left corner of the output video.
Overlay.Animation animationStart =
Overlay.Animation.newBuilder()
.setAnimationStatic(
AnimationStatic.newBuilder()
.setXy(NormalizedCoordinate.newBuilder().setX(0).setY(0).build())
.setStartTimeOffset(Duration.newBuilder().setSeconds(0).build())
.build())
.build();
// Create the ending animation (when the overlay disappears). In this example, the overlay
// disappears at the 10-second mark in the output video.
Overlay.Animation animationEnd =
Overlay.Animation.newBuilder()
.setAnimationEnd(
AnimationEnd.newBuilder()
.setStartTimeOffset(Duration.newBuilder().setSeconds(10).build())
.build())
.build();
// Create the overlay and add the image and animations to it.
Overlay overlay =
Overlay.newBuilder()
.setImage(overlayImage)
.addAnimations(animationStart)
.addAnimations(animationEnd)
.build();
JobConfig config =
JobConfig.newBuilder()
.addInputs(Input.newBuilder().setKey("input0").setUri(inputUri))
.setOutput(Output.newBuilder().setUri(outputUri))
.addElementaryStreams(
ElementaryStream.newBuilder()
.setKey("video_stream0")
.setVideoStream(videoStream0))
.addElementaryStreams(
ElementaryStream.newBuilder()
.setKey("audio_stream0")
.setAudioStream(audioStream0))
.addMuxStreams(
MuxStream.newBuilder()
.setKey("sd")
.setContainer("mp4")
.addElementaryStreams("video_stream0")
.addElementaryStreams("audio_stream0")
.build())
.addOverlays(overlay) // Add the overlay to the job config
.build();
var createJobRequest =
CreateJobRequest.newBuilder()
.setJob(
Job.newBuilder()
.setInputUri(inputUri)
.setOutputUri(outputUri)
.setConfig(config)
.build())
.setParent(LocationName.of(projectId, location).toString())
.build();
// Send the job creation request and process the response.
Job job = transcoderServiceClient.createJob(createJobRequest);
System.out.println("Job: " + job.getName());
}
}
}
use Google\Cloud\Video\Transcoder\V1\AudioStream;
use Google\Cloud\Video\Transcoder\V1\ElementaryStream;
use Google\Cloud\Video\Transcoder\V1\Job;
use Google\Cloud\Video\Transcoder\V1\JobConfig;
use Google\Cloud\Video\Transcoder\V1\MuxStream;
use Google\Cloud\Video\Transcoder\V1\Overlay;
use Google\Cloud\Video\Transcoder\V1\TranscoderServiceClient;
use Google\Cloud\Video\Transcoder\V1\VideoStream;
use Google\Protobuf\Duration;
/**
* Creates a job based on a supplied job config that includes a static image overlay.
*
* @param string $projectId The ID of your Google Cloud Platform project.
* @param string $location The location of the job.
* @param string $inputUri Uri of the video in the Cloud Storage bucket.
* @param string $overlayImageUri Uri of the JPEG image for the overlay in the Cloud Storage bucket. Must be a JPEG.
* @param string $outputUri Uri of the video output folder in the Cloud Storage bucket.
*/
function create_job_with_static_overlay($projectId, $location, $inputUri, $overlayImageUri, $outputUri)
{
// Instantiate a client.
$transcoderServiceClient = new TranscoderServiceClient();
$formattedParent = $transcoderServiceClient->locationName($projectId, $location);
$jobConfig =
(new JobConfig())->setElementaryStreams([
(new ElementaryStream())
->setKey('video-stream0')
->setVideoStream(
(new VideoStream())
->setH264(
(new VideoStream\H264CodecSettings())
->setBitrateBps(550000)
->setFrameRate(60)
->setHeightPixels(360)
->setWidthPixels(640)
)
),
(new ElementaryStream())
->setKey('audio-stream0')
->setAudioStream(
(new AudioStream())
->setCodec('aac')
->setBitrateBps(64000)
)
])->setMuxStreams([
(new MuxStream())
->setKey('sd')
->setContainer('mp4')
->setElementaryStreams(['video-stream0', 'audio-stream0'])
])->setOverlays([
(new Overlay())
->setImage(
(new Overlay\Image())
->setUri($overlayImageUri)
->setResolution(
(new Overlay\NormalizedCoordinate())
->setX(1)
->setY(0.5)
)
->setAlpha(1)
)
->setAnimations([
(new Overlay\Animation())
->setAnimationStatic(
(new Overlay\AnimationStatic())
->setXy(
(new Overlay\NormalizedCoordinate())
->setY(0)
->setX(0)
)
->setStartTimeOffset(
(new Duration())
->setSeconds(0)
)
),
(new Overlay\Animation())
->setAnimationEnd(
(new Overlay\AnimationEnd())
->setStartTimeOffset(
(new Duration())
->setSeconds(10)
)
)
])
]);
$job = (new Job())
->setInputUri($inputUri)
->setOutputUri($outputUri)
->setConfig($jobConfig);
$response = $transcoderServiceClient->createJob($formattedParent, $job);
// Print job name.
printf('Job: %s' . PHP_EOL, $response->getName());
}
import argparse
from google.cloud.video import transcoder_v1
from google.cloud.video.transcoder_v1.services.transcoder_service import (
TranscoderServiceClient,
)
from google.protobuf import duration_pb2 as duration
def create_job_with_static_overlay(
project_id, location, input_uri, overlay_image_uri, output_uri
):
"""Creates a job based on an ad-hoc job configuration that includes a static image overlay.
Args:
project_id: The GCP project ID.
location: The location to start the job in.
input_uri: Uri of the video in the Cloud Storage bucket.
overlay_image_uri: Uri of the JPEG image for the overlay in the Cloud Storage bucket. Must be a JPEG.
output_uri: Uri of the video output folder in the Cloud Storage bucket."""
client = TranscoderServiceClient()
parent = f"projects/{project_id}/locations/{location}"
job = transcoder_v1.types.Job()
job.input_uri = input_uri
job.output_uri = output_uri
job.config = transcoder_v1.types.JobConfig(
elementary_streams=[
transcoder_v1.types.ElementaryStream(
key="video-stream0",
video_stream=transcoder_v1.types.VideoStream(
h264=transcoder_v1.types.VideoStream.H264CodecSettings(
height_pixels=360,
width_pixels=640,
bitrate_bps=550000,
frame_rate=60,
),
),
),
transcoder_v1.types.ElementaryStream(
key="audio-stream0",
audio_stream=transcoder_v1.types.AudioStream(
codec="aac", bitrate_bps=64000
),
),
],
mux_streams=[
transcoder_v1.types.MuxStream(
key="sd",
container="mp4",
elementary_streams=["video-stream0", "audio-stream0"],
),
],
overlays=[
transcoder_v1.types.Overlay(
image=transcoder_v1.types.Overlay.Image(
uri=overlay_image_uri,
resolution=transcoder_v1.types.Overlay.NormalizedCoordinate(
x=1,
y=0.5,
),
alpha=1,
),
animations=[
transcoder_v1.types.Overlay.Animation(
animation_static=transcoder_v1.types.Overlay.AnimationStatic(
xy=transcoder_v1.types.Overlay.NormalizedCoordinate(
x=0,
y=0,
),
start_time_offset=duration.Duration(
seconds=0,
),
),
),
transcoder_v1.types.Overlay.Animation(
animation_end=transcoder_v1.types.Overlay.AnimationEnd(
start_time_offset=duration.Duration(
seconds=10,
),
),
),
],
),
],
)
response = client.create_job(parent=parent, job=job)
print(f"Job: {response.name}")
return response
No objeto image, use o campo
uri para
especificar a imagem de sobreposição no Cloud Storage. No objeto resolution, defina
os valores x e y de 0 a 1,0. O valor 0 mantém a resolução da imagem de origem
para a dimensão; Um valor de 1.0 expandirá a imagem para corresponder
à dimensão do vídeo de saída. Por exemplo, use os valores x: 0 e y: 0
para manter a resolução original da imagem de sobreposição.
Na matriz animations, crie um objeto animationFade com um fadeType de
FADE_IN. Defina as coordenadas x e y de 0 a 1.0. Essas coordenadas são
baseadas na resolução do vídeo de saída. Use os valores x: 0.5 e y: 0.5 para
posicionar o canto superior esquerdo da sobreposição no centro do vídeo de saída.
Especifique quando a sobreposição deve começar a aparecer na linha do tempo de saída do vídeo
usando o campo startTimeOffset. A sobreposição precisa estar totalmente visível pelo
tempo definido no campo endTimeOffset.
Para esmaecer a sobreposição, crie outro objeto animationFade. Desta vez, defina
fadeType como FADE_OUT. Insira as coordenadas de posição e os horários de início
e término como antes.
LOCATION: o local onde seu job será executado. Use
uma das regiões compatíveis:
us-central1
us-west1
us-west2
us-east1
us-east4
southamerica-east1
asia-east1
asia-south1
asia-southeast1
europe-west1
europe-west2
europe-west4
GCS_BUCKET_NAME: o nome do bucket do
Cloud Storage criado.
GCS_INPUT_VIDEO: o nome do vídeo no
bucket do Cloud Storage que você está transcodificando, como my-vid.mp4.
Este campo precisa considerar todas as pastas criadas no bucket (por exemplo, input/my-vid.mp4).
GCS_INPUT_OVERLAY: o nome da imagem
JPEG no bucket do Cloud Storage que você está usando para a sobreposição, como
my-overlay.jpg. Este campo precisa considerar todas as pastas criadas no bucket (por exemplo, input/my-overlay.jpg).
GCS_OUTPUT_FOLDER: nome da pasta do
Cloud Storage em que você quer salvar as saídas de vídeo codificadas.
import com.google.cloud.video.transcoder.v1.AudioStream;
import com.google.cloud.video.transcoder.v1.CreateJobRequest;
import com.google.cloud.video.transcoder.v1.ElementaryStream;
import com.google.cloud.video.transcoder.v1.Input;
import com.google.cloud.video.transcoder.v1.Job;
import com.google.cloud.video.transcoder.v1.JobConfig;
import com.google.cloud.video.transcoder.v1.LocationName;
import com.google.cloud.video.transcoder.v1.MuxStream;
import com.google.cloud.video.transcoder.v1.Output;
import com.google.cloud.video.transcoder.v1.Overlay;
import com.google.cloud.video.transcoder.v1.Overlay.Animation;
import com.google.cloud.video.transcoder.v1.Overlay.AnimationFade;
import com.google.cloud.video.transcoder.v1.Overlay.FadeType;
import com.google.cloud.video.transcoder.v1.Overlay.NormalizedCoordinate;
import com.google.cloud.video.transcoder.v1.TranscoderServiceClient;
import com.google.cloud.video.transcoder.v1.VideoStream;
import com.google.protobuf.Duration;
import java.io.IOException;
public class CreateJobWithAnimatedOverlay {
public static void main(String[] args) throws IOException {
// TODO(developer): Replace these variables before running the sample.
String projectId = "my-project-id";
String location = "us-central1";
String inputUri = "gs://my-bucket/my-video-file";
String overlayImageUri = "gs://my-bucket/my-overlay-image.jpg"; // Must be a JPEG
String outputUri = "gs://my-bucket/my-output-folder/";
createJobWithAnimatedOverlay(projectId, location, inputUri, overlayImageUri, outputUri);
}
// Creates a job from an ad-hoc configuration and adds an animated overlay to it.
public static void createJobWithAnimatedOverlay(
String projectId, String location, String inputUri, String overlayImageUri, String outputUri)
throws IOException {
// Initialize client that will be used to send requests. This client only needs to be created
// once, and can be reused for multiple requests.
try (TranscoderServiceClient transcoderServiceClient = TranscoderServiceClient.create()) {
VideoStream videoStream0 =
VideoStream.newBuilder()
.setH264(
VideoStream.H264CodecSettings.newBuilder()
.setBitrateBps(550000)
.setFrameRate(60)
.setHeightPixels(360)
.setWidthPixels(640))
.build();
AudioStream audioStream0 =
AudioStream.newBuilder().setCodec("aac").setBitrateBps(64000).build();
// Create the overlay image. Only JPEG is supported. Image resolution is based on output
// video resolution. This example uses the values x: 0 and y: 0 to maintain the original
// resolution of the overlay image.
Overlay.Image overlayImage =
Overlay.Image.newBuilder()
.setUri(overlayImageUri)
.setResolution(NormalizedCoordinate.newBuilder().setX(0).setY(0).build())
.setAlpha(1)
.build();
// Create the starting animation (when the overlay starts to fade in). Use the values x: 0.5
// and y: 0.5 to position the top-left corner of the overlay in the top-left corner of the
// output video.
Overlay.Animation animationFadeIn =
Animation.newBuilder()
.setAnimationFade(
AnimationFade.newBuilder()
.setFadeType(FadeType.FADE_IN)
.setXy(NormalizedCoordinate.newBuilder().setX(0.5).setY(0.5).build())
.setStartTimeOffset(Duration.newBuilder().setSeconds(5).build())
.setEndTimeOffset(Duration.newBuilder().setSeconds(10).build())
.build())
.build();
// Create the ending animation (when the overlay starts to fade out). The overlay will start
// to fade out at the 12-second mark in the output video.
Overlay.Animation animationFadeOut =
Animation.newBuilder()
.setAnimationFade(
AnimationFade.newBuilder()
.setFadeType(FadeType.FADE_OUT)
.setXy(NormalizedCoordinate.newBuilder().setX(0.5).setY(0.5).build())
.setStartTimeOffset(Duration.newBuilder().setSeconds(12).build())
.setEndTimeOffset(Duration.newBuilder().setSeconds(15).build())
.build())
.build();
// Create the overlay and add the image and animations to it.
Overlay overlay =
Overlay.newBuilder()
.setImage(overlayImage)
.addAnimations(animationFadeIn)
.addAnimations(animationFadeOut)
.build();
JobConfig config =
JobConfig.newBuilder()
.addInputs(Input.newBuilder().setKey("input0").setUri(inputUri))
.setOutput(Output.newBuilder().setUri(outputUri))
.addElementaryStreams(
ElementaryStream.newBuilder()
.setKey("video_stream0")
.setVideoStream(videoStream0))
.addElementaryStreams(
ElementaryStream.newBuilder()
.setKey("audio_stream0")
.setAudioStream(audioStream0))
.addMuxStreams(
MuxStream.newBuilder()
.setKey("sd")
.setContainer("mp4")
.addElementaryStreams("video_stream0")
.addElementaryStreams("audio_stream0")
.build())
.addOverlays(overlay) // Add the overlay to the job config
.build();
var createJobRequest =
CreateJobRequest.newBuilder()
.setJob(
Job.newBuilder()
.setInputUri(inputUri)
.setOutputUri(outputUri)
.setConfig(config)
.build())
.setParent(LocationName.of(projectId, location).toString())
.build();
// Send the job creation request and process the response.
Job job = transcoderServiceClient.createJob(createJobRequest);
System.out.println("Job: " + job.getName());
}
}
}
use Google\Cloud\Video\Transcoder\V1\AudioStream;
use Google\Cloud\Video\Transcoder\V1\ElementaryStream;
use Google\Cloud\Video\Transcoder\V1\Job;
use Google\Cloud\Video\Transcoder\V1\JobConfig;
use Google\Cloud\Video\Transcoder\V1\MuxStream;
use Google\Cloud\Video\Transcoder\V1\Overlay;
use Google\Cloud\Video\Transcoder\V1\TranscoderServiceClient;
use Google\Cloud\Video\Transcoder\V1\VideoStream;
use Google\Protobuf\Duration;
/**
* Creates a job based on a supplied job config that includes an animated overlay.
*
* @param string $projectId The ID of your Google Cloud Platform project.
* @param string $location The location of the job.
* @param string $inputUri Uri of the video in the Cloud Storage bucket.
* @param string $overlayImageUri Uri of the JPEG image for the overlay in the Cloud Storage bucket. Must be a JPEG.
* @param string $outputUri Uri of the video output folder in the Cloud Storage bucket.
*/
function create_job_with_animated_overlay($projectId, $location, $inputUri, $overlayImageUri, $outputUri)
{
// Instantiate a client.
$transcoderServiceClient = new TranscoderServiceClient();
$formattedParent = $transcoderServiceClient->locationName($projectId, $location);
$jobConfig =
(new JobConfig())->setElementaryStreams([
(new ElementaryStream())
->setKey('video-stream0')
->setVideoStream(
(new VideoStream())->setH264(
(new VideoStream\H264CodecSettings())
->setBitrateBps(550000)
->setFrameRate(60)
->setHeightPixels(360)
->setWidthPixels(640)
)
),
(new ElementaryStream())
->setKey('audio-stream0')
->setAudioStream(
(new AudioStream())
->setCodec('aac')
->setBitrateBps(64000)
)
])->setMuxStreams([
(new MuxStream())
->setKey('sd')
->setContainer('mp4')
->setElementaryStreams(['video-stream0', 'audio-stream0'])
])->setOverlays([
(new Overlay())->setImage(
(new Overlay\Image())
->setUri($overlayImageUri)
->setResolution(
(new Overlay\NormalizedCoordinate())
->setX(0)
->setY(0)
)
->setAlpha(1)
)->setAnimations([
(new Overlay\Animation())->setAnimationFade(
(new Overlay\AnimationFade())
->setFadeType(Overlay\FadeType::FADE_IN)
->setXy(
(new Overlay\NormalizedCoordinate())
->setY(0.5)
->setX(0.5)
)
->setStartTimeOffset(new Duration(['seconds' => 5]))
->setEndTimeOffset(new Duration(['seconds' => 10]))
),
(new Overlay\Animation())->setAnimationFade(
(new Overlay\AnimationFade())
->setFadeType(Overlay\FadeType::FADE_OUT)
->setXy(
(new Overlay\NormalizedCoordinate())
->setY(0.5)
->setX(0.5)
)
->setStartTimeOffset(new Duration(['seconds' => 12]))
->setEndTimeOffset(new Duration(['seconds' => 15]))
)
])
]);
$job = (new Job())
->setInputUri($inputUri)
->setOutputUri($outputUri)
->setConfig($jobConfig);
$response = $transcoderServiceClient->createJob($formattedParent, $job);
// Print job name.
printf('Job: %s' . PHP_EOL, $response->getName());
}
No vídeo resultante, a sobreposição animada tem as seguintes características:
Ela começa a desaparecer na marca de cinco segundos no vídeo de saída. O valor
Alfa da sobreposição começa em 0 e termina em 1.0. O canto superior esquerdo
da sobreposição aparece no centro do vídeo de saída. A sobreposição aparece
na resolução original da imagem de sobreposição.
Depois que ela desaparece, a sobreposição é exibida por dois segundos.
Ela começa a desaparecer na marca de 12 segundos no vídeo de saída. O valor
Alfa da sobreposição começa em 1.0 e termina em 0.
[[["Fácil de entender","easyToUnderstand","thumb-up"],["Meu problema foi resolvido","solvedMyProblem","thumb-up"],["Outro","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Problema na tradução","translationIssue","thumb-down"],["Outro","otherDown","thumb-down"]],["Última atualização 2023-12-15 UTC."],[],[]]