How-to Stream Live Videos

The Cloud Video Intelligence Streaming API supports standard live streaming protocols like RTSP, RTMP, and HLS. The AIStreamer ingestion pipeline behaves as a streaming proxy, converting from live streaming protocols to bidirectional streaming gRPC connection.

To support live streaming protocols, Video Intelligence uses the gStreamer open media framework.

Step 1: Create a named pipe

A named pipe is created to communicate between gStreamer and AIStreamer ingestion proxy. The two processes are running inside the same Docker container.

$ export PIPE_NAME=/path_to_pipe/pipe_name
$ mkfifo $PIPE_NAME

Step 2: Run AIStreamer ingestion proxy

C++ examples are available for you to use. The examples include a single binary that supports all features. To build the examples, see the build instructions.

The following example shows how to use the binary from the command line.

$ export GOOGLE_APPLICATION_CREDENTIALS=/path_to_credential/credential_json
$ export CONFIG=/path_to_config/config_json
$ export PIPE_NAME=/path_to_pipe/pipe_name
$ export TIMEOUT=3600
$ ./streaming_client_main --alsologtostderr --endpoint "dns:///" \
      --video_path=$PIPE_NAME --use_pipe=true --config=$CONFIG --timeout=$TIMEOUT

Here, $GOOGLE_APPLICATION_CREDENTIALS specifies the file path of the JSON file that contains your service account key.

You can find an example configuration file—$CONFIG in the previous example—here.

Make sure to set the correct timeout flag in the command line. If you need to stream 1 hour of video, timeout value should be at least 3600 seconds.

Step 3: Run gStreamer pipeline

gStreamer supports multiple live streaming protocols including but not limited to:

  • HTTP Live Streaming (HLS)

  • Real-time Streaming Protocol (RTSP)

  • Real-time Protocol (RTP)

  • Real-time Messaging Protocol (RTMP)

  • WebRTC

  • Streaming from Webcam

Video Intelligence uses gStreamer pipeline to convert from these live streaming protocols to a decodable video stream, and writes the stream into the named pipe created in Step 1.

The following examples demonstrate how to use live streaming library using HLS, RTSP and RTMP protocols.

HTTP Live Streaming (HLS)

$ export PIPE_NAME=/path_to_pipe/pipe_name
$ export HLS_SOURCE=http://abc.def/playlist.m3u8
$ gst-launch-1.0 -v souphttpsrc location=$HLS_SOURCE ! hlsdemux ! filesink location=$PIPE_NAME

Real-time Streaming Protocol (RTSP)

$ export PIPE_NAME=/path_to_pipe/pipe_name
$ export RTSP_SOURCE=rtsp://ip_addr:port/stream
$ gst-launch-1.0 -v rtspsrc location=$RTSP_SOURCE ! rtpjitterbuffer ! rtph264depay \
      ! h264parse ! mp4mux ! filesink location=$PIPE_NAME

Real-time Message Protocol (RTMP)

$ export PIPE_NAME=/path_to_pipe/pipe_name
$ export RTMP_SOURCE=rtmp://host/app/stream
$ gst-launch-1.0 -v rtmpsrc location=$RTMP_SOURCE ! flvdemux ! flvmux ! filesink location=$PIPE_NAME

Build instructions

The binary example is built using Bazel. A Docker example that has all build dependencies configured is also provided. You can find the compiled streaming_client_main binary in $BIN_DIR directory of the Docker image.

For more information on using Docker, see Using Docker & Kubernetes.

Flow control

Cloud Video Intelligence Streaming API server has inherent flow control. StreamingAnnotateVideoRequest requests are rejected, and gRPC streaming connections are stopped immediately in the following two cases:

  • When the AIStreamer ingestion client is sending requests to Google servers too frequently

  • When the AIStreamer ingestion client is sending too much data to Google servers (beyond 20Mbytes per second).

¿Te ha resultado útil esta página? Enviar comentarios:

Enviar comentarios sobre...

Cloud Video Intelligence API Documentation