Jump to Content
Inside Google Cloud

Robot dance party: How we created an entire animated short at Next ‘18

August 14, 2018
https://storage.googleapis.com/gweb-cloudblog-publish/images/robot_dance_party.max-2600x2600.png
Laith Massarweh

Media, Entertainment & Gaming Marketing Lead

Adrian Graham

Senior Cloud Solutions Architect

As content creators know all too well, creating Hollywood-caliber visual effects and animated features is no easy task. And with an increasingly global talent community, creators need access to powerful compute resources and specialized hardware so they can collaborate seamlessly and produce the awe-inspiring movie magic that we all know and love.

In June we announced a variety of solutions aimed at helping content creators do exactly that. A few weeks later, at Google Cloud Next, we challenged ourselves to do something bigger and more unique than a product demo. To demonstrate what’s possible, we built an animated short over the course of three days.

To do it, we invited some like-minded artists who share our vision to set up a live cloud-based animation studio on the second floor of Moscone Center. These artists worked throughout the three days of the show to model, animate, and render the spot, and deliver a finished short.

We called it Robot Dance Party.  

Here’s how it came together.

Setting up the studio

With our director and team of animators in place, our first task was to figure out how to run their workstations in the cloud. To showcase the artists' workflows at the expo, we connected all workstation displays to an overhead screen, controlled by the director, and switchable to any of the five connected monitors.

https://storage.googleapis.com/gweb-cloudblog-publish/images/Next18_studio.max-1800x1800.png

Each artist connected to their virtual workstation through a Zero Client, a hardware endpoint designed to optimize the experience with a purpose-built processor to perform image decompression and decoding, with support for up to four monitors.

https://storage.googleapis.com/gweb-cloudblog-publish/images/NextCloud_Studio_architecture.max-1700x1700.png

Introducing vCraig

Although three artists worked from the Moscone Center in San Francisco, we also wanted to demonstrate how easily Google Cloud enables remote collaboration. One of our artists, Craig, worked from his home in Los Angeles. Using Google Meet, a camera, microphone, and speakers, anyone could walk up to vCraig's desk and collaborate with him throughout the show. (We called him “vCraig”—short for Virtual Craig.)

https://storage.googleapis.com/gweb-cloudblog-publish/images/vCraig_Pzj6jUp.max-1800x1800.png
vCraig in action

vCraig worked on an identical workstation configuration as our other artists, writing to the same shared storage so his finished animation was available immediately—no data synchronization required. Any number of artists working from any location could be added in this way, collaborating with the team in the same fashion.

Building a virtual workstation

Each artist was assigned a virtual workstation—a new solution we announced at Next ‘18—consisting of:

  • A virtual machine running on GCP.

  • An NVIDIA GPU.

  • An NVIDIA graphics driver, licensed by our NVIDIA GRID License Server.

https://storage.googleapis.com/gweb-cloudblog-publish/images/Cloud_Studio_architecture.max-1700x1700.png
The Cloud Studio architecture

Delivering the desktop

There are many remote desktop protocols available that deliver a virtual workstation to an end user, but we used Teradici PCoIP, which is widely used in the media and & entertainment industry. PCoIP can deliver color-accurate, uncompressed graphics, making it particularly valuable for visual effects and animation which require both multiple high resolution displays and color accuracy across different users and display types.

Orchestrating the workstations

While it’s possible to deploy one workstation, installing software, licenses, and libraries, many creators need to deploy and manage hundreds of virtual workstations, and assign them to hundreds of users.

For this, you need to implement an orchestration layer, also known as a connection broker. Production managers use a connection broker to deploy, assign, and manage resources across an organization. From a single dashboard, you can control who gets what kind of workstation, and when and how it can be accessed.

Connection brokers also provide Multi-Factor Authentication (MFA), and can route users to resources closest to their physical location to reduce latency.

For the Cloud Studio, BeBop Technology provided a fully managed workstation deployment to all artists, building out and monitoring the infrastructure so the artists could get to work within a few minutes of the initial project kick-off meeting.

Storing the data

Assets were created by artists throughout the production pipeline. Models were built, textures were painted, characters animated, and the final scenes rendered as 4K images. All this data needed to be shared amongst all artists, so they needed high capacity, high speed storage available from all workstations. To store all this data, we used a premium Cloud Filestore instance, mounted as an NFS drive on each workstation.

Rendering the animation

While developing the design, modeling, animation, and look of the piece was completed on virtual workstations, rendering all 1,455 final frames required a render farm solution. We used Zync Render, a Renderfarm-as-a-Service running on GCP that can be deployed in minutes, and works with major 3D applications and renderers. The final piece was rendered in V-Ray for Maya, which is used widely in the visual effects, animation, and architectural visualization industries.

Render times for the 13 shots ranged from two to 30 minutes each, with multiple iterations of each shot needed to achieve the final look of the piece. In total, nearly 1,500 render hours were needed to generate the final product. Zync is able to deploy up to 500 render workers per project, up to a total of 48,000 vCPUs. This means that artists were able to assign a single render worker to each frame submitted, getting rendered shots back in the time it takes to render a single frame.

Bringing our vision to life

To prepare for the three days of the show, our artists collaborated to create moodboards, break down each shot, create an animatic, and do initial character design.


In collaboration with our Google Cloud brand team, our director honed in on an engaging concept that featured robots doing a funky dance.

https://storage.googleapis.com/gweb-cloudblog-publish/images/RobotDanceParty.max-800x800.png

We named our two dancers Fred and Ginger, after Fred Astaire and Ginger Rogers (they are dancing, and this is show business, after all). The four-limbed fellow orchestrating the dance party is named DJ Half-K—although this name has a nerdier origin story. You might already know that Kubernetes, our open source container orchestration system, is often abbreviated as K8s. DJ Half-K “orchestrates” the dance party, but only has four arms, hence…DJ Half-K. (We warned you it was nerdy.)

Without further ado, we’d like to show you the final cut of Fred, Ginger, and DJ Half-K doing their thing!

https://storage.googleapis.com/gweb-cloudblog-publish/images/thumbnail_for_youtube.max-1300x1300.jpg

To learn more about how Google Cloud can help you achieve your creative vision, stop by our booth at SIGGRAPH this week, or visit our website.

Posted in