Electronicdesign 9352 1014labbenchpromo

Disney Supercomputer Renders Big Hero 6

Sept. 5, 2014
Looking for a supercomputer? Disney’s using one to create its latest animated movie, Big Hero 6.
Fig 1. Baymax and Hiro are the stars in Disney Animation's Big Hero 6.

Big Hero 6 (BH6) is a story about a boy, Hiro, and his super robot, Baymax, built by his older brother, Tadashi. It is a very big robot (Fig. 1). But that is not this story.

This story is about a boy and his super computer. The boy in this case is Andy Hendrickson, Chief Technology Officer at Walt Disney Animation Studios. He oversaw the creation of BH6 working with directors Don Hall and Chris Williams. Both are Disney Animation veterans.

Today we look at the system and methodology used to turn BH6 from the directors' vision into virtual reality.

Rendering BH6

Walt Disney Animation Studios' render farm is actually a cloud spread across four sites (Fig. 2). It ranks about 75th compared to other supercomputers with over 55,000 Intel cores. There is some custom acceleration support but GPUs and FPGAs are something for the future. The system has 400 Tbytes of memory  and sucks down 1.5 MW of power. That is actually a small amount of power compared to the system size.

Fig 2. Walt Disney Animation Studios supercomputer is about the 75 largest in the world with over 55,000 cores.

The system is built around 1U COTS servers. It is linked via 10 Gigabit Ethernet links. All non-volatile storage is solid state disks (SSD). The only thing moving in the system are the cooling fans.

The Disney archives are currently 4 Pbytes. The average movie consumes about 4 Tbytes of information.

Download this article in .PDF format
This file type includes high resolution graphics and schematics when applicable.

Disney's in-house CODA job distribution system makes the system appear as a single virtual system. It handles a range of chores from rendering to asset management. The system typically performs 1 million render hours per day. This equates to about 400 jobs per day. The system is so automated that there is no overnight staff.

Of course, there is an app for it. This lets everyone check job status using an iPhone or Android phone.

A director can interact with the system using dPIX, an internally created asset browsing system. It can show all the renderings done in the last 24 hours. It allows progressive creative refinement. It can be used on a PC or in a darkened theater.

Lighting BH6

The big change with BH6 is the use of Disney's new rendering system called Hyperion. Hyperion uses a new lighting system for Disney that supports global illumination. Directors can set up a scene similar to a physical environment by placing light sources and objects along with selected materials that the objects are covered by.

Ray tracing tracks the light from the source. This is typical for rendering animated scenes but Hyperion uses a conserving energy transport approach that addresses sub surface scattering. This allows effects like the glow through an ear or light through a translucent object.

The system is more complex but it allows Hyperion to employ a single shader. This uses a bidirectional scattering distribution function (BSDF). The also uses a streaming geometry rendering system, all of which allows for more complex scenes to be rendered more quickly. The combination allows complexity that is an order of magnitude more than what Disney previously had available.

Disney animators were able to build an extremely detailed version of San Fransokyo (a combination of San Francisco and Tokyo) where BH6 is set. The city was actually based on county assessor maps of San Francisco. This was an artistic challenge because it was three times as complex as any Disney film so far. 

Hyperion took two years to build and is now in production use on films like BH6. Right now it is used to render 2K images but with a high dynamic range. The market demand for 4K images isn't there yet, but it is just a matter of scaling.

Scalable quality is another thing Hyperion does well. Dial down the quality and the rendering time goes down as well. This is useful when a director needs to see what a scene will look like but the final quality is not required. 

Sponsored Recommendations

Near- and Far-Field Measurements

April 16, 2024
In this comprehensive application note, we delve into the methods of measuring the transmission (or reception) pattern, a key determinant of antenna gain, using a vector network...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Empowered by Cutting-Edge Automation Technology: The Sustainable Journey

April 16, 2024
Advanced automation is key to efficient production and is a powerful tool for optimizing infrastructure and processes in terms of sustainability.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!