Big Hero 6 (BH6) is a story about a boy, Hiro, and his super robot, Baymax, built by his older brother, Tadashi. It is a very big robot (Fig. 1). But that is not this story.
This story is about a boy and his super computer. The boy in this case is Andy Hendrickson, Chief Technology Officer at Walt Disney Animation Studios. He oversaw the creation of BH6 working with directors Don Hall and Chris Williams. Both are Disney Animation veterans.
Today we look at the system and methodology used to turn BH6 from the directors' vision into virtual reality.
Walt Disney Animation Studios' render farm is actually a cloud spread across four sites (Fig. 2). It ranks about 75th compared to other supercomputers with over 55,000 Intel cores. There is some custom acceleration support but GPUs and FPGAs are something for the future. The system has 400 Tbytes of memory and sucks down 1.5 MW of power. That is actually a small amount of power compared to the system size.
The system is built around 1U COTS servers. It is linked via 10 Gigabit Ethernet links. All non-volatile storage is solid state disks (SSD). The only thing moving in the system are the cooling fans.
The Disney archives are currently 4 Pbytes. The average movie consumes about 4 Tbytes of information.
This file type includes high resolution graphics and schematics when applicable.
Disney's in-house CODA job distribution system makes the system appear as a single virtual system. It handles a range of chores from rendering to asset management. The system typically performs 1 million render hours per day. This equates to about 400 jobs per day. The system is so automated that there is no overnight staff.
Of course, there is an app for it. This lets everyone check job status using an iPhone or Android phone.
A director can interact with the system using dPIX, an internally created asset browsing system. It can show all the renderings done in the last 24 hours. It allows progressive creative refinement. It can be used on a PC or in a darkened theater.
The big change with BH6 is the use of Disney's new rendering system called Hyperion. Hyperion uses a new lighting system for Disney that supports global illumination. Directors can set up a scene similar to a physical environment by placing light sources and objects along with selected materials that the objects are covered by.
Ray tracing tracks the light from the source. This is typical for rendering animated scenes but Hyperion uses a conserving energy transport approach that addresses sub surface scattering. This allows effects like the glow through an ear or light through a translucent object.
The system is more complex but it allows Hyperion to employ a single shader. This uses a bidirectional scattering distribution function (BSDF). The also uses a streaming geometry rendering system, all of which allows for more complex scenes to be rendered more quickly. The combination allows complexity that is an order of magnitude more than what Disney previously had available.
Disney animators were able to build an extremely detailed version of San Fransokyo (a combination of San Francisco and Tokyo) where BH6 is set. The city was actually based on county assessor maps of San Francisco. This was an artistic challenge because it was three times as complex as any Disney film so far.
Hyperion took two years to build and is now in production use on films like BH6. Right now it is used to render 2K images but with a high dynamic range. The market demand for 4K images isn't there yet, but it is just a matter of scaling.
Scalable quality is another thing Hyperion does well. Dial down the quality and the rendering time goes down as well. This is useful when a director needs to see what a scene will look like but the final quality is not required.