Interview With The Producer Of Tower Of The Dragon

Feb. 25, 2013
Turning the Tower of the Dragon into a full feature movie might be done for a lot less if producer Tracy McSherry has his way. He talks with Technology Editor Bill Wong about how an array of new technologies are making this possible.

Tower of the Dragon is a 3D animated movie (Fig. 1) about a young girl, Lyric, who has to save the world and learns that making friends and being a friend is the most powerful magic of all. The Tower of the Dragon Kickstarter project has a $50,000 target. This will allow creation of the first previsualization pass. It is the first step to making a complete motion picture. You can support the project via Kickstarter.

Figure 1. The Tower of the Dragon can be turned into a full length motion picture using image capture and special effects.

This production isn't exactly scrimping with 8 songs, hundreds of characters and a hundred sets but it has a budget of under two million dollars. It also takes a novel, distributed approach creating the movie that is designed to keep the budget well below creation using more conventional movie means.

Bill Wong sits down with the producer, Tracy McSheery, to find out how magic and technology come together in this compelling animated movie. Tracy works with PhaseSpace that has some motion capture technology (see “Technologies Combine For Low-Cost Motion Capture VFX”) that will be used in the project.

Wong: How is this different from all the other animations out there?

McSherry: It isn't really. The goal is to tell a story that deals with issues my three kids dealt with and I did 50 years earlier. There are lots of good stories out there, and we hope we can make them easier to bring to the silver screen. What is different is that we are writing the story around the advantages and limitations of the tools we have, rather than trying to do everything the director wants and sacrificing the budget or the look. It's said the first thing you learn as a director is how to compromise. The only thing we aren't willing to be flexible on is making the story make sense and be something that you want to watch with your kids a dozen times.

Wong: What role does technology play in creating the animation?

McSherry: We are outsiders working with Hollywood insiders. This is a great partnership because with their hundreds of years of experience, they can tell us what worked and what didn't and we can use our relationships with Autodesk, HP, Intel, Nvidia and PhaseSpace to figure out ways that reproduce the success at an affordable price, and hopefully avoid the failures, or at least come up with new and original mistakes, at a lower cost. HP has provided some really powerful workstations and we are doing tests with the characters and sets to refine them before production really starts. Autodesk and Nvidia are giving us access to their upcoming products and instead of compromising on the look of the characters, we can design them so that they render quickly and look good. We are using the Beam Device from Suitable Technologies to allow the director and producers and other creative members to visit us by 'beaming in” instead of traveling here for meetings, which destroys productivity. Finally we'll be rendering it at 120 frames per second for excellent image quality especially in 3D.

Wong: Tell me more about 120 Frames Per second. Why is that important and if so why haven't they done it before?

McSherry: Frame rate is just as important as resolution when things are moving. Doug Trumbull came up with the idea of Showscan 30 years ago working with Richard Yuricich but then you used 4 times as much film so it was expensive. I heard about it before I met Richard, and realized that the Digital Cinema projectors from Christie and others were capable of 120 frames per second or higher and it doesn't cost any more other than the extra rendering cost and a few more terabytes of hard disk space. I had worked on 3D shutter glasses for the Amiga and 3DO machines, which was how I was invited to do a demo for James Cameron and met Richard. The good news was they selected my technology. The bad news: I turned them down when they explained that for sanitary reasons they needed to wash them or clean them daily. I did get a Siggraph party invite from James Cameron and spent the evening chatting, but he hasn't invited me back since then. For the animated movies, the experience is sharper when moving objects don't have the stuttering or strobe effect that you see at 24 and 30 frames per second. While 48 is nice, and 96 better, we'll be pushing the limits of the digital projectors so that the dragons flying will really appear smooth.

Wong: What's Autodesk's role in this production?

McSherry: They've given us access to Motion Builder 2014, along with the latest versions of Maya so we can work with actors and try out the story in real time. Using Motion Builder we can import digital sets, add the CG characters and then the actors can get to know their character by seeing how it moves with them, making adjustments to get the effect they want for that character. Then the same actor can become a second character and with a few changes in the way they walk, move or gesture, they can become dozens of characters in a single day. The cinematographer can work live, or go back later on, in Motion Builder and decide what they want for the virtual lens, or framing the shot, and then that information is captured, and sent to Maya for rendering at a higher resolution.

Wong: What's Nvidia's Role and why aren't others using GPU's to make movies?

McSherry: They are, they just don't always advertise it. We aren't using any tool that isn't commercially available, which is part of the secret sauce of our production. Let the games industry or the CAD industry pay for the expensive development, and craft the movie around the state of the art tools that are available. We are experimenting with Nvidia's Quadro K5000 and some real time rendering tools to take the characters and objects and render them in real time or just minutes per frame at full resolution later on.

Wong: How are you using the Suitable Technologies Beam (Fig. 2) to make a movie?

McSherry: The director lives in Massachusetts, the Choreographer in New York, the Composer in LA and the creative talent is all over the planet. The Beam allows them to visit and interact with us without having to travel here or bring everything to the conference room for productivity killing mass meetings. We plan to use its beam forming noise canceling microphone array to record some of the dialogue and make it look like a teleprompter to save on studio recording. We expect to have visiting directors who want to use our technology to make their own live action or animated movies 'Beam in' and work with us to understand how they can take advantage of these cost savings.

Figure 2. Suitable Technologies' Beam will provide telepresence that can allow a director or choreographer to work remotely.

Wong:  So this is isn't about outsiders making movies competing with Hollywood?

McSherry: No, it's about productivity tools vetted by proving they work in our own feature. If Hollywood can't compete with offshore pricing with better tools, the industry will simply move away. In fact if we do this right they will be making more movies, at a lower price and less risk, which is healthy for all of us who weren't getting 8 figure paychecks anyway.

Wong: How does Motion Capture fit into this?

McSherry: PhaseSpace can capture, fingers and faces at 960 frames per second with real time data clean enough to train the actor how to make the characters come alive, and then let them interact with each other (Fig. 3). This allows a half dozen actors to do in days the main characters for the entire movie. They can then go back and do the background characters. We hope to have homages to Charlie Chaplain, Buster Keaton, The Marx Brothers, Three Stooges and others hidden in the background, because instead of it being complicated, it's about the actors ability to make us believe they are Charlie Chaplain, not about costs or budgets.

Figure 3. Phase Space utilizes LED emitters that generate a flashing sequence that allows them to be easily recognized from the recorded video making image capture more automatic and significantly less expensive compared to conventional techniques.

Wong: What do you see the benefit to the Entertainment Industry if you succeed?

McSherry: I hope you'll see better movies because they can afford to do the previsualizations and iterate them a few times before they even hire the big name talent. That's not to say you won't let the star improvise and bring new life to the characters, but at least everyone will know what the movie they are making is going to be about, and that the script translates from a printed idea to a vision.

Wong: What's the next movie if this one succeeds? A Sequel?

McSherry: This will be enough for me for a while. I am trying to talk Alexei Panshin about doing his “Rite of Passage” or David Brin into doing the Practice Effect or any one of a thousand other science books especially Anne McCaffrey's Dragon Riders of Pern, or To Ride Pegasus Books.

Sponsored Recommendations


To join the conversation, and become an exclusive member of Electronic Design, create an account today!