How Video Game Engines Are Changing The Way Animation Is Made [Annecy]

[ad_1]

While big productions like “The Mandalorian” have used video game engines before for visual effects, “Super Giant Robot Brothers” is the first project to be completely rendered inside a video game engine. The studio Reel FX TV show is an action-comedy about, well, two giant robot brothers, who are tasked with fighting kaiju that threaten to destroy the world. The trailer sells a fun action cartoon with good physical comedy, but what is most exciting about the show is its shot composition and camera movement. There’s a sense of blocking, use of low angles, and lighting that looks more like what you’d find in live-action than in a cartoon. 

The director of “Super Giant Robot Brothers,” Mark Andrews, introduced the trailer and talked about the way Unreal Engine allowed for a streamlined process wherein a smaller team can work on the same project simultaneously, while also having more authorship over each stage of production. This stands in contrast with the current pipeline which is more of a loop, wherein a team works on one part of the project, then sends it off to another team, then it gets sent back again and again if any changes are needed. This creates a big need to communicate ideas clearly, which isn’t always easy. During Genndy Tartakovsky’s panel premiering the first episode of his show “Unicorn: Warriors Eternal,” the director spoke of the struggle of translating his ideas into something he could communicate with the animation studio, who then adapt the ideas to the screen. 

With Unreal Engine, however, every change, every addition can be shared with the entire animation team instantly, as it gets rendered in real-time. As Andrews said during the presentation, it allows everyone to be on the same page at all times, which eliminates the need for a back-and-forth, cutting costs, and the need for bigger teams micromanaging everything.

[ad_2]

Source link

Comments are closed.