Effects, such as the glowing orb, were mostly done in Niagara, Unreal Engine’s visual effects system. In addition to this, they used Unreal Engine’s Landscape toolset to create the terrain, and Quixel Megascans-which are free for all use with Unreal Engine-to populate the environment. “It's combining tricks, techniques and processes that were learned in the years I spent working in the visual effects and animation industry, with the benefits of being able to quickly iterate and visualize the results in real time,” says Edward. That was valuable, as it helped us visualize and finesse the look along the way.”įor look development, the team used lots of Unreal Engine’s materials and shaders, including vertex shaders and decals to not only give a unique effect, but to help maintain real-time performance. “You gained a taste of what it was going to look like right away, instead of having to wait a few months to start getting a little bit more involved. “It was great to see the previs with light and color,” says Edward. Taking advantage of the engine’s ability to render the files in real time, they brought in animation daily, starting in a very crude state, even while the models themselves were still being finalized. The team had previously modeled the assets in Maya and ZBrush, before blocking out the animation in Maya and bringing it into Unreal Engine via FBX, where they also blocked out the cameras in Sequencer, Unreal Engine’s built-in multi-track nonlinear editor. “We were able to look at her takes with the recording that came out of the iPhone and also, on the day, we could see the camera looking at her,” says Edward. For this, they used the Live Link Face iOS app. “Although we probably would have done a lot of things the same way we had if there was no pandemic, we were thankfully able to rely on the virtual production aspect of the filmmaking to save the day,” says Edward.Īfter the main motion was captured, the team did a second session with the actor just for facial capture. With the mocap happening in Las Vegas at SuperAlloy Interactive, Edward’s team in Burbank directed the actors via Zoom, while viewing the results on the characters in real time in Unreal Engine, making it easy to ensure they had the takes they wanted. Interestingly, it was virtual production techniques that ended up making this all possible. “We quickly created a way to enable our company to work remotely on an animated short by making our pipeline virtual,” said Edward. Edward and his team set up a remote working environment that would enable them to work on real-time rendered animated content, as well as other projects that they had on their books. Just as Slay was getting greenlit, the pandemic hit. With the exception of Windwalker Echo, the Slay assets, including her adversary, were all designed and created by Mold3D Studio. With the company now starting to get a name for environment art, they were excited to illustrate their expertise in story development and character design. The proposal was to create a finished piece of final-pixel animated content in Unreal Engine. Virtual production techniques help Mold3D Studio create Slay in a pandemicĪfter the UE5 Demo wrapped, Edward began talking to Epic about Slay. There have been some great advances in the latest UE 4.27 release-especially in the realm of virtual production-but features like Nanite and Lumen are really going to change the game.” “We’re currently using UE4, until UE5 is production-ready. “It was exciting to get a taste of what’s coming in UE5 in the future,” says Edward. They put the experience gained on previous projects like The Mandalorian to good use when tasked with creating extremely complex and high-resolution 3D models to show off Nanite, UE5’s virtualized micropolygon geometry system. Around this time, the studio was also approached by Epic to work on the Unreal Engine 5 reveal demo. Shortly thereafter, Mold3D Studio was invited back to join the VAD for The Mandalorian Season 2. We could all feel that this was happening.” “I was trying to build for what I saw that was coming-the future of visual effects. “It was the foundation of us starting up a studio solely devoted to the art of being a real real-time studio,” he says. In these early days of virtual production, Edward was using Unreal Engine to create pitches for films and to visualize environments for directors, enabling them to do virtual scouting, and to set up shots, color, and lighting that were then fed to the visual effects vendor where they would finish the film.Īfter his time at Fox VFX Lab, Edward and his team were asked to be a part of the VAD for The Mandalorian. With some Unreal Engine experience now under his belt, Edward joined Fox VFX Lab, where he was asked to head up their new VAD (virtual art department) and build a team.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |