Chaos Hopes to Make Virtual Production Better, Quicker and Cheaper With New Software Development (EXCLUSIVE)
German visualization tech developer Chaos, whose software includes the Engineering Emmy and SciTech Academy Award-honored V-Ray renderer, is developing new tech aimed at offering studios what it believes will be a less costly, more efficient and higher quality option for virtual production volumes.
To test Project Arena — a high quality real-time renderer — Chaos assembled a team of production pros to create a short using the tech in a virtual production environment. The Western-themed short, which is expected to debut in the spring, is being lensed by American Society of Cinematographers past president Richard Crudo, while the virtual production line producer is James Blevins, co-founder of virtual production firm MESH and a former post supervisor on “The Mandalorian.”
More from Variety
The goal of the new renderer is to help productions more quickly move 3D-created scenes from popular creation tools such as Maya and Houdini onto LED screens. “I don’t know any other way to express it, except we have something which is now producing the highest level of quality more cheaply and efficiently,” says Blevins. “It’s about making every single shot as efficient as possible.”
“Virtual art departments would have very different roles. They could not necessarily spend a lot of time converting data,” Chaos Labs’ director of special projects Christopher Nichols suggests, noting that the company is seeking feedback from artists as they develop the tech.
“It represents a huge step forward for cinematographers by allowing us to do our jobs more creatively, quickly and efficiently,” adds Crudo. “It delivers a much more precise method of accomplishing what up to now has been a generally cumbersome task. My eyes are always the final judge of what I’m doing, and my experience with it thus far has been thoroughly convincing. It’s destined to become the standard for all volume and LED wall work.”
The key to Chaos’ new development is the use of a type of rendering known as ray tracing (the new development combines the use of Chao’s V-Ray ray tracing tool with additional new tech). Nichols believes that the ray tracing approach is “the most accurate” at representing lighting and cameras, but the challenge has been that “you can’t always get a real time experience.”
In order to get those real time experiences, virtual production workflows today typically involve a game engine, which uses “rasterized” rendering. Explains Nichols, “rasterized rendering is a perfectly good solution for rendering. The only problem is that you have to fake a lot of the things to mimic the quality of what you get naturally out of ray tracing. You may hear now that a certain video game supports ray tracing, [but] they support some ray tracing and that’s enough to make it look a little bit better.”
Real-time ray tracing, Nichols contends, “changes the whole game, because by fully ray tracing, we are [producing] a much closer representation of what an actual camera does. And we’re doing it now in real time. We feel that is going to allow people to get back to filmmaking and not be disrupted by a lot of technology.”
Best of Variety
Sign up for Variety’s Newsletter. For the latest news, follow us on Facebook, Twitter, and Instagram.