A new way to create.
HYBRID VIRTUAL PRODUCTION
LED VOLUME STAGE VIRTUAL PRODUCTION
Impossible Objects is involved and energized in producing as well as researching new ways to explore the spectrum of virtual production, ranging from fully animated visualizations to live-action LED volume stage production. By embracing the non-linear nature of virtual production, we hope to push the boundaries of creative efficiency and real-time workflows.
"Real-time" rendering is one of the most fascinating new languages in creating cinematic visual effects. With the capability to motion capture an actor's body and facial performance from anywhere in the world and bring it into an infinite number of virtual environments, avoiding long traditional render times -- the possibilities are quite literally endless.
With Virtual Production, the approach to automotive filmmaking will never be the same.
Traditionally, shooting cars has been a refined practice of controlling the uncontrollable. And often, the uncontrollable factors are always the same.
The sun is always changing, and crews are chasing or trying to avoid uncontrollable weather and lighting conditions.
The vehicle often can't be driven at top speed... and sometimes can't be driven at all.
In visual effects, lineage pipelines blockade creatives from being able to make magical in-the-moment creative decisions because of lengthy rendering pipelines and wait times.
With real-time filmmaking and virtual production, we can experience the freedom to be even more innovative than ever before.
With our proprietary automotive rig blueprint, we are able to immediately rig any vehicle of our choosing with an accurate sense of weight and movement, and drive them through any virtual environment our imagination can dream up.
Real-Time Technologies Deliver Impossible Results
For "Battle At Home Screen" we were responsible for the entire production from start to finish, working alongside ad agency Omelet as well as the teams at Google and Diablo. The studio was intent on doing as much of the work as possible in Unreal Engine.
Our team brought this vision to life using accelerated virtual production workflows to blend visual effects with live action. Using Unreal Engine and NVIDIA RTX A6000-powered Dell Precision 7920 workstations, the team created all the stunning cinematics and graphics, from high-fidelity textures and reflections to realistic camera movement and lighting.
Recently, Faceware invited Impossible Objects to participate in testing out their facial capture software in relation to Unreal's MetaHumans. Prior to this moment, it has been difficult, time-consuming and costly to create and animate photo-real rigs. As a key partner with Unreal, Faceware is removing the animation barriers by providing access to a production-quality facial tracking tool capable of driving MetaHuman assets.
With the MetaHuman Creator, we are able to empower actors to slip right into the shoes of real-time animated characters, bringing them to life in a way that is wholly unique yet entirely human. These characters will provide an actor full capability to move freely in their performance while capturing intricate micro-expressions in real-time. Paired with Nvidia RTX technology, our pipeline turns this motion capture into cinematic photorealism in real-time.
Nemosyne, created by Luc Delamare and Kevin Stewart, was a short film demonstrating the potential for cinematic photo-realism in Unreal Engine, and how Virtual Production Cinematographers with Live Action backgrounds are able to work more efficiently in the virtual production space.