For cutting-edge visual effects and virtual production, creative teams and studios benefit from digital sets and environments that can be updated in real time.
A crucial element in any virtual production environment is a sky dome, often used to provide realistic lighting for virtual environments and in-camera visual effects. Legendary studio Industrial Light & Magic (ILM) is tapping into the power of AI to take its skies to new heights with NVIDIA AI-enabled DeepSearch and Omniverse Enterprise.
Capturing photorealistic details of a sky can be tricky. At SIGGRAPH today, ILM showcased how its team, with the NVIDIA DeepSearch tool, used natural language to rapidly search through a massive asset library and create a captivating sky dome.
The video shows how Omniverse Enterprise can provide filmmakers with the ultimate flexibility to develop the ideal look and lighting to further their stories. This helps artists save time, enhance productivity and accelerate creativity for virtual production.
After narrowing down their search results, the ILM team auditions the remaining sky domes in virtual reality to assess whether the asset will be a perfect match for the shot. By using VR, ILM can approximate what the skies will look like on a virtual production set.
The Sky’s the Limit With AI
An extensive library with thousands of references and 3D assets offers advantages, but it also presents some challenges without an efficient way to search through all the data.
Typically, users set up folders or tag items with keywords, which can be incredibly time consuming. This is especially true for a studio like ILM, which has over 40 years’ worth of material in its reference library, including photography, matte paintings, backdrops and other materials that have been captured over the decades.
With hundreds of thousands of untagged pieces of content, it’s impractical for the ILM team to manually search through them on a production schedule.
Omniverse DeepSearch, however, lets ILM search intuitively through untagged assets using text or a 2D image. DeepSearch uses AI to categorize and find images automatically — this results in massive time savings for the creative team, removing the need to manually tag each asset.
“With Omniverse DeepSearch, we have the ability to search through data in real time, which is key for production,” said Landis Fields, real time principal creative at ILM. “And being able to search through assets with natural language allows for our creative teams to easily find what they’re looking for, helping them achieve the final look and feel of a scene much more efficiently than before.”
DeepSearch also works on USD files, so the ILM team can review search results and bring images into the 3D space in Omniverse Enterprise. The artists could then interact with the 3D environment using a VR headset.
With NVIDIA DeepSearch and Omniverse Enterprise, ILM has the potential to accelerate creative pipelines, lower costs and enhance production workflows to create captivating content for virtual productions.
Join NVIDIA at SIGGRAPH to learn more about the latest Omniverse announcements, watch the company’s special address on demand and see the global premiere of NVIDIA’s documentary, The Art of Collaboration: NVIDIA, Omniverse, and GTC, on Wednesday, Aug. 10, at 10 a.m. PT.
The post As Far as the AI Can See: ILM Uses Omniverse DeepSearch to Create the Perfect Sky appeared first on NVIDIA Blog.