Artists and engineers, architects, and automakers are coming together around a new standard — born in the digital animation industry — that promises to weave all our virtual worlds together.
That’s the conclusion of a group of panelists from a wide range of industries who gathered at NVIDIA GTC21 this week to talk about Pixar’s Universal Scene Description standard, or USD.
“You have people from automotive, advertising, engineering, gaming, and software and we’re all having this rich conversation about USD,” said Perry Nightingale, head of creative at WPP, one of the world’s largest advertising and communications companies. “We’re basically experiencing this live in our work.”
Born at Pixar
Conceived at Pixar more than a decade ago and released as open-source in 2016, USD provides a rich, common language for defining, packaging, assembling and editing 3D data for a growing array of industries and applications.
For more, see “Plumbing the Metaverse with USD” by Michael Kass
Most recently, the technology has been adopted by NVIDIA to build Omniverse — a platform that creates a real-time, shared 3D world to speed collaboration among far-flung workers and even train robots, so they can more safely and efficiently work alongside humans.
The panel — moderated by veteran gaming journalist Dean Takahashi — included Martha Tsigkari, a partner at architects Foster + Partners; Mattias Wikenmalm, a senior visualization expert at Volvo Cars; WPP’s Nightingale; Lori Hufford, vice president of applications integration at engineering software company Bentley Systems; Susanna Holt, vice president at 3D software company Autodesk; and Ivar Dahlberg, a technical artist with Stockholm-based gaming studio Embark Studios.
It also featured two of the engineers who helped create the USD standard at Pixar — F. Sebastian Grass, project lead for USD at Pixar, and Guido Quaroni, now senior director of engineering of 3D and immersive at Adobe.
Joining them was NVIDIA Distinguished Engineer Michael Kass, who, along with NVIDIA’s Rev Lebaredian, helped lead the effort to build NVIDIA Omniverse.
A Sci-Fi Metaverse Come to Life
Omniverse was made to create and experience shared virtual 3D worlds, ones not unlike the science-fiction metaverse described by Neal Stephenson in his early 1990s novel “Snow Crash.” Of course, the full vision of the fictional metaverse remains in the future, but judging by the panel, it’s a future that’s rapidly approaching.
A central goal of Omniverse was to seamlessly connect together as many tools, applications and technologies as possible. To do this, Kass and Lebaredian knew they needed to represent the data using a powerful, expressive and battle-tested open standard. USD exactly fit the bill.
“The fact that you’ve built something so general and extensible that it addresses very nicely the needs of all the participants on this call — that’s an extraordinary achievement,” Kass told USD pioneers Grassia and Quaroni.
One of NVIDIA’s key additions to the USD ecosystem is a replication system. An application programmer can use the standard USD API to query a scene and alter it at will. With no special effort on the part of the programmer, the system keeps track of everything that changes.
In real time, the changes can be published to NVIDIA’s Omniverse Nucleus server, which sends them along to all subscribers. As a result, different teams in different places using different tools can work together and see each other’s changes without noticeable delay.
That technology has become invaluable in architecture, engineering and construction, where large teams from many different disciplines can now collaborate far more easily.
“You need a way for the creative people to do things that can be passed directly to the engineers and consultants in a seamless way,” Tsigkari said. “The structural engineer doesn’t care about my windows, doesn’t care about my doors.”
USD allows the structural engineer to see what they do care about.
USD and NVIDIA Omniverse provide a way to link a wide variety of specialized tools — for creatives, engineers and others — in real time.
“We do see the different industries converging and that’s not going to work if they can’t talk to one another,” said Autodesk’s Holt.
One valuable application is the ability to create product mockups in real time. For too long, Nightingale said, creative teams would have to present clients with 2D mockups of their designs because the tools used by the design teams were incompatible with those of the marketing team. Now those mockups can be in 3D and updated instantly as the design team makes changes.
Virtual Worlds Where AI, Robots and Autonomous Vehicles Can Learn
Capabilities like these aren’t just critical for humans. USD also promises to be the foundation for virtual worlds where new products can be simulated and rigorously tested.
USD and Omniverse are at the center of NVIDIA’s DRIVE simulation platform, Kass explained, which gives automakers a sandbox where they can test new autonomous vehicles. Nothing should go out into the real world until it’s thoroughly tested in simulation, he said.
“We want all of our mistakes to happen in the virtual world, and we based that entire virtual world on USD,” Kass said.
There’s also potential for technologies like USD to allow participants in the kind of virtual worlds game makers are building to play a larger role in shaping those words in real time.
“One of the interesting things we’re seeing is how players can be part of creating a world,” Dahlberg said.
“Now there are a lot more opportunities where you create something together with the inhabitants of that world,” he added.
The first steps, however, have already been taken — thanks to USD — making it easier to exchange data about shared 3D worlds.
“If we can actually get that out of the way, when that’s easy to do, we can start building a proper metaverse,” Volvo’s Wikenmalm said.
For more, see “Plumbing the Metaverse with USD,” by Michael Kass
The post Universal Scene Description Key to Shared Metaverse, GTC Panelists Say appeared first on The Official NVIDIA Blog.