Training, testing and validating autonomous vehicles requires a continuous pipeline — or data factory — to introduce new scenarios and refine deep neural networks.
A key component of this process is simulation. AV developers can test a virtually limitless number of scenarios, repeatably and at scale, with high-fidelity, physically based simulation. And like much of the technology related to AI, simulation is constantly evolving and improving, getting ever nearer to closing the gap between the real and virtual worlds.
NVIDIA DRIVE Sim, built on Omniverse, provides a virtual proving ground for AV testing and validation. It’s a highly accurate simulation platform with the ability to enable groundbreaking tools, including synthetic data generation and neural reconstruction, to build digital twins of driving environments and scenarios.
Matt Cragun, senior product manager for AV simulation at NVIDIA, joined the AI Podcast to discuss the development of simulation for self-driving technology, detailing the origins and inner workings of DRIVE Sim.
He also provided a sneak peek into the frontiers researchers are exploring for this critical testing and validation technology.
Neural Reconstruction Engine in NVIDIA DRIVE Sim
NVIDIA researchers have developed an AI pipeline, known as the Neural Reconstruction Engine, that constructs a 3D scene from recorded sensor data in NVIDIA DRIVE Sim.
First demonstrated at GTC22, these AI tools bring the real world directly in simulation to increase realism and speed up autonomous vehicle production.
NRE uses multiple AI networks to create interactive 3D test environments where developers can modify the scene and see how the world reacts. Developers can change scenarios, add synthetic objects, and apply randomizations—such as a child following a bouncing ball into the road—making the initial scenarios even more challenging.
You Might Also Like
Driver’s Ed: How Waabi Uses AI, Simulation to Teach Autonomous Vehicles to Drive
Teaching the AI brains of autonomous vehicles to understand the world as humans do requires billions of miles of driving experience. The road to achieving this astronomical level of driving leads to the virtual world. Learn how Waabi uses powerful high-fidelity simulations to train and develop production-level autonomous vehicles.
Polestar’s Dennis Nobelius on the Sustainable Performance Brand’s Plans
Driving enjoyment and autonomous driving capabilities can complement one another in intelligent, sustainable vehicles. Learn about the automaker’s plans to unveil its third vehicle, the Polestar 3, the tech inside it, and what the company’s racing heritage brings to the intersection of smarts and sustainability.
GANTheftAuto: Harrison Kinsley on AI-Generated Gaming Environments
Humans playing games against machines is nothing new, but now computers can develop their own games for people to play. Programming enthusiast and social media influencer Harrison Kinsley created GANTheftAuto, an AI-based neural network that generates a playable chunk of the classic video game Grand Theft Auto V.
Subscribe to the AI Podcast: Now Available on Amazon Music
The AI Podcast is now available through Amazon Music.
In addition, get the AI Podcast through iTunes, Google Podcasts, Google Play, Castbox, DoggCatcher, Overcast, PlayerFM, Pocket Casts, Podbay, PodBean, PodCruncher, PodKicker, Soundcloud, Spotify, Stitcher and TuneIn.
Make the AI Podcast better: Have a few minutes to spare? Fill out this listener survey.
The post Hittin’ the Sim: NVIDIA’s Matt Cragun on Conditioning Autonomous Vehicles in Simulation appeared first on NVIDIA Blog.