Driver’s Ed: How Waabi Uses AI, Simulation to Teach Autonomous Vehicles to Drive

Teaching the AI brains of autonomous vehicles to understand the world as humans do requires billions of miles of driving experience. The road to achieving this astronomical level of driving leads to the virtual world.

On the latest episode of the AI Podcast, Waabi CEO and founder Raquel Urtasun joins NVIDIA’s Katie Burke Washabaugh to talk about the role simulation technology plays in developing production-level autonomous vehicles.

Waabi is an autonomous-vehicle system startup that uses powerful, high-fidelity simulation to run multiple scenarios simultaneously and tailor training to rare and dangerous situations that are difficult to encounter in the real world.

Urtasun is also a professor of Computer Science at the University of Toronto. Before starting Waabi, she led the Uber Advanced Technologies Group as chief scientist and head of research and development.

You Might Also Like

Polestar’s Dennis Nobelius on the Sustainable Performance Brand’s Plans

Driving enjoyment and autonomous driving capabilities can complement one another in intelligent, sustainable vehicles. Learn about the automaker’s plans to unveil its third vehicle, the Polestar 3, the tech inside it, and what the company’s racing heritage brings to the intersection of smarts and sustainability.

GANTheftAuto: Harrison Kinsley on AI-Generated Gaming Environments

Humans playing games against machines is nothing new, but now computers can develop their own games for people to play. Programming enthusiast and social media influencer Harrison Kinsley created GANTheftAuto, an AI-based neural network that generates a playable chunk of the classic video game Grand Theft Auto V.

How Audio Analytic Is Teaching Machines to Listen

From active noise cancellation to digital assistants that are always listening for your commands, audio is perhaps one of the most important but often overlooked aspects of modern technology in our daily lives. Dr. Chris Mitchell, CEO and founder of Audio Analytic, discusses the challenges, and the fun, involved in teaching machines to listen.

Subscribe to the AI Podcast: Now available on Amazon Music

You can now listen to the AI Podcast through Amazon Music.

You can also get the AI Podcast through iTunes, Google Podcasts, Google Play, Castbox, DoggCatcher, Overcast, PlayerFM, Pocket Casts, Podbay, PodBean, PodCruncher, PodKicker, Soundcloud, Spotify, Stitcher and TuneIn.

Have a few minutes to spare? Fill out our listener survey.

The post Driver’s Ed: How Waabi Uses AI, Simulation to Teach Autonomous Vehicles to Drive appeared first on NVIDIA Blog.

Read More

GFN Thursday Caught in 4K: 27 Games Arriving on GeForce NOW in May, Alongside 4K Streaming to PC and Mac Apps

Enjoy the finer things in life. May is looking pixel perfect for GeForce NOW gamers.

RTX 3080 members can now take their games to the next level, streaming at 4K resolution on the GeForce NOW PC and Mac native apps — joining 4K support in the living room with SHIELD TV.

There’s also a list of 10 titles ready to play today, led by Star Wars Battlefront II, Star Wars Jedi: Fallen Order and Star Wars: Squadrons — all part of the 27 total games joining the GeForce NOW library in May.

Play in 4K Today

GeForce NOW is always upgrading. As of today, RTX 3080 members playing from the PC and Mac native apps can stream at 4K resolution at 60 frames per second.

Play in 4K on GeForce NOW
Stream in 4K on GeForce NOW with an RTX 3080 membership.

4K streaming gets a boost by NVIDIA DLSS, groundbreaking AI rendering technology that increases graphics performance using dedicated Tensor Core AI processors on RTX GPUs. DLSS taps into the power of a deep learning neural network to boost frame rates and generate beautiful, sharp images for games.

RTX 3080 members enjoy the benefits of ultra-low latency that rivals native console gaming. They can also enable customized in-game graphics settings like RTX ON, taking games to a cinematic level, and experience the maximized eight-hour play sessions.

On top of this, GeForce NOW is leveling up mobile gamers with support for more 120Hz devices capable of streaming at 120 FPS with RTX 3080 memberships. Newly supported devices include the Samsung Galaxy S22 and S22 Ultra, Galaxy Z Fold3 and Flip3, and OnePlus 9 Pro.

Visit our RTX 3080 setup guides for more information on 4K resolution on PC and macOS, and check out all of the new 120Hz Android devices.

Fan-Favorite Star Wars Games, Now in 4K

This week delivers three new games from Electronic Arts, and the far reaches of the galaxy, to the cloud — all streaming at up to 4K quality with a GeForce NOW RTX 3080 membership.

Be the hero in Star Wars Battlefront II (Steam and Origin, Rated T for Teen by the ESRB). Experience rich multiplayer battlegrounds of all three eras — prequel, classic and new trilogy — or rise as a new hero and discover a gripping single-player story spanning 30 years.

A galaxy-spanning adventure awaits in Star Wars Jedi: Fallen Order (Steam and Origin, Rated T for Teen by the ESRB). Rebuild the Jedi Order by developing powerful Force abilities and mastering the art of the lightsaber — all while staying one step ahead of the Empire in this third-person action-adventure title.

Speed through the stars and master the art of starfighter combat in Star Wars: Squadrons (Steam and Origin, Rated T for Teen by the ESRB). Buckle up and feel the adrenaline of first-person, multiplayer space dogfights alongside your squadron in this authentic piloting experience.

Play all of these fantastic titles, now streaming to PC, Mac, Chromebook, mobile devices and more with GeForce NOW.

Mayday, Mayday – May Games Inbound

We’re kicking the month off with May’s new gaming arrivals.

Our new “Instant Play Free Demos” row just got a little bit bigger. Members can now try out Backbone: Prologue streaming on the cloud before jumping into the full title.

Members can also dive into 27 total games coming in May, with 10 titles leading the pack this week.

The following are ready to stream today:

Plus, it’s evil. It’s dead. Evil Dead: The Game (Epic Games Store) is coming to GeForce NOW this month.

Evil Dead: The Game on GeForce NOW
Get ready to play ‘Evil Dead: The Game’ this month – groovy!

Step into the shoes of Ash Williams or his friends from the iconic Evil Dead franchise in a game loaded with co-op and PVP multiplayer action. Survive as a team of four and brandish your short barrel shotgun, chainsaw, cleavers and more against the armies of darkness. Or take control of the Kandarian Demon and seek to swallow their souls — all at up to 1440p and 120 FPS or 4K at 60 FPS on PC and Mac, and up to 4K on SHIELD TV with an RTX 3080 membership.

Coming in May:

  • Brigandine The Legend of Runersia (New release on Steam, May 11)
  • Neptunia x SENRAN KAGURA: Ninja Wars (New release on Steam, May 11)
  • Cepheus Protocol Anthology (New release on Steam, May 13)
  • Evil Dead: The Game (New release on Epic Games Store, May 13)
  • Old World (New release on Steam, May 19)
  • Vampire: The Masquerade Swansong (New release on Epic Games Store, May 19)
  • Crossfire: Legion (New release on Steam, May 24)
  • Out There: Oceans of Time (New release on Steam, May 26)
  • My Time at Sandrock (New release on Steam, May 26)
  • Turbo Sloths (New release on Steam, May 27)
  • Pogostuck: Rage With Your Friends (Steam)
  • Raji: An Ancient Epic (Steam and Epic Games Store)
  • Star Conflict (Steam)
  • THE KING OF FIGHTERS XV (Steam and Epic Games Store)
  • The Planet Crafter (Steam)
  • The Political Machine 2020 (Steam)
  • Yet Another Zombie Defense HD (Steam)

A Lotta Extra From April

On top of the titles announced in April, an extra 22 ended up coming to the cloud. Check out the additional games that were added last month:

The game Cities in Motion 2 (Steam) was also announced in April, but didn’t quite make it.

Finally, we’ve got a question for you. With all of these new games, we’re pretty sure we know the answer to this one. Let us know on Twitter or in the comments below.

 

The post GFN Thursday Caught in 4K: 27 Games Arriving on GeForce NOW in May, Alongside 4K Streaming to PC and Mac Apps appeared first on NVIDIA Blog.

Read More

Setting AIs on SIGGRAPH: Top Academic Researchers Collaborate With NVIDIA to Tackle Graphics’ Greatest Challenges

NVIDIA’s latest academic collaborations in graphics research have produced a reinforcement learning model that smoothly simulates athletic moves, ultra-thin holographic glasses for virtual reality, and a real-time rendering technique for objects illuminated by hidden light sources.

These projects — and over a dozen more — will be on display at SIGGRAPH 2022, taking place Aug. 8-11 in Vancouver and online. NVIDIA researchers have 16 technical papers accepted at the conference, representing work with 14 universities including Dartmouth College, Stanford University, the Swiss Federal Institute of Technology Lausanne and Tel Aviv University.

The papers span the breadth of graphics research, with advancements in neural content creation tools, display and human perception, the mathematical foundations of computer graphics and neural rendering.

Neural Tool for Multi-Skilled Simulated Characters

When a reinforcement learning model is used to develop a physics-based animated character, the AI typically learns just one skill at a time: walking, running or perhaps cartwheeling. But researchers from UC Berkeley, the University of Toronto and NVIDIA have created a framework that enables AI to learn a whole repertoire of skills — demonstrated above with a warrior character who can wield a sword, use a shield and get back up after a fall.

Achieving these smooth, life-like motions for animated characters is usually tedious and labor intensive, with developers starting from scratch to train the AI for each new task. As outlined in this paper, the research team allowed the reinforcement learning AI to reuse previously learned skills to respond to new scenarios, improving efficiency and reducing the need for additional motion data.

Tools like this one can be used by creators in animation, robotics, gaming and therapeutics. At SIGGRAPH, NVIDIA researchers will also present papers about 3D neural tools for surface reconstruction from point clouds and interactive shape editing, plus 2D tools for AI to better understand gaps in vector sketches and improve the visual quality of time-lapse videos.

Bringing Virtual Reality to Lightweight Glasses 

Most virtual reality users access 3D digital worlds by putting on bulky head-mounted displays, but researchers are working on lightweight alternatives that resemble standard eyeglasses.

A collaboration between NVIDIA and Stanford researchers has packed the technology needed for 3D holographic images into a wearable display just a couple millimeters thick. The 2.5-millimeter display is less than half the size of other thin VR displays, known as pancake lenses, which use a technique called folded optics that can only support 2D images.

The researchers accomplished this feat by approaching display quality and display size as a computational problem, and co-designing the optics with an AI-powered algorithm.

While prior VR displays require distance between a magnifying eyepiece and a display panel to create a hologram, this new design uses a spatial light modulator, a tool that can create holograms right in front of the user’s eyes, without needing this gap. Additional components — a pupil-replicating waveguide and geometric phase lens — further reduce the device’s bulkiness.

It’s one of two VR collaborations between Stanford and NVIDIA at the conference, with another paper proposing a new computer-generated holography framework that improves image quality while optimizing bandwidth usage. A third paper in this field of display and perception research, co-authored with New York University and Princeton University scientists, measures how rendering quality affects the speed at which users react to on-screen information.

Lightbulb Moment: New Levels of Real-Time Lighting Complexity

Accurately simulating the pathways of light in a scene in real time has always been considered the “holy grail” of graphics. Work detailed in a paper by the University of Utah’s School of Computing and NVIDIA is raising the bar, introducing a path resampling algorithm that enables real-time rendering of scenes with complex lighting, including hidden light sources.

Think of walking into a dim room, with a glass vase on a table illuminated indirectly by a street lamp located outside. The glossy surface creates a long light path, with rays bouncing many times between the light source and the viewer’s eye. Computing these light paths is usually too complex for real-time applications like games, so it’s mostly done for films or other offline rendering applications.

This paper highlights the use of statistical resampling techniques — where the algorithm reuses computations thousands of times while tracing these complex light paths — during rendering to approximate the light paths efficiently in real time. The researchers applied the algorithm to a classic challenging scene in computer graphics, pictured below: an indirectly lit set of teapots made of metal, ceramic and glass.

Related NVIDIA-authored papers at SIGGRAPH include a new sampling strategy for inverse volume rendering, a novel mathematical representation for 2D shape manipulation, software to create samplers with improved uniformity for rendering and other applications, and a way to turn biased rendering algorithms into more efficient unbiased ones.

Neural Rendering: NeRFs, GANs Power Synthetic Scenes

Neural rendering algorithms learn from real-world data to create synthetic images — and NVIDIA research projects are developing state-of-the-art tools to do so in 2D and 3D.

In 2D, the StyleGAN-NADA model, developed in collaboration with Tel Aviv University, generates images with specific styles based on a user’s text prompts, without requiring example images for reference. For instance, a user could generate vintage car images, turn their dog into a painting or transform houses to huts:

And in 3D, researchers at NVIDIA and the University of Toronto are developing tools that can support the creation of large-scale virtual worlds. Instant neural graphics primitives, the NVIDIA paper behind the popular Instant NeRF tool, will be presented at SIGGRAPH.

NeRFs, 3D scenes based on a collection of 2D images, are just one capability of the neural graphics primitives technique. It can be used to represent any complex spatial information, with applications including image compression, highly accurate representations of 3D shapes and ultra-high resolution images.

This work pairs with a University of Toronto collaboration that compresses 3D neural graphics primitives just as JPEG is used to compress 2D images. This can help users store and share 3D maps and entertainment experiences between small devices like phones and robots.

There are more than 300 NVIDIA researchers around the globe, with teams focused on topics including AI, computer graphics, computer vision, self-driving cars and robotics. Learn more about NVIDIA Research.

The post Setting AIs on SIGGRAPH: Top Academic Researchers Collaborate With NVIDIA to Tackle Graphics’ Greatest Challenges appeared first on NVIDIA Blog.

Read More

‘In the NVIDIA Studio’ Welcomes Concept Designer Yangtian Li

Editor’s note: This post is part of our weekly In the NVIDIA Studio series, which celebrates featured artists, offers creative tips and tricks, and demonstrates how NVIDIA Studio technology accelerates creative workflows. 

This week In the NVIDIA Studio, we welcome Yangtian Li, a senior concept artist at Singularity6.

Li is a concept designer and illustrator who has worked on some of the biggest video game franchises, including Call of Duty, Magic: the Gathering and Vainglory. Her artwork also appears in book illustrations and magazines.

Dreams of the Past (left) and Elf Archer by Yangtian Li.

Li’s impressive portfolio features character portraits of strong, graceful, empowered women. Their backstories and elegance are inspired by her own life experiences and world travels.

Snake Witch by Yangtian Li.

Now based in Seattle, Li’s artistic journey began in Chengdu, China. Her hometown serves as the inspiration behind her extraordinary portrait, Snake Witch. This unique and provoking work is based on tribal black magic from Chinese folklore. Li drew and painted the piece, powered by a GeForce RTX GPU and the NVIDIA Studio platform.

Up close, viewers can feel the allure of the Snake Witch.

Snake Witch is a product of Li’s fascination with black magic, or “Gu,” where tribal witch doctors would gather toxic creatures, use their venom to make poison and practice the dark arts. “I always thought the stories were fascinating, so I wanted to do a take,” Li said. “Snakes are more appealing to me, and it helps to create some interesting compositions.”

Li chooses to keep it simple before diving in on the details.

After primarily working in 2D, Li moves to 3D to support her concept process. She adds, “having a high-end GPU really helps speed up the process when it comes to using Blender, Zbrush and so on.” These speed-ups, she says, are particularly noticeable with GPU-accelerated rendering in Blender Cycles 3.0, achieving results that are over 5x faster with a GeForce RTX laptop GPU compared to a MacBook Pro M1 Max or CPU alone.

 

With the Snake Witch’s character foundation in a good place, Li used the Liquify filter to subtly distort her subject’s facial features.

Liquify is one of over 30 GPU-accelerated features in Adobe Photoshop, like AI-powered Neural Filters, that help artists explore creative ideas and make complex adjustments in seconds.

 

“The more life experiences you have, your understanding of the world evolves, and that will be reflected in your art.”

Li uses adjustment layers in the coloring phase of her process, allowing for non-destructive edits while trying to achieve the correct color tone.

If unsatisfied with an adjustment, Li can simply delete it while the original image remains intact.

Finally, Li adjusts lighting, using the Opacity feature to block light from right of the image, adding a modicum of realistic flair.

The devil is in the details, or in this case, the Snake Witch.

Remaining in a productive flow state is critical for Li, as it is for many artists, and her GeForce RTX GPU allows her to spend more time in that magical creative zone where ideas come to life faster and more naturally.

Li goes into greater detail on how she created Snake Witch in her Studio Session. This three-part series includes her processes for initial sketching, color and detail optimization, and finishing touches.

Previously, Li has worked as a senior concept artist and designer at Amazon Game Studios, Niantic and Treyarch, among others.

Check out Li’s portfolio and favorite projects on Instagram.

Accelerating Adobe Creators In the NVIDIA Studio

More resources are available to creators seeking additional NVIDIA Studio features and optimizations that accelerate Adobe creative apps.

Follow a step-by-step tutorial in Photoshop that details how to apply a texture from a photo to a 3D visualization render.

Learn how to work significantly faster in Adobe Lightroom by utilizing AI-powered masking tools; Select Subject and Select Sky.

Follow NVIDIA Studio on Facebook, Twitter and Instagram, access tutorials on the Studio YouTube channel, and get updates directly in your inbox by joining the NVIDIA Studio newsletter.

The post ‘In the NVIDIA Studio’ Welcomes Concept Designer Yangtian Li appeared first on NVIDIA Blog.

Read More

Mown Away: Startup Rolls Out Autonomous Lawnmower With Cutting Edge Tech

Jack Morrison and Isaac Roberts, co-founders of Replica Labs, were restless two years after their 3D vision startup was acquired, seeking another adventure. Then, in 2018, when Morrison was mowing his lawn, it struck him: autonomous lawn mowers.

The two, along with Davis Foster, co-founded Scythe Robotics. The company, based in Boulder, Colo., has a 40-person team working with robotics and computer vision to deliver what it believes to be the first commercial electric self-driving mower service.

Scythe’s machine, dubbed M.52, collects a dizzying array of data from eight cameras and more than a dozen other sensors, processed by NVIDIA Jetson AGX Xavier edge AI computing modules.

The company plans to rent its machines to customers much like is done in a software-as-as-service model, but based on acreage of cut grass, reducing upfront costs.

“I thought, if I didn’t enjoy mowing the lawn, what about the folks who are doing it every day. Wasn’t there something better they could do with their time?” said Morrison. “It turned out there’s a strong resounding ‘yes’ from the industry.”

The startup, a member of the NVIDIA Inception program, says it already has thousands of reservations for its on-demand robots. Meanwhile, it has a handful of pilots, including one with Clean Scapes, a large Austin-based commercial landscaping company.

Scythe’s electric machines are coming as regulatory and corporate governance concerns highlight the need for cleaner landscaping technologies.

What M.52 Can Do

Scythe’s M.52 machine is about as state of the art as it gets. Its cameras support vision on all sides, and its dozen sensors include ultrasonics, accelerometers, gyroscopes, magnetometers, GPS and wheel encoders.

To begin a job, the M.52 needs to only be manually driven on the perimeter of an area just once. Scythe’s robot mower relies on its cameras, GPS and wheel encoders to help plot out maps of its environment with simultaneous localization and mapping, or SLAM.

After that, the operator can direct the M.52 to an area and specify a direction and stripe pattern, and it completes the job unsupervised. If it encounters an obstacle that shouldn’t be there — like a bicycle on the grass — it can send alerts for an assist.

“Jetson AGX Xavier is a big enabler of all of this, as it can be used in powerful machines, brings a lot of compute, really low power, and hooks into the whole ecosystem of autonomous machine sensors,” said Morrison.

Scythe’s robo-mowers pack enough battery for eight hours of use on a charge, which can come from a standard level 2 EV charger. And the company says fewer moving parts than combustion engine mowers means less maintenance and a longer service life.

Also, the machine can go 12 miles per hour and includes a platform for operators to stand on for a ride. It’s designed for jobs sites where mowers may need to travel some distance to get to the area of work.

Riding the RaaS Wave

The U.S. market for landscaping services is expected to reach more than $115 billion this year, up from about $70 billion in 2012, according to IBISWorld research.

Scythe is among an emerging class of startups offering robots as a service, or RaaS, in which customers pay according to usage.

It’s also among companies operating robotics services whose systems share similarities with AVs, Morrison said.

“An advantage of Xavier is using these automotive-grade camera standards that allow the imagery to come across with really low latency,” he said. “We are able to turn things around from photon to motion quite quickly.”

Earth-Friendly Machines

Trends toward lower carbon footprints in businesses are a catalyst for Scythe, said Morrison. LEED certification for greener buildings counts the equipment used in maintenance — such as landscaping — driving interest for electric equipment.

Legislation from California, which aims to prohibit sales of gas-driven tools by 2024, factors in as well. A gas-powered commercial lawn mower driven one hour emitted as much of certain pollutants as driving a passenger vehicle 300 miles, according to a fact sheet released from the California Air Resources Board in 2017.

“There’s a lot of excitement around what we are doing because it’s becoming a necessity, and those in landscaping businesses know that they won’t be able to secure the equipment they are used to,” said Morrison.

Learn more about Scythe Robotics in this GTC presentation.

The post Mown Away: Startup Rolls Out Autonomous Lawnmower With Cutting Edge Tech appeared first on NVIDIA Blog.

Read More

Meet the Omnivore: 3D Artist Creates Towering Work With NVIDIA Omniverse

Editor’s note: This post is a part of our Meet the Omnivore series, which features individual creators and developers who use NVIDIA Omniverse to accelerate their 3D workflows and create virtual worlds.

Edward McEvenue

Edward McEvenue grew up making claymations in LEGO towns. Now, he’s creating photorealistic animations in virtual cities, drawing on more than a decade of experience in the motion graphics industry.

The Toronto-based 3D artist learned about NVIDIA Omniverse, a 3D design collaboration and world simulation platform, a few months ago on social media. Within weeks he was using it to build virtual cities.

McEvenue used the Omniverse Create app to make a 3D representation of New York City, composed of 32 million polygons:

And for the following scene, he animated a high-rise building with fracture-and-reveal visual effects:

Light reflects off the building’s glass windows photorealistically, and vehicles whiz by in a physically accurate way — thanks to ray tracing and path tracing enabled by RTX technology, as well as Omniverse’s physics-based simulation capabilities.

McEvenue’s artistic process incorporates many third-party applications, including Autodesk 3ds Max for modeling and animation; Epic Games Unreal Engine for scene layout; tyFlow for visual effects and particle simulations; Redshift and V-Ray for rendering; and Adobe After Effects for post-production compositing.

All of these tools can be connected seamlessly in Omniverse, which is built on Pixar’s Universal Scene Description, an easily extensible, open-source 3D scene description and file format.

Real-Time Rendering and Multi-App Workflows

EDSTUDIOS, McEvenue’s freelance company, creates 3D animations and motion graphics for films, advertisements and other visual projects.

He says Omniverse helps shave a week off his average delivery time for projects, since it eliminates the need to send animations to a separate render farm, and allows him to bring in assets from his favorite design applications.

“It’s a huge relief to finally feel like rendering isn’t a bottleneck for creativity anymore,” McEvenue said. “With Omniverse’s real-time rendering capabilities, you get instant feedback, which is incredibly freeing for the artistic process as it allows creators to focus time and energy into making their designs rather than facing long waits for the beautiful images to display.”

A virtual representation of the ancient sculpture ‘Laocoön and His Sons’ — made by McEvenue with Autodesk 3ds Max, Unreal Engine 5 and Omniverse Create.

McEvenue says he can output more visuals on one computer running Omniverse faster than he previously could with nine. He uses an NVIDIA RTX 3080 Ti GPU and NVIDIA Studio Drivers to accelerate his workflow.

In addition to freelance projects, McEvenue has worked on commercial campaigns across Canada, the U.S. and Uganda as a co-founder of the film production company DAY JOB.

“Omniverse takes away a lot of the friction in dealing with exporting formats, recreating shaders or linking materials,” McEvenue said. “And it’s very intuitively designed, so it’s incredibly easy to get up and running.”

Watch a Twitch stream replay that dives deeper into McEvenue’s workflow, as well as highlights the launch of the Unreal Engine 5 Omniverse Connector.

Join in on the Creation

Creators across the world can download NVIDIA Omniverse for free, and enterprise teams can use the platform for their 3D projects.

Join the #MadeInMachinima contest, running through June 27, for a chance to win the latest NVIDIA Studio laptop.

Learn more about Omniverse by watching GTC sessions on demand — featuring visionaries from the Omniverse team, Adobe, Autodesk, Bentley Systems, Epic Games, Pixar, Unity, Walt Disney Studios and more.

Connect your workflows to Omniverse with software from Adobe, Autodesk, Epic Games, Maxon, Reallusion and more.

Follow Omniverse on Instagram, Twitter, YouTube and Medium for additional resources and inspiration. Check out the Omniverse forums and join our Discord Server to chat with the community.

The post Meet the Omnivore: 3D Artist Creates Towering Work With NVIDIA Omniverse appeared first on NVIDIA Blog.

Read More

How DNEG Helped Win Another Visual-Effects Oscar by Bringing ‘Dune’ to Life With NVIDIA RTX

Featuring stunning visuals from futuristic interstellar worlds, including colossal sand creatures, Dune captivated audiences around the world.

The sci-fi film picked up six Oscars last month at the 94th Academy Awards, including for Best Sound and Visual Effects. Adapted from Frank Herbert’s 1965 novel of the same name, Dune tells the story of Paul Atreides, a heroic character whose family travels to the dangerous planet of Arrakis.

To bring the story to life, now seven-time Academy Award-winning studio DNEG used a blend of practical and visual effects, creating spectacular graphics that capture the dystopian worlds of Dune.

The film’s production visual effects supervisor, Paul Lambert, said that his focus was on seamlessly augmenting or enhancing what was already accomplished with the beautiful production design and cinematography — grounding the visual effects in reality. DNEG contributed to 28 sequences and over 1,000 VFX shots in the film, and the artists worked from multiple locations using NVIDIA RTX Virtual Workstations.

A Mix of Sand and Simulations

Sand, inevitably, was going to play a major part in Dune, and 18 tons of it were used to make the film. But the VFX team also digitally replicated every aspect of it to perfectly blend simulated sand into the shots.

“One of the things we came up against early on was getting that essence of scale,” said Paul Salvini, global CTO of DNEG. “In simulations, each grain of sand is literally the size of a pixel, which means we needed huge volumes of sand, and that turned into petabytes of data.”

All imagery courtesy of DNEG.

Beyond replicating the look and feel of sand, the team needed to realistically capture its movement. This became even more challenging when it came to finding a way to depict massive sandworms moving through the desert.

The artists spent months building, modeling and sculpting the creature into shape. They took inspiration from baleen whales — months of research revealed that when a colossal object moves through sand, the environment around it behaves like water, similar to how a whale moves through the ocean.

DNEG then simulated each sand particle to see how it would cascade off a sandworm, or how the dunes would ripple as the creature moved around. For the latter, the practical effects team created a sand-displacement effect by placing a vibrating metal plate under real sand, and the VFX team expanded it to simulate the effect on a much larger scale.

“It’s tricky to do, because it’s super complex and computationally expensive to figure out how one grain of sand is connected to another grain — and have all of this act on a massive scale,” said Salvini. “It was an iterative process, and it takes a long time to actually simulate all of these particles.”

DNEG used a combination of Dell Precision workstations and Dell PowerEdge R740 servers with NVIDIA RTX and server GPUs to iterate quickly and make changes, ensuring the simulations with the sandworm looked realistic.

To add more realism to the creature, the artists looked to the bristly teeth of baleen whales. The VFX team modeled different versions of the teeth and used a scattering system in the Houdini app, which allowed them to populate the worm’s mouth at render time.

Using Isotropix Clarisse and NVIDIA RTX, DNEG artists rendered graphics in hours instead of days. This allowed them to receive feedback on the visuals nearly instantly. It also helped increase their number of iterations, enabling final shots and high-quality images at a much quicker pace.

Enhancing Production Workflows With Virtualization

DNEG was one of the first studios to implement NVIDIA virtual GPUs at scale with NVIDIA RTX Virtual Workstation software. NVIDIA RTX-powered virtual workstations deliver incredible flexibility, allowing DNEG to adjust the number of users on a particular server based on the current workload.

Virtual machines are also cost effective. As newer GPUs and expanded software packages enter the data center, DNEG can deploy these to its users to maintain optimal performance for each artist.

“To give our artists more compute power, we can easily increase NVIDIA vGPU profile sizes and reduce the number of users we put on each server,” said Daire Byrne, global head of systems at DNEG. “We don’t need to replace any equipment to keep working with maximum performance.”

And because creators can securely log into RTX-powered virtual workstations from anywhere in the world, DNEG artists can work remotely, while still maintaining high productivity.

“Every show we get is different from the last, and NVIDIA RTX Virtual Workstations let us scale the memory and performance characteristics up or down to meet the needs of our artists,” said Byrne.

Living in the Future of Virtual Worlds

DNEG continues its pioneering work with NVIDIA Omniverse Enterprise as the studio looks to the future of connected, collaborative workflows.

“The virtual world is where filmmaking is going,” said Salvini. “We now have advanced tools and technologies that are capable of delivering photorealistic virtual environments and digital characters, allowing us to create incredible, beautiful stylized worlds.”

With the shift toward real-time technology and more seamless, collaborative content creation pipelines, DNEG sees greater opportunities to interact with the filmmakers and art teams across the globe. This will allow for many more iterations to accomplish artistic goals in a fraction of the time.

DNEG uses Omniverse Enterprise with Dell Precision workstations with NVIDIA RTX A6000 GPUs, and Dell PowerEdge R7525 servers with NVIDIA A40 GPUs.

Learn more about how DNEG is transforming global film production workflows in the studio’s GTC session, now available on demand.

The post How DNEG Helped Win Another Visual-Effects Oscar by Bringing ‘Dune’ to Life With NVIDIA RTX appeared first on NVIDIA Blog.

Read More

Your Odyssey Awaits: Stream ‘Lost Ark’ to Nearly Any Device This GFN Thursday

It’s a jam-packed GFN Thursday.

This week brings the popular, free-to-play, action role-playing game Lost Ark to gamers across nearly all their devices, streaming on GeForce NOW. And that’s not all.

GFN Thursday also delivers an upgraded experience in the 2.0.40 update. M1-based MacBooks, iMacs and Mac Minis are now supported natively.

Plus, membership gift cards can now be redeemed for RTX 3080, a membership reward for the “Heroic Edition” of Guild Wars 2 and 14 new games are joining the GeForce NOW library this week.

Embark on an Odyssey

Visit Arkesia, the vast and vibrant world of Lost Ark, now streaming to PC, Mac, Chromebook and more with GeForce NOW.

Explore new lands and encounter vibrant cultures, strange creatures and unexpected marvels waiting to be discovered. Seek out lost treasures in dungeons, test your mettle on epic quests, show your strength in intense raids against enemies and go head-to-head in expert PvP duels. Do it all solo, grouped with friends or matched with other players in this massive open world.

Forge your own destiny across almost all devices. Play to your fighting style with iconic character classes — each with their own distinct abilities — at up to 1440p or 1600p and 120 frames per second on PC and up to 4K on SHIELD TV with an RTX 3080 membership.

Choose weapons and gear to assist you on your journey gaming on the go with a mobile phone. Dive deep into non-combat skills, crafting, guilds, social systems and other rich features that bring the world alive, streaming even on a MacBook at up to 1600p or iMac up to 1440p.

The adventure is yours. Do it all, streaming on GeForce NOW.

Level Up With the GeForce NOW 2.0.40 Update

The newest update to the cloud enables the GeForce NOW macOS app to natively support the Apple M1 chip. This update provides lower power consumption, faster app startup times and an overall elevated GeForce NOW experience on M1-based MacBooks, iMacs and Mac Minis.

Streaming Statistics Overlay on GeForce NOW
Enjoy PC gaming across nearly all devices.

The upgrade brings some additional benefits to members.

Members can more easily discover new games to play in the app with the added Genre row at the bottom of the Games menu. Useful sorting options include the ability to see All Games available in specific regions and by device type, and multiple filters can help narrow down the list.

Finally, members can enjoy an improved Streaming Statistics Overlay that now includes server-side rendering frame rates. The overlay quickly toggles through Standard/Compact/Off using the hotkey Ctrl+N. Members can complete their whole log-in process on play.geforcenow.com within the same browser tab.

Give the Gift of Gaming

What could be as good as playing with the power of a GeForce RTX 3080? Try giving that power to a loved one, streaming from the cloud.

Gift Cards on GeForce NOW
It’s the gift that keeps on giving for gamers.

GeForce NOW RTX 3080 memberships are now available as digital gift cards with two-, three- and six-month options. GeForce NOW membership gift cards can be used to redeem an RTX 3080 membership or a Priority membership, depending on the recipient’s preference, so you can’t go wrong.

Spoil a special gamer in your life by giving them access to the cloud across their devices or bring a buddy onto the service to party up and play.

Gift cards can be added to an existing GeForce NOW account or redeemed on a new one. Existing Founders, Priority and RTX 3080 members will have the number of months added to their accounts. For more information, visit the GeForce NOW website.

Rewards of Heroic Quality

This week, members can receive the “Heroic Edition” of ArenaNet’s critically acclaimed free-to-play massively multiplayer online role-playing game Guild Wars 2. The Guild Wars 2Heroic Edition” comes with a full treasure trove of goodies, including the Suit of Legacy Armor, an 18-slot inventory expansion and four heroic Boosters.

Guild Wars 2 Heroic Edition on GeForce NOW
You bring the heroics, we’ll bring the rewards in Guild Wars 2.

Getting membership rewards for streaming games on the cloud is easy. Log in to your NVIDIA account and select “GEFORCE NOW” from the header, scroll down to “REWARDS” and click the “UPDATE REWARDS SETTINGS” button. Check the box in the dialogue window that pops up to start receiving special offers and in-game goodies.

Sign up for the GeForce NOW newsletter, including notifications for when rewards are available, by logging into your NVIDIA account and selecting “PREFERENCES” from the header. Check the “Gaming & Entertainment” box, and “GeForce NOW” under topic preferences.

No Time Like Playtime

Dune Spice Wars on GeForce NOW
Lead your faction and battle for control over the harsh desert planet of Arrakis.

To cap it all off, GFN Thursday has new games to play, as always. Members can look for the following 14 games ready to stream this week:

  • Dune: Spice Wars (New release on Steam)
  • Holomento (New release on Steam)
  • Prehistoric Kingdom (New release on Steam and Epic Games Store) 
  • Romans: Age of Caesar (New release on Steam)
  • Sea of Craft (New release on Steam)
  • Trigon: Space Story (New release on Steam)
  • Vampire: The Masquerade – Bloodhunt (New release on Steam)
  • Conan Exiles (Epic Games Store)
  • Crawl (Steam)
  • Flashing Lights – Police, Firefighting, Emergency Services Simulator (Steam)
  • Galactic Civilizations II: Ultimate Edition (Steam)
  • Jupiter Hell (Steam)
  • Lost Ark (Steam)
  • SOL CRESTA (Steam)

Finally, we’ve got a question for you this week and are only accepting wrong answers. Let us know what you think on Twitter or in the comments below:

The post Your Odyssey Awaits: Stream ‘Lost Ark’ to Nearly Any Device This GFN Thursday appeared first on NVIDIA Blog.

Read More

Answers Blowin’ in the Wind: HPC Code Gives Renewable Energy a Lift

A hundred and forty turbines in the North Sea — and some GPUs in the cloud — pumped wind under the wings of David Standingford and Jamil Appa’s dream.

As colleagues at a British aerospace firm, they shared a vision of starting a company to apply their expertise in high performance computing across many industries.

They formed Zenotech and coded what they learned about computational fluid dynamics into a program called zCFD. They also built a tool, EPIC, that simplified running it and other HPC jobs on the latest hardware in public clouds.

But getting visibility beyond their Bristol, U.K. home was a challenge, given the lack of large, open datasets to show what their tools could do.

Harvesting a Wind Farm’s Data

Their problem dovetailed with one in the wind energy sector.

As government subsidies for wind farms declined, investors demanded a deeper analysis of a project’s likely return on investment, something traditional tools didn’t deliver. A U.K. government project teamed Zenotech with consulting firms in renewable energy and SSE, a large British utility willing to share data on its North Sea wind farm, one of the largest in the world.

Wind fames Eng Channel
Zenotech simulated SSE’s Greater Gabbard wind farm, one of the largest of many in the North Sea.

Using zCFD, Zenotech simulated the likely energy output of the farm’s 140 turbines. The program accounted for dozens of wind speeds and directions. This included key but previously untracked phenomena in front of and behind a turbine, like the combined effects the turbines had on each other.

“These so-called ‘wake’ and ‘blockage’ effects around a wind farm can even impact atmospheric currents,” said Standingford, a director and co-founder of Zenotech.

The program can also track small but significant terrain effects on the wind, such as when trees in a nearby forest lose their leaves.

SSE validated that the final simulation came within 2 percent of the utility’s measured data, giving zCFD a stellar reference.

Accelerated in the Cloud

Icing the cake, Zenotech showed cloud GPUs delivered results fast and cost effectively.

For example, the program ran 43x faster on NVIDIA A100 Tensor Core GPUs than CPUs at a quarter of the CPUs’ costs, the company reported in a recent GTC session (viewable on-demand). NVIDIA NCCL libraries that speed communications between GPU systems further boosted results up to 15 percent.

Zenotech GTC results
Zenotech benchmarked zCFD on NVIDIA GPUs (green on the left chart) versus CPUs (blue) as well as GPU clusters with and without NCCL libraries (right chart).

As a result, work that took more than five hours on CPUs ran in less than 50 minutes on GPUs. The ability to analyze nuanced wind effects in detail and finish a report in a day “got people’s attention,” Standingford said.

“The tools and computing power to perform high-fidelity simulations of wind projects are now affordable and accessible to the wider industry,” he concluded in a report on the project.

Lowering Carbon, Easing Climate Change

Wind energy is among the largest, most cost-effective contributors to lowering carbon emissions, notes Farah Hariri, technical lead for the team helping NVIDIA’s customers manage their transitions to net-zero emissions.

“By modeling both wake interactions and the blockage effects, zCFD helps wind farms extract the maximum energy for the minimum installation costs,” she said.

This kind of fast yet detailed analysis lowers risk for investors, making wind farms more economically attractive than traditional energy sources, said Richard Whiting, a partner at Everose, one of the consultants who worked on the project with Zenotech.

Looking forward, Whiting estimates more than 2,100 gigawatts of wind farms could be in operation worldwide by 2030, up 3x from in 2020. It’s a growing opportunity on many levels.

“In future, we expect projects will use larger arrays of larger turbines so the modeling challenge will only get bigger,” he added.

Less Climate Change, More Business

Helping renewable projects get off the ground also put wind in Zenotech’s sails.

Since the SSE analysis, the company has helped design wind farms or turbines in at least 10 other projects across Europe and Asia. And half of Zenotech’s business is now outside the U.K.

As the company expands, it’s also revisiting its roots in aerospace, lifting the prospects of new kinds of businesses for drones and air taxis.

A Parallel Opportunity

For example, Cardiff Airport is sponsoring live trials where emerging companies use zCFD on cloud GPUs to predict wind shifts in urban environments so they can map safe, efficient routes.

“It’s a forward-thinking way to use today’s managed airspace to plan future services like air taxis and automated airport inspections,” said Standingford.

“We’re seeing a lot of innovation in small aircraft platforms, and we’re working with top platform makers and key sites in the U.K.”

It’s one more way the company is keeping its finger to the wind.

The post Answers Blowin’ in the Wind: HPC Code Gives Renewable Energy a Lift appeared first on NVIDIA Blog.

Read More

What Is Conversational AI? ZeroShot Bot CEO Jason Mars Explains

Entrepreneur Jason Mars calls conversation our “first technology.”

Before humans invented the wheel, crafted a spear or tamed fire, we mastered the superpower of talking to one another.

That makes conversation an incredibly important tool.

But if you’ve dealt with the automated chatbots deployed by the customer service arms of just about any big organization lately — whether banks or airlines — you also know how hard it can be to get it right.

Deep learning AI and new techniques such as zero-shot learning promise to change that.

On this episode of NVIDIA’s AI Podcast, host Noah Kravitz — whose intelligence is anything but artificial — spoke with Mars about how the latest AI techniques intersect with the very ancient art of conversation.

In addition to being an entrepreneur and CEO of several startups, including Zero Shot Bot, Mars is an associate professor of computer science at the University of Michigan and the author of “Breaking Bots: Inventing a New Voice in the AI Revolution” (ForbesBooks, 2021).

You Might Also Like

NVIDIA’s Liila Torabi Talks the New Era of Robotics Through Isaac Sim

Robots aren’t limited to the assembly line. Liila Torabi, senior product manager for Isaac Sim, a robotics and AI simulation platform powered by NVIDIA Omniverse, talks about where the field’s headed.

GANTheftAuto: Harrison Kinsley on AI-Generated Gaming Environments

Humans playing games against machines is nothing new, but now computers can develop their own games for people to play. Programming enthusiast and social media influencer Harrison Kinsley created GANTheftAuto, an AI-based neural network that generates a playable chunk of the classic video game Grand Theft Auto V.

The Driving Force: How Ford Uses AI to Create Diverse Driving Data

The neural networks powering autonomous vehicles require petabytes of driving data to learn how to operate. Nikita Jaipuria and Rohan Bhasin from Ford Motor Company explain how they use generative adversarial networks (GANs) to fill in the gaps of real-world data used in AV training.

Subscribe to the AI Podcast: Now available on Amazon Music

You can now listen to the AI Podcast through Amazon Music.

You can also get the AI Podcast through iTunes, Google Podcasts, Google Play, Castbox, DoggCatcher, Overcast, PlayerFM, Pocket Casts, Podbay, PodBean, PodCruncher, PodKicker, Soundcloud, Spotify, Stitcher and TuneIn.

Make the AI Podcast Better: Have a few minutes to spare? Fill out our listener survey.

The post What Is Conversational AI? ZeroShot Bot CEO Jason Mars Explains appeared first on NVIDIA Blog.

Read More