When a community embraces sustainability, it can reap multiple benefits: gainful employment for vulnerable populations, more resilient local ecosystems and a cleaner environment.
This Earth Day, we’re announcing our four latest corporate social responsibility investments in India, home to more than 2,700 NVIDIANs. These initiatives are part of our multi-year efforts in the country, which focus on investing in social innovation, job creation and climate action.
Last year, we funded projects that aided migrant workers affected by COVID-19, increased green cover, furthered sustainable waste management processes and improved livelihoods through job creation.
This project will build 10 water-harvesting structures and a dozen systems for diversion-based irrigation, a technique to irrigate farms by redirecting water from rivers or streams. It will benefit around 4,000 individuals from vulnerable migrant households by increasing vegetative cover and the irrigation potential of the land. The foundation will also create community-based initiatives to augment rural household income through the sale of non-timber forest products such as medicinal plants, leaves or honey.
We’re supporting the organization Grow-Trees’ efforts to plant local, non-invasive trees in the Dalma Wildlife Sanctuary, home to dozens of endangered Asiatic elephants. Located in the northeastern state of Jharkhand, this project will employ tribal women and other villagers to plant more than 26,000 trees to improve the environment and reinstate elephant migration routes.
Hyderabad-based Naandi is securing sustainable livelihoods for tribal communities by encouraging the organic farming of coffee and other crops. We’re funding this Rockefeller Foundation Award-winning project to transform depleted soil into carbon-rich landscapes, improving plant health for 3,000 acres of coffee farms and boosting coffee quality to a gourmet product that boosts the income of thousands of farming families.
Energy Harvest aims to reduce open-field burning by connecting small farmers with machinery owners and straw buyers, paving paths for alternative energy sources using agricultural waste. The initiative — which will use AI and edge devices to identify farm fires and track local emission levels — will create dozens of employment opportunities, benefit more than 100 farmers and improve air quality by saving hundreds of acres from burning.
NVIDIA has previously funded projects in India that provided education programs for underprivileged youth, taught computer skills to young women, supported people with disabilities and opened 50 community libraries in remote areas. Many of these initiatives have centered in communities near our three offices — Bangalore, Hyderabad and Pune.
GFN Thursday is our ongoing commitment to bringing great PC games and service updates to our members each week. Every Thursday, we share updates on what’s new in the cloud — games, exclusive features, and news on GeForce NOW.
This week, it includes the latest updates for two popular games: Fatshark’s free expansion Warhammer: Vermintide 2 Chaos Wastes, and The Lost Gods DLC for Ubisoft’s Immortals Fenyx Rising.
Since GeForce NOW is streaming the PC version of these games, members receive the full experience — with expansion and DLC support — and can play with and against millions of other PC players.
The GeForce NOW library also grows by 15 games this week, with game releases from NewCore Games, Ubisoft and THQ.
A High-Stakes Adventure
GeForce NOW members can explore the Chaos Wastes with their fellow heroes in Warhammer: Vermintide 2 with this new rogue-lite inspired game mode. Teams of up to four are built from the ground up, working together on tactics while preparing for the unexpected. As their team progresses, the reward grows greater. Failure is not an option.
Discover 15 new locations in the free expansion and prepare for an extra challenge as cursed areas — with changing landscapes influenced by the current ruler — bring a more sinister threat.
Warhammer: Vermintide 2 Chaos Wastes is streaming now on GeForce NOW.
God’s Eye View of the Lost Gods
Immortals Fenyx Rising– The Lost Gods, the third narrative DLC, launched today and is streaming starting today on GeForce NOW. Unfolding entirely from an overhead, god’s-eye perspective, the adventure centers on Ash, a new mortal champion following a series of catastrophic disasters.
Ash’s mission is to travel to a new land, the Pyrite Island, to find and reunite the gods who left Olympos in a huff after a falling-out with Zeus. These “lost gods,” including Poseidon and Hades, will all need to be convinced to return to the Pantheon and restore balance to the world. Naturally, there are plenty of monsters standing between them and Ash, which players can dispatch using a new, brawler-inspired combat system.
Get Your Game On
It’s a busy GFN Thursday this week with 15 games joining the GeForce NOW library today.
Launching on Steam today, play as an adorable yet trouble-making turnip. Avoid paying taxes, solve plantastic puzzles, harvest crops and battle massive beasts all in a journey to tear down a corrupt vegetable government!
Enter the Chaos-infested Caligari Sector and purge the unclean with the most powerful agents of the Imperium of Man! Warhammer 40,000: Inquisitor – Martyr is a grim, action-RPG featuring multiple classes of the Inquisition who will carry out the Emperor’s will.
Two games from Ubisoft’s long-running city-building franchise, Anno, release on GeForce NOW today. Anno 2070 offers a new world full of challenges, where you’ll need to master resources, diplomacy and trade in the most comprehensive economic management system in the Anno series.
In Anno 2205, you join humankind‘s next step into the future with the promise to build a better tomorrow. You conquer Earth, establishing rich, bustling cities and grand industrial complexes, but to secure the prosperity of your people, you must travel into space.
Though admittedly prone to breaking kitchen appliances like ovens and microwaves, Julie Bernauer — senior solutions architect for machine learning and deep learning at NVIDIA — led the small team that successfully built Selene, the world’s fifth-fastest supercomputer.
Adding to an already impressive feat, Bernauer’s team brought up Selene as the world went into lockdown in early 2020. They used skeleton crews, social distancing protocols, and remote cable validation to achieve what typically takes months with a larger install team in a few weeks.
Bernauer told NVIDIA AI Podcast host Noah Kravitz about the goal in creating Selene, which was primarily to support NVIDIA’s researchers. Referencing her time as a doctoral student, Bernauer explains how researchers are often prevented from working on larger models due to expense and infrastructure.
With Selene, the infrastructure is modular and can be scaled up or down depending on what users require, and allows for different types of research to be performed simultaneously. Bernauer said that Selene is proving most useful to autonomous vehicle and language modeling research at the moment.
Going forward, Bernauer envisions some of the power and efficiency of systems like Selene becoming more available on widely accessible devices, such as laptops or edge products such as cars.
Key Points From This Episode:
Selene’s unique, pandemic-safe installation is further explained in an NVIDIA blog detailing the specific efforts of Bernauer’s team and the lessons learned from past NVIDIA supercomputers such as SATURNV and Circe.
Bernauer joined NVIDIA in 2015, after spending 15 years in academia. She obtained her Ph.D. in structural genomics from Université Paris-Sud, after which she studied with Nobel Prize winner Michael Levitt at Stanford.
Tweetables:
“[Selene] is an infrastructure that people can share, where we can do different types of research at a time” — Julie Bernauer [8:30]
“We did [Selene] for ourselves, but we also did it … to figure out how to make a product better by going through the experience” — Julie Bernauer [13:27]
Marc Hamilton, vice president of solutions architecture and engineering at NVIDIA, speaks about overseeing the construction of the U.K.’s most powerful supercomputer, Cambridge-1. Built on the NVIDIA DGX SuperPOD architecture, the system will be used by AstraZeneca, GSK, Oxford Nanopore and more.
Hugging Face is more than just an adorable emoji — it’s a company that’s demystifying AI by transforming the latest developments in deep learning into usable code. Research engineer Sam Shleifer talks about the company’s NLP technology, which is used at over 1,000 companies.
Bryan Catanzaro, vice president of applied deep learning research at NVIDIA, walks through some of the latest developments at NVIDIA research … as well as shares a story involving Andrew Ng and cats.
NVIDIA DRIVE-powered cars electrified the atmosphere this week at Auto Shanghai.
The global auto show is the oldest in China and has become the stage to debut the latest vehicles. And this year, automakers, suppliers and startups developing on NVIDIA DRIVE brought a new energy to the event with a wave of intelligent electric vehicles and self-driving systems.
The automotive industry is transforming into a technology industry — next-generation lineups will be completely programmable and connected to a network, supported by software engineers who will invent new software and services for the life of the car.
Just as the battery capacity of an electric vehicle provides miles of range, the computing capacity of these new vehicles will give years of new delight.
EVs for Everyday
Automakers have been introducing electric vehicle technology with one or two specialized models. Now, these lineups are becoming diversified, with an EV for every taste.
Joining the recently launched EQS flagship sedan and EQA SUV on the showfloor, the Mercedes-Benz EQB adds a new flavor to the all-electric EQ family. The compact SUV brings smart electromobility in a family size, with seven seats and AI features.
Like its EQA sibling, the EQB features the latest generation MBUX AI cockpit, powered by NVIDIA DRIVE. The high-performance system includes an augmented reality head-up display, AI voice assistant and rich interactive graphics to enable the driver to enjoy personalized, intelligent features.
EV maker Xpeng is bringing its new energy technology to the masses with the P5 sedan. It joins the P7 sports sedan in offering intelligent mobility with NVIDIA DRIVE.
The P5 will be the first to bring Xpeng’s Navigation Guided Pilot (NGP) capabilities to public roads. The automated driving system leverages the automaker’s full-stack XPILOT 3.5, powered by NVIDIA DRIVE AGX Xavier. The new architecture processes data from 32 sensors — including two lidars, 12 ultrasonic sensors, five millimeter-wave radars and 13 high-definition cameras — integrated into 360-degree dual-perception fusion to handle challenging and complex road conditions.
Also making its auto show debut was the NIO ET7, which was first unveiled during a company event in January. The ET7 is the first vehicle that features NIO’s Adam supercomputer, which leverages four NVIDIA DRIVE Orin processors to achieve more than 1,000 trillion operations per second (TOPS).
The flagship vehicle leapfrogs current model capabilities, with more than 600 miles of battery range and advanced autonomous driving. With Adam, the ET7 can perform point-to-point autonomy, using 33 sensors and high-performance compute to continuously expand the domains in which it operates — from urban to highway driving to battery swap stations.
Elsewhere on the showfloor, SAIC’s R Auto exhibited the intelligent ES33. This smart, futuristic vehicle equipped with R-Tech leverages the high performance of NVIDIA DRIVE Orin to deliver automated driving features for a safer, more convenient ride.
SAIC- and Alibaba-backed IM Motors — which stands for intelligence in motion — also made its auto show debut with the electric L7 sedan and SUV, powered by NVIDIA DRIVE. These first two vehicles will have autonomous parking and other automated driving features, as well as a 93kWh battery that comes standard.
Improving Intelligence
In addition to automaker reveals, suppliers and self-driving startups showcased their latest technology built on NVIDIA DRIVE.
Global supplier ZF continued to push the bounds of autonomous driving performance with the latest iteration of its ProAI Supercomputer. With NVIDIA DRIVE Orin at its core, the scalable autonomous driving compute platform supports systems with level 2 capabilities all the way to full self-driving, with up to 1,000 TOPS of performance.
Autonomous driving startup Momenta demonstrated the newest capabilities of MPilot, its autopilot and valet parking system. The software, which is designed for mass production vehicles, leverages DRIVE Orin, which enhances production efficiency for a more streamlined time to market.
From advanced self-driving systems to smart, electric vehicles of all sizes, the NVIDIA DRIVE ecosystem stole the show this week at Auto Shanghai.
As AI is increasingly established as a world-changing field, the U.S. has an opportunity not only to demonstrate global leadership, but to establish a solid economic foundation for the future of the technology.
A panel of experts convened last week at GTC to shed light on this topic, with the co-chairs of the Congressional AI Caucus, U.S. Reps. Jerry McNerney (D-CA) and Anthony Gonzalez (R-OH), leading a discussion that reflects Washington’s growing interest in the topic.
The panel also included Hodan Omaar, AI policy lead at the Center for Data Innovation; Russell Wald, director of policy at Stanford University’s Institute for Human-Centered AI and Damon Woodard, director of AI partnerships at University of Florida’s AI Initiative.
“AI is getting increased interest among my colleagues on both sides of the aisle, and this is going to continue for some time,” McNerney said. Given that momentum, Gonzalez said the U.S. should be on the bleeding edge of AI development “for both economic and geopolitical reasons.”
Along those lines, the first thing the pair wanted to learn was how panelists viewed the importance of legislative efforts to fund and support AI research and development.
Wald expressed enthusiasm over legislation Congress passed last year as part of the National Defense Authorization Act, which he said would have an expansive effect on the market for AI.
Wald also said he was surprised at the findings of Stanford’s “Government by Algorithm” report, which detailed the federal government’s use of AI to do things such as track suicide risk among veterans, support SEC insider trading investigations and identify Medicare fraud.
Woodard suggested that continued leadership and innovation coming from Washington is critical if AI is to deliver on its promise.
“AI can play a big role in the economy,” said Woodard. “Having this kind of input from the government is important before we can have the kind of advancements that we need.”
The Role of Universities
Woodard and UF are already doing their part. Woodard’s role at the school includes helping transform it into a so-called “AI university.” In response to a question from Gonzalez about what that transition looks like, he said it required establishing a world-class AI infrastructure, performing cutting-edge AI research and incorporating AI throughout the curriculum.
“We want to make sure every student has some exposure to AI as it relates to their field of study,” said Woodard.
He said the school has more than 200 faculty members engaged in AI-related research, and that it’s committed to hiring 100 more. And while Woodard believes the university’s efforts will lead to more qualified AI professionals and AI innovation around its campus in Gainesville, he also said that partnerships, especially those that encourage diversity, are critical to encouraging more widespread industry development.
Along those lines, UF has joined an engineering consortium and will provide 15 historically Black colleges and two Hispanic-serving schools with access to its prodigious AI resources.
Omaar said such efforts are especially important when considering how unequally the high performance computing resources needed to conduct AI research are distributed.
In response to a question from McNerney about a recent National Science Foundation report, Omaar noted the finding that the U.S. Department of Energy is only providing support to about a third of the researchers seeking access to HPC resources.
“Many universities are conducting AI research without the tools they need,” she said.
Omaar said she’d like to see the NSF focus its funding on supporting efforts in states where HPC resources are scarce but AI research activity is high.
McNerney announced that he would soon introduce legislation requiring NSF to determine what AI resources are necessary for significant research output.
Moving Toward National AI Research Resources
The myriad challenges points to the benefits that could come from a more coordinated national effort. To that end, Gonzalez asked about the potential of the National AI Research Resource Task Force Act, and the national AI research cloud that would result from it.
Wald called the legislation a “game-changing AI initiative,” noting that the limited number of universities with AI research computing resources has pushed AI research into the private sector, where the objectives are driven by shorter-term financial goals rather than long-term societal benefits.
“What we see is an imbalance in the AI research ecosystem,” Wald said. The federal legislation would establish a pathway for a national AI research hub, which “has the potential to unleash American AI innovation,” he said.
The way Omaar sees it, the nationwide collaboration that would likely result — among politicians, industry and academia — is necessary for AI to reach its potential.
“Since AI will impact us all,” she said, “it’s going to need everyone’s contribution.”
Meet Paige Frank: Avid hoopster, Python coder and robotics enthusiast.
Still in high school, the Pittsburgh sophomore is so hooked on AI and robotics, she’s already a mentor to other curious teens.
“Honestly, I never was that interested in STEM. I wanted to be a hair stylist as a kid, which is also cool, but AI is clearly important for our future!” said Paige. “Everything changed in my freshman year, when I heard about the AI Pathways Institute.”
The initiative, known as AIPI for short, began in 2019 as a three-week pilot program offered by Boys & Girls Clubs of Western Pennsylvania (BGCWPA). Paige was in the first cohort of 40 youth to attend AIPI, which also included Tomi Oalore (left) and Makiyah Carrington (right), shown above.
Building on the success of that program, NVIDIA and BGCWPA this week have entered into a three-year partnership with the goal of expanding access to AI education to more students, particularly those from underserved and underrepresented communities.
Core to the collaboration is the creation of an AI Pathways Toolkit to make it easy for Boys & Girls Clubs nationwide and other education-focused organizations to deliver the curriculum to their participants.
“At first it was hard. But once we understood AI fundamentals from the AIPI coursework that the staff at BGCWPA taught and by using the NVIDIA Jetson online videos, it all began to come together,” said Paige. “Learning robotics hands-on with the Jetson Nano made it much easier. And it was exciting to actually see our programming in action as the Jetbot robot navigated the maze we created for the project.”
New AI Pathways to the Future
AI is spreading rapidly. But a major challenge to developing AI skills is access to hands-on learning and adequate computing resources. The AI Pathways Toolkit aims to make AI and robotics curriculum accessible for all students, even those without previous coding experience. It’s meant to prepare — and inspire — more students, like Paige, to see themselves as builders of our AI future.
Another obstacle to AI skills development can be perception. “I wasn’t that excited at first — there’s this thing that it’s too nerdy,” commented Paige, who says most of her friends felt similarly. “But once you get into coding and see how things work on the Jetbot, it’s real fun.”
She sees this transformation in action at her new internship as a mentor with the BGCWPA, where she helps kids get started with AI and coding. “Even kids who aren’t that involved at first really get into it. It’s so inspiring,” she said.
Boys & Girls on an AI Mission
Comprising 14 clubhouses and two Career Works Centers, BGCWPA offers programs, services and outreach that serve more than 12,000 youth ages 4-18 across the region. The AIPI is a part of its effort to provide young people with the tools needed to activate and advance their potential.
With support from NVIDIA, BGCWPA developed the initial three-week AIPI summer camp to introduce local high school students to AI and machine learning. Its curriculum was developed by BGCWPA Director of STEM Education Christine Nguyen and representatives from Carnegie Mellon University using NVIDIA’s educational materials, including the Jetson AI Specialist certification program.
The pilot in 2019 included two local summer camps with a focus on historically underrepresented communities encompassing six school districts. The camp attendees also created a hands-on project using the award-winning Jetson Nano developer kit and Jetbot robotics toolkit.
“We know how important it is to provide all students with opportunities to impact the future of technology,” said Nguyen. “We’re excited to utilize the NVIDIA Jetson AI certification materials with our students as they work toward being leaders in the fields of AI and robotics.”
Students earned a stipend in a work-based learning experience, and all of the participants demonstrated knowledge gained in the “Five Big Ideas in AI,” a framework created by AI4K12, a group working to develop guidelines for K-12 AI education. They also got to visit companies and see AI in action, learn human-centered design and present a capstone project that focused on a social problem they wanted to solve with AI.
“With the support of NVIDIA, we’re helping students from historically underrepresented communities build confidence and skills in the fields of AI, ML and robotics,” said Lisa Abel-Palmieri, Ph.D., president and CEO of BGCWPA. “Students are encouraged to develop personal and professional connections with a diverse group of peers who share similar passions. We also equip participants with the vital knowledge and tools to implement technology that addresses bias in AI and benefits society as a whole.”
From Summer Camp to Yearlong Program
Helping youth get started on a pathway to careers in AI and robotics has become an urgent need. Moreover, learning to develop AI applications requires real-world skills and resources that are often scarce in underserved and underrepresented communities.
NVIDIA’s partnership with BGCWPA includes a funding grant and access to technical resources, enabling the group to continue to develop a new AI Pathways Toolkit and open-source curriculum supported by staff tools and training.
The curriculum scales the summer camp model into a yearlong program that creates a pathway for students to gain AI literacy through hands-on development with the NVIDIA Jetson Nano and Jetbot kits. And the tools and training will make it easy for educators, including the Boys & Clubs’ Youth Development Professionals, to deliver the curriculum to their students.
The toolkit, when completed, will be made available to the network of Boys & Girls Clubs across the U.S., with the goal of implementing the program at 80 clubs by the middle of 2024. The open-source curriculum will also be available to other organizations interested in implementing AI education programs around the world.
As for Paige’s future plans: “I want to strengthen my coding skills and become a Python pro. I also would like to start a robotics club at my high school. And I definitely want to pursue computer science in college. I have a lot of goals,” she said.
Boys & Girls Club Joins AI Educators at GTC21
Abel-Palmieri was a featured panelist at a special event at GTC21 last week. With a record 1,600+ sessions this year, GTC offers a wealth of content — from getting started with AI for those new to the field, to advanced sessions for developers deploying real-world robotics applications. Register for free to view on-demand replays.
Joining Abel-Palmieri on the panel, “Are You Smarter Than a Fifth Grader Who Knows AI?” (SE2802), were Babak Mostaghimi, assistant superintendent of Curriculum, Instructional Support and Innovation for Gwinnett County Public Schools of Suwanee, Georgia; Jim Gibbs, CEO of Meter Feeder; Justin “Mr. Fascinate” Shaifer; and Maynard Okereke (a.k.a. Hip Hop MD) from STEM Success Summit.
Free GTC sessions to help students learn the basics of AI or brush up robotics skills include:
Jetson 101: Learning Edge AI Fundamentals (S32700)
Build Edge AI Projects with the Jetson Community (S32750)
Many GTC hands-on sessions are designed to help educators learn and teach AI, including: “Duckietown on NVIDIA Jetson: Hands-On AI in the Classroom” with ETH Zurich (S32637) and “Hands-On Deep Learning Robotics Curriculum in High Schools with Jetson Nano” with CAVEDU Education (S32702).
NVIDIA has also launched the Jetson Nano 2GB Developer Kit Grant Program with a goal to further democratize AI and robotics. The new program offers limited quantities of Jetson Developer Kits to professors, educators and trainers across the globe.
A rising technology star in Southeast Asia just put a sparkle in its AI.
Vingroup, Vietnam’s largest conglomerate, is installing the most powerful AI supercomputer in the region. The NVIDIA DGX SuperPOD will power VinAI Research, Vingroup’s machine-learning lab, in global initiatives that span autonomous vehicles, healthcare and consumer services.
One of the lab’s most important missions is to develop the AI smarts for an upcoming fleet of autonomous electric cars from VinFast, the group’s automotive division, driving its way to global markets.
New Hub on the AI Map
It’s a world-class challenge for the team led by Hung Bui. As a top-tier AI researcher and alum of Google’s DeepMind unit with nearly 6,000 citations from more than 200 papers and the winner of an International Math Olympiad in his youth, he’s up for a heady challenge.
In barely two years, Hung’s built a team that now includes 200 researchers. Last year, as a warm-up, they published as many as 20 papers at top conferences, pushing the boundaries of AI while driving new capabilities into the sprawling group’s many products.
“By July, a fleet of cars will start sending us their data from operating 24/7 in real traffic conditions over millions of miles on roads in the U.S. and Europe, and that’s just the start — the volume of data will only increase,” said Hung.
Hung foresees a need to retrain those models on a daily basis as new data arrives. He believes the DGX SuperPOD can accelerate by at least 10x the AI work of the NVIDIA DGX A100 system VinAI currently uses, letting engineers update their models every 24 hours.
“That’s the goal, it will save a lot of engineering time, but we will need a lot of help from NVIDIA,” said Hung, who hopes to have in May the new cluster of 20 DGX A100 systems linked together with an NVIDIA Mellanox HDR 200Gb/s InfiniBand network.
Developing World-Class Talent
With a DGX SuperPOD in place, Hung hopes to attract and develop more world-class AI talent in Vietnam. It’s a goal shared widely at Vingroup.
In October, the company hosted a ceremony to mark the end of the initial year of studies for the first 260 students at its VinUniversity. Vietnam’s first private, nonprofit college — founded and funded by Vingroup — it so far offers programs in business, engineering, computer science and health sciences.
It’s a kind of beacon pointing to a better future, like the Landmark81 (pictured above), the 81-story skyscraper, the country’s largest, that the group built and operates on the banks of the Saigon River.
“AI technology is a way to move the company forward, and it can make a lot of impact on the lives of people in Vietnam,” he said, noting other group divisions use DGX systems to advance medical imaging and diagnosis.
Making Life Better with AI
Hung has seen AI’s impact firsthand. His early work in the field at SRI International, in Silicon Valley, helped spawn the technology that powers the Siri assistant in Apple’s iPhone.
More recently, VinAI developed an AI model that lets users of VinSmart handsets unlock their phones using facial recognition — even if they’re wearing a COVID mask. At the same time, core AI researchers on his team developed Pho-BERT, a version for Vietnamese of the giant Transformer model used for natural-language processing.
It’s the kind of world-class work that two years ago Vingroup’s chairman and Vietnam’s first billionaire, Pham Nhat Vuong, wanted from VinAI Research. He personally convinced Hung to leave a position as research scientist in the DeepMind team and join Vingroup.
Navigating the AI Future
Last year to help power its efforts, VinAI became the first company in Southeast Asia to install a DGX A100 system.
“We’ve been using the latest hardware and software from NVIDIA quite successfully in speech recognition, NLP and computer vision, and now we’re taking our work to the next level with a perception system for driving,” he said.
It’s a challenge Hung gets to gauge daily amid a rising tide of pedestrians, bicycles, scooters and cars on his way to his office in Hanoi.
“When I came back to Vietnam, I had to relearn how to drive here — the traffic conditions are very different from the U.S.” he said.
“After a while I got the hang of it, but it got me thinking a machine probably will do an even better job — Vietnam’s driving conditions provide the ultimate challenge for systems trying to reach level 5 autonomy,” he added.
Artists and engineers, architects, and automakers are coming together around a new standard — born in the digital animation industry — that promises to weave all our virtual worlds together.
That’s the conclusion of a group of panelists from a wide range of industries who gathered at NVIDIA GTC21 this week to talk about Pixar’s Universal Scene Description standard, or USD.
“You have people from automotive, advertising, engineering, gaming, and software and we’re all having this rich conversation about USD,” said Perry Nightingale, head of creative at WPP, one of the world’s largest advertising and communications companies. “We’re basically experiencing this live in our work.”
Born at Pixar
Conceived at Pixar more than a decade ago and released as open-source in 2016, USD provides a rich, common language for defining, packaging, assembling and editing 3D data for a growing array of industries and applications.
Most recently, the technology has been adopted by NVIDIA to build Omniverse — a platform that creates a real-time, shared 3D world to speed collaboration among far-flung workers and even train robots, so they can more safely and efficiently work alongside humans.
The panel — moderated by veteran gaming journalist Dean Takahashi — included Martha Tsigkari, a partner at architects Foster + Partners; Mattias Wikenmalm, a senior visualization expert at Volvo Cars; WPP’s Nightingale; Lori Hufford, vice president of applications integration at engineering software company Bentley Systems; Susanna Holt, vice president at 3D software company Autodesk; and Ivar Dahlberg, a technical artist with Stockholm-based gaming studio Embark Studios.
It also featured two of the engineers who helped create the USD standard at Pixar — F. Sebastian Grass, project lead for USD at Pixar, and Guido Quaroni, now senior director of engineering of 3D and immersive at Adobe.
Joining them was NVIDIA Distinguished Engineer Michael Kass, who, along with NVIDIA’s Rev Lebaredian, helped lead the effort to build NVIDIA Omniverse.
A Sci-Fi Metaverse Come to Life
Omniverse was made to create and experience shared virtual 3D worlds, ones not unlike the science-fiction metaverse described by Neal Stephenson in his early 1990s novel “Snow Crash.” Of course, the full vision of the fictional metaverse remains in the future, but judging by the panel, it’s a future that’s rapidly approaching.
A central goal of Omniverse was to seamlessly connect together as many tools, applications and technologies as possible. To do this, Kass and Lebaredian knew they needed to represent the data using a powerful, expressive and battle-tested open standard. USD exactly fit the bill.
“The fact that you’ve built something so general and extensible that it addresses very nicely the needs of all the participants on this call — that’s an extraordinary achievement,” Kass told USD pioneers Grassia and Quaroni.
One of NVIDIA’s key additions to the USD ecosystem is a replication system. An application programmer can use the standard USD API to query a scene and alter it at will. With no special effort on the part of the programmer, the system keeps track of everything that changes.
In real time, the changes can be published to NVIDIA’s Omniverse Nucleus server, which sends them along to all subscribers. As a result, different teams in different places using different tools can work together and see each other’s changes without noticeable delay.
That technology has become invaluable in architecture, engineering and construction, where large teams from many different disciplines can now collaborate far more easily.
“You need a way for the creative people to do things that can be passed directly to the engineers and consultants in a seamless way,” Tsigkari said. “The structural engineer doesn’t care about my windows, doesn’t care about my doors.”
USD allows the structural engineer to see what they do care about.
USD and NVIDIA Omniverse provide a way to link a wide variety of specialized tools — for creatives, engineers and others — in real time.
“We do see the different industries converging and that’s not going to work if they can’t talk to one another,” said Autodesk’s Holt.
One valuable application is the ability to create product mockups in real time. For too long, Nightingale said, creative teams would have to present clients with 2D mockups of their designs because the tools used by the design teams were incompatible with those of the marketing team. Now those mockups can be in 3D and updated instantly as the design team makes changes.
Virtual Worlds Where AI, Robots and Autonomous Vehicles Can Learn
Capabilities like these aren’t just critical for humans. USD also promises to be the foundation for virtual worlds where new products can be simulated and rigorously tested.
USD and Omniverse are at the center of NVIDIA’s DRIVE simulation platform, Kass explained, which gives automakers a sandbox where they can test new autonomous vehicles. Nothing should go out into the real world until it’s thoroughly tested in simulation, he said.
“We want all of our mistakes to happen in the virtual world, and we based that entire virtual world on USD,” Kass said.
There’s also potential for technologies like USD to allow participants in the kind of virtual worlds game makers are building to play a larger role in shaping those words in real time.
“One of the interesting things we’re seeing is how players can be part of creating a world,” Dahlberg said.
“Now there are a lot more opportunities where you create something together with the inhabitants of that world,” he added.
The first steps, however, have already been taken — thanks to USD — making it easier to exchange data about shared 3D worlds.
“If we can actually get that out of the way, when that’s easy to do, we can start building a proper metaverse,” Volvo’s Wikenmalm said.
To help developers hone their craft, NVIDIA this week introduced more than 50 new and updated tools and training materials for data scientists, researchers, students and developers of all kinds.
The offerings range from software development kits for conversational AI and ray tracing, to hands-on courses from the NVIDIA Deep Learning Institute.
They’re available to all members of the NVIDIA Developer Program, a free-to-join global community of over 2.5 million technology innovators who are revolutionizing industries through accelerated computing.
Training for Success
Learning new and advanced software development skills is vital to staying ahead in a competitive job market. DLI offers a comprehensive learning experience on a wide range of important topics in AI, data science and accelerated computing. Courses include hands-on exercises and are available in both self-paced and instructor-led formats.
The five courses cover topics such as deep learning, data science, autonomous driving and conversational AI. All include hands-on exercises that accelerate learning and mastery of the material. DLI workshops are led by NVIDIA-certified instructors and include access to fully configured GPU-accelerated servers in the cloud for each participant.
These instructor-led workshops will be available to enterprise customers and the general public. DLI recently launched public workshops for its popular instructor-led courses, increasing accessibility to individual developers, data scientists, researchers and students.
To extend training further, DLI is releasing a new book, “Learning Deep Learning,” that provides a complete guide to deep learning theory and practical applications. Authored by NVIDIA Engineer Magnus Ekman, it explores how deep neural networks are applied to solve complex and challenging problems. Pre-orders are available now through Amazon.
New and Accelerated SDKs, Plus Updated Technical Tools
SDKs are a key component that can make or break an application’s performance. Dozens of new and updated kits for high performance computing, computer vision, data science, conversational AI, recommender systems and real-time graphics are available so developers can meet virtually any challenge. Updated tools are also in place to help developers accelerate application development.
Updated tools available now:
NGC is a GPU-optimized hub for AI and HPC software with a catalog of hundreds of SDKs, AI, ML and HPC containers, pre-trained models and Helm charts that simplify and accelerate workflows from end to end. Pre-trained models help developers jump-start their AI projects for a variety of use cases, including computer vision and speech.
New SDK (coming soon):
TAO (Train, Adapt, Optimize) is a GUI-based, workflow-driven framework that simplifies and accelerates the creation of enterprise AI applications and services. Enterprises can fine-tune pre-trained models using transfer learning or federated learning to produce domain specific models in hours rather than months, eliminating the need for large training runs and deep AI expertise. Learn more about TAO.
New and updated SDKs and frameworks available now:
Jarvis, a fully accelerated application framework for building multimodal conversational AI services. It includes state-of-the-art models pre-trained for thousands of hours on NVIDIA DGX systems, the Transfer Learning Toolkit for adapting those models to domains with zero coding, and optimized end-to-end speech, vision and language pipelines that run in real time. Learn more.
Maxine, a GPU-accelerated SDK with state-of-the-art AI features for developers to build virtual collaboration and content creation applications such as video conferencing and live streaming. Maxine’s AI SDKs — video effects, audio effects and augmented reality — are highly optimized and include modular features that can be chained into end-to-end pipelines to deliver the highest performance possible on GPUs, both on PCs and in data centers. Learn more.
Merlin, an application framework, currently in open beta, enables the development of deep learning recommender systems — from data preprocessing to model training and inference — all accelerated on NVIDIA GPUs. Read more about Merlin.
DeepStream, an AI streaming analytics toolkit for building high-performance, low-latency, complex video analytics apps and services.
Triton Inference Server, which lets teams deploy trained AI models from any framework, from local storage or cloud platform on any GPU- or CPU-based infrastructure.
TensorRT, for high-performance deep learning inference, includes a deep learning inference optimizer and runtime that delivers low latency and high throughput for deep learning inference applications. TensorRT 8 is 2x faster for transformer-based models and new techniques to achieve accuracy similar to FP32 while using high-performance INT8 precision.
RTX technology, which helps developers harness and bring realism to their games:
DLSS is a deep learning neural network that helps graphics developers boost frame rates and generates beautiful, sharp images for their projects. It includes performance headroom to maximize ray-tracing settings and increase output resolution. Unity has announced that DLSS will be natively supported in Unity Engine 2021.2.
RTX Direct Illumination (RTXDI) makes it possible to render, in real time, scenes with millions of dynamic lights without worrying about performance or resource constraints.
RTX Global Illumination (RTXGI) leverages the power of ray tracing to scalably compute multi-bounce indirect lighting without bake times, light leaks or expensive per-frame costs.
Real-Time Denoisers (NRD) is a spatio-temporal API-agnostic denoising library that’s designed to work with low ray-per-pixel signals.
The rollout of 5G for edge AI services promises to fuel a magic carpet ride into the future for everything from autonomous vehicles, to supply chains and education.
That was a key takeaway from a panel of five 5G experts speaking at NVIDIA’s GPU Technology Conference this week.
With speed boosts up to 10x that of 4G, 5G will offer game-changing features to cellular networks, such as low latency, improved reliability and built-in security. It will also radically improve AI services, such as online gaming, those provided by AVs, and robots used for logistics. In addition, AI on 5G could help deliver services like online learning and micro banking to remote regions of underdeveloped parts of the world today.
Executives from Verizon, Wind River, Mavenir, Google and NVIDIA shared their views on the wide-ranging impact 5G will have on edge AI services. And if just half of their predictions appear within the next decade, the future promises exciting times.
Enhance Human Experience
The next generation of applications is going to enhance the human experience and create new opportunities, said Ganesh Harinath, VP and CTO of 5G MEC and AI platforms at Verizon. But he said the networking requirements for the future call for edge computing.
“The inferencing aspect of machine learning has to be moved closer and closer to where the signals are generated,” said Harinath.
Propel Digital World
Nermin Mohamed, head of telco solutions at embedded systems software provider Wind River, said that 5G, AI and edge computing are “the three magic words that will propel the digital connected world.”
She said that companies are looking at 5G as an accelerator for their revenue and that the rollout of 5G grew four times faster than 4G over the past 18 months.
Bridge Digital Divide
The availability of 5G will usher in digital services to remote places, bridging the digital divide, said Pardeep Kohli, president and CEO of telecom software company Mavenir.
With 5G “you can have low latency and a good experience where this type of connectivity can be used for having an education” where it might otherwise not be available, said Kohli.
Reshape Telecom, Edge
Open ecosystems are key to encouraging developers to build applications, said Shailesh Shukla, vice president and general manager for Networking and Telecom at Google Cloud
“With the advent of 5G and AI, there is an opportunity now to reshape the broader telecom infrastructure and the edge industry by doing something very similar to what was done with Google and Android,” Shukla said.
‘Heady Mix Ahead’
A lot of the applications — autonomous vehicles, augmented and virtual reality — have been restrained by network limitations, said Ronnie Vasishta, NVIDIA senior vice president for Telecoms. NVIDIA has been investing in GPU and DPU platforms for accelerated compute to support the ecosystem of edge AI applications and telecom partners, he said.
“Sometimes we underestimate the impact that 5G will have on our lives,” he said. “We’re really in for a heady mix ahead of us with the combination of AI and 5G.”