Take the Future for a Spin at GTC 2022

Don’t miss the chance to experience the breakthroughs that are driving the future of autonomy.

NVIDIA GTC will bring together the leaders, researchers and developers who are ushering in the era of autonomous vehicles. The virtual conference, running March 21-24, also features experts from industries transformed by AI, such as healthcare, robotics and finance.

And it’s all free to attend.

The conference features a brilliant display of the latest in AI development with the opening keynote on March 22, delivered by NVIDIA CEO and founder Jensen Huang.

The whole week is packed with more than 900 sessions covering autonomous vehicles, AI, supercomputing and more. Conference-goers also have the opportunity to network and learn from in-house experts on the latest in AI and self-driving development.

Here’s a sneak peek of what to expect at GTC next month:

Learn From Luminaries

It seems that nearly every week there’s a new development in the field of AI, but how do these breakthroughs translate to autonomous vehicles?

Hear how industry leaders are harnessing the latest AI innovations to accelerate intelligent transportation — from global automakers and suppliers to startups and researchers.

Automotive session highlights include:

  • Stefan Sicklinger, head of Big Loop and advanced systems division at CARIAD/VW Group, covers the process of leveraging fleet data to develop and improve autonomous driving software at scale.
  • Magnus Östberg, chief software officer of Mercedes-Benz, discusses how the premium automaker is creating software-defined features for the next generation of luxury.
  • Xiaodi Hou, co-founder and CTO of TuSimple, walks through the autonomous trucking startup’s approach to achieving level 4 autonomy.
  • Raquel Urtasun, CEO of Waabi, details how the startup, which recently emerged from stealth, takes an AI-first approach to autonomous driving development.
  • Michael Keckeisen, director of ProAI at ZF Group, outlines the role of supercomputers in developing and deploying safer, more efficient transportation.
  • Developer-focused sessions from Cruise, Luminar, Microsoft, Ouster, Pony.ai, Zoox and more.

Inside NVIDIA DRIVE

Attendees also have the opportunity to get the inside scoop on the latest NVIDIA DRIVE solutions directly from the minds behind them.

NVIDIA DRIVE Developer Days is a series of deep-dive sessions on building safe and robust autonomous vehicles. These sessions are led by the NVIDIA engineering team, which will highlight the newest DRIVE features and discuss how to apply them to autonomous vehicle development.

Topics include:

This virtual content is available to all GTC attendees — register for free today and seize the opportunity to get a firsthand look at the autonomous future.

The post Take the Future for a Spin at GTC 2022 appeared first on The Official NVIDIA Blog.

Read More

Bringing Novel Idea to Life, NVIDIA Artists Create Retro Writer’s Room in Omniverse With ‘The Storyteller’

Real-time rendering and photorealistic graphics used to be tall tales, but NVIDIA Omniverse has made them fact from fiction.

NVIDIA’s own artists are writing new chapters in Omniverse, an accelerated 3D design platform that connects and enhances 3D apps and creative workflows, to showcase these stories.

Combined with the NVIDIA Studio platform, Omniverse and Studio-validated hardware enable creatives to push the limits of their imagination and design rich, captivating virtual worlds like never before.

One of the latest projects in Omniverse, The Storyteller showcases a stunning retro-style writer’s room filled with leather-bound books, metallic typewriters and wooden furniture. Artists from NVIDIA used Omniverse, Autodesk 3ds Max and Substance 3D Painter to capture the essence of the room, creating detailed 3D models with realistic lighting.

Just for Reference — It Begins With Images

To kick off the project, lead environment artist Andrew Averkin looked at various interior images to use as references for the scene. From retro furniture and toys to vintage record players and sturdy bookshelves, these images were used as guidance and inspiration throughout the creative process.

The team of artists also collected various 3D models to create the assets that would populate and bring mood and atmosphere to the scene.

For one key element, the writer’s table, the team added extra details, such as texturing done in Substance 3D Painter, to create more layers of realism.

3D Assets, Assemble!

Once the 3D assets were completed, Averkin used Autodesk 3ds Max to assemble the scene connected to Omniverse Create, a scene composition app that can handle complex 3D scenes and objects.

With Autodesk 3ds Max connected to Create, Averkin had a much more iterative workflow — he was able to place 3D models in the scene, make changes to them on the spot, and continue assembling the scene until he achieved the look and feel he wanted.

“The best part was that I used all the tools in Autodesk 3ds Max to quickly assemble the scene. And with Omniverse Create, I used path-traced render mode to get high-quality, photorealistic renders of the scene in real time,” said Averkin. “I also used Assembly Tool, which is a set of tools that allowed me to work with the 3D models in a more efficient way — from scattering objects to painting surfaces.”

Averkin used the Autodesk 3ds Max Omniverse Connector — a plug-in that enables users to quickly and easily convert 3D content to Universal Scene Description, or USD — to export the scene from Autodesk 3ds Max to Omniverse Create. This made it easier to sync his work from one app to another, and continue working on the project inside Omniverse.

A Story Rendered Complete

To put the final touches on the Storyteller project, the artists worked with the simple-to-use tools in Omniverse Create to add realistic, ambient lighting and shadows.

“I wanted the lighting to look like the kind you see after the rain, or on a cloudy day,” said Averkin. “I also used rectangular light behind the window, so it could brighten the indoor part of the room and provide some nice shadows.”

To stage the composition, the team placed 30 or so cameras around the room to capture its different angles and perspectives, so viewers could be immersed in the scene.

For the final render of The Storyteller, the artists used Omniverse RTX Renderer in path-traced mode to get the most realistic result.

Some shots were rendered on an NVIDIA Studio system powered by two NVIDIA RTX A6000 GPUs. The team also used Omniverse Farm — a system layer that lets users create their own render farm — to accelerate the rendering process and achieve the final design significantly faster.

Watch the final cut of The Storyteller, and learn more about Omniverse at GTC, taking place on March 21-24.

GTC, which is free to attend, will feature sessions that dive into virtual worlds and digital twins, including Averkin’s talk, “An Artist’s Omniverse: How to Build Virtual Worlds.

Creators can download NVIDIA Omniverse for free and get started with step-by-step tutorials on our Omniverse YouTube channel. For additional resources and inspiration, follow Omniverse on  Instagram, Twitter and Medium. To chat with the community, check out the Omniverse forums and join our Discord Server.

The post Bringing Novel Idea to Life, NVIDIA Artists Create Retro Writer’s Room in Omniverse With ‘The Storyteller’ appeared first on The Official NVIDIA Blog.

Read More

What Is Edge AI and How Does It Work?

Recent strides in the efficacy of AI, the adoption of IoT devices and the power of edge computing have come together to unlock the power of edge AI.

This has opened new opportunities for edge AI that were previously unimaginable — from helping radiologists identify pathologies in the hospital, to driving cars down the freeway, to helping us pollinate plants.

Countless analysts and businesses are talking about and implementing edge computing, which traces its origins to the 1990s, when content delivery networks were created to serve web and video content from edge servers deployed close to users.

Today, almost every business has job functions that can benefit from the adoption of edge AI. In fact, edge applications are driving the next wave of AI in ways that improve our lives at home, at work, in school and in transit.

Learn more about what edge AI is, its benefits and how it works, examples of edge AI use cases, and the relationship between edge computing and cloud computing.

What Is Edge AI? 

Edge AI is the deployment of AI applications in devices throughout the physical world. It’s called “edge AI” because the AI computation is done near the user at the edge of the network, close to where the data is located, rather than centrally in a cloud computing facility or private data center.

Since the internet has global reach, the edge of the network can connote any location. It can be a retail store, factory, hospital or devices all around us, like traffic lights, autonomous machines and phones.

Edge AI: Why Now? 

Organizations from every industry are looking to increase automation to improve processes, efficiency and safety.

To help them, computer programs need to recognize patterns and execute tasks repeatedly and safely. But the world is unstructured and the range of tasks that humans perform covers infinite circumstances that are impossible to fully describe in programs and rules.

Advances in edge AI have opened opportunities for machines and devices, wherever they may be, to operate with the “intelligence” of human cognition. AI-enabled smart applications learn to perform similar tasks under different circumstances, much like real life.

The efficacy of deploying AI models at the edge arises from three recent innovations.

  1. Maturation of neural networks: Neural networks and related AI infrastructure have finally developed to the point of allowing for generalized machine learning. Organizations are learning how to successfully train AI models and deploy them in production at the edge.
  2. Advances in compute infrastructure: Powerful distributed computational power is required to run AI at the edge. Recent advances in highly parallel GPUs have been adapted to execute neural networks.
  3. Adoption of IoT devices: The widespread adoption of the Internet of Things has fueled the explosion of big data. With the sudden ability to collect data in every aspect of a business — from industrial sensors, smart cameras, robots and more — we now have the data and devices necessary to deploy AI models at the edge. Moreover, 5G is providing IoT a boost with faster, more stable and secure connectivity.

Why Deploy AI at the Edge? What Are the Benefits of Edge AI? 

Since AI algorithms are capable of understanding language, sights, sounds, smells, temperature, faces and other analog forms of unstructured information, they’re particularly useful in places occupied by end users with real-world problems. These AI applications would be impractical or even impossible to deploy in a centralized cloud or enterprise data center due to issues related to latency, bandwidth and privacy.

The benefits of edge AI include:

  • Intelligence: AI applications are more powerful and flexible than conventional applications that can respond only to inputs that the programmer had anticipated. In contrast, an AI neural network is not trained how to answer a specific question, but rather how to answer a particular type of question, even if the question itself is new. Without AI, applications couldn’t possibly process infinitely diverse inputs like texts, spoken words or video.
  • Real-time insights: Since edge technology analyzes data locally rather than in a faraway cloud delayed by long-distance communications, it responds to users’ needs in real time.
  • Reduced cost: By bringing processing power closer to the edge, applications need less internet bandwidth, greatly reducing networking costs.
  • Increased privacy: AI can analyze real-world information without ever exposing it to a human being, greatly increasing privacy for anyone whose appearance, voice, medical image or any other personal information needs to be analyzed. Edge AI further enhances privacy by containing that data locally, uploading only the analysis and insights to the cloud. Even if some of the data is uploaded for training purposes, it can be anonymized to protect user identities. By preserving privacy, edge AI simplifies the challenges associated with data regulatory compliance.
  • High availability: Decentralization and offline capabilities make edge AI more robust since internet access is not required for processing data. This results in higher availability and reliability for mission-critical, production-grade AI applications.
  • Persistent improvement: AI models grow increasingly accurate as they train on more data. When an edge AI application confronts data that it cannot accurately or confidently process, it typically uploads it so that the AI can retrain and learn from it. So the longer a model is in production at the edge, the more accurate the model will be.

How Does Edge AI Technology Work?

Lifecycle of an edge AI application.

For machines to see, perform object detection, drive cars, understand speech, speak, walk or otherwise emulate human skills, they need to functionally replicate human intelligence.

AI employs a data structure called a deep neural network to replicate human cognition. These DNNs are trained to answer specific types of questions by being shown many examples of that type of question along with correct answers.

This training process, known as “deep learning,” often runs in a data center or the cloud due to the vast amount of data required to train an accurate model, and the need for data scientists to collaborate on configuring the model. After training, the model graduates to become an “inference engine” that can answer real-world questions.

In edge AI deployments, the inference engine runs on some kind of computer or device in far-flung locations such as factories, hospitals, cars, satellites and homes. When the AI stumbles on a problem, the troublesome data is commonly uploaded to the cloud for further training of the original AI model, which at some point replaces the inference engine at the edge. This feedback loop plays a significant role in boosting model performance; once edge AI models are deployed, they only get smarter and smarter.

What Are Examples of Edge AI Use Cases? 

AI is the most powerful technology force of our time. We’re now at a time where AI is revolutionizing the world’s largest industries.

Across manufacturing, healthcare, financial services, transportation, energy and more, edge AI is driving new business outcomes in every sector, including:

  • Intelligent forecasting in energy: For critical industries such as energy, in which discontinuous supply can threaten the health and welfare of the general population, intelligent forecasting is key. Edge AI models help to combine historical data, weather patterns, grid health and other information to create complex simulations that inform more efficient generation, distribution and management of energy resources to customers.
  • Predictive maintenance in manufacturing: Sensor data can be used to detect anomalies early and predict when a machine will fail. Sensors on equipment scan for flaws and alert management if a machine needs a repair so the issue can be addressed early, avoiding costly downtime.
  • AI-powered instruments in healthcare: Modern medical instruments at the edge are becoming AI-enabled with devices that use ultra-low-latency streaming of surgical video to allow for minimally invasive surgeries and insights on demand.
  • Smart virtual assistants in retail: Retailers are looking to improve the digital customer experience by introducing voice ordering to replace text-based searches with voice commands. With voice ordering, shoppers can easily search for items, ask for product information and place online orders using smart speakers or other intelligent mobile devices.

What Role Does Cloud Computing Play in Edge Computing? 

AI applications can run in a data center like those in public clouds, or out in the field at the network’s edge, near the user. Cloud computing and edge computing each offer benefits that can be combined when deploying edge AI.

The cloud offers benefits related to infrastructure cost, scalability, high utilization, resilience from server failure, and collaboration. Edge computing offers faster response times, lower bandwidth costs and resilience from network failure.

There are several ways in which cloud computing can support an edge AI deployment:

  • The cloud can run the model during its training period.
  • The cloud continues to run the model as it is retrained with data that comes from the edge.
  • The cloud can run AI inference engines that supplement the models in the field when high compute power is more important than response time. For example, a voice assistant might respond to its name, but send complex requests back to the cloud for parsing.
  • The cloud serves up the latest versions of the AI model and application.
  • The same edge AI often runs across a fleet of devices in the field with software in the cloud

Learn more about the best practices for hybrid edge architectures.

The Future of Edge AI 

Thanks to the commercial maturation of neural networks, proliferation of IoT devices, advances in parallel computation and 5G, there is now robust infrastructure for generalized machine learning. This is allowing enterprises to capitalize on the colossal opportunity to bring AI into their places of business and act upon real-time insights, all while decreasing costs and increasing privacy.

We are only in the early innings of edge AI, and still the possible applications seem endless.

Learn how your organization can deploy edge AI by checking out the top considerations for deploying AI at the edge.

The post What Is Edge AI and How Does It Work? appeared first on The Official NVIDIA Blog.

Read More

Performance You Can Feel: Putting GeForce NOW RTX 3080 Membership’s Ultra-Low Latency to the Test This GFN Thursday

GeForce NOW’s RTX 3080 membership is the next generation of cloud gaming. This GFN Thursday looks at one of the tier’s major benefits: ultra-low-latency streaming from the cloud.

This week also brings a new app update that lets members log in via Discord, a members-only World of Warships reward and eight titles joining the GeForce NOW library.

Full Speed Ahead

The GeForce NOW RTX 3080 membership tier is kind of like magic. When members play on underpowered PCs, Macs, Chromebooks, SHIELD TVs, Android devices, iPhones and iPads, they’re streaming the full PC games that they own with all of the benefits of GeForce RTX 3080 GPUs — like ultra-low latency.

A few milliseconds of latency — the time it takes from a keystroke or mouse click to seeing the result on screen — can be the difference between a round-winning moment and a disappointing delay in the game.

The GeForce NOW RTX 3080 membership, powered by GeForce NOW SuperPODs, reduces latency through faster game rendering, more efficient encoding and higher streaming frame rates. Each step helps deliver cloud gaming that rivals many local gaming experiences.

With game rendering on the new SuperPODs, all GeForce NOW RTX 3080 members will feel a discernible reduction in latency. However, GeForce NOW RTX 3080 members playing on the PC, Mac and Android apps will observe the greatest benefits by streaming at up to 120 frames per second.

The result? Digital Foundry proclaimed it’s “the best streaming system we’ve played.”

That means whether you’re hitting your shots in Rainbow Six Siege, using a game-changing ability in Apex Legends or even sweating out a fast-paced shooter like Counter-Strike: Global Offensive, it’s all streaming in real time on GeForce NOW.

But don’t just take our word for it. We asked a few cloud gaming experts to put the GeForce NOW RTX 3080 membership to the test. Cloud Gaming Xtreme looked at how close RTX 3080 click-to-pixel latency is to their local PC, while GameTechPlanet called it “the one to beat when it comes to cloud gaming, for picture quality, graphics, and input latency.”

And Virtual Cloud noted the latency results “really shocked me just how good it felt,” and that “when swapping from my local PC to play on GeForce NOW, I really couldn’t tell a difference at all.”

Live in the Future

On top of this, the membership comes with the longest gaming session length for GeForce NOW — clocking in at eight glorious hours. It also enables full control to customize in-game graphics settings and RTX ON, rendering environments in cinematic quality for supported games.

Cyberpunk 2077 on GeForce NOW
Explore Night City with RTX ON, all streaming from the cloud with RTX 3080-class performance.

That means RTX 3080 members can experience titles like Cyberpunk 2077 with maximized, uninterrupted playtime at beautiful, immersive cinematic quality. Gamers can also enjoy the title’s new update – Patch 1.5 – released earlier this week, adding ray-traced shadows from local lights and introducing a variety of improvements and quality of life chances across the board, along with fresh pieces of free additional content.

With improved Fixer and Gig gameplay, enhanced UI and crowd reaction AI, map and open world tweaks, new narrative interactions, weapons, gear, customization options and more, now is the best time to stream Cyberpunk 2077 with RTX ON.

Ready to play? Start gaming today with a six-month GeForce NOW RTX 3080 membership for $99.99, with 10 percent off for Founders. Check out our membership FAQ for more information.

Dive Into the Cloud with Discord in the 2.0.38 Update

The new GeForce NOW update improves login options for gamers by supporting Discord as a convenient new account creation and login option for their NVIDIA accounts. Members can now use their Discord logins to access their GeForce NOW accounts. That’s one less password to remember.

The 2.0.38 update on GeForce NOW PC and Mac apps also supports Discord’s Rich Presence feature, which lets members easily display the game they’re currently playing in their Discord user status. The feature can be enabled or disabled through the GeForce NOW settings menu.

‘World of Warships’ Rewards

GeForce NOW delivers top-level performance and gaming goodies to members playing on the cloud.

World of Warship Rewards on GeForce NOW
Say ‘ahoy’ to these two awesome ships arriving today.

In celebration of the second anniversary of GeForce NOW, GFN Thursdays in February are full of rewards.

Today, members can get rewards for the naval warfare battle game World of Warships. Add two new ships to your fleet this week with the Charleston or the Dreadnought, redeemable for players via the Epic Games Store based on playtime in the game.

Getting membership rewards for streaming games on the cloud is easy. Log in to your NVIDIA account and select “GEFORCE NOW” from the header, scroll down to “REWARDS” and click the “UPDATE REWARDS SETTINGS” button. Check the box in the dialogue window that shows up to start receiving special offers and in-game goodies.

Sign up for the GeForce NOW newsletter, including notifications for when rewards are available, by logging into your NVIDIA account and selecting “PREFERENCES” from the header. Check the “Gaming & Entertainment” box, and “GeForce NOW” under topic preferences, to receive the latest updates.

Ready, Set … Game!

SpellMaster the Saga on GeForce NOW
Master your magic skills to become a sorcerer and save an uncharted world from impending disaster in SpellMaster: The Saga.

GFN Thursday always brings in a new batch of games for members to play. Catch the following eight new titles ready to stream this week:

  • SpellMaster: The Saga (New release on Steam)
  • Ashes of the Singularity: Escalation (Steam)
  • Citadel: Forged With Fire (Steam)
  • Galactic Civilizations III (Steam)
  • Haven (Steam)
  • People Playground (Steam)
  • Train Valley 2 (Steam)
  • Valley (Steam)

We make every effort to launch games on GeForce NOW as close to their release as possible, but, in some instances, games may not be available immediately.

Great gaming is calling. Let us know who you’re playing with on Twitter:

The post Performance You Can Feel: Putting GeForce NOW RTX 3080 Membership’s Ultra-Low Latency to the Test This GFN Thursday appeared first on The Official NVIDIA Blog.

Read More

Reimagining Modern Luxury: NVIDIA Announces Partnership with Jaguar Land Rover

Jaguar Land Rover and NVIDIA are redefining modern luxury, infusing intelligence into the customer experience.

As part of its Reimagine strategy, Jaguar Land Rover announced today that it will develop its upcoming vehicles on the full-stack NVIDIA DRIVE Hyperion 8 platform, with DRIVE Orin delivering a wide spectrum of active safety, automated driving and parking systems, as well as driver assistance systems built on DRIVE AV software. The system will also deliver AI features inside the vehicle, including driver and occupant monitoring and advanced visualization, leveraging the DRIVE IX software stack.

The iconic maker of modern luxury vehicles and the leader in AI computing will work together to build software-defined features for future Jaguar and Land Rover vehicles with continuously improving automated driving and intelligent features, from 2025.

The result will be some of the world’s most desirable vehicles, preserving the design purity of the distinct Jaguar and Land Rover personalities while transforming the experiences for customers at every step of the journey.

These vehicles will be built on a unified computer architecture that delivers software-defined services for ongoing customer value and innovative new business models. The combination of centralized compute and intelligent features upgraded over the air also enhances supply chain management.

The next step in this reimagination of responsible, modern luxury is implementing safe and convenient AI-powered features.

“Our long-term strategic partnership with NVIDIA will unlock a world of potential for our future vehicles as the business continues its transformation into a truly global, digital powerhouse,” said Thierry Bolloré, CEO of Jaguar Land Rover.

End-to-End Intelligence

Future Jaguar and Land Rover vehicles will be developed with NVIDIA AI from end to end.

This development begins in the data center. Engineers from both companies will work together to train, test and validate new automated driving features using NVIDIA data center solutions.

This includes data center hardware, software and workflows needed to develop and validate autonomous driving technology, from raw data collection through validation. NVIDIA DGX supercomputers provide the building blocks required for DNN development and training, while DRIVE Sim enables the necessary validation, replay and testing in simulation to enable a safe autonomous driving experience.

With NVIDIA Omniverse, engineers can collaborate virtually as well as exhaustively test and validate these DNNs with high-fidelity synthetic data generation.

Jaguar Land Rover will deploy this full-stack solution on NVIDIA DRIVE Hyperion — the central nervous system of the vehicle — which features the DRIVE Orin centralized AI compute platform — the car’s brain. DRIVE Hyperion includes the safety, security systems, networking and surrounding sensors used for autonomous driving, parking and intelligent cockpit applications.

NVIDIA DRIVE Orin

The future vehicles will be continuously improved and supported throughout their lifetimes by some of the world’s foremost software and AI engineers at NVIDIA and Jaguar Land Rover.

​​“Next-generation cars will transform automotive into one of the largest and most advanced technology industries,” said NVIDIA founder and CEO Jensen Huang. “Fleets of software-defined, programmable cars will offer new functionalities and services for the life of the vehicles.”

A Responsible Future

This new intelligent architecture, in addition to the transition to zero emissions powertrains, ensures Jaguar and Land Rover vehicles will not only transform the experience of customers, but also benefit the surrounding environment.

In addition to an all-electric future, the automaker is aiming to achieve net-zero carbon emissions across its supply chain, products and operations by 2039, incorporating sustainability into its long-heralded heritage.

By developing vehicles that are intelligent and backed by the high-performance compute of NVIDIA, Jaguar Land Rover is investing in technology that is safe for all road users, as well as convenient and comfortable.

This attention to responsibility in the new era of modern luxury extends the unique, emotional attachment that Jaguar and Land Rover vehicles inspire for even more decades to come.

The post Reimagining Modern Luxury: NVIDIA Announces Partnership with Jaguar Land Rover appeared first on The Official NVIDIA Blog.

Read More

The Greatest Podcast Ever Recorded

Is this the best podcast ever recorded? Let’s just say you don’t need a GPU to know that’s a stretch. But it’s pretty great if you’re a fan of tall tales.

And better still if you’re not a fan of stretching the truth at all.

That’s because detecting hyperbole may one day get more manageable, thanks to researchers at the University of Copenhagen working in the growing field of exaggeration detection.

Dustin Wright and Isabelle Augenstein have used NVIDIA GPUs to train an “exaggeration detection system” to identify overenthusiastic claims in health science reporting.

Their work comes as the pandemic has fueled demand for understandable, accurate information. And social media has made health misinformation more widespread.

Their paper leverages “few-shot learning,” a technique that lets developers wring more intelligence out of less data, and a new version of a technique called pattern exploiting training.

Research like Wright and Augenstein’s could one day speed more precise health sciences news to more people.

AI Podcast host Noah Kravitz — whose fishing stories we will never trust again after this episode — spoke with Wright about the work.

Key Points From This Episode

  • Approximately 33% of press releases about scientific papers tend to exaggerate the findings in the papers, which leads to news articles exaggerating the findings of these papers.
  • Wright’s exaggeration detection project aims to provide people like journalists with accurate information to ensure that they report accurately on science.
  • The project, accelerated using a NVIDIA Titan X GPU, uses a novel, multitask-capable version of a technique called Pattern Exploiting Training, which they dubbed MT-PET

Tweetables:

“Can we leverage language and related that learn patterns that the language model has picked up on from mass language model pre training, and be able to do classification with any text?” – Dustin Wright [7:28]

“About 33% of the time, press releases will exaggerate the scientific papers and as a result, that means about 33% of news articles exaggerate the findings in scientists’ papers.” – Dustin Wright [9:50]

“This is progress towards a system that could assist, for example, journalists, and ensuring that they’re doing accurate reporting on science.” – Dustin Wright [16:20]

You Might Also Like

NVIDIA’s Liila Torabi Talks the New Era of Robotics Through Isaac Sim

Robots aren’t limited to the assembly line. Liila Torabi, senior product manager for Isaac Sim, a robotics and AI simulation platform powered by NVIDIA Omniverse, talks about where the field’s headed.

GANTheftAuto: Harrison Kinsley on AI-Generated Gaming Environments

Humans playing games against machines is nothing new, but now computers can develop their own games for people to play. Programming enthusiast and social media influencer Harrison Kinsley created GANTheftAuto, an AI-based neural network that generates a playable chunk of the classic video game Grand Theft Auto V.

The Driving Force: How Ford Uses AI to Create Diverse Driving Data

The neural networks powering autonomous vehicles require petabytes of driving data to learn how to operate. Nikita Jaipuria and Rohan Bhasin from Ford Motor Company explain how they use generative adversarial networks (GANs) to fill in the gaps of real-world data used in AV training.

Subscribe to the AI Podcast: Now Available on Amazon Music

You can now listen to the AI Podcast through Amazon Music.

You can also get the AI Podcast through iTunes, Google Podcasts, Google Play, Castbox, DoggCatcher, Overcast, PlayerFM, Pocket Casts, Podbay, PodBean, PodCruncher, PodKicker, Soundcloud, Spotify, Stitcher and TuneIn.

Make the AI Podcast Better: Have a few minutes to spare? Fill out our listener survey

 

 

Featured image: postcard, copyright expired

The post The Greatest Podcast Ever Recorded appeared first on The Official NVIDIA Blog.

Read More

Atos Previews Energy-Efficient, AI-Augmented Hybrid Supercomputer

Stepping deeper into the era of exascale AI, Atos gave the first look at its next-generation high-performance computer.

The BullSequana XH3000 combines Atos’ patented fourth-generation liquid-cooled HPC design with NVIDIA technologies to deliver both more performance and energy efficiency.

Giving users a choice of Arm or x86 computing architectures, it will come in versions using NVIDIA Grace, Intel, AMD or SiPearl processors. For accelerated computing, it supports nodes with four NVIDIA Tensor Core GPUs.

The XH3000 also flexibly employs network options including NVIDIA Quantum-2 InfiniBand and NVIDIA ConnectX-7 InfiniBand and Ethernet adapters to scale these powerful computing nodes to HPC systems capable of 10 mixed precision AI exaflops.

Hybrid HPC+AI+Quantum System

The result is a flexible, hybrid computing platform capable of running the most demanding HPC simulations, AI jobs and even emerging workloads in quantum computing.

The BullSequana XH3000 “will no doubt enable, through the gateway of exascale, some of the key scientific and industrial innovation breakthroughs of the future,” said Rodolphe Belmer, CEO of Atos, in a virtual event revealing the system.

With customers in more than 70 countries, Atos is #1 in supercomputing in Europe, India and South America and especially renowned in France, where it maintains its headquarters as well as a manufacturing and R&D base.

A Broad Collaboration

Optimizations for the XH3000 were among the first projects for EXAIL, the joint Excellence AI Lab that Atos and NVIDIA announced in November.

John Josephakis, global vice president of sales and business development for HPC/supercomputing at NVIDIA, congratulated the team behind the system in a video message.

“By combining the well-known expertise Atos has with NVIDIA AI and HPC technologies and work at our joint lab, this platform will allow researchers to get significant insights much faster to grand challenges both in supercomputing and industrial HPC,” he said.

EXAIL’s work spans climate research, healthcare and genomics, quantum computing, edge AI/computer vision and cybersecurity. Its researchers can access application frameworks such as NVIDIA Clara for healthcare and NVIDIA Morpheus for security as well as the NVIDIA cuQuantum SDK for quantum computing and the NVIDIA HPC SDK that runs hundreds of scientific and technical applications.

A Long, Productive Relationship

Atos built one of Europe’s first supercomputers to employ the NVIDIA Ampere architecture, the JUWELS Booster at the Jülich Supercomputing Center. It uses 3,744 NVIDIA A100 Tensor Core GPUs to deliver 2.5 exaflops of mixed-precision AI performance.

To provide a deeper understanding of climate change, Atos and NVIDIA researchers will run AI models on the system, currently ranked No. 8 on the TOP500 list of the world’s fastest supercomputers. Julich researchers used the system in April to conduct a state-of-the-art quantum circuit simulation.

Last year, Atos led deployment of BerzeLiUs, a system built on the NVIDIA DGX SuperPOD and Sweden’s largest supercomputer. The company also has delivered supercomputing infrastructure in Europe, India and South America based on NVIDIA DGX systems.

Next up, Atos is building Leonardo, a supercomputer at the Italian inter-university consortium CINECA. It will pack 14,000 NVIDIA A100 GPUs on an NVIDIA Quantum InfiniBand network and is expected to become the world’s fastest AI supercomputer, capable of 10 exaflops of mixed-precision AI performance.

With the first glimpse of the BullSequana XH3000, it’s clear there’s much more to come from the collaboration of Atos and NVIDIA.

The post Atos Previews Energy-Efficient, AI-Augmented Hybrid Supercomputer appeared first on The Official NVIDIA Blog.

Read More

Peak Performance: Production Studio Sets the Stage for Virtual Opening Ceremony at European Football Championship

At the latest UEFA Champions League Finals, one of the world’s most anticipated annual soccer events, pop stars Marshmello, Khalid and Selena Gomez shared the stage for a dazzling opening ceremony at Portugal’s third-largest football stadium — without ever stepping foot in it.

The stunning video performance took place in a digital twin of the Estádio do Dragão, or Dragon Stadium, rendered by Madrid-based MR Factory, a company that specializes in virtual production.

The studio, which has been at the forefront of using virtual productions for film and television since the 1990s, now brings its virtual sets to life with the help of NVIDIA Studio, RTX GPUs and Omniverse, a real-time collaboration and simulation platform.

MR Factory’s previous projects include Netflix’s popular series Money Heist and Sky Rojo, and feature films like Jeepers Creepers: Reborn.

With NVIDIA RTX technology and real-time rendering, MR Factory can create stunning visuals and 3D models faster than before. And with NVIDIA Omniverse Enterprise, MR factory enables remote designers and artists to collaborate in one virtual space.

These advanced solutions help the company accelerate design workflows and take virtual productions to the next level.

Images courtesy of MR Factory.

“NVIDIA is powering this new workflow that allows us to improve creative opportunities while reducing production times and costs,” said Óscar Olarte, co-founder and CTO of MR Factory. “Instead of traveling to places like Australia or New York, we can create these scenarios virtually — you go from creating content to creating worlds.”

Setting the Virtual Stage for UEFA Champions 

MR Factory received the UEFA Champions League Finals opening ceremony project when there were heavy restrictions on travel due to the pandemic. The event was initially set to take place at Istanbul’s Ataturk Stadium, the largest sporting arena in Turkey.

MR Factory captured images of the stadium and used them to create a 3D model for the music video. But with the pandemic’s shifting conditions, the UEFA changed the location to a stadium in Porto, on Portugal’s coast — with just two weeks until the project’s deadline.

MR Factory had to quickly create another 3D model of the new stadium. The team used NVIDIA technology to achieve this, with real-time rendering tools accelerating their creative workflows. To create the stunning graphics and set up the scenes for virtual production, MR Factory uses leading applications such as Autodesk Arnold, DaVinci Resolve, OctaneRender and Unreal Engine.

“One of the most exciting technologies for us right now is NVIDIA RTX because it allows us to render faster, and in real time,” said Olarte. “We can mix real elements with virtual elements instantly.”

MR Factory also uses camera-tracking technology, which allows it to capture all camera and lens movements on stage. They use that footage to then combine live elements with the virtual production environment in real time.

Over 80 people across Spain worked on the virtual opening ceremony and, with the help of NVIDIA RTX, the team was able to complete the integrations from scratch, render all the visuals and finish the project in time for the event.

Making Vast Virtual Worlds 

One of MR Factory’s core philosophies is enabling remote work, as this provides the company with more opportunities to hire talent from anywhere. The studio then empowers that talent with the best creative tools.

Additionally, MR Factory has been developing the metaverse as a way to produce films and television scenes. The pandemic accentuated the need for real-time collaboration and interaction between remote teams, and NVIDIA Omniverse Enterprise helps MR Factory achieve this.

With Omniverse Enterprise, MR Factory can drastically reduce production times, since multiple people can work simultaneously on the same project. Instead of completing a scene in a week, five artists can work in Omniverse and have the scene ready in a day, Olarte said.

“For us, virtual production is a way of creating worlds — and from these worlds come video games and movies,” he added. “So we’re building a library of content while we’re producing it, and the library is compatible with NVIDIA Omniverse Enterprise.”

MR Factory uses a render farm with 200 NVIDIA RTX A6000 GPUs, which provide artists with the GPU memory they need to quickly produce stunning work, deliver high-quality virtual productions and render in real time.

MR Factory plans to use Omniverse Enterprise and the render farm on future projects, so they can streamline creative workflows and bring virtual worlds together.

The same tools that MR Factory uses to create the virtual worlds of tomorrow are also available at no cost to millions of individual NVIDIA Studio creators with GeForce RTX and NVIDIA RTX GPUs.

Learn more about NVIDIA RTX, Omniverse and other powerful technologies behind the latest virtual productions by registering for free for GTC, taking place March 21-24.

The post Peak Performance: Production Studio Sets the Stage for Virtual Opening Ceremony at European Football Championship appeared first on The Official NVIDIA Blog.

Read More

New Levels Unlocked: Africa’s Game Developers Reach Toward the Next Generation 

Looking for a challenge? Try maneuvering a Kenyan minibus through traffic or dropping seed balls on deforested landscapes.

Or download Africa’s Legends and battle through fiendishly difficult puzzles with Ghana’s Ananse or Nigeria’s Oya by your side.

Games like these are connecting with a hyper-connected African youth population that’s growing fast.

Africa is the youngest region in the world.

Sixty percent of the continent’s population is under 25, and by 2030 the UN predicts the youth population to Africa will increase by 42 percent.

Disposable incomes are rising and high-speed internet connections are proliferating, too.

Africa is projected to have more than 680 million mobile phone users by the end of 2025, driving a surge in the number of gamers.

Across the region 177 million or 95 percent of gamers use mobile devices.

As a result Africa is the fastest-growing region for mobile game downloads, according to mobile insights firm App Annie.

Kenya’s Usiku Games and Ghana’s Leti Arts are among the new generation of African game developers who are pioneering games that connect with the experiences, challenges and histories of these gamers.

Each is developing mobile games aimed at educating youth on a continent where 41 percent of the population is under 15.

And more are coming: in January, South African startup Carry1st raised $20 million from marquee investors such as Andreesen Horowitz and Google for a mobile game publishing platform targeting the African market.

The timing couldn’t be better. Market research firm Mordor Intelligence expects gaming revenue on the continent to grow at a 12 percent annual rate through 2026 compared to 9.6 percent for the entire world.

A Path for Africa’s Gaming Developers

As creators of Okoa Simba, the first game developed in Kenya to be published globally, Nairobi-based Usiku Games believes it can serve as a role model for future game developers in Africa.

Usiku Games is determined to reach younger audiences with educational messages that are embedded within compelling games and animations.

“Our games directly influence the knowledge and behavior of youth on topics such as gender-based violence, mental health, sexual and reproductive health, education, and peaceful resolution of conflicts,” said Usiku Games founder and CEO Jay Shapiro.

One of its projects includes working in Unreal Engine with NVIDIA technologies to create a 3D game focused on HIV prevention and contraception for teen girls.

“For game developers such as myself, this is about making something that will capture the imagination and inspire vulnerable youth in Africa, and all parts of the world,” said Shapiro, a Toronto native, who has lived in Singapore, New York, Mexico and Cambodia. “I want to  create rich, visually compelling stories that impact and serve the next generation.”

Creating Visual Stories With NVIDIA GPUs

As the first gaming studio in Ghana, Leti Arts, founded in 2009, uses NVIDIA GPUs to help build mobile games and digital comics based on African history and folklore.

“Games with African settings made by Africans are the best way to cultivate a sense of cultural authenticity,” said Leti co-founder and CEO Eyram Tawia.

A comic and computer game enthusiast since junior high school, Tawia, a Mandela Washington Fellow, the flagship program of the U.S. Government’s Young African Leaders Initiative, wanted to turn the stories he’d heard and drawn as a child into immersive experiences.

“Art and culture contribute just as much to an economy as jobs,” Tawia said . “They help increase a community’s social capital, attracting talent, growth and innovation.”

The nine-person company’s most successful games include Africa’s Legends (2014) and The Hottseat (2019).

The long-term vision for Leti Arts is to make games from Africa for the world. Tawia says the high quality of its games enables gamers to better relate with the games and content being produced.

The continent is home to a growing number of game studios. In addition to Usiku Games and Leti Arts they include Maliyo Games, Kirro Games, Kayfo Games and others.

More games, and game developers, are coming. Tawia and Leti Arts have worked to mentor talent through internships, boot camps and workshops.

Last year Leti trained and supported over 30 game developers in partnership with  ITTHYK Gaming and sponsored by Microsoft.

Tolo Sagala is the heroine of Leti Arts’ “Africa’s Legends – The Game.”

Expanding the Omniverse

Both Usiku and Leti Arts, which are members of NVIDIA Inception, a global program designed to nurture cutting-edge startups, are also exploring NVIDIA Omniverse for real-time 3D design collaboration, AI-powered animation and game development.

With Africa’s gaming industry worth well over half a billion dollars in 2021, investments are also booming for African gaming startups.

“As Africa’s demand for local and regional gaming content grows, more startups are entering this space,” said Kate Kallot, head of Emerging Areas at NVIDIA.

“Africa’s gaming landscape is punctuated by a growing base of startups and studios who are challenging the norms of traditional games, and their impact is anticipated to reach well beyond the continent itself to other game developers and audiences.”

Learn more about Leti Arts and Usiku Games, among others, by catching up on our GTC session focused on the African gaming industry. 

And check out  entrepreneur and Leti Arts founder Eyram Tawia’s book, “Uncompromising Passion: The Humble Beginnings of an African Game Industry” (CreateSpace Independent Publishing Platform, 2016). 

The post New Levels Unlocked: Africa’s Game Developers Reach Toward the Next Generation  appeared first on The Official NVIDIA Blog.

Read More

Play PC Games on Your Phone With GeForce NOW This GFN Thursday

Who says you have to put your play on pause just because you’re not at your PC?

This GFN Thursday takes a look at how GeForce NOW makes PC gaming possible on Android and iOS mobile devices to support gamers on the go.

This week also comes with sweet in-game rewards for members playing Eternal Return, with two unique skins, including Military Doctor Cathy and an NVIDIA exclusive GeForce NOW Silvia.

Plus, there’s the newest content on the cloud with season 12 of Apex Legends, the ‘Joseph: Collapse’ Far Cry 6 DLC and 10 games joining the GeForce NOW library this week.

GTG: Good-to-Go to Game on the Go

Thanks to the power of the cloud, GeForce NOW transforms nearly any Android or iOS mobile device into a powerful gaming rig. With gamers looking for more ways to play from their phones, GeForce NOW has full or partial controller support when connected to a phone for the real PC versions of over 800 games, including 35 free-to-play titles.

Mobile gaming on GeForce NOW
Take your games with you when you stream from the cloud to mobile devices.

GeForce NOW levels up gameplay on phones by processing all of your gaming in the cloud and streaming it straight to your device.

Members playing on iOS can get all of the benefits of PC gaming without leaving the Apple ecosystem. They can enjoy top PC picks like Apex Legends or Rocket League for competitive fun or AAA titles like Far Cry 6 or Assassin’s Creed: Valhalla. They can also party up with friends in popular online multiplayer games like Destiny 2 and ARK: Survival Evolved or take on a classic arcade challenge with Overcooked! or Cuphead.

Gamers playing on Android devices can also play these titles from the GeForce NOW library and take their mobile gameplay even further with the perks of the GeForce NOW RTX 3080 membership. RTX 3080 memberships enable PC gaming at 120 frames per second on select 120Hz Android phones, such as the Samsung S21.

RTX 3080 members also stream with the maximized eight-hour session lengths on the service and can turn RTX ON for cinematic graphics and real-time ray tracing in supported games like Ghostrunner and Dying Light 2 Stay Human.

And, mobile gamers on AT&T’s network can take advantage of a special promotion. New or existing customers with a 5G device on a 5G unlimited plan — or another qualifying unlimited plan — can score a six-month GeForce NOW Priority membership at no charge (a $49.99 value). Priority members can stream at up to 60 FPS with priority access to GeForce NOW servers, extended session lengths and RTX ON in supported games. For more info on this special offer, read here.

Gamepads to Support Play

Enjoy gaming on the go and comfort during long-term play with no extra software needed using one of GeForce NOW’s recommended gamepads to improve your cloud gaming experience.

Destiny 2 Witch Queen Mobile on GeForce NOW
You’re in control. Play your way on mobile devices with these top picks from GeForce NOW.

Gamers playing on an iPhone can consider the Backbone One as well as the Razer Kishi. Android users can also enjoy the Razer Kishi, or the Razer Raiju and the Razer Junglecat.

Other great options for Android gamers include the Steelseries Stratus Duo and the GLAP controller to provide high-performance gameplay.

For more details and to pick out the best option for your GeForce NOW mobile experience, check out these recommended devices.

Eternal Rewards

GeForce NOW isn’t just about enabling great gaming experiences. It’s also about enriching those experiences for members playing on the cloud.

Eternal Return Rewards on GeForce NOW
Look your best as you battle against the rest in Eternal Return with these two skins.

To celebrate the second anniversary of GeForce NOW, GFN Thursdays in February are full of rewards.

Starting today, members can get rewarded in the free-to-play, multiplayer arena game Eternal Return. Redeem your rewards in-game to show off your style with a Military Doctor Cathy skin and represent your love for the cloud with a custom GeForce NOW Silvia skin.

Getting membership rewards for streaming games on the cloud is easy. Log in to your NVIDIA account and select “GEFORCE NOW” from the header, then scroll down to “REWARDS” and click the “UPDATE REWARDS SETTINGS” button. Check the box in the dialogue window that shows up and start receiving special offers and in-game goodies.

Sign up for the GeForce NOW newsletter, including notifications when rewards are available, by logging into your NVIDIA account and selecting “PREFERENCES” from the header. Check the “Gaming & Entertainment” box, and “GeForce NOW” under topic preferences to receive the latest updates.

For more updates on rewards coming in February, stay tuned to upcoming GFN Thursdays.

‘Fortnite’ Mobile Closed Beta Update

Over the past few weeks, multiple waves of invites have been sent to members to join the Fortnite limited time closed beta on iOS Safari and Android — and the feedback gathered thus far has been incredibly helpful.

Some gamers have come across occasional, unexpected in-game framerate dips below 30 FPS when using touch controls on iPhone, iPad and Android devices. It’s most commonly encountered when using the ADS/First Person shooting mode, but has also been observed while spectating, during rapid camera movements and other scenarios.

Throughout the beta we’ll continue to improve the experience — including working on a solution to improve frame rates — and will have more information when an update is available.

In the meantime, we’re looking forward to inviting more gamers into the Fortnite mobile closed beta on GeForce NOW and will continue to optimize and improve the experience.

More Gaming Goodness, Anywhere

This week also brings new content to the cloud, expanding two titles.

Rise to the challenge in Apex Legends Defiance, the newest season of the popular free-to-play battle royale, hero shooter. Season 12 comes with a new limited time mode and battle pass, rewards, map updates to Olympus and the newest Legend –  Mad Maggie.

Take things to the next level in Far Cry 6 with the release of the third DLC, ‘Joseph: Collapse,’ telling a new story about the villainous Joseph Seed.

Sifu on GeForce NOW
You know kung fu. Play as a young kung fu student on a path of revenge, hunting for the murderers of his family in Sifu.

Plus, GFN Thursday comes with a new batch of games joining the GeForce NOW library. Check out these 10 titles ready to stream this week:

We make every effort to launch games on GeForce NOW as close to their release as possible, but, in some instances, games may not be available immediately.

Finally, due to extended maintenance, Myth of Empires will be removed from the GeForce NOW library.

What device are you playing on this weekend? Let us know on Twitter and check out our setup while you’re at it.

The post Play PC Games on Your Phone With GeForce NOW This GFN Thursday appeared first on The Official NVIDIA Blog.

Read More