Performance You Can Feel: Putting GeForce NOW RTX 3080 Membership’s Ultra-Low Latency to the Test This GFN Thursday

GeForce NOW’s RTX 3080 membership is the next generation of cloud gaming. This GFN Thursday looks at one of the tier’s major benefits: ultra-low-latency streaming from the cloud.

This week also brings a new app update that lets members log in via Discord, a members-only World of Warships reward and eight titles joining the GeForce NOW library.

Full Speed Ahead

The GeForce NOW RTX 3080 membership tier is kind of like magic. When members play on underpowered PCs, Macs, Chromebooks, SHIELD TVs, Android devices, iPhones and iPads, they’re streaming the full PC games that they own with all of the benefits of GeForce RTX 3080 GPUs — like ultra-low latency.

A few milliseconds of latency — the time it takes from a keystroke or mouse click to seeing the result on screen — can be the difference between a round-winning moment and a disappointing delay in the game.

The GeForce NOW RTX 3080 membership, powered by GeForce NOW SuperPODs, reduces latency through faster game rendering, more efficient encoding and higher streaming frame rates. Each step helps deliver cloud gaming that rivals many local gaming experiences.

With game rendering on the new SuperPODs, all GeForce NOW RTX 3080 members will feel a discernible reduction in latency. However, GeForce NOW RTX 3080 members playing on the PC, Mac and Android apps will observe the greatest benefits by streaming at up to 120 frames per second.

The result? Digital Foundry proclaimed it’s “the best streaming system we’ve played.”

That means whether you’re hitting your shots in Rainbow Six Siege, using a game-changing ability in Apex Legends or even sweating out a fast-paced shooter like Counter-Strike: Global Offensive, it’s all streaming in real time on GeForce NOW.

But don’t just take our word for it. We asked a few cloud gaming experts to put the GeForce NOW RTX 3080 membership to the test. Cloud Gaming Xtreme looked at how close RTX 3080 click-to-pixel latency is to their local PC, while GameTechPlanet called it “the one to beat when it comes to cloud gaming, for picture quality, graphics, and input latency.”

And Virtual Cloud noted the latency results “really shocked me just how good it felt,” and that “when swapping from my local PC to play on GeForce NOW, I really couldn’t tell a difference at all.”

Live in the Future

On top of this, the membership comes with the longest gaming session length for GeForce NOW — clocking in at eight glorious hours. It also enables full control to customize in-game graphics settings and RTX ON, rendering environments in cinematic quality for supported games.

Cyberpunk 2077 on GeForce NOW
Explore Night City with RTX ON, all streaming from the cloud with RTX 3080-class performance.

That means RTX 3080 members can experience titles like Cyberpunk 2077 with maximized, uninterrupted playtime at beautiful, immersive cinematic quality. Gamers can also enjoy the title’s new update – Patch 1.5 – released earlier this week, adding ray-traced shadows from local lights and introducing a variety of improvements and quality of life chances across the board, along with fresh pieces of free additional content.

With improved Fixer and Gig gameplay, enhanced UI and crowd reaction AI, map and open world tweaks, new narrative interactions, weapons, gear, customization options and more, now is the best time to stream Cyberpunk 2077 with RTX ON.

Ready to play? Start gaming today with a six-month GeForce NOW RTX 3080 membership for $99.99, with 10 percent off for Founders. Check out our membership FAQ for more information.

Dive Into the Cloud with Discord in the 2.0.38 Update

The new GeForce NOW update improves login options for gamers by supporting Discord as a convenient new account creation and login option for their NVIDIA accounts. Members can now use their Discord logins to access their GeForce NOW accounts. That’s one less password to remember.

The 2.0.38 update on GeForce NOW PC and Mac apps also supports Discord’s Rich Presence feature, which lets members easily display the game they’re currently playing in their Discord user status. The feature can be enabled or disabled through the GeForce NOW settings menu.

‘World of Warships’ Rewards

GeForce NOW delivers top-level performance and gaming goodies to members playing on the cloud.

World of Warship Rewards on GeForce NOW
Say ‘ahoy’ to these two awesome ships arriving today.

In celebration of the second anniversary of GeForce NOW, GFN Thursdays in February are full of rewards.

Today, members can get rewards for the naval warfare battle game World of Warships. Add two new ships to your fleet this week with the Charleston or the Dreadnought, redeemable for players via the Epic Games Store based on playtime in the game.

Getting membership rewards for streaming games on the cloud is easy. Log in to your NVIDIA account and select “GEFORCE NOW” from the header, scroll down to “REWARDS” and click the “UPDATE REWARDS SETTINGS” button. Check the box in the dialogue window that shows up to start receiving special offers and in-game goodies.

Sign up for the GeForce NOW newsletter, including notifications for when rewards are available, by logging into your NVIDIA account and selecting “PREFERENCES” from the header. Check the “Gaming & Entertainment” box, and “GeForce NOW” under topic preferences, to receive the latest updates.

Ready, Set … Game!

SpellMaster the Saga on GeForce NOW
Master your magic skills to become a sorcerer and save an uncharted world from impending disaster in SpellMaster: The Saga.

GFN Thursday always brings in a new batch of games for members to play. Catch the following eight new titles ready to stream this week:

  • SpellMaster: The Saga (New release on Steam)
  • Ashes of the Singularity: Escalation (Steam)
  • Citadel: Forged With Fire (Steam)
  • Galactic Civilizations III (Steam)
  • Haven (Steam)
  • People Playground (Steam)
  • Train Valley 2 (Steam)
  • Valley (Steam)

We make every effort to launch games on GeForce NOW as close to their release as possible, but, in some instances, games may not be available immediately.

Great gaming is calling. Let us know who you’re playing with on Twitter:

The post Performance You Can Feel: Putting GeForce NOW RTX 3080 Membership’s Ultra-Low Latency to the Test This GFN Thursday appeared first on The Official NVIDIA Blog.

Read More

Reimagining Modern Luxury: NVIDIA Announces Partnership with Jaguar Land Rover

Jaguar Land Rover and NVIDIA are redefining modern luxury, infusing intelligence into the customer experience.

As part of its Reimagine strategy, Jaguar Land Rover announced today that it will develop its upcoming vehicles on the full-stack NVIDIA DRIVE Hyperion 8 platform, with DRIVE Orin delivering a wide spectrum of active safety, automated driving and parking systems, as well as driver assistance systems built on DRIVE AV software. The system will also deliver AI features inside the vehicle, including driver and occupant monitoring and advanced visualization, leveraging the DRIVE IX software stack.

The iconic maker of modern luxury vehicles and the leader in AI computing will work together to build software-defined features for future Jaguar and Land Rover vehicles with continuously improving automated driving and intelligent features, from 2025.

The result will be some of the world’s most desirable vehicles, preserving the design purity of the distinct Jaguar and Land Rover personalities while transforming the experiences for customers at every step of the journey.

These vehicles will be built on a unified computer architecture that delivers software-defined services for ongoing customer value and innovative new business models. The combination of centralized compute and intelligent features upgraded over the air also enhances supply chain management.

The next step in this reimagination of responsible, modern luxury is implementing safe and convenient AI-powered features.

“Our long-term strategic partnership with NVIDIA will unlock a world of potential for our future vehicles as the business continues its transformation into a truly global, digital powerhouse,” said Thierry Bolloré, CEO of Jaguar Land Rover.

End-to-End Intelligence

Future Jaguar and Land Rover vehicles will be developed with NVIDIA AI from end to end.

This development begins in the data center. Engineers from both companies will work together to train, test and validate new automated driving features using NVIDIA data center solutions.

This includes data center hardware, software and workflows needed to develop and validate autonomous driving technology, from raw data collection through validation. NVIDIA DGX supercomputers provide the building blocks required for DNN development and training, while DRIVE Sim enables the necessary validation, replay and testing in simulation to enable a safe autonomous driving experience.

With NVIDIA Omniverse, engineers can collaborate virtually as well as exhaustively test and validate these DNNs with high-fidelity synthetic data generation.

Jaguar Land Rover will deploy this full-stack solution on NVIDIA DRIVE Hyperion — the central nervous system of the vehicle — which features the DRIVE Orin centralized AI compute platform — the car’s brain. DRIVE Hyperion includes the safety, security systems, networking and surrounding sensors used for autonomous driving, parking and intelligent cockpit applications.

NVIDIA DRIVE Orin

The future vehicles will be continuously improved and supported throughout their lifetimes by some of the world’s foremost software and AI engineers at NVIDIA and Jaguar Land Rover.

​​“Next-generation cars will transform automotive into one of the largest and most advanced technology industries,” said NVIDIA founder and CEO Jensen Huang. “Fleets of software-defined, programmable cars will offer new functionalities and services for the life of the vehicles.”

A Responsible Future

This new intelligent architecture, in addition to the transition to zero emissions powertrains, ensures Jaguar and Land Rover vehicles will not only transform the experience of customers, but also benefit the surrounding environment.

In addition to an all-electric future, the automaker is aiming to achieve net-zero carbon emissions across its supply chain, products and operations by 2039, incorporating sustainability into its long-heralded heritage.

By developing vehicles that are intelligent and backed by the high-performance compute of NVIDIA, Jaguar Land Rover is investing in technology that is safe for all road users, as well as convenient and comfortable.

This attention to responsibility in the new era of modern luxury extends the unique, emotional attachment that Jaguar and Land Rover vehicles inspire for even more decades to come.

The post Reimagining Modern Luxury: NVIDIA Announces Partnership with Jaguar Land Rover appeared first on The Official NVIDIA Blog.

Read More

The Greatest Podcast Ever Recorded

Is this the best podcast ever recorded? Let’s just say you don’t need a GPU to know that’s a stretch. But it’s pretty great if you’re a fan of tall tales.

And better still if you’re not a fan of stretching the truth at all.

That’s because detecting hyperbole may one day get more manageable, thanks to researchers at the University of Copenhagen working in the growing field of exaggeration detection.

Dustin Wright and Isabelle Augenstein have used NVIDIA GPUs to train an “exaggeration detection system” to identify overenthusiastic claims in health science reporting.

Their work comes as the pandemic has fueled demand for understandable, accurate information. And social media has made health misinformation more widespread.

Their paper leverages “few-shot learning,” a technique that lets developers wring more intelligence out of less data, and a new version of a technique called pattern exploiting training.

Research like Wright and Augenstein’s could one day speed more precise health sciences news to more people.

AI Podcast host Noah Kravitz — whose fishing stories we will never trust again after this episode — spoke with Wright about the work.

Key Points From This Episode

  • Approximately 33% of press releases about scientific papers tend to exaggerate the findings in the papers, which leads to news articles exaggerating the findings of these papers.
  • Wright’s exaggeration detection project aims to provide people like journalists with accurate information to ensure that they report accurately on science.
  • The project, accelerated using a NVIDIA Titan X GPU, uses a novel, multitask-capable version of a technique called Pattern Exploiting Training, which they dubbed MT-PET

Tweetables:

“Can we leverage language and related that learn patterns that the language model has picked up on from mass language model pre training, and be able to do classification with any text?” – Dustin Wright [7:28]

“About 33% of the time, press releases will exaggerate the scientific papers and as a result, that means about 33% of news articles exaggerate the findings in scientists’ papers.” – Dustin Wright [9:50]

“This is progress towards a system that could assist, for example, journalists, and ensuring that they’re doing accurate reporting on science.” – Dustin Wright [16:20]

You Might Also Like

NVIDIA’s Liila Torabi Talks the New Era of Robotics Through Isaac Sim

Robots aren’t limited to the assembly line. Liila Torabi, senior product manager for Isaac Sim, a robotics and AI simulation platform powered by NVIDIA Omniverse, talks about where the field’s headed.

GANTheftAuto: Harrison Kinsley on AI-Generated Gaming Environments

Humans playing games against machines is nothing new, but now computers can develop their own games for people to play. Programming enthusiast and social media influencer Harrison Kinsley created GANTheftAuto, an AI-based neural network that generates a playable chunk of the classic video game Grand Theft Auto V.

The Driving Force: How Ford Uses AI to Create Diverse Driving Data

The neural networks powering autonomous vehicles require petabytes of driving data to learn how to operate. Nikita Jaipuria and Rohan Bhasin from Ford Motor Company explain how they use generative adversarial networks (GANs) to fill in the gaps of real-world data used in AV training.

Subscribe to the AI Podcast: Now Available on Amazon Music

You can now listen to the AI Podcast through Amazon Music.

You can also get the AI Podcast through iTunes, Google Podcasts, Google Play, Castbox, DoggCatcher, Overcast, PlayerFM, Pocket Casts, Podbay, PodBean, PodCruncher, PodKicker, Soundcloud, Spotify, Stitcher and TuneIn.

Make the AI Podcast Better: Have a few minutes to spare? Fill out our listener survey

 

 

Featured image: postcard, copyright expired

The post The Greatest Podcast Ever Recorded appeared first on The Official NVIDIA Blog.

Read More

Atos Previews Energy-Efficient, AI-Augmented Hybrid Supercomputer

Stepping deeper into the era of exascale AI, Atos gave the first look at its next-generation high-performance computer.

The BullSequana XH3000 combines Atos’ patented fourth-generation liquid-cooled HPC design with NVIDIA technologies to deliver both more performance and energy efficiency.

Giving users a choice of Arm or x86 computing architectures, it will come in versions using NVIDIA Grace, Intel, AMD or SiPearl processors. For accelerated computing, it supports nodes with four NVIDIA Tensor Core GPUs.

The XH3000 also flexibly employs network options including NVIDIA Quantum-2 InfiniBand and NVIDIA ConnectX-7 InfiniBand and Ethernet adapters to scale these powerful computing nodes to HPC systems capable of 10 mixed precision AI exaflops.

Hybrid HPC+AI+Quantum System

The result is a flexible, hybrid computing platform capable of running the most demanding HPC simulations, AI jobs and even emerging workloads in quantum computing.

The BullSequana XH3000 “will no doubt enable, through the gateway of exascale, some of the key scientific and industrial innovation breakthroughs of the future,” said Rodolphe Belmer, CEO of Atos, in a virtual event revealing the system.

With customers in more than 70 countries, Atos is #1 in supercomputing in Europe, India and South America and especially renowned in France, where it maintains its headquarters as well as a manufacturing and R&D base.

A Broad Collaboration

Optimizations for the XH3000 were among the first projects for EXAIL, the joint Excellence AI Lab that Atos and NVIDIA announced in November.

John Josephakis, global vice president of sales and business development for HPC/supercomputing at NVIDIA, congratulated the team behind the system in a video message.

“By combining the well-known expertise Atos has with NVIDIA AI and HPC technologies and work at our joint lab, this platform will allow researchers to get significant insights much faster to grand challenges both in supercomputing and industrial HPC,” he said.

EXAIL’s work spans climate research, healthcare and genomics, quantum computing, edge AI/computer vision and cybersecurity. Its researchers can access application frameworks such as NVIDIA Clara for healthcare and NVIDIA Morpheus for security as well as the NVIDIA cuQuantum SDK for quantum computing and the NVIDIA HPC SDK that runs hundreds of scientific and technical applications.

A Long, Productive Relationship

Atos built one of Europe’s first supercomputers to employ the NVIDIA Ampere architecture, the JUWELS Booster at the Jülich Supercomputing Center. It uses 3,744 NVIDIA A100 Tensor Core GPUs to deliver 2.5 exaflops of mixed-precision AI performance.

To provide a deeper understanding of climate change, Atos and NVIDIA researchers will run AI models on the system, currently ranked No. 8 on the TOP500 list of the world’s fastest supercomputers. Julich researchers used the system in April to conduct a state-of-the-art quantum circuit simulation.

Last year, Atos led deployment of BerzeLiUs, a system built on the NVIDIA DGX SuperPOD and Sweden’s largest supercomputer. The company also has delivered supercomputing infrastructure in Europe, India and South America based on NVIDIA DGX systems.

Next up, Atos is building Leonardo, a supercomputer at the Italian inter-university consortium CINECA. It will pack 14,000 NVIDIA A100 GPUs on an NVIDIA Quantum InfiniBand network and is expected to become the world’s fastest AI supercomputer, capable of 10 exaflops of mixed-precision AI performance.

With the first glimpse of the BullSequana XH3000, it’s clear there’s much more to come from the collaboration of Atos and NVIDIA.

The post Atos Previews Energy-Efficient, AI-Augmented Hybrid Supercomputer appeared first on The Official NVIDIA Blog.

Read More

Peak Performance: Production Studio Sets the Stage for Virtual Opening Ceremony at European Football Championship

At the latest UEFA Champions League Finals, one of the world’s most anticipated annual soccer events, pop stars Marshmello, Khalid and Selena Gomez shared the stage for a dazzling opening ceremony at Portugal’s third-largest football stadium — without ever stepping foot in it.

The stunning video performance took place in a digital twin of the Estádio do Dragão, or Dragon Stadium, rendered by Madrid-based MR Factory, a company that specializes in virtual production.

The studio, which has been at the forefront of using virtual productions for film and television since the 1990s, now brings its virtual sets to life with the help of NVIDIA Studio, RTX GPUs and Omniverse, a real-time collaboration and simulation platform.

MR Factory’s previous projects include Netflix’s popular series Money Heist and Sky Rojo, and feature films like Jeepers Creepers: Reborn.

With NVIDIA RTX technology and real-time rendering, MR Factory can create stunning visuals and 3D models faster than before. And with NVIDIA Omniverse Enterprise, MR factory enables remote designers and artists to collaborate in one virtual space.

These advanced solutions help the company accelerate design workflows and take virtual productions to the next level.

Images courtesy of MR Factory.

“NVIDIA is powering this new workflow that allows us to improve creative opportunities while reducing production times and costs,” said Óscar Olarte, co-founder and CTO of MR Factory. “Instead of traveling to places like Australia or New York, we can create these scenarios virtually — you go from creating content to creating worlds.”

Setting the Virtual Stage for UEFA Champions 

MR Factory received the UEFA Champions League Finals opening ceremony project when there were heavy restrictions on travel due to the pandemic. The event was initially set to take place at Istanbul’s Ataturk Stadium, the largest sporting arena in Turkey.

MR Factory captured images of the stadium and used them to create a 3D model for the music video. But with the pandemic’s shifting conditions, the UEFA changed the location to a stadium in Porto, on Portugal’s coast — with just two weeks until the project’s deadline.

MR Factory had to quickly create another 3D model of the new stadium. The team used NVIDIA technology to achieve this, with real-time rendering tools accelerating their creative workflows. To create the stunning graphics and set up the scenes for virtual production, MR Factory uses leading applications such as Autodesk Arnold, DaVinci Resolve, OctaneRender and Unreal Engine.

“One of the most exciting technologies for us right now is NVIDIA RTX because it allows us to render faster, and in real time,” said Olarte. “We can mix real elements with virtual elements instantly.”

MR Factory also uses camera-tracking technology, which allows it to capture all camera and lens movements on stage. They use that footage to then combine live elements with the virtual production environment in real time.

Over 80 people across Spain worked on the virtual opening ceremony and, with the help of NVIDIA RTX, the team was able to complete the integrations from scratch, render all the visuals and finish the project in time for the event.

Making Vast Virtual Worlds 

One of MR Factory’s core philosophies is enabling remote work, as this provides the company with more opportunities to hire talent from anywhere. The studio then empowers that talent with the best creative tools.

Additionally, MR Factory has been developing the metaverse as a way to produce films and television scenes. The pandemic accentuated the need for real-time collaboration and interaction between remote teams, and NVIDIA Omniverse Enterprise helps MR Factory achieve this.

With Omniverse Enterprise, MR Factory can drastically reduce production times, since multiple people can work simultaneously on the same project. Instead of completing a scene in a week, five artists can work in Omniverse and have the scene ready in a day, Olarte said.

“For us, virtual production is a way of creating worlds — and from these worlds come video games and movies,” he added. “So we’re building a library of content while we’re producing it, and the library is compatible with NVIDIA Omniverse Enterprise.”

MR Factory uses a render farm with 200 NVIDIA RTX A6000 GPUs, which provide artists with the GPU memory they need to quickly produce stunning work, deliver high-quality virtual productions and render in real time.

MR Factory plans to use Omniverse Enterprise and the render farm on future projects, so they can streamline creative workflows and bring virtual worlds together.

The same tools that MR Factory uses to create the virtual worlds of tomorrow are also available at no cost to millions of individual NVIDIA Studio creators with GeForce RTX and NVIDIA RTX GPUs.

Learn more about NVIDIA RTX, Omniverse and other powerful technologies behind the latest virtual productions by registering for free for GTC, taking place March 21-24.

The post Peak Performance: Production Studio Sets the Stage for Virtual Opening Ceremony at European Football Championship appeared first on The Official NVIDIA Blog.

Read More

New Levels Unlocked: Africa’s Game Developers Reach Toward the Next Generation 

Looking for a challenge? Try maneuvering a Kenyan minibus through traffic or dropping seed balls on deforested landscapes.

Or download Africa’s Legends and battle through fiendishly difficult puzzles with Ghana’s Ananse or Nigeria’s Oya by your side.

Games like these are connecting with a hyper-connected African youth population that’s growing fast.

Africa is the youngest region in the world.

Sixty percent of the continent’s population is under 25, and by 2030 the UN predicts the youth population to Africa will increase by 42 percent.

Disposable incomes are rising and high-speed internet connections are proliferating, too.

Africa is projected to have more than 680 million mobile phone users by the end of 2025, driving a surge in the number of gamers.

Across the region 177 million or 95 percent of gamers use mobile devices.

As a result Africa is the fastest-growing region for mobile game downloads, according to mobile insights firm App Annie.

Kenya’s Usiku Games and Ghana’s Leti Arts are among the new generation of African game developers who are pioneering games that connect with the experiences, challenges and histories of these gamers.

Each is developing mobile games aimed at educating youth on a continent where 41 percent of the population is under 15.

And more are coming: in January, South African startup Carry1st raised $20 million from marquee investors such as Andreesen Horowitz and Google for a mobile game publishing platform targeting the African market.

The timing couldn’t be better. Market research firm Mordor Intelligence expects gaming revenue on the continent to grow at a 12 percent annual rate through 2026 compared to 9.6 percent for the entire world.

A Path for Africa’s Gaming Developers

As creators of Okoa Simba, the first game developed in Kenya to be published globally, Nairobi-based Usiku Games believes it can serve as a role model for future game developers in Africa.

Usiku Games is determined to reach younger audiences with educational messages that are embedded within compelling games and animations.

“Our games directly influence the knowledge and behavior of youth on topics such as gender-based violence, mental health, sexual and reproductive health, education, and peaceful resolution of conflicts,” said Usiku Games founder and CEO Jay Shapiro.

One of its projects includes working in Unreal Engine with NVIDIA technologies to create a 3D game focused on HIV prevention and contraception for teen girls.

“For game developers such as myself, this is about making something that will capture the imagination and inspire vulnerable youth in Africa, and all parts of the world,” said Shapiro, a Toronto native, who has lived in Singapore, New York, Mexico and Cambodia. “I want to  create rich, visually compelling stories that impact and serve the next generation.”

Creating Visual Stories With NVIDIA GPUs

As the first gaming studio in Ghana, Leti Arts, founded in 2009, uses NVIDIA GPUs to help build mobile games and digital comics based on African history and folklore.

“Games with African settings made by Africans are the best way to cultivate a sense of cultural authenticity,” said Leti co-founder and CEO Eyram Tawia.

A comic and computer game enthusiast since junior high school, Tawia, a Mandela Washington Fellow, the flagship program of the U.S. Government’s Young African Leaders Initiative, wanted to turn the stories he’d heard and drawn as a child into immersive experiences.

“Art and culture contribute just as much to an economy as jobs,” Tawia said . “They help increase a community’s social capital, attracting talent, growth and innovation.”

The nine-person company’s most successful games include Africa’s Legends (2014) and The Hottseat (2019).

The long-term vision for Leti Arts is to make games from Africa for the world. Tawia says the high quality of its games enables gamers to better relate with the games and content being produced.

The continent is home to a growing number of game studios. In addition to Usiku Games and Leti Arts they include Maliyo Games, Kirro Games, Kayfo Games and others.

More games, and game developers, are coming. Tawia and Leti Arts have worked to mentor talent through internships, boot camps and workshops.

Last year Leti trained and supported over 30 game developers in partnership with  ITTHYK Gaming and sponsored by Microsoft.

Tolo Sagala is the heroine of Leti Arts’ “Africa’s Legends – The Game.”

Expanding the Omniverse

Both Usiku and Leti Arts, which are members of NVIDIA Inception, a global program designed to nurture cutting-edge startups, are also exploring NVIDIA Omniverse for real-time 3D design collaboration, AI-powered animation and game development.

With Africa’s gaming industry worth well over half a billion dollars in 2021, investments are also booming for African gaming startups.

“As Africa’s demand for local and regional gaming content grows, more startups are entering this space,” said Kate Kallot, head of Emerging Areas at NVIDIA.

“Africa’s gaming landscape is punctuated by a growing base of startups and studios who are challenging the norms of traditional games, and their impact is anticipated to reach well beyond the continent itself to other game developers and audiences.”

Learn more about Leti Arts and Usiku Games, among others, by catching up on our GTC session focused on the African gaming industry. 

And check out  entrepreneur and Leti Arts founder Eyram Tawia’s book, “Uncompromising Passion: The Humble Beginnings of an African Game Industry” (CreateSpace Independent Publishing Platform, 2016). 

The post New Levels Unlocked: Africa’s Game Developers Reach Toward the Next Generation  appeared first on The Official NVIDIA Blog.

Read More

Play PC Games on Your Phone With GeForce NOW This GFN Thursday

Who says you have to put your play on pause just because you’re not at your PC?

This GFN Thursday takes a look at how GeForce NOW makes PC gaming possible on Android and iOS mobile devices to support gamers on the go.

This week also comes with sweet in-game rewards for members playing Eternal Return, with two unique skins, including Military Doctor Cathy and an NVIDIA exclusive GeForce NOW Silvia.

Plus, there’s the newest content on the cloud with season 12 of Apex Legends, the ‘Joseph: Collapse’ Far Cry 6 DLC and 10 games joining the GeForce NOW library this week.

GTG: Good-to-Go to Game on the Go

Thanks to the power of the cloud, GeForce NOW transforms nearly any Android or iOS mobile device into a powerful gaming rig. With gamers looking for more ways to play from their phones, GeForce NOW has full or partial controller support when connected to a phone for the real PC versions of over 800 games, including 35 free-to-play titles.

Mobile gaming on GeForce NOW
Take your games with you when you stream from the cloud to mobile devices.

GeForce NOW levels up gameplay on phones by processing all of your gaming in the cloud and streaming it straight to your device.

Members playing on iOS can get all of the benefits of PC gaming without leaving the Apple ecosystem. They can enjoy top PC picks like Apex Legends or Rocket League for competitive fun or AAA titles like Far Cry 6 or Assassin’s Creed: Valhalla. They can also party up with friends in popular online multiplayer games like Destiny 2 and ARK: Survival Evolved or take on a classic arcade challenge with Overcooked! or Cuphead.

Gamers playing on Android devices can also play these titles from the GeForce NOW library and take their mobile gameplay even further with the perks of the GeForce NOW RTX 3080 membership. RTX 3080 memberships enable PC gaming at 120 frames per second on select 120Hz Android phones, such as the Samsung S21.

RTX 3080 members also stream with the maximized eight-hour session lengths on the service and can turn RTX ON for cinematic graphics and real-time ray tracing in supported games like Ghostrunner and Dying Light 2 Stay Human.

And, mobile gamers on AT&T’s network can take advantage of a special promotion. New or existing customers with a 5G device on a 5G unlimited plan — or another qualifying unlimited plan — can score a six-month GeForce NOW Priority membership at no charge (a $49.99 value). Priority members can stream at up to 60 FPS with priority access to GeForce NOW servers, extended session lengths and RTX ON in supported games. For more info on this special offer, read here.

Gamepads to Support Play

Enjoy gaming on the go and comfort during long-term play with no extra software needed using one of GeForce NOW’s recommended gamepads to improve your cloud gaming experience.

Destiny 2 Witch Queen Mobile on GeForce NOW
You’re in control. Play your way on mobile devices with these top picks from GeForce NOW.

Gamers playing on an iPhone can consider the Backbone One as well as the Razer Kishi. Android users can also enjoy the Razer Kishi, or the Razer Raiju and the Razer Junglecat.

Other great options for Android gamers include the Steelseries Stratus Duo and the GLAP controller to provide high-performance gameplay.

For more details and to pick out the best option for your GeForce NOW mobile experience, check out these recommended devices.

Eternal Rewards

GeForce NOW isn’t just about enabling great gaming experiences. It’s also about enriching those experiences for members playing on the cloud.

Eternal Return Rewards on GeForce NOW
Look your best as you battle against the rest in Eternal Return with these two skins.

To celebrate the second anniversary of GeForce NOW, GFN Thursdays in February are full of rewards.

Starting today, members can get rewarded in the free-to-play, multiplayer arena game Eternal Return. Redeem your rewards in-game to show off your style with a Military Doctor Cathy skin and represent your love for the cloud with a custom GeForce NOW Silvia skin.

Getting membership rewards for streaming games on the cloud is easy. Log in to your NVIDIA account and select “GEFORCE NOW” from the header, then scroll down to “REWARDS” and click the “UPDATE REWARDS SETTINGS” button. Check the box in the dialogue window that shows up and start receiving special offers and in-game goodies.

Sign up for the GeForce NOW newsletter, including notifications when rewards are available, by logging into your NVIDIA account and selecting “PREFERENCES” from the header. Check the “Gaming & Entertainment” box, and “GeForce NOW” under topic preferences to receive the latest updates.

For more updates on rewards coming in February, stay tuned to upcoming GFN Thursdays.

‘Fortnite’ Mobile Closed Beta Update

Over the past few weeks, multiple waves of invites have been sent to members to join the Fortnite limited time closed beta on iOS Safari and Android — and the feedback gathered thus far has been incredibly helpful.

Some gamers have come across occasional, unexpected in-game framerate dips below 30 FPS when using touch controls on iPhone, iPad and Android devices. It’s most commonly encountered when using the ADS/First Person shooting mode, but has also been observed while spectating, during rapid camera movements and other scenarios.

Throughout the beta we’ll continue to improve the experience — including working on a solution to improve frame rates — and will have more information when an update is available.

In the meantime, we’re looking forward to inviting more gamers into the Fortnite mobile closed beta on GeForce NOW and will continue to optimize and improve the experience.

More Gaming Goodness, Anywhere

This week also brings new content to the cloud, expanding two titles.

Rise to the challenge in Apex Legends Defiance, the newest season of the popular free-to-play battle royale, hero shooter. Season 12 comes with a new limited time mode and battle pass, rewards, map updates to Olympus and the newest Legend –  Mad Maggie.

Take things to the next level in Far Cry 6 with the release of the third DLC, ‘Joseph: Collapse,’ telling a new story about the villainous Joseph Seed.

Sifu on GeForce NOW
You know kung fu. Play as a young kung fu student on a path of revenge, hunting for the murderers of his family in Sifu.

Plus, GFN Thursday comes with a new batch of games joining the GeForce NOW library. Check out these 10 titles ready to stream this week:

We make every effort to launch games on GeForce NOW as close to their release as possible, but, in some instances, games may not be available immediately.

Finally, due to extended maintenance, Myth of Empires will be removed from the GeForce NOW library.

What device are you playing on this weekend? Let us know on Twitter and check out our setup while you’re at it.

The post Play PC Games on Your Phone With GeForce NOW This GFN Thursday appeared first on The Official NVIDIA Blog.

Read More

Startup Taps Finance Micromodels for Data Annotation Automation

After meeting at an entrepreneur matchmaking event, Ulrik Hansen and Eric Landau teamed up to parlay their experience in financial trading systems into a platform for faster data labeling.

In 2020, the pair of finance industry veterans founded Encord to adapt micromodels typical in finance to automated data annotation. Micromodels are neural networks that require less time to deploy because they’re trained on less data and used for specific tasks.

Encord’s NVIDIA GPU-driven service promises to automate as much as 99 percent of businesses’ manual data labeling with its micromodels.

“Instead of building one big model that does everything, we’re just combining a lot of smaller models together, and that’s very similar to how a lot of these trading systems work,” said Landau.

The startup, based in London, recently landed $12.5 million in Series A funding.

Encord is an NVIDIA Metropolis partner and a member of NVIDIA Inception, a program that offers go-to-market support, expertise and technology for AI, data science and HPC startups. NVIDIA Metropolis is an application framework that makes it easier for developers to combine video cameras and sensors with AI-enabled video analytics.

The company said it has attracted business in gastrointestinal endoscopy, radiology, thermal imaging, smart cities, agriculture, autonomous transportation and retail applications.

‘Augmenting Doctors’ for SurgEase

Back in 2021, the partners hunkered down near Laguna Beach, Calif., at the home of Landau’s parents, to build Encord while attending Y Combinator. And they had also just landed a first customer, SurgEase.

London-based SurgEase offers telepresence technology for gastroenterology. The company’s hardware device and software enable remote physicians to monitor high-definition images and video captured in colonoscopies.

“You could have a doctor in an emerging economy do the diagnostics or  detection, as well as a doctor from one of the very best hospitals in the U.S.,” said Hansen.

To improve diagnostics, SurgEase is also applying video data to training AI models for detection. Encord’s micromodels are being applied to annotate the video data that’s used for SurgEase’s models.The idea is to give doctors a second set of eyes on procedures.

“Encord’s software has been instrumental in aiding us in solving some of the hardest problems in endoscopic disease assessment,” said SurgEase CEO Fareed Iqbal.

With AI-aided diagnostics, clinicians using SurgEase might spot more things sooner so that people don’t need more severe procedures down the line, said Hansen. Doctors also don’t always agree, so it can help cut through the noise with another opinion, said Landau.

“It’s really augmenting doctors,” said Landau.

King’s College of London: 6x Faster 

King’s College of London had a challenge of finding a way to annotate images in precancerous polyp videos. So it turned to Encord for annotation automation because using highly skilled clinicians was costly on such large datasets.  

The result was that the micro models could be used to annotate about 6.4x faster than manual labeling. It was capable of handling about 97 percent of the datasets with automated annotation, the rest requiring manual labeling from clinicians.

Encord enabled King’s College of London to cut model development time from one year to two months, moving AI into production faster.

Triton: Quickly Into Inference

Encord was initially setting out to build its own inference engine, running on its API server. But Hansen and Landau decided using NVIDIA Triton would save a lot of engineering time and get them quickly into production.

Triton offers open-source software for taking AI into production by simplifying how models run in any framework and on any GPU or CPU for all inference types.

Also, it allowed them to focus on their early customers by not having to build inference engine architecture themselves.

People using Encord’s platform can train a micromodel and run inference very soon after that, enabled by Triton, Hansen said.

“With Triton, we get the native support for all these machine learning libraries like PyTorch and it’s compatible with CUDA,” said  Hansen. “It saved us a lot of time and hassles.”

The post Startup Taps Finance Micromodels for Data Annotation Automation appeared first on The Official NVIDIA Blog.

Read More

Burgers, Fries and a Side of AI: Startup Offers Taste of Drive-Thru Convenience

Eating into open hours and menus, a labor shortage has gobbled up fast-food services employees, but some restaurants are trying out a new staff member to bring back the drive-thru good times: AI.

Toronto startup HuEx is in pilot tests with a conversational AI assistant for drive-thrus to help support service at several popular Canadian chains.

Chronically understaffed, food services jobs have among the highest rate of employee departures, according to the U.S. Bureau of Labor Statistics.

HuEx’s voice service — dubbed AiDA — is helping behind the drive-up window at popular fast-service chains across North America.

AiDA handles order requests from customers at the drive-thru speaker box. Driven by HuEx’s proprietary models running on the NVIDIA Jetson edge AI platform, AiDA transcribes the voice orders to text for staff members to see and serve. And it can reply with voice in response.

It can understand 300,000-plus product combinations. “Things like ‘coffee with milk, coffee with sugar’ are common, but some people even order coffee with butter — it can handle that, too,” said Anik Seth, founder and CEO of HuEx.

The company is a member of NVIDIA Inception, a program that offers go-to-market support, expertise and technology for AI, data science and HPC startups.

All in the Family

Seth is intimately familiar with fast-service restaurants. He is part of a family business operating multiple quick-service restaurant locations.

Noticing a common problem, he has seen team members and guests struggling during drive-thru interactions, something he aims to address.

“AiDA’s voice recognition technology is easily handled by the NVIDIA Jetson for real-time interactions, which helps smooth the ordering process,” he said.

Talk AI to Me

The technology, integrated with the existing drive-thru headset system, allows for team members to hear the orders and jump in if needed to assist.

AiDA, first deployed in 2018, has been used in “thousands of transactions” in implementations in Canada, said Seth.

The system promises to help improve service time by taking on the drive-thru while other team members focus on fulfilling orders. Its natural language processing system is capable of 90 percent accuracy when taking orders, he said.

As new menu items, specials and promotions are introduced, the database is updated constantly to answer questions about them.

“The team is always in the know,” Seth said. “The moment you order a coffee, the AI is taking the order, while simultaneously, there’s a team member fulfilling it.”

Image credit: Robert Penaloza via Unsplash.

The post Burgers, Fries and a Side of AI: Startup Offers Taste of Drive-Thru Convenience appeared first on The Official NVIDIA Blog.

Read More

Meet the Omnivore: Developer Sleighs Complex Manufacturing Workflows With Digital Twin of Santa’s Workshop

Editor’s note: This is one in a series of Meet the Omnivore posts, featuring individual creators and developers who use the NVIDIA Omniverse 3D simulation and collaboration platform to boost their artistic or engineering processes.

Don’t be fooled by the candy canes, hot cocoa and CEO’s jolly demeanor.

Santa’s workshop is the very model of a 21st-century enterprise: pioneering mass customization and perfecting a worldwide distribution system able to meet almost bottomless global demand.

Michael Wagner

So it makes sense that Michael Wagner, CTO of ipolog, a digital twin software company for assembly and logistics planning, would make a virtual representation, or digital twin, of Santa’s workshop.

Digital twins like Wagner’s “santa-factory” can be used “to map optimal employee paths around a facility, simulate processes like material flow, as well as detect bottlenecks before they occur,” he said.

Wagner built an NVIDIA Omniverse Extension — a tool to use in conjunction with Omniverse apps — for what he calls the science of santa-facturing.

A rendering of the assembly room in Santa’s workshop, created with NVIDIA Omniverse.
A rendering of the assembly room in Santa’s workshop, created with NVIDIA Omniverse.

Creating the ‘Santa-Facturing’ Extension

To deck the halls of the santa-factory, Wagner needed a virtual environment where he could depict the North Pole, Santa himself, hundreds of elves and millions of toy parts. Omniverse provided the tools to create such a highly detailed environment.

“Omniverse is the only platform that’s able to visualize such a vast amount of components in high fidelity and make the simulation physically accurate,” Wagner said. “My work is a proof of concept — if Omniverse is fit to visualize Santa’s factory, it’s fit to visualize the daily material provisioning load for a real-world automotive factory, for example, which has a similar order of complexity.”

Ipolog recently provided BMW with highly detailed elements like racks and boxes for a digital twin of the automaker’s factory.

With the help of ipolog software and other tools, BMW is creating a digital twin-based factory of the future with NVIDIA Omniverse, which enables the automaker to simulate complex production scenarios taking place in more than 6 million square meters of factory space.

Digital twin simulation speeds output and increases efficiency for BMW’s entire production cycle — from the examination of engineering detail for vehicle parts to the optimization of workflow at the factory-plant level.

Wagner used Omniverse Kit, a toolkit for building Omniverse-native extensions and applications, to create the santa-facturing environment.

The developer is also exploring Omniverse Code — a recently launched app that serves as an integrated development environment for developers to easily build Omniverse extensions, apps or microservices.

“The principle of building on the shoulders of giants is in the DNA of the Omniverse ecosystem and the kit-based environment,” Wagner said. “Existing open-source extensions, which any developer can contribute to, provide a good base from which to start off and quickly create a dedicated app or extension for digital twins.”

Visualizing the ‘Santa-Factory’

Using Omniverse, which includes PhysX — a software development kit that provides advanced physics simulation — Wagner transformed 2D illustrations of the santa-factory into a physically accurate 3D scene. The process was simple, he said. He “piled up a lot of elements and let PhysX work its magic.”

A 2D representation of Santa’s workshop turned into a 3D rendering using Omniverse.

To create the glacial North Pole environment, Wagner used the Unreal Engine 4 Omniverse Connector. To bring the trusty elves to life, he brought in animations from Blender. And to convert the huge datasets to Universal Scene Description format, Wagner worked with Germany-based 3D software development company NetAllied Systems.

A rendering of elves tending to reindeer near Santa’s workshop, created with NVIDIA Omniverse.

What better example of material supply and flow in manufacturing than millions of toy parts getting delivered to Santa’s workshop? Watch Wagner’s stunning demo of this, created in Omniverse:

Such use of digital twin simulations, Wagner said, allows manufacturers to visualize and plan their most efficient workflow, often reducing the time it takes to complete a manufacturing project by 30 percent.

Looking forward, Wagner and his team at ipolog plan to create a full suite of apps, extensions and backend services to enable a manufacturing virtual world entirely based on Omniverse.

Learn more about the santa-facturing project and how Wagner uses Omniverse Kit.

Attend Wagner’s session on digital twins for manufacturing at GTC, which will take place March 21-24.

Creators and developers can download NVIDIA Omniverse for free and get started with step-by-step tutorials on the Omniverse YouTube channel. Follow Omniverse on Instagram, Twitter and Medium for additional resources and inspiration. Check out the Omniverse forums and join our Discord Server to chat with the community.

The post Meet the Omnivore: Developer Sleighs Complex Manufacturing Workflows With Digital Twin of Santa’s Workshop appeared first on The Official NVIDIA Blog.

Read More