Update Complete: GFN Thursday Brings New Features, Games and More

No Thursday is complete without GFN Thursday, our weekly celebration of the news, updates and great games GeForce NOW members can play — all streaming from the cloud across nearly all of your devices.

This week’s exciting updates to the GeForce NOW app and experience Include updated features, faster session loading and a bunch of new games joining the GFN library.

… Better, Faster, Stronger

There’s a lot happening behind the scenes as our team continuously works to improve your cloud-gaming experience with each session.

Our cloud-gaming engineers work to further improve your streaming sessions, optimize games and develop new features. We also continually refine the user experience in the GeForce NOW app, which is now rolling out version 2.0.29 with several improvements.

Game in the Fast Lane

From the Settings pane in the GeForce NOW app, you can link your Epic Games Store account to take advantage of some new features.
From the Settings pane in the GeForce NOW app, you can link your Epic Games Store account to take advantage of some new features.

One feature we’re currently testing with our Founders and Priority members is preloading, which loads parts of your game before you arrive so your launch times will be faster. Members testing this feature should see sessions launch up to a minute faster from the moment they click play in the GeForce NOW app. Free members are not guaranteed preloaded sessions and may see slightly longer startup times.

To enable the benefits of preloading, we’re also testing a new account linking feature which lets you play games without having to login into your game store account. Both the preloading and account linking features are currently enabled for Fortnite’s PC build on GeForce NOW. We anticipate an expansion of these features to more GeForce NOW games in the future.

PC, macOS and Chromebook users can enable the new account linking features from a new tile on the My Library row in-app. This takes you to the Settings pane, where you can turn on account linking for Fortnite under Connections. Once complete, you won’t need to log in to your Epic Account to play Fortnite’s PC build on any other supported GeForce NOW platform, and you’ll be eligible for preloaded sessions.

Find What You’re Looking For

We’re also improving search results in the app to make managing libraries easier and get members in games faster. Searching for games to add to libraries will now return full page search results, providing easier access to the game’s details and a quicker way to add it to your library.

The newest version of the GeForce NOW app includes improved search, account linking for Epic Games Store, and a whole lot more.

If you’re playing on GeForce NOW from a Chrome browser, we’ve recently added our in-game overlay.  The overlay lets members configure many in-stream features, such as FreeStyle filters, network labels, microphone toggles and ending game sessions. To bring up the overlay, using Ctrl + G for PC and Chromebook, or CMD + G for macOS.

And no GeForce NOW app update would be complete without squashed bugs. To get the full lowdown, check out the version 2.0.29 Release Highlights from the Settings pane in the app.

These updates are just a few of the improvements we’re working on. We have a ton more in store, and every update is designed to make sure that when GFN members play their favorite PC games from our cloud servers, they’re getting the best possible experience.

Rolling in the Deep (Silver)

We recently spoke with our friends at Deep Silver about new updates coming for KING Art Games’ Iron Harvest and 4A Games’ Metro Exodus: Enhanced Edition, both of which will be supported on GeForce NOW. Catch up on all the details here.

Get Your Game On

The latest in the classic R-Type series comes to GeForce NOW this week.

Finally, below are the games joining the GeForce NOW library this week. 

What do you think of the newest GeForce NOW updates? Let us know on Twitter or in the comments below.

The post Update Complete: GFN Thursday Brings New Features, Games and More appeared first on The Official NVIDIA Blog.

Read More

GFN Thursday: Rolling in the Deep (Silver) with Major ‘Metro Exodus’ and ‘Iron Harvest’ Updates

GFN Thursday reaches a fever pitch this week as we take a deeper look at two major updates coming to GeForce NOW from Deep Silver in the weeks ahead.

Catching Even More Rays

Metro Exodus was one of the first RTX games added to GeForce NOW. It’s still one of the most-played RTX games on the service. Back in February, developer 4A Games shared news of an Enhanced Edition coming to PC that takes advantage of a new Fully Ray-Traced Lighting Pipeline.

Today, we can share that it’s coming to GeForce NOW, day-and-date, on May 6.

The PC Enhanced Edition features significant updates to Metro Exodus’ real-time ray tracing implementations. Players will see improvements to the groundbreaking Ray-Traced Global Illumination (RTGI) featured in the base game, as well as new updates for the Ray-Traced Emissive Lighting techniques pioneered in “The Two Colonels” expansion.

The PC Enhanced Edition also includes additional ray-tracing features, like Advanced Ray-Traced Reflections, and support for NVIDIA DLSS 2.0 on NVIDIA hardware — including GeForce NOW.

The list of RTX features coming to the PC Enhanced Edition is massive:

  • Fully ray-traced lighting throughout — every light sources is now ray traced
  • Next-gen ray tracing and denoising
  • Next-gen temporal reconstruction technology
  • Per-pixel ray-traced global illumination
  • Ray-traced emissive surfaces with area shadows
  • Infinite number of ray-traced light boundaries
  • Atmosphere and transparent surfaces receiving ray-traced bounced lighting
  • Full ray-traced lighting model support with color bleeding and for every light source
  • Advanced ray-traced reflections
  • DX12 ultimate support, including DXR 1.1 and variable rate shading
  • GPU FP16 support and thousands of optimized shaders
  • Support for DLSS 2.0
  • Addition of FOV (field of view) slider to main game options

In short, the game is going to look even more amazing. And, starting next week, members who own Metro Exodus will have access to the update on GeForce NOW. But remember, to access the enhanced visuals you’ll need to be a Founder or Priority member.

Don’t own Metro Exodus yet? Head to Steam or the Epic Games Store and get ahead of the game.

Metro Exodus PC Enhanced Edition includes updated real-time ray tracing across both the base game and the expansion.
Metro Exodus PC Enhanced Edition includes updated real-time ray tracing that GeForce NOW Founders and Priority members can experience across nearly all of their devices.

A Strategic Move

Iron Harvest, the classic real-time strategy game with an epic single-player campaign, multiplayer and coop, set in the alternate reality of 1920+, is getting a new DLC on May 27. Dubbed “Operation Eagle,” the update brings a new faction to the game’s alternate version of World War I: USA.

You’ll guide this new faction through seven new single-player missions, while learning how to use the game’s new Aircraft units across all of the game’s playable factions, including Polania, Saxony and Rusviet.

“Operation Eagle” also adds new multiplayer maps that RTS fans will love, and the new USA campaign can be played cooperatively with friends.

Iron Harvest’s “Operation Eagle” DLC will be available on GeForce NOW day-and-date. You can learn more about the update here.

Don’t Take Our Word for It

The team at Deep Silver was gracious enough to answer a few questions we had about these great updates.

Q: We’re suckers for beautifully ray-traced PC games, on a scale from 1-to-OMG, how great does Metro Exodus PC Enhanced Edition look?

A: We’re quietly confident that the Metro Exodus PC Enhanced Edition will register at the OMG end of the scale, but you don’t need to take our word for it – Digital Foundry declared that, “Metro Exodus’ PC Enhanced Edition’s Global Illumination produces without a doubt, the best lighting I’ve ever witnessed in a video game.”

Q: What does it mean for the team to leverage GeForce NOW to bring these new real-time ray-tracing updates in Metro Exodus PC Enhanced Edition to gamers across their devices?

A: We believe hardware-accelerated ray-tracing GPUs are the future, but right now the number of players with ray-tracing-capable GPUs is a small, albeit growing, percentage of the total PC audience. GeForce NOW will give those players yet to upgrade their gaming hardware a glimpse into the future.

Q: How does “Operation Eagle” build on the story in Iron Harvest? We’re excited to try this new faction.

A: The American Union of Usonia stayed out of the Great War and became an economic and military powerhouse, unnoticed by Europe’s old elites. Relying heavily on mighty “Diesel Birds,” the Usonia faction brings more variety to the Iron Harvest battlefields. Additional new buildings and new units for all factions will enhance the Iron Harvest roster to give players even more options to find the perfect attack and defence strategy.

Q: How do you see GeForce NOW expanding the audience of gamers who can play Metro Exodus and Iron Harvest?

A: We’re committed to bringing the Metro Exodus experience to as many platforms as we can without compromising on the quality of the experience; GeForce NOW puts our state-of-the-art ray-traced version of Metro Exodus into the hands of gamers regardless of their own hardware setup.

Q: Is there anything else you’d want to share with your fans who are streaming Metro Exodus and Iron Harvest from the cloud?

A: Watch out for the jump scares. You have been warned.

There’s probably a jump scare coming here, right? GeForce NOW members can find out on May 6.

GFN Thursday

In addition to rolling with a pair of Deep Silver announcements this week, members get their regular dose of GFN Thursday goodness. Read more about that and other updates this week here.

Getting excited for more ray-traced goodness in Metro Exodus? Can’t wait to get your hands on “Operation Eagle”? Let us know on Twitter or in the comments below.

The post GFN Thursday: Rolling in the Deep (Silver) with Major ‘Metro Exodus’ and ‘Iron Harvest’ Updates appeared first on The Official NVIDIA Blog.

Read More

Perceiving with Confidence: How AI Improves Radar Perception for Autonomous Vehicles

Editor’s note: This is the latest post in our NVIDIA DRIVE Labs series, which takes an engineering-focused look at individual autonomous vehicle challenges and how NVIDIA DRIVE addresses them. Catch up on all of our automotive posts, here.

Autonomous vehicles don’t just need to detect the moving traffic that surrounds them — they must also be able to tell what isn’t in motion.

At first glance, camera-based perception may seem sufficient to make these determinations. However, low lighting, inclement weather or conditions where objects are heavily occluded can affect cameras’ vision. This means diverse and redundant sensors, such as radar, must also be capable of performing this task. However, additional radar sensors that leverage only traditional processing may not be enough.

In this DRIVE Labs video, we show how AI can address the shortcomings of traditional radar signal processing in distinguishing moving and stationary objects to bolster autonomous vehicle perception.

Traditional radar processing bounces radar signals off of objects in the environment and analyzes the strength and density of reflections that come back. If a sufficiently strong and dense cluster of reflections comes back, classical radar processing can determine this is likely some kind of large object. If that cluster also happens to be moving over time, then that object is probably a car.

While this approach can work well for inferring a moving vehicle, the same may not be true for a stationary one. In this case, the object produces a dense cluster of reflections, but doesn’t move. According to classical radar processing, this means the object could be a railing, a broken down car, a highway overpass or some other object. The approach often has no way of distinguishing which.

Introducing Radar DNN

One way to overcome the limitations of this approach is with AI in the form of a deep neural network (DNN).

Specifically, we trained a DNN to detect moving and stationary objects, as well as accurately distinguish between different types of stationary obstacles, using data from radar sensors.

Training the DNN first required overcoming radar data sparsity problems. Since radar reflections can be quite sparse, it’s practically infeasible for humans to visually identify and label vehicles from radar data alone.

Figure 1. Example of propagating bounding box labels for cars from the lidar data domain into the radar data domain.

Lidar, however, can create a 3D image of surrounding objects using laser pulses. Thus, ground truth data for the DNN was created by propagating bounding box labels from the corresponding lidar dataset onto the radar data as shown in Figure 1. In this way, the ability of a human labeler to visually identify and label cars from lidar data is effectively transferred into the radar domain.

Moreover, through this process, the radar DNN not only learns to detect cars, but also their 3D shape, dimensions and orientation, which classical methods cannot easily do.

With this additional information, the radar DNN is able to distinguish between different types of obstacles — even if they’re stationary — increase confidence of true positive detections, and reduce false positive detections.

The higher confidence 3D perception results from the radar DNN in turn enables AV prediction, planning and control software to make better driving decisions, particularly in challenging scenarios. For radar, classically difficult problems like accurate shape and orientation estimation, detecting stationary vehicles as well as vehicles under highway overpasses become feasible with far fewer failures.

The radar DNN output is integrated smoothly with classical radar processing. Together, these two components form the basis of our radar obstacle perception software stack.

This stack is designed to both offer full redundancy to camera-based obstacle perception and enable radar-only input to planning and control, as well as enable fusion with camera- or lidar-perception software.

With such comprehensive radar perception capabilities, autonomous vehicles can perceive their surroundings with confidence.

To learn more about the software functionality we’re building, check out the rest of our DRIVE Labs series.

The post Perceiving with Confidence: How AI Improves Radar Perception for Autonomous Vehicles appeared first on The Official NVIDIA Blog.

Read More

Making Movie Magic, NVIDIA Powers 13 Years of Oscar-Winning Visual Effects

For the 13th year running, NVIDIA professional GPUs have powered the dazzling visuals and cinematics behind every Academy Award nominee for Best Visual Effects.

The 93rd annual Academy Awards will take place on Sunday, April 25, with five VFX nominees in the running:

  • The Midnight Sky
  • Tenet
  • Mulan
  • The One and Only Ivan
  • Love and Monsters

NVIDIA professional GPUs have been behind award-winning graphics in films for over a decade. During that time, the most stunning visual effects shots have formed the backdrop for the Best Visual Effects Oscar.

Although some traditional nominees, namely tentpole summer blockbusters, weren’t released in 2020 because of the pandemic, this year’s lineup still brought innovative tools, new techniques and impressive visuals to the big screen.

For the visuals in The Midnight Sky, Framestore delivered the breathtaking VFX and deft keyframe animation for which they are renowned. Add in cutting-edge film tech like ILM Stagecraft and Anyma, and George Clooney supervising previsualization and face replacement sequences, and it’s no wonder that Framestore swept the Visual Effects Society Awards this year.

Christopher Nolan’s latest film, Tenet, is made up of 300 VFX shots that create a sense of time inversion. During action sequences, DNEG used new temporal techniques to show time moving forward and in reverse.

In Paramount’s Love and Monsters, a sci-fi comedy about giant creatures, Toronto-based visual effects company Mr. X delivers top-notch graphics that earned them their first Oscars nomination. From colossal snails to complex crustaceans, the film featured 13 unique, mutated creatures. The VFX and animation teams crafted the creatures’ movements based on how each would interact in a post-apocalyptic world.

And to create the impressive set extensions, scenic landscapes and massive crowds in Disney’s most recent live-action film, Mulan, Weta Digital tapped NVIDIA GPU-accelerated technology to immerse the audience in a world of epic scale.

While only one visual effects team will accept an award at Sunday’s ceremony, millions of artists are creating stunning visuals and cinematics with NVIDIA RTX. Whether it’s powering virtual production sets or accelerating AI tools, RTX technology is shaping the future of storytelling.

Learn more about NVIDIA technology in media and entertainment.

Featured image courtesy of Framestore. © NETFLIX

The post Making Movie Magic, NVIDIA Powers 13 Years of Oscar-Winning Visual Effects appeared first on The Official NVIDIA Blog.

Read More

Cultivating AI: AgTech Industry Taps NVIDIA GPUs to Protect the Planet

What began as a budding academic movement into farm AI projects has now blossomed into a field of startups creating agriculture technology with a positive social impact for Earth.

Whether it’s the threat to honey bees worldwide from varroa mites, devastation to citrus markets from citrus greening, or contamination of groundwater caused from agrochemicals — AI startups are enlisting NVIDIA GPUs to help solve these problems.

With Earth Day today, here’s looking at some of the work of developers, researchers and entrepreneurs who are harnessing NVIDIA GPUs to protect the planet.

The Bee’s Knees: Parasite Prevention 

Bees are under siege by varroa parasites destroying their colonies. And saving the world’s honeybee population is about a lot more than just honey. All kinds of farmers now need to rent bees because of their scarcity to get their own crops pollinated.

Beewise, a startup based in Israel, has developed robo hives with computer vision for infestation identification and treatment capabilities. In December, TIME magazine named the Beewise Beehome to its “Best Inventions of 2020” list. Others are using deep learning to understand hives better and look at improved hive designs.

Orange You Glad AI Helps

If it weren’t for AI, that glass of orange juice for breakfast might be a puckery one. A rampant “citrus greening” disease is decimating orchards and souring fruit worldwide. Thankfully, University of Florida researchers are developing computer vision for smart sprayers of agrochemicals, which are now being licensed and deployed in pilot tests by CCI, an agricultural equipment company.

The system can adjust in real time to turn off or on the application of crop protection products or fertilizers as well as adjust the amount sprayed based on the plant’s size.

SeeTree, based in Israel, is tackling citrus greening, too. It offers a GPU-driven tree analytics platform of image recognition algorithms, sensors, drones and a data collection app.

The startup uses NVIDIA Jetson TX2 to process images and CUDA as the interface for cameras at orchards. The TX2 enables it to do fruit-detection for orchards as well as provide farms with a yield estimation tool.

AI Land of Sky Blue Water

Bilberry, located in Paris, develops weed recognition powered by the NVIDIA Jetson edge AI platform for precision application of herbicides. The startup has helped customers reduce the usage of chemicals by as much as 92 percent.

FarmWise, based in San Francisco, offers farmers an AI-driven robotic machine for pulling weeds rather than spraying them, reducing groundwater contamination.

Also, John Deere-owned Blue River offers precision spraying of crops to reduce the usage of agrochemicals harmful to land and water.

And two students from India last year developed Nindamani, an AI-driven, weed-removal robot prototype that took top honors at the AI at the Edge Challenge on Hackster.io.

Milking AI for Dairy Farmers 

AI is going to the cows, too. Advanced Animal Diagnostics, based in Morrisville, North Carolina, offers a portable testing device to predict animal performance and detect infections in cattle before they take hold. Its tests are processed on NVIDIA GPUs in the cloud. The machine can help reduce usage of antibiotics.

Similarly, SomaDetect aims to improve milk production with AI. The Halifax, Nova Scotia, company runs deep learning models on NVIDIA GPUs to analyze milk images.

Photo courtesy of Mark Kelly on Unsplash

The post Cultivating AI: AgTech Industry Taps NVIDIA GPUs to Protect the Planet appeared first on The Official NVIDIA Blog.

Read More

Green for Good: How We’re Supporting Sustainability Efforts in India

When a community embraces sustainability, it can reap multiple benefits: gainful employment for vulnerable populations, more resilient local ecosystems and a cleaner environment.

This Earth Day, we’re announcing our four latest corporate social responsibility investments in India, home to more than 2,700 NVIDIANs. These initiatives are part of our multi-year efforts in the country, which focus on investing in social innovation, job creation and climate action.

Last year, we funded projects that aided migrant workers affected by COVID-19, increased green cover, furthered sustainable waste management processes and improved livelihoods through job creation.

The organizations we’re supporting this year are:

Foundation for Ecological Security

This project will build 10 water-harvesting structures and a dozen systems for diversion-based irrigation, a technique to irrigate farms by redirecting water from rivers or streams. It will benefit around 4,000 individuals from vulnerable migrant households by increasing vegetative cover and the irrigation potential of the land. The foundation will also create community-based initiatives to augment rural household income through the sale of non-timber forest products such as medicinal plants, leaves or honey.

Impact Guru Foundation

We’re supporting the organization Grow-Trees’ efforts to plant local, non-invasive trees in the Dalma Wildlife Sanctuary, home to dozens of endangered Asiatic elephants. Located in the northeastern state of Jharkhand, this project will employ tribal women and other villagers to plant more than 26,000 trees to improve the environment and reinstate elephant migration routes.

Naandi Foundation

Hyderabad-based Naandi is securing sustainable livelihoods for tribal communities by encouraging the organic farming of coffee and other crops. We’re funding this Rockefeller Foundation Award-winning project to transform depleted soil into carbon-rich landscapes, improving plant health for 3,000 acres of coffee farms and boosting coffee quality to a gourmet product that boosts the income of thousands of farming families.

Energy Harvest Charitable Trust

Energy Harvest aims to reduce open-field burning by connecting small farmers with machinery owners and straw buyers, paving paths for alternative energy sources using agricultural waste. The initiative — which will use AI and edge devices to identify farm fires and track local emission levels — will create dozens of employment opportunities, benefit more than 100 farmers and improve air quality by saving hundreds of acres from burning.

NVIDIA has previously funded projects in India that provided education programs for underprivileged youth, taught computer skills to young women, supported people with disabilities and opened 50 community libraries in remote areas. Many of these initiatives have centered in communities near our three offices — Bangalore, Hyderabad and Pune.

Learn more about corporate social responsibility at NVIDIA.

The post Green for Good: How We’re Supporting Sustainability Efforts in India appeared first on The Official NVIDIA Blog.

Read More

GFN Thursday Drops the Hammer with ‘Vermintide 2 Chaos Wastes’ Free Expansion, ‘Immortals Fenyx Rising The Lost Gods’ DLC

GFN Thursday is our ongoing commitment to bringing great PC games and service updates to our members each week. Every Thursday, we share updates on what’s new in the cloud — games, exclusive features, and news on GeForce NOW.

This week, it includes the latest updates for two popular games: Fatshark’s free expansion Warhammer: Vermintide 2 Chaos Wastes, and The Lost Gods DLC for Ubisoft’s Immortals Fenyx Rising.

Since GeForce NOW is streaming the PC version of these games, members receive the full experience — with expansion and DLC support — and can play with and against millions of other PC players.

The GeForce NOW library also grows by 15 games this week, with game releases from NewCore Games, Ubisoft and THQ.

A High-Stakes Adventure

GeForce NOW members can explore the Chaos Wastes with their fellow heroes in Warhammer: Vermintide 2 with this new rogue-lite inspired game mode. Teams of up to four are built from the ground up, working together on tactics while preparing for the unexpected. As their team progresses, the reward grows greater. Failure is not an option.

Explore the Chaos Wastes together with your fellow heroes in Warhammer: Vermintide 2’s new rogue-lite inspired game mode.

Discover 15 new locations in the free expansion and prepare for an extra challenge as cursed areas — with changing landscapes influenced by the current ruler — bring a more sinister threat.

Warhammer: Vermintide 2 Chaos Wastes is streaming now on GeForce NOW.

God’s Eye View of the Lost Gods

Immortals Fenyx Rising – The Lost Gods, the third narrative DLC, launched today and is streaming starting today on GeForce NOW. Unfolding entirely from an overhead, god’s-eye perspective, the adventure centers on Ash, a new mortal champion following a series of catastrophic disasters.

Meet a new hero, Ash, embarking on an epic journey to reunite the Greek gods in Immortals Fenyx Rising’s newest DLC.

Ash’s mission is to travel to a new land, the Pyrite Island, to find and reunite the gods who left Olympos in a huff after a falling-out with Zeus. These “lost gods,” including Poseidon and Hades, will all need to be convinced to return to the Pantheon and restore balance to the world. Naturally, there are plenty of monsters standing between them and Ash, which players can dispatch using a new, brawler-inspired combat system.

Get Your Game On

It’s a busy GFN Thursday this week with 15 games joining the GeForce NOW library today.

Turnip Boy Commits Tax Evasion, releasing day-and-date with the Steam launch, is one of 15 new games this GFN Thursday.

Turnip Boy Commits Tax Evasion (Steam)

Launching on Steam today, play as an adorable yet trouble-making turnip. Avoid paying taxes, solve plantastic puzzles, harvest crops and battle massive beasts all in a journey to tear down a corrupt vegetable government!

Warhammer 40,000: Inquisitor – Martyr (Steam)

Enter the Chaos-infested Caligari Sector and purge the unclean with the most powerful agents of the Imperium of Man! Warhammer 40,000: Inquisitor – Martyr is a grim, action-RPG featuring multiple classes of the Inquisition who will carry out the Emperor’s will.

Anno 2070 (Steam) and Anno 2205 (Steam)

Two games from Ubisoft’s long-running city-building franchise, Anno, release on GeForce NOW today. Anno 2070 offers a new world full of challenges, where you’ll need to master resources, diplomacy and trade in the most comprehensive economic management system in the Anno series.

In Anno 2205, you join humankind‘s next step into the future with the promise to build a better tomorrow. You conquer Earth, establishing rich, bustling cities and grand industrial complexes, but to secure the prosperity of your people, you must travel into space.

In addition, members can look for the following:

What are you playing? Let us know on Twitter using #GFNThursday, or in the comments below.

The post GFN Thursday Drops the Hammer with ‘Vermintide 2 Chaos Wastes’ Free Expansion, ‘Immortals Fenyx Rising The Lost Gods’ DLC appeared first on The Official NVIDIA Blog.

Read More

Mooning Over Selene: NVIDIA’s Julie Bernauer Talks Setting Up One of World’s Fastest Supercomputers

Though admittedly prone to breaking kitchen appliances like ovens and microwaves, Julie Bernauer — senior solutions architect for machine learning and deep learning at NVIDIA — led the small team that successfully built Selene, the world’s fifth-fastest supercomputer.

Adding to an already impressive feat, Bernauer’s team brought up Selene as the world went into lockdown in early 2020. They used skeleton crews, social distancing protocols, and remote cable validation to achieve what typically takes months with a larger install team in a few weeks.

 

Bernauer told NVIDIA AI Podcast host Noah Kravitz about the goal in creating Selene, which was primarily to support NVIDIA’s researchers. Referencing her time as a doctoral student, Bernauer explains how researchers are often prevented from working on larger models due to expense and infrastructure.

With Selene, the infrastructure is modular and can be scaled up or down depending on what users require, and allows for different types of research to be performed simultaneously. Bernauer said that Selene is proving most useful to autonomous vehicle and language modeling research at the moment.

Going forward, Bernauer envisions some of the power and efficiency of systems like Selene becoming more available on widely accessible devices, such as laptops or edge products such as cars.

Key Points From This Episode:

  • Selene’s unique, pandemic-safe installation is further explained in an NVIDIA blog detailing the specific efforts of Bernauer’s team and the lessons learned from past NVIDIA supercomputers such as SATURNV and Circe.
  • Bernauer joined NVIDIA in 2015, after spending 15 years in academia. She obtained her Ph.D. in structural genomics from Université Paris-Sud, after which she studied with Nobel Prize winner Michael Levitt at Stanford.

Tweetables:

“[Selene] is an infrastructure that people can share, where we can do different types of research at a time” — Julie Bernauer [8:30]

“We did [Selene] for ourselves, but we also did it … to figure out how to make a product better by going through the experience” — Julie Bernauer [13:27]

You Might Also Like:

NVIDIA’s Marc Hamilton on Building the Cambridge-1 Supercomputer During a Pandemic

Marc Hamilton, vice president of solutions architecture and engineering at NVIDIA, speaks about overseeing the construction of the U.K.’s most powerful supercomputer, Cambridge-1. Built on the NVIDIA DGX SuperPOD architecture, the system will be used by AstraZeneca, GSK, Oxford Nanopore and more.

Hugging Face’s Sam Shleifer Talks Natural Language Processing

Hugging Face is more than just an adorable emoji — it’s a company that’s demystifying AI by transforming the latest developments in deep learning into usable code. Research engineer Sam Shleifer talks about the company’s NLP technology, which is used at over 1,000 companies.

NVIDIA’s Bryan Catanzaro on the Latest from NVIDIA Research

Bryan Catanzaro, vice president of applied deep learning research at NVIDIA, walks through some of the latest developments at NVIDIA research … as well as shares a story involving Andrew Ng and cats.

The post Mooning Over Selene: NVIDIA’s Julie Bernauer Talks Setting Up One of World’s Fastest Supercomputers appeared first on The Official NVIDIA Blog.

Read More

The Future’s So Bright: NVIDIA DRIVE Shines at Auto Shanghai

NVIDIA DRIVE-powered cars electrified the atmosphere this week at Auto Shanghai.

The global auto show is the oldest in China and has become the stage to debut the latest vehicles. And this year, automakers, suppliers and startups developing on NVIDIA DRIVE brought a new energy to the event with a wave of intelligent electric vehicles and self-driving systems.

The automotive industry is transforming into a technology industry — next-generation lineups will be completely programmable and connected to a network, supported by software engineers who will invent new software and services for the life of the car.

Just as the battery capacity of an electric vehicle provides miles of range, the computing capacity of these new vehicles will give years of new delight.

EVs for Everyday

Automakers have been introducing electric vehicle technology with one or two specialized models. Now, these lineups are becoming diversified, with an EV for every taste.

The all-new Mercedes-Benz EQB.

Joining the recently launched EQS flagship sedan and EQA SUV on the showfloor, the Mercedes-Benz EQB adds a new flavor to the all-electric EQ family. The compact SUV brings smart electromobility in a family size, with seven seats and AI features.

The latest generation MBUX AI cockpit, featured in the Mercedes-Benz EQB.

Like its EQA sibling, the EQB features the latest generation MBUX AI cockpit, powered by NVIDIA DRIVE. The high-performance system includes an augmented reality head-up display, AI voice assistant and rich interactive graphics to enable the driver to enjoy personalized, intelligent features.

EV maker Xpeng is bringing its new energy technology to the masses with the P5 sedan. It joins the P7 sports sedan in offering intelligent mobility with NVIDIA DRIVE.

The Xpeng P5.

The P5 will be the first to bring Xpeng’s Navigation Guided Pilot (NGP) capabilities to public roads. The automated driving system leverages the automaker’s full-stack XPILOT 3.5, powered by NVIDIA DRIVE AGX Xavier. The new architecture processes data from 32 sensors — including two lidars, 12 ultrasonic sensors, five millimeter-wave radars and 13 high-definition cameras — integrated into 360-degree dual-perception fusion to handle challenging and complex road conditions.

Also making its auto show debut was the NIO ET7, which was first unveiled during a company event in January. The ET7 is the first vehicle that features NIO’s Adam supercomputer, which leverages four NVIDIA DRIVE Orin processors to achieve more than 1,000 trillion operations per second (TOPS).

The NIO ET7.

The flagship vehicle leapfrogs current model capabilities, with more than 600 miles of battery range and advanced autonomous driving. With Adam, the ET7 can perform point-to-point autonomy, using 33 sensors and high-performance compute to continuously expand the domains in which it operates — from urban to highway driving to battery swap stations.

Elsewhere on the showfloor, SAIC’s R Auto exhibited the intelligent ES33. This smart, futuristic vehicle equipped with R-Tech leverages the high performance of NVIDIA DRIVE Orin to deliver automated driving features for a safer, more convenient ride.

The R-Auto ES33.

SAIC- and Alibaba-backed IM Motors — which stands for intelligence in motion — also made its auto show debut with the electric L7 sedan and SUV, powered by NVIDIA DRIVE. These first two vehicles will have autonomous parking and other automated driving features, as well as a 93kWh battery that comes standard.

The IM Motors L7.

Improving Intelligence

In addition to automaker reveals, suppliers and self-driving startups showcased their latest technology built on NVIDIA DRIVE.

The scalable ZF ProAI Supercomputer.

Global supplier ZF continued to push the bounds of autonomous driving performance with the latest iteration of its ProAI Supercomputer. With NVIDIA DRIVE Orin at its core, the scalable autonomous driving compute platform supports systems with level 2 capabilities all the way to full self-driving, with up to 1,000 TOPS of performance.

A Momenta test vehicle with MPilot automated driving system.

Autonomous driving startup Momenta demonstrated the newest capabilities of MPilot, its autopilot and valet parking system. The software, which is designed for mass production vehicles, leverages DRIVE Orin, which enhances production efficiency for a more streamlined time to market.

From advanced self-driving systems to smart, electric vehicles of all sizes, the NVIDIA DRIVE ecosystem stole the show this week at Auto Shanghai.

The post The Future’s So Bright: NVIDIA DRIVE Shines at Auto Shanghai appeared first on The Official NVIDIA Blog.

Read More

Hanging in the Balance: More Research Coordination, Collaboration Needed for AI to Reach Its Potential, Experts Say

As AI is increasingly established as a world-changing field, the U.S. has an opportunity not only to demonstrate global leadership, but to establish a solid economic foundation for the future of the technology.

A panel of experts convened last week at GTC to shed light on this topic, with the co-chairs of the Congressional AI Caucus, U.S. Reps. Jerry McNerney (D-CA) and Anthony Gonzalez (R-OH), leading a discussion that reflects Washington’s growing interest in the topic.

The panel also included Hodan Omaar, AI policy lead at the Center for Data Innovation; Russell Wald, director of policy at Stanford University’s Institute for Human-Centered AI and Damon Woodard, director of AI partnerships at University of Florida’s AI Initiative.

“AI is getting increased interest among my colleagues on both sides of the aisle, and this is going to continue for some time,” McNerney said. Given that momentum, Gonzalez said the U.S. should be on the bleeding edge of AI development “for both economic and geopolitical reasons.”

Along those lines, the first thing the pair wanted to learn was how panelists viewed the importance of legislative efforts to fund and support AI research and development.

Wald expressed enthusiasm over legislation Congress passed last year as part of the National Defense Authorization Act, which he said would have an expansive effect on the market for AI.

Wald also said he was surprised at the findings of Stanford’s “Government by Algorithm” report, which detailed the federal government’s use of AI to do things such as track suicide risk among veterans, support SEC insider trading investigations and identify Medicare fraud.

Woodard suggested that continued leadership and innovation coming from Washington is critical if AI is to deliver on its promise.

“AI can play a big role in the economy,” said Woodard. “Having this kind of input from the government is important before we can have the kind of advancements that we need.”

The Role of Universities

Woodard and UF are already doing their part. Woodard’s role at the school includes helping transform it into a so-called “AI university.” In response to a question from Gonzalez about what that transition looks like, he said it required establishing a world-class AI infrastructure, performing cutting-edge AI research and incorporating AI throughout the curriculum.

“We want to make sure every student has some exposure to AI as it relates to their field of study,” said Woodard.

He said the school has more than 200 faculty members engaged in AI-related research, and that it’s committed to hiring 100 more. And while Woodard believes the university’s efforts will lead to more qualified AI professionals and AI innovation around its campus in Gainesville, he also said that partnerships, especially those that encourage diversity, are critical to encouraging more widespread industry development.

Along those lines, UF has joined an engineering consortium and will provide 15 historically Black colleges and two Hispanic-serving schools with access to its prodigious AI resources.

Omaar said such efforts are especially important when considering how unequally the high performance computing resources needed to conduct AI research are distributed.

In response to a question from McNerney about a recent National Science Foundation report, Omaar noted the finding that the U.S. Department of Energy is only providing support to about a third of the researchers seeking access to HPC resources.

“Many universities are conducting AI research without the tools they need,” she said.

Omaar said she’d like to see the NSF focus its funding on supporting efforts in states where HPC resources are scarce but AI research activity is high.

McNerney announced that he would soon introduce legislation requiring NSF to determine what AI resources are necessary for significant research output.

Moving Toward National AI Research Resources

The myriad challenges points to the benefits that could come from a more coordinated national effort. To that end, Gonzalez asked about the potential of the National AI Research Resource Task Force Act, and the national AI research cloud that would result from it.

Wald called the legislation a “game-changing AI initiative,” noting that the limited number of universities with AI research computing resources has pushed AI research into the private sector, where the objectives are driven by shorter-term financial goals rather than long-term societal benefits.

“What we see is an imbalance in the AI research ecosystem,” Wald said. The federal legislation would establish a pathway for a national AI research hub, which “has the potential to unleash American AI innovation,” he said.

The way Omaar sees it, the nationwide collaboration that would likely result — among politicians, industry and academia — is necessary for AI to reach its potential.

“Since AI will impact us all,” she said, “it’s going to need everyone’s contribution.”

The post Hanging in the Balance: More Research Coordination, Collaboration Needed for AI to Reach Its Potential, Experts Say appeared first on The Official NVIDIA Blog.

Read More