A Further Step to Getting GeForce Cards into the Hands of Gamers

GeForce products are made for gamers — and packed with innovations. Our RTX 30 Series is built on our second-generation RTX architecture, with dedicated RT Cores and Tensor Cores, delivering amazing visuals and performance to gamers and creators.

Because NVIDIA GPUs are programmable, users regularly discover new applications for them, from weather simulation and gene sequencing to deep learning and robotics. Mining cryptocurrency is one of them.

Halving Hash Rate

To help get GeForce GPUs in the hands of gamers, we announced in February that all GeForce RTX 3060 graphics cards shipped with a reduced Ethereum hash rate.

Today, we’re taking additional measures by applying a reduced ETH hash rate to newly manufactured GeForce RTX 3080, RTX 3070 and RTX 3060 Ti graphics cards. These cards will start shipping in late May.

Clear Communication to Gamers

Because these GPUs originally launched with a full hash rate, we want to ensure that customers know exactly what they’re getting when they buy GeForce products. To help with this, our GeForce partners are labeling the GeForce RTX 3080, RTX 3070 and RTX 3060 Ti cards with a “Lite Hash Rate,” or “LHR,” identifier. The identifier will be in retail product listings and on the box.

This reduced hash rate only applies to newly manufactured cards with the LHR identifier and not to cards already purchased.

GeForce Is Made for Gaming

GeForce RTX GPUs have introduced a range of cutting-edge technologies — RTX real-time ray tracing, AI-powered DLSS frame rate booster, NVIDIA Reflex super-fast response rendering for best system latency, and many more — created to meet the needs of gamers and those who create digital experiences.

We believe this additional step will get more GeForce cards at better prices into the hands of gamers everywhere.

 

The post A Further Step to Getting GeForce Cards into the Hands of Gamers appeared first on The Official NVIDIA Blog.

Read More

NVIDIA BlueField DPUs Fuel Unprecedented Data Center Transformation

Cloud computing and AI are pushing the boundaries of scale and performance for data centers.

Anticipating this shift, industry leaders such as Baidu, Palo Alto Networks, Red Hat and VMware are using NVIDIA BlueField DPUs to transform their data center platforms into higher performing, more secure, agile platforms and bring differentiated products and services to market.

NVIDIA BlueField DPUs are designed to meet the infrastructure requirements that modern data centers must offer for today’s cloud computing and AI workloads:

  • Upgraded Efficiency: BlueField DPUs enable all available CPU resources to run business applications, freeing up CPUs that would otherwise have been used to support software-defined networking.
  • Increased Ability to Scale: Cloud-native applications are highly distributed and create intensive “east-west” traffic within data centers. BlueField DPUs provide a high-throughput, low-latency network environment for scale-out applications.
  • Leading-Edge Security: Multi-tenancy and infrastructure elasticity in cloud data centers pose privacy and confidentiality risks that are addressed by BlueField DPUs.
  • Enhanced Performance: BlueField DPUs provide robust and powerful networking to handle the growing prevalence of GPU-accelerated computing in the cloud, enterprise and edge.

It can seem overwhelming for organizations to deliver on these new requirements quickly and efficiently, while also protecting each individual workload. Below are examples of how NVIDIA’s customers and partners are leveraging BlueField DPUs to offload, accelerate and isolate infrastructure software in ways that dramatically improve data center performance, scalability and security.

Baidu Delivers on Promise of Bare-Metal Clouds

Baidu has deployed NVIDIA BlueField DPUs for its scale-out bare-metal cloud infrastructure. One of the key advantages of bare-metal cloud computing lies in its ability to deliver predictable and consistent performance by allowing direct access to the hardware.

Traditional bare-metal infrastructure lacks the operational flexibility and agility that the virtualized cloud provides. BlueField empowers Baidu to easily provision bare-metal compute instances to millions of companies in China with high-speed networking and tenant isolation.

An ideal use-case for DPUs, NVIDIA BlueField has helped Baidu transform bare-metals from hardware-defined server infrastructure into software-defined, hardware-accelerated cloud platforms.

Red Hat Brings Composable Compute to the Edge

Red Hat closely collaborates with NVIDIA to enable GPU-accelerated AI computing and drive innovation across the entire stack.

The open-source software titan has developed a DPU-powered composable compute infrastructure to address the stringent requirements for performance, latency and security in edge data centers. When deployed in a fleet of 50,000 nodes, this state-of-the-art infrastructure would save up to $70 million through BlueField’s efficient software-defined, hardware acceleration engines.

At a basic level, Red Hat’s composable compute is optimized for the edge as it uses bare-metal as a service together with BlueField’s domain-specific hardware acceleration to boost the performance of enterprise workloads running on the flagship Red Hat Enterprise Linux operating system or the full-scale OpenShift Kubernetes platform.

As a leader in application containerization, Red Hat’s vision extends the software microservices concept to hardware, enabling zero-trust security and operational efficiency for enterprise IT and devops alike.

Palo Alto Networks Accelerates 5G-Native Security

5G wireless networks are set to reshape digital business and open up conversation about modern security concerns. At the forefront of cybersecurity, Palo Alto Networks has taken a unique approach to bring security forward to 5G networks.

Building on its expertise in network, cloud and device security, the company has created a 5G-native security solution that consists of 5G context-driven security, automation and service protection. To address the rigid data processing requirements of 5G, Palo Alto Networks has integrated its industry-leading next-generation firewall with NVIDIA BlueField DPUs.

The result is a software-defined, hardware-accelerated firewall architecture that will provide line-rate security processing at 100Gb/s speed. This innovative architecture is available today for service providers and enterprises for testing the technology.

VMware Redefines the Hybrid Cloud Architecture

At GTC 2020, VMware and NVIDIA announced a broad partnership to deliver both an AI-Ready Enterprise Platform, and a new architecture for data center, cloud and edge that uses BlueField DPUs to support existing and next-generation applications.

A critical component of the AI-Ready Enterprise Platform, the NVIDIA AI Enterprise software suite runs on VMware vSphere and is optimized, certified and supported by NVIDIA to help thousands of VMware customers in the world’s largest industries to unlock the power of AI.

Additionally, VMware is collaborating with NVIDIA to define a next-generation cloud architecture based on VMware Cloud Foundation and BlueField, and will offer increased performance, a zero-trust distributed security model and simplified operations. Known as VMware Project Monterey, this will help customers take advantage of both CPUs and DPUs while extending VMware infrastructure value to bare-metal applications for the first time.

Together, VMware and NVIDIA are building a high-performance, more secure, efficient and AI-ready cloud platform for the 300,000-plus organizations using VMware’s virtualization platform today.

Next Up: NVIDIA BlueField-3 

Unveiled by NVIDIA CEO Jensen Huang during his GTC21 keynote address, BlueField-3 is the product of the company’s multiyear innovation roadmap, which will see a new BlueField generation every 18 months.

NVIDIA BlueField-3 redefines the art of the possible by delivering the industry’s first DPU to offer 400Gb/s Ethernet and InfiniBand networking with up to 10x the processing power of its predecessor.

Huang also announced NVIDIA DOCA 1.0, a powerful application framework that enables developers to rapidly create applications and services on top of NVIDIA BlueField DPUs. Every new generation of BlueField will support DOCA, which preserves the development investment as each DPU generation evolves.

For customers, this means that today’s applications and data center infrastructure will run even faster when BlueField-3 arrives.

Transforming the Data Center

NVIDIA and its broad partner ecosystem are on a mission to transform the data center by leveraging the NVIDIA BlueField family of data processing units to power the next wave of accelerated cloud and AI computing applications.

Learn more about NVIDIA BlueField data processing units and apply for early access to NVIDIA DOCA SDK.

Be sure to watch the sessions at GTC21 presented in partnership with Red Hat, Palo Alto Networks and VMware.

The post NVIDIA BlueField DPUs Fuel Unprecedented Data Center Transformation appeared first on The Official NVIDIA Blog.

Read More

DiDi Chooses NVIDIA DRIVE for New Fleet of Self-Driving Robotaxis

Robotaxis are one major step closer to becoming reality.

DiDi Autonomous Driving, the self-driving technology arm of mobility technology leader Didi Chuxing, announced last month a strategic partnership with Volvo Cars on autonomous vehicles for DiDi’s self-driving test fleet.

Volvo’s autonomous drive-ready XC90 cars will be the first to integrate DiDi Gemini, a new self-driving hardware platform, which is equipped with NVIDIA DRIVE AGX Pegasus. These vehicles, equipped with DiDi’s Gemini self-driving hardware platform, will eventually be deployed in robotaxi services.

These self-driving test vehicles are significant progress towards commercial robotaxi services.

Robotaxis are autonomous vehicles that can operate on their own in geofenced areas, such as cities or residential communities. With a set of high-resolution sensors and a supercomputing platform in place of a driver, they can safely operate 24 hours a day, seven days a week.

And as a safer alternative to current modes of transit, robotaxis are expected to draw quick adoption once deployed at scale, making up more than 5 percent of vehicle miles traveled worldwide by 2030.

With the high-performance, energy-efficient compute of NVIDIA DRIVE at their core, these vehicles developed by DiDi are poised to help accelerate this landmark transition.

Doubling Up on Redundancy

The key to DiDi’s robotaxi ambitions is its new self-driving hardware platform, DiDi Gemini.

Achieving fully autonomous vehicles requires centralized, high-performance compute. The amount of sensor data a robotaxi needs to process is 100x greater than today’s most advanced vehicles.

The complexity in software also increases exponentially, with an array of redundant and diverse deep neural networks running simultaneously as part of an integrated software stack.

Built on NVIDIA DRIVE AGX Pegasus, DiDi Gemini achieves 700 trillion operations per second (TOPS) of performance, and includes up to 50 high-resolution sensors and an ASIL-D rated fallback system. It is architected with multi-layered redundant protections to enhance the overall safety of the autonomous driving experience.

The Gemini platform was designed using Didi Chuxing’s massive database of ride-hailing data as well as real-world autonomous driving test data to deliver the optimal self-driving hardware experience.

A New Generation of Collaboration

DiDi’s test fleet also marks a new era in technology collaboration.

DiDi and Volvo Cars plan to build a long-term partnership, expanding the autonomous test fleets across China and the U.S. and scaling up commercial robotaxi operations. The NVIDIA DRIVE platform enables continuous improvement over the air, facilitating these future plans of development and expansion.

This collaboration combines long-held legacies in vehicle safety, ride-hailing expertise and AI computing to push the bounds of transportation technology for safer, more efficient everyday mobility.

The post DiDi Chooses NVIDIA DRIVE for New Fleet of Self-Driving Robotaxis appeared first on The Official NVIDIA Blog.

Read More

AI Slam Dunk: Startup’s Checkout-Free Stores Provide Stadiums Fast Refreshments

With live sports making a comeback, one thing remains a constant: Nobody likes to miss big plays while waiting in line for a cold drink or snack.

Zippin offers sports fans checkout-free refreshments, and it’s racking up wins among stadiums as well as retailers, hotels, apartments and offices. The startup, based in San Francisco, develops image-recognition models that run on the NVIDIA Jetson edge AI platform to help track customer purchases.

People can simply enter their credit card details into the company’s app, scan into a Zippin-driven store, grab a cold one and any snacks, and go. Their receipt is available in the app afterwards. Customers can also bypass the app and simply use a credit card to enter the stores and Zippin automatically keeps track of their purchases and charges them.

“We don’t want fans to be stuck waiting in line,” said Motilal Agrawal, co-founder and chief scientist at Zippin.

As sports and entertainment venues begin to reopen in limited capacities, Zippin’s grab-and-go stores are offering quicker shopping and better social distancing without checkout lines.

Zippin is a member of NVIDIA Inception, a virtual accelerator program that helps startups in AI and data science get to market faster. “The Inception team met with us, loaned us our first NVIDIA GPU and gave us guidance on NVIDIA SDKs for our application,” he said.

Streak of Stadiums

Zippin has launched in three stadiums so far, all in the U.S. It’s in negotiations to develop checkout-free shopping for several other major sports venues in the country.

In March, the San Antonio Spurs’ AT&T Center reopened with limited capacity for the NBA season, unveiling a Zippin-enabled Drink MKT beverage store. Basketball fans can scan in with the Zippin mobile app or use their credit card, grab drinks and go. Cameras and shelves with scales identify purchases to automatically charge customers.

The debut in San Antonio comes after Zippin came to Mile High Stadium, in Denver, in November, for limited capacity Broncos games. Before that, Zippin unveiled its first stadium, the Golden 1 Center, in Sacramento. It allows customers to purchase popcorn, draft beer and other snacks and drinks and is open for Sacramento Kings basketball games and concerts.

“Our mission is to accelerate the adoption of checkout-free stores, and sporting venues are the ideal location to benefit from our approach,” Agrawal said.

Zippin Store Advances  

In addition to stadiums, Zippin has launched stores within stores for grab-and-go food and beverages in Lojas Americanas, a large retail chain in Brazil.

In Russia, the startup has put a store within a store inside an Azbuka Vkusa supermarket chain store located in Moscow. Zippin is also in Japan, where it has a pilot store in Tokyo with Lawson, a convenience store chain in an office location and another store within the Yokohama Techno Tower Hotel.

As an added benefit for retailers, Zippin’s platform can track products to help automate inventory management.

“We provide a retailer dashboard to see how much inventory there is for each individual item and which items have run low on stock. We can help to know exactly how much is in the store — all these detailed analytics are part of our offering,” Agrawal said.

Jetson Processing

Zippin relies on the NVIDIA Jetson AI platform for inference at 30 frames per second for its models, enabling split-second decisions on customer purchases. The application’s processing speed means it can keep up with a crowded store.

The company runs convolutional neural networks for product identification and store location identification to help track customer purchases. Also, using Zippin’s retail implementations, stores utilize smart shelves to determine whether a product was removed or replaced on a shelf.

The NVIDIA edge AI-driven platform can then process the shelf data and the video data together — sensor fusion — to determine almost instantly who grabbed what.

“It can deploy and work effectively on two out of three sensors (visual, weight and location) and then figure out the products on the fly, with training ongoing in action in deployment to improve the system,” said Agrawal.

The post AI Slam Dunk: Startup’s Checkout-Free Stores Provide Stadiums Fast Refreshments appeared first on The Official NVIDIA Blog.

Read More

GFN Thursday Set to Evolve as Biomutant Comes to GeForce NOW on May 25

GeForce NOW is always evolving, and so is this week’s GFN Thursday. Biomutant, the new open-world action RPG from Experiment 101 and THQ Nordic, is coming to GeForce NOW when it releases on May 25.

Everybody Was Kung Fu Fighting

Biomutant puts you in the role of an anthropomorphic rodent with swords, guns and martial arts moves to explore a strange, new open world. Your kung fu creature will evolve as you play thanks to the game’s mutations system, granting new powers and resistances to help you on your journey.

Each mutation will change how you fight and survive, with combat that challenges you to build ever-increasing attack combinations. You’ll mix and match melee attacks, ranged weapons and Psi powers to emerge victorious.

As you evolve your warrior’s abilities, you’ll also change your in-game look. Players will craft new weapons and armor from items found during their adventures. For gamers who love customizing their build, Biomutant looks to be a dream game.

Adventure Across Your Devices

PC gamers have been eagerly anticipating Biomutant’s release, and now GeForce NOW members will be able to take their adventure with them at GeForce quality, across nearly all of their devices.

“It is great that PC gamers can play Biomutant even without relying on powerful hardware via GeForce NOW,” said Stefan Ljungqvist, art and creative director at Experiment 101.

“We love that GeForce NOW will help introduce even more players to this ever-evolving furry creature and its lush game world,” said Florian Emmerich, head of PR at THQ Nordic, Biomutant’s publisher. “Now players can discover everything we have in store for them, even if their PC isn’t the latest and greatest.”

Biomutant on GeForce NOW
There’s a gorgeous world to explore in Biomutant, and with GeForce NOW, you can explore it even on lower-powered devices.

From what we’ve seen so far, the world that Experiment 101 has built is lush and gorgeous. GeForce NOW members will get to explore it as it was meant to be seen, even on a Chromebook, Mac or mobile device.

How will you evolve when Biomutant releases on May 25? Let us know in the comments below.

Get Your Game On

What GFN Thursday is complete without more games? Members can look forward to the following this week:

Returning to GeForce NOW:

  • The Wonderful 101: Remastered (Steam)

What are you planning to play this weekend? Let us know on Twitter or in the comments below.

The post GFN Thursday Set to Evolve as Biomutant Comes to GeForce NOW on May 25 appeared first on The Official NVIDIA Blog.

Read More

Keeping Games up to Date in the Cloud

GeForce NOW ensures your favorite games are automatically up to date, avoiding game updates and patches. Simply login, click PLAY, and enjoy an optimal cloud gaming experience.

Here’s an overview on how the service keeps your library game ready at all times.

Updating Games for All GeForce NOW Members

When a gamer downloads an update on their PC, all that matters is their individual download.

In the cloud, the GeForce NOW team goes through the steps of patching or maintenance and makes the updated game available for millions of members around the world.

Depending on when you happen to hop on GeForce NOW, you may not even see these updates taking place. In some cases, you’ll be able to keep playing your game while we’re updating game bits. In others, we may need to complete the backend work before it’s ready to play.

Patching in GeForce NOW

Patching is the process of making changes to a game or its supporting data to update, fix or improve it.

It typically takes a few minutes, sometimes longer for a game to patch on GeForce NOW.

When a developer releases a new version of a game, work is done on the backend of GeForce NOW to download that patch, replicate each bit to all of our storage systems, test for the proper security features, and finally copy it back onto all of our data centers worldwide, becoming available for gamers.

GeForce NOW handles this entire process so gamers and developers can focus on, well, gaming and developing.

Different Types of Patching 

Three types of patching occur on GeForce NOW:

  • On-seat patching allows you to play your game as-is while the patch is downloading in the background. Meaning you’re always game ready.
  • Offline patching happens for games that don’t support on-seat patching. Our automated software system downloads the patch on the backend then updates. Offline patching can take minutes to hours.
  • Distributed patching is a quicker type of patching. Specific bits are downloaded and installed, instead of the entire game (100 GB or so). We then finish updating and copy them onto the server. Typically this takes 30 minutes or less.

GeForce NOW continues to work with game developers requesting patch updates before releasing on PC, allowing for real-time preparations in the cloud, resulting in zero wait for gamers, period.

Maintenance Mode Explained 

Maintenance mode is the status of a game that’s been taken offline for bigger fixes such as bugs that cause poor performance, issues with save games, or patches that need more time to deploy.

Depending on severity, a game in maintenance can be offline anywhere from days to weeks. We work to keep these maintenance times to a minimum, and often work directly with game developers to resolve these issues.

Eliminate Waiting for More Gaming

Our goal is to do all of the patching and maintenance behind the scenes — so that when you’re ready to play, you’re game ready and playing instantly.

Get started with your gaming adventures on GeForce NOW.

Follow GeForce NOW on Facebook and Twitter and stay up to date on the latest features and game launches. 

The post Keeping Games up to Date in the Cloud appeared first on The Official NVIDIA Blog.

Read More

Create in Record Time with New NVIDIA Studio Laptops from Dell, HP, Lenovo, Gigabyte, MSI and Razer

New NVIDIA Studio laptops from Dell, HP, Lenovo, Gigabyte, MSI and Razer were announced today as part of the record-breaking GeForce laptop launch. The new Studio laptops are powered by GeForce RTX 30 Series and NVIDIA RTX professional laptop GPUs, including designs with the new GeForce RTX 3050 Ti and 3050 laptop GPUs, and the latest 11th Gen Intel mobile processors.

GeForce RTX 3050 Ti and 3050 Studio laptops are perfect for graphic designers, photographers and video editors, bringing high performance and affordable Studio laptops to artists and students.

The NVIDIA Broadcast app has been updated to version 1.2, bringing new room echo removal and video noise removal features, updated general noise removal and the ability to stack multiple effects for NVIDIA RTX users. The update can be downloaded here.

And it’s all supported by the May NVIDIA Studio Driver, also available today.

Creating Made Even Easier

With the latest NVIDIA Ampere architecture, Studio laptops accelerate creative workflows with real-time ray tracing, AI and dedicated video acceleration. Creative apps run faster than ever, taking full advantage of new AI features that save time and enable all creators to apply effects that previously were limited to the most seasoned experts.

The new line of NVIDIA Studio laptops introduces a wider range of options, making finding the perfect system easier than ever.

  • Creative aficionados that are into photography, graphic design or video editing can do more, faster, with new GeForce RTX 3050 Ti and 3050 laptops, and RTX A2000 professional laptops. They introduce AI acceleration, best-in-class video hardware encoding and GPU acceleration in hundreds of apps. With reduced power consumption and 14-inch designs as thin as 16mm, they bring RTX to the mainstream, making them perfect for students and creators on the go.
  • Advanced creators can step up to laptops powered by GeForce RTX 3070 and 3060 laptop GPUs or NVIDIA RTX A4000 and A3000 professional GPUs. They offer greater performance in up to 6K video editing and 3D rendering, providing great value in elegant Max-Q designs that can be paired with 1440p displays, widely available in laptops for the first time.
  • Expert creators will enjoy the power provided by the GeForce RTX 3080 laptop GPU, available in two variants, with 8GB or 16GB of video memory, or the NVIDIA RTX A5000 professional GPU, with 16GB of video memory. The additional memory is perfect for working with large 3D assets or editing 8K HDR RAW videos. At 16GB, these laptops provide creators working across multiple apps with plenty of memory to ensure these apps run smoothly.

The laptops are powered by third-generation Max-Q technologies. Dynamic Boost 2.0 intelligently shifts power between the GPU, GPU memory and CPU to accelerate apps, improving battery life. WhisperMode 2.0 controls the acoustic volume for the laptop, using AI-powered algorithms to dynamically manage the CPU, GPU and fan speeds to deliver quieter acoustics. For 3D artists, NVIDIA DLSS 2.0 utilizes dedicated AI processors on RTX GPUs called Tensor Cores to boost frame rates in real-time 3D applications such as D5 Render, Unreal Engine 4 and NVIDIA Omniverse.

Meet the New NVIDIA Studio Laptops 

Thirteen new Studio laptops were introduced today, including:

  • Nine new models from Dell: The professional-grade Precision 5560, 5760, 7560 and 7760, creator dream team XPS 15 and XPS 17, redesigned Inspiron 15 Plus and 16 Inspiron Plus, and the ready for small business Vostro 7510. The Dell Precision 5560 and XPS 15 debut with elegant, thin, world-class designs featuring creator-grade panels.
  • HP debuts the updated ZBook Studio G8, the world’s most powerful laptop of its size, featuring an incredible DreamColor display with a 120Hz refresh rate and a wide array of configuration options including GeForce RTX 3080 16GB and NVIDIA RTX A5000 laptop GPUs.
  • Lenovo introduced the IdeaPad 5i Pro, with a factory-calibrated, 100 percent sRGB panel, available in 14 and 16-inch configurations with GeForce RTX 3050, as well as the ThinkBook 16p, powered by GeForce RTX 3060.

Gigabyte, MSI and Razer also refreshed their Studio laptops, originally launched earlier this year, with new Intel 11th Gen CPUs, including the Gigabyte AERO 15 OLED and 17 HDR, MSI Creator 15 and Razer Blade 15.

The Proof is in the Perf’ing

The Studio ecosystem is flush with support for top creative apps. In total, more than 60 have RTX-specific benefits.

GeForce RTX 30 Series and NVIDIA RTX professional Studio laptops save time (and money) by enabling creators to complete creative tasks faster.

Video specialists can expect to edit 3.1x faster in Adobe Premiere Pro on Studio laptops with a GeForce RTX 3050 Ti, and 3.9x faster with a RTX 3060, compared to CPU alone.

Studio laptops shave hours off a single project by reducing time in playback, unlocking GPU-accelerated effects in real-time video editing and frame rendering, and faster exports with NVIDIA encoding.

Color grading in Blackmagic Design’s DaVinci Resolve, and editing using features such as face refinement and optical flow, is 6.8x faster with a GeForce RTX 3050 Ti than on CPU alone.

Edits that took 14 minutes with RTX 3060 would have taken 2 hours with just the CPU.

3D artists working with Blender who are equipped with a laptop featuring a GeForce RTX 3080-powered system can render an astonishing 24x faster than CPU alone.

A heavy scene that would take 1 hour to render on a Macbook Pro 16 only takes 8 minutes to render on an RTX 3080.

Adobe Photoshop Lightroom completes Enhance Details on RAW photos 3.7x faster with a GeForce RTX 3050 Ti, compared to an 11th Gen Intel i7 CPU, while Adobe Illustrator users can zoom and pan canvases twice as fast with an RTX 3050.

Regardless of your creative field, Studio laptops with GeForce RTX 30 Series and RTX professional laptop GPUs will speed up your workflow.

May Studio Driver and Creative App Updates

Two popular creator applications added Tensor Core support, accelerating workflows, this month. Both, along with the new Studio laptops, are supported by the latest Studio Driver.

Topaz Labs Gigapixel enhances imagery up to 600 percent while maintaining impressive original image quality.

Video Enhance AI is a collection of upscaling, denoising and restoration features.

With Studio Driver support, both Topaz apps are at least 6x faster with a GeForce RTX 3060 than with a CPU alone.

Recent NVIDIA Omniverse updates include new app and connector betas, available to download now.

Omniverse Machinima offers a suite of tools and extensions that enable users to render realistic graphics and animation using scenes and characters from games. Omniverse Audio2Face creates realistic facial expressions and motions to match any voice-over track.

AVerMedia integrated the NVIDIA Broadcast virtual background and audio noise removal AI features natively into their software suite to improve broadcasting abilities without requiring special equipment.

Download the May Studio Driver (release 462.59) today through GeForce Experience or from the driver download page to get the latest optimizations for the new Studio laptops and applications.

Get regular updates for creators by subscribing to the NVIDIA Studio newsletter and following us on Facebook, Twitter and Instagram.

The post Create in Record Time with New NVIDIA Studio Laptops from Dell, HP, Lenovo, Gigabyte, MSI and Razer appeared first on The Official NVIDIA Blog.

Read More

Sharpening Its Edge: U.S. Postal Service Opens AI Apps on Edge Network

In 2019, the U.S. Postal Service had a need to identify and track items in its torrent of more than 100 million pieces of daily mail.

A USPS AI architect had an idea. Ryan Simpson wanted to expand an image analysis system a postal team was developing into something much broader that could tackle this needle-in-a-haystack problem.

With edge AI servers strategically located at its processing centers, he believed USPS could analyze the billions of images each center generated. The resulting insights, expressed in a few key data points, could be shared quickly over the network.

The data scientist, half a dozen architects at NVIDIA and others designed the deep-learning models needed in a three-week sprint that felt like one long hackathon. The work was the genesis of the Edge Computing Infrastructure Program (ECIP, pronounced EE-sip), a distributed edge AI system that’s up and running on the NVIDIA EGX platform at USPS today.

An AI Platform at the Edge

It turns out edge AI is a kind of stage for many great performances. ECIP is already running a second app that acts like automated eyes, tracking items for a variety of business needs.

USPS camera gantry
Cameras mounted on the sorting machines capture addresses, barcodes and other data such as hazardous materials symbols. Courtesy of U.S. Postal Service.

“It used to take eight or 10 people several days to track down items, now it takes one or two people a couple hours,” said Todd Schimmel, the manager who oversees USPS systems including ECIP, which uses NVIDIA-Certified edge servers from Hewlett-Packard Enterprise.

Another analysis was even more telling. It said a computer vision task that would have required two weeks on a network of servers with 800 CPUs can now get done in 20 minutes on the four NVIDIA V100 Tensor Core GPUs in one of the HPE Apollo 6500 servers.

Today, each edge server processes 20 terabytes of images a day from more than 1,000 mail processing machines. Open source software from NVIDIA, the Triton Inference Server, acts as the digital mailperson, delivering the AI models each of the 195 systems need —  when and how they need it.

Next App for the Edge

USPS put out a request for what could be the next app for ECIP, one that uses optical character recognition (OCR) to streamline its imaging workflow.

“In the past, we would have bought new hardware, software — a whole infrastructure for OCR; or if we used a public cloud service, we’d have to get images to the cloud, which takes a lot of bandwidth and has significant costs when you’re talking about approximately a billion images,” said Schimmel.

Today, the new OCR use case will live as a deep learning model in a container on ECIP managed by Kubernetes and served by Triton.

The same systems software smoothed the initial deployment of ECIP in the early weeks of the pandemic. Operators rolled out containers to get the first systems running as others were being delivered, updating them as the full network of nearly nodes was installed.

“The deployment was very streamlined,” Schimmel said. “We awarded the contract in September 2019, started deploying systems in February 2020 and finished most of the hardware by August — the USPS was very happy with that,” he added.

Triton Expedites Model Deliveries

Part of the software magic dust under ECIP’s hood, Triton automates the delivery of different AI models to different systems that may have different versions of GPUs and CPUs supporting different deep-learning frameworks. That saves a lot of time for edge AI systems like the ECIP network of almost 200 distributed servers.

NVIDIA DGX servers at USPS
AI algorithms were developed on NVIDIA DGX servers at a U.S. Postal Service Engineering facility. Courtesy of NVIDIA.

The app that checks for mail items alone requires coordinating the work of more than a half dozen deep-learning models, each checking for specific features. And operators expect to enhance the app with more models enabling more features in the future.

“The models we have deployed so far help manage the mail and the Postal Service — it helps us maintain our mission,” Schimmel said.

A Pipeline of Edge AI Apps

So far, departments across USPS from enterprise analytics to finance and marketing have spawned ideas for as many as 30 applications for ECIP. Schimmel hopes to get a few of them up and running this year.

One would automatically check if a package carries the right postage for its size, weight and destination. Another one would automatically decipher a damaged barcode and could be online as soon as this summer.

“This has a benefit for us and our customers, letting us know where a specific parcel is at — it’s not a silver bullet, but it will fill a gap and boost our performance,” he said.

The work is part of a broader effort at USPS to explore its digital footprint and unlock the value of its data in ways that benefit customers.

“We’re at the very beginning of our journey with edge AI. Every day, people in our organization are thinking of new ways to apply machine learning to new facets of robotics, data processing and image handling,” he said.

Learn more about the benefits of edge computing and the NVIDIA EGX platform, as well as how NVIDIA’s edge AI solutions are transforming every industry.

Pictured at top: Postal Service employees perform spot checks to ensure packages are properly handled and sorted. Courtesy of U.S. Postal Service.

The post Sharpening Its Edge: U.S. Postal Service Opens AI Apps on Edge Network appeared first on The Official NVIDIA Blog.

Read More

GFN Thursday: 61 Games Join GeForce NOW Library in May

May’s shaping up to be a big month for bringing fan-favorites to GeForce NOW. And since it’s the first week of the month, this week’s GFN Thursday is all about the games members can look forward to joining the service this month.

In total, we’re adding 61 games to the GeForce NOW library in May, including 17 coming this week.

Joining This Week

This week’s additions include games from Remedy Entertainment, a classic Wild West FPS and a free title on Epic Games Store. Here are a few highlights:

Alan Wake on GeForce NOW

Alan Wake (Steam)

A Dark Presence stalks the small town of Bright Falls, pushing Alan Wake to the brink of sanity in his fight to unravel the mystery and save his love.

Call of Juarez: Gunslinger on GeForce NOW

Call of Juarez: Gunslinger (Steam)

From the dust of a gold mine to the dirt of a saloon, Call of Juarez Gunslinger is a real homage to Wild West tales. Live the epic and violent journey of a ruthless bounty hunter on the trail of the West’s most notorious outlaws.

Pine (Free on Epic Games Store until May 13)

An open-world action-adventure game set in a simulated world in which humans never reached the top of the food chain. Fight with or against a variety of species as you make your way to a new home for your human tribe.

Members can also look for the following titles later today:

MotoGP21 on GeForce NOW
Push your bike to the limit in MotoGP21, joining the GeForce NOW library this week.

More in May

This week is just the beginning. We have a giant list of titles joining GeForce NOW throughout the month, including:

In Case You Missed It

In April, we added 27 more titles than shared on April 1. Check out these games, streaming straight from the cloud:

Time to start planning your month, members. What are you going to play? Let us know on Twitter or in the comments below.

The post GFN Thursday: 61 Games Join GeForce NOW Library in May appeared first on The Official NVIDIA Blog.

Read More

Putting the AI in Retail: Walmart’s Grant Gelvin on Prediction Analytics at Supercenter Scale

With only one U.S. state without a Walmart supercenter — and over 4,600 stores across the country — the retail giant’s prediction analytics work with data on an enormous scale.

Grant Gelven, a machine learning engineer at Walmart Global Tech, joined NVIDIA AI Podcast host Noah Kravitz for the latest episode of the AI Podcast.

Gelven spoke about the big data and machine learning methods making it possible to improve everything from the customer experience to stocking to item pricing.

Gelven’s most recent project has been a dynamic pricing system, which reduces excess food waste by pricing perishable goods at a cost that ensures they’ll be sold. This improves suppliers’ ability to deliver the correct volume of items, the customers’ ability to purchase, and lessens the company’s impact on the environment.

The models that Gelven’s team work on are extremely large, with hundreds of millions of parameters. They’re impossible to run without GPUs, which are helping accelerate dataset preparation and training.

The improvements that machine learning have made to Walmart’s retail predictions reach even farther than streamlining business operations. Gelven points out that it’s ultimately helped customers worldwide get the essential goods they need, by allowing enterprises to react to crises and changing market conditions.

Key Points From This Episode:

  • Gelven’s goal for enterprise AI and machine learning models isn’t just to solve single use case problems, but to improve the entire customer experience through a complex system of thousands of models working simultaneously.
  • Five years ago, the time from concept to model to operations took roughly a year. Gelven explains that GPU acceleration, open-source software, and various other new tools have drastically reduced deployment times.

Tweetables:

“Solving these prediction problems really means we have to be able to make predictions about hundreds of millions of distinct units that are distributed all over the country.” — Grant Gelven [3:17]

“To give customers exactly what they need when they need it, I think is probably one of the most important things that a business or service provider can do.” — Grant Gelven [16:11]

You Might Also Like:

Focal Systems Brings AI to Grocery Stores

CEO Francois Chaubard explains how Focal Systems is applying deep learning and computer vision to automate portions of retail stores to streamline store operations and get customers in and out more efficiently.

Credit Check: Capital One’s Kyle Nicholson on Modern Machine Learning in Finance

Kyle Nicholson, a senior software engineer at Capital One, talks about how modern machine learning techniques have become a key tool for financial and credit analysis.

HP’s Jared Dame on How AI, Data Science Are Driving Demand for Powerful New Workstations

Jared Dame, Z by HP’s director of business development and strategy for AI, data science and edge technologies, speaks about the role HP’s workstations play in cutting-edge AI and data science.

Tune in to the AI Podcast

Get the AI Podcast through iTunes, Google Podcasts, Google Play, Castbox, DoggCatcher, Overcast, PlayerFM, Pocket Casts, Podbay, PodBean, PodCruncher, PodKicker, Soundcloud, Spotify, Stitcher and TuneIn. If your favorite isn’t listed here, drop us a note.

Tune in to the Apple Podcast Tune in to the Google Podcast Tune in to the Spotify Podcast

Make the AI Podcast Better

Have a few minutes to spare? Fill out this listener survey. Your answers will help us make a better podcast.

The post Putting the AI in Retail: Walmart’s Grant Gelvin on Prediction Analytics at Supercenter Scale appeared first on The Official NVIDIA Blog.

Read More