Get Outta My Streams, Get Into My Car: Aston Martin Designs Immersive Extended Reality Experience for Customers

Legendary car manufacturer Aston Martin is using the latest virtual and mixed reality technologies to drive new experiences for customers and designers.

The company has worked with Lenovo to use VR and AR to deliver a unique experience that allowed customers to explore its first luxury SUV, the Aston Martin DBX, without physically being in dealerships or offices.

With the Lenovo ThinkStation P620 powered by NVIDIA RTX A6000 graphics, Aston Martin is able to serve up an immersive experience of the Aston Martin DBX. The stunning demo consists of over 10 million polygons, enabling users to view incredibly detailed, photorealistic visuals in virtual, augmented and mixed reality — collectively known as extended reality, or XR.

“It’s our partnership with Lenovo workstations — and in particular, ThinkStation P620 — which has enabled us to take this to the next level,” said Pete Freedman, vice president and chief marketing officer of Aston Martin Lagonda. “Our aim has always been to provide our customers with a truly immersive experience, one that feels like it brings them to the center of the automotive product, and we’ve only been able to do this with the NVIDIA RTX A6000.”

NVIDIA RTX Brings the XR Factor

Customers would typically visit Aston Martin dealerships, attend motor shows or tour their facilities in the U.K. to explore the latest car models. A team would walk them through the design and features in person.

But after everyone started working remotely, Aston Martin decided to take a fresh look at what’s truly possible and investigate options to take the experience directly to customers — virtually.

With the help of teams from Lenovo and Varjo, an XR headset maker, the automaker developed the demo that provides an immersive look at the new Aston Martin DBX using VR and XR.

The experience, which is rendered from the NVIDIA RTX-powered ThinkStation P620, allows virtual participants to enter the environment and see a pixel-perfect representation of the Aston Martin DBX. Customers with XR headsets can explore the virtual vehicle from anywhere in the world, and see details such as the stitching and lettering on the steering wheel, leather and chrome accents, and even the reflections within the paint.

The real-time reflections and illumination in the demo were enabled by Varjo’s pass-through mixed reality technology. The Varjo XR-3’s LiDAR with RGB Depth Fusion using NVIDIA’s Optical Flow gives users the perception that the car is in the room, seamlessly blending the real world and virtual car together.

With the NVIDIA RTX A6000, the immersive demo runs smoothly and efficiently, providing users with high-quality graphics and stunning detail.

“As you dial up the detail, you need high-end GPUs. You need large GPU frame buffers to build the most photorealistic experiences, and that’s exactly what the NVIDIA RTX A6000 delivers,” said Mike Leach, worldwide solution portfolio lead at Lenovo.

The NVIDIA RTX A6000 is based on the NVIDIA Ampere GPU architecture and delivers a 48GB frame buffer. This allows teams to create high-fidelity VR and AR experiences with consistent framerates.

Aston Martin will expand its use of VR and XR to enhance internal workflows, as well. With this new experience, the design teams can work in virtual environments and iterate more quickly earlier in the process, instead of creating costly models.

Watch Lenovo’s GTC session to hear more about Aston Martin’s story.

Learn more about NVIDIA RTX and how our latest technology is powering the most immersive environments across industries.

The post Get Outta My Streams, Get Into My Car: Aston Martin Designs Immersive Extended Reality Experience for Customers appeared first on The Official NVIDIA Blog.

Read More

AI Researcher Explains Deep Learning’s Collision Course with Particle Physics

For a particle physicist, the world’s biggest questions — how did the universe originate and what’s beyond it — can only be answered with help from the world’s smallest building blocks.

James Kahn, a consultant with German research platform Helmholtz AI and a collaborator on the global Belle II particle physics experiment, uses AI and the NVIDIA DGX A100 to understand the fundamental rules governing particle decay.

Kahn spoke with NVIDIA AI Podcast host Noah Kravitz about the specifics of how AI is accelerating particle physics.

He also touched on his work at Helmholtz AI. Khan helps researchers in fields spanning medicine to earth sciences apply AI to the problems they’re solving. His wide-ranging career — from particle physicist to computer scientist — shows how AI accelerates every industry.

Key Points From This Episode:

  • The nature of particle physics research, which requires numerous simulations and constant adjustments, requires massive AI horsepower. Kahn’s team used the DGX A100 to reduce the time it takes to optimize simulations from a week to roughly a day.
  • The majority of Kahn’s work is global — at Helmholtz AI, he collaborates with researchers from Beijing to Tel Aviv, with projects located anywhere from the Southern Ocean to Spain. And at the Belle II experiment, Kahn is one of more than 1,000 researchers from 26 countries.

Tweetables:

“If you’re trying to simulate all the laws of physics, that’s a lot of simulations … that’s where these big, powerful machines come into play.” — James Kahn [6:02]

“AI is seeping into every aspect of research.” — James Kahn [16:37]

You Might Also Like:

Speed of Light: SLAC’s Ryan Coffee Talks Ultrafast Science

Particle physicist Ryan Coffee, senior staff scientist at the SLAC National Accelerator Laboratory, talks about how he is putting deep learning to work.

A Conversation About Go, Sci-Fi, Deep Learning and Computational Chemistry

Olexandr Isayev, an assistant professor at the UNC Eshelman School of Pharmacy at the University of North Carolina at Chapel Hill, explains how deep learning, abstract strategy board game Go, sci-fi and computational chemistry intersect.

How Deep Learning Can Accelerate the Quest for Cheap, Clean Fusion Energy

William Tang, principal research physicist at the Princeton Plasma Physics Laboratory, is one of the world’s foremost experts on how the science of fusion energy and HPC intersect. He talks about how he sees AI enabling the quest to deliver fusion energy.

Tune in to the AI Podcast

Get the AI Podcast through iTunes, Google Podcasts, Google Play, Castbox, DoggCatcher, Overcast, PlayerFM, Pocket Casts, Podbay, PodBean, PodCruncher, PodKicker, Soundcloud, Spotify, Stitcher and TuneIn. If your favorite isn’t listed here, drop us a note.

Tune in to the Apple Podcast Tune in to the Google Podcast Tune in to the Spotify Podcast

Make the AI Podcast Better

Have a few minutes to spare? Fill out this listener survey. Your answers will help us make a better podcast.

The post AI Researcher Explains Deep Learning’s Collision Course with Particle Physics appeared first on The Official NVIDIA Blog.

Read More

From Gaming to Enterprise AI: Don’t Miss NVIDIA’s Computex 2021 Keynote

NVIDIA will deliver a double-barrelled keynote packed with innovations in AI, the cloud, data centers and gaming at Computex 2021 in Taiwan, on June 1.

NVIDIA’s Jeff Fisher, senior vice president of GeForce gaming products, will discuss how NVIDIA is addressing the explosive growth in worldwide gaming. And Manuvir Das, head of enterprise computing at the company, will talk about democratizing AI to put more AI capabilities within reach of more enterprises.

Hosted by the Taiwan External Trade and Development Council, Computex has long been one of the world’s largest enterprise and consumer trade shows. Alongside its partners, NVIDIA has introduced a host of innovations at Computex over the years.

This year’s show will be both live and digital, giving technology enthusiasts around the world an opportunity to watch. You can tune in to the keynote, titled “The Transformational Power of Accelerated Computing, from Gaming to the Enterprise Data Center,” from our event landing page, or from our YouTube channel starting at 1 p.m. Taiwan time on June 1 (10 p.m. Pacific time on May 31).

Besides the keynote, NVIDIA will hold three talks at Computex forums.

Ali Kani, vice president and general manager of automotive at NVIDIA, will talk about “Transforming the Transportation Industry with AI” at the Future Car Forum on June 1, from 11 a.m. to 1 p.m. Taiwan time.

Jerry Chen, NVIDIA’s head of global business development for manufacturing and industrials, will discuss “The Promise of Digital Transformation: How AI-Infused Industrial Systems Are Rising to Meet the Challenges” at the AIoT Forum on June 2, at 11 a.m. Taiwan time.

And Richard Kerris, head of worldwide developer relations and general manager of NVIDIA Omniverse, will deliver a talk on the topic of “The Metaverse Begins: NVIDIA Omniverse and a Future of Shared Worlds,” on June 3 from 3:30 to 4 p.m. Taiwan time.

Key image credit: Arlene Hu, some rights reserved

The post From Gaming to Enterprise AI: Don’t Miss NVIDIA’s Computex 2021 Keynote appeared first on The Official NVIDIA Blog.

Read More

A Further Step to Getting GeForce Cards into the Hands of Gamers

GeForce products are made for gamers — and packed with innovations. Our RTX 30 Series is built on our second-generation RTX architecture, with dedicated RT Cores and Tensor Cores, delivering amazing visuals and performance to gamers and creators.

Because NVIDIA GPUs are programmable, users regularly discover new applications for them, from weather simulation and gene sequencing to deep learning and robotics. Mining cryptocurrency is one of them.

Halving Hash Rate

To help get GeForce GPUs in the hands of gamers, we announced in February that all GeForce RTX 3060 graphics cards shipped with a reduced Ethereum hash rate.

Today, we’re taking additional measures by applying a reduced ETH hash rate to newly manufactured GeForce RTX 3080, RTX 3070 and RTX 3060 Ti graphics cards. These cards will start shipping in late May.

Clear Communication to Gamers

Because these GPUs originally launched with a full hash rate, we want to ensure that customers know exactly what they’re getting when they buy GeForce products. To help with this, our GeForce partners are labeling the GeForce RTX 3080, RTX 3070 and RTX 3060 Ti cards with a “Lite Hash Rate,” or “LHR,” identifier. The identifier will be in retail product listings and on the box.

This reduced hash rate only applies to newly manufactured cards with the LHR identifier and not to cards already purchased.

GeForce Is Made for Gaming

GeForce RTX GPUs have introduced a range of cutting-edge technologies — RTX real-time ray tracing, AI-powered DLSS frame rate booster, NVIDIA Reflex super-fast response rendering for best system latency, and many more — created to meet the needs of gamers and those who create digital experiences.

We believe this additional step will get more GeForce cards at better prices into the hands of gamers everywhere.

 

The post A Further Step to Getting GeForce Cards into the Hands of Gamers appeared first on The Official NVIDIA Blog.

Read More

NVIDIA BlueField DPUs Fuel Unprecedented Data Center Transformation

Cloud computing and AI are pushing the boundaries of scale and performance for data centers.

Anticipating this shift, industry leaders such as Baidu, Palo Alto Networks, Red Hat and VMware are using NVIDIA BlueField DPUs to transform their data center platforms into higher performing, more secure, agile platforms and bring differentiated products and services to market.

NVIDIA BlueField DPUs are designed to meet the infrastructure requirements that modern data centers must offer for today’s cloud computing and AI workloads:

  • Upgraded Efficiency: BlueField DPUs enable all available CPU resources to run business applications, freeing up CPUs that would otherwise have been used to support software-defined networking.
  • Increased Ability to Scale: Cloud-native applications are highly distributed and create intensive “east-west” traffic within data centers. BlueField DPUs provide a high-throughput, low-latency network environment for scale-out applications.
  • Leading-Edge Security: Multi-tenancy and infrastructure elasticity in cloud data centers pose privacy and confidentiality risks that are addressed by BlueField DPUs.
  • Enhanced Performance: BlueField DPUs provide robust and powerful networking to handle the growing prevalence of GPU-accelerated computing in the cloud, enterprise and edge.

It can seem overwhelming for organizations to deliver on these new requirements quickly and efficiently, while also protecting each individual workload. Below are examples of how NVIDIA’s customers and partners are leveraging BlueField DPUs to offload, accelerate and isolate infrastructure software in ways that dramatically improve data center performance, scalability and security.

Baidu Delivers on Promise of Bare-Metal Clouds

Baidu has deployed NVIDIA BlueField DPUs for its scale-out bare-metal cloud infrastructure. One of the key advantages of bare-metal cloud computing lies in its ability to deliver predictable and consistent performance by allowing direct access to the hardware.

Traditional bare-metal infrastructure lacks the operational flexibility and agility that the virtualized cloud provides. BlueField empowers Baidu to easily provision bare-metal compute instances to millions of companies in China with high-speed networking and tenant isolation.

An ideal use-case for DPUs, NVIDIA BlueField has helped Baidu transform bare-metals from hardware-defined server infrastructure into software-defined, hardware-accelerated cloud platforms.

Red Hat Brings Composable Compute to the Edge

Red Hat closely collaborates with NVIDIA to enable GPU-accelerated AI computing and drive innovation across the entire stack.

The open-source software titan has developed a DPU-powered composable compute infrastructure to address the stringent requirements for performance, latency and security in edge data centers. When deployed in a fleet of 50,000 nodes, this state-of-the-art infrastructure would save up to $70 million through BlueField’s efficient software-defined, hardware acceleration engines.

At a basic level, Red Hat’s composable compute is optimized for the edge as it uses bare-metal as a service together with BlueField’s domain-specific hardware acceleration to boost the performance of enterprise workloads running on the flagship Red Hat Enterprise Linux operating system or the full-scale OpenShift Kubernetes platform.

As a leader in application containerization, Red Hat’s vision extends the software microservices concept to hardware, enabling zero-trust security and operational efficiency for enterprise IT and devops alike.

Palo Alto Networks Accelerates 5G-Native Security

5G wireless networks are set to reshape digital business and open up conversation about modern security concerns. At the forefront of cybersecurity, Palo Alto Networks has taken a unique approach to bring security forward to 5G networks.

Building on its expertise in network, cloud and device security, the company has created a 5G-native security solution that consists of 5G context-driven security, automation and service protection. To address the rigid data processing requirements of 5G, Palo Alto Networks has integrated its industry-leading next-generation firewall with NVIDIA BlueField DPUs.

The result is a software-defined, hardware-accelerated firewall architecture that will provide line-rate security processing at 100Gb/s speed. This innovative architecture is available today for service providers and enterprises for testing the technology.

VMware Redefines the Hybrid Cloud Architecture

At GTC 2020, VMware and NVIDIA announced a broad partnership to deliver both an AI-Ready Enterprise Platform, and a new architecture for data center, cloud and edge that uses BlueField DPUs to support existing and next-generation applications.

A critical component of the AI-Ready Enterprise Platform, the NVIDIA AI Enterprise software suite runs on VMware vSphere and is optimized, certified and supported by NVIDIA to help thousands of VMware customers in the world’s largest industries to unlock the power of AI.

Additionally, VMware is collaborating with NVIDIA to define a next-generation cloud architecture based on VMware Cloud Foundation and BlueField, and will offer increased performance, a zero-trust distributed security model and simplified operations. Known as VMware Project Monterey, this will help customers take advantage of both CPUs and DPUs while extending VMware infrastructure value to bare-metal applications for the first time.

Together, VMware and NVIDIA are building a high-performance, more secure, efficient and AI-ready cloud platform for the 300,000-plus organizations using VMware’s virtualization platform today.

Next Up: NVIDIA BlueField-3 

Unveiled by NVIDIA CEO Jensen Huang during his GTC21 keynote address, BlueField-3 is the product of the company’s multiyear innovation roadmap, which will see a new BlueField generation every 18 months.

NVIDIA BlueField-3 redefines the art of the possible by delivering the industry’s first DPU to offer 400Gb/s Ethernet and InfiniBand networking with up to 10x the processing power of its predecessor.

Huang also announced NVIDIA DOCA 1.0, a powerful application framework that enables developers to rapidly create applications and services on top of NVIDIA BlueField DPUs. Every new generation of BlueField will support DOCA, which preserves the development investment as each DPU generation evolves.

For customers, this means that today’s applications and data center infrastructure will run even faster when BlueField-3 arrives.

Transforming the Data Center

NVIDIA and its broad partner ecosystem are on a mission to transform the data center by leveraging the NVIDIA BlueField family of data processing units to power the next wave of accelerated cloud and AI computing applications.

Learn more about NVIDIA BlueField data processing units and apply for early access to NVIDIA DOCA SDK.

Be sure to watch the sessions at GTC21 presented in partnership with Red Hat, Palo Alto Networks and VMware.

The post NVIDIA BlueField DPUs Fuel Unprecedented Data Center Transformation appeared first on The Official NVIDIA Blog.

Read More

DiDi Chooses NVIDIA DRIVE for New Fleet of Self-Driving Robotaxis

Robotaxis are one major step closer to becoming reality.

DiDi Autonomous Driving, the self-driving technology arm of mobility technology leader Didi Chuxing, announced last month a strategic partnership with Volvo Cars on autonomous vehicles for DiDi’s self-driving test fleet.

Volvo’s autonomous drive-ready XC90 cars will be the first to integrate DiDi Gemini, a new self-driving hardware platform, which is equipped with NVIDIA DRIVE AGX Pegasus. These vehicles, equipped with DiDi’s Gemini self-driving hardware platform, will eventually be deployed in robotaxi services.

These self-driving test vehicles are significant progress towards commercial robotaxi services.

Robotaxis are autonomous vehicles that can operate on their own in geofenced areas, such as cities or residential communities. With a set of high-resolution sensors and a supercomputing platform in place of a driver, they can safely operate 24 hours a day, seven days a week.

And as a safer alternative to current modes of transit, robotaxis are expected to draw quick adoption once deployed at scale, making up more than 5 percent of vehicle miles traveled worldwide by 2030.

With the high-performance, energy-efficient compute of NVIDIA DRIVE at their core, these vehicles developed by DiDi are poised to help accelerate this landmark transition.

Doubling Up on Redundancy

The key to DiDi’s robotaxi ambitions is its new self-driving hardware platform, DiDi Gemini.

Achieving fully autonomous vehicles requires centralized, high-performance compute. The amount of sensor data a robotaxi needs to process is 100x greater than today’s most advanced vehicles.

The complexity in software also increases exponentially, with an array of redundant and diverse deep neural networks running simultaneously as part of an integrated software stack.

Built on NVIDIA DRIVE AGX Pegasus, DiDi Gemini achieves 700 trillion operations per second (TOPS) of performance, and includes up to 50 high-resolution sensors and an ASIL-D rated fallback system. It is architected with multi-layered redundant protections to enhance the overall safety of the autonomous driving experience.

The Gemini platform was designed using Didi Chuxing’s massive database of ride-hailing data as well as real-world autonomous driving test data to deliver the optimal self-driving hardware experience.

A New Generation of Collaboration

DiDi’s test fleet also marks a new era in technology collaboration.

DiDi and Volvo Cars plan to build a long-term partnership, expanding the autonomous test fleets across China and the U.S. and scaling up commercial robotaxi operations. The NVIDIA DRIVE platform enables continuous improvement over the air, facilitating these future plans of development and expansion.

This collaboration combines long-held legacies in vehicle safety, ride-hailing expertise and AI computing to push the bounds of transportation technology for safer, more efficient everyday mobility.

The post DiDi Chooses NVIDIA DRIVE for New Fleet of Self-Driving Robotaxis appeared first on The Official NVIDIA Blog.

Read More

AI Slam Dunk: Startup’s Checkout-Free Stores Provide Stadiums Fast Refreshments

With live sports making a comeback, one thing remains a constant: Nobody likes to miss big plays while waiting in line for a cold drink or snack.

Zippin offers sports fans checkout-free refreshments, and it’s racking up wins among stadiums as well as retailers, hotels, apartments and offices. The startup, based in San Francisco, develops image-recognition models that run on the NVIDIA Jetson edge AI platform to help track customer purchases.

People can simply enter their credit card details into the company’s app, scan into a Zippin-driven store, grab a cold one and any snacks, and go. Their receipt is available in the app afterwards. Customers can also bypass the app and simply use a credit card to enter the stores and Zippin automatically keeps track of their purchases and charges them.

“We don’t want fans to be stuck waiting in line,” said Motilal Agrawal, co-founder and chief scientist at Zippin.

As sports and entertainment venues begin to reopen in limited capacities, Zippin’s grab-and-go stores are offering quicker shopping and better social distancing without checkout lines.

Zippin is a member of NVIDIA Inception, a virtual accelerator program that helps startups in AI and data science get to market faster. “The Inception team met with us, loaned us our first NVIDIA GPU and gave us guidance on NVIDIA SDKs for our application,” he said.

Streak of Stadiums

Zippin has launched in three stadiums so far, all in the U.S. It’s in negotiations to develop checkout-free shopping for several other major sports venues in the country.

In March, the San Antonio Spurs’ AT&T Center reopened with limited capacity for the NBA season, unveiling a Zippin-enabled Drink MKT beverage store. Basketball fans can scan in with the Zippin mobile app or use their credit card, grab drinks and go. Cameras and shelves with scales identify purchases to automatically charge customers.

The debut in San Antonio comes after Zippin came to Mile High Stadium, in Denver, in November, for limited capacity Broncos games. Before that, Zippin unveiled its first stadium, the Golden 1 Center, in Sacramento. It allows customers to purchase popcorn, draft beer and other snacks and drinks and is open for Sacramento Kings basketball games and concerts.

“Our mission is to accelerate the adoption of checkout-free stores, and sporting venues are the ideal location to benefit from our approach,” Agrawal said.

Zippin Store Advances  

In addition to stadiums, Zippin has launched stores within stores for grab-and-go food and beverages in Lojas Americanas, a large retail chain in Brazil.

In Russia, the startup has put a store within a store inside an Azbuka Vkusa supermarket chain store located in Moscow. Zippin is also in Japan, where it has a pilot store in Tokyo with Lawson, a convenience store chain in an office location and another store within the Yokohama Techno Tower Hotel.

As an added benefit for retailers, Zippin’s platform can track products to help automate inventory management.

“We provide a retailer dashboard to see how much inventory there is for each individual item and which items have run low on stock. We can help to know exactly how much is in the store — all these detailed analytics are part of our offering,” Agrawal said.

Jetson Processing

Zippin relies on the NVIDIA Jetson AI platform for inference at 30 frames per second for its models, enabling split-second decisions on customer purchases. The application’s processing speed means it can keep up with a crowded store.

The company runs convolutional neural networks for product identification and store location identification to help track customer purchases. Also, using Zippin’s retail implementations, stores utilize smart shelves to determine whether a product was removed or replaced on a shelf.

The NVIDIA edge AI-driven platform can then process the shelf data and the video data together — sensor fusion — to determine almost instantly who grabbed what.

“It can deploy and work effectively on two out of three sensors (visual, weight and location) and then figure out the products on the fly, with training ongoing in action in deployment to improve the system,” said Agrawal.

The post AI Slam Dunk: Startup’s Checkout-Free Stores Provide Stadiums Fast Refreshments appeared first on The Official NVIDIA Blog.

Read More

GFN Thursday Set to Evolve as Biomutant Comes to GeForce NOW on May 25

GeForce NOW is always evolving, and so is this week’s GFN Thursday. Biomutant, the new open-world action RPG from Experiment 101 and THQ Nordic, is coming to GeForce NOW when it releases on May 25.

Everybody Was Kung Fu Fighting

Biomutant puts you in the role of an anthropomorphic rodent with swords, guns and martial arts moves to explore a strange, new open world. Your kung fu creature will evolve as you play thanks to the game’s mutations system, granting new powers and resistances to help you on your journey.

Each mutation will change how you fight and survive, with combat that challenges you to build ever-increasing attack combinations. You’ll mix and match melee attacks, ranged weapons and Psi powers to emerge victorious.

As you evolve your warrior’s abilities, you’ll also change your in-game look. Players will craft new weapons and armor from items found during their adventures. For gamers who love customizing their build, Biomutant looks to be a dream game.

Adventure Across Your Devices

PC gamers have been eagerly anticipating Biomutant’s release, and now GeForce NOW members will be able to take their adventure with them at GeForce quality, across nearly all of their devices.

“It is great that PC gamers can play Biomutant even without relying on powerful hardware via GeForce NOW,” said Stefan Ljungqvist, art and creative director at Experiment 101.

“We love that GeForce NOW will help introduce even more players to this ever-evolving furry creature and its lush game world,” said Florian Emmerich, head of PR at THQ Nordic, Biomutant’s publisher. “Now players can discover everything we have in store for them, even if their PC isn’t the latest and greatest.”

Biomutant on GeForce NOW
There’s a gorgeous world to explore in Biomutant, and with GeForce NOW, you can explore it even on lower-powered devices.

From what we’ve seen so far, the world that Experiment 101 has built is lush and gorgeous. GeForce NOW members will get to explore it as it was meant to be seen, even on a Chromebook, Mac or mobile device.

How will you evolve when Biomutant releases on May 25? Let us know in the comments below.

Get Your Game On

What GFN Thursday is complete without more games? Members can look forward to the following this week:

Returning to GeForce NOW:

  • The Wonderful 101: Remastered (Steam)

What are you planning to play this weekend? Let us know on Twitter or in the comments below.

The post GFN Thursday Set to Evolve as Biomutant Comes to GeForce NOW on May 25 appeared first on The Official NVIDIA Blog.

Read More

Keeping Games up to Date in the Cloud

GeForce NOW ensures your favorite games are automatically up to date, avoiding game updates and patches. Simply login, click PLAY, and enjoy an optimal cloud gaming experience.

Here’s an overview on how the service keeps your library game ready at all times.

Updating Games for All GeForce NOW Members

When a gamer downloads an update on their PC, all that matters is their individual download.

In the cloud, the GeForce NOW team goes through the steps of patching or maintenance and makes the updated game available for millions of members around the world.

Depending on when you happen to hop on GeForce NOW, you may not even see these updates taking place. In some cases, you’ll be able to keep playing your game while we’re updating game bits. In others, we may need to complete the backend work before it’s ready to play.

Patching in GeForce NOW

Patching is the process of making changes to a game or its supporting data to update, fix or improve it.

It typically takes a few minutes, sometimes longer for a game to patch on GeForce NOW.

When a developer releases a new version of a game, work is done on the backend of GeForce NOW to download that patch, replicate each bit to all of our storage systems, test for the proper security features, and finally copy it back onto all of our data centers worldwide, becoming available for gamers.

GeForce NOW handles this entire process so gamers and developers can focus on, well, gaming and developing.

Different Types of Patching 

Three types of patching occur on GeForce NOW:

  • On-seat patching allows you to play your game as-is while the patch is downloading in the background. Meaning you’re always game ready.
  • Offline patching happens for games that don’t support on-seat patching. Our automated software system downloads the patch on the backend then updates. Offline patching can take minutes to hours.
  • Distributed patching is a quicker type of patching. Specific bits are downloaded and installed, instead of the entire game (100 GB or so). We then finish updating and copy them onto the server. Typically this takes 30 minutes or less.

GeForce NOW continues to work with game developers requesting patch updates before releasing on PC, allowing for real-time preparations in the cloud, resulting in zero wait for gamers, period.

Maintenance Mode Explained 

Maintenance mode is the status of a game that’s been taken offline for bigger fixes such as bugs that cause poor performance, issues with save games, or patches that need more time to deploy.

Depending on severity, a game in maintenance can be offline anywhere from days to weeks. We work to keep these maintenance times to a minimum, and often work directly with game developers to resolve these issues.

Eliminate Waiting for More Gaming

Our goal is to do all of the patching and maintenance behind the scenes — so that when you’re ready to play, you’re game ready and playing instantly.

Get started with your gaming adventures on GeForce NOW.

Follow GeForce NOW on Facebook and Twitter and stay up to date on the latest features and game launches. 

The post Keeping Games up to Date in the Cloud appeared first on The Official NVIDIA Blog.

Read More

Create in Record Time with New NVIDIA Studio Laptops from Dell, HP, Lenovo, Gigabyte, MSI and Razer

New NVIDIA Studio laptops from Dell, HP, Lenovo, Gigabyte, MSI and Razer were announced today as part of the record-breaking GeForce laptop launch. The new Studio laptops are powered by GeForce RTX 30 Series and NVIDIA RTX professional laptop GPUs, including designs with the new GeForce RTX 3050 Ti and 3050 laptop GPUs, and the latest 11th Gen Intel mobile processors.

GeForce RTX 3050 Ti and 3050 Studio laptops are perfect for graphic designers, photographers and video editors, bringing high performance and affordable Studio laptops to artists and students.

The NVIDIA Broadcast app has been updated to version 1.2, bringing new room echo removal and video noise removal features, updated general noise removal and the ability to stack multiple effects for NVIDIA RTX users. The update can be downloaded here.

And it’s all supported by the May NVIDIA Studio Driver, also available today.

Creating Made Even Easier

With the latest NVIDIA Ampere architecture, Studio laptops accelerate creative workflows with real-time ray tracing, AI and dedicated video acceleration. Creative apps run faster than ever, taking full advantage of new AI features that save time and enable all creators to apply effects that previously were limited to the most seasoned experts.

The new line of NVIDIA Studio laptops introduces a wider range of options, making finding the perfect system easier than ever.

  • Creative aficionados that are into photography, graphic design or video editing can do more, faster, with new GeForce RTX 3050 Ti and 3050 laptops, and RTX A2000 professional laptops. They introduce AI acceleration, best-in-class video hardware encoding and GPU acceleration in hundreds of apps. With reduced power consumption and 14-inch designs as thin as 16mm, they bring RTX to the mainstream, making them perfect for students and creators on the go.
  • Advanced creators can step up to laptops powered by GeForce RTX 3070 and 3060 laptop GPUs or NVIDIA RTX A4000 and A3000 professional GPUs. They offer greater performance in up to 6K video editing and 3D rendering, providing great value in elegant Max-Q designs that can be paired with 1440p displays, widely available in laptops for the first time.
  • Expert creators will enjoy the power provided by the GeForce RTX 3080 laptop GPU, available in two variants, with 8GB or 16GB of video memory, or the NVIDIA RTX A5000 professional GPU, with 16GB of video memory. The additional memory is perfect for working with large 3D assets or editing 8K HDR RAW videos. At 16GB, these laptops provide creators working across multiple apps with plenty of memory to ensure these apps run smoothly.

The laptops are powered by third-generation Max-Q technologies. Dynamic Boost 2.0 intelligently shifts power between the GPU, GPU memory and CPU to accelerate apps, improving battery life. WhisperMode 2.0 controls the acoustic volume for the laptop, using AI-powered algorithms to dynamically manage the CPU, GPU and fan speeds to deliver quieter acoustics. For 3D artists, NVIDIA DLSS 2.0 utilizes dedicated AI processors on RTX GPUs called Tensor Cores to boost frame rates in real-time 3D applications such as D5 Render, Unreal Engine 4 and NVIDIA Omniverse.

Meet the New NVIDIA Studio Laptops 

Thirteen new Studio laptops were introduced today, including:

  • Nine new models from Dell: The professional-grade Precision 5560, 5760, 7560 and 7760, creator dream team XPS 15 and XPS 17, redesigned Inspiron 15 Plus and 16 Inspiron Plus, and the ready for small business Vostro 7510. The Dell Precision 5560 and XPS 15 debut with elegant, thin, world-class designs featuring creator-grade panels.
  • HP debuts the updated ZBook Studio G8, the world’s most powerful laptop of its size, featuring an incredible DreamColor display with a 120Hz refresh rate and a wide array of configuration options including GeForce RTX 3080 16GB and NVIDIA RTX A5000 laptop GPUs.
  • Lenovo introduced the IdeaPad 5i Pro, with a factory-calibrated, 100 percent sRGB panel, available in 14 and 16-inch configurations with GeForce RTX 3050, as well as the ThinkBook 16p, powered by GeForce RTX 3060.

Gigabyte, MSI and Razer also refreshed their Studio laptops, originally launched earlier this year, with new Intel 11th Gen CPUs, including the Gigabyte AERO 15 OLED and 17 HDR, MSI Creator 15 and Razer Blade 15.

The Proof is in the Perf’ing

The Studio ecosystem is flush with support for top creative apps. In total, more than 60 have RTX-specific benefits.

GeForce RTX 30 Series and NVIDIA RTX professional Studio laptops save time (and money) by enabling creators to complete creative tasks faster.

Video specialists can expect to edit 3.1x faster in Adobe Premiere Pro on Studio laptops with a GeForce RTX 3050 Ti, and 3.9x faster with a RTX 3060, compared to CPU alone.

Studio laptops shave hours off a single project by reducing time in playback, unlocking GPU-accelerated effects in real-time video editing and frame rendering, and faster exports with NVIDIA encoding.

Color grading in Blackmagic Design’s DaVinci Resolve, and editing using features such as face refinement and optical flow, is 6.8x faster with a GeForce RTX 3050 Ti than on CPU alone.

Edits that took 14 minutes with RTX 3060 would have taken 2 hours with just the CPU.

3D artists working with Blender who are equipped with a laptop featuring a GeForce RTX 3080-powered system can render an astonishing 24x faster than CPU alone.

A heavy scene that would take 1 hour to render on a Macbook Pro 16 only takes 8 minutes to render on an RTX 3080.

Adobe Photoshop Lightroom completes Enhance Details on RAW photos 3.7x faster with a GeForce RTX 3050 Ti, compared to an 11th Gen Intel i7 CPU, while Adobe Illustrator users can zoom and pan canvases twice as fast with an RTX 3050.

Regardless of your creative field, Studio laptops with GeForce RTX 30 Series and RTX professional laptop GPUs will speed up your workflow.

May Studio Driver and Creative App Updates

Two popular creator applications added Tensor Core support, accelerating workflows, this month. Both, along with the new Studio laptops, are supported by the latest Studio Driver.

Topaz Labs Gigapixel enhances imagery up to 600 percent while maintaining impressive original image quality.

Video Enhance AI is a collection of upscaling, denoising and restoration features.

With Studio Driver support, both Topaz apps are at least 6x faster with a GeForce RTX 3060 than with a CPU alone.

Recent NVIDIA Omniverse updates include new app and connector betas, available to download now.

Omniverse Machinima offers a suite of tools and extensions that enable users to render realistic graphics and animation using scenes and characters from games. Omniverse Audio2Face creates realistic facial expressions and motions to match any voice-over track.

AVerMedia integrated the NVIDIA Broadcast virtual background and audio noise removal AI features natively into their software suite to improve broadcasting abilities without requiring special equipment.

Download the May Studio Driver (release 462.59) today through GeForce Experience or from the driver download page to get the latest optimizations for the new Studio laptops and applications.

Get regular updates for creators by subscribing to the NVIDIA Studio newsletter and following us on Facebook, Twitter and Instagram.

The post Create in Record Time with New NVIDIA Studio Laptops from Dell, HP, Lenovo, Gigabyte, MSI and Razer appeared first on The Official NVIDIA Blog.

Read More