Cloud Control: Production Studio Taylor James Elevates Remote Workflows With NVIDIA Technology

WFH was likely one the most-used acronyms of the past year, with more businesses looking to enhance their employees’ remote experiences than ever.

Creative production studio Taylor James found a cloud-based solution to maintain efficiency and productivity — even while working remotely — with NVIDIA RTX Virtual Workstations on AWS.

With locations in New York, Los Angeles, London and Mexico, Taylor James creates stunning content for projects like interactive experiences, product commercials for automotive clients and visuals for healthcare campaigns.

Previously, the IT team at Taylor James faced the challenge of supporting an inconsistent desktop environment, where some users had more powerful machines and capabilities than others.

The studio also wanted to upgrade its infrastructure of on-premises hardware, which consisted of render farms and storage servers with lengthy service contracts and minimal opportunity for cost efficiencies.

Adopting NVIDIA RTX Virtual Workstations on AWS, Taylor James migrated all of its production operations to the cloud, providing all of its artists with powerful virtual machines accessible from anywhere.

This allowed the studio to better equip its creative teams to do their best work, while expanding their workforce by attracting top candidates who could work from anywhere — without the constraints of in-office desktop equipment.

“With NVIDIA RTX Virtual Workstations on AWS, we have access to the latest technology all the time,” said Mark Knowles, general manager of Creative Production at Taylor James. “This provides us with the ultimate flexibility and scalability.”

Enhancing Content Creation From the Cloud

“NVIDIA’s technology enables our artists to create accurate simulation of real-world objects and physics, and produce the highest quality visual effects.”
— Mark Knowles, general manager of Creative Production, Taylor James

Taylor James’s digital artists require access to graphics and compute-intensive applications for rendering and creative production, such as Autodesk Maya, Arnold, 3ds Max and Foundry Nuke, many of which are now accelerated by NVIDIA RTX technology.

These artists also use real-time rendering with apps like Maxon Cinema 4D and Redshift, and produce high-resolution automotive rendering for online car configurators with Unreal Engine and Chaos V-Ray.

Taylor James relied on NVIDIA GPU-accelerated physical workstations in the onsite environment, so when it came time to move to AWS, the firm selected virtual workstation instances accelerated by NVIDIA RTX technology.

NVIDIA RTX Virtual Workstations come with all the benefits of RTX technology, including real-time ray tracing, AI, rasterization and simulation.

With NVIDIA RTX, artists realize the dream of real-time cinematic-quality rendering of photorealistic environments with perfectly accurate shadows, reflections and refractions, so that they can create amazing content faster than ever.

“Our designers and artists are masters at creating the most compelling, powerful storytelling that breaks the boundaries of visual production and provides transformative experiences to our audience,” Knowles said. “We require the most powerful accelerated virtual workstation technology, which can only be provided by NVIDIA.”

NVIDIA RTX also brings the power of AI to visual computing, which dramatically accelerates creativity by automating repetitive tasks, enabling all-new creative assistants and optimizing compute-intensive processes.

Artists also get immediate access to additional resources — like augmented and virtual reality tools with NVIDIA CloudXR, as well as IT support to spin up new virtual workstations tailored to specific tasks in a matter of minutes.

Learn more about NVIDIA RTX Virtual Workstations, and explore how NVIDIA is helping professionals tackle the most complex computing challenges.

The post Cloud Control: Production Studio Taylor James Elevates Remote Workflows With NVIDIA Technology appeared first on The Official NVIDIA Blog.

Read More

Scooping up Customers: Startup’s No-Code AI Gains Traction for Industrial Inspection

Bill Kish founded Ruckus Wireless two decades ago to make Wi-Fi networking easier. Now, he’s doing the same for computer vision in industrial AI.

In 2015, Kish started Cogniac, a company that offers a self-service computer vision platform and development support.

Like in the early days of Wi-Fi deployment, the rollout of AI is challenging, he said. Cogniac’s answer is to offer companies a fast track to building datasets on their own for models by scanning parts and equipment, using its no-code AI platform.

No-code AI platforms enable people to work with visual tools and user interfaces — for labeling data, for example — to help develop applications without any programming skill required.

It’s a strategy that’s working. Cogniac customers include Ford, freight giant BNSF Railway and tractor maker Doosan Bobcat. The startup turbocharges these businesses with NVIDIA GPUs for all training and inference.

Cogniac, based in Silicon Valley, recently landed a $20 million Series B investment. The company is an NVIDIA Metropolis partner and a member of NVIDIA Inception, a program that offers go-to-market support, expertise and technology for AI, data science and HPC startups. NVIDIA Metropolis is an application framework that makes it easier for developers to combine video cameras and sensors with AI-enabled video analytics.

BNSF Spots Railroad Damages

North America’s largest freight railway network, BNSF Railway has more than 32,000 miles of track in 28 U.S. states and more than 8,000 locomotives, a massive challenge for keeping up with inspections. BNSF relies on Cogniac to build models for inspections of train tracks, train wheels and other parts.

Cogniac enables BNSF to use mobile devices to gather images of defects that can be automatically fed into models. BNSF has about 200 convolutional neural networks in production and several times that under development with Cogniac, said Kish. They’re helping the railway look for missing cotter pins and bolts on cars, tankers that have been left open and hundreds of other safety-related inspections.

Using GPUs onboard trains, Cogniac’s AI enables ongoing inspections of railways to detect broken rails. It also helps prioritize maintenance. Previously, inspections required closing down tracks for about a day to scrutinize sections of it, he said.

“It’s a huge win for the railways to be able to inspect their assets as a part of their ongoing operations,” said Kish.

Ford Detects Sheet Metal Defects

Ford relies on Cogniac for real-time inspections of sheet metal used in F-150 trucks, the best-selling vehicle in North America. Body panels and inner door panels are made of stamped aluminum sheet metal that is pressed into different shapes using a stamping tool.

But those stamped panels can sometimes have defects such as small splits that need to be detected before installation into vehicles.

“Our edge computing is processing gigapixels using dozens of cameras to capture the contours of the surfaces, enabled by NVIDIA GPUs,” said Kish.

Doosan Bobcat Bulldozes Errors

Doosan Bobcat, a Korean maker of compact tractors, was having a lot of missing parts in the build kits it sent out for tractor orders.Those parts kits are built in Minnesota and then sent to North Dakota for assembly. Missing parts were a big problem that stalled output.

But now the tractor giant is aided by Cogniac’s vision pipelines to monitor parts kits the company puts together for building different configurations of its Bobcat tractors.

Before Cogniac, one-third of the kits that Doosan Bobcat put together for building tractors were incomplete or incorrect. Since implementing Cogniac, kit errors are now one out of 20,000, according to Doosan Bobcat.

“Chances are that there are going to be parts that are missing or wrong,” said Kish. “It’s money that’s not going out the door.”

The post Scooping up Customers: Startup’s No-Code AI Gains Traction for Industrial Inspection appeared first on The Official NVIDIA Blog.

Read More

Prepare for Genshin Impact, Coming to GeForce NOW in Limited Beta

GeForce NOW is charging into the new year at full force.

This GFN Thursday comes with the news that Genshin Impact, the popular open-world action role-playing game, will be coming to the cloud this year, arriving in a limited beta.

Plus, this year’s CES announcements were packed with news for GeForce NOW. Battlefield 4: Premium Edition and Battlefield V: Definitive Edition join the exhilarating collection of Electronic Arts titles streaming from the cloud. Plus, a GeForce NOW promotion is available for AT&T 5G customers, and GeForce NOW will be streaming on Samsung TVs later this year.

And, as always, for the first GFN Thursday of the month, we’ve got the list of new games joining the GeForce NOW library. In January, eight titles are joining the cloud, with two ready to start streaming this week.

Genshin Impact Coming to GeForce NOW

Get ready for an epic fantasy adventure, Traveler. Genshin Impact is coming to GeForce NOW in a limited beta for GeForce NOW users on Windows PCs. To learn if you have access to the beta, members with a miHoYo account can search for Genshin Impact in the GeForce NOW Windows app to see if the beta is available.

Play Genshin Impact on GeForce NOW
Discover the world of Teyvat. Piamon will do her best to be a great guide!

You are a traveler from another world, stranded in the mysterious and fantastic land of Teyvat. Embark on a journey to reunite with your lost sibling and explore the seven districts, each presided over by seven elemental gods known as the Archons.

Manipulate and master the various elements to defeat enemies and solve challenging puzzles. Meet the inhabitants of Teyvat and build up your party from over 40 playable characters — with more to come — each with unique abilities, stories, personalities and combat styles.

Experience an immersive campaign, charge head-on into battles solo or invite friends to join your adventures in a vast magical world.

ICYMI: GeForce NOW Announcements From CES

GeForce NOW at CES
GeForce NOW is kicking off the new year by bringing more games, more devices and more networks to our cloud gaming ecosystem.

Members can now play Battlefield 4: Premium Edition to embrace unrivaled destruction and experience the ultimate war in Battlefield V: Definitive Edition, streaming from the cloud.

These new additions will join the existing EA catalog available to GeForce NOW members, which includes the massively popular, high-speed hero shooter Apex Legends, already streamed by more than 1 million members.

For the mobile gamer, AT&T and NVIDIA have joined forces as collaborators in 5G technical innovation to deliver one of the world’s best gaming experiences. To celebrate, a special promotion for certain AT&T 5G subscribers is ongoing.

Going from the small screen to the big screen, look for GeForce NOW to arrive on select Samsung TVs later this year.

Our CES blog has more information on these announcements.

Joining in January

Rainbow Six Extraction on GeForce NOW
Grab your gadgets and choose your operator in Tom Clancy’s Rainbow Six Extraction, streaming in January.

A new month (and a whole new year) means a new batch of games. 2022 kicks off with eight new titles coming throughout January.

Coming to GeForce NOW on January 20th is Tom Clancy’s Rainbow Six Extraction. Join millions of players in the Rainbow Six universe. Go it alone or in a squad of three in thrilling co-op gameplay. Pick from 18 unique Operators and enter the containment zone to take on the ever-evolving, lethal alien threat known as the Archaens.

Gear up for two games arriving this week:

Also coming in January:

  • The Anacrusis (New release on Steam, Jan. 13)
  • Tom Clancy’s Rainbow Six Extraction (New release on Ubisoft Connect, Jan. 20)
  • Mortal Online 2 (Early access on Steam)
  • Ready or Not (Early access on Steam)
  • Fly Corp (Steam)
  • Garfield Kart – Furious Racing (Steam)

We make every effort to launch games on GeForce NOW as close to their release as possible, but, in some instances, games may not be available immediately.

More From December

On top of the games announced in December, some extra games made it to the cloud last month — including some of Epic Games Store’s 15 days of free games promotion. Check out these additions:

What are you planning to play this weekend? Let us know on Twitter or in the comments below.

The post Prepare for Genshin Impact, Coming to GeForce NOW in Limited Beta appeared first on The Official NVIDIA Blog.

Read More

Teamwork Makes AVs Work: NVIDIA and Deloitte Deliver Turnkey Solutions for AV Developers

Autonomous vehicles are born in the data center, which is why NVIDIA and Deloitte are delivering a strong foundation for developers to deploy robust self-driving technology.

At CES this week, the companies detailed their collaboration, which is aimed at easing the biggest pain points in AV development. Deloitte, a leading global consulting firm, is pairing with NVIDIA to offer a range of services for data generation, collection, ingestion, curation, labeling and deep neural network (DNN) training with NVIDIA DGX SuperPOD.

Building AVs requires massive amounts of data. A fleet of 50 test vehicles driving six hours a day generates 1.6 petabytes daily — if all that data were stored on standard 1GB flash drives, they’d cover more than 100 football fields. Yet that isn’t enough.

On top of this collected data, AV training and validation requires data from rare and dangerous scenarios that the vehicle may encounter, but could be difficult to come across in standard data gathering. That’s where simulated data comes in.

NVIDIA DGX systems and advanced training tools enable streamlined, large-scale DNN training and optimization. Using the power of GPUs and AI, developers can seamlessly collect and curate data to comprehensively train DNNs for autonomous vehicle perception, planning, driving and more.

Developers can also train and test these DNNs in simulation with NVIDIA DRIVE Sim, a physically accurate, cloud-based simulation platform. It taps into NVIDIA’s core technologies — including NVIDIA RTX, Omniverse and AI — to deliver a wide range of real-world scenarios for AV development and validation.

DRIVE Sim can generate high-fidelity synthetic data with ground truth using NVIDIA Omniverse Replicator to train the vehicle’s perception systems or test the decision-making processes.

It can also be connected to the AV stack in software-in-the-loop or hardware-in-the-loop configurations to validate the complete integrated system.

“The robust AI infrastructure provided by NVIDIA DGX SuperPOD is paving the way for our clients to develop transformative autonomous driving solutions for safer and more efficient transportation,” said Ashok Divakaran, Connected and Autonomous Vehicle Lead at Deloitte.

A Growing Partnership

Deloitte is at the forefront of AI innovation, services and research, including AV development.

In March, it announced the launch of the Deloitte Center for AI Computing, a first-of-its-kind center designed to accelerate the development of innovative AI solutions for its clients.

The center is built on NVIDIA DGX A100 systems to bring together the supercomputing architecture and expertise that Deloitte clients require as they become AI-fueled organizations.

This collaboration now extends to AV development, using robust AI infrastructure to architect solutions for truly intelligent transportation.

NVIDIA DGX POD is the foundation, providing an AI compute infrastructure based on a scalable, tested reference architecture featuring up to eight DGX A100 systems, NVIDIA networking and high-performance storage.

To further scale AV development and speed time to results, customers can choose the NVIDIA DGX SuperPOD, which includes 20 or more DGX systems plus networking and storage.

With Deloitte’s long-standing work with the automotive industry and investment in AI innovation, combined with the unprecedented compute of NVIDIA DGX systems, developers will have access to the best AV training solutions for truly revolutionary products.

Deloitte’s leadership in AI is paired with a broad and deep set of technical capabilities and services. Among its ranks are more than 5,500 systems integration developers, 2,000 data scientists and 4,500 cybersecurity practitioners. In 2020, Deloitte was named the global leader in worldwide system integration services by the International Data Corporation.

Deloitte also has deep experience with the automotive industry, serving three-quarters of the Fortune 1000 automotive companies.

Streamlined Solutions

With combined experience and cutting-edge technology, NVIDIA and Deloitte are offering robust data center solutions for AV developers, encompassing infrastructure, data management, machine learning operations and synthetic data generation.

These services begin with Infrastructure-as-a-Service, which provides management of the DGX SuperPOD infrastructure in an on-prem or co-location environment. Experts design and set up this AI infrastructure, as well as provide ongoing infra operations, for streamlined and efficient AV development.

With Data-Management-as-a-Service, developers can use tools for data ingestion and curation, enabling scale and automation for DNN training.

NVIDIA and Deloitte can also improve data scientist productivity up to 30 percent with MLOps-as-a-Service. This turnkey solution deploys and supports enterprise-grade MLOps software to train DNNs and accelerate accuracy.

Finally, NVIDIA and Deloitte make it possible to curate specific scenarios for comprehensive DNN training with Synthetic Data Generation-as-a-Service. Developers can take advantage of simulation expertise to generate high-fidelity training data to cover the rare and hazardous situations AVs must be able to handle safely.

Equipped with these invaluable tools, AV developers now have the capability to ease some of the largest bottlenecks in DNN training to deliver safer, more efficient transportation.

The post Teamwork Makes AVs Work: NVIDIA and Deloitte Deliver Turnkey Solutions for AV Developers appeared first on The Official NVIDIA Blog.

Read More

‘AI Dungeon’ Creator Nick Walton Uses AI to Generate Infinite Gaming Storylines

What started as Nick Walton’s college hackathon project grew into AI Dungeon, a popular text adventure game with over 1.5 million users.

Walton is the co-founder and CEO of Latitude, a Utah-based startup that uses AI to create unique gaming storylines.

He spoke with NVIDIA AI Podcast host Noah Kravitz about how natural language processing methods can generate infinite open-ended adventure plots for interactive games like AI Dungeon, which draws an average of 150,000 new players each month.

Key Points From This Episode:

  • In AI Dungeon, players type in their actions or responses to prompts — for example, “You’re about to enter a world of endless possibilities, where you can do absolutely anything you can imagine … Will you proceed?” — and AI keeps the story going.
  • Unlike other text adventure games that use pre-written content, AI Dungeon offers infinite unique possibilities for each storyline, as the AI adapts and responds to almost any user input.
  • Users can dive into adventures individually or in multiplayer mode, which allows players with distinct characters to take turns interacting with the AI within the same game session. To kick off the story, they can choose from a list of initial prompts or create custom adventures.

Tweetables:

“There’s something really compelling here in this ability to have stories that can go anywhere.” — Nick Walton [3:30]

In gaming and in the world, AI enables “new experiences that are no longer static and predetermined, but dynamic and alive.” — Nick Walton [7:16]

You Might Also Like:

GANTheftAuto: Harrison Kinsley on AI-Generated Gaming Environments

Humans playing games against machines is nothing new, but now computers can develop games for people to play. Programming enthusiast and social media influencer Harrison Kinsley created GANTheftAuto, an AI-based neural network that generates a playable chunk of the classic video game Grand Theft Auto V.

Matt Ginsberg Built a GPU-Powered Crossword Solver to Take on Top Word Nerds

Dr.Fill, the crossword puzzle-playing AI created by Matt Ginsberg — serial entrepreneur, pioneering AI researcher and former research professor — scored higher than any humans earlier this year at the American Crossword Puzzle Tournament.

Maya Ackerman on LyricStudio, an AI-Based Writing Songwriting Assistant

Maya Ackerman is the CEO of WaveAI, a Silicon Valley startup using AI and machine learning to, as the company motto puts it, “unlock new heights of human creative expression.” The startup’s LyricStudio software is an AI-based lyric and poetry writing assistant.

Subscribe to the AI Podcast

Get the AI Podcast through iTunes, Google Podcasts, Google Play, Castbox, DoggCatcher, Overcast, PlayerFM, Pocket Casts, Podbay, PodBean, PodCruncher, PodKicker, Soundcloud, Spotify, Stitcher and TuneIn. If your favorite isn’t listed here, drop us a note.

Make the AI Podcast Better

Have a few minutes to spare? Fill out this listener survey. Your answers will help us make a better podcast.

The post ‘AI Dungeon’ Creator Nick Walton Uses AI to Generate Infinite Gaming Storylines appeared first on The Official NVIDIA Blog.

Read More

NVIDIA Builds Isaac AMR Platform to Aid $9 Trillion Logistics Industry

Manufacturing and fulfillment centers are profoundly complex. Whenever new earbuds or socks land at your doorstep in hours or a vehicle rolls off an assembly line, a maze of magic happens with AI-driven logistics.

Massive facilities like these are constantly in flux. Robots travel miles of aisles to roll up millions of products to assist teams of moving people. Obstacles are ever present.

Today we’re introducing the Isaac Autonomous Mobile Robot (AMR) platform to  optimize operational efficiency and accelerate deployment of AMRs. Isaac AMR extends NVIDIA Isaac capabilities for building and deploying robotics applications, bringing mapping, site analytics and fleet optimization onto NVIDIA EGX servers.

Industrial facilities of this type can be as big as city blocks or stadiums. They are constantly reconfigured or scaled up to meet product demands of the moment. Path planning and rerouting for autonomous robots need to move in lockstep.

At industrial scale, even small routing optimizations can save billions of dollars in the $9 trillion logistics industry.

Autonomous mobile robot deployments are estimated to reach 53,000 sites by 2025, up from 9,000 in 2020, according to Interact Analysis. Meanwhile, supply chains strain to keep up with increasing e-commerce amid worker shortages and COVID-19 restrictions.

One hurdle is the ability to rapidly and autonomously develop high-accuracy robot maps. They need to be continuously updated as operations scale or fluctuate. And increasing situational awareness of changing environments for mobile robots and continuously reoptimizing the routes —  all while developing new skill sets through simulation — is paramount to operational efficiency.

Isaac AMR is the result of years of research and product development at NVIDIA. The framework is available on the NVIDIA NGC software hub and within the NVIDIA Omniverse platform, tapping into Metropolis and ReOpt at first, and soon DeepMap and more NVIDIA technologies.

Scaling Operations With Isaac AMR

The AI and computing challenges of autonomous mobile robots for manufacturing and fulfillment aren’t all that different from those of autonomous vehicles.

Obstacles and people must be avoided. Destinations need to be reached. Thousands of sensors driven by GPU-accelerated algorithms help fleets of autonomous robots solve the traveling salesman problem — finding the shortest path between multiple destinations —  among ever-changing industrial workflows in real time.

The Isaac AMR platform uses NVIDIA Omniverse for creating digital twins of the facility where AMRs will be deployed. NVIDIA Isaac Sim (built on Omniverse) simulates the behavior of robot fleets, people and other machines in the digital twins with high-fidelity physics and perception. It also enables synthetic data generation for training of AI models.

Isaac AMR consists of GPU-accelerated AI technologies and SDKs including DeepMap, ReOpt and Metropolis. These technologies are securely orchestrated and cloud delivered with NVIDIA Fleet Command.

DeepMap Delivers Mapping Advances

NVIDIA’s recent acquisition of DeepMap brings advances in mapping for autonomous vehicles to the AMR industry as well.

AMR deployments can access the DeepMap platform’s cloud-based SDK to help accelerate robot mapping of large facilities from weeks to days, while achieving centimeter-level accuracy.

The DeepMap Update Client enables robot maps to be updated as frequently as necessary, in real time. And the DeepMap SDK delivers layers of intelligence to maps by adding semantic understanding — so robots can identify the objects pixels represent and know if they can move one way or not. It’s also capable of addressing both indoor and outdoor map building.

As part of the Isaac AMR platform, NVIDIA DeepMap integrates with other components, such as Metropolis, ReOpt, Isaac Sim via Omniverse and more.

NVIDIA Metropolis Adds Real-Time Situational Awareness

Mapping doesn’t account for everything in these environments. And the advanced sensors onboard AMRs aren’t always sufficient to ensure safe and efficient operation.

The NVIDIA Metropolis video analytics platform meets the need for higher level real-time “outside-in” perception by providing access to cameras and sensors deployed all over the factory or warehouse floor.

With Metropolis, AMRs have access to additional layers of situational awareness on the factory floor, enabling them to avoid high-congestion areas, eliminating blind spots and enhanced visibility of both people and other AMRs. In addition, Metropolis’s pre-trained models provide a head start in customizing for site-specific needs.

ReOpt Libraries Transform Logistics 

NVIDIA ReOpt AI software libraries can be used to optimize vehicle route planning and logistics in real time, which can be applied to fleets of AMRs.

Many factors need to be considered in deciding the optimal AMR fleet size to be deployed for large, complex environments. Robot speeds, battery life, transport size and weight, and facilities layout all factor in.

Companies can simulate (using Isaac Sim) multiple AMR interactions with NVIDIA ReOpt. These can happen quickly and accurately in digital twins of environments such as warehouses. And they can be implemented before deploying robots in production as situations change, saving time and money.

And, once deployed, routes have to be continuously re-optimized to deliver the most operational efficiency. NVIDIA ReOpt provides for dynamic reoptimization of routes to a fleet of heterogeneous AMRs based on a number of constraints.

Deploying AMRs Into Production

Available on NVIDIA EGX servers, the Isaac AMR platform enhances the development of AI-driven logistics by providing a complete path to build industrial and human-robot simulations as well as route optimizations.

Isaac AMR Platform is built to be enterprise class and cloud ready. The NVIDIA Technologies presented as part of the Isaac AMR platform can be securely deployed and managed on EGX servers with NVIDIA Fleet Command.

Check out the latest developments on Omniverse, Metropolis, DeepMap and ReOpt.

Learn more about the Isaac Robotics Platform and use the contact us form on the portal if you have any questions. 

The post NVIDIA Builds Isaac AMR Platform to Aid $9 Trillion Logistics Industry appeared first on The Official NVIDIA Blog.

Read More

Gamers, Creators, Drivers Feel GeForce RTX, NVIDIA AI Everywhere

Putting the power of graphics and AI at the fingertips of more users than ever, NVIDIA announced today new laptops and autonomous vehicles using GeForce RTX and NVIDIA AI platforms and expanded reach for GeForce NOW cloud gaming across Samsung TVs and the AT&T network.

A virtual address prior to CES showed next-gen games, new tools for creating virtual worlds, and AI-powered vehicles — all accelerated by NVIDIA technologies.

160-Plus Portable Powerhouses

“Ray tracing and AI are defining the next generation of content,” said Jeff Fisher, senior vice president of NVIDIA’s GeForce business, announcing more than 160 thin-and-light laptops using RTX 30 Series GPUs in a smorgasbord of mobile designs.

They will serve 3 billion gamers and tens of millions of content creators worldwide, who are increasingly turning to portable PCs.

Many of the systems will pack a new GeForce RTX 3080 Ti laptop GPU, the first time the flagship 80 Ti class comes to mobile PCs. It sports 16GB of the fastest ever GDDR6 memory in a laptop and delivers higher performance than the desktop TITAN RTX, with prices starting at $2,499.

More will light up with the GeForce RTX 3070 Ti laptop GPU, the newest member of our fastest-growing class of GPUs. It can drive displays at 100 frames/second with 1,440-pixel resolution, at prices starting at $1,499.

GeForce RTX laptops announced at CES

Laptops with the new GPUs will be available starting Feb. 1. They bring a fourth generation of Max-Q Technologies, a systems design approach that enables the thinnest, lightest, quietest gaming laptops.

The latest Max-Q applies AI to optimally balance GPU/CPU power use, battery discharge, image quality and frame rates. Laptops running it can provide up to 70 percent more battery life while delivering more performance.

Mainstream and Monster GPUs

Fisher also announced the GeForce RTX 3050, which brings ray tracing and accelerated AI to mainstream desktops.

The GPU lets mass market PCs run the latest games at 60 frames per second, thanks to graphics and AI cores in its NVIDIA Ampere architecture.

The RTX 3050 sports 8GB of GDDR6 memory. The card will be available from partners worldwide on Jan. 27 starting at $249.

“RTX is the new standard, and the GeForce RTX 3050 makes it more accessible than ever,” said Fisher, showing a “monster” RTX 3090 Ti (pictured below) also in the wings.

GeForce RTX 3090 Ti

The RTX 3090 Ti will pack 24GB of GDDR6X running at 21 Gbit/s, the fastest memory ever. The GPU will crank out 40 teraflops for shaders, 78 teraflops for ray tracing and a whopping 320 teraflops of AI muscle. More details will be coming later this month.

New Initiatives with AT&T, Samsung

Giving high-performance mobile gaming new wings, AT&T and NVIDIA are working together to provide customers an optimized experience. And AT&T customers are getting a special offer to enjoy NVIDIA GeForce NOW on 5G.

“Starting today, AT&T customers with a 5G device on a qualifying plan, can get a six-month GeForce NOW Priority Membership at no charge,” said Fisher.

GeForce NOW on AT&T's 5G network

For its part, Samsung will offer NVIDIA’s cloud gaming service on its smart TVs through the Samsung Gaming Hub around midyear.

It’s the latest move to bring GeForce NOW to the living room. In November, LG showed a beta version of the service on its 2021 WebOS smart TVs.

As for games, GeForce NOW continues to expand support for the Electronic Arts catalog, adding Battlefield 4 and Battlefield 5 to GeForce NOW’s online library of more than 1,100 PC games played by 15 million subscribers.

Heart-Pounding RTX and Reflex Games

Speaking of new games, NVIDIA announced 10 new RTX games.

Fisher showed clips from four upcoming titles that this year will support RTX, DLSS — NVIDIA’s AI-powered graphics enhancing technology — or both.

Escape From Tarkov and Rainbow Six Extraction are adding DLSS, while The Day Before and Dying Light 2 Stay Human are adding both DLSS and ray tracing.

In addition, seven new titles support NVIDIA Reflex for low-latency gameplay. They include Sony’s God of War, Rainbow Six Extraction and iRacing, taking Reflex into the world’s premier online racing simulator.

And we’re raising the bar in esports. A new class of 27-inch esports displays with 1440-pixel resolution and G-SYNC at up to 360 Hz refresh rates will be available soon from AOC, ASUS, MSI and ViewSonic.

Building 3D Virtual Worlds

For the 45 million professionals who create games, movies and more, Fisher described tools transforming their workflows.

“We are at the dawn of the next digital frontier. Interconnected 3D virtual worlds … with shops, homes, people, robots, factories, museums … will be built by an expanding number of creators, collaborating across the globe,” he said.

NVIDIA Omniverse, the powerful platform for artists to collaborate and accelerate 3D work, remains free and is now generally available for GeForce and NVIDIA RTX Studio creators.

NVIDIA Omniverse available on GeForce RTX Studio

Omniverse uses Pixar’s open-standard Universal Scene Description (USD) to connect tools from more than 40 software development partners into a single 3D design platform. That lets creators across the globe collaborate in Omniverse on shared 3D workflows.

“This is the future of 3D content creation and how virtual worlds will be built,” Fisher said.

He detailed new capabilities in Omniverse, including:

  • Omniverse Nucleus Cloud, a one-click-to-collaborate 3D scene sharing feature, now in early access
  • Updates to Omniverse Audio2Face, an AI-enabled app that animates a 3D face based on an audio track, including direct export to Epic’s MetaHuman for creating realistic characters
  • New assets from Mechwarrior 5 and Shadow Warrior 3 added to the Omniverse Machinima library
  • And a wealth of free digital assets now available in the Omniverse launcher from leading 3D marketplaces.

In addition, we’ve upgraded NVIDIA Canvas, our Studio app that converts brushstrokes into photorealistic images. Available free to download here, it now supports 4x the resolution and new materials like flowers and bushes thanks to the efforts of NVIDIA researchers who developed GauGAN2.

It’s part of NVIDIA Studio’s broad and deep software stack, which accelerates more than 200 of the industry’s top creative applications.

GeForce RTX laptops include the Razer Blade 14

All that software is available in systems as svelte as the Razer Blade 14, a new ultraportable with an RTX 3080 Ti.

“I feel like I can go anywhere with it and still am able to produce the quality artwork that I could do at home,” said a 3D artist who took it on a test drive and shared his experience in the special address.

Taking AI to the Street

In autonomous cars and trucks — another hot topic at CES — NVIDIA announced more companies adopting its open DRIVE Hyperion platform that includes a high-performance computer and sensor architecture meeting the safety requirements of fully autonomous vehicles.

The latest generation DRIVE Hyperion 8 is designed with redundant NVIDIA DRIVE Orin systems-on-a-chip, 12 state-of-the-art surround cameras, nine radar, 12 ultrasonics, one front-facing lidar and three interior-sensing cameras. It’s architected to be functionally safe, so that if one computer or sensor fails, there is a back up available to ensure that the AV can drive its passengers to a safe place.

Electric vehicle makers such as Volvo-backed Polestar and EV companies in China, including NIO, Xpeng, Li Auto, R Auto and IM Motors have all adopted DRIVE Hyperion.

“These new electric cars will get better and better over time with each over-the-air update,” said Ali Kani, vice president and general manager of NVIDIA’s automotive business.

“Such companies can benefit from new business models that are software driven,” he added.

Electric vehicle makers adopting Drive Hyperion

Robotaxi services like Cruise, Zoox and DiDi and trucking services like Volvo, Navistar and Plus are also embracing DRIVE Hyperion.

Autonomous trucking company TuSimple announced at CES that it will build its new platform on NVIDIA DRIVE Orin. It’s working with leading delivery companies such as UPS, Navistar and Penske, and its technology has already improved arrival times for long-haul routes of the U.S. Postal Service.

Such vehicles will help fill an estimated shortage of more than 140,000 drivers by 2027 in the U.S. alone.

TuSimple adopts DRIVE Orin

In addition, five leading automotive suppliers — Desay, Flex, Quanta, Valeo and ZF — now support DRIVE Hyperion.

Kani ended the session with a demo of NVIDIA DRIVE Concierge.

An always-on, digital assistant for drivers, it’s one example of helpful agents created with NVIDIA Omniverse Avatar using the company’s speech AI, computer vision, natural language understanding, recommendation engines and simulation technologies.

DRIVE Concierge is the latest addition to NVIDIA’s rich real-time, AI-based software stack for autonomous vehicles. The end-to-end solution also includes code to train AI models in the cloud and run complex simulations in data centers to safely test and validate an entire AV system.

“From AI in the car to AI in the cloud, NVIDIA is paving the way to safer and more efficient transportation,” said Kani.

For more news, check out our CES page and watch the full special address below.

The post Gamers, Creators, Drivers Feel GeForce RTX, NVIDIA AI Everywhere appeared first on The Official NVIDIA Blog.

Read More

NVIDIA Canvas Updated With New AI Model Delivering 4x Resolution and More Materials

As art evolves and artists’ abilities grow, so must their creative tools.

NVIDIA Canvas, the free beta app and part of the NVIDIA Studio suite of creative apps and tools, has brought the real-time painting tool GauGAN to anyone with an NVIDIA RTX GPU.

Artists use advanced AI to quickly turn simple brushstrokes into realistic landscape images, speeding up concept exploration and freeing more time to visualize ideas.

The NVIDIA Canvas update released today, powered by the GauGAN2 AI model and NVIDIA RTX GPU Tensor Cores, generates backgrounds with increased quality and 4x higher resolution, and adds five new materials to paint with.

Updates to NVIDIA Canvas, the launch of NVIDIA Omniverse in general availability and newly announced NVIDIA Studio products highlight all the Studio goodness at CES. Read the full recap.

Artwork Made More Easel-y

The NVIDIA Canvas update enables even more realistic images with greater definition and fewer artifacts, adding significant photorealism, all in up to 1K pixel resolution for increased visual quality.

Exports can easily be integrated into existing creative workflows by exporting to Adobe Photoshop or an artist’s preferred creative app.

A dramatic 4X increase in pixel resolution.

Artists start in Canvas by sketching simple shapes and lines, then the AI model immediately fills the screen with real-world materials such as water, grass, snow, mountains and more for show-stopping results.

The update adds five new materials to create richer landscape environments: straw, flowers, mud, dirt and bushes.

New materials including mud, dirt and bush add more realism and variety to backgrounds.

These new materials can be further customized with nine preset styles that modify the look and feel of the painting. Artists have the option to upload their own imagery to create a one-of-a-kind background.

NVIDIA Canvas dials up the realism with greater definition and fewer artifacts.

With Canvas’s new AI, creative possibilities have become even more limitless.

Start Painting With AI

Download the latest NVIDIA Canvas app update, give us feedback, and install the January NVIDIA Studio Driver to optimize your creative app arsenal.

Share your Canvas paintings with the hashtag #StudioShare for a chance to be featured on the Studio Facebook, Twitter and Instagram channels. Subscribe to the Studio YouTube channel for tutorials, tips and tricks by industry-leading artists, and stay up to date on all things Studio by signing up for the NVIDIA Studio newsletter.

The post NVIDIA Canvas Updated With New AI Model Delivering 4x Resolution and More Materials appeared first on The Official NVIDIA Blog.

Read More

Groundbreaking Updates to NVIDIA Studio Power the 3D Virtual Worlds of Tomorrow, Today

We’re at the dawn of the next digital frontier. Creativity is fueling new developments in design, innovation and virtual worlds.

For the creators driving this future, we’ve built NVIDIA Studio, a fully accelerated platform with high-performance GPUs as the heartbeat for laptops and desktops.

This hardware is paired with exclusive NVIDIA RTX-accelerated software optimizations in top creative apps and a suite of tools like NVIDIA Omniverse, Canvas and Broadcast, which help creators enhance their workflows.

And it’s all supported by specialized drivers that are updated monthly for performance and reliability — like the January Studio Driver, available starting today.

The interconnected 3D virtual worlds of tomorrow are being built today. NVIDIA Omniverse, designed to be the foundation that connects these virtual worlds, is now available to millions of NVIDIA Studio creators using GeForce RTX and NVIDIA RTX GPUs.

We’ve also introduced GeForce RTX 3080 Ti and 3070 Ti-based Studio laptops, groundbreaking hardware with heightened levels of performance — especially on battery.

Updated with a major increase in fidelity, new materials and an upgraded AI model, NVIDIA Canvas enables artists to turn simple brushstrokes into realistic landscape images by using AI.

Omniverse Available to Millions of Individual Creators and Artists Worldwide

Expanding the NVIDIA Studio ecosystem, NVIDIA Omniverse is now available at no cost to millions of individual creators with GeForce RTX and NVIDIA RTX GPUs.

Bolstered by new features and tools, NVIDIA’s real-time design collaboration and simulation platform empowers artists, designers and creators to connect and collaborate in leading 3D design applications from their RTX-powered laptop, desktop or workstation.

Multi-app workflows can grind to a halt with near-constant exporting and importing. With Omniverse, creators can connect their favorite 3D design tools to a single scene and simultaneously create and edit between the apps.

 

We’ve also announced platform developments for Omniverse Machinima with new free game characters, objects and environments from Mechwarrior 5, Shadow Warrior 3, Squad and Mount & Blade II: Bannerlord; and Omniverse Audio2Face with new blendshape support and direct export to Epic’s MetaHuman; plus early access to new platform features like Omniverse Nucleus Cloud — enabling simple “one-click-to-collaborate” sharing of large Omniverse 3D scenes.

Learn more about Omniverse, the latest enhancements and its general availability, and download the latest version at nvidia.com/omniverse.

New Studio Laptops With GeForce RTX 3080 Ti and RTX 3070 Ti GPUs

NVIDIA Studio laptops provide the best mobile performance for 3D creation. The new GeForce RTX 3080 Ti Laptop GPU features 16GB of the fastest GDDR6 memory ever shipped in a laptop and higher performance than the desktop TITAN RTX.

The new GeForce RTX 3070 Ti also delivers fantastic performance — it’s up to 70 percent faster than RTX 2070 SUPER laptops.

Next-generation laptop technologies are amping up performance. We’ve worked with CPU vendors on CPU Optimizer. It’s a new, low-level framework enabling the GPU to further optimize performance, temperature and power of next-gen CPUs. As a result, CPU efficiency is improved and power is transferred to the GPU for more performance in creative applications.

In compute-heavy apps like Adobe Premiere, Blender and Matlab, we’ve developed Rapid Core Scaling. It enables the GPU to sense the real-time demands of applications and use only the cores it needs rather than all of them. This frees up power that can be used to run the active cores at higher frequencies, delivering up to 3x more performance for intensive creative work on the go.

ASUS, MSI and Razer are launching new laptops with a wide range of designs — and up to GeForce RTX 3080 Ti GPUs — starting in February.

Paint by AI With NVIDIA Canvas

Bolstered by work from the NVIDIA Research team developing GauGAN2, NVIDIA Canvas is now available with 4 times higher resolution and five new materials.

The GauGAN2 AI model incorporated in the latest update helps deliver more realistic images with greater definition and fewer artifacts.

 

Five new materials — straw, flowers, mud, dirt and bush — liven up and create richer landscape environments.

Read more about the latest NVIDIA Canvas update.

Reliable Performance

Creators can download the January Studio Driver, available now with improved performance and reliability for the Omniverse and Canvas updates.

 

With monthly updates, NVIDIA Studio Drivers deliver smooth performance on creative applications and the best possible experience when using NVIDIA GPUs. Extensive multi-app workflow testing ensures the latest apps run smoothly.

New Desktop Solutions, Coming Soon

Finally, the GeForce RTX 3050 GPU brings even more choice for creators. Our new entry-level GPU provides the most accessible way of getting great RTX benefits — real-time ray tracing, AI, a top-notch video encoder and video acceleration. Starting at just $279, it’s a great way to start creating with RTX. Look for availability from partners worldwide on Jan. 27.

 

One more thing: Keep an eye out for more information on GeForce RTX 3090 Ti later this month. It’ll have a huge 24GB of lightning-fast video memory, making it perfect for conquering nearly any creative task.

Subscribe to the Studio YouTube channel for tutorials, tips and tricks by industry-leading artists, and stay up to date on all things Studio by signing up for the NVIDIA Studio newsletter.

The post Groundbreaking Updates to NVIDIA Studio Power the 3D Virtual Worlds of Tomorrow, Today appeared first on The Official NVIDIA Blog.

Read More

NVIDIA Makes Free Version of Omniverse Available to Millions of Individual Creators and Artists Worldwide

Designed to be the foundation that connects virtual worlds, NVIDIA Omniverse is now available to millions of individual NVIDIA Studio creators using GeForce RTX and NVIDIA RTX GPUs.

In a special address at CES, NVIDIA also announced new platform developments for Omniverse Machinima and Omniverse Audio2Face, new platform features like Nucleus Cloud and 3D marketplaces, as well as ecosystem updates.

With Omniverse, NVIDIA’s real-time 3D design collaboration and virtual world simulation platform, artists, designers and creators can use leading design applications to create 3D assets and scenes from their laptop or workstation.

Since its open beta launch a year ago, Omniverse has been downloaded by almost 100,000 creators who are accelerating their workflows with its core rendering, physics and AI technologies.

“With this technology, content creators get more than just a fast renderer,” said Zhelong Xu, a digital artist and Omniverse Creator based in Shanghai. “NVIDIA Omniverse and RTX give artists a powerful platform with infinite possibilities.”

Creators like Xu are the people who will use Omniverse’s tools to build and collaborate on the vast amounts of content needed for this next generation of the web. They’re building interconnected 3D virtual worlds for commerce, entertainment, creativity and industry.

These boundless worlds will be populated with shops, homes, people, robots, factories, museums — a staggering amount of 3D content. This content is challenging to produce, typically requiring multiple, often incompatible tools.

Omniverse connects these independent 3D design worlds into a shared virtual scene.

The culmination of over 20 years of NVIDIA’s groundbreaking work, Omniverse brings graphics, AI, simulation and scalable compute into a single platform to enhance existing 3D workflows.

With Omniverse, free for individual users, GeForce RTX Studio creators can connect their favorite 3D design tools to a single scene and simultaneously create and edit between the apps.

Download it at nvidia.com/omniverse.

Omniverse Enterprise, the paid subscription for professional teams, was made available at GTC in November and is sold by NVIDIA’s global partner network. Today’s announcement brings Omniverse capabilities into the hands of individual creators.

New Omniverse Features 

New features within Omniverse include: 

  • Omniverse Nucleus Cloud enables “one-click-to-collaborate” simple sharing of large Omniverse 3D scenes, meaning artists can collaborate from across the room or the globe without transferring massive datasets. Changes made by the artist are reflected back to the client — like working on a cloud-shared document — but for a 3D scene.
  • New support for the Omniverse ecosystem provided by leading 3D marketplaces and digital asset libraries gives creators an even easier way to build their scenes. TurboSquid by Shutterstock, CGTrader, Sketchfab and Twinbru have released thousands of Omniverse-ready assets for creators, all based on Universal Scene Description (USD) format, and are found directly in the Omniverse Launcher. Reallusion’s ActorCore, Daz3D and e-on software’s PlantCatalog will soon release their own Omniverse-ready assets.
  • Omniverse Machinima for RTX creators who love to game — now featuring new, free characters, objects and environments from leading game titles like Mechwarrior 5 and Shadow Warrior 3, plus Mount & Blade II: Bannerlord and Squad assets in the Machinima library. Creators can remix and recreate their own game cinematics with these assets by dragging and dropping them into their scenes.
  • Omniverse Audio2Face, a revolutionary AI-enabled app that instantly animates a 3D face with just an audio track, now offers blendshape support and direct export to Epic’s MetaHuman Creator app. This leaves the tedious, manual blend-shaping process to AI, so artists and creators can spend more time on their creative workflows.

Omniverse Ecosystem Expands

The NVIDIA Omniverse ecosystem expands with new Omniverse Connectors, extensions and asset libraries — built by many partners.

Today, there are 14 connectors to applications like Autodesk 3ds Max, Autodesk Maya and Epic Games’ Unreal Engine — with many more in the pipeline, including an Adobe Substance 3D Material Extension coming soon.

The latest Omniverse Connectors, extensions, and asset libraries include:

  • e-on software’s VUE, an all-in-one application that allows users to create digital 3D nature, from skies and volumetric clouds to terrains, large-scale ecosystems, wind-swept vegetation populations, open water bodies, roads and rocks — all based on nature’s rules – and includes a native Omniverse live link connector that will sync all scene modifications directly to Omniverse stages.
  • e-on software’s PlantFactory, a vegetation application that allows modeling of foliage as small as twigs or as big as giant redwood trees from scratch. It also models animation like wind and permits asset export in a wide variety of formats, including a direct link from PlantFactory to Omniverse.
  • e-on software’s PlantCatalog, which provides a collection of over 120 ready-made procedural vegetation assets.
  • Twinbru, a “digital twin of physical fabric” provider that supplies interior and exterior furnishing fabrics for drapery, sheers, curtains and upholstery applications, offering 21,000 different fabrics and 11,000 digitized fabric twins — all high quality and physically accurate — now integrated into Omniverse to help streamline manufacturing and architectural designs.

Watch the NVIDIA special address on demand for a recap of all of the company’s announcements at CES.

Creators can download NVIDIA Omniverse for free, submit their work to the NVIDIA Omniverse gallery, and find resources through our forums, Medium, Twitter, YouTube, Twitch, Instagram and Discord server.

The post NVIDIA Makes Free Version of Omniverse Available to Millions of Individual Creators and Artists Worldwide appeared first on The Official NVIDIA Blog.

Read More