Rise to the Cloud: ‘Monster Hunter Rise’ and ‘Sunbreak’ Expansion Coming Soon to GeForce NOW

Rise to the Cloud: ‘Monster Hunter Rise’ and ‘Sunbreak’ Expansion Coming Soon to GeForce NOW

Fellow Hunters, get ready! This GFN Thursday welcomes Capcom’s Monster Hunter Rise and the expansion Sunbreak to the cloud, arriving soon for members.

Settle down for the weekend with 10 new games supported in the GeForce NOW library, including The Settlers: New Allies.

Plus, Amsterdam and Ashburn are next to light up on the RTX 4080 server map, giving nearby Ultimate members the power of an RTX 4080 gaming rig in the cloud. Keep checking the weekly GFN Thursday to see where the RTX 4080 SuperPOD upgrade rolls out next.

Palicos, Palamutes and Wyverns, Oh My

The hunt is on! Monster Hunter Rise, the popular action role-playing game from Capcom, is joining GeForce NOW soon. Protect the bustling Kamura Village from ferocious monsters; take on hunting quests with a variety of weapons and new hunting actions with the Wirebug; and work alongside a colorful cast of villagers to defend their home from the Rampage — a catastrophic event that nearly destroyed the village 50 years prior.

Members can expand the hunt with Monster Hunter Rise: Sunbreak, which adds new quests, monsters, locales, gear and more. And regular updates keep Hunters on the job, like February’s Free Title Update 4, which marks the return of the Elder Dragon Velkhana, the lord of the tundra that freezes all in its path.

Monster Hunter Rise Sunbreak on GeForce NOW
Carve out more time for monster hunting by playing in the cloud.

Whether playing solo or with a buddy, GeForce NOW members can take on dangerous new monsters anytime, anywhere. Ultimate members can protect Kamura Village at up to 4K at 120 frames per second — or immerse themselves in the most epic monster battles at ultrawide resolutions and 120 fps. Members won’t need to wait for downloads or worry about storage space, and can take the action with them across nearly all of their devices.

Rise to the challenge by upgrading today and get ready for Monster Hunter Rise to hit GeForce NOW soon.

New Week, New Games

The Settlers New Allies on GeForce NOW
Onward! There’s much to explore in the Forgotten Plains.

Kick off the weekend with 10 new titles, including The Settlers: New Allies. Choose among three unique factions and explore this whole new world powered by state-of-the-art graphics. Your settlement has never looked so lively.

Check out the full list of this week’s additions:

  • Labyrinth of Galleria: The Moon Society (New release on Steam)
  • Wanted: Dead (New release on Steam and Epic)
  • Elderand (New release on Steam, Feb. 16)
  • Wild West Dynasty (New release on Steam, Feb. 16)
  • The Settlers: New Allies (New release on Ubisoft, Feb. 17)
  • Across the Obelisk (Steam)
  • Captain of Industry (Steam)
  • Cartel Tycoon (Steam)
  • SimRail — The Railway Simulator (Steam)
  • Warpips (Epic Games Store)

The monthlong #3YearsOfGFN celebration continues on our Twitter and Facebook channels. Members shared the most beautiful place they’ve visited in-game on GFN.

And make sure to check out the question we have this week for GeForce NOW’s third anniversary celebration!

 

Read More

Redefining Workstations: NVIDIA, Intel Unlock Full Potential of Creativity and Productivity for Professionals

Redefining Workstations: NVIDIA, Intel Unlock Full Potential of Creativity and Productivity for Professionals

AI-augmented applications, photorealistic rendering, simulation and other technologies are helping professionals achieve business-critical results from multi-app workflows faster than ever.

Running these data-intensive, complex workflows, as well as sharing data and collaborating across geographically dispersed teams, requires workstations with high-end CPUs, GPUs and advanced networking.

To help meet these demands, Intel and NVIDIA are powering new platforms with the latest Intel Xeon W and Intel Xeon Scalable processors, paired with NVIDIA RTX 6000 Ada generation GPUs, as well as NVIDIA ConnectX-6 SmartNICs.

These new workstations bring together the highest levels of AI computing, rendering and simulation horsepower to tackle demanding workloads across data science, manufacturing, broadcast, media and entertainment, healthcare and more.

“Professionals require advanced power and performance to run the most intensive workflows, like using AI, rendering in real time or running multiple applications simultaneously,” said Bob Pette, vice president of professional visualization at NVIDIA. “The new Intel- and NVIDIA-Ada powered workstations deliver unprecedented speed, power and efficiency, enabling professionals everywhere to take on the most complex workflows across all industries.”

“The latest Intel Xeon W processors — featuring a breakthrough new compute architecture — are uniquely designed to help professional users tackle the most challenging current and future workloads,” said Roger Chandler, vice president and general manager of Creator and Workstation Solutions in the Client Computing Group at Intel. “Combining our new Intel Xeon workstation processors with the latest NVIDIA GPUs will unleash the innovation and creativity of professional creators, artists, engineers, designers, data scientists and power users across the world.”

Serving New Workloads 

Metaverse applications and the rise of generative AI require a new level of computing power from the underlying hardware. Creating digital twins in a simulated photorealistic environment that obeys the laws of physics and planning factories are just two examples of workflows made possible by NVIDIA Omniverse Enterprise, a platform for creating and operating metaverse applications.

BMW Group, for example, is using NVIDIA Omniverse Enterprise to design an end-to-end digital twin of an entire factory. This involves collaboration with thousands of planners, product engineers and facility managers in a single virtual environment to design, plan, simulate and optimize highly complex manufacturing systems before a factory is actually built or a new product is integrated into the real world.

The need for accelerated computing power is growing exponentially due to the explosion of AI-augmented workflows, from traditional R&D and data science workloads to edge devices on factory floors or in security offices, to generative AI solutions for text conversations and text-to-image applications.

Extended reality (XR) solutions for collaborative work also require significant computing resources. Examples of XR applications include design reviews, product design validation, maintenance and support training, rehearsals, interactive digital twins and location-based entertainment. All of these demand high-resolution, photoreal images to create the most intuitive and compelling immersive experiences, whether available locally or streamed to wireless devices.

Next-Generation Platform Features 

With a breakthrough new compute architecture for faster individual CPU cores and new embedded multi-die interconnect bridge packaging, the Xeon W-3400 and Xeon W-2400 series of processors enable unprecedented scalability for increased workload performance. Available with up to 56 cores in a single socket, the top-end Intel Xeon w9-3495X processor features a redesigned memory controller and larger L3 cache, delivering up to 28% more single-threaded(1) and 120% more multi-threaded(2) performance over the previous- generation Xeon W processors.

Based on the NVIDIA Ada Lovelace GPU architecture, the latest NVIDIA RTX 6000 brings incredible power efficiency and performance to the new workstations. It features 142 third-generation RT Cores, 568 fourth-generation Tensor Cores and 18,176 latest-generation CUDA cores combined with 48GB of high-performance graphics memory to provide up to 2x ray-tracing, AI, graphics and compute performance over the previous generation.

NVIDIA ConnectX-6 Dx SmartNICs enable professionals to handle demanding, high-bandwidth 3D rendering and computer-aided design tasks, as well as traditional office work with line-speed network connectivity support based on two 25Gbps ports and GPUDirect technology for increasing GPU bandwidth by 10x over standard NICs. The high-speed, low-latency networking and streaming capabilities enable teams to move and ingest large datasets or to allow remote individuals to collaborate across applications for design and visualization.

Availability 

The new generation of workstations powered by the latest Intel Xeon W and Intel Scalable processors and NVIDIA RTX Ada generation GPUs will be available for preorder beginning today from BOXX and HP, with more coming soon from other workstation system integrators.

To learn more, tune into the launch event.

 

(1) Based on SPEC CPU 2017_Int (1-copy) using Intel validation platform comparing Intel Xeon w9-3495X (56c) versus previous generation Intel Xeon W-3275 (28c).
(2) Based on SPEC CPU 2017_Int (n-copy) using Intel validation platform comparing Intel Xeon w9-3495X (56c) versus previous generation Intel Xeon W-3275 (28c).

Read More

Blender Alpha Release Comes to Omniverse, Introducing Scene Optimization Tools, Improved AI-Powered Character Animation

Blender Alpha Release Comes to Omniverse, Introducing Scene Optimization Tools, Improved AI-Powered Character Animation

Whether creating realistic digital humans that can express emotion or building immersive virtual worlds, 3D artists can reach new heights with NVIDIA Omniverse, a platform for creating and operating metaverse applications.

A new Blender alpha release, now available in the Omniverse Launcher, lets users of the 3D graphics software optimize scenes and streamline workflows with AI-powered character animations.

Save Time, Effort With New Blender Add-Ons

The new scene optimization add-on in the Blender release enables creators to fix bad geometry and generate automatic UVs, or 2D maps of 3D objects. It also reduces the number of polygons that need to be rendered to increase the scene’s overall performance, which significantly brings down file size, as well as CPU and GPU memory usage.

Plus, anyone can now accomplish what used to require a technical rigger or animator using an Audio2Face add-on.

A panel in the add-on makes it easier to use Blender characters in Audio2Face, an AI-enabled tool that automatically generates realistic facial expressions from an audio file.

This new functionality eases the process of bringing generated face shapes back onto rigs — that is, digital skeletons — by applying shapes exported through the Universal Scene Description (USD) framework onto a character even if it is fully rigged, meaning its whole body has a working digital skeleton. The integration of the facial shapes doesn’t alter the rigs, so Audio2Face shapes and animation can be applied to characters — whether for games, shows and films, or simulations — at any point in the artist’s workflow.

Realistic Character Animation Made Easy

Audio2Face puts AI-powered facial animation in the hands of every Blender user who works with Omniverse.

Using the new Blender add-on for Audio2Face, animator and popular YouTuber Marko Matosevic, aka Markom 3D, rigged and animated a Battletoads-inspired character using just an audio file.

Australia-based Matosevic joined Dave Tyner, a technical evangelist at NVIDIA, on a livestream to showcase their live collaboration across time zones, connecting 3D applications in a real-time Omniverse jam session. The two used the new Blender alpha release with Omniverse to make progress on one of Matosevic’s short animations.

The new Blender release was also on display last month at CES in The Artists’ Metaverse, a demo featuring seven artists, across time zones, who used Omniverse Nucleus Cloud, Autodesk, SideFX, Unreal Engine and more to create a short cinematic in real time.

Creators can save time and simplify processes with the add-ons available in Omniverse’s Blender build.

NVIDIA principal artist Zhelong Xu, for example, used Blender and Omniverse to visualize an NVIDIA-themed “Year of the Rabbit” zodiac.

“I got the desired effect very quickly and tested a variety of lighting effects,” said Xu, an award-winning 3D artist who’s previously worked at top game studio Tencent and made key contributions to an animated show on Netflix.

Get Plugged Into the Omniverse 

Learn more about Blender and Omniverse integrations by watching a community livestream on Wednesday, Feb. 15, at 11 a.m. PT via Twitch and YouTube.

And the session catalog for NVIDIA GTC, a global AI conference running online March 20-23, features hundreds of curated talks and workshops for 3D creators and developers. Register free to hear from NVIDIA experts and industry luminaries on the future of technology.

Creators and developers can download NVIDIA Omniverse free. Enterprises can try Omniverse Enterprise free on NVIDIA LaunchPad. Follow NVIDIA Omniverse on Instagram, Medium, Twitter and YouTube for additional resources and inspiration. Check out the Omniverse forums, and join our Discord server and Twitch channel to chat with the community.

Read More

Making a Splash: AI Can Help Protect Ocean Goers From Deadly Rips

Making a Splash: AI Can Help Protect Ocean Goers From Deadly Rips

Surfers, swimmers and beachgoers face a hidden danger in the ocean: rip currents. These narrow channels of water can flow away from the shore at speeds up to 2.5 meters per second, making them one of the biggest safety risks for those enjoying the ocean.

To help keep beachgoers safe, Christo Rautenbach, a coastal and estuarine physical processes scientist, has teamed up with the National Institute of Water and Atmospheric Research in New Zealand to develop a real-time rip current identification tool using deep learning.

On this episode of the NVIDIA AI Podcast, host Noah Kravitz interviews Rautenbach about how AI can be used to identify rip currents and the potential for the tool to be used globally to help reduce the number of fatalities caused by rip currents.

Developed in collaboration with Surf Lifesaving New Zealand, the rip current identification tool has achieved a detection rate of roughly 90% in trials. Rautenbach also shares the research behind the technology, which was published in the November 22 edition of the journal Remote Sensing.

You Might Also Like

Art(ificial) Intelligence: Pindar Van Arman Builds Robots That Paint
Pindar Van Arman, an American artist and roboticist, designs painting robots that explore the differences between human and computational creativity. Since his first system in 2005, he has built multiple artificially creative robots. The most famous, Cloud Painter, was awarded first place at Robotart 2018.

Real or Not Real? Attorney Steven Frank Uses Deep Learning to Authenticate Art
Steven Frank is a partner at the law firm Morgan Lewis, specializing in intellectual property and commercial technology law. He’s also half of the husband-wife team that used convolutional neural networks to authenticate artistic masterpieces, including da Vinci’s Salvador Mundi, with AI’s help.

GANTheftAuto: Harrison Kinsley on AI-Generated Gaming Environments
Humans playing games against machines is nothing new, but now computers can develop games for people to play. Programming enthusiast and social media influencer Harrison Kinsley created GANTheftAuto, an AI-based neural network that generates a playable chunk of the classic video game Grand Theft Auto V.

Subscribe to the AI Podcast on Your Favorite Platform

You can now listen to the AI Podcast through Amazon Music, Apple Music, Google Podcasts, Google Play, Castbox, DoggCatcher, Overcast, PlayerFM, Pocket Casts, Podbay, PodBean, PodCruncher, PodKicker, Soundcloud, Spotify, Stitcher and TuneIn.

Featured image credit: T. Caulfield

Read More

3D Creators Share Art From the Heart This Week ‘In the NVIDIA Studio’

3D Creators Share Art From the Heart This Week ‘In the NVIDIA Studio’

Editor’s note: This post is part of our weekly In the NVIDIA Studio series, which celebrates featured artists, offers creative tips and tricks, and demonstrates how NVIDIA Studio technology improves creative workflows. We’re also deep diving on new GeForce RTX 40 Series GPU features, technologies and resources, and how they dramatically accelerate content creation.

Love and creativity are in the air this Valentine’s Day In the NVIDIA Studio, as 3D artist Molly Brady presents a parody scene inspired by the iconic The Birth of Venus (Redux) painting by Sando Botticelli.

Plus, join the #ShareYourHeART challenge by sharing what Valentine’s Day means to you in a scene built with NVIDIA Omniverse, a platform for creating and operating metaverse applications. Use the hashtag to post artwork — whether heartened by love, chocolate, teddy bears or anything else Valentine’s-themed — for a chance to be featured across NVIDIA social media channels.

3D artist Tanja Langgner’s delectable scene with chocolate hearts, featured below, is just one example.

Also, get a chance to win a GeForce RTX 3090 Ti GPU in the NVIDIA Instant NeRF VR sweepstakes. Named by TIME Magazine as one of the best inventions of 2022, NVIDIA Instant NeRF enables creators to rapidly create 3D models from 2D images and use them in virtual scenes. The tool provides a glimpse into the future of photography, 3D graphics and virtual worlds. Enter the sweepstakes by creating your own NeRF scene, and look to influencer Paul Trillo’s Instagram for inspiration.

New NVIDIA Studio laptops powered by GeForce RTX 40 Series Laptop GPUs are now available, including MSI’s Stealth 17 Studio and Razer’s 16 and 18 models — with more on the way. Learn why PC Gamer said the “RTX 4090 pushes laptops to blistering new frontiers: Yes, it’s fast, but also much more.”

Download the latest NVIDIA Studio Driver to enhance existing app features and reduce repetitive tasks. ON1 NoNoise AI, an app that quickly removes image noise while preserving and enhancing photo details, released an update speeding this process by an average of 50% on GeForce RTX 40 Series GPUs.

And NVIDIA GTC, a global conference for the era of AI and the metaverse, is running online March 20-23, with a slew of creator sessions, Omniverse tutorials and more — all free with registration. Learn more below.

A Satirical Valentine

Molly Brady is a big fan of caricatures.

“I love parody,” she gleefully admitted. “Nothing pleases me more than taking the air out of something serious and stoic.”

Botticelli’s The Birth of Venus painting, often referenced and revered, presented Brady with an opportunity for humor through her signature visual style.

According to Brady, “3D allows you to mix stylistic artwork with real-world limitations,” which is why the touchable, cinematic look of stop-motion animation heavily inspires her work.

“Stop-motion reforms found items into set pieces for fantastical worlds, giving them a new life and that brings me immense joy,” she said.

Brady’s portfolio features colorful, vibrant visuals with a touch of whimsical humor.

Brady composited Birth of Venus (Redux) with placeholder meshes, focusing on the central creature figure, before confirming the composition and scale were to her liking. She then sculpted finery details in the flexible 3D modeling app Foundry Modo, assisted by RTX acceleration in OTOY OctaneRender, which was made possible by her GeForce RTX 4090 GPUs.

Advanced sculpting was completed in Modo.

She then applied materials and staged lighting with precision, and added speed with the RTX-accelerated ray tracing renderer. Brady has the option to deploy Octane Render, her preferred 3D renderer, in over 20 3D applications, including Autodesk 3ds Max, Blender and Maxon’s Cinema 4D.

After rendering the image, Brady deployed several post-processing features in Adobe Photoshop to help ensure the colors popped, as well as to add grain to compensate for any compression when posted on social media. Her RTX GPU affords over 30 GPU-accelerated features, such as blur gallery, object selection, liquify and smart sharpen.

“Art has been highly therapeutic for me, not just as an outlet to express emotion but to reflect how I see myself and what I value,” Brady said. “Whenever I feel overwhelmed by the pressure of expectation, whether internal or external, I redirect my efforts and instead create something that brings me joy.”

3D artist Molly Brady.

View more of Brady’s artwork on Instagram.

Valen(time) to Join the #ShareYourHeART Challenge

The photorealistic, chocolate heart plate beside a rose-themed mug and napkins, featured below, is 3D artist and illustrator Tanja Langgner’s stunning #ShareYourHeART challenge entry.

Hungry?

Langgner gathered assets and sculpted the heart shape using McNeel Rhino and Maxon ZBrush. Next, she assembled the pieces in Blender and added textures using Adobe Substance 3D Painter. The scene was then exported from Blender as a USD file and brought into Omniverse Create, where the artist added lighting and virtual cameras to capture the sweets with the perfect illuminations and angles.

“The main reason I started using Omniverse was its capability to link all my favorite apps,” Langgner said. “Saving time on exporting, importing and recreating materials in each app is a dream come true.”

Learn more about Langgner’s creative journey at the upcoming Community Spotlight livestream on the Omniverse Twitch channel and YouTube on Wednesday, Feb. 22, from 11 a.m. to 12 p.m. PT.

Join the #ShareYourHeART challenge by posting your own Valentine’s-themed Omniverse scene on social media using the hashtag. Entries could be featured on the NVIDIA Omniverse Twitter, LinkedIn and Instagram accounts.

Creative Boosts at GTC 

Experience this spring’s GTC for more inspiring content, expert-led sessions and a must-see keynote to accelerate your life’s creative work.

Catch these sessions live or watch on demand:

  • 3D Art Goes Multiplayer: Behind the Scenes of Adobe Substance’s “End of Summer” Project With Omniverse [S51239]
  • 3D and Beyond: How 3D Artists Can Build a Side Hustle in the Metaverse [SE52117]
  • NVIDIA Omniverse User Group [SE52047]
  • Accelerate the Virtual Production Pipeline to Produce an Award-Winning Sci-Fi Short Film [S51496]
  • 3D by AI: How Generative AI Will Make Building Virtual Worlds Easier [S52163]
  • Custom World Building With AI Avatars: The Little Martians Sci-Fi Project [S51360]
  • AI-Powered, Real-Time, Markerless: The New Era of Motion Capture [S51845]

Search the GTC session catalog or check out the Media and Entertainment and Omniverse topics for additional creator-focused talks.

Follow NVIDIA Studio on Instagram, Twitter and Facebook. Access tutorials on the Studio YouTube channel and get updates directly in your inbox by subscribing to the Studio newsletter. Learn more about Omniverse on Instagram, Medium, Twitter and YouTube for additional resources and inspiration. Check out the Omniverse forums, and join our Discord server and Twitch channel to chat with the community.

Read More

Crossing Continents: XPENG G9 SUV and P7 Sedan Set Course for Scandinavia, the Netherlands

Crossing Continents: XPENG G9 SUV and P7 Sedan Set Course for Scandinavia, the Netherlands

Electric automaker XPENG’s flagship G9 SUV and P7 sports sedan are now available for order in Sweden, Denmark, Norway and the Netherlands — an expansion revealed last week at the eCar Expo in Stockholm.

The intelligent electric vehicles are built on the high-performance NVIDIA DRIVE Orin centralized compute architecture and deliver AI capabilities that are continuously upgradable through over-the-air software updates.

“This announcement represents a significant milestone as we build our presence in Europe,” said Brian Gu, vice chair and president of XPENG. “We believe both vehicles deliver a new level of sophistication and a people-first mobility experience, and will be the electric vehicles of choice for many European customers.”

Safety Never Takes a Back Seat 

The XPENG G9 and P7 come equipped with XPENG’s proprietary XPILOT Advanced Driver Assistance System, which offers safety, driving and parking support through a variety of smart functions.

The system is supported by 29 sensors, including high-definition radars, ultrasonic sensors, and surround-view and high-perception cameras, enabling the vehicles to safely tackle diverse driving scenarios.

The EVs are engineered to meet the European New Car Assessment Programme’s five-star safety standards, along with the European Union’s stringent whole vehicle type approval certification.

Leading Charge for the Long Haul

The rear-wheel-drive (RWD), long-range version of the XPENG G9 can travel up to 354 miles on a single charge and features a new powertrain system for ultrafast charging, going from 10% to 80% in just 20 minutes. The P7 RWD, long-range model also has optimized charging power to reach 80% in 29 minutes, while offering up to 358 miles on a single charge.

To ensure an easy, fast charging experience, XPENG customers can access more than 400,000 charging stations in Europe through the automaker’s collaboration with major third-party charging operations and mobility service providers.

The XPENG G9 features faster charging and longer range, up to 354 miles on a single charge.

Beauty Backed by Brains and Brawn

With the high compute power found only with DRIVE Orin, the XPENG G9 and P7 advanced driving systems boast superior performance, while sporting sleek and elegant designs, quality craftsmanship and comfort to meet the most discerning of tastes.

The upgraded in-car Xmart operating system (OS) features a new 3D user interface that offers support in English and other European languages, depending on the market. The OS comes with an improved voice assistant that can distinguish voice commands from four zones in the cabin. It also features wide infotainment screens and a library of in-car apps to assist and entertain both the driver and passengers.

The G9 and P7 are available in all-wheel drive (AWD) or RWD configurations. XPENG reports that the G9’s AWD version delivers up to 551 horsepower, and can accelerate from 0 to 100 kilometers per hour in 3.9 seconds, while the upgraded P7 AWD model can do the same in 4.1 seconds.

The XPENG P7 features an immersive cabin experience.

Deliveries of the P7 will begin in June, while the G9 is expected to start in September. To support demand in key European markets, XPENG plans to open delivery and service centers in Lørenskog, Norway, this month — as well as in Badhoevedorp, the Netherlands; Stäket, Sweden; and Hillerød, Denmark in Q2 2023.

The EV maker expects to open additional authorized service locations in other European countries by the end of the year.

Featured image: Next-gen XPENG P7 sports sedan, powered by NVIDIA DRIVE Orin.

Read More

3D Artist Brings Ride and Joy to Automotive Designs With Real-Time Renders Using NVIDIA RTX

3D Artist Brings Ride and Joy to Automotive Designs With Real-Time Renders Using NVIDIA RTX

Designing automotive visualizations can be incredibly time consuming. To make the renders look as realistic as possible, artists need to consider material textures, paints, realistic lighting and reflections, and more.

For 3D artist David Baylis, it’s important to include these details and still create high-resolution renders in a short amount of time. That’s why he uses the NVIDIA RTX A6000 GPU, which allows him to use features like real-time ray tracing so he can quickly get the highest-fidelity image.

The RTX A6000 also enables Baylis to handle massive amounts of data with 48GB of VRAM, which means more GPU memory. In computer graphics, the higher the resolution of the image, the more memory is used. And with the RTX A6000, Baylis can extract more data without worrying about memory limits slowing him down.

Bringing Realistic Details to Life With RTX

To create his automotive visualizations, Baylis starts with 3D modeling in Autodesk 3ds Max software. He’ll set up the scene and work on the car model before importing it to Unreal Engine, where he works on lighting and shading for the final render.

In Unreal Engine, Baylis can experiment with details such as different car paints to see what works best on the 3D model. Seeing all the changes in real time enables Baylis to iterate and experiment with design choices, so he can quickly achieve the look and feel he’s aiming for.

In one of his latest projects, Baylis created a scene with an astounding polycount of more than 50 million triangles. Using the RTX A6000, he could easily move around the scene to see the car from different angles. Even in path-traced mode, the A6000 allows Baylis to maintain high frame rates while switching from one angle to the next.

Rendering at a higher resolution is important to create photorealistic visuals. In the example below, Baylis shows a car model rendered at 4K resolution. But when zoomed in, the graphics start to appear blurry.

When the car is rendered at 12K resolution, the details on the car become sharper. By rendering at higher resolutions, the artist can include extra details to make the car look even more realistic. With the RTX A6000, Baylis said the 12K render took under 10 minutes to complete.

It’s not just the real-time ray tracing and path tracing that help Baylis enhance his designs. There’s another component he said he never thought would make an impact on creative workflows — GPU memory.

The RTX A6000 GPU is equipped with 48GB of VRAM, which allows Baylis to load incredibly high-resolution textures and high-polygon assets. The VRAM is especially helpful for automotive renders because the datasets behind them can be massive.

The large memory of the RTX A6000 allows him to easily manage the data.

If we throw more polygons into the scene, or if we include more scanned assets, it tends to use a lot of VRAM, but the RTX A6000 can handle all of it,” explained Baylis. “It’s great not having to think about optimizing all those assets in the scene. Instead, we can just scan the data in, even if the assets are 8K, 16K or even 24K resolution.”

When Baylis rendered one still frame at 8K resolution, he saw it only took up 24GB of VRAM. So he pushed the resolution higher to 12K, using almost 35GB of VRAM — with plenty of headroom to spare.

This is an important feature to highlight, because when people look at new GPUs, they immediately look at benchmarks and how fast it can render things,” said Baylis. “And it’s good if you can render graphics a minute or two faster, but if you really want to take projects to the finish line, you need more VRAM.”

Using NVLink, Baylis can bridge two NVIDIA RTX A6000 GPUs together to scale memory and performance. With one GPU, it takes just about a minute to render a path-traced image of the car. But using dual RTX A6000 GPUs with NVLink, it reduces the render time by almost half. NVLink also combines GPU memory, providing 96 GB VRAM total. This makes Baylis’ animation workflows much faster and easier to manage.

Check out more of Baylis’ work in the video below, and learn more about NVIDIA RTX. And join us at NVIDIA GTC, which takes place March 20-23, to learn more about the latest technologies shaping the future of design and visualization.

Read More

Gather Your Party: GFN Thursday Brings ‘Baldur’s Gate 3’ to the Cloud

Gather Your Party: GFN Thursday Brings ‘Baldur’s Gate 3’ to the Cloud

Venture to the Forgotten Realms this GFN Thursday in Baldur’s Gate 3, streaming on GeForce NOW.

Celebrations for the cloud gaming service’s third anniversary continue with a Dying Light 2 reward that’s to die for. It’s the cherry on top of three new titles joining the GeForce NOW library this week.

Roll for Initiative

Mysterious abilities are awakening inside you. Embrace corruption or fight against darkness itself in Baldur’s Gate 3 (Steam) – a next-generation role-playing game, set in the world of Dungeons and Dragons.

Choose from a wide selection of D&D races and classes, or play as an origin character with a handcrafted background on underpowered PCs and Macs. Adventure, loot, battle and romance as you journey through the Forgotten Realms and beyond from mobile devices. Play alone and select companions carefully, or as a party of up to four in multiplayer.

Level up to the GeForce NOW Ultimate membership to experience the power of an RTX 4080 in the cloud and all of its benefits, including up to 4K 120 frames per second gameplay on PC and Mac, and ultrawide resolution support for a truly immersive experience.

Dying 2 Celebrate This Anniversary

To celebrate the third anniversary of GeForce NOW, members can now check their accounts to make sure they received the gift of free Dying Light 2 rewards.

Dying Light 2 GeForce NOW Anniversary Reward
You’re all set to survive the post-apocalyptic wasteland with this loadout.

Claim a new in-game outfit dubbed “Post-Apo,” complete with a Rough Duster, Bleak Pants, Well-Worn Boots, Tattered Leather Gauntlets, Dystopian Mask and Spiked Bracers to scavenge around and parkour in. Members who upgrade to Ultimate and Priority memberships can claim extra loot with this haul, including the Patchy Paraglider and Scrap Slicer weapon.

Visit the GeForce NOW Rewards portal to start receiving special offers and in-game goodies.

Welcome to the Weekend

Recipe for Disaster on GeForce NOW
Uh… maybe we should order takeout.

Buckle up for three more games supported in the GeForce NOW library this week.

  • Recipe for Disaster (Free on Epic Games, Feb. 9-16)
  • Baldur’s Gate 3 (Steam)
  • Inside the Backrooms (Steam)

Members continue to celebrate #3YearsOfGFN on our social channels, sharing their favorite cloud gaming devices:

Follow #3YearsOfGFN on Twitter and Facebook all month long and check out this week’s question.

 

Read More

New NVIDIA Studio Laptops Powered by GeForce RTX 4090, 4080 Laptop GPUs Unleash Creativity

New NVIDIA Studio Laptops Powered by GeForce RTX 4090, 4080 Laptop GPUs Unleash Creativity

Editor’s note: This post is part of our weekly In the NVIDIA Studio series, which celebrates featured artists, offers creative tips and tricks, and demonstrates how NVIDIA Studio technology improves creative workflows. We’re also deep diving on new GeForce RTX 40 Series GPU features, technologies and resources, and how they dramatically accelerate content creation.

The first NVIDIA Studio laptops powered by GeForce RTX 40 Series Laptop GPUs are now available, starting with systems from MSI and Razer — with many more to come.

Featuring GeForce RTX 4090 and 4080 Laptop GPUs, the new Studio laptops use the NVIDIA Ada Lovelace architecture and fifth-generation Max-Q technologies for maximum performance and efficiency. They’re fueled by powerful NVIDIA RTX technology like DLSS 3, which routinely increases frame rates by 2x or more.

Backed by the NVIDIA Studio platform, these laptops give creators exclusive access tools and apps — including NVIDIA Omniverse, Canvas and Broadcast — and deliver breathtaking visuals with full ray tracing and time-saving AI features.

They come preinstalled with regularly updated NVIDIA Studio Drivers. This month’s driver is available for download starting today.

And when creating turns to gaming, the laptops enable playing at previously impossible levels of detail and speed.

Plus, In the NVIDIA Studio this week highlights the making of The Artists’ Metaverse, a video showcasing the journey of 3D collaboration between seven creators, across several time zones, using multiple creative apps simultaneously — all powered by NVIDIA Omniverse.

The Future of Content Creation, Anywhere

NVIDIA Studio laptops, powered by new GeForce RTX 40 Series Laptop GPUs, deliver the largest-ever generational leap in portable performance and are the world’s fastest laptops for creating and gaming.

These creative powerhouses run up to 3x more efficiently than the previous generation, enabling users to power through creative workloads in a fraction of the time, all using thin, light laptops — with 14-inch designs coming soon for the first time.

MSI’s Stealth 17 Studio comes with up to a GeForce RTX 4090 Laptop GPU.

MSI’s Stealth 17 Studio comes with up to a GeForce RTX 4090 Laptop GPU and an optional 17-inch, Mini LED 4K, 144Hz, 1000 Nits, DisplayHDR 1000 display — perfect for creators of all types. It’s available in various configurations at Amazon, Best Buy, B&H and Newegg.

New Razer Blade Studio Laptops come preinstalled with NVIDIA Broadcast.

Razer is upgrading its Blade laptops with up to a GeForce RTX 4090 Laptop GPU. Available with a 16- or 18-inch HDR-capable, dual-mode, mini-LED display, they feature a Creator mode that enables sharp, ultra-high-definition+ native resolution at 120Hz. It’s available at Razer, Amazon, Best Buy, B&H and Newegg.

The MSI Stealth 17 Studio and Razer Blade 16 and 18 come preinstalled with NVIDIA Broadcast. The app’s recent update to version 1.4 added an Eye Contact feature, ideal for content creators who want to record themselves while reading notes or avoid having to stare directly at the camera. The feature also lets video conference presenters appear as if they’re looking at their audience, improving engagement.

Designed for gamers, new units from ASUS, GIGABYTE and Lenovo are also available today and deliver great performance in creator applications with access to NVIDIA Studio benefits.

Groundbreaking Performance

The new Studio laptops have been put through rigorous testing, and many reviewers are detailing the new levels of performance and AI-powered creativity that GeForce RTX 4090 and 4080 Laptop GPUs make possible. Here’s what some are saying:

“NVIDIA’s GeForce RTX 4090 pushes laptops to blistering new frontiers: Yes, it’s fast, but also much more.” — PC World

“GeForce RTX 4090 Laptops can also find the favor of content creators thanks to NVIDIA Studio as well as AV1 support and the double NVENC encoder.” — HDBLOG.IT

“With its GeForce RTX 4090… and bright, beautiful dual-mode display, the Razer Blade 16 can rip through games with aplomb, while being equally adept at taxing, content creations workloads.” — Hot Hardware

“The Nvidia GeForce RTX 4090 mobile GPU is a step up in performance, as we’d expect from the hottest graphics chip.” — PC Magazine

“Another important point – particularly in the laptop domain – is the presence of enhanced AV1 support and dual hardware encoders. That’s really useful for streamers or video editors using a machine like this.” – KitGuru

Pick up the latest Studio systems or configure a custom system today.

Revisiting ‘The Artists’ Metaverse’

Seven talented artists join us In the NVIDIA Studio this week to discuss building The Artists’ Metaverse — a spotlight demo from last month’s CES. The group reflected on how easy it was to collaborate in real time from different parts of the world.

It started in NVIDIA Omniverse, a hub to interconnect 3D workflows replacing linear pipelines with live-sync creation. The artists connected to the platform via Omniverse Cloud.

“Setting up the Omniverse Cloud collaboration demo was a super easy process,” said award-winning 3D creator Rafi Nizam. “It was cool to see avatars appearing as people popped in, and the user interface makes it really clear when you’re working in a live state.”

Assets were exported into Omniverse with ease, thanks to the Universal Scene Description format.

Filmmaker Jae Solina, aka JSFILMZ, animated characters in Omniverse using Xsens and Unreal Engine.

“Prior to Omniverse, creating animations was such a hassle, let alone getting photorealistic animations,” Solina said. “Instead of having to reformat and upload files individually, everything is done in Omniverse in real time, leading to serious time saved.”

 

Jeremy Lightcap reflected on the incredible visual quality of the virtual scene, highlighting the seamless movement within the viewport.

The balloon 3D model was sculpted by hand in Gravity Sketch and imported into Omniverse.

“We had three Houdini simulations, a volume database file storm cloud, three different characters with motion capture and a very dense Western town set with about 100 materials,” Lightcap said. “I’m not sure how many other programs could handle that and still give you instant, path-traced lighting results.”

 

For Ashley Goldstein, an NVIDIA 3D artist and tutorialist, the demo highlighted the versatility of Omniverse. “I could update the scene and save it as a new USD layer, so when someone else opened it up, they had all of my updates immediately,” she said. “Or, if they were working on the scene at the same time, they’d be instantly notified of the updates and could fetch new content.”

Applying colors and textures to the ballon in Adobe Substance 3D Painter.

Edward McEvenue, aka edstudios, reflected on the immense value Omniverse on RTX hardware provides, displaying fully ray-traced graphics with instant feedback. “3D production is a very iterative process, where you have to make hundreds if not thousands of small decisions along the way before finalizing a scene,” he said. “Using GPU acceleration with RTX path tracing in the viewport makes that process so much easier, as you get near-instant feedback on the changes you’re making, with all of the full-quality lighting, shadows, reflections, materials and post-production effects directly in the working viewport.”

Edits to the 3D model in Blender are reflected in real time with photorealistic detail in Omniverse.

3D artist Shangyu Wang noted Omniverse is his preferred 3D collaborative content-creation platform. “Autodesk’s Unreal Live Link for Maya gave me a ray-traced, photorealistic preview of the scene in real time, no waiting to see the final render result,” he said.

Fellow 3D artist Pekka Varis mentioned Omniverse’s positive trajectory. “New features are coming in faster than I can keep up!” he said. “It can become the main standard of the metaverse.”

Omniverse transcends location, time and apps, where collaboration, communication and creativity reign supreme.

Download Omniverse today, free for all NVIDIA and GeForce RTX GPU owners — including those with new GeForce RTX 40 Series laptops.

Follow NVIDIA Studio on Instagram, Twitter and Facebook. Access tutorials on the Studio YouTube channel and get updates directly in your inbox by subscribing to the Studio newsletter. Learn more about Omniverse on Instagram, Medium, Twitter and YouTube for additional resources and inspiration. Check out the Omniverse forums, and join our Discord server and Twitch channel to chat with the community.

Read More

Vietnam’s VinBrain Deploys Healthcare AI Models to 100+ Hospitals

Vietnam’s VinBrain Deploys Healthcare AI Models to 100+ Hospitals

Doctors rarely make diagnoses based on a single factor — they look at a mix of data types, such as a patient’s symptoms, laboratory and radiology reports, and medical history.

VinBrain, a Vietnam-based health-tech startup, is ensuring that AI diagnostics can take a similarly holistic view across vital signs, blood tests, medical images and more.

“Multimodal data is key to delivering precision care that can improve patient outcomes,” said Steven Truong, CEO of VinBrain. “Our medical imaging models, for instance, can analyze chest X-rays and make automated observations about abnormal findings in a patient’s heart, lungs and bones.”

If a medical-imaging AI model reports that a patient’s scan shows lung consolidation, Truong explained, doctors could combine the X-ray analysis with a large language model that reads health records to learn the patient has a fever — helping clinicians more quickly determine a more specific diagnosis of pneumonia.

Funded by Vingroup — one of Vietnam’s largest public companies — VinBrain is the creator of DrAid, which is the only AI software for automated X-ray diagnostics in Southeast Asia, and among the first AI platforms to be cleared by the FDA to detect features suggestive of collapsed lungs from chest X-rays.

Trained on a dataset of more than 2.5 million images, DrAid is deployed in more than 100 hospitals in Vietnam, Myanmar, New Zealand and the U.S. The software applies AI analysis to medical images for more than 120,000 patients each month. VinBrain is also building a host of other AI applications, including a telehealth product that analyzes lab test results, medical reports and other electronic health records.

The company is part of NVIDIA Inception, a global program designed to offer cutting-edge startups expertise, technology and go-to-market support. The VinBrain team has also collaborated with Microsoft and with academic researchers at Stanford University, Harvard University, the University of Toronto and the University of California, San Diego to develop its core AI technology and submit research publications to top conferences.

Many Models, Easy Deployment

The VinBrain team has developed more than 300 AI models that process speech, text, video and images — including X-ray, CT and MRI data.

“Healthcare is complex, so the pipeline requires hundreds of models for each step, such as preprocessing, segmentation, object detection and post-processing,” Truong said. “We aim to package these models together so everything runs on GPU servers at the hospital — like a refrigerator or household appliance.”

VinBrain recently launched DrAid Appliance, an on-premises, NVIDIA GPU-powered device for automatic screening of medical imaging studies that could improve doctors’ productivity by up to 80%, the team estimates.

The company also offers a hybrid solution, where images are preprocessed at the edge with DrAid Appliance, then sent to NVIDIA GPUs in the cloud for more demanding computational workloads.

Another way to access VinBrain’s DrAid software is through Ferrum Health, an NVIDIA Inception company that has developed a secure platform to help healthcare organizations deploy AI applications across therapeutic areas.

Accelerating AI Training and Inference

VinBrain trains its AI models — which include medical imaging, intelligent video analytics, automatic speech recognition, natural language processing and text-to-speech — using NVIDIA DGX SuperPOD. Adopting DGX SuperPOD enabled Vinbrain to achieve near-linear-level speedups for model training, achieving 100x faster training compared with CPU-only training and significantly shortening the turnaround time for model development.

The team is using software from NVIDIA AI Enterprise, an end-to-end solution for production AI, which includes the NVIDIA Clara platform, the MONAI open-source framework for medical imaging development and the NVIDIA NeMo conversational AI toolkit for its transcription model.

“To develop good AI models, you can’t just train once and be done,” said Truong. “It’s an evolving process to refine the neural networks.”

VinBrain has set up an early validation pipeline for its AI projects: The company tests its early-stage models across a couple dozen hospitals in Vietnam to collect performance data, gather feedback and fine-tune its neural networks.

In addition to using NVIDIA DGX SuperPOD for AI training, the company has adopted NVIDIA GPUs to improve run-time efficiency and deployment. It uses the NVIDIA Triton inference server and NVIDIA TensorRT to streamline inference for more than hundreds of AI models on cloud-based NVIDIA Tensor Core GPUs.

“We shifted to NVIDIA GPUs for inference because of the higher throughput, faster response time and, most importantly, the cost ratio,” Truong said.

After switching from CPUs to NVIDIA Tensor Core GPUs, the team was able to accelerate inference for medical imaging AI by more than 3x, and video streaming by more than 30x.

“In the coming years, we want to become the top company solving the problem of multimodality in healthcare data,” said Truong. “Using AI and edge computing, we aim to improve the quality and accessibility of healthcare, making intelligent insights accessible to patients and doctors across countries.”

Register for NVIDIA GTC, taking place online March 20-23, to learn more about AI in healthcare.

Read More